


Thank you for being a part of the global VFX Voice community!
Coming out of a much-anticipated awards season, this issue of VFX Voice shines a light on the 23rd Annual VES Awards gala. Congratulations again to all of our outstanding nominees, winners and honorees!
Our cover story takes us inside the stop-motion animated Memoir of a Snail. We share a behind-the-scenes look at The Electric State with the Russo brothers and delve into trends around retro effects, real-time VFX, the unsung technical heroes of the pipeline and the rise of high-quality indie film VFX. We highlight the latest in LED volumes, virtual production and Gen AI in schools. And circling the globe, we deliver a special focus on emerging VFX markets in Europe, South Africa and the Middle East… and we put our New Zealand Section in the spotlight.
Share the joy with our VES Awards photo gallery capturing the festive celebration that awarded outstanding artistry and innovation in 25 categories and honored Virtual Reality/Immersive Technology Pioneer Dr. Jacquelyn Ford Morie with the Georges Méliès Award, Golden Globewinning actor-producer Hiroyuki Sanada with the VES Award for Creative Excellence, and Academy Award-winning filmmaker and VFX Supervisor Takashi Yamazaki with the Visionary Award.
Dive in and meet the innovators and risk-takers who push the boundaries of what’s possible and advance the field of visual effects.
Cheers!
Kim Davidson, Chair, VES Board of Directors
Nancy Ward, VES Executive Director
P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X at @VFXSociety.
8 VIRTUAL PRODUCTION: REAL-TIME VFX
How real-time is impacting VFX workflows and pipelines.
18 VIRTUAL PRODUCTION: LED VOLUMES
VFX supervisors help provide a bridge to LED volumes.
24 COVER: MEMOIR OF A SNAIL
Filmmaker Adam Elliot advances traditional stop-motion.
30 VFX TRENDS: RETRO EFFECTS
Striking a balance between in-camera and digital effects.
36 FILM: THE ELECTRIC STATE
The Russo brothers create a new interpretation of the 1990s.
42 EDUCATION: VFX & NEW TECH
VFX/animation schools sharpen their focus on VP and AI.
46 PROFILE: TAKASHI YAMAZAKI
Recipient of the VES Visionary Award.
47 PROFILE: DR. JACQUELYN FORD MORIE
Recipient of the VES Georges Méliès Award.
48 THE 23RD ANNUAL VES AWARDS
Celebrating excellence in visual effects.
56 VES AWARD WINNERS
Photo Gallery.
62 VFX TRENDS: UNSUNG HEROES
Technical directors reveal their complex role in the pipeline.
70 SPECIAL FOCUS: EMERGING MARKETS
VFX in Eastern Europe, South Africa and the Middle East.
78 VFX TRENDS: INDIE VFX
Indie films impress with remarkable effects on a limited budget.
84 TECH & TOOLS: VFX SOFTWARE
Catching up with some essential tools shaping the industry.
2 EXECUTIVE NOTE
90 THE VES HANDBOOK
92 VES SECTION SPOTLIGHT – NEW ZEALAND
94 VES NEWS
96 FINAL FRAME – STOP MOTION
ON THE COVER: Grace Pudel is a melancholy young woman who collects snails in Adam Elliot’s stop-motion Memoir of a Snail (Image courtesy of Arenamedia Pty Ltd. and IFC Films)
Visit us online at vfxvoice.com
PUBLISHER
Jim McCullaugh publisher@vfxvoice.com
EDITOR
Ed Ochs editor@vfxvoice.com
CREATIVE
Alpanian Design Group alan@alpanian.com
ADVERTISING
Arlene Hansen Arlene-VFX@outlook.com
SUPERVISOR
Ross Auerbach
CONTRIBUTING WRITERS
Naomi Goldman
Trevor Hogg
Chris McGowan
Barbara Robertson
Oliver Webb
ADVISORY COMMITTEE
David Bloom
Andrew Bly
Rob Bredow
Mike Chambers, VES
Lisa Cooke, VES
Neil Corbould, VES
Irena Cronin
Kim Davidson
Paul Debevec, VES
Debbie Denise
Karen Dufilho
Paul Franklin
Barbara Ford Grant
David Johnson, VES
Jim Morris, VES
Dennis Muren, ASC, VES
Sam Nicholson, ASC
Lori H. Schwartz
Eric Roth
Tom Atkin, Founder
Allen Battino, VES Logo Design
VISUAL EFFECTS SOCIETY
Nancy Ward, Executive Director
VES BOARD OF DIRECTORS
OFFICERS
Kim Davidson, Chair
Susan O’Neal, 1st Vice Chair
David Tanaka, VES, 2nd Vice Chair
Rita Cahill, Secretary
Jeffrey A. Okun, VES, Treasurer
DIRECTORS
Neishaw Ali, Fatima Anes, Laura Barbera
Alan Boucek, Kathryn Brillhart, Mike Chambers, VES
Emma Clifton Perry, Rose Duignan
Dave Gouge, Kay Hoddy, Thomas Knop, VES
Brooke Lyndon-Stanford, Quentin Martin
Julie McDonald, Karen Murphy
Janet Muswell Hamilton VES, Maggie Oh
Robin Prybil, Lopsie Schwartz
David Valentin, Sean Varney, Bill Villarreal
Sam Winkler, Philipp Wolf, Susan Zwerman, VES
ALTERNATES
Fred Chapman, Dayne Cowan, Aladino Debert, John Decker, William Mesa, Ariele Podreider Lenzi
Visual Effects Society
5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 vesglobal.org
VES STAFF
Elvia Gonzalez, Associate Director
Jim Sullivan, Director of Operations
Ben Schneider, Director of Membership Services
Charles Mesa, Media & Content Manager
Eric Bass, MarCom Manager
Ross Auerbach, Program Manager
Colleen Kelly, Office Manager
Mark Mulkerron, Administrative Assistant
Shannon Cassidy, Global Manager
P.J. Schumacher, Controller
Naomi Goldman, Public Relations
By TREVOR HOGG
TOP AND BOTTOM: Real-time software programs are being developed by Chaos, such as the ray tracing renderer Vantage and Arena, which does ray tracing for in-camera effects. (Images courtesy of Chaos)
OPPOSITE TOP TO BOTTOM: The introduction of full ray tracing to the virtual production process removes the need for rasterized rendering. Source: Ray Tracing FTW. (Image courtesy of Chaos)
An ambition for real-time visual effects is to have the ability to visualize, explore and iterate quickly without closing the door on the visual effects team finishing it off to get the final image. Previs from The Witcher Season 3. (Image courtesy of Cinesite and Netflix)
Real-time is most useful at the concepting stage. (Image courtesy of Vū Technologies)
Virtual production could not exist without real-time rendering, customarily associated with game engines such as Unreal Engine and Unity. Still, real-time technology is also impacting workflows and pipelines constructed to produce visual effects on a daily basis. As the tool is refined to become more cinematically proficient, new challenges and opportunities have emerged for visual effects artists and production teams. “My first job in the visual effects industry was working on Star Wars: Episode 1 – The Phantom Menace, the first movie to do previs,” recalls Kevin Baillie, Vice President and Head of Creative at Eyeline Studio. “Our real-time capabilities back then were quite limited, but now fast forward to where we have these images that can look near to final quality in real-time. Not just previs, but a virtual art department to build set designs whether we’re looking at them through a camera, VR goggles or any other means. These incredibly powerful tools allow a filmmaker to accelerate some of the physical process, start it digitally and iterate on it quickly before we get into the tedious, expensive physical phase. When I worked with Robert Zemeckis on Pinocchio, we previs’d the entire movie. As we were shooting it, we did real-time on-set composites of the scenes that involved live-action, relay down cameras for everything that was a fully virtual shot, then those cameras went into the visual effects post-production process. We made the movie three times using these real-time technologies, and that iteration helped Zemeckis narrow it down on what exactly he wanted.”
Unreal Engine became the answer when pandemic restrictions meant that not everyone could go into the same vehicle together to scout locations for The Handmaid’s Tale. “I would go out, scan the locations, rebuild them in Unreal Engine, and we would walk through in sessions,” recalls Brendan Taylor, President & VFX Supervisor at Mavericks VFX. “I like to say that we are making a game called, ‘Let’s make a movie.’ What’s awesome about that is you can create all the rules for this world. The thing about a game is you need to be able to see it from all angles and be able to change things on the fly. When we’re working in film, we’re dealing with what’s
here and in the camera.” Virtual scouting led to some discoveries that Elisabeth Moss applied when directing her first episode of The Handmaid’s Tale. Taylor explains, “What we were able to do was build the set on the bluescreen stage from the plans, sit with a monitor on a little handheld rig [in our screening room] and explore the space with Elisabeth. She tried things out with just me, Stuart Biddlecombe [Cinematographer] and Paul Wierzbicki [Unreal Engine Specialist]. Elisabeth said, ‘There’s something missing. We’re so monochrome.’ Paul responded, ‘Sometimes these buildings have red lights on them.’ He quickly put a flashing red light in the corner, and it changed the tone of the scene to give it this devilish look. It made this guy pushing women off of the roof even more menacing. We would have never known until we lived within this game we had created. For me, that was a real a-ha moment where it became collaborative again.”
Simplification is taking place when it comes to game engines and real-time. “We don’t have enough people who know Unreal Engine to drive a virtual production because it’s such a beast of a software that has been in development forever,” observes Jason Starne, Owner, SHOTCALLER and Director of Virtual Production for AMS Pictures. “We need some simplified things, and that’s what we are starting to see with what companies like Chaos are doing. They’re building something that allows you to have a 3D world scene that is truly a real-time path tracer, and the path tracer gives the best quality you can out of a rendered image. Real-time is an aspect of the pipeline. It’s a tool just like virtual production is another toolset a studio would have.” Misconceptions are an issue. “The con is that the marketing has made even our clients believe this is easy to do and can be achieved without a whole lot of work going into it. In real life, we have to put work into it and make or build things in a way where we can get speed out of it. It’s not just going to be real-time because it’s coming out of Unreal Engine. It could be, but it will look like crap. How do we get the quality versus the speed that we need?”
TOP TO BOTTOM: The mantra for Vū Technologies is ‘content at the speed of thought,’ which they believe will be the next evolution of communication. (Image courtesy of Vū Technologies)
Real-time allows digital artists to iterate way faster, which means more options for clients. Scene from Sweet Tooth (Image courtesy of Zoic Studios and Netflix)
Real-time has shifted the involvement of Zoic Studios toward the front end of the production, resulting in far less in the back end. Scene from The Sympathizer. (Image courtesy of Zoic Studios and HBO)
The Chaos Group is developing real-time software programs, such as the ray tracing renderer Vantage and Arena, which does ray tracing for in-camera effects. “For us, Arena is an extension of the camera that the DP already has, and as long as the DP can talk to the people who are running the stage, like to a grip or camera operator, then we’re in good shape,” remarks Christopher Nichols, Director of Chaos Labs at the Chaos Group. “We looked at what they needed to do to get the correct video on the LED walls. Essentially, we needed a system that synchronizes renders across multiple nodes and can track a camera so you can get the correct parallax. That’s the fundamental thing we added to Vantage, enabling it to become an in-camera effect solution. By introducing full ray tracing to the process that removes the need for rasterized rendering, you can make a better duplicate of the camera and don’t need to optimize your data or geometry in the same way that you need for video games. Almost everything that is done in post-production uses full ray tracing, either V-Ray or Arnold. That massively cuts down on how much time and energy is used to put the CG elements behind people because it’s the same asset for everything. The virtual art department can focus on compositing the shot correctly or creating the right environment and not on, ‘How do I remake this to work for a game engine?’”
More options have become available to be creative. “We’re seeing concepts emerge now that would have been nearly impossible without the use of real-time tools to plan and execute, like digital twins, which are changing the game for creators, especially when budget and ambition are both high and there’s no room for miscommunication,” notes states Brian Solomon, Creative Technology Director at Framestore. “Another area advancing rapidly revolves around how we utilize characters. Real-time allows us to previs and utilize dynamic 3D characters earlier in feature film production, especially with character-driven live-action pictures. Similarly, there are now advantages coming
from production-grade real-time variants of characters. These are benefiting larger brands and animated IP owners, as a host of new formats are emerging that allow these characters to interact with the world in ways they couldn’t prior and at turnaround speeds not hitherto possible. Real-time overall is broadening the horizon for characters.”
Real-time technology is positively transforming production pipelines. “In the traditional visual effects world, it is allowing for faster iterations which enable additional exploration of creative options,” notes Paul Salvini, Global Chief Technology Officer at DNEG. “These advances are most critical in areas like animation and creature and character effects [such as the simulation of muscle, skin, hair, fur and cloth]. In cases where the final output from real-time solutions needs further processing, seamlessly connecting real-time and non-real-time tools becomes critical. The role of artists doesn’t fundamentally change, but the tools will allow a more interactive workflow with better feedback. Real-time visual effects are also transforming more areas of production than ever before from previs through final render.” Audience members are getting to enjoy even more immersive and interactive experiences. Salvini remarks, “Some recent live and virtual concert experiences have done a great job of bringing together the best of the real and computer-generated worlds to deliver experiences never before possible for audiences, such as allowing a current artist’s performance to be mapped visually onto their younger selves.”
Storytelling and being able to present clients with the best possible imagery are the main technological goals for Sony Pictures Imageworks, which meant figuring out how to get close to real-time with their GPU renderer Arnold. “The more the client is educated with real-time and sees what the studios are doing, the more they want you to push the envelope,” states Gregory Ducatel, Executive Director, Software Development at Sony Pictures Imageworks. “The magic you get when you work
TOP TO BOTTOM: The visual effects pipeline at Zoic Studios has always been modular. Scene from The Boys. (Image
Technology is an ecosystem that is constantly evolving because of innovation. (Image courtesy of Vū Technologies)
Real-time visual effects are here to stay because it is the best way to get feedback from clients or collaborators. Composite from 9-1-1 (Image courtesy of Zoic Studios and ABC)
with good creatives, clients and technology is that the creativity of those people jumps. It’s crazy. Currently, if you go outside of Unreal Engine, the quality of the imagery drops, and then with lighting, it goes back up; that was not acceptable for us because artists lose the context of their work, and the creatives don’t like that. This is why Spear [Sony Pictures Imageworks’ version of the Arnold Renderer] was brought to the table. How can we always have the highest quality possible at each given step but never go back to the previous one?” The feature animation and visual effects applications are somewhat different: however, the principles remain the same. “We always want better quality, more iterations. We don’t want to wait for notes and for the artists to do something, then go back to notes. If you can do that in real-time, the artist can move forward, and it’s exactly what you want,” states Ducatel.
Real-time visual effects are here to stay. “People who don’t see that real-time is where we all should go are stuck in the past,” believes Julien Brami, VFX Supervisor & Creative Director at Zoic Studios. “There is time for finishing and concepting; all of these take time, but when we need the interactivity and get feedback, whether from clients or collaborators, real-time is the best tool. Real-time allows us to iterate way faster, and faster means more options. Then you can filter what is working. Instead of saying ‘no’ to a client, now you have an opportunity to work with them. There are more iterations, but it’s less painful to iterate.”
The pipeline is evolving. Brami says, “The visual effects pipeline at Zoic Studios has always been modular. We try to make the pipeline procedural so it can be crafted per show and be more efficient. Real-time has shifted our involvement toward the front end of the production, and we have way less in the back end. With a traditional pipeline we would have a bluescreen or greenscreen
TOP TO BOTTOM: The more the client is educated with real-time and sees what the studios are doing, the more they want the envelope pushed. Scene from K-Pop: Demon Hunters. (Image courtesy of Sony Pictures Animation and Sony Pictures Imageworks)
Real-time is allowing the utilization of dynamic 3D characters earlier in the process of feature film production, especially with character-driven live-action pictures. Scene from Paddington in Peru. (Image courtesy of Framestore and Columbia Pictures)
Three years ago, it was all about using game engines for real-time, but with the advances in generative AI, people are doing things even more instantly. (Image courtesy of Vū Technologies)
and have to key everything; all of that would have been at the tail end, which is usually more stressful.”
Technology is constantly advancing along with the growth of expectations. “Virtual production, machine learning and real-time rendering engines; all of these have been around for decades,” observes Mariana Acuña Acosta, SVP Global Virtual Production and On-Set Services at Technicolor. “It’s not like it just happened overnight. What has continued to advance is our computing power. I can’t even comprehend how we’re going to be able to maintain all of the machine learning and AI with these new generational GPUs. What has pushed these advancements forward has been virtual production, cloud workflows, machine learning, AI and the game engines themselves.” To avoid obsolete technology, hardware has to be constantly updated. “It’s costly for a studio to be constantly updating hardware. Maybe at some point, you get a project or want to create your own project and realize you don’t have enough hardware to go and run with it. That’s when the cloud comes in, as you can scale and have the best spec machines. This is crucial because then the cloud service providers are the ones that have a lot of resources to go around when it comes to RAM and GPUs.”
Rendering improves with each new release of Unreal Engine and Unity. “Advances in real-time rendering, such as virtualized geometry with Unreal Engine’s Nanite, have significantly reduced the time required to optimize assets for real-time performance while enhancing their visual fidelity,” observes Dan Chapman, Senior Product Manager, Real-Time & Virtual Production at Framestore. “Looking ahead, Gaussian Splatting is setting a new standard for photorealism in real-time applications. By moving away from traditional polygon-based 3D models and building on Neural Radiance Fields [point clouds that encode light information], Gaussian Splatting offers a more efficient and accurate approach to rendering complex, photorealistic scenes in real-time.” Real-time visual effects have raised the expectations of audiences when it comes to immersive, interactive and personalized experiences.
Chapman remarks, “Technologies like augmented reality, virtual reality and projection mapping allow attractions to respond to guest movements and decisions in real-time, creating personalized storylines and environments that feel unique to each visitor. This shift is also taking place online, where audiences are actively participating in experiences in a way that they can shape and share with others. This is particularly evident in platforms like Fortnite and Roblox, where users engage in live events, socialize with friends and collaborate on creative projects.”
Sometimes, real-time solutions slow down to a traditional visual effects renderer. “It can go in the wrong direction if you’re pushing it too far,” notes Richard Clarke, Head of Visualization & VFX Supervisor at Cinesite. “I’m curious if we can evolve this two-stage process where you can visualize, explore, iterate quickly, and have a good idea of what your end product is going to be, but still not closing the door on allowing the visual effects team to finish it off or push it to the cloud for higher processing. What you get back is closer to a final version. One little wrinkle at the moment is the various render passes that the visual effects team will be utilizing can’t be replicated as easily. The more AOVs [Arbitrary Output Variables] you’re pushing out, the more you’re going to slow down the real-time. Postvis is a real melding of real-time technology and visual effects pipeline workflows. The nice thing about postvis is it’s not an end product. We’ve got a little trick where we make a beautiful scene in Arnold, bake all of the lights and textures, output shots in minutes direct from Maya and go straight into comp. They almost look final. That’s pre-packaging things. Game engines
pre-capture a lot of their lighting to make real-time. That’s where you can save on a lot of processing. The more I use real-time technology, the more I think it’s going to be a cornerstone of everything. Autodesk showed us a beta version of Unreal Engine in Maya. I got excited about that because we’ve been doing it the other way around. Having Unreal Engine in your viewport was like a hallelujah moment for me because most visual effects artists are Maya-centric at the moment.”
As with nature, technology is an ecosystem. “What we’re seeing right now at the top level is the merging of many new innovative technologies,” states Tim Moore, CEO of Vū Technologies. “Three years ago, it was all about using game engines for real-time, and with the advances in generative AI, you now see people doing things even more instantly. The merging of those two is interesting; to be generative inside a 3D environment where you have all the perspectives and control.” Real-time is most useful at the concepting stage. “For people who have simple thoughts and want an extravagant output, AI is amazing because you can give it a little and the AI will fill in the rest. For people who have a specific vision and want it to come to life, AI becomes challenging because you have to figure out how to communicate to this thing in a way where it sees what you see in your head, and you have to use words to do that.” The future can be found in the mantra of Vū Technologies. Moore comments, “The vision for our company is ‘content at the speed of thought,’ and to me that is the next evolution of communication. Encoding and decoding language into sounds and words is an inefficient way to communicate, whereas the ability to use visuals as a communication layer is the most universal language in the world. Everyone perceives the world in a visual way. That ability to make visuals at the speed of thought is the big evolution of storytelling we will see in the next 10 years.”
By TREVOR HOGG
TOP: Preparing for a virtual production shoot of a Vertibird featured in Fallout. (Image courtesy of All of it Now)
OPPOSITE TOP TO BOTTOM: LED walls are beneficial for rendering content for backgrounds but often fall short as a lighting instrument. (Image courtesy of Disney+ and Lucasfilm Ltd.)
Limitations still exist regarding how much you can put on the LED wall in terms of computational power. (Image courtesy of Dimension Studio, DNEG and Apple TV+)
Westworld Season 4 made use of virtual production technology to expand the scope of the world-building. (Image courtesy of Technicolor and HBO)
Has virtual production revolutionized filmmaking, beginning with The Mandalorian in 2019, and accelerated by the COVID-19 pandemic a year later? The answer is ‘no,’ but the methodology has become an accepted alternative to bluescreen and greenscreen. Even though technology continues to advance at a rapidly, some things have remained the same. “It’s a mixed bag,” states Matt Jacobs, VFX Supervisor. “What’s on my mind now when talking to people is building brick-and-mortar facilities. There was a project constructing a backlot in France, and I asked, ‘Did you set up an LED volume because you’ve sunk a lot of money into this?’
And they’re like, ‘No, because every time we do an LED volume, it seems that the ask is different for what the volume needs to do.’ Everybody comes in and says, ‘I need it for process shots for cars.’
Or, ‘I’m doing playback, and I need the volume to be this size and configuration.’ The ability to pop up a volume, be flexible and build the volume out to case-specific specs seems to be the way to go these days.”
Companies like Magicbox offer a tractor-trailer studio setup. “The pop-up trailer is an interesting thing, but you also have to look at that as a set configuration,” Jacobs notes. “Yes, it’s mobile, but it’s what the tractor trailer looks like. Do you need a volume that is semicircle? Do you need the ceiling, or is that lighting?
How are you going to work a volume with a known configuration of width and height? Is it squared-off walls or a circular volume? Does it have ceiling panels that you need for reflections in a car? How are those ceiling panels configured? I was on a Netflix shoot, and we had this great volume at Cinecittà Studios outside of Rome. It was a cool setup and a big stage. The floor was a Lazy Susan, so it actually spun around. The ceiling was great, but because the tiles didn’t line up perfectly, there were lines and seams across the car
where there were no reflections. We had to bring in walls to do fill reflection on the front of the car. We had to do a lot of work to reconfigure that stage and bring in certain elements. Thankfully, they were nimble and had a lot of great pieces and solutions for us to work with. But it goes back to the point that the stage was probably too big for certain things, and maybe it wasn’t perfect for our car shoot.”
Generally, people think that virtual production is synonymous with the LED volume. “I think virtual production is anytime that you’re using real-time technologies in conjunction with normal production,” remarks Ben Lumsden, Executive Producer at Dimension Studio. “The biggest single change is you can push a lot more through Unreal Engine. You’ve got a whole suite of tools specifically addressing LED volume methodologies. There’s the switchboard app and level snapshots that allow you to go back to a period of time when there was that particular load on the volume and understand exactly where everything was, which animation was where and what the lighting setup was. On Avatar, James Cameron would get so frustrated because everything was done using MotionBuilder. Cameron would return to post-production after being on set, and all the creative changes he made on the day got lost in translation through the pipeline.” MegaLights from Unreal Engine 5.5 is a huge step forward. Lumsden says, “Beforehand, it was geometry, which was too expensive. But then Nanite came along with Unreal Engine 5, meaning geometry was no longer an issue. Our experiments with MegaLights so far suggest that lights will no longer be an issue.”
Limitations still exist regarding to how much you can put on the LED wall in terms of computational power. “You don’t want to drive too many metahumans, for instance, but you can put loads
of volumetrically-captured people and make sure that their card is pointed back to the camera or their rendered view is relative to the position of the camera,” Lumsden notes. “One thing that we did that was cool regarding R&D is marrying our performance-capture technology with the LED virtual production. We’ve been doing some tests where we can actually drive metahumans on the wall as digital extras being live-puppeteered on a mocap stage and interacting with the real talent; that’s a new technology or workflow that we may well bring into production going forward.” Sound remains problematic. “There is a real issue with capturing audio because you’ve got this big echo chamber. There are some fantastic new LED panels coming out all of the time. But the great new panels are always expensive. Over time, that will change, as with all of these things. There are also some new and interesting technologies of people doing projector-based methodologies, which are intriguing because the price point is more applicable to indie filmmakers.” Interest rates have made productions more cost-conscious and less adventurous. “The early stories of the volume being a cost-saving mechanism put volume shoots at a disadvantage because producers came in expecting to see a 10x savings in cost or whatever number they had in mind, and it’s dramatic but not that dramatic,” observes Danny Firpo, CEO & Co-Founder of All of it Now. “Now, people are realizing what the volume does well, which are process shoots for vehicles or being able to create a lot of environments in a short amount of time or being able to move the environment around talent.” Hardware and software have greatly improved. “The expansive rate of cheap graphic cards is increasing in power and is helping to keep the dream of a real-time holodeck-style volume within arm’s reach. The quality of real-time graphics is increasing exponentially, and the time it takes to create those real-time environments is decreasing due to the impressive tools that have come out on the software side. Nanite and some of the impressive tools that have come out from Unreal Engine 5.3 and all of the way up to 5.5 are creating a much better environment for artists to create the best version of what they can possibly create now. In addition, we’re seeing a better understanding across the board of LED and camera providers and even lighting vendors
of what types of equipment flourish in an LED volume environment as opposed to trying to take live show or film rental inventory and cramming it into the volume, which we saw in the volumes during the pandemic.”
One particular department head remains central in being able to understand and communicate the capabilities of the LED volume to other members of the production team. “The visual effects supervisor is an ideal bridge because they already exist in this hybrid or mixed reality of 2D and 3D, real-time, physical and digital environments colliding to create the finished product,” Firpo states. “That type of thinking is more challenging for somebody from a different department like Art, Camera or Lighting and is only used to dealing with one physical reality in a real-world space. What we have discovered is specialists are emerging in those departments who have a real understanding of that and are willing to take an extra day and pre-light or go through a virtual scout and ultimately help explore those worlds more and use the same mentalities of what they would do in a physical scout.” An effort has been made to make the virtual production process more intuitive for the various departments. Firpo notes, “We’re moving all of the extraneous tools and features that we deal with and making a simplified UI. For example, giving a DP doing a virtual location scout using an iPad, which is ubiquitous on set, a sense of a rigged virtual camera, which feels like operating a physical one but is essentially a digital portal into that world. Getting that buy-off and sense of translation from the physical into the digital world and vice versa is where it’s helped bridge that communication and culture gap.”
LED walls are great for rendering content for backgrounds but often fall short as a lighting instrument. “LED volumes have a limited brightness, and the light spreads out, so you can’t create harsh shadows,” notes Lukas Lepicovsky, Director of Virtual Production at Eyeline Studios. “They’re also not full spectrum light. LED walls are only RGB instead of RGBW Amber like you would get from an on-set light. You can maybe use the LED wall as fill light, but then you definitely want to be working with on-set lighting for the actual key light.” Virtual production excels with
Technicolor, in cooperation with the American Society of Cinematographers, conducts an in-camera visual effects demo. (Image courtesy of Technicolor)
Virtual production has not only revolutionized filmmaking, but the methodology has become an accepted alternative to bluescreen and greenscreen. (Image courtesy of Technicolor)
TOP TO BOTTOM: The quality of real-time graphics is increasing exponentially, and the time it takes to create those real-time environments is decreasing. (Image courtesy of All of it Now)
Those About to Die was shot on the LED volume stage at Cinecittà Studios in Rome. (Image courtesy of Dimension Studio, DNEG and Peacock)
Interest rates have made productions more cost-conscious and less adventurous. (Image courtesy of All of it Now)
short turnaround projects such as commercials because all the decisions are made upfront. “If you’re a massive visual effects project, then you’re probably going to want to lean on it more for lighting capabilities, like projecting an explosion that lights up the actor’s face in a nice way, but then leave yourself room in visual effects to augment the background with giant building destruction. This is what we ended up doing with Black Adam. We made the wall be near final, or in some cases just a previs in the background that had good lighting, which had explosions and lightning elements. We used it as a lighting instrument, knowing we would replace the background afterward. It depends on the production because, in those cases, you don’t always know what your final asset looks like while you’re shooting a large feature production. Because it’s a real-time process, you have constraints of polygon budget and render time, so you can’t just fill the world with all sorts of assets. You have to have strong planning when it comes to these things.”
Game engines have been a game-changer and are constantly improving. “Where it can stand to improve still is the integration of some visual effects technology like USD and the ability to quickly share assets between departments and make layered, modifiable changes in the pipeline,” Lepicovsky remarks. “Also, over time, we’ve seen this with visual effects; things started from a rastering approach, and eventually everything turned into ray tracing. So, I’m excited to see that there are also ray tracing possibilities in real-time that are coming forward both from Epic Games and Chaos Vantage, a new entrant in the virtual production market.” It is still too early to judge the impact of machine learning on virtual production. Lepicovsky adds, “There are machine learning tools that generate the backgrounds, but right now, they often want nice animation with all the leaves blowing and trees swaying; that is easier to do in actual game assets. Machine learning has been interesting for us in a new process called Gaussian Splatting, which is like a new version of photogrammetry based on a machine learning process. What is different from traditional photography is that you can have reflective and see-through surfaces and capture hair. Another interesting one involves a relighting process that allows you to capture actors in neutrally-lit lighting conditions, like volumetric capture, but then change the lighting afterwards using machine learning.”
“The LED panel is excellent because it’s an incredibly high output, so people like to use it for the lighting, and companies like ROE Visual are adding additional colors into the diode cluster to get better skin tones,” remarks Jay Spriggs, Managing Partner at Astra Production Group. “But that’s not going to replace a conventional lighting instrument. We know people who are researching projection in volumes because the cost to run that is much lower, and you also have additional benefits. For LEDs, the diodes light up and shoot light out, whereas, in a projection-oriented environment, they are reflective, so you have a different quality of light and mixing, which comes from that. The Light Field Lab stuff is fascinating. I don’t want to even think about what the volume would cost for that!” The
central question is, how do you help with what is happening in the frame? “From there, you reverse engineer that into what products are not just the best for what’s going to happen but also the most money-efficient so that they have enough money to bring in their people.” The most cost-effective way is projecting plate photography, as there are so many more complications with real-time tracking, says Spriggs. However, Unreal Engine is making major strides with a new grading workflow. “That is going to be huge for making better pictures out of the game engine because one of the biggest things has always been: how do you do a final polish pass on what is already a good lighting engine but is not perfect?”
Not everything gets treated the same way. “If Greig Fraser [Cinematographer] wants to get the highest quality lighting effect for the best skin tone, but we’re only doing a couple of tight shots, and he has a generous post budget, then we look at the background of the LED,” Spriggs explains. “We build it with the highest quality LED with the smallest pitch we can find. Don’t worry about the final color that you see in the picture because the post budget will kick all of that stuff out so they can post-render and grade. All we focus on is the skin tone. If someone is trying to shoot a car commercial, they’re trying to get the closest to final pixel for the reflections. You build a volume around the car that they’re looking at with the smallest pitch so that you will not be able to see individual pixels on an LED wall with a ceiling. Shoot that and walk away. You wouldn’t use that same configuration for the other one because benefits wouldn’t be there.” Fundamentals should not be forgotten. Advises Spriggs, “If we focus too much on revolutionizing and democratizing or any such big-picture thoughts, we forget about what we have to do right in front of us, which is to make a damn pretty picture!”
BOTTOM: The visual effects supervisor remains the bridge in understanding and communicating the capabilities of the LED volume to the other heads of the departments. From Time Bandits (Image courtesy of Dimension Studio, DNEG and Apple TV+)
By TREVOR HOGG
TOP: Along with being surrounded by snail memorabilia, Grace finds herself responsible for an ever-growing population of frisky guinea pigs.
OPPOSITE TOP TO BOTTOM: Pinky helps Grace to break out of her shell and experience life.
Director Adam Elliot works on the adult Grace puppet surrounded by her character designs. The most dynamic character is Pinky, who required a selection of heads.
Making the most of the global shutdown caused by the COVID-19 pandemic, Australian filmmaker Adam Elliot mapped out what would become an Oscar-nominee favorite for Best Animated Feature and an Annecy winner. Memoir of a Snail tells the tale of Grace, a hoarder of snail memorabilia who longs to be reunited with her twin brother Gilbert while experiencing the trials and tribulations of becoming an adult. Previously, Elliot made his feature film directorial debut in 2009 with Mary and Max, but interestingly, the methodology and technology between the two productions have not altered much.
“The technology has changed,” states Adam Elliot, Producer, Director, Production Designer and Writer. “Dragonframe is a wonderful tool. The animators love it because they can do all sorts of tricks, and it’s got all sorts of bells and whistles. I apply many restrictions on my animators and try to get them not to rely on the software too much, animate from intuition and celebrate happy accidents. We have LED lights now, so the stages aren’t as hot. The globes don’t burn out as quickly. Sound editing and design are far more digitized. The sound libraries are bigger. Cameras have gotten higher megapixels, so the resolution is much higher. Having said all that, they’re just tools. We still try to animate in a traditional manner. Everything in front of the camera is done traditionally. We don’t do any CGI additions. However, we certainly do cleanup digitally, like removing rigs. All our special effects, like fire, rain and water, are all handmade. The fire is yellow cellophane. We celebrate the old, but we certainly embrace the new.”
Visual effects have come a long way, allowing for more creative freedom. In the old days, we would use fishing line to have things airborne, and now we can have a big metal rod and the visual effects artists remove that digitally,” Elliot notes. “That’s about it.
“We can say to the audience that you can hold in your hand every prop and character you have just seen. However, there was a lot of post-production to make it look that way! There were roughly 1,500 storyboard panels. Then I drew and designed all of the characters [200], most of the props [5,000 to 7,000] and sets [200]. I drew by hand because I was in lockdown during COVID-19 and had a lot of time!”
—Adam Elliot, Producer, Director, Production Designer and Writer
It’s just cleanup. There is a lot of compositing. For elements like fire, we do it on a piece of glass with the camera looking down, often with a greenscreen background, then we composite that in post. We had 600 visual effects shots in the film, and a lot of money spent on the visual effects, but it was mostly basic stuff.” One of the more complex effects was the burning church. “Those flames are recycled and layered,” Elliot explains. “We do one set of flames, then cut, paste and layer them. The claim is that we can say to the audience that you can hold in your hand every prop and character you have just seen. However, there was a lot of post-production to make it look that way!” Every shot was storyboarded. “There were roughly 1,500 storyboard panels. Then I drew and designed all of the characters [200], most of the props [5,000 to 7,000] and sets [200]. I drew by hand because I was in lockdown during COVID-19 and had a lot of time!”
1,600
Compositing was only used where necessary. “Most of the skies you see in the film were on set on giant canvases,” Elliot remarks. “There were only one or two skies or maybe more where we did greenscreen then composited in one of the canvas skies. It is a wonderful tool that now liberates us as stop-motion animators. When I left film school in 1996, I was told I was pursuing a dying art form and that stop motion would be obliterated by CGI. The complete opposite has happened. CGI and digital tools have liberated us. You have to be careful not to get carried away. There is a hybrid look that has gone a bit far, and now with 3D printers, too. Some of these stop-motion films almost look computer-animated because they’re so slick. We’re trying to celebrate the lumps and bumps, brushstrokes, and fingerprints on the clay.”
The design of the characters was based on what Elliot was able to accomplish in his one-room apartment. “Adam started making everything out of Apoxie Sculpt, which is this material that sculpts like clay, then goes rock-hard in about an hour,” explains Animation Supervisor John Lewis. “We had stylized the characters around the fact that they were solid and budgeted around it as well. Creating a character’s costume out of real fabric or silicone that can bend is time-consuming and expensive. Adam wanted us to have these rock-hard-solid puppets. They’re like statues; in some sense, that’s easy because it restricts what the puppet can do. Sometimes, when you don’t move the head, the ear clunks into the shoulder, and you can’t move it where you want it to move, or body movements are stiffer or different than how you might move it otherwise. Adam wrote ‘walking’ out of the film as a stylistic and budgetary choice. If we did have characters walking full frame where you can see their feet, we would have had to have legs that could bend and pants. Instead, we cropped it with the camera, removed the legs and put a little up-and-down rig underneath, which was a cheap microscope stand. It’s about the same size as the puppet’s legs. We put
that on the table and slide the character along; that’s how we get dynamic movement.”
Voiceover narrative figures prominently in the storytelling. “Every shot is cut into the animatic so it has the piece of narration that goes with it as well as the music ideally. Some of that [narration] is scratch and some of it is filled with the real stuff as we go,” Lewis remarks. “You will listen to the narration for every shot that is only five to 10 seconds long. You will time your animation out and express your character differently, depending on what the narrator is saying. They play into each other.” A rhythm gets established with the animation. “Each character will start to lend itself to different expressions and movements. The actor’s voice is a huge guide for that. As an animator, you find that some things are working and keep doing them. Some things don’t work, so you might cut back. Between all the animators, we talk and look at each other’s work. Slowly, a language of that character will develop. Because Grace doesn’t have confidence, she’s often got her hands tucked up to her chest, has her arms in and holds herself tightly. I talked to the animators about how much tension was in the character. Gilbert is bolder, so he’s more likely to be striking big poses with his arms out. Adam has nuanced rules about how he likes each character to look and move when it comes to the blinks and shapes of their eyelids. Certain things make characters look like Adam Elliot characters.”
One of the nine stages had an under-camera rig where a camera could swing down onto a glass tabletop. “Every time an animator had some downtime, they would go onto that stage and do a bit of effects work,” explains Production Manager and VFX Supervisor Braiden Asciak. “The fire was orange and yellow cellophane, and the smoke was cottonwood. They would do these effects,
A surreal moment is when Gilbert and Grace appear inside their mother’s womb, which serves to emphasize the strong bond that exists between the siblings.
Pinky also required a wide variety of mouths.
sometimes to the shot. They’d load the shot we had completed into Dragonframe as a reference and animate those effects elements on top of the animation they had already animated prior. It meant that quite a few of the elements we shot were specific to the shot. However, we could reuse certain elements from those other shots.” The visual effects team consisted of Asciak, Visual Effects Editor Belinda Fithie, Gemila Iezzi [Post Producer at Soundfirm) and four to five digital artists at Soundfirm. “They were simple 2D visual effects, so we didn’t need Maya or Houdini. The visual effects artists mostly used Nuke or DaVinci Resolve – Fusion [a node-based workflow built into Resolve with 2D and 3D tools].”
The church burning-down sequence was challenging but rewarding. “Once we shot all the plates we needed throughout that sequence, we put it into the timeline. Everything was worked out together,” Asciak states. “How was Gilbert going to move around that church? Then we slowly animated the effects elements. Once we finished production, John Lewis spent several weeks doing as many effects elements as possible. We built an extensive effects library of cellophane fire, and he did two big shots – one of Gilbert in the church – and did some compositing in Adobe After Effects himself. Then, John did an exterior shot of the church on fire with the huge fire bursting out of the roof. When John left, it was up to me to layer out the rest of that sequence and ensure continuity. I was doing everything in DaVinci Resolve. Some shots have 15 layers of fire, smoke and embers to try to get it right to a level that Adam was happy with, and it looked compositionally right.”
Creating a sense of peril was critical for the moment Gilbert attempted to rescue a snail in the middle of a busy road. “We had a number of cars animated on greenscreen so we could composite those in later,” Asciak reveals. “But the angle of those plates wasn’t matching the shots. Adam wanted the cars going in from the left and out from the
right to speed by. You had to match that up with the performance to avoid covering the key moments of a laugh or yell. I spent a good two-to-three weeks trying to time all those cars and each of those shots. Also, we didn’t have any sound at that point, so the sound was done to the visual effects work. I felt like we needed to amp up the intensity of that sequence and bolster it with a lot of cars. One way of doing that was [to transition between shots] using side swipes with the light poles, or there was a van that drove by. That was a good way to improve the cutting in the sequence to make it flow and feel chaotic.”
Close attention had to be paid to ensure all the rigs were painted out. “If there was an element of the shot before the rig appeared, then we wouldn’t need a clean plate,” Asciak states. “But for something like Pinky tap-dancing on a table doing the burlesque, we had to go frame-by-frame and paint out that rig from the mirror. In some cases, it’s a little sliver of a rig. When she’s in the pineapple suit and holding out the plate of pineapple chunks, there is a rig that is like a straight line behind her. If you weren’t paying attention, you probably wouldn’t have noticed, but there was a sliver of a rig, and even the visual effects artists asked, ‘Where is the rig?’
Grace’s bedroom had the most scenes, which were shot chronologically. “That set was there for 16 weeks, over half of the shoot period. It goes from being empty to being full of [snail memorabilia], then we return to it being empty. When Grace is sitting in her bed and says she is surrounded by her snail fortress and the camera pulls out, as soon as we finished that shot, we pulled everyone into the kitchen where we had the TV screening room. We watched that shot together for the first time, and it was magical. At that point, we knew we had something on our hands, as we could clearly see the work of the art, animation, lighting and camera departments.”
the
By TREVOR HOGG
OPPOSITE TOP TO BOTTOM: Having practical elements for actors to interact with is an essential part of the filmmaking process for
One of the hardest tasks for MPC was matching digital soldiers with the on-set extras in the battle sequences for Napoleon (Image courtesy of Columbia Pictures)
Having real planes shot grounded the camerawork, which was reflected in visual effects for Top Gun: Maverick (Image courtesy of Paramount Pictures)
In the digital age, where photorealism is achievable virtually and is getting further refined with machine learning, the visual effects industry finds itself being viewed as a double-edged sword. When used properly, visual effects are interwoven into a cinematic vision that could not be achieved otherwise and, when deployed badly, an instrument of laziness. This perception has been accentuated by the global depository of human knowledge and ignorance known as the Internet. In the middle of all this is a question of whether there is an actual trend of filmmakers favoring practical effects, or is it simply a marketing ploy taking advantage of public opinion?
“Like with any new technology, people went a bit overboard with CGI. CGI is powerful, but it can have some limits,” states director Denis Villeneuve who has created everything from a talking fish in Maelström, a spider in a closet in Enemy, a sandworm ride in Dune: Part Two and a traffic-congested highway in Sicario. “It’s all about the talents of the artists you’re working with. I’m not the only one. Many filmmakers realize it’s a balance, and the more you can capture in-camera, the better. A great visual effects supervisor will tell you the same. The pendulum is going more towards the center, in the right way, between what you can capture in-camera and what you can improve with CGI. If it was up to me, there would be no behind-the-scenes. I feel like you spend years trying to create magic, specifically with CGI, which is so delicate and fragile that it can quickly look silly. So much work has been done to make it look real that I’m always sad when we show behind the curtain.”
Villeneuve is not entirely against the idea of unmasking the illusion as it helps to inform and inspire the legendary filmmakers of tomorrow. “When I was a kid, I read Cinefex and was excited to know how the movie had been made, but it was a specialized publication. It wasn’t wide-open clips that can be seen by millions.
It was something that if you were a dedicated nerd who wanted to know about it, you had to dig for the information, but now it is spread all over the place.”
“There is this need for directors or studios to diminish the visual effects departments and put forth, ‘We did it all in camera,’” notes Bryan Grill, Production VFX Supervisor for Beverly Hills Cop: Axel F, “when we all know that’s not the case. You put your best foot forward to do stunts and practical effects, but there’s always something in there that needs some clean-up or enhancement. It has been this juxtaposition. You’ve had superhero movies, which are nothing but visual effects, environments and multi-dimensions. It’s overbearing. Then you have the other side, which is traditional filmmaking.” Along with Eddie Murphy reprising his role of the quick-witted rogue detective named Axel Foley for the fourth time, an effort was made to recapture the 1980s roots of the franchise. “What always stuck with me about the original Beverly Hills Cop was the opening scene where the truck is barreling down, hitting car after car. One of my other favorite movies from that era was The Blues Brothers, with all of the police cars hitting each other and falling off the bridges. It was a carnage of special effects. That’s what they wanted to bring into this version as well, and they damaged a lot of cars! There were at least 20 more cars that didn’t make the edit that got destroyed. The filmmakers went all out to relive and show the next generation of that type of movie.”
“There is a trend in the marketing of these films where audiences seem to want to crave an authentic experience, so they’re emphasizing the practical aspects even if most of what you’re watching has been digitally replaced in post,” remarks Paul Franklin, Senior VFX Supervisor at DNEG. “If you think back 30 years when Jurassic Park came out, that film was marketed on
Allowing enough time for the various crafts, including visual effects, leads to successfully encapsulating the vision of the director, which was the case with Barbie. (Image courtesy of Warner Bros.)
Even when dealing with the artificial-looking environments found in Barbie, practical sets provide a solid foundation for the seamless integration of visual effects. (Image courtesy of Warner Bros.)
the fact that they had computer-generated dinosaurs in it for the first time. Those of us in the visual effects world who are familiar with that film know that the majority of the dinosaurs that we saw on the screen were Stan Winston’s animatronics that were created practically. If that film was being released today, it would be all of this stuff about Stan Winston building these dinosaurs as animatronics.” That being said, practical effects aspirations do exist. “There are a lot of filmmakers who have seen the success of Interstellar, The Dark Knight movies and recently Oppenheimer, and the way that Christopher Nolan leans into the practical aspect of what he does. They’re going, ‘That’s an effective way to tell your story.’ A lot of filmmakers aspire to that. Whether there are so many of them who can pull it off is a different thing because it turns out to be quite difficult to do that balancing act. I got lucky and worked with Chris for 10 years, and he is a genius filmmaker. I don’t know if there is anybody else quite like him. Steven Spielberg in the days when he was making films such as Saving Private Ryan and Jurassic Park; he’s a filmmaker who knew the value of doing things practically, which is why he would always want to work with Stan Winston. You look back at E.T. the Extra-Terrestrial, and it still holds up because they used state-of-the-art animatronics and practical effects at the time.”
“For a good decade or more, there was this ability for visual effects teams to provide directors and producers with shots that were extraordinary in their ability to break some of the traditional filmmaking rules and to move away from some of the things that made cinema look the way it had for the decades before that,” notes Ryan Tudhope, Production VFX Supervisor on Top Gun: Maverick “That created a bit of a look and, just like anything, looks come and go. One thing, if you think about it, over the course of all of filmmaking, is that every single shot of every single movie has one thing in common, which is that it was shot through a camera. Then came along the digital ability to create digital cameras and shots. That freed us up for a long time to be able to do things with those cameras that had never been done before. When I think about it in terms of what I’m trying to do, it is to recognize what that camera means to the artform and to honor that by trying to design a shot that appreciates what the camera can do and should do, how it
visualizes the world, and how the audience sees the film or shot or whatever the action might be through that limitation. When you don’t respect that, it can be visually stunning and impressive shots, but the audience immediately knows that it’s not real.”
“We’re trying to have our cake and eat it too,” believes Aaron Weintraub, VFX Supervisor at MPC. “The highest compliment we can ever be paid is if people have no idea that we did anything – that’s what we strive for. What we do is stagecraft. We’re trying to fool the audience into thinking that something is completely real and was there in front of the camera; they recorded it, and that’s what you get to see on the screen. If we have done it correctly, nobody knows.” Real-life examples are the starting point. Weintraub explains, “Everything that we do is looking at photographs and film footage and trying to replicate how the light and surfaces react. We’re nothing without reference of the real world, if the real world and photorealism is our goal.” Technology is the means to the end. “Every iteration and every step that we take with the technology is something new that the audience may not ever have seen before. Reacting to the newness is part of saying that the technology is driving it, but it’s a story that we’re trying to tell that we couldn’t do in the past.” Reality can provide a sense of spontaneity to animation. Weintraub notes, “There are mistakes and happy accidents that can happen when you shoot real stuff that you might not get otherwise. When we did Guillermo del Toro’s Pinocchio, which was stop-motion animation, one of the guiding principles of the animation was to try to anticipate all of those weird little accidents that if you were shooting this live-action and put those into the animation. In advance of shooting the stop-motion, the animators would shoot these little videos of their clips to see what would happen.”
“It’s a stylistic thing. When we’re talking about the marriage of practical, what’s shot on set and where we come into play, either augmenting or completely replacing it in some cases, there is always this desire to maintain this visual characteristic that is inherent in practical shooting,” observes Robin Hackl, Co-Founder and Visual Effects Supervisor at Image Engine. “It’s being art-directed and driven by the DP, lighting director and the director himself, and they have hands on the physicality of that
Actual cars were flipped and digitally augmented for Beverly Hills Cop: Axel F. (Image courtesy of Netflix)
Joseph Gordon-Levitt and Eddie Murphy prepare for a scene that takes place inside of a helicopter cockpit for Beverly Hills Cop: Axel F. (Image courtesy of Netflix)
being on set and getting that look. It’s always that apprehension almost of committing fully to CG and leaving it to the hands of the visual effects vendors and artists, even things that they are implementing in practical set photography, like virtual sets that give a higher degree of reality to the lighting of the characters. From the feedback that I’ve gotten from the people on set, the actors in particular react well to virtual production in the sense that they have something tangible to react to and see, to be part of that little world, which is sometimes hard for them to wrap their heads around. What is that emotion tied around that environment they’re in? It heightens that. Of course, it’s not all done that way. The Mandalorian is all over the place, from on-set photography to giant bluescreens and virtual production. The Mandalorian is a good example of aesthetic. The creators wanted to retain as much as possible the flavor and vibe of the original Star Wars films. There were a lot of optical effects done back in the early days and stop-motion that was live in-camera, for the most part they tried to shoot everything in-camera the best they could then augment it. It’s a real harkening back to that era.”
“It’s more about having a good dialogue with the people you are working with and making sure that they understand how to get the best out of the tools they’re using,” states Glen Pratt, Production VFX Supervisor on Barbie. “If I’m blunt, it’s often because bad choices are made. If you allow all the crafts that are involved in filmmaking the time that they say [they need], you get a good result, whether it be building a set, creating pyrotechnic explosions, then equally whatever aspects of visual effects you’re adding into that.”
Open communication is important. “Greta Gerwig hadn’t done visual effects before, so I sat down early with her and talked through various sets of tools that we have at our disposal. I could
tell she was overwhelmed by some of those things. But it’s honing it down to that is just a step in the process of how we will eventually get to the end result. That comes with experience. The more that you work with bringing visual effects into it, the more well-versed the director becomes with the language. A lot of the time, they don’t want to know that level of detail. They only want to know that you have their back and you can do this for them.”
Barbie Land was an artificial environment, but the same photorealistic principles applied. “We captured everything so we could recreate whether it be the actual stages themselves or miniature models, and often we embellished them further on what was there to ground it, make it feel like it belongs in that world and had a cohesive, insistent aesthetic running through it.”
“Personally, when the conditions and the type of effect to be achieved allow me to use the practical, I jump at it,” remarks Mathieu Dupuis, VFX Supervisor at Rodeo FX. “We’re fortunate to have a studio at our disposal here at Rodeo FX, and I can’t imagine executing some of the large-scale effects on our recent projects without the support of practical effects. I’m not just talking about blood splattering on a greenscreen or crowd duplication, which, by the way, is always highly effective. Being able to rebuild scaleddown set pieces [painted green] to capture the precise interaction of, say, glass breaking on a table or to recreate an organic dream effect by filming floating debris in macro within an aquarium allows us to achieve quick, cost-effective results that are both efficient and innovative. There’s also the advantage of avoiding endless discussions with clients by capturing how a flag moves in the wind or how a plate shatters. There’s no need to imagine or convince anyone how these elements would behave because we’ve captured them in real life. There’s nothing more authentic than reality, right!?”
“It’s a stylistic thing. When we’re talking about the marriage of practical, what’s shot on set and where we come into play, either augmenting or completely replacing it in some cases, there is always this desire to maintain this visual characteristic that is inherent in practical shooting.”
—Robin Hackl, Co-Founder and Visual Effects Supervisor, Image Engine
By TREVOR HOGG
Even though the term ‘retro-future’ is nothing new, Swedish artist and musician Simon Stålenhag has been able to create a unique vision where discarded technology is scattered across vast landscapes situated in an alternative universe. His illustrations and accompanying narratives have captured the attention of filmmakers such as Nathaniel Halpern with Tales from the Loop for Prime Video and siblings Anthony and Joe Russo with The Electric State for Netflix, which they spent seven years developing. The latter revolves around the aftermath of a robot uprising where an orphaned teenager goes on a cross-country journey to find her lost brother. The human cast consists of Millie Bobby Brown, Chris Pratt, Stanley Tucci, Giancarlo Esposito, Ke Huy Quan and Jason Alexander. At the same time, Woody Harrelson, Anthony Mackie, Brian Cox, Alan Tudyk, Hank Azaria and Colman Domingo voiced the many mechanical co-stars.
OPPOSITE TOP TO BOTTOM: A complex sequence to execute was 20-foot Herman carrying a Volkswagen campervan containing Michelle,
The drones went through a major design change where the heads resemble neurocasters and had a screen that projected the face of the pilot.
Inspiring the imagery was Swedish artist-musician Simon Stålenhag, who has developed a retro-tech and alternative-world visual aesthetic.
“Simon Stålenhag’s original artwork electrified us,” producer/ director Anthony Russo remarks. “It’s this strange feeling of familiarity in what he’s drawn and also strangeness. It’s a historical period that you can recognize whether or not you lived through it, but it’s not exactly what that period was.” There are elements from the 1990s. “The story is a parable,” producer/director Joe Russo notes. “It’s less about nostalgia than it is about the idea that technology could have developed faster and maybe deviated humanity from its main path. That was the fun part, thinking through those little elements that allowed us to create a new interpretation of the 1990s.” The story taps into the present-day fears of AI usurping its human creators. “Part of what we want to do is explore the fact that you can find humanity in technology and inhumanity in humans,” states Anthony Russo. “We have both of those experiences in our lives and world. Joe and I are technologists. We use technology throughout our lives to tell stories, but at the same time, we all know that technology is powerful and can cause problems, whether the nuclear bomb or social media. It’s us recognizing the complex relationship that we all have with technology as human beings.”
Determining how the robots would be created and executed was a major topic of discussion as they are 80% of the cast. “This is true for all of our projects when you’re dealing with a fantasy world that needs to be created from whole cloth that doesn’t exist,” states Anthony Russo. “The methodology used to create that is always a question. It is driven by how Joe and I see the movie. What are we trying to achieve? What do we want to do with the scenes? How do we want to stage things? How do we want the actors to interact with other characters in the movie who may not be played on the screen by physical actors? These all become questions in terms of what is the right methodology to use to create the film. We were involved with our Visual Effects Supervisor, Matthew Butler, to determine the proper methodologies. Because there are so many characters in it that don’t exist in reality, we had to rely upon visual effects to create a huge portion of the film.”
Dennis Gassner and Richard Johnson, who consulted robotic companies, shared production designer duties. “I was in charge of making a real walking and moving Cosmo,” states Production Designer Richard Johnson. “I had to go to every robotics company in the world that would listen to me. The immediate refrain was. ‘The head is a deal-breaker. It throws him immediately out of balance.’ If you look at all of the real robots that are popping up on the Internet today, they all have tiny heads. The other limiting factor was height. They were all in the zone of 5’ 7” or less. I now know more about real robots than I ever expected to know my entire life!” The robots had to be distinct in their own right. “We felt the robots with more screen time needed to be more iconic-looking, so we looked at iconic products or services or things from the last two, three or four decades. Mr. Peanut is very wellknown brand name. We thought, ‘He could be a robot. Baseball player. Very iconic. Be a robot.’”
Approximately 2,000 visual effects shots are in the final film, with Digital Domain and ILM being the main vendors, followed by Storm Studios, One of Us, Lola VFX and an in-house team.
“We’re not in a world of magic,” observes Visual Effects Supervisor Matthew Butler. “The idea is that these robots were often designed to make us feel comfortable about them serving us. I fought tooth and nail to put in little piston rod push-pulls and things that could justify that Cosmo could actually move. If we designed a particular ball joint or cylindrical actuator or pitch actuator, we made sure that the motion of these robots was restricted to what that could do.” Artistic license was taken with the original design by Simon Stålenhag. “I wanted Cosmo’s eyes to have some emotion. Rather than be just a painted pupil as in the book, we made a smoked glass lens for the pupil that you can see behind it that there is a camera. Artistically, we let those eyes have a gratuitous green light to them. Now, you have a twinkle of an eye and can get emotion into that eye. That was another tricky thing. It was about getting enough emotion into them without breaking the silhouette of the design of the robots that we needed to adhere to – that was hard,” Butler says. Keats’s (Chris Pratt) robot sidekick, Herman, comes in different sizes. “It was always the Russian doll thing where the one size smaller fits into the one size bigger,” Butler remarks. “We did honor the style and personality but not at the expense of physics. Most of the movie is four-foot Herman with Anthony Mackie’s [vocal] and Martin Klebba’s [mocap] performances. It’s also coming out of the script. He’s this sarcastic character. I love his personality, and it came through extremely well. Herman borrowed the power extension cable for his devices and forgot to return it. Meanwhile, all of Keats’ food in the fridge has gone bad. Herman has messed up, and he’s like a guilty teenager shuffling around on the couch, deliberately avoiding eye contact with Keats because he’s this busted little kid. That character is amazing, and it carries through the movie well. Chris Pratt was perfect for this, and it works so well for the two of them. It’s most people’s favorite relationship in the movie.”
Following the example of animated features, Lead Storyboard Artist Darrin Denlinger storyboarded and assembled the entire film into an animatic. “I had a version of the movie before we started shooting, or close to when we started shooting, that was a layout,” states Jeffrey Ford, Executive Producer and Editor. “It had all the storyboards cut together in sequence with sound, music and subtitles instead of dialogue. I used that as a layout to guide
us throughout production so we knew roughly how scenes would play out. Of course, things change on the day; actors re-block the scenes.” The amount of footage captured surpassed Avengers: Infinity War and Avengers: Endgame. Ford explains, “This film that had to be made multiple times because when you deal with animated characters carrying this much weight dramatically, those performances are created in passes. You may shoot a proxy pass where Millie interacts with Cosmo, and it’s a motion capture actor. Then you may shoot a pass where she is interacting with nothing. We might go back and shoot that same performance on the mocap stage multiple times. We may end up with various iterations of those visual effects as they come in over the months. An enormous number of iterations go on, and when you do that, you generate enormous amounts of footage.”
Atlanta doubled as the American Southwest. “Tumbleweeds and sagebrush, the basic things a person sees in the Southwest, do not exist in Atlanta or anywhere near there,” Johnson notes. “We had to fill several trucks with all that stuff and bring it into Atlanta.”
Parts of Atlanta have not changed for years. “Through the camera’s eye, if you wanted to say that it was the 1970s or 1980s, it was easy because nobody had built modern or contemporary homes in those neighborhoods for whatever reason. The same thing happened when selecting the locations for the battles in the city. If you put in the right period of car, voilà! You’re in that era.” A question that had to be answered was where exiled robots responsible for the uprising would live. “The X was in the script and is a large area in the Southwest. Where would these guys go? A country club? A department store? Football stadium? We landed on a shopping mall. It dawned on me one day that the only reason they would go to a place like this is to recharge themselves or maybe for repairs. That’s why in the shopping mall, you see a lot of wires, batteries and charging stations.”
Along with the final battle, which is almost entirely synthetic apart from the Keats’s interaction with the grass, a virtual meeting occurs between antagonists Ethan Skate (Stanley Tucci) and Colonel Bradbury (Giancarlo Esposito). “It’s where Ethan pitches the Colonel to go into the X to get Cosmo back,” Butler recalls. “Bradbury puts on the neurocaster and teleports into this virtual
BOTTOM TWO: It was imperative for believability that Herman’s movements be grounded in real physics.
environment that takes place in this beautiful mountain lake scene. Skate is standing in the middle of the lake while Bradbury is on terra firma and gently steps out to have a conversation with him. Production didn’t go to Norway. It’s just a couple of guys and girls from Storm Studios [located in Oslo] with some backpacks hiking out into some beautiful locations in the summer for reference. We had a shallow pool with plexiglass an inch under the surface so we could have Stanley and Giancarlo stand in the water. The only thing that we kept was the little bit of interaction of water locally right at their feet while the rest was digital water. The ripples needed to come out and propagate out into the lake as it would for real. It’s absolutely stunning work.”
The hardest shot to execute was when Cosmo, Herman, Keats and Michelle (Millie Bobby Brown) have been captured and are escorted into the mall. “They walk into this forecourt and finally see that the whole mall is filled with robots,” Ford recalls. “That shot was incredibly hard and took us months to do. The only things in that shot are Chris Pratt, Millie Bobby Brown and an empty mall. I drew maps of each frame, and we did a whole progression. We talked about where the different robots were, what they were doing, what their day was like, where they were going next and why they were moving in a certain way. We wanted it to feel like a real city. If you were to block out extras, those people would all come up with all of their mini-stories, and they would work it out. But we didn’t have that. We had to figure it all out as animators. It was fun but brutal. Digital Domain did an incredible job on it. I hope people will stop and rewind the shot because it’s beautifully detailed and feels completely real.”
WINNER OF THE 2024 FOLIO: OZZIE AWARD
Best Cover Design VFXV Winter 2024 Issue (Association/Nonprofit/Professional/Membership)
HONORABLE MENTIONS FOR THE 2024 FOLIO: EDDIE & OZZIE AWARD
Best Cover Design VFXV Fall 2023 Issue and Best Full Issue for Fall 2023 Issue
The Folio: Awards are one of the most prestigious national awards programs in the publishing industry. Congratulations to the VFXV creative, editorial and publishing team!
Thank you, Folio: judges, for making VFXV a multiple Folio: Award winner.
By CHRIS McGOWAN
“As far as VFX education, we are constantly seeing new pieces of software and technology being implemented into the pipeline. That is something we are always grappling with when it comes to learning,” says Professor Flip Phillips of The School of Film and Animation at Rochester Institute of Technology (RIT). Each VFX and animation school explores the implementation of new tech differently, such as virtual production and AI.
OPPOSITE
“The film and media industry is experiencing a significant shift toward LED stages and virtual production technologies, fundamentally changing how stories are told and content is created,” comments Patrick Alexander, Department Head at Ringling College Film Program. “Real-time visualization capabilities, combined with the seamless integration of physical and digital elements, provide creators with unprecedented creative control and production efficiency. At Ringling College, we’ve recognized this industry transformation by installing LED walls this semester and are actively developing a series of virtual production courses. From our campus in Sarasota, Florida, we can now create entire worlds and environments once thought impossible to achieve, enabling students to realize their creative visions in real-time. This curriculum development paired with our new technology will prepare students for the rapidly evolving production landscape, where virtual production technologies enhance creative possibilities while maintaining the fundamental principles of cinematic craft.”
“We have a soundstage in the MAGIC Center that houses a 32’ x 16’ LED wall, and the center provides technical support for motion capture, camera tracking, virtual art department and real-time in-camera visual effects. Having the opportunity to see and work with a virtual production stage is a great asset for our graduates.”
—Professor Flip Phillips, The School of Film and Animation, Rochester Institute of Technology
RIT has worked “to ensure our students have experience working in virtual production before they leave our campus,” Phillips states. “We have a soundstage in the MAGIC Center that houses a 32’ x 16’ LED wall, and the center provides technical support for motion capture, camera tracking, virtual art department and real-time in-camera visual effects. Having the opportunity to see and work with a virtual production stage is a great asset for our graduates.”
To meet the growing emphasis on LED stages and virtual production, Vancouver Film School is “launching a VP course alongside Pixomondo in 2025 to meet the industry needs in this area,” says Colin Giles, Head of The School for Animation & VFX at Vancouver Film School. The 12-week certificate program will teach the fundamentals of content creation for virtual production in VFX.
NYU’s Tisch School of the Arts has opened the Martin Scorsese Virtual Production Center, made possible by a major donation
from the Hobson/Lucas Family Foundations by Mellody Hobson, Co-CEO of Ariel Investments, and filmmaker George Lucas. The facility features two 3,500-square-foot double-height, column-free virtual production stages, with motion capture and VP technology outfitted by Vicon and Lux Machina, and two 1,800-square-foot television studios and state-of-the-art broadcast and control rooms.
Savannah College of Art and Design has two full-size LED volume stages where students can get their hands on Mandalorianstyle production techniques, as well as classes in virtual production, photogrammetry and real-time lighting, according to Gray Marshall, Chair of Visual Effects at SCAD. “We have even launched a multidisciplinary minor in virtual production, bringing visual effects, production design and film students together to create complete and inventive ‘in-camera VFX’ projects.”
The amount of class time devoted to AI is also rapidly growing. “Educational institutions have a unique opportunity to learn from
industry standards and histories while pushing the boundaries through emerging technologies,” notes Jimmy Calhoun, Chair of BFA 3D Animation and Visual Effects at The School of Visual Arts in New York City. “Our students understand this responsibility. They’re not only exploring the potential of AI but also reflecting on its impact on their rights as artists, the future of their mentors’ jobs and the environment.”
SCAD’s Marshall comments, “AI is the most exciting trend in VFX since the advent of the computer itself. AI has already found itself ingrained in many aspects of our day-to-day tools and will continue to do so. It also creates new opportunities to rapidly try ideas, modify them and get stimulated in new directions, but it is still all under your control. Yes, there are some challenges to be faced, both regarding IP and resource utilization, but those can be worked out. I am not one of those who feels we’ll lose jobs.”
Marshall continues, “I watched as computers displaced some older-style workers, only for a whole new style of artist to emerge in greater numbers, driving greater demands for their services. Computers have always been good at replacing repetitive jobs, and I don’t think losing that class of jobs will be a loss. Since the basic premise of AI-generated images is to aggregate toward the middle ground, if you’re concerned it will take your ‘creative job,’ I wouldn’t be. If you are truly creative, then AI isn’t an exit ramp, it’s a launch ramp.”
“Currently, I think there are two groups of educators that I am seeing when it comes to AI,” says RIT’s Phillips. “There is one group that is hesitant to adopt AI into their curriculum due to the lack of knowledge of how it could benefit them and their students, and another group that sees the benefits of using AI to make the VFX pipeline more efficient. I am part of the latter group. I have seen many use cases for AI to allow me and my students to deal with problems that are tedious or inefficient. There will be many
more beneficial situations for AI in the VFX field, but we still have to be mindful of the ethical issues that arise.”
“We embrace emerging technologies like AI as valuable tools to be utilized when appropriate,” notes Christian Huthmacher, Professor, The Department of Motion Design, specializing in VFX at Ringling College of Art and Design. “In our Visual Effects courses, students are introduced to cutting-edge tools such as The Foundry’s CopyCat and Topaz AI. However, our approach goes beyond merely teaching technical proficiency. We engage students in critical discussions about the ethical considerations, potential biases and security implications surrounding AI usage. By addressing these complex topics, we ensure our students are uniquely equipped to navigate the evolving landscape of AI in the industry.”
“We are embracing [AI] when it makes sense,” says Ria Ambrose Benard, Director/Senior Educational Administrator at The Rookies Academy (formerly Los Boys Studios). “Tools are being created to help artists with the mundane portion of the job and offer more time for the more difficult and rewarding shots. Much like spell check, it is a tool people use regularly, and sometimes it is great, but sometimes it is not. AI is a tool that can be utilized correctly. It’s not always the best solution, and often an artist will get better results faster, but it is a tool that can be used in some circumstances to make things easier for the artists.”
“One goal that I always strive for in my classrooms is allowing students to problem-solve using any and all tools available,” comments RIT’s Phillips. “I believe this will allow for the industry to continue to evolve and become a place where creativity, design and innovation will help tell the stories in a more unique and beautiful way. Our field is constantly evolving, and that is what makes it exciting. The unknown can be a scary place for some, but I see it as an opportunity to make great strides in VFX.”
By NAOMI GOLDMAN
Takashi Yamazaki is a renowned talent in Japanese cinema who accomplished a significant feat when he became only the second director to win an Academy Award for Best Visual Effects for Godzilla Minus One – and in the process reinvigorated a legendary kaiju franchise. As a filmmaker and visual effects supervisor, he is regarded as one of Japan’s leading film directors. Yamazaki is set to make his Hollywood debut with Grandgear for Bad Robot and Sony Pictures, and recently announced that he is working on the screenplay and storyboards for the much-anticipated next Godzilla movie.
For his consummate artistry, expansive storytelling and profound ability to use visual effects to bring his unique visions to life, the Society honored Takashi Yamazaki with the VES Visionary Award at the 23rd Annual VES Awards. “Takashi has been at the forefront in using visual effects to tell remarkable stories that transfix audiences and create unforgettable cinematic experiences,” said VES Chair Kim Davidson. “As a creative force who has made an indelible mark in the world of filmed entertainment, we are honored to award him with the prestigious VES Visionary Award.”
Michael Dougherty, director of Godzilla: King of the Monsters, gave this tribute in presenting the award to his colleague in the kaiju genre: “Takashi is a filmmaker whose work is both humbling and inspiring. He was so moved by Star Wars and Close Encounters that he started his career building miniatures and working tirelessly as a visual effects supervisor before directing his own films. Takashi’s work pushes the boundaries of visual effects, blending technical innovation and compelling storytelling to create immersive and iconic films. Godzilla Minus One resurrected the king of the monsters. Yes, Godzilla has a soul, and Takashi captured it in such a way that moved the world, earning Japan its first Academy Award for Visual Effects.”
In accepting his award, Yamazaki remarked, “I’m truly surprised and overjoyed to receive such a wonderful award. I started this career with a dream of working with people around the world like a pioneer in visual effects; however, in Japan, there were hardly any opportunities for this type of work. I kept telling myself that as long as I was born in Japan, bringing spaceships, robots and kaiju to the screen was already a dream come true. But then Godzilla brought me to this incredible place. Thank you, Godzilla! And I believe many of you can relate when I say – I want to praise my young self for choosing to pursue this career. Thank you for this great honor.”
By NAOMI GOLDMAN
Dr. Jacquelyn Ford Morie is a luminary in the realm of virtual reality and a defining voice of immersive technology. Standing at the intersection of art, science and technology, she has been shaping the future of virtual experiences worldwide since the 1980s, and remains dedicated to mentoring the next generation of VR innovators while forging ahead with new paradigms in immersive media.
For her immense contributions to the computer animation and visual effects industries, by way of artistry, invention and groundbreaking work, the Society honored Dr. Morie with the Georges Méliès Award at the 23rd Annual VES Awards. “Dr. Jacki Morie redefined the possibilities of immersive media, infusing them with profound emotional and educational resonance,” said VES Chair Kim Davidson. “Her visionary leadership has shaped pivotal projects across health, space exploration, art and education, underscoring her as a transformative innovator and thought leader.”
Chief Research Officer of Eyeline Studios Paul Debevec, VES gave this heartfelt introduction to his longtime friend: “I had the privilege of working with Jacki for over a decade at USC’s Institute for Creative Technologies. We were two of the first research leaders hired in 2000, and I was lucky to have a colleague who helped set the tone that this institute would not just be about developing technology, but also about exploring its creative possibilities. Jacki has developed computer animation training programs that have educated a generation of digital artists. But perhaps her most impactful contributions have been pioneering work in VR experiences, creating new forms of multisensory feedback, like helping NASA astronauts combat isolation on long-duration space missions by keeping them emotionally connected to Earth. It is a theme of Jacki’s work to develop technology and experiences to make life better for others.”
In accepting her award, Morie remarked, “To say I am honored to receive this award is hardly representative of what I am feeling. It is not everyone who gets to be at the birth of a new medium, and yet here tonight we celebrate both film and computer graphics and effects. I have spent my entire career trying to make this new medium of VR more meaningful, more emotional, more purposeful. I’d like to think that my work also honors those who are trying to follow the desire lines of where we want to take this medium. I am glad to have started this journey and look forward to many more artists taking it up and making it truly something spectacular! My sincere thanks to the VES for recognizing me and the birth of this new art form.”
Dr. Morie with the Georges Méliès Award.
Dr. Morie and Rob Smith.
2. A
On February 11, the VES hosted the 23rd Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.
Captions list all members of each Award-winning team even if some members of the team were not present or out of frame. For more Show photos and a complete list of nominees and winners of the 23rd Annual VES Awards, visit vesglobal.org.
All photos by Moloshok Photography.
Nancy Ward, Executive Director of the Visual
stage with a heartfelt greeting, celebrating an incredible year for the VES and the groundbreaking achievements of the visual effects community.
3. Kicking Off the Celebration. Kim Davidson, founder of SideFX and Chair of the Visual Effects Society, took the stage to present the first round of awards, setting the tone for an unforgettable night of VFX excellence.
4. The Sklar
and Jason) brought their signature wit and energy to their debut as VES Awards hosts.
Industry guests gathered at The Beverly Hilton hotel to celebrate VFX talent in 25 awards categories and special honorees. Kingdom of the Planet of the Apes received the top photoreal feature award. The Wild Robot was named top animated film, winning four awards. Shōgun; Anjin was named best photoreal episode, winning three awards.
Comedy duo The Sklar Brothers made their debut as VES Awards hosts. Acclaimed actor Keanu Reeves presented Golden Globe-winning actor-producer Hiroyuki Sanada with the VES Award for Creative Excellence. Chief Research Officer of Eyeline Studios Paul Debevec, VES, presented Virtual Reality/Immersive Technology Pioneer Dr. Jacquelyn Ford Morie with the Georges Méliès Award. Writer-director Michael Dougherty presented Academy Award-winning filmmaker and VFX Supervisor Takashi Yamazaki with the Visionary Award. Award presenters included: Kelvin Harrison, Jr., Krys Marshall, Mary Mouser, Russell Hornsby, Tanner Buchanan, Eric Winter, Tia Carrere, and Autodesk’s Senior Director, Business Strategy Rachael Appleton presented the VES-Autodesk Student Award.
“As we celebrate the 23rd Annual VES Awards, we’re honored to shine a light on outstanding visual effects artistry and innovation,” said VES Chair Kim Davidson. “The honorees and their work represent best-in-class visual effects – work that engages audiences and enhances the art of storytelling. The VES Awards is the only venue that showcases and honors these outstanding global artists across a wide range of disciplines, and we are extremely proud of all our winners and nominees.”
5. Kingdom of the Planet of the Apes won the Award for Outstanding Visual Effects in a Photoreal Feature, led by the team of Erik Winquist, Julia Neighly, Paul Story, Danielle Immerman and Rodney Burke.
6. The Award for Outstanding Visual Effects in an Animated Feature went to The Wild Robot and the team of Chris Sanders, Jeff Hermann, Jeff Budsberg and Jakob Hjort Jensen.
7. The Award for Outstanding Visual Effects in a Photoreal Episode went to Shō gun; Anjin and the team of Michael Cliett, Melody Mead, Philip Engström, Ed Bruce and Cameron Waldbauer.
Cobra Kai stars
Buchanan and Mary
brought energy and charm to the stage, acting as an engaging duo of presenters.
9. The Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to The Penguin; Bliss and the team of Johnny Han, Michelle Rose, Goran Pavles, Ed Bruce and Devin Maggio.
10. The Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Civil War and the team of David Simpson, Michelle Rose, Freddy Salazar, Chris Zeh and J.D. Schwalm.
11. Lisa Cooke, VES former VES Chair and the first woman ever elected to the role, took the stage to present multiple VES Awards.
12. Actress and singer Tia Carrere took the stage to present several awards, bringing charm, energy and star power to the celebration.
13. The Award for Outstanding Visual Effects in a Commercial went to Coca-Cola; The Heroes and the team of Greg McKneally, Antonia Vlasto, Ryan Knowles and Fabrice Fiteni.
14. The Award for Outstanding Visual Effects in a Special Venue Project went to D23; RealTime Rocket and the team of Evan Goldberg, Alyssa Finley, Jason Breneman and Alice Taylor.
15. The Award for Outstanding Character in a Photoreal Feature went to Better Man; Robbie Williams and the team of Milton Ramirez, Andrea Merlo, Seoungseok Charlie Kim and Eteuati Tema.
16. The Award for Outstanding Character in an Animated Feature went to The Wild Robot; Roz and the team of Fabio Lignini, Yukinori Inagaki, Owen Demers and Hyun Huh.
17. The Award for Outstanding Character in an Episode, Commercial, Game Cinematic or Real-Time Project went to Ronja the Robber’s Daughter; Vildvittran the Queen Harpy and the team of Nicklas Andersson, David Allan, Gustav Åhren and Niklas Wallén.
18. Liv Hewson, star of Yellowjackets, was a captivating presenter in celebrating VFX, as she bestowed multiple awards.
19. The Award for Outstanding Environment in a Photoreal Feature was won by Dune: Part Two; The Arrakeen Basin and the team of Daniel Rhein, Daniel Anton Fernandez, Marc James Austin and Christopher Anciaume.
20. The Award for Outstanding Environment in an Animated Feature was presented to The Wild Robot; The Forest and the team of John Wake, He Jung Park, Woojin Choi and Shane Glading.
21. The Award for Outstanding Environment in an Episode, Commercial, Game Cinematic or Real-Time Project went to Shō gun, Osaka and the team of Manuel Martinez, Phil Hannigan, Keith Malone and Francesco Corvino.
22. The Award for Outstanding CG Cinematography went to Dune: Part Two; Arrakis and the team of Greig Fraser, Xin Steve Guo, Sandra Murta and Ben Wiggs.
23. The Award for Outstanding Visual Effects in a Real-Time Project went to Star Wars Outlaws and the team of Stephen Hawes, Lionel Le Dain, Benedikt Podlesnigg and Andi-Bogdan Draghici. Presenter Kelvin Harrison, Jr. accepted the Award.
24. The Award for Outstanding Model in a Photoreal or Animated Project went to Alien: Romulus; Renaissance Space Station and the team of Waldemar Bartkowiak, Trevor Wide, Matt Middleton and Ben Shearman.
25. The Award for Outstanding Effects Simulations in a Photoreal Feature went to Dune: Part Two; Atomic Explosions and Wormriding and the team of Nicholas Papworth, Sandy la Tourelle, Lisa Nolan and Christopher Phillips.
26. The Award for Outstanding Effects Simulations in an Animated Feature went to The Wild Robot and the team of Derek Cheung, Michael Losure, David Chow and Nyoung Kim.
27. Krys Marshall, known for her role in For All Mankind, lit up the stage with her charisma and grace as she presented multiple awards.
28. The Award for Outstanding Effects Simulations in an Episode, Commercial, Game Cinematic or Real-Time Project was won by Shō gun; Broken to the Fist; Landslide and the team of Dominic Tiedeken, Heinrich Löwe, Charles Guerton and Timmy Lundin.
29. The Award for Outstanding Compositing & Lighting in a Feature went to Dune: Part Two; Wormriding, Geidi Prime and the Final Battle and the team of Christopher Rickard, Francesco Dell’Anna, Paul Chapman and Ryan Wing.
30. The Award for Outstanding Compositing & Lighting in an Episode was presented to The Penguin; After Hour and the team of Jonas Stuckenbrock, Karen Cheng, Eugene Bondar and Miky Girón.
31. Actor Russell Hornsby, featured in The Woman in the Yard, showcased his charisma and love for VFX as a standout presenter.
32. The Award for Outstanding Compositing & Lighting in a Commercial went to Coca-Cola; The Heroes and the team of Ryan Knowles, Alex Gabucci, Jack Powell and Dan Yargici.
33. The Award for Outstanding Special (Practical) Effects in a Photoreal Project went to The Penguin; Safe Guns and the team of Devin Maggio, Johnny Han, Cory Candrilli and Alexandre Prod’homme.
34. The Emerging Technology Award went to Here; Neural Performance Toolset and the team of Jo Plaete, Oriel Frigo, Tomas Koutsky and Matteo Olivieri-Dancey.
35. Championing the Next Generation. Rachael Appleton, Autodesk’s Senior Director of Business Strategy, took the stage to present the prestigious VES-Autodesk Student Award, honoring emerging talent in the world of VFX.
36. The Award for Outstanding Visual Effects in a Student Project went to Pittura (entry from ARTFX, The Schools of Digital Arts) and the team of Adam Lauriol, Titouan Lassère, Rémi Vivenza and Helloïs Marre.
37. Keanu Reeves honors his friend, award-winning actor-producer Hiroyuki Sanada, and presents him with the distinguished VES Award for Creative Excellence, a tribute to his extraordinary contributions to film, VFX and storytelling.
38. Golden Globe-winning actor-producer Hiroyuki Sanada, one of Japan’s most celebrated actors, proudly holds his VES Award for Creative Excellence. Known for his unforgettable portrayal of Lord Toranaga in the acclaimed series Shō gun, Sanada continues to leave an indelible mark on cinema and television.
39. Paul Debevec, VES, Chief Research Officer at Eyeline Studios, presents Virtual Reality and Immersive Technology pioneer Dr. Jacquelyn Ford Morie with the VES Georges Méliès Award, recognizing her trailblazing contributions to immersive storytelling.
40. Virtual Reality/ Immersive Technology Pioneer Dr. Jacquelyn Ford Morie was honored with the VES’s prestigious Georges Méliès Award.
41. Writer-director Michael Dougherty presented Academy Award-winning filmmaker and VFX Supervisor Takashi Yamazaki with the VES Visionary Award.
42. Takashi Yamazaki, recipient of the Visionary Award, is one of Japan’s leading filmmakers and VFX supervisors, earning the 2024 Academy Award for Best Visual Effects for his groundbreaking work on Godzilla Minus One.
43. Hosts The Sklar Brothers share a laugh backstage with award presenter Tia Carrere.
44. The celebrated group of Keanu Reeves, VES Executive Director Nancy Ward, Takashi Yamazaki, Hiroyuki Sanada and VES Chair Kim Davidson share a star-studded backstage moment, radiating talent and camaraderie
45. The dedicated volunteers of the VES Awards Committee – flanking Den Serras (Chair), Lopsie Schwartz (Co-Chair), Scott Kilburn (Co-Chair)turned their hard work into a spectacular evening celebrating the VES and the best in VFX.
Outstanding Visual Effects in a Photoreal Episode went to Shō gun; Anjin, which won three VES Awards including Outstanding Effects Simulations in an Episode, Commercial, Game Cinematic or Real-Time Project (Broken to the Fist; Landslide) and Outstanding Created Environment in an Episode, Commercial or Real-Time Project (Osaka). (Photos courtesy of FX Network)
By TREVOR HOGG
The visual effects pipeline aims to allow artists to focus on creative challenges rather than technical ones. However, technicians must continually update the technology to ensure the best tools are used, and that implementation doesn’t cause costly production delays. The pipeline department comprises developers, engineers, leads, architects, technical directors, and supervisors from various career backgrounds. The major goal is to translate the artistic needs into programmable computer code.
“We work in the shadows, but our teams are great developers passionate about visual effects and can do a ton of magic,” states Anne-Gaëlle Jolly, Head of Technical Directors at Rodeo FX. “You have to juggle 20 things at once but always give the feeling that everything is under control. A key part is to stay calm and don’t hesitate to take a step back to make the right decision when I’m trying to ‘put out fires.’” The pressure can be intense. “We had to roll out an OS change and Python version change across the farm and the floor at the same time while having a full slate of running shows and making sure that this would not impact the artists,” remarks Julien Dubuisson, Head of Pipeline at Rodeo FX. “We work with every department from production to the artists on the floor. Some are more technical than others. We will get detailed specs on all aspects of a tool or system or a rough outline of the expected outcome. They could work without us, but the systems we provide significantly enhance their ability to work at a much greater scale and efficiency.”
OPPOSITE TOP: Pipeline is a critical component of the visual effects process to make the transition of files and information seamless between DCCs. VFX from The Lord of the Rings: The Rings of Power (Image courtesy of Outpost VFX and Prime Video)
Python is a prominent pipeline programming language. “Ninety-nine per cent of our pipeline code is written in Python. The rest is configuration files and the occasional shell script,” explains Nicholas Hampshire, Lead Pipeline TD at Framestore. “That hasn’t changed in over a decade. We use Maya, Houdini, 3D Equalizer, Blender, Mari, Mush3D and other great off-theshelf software for CGI All require deep in-house integrations and
“We work in the shadows, but our teams are great developers passionate about visual effects and can do a ton of magic. You have to juggle 20 things at once but always give the feeling that everything is under control. A key part is to stay calm and don’t hesitate to take a step back to make the right decision when I’m trying to ‘put out fires.’”
—Anne-Gaëlle Jolly, Head of Technical Directors, Rodeo FX
customization to ensure we can deliver the best for production. We’re picking up ML and formalizing our CI/CDs to make it easier to test our code before rolling it out.” Mental strength is important.
“It’s reasonable to say that you must be dogged and have a grizzled determination to succeed. Software development is only half the problem. The rest is an ongoing battle against time!” Hampshire gives an example where he had to conquer the clock. “I was left with no alternative than to re-implement a deprecated C++ rendertime point instance format within a distributed set of Python expressions embedded in a VEX SOP inside of Houdini and get it to match renderer results within three hours, one idle Thursday night for a major feature film. That was reasonably tricky to do. Worked alright, though.”
The cornerstone of the pipeline is the asset management system.
“The most important thing for the studio is how you manage your assets,” remarks Elizabeth Shiers, Lead Pipeline TD at Outpost VFX. “Where is the central database you store that information? For Outpost VFX, that’s Flow for production tracking. Your DCCs need to plug into that database. It’s about how you get from that database into a DCC. Do what we need to do in the DCC and pass it back to the database. The next person who comes along pulls in again from that database to another DCC, does what they need to do and gets it back out. That process of getting it in and out is where the pipeline comes in, trying to make it seamless between
DCCs. The aim is to make it feel familiar. If you’re a generalist and need to switch from Maya to Houdini to Nuke you are familiar with, you know ‘This is how I get things in and out.’ They know the DCCs enough in a vanilla capacity to do what they need to do, and the pipeline is there to pass data from the database into the DCC and then to someone else. That’s how I see it working in the ideal scenario.” Outpost VFX came into existence after most third-party standards were established. “DCCs support the things we’re trying to use. It’s a matter of getting them from one to the other.”
“The most important part of my job is leading a team to automate all of the boring stuff so that artists can focus on making art,” notes Chad Dombrova, Global Head of Pipeline at Scanline VFX.
“Another significant aspect is taking all of the stories you hear from the different artists in the various departments, seeing the overall trend of these problems and finding the right technology to pair up with them. You’re trying to build a set of technological capabilities that can solve almost any problem. That’s your goal.”
Reducing the time for artists to get iterative feedback has been the goal for over the past 30 years. Dombrova adds, “The bar is always being raised as we get closer to that nirvana, which makes it harder to achieve real-time. However, as a principle, the pipeline should be built in a way that technologies are ideally interchangeable. If there’s this new real-time technology and we feel it will have a huge impact, we should hopefully be able to excise the technology
TOP TO BOTTOM: Striking the right balance between human tasks, metadata and automation is incredibly complex, especially with the unique requirements and priorities of different shows. From The Fall Guy (Image courtesy of Rising Sun Pictures and Universal Pictures)
For Brave, Pixar switched to its proprietary animation software Presto. (Image courtesy of Pixar)
Hair was a major technological challenge for Brave . (Image courtesy of Pixar)
it replaces and slot it in. One of the things that machine learning is good at is simply taking a process and making it more efficient, like de-noising. As a pipeline engineer or architect, what you looking is choosing an ecosystem and investing in those types of technologies because, often, the easiest way to get those kinds of technologies into the pipeline is not by making some massive switch. It’s by adopting a DCC software vendor that is already investing in those things for you.”
“There is this sense that if you are entrenched using a particular software then moving away from it, you need to weigh the cost-benefit of introducing something completely new or a change to the workflow,” states Claude Martins, Director of Pipeline, VFX at Digital Domain. “About five or six years ago, we switched from using Maya as our primary lighting platform to Houdini. Houdini had just come out with their lighting platform Solaris, which was USD-based, and what we saw there were the advantages of getting our foot in the door for introducing USD into our pipeline, which more and more of the industry is switching to. And the ability to switch to sequence-based workflows. Before, we would have lighters light each individual shot. We were able to introduce this huge efficiency to lighters. While it was quite painful in terms of the effort to switch over, once we switched over we immediately started seeing the benefits. The other example is our crowd software. Despite investing a bunch of effort into our Houdini Crowds pipeline, which is simulation-centric, we looked at the costs and benefits. We decided to switch over to using agent-centric Golaem. It’s always a case-by-case basis.”
Generally, the fundamental tools used by Pixar have remained relatively stable. “There are moments where we have a big transition point,” states Ariela Fedorov, Technical Director at Pixar. “For Brave, we switched to our new proprietary animation software Presto, and for Finding Dory, we changed to a new version of RenderMan RIS and Katana.” Technological development is
divided into three areas: R&D explores cutting-edge technology, tools take a long-term view of pipeline and software engineering, and global technology deals with technological issues and provides support for specific shows. “Soul was our first volumetric characters, and we’ve had other shows since then like Elemental. We didn’t have a way to converge geometry into a volume, so we had to make a pipeline for it; that pipeline has been developed and become a central pillar of our workflow.”Art and technology impact each other. “What is healthy for a show is to have the art influence the technology that we need to build and support, but also have the technology influence the art because we’re not working in a vacuum. There are constraints such as schedules and team size. We need to ensure that the artistic look we land on is something that can be executed and scaled.”
A specialized approach is taken at Sony Pictures Imageworks where the pipeline team supports layout and animation. At the same time, look development creates the specialized tools to achieve the desired visual aesthetic of a project. “We mostly work within Maya, at least for CG shows, while in visual effects, we have more support for the back-end department such as Katana and Nuke,” states Diana Lee, Pipeline TD at Sony Pictures Imageworks, who also has to assist in providing assets to other visual effects studios. “Especially if a rig is being shared, we have a workflow that pulls out all of our SPI plug-ins before we send it out. That file can then be opened in a vanilla version of Maya. You don’t need any plug-ins. If we’re sending just the geometry, it can be sent using alembic caches that don’t have proprietary info. We have also recently been using USD publishes to send info to other studios. It’s a challenge to keep track of everything we need to send, what we can send, and how we can send that efficiently because those requests always come in.” The really difficult part of the pipeline is understanding the issue being brought forth. “We have to be good communicators and, on top of that, be good on the technical side.
land on an
The pipeline team works in the shadows doing a ton of magic for shows
and Netflix)
Even though there are many crossovers between animation and visual effects pipelines at Sony Pictures Imageworks, differences between them do exist. From Ghostbusters: Frozen Empire (Image courtesy of Sony Pictures Imageworks and Columbia Pictures)
TOP: One of the most challenging parts of pipeline is understanding the issue being brought forth, which makes communication critical. Screenshot from Spider-Man: Across the Spider-Verse. (Image courtesy of Sony Pictures Imageworks and Sony Pictures)
BOTTOM: Sony Pictures Imageworks has a unique setup where the pipeline team supports layout and animation while look development creates the specialized tools to achieve the desired visual aesthetic of a project. Screenshot from Spider-Man: Across the Spider-Verse (Image courtesy of Sony Pictures Imageworks and Sony Pictures)
“We have to be good communicators and, on top of that, be good on the technical side. The fun part of being in the pipeline is being that glue between the artists and development. The artist tells you something that they want, and it’s your job to clarify, ‘Do you need A, B and C?’ Sometimes, what they’re asking for is already possible, but maybe they don’t know about it, or other occasions, it’s like, ‘Wow. That’s a good idea. We’re going to put our heads together and make something out of it.’”
—Diana Lee, Pipeline TD, Sony Pictures Imageworks
The fun part of being in pipeline is being that glue between the artists and development. The artist tells you something they want, and it’s your job to clarify, ‘Do you need A, B and C?’ Sometimes, what they’re asking for is already possible, but maybe they don’t know about it, or other occasions, it’s like, ‘Wow. That’s a good idea. We’re going to put our heads together and make something out of it,’” Lee says.
Standardization exists within the visual effects industry. “At the moment, the file format standards are open-sourced, so all of the applications we use you can write in the same file format, meaning that the things we generate are highly portable,” observes Elliott Smith, Head of Pipeline at One of Us. “That has now been put on steroids with USD where it’s not only things we can share with other companies or people like models or renders but full complex scenes with lots of things going on.” A lot of information must be stored in a database such as ShotGrid. Remarks Smith, “What are the relevant details about that thing the next person needs to know? Preserving that information is important because we can only automate if we’ve got the right input and what we’re trying to output. How do we move things through the pipeline and make it all work? We have to concentrate on, ‘Is the thing that we made technically correct?’ There is a big focus on validation checks. We do need someone to look at it in dailies and say, ‘Yes. Ready for the next person.’” An overall view of the workflow has to be taken into consideration when devising solutions. “If you’re making something for mass use, you have to go through all the departments and ask, ‘What do you need?’ You can’t assume that everyone needs the same thing. We don’t want to bias the pipeline to one particular individual, shot or show. We rely a lot on our HODs and supervisors on shows to guide us,” Smith adds.
Even though there are a lot of crossovers between animation
and visual effects, pipeline differences do exist. “All of the major tools are the same,” notes Larry Gritz, Software Architect & Sony Distinguished Engineer at Sony Pictures Imageworks. “There are stages of the pipeline that are more relevant to one than the other. Digital doubles or face replacements are not required for animated shows. The biggest difference these days in many ways is that on the animated shows we do the whole thing, whereas for the live-action effects, almost everything is split between four or five major studios.” Each department picks a central software program based on what works best for them. Says Gritz, “Pipeline is concerned with figuring out how things will flow through these different applications that come from several different companies. The custom software is written either by the pipeline or software group. When I started, every DCC that existed was its own universe, and every studio was a silent thing with a bunch of custom development that was nothing like what you find in a different studio, even though they were working on similar things. Now, there’s a lot more commonality, and part of this is the commonality of the foundational software that we’re all using and working on together. Also on the visual effects side, the need to work with the other studios on the same shows and be able to exchange assets and deliver uniform results to clients is driving more of a standardization.”
Complex tasks revolve around cross-departmental workflows. “No one works in isolation, so any change in process can have significant impacts both upstream and downstream,” remarks Anthony Winter, Pipeline and Software Supervisor at Rising Sun Pictures. “One area we’ve been refining is the QC workflow. The goal is to ensure that each department’s output meets the needs of all subsequent teams while managing the smooth flow of assets through the pipeline. This involves determining how
TOP TO BOTTOM: Maya, Nuke and Houdini remain the core software programs for visual effects studios. From Masters of the Air (Image courtesy of Whiskeytree and Apple+)
Whiskytree deploys a mixture of generalists and specialized digital artists when working on productions like Masters of the Air
(Image courtesy of Whiskeytree and Apple+)
Updates are done by Whiskeytree’s tool developers at the middle of the night or when fewer people are working. From Masters of the Air. (Image courtesy of Whiskeytree and Apple+)
TOP TO BOTTOM: Rodeo FX had to roll out an OS change and Python version change across the render farm and floor while having a full slate of running shows. From Monarch: Legacy of Monsters (Image courtesy of Rodeo FX and Apple+)
It is important to remain calm when putting out fires in order to make the right decisions. Screenshot from House of Dragon (Image courtesy of Rodeo FX and HBO)
The systems provided by the pipeline enable artists to work at a higher level on productions such as Foundation Season 2. (Image courtesy of Rodeo FX and Apple+)
to present the right visual representations to the right people at the right time, gaining approvals efficiently and quickly moving assets forward in the pipeline. Technology alone cannot address all these challenges. Striking the right balance between human tasks, metadata and automation is incredibly complex, especially when accounting for the unique requirements and priorities of different shows. Improvements are ongoing, and there’s still much to be done to optimize this process.” There have been technical highlights. Recalls Winter, “A recent proud moment came when we successfully scaled our render farm to nearly 10 times its compute capacity, leveraging both on-premises hardware and multiple cloud-based compute facilities. This achievement wasn’t just about increasing compute power. It meant handling 10 times the jobs, 10 times the data and countless high-priority requests. Essentially, everything ramped up to 11, and we managed to keep it all running seamlessly.”
Each project for Cinesite is given a separate pipeline, though there are occasions when the same one is used. “We use something called Git, which is the backbone for all of the software,” states Sherwin Shahidi, Head of Production Technology at Cinesite. “If overnight one of these pipelines breaks, we have GitLab, which allows us to place a temporary pipeline right away that doesn’t have the problem so the project can continue. We have a timestamp of all the steps that we have done.” While visual effects and animation utilize the same pipeline, there are quite a lot of differences. Feature animation requires better rigging and blend shapes. The lighting depends on the vision of the art director and director. We have to go back and forth quite a lot versus visual effects, where some shots are full CG, but for the majority of them, we need to match something that is in real life. Cinesite made a conscious decision to use the same pipeline, and we make customizations at the department level for what they need. All of our pipeline engineers work on the same pipeline, whether it’s feature animation
or visual effects. If you look at the water in Moana, it is exemplary and uses visual effects. Over the past 10 years, there are quite a lot of shared elements. Last year, we started a visualization division, which is techvis, previs and postvis. The idea is to put together a vision of the director as quickly as possible into a workable visualization, and we use Maya and Unreal Engine for that. The turnaround to get something is usually a maximum of four weeks. That’s where real-time has helped.”
Whiskytree has a mixture of generalists and specialized digital artists. “In our case, it has worked out well where the 3D artists are generalists and can dabble in a lot of different things,” explains JP Monroy, CG Supervisor and Director of Artist Development at Whiskytree. “But there are occasions when you need someone who is so specialized and focused on one tool that they’re the only ones who can work on it, so you need to cater to them.” Pipeline alterations are handled delicately. “Updates are done by our tool developers in the middle of the night or when fewer people are working. That way, it’s the least disruptive. Software is getting updated all the time. Most studios and people want to stay on a version to complete a project, so you can take the time to update and test out new things during that in-between period.” For the past three decades, the core computer programs in the visual effects industry have remained the same. Observes Monroy, “Maya, Nuke and Houdini stick around because they’ve been the basis for a lot of creative work, and a lot of studios have built a pipeline and workflow around them. Changing all of that at once is like trying to turn a tanker. It will take way too long and way too many resources. However, other studios coming up with people who have no experience in that, then they might start to try new things and make less use of some of those tools. For now, it seems they’re the standards, and it’s how many of us are used to working, but things are constantly changing. Who knows what five years from now will look like!”
The volumetric characters of Elemental were built off the technology developed for Soul. (Image courtesy of Pixar)
The asset management system is the cornerstone of the pipeline. From Knuckles. (Image courtesy of Outpost VFX and Paramount+)
By OLIVER WEBB
Visual effects are in high demand in several industries worldwide and are now present in developing regions that are continuously expanding and evolving. Regions, including Eastern Europe, South Africa and the Middle East, have seen major shifts in demand in recent years. As these businesses grow, they have the potential to become significant players in the global VFX community.
Eastern Europe has become a key player in the industry over the last 10 years. Formed in 2019 by Lukas Remis, Imaginary Pixels is a boutique VFX studio in Poland. Remis formed the company while still actively working abroad as a freelance compositor. “By the end of 2021, running the studio became my main occupation,” he states. “We are located in Gliwice, Poland, far from Warsaw, the main film production center. We mainly work on film and episodic productions; only around 30% are Polish productions. Between 2023 and 2024, we worked on a large Polish war movie, Red Poppies, for which we created more than 120 VFX shots of various types. Some of our most interesting international projects include the 60th Anniversary Doctor Who episode ‘Wild Blue Yonder,’ and we just finished working on [the film] Return to Silent Hill We also had a small gig on the Paris Olympics closing ceremony clip with Tom Cruise. Our main areas of work are compositing and CG environments, but we’re able to cover every aspect of VFX production. So far, our teams have been up to 15 artists, depending on the project.”
Remis’ main roles within the company include VFX producer and supervisor. He believes the growth opportunities involve larger participation in the Polish film market. “There are around 100 movies being produced in Poland each year, and some streaming platforms are very active in producing local content here,” he says. “Especially Netflix, which opened its central and eastern European office in Poland. However, there’s quite a bit of competition here, with around 10 major local VFX studios. As
“There is an immense interest/passion for film, VFX, animation and interactive technologies [in Africa], meaning that Africa – with a 1.5 billion population – is poised not only to be a major creative hub producing content and consuming its own content. We already see this trend in relation to music.”
—Nosipho Maketo-van den Bragt, Owner and CEO, Chocolate Tribe
most of our projects come from abroad, we’re still affected by the slowdown n U.S. production caused by the strikes and COVID. Hopefully, we’ll finally see a rise in 2025.”
Another Eastern European VFX studio that has emerged in the last few years is StaticVFX Studio. Founded in 2014, StaticVFX is a Bucharest-based VFX studio whose portfolio includes projects for Netflix, HBO and ZDF, as well as animation projects like short films, proof-of-concept pieces and promotional storytelling for immersive installations and tourism activations. Co-Founder and VFX Supervisor Alexandru-Alin Dumitru is in charge of business development, and Liviu Dinu serves as Co-Founder and VFX Supervisor is charge of CG and the pipeline. “At our size, one of our strengths is that everyone still has a hand in the day-to-day creation of the work, so we’re not disconnected from the process,” Dumitru notes. “None of us are very far from the clients, the tools or the technology we use to push our work forward. It also creates a very team-oriented atmosphere where we are all in constant dialogue, which leads to strong bonds and long-term relationships. These responsibilities change depending on the project, experience and how the team size develops, but our values remain the same.”
Dinu and Dumitru wanted to create an environment where artists could feel like part of a team that grows from project to project and can deal with challenging, creative and technical ways
of expressing themselves. “It’s always been our dream to have all the tools at our hands for testing and developing interesting visuals to support stories. So, in 2014, we thought we should give it a go and see where this path would take us. This vision of a worldclass VFX and animation environment in Romania relies heavily on the wonderfully hard-working and creative artists from here, creating an opportunity to work on amazing projects in Romania,” Dumitru says.
VFX has matured into a global market where companies collaborate to tell stories worldwide. “When I think about the region, I first think of the European scale,” Dumitru explains. “Each culture has a unique identity and stories that are just now starting to find their way to audiences around the world. Soon, small and large media companies will have the opportunity to tap into these stories and tell them authentically. On the service side of the business, one advantage that Europe provides is varied rebates and tax incentive structures, so most projects can put together packages that are competitive in value and price. We are fortunate that the Romanian government is committed to the industry and has established a strong 30% rebate program because they see the value it can bring to the economy. We often partner with other European studios to create combined solutions for our clients. There are numerous examples where companies across Europe and Eastern Europe have risen to the occasion when given the opportunity,
TOP TO BOTTOM: Established in 2020 in Dubai Studio City, Nested VFX is a post-production and visual effects studio offering world-class video editing, color grading, motion design, 2D and 3D computer graphics, VFX, video compositing and video mastering. (Images courtesy of Nested VFX)
delivering outstanding results and setting a positive precedent for future collaborations.”
VFX activity has been relatively slow globally, and the Romanian VFX industry has not been immune to that. “On the positive side, we have been able to capitalize on some of the slower periods to invest in our pipeline, tooling, experiment with Unreal or USD workflows and how we can integrate AI efficiencies,” Dumitru adds. “Bearing in mind that Romania is slowly starting to emerge as a player in the VFX and animation industry, there is a lot of potential to explore, which has been untapped so far. In talking to my peers in Romania and Europe more broadly, many share this outlook; pent-up excitement is waiting to be released. Another important aspect of any industry is the surrounding community, which often acts as a catalyst and fosters a symbiotic relationship with the companies and players in that sector. This idea has inspired us to take an active role in building and supporting a VFX community in Romania.”
The African VFX industry has proven that it has the potential to emerge as a leading region in the visual effects industry.
Countries including South Africa, Nigeria, Kenya and Egypt lead the way. While the visual effects industry on the continent is still developing, the animation industry is booming.
For Nosipho Maketo-van den Bragt, Owner and CEO of Johannesburg-based Chocolate Tribe VFX & Animation, the future for VFX and amimation in South Africa looks bright despite the challenges. “There is a narrative shift emerging from South Africa with creators aiming for world-class, high-production value content. This mindset reflects a growing confidence in local talent and the desire to compete on a global stage, showcasing Africa’s capacity for excellence in VFX, animation and storytelling. There is an immense interest/passion for film, VFX, animation and interactive technologies, meaning that Africa – with a 1.5 billion population – is poised to become a major creative hub producing content and consuming it own content. We already see this trend in relation to music.”
Explains Maketo-van den Bragt, “South Africa started as a live-action film service destination for major blockbusters due to its diverse landscapes, favorable exchange rates, highly skilled on-set crews and strong infrastructure. However, of late, the creative industry has expanded into post-production and original content development, which signals its broader capacity in relation to VFX, animation and creative storytelling. According to the National Film and Video Foundation Movies and Entertainment Market Report, there has been growth in animation studios in 2023 [499.36 million USD].”
Maketo-van den Bragt notes that South Africa’s appeal as an active partner has increased with direct flights from film hubs in North America, Europe and Asia. From a VFX perspective, South Africa is a tried and tested film destination. Some of the most successful global blockbusters, such as The Kissing Booth, Black Panther, The Woman King and upcoming Mission: Impossible – The Final Reckoning, have been partly or wholly filmed in South Africa. Since English is one of the primary languages, there are no language barriers, “meaning there exists
vast opportunities for collaboration and co-productions.”
“The global appetite for high-quality African content is a major driver of growth in South Africa’s VFX and animation industry,” Maketo-van den Bragt states. “International audiences are increasingly interested in stories that reflect the African continent’s diversity, culture and unique perspectives. According to the market projections for South Africa’s animation industry, the animation industry is projected to grow at a robust rate of 7.5%- 7.78% – the market reflects steady growth in Africa as a whole. This has created opportunities for South African VFX and animation studios to position themselves as key players in producing and delivering compelling content that resonates on a global scale.”
Film and creative festivals like AVIJOZI, Comic Con Africa and other local festivals have become vital for showcasing thought leadership and African talent, allowing collaboration and driving industry growth, according to Maketo-van den Bragt. AVIJOZI, a Chocolate Tribe and Netflix collaboration, focuses on animation, VFX and interactive technology and has seen exponential growth since its inception in 2022, doubling attendance each year and drawing a combined audience of 4,200 over two days in 2024. Held in Johannesburg, a hub of 15 million people, the festival connects creatives, tech innovators and industry leaders while highlighting cutting-edge fields like AR, VR and gaming.
Sunrise Animation Studios, a leading South African studio, has historically been a small company that has done primarily episodic and feature work for streaming platforms as well as some sports promotionals. Phil Cunningham and his wife Jacqui started Sunrise in 2001 in Harare, Zimbabwe, where they lived at the time.
“Apart from the fact that Zimbabwe’s film industry and content market is relatively tiny, we were also completely unqualified at the time to try to set up a film business,” Cunningham admits. “We had built an agricultural trading business that was doing well, but for as long as I can remember I’ve loved story and specifically film
TOP: JJ Agency is an international video content production house based in Dubai specializing in production and postproduction for social media platforms, a wide range of VFX and commercial video production. (Image courtesy of JJ Agency)
BOTTOM: JJ Agency Films offers cutting-edge drone videography services for film, commercials and events, capturing aerial footage that showcases “the beauty and grandeur” of Dubai. (Image courtesy of JJ Agency Films)
TOP: Impreuna, Trup Si Suflet (Together As One) (2022) is a full-CG stop-motion short film co-produced by StaticVFX and co-directed by StaticVFX’s Liviu Dinu. (Image courtesy of Imaginary Pixels and Saga Film)
MIDDLE AND BOTTOM: StaticVFX contributed to the 2022 Netflix film Troll, with a sequel in production. (Image courtesy of StaticVFX Studio and Netflix)
and had this crazy dream of becoming a filmmaker. I also have an incredibly brave wife who encouraged me not to play it safe and to chase that dream. She read a saying that said that if an eagle chases two rabbits, both will get away, so we closed down our agricultural business and focused on starting an animation business.”
Their first project was a stop-motion animated feature film called The Legend of the Sky Kingdom, which was shot from a first-draft script without any storyboards. “Mainly because we didn’t know any better,” Cunningham admits. “Unsurprisingly, it wasn’t a huge commercial success, but I will say that it was visually quite unique and got a lot of attention, having worked with amazing Zimbabwean street artists to create models and our set from all kinds of recycled objects and materials. By the time we finished Legend, I knew that CG animation would be a far more versatile and powerful medium for us to focus on. So we built a tiny CG department and made a five-minute animation test that initially wasn’t intended for anything more than trying our hand at CG animation. That short became the first episode of the Jungle Beat series, now into its eighth season, and we’ve also recently completed our second Jungle Beat feature film.”
Sunrise decided to move the studio to Cape Town mainly to position itself strategically in terms of access to talented local animation professionals and being more closely connected to the global industry. “We’ve been based in Cape Town for 20 years now and have had the privilege of working with world-class professionals locally and from all over the world who have all played a vital role in our growth as a studio,” Cunningham says. “Initially, we took on a lot of service work, mainly working with New York and London-based agencies, which helped us grow our capabilities while keeping the business afloat. Still, we always tried to be very intentional about developing IP and trying to move increasingly towards film and series projects. Our studio motto is ‘Inspire Through Story,’ which is the best way to describe the kind of projects we gravitate towards.”
Cunningham observes, “The industry, in general, has been disrupted quite significantly over the last decade or so, with the rise of the major streaming platforms and the shift of audience towards short-form content – especially in younger audiences, which has a bearing on animation. The commercial model for various levels of film and TV production is less clearly defined, and especially with relatively long production timeframes in CG animation. There’s always the possibility of quite significant shifts in the market even while a film is in production. It’s quite a balancing act to build permanent capacity for ambitious productions without having overheads become too bloated or losing agility to adapt to dynamic market conditions. In 2025, the rise of AI will create challenges and opportunities, and we are trying to set ourselves up to be agile and able to adapt.”
Driven by its diverse locations, skilled workforce, competitive production costs and strong government support, Cape Town offers vibrant opportunities for visual effects and animation. “The city attracts international productions and serves as a growing hub for animation,” says Tim Keller, Producer at Sunrise. “Costeffective yet high-quality production capabilities supported by a
skilled talent pool trained in CG, VR and motion capture further enhance Cape Town’s appeal. Institutions like The Animation School and SAE Institute offer specialized programs in animation and VFX. Facilities such as Cape Town Film Studios provide world-class infrastructure, while government incentives, including production rebates and co-production treaties with major countries, encourage local and international projects. Institutions like the Cape Town Film Commission and National Film and Video Foundation assist with funding, training and logistics, while festivals like the Cape Town International Animation Festival (CTIAF) showcase local talent. Despite challenges in financing for smaller studios, Cape Town’s potential to expand into global markets and integrate advanced technologies position it as a competitive player in the global entertainment industry.”
Keller continues, “South Africa is an emerging hub for animation talent, and while the local pool isn’t yet large enough to fully support premium feature animation, we are proud to contribute to its growth. With the global animation and visual effects industry evolving rapidly, we are keeping a close eye on the international labor market. In all of these areas, the challenges we face are accompanied by tremendous opportunities for growth, innovation and impact.”
The Middle East is another region gaining global recognition accelerated by government and media support and high ceiling for future growth over the next decade. Founded in 2016, JJ Agency is a Dubai-based visual effects and content production studio with 80 employees specializing in animation, commercials, TV series and feature films. “The United Arab Emirates is a major center for film and television production, thanks in part to the generous film tax breaks that have made it attractive,” states Gary Fedorenko, Co-Founder, CEO and VFX Supervisor of JJ Agency. “Dubai and the UAE are rapidly becoming major players in the global media landscape. This region is characterized by its forward-thinking approach and strategic investments in infrastructure and technology, making it a hotbed for innovation and creativity. The UAE’s commitment to economic diversification has led to significant support for the media and entertainment sector, attracting international studios, talent and productions.”
Fedorenko continues, “Dubai’s strategic location as a global hub offers numerous advantages, including its accessibility to diverse markets, world-class infrastructure and business-friendly environment. The UAE’s tax incentives, such as the low VAT and corporate tax rates, further enhance its attractiveness for businesses and investors in the creative industries. This combination of factors creates a fertile ground for the continued growth of video production, VFX and animation in the region.”
With growth comes growing pains. Fedorenko comments, “While the UAE has made significant strides, the availability of a skilled local workforce remains a key challenge. To further solidify its position in the global market, the UAE needs to invest in nurturing local talent and attracting experienced professionals from around the world.” He says education and training programs in VFX, animation and video production are needed. “By fostering a robust talent pool, the UAE can reduce its reliance
StaticVFX contributed 128 shots that encompassed cleanups, set extensions and greenscreens to the 2023 Spy/Master HBO Max series.
(Image courtesy of StaticVFX and HBO Max)
A collaboration of Johannesburg, South Africa-based Chocolate Tribe and Netflix, AVIJOZI is a popular two-day, free-entry, social-impact initiative and industry event that focuses on animation, VFX and interactive technology. The festival, which has seen exponential growth since its inception in 2022, connects creatives, tech innovators and industry leaders while highlighting cutting-edge fields like AR, VR and gaming.
(Image courtesy of Chocolate Tribe)
on outsourcing projects to other countries and establish itself as a self-sufficient hub for creative excellence. Despite the talent gap, working in the UAE’s VFX and video production industry offers unique opportunities. While the UAE is still in the process of building a robust local talent pool, the opportunities for knowledge transfer and skill development are immense, paving the way for a bright future in the industry.”
When Samer Asfour, Chief Executive Officer of Nested VFX in Dubai, first arrived in the UAE 12 years ago, finding talented professionals in the region. “Whenever we needed to hire new talent, we often had to source them from overseas. Reliable freelancers were equally scarce,” he says. “However, the landscape has transformed significantly. The UAE’s introduction of initiatives for freelancers, such as flexible residency visas and the absence of individual income tax, has positioned the country as a thriving hub for creative and production teams.”
Asfour notes, “On another level, the advertising industry has witnessed significant growth over the past decade. Dubai, long recognized as a hub for pan-Arab and international commercials due to its diverse pool of multi-ethnic talent, unique desert landscapes and the presence of regional headquarters, has further solidified this reputation. This dynamic is more pronounced than ever, as the city continues to attract global campaigns and productions, making it a focal point for the advertising world.”
Local governments in Abu Dhabi and Dubai offer incentives to attract international films and series to be shot locally. Asfour says. “These initiatives take full advantage of the region’s year-round sunny weather and distinctive landscapes, further enhancing the UAE’s appeal as a premier filming destination.”
Asfour is candid about the challenges. “The VFX and animation industries still have significant ground to cover, as many producers continue to outsource their post-production work abroad. However, the few locally-based studios are making notable efforts to build trust in the local VFX market. These studios are striving to attract a growing share of local projects that are currently being lost to international post-production houses.
“One of the biggest challenges we face is competing with international VFX studios that have greater access to local talent, benefit from a lower cost of living and operate with reduced overall expenses.”
A youth movement is energizing growth. “Universities have played a pivotal role in this evolution,” Asfour remarks. “Many students born in the early 2000s to expatriate families in the UAE have pursued studies in media and production at local institutions. Today, these graduates are becoming vital contributors to the region’s production market. The practice of mentoring interns and training them on the job has further enriched the industry, fostering a new generation of skilled professionals and strengthening the local talent pool.”
Based in Giza, Egypt, Trend VFX, with 50-200 employees, offers high-end VFX services for feature films, TV, animation, commercials and post-production. Mohamed Tantawy, CEO and VFX Supervisor, also sees the Middle East as a region on the rise. “I see significant positive indicators for the growth of visual effects
production in Egypt in particular, as well as across the Middle East. Focusing on Egypt specifically, it’s worth noting that the country is one of the earliest film producers worldwide, with a cinematic history that exceeds 100 years. Over the decades, Egypt has developed a wealth of talented professionals and high-quality skills in various fields of filmmaking.
Tantawy believes Egypt has caught up with the best production facilities in the world. “In the past 10 years, VFX in Egypt has advanced notably and now rivals global standards. There have already been numerous international projects from Canada, the U.S., several East Asian countries and Europe that have relied on Egyptian teams to handle their VFX work. We at Trend have been fortunate to collaborate on projects from these regions. Additionally, in the last decade, you can observe the presence of many Egyptian and Arab VFX artists in some of the world’s largest studios, and local productions have also shown remarkable improvements in the quality of their visual effects.”
Egypt’s emergence in the global production and VFX marketplace has been a personal dream of Tantawy for years. He believes Egypt can become a key player because of its highly competitive pricing, large talent pool, government support and technological shift. Tantawy explains, “Due to the exchange rate between the Egyptian pound and the U.S. dollar, production costs – such as rent, overhead and salaries – are often significantly lower in Egypt. Taxes can also be lower than in other markets, which provides further financial incentives.”
Tantawy continues, “In recent years, there has been a major expansion in film and VFX education in Egypt, driven by both government and private initiatives. The rise of the internet and digital resources has made high-level knowledge and tools more accessible. Younger generations are especially enthusiastic about learning these skills as they see VFX and related fields as a promising creative industry with strong earning potential. Egypt, like many Arab countries, has a young population eager to pursue opportunities in film and television – industries in which Egypt has deep historical roots and cultural significance. Language barriers are not a significant obstacle since a large segment of the population speaks English [taught from early school years]. Additionally, hiring employees in Egypt is relatively streamlined, and there’s a healthy pool of well-trained candidates available to join VFX and production teams.”
Through various initiatives and educational programs, the government has been actively encouraging young people to begin careers in technology-driven fields, including VFX. Tantawy adds, “There is also growing recognition of the value VFX and film production bring to the economy, prompting efforts to facilitate and streamline company operations in these domains. Over the last decade, we have seen a surge in graphics-centric projects requiring globally competitive quality. With continued support, training and investment, I believe Egypt is positioned to play a substantial role in the global VFX and production marketplace.”
BOTTOM TWO: Based in Giza, Egypt, Trend VFX offers high-end VFX, animation and design services for feature films, TV and advertising. The company also provides compositing, matte painting, rig removal, set extension, environments and other VFX services. (Images courtesy of Trend VFX)
By OLIVER WEBB
The visual effects industry is soaring with effects in high demand. While the average budget for an effects-heavy Hollywood film is estimated at $65 million, the number is much lower for independent films, which fall below the $15 million mark; however, the budget is usually much lower. Regardless of the project, visual effects have become an essential tool for filmmakers, and many independent films have shifted towards relying on effects over the last few years. Impressively, Godzilla Minus One, winner of the 2024 Academy Award for Best Visual Effects, was made on a $15 million budget. The Creator was the second-lowest at around $80 million. Despite high costs and pressure for high-quality visual effects, a variety of independently-produced films and series throughout the last few years have impressed with their exceptional effects on a limited budget.
TOP: 2024 Best Effects Academy Award-winner Godzilla Minus One, made on a $15 million budget, required 610 visual effects shots. (Image courtesy of Toho Co., Ltd.)
OPPOSITE TOP TO BOTTOM: All Quiet on the Western Front, nominated for Best Visual Effects at the 2023 Academy Awards, required approximately 500 VFX shots. (Image courtesy of Netflix)
Evil Dead Rise (2023), a film reliant on VFX, was realized by managing repetition and being aware that more shots could add to the budget. (Image courtesy of Warner Bros. Pictures )
The Favourite (2018) was an example of SSVFX working on a storytelling film where the VFX needed to be invisible. Approximately 150 visual effects shots were required for the film. (Image courtesy of Searchlight Pictures)
Moon (2009) was renowned for its impressive visual effects captured on a minor budget, an example of Cinesite delivering high-quality visual effects for a low-budget film. (Image courtesy of Sony Pictures)
At the 2023 Academy Awards, All Quiet on the Western Front saw itself nominated for the award with only a $20 million budget. To put it into context, Avatar: The Way of Water scooped the award and had an estimated budget of $350 million. From the beginning, it was clear that All Quiet on the Western Front would have a modest budget both during filming and in post-production. Viktor Müller of UPP VFX served as the film’s Visual Effects Supervisor. “What helped was that director Edward Berger and I knew from the start that digital effects should not heavily impact this film, as that would undoubtedly diminish its credibility and realism,” he explains. “We agreed from the beginning to use as many traditional methods and practical elements as possible. This meant limiting the use of bluescreens, not only due to production challenges but also to enhance the realism of the final composition. What initially seemed like a compromise turned out to be an advantage for us.”
All Quiet on the Western Front along with Parasite, Fanny and Alexander and Crouching Tiger, Hidden Dragon is one of the most awarded non-English language films in Academy history. As a result, it has helped boost the German film industry, and the German government has also pledged a new tax plan aimed at attracting
more international productions to shoot in the country. Similarly, the U.K. the government has announced the U.K. Independent Film Tax Credit, strengthening its support for indie film productions. One of the main changes includes an enhancement of the rate of credit payable on visual effects costs. Qualifying VFX spending will get an additional 5% credit rate and the 80% qualifying spending cap will not apply to VFX costs. The results of these tax incentives make the U.K. more appealing for international co-productions and have boosted independent cinema.
As a result of the tax credit for independent films, producers Andy Paterson (Girl with a Pearl Earring, The Railway Man) and Annalise Davis (Up There, No Way Up) have teamed up with Dimension Studio and Vue, the largest privately-owned cinema operator in Europe, to deliver British independent films to the big screen. “The Virtual Circle partnership is a way to start financing and co-producing independent films that leverage the U.K. tax incentive, which means they need to be below £15m and go straight to exhibition,” says DNEG 360 and Dimension Managing Director Steve Griffith. “It’s a non-exclusive partnership with Vue Cinema, but it does mean that the films that we produce will see a theatrical release, and that’s hugely important. In general, we believe that the distribution model is a little bit broken. Foreign pre-sales are what independent films primarily use to finance their films.”
The first forthcoming films from the partnership are 2040 and Campbeltown ‘69
No Way Up is a 2024 disaster movie shot at Dimension Studio that follows survivors of a plane crash trapped underwater. The film embraced virtual production techniques to capture the story. Unreal Engine was used to previsualize all the technical details, and during the physical shoot, the sequence was shot dry-for-wet within an LED volume. Shooting the film at Dimension Studios helped the filmmakers capture required effects while helping save costs.
One of Dimension Studio’s recent projects included the independently produced series Those About to Die, with the studio completing around 2,000 visual effects shots on the stage at a fraction of the traditional cost. While the budget for the show was high, it proves that virtual production can be a valuable tool in reducing visual effects costs. “For Those About to Die, they did 22 takes or so of Golden Hour for the production. If you did it as normal visual effects, it would also be very costly as it’s a fourminute-long shot, so virtual production was quicker and cheaper,” Griffith says. “There is this magic triangle of cost, time and quality. We can do it cheaper, faster, and it’s high quality. That’s the benefit to independent films and why Andy and Annalise came to us and proposed this partnership. There are projects coming off the shelf that are traditionally bigger-budget projects that now fit that indie film budget that we can execute on. People think it has to be The Mandalorian or some big-budget thing, but virtual production is definitely a tool for independent filmmakers.”
“A bigger budget obviously buys you scale and capacity; if your means are limited, a 15-minute sequence featuring thousands of CG aliens invading a futuristic city probably won’t be feasible. But, if the history of filmmaking has taught us anything, it’s that creativity and resourcefulness go a long way – a good VFX supervisor will be able to advise, troubleshoot and help ensure you maximize what you have at your disposal.”
—Fiona Walkinshaw, CEO of Film & Episodic, Framestore
The monster movie Ick is a low-budget horror film and a good example of one of Ingenuity Studio’s latest projects completed on a tight budget. Ick follows science teacher Hank’s life as he reconnects with his first love and is faced with an alien threat in their town. “It’s a low-budget film directed by Joseph Kahn with tons of CG shots,” says David Lebensfeld, President of Ingenuity Studios and Ghost VFX. “We were able to stretch the dollar because Joseph knew what he wanted. We had a clear path for execution, so we didn’t waste a lot of time. There weren’t a lot of voices in the room, which means you don’t have a lengthy approval process. With Ick, we served a director with a clear vision, making the process incredibly efficient. We have been working with Joseph for almost 20 years, including all his Taylor Swift music videos, a range of commercials and some of his prior movies (Detention, Bodied ). We know how to deliver what he’s after.”
According to Lebensfeld, you’re always on a budget, regardless of whether you have $10 million or $10 to play with. You still never have enough money for what you want to do, and the challenges remain the same. “You need proper planning and time to execute. Often, lower-budget projects are passion projects for the
filmmakers, and you have to manage personalities and help them separate emotion from the work. Higher-budget projects tend to have less of that. Sometimes, there is a disconnect between what indie filmmakers want and what they can afford. Proper planning can help ensure the best use of visual effects from a strategic level, leveraging the budget in the smartest way. Getting us involved early, listening to us and trusting the process, and sticking to the plan is a real benefit to our indie clients.”
SSVFX is an internationally-acclaimed visual effects house based in Dublin, Ireland. SSVFX not only caters to larger-scale productions but also to independent projects. Recently nominated for an Emmy for their work on Shōgun, SSVFX has worked extensively on a range of projects, including Game of Thrones, The Penguin and Gladiator 2. Head of Studio Nicholas Murphy believes that any budget has its challenges, whether big or small, though when budgets are limited, getting the visual effects team onto a show early is paramount, especially when there is no independent team, so the vendor is responsible for managing the budget. Having early creative discussions with the director, cinematographers, production designers, etc. is key for any project, “so that the VFX team or vendor is involved in the design and planning of the visuals, so it doesn’t become an afterthought where the shopping list grows but the budget doesn’t align,” Murphy remarks. “If decisions have been made without VFX but involve VFX, then this is where challenges can arise. Tax breaks, such as in Ireland, which is 32%, can also be another solution to help a show get the best out of its budget. It is about managing expectations, aligning all the creatives to the budget while designing the best shots to maximize the budget.”
Murphy argues that it’s all relative when it comes to completing a project on a tight budget. “Big-budget shows generally have big-budget VFX. Small-budget shows want big-budget VFX, so we need to communicate and get everyone on board to understand what is achievable for the budget. Some indie shows we’ve done are probably good examples of this. We needed to plan and design a show to get the best visuals possible within the agreed budget.”
Nominated for 10 Academy Awards, The Favourite is a prime
MIDDLE AND BOTTOM: Framestore was the primary VFX vendor on His House (2020), delivering environments, effects, photorealistic animation and VFX-enhanced creatures. Framestore contributed 185 VFX shots to the low-budget film. (Images courtesy of Netflix)
TOP TO BOTTOM: Despite their impressive effects, Alex Garland’s films have all had relatively low budgets like Men (2022). (Image courtesy of A24)
Alex Garland’s Ex Machina (2014), budgeted at $15 million, is renowned for its impressive array of effects. (Image courtesy of A24)
Ick (2024), a low-budget monster horror film, was completed by Ingenuity Studios on a tight budget. (Image courtesy of Ick LLC)
example of SSVFX working on a storytelling film where the VFX needed to be invisible. Despite its appearance as a big-budget production, The Favourite’s budget was recorded at £15 million. “Every department on the show was worthy of an Oscar, so the VFX needed to maintain that level of quality and not take away from the story and be seamlessly integrated.” Evil Dead Rise was another one of SSVFX projects. The film is full of gore and VFX. “Trying to achieve a film that is so reliant on VFX was all about managing repetition and being aware that more shots may add to the budget. We had to change eyes, add blood and add gore to many scenes, so shot counts can easily grow, and so can the budget. We just needed to be efficient, design workflows that helped with show’s repetitive nature and have an open line of communication with the editorial team.”
Drew Jones, Chief Business Development Officer at Cinesite, believes that indie filmmaking is always challenged to produce a compelling story on screen with visual imagery that delivers the director’s vision to an often-tight budget and schedule. “Recent changes to the U.K. tax rebate for indie production up to £15M means that this area of filmmaking will continue to thrive,” he notes. “VFX content for indie films, like technology, is always improving, but has to be a very collaborative process, managing the director’s vision in association with potential budget limitations. It is about working with experienced VFX supervisors to understand the storytelling and to work with production to bring creativity to the screen while being mindful of budget and time constraints. The Cinesite Group brings its experienced VFX supervisors to the mix to work together with a production potentially utilizing Cinesite’s VIS [visualization] services to help find the best creative solutions at the right budget level while not diminishing the narrative. Using VIS across concept, previsualization, techvis and postvis offers many options to keep the vision and budget on track along with the Cinesite Group’s worldwide footprint for tax rebates and cost centers. All of these are designed to assist the indie production’s needs.”
Over the years, The Cinesite Group of companies has worked on many indie projects with varying VFX content. The 2009 film Moon is one example of Cinesite’s projects that delivered outstanding visual effects for a film with a budget of $5 million. “Where it needs be, our studio network may split work or sequences to aid with budget expectations or, indeed, simply propose that the work is completed in another Cinesite partner studio or location to take advantage of tax rebates or lower cost center,” Jones states. “The work will be exemplary, overseen by executive supervisors, and always delivered on budget and schedule. The Cinesite Group has a seamless pipeline between the studio locations; this makes for efficient communication and ease sharing working processes.”
VFX is an essential tool for modern filmmakers, whether working on an action blockbuster, a period drama or an indie feature. Films such as Civil War, Men, Downton Abbey and His House are notable examples of Framestore’s projects completed on a tight budget. Men’s budget was just over $5 million, and Framestore won Best Effects at the British Independent Film Awards for their work on the film. Despite their impressive effects, Alex Garland’s films have all had relatively low budgets. Released in 2014, Ex Machina had a budget of $15 million and was renowned for its impressive array of effects.
For Men and His House, the caliber of the work for Framestore was extremely high and perfectly fit the films in question, but they were delivered on budgets that were modest compared to any modern tentpole feature. Fiona Walkinshaw, CEO of Film & Episodic at Framestore, explains, “When you’re a major VFX studio, budget and capacity always have to be a consideration. At the same time, though, we’re also film fans, and we want to help new generations of filmmakers tell the stories they want to tell. So, if we’re excited by a script, an idea or a director, then we will do what we can to make it work. A bigger budget obviously buys you scale and capacity; if your means are limited, a 15-minute sequence featuring thousands of CG aliens invading a futuristic city probably won’t be feasible. But, if the history of filmmaking has taught us anything, it’s that creativity and resourcefulness go a long way – a good VFX supervisor will be able to advise, troubleshoot and help ensure you maximize what you have at your disposal.”
Tax incentives such as the U.K. Independent Film Tax Credit and Ireland’s tax break help boost the independent film industry. Virtual production is also a vital tool for filmmakers looking for alternative ways of capturing effects as new tools and technologies continue to advance VFX. As seen at the Academy Awards over the last few years, independent films are gaining more recognition for their impressive visual effects shot on a budget. Films such as Godzilla Minus Zero, All Quiet on the Western Front, Ex Machina, Men, No Way Up and Ick have all demonstrated that high-quality visual effects can be captured on a budget. However, with high costs and limited means to achieve the same range of effects as big-budget Hollywood movies, there is a long way to go to make apple-to-apple comparisons. Nevertheless, the future for independent filmmaking and VFX is brimming with optimism and opportunity.
“There are projects coming off the shelf that are traditionally bigger-budget projects that now fit that indie film budget that we can execute on. People think it has to be The Mandalorian or some big-budget thing, but virtual production is definitely a tool for independent filmmakers.”
—Steve Griffith, Managing Director, Dimension Studio
By OLIVER WEBB
It is paramount for visual effects artists to keep up with the latest tools and technology in an ever-rapidly changing industry. A wide range of software is available for artists in the VFX industry. Adobe After Effects, Maya, Blender, Nuke, Fusion and Houdini are some examples of the industry-leading software the VFX industry relies on to create exceptional visual effects. In the last few years, virtual production and AI have also become essential tools in the toolbox, playing a key role in shaping the industry’s future.
Blackmagic Design is one of the world’s leading innovators and manufacturers of creative video technology. DaVinci Resolve is an end-to-end post-production software that includes editing, color grading, VFX and audio post-production, making it the leading industry software used on more productions than any other software. Similarly, Fusion is an industry favorite and the world’s most advanced compositing software for visual effects artists and 3D animators. DaVinci Resolve and Fusion Studio have received major updates, and the new features offer endless possibilities.
TOP: Fusion was the software of choice when it came to Muse VFX creating the effects for FX’s What We Do in the Shadows. (Image courtesy of FX Network)
OPPOSITE TOP: For What We Do in the Shadows, which required ‘on the fly’ problem solving, it was easy to get great first results quickly in Fusion then refine them later, providing the client multiple options in the time it often takes to produce only one. (Image courtesy of FX Network)
Blackmagic Design acquired Eyeon’s Fusion in 2014. “It was 2017 when we ultimately did reveal Version 9,” explains Blackmagic Product Specialist Shawn Carlson. “The first thing we did was to make it cross-platform. There was a major version update [Version 9] that included unification of tools. Regarding the software’s types of tools, there were major tool revisions and updates in version 18 and then again in 18.5. Fusion 19 has added volumetric VDB support so you can import and composite with VDB volumetrics very fluidly in the same composite. More than 200 different 2D and 3D compositing and motion graphic tools are included.”
Blackmagic Design has released DaVinci Resolve 19, DaVinci Resolve Studio 19 and Fusion Studio 19. DaVinci Resolve 19 is a
major new update that adds new DaVinci Neural Engine AI tools, over 100 feature upgrades such as IntelliTrack AI, Ultra NR noise reduction, ColorSlice six-vector grading, film look creator FX and multi-source editing on the cut page. For Fusion, there are new USD tools and multipoly rotoscoping tools, plus Fairlight adds IntelliTrack AI tracking for audio panning to video, ducker track FX and ambisonic surround sound. Fusion Studio 19 adds support for Volume VDB files, multipoly rotoscoping, a new multi-merge tool and an expanded set of USD tools.
There are slight differences between the two versions in terms of the Fusion tools within a page of the Resolve workflow and the Fusion Studio software that exists as a standalone. “For one example,” Carlson adds, “Fusion Studio 19 allows for render farms and to install multiple render nodes within the same installer. The ability to populate and distribute to a render farm across multiple nodes is one of the great advantages of the standalone version. We also added a multipoly node to 19, which allows you to control multiple polygon nodes together in one view when building a roto. We would use these tools in Hollywood and other industry hotbeds around the globe to roto paint out rig removal, sky replacement, set extensions, 2D paint, Intellitrack, or 3D camera track with point clouds for 3D object integration. We also have great looping and animation controls. The animation controls themselves are separated in two different windows.”
Major films and television series such as Game of Thrones, Breaking Bad, Only Murders in the Building, What We Do in the Shadows, Dumb Money, Avatar and the Harry Potter movies have all relied on Fusion. “Other benefits to a giant Hollywood production, or even a small production, is the cross-platform aspect because you have various collaborators working around the globe,”
Carlson remarks. “They can all work on whatever platform is their favorite platform, and all deliver within the same software, and the software tools present themselves the same way across all three platforms. The ability to jump between any platform and continue in Fusion is all there. The permanent licensing versus the subscription model is also a big win for productions. Resolve has improved its licensing scheme to include activation codes. They are available to get after purchase immediately, then you begin to install, so you don’t have to wait anymore. The activation codes are flexible. It’s not node-locked.”
Fred Pienkos, Founder and VFX Supervisor at Muse VFX, explains that his team relies on Fusion for its proven speed and accuracy, enhanced by its mature integration with Resolve. “This combination empowers us to efficiently address our clients’ challenges in today’s demanding industry landscape. Fusion’s versatility allows us to maintain high standards while adapting to tightening budgets and accelerated timelines. By leveraging this toolset alongside our creative expertise, we continue to deliver compelling visual storytelling, meeting the current market’s artistic and economic demands.”
Senior Compositors at Muse VFX Kelly McSwain and Toby Newell rely on the software when working on projects. McSwain first started using Fusion when she started working at Muse VFX. “I had never touched the program before, and they assured me it would be easy to learn and translate my knowledge from other programs. They were absolutely right; working in Fusion has been a wonderful surprise for me.”
For Senior Compositor Toby Newell, affordability and practicality is first and foremost, so Fusion is his go-to software. “I don’t think we’re the only ones who look at the cost of renting
DaVinci Resolve 19 is a major new update that adds new DaVinci Neural Engine AI tools and more than 100 feature upgrades. (Image courtesy of Blackmagic Design)
There were major tool revisions and updates in Fusion Version 18 and again in 18.5. (Image courtesy of Blackmagic Design)
licenses from other software companies as a major downside to using their products, especially when factoring in render nodes,” he explains. “Fusion is incredibly low-priced in comparison, and it’s massive that they don’t charge separately for render nodes with the purchase of a license. The ability to quickly scale up your render farm on any given project is a major obstacle for any visual FX company. The reality is that Fusion does at least 90% of what other software does. Anything it doesn’t do well hasn’t been a dealbreaker for us. They have a phenomenal user group that’s quick to offer helpful tools or workarounds if you encounter a problem. As someone who started as a compositor using Fusion 20+ years ago then spent quite a few years using Nuke, I’ve found it’s not that hard to train good compositors to transition to either. With the wide-scale adoption Resolve has seen, using software that’s integrated into it also brings quite a bit of appeal.”
Fusion was the obvious software of choice when it came to Muse VFX creating the effects for FX’s What We Do in the Shadows. “I like all of the custom tools that we’re able to make,” McSwain notes. “We’ve built some incredible macros that help us as compositors and add to the quality of work that we can do. I’m able to easily pull in what I get from 3D and roto with a click of a button, easily swap out old renders, and quickly enhance stock elements with just some of the macros we’ve been able to build.”
A project like What We Do in the Shadows doesn’t come along that often,” Newell adds. “Creativity starts from the top down, and the process isn’t always a straight line. Considering the range of work involved drastically differs from episode to episode. It often calls for a heavy amount of ‘on the fly’ problem-solving. There’s a lot of work required from our compositors as well as our 3D department. It’s the kind of work where you often can’t rely on widely adopted techniques. You end up having to throw the book out and start throwing things at the wall to see what sticks. Fusion
is great in scenarios like this. It’s easy to get great first results quickly then refine them later. This allows us to give our clients multiple options in the time it often takes to get one.”
Christopher Hebert is Senior Director of Marketing at SideFX. SideFX is an Academy Award-winning industry leader in developing 3D animation and visual effects software, best known for its flagship product, Houdini. Founded in Toronto over three decades ago, SideFX has consistently pushed the boundaries of what’s possible in digital content creation. The company is known for its deep expertise in procedural workflows, allowing artists to create complex effects and animations with high control and flexibility.
With a team that blends seasoned developers, technical artists and VFX industry veterans, SideFX operates as a mid-sized company that punches well above its weight. “Despite its relatively small size compared to some of the industry’s giants, SideFX maintains a close-knit culture of innovation and collaboration,” Hebert says. “This allows the company to respond quickly to the evolving needs of artists and studios from major Hollywood productions to independent creators and game developers. SideFX’s influence extends beyond its software – the company is deeply involved in the media and entertainment community, regularly contributing to industry events, educational initiatives and collaborations that drive the field forward.”
Recently, SideFX released Houdini 20.5, which brings a wide range of new features and enhancements. KineFX Rigging and Animation tools now feature tags, making it easier for riggers to set up procedural character and creature rigs using pre-built rig components. Animators will appreciate a new suite of tools designed to streamline the keyframing of character motion, making the animation process more intuitive and efficient.
“Another major addition is the MPM (Material Point Method) Solver, which strengthens the software’s simulation capabilities
TOP TO
Fusion Studio’s 3D tracker was used to get rid of unwanted boom mics, crew and camera reflections in windows for Only Murders in the Building. (Image courtesy of Disney+)
Major films and TV series, such as Game of Thrones, Breaking Bad, Only Murders in the Building, What We Do in the Shadows, Dumb Money, Avatar and the Harry Potter movies, have all relied on Fusion. (Image courtesy of Blackmagic Design)
Fusion Studio’s 3D tracker was used to get rid of unwanted boom mics, crew and camera reflections in windows for Only Murders in the Building (Image courtesy of Disney+)
for snow, soil and mud – allowing for more realistic and dynamic interactions and opening up new creative possibilities for artists working on natural environments,” Hebert says. “The brand new Copernicus feature set is a 2D and 3D GPU image processing framework that provides real-time image manipulation and is ideal for tasks such as building texture maps and setting up Slap Comps – providing a powerful and flexible platform for artists who need immediate visual feedback. These are just a few of the hundreds of new enhancements.”
As one of the leading software packages for VFX, Houdini appeals to major Hollywood productions for numerous reasons. First, it appeals to major film productions with its procedural nature, which allows studios to make rapid adjustments without starting from scratch. “This approach is particularly valuable in large-scale content creation, where tight deadlines and evolving creative directions are the norms,” Hebert notes. “The software’s ability to handle everything from large-scale explosions to intricate particle effects makes it a versatile choice for VFX artists needing reliability and depth. Houdini’s integration with production pipelines allows it to fit seamlessly into various studio workflows, further cementing its place in high-end VFX work. Beyond the software itself, SideFX is well-known for its incredibly responsive technical support, tight collaboration between its R&D team and studio customers, long-term product planning, and investment in community skill-building and knowledge sharing.”
Houdini 20.5 is helping to change the VFX landscape. Hebert highlights five key reasons how the software achieves this: Procedural Workflows Becoming the Norm, Integration of RealTime and GPU Acceleration, Expanding the Scope of Simulations, Bridging the Gap Between VFX and Other Creative Industries, and Adapting to AI and Machine Learning Integration. “Houdini’s procedural approach allows artists to build effects and models using rules and parameters rather than manual step-by-step methods,” Hebert says. “This shift towards proceduralism reduces the time spent on repetitive tasks, making the creative process more efficient. Artists can iterate quickly, adjust on the fly and even automate complex tasks, which is essential in high-pressure production environments where constant changes occur. This trend democratizes the ability to create high-end VFX, making advanced effects accessible to smaller studios and independent artists.”
Hebert also notes that with the growing use of GPU acceleration, tools like Houdini’s Copernicus framework are transforming workflows by providing real-time feedback during the creative process. This is particularly important for VFX artists working on time-sensitive projects, such as episodic television or advertising, where turnaround times are tight. “Real-time capabilities are blurring the lines between pre-production, production and post-production, enabling artists to work more interactively and make decisions faster, ultimately improving the quality and pace of production.”
Houdini 20.5’s MPM Solver is a good example of how VFX software pushes the boundaries of what can be simulated digitally. By offering more realistic simulations of complex materials like snow, soil and mud, artists can achieve a higher level of detail and realism
in their scenes,” Hebert explains. “These advancements mean that previously difficult or time-consuming effects can now be achieved more easily, allowing for more creative experimentation and enabling artists to bring their visions to life in ways that were not possible before.”
Unreal Engine is the world’s most advanced real-time 3D creation tool and is revolutionizing the VFX landscape by enabling real-time rendering, drastically speeding up the production process. Traditionally, rendering high-quality visual effects took hours or days, but Unreal allows for instant feedback and iteration. “This real-time capability also facilitates virtual production, where filmmakers can visualize and interact with CGI elements on set, merging the physical and digital worlds seamlessly,” explains Paul Martin Eliasz, Unreal Technical Director and Co-Founder at Umewe Collective. “Unreal’s high-quality rendering, paired with its powerful tools for simulation, lighting and animation, provides VFX artists with the flexibility to create photorealistic environments and complex effects more efficiently. The engine’s accessibility and continuous updates make cutting-edge VFX more affordable and adaptable, allowing studios of all sizes to compete at a high level. Additionally, its integration with other software through plug-ins and its open-source nature encourage innovation, fosters new creative possibilities in the VFX industry.”
There are advantages when it comes to Unreal Engine. Productions using Unreal Engine benefit from real-time rendering, significantly accelerating the iteration process and reducing production timelines. “This real-time feedback enables directors and artists to make creative decisions on the spot, improving the overall quality and coherence of the visual narrative,” Eliasz says. “Unreal Engine also supports virtual production techniques, such as LED wall-based filming, where digital environments are rendered live on set, blending physical and virtual worlds seamlessly. This reduces the need for greenscreens and post-production compositing, saving time and costs. The engine’s photorealistic capabilities and vast library of assets and tools for animation, simulation and lighting, allow for high-quality results across various genres. Additionally, Unreal’s flexibility and integration with other industry-standard tools make it easier to adapt to different workflows. At the same time, its cross-platform compatibility ensures that assets can be reused across multiple media formats.”
Staying ahead of VFX trends and technological advancements is vital for those in the VFX industry, and constantly evolving software like Unreal Engine requires VFX artists and studios to continuously learn and adapt to new features, workflows and best practices. “This can be resource-intensive, requiring time and investment in training and experimentation,” Eliasz details. “Additionally, the demand for higher-quality visuals and more immersive experiences pushes the boundaries of current hardware capabilities, making it essential to stay updated with the latest advancements in GPUs, CPUs and other rendering technologies. The integration of new technologies like AI and machine learning into VFX also presents both opportunities and challenges, as it requires a deep understanding of these fields. Balancing the need
to innovate while maintaining production efficiency and meeting tight deadlines is an ongoing challenge for the industry.”
McSwain agrees that one of the biggest challenges of keeping ahead is growing with the latest technology as it’s evolving. “You’re learning or hearing of this new awesome tool, but then the tool itself is also learning, growing and improving all the time. When Fusion’s Magic Mask was first introduced, I was weary about the quality of results I would get based on other tools I’ve seen in the past. I used to treat it as a way to get a rough roto for a work-in-progress shot. Recently, I’ve used it multiple times very successfully with more refined details for final shots. It’s all about balancing your expectations and the reality of where the technology is at that moment in time.”
There are many tools and technologies available for visual effects artists, and Fusion and Houdini are just two examples of revolutionary software that have helped to shape the visual effects industry. New tools such as virtual production and AI machine learning are also becoming a key part of the industry. With a rapidly changing industry, staying ahead of the game is vital.
By IAN MILHAM, ILM
Edited for this publication by Jeffrey A. Okun, VES
Abstracted from The VES Handbook of Virtual Production
Edited by Susan Zwerman,
VES
and Jeffrey A. Okun,
VES
The Virtual Production Supervisor is the head of a new on-set department that overlaps and interacts with many others. The work of interacting with other heads of departments may represent the majority of the VP Supervisor’s time, beyond what time they spend with the virtual production team itself. The VP Supervisor is the team’s advocate, champion and early warning system.
The virtual production work may be integrated just as much with the Production Designer, Art Department, the Director of Photography or Gaffer – not just with the VFX Supervisor. He/ she needs to build specific relationships with multiple key partners across a production, in both pre-production and on set on the day.
The goal of the VP Supervisor during pre-production is to lay the groundwork for a successful shoot day. They need to accomplish specific goals with key partners. The Director of Photography is perhaps the most important partner. The VP Supervisor needs to understand their visual goals so they are able to suggest how virtual production may help achieve those goals. They need to work with the DP to light the content that will be on the LED walls since it is an extension of the practical set that the DP will also be lighting. They also need to work with the DP on how that content lights talent and the practical set since it will be supplying the majority of lighting on the day.
Another key aspect with the DP is making sure that the DP can get his/her shots. This comes with discussions of what parts of the virtual set might need to be dynamic, or operable on the day, or virtually wild (so they can be hidden). Or will something in the shoot plan be a challenge for camera tracking, such as occlusion in the set or an unusual camera head?
Second only to the DP, the key relationships in prep are with the Production Designer and Art Department. On the day, these partners need to stand in front of the virtual set and have it be just as much a part of their vision as the practical one. The VP Supervisor needs to understand the visual goals of the set design and, as the Art Department creates it, ensure that the virtual version extends it seamlessly. Since the content team frequently is not on site, the VP Supervisor is the point person to make sure their content blends seamlessly with the practical set. This may involve delivering feedback and notes to them, or if the shoot is close, having the on-set team make those changes themselves.
Less prominent but still key in prep is the relationship(s) with the Director(s); the VP Supervisor representing the shooting experience to the Director(s) and learning any specifics of their shooting style that may need to be accounted for. Another key relationship in prep is with the VFX Supervisor – establishing goals for finals and any notable aspects of the scene, for example, unusual frame rates or elements that may be added in post.
The First AD is key to getting the shooting days in order and making them successful. This may be one of the most important relations the VP Supervisor will have. The First AD is responsible for the schedule and the set logistics. Operations of the virtual production team are invisible to the AD department, and they count on the VP Supervisor for information on the timing of the operations. That key information flows both ways and timely information from the First AD can allow the VP Supervisor to give early warning on upcoming stage operations so the VP team is ready to roll.
Order yours today! https://bit.ly/VES_VPHandbook
By NAOMI GOLDMAN
TOP TO BOTTOM: VES New Zealand members participate in a VES Awards Nominations Event at Wētā FX.
VES New Zealand hosts a TEQ Talk featuring former Wellington Mayor Celia Wade Brown, stuntman Shane Rangi and Floating Rock founder Lukas Niklaus.
Section Board of Managers member Jason Galeon with out-of-this-world TEQ Talk guest NASA Astronaut Dr. Yvonne Cagle.
The Visual Effects Society’s international presence gets stronger every year – and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. Founded in 2011, the VES New Zealand Section is growing with more than 105 members, and cites its diverse international membership as a point of pride. The overwhelming majority of members are in Wellington, originally borne from the Wētā Group of Companies, the major regional visual effects employer, with a small but growing member contingent in Auckland.
In the last few years, smaller visual effects facilities and VR companies have populated the New Zealand market, including Lane Street Studios and Scale Studios in Wellington, and Fathom VFX in Auckland. In addition, many educational spaces, including Canterbury University, are installing VFX and virtual sound stages – allowing for cross use by students and industry professionals. The business landscape is one of resilience and optimism, as new projects have been flowing in after a long period of uncertainty and a dynamic global economy.
“Growing and diversifying the membership is a high priority for the Section,” said Lance Lones, VES New Zealand Section Co-Chair and Creative Director & CEO at Alterspace Entertainment. “Our membership is primarily comprised of VFX practitioners working in feature film with increasing representation from the gaming community, tapping into Rocket Works in Auckland and new companies emerging in Wellington. Everything in the VFX community in Wellington revolves around the eastern suburbs surrounding Wētā, making it easy to push word-of-mouth and get good turnout at events. And we are excited about building up our presence in Auckland with more events and member opportunities.”
The Section hosts a strong array of pub nights, screenings and other social gatherings – but its signature TEQ Talks is a unique and wildly popular standout draw. Celebrating its 10th anniversary this year, TEQ Talks was the brainchild of Jason Galeon, New Zealand Section Board of Managers member and Motion Graphics Supervisor at Wētā FX. “A decade ago, the idea of inviting interesting people to speak to our VFX community and having a friend with a local Mexican restaurant near Wētā, converged. That first night was a tequila-infused gathering of a concept artist, games designer and former VFX Supervisor – and a highly engaged audience. The interest grew quickly to host more events, and suddenly we had an astrophysicist, then NASA heard about it and offered us an astronaut… then Victoria University and Double Vision Brewery started hosting events… and on and on. It started with VFX and always has that focus, but we try to keep the brand: fascinating people talking about their passion projects.”
TEQ Talks speakers have included: NASA Astronaut Dr. Yvonne Cagle; Head Astronomer, SETI, Dr. Seth Shostak; ballerina and motion capture performer Kate Venables; comic book artist Dylan Horrocks; former Mayor of Wellington Celia Wade-Brown; beekeeper Chris Baring; and many more. The Section is planning
“Where else can you mingle among astronauts and beekeepers and raise a glass toasting creativity and invention?”
—Lance Lones
a 10-year celebration with events planned in Wellington and Auckland, and is exploring an ongoing TEQ Talks series in Auckland. It has also focused on expanding partnerships with education providers to raise the visibility of VFX within that community and among students, to help build the next-generation pipeline of artists and practitioners.
The New Zealand Section boasts a number of meaningful partnerships around educational, networking, social and career development opportunities, including an educational series of talks with the Media Design School in Auckland; collaborative events with Women in Film and Television New Zealand (WIFT NZ) and the Visual Effects Professionals Guild New Zealand (VFXG); and screenings hosted at Park Road Post and The Roxy Cinemas in Wellington. The Section also ran VES Awards Nominations events in January (virtual and in-person at Wētā FX) – and boasts that they are the first event that kicked off 30-hours of continuous nominations panels around the globe.
“More exciting events ahead. Later this summer, we are excited to bring together VFX professionals, filmmakers and other creatives for a special networking event at the Big Screen Symposium in Aotearoa [New Zealand],” said Rachel Copp, New Zealand Section Co-Chair and Visual Effects Producer at Rising Sun Pictures. “By co-hosting this event, we have the opportunity to strengthen the connections between storytellers and visual effects artists, foster new creative collaborations and bridge the gap between vision and innovation.”
Section leaders shared their personal perspectives on the VES and their membership.
Lance Lones: “With five million people in our country, having a VFX industry that attracts such high-caliber talent is remarkable. This is such an interesting community with the ebb and flow of talented people and events that take us in unexpected directions. Where else can you mingle among astronauts and beekeepers and raise a glass toasting creativity and invention?”
Rachel Copp: “Our strong loyalty and camaraderie keeps the Section alive and growing. Everyone is deeply committed here. I’m proud that we have multiple New Zealand Board members on the global Board – we’re an incubator! I’m also proud of bringing Auckland into the mix and for spearheading that small but mighty membership… that we will keep growing.”
Jason Galeon: “I’m proud that the VES has brought me connections to amazing people around the world and tethered us closer in our regional industry. I’ve been nominated at the VES Awards and get to see the ‘best of the best’ in our field being recognized. And I’m thankful for a community where can collaborate and connect.”
By NAOMI GOLDMAN
Susan
David
Jeffrey
The 2025 VES Board of Directors Officers, who comprise the VES Board Executive Committee, were elected at the January 2025 Board meeting. The Officers include Kim Davidson, who was re-elected Board Chair, and is the first Board member from outside the U.S. to hold this role since the Society’s inception.
“It is my privilege to serve as Chair of our worldwide community of visual effects artists and innovators,” said VES Chair Kim Davidson. “Since I joined the Society 18 years ago, I have seen the VES grow from a California-based organization to a global society with 16 regional Sections and members in more than 50 countries. As the first Chair elected from outside the U.S., I am representative of our thriving globalization, and I look forward to further championing our expansion worldwide. The leadership on our Board bring enormous commitment, expertise and enthusiasm to their volunteer service, and I’m excited about what we can achieve, together.”
“Our Society is fortunate to have strong leadership represented on our Executive Committee,” said Nancy Ward, VES Executive Director. “I am honored to work alongside these exceptional, dedicated professionals, especially amidst this time of dynamic change. We appreciate their commitment to further advance the Society’s global initiatives and impact.”
The 2025 Officers of the VES Board of Directors are:
• Chair: Kim Davidson
• 1st Vice Chair: Susan O’Neal
• 2nd Vice Chair: David Tanaka, VES
• Secretary: Rita Cahill
• Treasurer: Jeffrey A. Okun, VES
Kim Davidson is the President and CEO of SideFX, a world-leading innovator of advanced 3D animation and special effects Houdini software that he co-founded in 1987. Davidson has received three Scientific and Technical Awards from the Academy of Motion Picture Arts and Sciences. He was the first Chair of the VES Toronto Section and served on its board for eight years. Davidson served on the global VES Board of Directors for seven years, on the Membership Committee for seven years as well as a number of VES committees. He has been a VES mentor and played an active role in the
Society’s strategic planning process on its Impact & Visibility and Globalization sub-committees.
Susan O’Neal, a VES Founders Award recipient, has served on the global Board of Directors as Treasurer, 2nd Vice Chair and 1st Vice Chair. For many years, she served as the Chair for the legacy global Education Committee and currently chairs the Membership Committee, playing an instrumental role in the Society’s growth. She is currently a recruiter for BLT Recruiting, Inc. and has worked as an Operations Manager at The Mill, Operations Director at Escape Studios in Los Angeles and as an Account Manager at SideFX.
David H. Tanaka, VES, a VES Lifetime Member, is an Editor, Producer and Creative Director with experience ranging from visual effects and animation to documentary and live-action feature filmmaking. Tanaka currently holds a VFX Editor position with Tippett Studio. He has served three terms as Chair of the VES Bay Area Section and two previous terms as 2nd Vice Chair on the global Board of Directors. He is an active contributor to the VES Archives, Outreach and Work From Home Committees and the VES Handbook of Visual Effects publications.
Rita Cahill, a VES Founders Award recipient and Lifetime Member, is an international business and marketing/PR consultant and has worked with a number of U.S., Canadian, U.K., EU and Asian companies for visual effects and animation projects. Previously, Cahill was the Vice President of Marketing for Cinesite where she oversaw the marketing of four divisions of the pioneering digital studio’s services. This is Cahill’s ninth term as Secretary on the global Board of Directors. She also served as Chair or Co-Chair of the VES Summit for eight years.
Jeffrey A. Okun, VES, a VES Founders Award recipient, has been creating outstanding visual effects for film and television for over 30 years. He co-edited the award-winning VES Handbook of Visual Effects as well as the newly released and acclaimed VES Handbook of Virtual Production Okun also created and founded the VES Awards, now in its 23rd year. Okun serves as VES Board Chair for seven years, Chair of the Los Angeles Section of the VES for two years and as 1st Vice Chair and Treasurer on the global Board of Directors for several years.
As an organization born in Los Angeles, our hearts are with everyone affected by the devastating wildfires that took hold in January. The region is in the early stages of a long road to recovery and rebuilding, and we will continue to support our members, friends, colleagues and neighbors impacted by this disaster. Our deep gratitude goes out to the firefighters, first responders, relief workers, nonprofits and volunteers who have come together to safeguard and give comfort to our communities in this time of need.
To contribute to their long-term relief effort, the VES has made donations to the Los Angeles Regional Food Bank and the Motion Picture and Television Fund. We encourage people to donate their time and resources to organizations providing aid and services to people impacted across Southern California. Together, we are #LAStrong.
Please scan the QR code to join us in supporting relief efforts.
In an era dominated by CGI and cutting-edge digital effects, stop motion continues to thrive as a vibrant, hands-on art form, captivating both filmmakers and audiences with its tangible, handcrafted charm. Visionary auteurs like Australian director Adam Elliot are keeping this timeless technique alive, infusing their stories with a unique, tactile aesthetic that digital effects simply can’t replicate.
Our cover story on Memoir of a Snail (page 24) dives into this enduring craft, which brings inanimate objects to life through meticulously photographed frame-by-frame movements. Stop motion has roots stretching back to cinema’s earliest days when Georges Méliès wove it into his iconic A Trip to the Moon (1902), blending magic with motion.
The technique truly found its stride in the 1920s and ‘30s with trailblazers like Willis O’Brien, whose work on The Lost World (1925) and King Kong (1933) revolutionized the integration of stop motion with live action. Ray Harryhausen took it to new heights with the legendary Jason and the Argonauts (1963), crafting unforgettable sequences like the skeleton battle that still inspire awe today.
Fast forward to the 1990s, when Tim Burton’s The Nightmare Before Christmas (1993) and James and the Giant Peach (1996) gave stop motion a darker, more whimsical twist. Laika Studios pushed boundaries even further with Coraline (2009) and introduced 3D printing innovations in Kubo and the Two Strings (2016), redefining what’s possible within the medium.
Meanwhile, Aardman Animations charmed the world with its signature Claymation style in Wallace & Gromit films and Chicken Run (2000), while Wes Anderson’s Isle of Dogs (2018) showcased the seamless fusion of traditional stop motion with modern digital compositing. Exquisitely crafted and elegantly designed, Guillermo del Toro’s Pinocchio won an Oscar in 2023 for Best Animated Feature. Del Toro co-directed his Pinocchio with Mark Gustafson, an animation director on another modern stop-motion classic, Wes Anderson’s Fantastic Mr. Fox (2009).
Despite the rise of digital technology, stop motion isn’t just surviving – it’s evolving, thriving and proving that there’s nothing quite like the magic of handcrafted storytelling.