THE WILD ROBOT
VFX IN ASIA • THE PENGUIN • UNSUNG HEROES OF SFX PROFILES: RACHAEL PENFOLD & TAKASHI YAMAZAKI
Welcome to the Fall 2024 issue of VFX Voice!
Thank you for being a part of the global VFX Voice community. We’re proud to be the definitive authority of all things VFX.
In this issue, our VFX Voice cover story celebrates The Wild Robot, the animated adventure film following Roz the robot’s journey of survival. We shine a light on the booming Asian VFX industry in a special focus feature. And we explore a multitude of VFX trends: the evolving role of SFX artists; the process of hiring VFX vendors; the generational impact of film history on today’s VFX practitioners; the dynamic use of CGI to enliven prehistoric creatures; and a vital sampling of amazing artwork that VFX artists have created in their spare time.
We share personal profiles of Godzilla Minus One filmmaker Takashi Yamazaki and One of Us Co-Founder Rachael Penfold. VFX Voice goes behind the scenes of HBO’s limited series The Penguin, delves into the latest in tech & tools around facial animation and real-time rendering for animation, and measures the impact of the latest mixed-reality VR headsets. We also put the VES Los Angeles Section in the spotlight, highlight VFX studios helping to foster work-life balance and more.
Dive in and meet the innovators and risk-takers pushing the boundaries of what’s possible and advancing the field of visual effects.
Cheers!
Kim Davidson, Chair, VES Board of Directors
Nancy Ward, VES Executive Director
P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X (Twitter) at @VFXSociety.
FEATURES
8 VFX TRENDS: DOCUMENTING PREHISTORY
Photorealistic CGI captures iconic beasts in the wild.
14 TECH & TOOLS: FACIAL ANIMATION
Advancements break new ground with greater realism.
20 PROFILE: RACHAEL PENFOLD
One of Us Co-Founder calls humanity a key to success.
24 COVER: THE WILD ROBOT
Reviving an analog feel delivers a warmth and presence.
30 VFX TRENDS: UNSUNG HEROES
The changing role of SFX artists in defining visual reality.
36 PROFILE: TAKASHI YAMAZAKI
Godzilla Minus One director aims to keep audiences riveted.
40 TELEVISION/STREAMING: THE PENGUIN
Ruling the night and the devastated streets of Gotham City.
46 VFX TRENDS: HIRING VENDORS
Maturing selection process becomes more sophisticated.
50 V-ART: PERSONAL PERSPECTIVES
Industry talent lead rich creative lives outside of work.
60 SPECIAL FOCUS: VFX IN ASIA
The Asian VFX industry is experiencing a meteoric rise.
66 VFX TRENDS: GENERATIONAL CHANGE
VFX artists weigh how film history influenced their careers.
76 VR/AR/MR TRENDS: EMERGENT IMMERSIVE
Meta Quest 3 and Apple Vision Pro headsets power growth.
80 TECH & TOOLS: REAL-TIME RENDERING
Animators explore new techniques to spark innovation.
86 HEALTH & WELL-BEING: GETAWAYS
VFX studios help artists achieve a work-life balance.
DEPARTMENTS
2 EXECUTIVE NOTE
90 THE VES HANDBOOK
92 VES SECTION SPOTLIGHT – LOS ANGELES
94 VES NEWS
96 FINAL FRAME – THE POWER OF SFX
ON THE COVER: Shipwrecked on a deserted island, a robot named Roz must learn to adapt to its new surroundings in The Wild Robot (Image courtesy of DreamWorks Animation and Universal Pictures)
VFXVOICE
Visit us online at vfxvoice.com
PUBLISHER
Jim McCullaugh publisher@vfxvoice.com
EDITOR
Ed Ochs editor@vfxvoice.com
CREATIVE
Alpanian Design Group alan@alpanian.com
ADVERTISING
Arlene Hansen Arlene-VFX@outlook.com
SUPERVISOR
Ross Auerbach
CONTRIBUTING WRITERS
Naomi Goldman
Trevor Hogg
Chris McGowan
Barbara Robertson
Oliver Webb
ADVISORY COMMITTEE
David Bloom
Andrew Bly
Rob Bredow
Mike Chambers, VES
Lisa Cooke
Neil Corbould, VES
Irena Cronin
Kim Davidson
Paul Debevec, VES
Debbie Denise
Karen Dufilho
Paul Franklin
Barbara Ford Grant
David Johnson, VES
Jim Morris, VES
Dennis Muren, ASC, VES
Sam Nicholson, ASC
Lori H. Schwartz
Eric Roth
Tom Atkin, Founder
Allen Battino, VES Logo Design
VISUAL EFFECTS SOCIETY
Nancy Ward, Executive Director
VES BOARD OF DIRECTORS
OFFICERS
Kim Davidson, Chair
Susan O’Neal, 1st Vice Chair
Janet Muswell Hamilton, VES, 2nd Vice Chair
Rita Cahill, Secretary
Jeffrey A. Okun, VES, Treasurer
DIRECTORS
Neishaw Ali, Alan Boucek, Kathryn Brillhart
Colin Campbell, Nicolas Casanova
Mike Chambers, VES, Emma Clifton Perry
Rose Duignan, Gavin Graham
Dennis Hoffman, Brooke Lyndon-Stanford
Quentin Martin, Maggie Oh, Robin Prybil
Jim Rygiel, Suhit Saha, Lopsie Schwartz
Lisa Sepp-Wilson, David Tanaka, VES
David Valentin, Bill Villarreal
Rebecca West, Sam Winkler
Philipp Wolf, Susan Zwerman, VES
ALTERNATES
Andrew Bly, Fred Chapman
John Decker, Tim McLaughlin
Ariele Podreider Lenzi, Daniel Rosen
Visual Effects Society
5805 Sepulveda Blvd., Suite 620
Sherman Oaks, CA 91411
Phone: (818) 981-7861 vesglobal.org
VES STAFF
Elvia Gonzalez, Associate Director
Jim Sullivan, Director of Operations
Ben Schneider, Director of Membership Services
Charles Mesa, Media & Content Manager
Eric Bass, MarCom Manager
Ross Auerbach, Program Manager
Colleen Kelly, Office Manager
Mark Mulkerron, Administrative Assistant
Shannon Cassidy, Global Manager
P.J. Schumacher, Controller
Naomi Goldman, Public Relations
BLENDING CG CREATURES & WILDLIFE PHOTOGRAPHY TO BRING LOST GIANTS TO LIFE
By TREVOR HOGG
A quarter century ago, the landmark production Walking with Dinosaurs was released by BBC Studios Science Unit, Discovery Channel and Framestore. The six-part nature docuseries pushed the boundaries of computer animation to envision the iconic prehistoric beasts in a more realistic fashion than the expropriated Hollywood DNA of Jurassic Park and sentimental cartoon cuteness of The Land Before Time. Building upon the nature documentary television miniseries are Netflix and Apple, which partnered with BBC Studios Natural History Unit, Silverback Films, paleontologists Dr. Tom Fletcher and Dr. Darren Naish, narrators Sir David Attenborough and Morgan Freeman, cinematographers Jamie McPherson, Paul Stewart and David Baillie and digital creature experts ILM and MPC to do a natural fusion of photorealistic CGI and wildlife photography to produce Life on Our Planet and Prehistoric Planet
OPPOSITE TOP TO BOTTOM: The giraffe-sized flying predator Hatzegopteryx courting on its own love island in the “Islands” episode of Prehistoric Planet 2. (Image courtesy of Apple Inc.)
The T-Rex was clever, as revealed by a large brain, so its young were doubtless curious and even playful, so Cinematographer Paul Stewart imagined this scene in the “Coasts” episode of Prehistoric Planet 1 (Image courtesy of Apple Inc.)
GSS and RED Monstro camera on location in Morocco to film Terrorbirds hunting for Life on Our Planet. (Image courtesy of Jamie McPherson)
Shooting CG rather than live-action creatures was educational for Jamie McPherson, Visual Effects Director of Photography for Life on Our Planet, who previously took a GSS (Gyro-Stabilizer System) normally associated with helicopters, attached it to a truck and captured wild dogs running 40 miles per hour in Zambia for the BBC One documentary series The Hunt. “I never did visual effects before this, so it was a big learning curve for me, and ILM never tried to do visual effects in a style that we came up with, which was to make it feel like a high-end, blue-chip documentary series. It was also sitting alongside Natural History. If you’re doing pure visual effects, you’ve got more leeway of people not seeing how those two worlds mix. In terms of the creatures, the process
was incredibly long. We worked out what the creature was going to be between the producer, director, myself, ILM and Amblin. Then you have to work out how it interacted with another creature of the same or different species. You have all of these parameters that you’re trying to blend together and make them feel believable. When we were out filming the back plates for this, we made sure that they felt reactive, so we were careful to work out where the creature was going. As for behavior, we had to dial it back from what I filmed in the real world because we didn’t want to break that believability and take people out of the moment of them noticing a crazy camera move, like a T-Rex walking over you.”
Being able to rely upon predetermined CG creatures provided the opportunity to utilize the best of narrative and Natural History cinematography. “To me, it’s a drama, and you don’t give the audience anything more by pretending that you’re in a hide 160 million years ago with a long lens,” observes David Baillie, Director of Photography for Prehistoric Planet and Life on Our Planet. “It’s more important to tell the story.” Coloring his perspective is that Baillie continually shifts between narrative projects like Munich: The Edge of War and Natural History productions such as Frozen Planet. “My job as a cinematographer is to tell the story with all of the emotion, and I do that by using focal length, camera movement and framing.” Some limitations need to be respected. “We’re in a location that we maybe haven’t recce before. There is a bit of rock and river and you say, ‘Let’s do this.’ Everybody says, ‘That looks nice.’ Then, the Visual Effects Supervisor will say, ‘Using that shot will cost another £100,000 because we’ve planned it the other way.’
That can be quite frustrating.” A slightly different mental attitude had to be adopted. “I have to be more disciplined with things like changing focal length and stop. If I’m doing a documentary or even a drama, I might think, ‘I’ll tighten it up a bit here.’ But they’ve already done one pass on the motion control,” Baillie notes. Everything had to be choreographed narratively and visually before principal photography commenced. “To ensure our plates matched the action we wanted, we had reference previsualization video prepared by the animators for every backplate we planned to shoot,” remarks Paul Stewart, writer, Producer and Director of Photography for Prehistoric Planet, who won Primetime Emmy Awards for The Blue Planet, Planet Earth and Planet Earth II “Using accurate scale ‘cut-out’ models, or [for the big ones] poles on sticks, we could see how big the dinosaur was in the actual scene. Sometimes shockingly huge! Knowing size helped us figure out the speed and scope of any camera moves; we would capture a ‘good plate’ then plates where we mocked up environmental interactions like kicking dirt when running, brushing bushes and picking up twigs. We also tried where possible to isolate foreground elements using bluescreen. In some cases, we even used beautifully made blue puppets and skilled puppeteers to create complex interactions; for example, a baby Pterosaur emerging from a seaweed nest. This all went back to the MPC wizards, together with LiDAR scans and photogrammetry, to make the magic happen.”
There was a lot of unforeseen rethinking later about how things worked out. “For instance, when the Natural History Unit is on location, it’s one guy with the camera shooting for maybe months,” explains Kirstin Hall, Visual Effects Supervisor at MPC. “There’s no DIT [Digital Imaging Technician] or a script, and he’s not even taking notes on his cards. We didn’t think about that as a reality. When we showed up with our huge crew, and all of these shots
that we had to get and things we had to do, it was a huge shock for everybody involved. It caused us to think, ‘We need to plan this a different way. How are we going to get the data?’ Also, they need to shoot a script, whereas most of the narratives in Natural History come in the edit. Because we wanted to keep it as holistic as possible, the biggest thing for us was working together and becoming one healthy team. The NHU started doing their own charts, and we did HDRIs on the side. The shoots became like clockwork for Prehistoric Planet 2.”
Even with the experience of working on The Lion King and The Jungle Book, MPC still had room for improvement when it came to depicting CG creatures in a naturalistic and believable manner. “We had to get up to speed on animal behaviors and instinct and tailor our whole kit to that, like the lenses and cameras we used and filming off-speed; everything is slightly slow-motion, about 30 fps,” Hall states. “In Prehistoric Planet 2, we went 100 fps or more, which is hard to do with feathers and fur, but it helped us to get the full experience of these blue-chip Natural History Unit productions.” It was not until Jurassic World Dominion that feathered dinosaurs appeared in a Hollywood franchise, but this was not the case for Prehistoric Planet, “We knew from the beginning we would have to [do feathers] for it to be scientifically accurate,” Hall explains. “We were lucky enough to work with Darren Naish and didn’t realize how integrated he would be in our team. It felt like Darren was a member of MPC because he was in every asset and animation review. If something was not authentic in how something moves or blinks, we would catch it early on and rectify going forward. We learned a lot, even with the plants. When shooting on location, we made sure to rip out holly, and we couldn’t film on grass because it did not exist [during the Late Cretaceous period].”
Some unexpected artistic license was taken given the nature
of the subject matter of Life on Our Planet. “We tried to be as authentic as possible, certainly in the cinematography,” remarks Jonathan Privett, Visual Effects Supervisor at ILM. “Where it varied because we could control what the creatures did, there are quite a few places where we used continuity cuts that you wouldn’t be able to do if you were shooting Natural History because it would have required multiple cameras which they rarely use. We didn’t start off like that. It was a bit of a journey. From the outset, we said we wouldn’t do that. However, there’s something about the fact that creatures are not real even though they look real, which led to a sense of being slightly odd that you wouldn’t do those continuity cuts when you watched the edits back; so we ended up putting them in.”
Essentially, the virtual camera kit emulated the physical one. “We shot a lot of it on the Canon CN20, which is a 50mm to 1000mm lens,” Privett states. “Jamie has a doubler [lens extender] that can make it 1500mm. An incredible bit of kit. We also used a Gyro-Stabilizer System. The process is the same as if we were making a feature. We took Jaime’s GSS and shot lens grids for it. It’s optically quite good because it’s quite long, so everything is quite flat, but we mirrored the distortion, and inside the GSS is a RED camera, so that is a relatively standard thing. The other $300,000 worth of equipment is the stabilization bit, and our traditional methods of camera tracking work fine for that. The hard bit is we never use lenses that long. In a drama, nobody is breaking out the 600mm lens. That’s quite interesting to have to deal with because you probably don’t have much information. It could be a blurry mass back there, so our brilliant layout department managed to deal with those well.”
“What made my hair go gray is the language of wildlife photography and doing long, lingering close-ups of creatures,” Privett laughs. “You’re right in there, so there’s nowhere to hide in terms of your modeling and texturing work. We had to spend a lot of time on the shapes. For instance, a lizard has a nictitating membrane, so when closing its eye all the muscles around it move, and actually the whole shape of the face almost changes. We had to build all of those into the models probably in more detail than we would normally expect.” The image is more compressed as well. “Generally, the crane is panning with the GSS, and Jamie will counter-track around the creature so you get this great sense of depth. You can also see the air between you and the subject because you’re so far away from it. Any kind of temperature gradient shows up as heat haze in the image. In some of the shots, we’re warping the final rendered image to match it with what’s happening in the background because you can get some crazy artifacts,” Privett remarks.
There was lots of creativity but less freedom. “The fusion of science with the creatives at MPC paid off in a spectacular way,” Stewart notes. “Creativity could never come at the expense of accuracy, and surprises and beauty had to be hard-baked into the sequences rather than serendipitously revealed in the course of filming. Giving ourselves hard rules about what could and could not happen in the animal world helped set limits and improve the believability of the films. We might have wanted the animal to
jump or run, but the bones tell us it could not, so it didn’t. I even found myself checking the craters on the moon for any we should erase because they happened in the last 65 million years! There was also the matter of cost. We could never afford to make all the creatures we wanted or to get the creatures to do everything we would have liked. Interaction with water, vegetation, even shadows and the ground, required huge amounts of art and render time to get right but would be hardly noticed by the audience. We got savvy quickly at how to get impact without costing the sequence out of existence. But the thrill of recreating a world that disappeared so long ago never dulled. Even the scientists and reviewers said they soon forgot they were watching animation,” Stewart says.
Standard visual techniques had to be rethought. “The easiest way to explain it is, if I’m filming a tiger in the jungle, I would want to be looking at it so you get a glimpse into its world,” McPherson explains. “I tend to shoot quite a long lens and make all of the foliage in the foreground melt so you’re looking through this kaleidoscopic world of this tiger walking through a forest. But you can’t do that with a visual effects creature because they can’t put the creature behind melty, out-of-focus foliage. The best example is the opening shot of Episode 101 of Life on Our Planet of a Smilodon walking through what looks to be grass. There is a lot of grass in front and behind it. The only way to achieve that was to shoot where the creature was going to be on this plate. You shoot it once clean. Then we add in and shoot multiple layers of out-of-focus grass and then those shots are all composited together so it looks like the creature is walking behind the grass, and we match the frame speed, which then makes it feel like you’re looking into that world.”
Having limitations is not a bad thing. “There are restrictions, but they also make you more creative,” McPherson observes. “You have to overcome the limitations of a limited number of shots by making sure that every shot works together and tells the story in the best possible way.” The usual friend or foe had to be dealt with throughout the production. “Because of weather, some shoots were hard, which had nothing to do with visual effects,” Baillie states. “We had to do some stormy cliff shots in Yesnaby, Scotland, and had winds of nearly 100 miles per hour, which was great for crashing waves. In Sweden, when we were doing the ice shots for the ‘Oceans’ episode of Prehistoric Planet, it was great to begin with because it was -28°C, but there weren’t any holes in the ice. We nearly flew to Finland to try to find one. Then overnight the temperature went up to 3°C, the wind picked up and all of the ice broke up and melted, and we couldn’t find any ice without a hole!” Dealing with the requirements for visual effects led to some surreal situations. “The Pterosaur cliffs sequence was one of the most complex sequences because it involved shoots on land, sea, aerial and practical effects,” Stewart recalls. “Animation Supervisor Seng Lau worked with me in the field to help direct the plate work, and it was a fun collaboration. Watching our Smurfblue baby puppet Pterosaurs emerge from their seaweed nests was a bonding moment!”
TOP TO BOTTOM: Proxies ranging from 3D-printed heads to cut-outs were allowed Paul Stewart to frame shots properly for Prehistoric Planet. (Image courtesy of Paul Stewart)
Cinematographer Paul Stewart describes, “mimicking other cameras like thermal cameras” to point out that many dinosaurs were warm-blooded and insulated by feathers. (Image courtesy of Apple)
Cinematographer Jamie McPherson on location to film Komodo dragons with the cine-buggy. (Image courtesy of Jamie McPherson)
REALISTIC FACIAL ANIMATION: THE LATEST TOOLS, TECHNIQUES AND CHALLENGES
By OLIVER WEBB
BOTTOM: Vicon’s CaraPost single camera tracking system tracks points using only a single camera. Single-camera tracking works automatically when only one camera can see a point. If a point becomes visible again in two cameras, the tracking reverts to multi-camera tracking. (Image courtesy of Vicon Motion Systems Ltd. UK)
Realistic facial animation remains a cornerstone of visual effects, enabling filmmakers to create compelling characters and immersive storytelling experiences. Facial animation has come a long way since the days of animatronics, as filmmakers now have access to a choice of advanced facial motion capture systems. Technologies such as performance capture and motion tracking, generative AI and new facial animation software have played a central role in the advancement of realistic facial animation. World-leading animation studios are utilizing these tools and technologies to create more realistic content and breaking new ground in the way characters are depicted.
A wide range of facial capture technology is in use today. One of the pioneers of facial animation technology is Vicon’s motion capture system, which was vital in its use in films such as Avatar and The Lord of the Rings trilogy. Faceware is another leading system that has been used in various films, including Dungeons & Dragons: Honor Among Thieves, Doctor Strange in the Multiverse of Madness and Godzilla vs. Kong, as well as games such as Hogwarts Legacy and EA Sports FC 24. ILM relies on several systems, including the Academy Award-winning Medusa, which has been a cornerstone of ILM’s digital character realization. ILM also pioneered the Flux system, which was created for Martin Scorsese’s The Irishman, as well as the Anyma Performance Capture system, which was developed with Disney Research Studios. DNEG, on the other hand, uses a variety of motion capture options depending on the need. Primarily, DNEG uses a FACSbased system to plan, record and control the data on the animation rigs. “However, we try to be as flexible as possible in what method we use to capture the data from the actors, as sometimes we could be using client vendors or client methods,” says Robyn Luckham, DNEG Animation Director and Global Head of Animation.
Developing a collaborative working understanding of the human aspects, nuances and complexities and the technology was particularly challenging for Framestore on 2023’s Wonka. “Part of that was building an animation team that had a good knowledge of how FACS [Facial Action Coding System] works,” remarks Dale Newton, Animation Supervisor at Framestore. “While as humans we all have the same facial anatomy, everyone’s face moves in different ways, and we all have unique mannerisms. When it came to generating a CG version of an actor as recognizable as Hugh Grant, that raised the bar very high for us. At the core of the team was sculptor and facial modeler Gabor Foner, who helped us to really understand the quirks of muscular activation in Hugh’s face, [such as] what different muscle combinations worked together with what intensities to use for any particular expression. We ended up with a set of ingredients, recipes if you like, to re-create any particular facial performance.”
Masquerade3 represents the next level of facial capture technology at Digital Domain. “This latest version brings a revolution to facial capture by allowing markerless facial capture without compromising on quality,” Digital Domain VFX Supervisor Jan Philip Cramer explains. “In fact, it often exceeds previous standards. Originally, Masquerade3 was developed to capture every detail of an actor’s face without the constraints of facial markers. Utilizing state-of-the-art machine learning, it captures intricate details like skin texture and wrinkle dynamics. We showcased its outstanding quality through the creation of iconic characters such as Thanos and She-Hulk for Marvel. Eliminating the need for markers is a natural and transformative progression, further enhancing our ability to deliver unmatched realism. To give an example of the impact of this update; normally, the CG actor has to arrive two hours early on set to get the markers applied. After
TOP: ILM relies on the Academy Award-winning Medusa system, a cornerstone of ILM’s digital character realization, the Anyma Performance Capture system, which was also developed with Disney Research Studios, and the Flux on-set system. (Image courtesy of ILM)
BOTTOM: Wētā FX’s FACET system was developed primarily for Avatar, where it provided input to the virtual production technology. Other major facial capture projects include The Hobbit trilogy and the Planet of the Apes trilogy. (Image courtesy of Wētā FX)
each meal, they have to be reapplied or fixed. The use of COVID masks has made this issue infinitely worse. On She-Hulk, each day seemed to have a new marker set due to pandemic restrictions, and that caused more hold-ups on our end. So, we knew removing the markers would make a sizeable impact on production.”
Ensuring that the emotions and personalities of each character are accurately conveyed is critical when it comes to mastering realistic facial animation. “The process would loosely consist of capture, compare, review and adjust,” Luckham explains. “It would be a combination of the accuracy of the data capture, the amount of adjustments we would need to make in review of the motion capture data against the actor’s performance and then the animation of the character face rig against the actor’s performance, once that data is put onto the necessary creature/character for which it is intended. Once we have done as much as we can in motion capture, motion editing and animation, we would then go into Creature CFX – for the flesh of the face, skin folds, how the wrinkles would express emotion, how the blood flow on a face would express color in certain emotions – to again push it as close as we can to the performance that the actor gave. After that would be lighting, which is a huge part of getting the result of facial animation and should never be overlooked. If any one of these stages are not respected, it is very easy to not hit the acting notes and realism that is needed for a character.”
For Digital Domain, the most important aspect of the process is combining the actor with the CG asset. “You want to ensure all signature wrinkles match between them. Any oddity or unique feature of the actor should be translated into the CG version,” Cramer notes. “All Thanos’ wrinkles are grounded in the actor Josh Brolin. These come to life especially in his expressions, as the wrinkle lines created during a smile or frown exactly match the
actor. In addition, you don’t want the actor to feel too restricted during a capture session. You want them to come across as natural as possible. Once we showed Josh that every nuance of his performance comes through, he completely changed his approach to the character. Rather than over-enunciating and overacting, he underplayed Thanos and created this fantastic, stoic character. This is only possible if the actor understands and trusts that we are capturing the essence of his performance to the pixel.”
Performance capture is a critical aspect of realistic facial animation based on human performance. “Since Avatar, it has been firmly established as the go-to setup for realistic characters,” Cramer adds. “However, there are unique cases where one would go a different route. On Morbius, for instance, most faces were fully keyframed with the help of HMCs [head-mounted cameras], as they had to perfectly match the actor’s face to allow for CG transitions. In addition, some characters might need a more animated approach to achieve a stylistic look. But all that said, animation is still needed. We get much closer to the final result with Masquerade3, but it’s important to add artistic input to the process. The animators make sure the performance reads best to a given camera and can alter the performance to avoid costly reshoots.”
For Framestore’s work on Disney’s Peter Pan & Wendy, the facial animation for Tinker Bell was generated through a mixture of facial capture elements using a head-mounted camera worn by the actress Yara Shahidi. “This, then, underwent a facial ‘solve,’ which involves training deep learning algorithms to translate her facial motion onto our facial rigs,” Newton says. “The level of motion achieved by these solves required experienced animators to tighten up and finesse the animation in order to achieve the quality level for VFX film production. In contrast, performance
through a mixture of
elements using a head-mounted camera worn by the actress
which then underwent a facial solve involving deep learning algorithms trained to translate her facial motion onto Framestore’s CG facial rig. (Image courtesy of Walt Disney+)
BOTTOM: Masquerade3 was developed by Digital Domain to capture every detail of an actor’s face without the constraints of facial markers, as showcased through the creation of iconic characters such as Thanos from the Avengers films. All Thanos’ wrinkles are grounded in the actor Josh Brolin. (Image courtesy of Marvel)
capture on Wonka meant that we had good visual reference for the animators working on the Oompa Loompa as voiced by Hugh Grant. Working with Hugh and [co-writer/director] Paul King, we captured not only Hugh’s performance in the ADR sessions but also preparatory captures, which allowed us to isolate face shapes and begin the asset build. We had a main ARRI Alexa camera set up that he performed towards. Additionally, we had a head-mounted 1k infrared camera that captured his face and a couple of Canon 4k cameras on either side of him that captured his body movements.”
According to Oliver James, Chief Scientist at DNEG, generative AI systems, which can generate data resembling that which they were trained on, have huge potential applications in animation. “They also have the potential to create huge legal and ethical problems, so they need to be used responsibly,” James argues. “Applied to traditional animation methods, it’s possible to generate animation curves which replicate the style of an individual, but can be directed at a high level. So instead of having to animate the motion of every joint in a character over time, we could just specify an overall motion path and allow an AI system, trained on real data, to fill in the details and generate a realistic full-body animation. These same ideas can be applied to facial motion too, and a system could synthesize animation that replicated the mannerisms of an individual from just a high-level guide. Face-swapping technology allows us to bypass several steps in traditional content creation, and we can produce photorealistic renderings of new characters driven directly from video. These techniques are typically limited by the availability of good example data to train the networks on, but this is being actively tackled by current research, and we’re already starting to see convincing renders based on just a single reference image.”
Newton suggests that given how finely tuned performances in film VFX are today, it will take some time for AI systems to become useful in dealing with more than the simplest animation blocking. “A personal view on how generative AI is developing these days – some companies create software that seems to want to replace the artist. A healthier attitude, one that protects the artists and, thereby, also the business we work in at large, is to focus AI development on the boring and repetitive tasks, leaving the artist time to concentrate on facets of the work that require aesthetic and creative input. It seems to me a safer gamble that future creative industries should have artists and writers at their core, rather than machines,” Newton says.
For Digital Domain, the focus has always been to marry artistry with technology and make the impossible possible. “There is no doubt that generative AI will be here to stay and utilized everywhere; we just need to make sure to keep a balance,” Cramer adds. “I hope we keep giving artists the best possible tools to make amazing content. Gen AI should be part of those tools. However, I sure hope gen AI will not be utilized to replace creative steps but rather to improve them. If someone with an artistic background can make fancy pictures, imagine how much better an amazing artist can utilize those.”
There has been a surge in new technologies over the past
few years that have drastically helped to improve realistic facial animation. “The reduced cost and complexity of capturing and processing high resolution, high frame rate and multi-view video of a performance have made it easier to capture a facial performance with incredible detail and fidelity,” James says. “Advances in machine learning have made it possible to use this type of capture to build facial rigs that are more expressive and lifelike than previous methods. Similar technology allows these rigs to perform in real-time, which improves the experience for animators; they can work more interactively with the rig and iterate more quickly over ideas. Real-time rendering from game engines allows animators to see their work in context: they can see how a shadow might affect the perception of an expression and factor that into their work more effectively. The overall trend is away from hand-tuned, hand-sculpted rigs, and towards real-time, datadriven approaches.”
Cramer supports the view that both machine learning and AI have had a serious impact on facial animation. “We use AI for high-end 1:1 facial animation, let’s say, for stunts. This allows for face swapping at the highest level and improves our lookdev for our 3D renders. In addition, we can control and animate the performance. On the 3D side with Masquerade3, we use machine learning to generate a 4D-like face mask per shot. Many aspects of our pipeline now utilize little training models to help streamline our workflow and make better creative decisions.”
On Peter Pan & Wendy, Framestore relied on advanced facial technology to capture Tinker Bell. “We worked with Tinker Bell actress Yara Shahidi who performed the full range of FACS units using OTOY’s ICT scanning booth. The captured data was solved onto our CG facial rig via a workflow developed using a computer vision and machine learning-based tracking and performance retargeting tool,” Newton details. “This created a version of the facial animation for Tinker Bell the animators could build into their scenes. This animation derived from the solve required tightening up and refinement from the animators, which was layered on top in a non-destructive way. This workflow suited this show, as the director wanted to keep Yara’s facial performance as it was recorded on the CG character in the film. Here, the technology was very useful in reducing the amount of the time it might have taken to animate the facials for particular shots otherwise.”
Concludes Luckham. “For facial capture specifically, I would say new technologies have generally improved realism, but mostly indirectly. It’s not the capturing of the data literally, but more how easy we can make it for the actor to give a better performance. I think markerless and camera-less data-capturing is the biggest improvement for actors and performances over technology improvements. Being able to capture them live on set rather than on a separate stage, adds to the filmmaking process and the involvement of the production. Still, at the moment, I think the more intrusive facial cameras and stage-based capturing does get the better results. Personally I would like to see facial capture become a part of the on-set production, as the acting you would get from it would be better. Better for the actor, better for the director and better for the film as a whole.”
TOP TO BOTTOM: DNEG primarily uses a FACS (Facial Action Coding System) based system to plan, record and control the data on the animation rigs, but tries to be as flexible as possible in what method they use to capture the data from the actors, as sometimes they could be using client vendors or client methods. (Images courtesy of DNEG)
Creating Ariel’s digital double for The Little Mermaid was one of the most complicated assets Framestore ever built. It involved replacing all of Halle Bailey’s body, at times also her face, into her mermaid form.
(Image courtesy of Walt Disney Studios)
On Morbius, where Masquerade3 was used for facial capture, most faces were fully keyframed with the help of HMCs, as they had to perfectly match the actor’s face to allow for CG transitions. (Image courtesy of Digital Domain and Columbia Pictures/Sony)
FORMING A STRONG BOND WITH RACHAEL PENFOLD
By OLIVER WEBB
Rachael Penfold grew up in Ladbroke Grove in West London where she attended the local comprehensive school. “At the time, it was less of an institute for education and more like a social experiment,” Penfold reveals. “A symptom of the same problems we see today because of the lack of funding for state education. A small example, but it’s easy to see why diversity is a problem in the wider film industry.”
Penfold didn’t initially study to become a visual effects artist, instead she learned on the job. “I had the best apprenticeship really,” Penfold says. “It was at the Computer Film Company (CFC), a pioneering digital film VFX company. I think they may have been the first in the U.K. It was a sort of melting pot of scientists, engineers, filmmakers and artists. It was weird and fun and pretty unruly to be honest, but always and without compromise, it was about image quality. I obviously made a half-decent impression as a runner at CFC and was moved to production assistant. Either that or they simply needed bodies in production. So, in that sense, I caught a lucky break with my entry into the industry.”
Penfold’s lucky break came in 1997 with the British fantasy film Photographing Fairies, serving as Visual Effects Producer. This was followed by other CFC projects, which saw Penfold work as a Visual Effects Producer on acclaimed films such as Tomorrow Never Dies, Spice World, The Bone Collector, Mission: Impossible II, Chicken Run and Sexy Beast. Penfold’s last CFC project was the 2002 film Resident Evil for which she was Head of Production.
In 2004, Penfold co-founded London visual effects studio One of Us alongside Dominic Parker and Tom Debenham, where she currently serves as Company Director. Working across TV and film, One of Us currently has a capacity for over 300 artists, but is looking to expand across more exciting projects. One of Us also launched its Paris studio in 2021, which houses 70 artists. The company won Special, Visual & Graphics Effects at the 2022 BAFTA TV Awards for their work on Season 2, Episode 1 of The Witcher, and in 2022 they were also nominated for Outstanding Visual Effects in a Photoreal Feature at the VES Awards for their work on The Matrix Revolutions. Some of their recent work includes Damsel, The Sandman, Fantastic Beasts: The Secrets of Dumbledore and Bridgerton, Season 2.
Images courtesy of Rachael Penfold, except where noted.
TOP: Rachael Penfold, Co-Founder and Company Director, One of Us
OPPOSITE TOP: Some of the work that One of Us completed for Damsel included digi-doubles, burnt swallows, armor swords, dragon fire, melting and cracking ice and huge-scale cave environments and DMP set extensions. (Image courtesy of Netflix)
OPPOSITE MIDDLE, LEFT TO RIGHT: The One of Us leadership team. From left: Tom Debenham, Rachael Penfold and Dominic Parker. One of Us won the BAFTA Craft Award in 2018 for their work on The Crown.
Penfold at the 2022 Emmy Awards where One of Us was nominated for Special Visual Effects in a Single Episode for Episode 1 of The Man Who Fell to Earth as well as for Special Visual Effects in a Season or a Movie for The Witcher, Season 2.
OPPOSITE BOTTOM: The Zone of Interest won Best International Feature Film at the 96 th Academy Awards and was also nominated for Best Picture. (Image courtesy of A24)
Setting up the company was a very organic process for Penfold. “We set up a small team to deal with the visual development of a particular project,” Penfold says. “We kept ourselves small for quite a few years, always wanting to engage with more of the outsider work. It was great fun, lots of risk, and my two partners, Dominic Parker and Tom Debenham, are also two wonderful friends. We still work so closely together.”
Penfold’s first film as One of Us Visual Effects Producer was the 2007 film When Did You Last See Your Father? starring Jim Broadbent and Colin Firth. Since launching One of Us, Penfold has worked on an array of projects including The Tree of Life, Cloud Atlas, Under the Skin, Paddington, The Revenant and The Alienist. One of Us also served as the leading vendor on the Netflix original series The Crown. Their work for the show included digital set extensions, environments, crowd replication and recreating Buckingham Palace. They have also contributed to key scenes throughout the series, including Queen Elizabeth and Prince Philip’s Royal Wedding of 1947, the funeral of
King George VI and subsequent coronation of Queen Elizabeth II. Penfold served as Visual Effects Executive Producer across 20 episodes. The company’s work on the show helped them gain more recognition after winning the BAFTA Craft Award in 2018 for their work on the show, as well as receiving an Emmy nomination in 2017 for Outstanding Special Visual Effects in a Supporting Role for their work on Season 1 of the show and again in 2018 for Season 2. The company ethos of One of Us is one of creative intelligence and the ability to select and adapt ways of approaching practical problems involved in bringing ideas to life. Since their launch in 2004, their work has truly reflected these values.
Choosing a favorite visual effect shot from her oeuvre, however, is an almost impossible task for Penfold. Boasting an impressive catalog of award-winning films and series, it’s easy to understand.
“I definitely have favorite work, but not always because it’s the
MIDDLE: One of Us enjoyed its creative involvement with Mirror Mirror (2012). (Image courtesy of Relativity Media)
BOTTOM LEFT: Penfold and Jonathan Glazer at the premiere of Under the Skin at the 70th Venice International Film Festival in Venice in 2013. (Photo: Aurore Marechal)
BOTTOM RIGHT: Penfold taught a masterclass in VFX for “Becoming Maestre – A springboard for a new generation of professionals in cinema and seriality” in Rome in 2023. The mentoring program, aimed at Italian female audiovisual talent, was conceived and developed by Accademia del Cinema Italiano, David di Donatello Awards and Netflix as part of the Netflix Fund for inclusive creativity. (Image courtesy of Accademia del Cinema Italiano)
biggest or most ambitious. There’s so much that makes an experience great, not just the outcome, but the journey and who you go on that journey with,” Penfold explains. “But, from a professional pride perspective, Damsel is a huge achievement.”
Some of the impressive work that One of Us completed for the film included digi-doubles, burnt swallows, armor swords, dragon fire, melting and cracking ice, huge-scale cave environments and DMP set extensions. “Again, I can’t choose a favorite shot from the film,” Penfold explains. “There are so many massive shots in that film, but the shot where the dragon flattens the knight underfoot feels like the culmination of many years of hard work, growing One of Us to the point where it can take on the toughest challenges. From a simple aesthetic perspective, I love the work we did on Mirror Mirror – a perfectly told story in a beautifully designed world.”
Another recent project that Penfold is particularly proud of is The Zone of Interest, directed by her partner Jonathan Glazer. The film marks their third feature collaboration after Sexy Beast (2000) and Under the Skin (2013). The Zone of Interest follows Rudolf Höss, the commandant of Auschwitz, and his wife, Hedwig, as they strive to build a dream life for their family besides the extermination camp that Höss helped to create. “Sometimes you are proud to be associated with a project because it’s a great piece of filmmaking, even if our work is a relatively minor contribution,” Penfold remarks. “I’m very proud to have been a part of The Zone of Interest. I believe it is an important film. There are also 660 visual effects shots in it, and I’m delighted that no one knows that!”
On the experience of setting up a company in a male-dominated industry, Penfold explains that the important thing is to look after the work and look after the people and the rest should follow. “Parity/ equality is best served by looking after your people – and by looking after all people so you create an environment where everyone can thrive. Key roles for women, and a diverse team, will come through a genuine commitment to value everyone which, in turn, will enrich everything that we do,” Penfold states.
VFX tools and technology never stop developing, and there have been many advancements since the beginning of Penfold’s career. “Thinking back to the rudimentary tools we used to have – massive clunking hardware that seemed to constantly fall over or fail. The most basic software… we have come a really long way,” she notes. “Many of today’s tools are designed to improve long-established techniques. The craft is being reinvigorated by new technologies in better and more exciting ways. Some technologies are completely new, and we are learning how to use them. But, as consumers, we have an irrepressible desire to examine our own humanity, one way or another. I can’t see a world where human storytellers are not at the heart of creating and bringing to life our own stories.”
For Penfold, that humanity is key to her definition of success. “Without question, the thing that I enjoy most about my role is the absolutely wonderful people I work with. Whether that’s internally – our teams, or whether that’s part of the ‘film family’ – these endeavors are often hard and long and unknown. So, you really do form extremely strong bonds. Producing exciting imagery is thrilling; it’s a real buzz. So, to do that as part of a tight-knit team is rewarding in so many ways.”
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
By TREVOR HOGG
What happens when a precisely programmed robot has to survive the randomness of nature? That is the premise that allowed author Peter Brown to bring a fresh perspective to the ‘fish-out-of-water’ scenario that captured the attention of filmmaker Chris Sanders and DreamWorks Animation. Given the technological innovations that were achieved with The Bad Guys and Puss in Boots: The Last Wish, the timing proved to be right to adapt The Wild Robot in a manner that made distinguishing the concept art from the final frame remarkably difficult.
“When I read the book for the first time, I was struck by how deep the emotional wavelengths were, and I became concerned that if the look of the film wasn’t sophisticated enough, people would see it as too young because of the predominance of animals and the forest setting,” explains director/writer Chris Sanders. “We’ve always talked about Hayao Miyazaki’s forests, which are beautiful, sophisticated, immersive and have depth. We wanted that same feeling for our film.”
Achieving that desired visual sophistication meant avoiding the coldness associated with CG animation. “I’m absolutely thrilled about the analog feel that we were able to revive,” Sanders notes. “It’s one of the things that I’ve missed the most about traditional animation. The proximity of the humans who created it makes the look resonate. When you have hand-painted backgrounds, the artist’s hand is evident; there’s a warmth and presence that you get. When we moved into full CG films, we immediately got these wonderful gifts, like being able to move the camera in space and change lenses. However, we also lost that analog warmth and things got cold for a while.” The painterly approach made for an interesting discovery. “A traditionally done CG tree is a structure that has millions of leaves stuck to it, and we’re fighting to make those leaves not look repetitive. Now we’re able to have someone
paint a tree digitally. They can relax the look and make it more impressionistic. The weird and interesting thing is, it looks more realistic to my eye,” Sanders remarks.
CG animation was not entirely avoided as a visual aesthetic, in particular when illustrating the character arc of the islandstranded ROZZUM Unit 7134 robot, which goes by the nickname ‘Roz.’ “From the first frame of the movie to the last frame, Roz is dirtier and growing things on her,” notes Visual Effects Supervisor Jeff Budsberg. “But if you look at the aesthetic of how we render and composite her, there is a drastic difference. Roz is much looser. You’ll see brushstroke highlights and shadow detail removed just like a painter would. We slowly introduce those things over the course of the movie. When the robots are trying to get her back, it becomes a jarring juxtaposition; she now fits with the world around her while the robots have that more CG look to them.”
Something new to DreamWorks is authoring color in the DCI-P3 color space rather than SRGB. Budsberg explains, “It allows us a wider gamut of available hues because we wanted to feel more like the pigments that you have available as a painter. It allows us to hit way more saturated things than we’ve been able to do at DreamWorks before. The same with the greens. They’re so much richer in hue. Maybe to the audience, it’s imperceptible, but the visceral experience is so much more impactful.”
Elevating the themes of The Wild Robot is the shape language. “We’ve always loved the potential that Roz was going to be a certain fish out of water, and that influenced the design,” notes Production Designer Raymond Zibach. “The simple circular clean, the way that we know a lot of our technology from the iPhone to the Roomba, everything is simple shapes, whereas nature is jagged or pretty with flowers, but everything is asymmetrical.” The trio of Nico Marlet, Borja Montoro and Genevieve Tsai were responsible
Eliminating facial articulation was an important part in making Roz a believable and endearing robot.
“It’s limiting not to have a mouth, but that’s red meat for us as animators because that’s when we start to imagine and emphasize pantomime. We looked at Buster Keaton, who has very little facial expression, but his body language conveys all of the emotions, as well as other masters of pantomime and comedy, Charlie Chaplin and Jacques Tati. There is definitely comedy in this film, but all of it is grounded in some form of reality.”
—Jakob Jensen, Head of Character Animation
BOTTOM: As the story
for the character designs. “All three of them were studying animals and doing the almost classic Disney approach, like when they studied deer for Bambi. Our style landed somewhere in-between a realistic drawing and a slightly pushed style. You can see that in the character, Fink. How big his tail is to how pointy his face is. Those things are born out of an observation that Nico made on all of his fox drawings. We have quite a few species. I heard somebody say 60, but it’s because we have quite a few birds, so maybe that’s why it ended up being that many,” Zibach says.
While the watercolor paintings Tyrus Wong did for Bambi influenced the depiction of the island, a famous industrial designer was the inspiration for the technologically advanced world of the humans and robots. “Syd Mead is the father of future design from the 1960s through the 1980s,” Zibach observes. “The stuff before Blade Runner was optimistic. We wanted to bring that sense to our human version of the future, which to the humans is optimistic. However, for the animals, it’s not a great place for them. That ended up being such a great fit.” As for the robots, Ritche Sacilioc [Art Director] did the final designs for Roz, RICOs and Vontra.
“Ritchie designed most of the future world except for a neighborhood that I did. Ritchie loves futurism, and you can see it reflected in all of those designs because he also helped matte painting do all
of the cityscapes. I couldn’t be happier because I always wanted to work on a sci-fi movie in animation, and this is my first one. We got to blow the doors off to do cool stuff.”
Tools were made to accommodate the need for wind to blow through the vegetation. “In Doodle, where you can draw whatever foliage assets that you want, we have Grasshopper, which can build rigs to deform these plants on demand, and you can build more physical rigs of geometry that don’t connect or flowers that are floating,” Budsberg explains. “We can build different rigs at various wind speeds, or you can do hero-generated geometry.” Water was tricky because it could not look like a fluid simulation. “It’s one thing to make the water, but how do you make the water look like a painter painted it? It’s not good enough to make physically accurate water. You have to take a step back to be able to dissect it: How would Hayao Miyazaki paint this or draw that splash? How would a Hudson River painter detail out this river? You have to forget what you know about computer-generated water and rethink how you would approach some of those problems. You want to make sure that you feel the brushstrokes in that river. Look at the waterfall shot where the water starts to hit the sunlight; you feel that the brushstrokes of those ripples are whipping through the river. There’s a little Miyazaki-style churn of the water that is drawn. Then the splashes are almost like splatter paint. It’s an impressionistic version of water that allows the audience to make it their own. You don’t want to see every single micro ripple or detail,” Budsberg remarks.
A conscious decision was made not to give Roz any facial articulation to avoid her appearing ‘cartoony.’ “It’s limiting not to have a mouth, but that’s red meat for us as animators because that’s when we start to imagine and emphasize pantomime,” states Jakob Jensen, Head of Character Animation. “We looked at Buster Keaton, who has very little facial expression, but his body language conveys all of the emotions, as well as other masters of pantomime and comedy, Charlie Chaplin and Jacques Tati. There is comedy in this film, but all of it is grounded in some form of reality.”
It was also important to incorporate nuances that revealed a lot about the characters and added to their believability. “One of our Chief Supervising Animators, Fabio Lignini, was the supervisor of Roz chiefly, but also was with me from the beginning in developing the animal stuff. He did so much wonderful work that was so inventive and grounded in observations of how otters behave. We would show the story department, ‘This is what we’re thinking.’ That would then make them pivot from doing a lot of anthropomorphic hand-acting of certain creatures that was not the direction Chris Sanders or our animation department felt that it should go in. By seeing what we were doing, they adapted their storyboards, because you have to stage your shot. How do otters swim? There is so much fun stuff to draw from nature, so why not use that?”
Locomotion had to be understood to the degree that it became second nature to the animator. “The animal starts walking and trots over here, or maybe gallops and then stops,” Jensen explains. “For that to become second nature for an animator, you have to study hard, and you’ll see shots where it’s amazing how the team
did it because you never pay attention to it. You just believe it. That was my first pitch to Chris Sanders as to how far I saw the animation approach and style. I wanted the animation to disappear into the story and for no one to concentrate on the fact that we are watching animation. We’re just watching the story and characters.” Crowds were problematic. Jensen comments, “Sometimes, we have an immense number of characters who couldn’t only be handled by the crowds department. We threw stuff to each other
TOP: Effects are added to depict Roz fritzing out in the forest.
BOTTOM: An animation storyboard that explores the motion of Roz.
all of the time. But in order to even have a session in our Premo software that would allow for more than five to 10 characters in the shot, they had to come up with all kinds of solutions to have a low-resolution version of the character that could be viewed while animating all of the other characters because there are a ton of moments when they all interact with each other.”
Animation tends to have quick cuts which was not appropriate for The Wild Robot. “From the start, it became evident that we needed time [to determine] how to express that in a way when you don’t have a final image to say, ‘See this is gorgeous and you’re going to want to be here,’” notes Editor Mary Blee. “We were using tricks and tools like taking development art and using After Effects to put characters in it moving, or getting things on layers to say, ‘She is walking through the forest. You guys are going to be interested one day, but it’s hard to see right now.’ There was a lot of invention to get across the flavor, tone and pace of the movie. We had a couple of previs shots, one in particular where Roz stands on top of the mountain and sees the entirety of the island. Chris Stover [Head of Cinematography Previz/Layout], with help from art, made that up out of nothing before we had a sequence.” Boris FX was utilized to create temporary effects. “The first sequence I cut was Roz being chased by a bear, falling down a mountain and discovering that she has destroyed a goose nest and there is only an egg left. We needed to show that her computer systems were failing, so we made a fake HUD and started adding effects to depict it fritzing out and an alarm going off. That sequence was amazing because it encapsulated what the movie was going to be, which was a combination of action, stress, excitement, devastation, sadness and silence. All that happens within two and a half minutes,” Blee says.
Illustrating scenes involving crowds was a major task for the
previs team. “We have a lot of naturalistic crowds whether it was all of the animals in the lodge, the migration scenes, flying flocks of birds and the forest fire with all of the animals running,” Stover notes. “For the geese, it was fairly simple. It was like a sea of birds. When you are looking at moments like the lodge, it was impactful for the audience to understand that there were a lot of animals that were going to be affected if Roz didn’t step in and help the island over this harsh winter.” Each of the three ‘oners’ were complex to execute. Stover explains, “Those types of shots were often tricky because I don’t want anybody to ever look at it and go, ‘That’s a single shot. It’s really cool.’ What we want to do is allow you to be with that character for an extended period of time in a way that would ground you to that character’s challenges, as well as the cinematic moment that we’re trying to achieve in the storytelling.” Throughout the story, various camera styles were adopted. “At the beginning, we’re on a jib arm and it feels controlled. It wasn’t until we created the shot where the otters jump into the water and pop up that we then realized we are now in the island’s point of view. The wildness of the island became a loose camera style. The acting was going to drive the camerawork. When Vontra tries to lure Roz onto the ship, we use this flowy camera style. We let the action move in a much more dynamic way. The camera feels deliberate in its choices. That sensibility of being deliberate versus the sensibility of reactionary camerawork was the contrast that we had to play with throughout the film.”
To create the illusion of spontaneity, close attention was paid to the background characters. “At one point, a pair of dragonflies are behind Roz,” Sanders states. “I never sat down and said, ‘I must have dragonflies.’ This was something that was built in as people were working on it. It was perfectly placed and was so well thought through because it looked believable. Things fly in; they’re asymmetrical, off to the side and dart out. It feels like the dragonflies flew through a shot that we were shooting with a camera on that day.”
Jensen is partial to the character of Fink. “That was one of the first characters that we developed that was supervised by Dan Wagner [Animation Supervisor], who is a legend. I liked animating him because it’s difficult to do a fox. They don’t quite move like dogs or cats. It’s not even an in-between. It’s something interesting.” Even moments of silence were carefully considered, such as during the migration scene where Brightbill is at odds with his surrogate mother Roz before he flies away with a flock of geese, which was aided by a shifted line of dialogue that reinforces the idea that things are not good between them.
“If we have done our job, there is storytelling in the silence,” Blee observes. “It’s not just a pause or beat. It’s the weight of what we’ve all had in our lives when we needed to say something to somebody, but we can’t. It’s awkward and difficult, and there are too many emotions. Hundreds of storyboards were drawn to try to get that moment right over time. Because we don’t want it to sit there and use exposition to have people just talk to explain, ‘You’re supposed to be feeling this right now.’ No. It’s a lot of work to make it so you don’t have to say anything but the audience understands what’s happening.”
with different
Concept art for the pivotal sequence where Roz aids Brightbill in joining a flock of migrating geese.
Concept art of Roz and Fink attempting to survive a brutal snowstorm, which served as a visual template for the painterly animation style.
Roz appears to be oblivious to a looming wave as her attention is focused on the coastline activities of crabs.
Depicting the size and scale of Roz in comparison to the beach formations.
UNSUNG HEROES: SFX ARTISTS KEEP IT REAL IN A DIGITAL WORLD
By OLIVER WEBB
The role of the special effects artist is to create an illusion, practically or digitally, to enhance the storytelling. Practical effects include the use of makeup, prosthetics, animatronics, pyrotechnics and more, while digital effects rely on computer-generated imagery. With the two mediums having blended together in more recent years, the role of the special effects artist is often overlooked. Following are just a few of the many artists working today who are responsible for providing audiences with outstanding effects and immersing us in the world of film. From concept designers to special effects supervisors, they are working tirelessly behind the scenes to make movie magic happen.
Neil Corbould, VES, Special Effects Supervisor
TOP: Dominic Tuohy was nominated for an Academy Award for his visual effects work on The Batman (2022). (Image courtesy of Dominic Tuohy)
OPPOSITE TOP: Neil Corbould supervised this train-over-the-cliff effect for Mission: Impossible – Dead Reckoning Part One (2023). (Image courtesy of Neil Corbould and Paramount Pictures)
OPPOSITE BOTTOM: Neil Corbould feels that practical effects are “in a great place,” and he embraces all the new technologies that come along that add to his toolkit of options. (Image courtesy of Neil Corbould)
My uncle, Colin Chilvers, was the Special Effects Supervisor on Superman back in 1978. Being a big fan of Superman, I bugged Colin into taking me to see some of the sets. The first set he showed me was the ‘fortress of solitude’ built on the 007 stage at Pinewood Studios. I arrived just at the right time to see Superman aka Christopher Reeves flying down the length of the stage on wire through the smoke, mist and dry ice. From that moment on, I knew that I wanted a job in special effects. After Superman, I went on to work on Saturn 3, which starred Kirk Douglas and Farah Fawcett Majors, then Superman 2. After that, I started working with a company called Effects Associates Ltd., run by Martin Gutteridge. This was an amazing place to learn the art of practical effects both in small and large-scale productions.
I feel that practical effects are now in a great place. I embrace all the new technologies that come along, and I am always on the lookout for the next generation of machines and materials I can use and integrate into my special effects work. YouTube has been a great source of information for me. Whenever I have any free
time, I search through many clips on the platform, and it’s amazing what people come up with. Then I try to figure out how I can use it. With the influx of streaming platforms and the need for product, I have seen a surge in the need for more practical effects personnel. This has meant an increase of crew coming through the ranks at a fast pace, which has been a concern of mine here in the U.K. To be graded as a special effects supervisor takes a minimum of 15 years. During these 15 years, you need to have completed a certain number of movies in the various grades: trainee, assistant technician, technician, senior technician and then on to supervisor. This is the same with regards to pyrotechnic effects, which has similar criteria, with added independent explosion and handling courses that need to be completed before you are allowed to handle and use pyrotechnics and explosives. There is a worry that some have been fast-tracked through the system, which is where my concern lies. Because of the nature of the work we do, safety is paramount. You need to have completed the time and have the experience to say ‘This is not right.’ This comes over time and spent working with seasoned supervisors who have been in the industry a long time. We need to create a safe working environment for everyone who is in and around a set.
Dominic Tuohy, Special Effects Artist
If you look at when I first started, everything we did was in-camera, i.e. filmed for real. It was harder back then – nearly 40 years ago –than it is today because we had almost no CGI and relied on matte painting and miniature work, etc. That’s where the saying, ‘It’s all smoke and mirrors’ comes from.
Now the goal is to create a seamless collaboration between special effects and visual effects so that you don’t question the effects within the film. Visual effects always have this problem: If you say to someone “Draw me an explosion,” your image will be
different from mine. Whereas, if I create a real explosion, you, the audience, will accept it, and it grounds the film in reality. That’s the starting block of the ‘smoke and mirrors’ moment. Now, using the existing SFX explosion, visual effects can augment that image to fit the shot and continue to convince the audience it’s real. All of this cannot be achieved without a great team effort, and that’s something the British Film Industry has in abundance.” [Note: Tuohy won the Academy Award in 2020 for Best Achievement in Visual Effects for his work on 1917.]
Nick Rideout, Co-Founder and Managing Director, Elements Special Effects
I am one of the few of my generation who wanted to be in SFX from an extremely early age. I was completely taken by Star Wars and Hammer [Film Productions] movies and wanted to be involved in making monsters and models for the film industry. I was lucky enough to go to art school and had the good fortune of a very honest tutor who pointed out that at best I was an under-average
sculptor, but shouldn’t let that stop me from pursuing a career in effects work as it is such a varied department. With that, it was a matter of writing a lot of letters and hanging in there once given a chance within a physical effects department. Weather effects, mechanical rigs, fire and pyrotechnics, it felt like the greatest job ever – still does most days. I worked hard in the teams that I was part of and was fortunate enough to be alongside some of the best technicians of the time, who took the time to explain the how and why along with giving me the chance to express my ideas. It’s a long road as no two days are ever the same, and even now I’m not sure when you become an expert in the field as it is a constant learning curve.
With all HETV, the challenges start with script expectation, director’s vision versus the schedule, budget and location. The challenges can be so varied, the physicality of getting equipment onto a location, grade-one listed buildings, not having the means to test prior to the shooting day, all this before cast and cameras are present.
There are so many times that I am proud of my crew and their accomplishments. Without romanticizing our department, the pressure to deliver on the day is huge with nowhere to hide when the cameras are turning. Concentration, expertise and, at times, sheer grit get these effects over the line. Film and television production is always evolving and reinventing. SFX is at times an old technology but remains able to integrate itself with the most modern of techniques. That’s not to say we have not developed alongside the rest of the industry, but we will always be visual and physical.
Max Chow, Concept Designer, Wētā Workshop I didn’t know much about SFX. I wanted to try different stuff, and it just so happened that Wētā Workshop had opportunities. Because I’m quite new to this, learning about the harmony between physical manufacturing and the digital pipeline is very important. It is exciting to play a role or part in any of these projects at Wētā Workshop. I think these limits make it relatable to us and give a realistic feel, which I think we all appreciate. Being fairly new to special effects, one of the most challenging things was storyboarding for Kingdom of the Planet of the Apes. Seeing how great those storyboards were, I had to try my best to match that standard while learning how important pre-visualization and boarding are for VFX.
John
It’s easy to get lost and want to reinvent things and put in alternate creative designs – putting a lot of yourself in it, but we have to pull back. We have to trust in our textiles and leather workers, because they know every stitch, every seam, better than us. We have to design with the realities of SFX in mind. Later, when something is made wet, when there is cloth physics applied to a design, or when rigging is applied in VFX, there’s no doubt that it works, because it worked in real life. As VFX progresses, SFX is used in tandem to progress at the same rate – we are using new materials, workflows, tech and pipelines for physical manufacture. The world has started to demand more of the unseen in theaters, like Avatar, Kingdom of the Planet of the Apes and Godzilla vs. Kong, where we’re getting fully made-up worlds and less human ones. We
pay to see something that we can’t experience in real life. As that evolves, as we demand greater entertainment, there’s more work needed to ground these films and make everything a spectacle but also believable.
Iona Maria Brinch, Senior Concept Artist, Wētā Workshop Being a fan of The Lord of the Rings books and films, and fascinated by Wētā Workshop’s work on the films, it was a thrill when I was introduced to Wētā Workshop Co-founder Richard Taylor, VES, through a friend while I was backpacking through New Zealand. He saw my paintings, pencil sketches and wood carvings that I had been creating on the road, and invited me in. I began as an intern, working through different departments before eventually landing a role in the 3D department. From there, I worked my way into our design studio, initially doing costume designs for Avatar: The Way of Water, working closely with Costume Designer Deborah Scott. I tend to lean towards the more organic and elegant side of things when designing, such as the Na’vi costumes, which have a natural feel to them. So, working on the designs with Deborah Scott for the RDA [Resources Development Administration], who are a military unit in Avatar, was a fun challenge, from having to consider hard surfaces, to thinking about futuristic yet realistic-looking gear. I was part of the team that designed a lot of the female characters’ clothing in Avatar, like Tsireya, Neytiri and Kiri, with Deborah Scott. Getting to properly dive into the world of Pandora, then eventually see my designs go through our costume department, where they’d build them physically, was incredible. They brought it to a completely new level. You can’t take ownership of a design – it’s a collaboration that goes through so many hands, which is an absolute joy to see and be a part of.
There’s a lot to consider when you’re designing stuff that is going to be made digitally and what’s going to look good on a screen. There needs to be volume and 3D textures, interesting shapes and silhouettes that also read well at a distance, such as adding strands that can sway in the wind, shells that can give a slight dangling sound, or certain objects that can catch the light in an interesting way. All these things make it feel immersive and realistic. It’s amazing to see how SFX is blending with VFX, but how there’s still a desire to make as much as possible physically. SFX adds a certain grounded-ness to the films, perhaps because there are still limits to how much you can do in SFX.
Abbie Kennedy, Layout and Matchmove Supervisor, ILM
I had mainly focused on modeling and texturing during my Computer Animation degree, so I had no experience with matchmove or layout when I joined the industry in 2010. I distributed my showreel to the London VFX studios with the dream of becoming a texture artist and soon realized that a graduate position in that field was hard to come by. At that time, the entry-level route across the 3D departments was via matchmove. I landed a junior matchmove artist role at DNEG working on Walt Disney Studios’ John Carter and soon recognized the importance of this skill. You could have the most amazing textured models but if the matchmove wasn’t correct, the shots wouldn’t look convincing. I discovered that I enjoyed the problem-solving aspect of replicating exactly what took place on set. I decided to commit to a path in
TOP TO BOTTOM: 0n the set of Napoleon with Special Effects Supervisor Neil Corbould. Corbould believes that creating a safe working environment for everyone on and around a set is paramount. (Image courtesy of Neil Corbould and Columbia Pictures/Sony)
The digital 3D modeling work for Kingdom of the Planet of the Apes was completed by the Aaron Sims Company with W ētā Workshop designer Max Chow, who was also a concept and storyboard artist on the film. (Image courtesy of Walt Disney Studios Motion Pictures)
Hayley Williams served as Special Effects Supervisor on Wonka
(Image courtesy of Warner Bros. Pictures)
the matchmove department there, eventually progressing to Matchmove Supervisor. When ILM opened its London office, the layout department appealed to me as it encompasses matchmove and layout, satisfying both the technical and creative sides of my brain. I came in to work on Star Wars: The Last Jedi and quickly adapted to the ILM pipeline and proprietary software. The scope of work in the department is varied and each project has different challenges. One month you could be body-tracking an actor to transform them into someone else and the next you’re flying a camera through space. I now manage the department as a whole, growing and nurturing a team of highly skilled career layout artists while working on really exciting projects.
Layout is right at the beginning of the shot pipeline. If our work isn’t complete or correct then it holds up all the other departments, so there can be a lot of pressure at the beginning of a show to output a lot of work. For layout supervisors/leads, their job is to make sure artists have all the ingredients to complete their shot work. We’re processing LiDAR scans so that they are user-friendly, separating out movable set pieces so we can reposition them; solving the lens grids so we have a library of lens distortion we can add to our cameras; and liaising with production to schedule the shots in an order that allows for the most efficient workflows both in layout and downstream, to name but a few! Organization is key, and spreadsheets are my friend! The work that comes our way in the layout department is constantly evolving, and we’re always developing new technology to push the boundaries of what we can achieve. In the last few years, we have taken on some innovative projects such as the ‘ABBA Voyage’ immersive experience. This presented many technical challenges and involved putting a team of artists together to capture thousands of seconds of facial performance. I’m proud of being able to give a wave of junior artists a foot in the door at ILM. I have a passion for growing emerging talent and I’m excited to see what projects come our way next.
Gem Ronn Cadiz, Senior Creature Technical Director, ILM I got my role as a Creature TD in visual effects when I applied for the Jedi Masters Apprenticeship program at Lucasfilm Singapore back in 2009. It was an eight-month training course that taught me all the basics for cloth, flesh, hair and rigid body simulations. I was also fortunate to have amazing mentors from ILM San Francisco who guided me and provided invaluable insights throughout the program. Their expertise and support helped me build a solid foundation, and that experience not only kickstarted my passion for VFX but also set me on the path to where I am today. It was an incredible opportunity that shaped my career in ways I couldn’t have imagined.
I think the most challenging aspect is making sure our simulations look as natural as possible while juggling the technical side of asset development. It’s a balancing act to maintain and update those assets throughout the entire timeline of the show and still deliver quality shots within the timeframe. It can be challenging, but it’s also a lot of fun and very rewarding when everything comes together. As a hobbyist garment maker in my
spare time and a Clo3d user, it’s exciting to see how our discipline is slowly evolving. The setup for simulated clothes is getting closer to proper sewing patterns, and the level of detail we can achieve now is amazing. The introduction of Houdini and its simulation solver is gradually being accepted industry-wide, which is another exciting development. There’s a strong motivation to push for physically accurate simulations and to adapt realistic techniques, which makes this an exciting time to be in this field.
Hayley Williams, Special Effects Supervisor
I have been around the film industry and SFX since I was a child as my father and uncles were/are in the business. I developed a love of SFX as I got older and was always very interested in the mechanical side of things, so I went to college and qualified as a mechanical engineer, then went on to become a project engineer at a company in the Midlands. After gaining an extensive skill base outside of the film industry, I felt it was a good time to use those skills in SFX. I joined my father’s team on Charlie and the Chocolate Factory and built my career in SFX from there.
The most challenging and exhilarating sequence I have supervised is a recent film with Steve McQueen directing, involving a huge water setup and lots of stunt choreography linked to big-scale effects. Another nail-biting effect I have been involved in is launching a military tank – built by SFX – out of a huge carrier down a motorway in Tenerife on Fast & Furious 6. There was only one go at this as the entire front of the carrier was breakaway and the tank would likely be damaged beyond repair once it hit the road, so we tested and prepped a huge amount to give us the best outcome possible.
The world of physical in-camera effects took a hit around 10 years ago, but I believe that the desire for on-set effects is strong again, and directors along with VFX supervisors are keen to have as much physically in-camera as is practical with time and financial constraints. Advances in things like 3D printing are changing how we work all the time and allowing us faster turnarounds and the ability to test more ideas out within better timescales.
John Richardson, VES, Special Effects Designer
My father was an FX man who started in the industry in 1921, so I got into SFX through nepotism. My first film as a supervisor was Duffy in 1968. I had already worked with Bob Parrish, the director on Casino Royale (1967), the year before. There are too many challenging special effects sequences that I’ve worked on to name them all, but A Bridge Too Far was challenging as was Lucky Lady and Aliens. Sequence-wise, the Nijmegen River Crossing on A Bridge Too Far and the Bond films I did were very challenging. I’m proud of all 60 years of my career, but I’m most proud of A Bridge Too Far, the Bond movies, Cliffhanger, Aliens and the Harry Potter films.
As special effects shift more into CGI, it is sadly losing reality, which is something I have always strived to put on screen. [Note: Richardson authored the book Making Movie Magic.]
(Image
John Richardson, in the red hat, with a safety climber cliffside on Cliffhanger (1993). The majority of the movie was shot in Cortina d’Ampezzo, Italy, in the Dolomite Mountains, which doubled for the Colorado Rockies.
(Image courtesy of Carolco Pictures and TriStar/Columbia/Sony)
TAKASHI YAMAZAKI ACHIEVES KAIJU-SIZE SUCCESS
By TREVOR HOGG
When people think about Japanese cinema, Akira Kurosawa and Hayao Miyazaki often get mentioned, but that is not the entire picture as renowned talent has emerged from younger generations, such as Hirokazu Kore-eda, Mamoru Hosoda, Makoto Shinkai and Takashi Miike. Another name to add to the list is Takashi Yamazaki, who accomplished a feat only achieved by Stanley Kubrick when he became only the second director to win an Academy Award for Best Visual Effects, and in the process reinvigorated a legendary kaiju [giant monster] franchise with Godzilla Minus One. What impressed him most was not being handed the golden statue but getting the opportunity to brush shoulders with his childhood idol. “Receiving the Academy Award for Best Visual Effects was a great honor, but meeting Steven Spielberg at the Nominees Luncheon was perhaps an even more exciting moment,” Yamazaki admits. “It was a chance encounter with the God I had longed for since childhood.”
Previously, Yamazaki had established himself by adapting mangas, such as Parasyte and Always: Sunset on Third Street, with the sequel of the latter foreshadowing his feature film involvement with the King of Monsters, as he appears in an imagery scene.
“That scene was a short one, but it was just about as much as we could do with our technology and computing power we had. At that time, it was impossible to complete the visual effects for a two-hour Godzilla film with our capabilities. As time went by, we were able to process information that was incomparable to that time in terms of technology and computing power we had, so I thought I could finally create the Godzilla I envisioned and started this project. It was a good decision to wait until this happened and make the Godzilla I envisioned.”
Like the kaiju, manga are a cultural phenomenon. “The best way to succeed as a creator in Japan is to become a manga artist. Therefore, the best talent is concentrated in manga. Furthermore, the ones who survive in the very tough competition are the ones who become known to the most people. There is no reason why the stories told by those at the top of the giant pyramid should not be interesting. Adapting a comic book into a film potentially requires the characters to be the comic book itself, which is difficult,” Yamazaki says.
Images courtesy of Takashi Yamazaki, except where noted.
TOP: Takashi Yamazaki and Stanley Kubrick are the only directors to have ever won an Oscar for Best Visual Effects.
OPPOSITE TOP TO BOTTOM: To help define Godzilla’s look, Yamazaki and the animator spent time developing Godzilla’s walk in Godzilla Minus One. (Image courtesy of Toho Company)
Four-year-old Takashi Yamazaki stands in front of Matsumoto Castle with his family.
Takashi Yamazaki started off as a model maker for Shirogumi in 1986.
Growing up in Matsumoto, Japan, Yamazaki had a childhood fascination with insects and crafts. “I was surrounded by nature, so I collected insects and lizards and observed them. I was also a child who preferred drawing paper to toys and would request 100 sheets of drawing paper as a Christmas present.” Neither of his parents had much to do with the arts. “My father was good at drawing, and I remember that when I asked him to do something, he would do his best to draw me Ultraman or some other character.” A cinematic turning point was getting the opportunity to watch the sci-fi classic by Steven Spielberg, Close Encounters of the Third Kind. “What was shocking was the scene where the giant mothership flips over. With the visual effects before this, it took some effort to believe that it is real, but this was the first moment when I had the illusion that it is real.”
Yamazaki became part of the Japanese film industry while studying film at Asagaya College of Art and Design. “When I was at
“The science fiction genre is interesting in that it can create things that do not exist in this world. I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child.”
—Takashi Yamazaki, Director, Godzilla Minus One
art school, many expos were being held in Japan, and Shirogumi, which was skilled in creating unique visuals, was producing visuals for many of the pavilions,” Yamazaki explains. “There was a parttime job offer for this, and I was able to join Shirogumi as a result of participating in it. Visual effects were led by TV commercials, which had a relatively large budget to work with. We were also trying to introduce the techniques we had tried in TV commercials into film. Around the time I made my debut as a director, CG became more readily available. At that time, it was very difficult to scan live-action parts in theatrical quality, so we even built a scanner in-house that was converted from an optical printer.” The pathway to becoming a director began when there was a call for pitches within Shirogumi leading to the production of Juvenile [2000], which revolves around a tween having an extraterrestrial encounter. “The president of the company showed the idea I submitted there to Producer Shuji Abe, who was the president of another company; he liked it and worked hard on it, leading to my debut film.”
Science fiction goes beyond spectacle. “The science fiction genre is interesting in that it can create things that do not exist in this
a young boy prays for courage to a great
A major reason that Godzilla Minus One won the Oscar for Best Visual Effects is that there were both practical and digital effects.
world,” Yamazaki observes. “I also like the fact that it can be used as an allegory with various messages. The biggest reason for my attraction is that it excites my inner child.” With science fiction comes the need to digitally create what does not exist in reality. “I decided to become a director because I wanted to make films with the type of visual effects I wanted to make in the first place. When I made my debut as a visual effects director, most Japanese films didn’t have spaceships or robots in them. I think that having three jobs at the same time is economical because I can judge things quickly and write scripts with the final image in my mind, so there is no loss of time.”
Yamazaki has directed 20 feature films. “You never know what will be a hit, so when I have an original story, I only base it on whether it excites me or not. Making a film means you have to live with the original story for a number of years, so if it’s not a good match, it becomes hard to get on with it. I simply ask for good actors to join the cast. I am basically a person who wants to do everything myself. When it comes to the staff, I try to ask for people who are at least more skilled than me, people who have talent that I can respect.”
International markets are rarely taken into consideration when approving film budgets in Japan. “This is because for a long time it was said that Japanese films could not go mainstream even if they were released overseas, and that was probably true,” Yamazaki states. “It was a great surprise that Godzilla Minus One was seen by so many people overseas, and to put it bluntly, it was a joyful experience that opened up new possibilities for future film production in Japan. Hopefully, the budget will reflect that element. I guess we’ll just have to build up our track record and prove that pouring big budgets into it is not a bad option.” Stories scripted and directed by Yamazaki have ranged from siblings trying to learn about their
grandfather who died as a kamikaze pilot in World War II in The Fighter Pilot, to contributing to the Space Battleship Yamato franchise where an interstellar crew attempt to locate a device to make a devastated Earth inhabitable again, to a forbidden book that can grant any wish but at the cost of a life-threatening ordeal in Ghost Book. The growing popularity of video games has not altered the essence of storytelling. “Interesting stories are interesting in any media, and the core of stories that can be developed in various media continues to be influenced by stories that have been around for a long time.”
An extremely complex shot to design, create and execute is found in Godzilla Minus One, where a kamikaze pilot has to overcome survivor guilt in order to protect those he loves and Japan from the rampaging title character. “The sea battle between Shinsei Maru, the mine disposal ship, and Godzilla was difficult because we had to combine a live-action small boat with CG waves and a giant Godzilla,” Yamazaki reveals. “The boat in the foreground is live-action, so it was a very time-consuming job to build the waves at a level that would blend in with it. I’m glad it worked out.”
When asked what are the essential traits of a successful director and what has allowed him to have a long career, he responds, “What it takes to be a successful film director is to keep everything interesting all the time, but I am not sure about the career. It would be bad if a film failed, so I think it’s easier to prolong my life if I get the next project off the ground before the next film is released.” Yamazaki is appreciative of his good fortune. “Thanks to the many people around the world who liked Godzilla Minus One Godzilla Minus One has received many wonderful awards. I will continue to make films, treasuring the memories of the days I created with you all. Thank you very much. Arigato.”
Back in the early digital age when Takashi Yamazaki was learning how to create visual effects.
At the age of 10, Takashi Yamazaki ventures to downtown Matsumoto with his sister Satsuki.
The sea battle between the mine disposal ship Shinsei Maru and Godzilla was difficult because CG waves and Godzilla had to be integrated with the practical vessel.
THE PENGUIN: GUNNING FOR THE TOP OF GOTHAM CITY
By TREVOR HOGG
A secondary villain who was the object of an explosive highway pursuit by the Batmobile gets his own Max limited series to bridge the narrative gap between The Batman and its upcoming sequel directed by Matt Reeves. The Penguin consists of eight episodes executive produced by Reeves, Dylan Clark and Lauren LeFranc, who also serves as the showrunner.
OPPOSITE TOP TO BOTTOM: Complicating the daily prosthetic makeup process for Mike Marino was having to flatten, cover and integrate the real hair of Colin Farrell.
Rather than peering down upon Gotham City, the perspective of the show is from the street level.
Bridges, subways, underpasses and overpasses are a visual motif for The Penguin.
Colin Farrell dons the prosthetic makeup and body suit again as an aspiring crime lord nicknamed after an aquatic flightless bird with a distinct waddle. “It’s great to be in the Batman universe and in the version of Gotham City that Matt Reeves created,” states Lauren LeFranc, Executive Producer and Showrunner. “It felt real, less comic book and more crime drama. We took that and ran with it. The film always takes place at night because it’s the Batman and he’s not a guy who comes out in the day. But Oswald “Oz” Cobb does. We tried to embrace that aesthetic, but we were more handheld in how we shot. I always say that Batman is up high looking down on Gotham City. Oz is the guy who’s down in the streets in the muck of the city looking up wanting to rise to power; that allows for us to enter seedier worlds in our show. The French Connection and Taxi Driver were parallels that we talked about a lot. Batman moves and thinks methodically while Oz is unpredictable in his actions, so we tried to have the camerawork and feel of our show mirror more who Oz is rather than Batman.”
Contributing to the difficulty of the production was having lead actor Colin Farrell go from having to transform into his character 30 times for the feature film to 100 times for the limited series. “I’ve seen every single frame of this show, have been on set and
watched this from every angle,” LeFranc notes. “Colin Farrell can be an actor, as he normally is, with his own face. Oz is a man to me. I know it’s Colin because I talk to him all of the time. But Mike Marino [The Penguin Prosthetic Makeup Designer] and Colin together created a completely new person. Crew members would come on set, especially when we first started shooting, and would stare at him because you kept looking for seams or something to understand how this was a mask and you couldn’t find it. We have shots that are extremely close up, and the detail on Oz’s face is incredible. Mike always wanted to challenge himself. I would write something and he’d be, ‘Keep writing like that because it’s so hard and I want to see what I can do.’ Oz is naked in the first episode. That is a full body prosthetic, which Mike and his team created. There are little specific hairs on his chest put in place with tweezers. The detail is remarkable.” The Oscar-nominated actor never felt that his performance was impeded. “That’s the alchemy of what Mike Marino does,” observes Colin Farrell, who plays Oswald Cobb. “The mix of what I was doing beneath the makeup and the kind of sense of tragic wrath that Oz carries himself through the world with, it just seemed to work. I was somewhere in there, but the face just moved beautifully. It may be a bit idealistic, but I do think if everyone approaches their work from a place of integrity and purity, and wants to do best by the project and understands you’re a significant spoke of the project being born to life, then things do work out.”
Five body suits were in constant rotation. “They get sweaty and have to be cleaned and dried then get the next one on,” remarks
BOTTOM TWO: Approximately 2,200 visual effects shots were created over a period of months for the eight episodes.
Mike Marino. “We tried to plan as much as we can based on experience, but you never know what’s going to happen. There is so much movement and bubbles that form, and he’s sweating, and the environment. We tried to keep him cool and tried to plan that you’re going to stay in this tent that is hardcore air-conditioned; he’s basically wearing a whole snowsuit every day.” The nude suit was a one-off. “It’s blended at the wrists and ankles, and all of the hair that goes on. We didn’t get to go through the finalizing process of the second suit because we got it all within one day and it maintained itself.” The makeup process for the nude suit was even more laborious. “We had pre-painted and pre-done a lot of hair, but the connection of it all had to be done on the day. Colin was in five or six hours of makeup, and the whole day was dedicated to that scene, which became its own entity. We see why he limps. There is a whole leg involved that is strange-looking. Colin is 95% covered in prosthetics during that nude reveal,” Marino adds. Some digital augmentation was required. “I’m working with the visual effects team, personally circling and pointing out unavoidable errors. My eye is going to see things that they won’t see because I’m looking at where things begin and end that may seem seamless to someone else,” he notes.
Accenture Song VFX, Pixomondo, ReDefine, Stormborn Studios and Frost FX created approximately 2,200 visual effects shots over a six-month period. “We developed an in-house asset library that was searchable and filterable, and shoehorned it into ShotGrid,” reveals Visual Effects Supervisor Johnny Han. “The vendors could peruse our assets, not just elements like sparks, glass, smoke and blood, but also cityscapes, sunrises, sunsets and looking at Manhattan in 100 different directions. We thought that was key to give the vendors the freedom to find what they thought was useful that perhaps I didn’t even think would be necessary for shots.”
Various elements had to be recreated from The Batman, such as lens flares. “I got on the phone with Dan Lemmon, the Visual Effects Supervisor on The Batman, and said, ‘Tell me how you did this.’ He sent me this great document where he fleshed it all out. I went to the hardware store and bought all kinds of different silicone gels, hand sanitizers, soaps and anything we could stick onto the glass of the matte box of the ALEXA LF camera to smear different patterns. I also bought a bunch of different flashlights, LED and incandescent, to get wet caustic lens flares by shining them through the silicone gel smeared onto the glass in front of the camera. We got some amazing material that we felt even expanded upon what Wētā FX had done in the film because we have a larger variety of scenes,” Han says.
The first shot after the prologue in Episode 101 was digitally altered after it was captured to achieve the desired thematic effect. Han describes, “We’re beginning way up above the ground looking at a classic Gotham skyline as if it was the same one from the film. But hold on; this is not where we are. Let’s come down, and the camera starts descending through the rain, passing train tracks, and now coming into much smaller and rougher-looking buildings that have seen better days, all the way down to almost six inches off the ground. Literally, as low as you can get. Those same skyscrapers that we saw at the start of the shot are almost looking down at us from afar, like the rich looking down at a safe distance. Of course, a moment later, Oz’s Maserati pulls right up to the nose of the camera.” A different perspective was adopted for a flashback sequence in Episode 103 when the Riddler destroys the seawall causing a devastating flood. “In The Batman, we were always far away from the water. We wanted to make this flood feel like Victor Aguilar was at ground zero. He saw the seawall break open, the water rush in and consume all of the cars and people on the street. And to make it a horrific and traumatic experience for him, which is a lot of the basis for his backstory of having lost everything.”
Conveying weight is always hard. “We filled the water with chunks of debris from the concrete seawall and played that up, so the water itself ended up having this inky black depth to it,” Han adds.
Given the nature of the subject matter, a certain element was unavoidable. “It’s a show about gangsters who like to shoot guns often at night or in shadowy places,” Han notes. “The first week of shooting, we had a hero gunshot. I got an old camera flash and a sound trigger that sports photographers use, hooked them together, brought it on set and crossed my fingers. When the blank shot off, the bang of the trigger would sound-trigger the camera flash I had. It was this big ‘Eureka!’ moment of an interactive light that we needed for the show. Visual effects still added a little muzzle flash cloud. We also obtained some technology to phase-synchronize the timing of the flashes with the rolling shutter of the camera so that it would never flash halfway through a frame. I felt from a visual effects perspective, it was a nice little visual stamp for the show, and we could take our gunfire to the next level.” Atmospherics added a touch of life to shots. “We were inserting birds, airplanes, helicopters, trains, cars and digital people to give the scene the right tone of life. We don’t want it to feel too active. It’s just that Gotham City itself is a living, breathing
“[Visual Effects Supervisor] Johnny Han and I … are collaborating on creating some of the sets and translating them into VR sets and putting the Oculus goggles on our director Craig Zobel and have him walk through them.”
—Kalina Ivanov, Production Designer
city, so there has to be some pulse to it at any given time.” The Gotham City envisioned by Matt Reeves had to be honored. “Part of the formula was Gothic architecture and skyscrapers. You might call it Neo-Gothic, where you have buildings that look like they have cathedrals on top of them.”
Genre does not impact the design language. “I find a character that I like and approach it from a character arc,” reveals Production Designer Kalina Ivanov. “I don’t connect with Batman, but I get the Penguin. I get him with a chip on his shoulder for being poor and wanting to become rich. For me, everything that informs the design is that character: what he’s about, his angst and his emotional journey. I design from an emotional state.” The Penguin occurs a couple of weeks after the flood. Ivanov explains, “The story takes place in locations where the water has receded, so what you’re left with is the mud and muck. That’s where Katrina helped because the scale of that disaster was so huge and the way it pushed things around and the way cars were pushed into each other by the water and created these structures of three cars on top of each other. The special effects team created rigged cars to be like that. Because FEMA had started to clean up, we were also showing the dumpsters where people were throwing stuff, but what we needed to create was mud.” Blackgate Prison went through many iterations. Ivanov recalls, “I did this black-and-white sketch of a prison embedded into a rock formation, playing with the idea that it’s the Palisades across from Manhattan. Matt and Lauren thought the design was terrific but not grounded enough in reality. They encouraged me to go and look at more real prisons. It was the time that I realized they’re not approaching this as a comic book. I looked at some Eastern European prisons and Brutalism. There is an idea that I’m going after, but it’s not vertical. It’s horizontal. And it’s on an island. I looked at prisons on islands. I started creating it from there.”
Ivanov loves collaborating with visual effects. “Johnny Han and I had a great relationship from the beginning. We are collaborating on creating some of the sets and translating them into VR sets and putting the Oculus goggles on our director Craig Zobel and have him walk through them,” Ivanov states. “I find that to be useful. We all agreed that in a show where you want the characters to feel like real people, you want to create as much around them for real and have fewer visual effects, [digitally] extend from there and [lean on visual effects] for the things you couldn’t create, like the prison.”
Originally, the project was daunting. “I thought I was supposed to follow the film, which was scary, to be honest. It’s an eight-part series, not a two-hour film. It became clear that Matt, Dylan and Lauren were not looking to duplicate the film but capture the spirit of it. There was another reason we couldn’t duplicate the film. The film was shot in England, and we were shooting in New York. The whole motif of The French Connection and shooting under tracks, subways and passages came from that original discussion. It became a theme,” Ivanov says. Singling out an environment is not easy. “Truly there are so many wonderful sets in this show, and they’re all my children,” she acknowledges. “It’s hard for me to play my favorites. The Penguin is so rich for a designer. It’s a true visual gift.”
CATCHING A RIDE ON THE VFX VENDOR-GO-ROUND
By TREVOR HOGG
TOP: The foundation for shows such as Vikings: Valhalla are previous collaborations that enabled visual effects supervisors and producers to deliver shots within budget and on schedule. (Image courtesy of Netflix)
OPPOSITE TOP TO BOTTOM: Producers and directors look for vendors and artists whose style and capabilities match the project’s needs, like for The Wheel of Time (Image courtesy of Prime Video)
Considering the capacity of the vendor to meet deadlines and handle the complexity of the work is the first crucial step in the selection process for shows like Fallout (Image courtesy of Prime Video)
The selection process for visual effects teams can vary significantly depending on the structure and needs of a particular project such as Asteroid Hunters. (Image courtesy of IMAX and Spin VFX)
Compared to the 1990s when only a few visual effects companies were capable of doing film-quality work, the number of options has exploded around the world. It is fueled by the ability to achieve photorealism and the growing acceptance of CGI as a filmmaking tool. As a consequence, the success or failure of a project is often dependent upon hiring the correct group of supervisors, producers, digital artists and technicians either through a single or several vendors who aim to achieve the same goal. In some ways, the vendor selection process has remained the same, but in other areas it has become sophisticated, reflecting the maturity of the visual effects industry as it travels further down the pathway once traveled by the crafts of cinematography, editing, production design and costume design to become entrenched members of the entertainment establishment.
Vendor connections begin with the production companies, studio visual effects executives or visual effects producers and supervisors. “On the studio side, we break down a script; we are typically the first ones, and we tend to do this before a director is hired,” states Kathy Chasen-Hay, Senior Vice President of Visual Effects at Skydance Productions. “We work closely with a line producer to discuss shoot methodology, then we’ll send the breakdown out to four or five trusted visual effects companies. We pick these vendors based on their specialties, shot count and the size of the budget.” Finding and hiring vendors is a group effort.
“The VFX studio executives work closely with the visual effects teams when picking vendors. Since studio visual effects executives work with tons of vendors, we know and trust certain vendors. Did that vendor deliver on time? Was the work stellar? Did we get change-ordered to death? Relationships are key, and several VFX supervisors have built relationships with vendor supervisors, and it’s important to support these relationships; after all, they are the ones in the trenches, day after day.”Agents are typically not involved. “Relationships are built on past projects. Successful vendors have someone at the top who communicates with studios,
production companies and the visual effects producers. We trust these folks as we have worked with them on prior projects. It’s all about previous projects.”
Established relationships are favored given the difficult nature of delivering finished shots within budget and on time. “Depending on the type of work required in the visual effects breakdown, the visual effects production team would work together with their production company and/or studio to start understanding how many vendors may be needed and which ones have the capacity and capabilities to handle that type of work in the given timeframe,” explains Jason Sperling, Creative Director/VFX Supervisor at Digikore Studios. “This can help narrow the list of possible vendor candidates significantly, and at that point, visual effects production teams begin the specific task of reviewing vendor demos and sample bidding numbers and expressing the creative and logistical expectations. If individual artists are needed for an in-house visual effects production team, they begin to assemble and reach out to their known available freelance crew or other resource lists.”
“The selection process for visual effects teams can vary significantly depending on the structure and needs of a particular project,” states Neishaw Ali, CEO and Executive Producer at Spin VFX. “While sometimes studio preferences might dictate the choice, more commonly, the decision-making is often led by the VFX supervisor and the VFX producer. These key figures play crucial roles due to their expertise and their understanding of the project’s technical and artistic requirements. The visual effects supervisor is primarily responsible for ensuring that all visual effects are seamlessly integrated into the live-action footage and align with the director’s vision. Meanwhile, the visual effects producer manages the budget, scheduling and logistics of the visual effects work, coordinating between the studio and the creative teams. Their collaboration is essential in choosing the right visual effects team[s] that can deliver high-quality results within the constraints of the project’s timeline and budget.”
Scale and budget have an impact on the audition and hiring process. “For independent films, I found there’s more flexibility, while the big studio productions may have predetermined criteria or preferences,” notes Julian Parry, VFX Supervisor and VFX Producer. “Producers and directors typically seek out talent based on their track record and previous work. Artists or visual effects houses with impressive portfolios are usually approached for potential collaborations. It’s not uncommon when vetting a visual effects vendor that the artists are promoted in the pitching. Breakdown reels, showcasing previous work and expertise play a significant role in the hiring process. Producers and directors look for visual effects houses or artists whose style and capabilities match the project’s needs and offer a detailed insight into their experience in specific disciplines, such as creating monsters, which can be crucial for achieving desired visual results.”
Considering the capacity of the vendor to meet deadlines and handle the complexity of the work is the first crucial step in the selection process. “Can the vendor commit to delivering a specific number of shots by a set date, and do they have the necessary
TOP TO BOTTOM: Scale and budget have an impact on the audition and hiring process for vendors on projects like
Another part of the vendor equation for films like Dungeons & Dragons: Honor Among Thieves are in-house visual effects teams that can consist of a designated vendor or a group of handpicked digital artists. (Image courtesy of Paramount Pictures)
Generally, for VFX Producer Tyler Cordova, one to three major vendors are brought on board during the majority of prep and shoot to develop assets for films like Dungeons & Dragons: Honor Among Thieves (Image courtesy of Paramount Pictures)
resources to handle the project?” notes Pieter Luypaert, VFX Producer at Dream Machine FX. “Competitive pricing is important, as multiple vendors are bidding for the same work. The vendor’s location also plays a role, as tax incentives can significantly impact cost. Breakdowns are a big part of the bidding process, as they provide the vendors with all the essential information needed to provide a first bid and propose a methodology. Does the vendor believe they can achieve a certain effect with a 2D solution? The chosen methodology can drive the cost and schedule. Lastly, pre-existing work relationships, mutual connections and shared history are important. Due to the interconnected nature of the visual effects industry, personal connections can ultimately be the deciding factor.” Multiple vendors are often used to mitigate risks. “The main vendor typically handles the majority of the work, while the studio’s visual effects production team oversees the coordination among the different vendors. This is becoming more common as the vendor providing virtual production services needs to collaborate closely with other vendors using their assets.”
In many ways, the career trajectories of individuals determine future studio and vendor collaborations. “I was a compositor by trade and knew a lot of the people at Pixomondo who went on to form their own companies such as Crafty Apes,” states Jason Zimmerman, Supervising Producer and VFX Supervisor at CBS. “Me bouncing around working at Zoic Studios and Eden FX, you meet a lot of people along the way and collect those people with whom you resonate. I’ve been fortunate to meet a lot of awesome people along the way who have either started a company or gone to a company. To me, it’s all about the artists, having been one myself. I keep track of all my favorite people, and they have all moved around and done great work at different places.” Not everything is about past relationships. “If someone has great visuals then you’re going to try them out, regardless. Having a reel that has got a good variety is important because you know that they can do more than one type of shot or effect or discipline. And how does it look to your eye? Do you agree with the integration and lighting? All of those shots were driven by a supervisor, studio, timelines and budget. You take it for what it is, and every decision made was not only one person because there are a lot of people who go into making a visual effect shot work.”
Setting up a visual effects company has become more economical. “The technology is at a point where if you’re an independent artist, you can buy the software and render it on the cloud,” notes Jonathan Bronfman, CEO at MARZ. “You don’t need infrastructure. But it has been that way for a while. It’s quite homogenous. Everyone is using the same tech staff. We have artists who have worked at Wētā FX and vice versa. What is that differentiator? Which is why we ended up developing the AI. That’s through differentiation. If you can nail the tech. Outside of the AI that we’re developing, we’re very much a creature/character shop. We still do environments because creatures and characters need to live in environments. There are other companies like Zoic Studios which are television-focused. But if you go to Wētā FX or ILM, they do everything.” Everything stems from reliability. “Word of mouth is the result of doing a good job and executing at a high level. You have to produce good quality on time and budget. If you can do those things then it spreads.” Certain stakeholders have to be impressed. “You have the visual effects supervisor, visual effects
producer, production company and studio. If you have all three levels of stakeholders, that is ideal. But ultimately, it is the visual effects supervisor who gets the final say.”
Conversing with potential vendors actually commences before the studio assembles a visual effects team. “I will get a look at the scripts early, know what type of work it is, and I can reach out to my counterparts at some of those vendors,” explains Todd Isroelit, Senior Vice President, Visual Effects Production at 20th Century Studios. “I’d say, ‘We have project ‘X,’ which has a creature/character or water effects simulation. Here is the rough schedule that we’re looking at.’ It’s important to plant a flag in some of these vendors so your project is on their radar as they’re talking to all of the other studios and filmmakers about other things that might be happening in a similar timeframe and looking for similar resources. As we start to identify the team or the candidates for the team, we’ll look at what projects they’ve done and what relationships they have. Sometimes, we’ll look at actually creating a package scenario where we are talking to a vendor and vendor supervisor.” The proliferation of visual effects has led to more agent representation. “In the early days, all of the visual effects supervisors were tied to facilities like ILM, Digital Domain and Sony. There wasn’t this big freelance pool. As the industry grew and people started moving around, it became this emerging piece of the business that gave the supervisor a head of department status that fits into that below-the-line approach to filmmaking where you are looking at DPs and costume designers. Visual effects supervisors started having a bigger stake and voice in how the projects were coming together. That’s when I saw people getting agents started to evolve, even to the point where big below-the- line talent agencies who represent DPs, editors and costume designers started realizing the same thing.” Agent representation is not as significant for the vendors as a point of contact for the studios.
“Executive producers or business development executives at the vendors; those are the relationships that we have,” Isroelit says.
Another part of the vendor equation are in-house visual effects teams that can consist of a designated vendor or a group of handpicked digital artists. “In my experience, an in-house team usually comes in closer to the end of post-production to do easier, mostly non-CG shots,” remarks VFX Producer Tyler Cordova. “Typically, opticals, re-times, split screens and simple paint-outs, things of that nature. It’s important because it’s a cost-effective solution to have a small team do simpler shots after the edit has settled. I’ve hired in-house artists on past shows through contacts I’ve worked with for years and years. In-house artists will suggest other artists they’ve worked with as well. There are some legendary in-house artists that a lot of visual effects producers know about – Johnny Wilson, John Stewart, looking at you! – though some studios and producers prefer going to a vendor instead of using in-house artists to give some accountability to a company performing efficiently, rather than relying on individual artists to manage themselves; it depends. In-house teams are rarer these days since COVID-19 hit, and a lot of productions seem to be hiring smaller in-house-type vendors rather than individual artists so they can do the work securely and efficiently while working remotely.
TOP TO BOTTOM: Rather than hire agents, vendors tend to have a top executive communicating with production companies and studios to work on series such as Foundation (Image courtesy of Skydance Television and AppleTV+)
Having a reel that has a good variety is important because it demonstrates the ability to do more than one type of shot, effect or discipline when attempting to work on series such as Star Trek: Discovery. (Image courtesy of Paramount+)
Conversations with potential vendors actually commence before the studio assembles a visual effects team, reveals 20th Century Studios’ Todd Isroelit, who worked on Prometheus (Image courtesy of Twentieth Century Fox)
ARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENT
By TREVOR HOGG
In essence, an alchemist can transform matter into something else, which oddly enough, describes the creative process where pieces of paper, canvas, clay or a blank computer screen are turned into works of art by combining and applying different materials guided by human imagination. In the world of visual effects, digital toolsets reign supreme, but that does not mean that traditional mediums of oil paint, pottery or watercolor have been tossed to the wayside. Outside of office hours, private collections are being assembled that, in some cases, have entered into the public consciousness through art exhibitions and published children’s books. To showcase the sheer breadth of artistic ingenuity, seven individuals have been curated, each of whom demonstrate a unique perspective and talent, which we have the privilege to share with you.
Liz Bernard, Senior Animation Supervisor, Digital Domain
OPPOSITE TOP: Parker Ridge (Image courtesy of Zoe Cranley)
OPPOSITE BOTTOM: Venetian Caprice (Image courtesy of Andrew Whitehurst)
Art has been a part of the life of Liz Bernard ever since her graphic designer parents placed an X-Acto knife in her hands as a child. The creative inclinations have culminated in a career that has seen her animate the Alien Queen in Ender’s Game, video game characters in Free Guy and a lawyer with gamma-ray issues for She-Hulk: Attorney at Law. A major source of inspiration is a deep love for
nature, which Bernard draws upon when producing each piece of ceramic, either through the art of wheel throwing or utilizing flaming trash cans.
“I took a day off because there is a workshop that only happens a couple of times per year at a local community arts center where you can do an alternative firing called Raku, which originated in Japan,” Bernard explains. “The idea is that you fire things in a kiln. While they’re still yellow hot, you open the kiln up, reach in with tongs and quickly take your pottery over to a prepared nest of newspaper situated in a sandpit; it instantly catches on fire, and you up-end a miniature metal trash can, which has even more combustibles, over your piece so to create a reduction atmosphere. You get these crazy metallic reds and coppers, beautiful colors that are hard to achieve with other firing techniques. It’s an unpredictable, chaotic, elemental experience.”
“I find that my animation background influences me heavily because I’m always wanting to find an interesting pose for something,” Bernard notes. “You can do a straight-on profile of an eagle or find something that has more personality to it. I love finding those personalities in animals. and I try to put that into my work.” There is a lot of experimentation. “One of my favorite things to do right now is called Sgraffito, where I formed a piece of clay into a
bowl, painted the entire interior surface in black and scraped away the lighter parts. What I’ve been doing with these particular pieces is begin with a footprint of a local animal, like a heron, and then use the negative space to start drawing in random shapes.” A different aspect of the brain gets creatively simulated. “The reason I like this so much is because it’s so tactile and real. The images we make in the computer, you can’t interact with using your hands. This is a nice counterpoint to what I do daily.” Visit: www.lizupclose.com
Zoe Cranley, Head of CG, beloFX
Major franchises such as Jason Bourne, MonsterVerse, The Hunger Games, Wonder Woman and Mission: Impossible appear on the CV of Zoe Cranely, who has transitioned from being a digital artist to a CG supervisor to a more managerial role. Throughout all of this, the passion for oil painting has remained and led to an exhibition at the Seymour Art Gallery in Vancouver showcasing landscapes transformed into geometric shapes and blurred lines.
“It’s being in them,” Cranley observes. “You can paint or draw anything you want. I used to do a lot of still life and flowers which look pretty. but they don’t mean anything to you. Landscapes are so epic, and generally most of the paintings I’ve done I’ve been there, so I’m drawn back to them and can remember that exact moment. Being in Vancouver, beautiful landscapes are abundant wherever you go.” Unlike visual effects, the goal is not to achieve photorealism. “When you look at a picture that is real, I don’t have that desire to keep looking at it because you go, ‘Oh, yeah. That’s
what it looks like.’ I love it when people recognize not instantly what it is, but then have an attachment. I feel like I’ve done what I had set out to do, which is to capture the essence of that place in an abstract way.”
“I’ve been using oils for at least 20 years and won’t go back to anything,” states Cranley, who is not a fan of digital painting. “There is something so magical about putting a paintbrush to a canvas. I like that it takes so long to dry and is so malleable for so long. You can do so many different things to it based on the stage of drying. Also, I like the science of the various solvents that you can use. So much of it is the fundamentals of design, color, negative space and composition. Generally, the meaning to me is what makes a nice picture.” The quality of the work and brushstrokes have improved. “I’ve gotten a lot more critical and precise. The edges are neater and I have learned to varnish properly. I have refined the process. A lot of people have said that I’ve gotten more abstract. Last year, I learned how to digitize everything, which was a whole process in itself.” Visit: https://www.zoecranley.art
Sigri de Vries, Compositor, Important Looking Pirates
There is no shortage of high-profile projects to work on whether it be Shōgun, Avatar: The Last Airbender, Ahsoka or Foundation, but this has not stopped Sigri de Vries from embarking on a journey to discover her medium of choice. Along the way, she was hired to create the cover and 12 illustrations for the children’s book Balduin, die Entdecker-Fledermaus by Bianca Engel. “I was
expecting more kickbacks and having to re-do things and such, instead I was given a lot of freedom with how I wanted to do the illustrations and what parts of the story I wanted to paint,” de Vries states. “I started with a few sketches of the various characters, and once I got the green light on those, the rest of the illustrations were smooth sailing.”
Experimentation is the only constant. “I always start with a sketch,” de Vries remarks. “I erase the sketch so you can almost not see the lines and then do the watercolor and a pen on top. I found that to be what I like aesthetically, but I’m still at the beginning of this journey where I’m experimenting a lot and looking at YouTube videos for inspiration and techniques. I follow a number of artists on the Internet and want to do what they do. I want to try everything. I’ve done watercolor, clay sculpting, digital art, acrylic and ink. It’s my hobby, so I’m just having fun!” Initially, the plan was to learn digital painting to do concept art. “I did a lot of landscapes and played around with compositions. I also did a lot of matte paintings at work, but matte paintings are more photo collaging than painting. As my journey progressed, I got interested in characters and creating them in a cute illustrative style.
“When I finally had enough money to buy an iPad, I switched from Photoshop to Procreate,” de Vries states. “Since then, I’ve been painting so much more. Procreate is so easy and intuitive, and I can paint and draw directly on the screen, which I love. What a lot of artists do is paint with an actual brush on paper, scan that and use it as a texture for a brush in Procreate. My next big project is a scanner/printer so I can do that stuff as well because it sounds fun to make your own brushes.”
Visit: https://www.artstation.com/sigri
Mariana Gorbea, Senior Modeller, Scanline VFX
Modeling creatures and characters is something that Marianna Gorbea does on daily basis for Scanline on projects such as Game of Thrones, X-Men: Dark Phoenix or Terminator: Dark Fate, but that all occurs within the digital realm. This expertise has also been taken into the physical world where clay is literally shaped and transformed into figments of her imagination. “I started with ZBrush and then moved onto clay,” Gorbea states. “The biggest difference is that you have to be mindful of what you’re doing with clay because if you mess up, those hours cannot be taken back.” Lessons have been learned from working with clay. “It has made me observe more of the whole picture, to be more careful with details, composition and how a sculpture looks from all angles; that has helped me to make better sculptures in ZBrush. The tools I use with clay, I try to replicate in ZBrush and vice versa.”
BOTTOM: Deathly Silence (Image courtesy of Mariana Gorbea)
Gorbea is attracted to crafting fantastical creatures. “Mexican culture is fascinated with death, and some artists can turn dark things into something beautiful. I’m drawn to that, and that’s why I try to sculpt creatures and characters.” Designs are simplified for clay. “Building armatures is the hardest and trickiest part with clay. It has to be able to stand. You have to be familiar with skeletons. For example, if I’m making a horse, I’m looking at horse anatomy, how the bones are built and proportions. I build the armature first because if that is not done properly, it’s not going to work.”
Three types of clay are utilized: oil-based, polymer and waterbased. “All of them are quite different, so I have to think about how I’m going to make a structure and base for it,” Gorbea remarks. “Water-based clay dries quickly, and I use it to make bigger sculptures that have fewer details. With polymer or oil-based clay, you get to spend more time with it and put in more detail; I use them for smaller sculptures. The sculptures are usually made of several pieces, and I create layers of detail.” Depending on the size, a sculpture can take five to 10 hours. “The hardest part of making a sculpture is to give it personality and convey emotion. If the face, especially the eyes, don’t work, then the sculpture is not going to work.” Visit: https://www.instagram.com/margo_sculptures/
Adam Howard, Visual Effects Supervisor
Interwoven with the space exploration universe envisioned by Gene Roddenberry, Adam Howard has been lauded with Primetime Emmys for Star: Trek: Voyager and Star Trek: The Next Generation as well as nominations for Star: Trek: Deep Space Nine and Star Trek: Enterprise. However, his artistic eye has gone beyond the realm of Federation and Klingon starships as he paints with light to produce character studies of friends, colleagues and celebrities.
“The human face is a never-ending source of wonderful detail and surprise,” Howard explains. “Based on a photograph, I start with a detailed pencil outline that determines the overall shape of the face. Within that outline, I also mark out areas for shadow and highlights. I paint masks for each major area: face, eyes, ears,
neck, hair, beards and clothing. Once each area has a clean mask, then I start the actual painting. First, come base colors and areas of shadow and highlight followed by middle ground detail then eventually on to finer detail. I paint in digital oils because I love being able to blend my paint to help give subtle form to each area. I also love the fact that by painting on my iPad, I can paint anywhere. I am not restricted to a physical studio or materials.”
Howard begins each portrait by painting the eyes. “Eyes truly are the window to the soul, and I try to capture the real essence of each subject by painting the fine detail and shape of eyes. Sometimes, it can be a really tiny detail like a single point of highlight on an eyelid that makes the person feel real. I love those moments when the face pops off the page at me as the person I am painting. Depending on the portrait, I sometimes work in additional detail over the final painting from the original pencil outline. This can assist in deepening lines around the eyes and accentuating hair detail. I used to do colored pencil and ink portraits on a plain white background.” The backgrounds have become more detailed. “This plays a big part in portraits, like my paintings of Ve Neill and Steven Spielberg, where so many films they have created are represented in the background. Sometimes, the backgrounds take longer to paint than the person.” Visit: www.adamhoward.art
Marc Simonetti, Art Director, DNEG
Initially trained as an engineer, Marc Simonetti decided to become a concept artist and has made contributions to Valerian and the City of a Thousand Planets, Aladdin and Transformers: Rise of the Beasts. He has also illustrated Discworld by Terry Pratchett, Royal Assassin (The Farseer Trilogy) by Robin Hobb and The Name of the Wind by Patrick Rothfuss. “When I started my career, the only job available was for book covers in sci-fi or fantasy,” Simonetti notes. “I grew up with that trend. Maybe I would have had a completely different style if I had tried fine art first. But that’s life.”
“Sometimes, I start with watercolors or pastels, but that is rare because we have to be fast,” Simonetti remarks. “The only thing that I try to do all of the time is to change my process because I need to have fresh options. If I stick to something then my picture will always look the same. At the same time, it’s trying to be as honest as possible. Most of the time, I start with pencil and paper because it’s the easiest one. Once the composition is set in my mind, there is an app, Substance 3D Modeler, that allows you to sculpt in VR, which is a lot like ZBrush. I use my Meta Quest headset to scout everything. I can put lighting on the model and find different cameras. I also can create a library by sculpting a tower or window that are used later on. Once I have the model, I can use Octane, Cinema 4D, Blender or Unreal Engine. Then I render and paint it in Procreate or Photoshop.”
Sketches are conceptualized without references. “I want to be as free as possible to set up a good composition,” Simonetti states. “However, when I need to fill the composition with elements, I try to have lots of references whether it’s architecture or anatomy. Everything has to be grounded. Even when I’m making an alien, it has to be believable. Same thing with architecture. I want people to connect with it. If you don’t have any reference for
the scale, it takes people out of the painting.” Lighting is critical. “When I’m using 3D, it’s a huge help. I’m trying so many different lighting scenarios to fit the correct mood and to be as impactful as possible.” Visit: https://art.marcsimonetti.com/
Andrew Whitehurst, VFX Supervisor, ILM
Given that Andrew Whitehurst studied Fine Arts before becoming an Oscar-winner for Ex Machina, his belief that music, pictures, lunch and ball sports are the greatest achievements of humanity is not entirely surprising. The enjoyment of studying faces and drawing caricatures has come in handy. “If I know that we’re doing a digital face for someone, literally the first thing that I will do is type ‘an actor’s name + caricatures’ and search the Internet,” Whitehurst reveals. “If there are loads of good caricatures then it’s going to be an easier job because something is capturable about their likeness. If there aren’t that many good caricatures then it’s going to be much harder. There aren’t many good caricatures of Harrison Ford, and it was hard.
“There is an interplay between the way that I paint and what I understand about the world, which I have gleaned from doing visual effects for a long time,” Whitehurst notes. “I’m always trying to make something psychologically interesting. I love abstract art, but I’m not good at doing it. I started doing a lot of landscape paintings, and I discovered what painting is to me; it’s a way for me to meditatively spend time somewhere I find special or engaging in some way; and to have the opportunity to think about it, enjoy it, and try to capture something of it, but in a reflective way.
“If I’m going on location or holiday, I have a sketchbook with me,” Whitehurst remarks. “I will do black-and-white pen-and-ink drawings. Some of them I will scan and add color in Procreate later if I feel like it. The drawings tend to be a more immediate reaction to a place and have more of a comic book style because that is generally how I draw. I like to exaggerate and use a little bit of imagination.” The paintings consist of casein on wooden panels. “Casein has the advantage over gouache because when it’s properly dry, it doesn’t reactivate as easily, so you can paint over the top of it, and it’s slightly creamier in texture, so it’s a little bit like oil paint but is water soluble and dries quickly. I would paint in oil but for the fact I can’t have my house stinking of turpentine!”
Contact: @andrewrjw on Cara and Instagram.
VFX IN ASIA: BOOM TIME
By CHRIS McGOWAN
TOP: Mysterious creatures fall from space, prey on humans and use them as hosts in Parasyte: The Grey. The Netflix live-action sci-fi/horror series is based on the Hitoshi Iwaaki manga Parasyte. Dexter Studios provided the VFX. (Image courtesy of Dexter Studios and Netflix)
OPPOSITE TOP TO BOTTOM: South Korea’s Gulliver Studios handled the VFX for the Emmy-winning suspense/horror/survivalist series Squid Game. Season 2 is scheduled for December release. (Image courtesy of Gulliver Studios and Netflix)
A volcano erupts on the China/North Korea border and a special team tries to save lives and prevent further eruptions in Ashfall (Image courtesy of CJ Entertainment and Netflix)
Parasite, directed by Bong Joon-ho, won South Korea’s first Academy Award for Best Picture in 2020. Dexter Studios provided the VFX for the class-conscious black comedy, most of the VFX invisible. (Image courtesy of Dexter Studios and CJ Entertainment)
“The Asian VFX industry is experiencing a meteoric rise driven by a confluence of powerful forces,” says Merzin Tavaria, Co-Founder and President, Global Production and Operations for DNEG. “The region possesses a vast pool of highly skilled and technically adept VFX artists, a critical foundation for producing top-tier visual effects.”
Jay Seung Jaegal, VFX Supervisor for Seoul-based Dexter Studios, adds, “I believe that the Asian region will become a new core base for the content and VFX industries in the future. As Asian VFX studios increasingly participate in global projects, their presence is expanding. Although they have already proven significant competitiveness and potential, I think there is still immense room for growth.”
Asia is playing an evolving role in shaping the global VFX ecosystem. Key regions and cities driving the growth of the Asian VFX industry include India, South Korea, Japan, China, Taiwan and Singapore, with Bangkok and Vietnam beginning to gain traction. Homegrown VFX studios like Dexter are on the rise, and multinational firms with VFX branches in Asia include DNEG, BOT VFX, Framestore, ILM, Digital Domain, Rotomaker Studios, Mackevision (Accenture Song), The Third Floor, Tau Films, Method Studios, MPC and Outpost VFX.
South Korea has risen to become one of the most important Asian VFX hubs, and the trajectory of Dexter, founded in 2012, is one of the most impressive in South Korea. Jaegal says, “As of the first half of 2024, the company has grown into a group with six subsidiaries connected. Dexter’s headquarters alone employs
about 330 people, including around 200 VFX artists. Currently, Dexter Studios is active as a comprehensive content studio with an all-in-one system covering content planning, development, production, post-production, DI and sound. We are also expanding our business areas to new technology fields such as immersive content, AR/VR/XR and the metaverse.”
Along the way, Dexter has provided visual effects for several noteworthy films, including Parasite (2019), which captured four Academy Awards, including Best Picture. Jaegal comments, “Parasite is a significant film in Korean cinema history as it was the first film to win an Academy Award. It marked a pivotal moment that showcased the excellence and prestige of Korean films to the world. Notably, Parasite is famous for its invisible VFX. Many people think that little VFX was used, but in reality, much of it was created after filming, including the two-story house of Mr. Park, the driving scenes and the neighborhood where Ki-Taek’s family lives. Our company designed the VFX and DI [Digital Intermediate], and our subsidiary, Livetone, handled the sound, making us an all-rounder in post-production.”
Dexter also provided VFX for Space Sweepers (2021), “which holds a special meaning as a Korean-style SF [sci-fi] film,” Jaegal explains. “It successfully [put together] a unique story involving space, the spaceship Victory and robots, which had not been commonly attempted in Korea at that time. We also handled all three post-production parts of this film. I think it redefined the standards for the space/SF genre that can be produced in Korea. Based on this experience, we [went on] to handle KSF-VFX for
Netflix’s JUNG_E, the Alienoid series, and The Moon.” Recently, Dexter has worked on Knights of the Zodiac [with DNEG], YuYu Hakusho with Scanline VFX and Japan’s Digital Frontier, Gyeongseong Creature for Netflix and Parasyte: The Grey.
Gulliver Studios handled the VFX for Squid Game, winner of the 2022 Primetime Emmy Award for Outstanding Special Visual Effects in a Single Episode, which was among the six total Emmys garnered by the series. Squid Game VFX Supervisor Cheong Jai-hoon notes, “After Squid Game was released on Netflix, it was gratifying and meaningful to see that viewers worldwide loved it, especially considering that they couldn’t tell which parts were created with VFX.”
Gulliver Studios is a VFX company (also called C-Jes Gulliver Studios) established in 2019 in Goyang by C-Jes Studio. The latter “manages actors, singers, and K-pop artists, and is involved in the
planning and production of movies, dramas, and musicals,” notes the firm. At the end of 2022, Gulliver Studios and C-Jes Studio merged to become a comprehensive content group that extends its scope from planning and producing theatrical films and OTT [Over-The-Top] content to post-production VFX.
Looking at the growth of VFX in South Korea, Jai-hoon explains, “Around 2015, there was a notable increase in the production of large-scale fantasy and action films within China, yet there weren’t many proficient VFX companies in China at the time. As a result, the majority of VFX work during that period was handled by Korean companies. As Korean VFX companies gained significant experience through working on various Chinese films, it led to substantial growth in the Korean VFX industry.”
As the volume of work in Korea increased exponentially, “Korean VFX companies established satellite companies in countries like Vietnam and China, where labor costs were lower, and they also outsourced a significant portion of their work to India and Ukraine. As a result, the VFX industry in Asia experienced growth during this period,” Jai-hoon remarks. “By the late 2010s, the Chinese film industry faced a slowdown, which also halted the growth of the Korean VFX market. However, in the 2020s, the production of Asian content by platforms like Netflix and Disney+ revitalized the industry. Successes such as Squid Game and [prior to that] Bong Joon-ho’s Parasite [also] energized the global OTT production scene in Asia.”
Jai-hoon adds, “Recently, there have been talks about Netflix increasing its investment in Korean content production and, following Disney+, even Amazon Prime is outsourcing a lot of work to Korean VFX companies. This signifies that the level of Korean VFX has already been recognized worldwide. Additionally, some global VFX companies like Scanline, Eyeline Studios and The Mill
have recently entered the Korean market, gradually increasing their investment in Korean artists’ potential. As a result, existing Korean VFX companies are building pipelines according to Hollywood’s VFX pipeline and standard production processes, different from the Korean system. Also, Korean artists with experience from abroad are gradually returning to Korean VFX companies.”
Westworld VFX in Goyang, Korea, was established in 2019 and has about 200 employees. Westworld handled the VFX for the Netflix sci-fi series The Silent Sea, the first large-scale project in Korea to use virtual production and LED walls. Asked about Asia’s VFX growth, Managing Director Koni Jung responds, “It’s difficult to say exactly, but the growth of young artists and the entry of global OTT platforms into Asia seem to be factors driving growth. [And] as Korean films and series achieve global success; an increasing number of overseas projects are being filmed and post-produced in Korea. Honestly, isn’t it because it costs less than the North American market?”
Wooyoung Kim, Director of Global Business at Seoul-based VA Corporation, comments, “As the investment of OTT platforms in the Asian market expanded during the pandemic, the budget for content rose significantly, and many content projects [were] planned that [could] expand expression in a technical direction. This led to successful outcomes for VFX companies in each country, allowing them them to showcase the technical skills that they may not have had in their home markets.” VA successfully launched a Netflix series called Goodbye Earth, participated in Netflix Japan’s project YuYu Hakusho in 2023 and is working on the movie New Generation War: Reawakened Man
In India, DNEG has teams of thousands of talented artists spread across 10 locations (including Chennai, Bangalore, Mumbai and Chandigarh), encompassing both DNEG and ReDefine studios, according to DNEG’s Tavaria. “This strategic network allows for seamless collaboration with our Western counterparts on every DNEG and ReDefine film and episodic VFX and animation project. We’re incredibly proud of the vital role that India plays in DNEG’s global success.” Tavaria continues, “Our talented Indian teams play a pivotal role in all our top-tier international projects, from feature films to episodic television series. Just to name a few, our Indian teams have recently brought their magic to Dune: Part Two, Furiosa: A Mad Max Saga and The Garfield Movie, among others, showcasing their versatility across genres. Their expertise has also been instrumental in projects like Oppenheimer, NYAD, Masters of the Air, Ghostbusters: Frozen Empire, Godzilla x Kong: The New Empire, Borderlands and many others.”
Tavaria notes, “Many Asian governments are actively nurturing the industry’s growth. Take India’s AVGC Promotion Task Force, for example. This initiative recognizes the significant contribution VFX makes to the economy and aims to propel India further onto the global stage. By establishing a national curriculum focused on VFX skills development, the Task Force paves the way for India to produce even more world-class content.”
Larger Asian studios are staying ahead of the curve by rapidly
Extraordinary Attorney Woo is an autistic lawyer with a photographic memory and a great love of whales. Westworld VFX contributed effects to the Netflix series. (Image courtesy of Westworld VFX and Netflix)
South Korea’s Dexter Studios co-produced and helmed the VFX for Jung_E, a near-future post-apocalyptic story about a warring faction on a desolate Earth that attempts to clone the brain of a legendary warrior to develop an AI mercenary and stop a civil war.
(Image courtesy of Dexter Studios and Netflix)
TOP TO BOTTOM: Space Sweepers was South Korea’s first big-scale sci-fi film, proving they could handle the genre. (Image courtesy of Dexter Studios and Netflix)
DNEG’s Indian teams worked on VFX for Ghostbusters: Frozen Empire. (Image courtesy of DNEG and Columbia Pictures/Sony)
Westworld VFX contributed effects to the Netflix series Black Knight, a South Korean television series based on a webtoon, a digital comic book read on smartphones in Korea. (Image courtesy of Westworld VFX and Netflix)
embracing cutting-edge technologies. This ensures their VFX offerings and capabilities remain at the forefront of the global landscape. Tavaria says, “This confluence of a skilled workforce and a commitment to technological innovation has solidified Asia’s position as a major player in the ever-evolving world of VFX.”
About DNEG, Tavaria comments, “We’re continually working hard to refine our global pipeline to open the door to a new era of creative collaboration across our locations. This allows our Western studios and Asian teams to work seamlessly together to push the boundaries of what’s possible in VFX.”
Streaming has been another factor in the Asian VFX rise. Tavaria explains, “The rise of streaming, along with a flourishing domestic film market, has fueled a surge in high-quality content, presenting a thrilling opportunity for the Asian VFX industry. This explosion of content demands ever more exceptional visual effects for Asian audiences that are hungry for stories that reflect their own cultures and aesthetics.”
BOT VFX has four locations in India (Chennai, Coimbatore, Pune and Hyderabad) and one in Atlanta. “Our total team size is 800,” says BOT VFX CEO and Founder Hitesh Shah. The firm has been working on many high-profile projects, including Kingdom of the Planet of the Apes, Fallout, 3 Body Problem, Shogun, Monarch: Legacy of Monsters, Knuckles, Civil War, The Boys 4, Furiosa: A Mad Max Saga and Bad Boys: Ride or Die
The size of the talent pool has been growing significantly in India thanks to nearly two decades of RPM work that motivated many new entrants to join the industry. According to Shah, “What was a pool of several hundred artists in 2005 is in the tens of thousands today. Also, there is now a large base of training institutions that continually feed new talent into the ecosystem. From the large pool of talent, a portion has had the skills and the ambition to fill highly specialized and technical roles required to build full-service facilities.”
About the move to provide full-service VFX for Western clients, Shah comments, “That shift is from three segments. First, those facilities that have been historically providing full-service VFX to the Indian domestic market are turning part of their attention to Western productions. Second, independent facilities that have historically been point-services providers – for example, RPM –roto, paint, matchmove – are shifting towards full-service VFX. Finally, even large multinational VFX companies that have set up a footprint in India initially for point-services support [are] leveraging more Indian talent towards the full VFX value chain.”
Shah states, “For India specifically, the growth of the VFX industry is driven by the strong value proposition it offers to Western productions in the form of three compelling components: strong cost advantage, a large talent pool and a broadly Englishspeaking talent pool that has an affinity with Western content.”
He adds, “Despite strong tax incentives in other global regions and trends toward rising talent costs in India, the overall cost advantage of India is still compelling. It seems most Western productions implicitly, or sometimes explicitly, expect their VFX vendors to bake in the lower costs of getting RPM work done in India into their bids. Finally, the affinity to Western content and English has had a subtle
but notable impact on VFX growth in India. Many young artists are bi-cultural and equally motivated to work on Western content as they are in working on Indian domestic films. There is a swifter cross-pollination and travel between India-based artists and team members in Canada, the U.S., the U.K. and Australia.”
In addition, VFX has gained prominence in Indian content, streaming and theatrical. “The availability and affordability of VFX in the content creator’s toolbox have opened up whole new genres and the ability to tell epic Indian tales that were out of budget reach previously,” Shah says. “Keep in mind that India is not just one monolithic content market, but multiple mature regional markets with their own vibrant histories of storytelling, all of whom have taken a fondness towards what VFX can enable. The fact that the Indian VFX market is well poised to serve both Western content, as well as the expanding Indian domestic content, gives it a firm footing in the global VFX ecosystem.”
Peter Riel, Owner and CEO of Basecamp VFX, a small studio founded in 1996 and based in Kuala Lumpur, Malaysia. He argues that it’s important to understand how the SEA (Southeast Asian) market works. Riel says, “Each nation here is quite sovereign both in terms of language and culture. While it’s easy to quickly take a glance and see the various countries’ similarities, it’s a mistake to think they all share the same cultural sentiments that perhaps the Western world does to films made in the U.S. As an example, one would expect Malaysian movies to be popular in Indonesia and vice-versa, due to their similar language and culture, but that’s far from the case. I do still think there is tremendous value to be found in SEA VFX. The artists here are extremely dedicated and are used to working fast and efficiently.”
Kraken Kollective CEO Dayne Cowan has over 30 years of experience in the VFX business, worked with DNEG and other major VFX firms, managed VHQ Media’s feature film divisions in Singapore, Kuala Lumpur and Beijing, and worked for Amazon Studios South East Asia as the Post Production Lead for Local Original Movies. Earlier this year, Cowan founded a new VFX firm in Singapore. “Kraken Kollective is a next-generation postproduction and visual effects management company,” he says.
“It leverages the cost and skill benefits of Asia for foreign productions that would like to have the work done here but are unfamiliar with business environments, cultures and capabilities locally. Asia is a massive diverse region with so many countries. Working here may appear straightforward, but there are many unseen challenges that we help to navigate. The skill of the talent [here] continues to grow and develop, almost at an exponential rate. When combined with technology advancements like generative AI and the sheer size of the talent pool here, it represents a serious value add.”
Cowan comments, “Parts of Asia have long been known for handling the entry-level work like roto, matchmove, paint work. As new technology changes the shape of things, I am seeing smaller companies emerge with stronger, specialized skill sets.
I think people forget that nearly 4.5 billion people live in Asia, and with a rapidly developing talent base, it will play a huge role going forward in production and post-production. Broadly speaking, I believe we are looking at a boom time for Asian VFX.”
Dexter Studios handled the VFX for Space Sweepers, a 2021 South Korean space western film regarded as the first Korean space blockbuster.
Dexter Studios supplied VFX fuel for The Moon, a 2023 South Korean space survival drama film written, co-produced and directed by Kim Yong-hwa. (Image courtesy of Dexter Studios and CJ ENM Studios)
ACROSS VFX GENERATIONS: WITNESSES TO THE EVOLUTION
By CHRIS McGOWAN
How has the VFX industry changed over time? Veteran artists who have witnessed the evolution firsthand discuss milestone trends, innovations and changes they have experienced during their careers – and select a film or series that has served as a landmark in VFX or their own professional journey.
Rob Bredow, Senior Vice President, Creative Innovation, Lucasfilm & Chief Creative Officer, ILM
The visual effects industry is going through a substantial period of change right now. It’s useful to look back at the last time we had that much transition happening, which was, if I can be dramatic, back to the age of the dinosaurs. Jurassic Park (1993) was significant in its use of CG to bring living, breathing characters to the screen, and it changed everything about the way visual effects were created. Just prior to Jurassic Park, major advances were made in the application of computer graphics on Terminator 2: Judgment Day (1991). Over the next several years, most VFX transitioned from models and optical effects to digital. I’m fortunate to get to work with many people today at Industrial Light & Magic, who made that transition from the model era to the digital era, and see how their skills and experience continue to advance the industry today, contributing to some of the latest developments we are seeing today.
The movie that had the biggest impact on me personally was
Star Wars: Episode V – The Empire Strikes Back (1980). There were many moments that stood out to me, but none more than the opening scene of the AT-AT walkers attacking the rebel base on Hoth. One of the most memorable images from that film is a BTS photograph of Phil Tippett popping up through a trap door in the miniature Hoth set to pose the AT-AT for the next frame – which taught me then that there were real artists who got to make these films.
Phil Tippett, VES, Founder, Tippett Studio
The advent and development of computer graphics changed everything, opening up an entire field and engendering a global workforce to handle a multitude of media. While controversial, I’m very curious [about] the ethical application of AI, which is already finding its way into VFX production. I am investigating how to train models based on my own style and work to create a new original film.
From my own career, I think Jurassic Park (1993) and Starship Troopers (1997) were my most impactful projects. They both featured breakthroughs in early CGI, proving to ourselves – and others – that we could create believable creatures – dinosaurs and arachnids – on screen.
Rob Hifle, CEO & Creative Director, Lux Aeterna
It’s mind-blowing where we are now with artists working across the same project tasks in different countries using super-efficient
TOP: Jurassic Park (1993) was significant in its use of CG to bring living, breathing characters to the screen, commends ILM’s Rod Bredow. (Image courtesy of Universal Pictures)
BOTTOM: For Wētā’s Guy Williams, Jurassic Park showed filmmakers that anything, finally, was possible and that creativity, with the help of CG, could be unlimited. (Image courtesy of Universal Pictures)
TOP: Spin VFX’s Neishaw Ali believes that the impact of Avatar on motion capture, 3D technology, virtual production, environmental creation and machine learning integration has driven the VFX industry toward new heights of creativity and technical excellence. (Image courtesy of Twentieth Century Fox)
BOTTOM: VFX Supervisor Jason Zimmerman worked on Game of Thrones (2011-2019) and celebrates the show for blurring the lines between feature and TV VFX and making feature-level VFX attainable in the TV realm. (Image courtesy of HBO)
workflows. I started working in the early ‘90s when digital cameras were just starting to enter the marketplace. Most of the edits I was working in and out of were still using film, so Steenbecks were the tool of choice. However, it was only a matter of time before all the editors were using non-linear editing systems allowing video to be rearranged on computers. Edits were easily transferable without being spliced together, now that the source material was on tape. My entry into the business was at such a pivotal time in the television and film industry. The speed at which the tech was advancing was phenomenal.
Flash forward 30 years and it seems nothing has slowed down. It actually feels like it’s ramping up. I am constantly mind-blown by the tech advancement leading to the reinvention of the VFX workflows. Real-time virtual studio productions are starting to manifest themselves into our drama and documentary workflows, as we have seen recently working on The Crown for Netflix. We also do a lot of Houdini VFX work here at Lux Aeterna, and we’ve managed to significantly reduce our render times using the latest XPU render integrated with USD [Unique Scene Description].
These game-changing developments are happening over days, not weeks or months. AI has been integrated into our software toolsets over the last 20 years to speed up the more laborious tasks, but what’s happening now is very different. Generative AI has developed at such a rate that the ethical aspects have not had time to catch up… it has polarized the industry through its perception. Just like the introduction of non-linear workflows back in the late ‘80s and early ‘90s, AI is the present-day disruptor! I have always
been an eternal optimist when it comes to technological advancement, and I’m sure the AI storm will calm with regulations and legalities.
Dayne Cowan, CEO, Kraken Kollective
There have been many incredible advancements lately, like NERF, or Neural Radiance Fields, for example, but during my 30 years in the industry, the most standout evolution for me arrived in a cluster. When these four developments – global illumination, ambient occlusion, subsurface scattering and image-based lighting – became production-usable, it catapulted visual effects to a whole new level. We had been struggling for years to make our CG look realistic and fit live-action scenes. This cluster made realism possible. Lighting from the real scene could drive the lighting in the CG scene, tiny contact shadows were possible and realistic-looking skin was achievable. As a VFX artist, it was a dream come true!
Jurassic Park III (2001) was one of the first times we had seen techniques like global illumination and ambient occlusion used on a project. As mentioned, this completely changed our workflow when it became more widely production-ready some years later and made it possible to realistically and more easily integrate CG into live-action scenes. Developments in subsurface scattering would enhance it even further, but this was the beginning of an amazing evolution in the industry.
Sylvain Théroux, VFX Supervisor and Partner, Raynault VFX
I will name the two changes that I think had the biggest impact on my career so far. I started in 2001, so the move to fully digital was
which he believes “sets a flag in the sand for quality VFX production.” (Image courtesy of Warner Bros. Pictures)
BOTTOM: From huge polar bears to shape-shifting dæmons believably interacting with humans, His Dark Materials (2019-2022) pushed the envelope for episodic, according to Framestore’s Prashant Nair. (Image courtesy of BBC One and HBO)
TOP TO BOTTOM: Pixar’s Steve May celebrates how Toy Story (1995) and Jurassic Park (1993) changed the landscape of film in the early-to-mid1990s and sparked the imaginations of May and countless other artists, technologists and filmmakers. (Image courtesy of Pixar/Disney)
Jason Zimmerman, VFX Supervisor on Star Trek: Strange New Worlds, marvels how memorable sci-fi moments in the series and movies that followed the original Star Trek were created by a who’s who of VFX veterans who were part of creating the foundation of today’s VFX industry. (Image courtesy of Paramount+)
For Kraken Kollective’s Dayne Cowans, Jurassic Park 3 marked the beginning of “an amazing evolution” in the industry, changing workflows to make it possible to realistically and more easily integrate CG into live-action scenes. (Image courtesy of Wētā FX and Universal Pictures)
already underway before I joined the industry. The first has been the progressive lowering of the cost of entry. Hardware, software and knowledge used to be a lot more expensive, making it very difficult to get started. Now, Blender and Linux are free, can run on low-cost hardware, and there are thousands of hours of free training available everywhere on the internet. It makes it cheaper to get started, but also that a lot more people can try to get in.
The second biggest change is remote work. It’s given us a lot of flexibility [in] how we work and much easier management of worklife balance. It also helps to be able to hire across a larger geographic area and reduces the need for large offices. In our day-to-day work, it has been by far the biggest change. Wasting time waiting for your shot in in-person dailies or staying very late at the office waiting for renders are things I am definitely not missing.
[For a landmark film] I would go with the 1998 movie What Dreams May Come starring Robin Williams. It was a creative and innovative way of using visual effects, applied to a dramatic subject that you don’t usually associate with a visual effects movie. It was something very different from spaceships and monsters. The paint-smearing effects and great matte paintings created a different world. I think it was ambitious for the time. It would still be a challenging brief today.
Ben Magana, VFX Supervisor, Framestore (Montreal)
Our trade is constantly evolving. We’re always devving new tools and technologies, so every year we have breakthroughs and innovations. If I had to choose one change, though, it would be a more personal one: how we’ve adapted our ways of working following the COVID pandemic. This was a huge accomplishment on the part of our tech and systems teams, and it allowed us to continue delivering work for major Hollywood projects despite the entirely new environment we found ourselves operating in. Working from home allowed us to create a better balance between our work and personal life. In many ways, we’re still in the learning phase of this new scenario: while we’ve achieved more balance in some areas, we do need to make sure we don’t lose sight of the benefits of in-studio working: bonding, mentoring, training and ensuring younger artists get access to and input from the wider team so that they can grow, learn and reach their full potential.
For me, [a landmark VFX film] has to be Gravity (2013). It was the first movie to successfully use on-set interactive lighting. This helped to create a seamless integration of CG backgrounds but also allowed the actors and everyone on set a better understanding of where things were spatially. Needless to say, it was a great tool on set and back in the VFX studio. Gravity is a stunning film where t he VFX work is not only spectacular but also crucial to tell a compelling story.
Prashant Nair, CG Supervisor, Framestore (Mumbai)
The 21st century has witnessed a remarkable CGI revolution fueled by technological advancements that allows filmmakers to create awe-inspiring visuals once deemed implausible. For me, the advent of virtual production represents a major change. Having LED screens, VR, and portable options like Framestore’s fARsight GO, all ensure we can better empower directors by allowing them to preview everything from camera moves to CG characters and
creatures at a far earlier stage. This means they’re able to more fully actualize their creative vision as they strive to transport viewers to novel and captivating worlds and more immersive storytelling.
Being able to work on The Jungle Book (2016) from recce and shoot to the creation of lifelike creatures was a real game-changer for me. It was an exciting challenge, from the realistic interactions with Mowgli to ensuring the CG animals’ expressions and gestures were natural rather than exaggerated. The film showcased the enormous potential of CGI when it comes to producing immersive and visually spectacular experiences. On the episodic front, His Dark Materials (2019-2022) represented a real raising of the bar when it came to VFX for the small screen. From huge polar bears to shape-shifting dæmons believably interacting with humans, it felt like the envelope was being pushed for episodic.
Neishaw Ali, CEO and Executive Producer, Spin VFX
The VFX industry has undergone significant transformations in the past decade, influencing workflows, competition and collaboration, thus creating a borderless economy. [For one], real-time rendering, powered by tools like Unreal Engine and Unity, is blurring the lines between pre-production and post-production. It enables interactive feedback and reduces iteration time, which can enhance the filmmaking process. This technology allows for more dynamic visual storytelling, transforming traditional workflows. Creative tools developed through AI and machine learning software can automate many repetitive tasks such as rotoscoping, tracking and even creating realistic digital doubles. These advancements have leveled the playing field, allowing smaller studios to produce high-quality VFX that were once only achievable by larger, well-funded studios.
As cloud-based workflows and cloud computing become more cost-effective, it has the ability to foster a borderless economy in the VFX industry. These technologies enable remote collaboration and instant scalability, allowing teams from different parts of the world to work together seamlessly as was proven during COVID-19, where VFX artists worked remotely. These trends have collectively transformed the VFX industry.
Avatar (2009) was a landmark moment in the VFX industry, setting new standards and influencing future filmmaking with its groundbreaking visual effects. The film utilized cutting-edge motion capture technology to create realistic character movements, including facial motion capture to record subtle expressions, resulting in highly lifelike CGI characters. James Cameron pioneered virtual filmmaking by showcasing real-time representations of the CG world viewable on set through custom cameras. This allowed him to frame shots and preview action in ways previously unseen. These innovations have since become integral to modern filmmaking. The impact of Avatar on motion capture, 3D technology, virtual production, environmental creation and machine learning integration has driven the industry to new heights of creativity and technical excellence.
Jason Zimmerman, VFX Supervisor, Star Trek: Discovery and Star Trek: Strange New Worlds
Working on the newest iterations of Star Trek is daunting because
TOP TO
Richard Clarke declares Star Wars: Episode I –
The Phantom Menace a “lightbulb moment for filmmakers” that resulted in the creation of the first previsualization sequence, where the quick iterations of shots and sequences existed. (Image courtesy of Lucasfilm Ltd.)
Breathtaking VFX and spectacular action made the iconic Battle of Hoth a standout sequence in Star Wars: Episode V – The Empire Strikes Back and the Star Wars Universe. (Image courtesy of Lucasfilm Ltd.)
Jurassic Park changed everything about the way visual effects were created for movies and changed the industry around it, ILM’s Rob Bredow and Wētā FX’s Guy Williams agree. (Image courtesy of Universal Pictures)
Ghost VFX’s David Lebensfeld is one of many who believe The Matrix (1999) influenced a generation of VFX practitioners. (Image courtesy of Warner Bros. Pictures)
Raynault VFX’s Sylvain Théroux believes What Dreams May Come (1998) elevated a creative and innovative way of using visual effects, applied to a dramatic subject not usually associated with a visual effects movie. (Image courtesy of Universal Pictures)
Framestore’s Ben Magana heralds Gravity (2013) as the first film to successfully use on-set interactive lighting, which helped to create a seamless integration of CG backgrounds. (Image courtesy of Warner Bros. Pictures)
it has always been a mainstay in VFX. Starting almost 60 years ago, the TV show – and then movies – that followed were incredible sci-fi moments made possible by a who’s who of VFX veterans who were part of creating the foundations of the VFX industry: Douglas Trumbull, VES, John Dykstra, Richard Yuricich, ASC, Dan Curry, VES, Ronald B. Moore, Gary Hutzel, Rob Legato, ASC, Bill Taylor, VES, ASC, Syd Dutton, Michael Okuda. All VFX Hall of Famers. That’s pretty much been the norm with each series and movie since. They have been contenders for Emmys and Oscars and often with VFX that were innovations along the way. The ships themselves are beloved by fans and treated almost as characters, so showing those throughout the years whether through models and miniatures or moving into the CG era has always been central to the shows and by extension pushed the boundaries of VFX to give us the most realistic representations for storytelling. That work has laid the foundation for how we today try to emulate not just ships, but space, planets and aliens. Star Trek laid the groundwork for a lot of what we see in TV and film down to this day. Star Trek has always been at the forefront in one way or another. For my career, the show that comes to mind is Game of Thrones (2011-2019), specifically Season 2. It was the first time I, as an artist and supervisor, felt like the lines between feature and TV VFX began to blur. Not that there were not great VFX before that, there were. To me, the scope and scale of what Game of Thrones started producing felt like a big moment in VFX and TV in general. As an artist, it made feature-level VFX attainable in the TV realm, and out of that a lot of other shows started to get much more ambitious. Today, you have so much great work out there. Every year there are many shows doing thousands of shots a season at an incredibly high level, and I believe to an extent Game of Thrones sort of kicked the door open for that to happen.
David Lebensfeld, President & VFX Supervisor, Ingenuity Studios and Ghost VFX
For me, the answer to [what are the biggest changes you’ve experienced in VFX] is what allowed me to start my company. The gamechanger was the democratization of hardware and software tools. The barrier of entry came down enough to start a business without a significant financial investment, thanks to desktop hardware and software. At the time, in 2004, desktop adoption was in its relative infancy. Prior to that, we used to need beefy workstations or turnkey solutions, and the evolution continues even now. Today, all we need is a laptop and YouTube. Cloud-backed services allow rendering and storage in the cloud, so you can pay as you go. This lower barrier to entry brings creativity to the forefront. There are no real gatekeepers. This paradigm is based on talent, just like basketball where everyone has a basketball and access to a court.
The Matrix (1999) was a big visual effects movie around the time when desktop solutions became more widely available. This film influenced a generation. One of the big things about that movie is the sheer quantity of VFX work and the fact that VFX is really part of the storytelling. It is VFX-forward, rather than playing a supporting role. There are so many other iconic examples, ranging from Jurassic Park (1993) to Star Wars (1977) and beyond.
Matthew Butler, VFX Supervisor, Digital Domain I’ll deliberately avoid the knee-jerk reaction to the current and future
involvement with AI, but it is clear that it will create a massive change. Instead, I think it’s interesting to go way back in time and reference something that we take for granted now. Here’s one: digital cameras! Can you imagine a time when everything was shot on film and had to be scanned to get into a digital world to be manipulated? As an example, during Titanic (1997), not only was everything shot on film but even on-set reference photography, including stereographic reconstruction photography.
Due to disc space and throughput limitations at the time, the film was scanned into 8-bit RGB color space – a Missive limitation, and all “finals” had to be reviewed on a loop through a film projector, and we were at the mercy of the randomness of the chemical bath that processed our film overnight. So, the advent and improvement of digital cameras, along with the high-depth color pipelines, catapulted the quality and eased the production of VFX work massively.
A second advancement I must note is the improvement of simulated effects animation and rendering. This procedural computation of natural phenomena such as fluid simulations and gas combustion has reached levels where it’s often better to solve it synthetically than it is to use real life! Not that long ago, we would assume that the “real” photography of any natural phenomena would always be superior to what we could synthesize using mathematical algorithms, but I don’t believe that is the case anymore.
For fun, let’s pick The Day After Tomorrow (2004) [as a landmark film]. While the water simulations are now very crude by comparison to today’s capabilities, at the time they were groundbreaking. It’s a very different thing to solve the motion and light behavior of an “un-broken” continuous water surface than to compute the volumetric, “white water” look of particulate spray. So, to tell the story of tsunami waves crashing through New York, big advancements in water simulation had to be made, and how it was animated and rendered re-thought. Again, it’s crude compared to what we would be able to do today but landmark at the time.
Guy Williams, Senior VFX Supervisor, Wētā FX
When I started in VFX at Boss Film [Studios] in 1992, film was king and TV/commercials were seen as sort of a lesser part of the field – we did that work too, but our goal was to get the big film projects. The entire industry was built around this idea, and because of this, more companies sprouted up, globalization kicked in, and we had even more companies around the world servicing the growing film industry.
Enter the streamers – TV was no longer fixed to a delivery schedule or block size, and TV content could now include more work close to the scope of film. It could be consumed on demand instead of at a fixed time. We started to question the strength of film in this space, but with the likes of Marvel showing such strong box office returns, we held to the belief in large-scale film. Then COVID hit and we spent a year not being able to go to theaters – this one event massively accelerated the timeline of events. Streaming demand exploded and the difference between TV and film got even more blurred. Streamers were presenting their material at 2k, then 4k. Due to the success of the content on the streaming platforms, their budgets continued to rise, and soon, the
TOP TO BOTTOM: The experience of working on The Jungle Book (2016), from recce and shoot to being involved in the creation of lifelike creatures, was “a game-changer” for Framestore’s Prashant Nair. (Image courtesy of Disney Enterprises)
While the water simulations for The Day After Tomorrow (2004) are crude compared to today’s capabilities, they were groundbreaking at the time, states Digital Domain’s Matthew Butler.
(Image courtesy of Twentieth Century Fox)
Highly praised for its groundbreaking visuals, Disney’s Tron (1982) was one of the earliest movies to feature CGI, making it a landmark for Cinesite’s Richard Clarke. (Image courtesy of the Walt Disney Company)
TOP TO BOTTOM: Spin VFX’s Neishaw Ali honors Avatar (2009) as “a landmark moment” in the VFX industry, setting new standards and influencing future filmmaking with its groundbreaking visual effects. (Image courtesy of Twentieth Century Fox)
Wētā FX’s Guy Williams points to the massive growth of the industry with the success of Jurassic Park 3 because it showed filmmakers that, finally, anything was possible. Creativity, with the help of CG, could be unlimited. (Image courtesy of Wētā FX and Universal Pictures)
Lux Aeterna’s Paul Silcox found the combination of spectacular cinematography and the jaw-dropping VFX of Dune: Part Two (2024) an “intoxicating experience” to watch for the first time. (Image courtesy of Warner Bros. Pictures)
OPPOSITE TOP: VFX Supervisor Jason Zimmerman believes that Game of Thrones kicked the door open for a wave of other ambitious shows to produce episodic VFX at a high level. (Image courtesy of HBO)
main difference between TV and film was purely the location of the audience.
Because of this, the industry has had to shift its view on high-end content. We see content agnostically now, and we approach the work the same, regardless of the final viewing platform – it is all content. This has strengthened the VFX industry as it has democratized exceptional work. More democratized tools and more companies capable of doing great work also factor in. This is just the latest major shift, and we need to be self-aware about it.
I can get it down to two [landmark films]. Jurassic Park (1993) –I saw this movie in college. It is the reason I headed to Los Angeles to get a job in film VFX. I wasn’t the only one; the industry grew massively off the heels of this film. It showed filmmakers that anything finally was possible. Creativity, with the help of CG, could be unlimited. Avatar (2009) is special in many ways, but the idea of virtual production was proven here. Earlier movies had touched upon the idea, but Avatar was the first major success, making the concept hard to ignore. I still believe we haven’t reached the top bounds of what virtual production can do for creative content. In time, Avatar will be remembered as the first step down that path.
Richard Clarke, Head of Visualization/VFX Supervisor, Cinesite
Planning and evolving shots is nothing new in filmmaking. However, it is definitely one area that has been around since before VFX appeared and will always be a key part of the filmmaking process. These days, visualizing shots right up to the start of final pixel production is almost taken for granted. Historically, storyboarding started at Disney in the 1930s where storyboards were filmed and edited to the movie’s soundtrack. It is the predecessor of modern previsualization.
By the time VFX was emerging in the late 1980s, previsualization of shots was already part of the shot design process. Movies such as Star Trek V: The Final Frontier (1989) and The Abyss (1989) were early adopters of using primitive 3D to explore shot design. However, it was not until David Dozoretz created the first-ever previsualization sequence for Star Wars: Episode 1 - The Phantom Menace (1999) that quick iterations of shots and sequences existed. This was the lightbulb moment for filmmakers.
Over the next two decades, visualization has become an essential pre-production planning tool, mainly using Maya and simple playblasting to build an edit. Over time, it has grown to cover many aspects of shot and sequence design, including techvis, stuntvis and now postvis. Over the last decade, game engines have become another key tool for quick iterations. They also allow real-time virtual production, with LED volumes becoming an integral part of the process. Productions can now iterate live at the shooting stage in real-time by facilitating scene editing with instant feedback. This allows filmmakers to compose and stage shots as they do with actors and physical sets.
When considering VFX [landmark] movies, the further back in time you go, the bigger and more considerable the landmark. With that in mind, one of the very earliest movies to feature CGI [is] Disney’s Tron, a 1982 movie directed by Steven Lisberger. The movie was highly praised for its groundbreaking visuals. However,
it was disqualified from the Best Visual Effects category because at the time the Academy felt that using computer animation was “cheating.”
The movie features less than 20 minutes of CGI and was intercut with live action. Much of the technology we take for granted today to combine CGI and live-action did not exist back then. Computers were very primitive, with only 2MB of memory and no more than 330 MB of storage. They created “depth cueing” to fade to black as the computers could only represent limited amounts of detail. The computers could not create image sequences, so the frames were produced one by one. Some frames took up to six hours to render. Outputting the final shots was also difficult because there were no digital film recorders. Instead, a motion picture camera was placed in front of a computer screen to capture each individual frame.
Most scenes were created using a process known as “backlit animation.” Live-action scenes were filmed in black and white on an entirely black set, and negatives were placed on an enlarger and transferred to large-format Kodalith high-contrast film [created by Kodak, who went on to open Cinesite in 1991]. Many layers of film footage, mattes, rotoscopes and CGI were merged together. Color was added by using gelatin filters in a similar procedure. Some shots utilized up to 50 passes, all optically assembled by hand to create the final VFX shot.
Steve May, CTO, Pixar
I’m happy to say that Pixar’s OpenUSD is one of the most fundamentally important technologies of the last decade. OpenUSD solved a problem that faced the film industry for years – the seamless interchange of complex 3D scenes. It’s a technology that enables artists to do more by providing powerful ways to assemble, edit and view 3D content by allowing interoperability of data between different 3D authoring tools and by supporting collaboration between artists working simultaneously on the same artistic content. OpenUSD is the backbone of Pixar’s pipeline and the core of our animation system, Presto. By sharing it, OpenUSD
has become the de facto standard in animation and VFX. As the Chair of the Alliance for OpenUSD (AOUSD), I’m also happy to say that the mission of the Alliance is to expand the capabilities of OpenUSD further and broaden the reach of OpenUSD into other exciting and rapidly-evolving industries, including immersive and interactive content.
Jurassic Park (1993) and Toy Story (1995) changed the landscape of film in the early to mid-1990s and sparked the imaginations of me and countless other artists, technologists and filmmakers. They both achieved what had previously been considered impossible – breathing life into photorealistic dinosaurs and creating a 90-minute computer-animated feature film. The movie industry was never the same.
Paul Silcox, VFX Director, Lux Aeterna
Combining spectacular cinematography with jaw-dropping VFX, my choice is a recent blockbuster and surely a contender for the next round of VFX awards, Dune: Part Two (2024). The experience of watching this movie for the first time at the cinema was intoxicating and demonstrated innovation and excellence across many disciplines. From the incredible score to the brutalist set design, the use of infra-red cameras or the VFX creature work, it was an immensely entertaining sci-fi spectacle.
I love the scope and the seamless blend of practical and digital FX, flawlessly executed throughout the movie. Innovations such as the inverted sandscreen, sand-colored screens used instead of the traditional blue or green, huge practical pyro FX used purely for lighting and then combined with simulated VFX explosions to machine learning algorithms augmenting the blue in the eyes of the Fremen, Dune: Part Two sets a flag in the sand for quality VFX production.
From the exciting and mind-blowing scenes of Paul Atreides riding the grandfather worm to the destruction of Arrakeen, it is a joy to take in the expansive worlds of Arrakis and a huge inspiration to keep creating cinematic VFX.
META QUEST 3 AND APPLE VISION PRO SPARK SURGE IN VR/AR HEADSETS
By CHRIS McGOWAN
While sales of VR and AR headsets were gloomy in 2023, this year has seen a solid recovery, with growth in standalone headsets powered by Meta Quest 3 and excitement stoked by the launch of the flashy Apple Vision Pro in February.
Global shipments for augmented reality and virtual reality (AR/VR) headsets declined 23.5% in 2023, but 2024 shipments “are forecast to surge 44.2% to 9.7 million units,” according to the International Data Corporation (IDC)’s “Worldwide Quarterly Augmented and Virtual Reality Headset Tracker.” VR headsets are forecast to reach 24.7 million units by the end of 2028 with a five-year CAGR of 29.2%, and AR headsets will grow to 10.9 million in 2028, representing an 87.1% CAGR over the same period, according to IDC. In addition, Meta, which introduced Quest 3 in October 2023, thoroughly dominates the VR headset market, with an 86% market share in 2023, according to Zreality.com. Its distant rivals include Sony PlayStation VR2, ByteDance’s Pico, Valve Index and the Apple Vision Pro. Oculus VR’s Beat Saber, officially launched in 2019, is most likely the biggest VR title of all time, having earned $255 million by October 2022, according to the Wall Street Journal. VR and AR have shown their many uses –from home entertainment to LBE (location-based entertainment) to enterprise applications, to usage of AR outdoors or at big public events. Now the Apple Vision Pro is another reason to get into AR/MR/VR.
“Overall, I’m optimistic about the future of immersive and HMDs in general,” says Tuong Huy Nguyen, Director Analyst at research firm Gartner Inc. “The future success of a specific device or solution will come down to a combination of price and value. Value depends on a variety of factors. Finding the correct balance while evolving the marketing is what vendors, including Apple, are trying to figure out.”
There are many reasons why VR adoption slowed prior to 2024. One is that “more content [volume and variety] is needed,” Nguyen says. In addition, more compatibility is needed. “Currently, instead of an ecosystem of VR, it’s more like a medley of standalone apps. The analogy I like to use here is to imagine what the web would be like if instead of a common interface [web browsers] to access the internet, we had apps. As in, every time you want to share something, the recipient is required to download a different program [or custom-made browser]? Also, certain browsers only work with certain hardware, similar to what we have with console gaming. Without a common delivery mechanism, VR adoption will continue to be hindered,” he adds.
Nguyen continues, “AR growth is still siloed and sporadic. One of the reasons is the same as I mentioned for VR. The majority of AR experiences are apps in the sense that they do a very specific thing involving a specific organization, brand or product and can’t be applied to anything else. There are other challenges like creating standards in order so that more AR content can share a browser. For example, two competing brands can be AR-enabled within a single view; so, a user can pan from one product to another and get relevant overlay information on both.”
Nguyen praises Apple Vision Pro, saying, “Apple’s official entry into the market [immersive and HMD] has validated the market
by bringing needed hype and exposure at a time when both were on the decline. On the other hand, the introductory price point makes it nearly impossible for any [regular consumer] to justify adoption.”
WHAT IF?
Marvel Studios and ILM Immersive have teamed to release the immersive experience What If…? – An Immersive Story for its Vision Pro. Pat Conran, Supervisor of VFX and Tech Art/Visual Effects Supervisor at ILM, comments, “Our teams [at Marvel and ILM] are taking full advantage of all the spatial computing tools that Apple has created for its Vision Pro. Some parts of the experience take place within your own room, meaning that you’ll see your real-world environment and digital elements at the same time, and at a very high fidelity. Characters materialize or step out of portals, and at other times, you will transition through various portals from your room into fully VR environments.”
Conran continues, “We were really lucky to have a great foundation with Marvel Studios’ series What If...? which has two successful seasons of traditional linear episodes. Working with our director Dave Bushore and the series’ showrunners, we created a story in this animated universe where you get to interact and participate in the progression of the story. This experience is unique in that it places you in the same space as these amazing characters, and you’re a part of the story.”
The team wanted the shift from AR to VR and back out again to be a key part of the experience. Conran explains, “At certain points, you will see stereo vignettes playing on large multiversal ‘shards’ in your room, with the impression that you are looking through the shard and into the scene that is playing out. These then transition to actual environments that, as the shard you’ve been watching from glides forward, brings you into a fully immersive environment in full VR.”
Regarding the platforms, he says, “Each VR/AR/MR has its strengths and weaknesses. It’s great to be able to combine them and transition between mixed reality and full VR so you keep grounded in your own reality, but that has to make sense within the narrative. Successful VR has to be as compelling a story as we would expect from linear, more traditional content, but should also be one that can be told uniquely on that platform.” Conran calls What If…? “a new type of blend in narrative and interaction.”
BAOBAB STUDIOS
“We are excited about Apple Vision Pro, but it’s not going to be until V2 or V3 that it will make major consumer headway, due to the high price right now. The potential is there to disrupt the VR/AR industry, but it still needs to figure out the killer apps that make it a must-own,” comments David Kahn, Head of Product and Games at Baobab Studios.
Baobab is a VR animation studio and creator of titles such as the Emmy-winning Baba Yaga. Kahn notes, “The biggest change for animation technology, especially for VR and AR, is the use of 3D game engines like Unity and Unreal to create real-time animation pipelines. Developers need to rethink how to tell a story when you
Other entertainment available on the Apple Vision Pro includes NBA action through an official NBA app. (Image courtesy of Apple Inc.)
Oculus VR’s Beat Saber is most likely the biggest VR game of all time, having earned $255 million by October 2022, according to the Wall Street Journal. (Image courtesy of Oculus VR)
TOP TO BOTTOM: From Baobab Studios’ Invasion! Part of Baobab’s focus has been to develop new tools and processes to work in a 360 real-time animated environment. (Image courtesy of Baobab Studios)
Framestore has been creating immersive entertainment sites around the world, including VR attractions based on Marvel, the Wizarding World and Lionsgate’s film properties, as well as creating theme park environmental and queue media. (Image courtesy of Framestore and Lionsgate Entertainment)
In What If...?, stereo vignettes play on large multiversal ‘shards’ in the user’s room, lending the impression that you are looking through the shard into the scene that is playing out. These then transition to actual environments where the shard glides forward into a fully immersive VR environment. (Image courtesy of Marvel Studios)
don’t have control over the frame and need to do it on a mobile chipset. Part of our studio focus has been to develop new tools and processes to work in a 360 real-time animated environment.”
In terms of the consumer market, Kahn notes, “VR needs a [greater number] of developers in the space creating unique content, and it still needs to have cheaper entry points for regular use. There is a growing trend to both mixed-reality experiences thanks to Apple entering the space and more social interactive experiences that appeal to a younger generation of users. These users have grown up on Roblox, TikTok and YouTube, so it’s more about quicker gratification and compelling player-driven experiences than the highest fidelity.”
FRAMESTORE VR
“Without a doubt, I think XR has greatly influenced both storytelling and filmmaking,” says Brian Solomon, Creative Technology Director for Framestore, which has worked on immersive and VR experiences such as Warner Bros.’ Fantastic Beasts and Where to Find Them and (with Thinkwell Group) several LBVR experiences for Lionsgate Entertainment World theme park in China.
Solomon comments, “Within the last 10 years, we’ve seen a sharp jump in the amount of creative and engineering talent entering the scene with knowledge of tracking systems and realtime concepts. Everyone from indie creators to major studios has received a big boost from all the advancements and research aimed at advancing spatial technology.
“On the consumer side, it feels like it’s also brought on a new expectation that the stories we produce can exist in multitudes, and that our interactions with our stories and concepts benefit from new ways of inviting the audience to play and explore. With this new expectation comes a ton of creative opportunities where we have a chance to realign and play with creator vs. audience roles. We’re getting more used to this idea of moving participants into the driver’s seat, making them into creative partners in a way,” Solomon continues.
“I fell in love with VR around the first Vive/Oculus wave. I was obsessed with the idea that you could all of a sudden ‘enter a world,’ and with this intermediary digital layer of fiction, people could walk around and you could take them somewhere new, or that their body could matter in how we tell stories and remember them. Powerful stuff. For consumers, I feel like those types of experiences can be fully transformative, and in some ways have the potential to make even real places that aren’t ‘active’ valuable again.”
LBVR
Solomon comments, “For specific venues, I do feel VR makes total sense, and people are always excited about it. Themed entertainment and rides are a wonderful starting place where you have a dedicated audience who’s so ready to have their mind blown and wants to use this technology. Location-based is so powerful for pushing to the bleeding edge and unleashing what you can do with HMDs. That space is still rich with life for audiences, and there’s a mechanism there for commerce. Framestore is working in this space frequently.”
The Location-based entertainment market was estimated at USD 3.5 billion in 2023 and is projected to reach USD 11.8 billion by 2028, at a CAGR of 27.3% from 2023 to 2028, according to a recent report by MarketsandMarkets.
“With the current crop of hardware, we’re seeing the need for much more lifelike fully immersive environments that run with very little memory usage or computational overhead. As these devices are starting to become full-fledged computers, the need to operate with utmost quality while leaving room for all the applications people want to use is paramount,” Solomon says. “Framestore is a leader in this space, having created some of the most efficient yet lifelike environments for many generations of these devices.”
Solomon adds, “For the greater public and indie spectrum, there is, however, very little educational material on how to properly configure and design for these strict requirements while maintaining quality. So, it can feel unattainable to some to convert their products and ideas into a ‘VR-ready’ format. I suppose the greatest trend is that we’re moving toward the ease of creation being most important, so more can take advantage of the tech and bend it to fit their needs.”
SMART GLASSES
AR and VR smart glasses are also generating excitement. The AR and VR smart glasses market will grow from $15.93 billion in 2023 to $18.58 billion in 2024 at a compound annual growth rate (CAGR) of 16.6%, and to $34.62 billion in 2028 at a compound annual growth rate (CAGR) of 16.8%, according to The Business Research Company.
HYPE CYCLE
“The hard part is that tech companies have geared audiences into this hype-cycle mentality, and new devices are only big news for a short time nowadays with a consumer base with short attention spans. It’s getting even harder for consumers lately who are in many ways overwhelmed with technology and its rapid advancement in general,” Solomon remarks. “I don’t think they want to feel like they’re wearing a tech company on their face; they want to experience something that enables them to go beyond that feeling and, like all good tech, let it fade into the background.”
Solomon continues, “For it to catch on publicly, there must be more to do that is so cool that it draws the audience’s attention [as must-see], and the fact it’s in VR or AR is less of the story. You almost don’t even want to make that a big deal because the public at large isn’t as blown away by the novelty of the medium itself anymore. They get it; they just need more to do with the devices.”
PEAKS AND TROUGHS
Nguyen says, “There have been many peaks and troughs in the past decade and there will be many more in the coming 10 years. We can count the peaks as milestones towards mass adoption. We can look at the troughs as lessons learned and moments when the industry can focus on quietly making progress without the distraction of the hype, rather than considering them as setbacks.”
The Meta Quest 3 standalone VR headset and joysticks were introduced in October 2023 and helped drive the VR market this year. (Image courtesy of Meta)
Sony launched its VR2 headset for the PC and PS5 in early 2023. (Image courtesy of Sony Interactive Entertainment)
THE CHANGING LANDSCAPE OF CUTTING-EDGE VFX AND ANIMATION TECHNOLOGY
By OLIVER WEBB
With streaming services constantly releasing content and more demand than ever for visual effects – especially with most series and features relying on effects, whether invisible or extensive –staying abreast of the latest tools and tech trends in VFX is key for industry professionals. In the last few years, tools such as virtual production have taken off and are quickly becoming a staple in the industry, while AI is also being used as a tool for creating creative content. Other innovations such as real-time rendering, cloud-based workflows, augmented and extended reality and new software are reshaping the way stories are told.
TOP: Writer/director Chris Sanders was thrilled with the analog feel they were able to achieve in The Wild Robot, which offers rare warmth and presence within a digital environment. (Image courtesy of DreamWorks Animation and Universal Pictures)
OPPOSITE TOP TO BOTTOM: Using Unreal Engine as a real-time compositing tool on Chicken Run: Dawn of the Nugget allowed the director, DPs and animators to make creative and performance choices in context when the environment wasn’t built or was partially or completely CG in the background. (Image courtesy of Aardman Animations Ltd.)
Cinesite worked on Iwájú, which was a collaboration of Walt Disney Animation Studios and Lagos-based Kugali Media. (Image courtesy of Disney Enterprises Inc.)
Cinesite contributed VFX to True Detective: Night Country. The effects caches for the polar bear in Episode 404 totaled 15.6 terabytes of data. (Image courtesy of Cinesite and HBO)
Aardman Animations, based in Bristol, England, is known for its stop-motion and clay animation techniques with Wallace & Gromit, Chicken Run, Flushed Away, Shaun the Sheep and Early Man. “We have a wide range of innovative projects within the VFX/ CG context,” states Aardman CTO Steven Shapiro. “One exciting area of innovation centers around digital twins and the acquisition of physical assets [such as models and puppets]. As is typical in production, we have been using photogrammetry for some time. However, the limitations in the technique around fine detail, especially in vegetation, and lack of useful texture sets have pushed us to explore several other techniques. We recently completed an R&D and technology project around photometrics and how to use both photogrammetry and photometrics to generate useful digital representations of physical assets. We are exploring additional techniques further with an R&D project around the use of gaussian splat for acquisition complementary to our existing techniques. Our digital twins projects aim to create a robust toolset that allow our acquisition technicians and VFX supervisors to quickly and efficiently capture our physical assets for robust digital use across media.”
Another project Aardman recently completed on Chicken Run:
Dawn of the Nugget is the use of virtual production within physical production in a stop-motion animation context. “We have a very different set of constraints for the use of virtual production because we work at a different measure of time than live action,” Shapiro says. “In our project, we used Unreal Engine as a realtime compositing tool to allow our director, DPs and animators to make creative and performance choices in context when the environment was not built or would be partially or completely CG in the background. It was a very interesting and important project because it brought existing techniques into a new time-lapsed environment and proved the viability and overhead. We are taking these ideas further in upcoming R&D projects.”
Cinesite’s most notable recent VFX-related projects include The Fall Guy, Road House and True Detective: Night Country “We’ve also been busy in animation, having recently completed the first animated adaptation of George Orwell’s take on Animal Farm, directed by Andy Serkis, and audiences have been enjoying Iwájú, which is streaming now on Disney+ and is a collaboration with Walt Disney Animation Studios and Kugali,” remarks Cinesite CTO Michele Sciolette. “Away from the cinemas and TVs, our immersive work can be enjoyed in Frameless, an immersive multi-sensory gallery in London where we’re bringing Hollywoodstyle effects to classical paintings such as Rembrandt’s The Storm on the Sea of Galilee, for which we won a VES Award earlier this year.”
One area that has emerged as a game-changer in the world of VFX and animation is real-time rendering. “Real-time rendering has made some difference in our animation process, though not for the majority of projects,” Shapiro explains. “The reality remains that in order for real-time rendering to impact our process, we need digital assets early on in the production process. That is why
we are working to build up our digital asset library, improve digital acquisition of physical assets and other creative ways to get assets into the rendering context. Real-time rendering is just another tool in our toolbox and not yet a critical component to our pipeline across all project types. We are exploring additional use cases that would add value to the type of interactive feedback and creative discussions that lead to more time for creative. We have partnered with NVIDIA on an R&D project to use their Omniverse platform to drive on-set virtual production workflows. This speculative project is another way that real-time rendering could impact the animation process.”
For Cinesite, improvements in real-time rendering made a huge impact in many areas. “However, when it comes to animation, probably not as much as some of us expected a few years ago,” Sciolette says. “Beyond the usual challenges associated with adopting new technologies, some of the restrictions that come with real-time rendering mean that it is often easier to rely on more established workflows that are well understood and proven to support high complexity and rich visuals. In addition, the incredible level of photorealism achieved by game engines has become less critical as the animation industry shifted towards more stylized looks. The potential to significantly impact the animation workflow is still there; it just proved to be harder than expected.”
Cinesite delivered around 355 VFX shots for The Fall Guy. The majority of Cinesite’s production infrastructure is hosted on premise, in their offices or data centers, but they rely heavily on cloud-based collaboration tools, particularly to support a hybrid working environment. (Image courtesy of Universal Pictures).
Real-time has the potential to make a big impact when it’s available to all workflows. Betsy Nofsinger, VFX Supervisor at DreamWorks, home to such franchises as Shrek, Kung Fu Panda, How To Train Your Dragon and Madagascar, describes the process. “As of now, we have seen it help with early visualization by bringing artist specialties typically involved later in the pipeline up to the early phases to get a more complete review of the concepts still in development. On Kung Fu Panda 4, our strategy for the
busy big city included taking very early looks at crowd sizes and variety and bringing the full street scene to life in motion while we were still modeling both the environment and the characters. We took advantage of a real-time space to see crowds in motion in the actual set and from many angles and in different lighting conditions. We were able to make important creative choices early, modify the set to suit, get the relative scale of each character all working together, work on traffic patterns and density issues, as well as see depth-of-field and atmosphere all before any production shots were available to test.”
In a traditional pipeline, many artists contribute to the completion of a shot and work is reviewed during each step in the department order. “Sometimes the steps are weeks or months apart, depending on the overall readiness of the inventory,” Nofsinger continues. “When new elements are added, it becomes necessary to loop back to a previous step to revise work. Those loops can take hours or days to go through new approvals and get back to the current department. Collaboration across workflows can be faster and more creative in a real-time space when iterations that have those complicated dependencies in an offline renderer can be seen live and played back in session. We had a sequence with paper that had gold leaf on it, and when we didn’t quite hit the gold look at first, it was very helpful when we sat down with lookdev and lighting to do a live session in the real-time space. While these teams do generally work closely together, the immediate feedback from the live collaboration was a huge success.”
The majority of Cinesite’s production infrastructure is hosted on premise, in their offices or data centers, but they rely heavily on cloud-based collaboration tools, particularly to support a hybrid working environment. “For instance,” Sciolette adds, “tools like SyncSketch or cineSync play a key role in our distributed review workflows. More generally, Google’s Workspace is one of the cornerstones of our global communication and collaboration.”
Similarly, due to cloud technology and the ability to tap into an international labor pool, Sunrise Animation Studios is able to produce its first theatrical animated feature David, which will be released in 2025. Based in Cape Town, Sunrise is one of the many growing animation companies in Africa, producing Africa’s first animated feature film, The Legend of the Sky Kingdom, in 2003 “This is an amazing opportunity for a scrappy emerging market company that would normally be shut out of those labor markets due to geography. But at the same time, those opportunities may not be what American and Canadian artists are accustomed to from a compensation perspective. And, then, time zones matter quite a lot for how and when you interact with artists. So, while it is an amazing, enabling technology, you can’t forget about the other macro factors that come into play,” says Sunrise Feature Animation Cinematographer Dave Walvoord.
AI-driven VFX is revolutionizing content creation and offers many advantages, despite the concerns surrounding the tool. “The term AI covers a broad technology landscape, and while there are parts of this landscape that are very exciting and we want to maximize the opportunity they represent for us, there are other parts that are clearly more problematic,” Sciolette notes. “We are
(Image
The making of A Shaun the Sheep Movie: Farmageddon Shaun the Sheep is one of the longest-running animated series in British television and has produced two feature films. (Image courtesy of Aardman Animations Ltd.)
Aardman Animations used VR technology on Chicken Run: Dawn of the Nugget. (Image courtesy of Aardman Animations Ltd.)
ramping up our investment in machine learning and will be relying on it to produce exciting new tools that enable our artists to be more efficient or to push the creative boundaries even further. At the same time, while we fully recognize the incredible potential of generative AI solutions, we are cautious about integrating any form of generative AI workflow until we have solutions that are fully respectful of creator’s rights and we are confident that all the legal challenges that these solutions pose today are fully addressed.”
Walvoord believes that while AI is getting all the buzz now, there is still much progress to be made before it becomes transformative.
“My naive guess is that we are 10 years away from it being a truly transformative technology. Where we are now is that AI denoising is helping artists make informed decisions interactively and faster
– and is perhaps helping me put up a more legible concept proposal using Photoshop’s generative fill. But, it isn’t really eliminating the hard creative work of putting a stunning image on screen at this point. Ten years from now, that will probably be very different. We haven’t really explored AI in any other areas so far.”
Shapiro argues that newer technologies, like generative AI, are not yet mature enough to make a significant impact, and it is also unclear how valuable these advancements will be, specifically in stop-motion. “Advances in color workflows, such as continued advancements in ACES, have ensured consistency across our studio and the industry.” Shapiro adds, “New LED lighting technology, DLSR cameras and motion control equipment continue to improve our efficiency, power consumption and ability to work in smaller spaces. Within the VFX context, new ways to capture, create and reproduce materials and textures that match the look of our physical assets are ensuring we can preserve the artistic choices for the worlds we build. The most impactful technological advancements have actually been the incremental and foundational improvements. Combinations of better up-res algorithms, denoising of ray-traced renders, increased CPU and GPU capacities, and software optimizations to better leverage computer resources have all allowed us to do more great work interactively or at reductions of time by an order of magnitude.”
Remote and hybrid toolsets and workflows have been impactful for stop-motion. “The pandemic forced a lot of workflows to be remote-capable. However, many of the products and tools had constraints that caused friction or prevented remote work,” Shapiro notes. “We have partnered with several software and hardware providers to deploy some tools that reduce latency and improve the video and audio quality to allow for more remote artists who are drawing with input devices that require immediate feedback. We have mature workflows for collaborative review on platforms that are feature-rich, secure and integrated to our existing workflows. When we have key creative staff that are traveling or off-site, they can now actively participate and drive production decisions in offline processes, even from the production floor during shooting.”
It proved more creatively flexible and cost-effective to use
Cinesite has implemented a number of new technologies within the company. “In terms of core pipeline, our main focus has been on USD adoption,” Sciolette says. “Our pipeline is entirely based on USD, and we are now focusing on improving our shot production workflow taking full advantage of the powerful features USD provides. This has been a great opportunity for us to rethink and streamline many parts of our pipeline. From a ‘tools’ perspective, among the many initiatives, we developed a broad set of tools that allows us to represent clothing and garments based on woven fabric strands, rather than more traditional texturing-based approaches, significantly improving the realism and the level of detail we can produce. We also invested in expanding our tools for non-photorealistic rendering, developing a comprehensive toolkit for procedurally generating or hand-drawing and animating ink lines to support a broad range of visual styles. Finally, we continue to invest heavily in Gaffer, our free and open-source application for lookdev, lighting and pipeline automation. While last year’s
focus was on a real-time ray-traced viewport and native support for Cycle’s renderer, more recently we developed a powerful and extensible system for render pass management, improved tools for interactive light placement and much more.”
The landscape of VFX and animation has changed significantly over the last few years and has been accelerated by large events like the pandemic and the recent strikes. Shapiro observes, “The reduction of budgets and schedules without reducing complexity or quality has been a long-time pressure in our industry. However, recent economic pressures have exacerbated the problem. The only way to meet the moment is to innovate and rethink the way we make productions. Virtual production has gotten some negativity recently, but it is really a set of tools, technologies and workflows that can be used either effectively or poorly. The other way that the landscape has changed significantly is that most companies can now effectively support remote VFX artists and CG animators/generalists. This empowers people to look for work globally and for studios to find talent from a global talent pool. We do prefer hybrid working to connect our people and teams, but if that is not viable, we can – and often do – bring on remote artists. This agility is key to meeting the budget and scheduling challenges. It is not about finding lower-cost talent; it is about being able to get the right people at the right time to jump into the work effectively.”
Nofsinger argues that for a long time, specialization and proprietary tools were both sort of the standard. “Artists became extremely skilled in a narrow specialty, and tools were developed to provide the best opportunity to elevate their creative contribution, reduce repetitive tasks and facilitate the ever-growing appetite for complexity,” Nofsinger adds. “In recent years, we’ve seen a move toward standardization or, at least, more shared tools through open-source initiatives and some dccs [digital content creators] adding additional features and support to allow department crossovers that were difficult before. I think we are also seeing some more generalist artist roles becoming popular. There has definitely been a desire to allow an individual artist to do a larger share of the creative work concurrently when it makes sense both for their own interest and the creative goals. The fewer handoffs and disruptions to the creative process the better.”
Concludes Sciolette, “We are at a very interesting point in time. If I look back at the last few years, we have seen many significant changes, including the incredible progress with realtime rendering, which led to the powerful virtual production workflows in VFX, the exciting exploration of very diverse visual styles in feature animation, the broad adoption of open-source foundational technologies such as USD, but at a very high level, the way high-end VFX and animation are made today is not drastically different from what it was five or even 10 years ago, with the notable exception of hybrid and remote work being the norm today when it was completely absent at that time. On the other hand, looking forward, we can expect that more significant changes are coming, driven by the incredible innovations enabled by machine learning. It is important for our industry to shape those changes around our needs and, more importantly, our artists.”
TO BOTTOM: Authoring color in the DCI-P3 color space rather than SRGB allowed the The Wild Robot to enjoy a wider gamut of available hues, adding more ‘pigments’ to artists’ palettes, dramatically saturating the animation more than ever before at DreamWorks. (Image courtesy of DreamWorks Animation and Universal Pictures)
Although Trolls Band Together was predominantly CGI animation, the film includes some 2D animation sequences by Titmouse, Inc., with animation styles inspired by Yellow Submarine and Fantasia (Image courtesy of DreamWorks Animation and Universal Pictures)
Darren Dubicki, Production Designer on Chicken Run: Dawn of the Nugget, used VR in the set design process. (Image courtesy of Aardman Animations Ltd.)
LIVING ON DEADLINE:
FINDING A HEALTHY BALANCE
By CHRIS McGOWAN
“Our industry is not one of stability,” says Emma Clifton Perry, Co-Chair of the VES Health & Wellbeing Initiative. “Most of the workers in our global VFX community do not have retirement plans. Many are contractors responsible for their own healthcare and well-being. Our industry is also no longer one of purely fresh graduates with a passion to succeed. We have families, we have mortgages, we have [worries] for our future, and often we have no job security. This is all a lot for one person to manage, and that is why we were keen to try to offer some assistance in this area. It can be overwhelming for anyone, managing all that alone.”
Many VFX studios are trying to help artists navigate often heavy work demands and worries and achieve a work-life balance. Perry, with Philipp Wolf, establsihed the VES Health & Wellbeing Initiative “to ensure that all VES members worldwide, and indeed the wider VFX industry, felt that there was someone out there who cared about them. We care about their physical health, mental
health, financial well-being and general work-life balance. Our initiative has implemented a range of resources, not only for members but also for the wider VFX community.”
“Our industry is fast-paced, and driven by tight deadlines, tighter budgets and an insatiable appetite by consumers for entertainment content, be it games, TV shows, commercials or feature films,” Perry comments. “Despite all this, it is imperative [that] organizations of all types within the industry remember that those creating this content are people with families, with homes, with dreams, with desires and lives outside of work. There must be a balance.”
Perry adds, “If someone is working overtime for prolonged periods of time, their productivity actually drops [and] the quality of their work drops too. This is not financially beneficial to a company. Managing schedules, work hours and expectations is a recipe for a good outcome for organizations, both fiscally and in terms of crew retention. When [the] crew feels their company cares, the morale among the workers is higher. This in turn leads to more productivity, greater efficiency and a better fiscal and creative outcome for companies.”
Cinesite’s Chief HR Officer Sashka Jankovska says, “Usually, the biggest pressure in the VFX world is the fact that a lot of the work is project based, which requires a lot of planning and changing throughout the process, which can lead to job insecurity. Managing tight deadlines in VFX is another pressure that we are facing constantly. We do our best to alleviate pressures wherever possible; to act as a buffer easing the impact on the wider team, where we are able to do so.”
Jankovska continues, “We foster an open working environment where people are encouraged to share their concerns with their peers or our dedicated mental health first-aiders and to also make use of our employee assistance program, which provides access to clinical-based mental health and well-being solutions.”
Jankovska explains, “We continue to focus on social and team bonding events in the studios in order to bring the crew together. We encourage our team members to have outside interests, which often reduces their stress levels and feeds their creativity. A great example is the London VFX football league, which Cinesite participates in. It’s a great way to network, socialize and improve physical and mental health at the same time. We also encourage them to book their annual leave. Time out is very important for everyone. Time management is also a very useful skill that is important to develop.”
For Lux Aeterna CG Supervisor Timmy Willmott, while the obvious pressure is deadlines, “The most consistent pressure is to produce a standard of work that you, your team and the client are happy with. That’s not always easy. Communication and active listening are key to a supportive culture and assist in the mindful delegation of tasks. Know your artists and give them the work that’s going to light them up. You’ve got to recognize people’s passions and goals. Help them on their path to achieve them. Realistic bids and realistic deadlines are pretty key, too.”
Luz Aeterna Operations & Marketing Manager Laura Ashley notes, “Our staff is our biggest asset, so creating a workplace that
RSP (Rising Sun
artists enjoy a Diwali
The company is headquartered in Adelaide, South Australia. (Image courtesy of Rising Sun
The Cinesite London VFX League football team at a workout. (Image courtesy of Cinesite)
TOP TO BOTTOM: Throughout the year, Vine FX hosts entertainment, games and events for its employees (Image courtesy of Vine FX)
RSP artists relax with social rock-climbing. (Image courtesy of Rising Sun Pictures)
RSP artists participate in a Pride march. (Image courtesy of Rising Sun Pictures)
they look forward to coming to is a real focus for us. I’m one of a few Mental Health First Aiders at Lux Aeterna. Having people in the workplace who have this training is a must for us. It’s so important that as a business we can see the early signs when others might need some extra support, then either offer a safe space for listening or be able to signpost to resources or organizations who can help further.”
“When it comes to workplace dynamics, at ILP we prioritize nurturing a positive culture and are steadfastly against bullying or any negative behavior,” says Måns Björklund, Important Looking Pirates Executive Producer. “Our company is proudly artist-driven, focusing on supporting and celebrating artistic talent.”
Björklund notes, “We at ILP have dedicated talent managers that artists can reach out to, and they also develop an individual development plan for each artist. VFX is a collaborative and team effort. Part of ILP culture is to put artists first and to give them the tools and resources to be able to do their best work.” He adds that VFX artists can achieve a better work-life balance by “setting boundaries and making sure to try to [take time] off work. It is a tough business with [artists] sometimes needing to put in more time, but I think the company should try their utmost to plan around that.”
Located in Sweden, ILP has some built-in national stress reduction. Bjorklund notes, “The Swedish general health system is based on the principle of universal healthcare for all residents. It is largely funded by taxes. So, anyone in Sweden who is working is covered by that. The same goes if you have kids in daycare and in school. Balance of life is also added through removing stress for not having enough funds to send your kids to a good school or to university. There is also a public pension system. In addition, ILP is providing private health care insurance for Swedish employees and a private pension, both paid by the studio.”
“The VFX industry is known for its intense pressures, like long hours during crunch time and job insecurity because of contractbased work,” says Laura Usaite, Vine FX Managing Director. At Vine FX, we really push back against this norm. We offer flexible hours and lots of opportunities for learning and growth. It’s important to emphasize open communication and teamwork, creating a space where artists can thrive creatively while maintaining a healthy work-life balance, whether they’re working remotely or in the studio. It’s also about providing resources for mental health and knowing who to reach out to with any questions, queries or concerns so everyone feels supported.”
Mike Ring is a Rigging Artist at Rising Sun Pictures’ Adelaide studio. He also leads RSP’s Social Crew, an employer-sponsored group of volunteers who run and support staff-led initiatives aimed at making RSP a fun and inclusive place to work. Ring comments, “I think it’s imperative to care about what we do, but don’t take it too seriously. Rigs will break, sims will be uncooperative, code will have bugs, human errors happen, and things will be kicked back. Learn to step back from it, don’t take it personally and, if you can, have a good laugh when something ridiculous happens. ... Finding a good support network of people to go for a lap of the block or a coffee when we need to debrief or have a laugh can be invaluable.”
The Social Crew’s initiatives have included “potlucks for
Harmony Day, a Diwali feast run by our Indian community, fundraisers supporting important charities, gaming tournaments, a hiking club, screenings in the theater, supporting our LGBTQIA+ people and allies to march under an RSP banner at the Pride march, and our weekly tradition of Friday night Happy Hour,” Rigg says.
“While remote working has been embraced and well-received by the majority of our crew, feeling isolated is something that some people experience,” Jankovska comments. “For the management teams, it can be harder to spot if someone is struggling at home so they can offer the necessary support.”
Living where you work can be a blessing and a curse. Willmott explains, “When the working day ends and you’re still in the environment where you work, it’s too easy to keep going and not give yourself proper down time. Burnout can creep up on you pretty fast in that scenario. It’s harder to supervise remote artists for that reason, too. When you’re all in a studio, it’s easy to see when someone’s struggling with something.”
Usaite remarks, “Vine FX consists of a majority of remote artists. For them, it can be tricky to communicate without faceto-face chats, so we’ve integrated all of our communications into Slack, which helps keep conversations organized, open and clear, with plenty of huddles throughout the day. We also understand that working alone can feel pretty isolating without the usual office vibe, so throughout the year we host some entertainment, such as games nights and quizzes, and invite everyone in the company to Cambridge for our Summer and Christmas parties. It’s such a pleasure to meet everyone in person; some of them fly in from all parts of the world and have a great time together!”
Ashley adds, “We have a broad mix of ages and people here at Lux Aeterna, so we recognize that one work-day might not suit all. We have a few core working hours each day, where we ask people to be in the studio or online, and we try to make sure all meetings happen within that time. Around that, each person can make up their remaining hours when it suits them best, which allows a late start for a night owl or an early start for someone doing the school run.”
Perry says, “I feel sometimes as artists we forget to pause and self-reflect. We jump from one shot to the next, ticking off the tasks in Shotgrid, skipping coffee breaks, skipping lunch, all to get that next task done. We put off that holiday, that appointment with the accountant or doctor, or that conversation with our partners till later because we’re tired when we get home.
“When we let the life tasks build up,” Perry continues, “it’s just as stressful and overwhelming as when we miss a shot deadline. It is imperative as VFX artists, or indeed any VFX professional, that we take a step back and say to ourselves, ‘What do I need in order to do my job better?’ [and] ‘What do I need in order to be happier in my day-to-day?’”
Perry observes, “Sometimes it is hard to find your voice and confidence. That’s where the VES Members Assistance Program can help. You can call and get a free session with a counselor or advisor to help you navigate your day-to-day stresses and help you formulate a plan.”
Smart Simulated Characters (Digital Extras)
By JEFF FARRIS, Epic Games
Edited for this publication by Jeffrey A. Okun, VES
Abstracted from Chapter 10 of The VES Handbook of Virtual Production
Edited by Susan Zwerman, VES, and Jeffrey A. Okun, VES
Decision-Making
Decisions happen in the “brain” of the AI character. This is the internal logic that processes the gathered perception data and chooses what to do. The output of this step is a goal or command that the character knows how to execute.
Action
The ability to execute a decision is the final step. The actions an AI needs to perform can seem simple, such as walking towards a destination or playing a custom animation. But making them happen convincingly can be complex, often involving many interacting systems.
How Characters React to Their Virtual Environment and How Virtual Characters React to the Physical Environment
Smart Simulated Characters
Smart AI-driven characters have been a staple in the video game industry for decades. But this technology is also well-suited to virtual production applications. Many useful tools and resources are widely available. And commercially available game engines come fully equipped with deep technology stacks, capable of producing sophisticated and believable AI characters that can be simulated in real-time on consumer hardware.
A Crash Course in Real-Time Artificial Intelligence
Fundamentally, there are a few important concepts to consider when creating an AI character:
• How an AI agent perceives the world.
• How an AI agent decides what to do.
• How an AI agent executes that decision. Systems to define these elements are all fundamental, and game engines will reliably have the tools to implement the behaviors needed.
Perception
The key challenge of modeling AI perception is crafting which data to use and which data to ignore. For an AI agent to perceive its environment it needs access to state data for the simulation where it exists. This is typically the easy part, as it is common that an AI system has nearly the entire state of the simulation available to query. But omniscient AI is rarely interesting. How an AI reacts to limited information is what makes it seem lifelike and believable.
Finally, a spatial navigation system is necessary to make sure the AI character can take a valid route to a goal destination. This system must be able to generate paths around obstacles in the environment, and it must be able to move a character realistically along these paths. It may even need to be able to detect and avoid dynamic obstacles in the environment, including other AI agents. This system relies on another set of custom metadata to define traversable areas in the environment. One common approach to this is called a navigation mesh.
Trade-Offs
Smart digital characters can be very powerful in certain situations but are not appropriate for every situation. One trade-off to consider is interactivity versus fidelity. A core strength of real-time character simulation is the ability to get good results from unpredictable inputs. On the other hand, more interactivity can sometimes mean lower fidelity. This is partially due to the natural patterns that can emerge in algorithmic approaches and partially due to the limited processing time available due to the need for interactive frame rates.
Another major trade-off is short-term cost versus long-term benefit. There can be a significant initial investment to develop, test and deploy a smart character system. But as is true with many systemic processes, this type of system shines at scale and can enable certain types of workflows or results that might not be otherwise achievable.
Of course, both trade-offs are highly situational and content-dependent, but they are key to understand to make effective choices.
Order yours today!
https://bit/ly/VES_VPHandbook
VES Los Angeles Revisited: Celebrating a Proud VFX Metropolis
By NAOMI GOLDMAN
The Visual Effects Society’s worldwide presence gets stronger every year – and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. The VES Los Angeles Section, the largest of the Society’s regional hubs, now touts upwards of 1,635 members. The Section is Los Angeles County-centric but encompasses a diversity of Southern California members from neighboring Orange and Ventura Counties, with a presence from Santa Barbara down to San Diego.
“Looking at the state of the visual effects industry, we’re experiencing the aftermath of what I call ‘strike-demic,’ coming off the dual labor strikes and continuing readjustments of how people work post-pandemic,” said Charlotte Nelson, Chair, VES Los Angeles Section, Vice President Strategic Organizational Planning at Digital Domain and Co-Founder and CEO of Slapworks Animation. “Because we are all dealing with multiple challenges – how remote work impacts our ability to nurture up and coming talent, the lag in expected work pick-up, the backlash around invisible CGI, and evolving issues around massive industry disruptor AI – the VES is so important, especially now. As our colleagues work to adapt to shifting dynamics, having this consistent feeling of community, a central group wherever you are in the world to connect, our Society network is tremendously valuable.”
Earlier this year, the Section issued a member survey to solicit feedback on the most sought-after programming and activities, which has proven to be instrumental in the planning process. “We are focused on reinvesting in our Section as a community to support our members, and aligning our programming to meet
their needs and desire for networking, socializing and educational events,” said Theresa Patten-Koeckert, Co-Chair, VES Los Angeles Section and Business Development Manager at Halon Entertainment. “Our membership spans a wide geography across Southern California, and we are also diverse in terms of the disciplines our members represent and where they fall in their career trajectory, from people looking for work, new practitioners, hiring managers, to VFX Supervisors. So we are intentional in mapping our activities to differing social and educational demands.”
The VES Los Angeles Section has consistently fulfilled that mission with a wide spectrum of programs and events. Recent highlight include: a webinar about multi-vendor collaboration –“The VFX of The Hunger Games: The Ballad of Songbirds & Snakes,” featuring VFX Supervisor Adrian de Wet and representatives from Ghost, ILP, Halon, Rise and Outpost; the Fuzzy Door demo at Gnomon with members from the TV Academy; a series of Unreal Engine crash courses for VFX-related tasks, including environment creation, sequencing and effects; and a special Jim Henson’s Creature Shop Demo of the Henson Digital Puppetry Studio plus a Q&A on Henson Real-Time Animation at The Jim Henson Company’s historic Charlie Chaplin Sound Stage. The Section’s holiday party casino night was wildly popular. And Section members and guests benefit from an ongoing slate of exclusive film screenings and Q&A panels, thanks to longstanding support from Sony Imageworks, Disney and Netflix.
“Continuing with our programming, our members are enthusiastic about being a part of the VES Awards nomination process, and our annual judging panels at FotoKem draw members from
out of town to converge and participate in this prestigious VES event,” Nelson said. “We also take great pride in celebrating our global VFX community at SIGGRAPH through our hosting parties that bring our community together, and I want to give thanks to last year’s partners for our Los Angeles event in the LA Live entertainment district, sponsored by Netflix, Eyeline and Scanline.”
Both Nelson and Patten-Koeckert cited the exceptional work of the Section’s hands-on volunteer leadership. “Heather Baker, Social Chair, and Mark Spatny, Education Chair, put a huge amount of effort into organizing the events and the attention to detail that makes them so successful. We also want to give special thanks to our current Treasurer, Sarah McGrail, without whom none of our events would be possible, and our Secretary Ihimu Ukpo, who keeps us all organized.”
“The VES in Los Angeles is unique, not just because we are the largest Section, but because we live in the hub of film and TV,” Nelson said. “We take our roles seriously and have an eye towards trying new things that can be rolled out to other Sections and contributing to our global expansion. From our early launch of a Discord channel to connect our members, to spotlighting upcoming talent, which is now a feature in VFX Voice, we want to
LEFT AND RIGHT: VES LA members gather for a live
Door Tech’s ViewScreen suite of ProViz tools at Gnomon.
MIDDLE LEFT: VES LA members enjoy a demo of the Henson Digital Puppetry Studio at The Jim Henson Company’s historic Charlie Chaplin Sound Stage.
MIDDLE RIGHT: VES LA hosts a Pub & Trivia Night for current and prospective members at 33 Taps.
BOTTOM LEFT: VES LA hosted the SIGGRAPH 2023 gala party in the LA Live entertainment district.
continually demonstrate a group that is proactive, caring and highly attentive to the changing market and the changing needs of our members.”
Said Patten-Koeckert, “Once I joined the Section Board, I felt a real responsibility to contribute, to do something important and provide value to our members. Whether that is amplifying the benefits of overall VES membership or coming up with fun and enriching local and regional opportunities. we are all in the same boat, and providing support to our members is paramount. We take on these volunteer roles because we care, we understand the industry and we want to give something back – it’s a very rewarding position to be in.”
“I have a very personal point of view,” shared Nelson. “As a Brit working in Los Angeles, I could not imagine this opportunity when I was just starting out. Because of my involvement with the VES, I am continuing to expand my global network. I continue to draw from and learn from people who work in every aspect of the industry. It is an absolute honor, and I am still giddy with excitement finding myself working in visual effects and with the VES, because we are where creation meets technology, and you never know what will happen next.”
New York VES Celebrates VFX Excellence
By NAOMI GOLDMAN
On June 27th, the VES New York Section hosted its 10th Annual Awards Celebration. Held at the 230 Fifth rooftop bar in Manhattan, the VES Awards “after-after party” celebrated the finest achievements in visual effects artistry – from New York’s VES Award winners and nominees to local VFX heroes.
Vico Sharabani, Director, VFX Supervisor and Founder of The-Artery, received the VES NY Empire Award, which recognizes individuals who have made significant and impactful contributions to visual effects artistry in film, animation, television, commercials, video games and special venues throughout the greater New York area. He was honored for his groundbreaking contributions to the advertising and film industry over the past 26 years. Sharabani’s impressive visual effects work spans dozens of films with renowned directors such as Wes Anderson, Stephen Soderbergh and Noah Baumbach. His portfolio also includes high-profile commercials and experiential work for brands like Mercedes-Benz, Nike and AT&T, along with iconic music videos for artists including Beyoncé, Coldplay, Nicki Minaj, Kanye West and Bob Dylan. As one of the world’s foremost Autodesk Flame® artists, Sharabani has mentored hundreds of emerging talents throughout his celebrated career in New York City.
Dr. Russ Andersson, Creator and Developer of Boris FX SynthEyes, was honored with the VES NY Empire Technology Award, which honors outstanding achievement in visual effects technological innovation. Andersson is a distinguished computer scientist and expert in real-time computer vision and robotics. In 2003, he began developing SynthEyes, an innovative 3D camera-tracking software application that has made its mark on Hollywood blockbusters including Black Panther, The Guardians of the Galaxy series, The Curious Case of Benjamin Button, Pan’s Labyrinth, and the renowned TV shows Lost, Stranger Things and Game of Thrones. In 2023, Andersson and SynthEyes joined VFX software company Boris FX.
During the event, VES New York section members whose work in recent years has been recognized at the prestigious VES Awards, were also celebrated for their creative excellence. “As we celebrated the 10th Annual VES New York Awards, we were honored to bestow the VES NY Empire Award and VES NY Empire Tech Award to two prominent individuals whose work over the decades continues to elevate the art of visual effects storytelling and production,” said Jose Marin, VES New York Section Board Chairman. “Sharabani’s creative vision and leadership consistently push the boundaries of what is possible in visual effects filmed entertainment, while Dr. Andersson’s pioneering software development of SynthEyes has revolutionized 3D camera tracking, enabling countless successful film and television productions.”
The VES New York Section Board of Managers celebrates at the Manhattan rooftop tribute to VFX artistry.
Dr. Russ Andersson, Creator and Developer of Boris FX SynthEyes, was honored with the VES NY Empire Technology Award, flanked by VES New York Section Co-Chairs Jose Marin, left, and Aryeh Reisner, right.
Sponsors of the festive event included: Platinum & Gold Sponsors –Dell Technologies, SideFX Software, School of Visual Arts; Product & Prize Donations – ActionVFX, Boris FX, Blackmagic Design, Foundry, SideFX Software, JangaFX, Louper, aescripts + aeplugins, Time in Pixels; Media & Education Partners – Stash Media Inc., befores & afters, Animation World Network, Computer Graphics World, Post Magazine, VFXWorld, garius.media, Picture Shop and The Offline Sessions.
VES Celebrates SIGGRAPH 2024
The VES celebrated its global community with VES members from around the world at a high-energy VES SIGGRAPH mixer on July 30th in downtown Colorado, co-hosted with SideFX.
SIGGRAPH is the premiere conference and exhibition on computer graphics and interactive techniques worldwide. The annual conference has ushered in new breakthroughs, bolstered a community of perpetual dreamers, and mapped a future where exponential innovation isn’t just possible – it’s inevitable.
VES members from around the globe and their guests gathered at the festive mixer during the conference to meet fellow members and colleagues who are at the forefront of visual effects, computer graphics, digital art, animation, new realities, artificial intelligence, research and more. The hugely popular event, free to VES members, gave SIGGRAPH attendees the opportunity to enjoy one of the many benefits that the VES offers its global membership.
The Explosive Power of SFX
Today’s visual effects and special effects artists are pushing the boundaries of cinematic magic like never before. While advancements in CGI have transformed the industry, the timeless techniques of SFX – dating back to the 1930s – remain integral to creating breathtaking visuals. Techniques like miniatures, prosthetics, forced perspective, stop-motion and pyrotechnics continue to captivate audiences, often blending seamlessly with CGI for unforgettable scenes. The recent Mad Max series is a testament to this spellbinding combination of VFX and SFX. In this issue, we shine a light on several unsung SFX masters who bring these incredible effects to life. One standout example of SFX is the extraordinary scene from the 2015 James Bond film Spectre, orchestrated by the legendary Chris Corbould. This scene set a Guinness World Record for the largest film stunt explosion in cinematic history. (Some filmmakers, however, like Michael Bay, think their explosions are bigger). Filmed in Morocco, the detonation of the villain’s
headquarters involved an astounding 8,140 liters of kerosene and 24 charges, each containing a kilogram of high explosives.
In an interview on the official 007 website, Corbould, who has earned four VES Awards, shared insights into this monumental feat: “The explosion in Morocco was interesting in that I didn’t just want to do a huge explosion. I wanted it to have shape and progression as it spread through all the connecting buildings over an area the size of 10 football fields. Conveniently, we were filming in the desert in Morocco without a pane of glass or habited building for miles around, both of which are normal causes for concern when doing an explosion of this size. It also gave me the opportunity to use the fairly new technology of programmable detonators, where each detonator is programmed to a thousandth of a second.”
Chris Corbould’s meticulous planning and innovative use of technology exemplify the enduring power and artistry of SFX in modern filmmaking.
Image from Spectre courtesy of Columbia Pictures/Sony.