

AMD IS PROUD TO PLAY A PIVOTAL ROLE IN THE VFX INDUSTRY.


Welcome to the Winter 2023 issue of VFX Voice!
Thank you for your continued and enthusiastic support of VFX Voice. We’re proud to keep shining a light on outstanding visual effects artistry and innovation worldwide and the creative talent who never cease to inspire us all.
In this issue, our cover story goes inside fantastical director Guillermo del Toro’s stop-motion musical tale Pinocchio. We preview the films anticipated to compete for VFX Oscar gold. We delve into VFX trends in continuing education for VFX artists, spotlight a global roundtable on the future of VFX and animation, and look at how VFX companies are forging into the future of the Metaverse.
We deliver up close and personal profiles on acclaimed director Peter Weir and veteran VFX producer Diana Giorgiutti, and sit down with legendary creature master Phil Tippett. We profile the Vancouver VES Section and share practical know-how from The VES Handbook. And we go back to the world of Pandora for Avatar: The Way of Water with visionary director James Cameron.
VFX Voice is proud to be the definitive authority of all things VFX.
Cheers!
Lisa
Cooke, Chair, VES Board of DirectorsP.S. Please continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.


FEATURES
8 FILM: THE VFX OSCAR
Previewing the top prospects for this year’s prize. 16 EDUCATION: CONTINUING TO LEARN
A look at several upskilling paths for VFX artists. 28 FILM: AVATAR: THE WAY OF WATER
For James Cameron, visual effects are his craft. 40 VFX TRENDS: GLOBAL ROUNDTABLE
Industry leaders weigh the future of VFX and animation. 58 COVER: GUILLERMO DEL TORO’S PINOCCHIO
Director del Toro finds new meaning in an old classic. 66 PROFILE: DIANA GIORGIUTTI
Veteran VFX producer shares the keys to her success. 74 PROFILE: PETER WEIR
A retrospective of the director’s brilliant film career. 86 LEGENDS: PHIL TIPPETT The creaturemaster’s 30-year journey to make Mad God 94 VFX TRENDS: THE METAVERSE
Shaping the VFX industry’s role in an evolving Metaverse.
ON THE COVER: Pinocchio is made of wood, but very human in Guillermo del Toro’s Pinocchio (Image courtesy of Netflix)


WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.
•
Follow us on social media
VFXVOICE
Visit us online at vfxvoice.com
PUBLISHER
Jim McCullaugh publisher@vfxvoice.com
EDITOR
Ed Ochs editor@vfxvoice.com
CREATIVE
Alpanian Design Group alan@alpanian.com
ADVERTISING
Arlene Hansen Arlene-VFX@outlook.com
SUPERVISOR Nancy Ward
CONTRIBUTING WRITERS
Naomi Goldman Trevor Hogg
Jim McCullaugh Chris McGowan
ADVISORY COMMITTEE
David Bloom Andrew Bly Rob Bredow
Mike Chambers, VES Lisa Cooke
Neil Corbould, VES Irena Cronin
Paul Debevec, VES Debbie Denise Karen Dufilho
Paul Franklin David Johnson, VES Jim Morris, VES
Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz
Eric Roth
VISUAL EFFECTS SOCIETY
Nancy Ward, Interim Executive Director
VES BOARD OF DIRECTORS
OFFICERS
Lisa Cooke, Chair
Emma Clifton Perry, 1st Vice Chair
Susan O’Neal, 2nd Vice Chair
Rita Cahill, Secretary Laurie Blavin, Treasurer
DIRECTORS
Jan Adamczyk, Neishaw Ali, Nicolas Casanova
Mike Chambers, VES, Bob Coleman Dayne Cowan, Michael Fink, VES
Gavin Graham, Dennis Hoffman, Thomas Knop
Kim Lavery, VES, Brooke Lyndon-Stanford
Josselin Mahot, Arnon Manor, Andres Martinez
Tim McGovern, Karen Murphy
Janet Muswell Hamilton, VES, Maggie Oh
Jim Rygiel , Richard Winn Taylor II, VES
David Valentin, Bill Villarreal, Joe Weidenbach Susan Zwerman, VES
ALTERNATES
Colin Campbell, Himanshu Gandhi, Johnny Han Adam Howard, Robin Prybil, Dane Smith, Philipp Wolf
Visual Effects Society
5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 vesglobal.org
VES STAFF
Jim Sullivan, Director of Operations
Ben Schneider, Director of Membership Services
Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Shannon Cassidy, Global Coordinator P.J. Schumacher, Controller Naomi Goldman, Public Relations
Tom Atkin, Founder
Allen Battino, VES Logo Design
VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2023 The Visual Effects Society. Printed in the U.S.A.

MEET THE 2023 VFX OSCAR CONTENDERS IN A YEAR OF VARIETY AND VARIABLES
By TREVOR HOGGThere are strange variables in play in the 2023 Oscar race for Best Visual Effects. For one, filmmaker Taika Waititi and actress Tessa Thompson did a scene breakdown video for Vanity Fair and made fun of some visual effects work in Thor: Love and Thunder, without mention of the groundbreaking camera and lighting techniques utilized by Marvel Studios Visual Effects Supervisor Jake Morrison to create six separate lighting passes simultaneously for the Moon of Shame sequence without interrupting principal photography. In an interesting twist, the comments sparked an Internet frenzy about the unreasonable demands and deadlines that digital artists have to contend with on a daily basis, with the backlash possibly having a ripple effect on other MCU contenders Doctor Strange in the Multiverse of Madness and Black Panther: Wakanda Forever.
TOP: Visual Effects Supervisor Ryan Tudhope honored and preserved the messiness of the practical aerial photography for Top Gun: Maverick, which in turn made the 2,400 visual effects shots seamless. (Image courtesy of Paramount Pictures)
OPPOSITE TOP TO BOTTOM: The major technical innovations for Avatar: The Way of Water have been facial capture, the ability to do performance capture underwater and the recreation of realistic CG water. (Image courtesy of 20th Century Studios)
Every single shot required digital augmentation from rig cleanup to water simulations to digital skies while retaining the stopmotion handcrafted aesthetic for Guillermo del Toro’s Pinocchio (Image courtesy of Netflix)
Stunning CG environments will be the hallmark of Black Panther: Wakanda Forever, with the futuristic African homeland being the centerpiece. (Image courtesy of Marvel Studios)
Then, there is the matter of Top Gun: Maverick where the filmmakers and studio are marketing how everything was done practically. However, to achieve the desired cinematic scope there are over 2,000 visual effects shots that have been seamlessly integrated into the remarkable aerial plate photography, such as the opening Blackstar scene; that in itself should make the blockbuster, which received critical acclaim and earned $1.37 billion worldwide as of mid-August, a favorite to win. Curiously, though, the visual effects team led by Visual Effects Supervisor Ryan Tudhope has been grounded from promoting the film, and there is unlikely to be any campaign support from the VFX team to add fuel to the nomination fire. Nevertheless, there is a strong possibility that no perceived lack of VFX team publicity can stop Top Gun: Maverick from topping the field, as demonstrated by Dunkirk’s Oscar win in 2018.

Avatar: The Way of Water is certainly getting studio support, with the original Avatar (2009) being re-released to theaters to remind audiences of the highest-grossing film of all time. It is never wise to bet against director James Cameron, who knows how to push and invent technology to enhance his storytelling, in this case facial capture. With Cameron out to build on the spectacle of Avatar, the visual effects for The Way of Water are sure to be amazing. Some theatergoers will be stunned by the visuals, but repeat viewings will likely depend on the story development, which Cameron did not rush, as he understands the multi-film endeavor is pointless without a solid narrative foundation.

As for those filmmakers who showed ingenuity and a unique perspective towards how to incorporate visual effects, two in particular stand out. To begin, director Jordan Peele is having a major impact on redefining the horror genre and having it be a mirror that reflects the beauty and ugliness of society. Nope addresses the issue of spectacle and elevates the UFOs of B-movies into an aerial creature wreaking havoc on the world below. Massive wind machines were needed, so a helicopter was brought in to generate practical dust for the shots. Day for night photography consisted of a 3D rig that synced a color film camera with an infrared digital camera under the guidance of innovative Cinematographer Hoyte van Hoytema and Production Visual Effects Supervisor Guillaume Rocheron. And for an added bonus, there is the rampaging monkey brought to life with performance capture legend Terry Notary.

Some might argue that the visual effects are not Oscar caliber, but one has to be impressed by how filmmakers Dan Kwan and Daniel Scheinert were able to depict a believable multiverse

By
without an MCU budget for Everything Everywhere All at Once That in itself is award-worthy. The visual effects team consisted of mainly five digital artists producing 80% of the over 500 bespoke shots that include an acrobatic juggling hibachi chef who in reality was an actor pantomiming his actions with the culinary tools and ingredients added later in CG. Also, there was a case where a character had to be removed from an entire scene. Shots like the first “verse-jumping” shot of Michelle Yeoh’s character benefited from the extra time afforded by the pandemic. Michelle Yeoh is an amazing practical effect in herself, as she at one time overshadowed another martial arts icon, Jackie Chan, who actually rejected the lead role of what has become the first A24 move to earn over $100 million worldwide.

Speaking of multiverses, director Sam Raimi brings his own sense of dimensional mayhem with Doctor Strange in the Multiverse of Madness, which stands to be the top contender for the MCU. Raimi has a distinct blend of horror and comedy, which is appropriate for a story that centers around the egotistical and sardonic Master of the Mystic Arts portrayed by Benedict Cumberbatch. An incursion occurs that sees two dimensions disintegrate, a mirror trap is sprung with shards of glass, and a magical, pristine orchard is revealed to be a twisted forest conjured out of a Brothers Grimm fairy tale. Found in the heart of darkness is an evil doppelganger, run-amok scarlet witchery and a beloved 1973 Oldsmobile Delta 88.
Black Panther: Wakanda Forever looks as impressive as the original film and has the legacy of Chadwick Boseman. If anyone can pay his late colleague a worthy send-off, it will be filmmaker Ryan Coogler, who returned to direct the sequel. The world-building has made the futuristic African land of Wakanda a wonder to behold. And let us not forget all of the amazing technological toys that can be created using Vibranium, which rivals anything Q has produced for English compatriot James Bond.

If visual mayhem is what you seek, there is Moonfall, which sees


TOP TO BOTTOM: A crowning accomplishment for Thor: Love and Thunder was the ability to capture six different lighting passes simultaneously and not interrupt principal photography during the Moon of Shame sequence. (Image courtesy of Marvel Studios)

Sonic the Hedgehog 2 embraces its video game and cartoon heritage, demonstrating that not everything has to be photorealistic. (Image courtesy of Paramount Pictures and Sega of America)
Among the environmental work in Elvis is visiting the Graceland estate over three decades during three different seasons. (Image courtesy of Warner Bros. Pictures)

Nope features a creature of the sky, major dust simulations, day-for-night photography and a raging monkey, all done in a photorealistic manner. (Image courtesy of Universal Pictures and Monkeypaw Productions)

gravity go sideways as the lunar neighbor is revealed to house a dwarf star and becomes the target of a nanobot AI determined to get its vengeance against humanity. When it comes to destroying Earth, no one can do it better creatively and consistently than director Roland Emmerich. Very few sets were built physically, but a shuttle cockpit was brought in from a museum to assist with the flying scenes. Inflicting massive damage caused by an arsenal of weaponry rather than a cosmic event are the Russo siblings, Anthony and Joe, with The Gray Man, which ups the ante for assassins with no sense of covert activities, as their missions become news headlines as entire city blocks get decimated in the effort to kill one person!
Ryan Reynolds enters into fray with The Adam Project, where he gets to encounter his younger sarcastic self and attempts to destroy the invention of time travel much to the chagrin of author H.G. Wells. There is cool tech involved for some blockbuster flying and badass hand-to-hand combat sequences, but the visual effects do not capture the innovation of the zanier Free Guy, which is also the product of Reynolds partnering with fellow Canadian director Shawn Levy. As for award-worthy video game adaptations, Uncharted emerges from development hell with Ruben Fleischer shepherding the big-screen adaptation. Tom Holland’s acrobatic antics as Nathan Drake would even impress his most famous character, Spider-Man, especially the aerial daisy-chain sequence, and there is a different spin on a naval battle. While two long-lost 16th century ships are being transported in the air by heavy-duty cargo helicopters, the opposing forces swing on ropes going from one seafaring vessel to another.

Battling for supremacy in the DC Universe will be The Batman and Black Adam Black Adam hopes to dodge the critical and visual effects backlash of The Scorpion King, which also starred Dwayne Johnson getting involved with Egyptian god shenanigans. No doubt the technology has greatly improved since then, but the rush to finish the visual effects on time hopefully won’t undermine quality. Johnson is such a likable person that it will be interesting to see him portray an antihero. In The Batman, the car chase through the rain by Wētā FX and Scanline VFX, flooding the streets of Gotham, are standout environmental moments when it comes to visual effects. Watch out for The Batman punching and grappling his way to a nomination.
Sony continues to spotlight comic book villains with the doctor turned bloodsucking creature of the night. Morbius reveals that the cure is worse than the disease. The vampire faces are the most impressive digital work, but the toughest was honoring physical dynamics. The entire third act was rewritten and had to be reconstructed with bluescreen. Sonic the Hedgehog 2 was nearly derailed with the restart after the pandemic lockdown led to a digital artist talent shortage and capacity issues with different vendors around the world. Fortunately, there was no character designs required this time around, so the focus could be on introducing new characters and environments. Jim Carrey is let loose once more as the evil Dr. Robotnik, with the twirlable mustache added by a gigantic mech robot and Idris Elba channeling the adversarial Knuckles.
Elba gets to literally punch a malevolent lion in Beast, which

The recreation of ancient Egypt mixed with superpowers lead to stunning visuals in Black Adam that could only be accomplished with the support of CG. (Image courtesy of Warner Bros. Pictures) The high tech that goes along with time travel gets an imaginative spin in The Adam Project. (Image courtesy of Netflix)


features a CG antagonist in the vein of the infamous grizzly bear attack in The Revenant. Think Jaws on safari. Icelandic filmmaker Baltasar Kormákur has gained a reputation for being able to shift between blockbusters like Everest and indie films such as The Oath; he is aware of the importance of visual effects and using them wisely as reflected by his ownership of RVX, the effects arms of RVX Studios, and ongoing collaboration with Framestore. Other creatures unleashing havoc on the human population are the prehistoric ones brought back to life in Jurassic World Dominion, which ties together with the seminal Jurassic Park that achieved groundbreaking photorealistic digital effects. Production Designer Kevin Jenkins, Visual Effects Supervisor David Vickery and Creature Effects Supervisor John Nolan worked closely together to ensure that dinosaurs were anatomically correct and that as much of the animatronics could be maintained as possible. The major innovation was finally introducing dinosaurs with feathers, like the Pyroraptor, to the franchise, which required ILM to build a new feather system.

Curiously, the live-action version of Pinocchio by director Robert Zemeckis and starring Tom Hanks is going directly to Disney+, so it will not qualify for the Oscars. However, there will be a brief theatrical run before Guillermo del Toro’s Pinocchio takes up permanent residence on Netflix. It might seem as a stretch to include a stop-motion animation feature as part of the contender list, but the feat was actually achieved by Kubo and the Two Strings The realm of Limbo and the interior of the dogfish are two major CG environments, and there is also a minor fully-CG character, while atmospherics range from being realistic mist to snow that is given a paper-like quality, as well as surrealistic skies, set extensions, and plenty of clean-up resulting from set shifts, light flickers, dust and hair. All of this is done while maintaining a live-action approach to both the camerawork and animation of the puppets. Other award-worthy possibilities are the prequel Fantastic Beasts: The Secrets of Dumbledore, which casts spells of good and evil and features various magical creatures from the imagination of J.K. Rowling. Bullet Train recreated Japan and a speeding locomotive with LED screens and a soundstage in Los Angeles. Idris Elba appears as a wish-fulfilling genie in Three Thousand Years of Longing, directed George Miller. Three decades of the life of Elvis Presley gets the Baz Luhrmann treatment in Elvis, which is basically a film about a showman by a showman. The visual effects are faithful to the period while also having an element of hyper-realism to them when depicting Graceland in the various stages and seasons in the 1950s, 1960s and 1970s. The winner for Best Visual Effects at the 95th Academy Awards, being held on March 12, 2023 at Dolby Theatre in Los Angeles, is not a foregone conclusion, and that will make an interesting change from last year where Dune was the runaway favorite, with the only question being which runner-up films would get nominated.


VFX CONTINUING EDUCATION: WHERE DO VFX ARTISTS


GO TO KEEP LEARNING?
By CHRIS McGOWANIn these days of rapidly developing VFX technologies and processes, it can be hard to keep up in your field, let alone master new realms. Fortunately, there are many options for continuing education – on the job or via schools or online tutorials. “Since a majority of VFX workflows are closely allied with emerging computing technologies, VFX artists, production management and pipeline staff need to constantly keep abreast of change in workflows and tools resulting from new technologies,” says Shish Aikat, Global Head of Training for DNEG. “Depending on the depth of commitment a VFX artist has to a specific topic or workflow, the sources of learning can be anywhere from a series of tutorials on YouTube, online courses at portals such as fxphd or Gnomon Online [to] an undergraduate/ graduate program in VFX at a degree/diploma-granting institution.”
Oftentimes, one need not go further than one’s job to find new training. Dijo Davis, Senior Training Manager for DNEG, comments, “Once the artists make it into DNEG, a variety of in-house training resources is available for their reference via our intranet. In addition to this, custom ‘upskilling’ training programs are organized and conducted to support employees and keep the existing and upcoming shows in line.”
Here is a look at several upskilling paths for the VFX artist.
ONLINE RESOURCES
A vast variety of visual effects courses is available on the internet. For starters, “The tutorials you can find on independent sites or YouTube are super helpful to help fill in any targeted foundational gaps an artist may have. I would say the companies and artists that provide these tutorials are champions for these applications, and they are a great way to learn focused skills,” says Matthew Cruz, Global Creative Training and Development Manager for Technicolor Creative Studios (TCS).
Aikat adds, “Thanks to a lot of open-source tools for VFX, a lot of tools and learning materials are available, and going by the views and comments on VFX tutorials on YouTube, it appears that a surging number of VFX artists are taking advantage of these courses. Game engines, GPU rendering and real-time technologies have piqued the interest of a multitude of VFX artists. Companies like Epic Games and Unity offer hundreds of courses and channels for artists to interact, and VFX artists are thronging in large numbers to these sites and channels.”
Supported by Netflix, the VES Virtual Production Resource Center offers access to free educational resources and information on the latest trends and technologies. Aimed at both current and future VFX professionals, the center is an effort of the Visual Effects Society in collaboration with the VES Technology Committee and the industry’s Virtual Production Committee (https://www. vesglobal.org/virtual-production-resources).
“When it comes to tools, there’s a tremendous amount of autodidactic training that the artists do on their own on their personal time with both paid or free master classes and a lot of online videos,” comments Sylvain Nouveau, Rodeo FX Head of FX. “VFX artists spend a lot of time sharing their knowledge with each other both during and after work hours – scouring for videos online and posting in relevant forums, checking Reddit.”

Unreal Fellowship is a free 30-day blended experience for learning Unreal Engine. (Image courtesy of Epic Games)

SOFTWARE COMPANY TUTORIALS
Some software company tutorials (for VFX, animation and related creative tools) charge a fee, but many are free. Autodesk offers thousands of free tutorials across YouTube and AREA, according to a company spokesperson. Its YouTube channels include Maya Learning Channel, 3ds Max Learning Channel, Flame Learning Channel and Arnold Renderer. And there are AREA tutorials and courses at https://area.autodesk.com.

SideFX offers free lessons about Houdini and Houdini Engine, which plugs into applications such as Unreal, Unity, Autodesk Maya and 3ds Max,” says SideFX Senior Product Marketing Manager Robert Magee. The SideFX website (https://www.sidefx. com) is host to 2,758 lessons designed to support self-learning within the community. Magee explains, “To help navigate all of these lessons, the SideFX site has 18 curated ‘Learning Path’ pages, which highlight the best lessons to explore for a variety of topics.” The 450 lessons created and published by SideFX are all free. The other 2,300+ tutorials are “submitted by the community.”
The Unity Learn platform offers “over 750 hours of content, both free live and for-fee on-demand learning content, for all levels of experience,” according to the company (https://learn. unity.com/). There are hundreds of Red Giant tutorials on its YouTube channel (https://www.youtube.com/c/Redgiant) and free Maxon webinars here: (https://www.youtube.com/c/ MaxonTrainingTeam). Foundry’s site, learn.foundry.com, has Nuke, Modo, Flix and Katana tutorials, among others.
For VFX and video-editing tutorial options, Adobe offers over 100 free tutorials for After Effects and Premiere Pro users ( https:// creativecloud.adobe.com/learn/app/after-effects and https:// creativecloud.adobe.com/learn/app/premiere-pro). These tutorials range in skill level from beginner to experienced and cover an array of topics including how to get started, animate text, and add special and immersive effects to a video, among others (also see: https://creativecloud.adobe.com). And, Adobe offers training and certification programs for those looking to further their learning and skill sets at an additional cost (https://learning.adobe.com).
The Unreal Fellowship is a free 30-day blended learning experience for learning Unreal Engine. Julie Lottering, Director of Unreal Engine Education at Epic Games, comments, “The Unreal Fellowship started as a pandemic relief strategy when many artists in film were furloughed. Epic wanted to support the creative community by giving them a chance to expand their skills
real-time tools for free. We transitioned the Fellowship into
with
a
TOP: TCS and its studios have a trainer for each department who is a specialized in the tools of that department and pipeline. (Image courtesy of Technicolor Creative Studios)
BOTTOM: The
“Thanks to a lot of open-source tools for VFX, a lot of tools and learning materials are available, and going by the views and comments on VFX tutorials on YouTube, it appears that a surging number of VFX artists are taking advantage of these courses. Game engines, GPU rendering and real-time technologies have piqued the interest of a multitude of VFX artists. Companies like Epic Games and Unity offer hundreds of courses and channels for artists to interact, and VFX artists are thronging in large numbers to these sites and channels.”
—Shish Aikat, Global Head of Training, DNEG

permanent program because it was beloved by the industry.”
Lottering adds, “In April 2022 we welcomed select partners, whom we call Connectors, to offer regional Fellowships so that artists can access our curriculum and training methods in their own time zones and languages.” The Fellowship program (https:// www.unrealengine.com/en-US/fellowship) combines live-led instruction and labs, with mentors coaching participants on their personal projects. Lottering continues, “The majority of our programs are online so that we can include people from a wide variety of locations and backgrounds.” She notes, “Our intention is to elevate the level of creativity in games, film, animation and VFX by offering world-class educational programs.”
Also of note, disguise is set to launch a Virtual Production Accelerator Program, a comprehensive on-set learning program for all skill levels, in partnership with ROE Visual, to be held at the latter’s new LED volume in Los Angeles.



CONFERENCES
Conferences such as SIGGRAPH and FMX also help keep artists current. DNEG’s Aikat notes, “Many smaller VFX houses do not have the resources to house an R&D department, and they often lean on technologies and workflows seeded through academic research. Conferences like ACM SIGGRAPH are opportunities for VFX artists to learn about the products of such research. Many VFX studios send delegates to these technology conferences to soak in the future trends, workflows and solutions predicated by emerging technology.”
FMX, held in Stuttgart each spring, draws thousands of professionals and students. “At FMX, we curate conference tracks on the latest and greatest developments in projects and processes as well as in hardware, software and services,” says Mario Müller, FMX Project Manager. “Conferences play an important part in continuing education [about] the art, craft, business and technology of visual effects. Compared to the readily available information by vendors on the internet, a conference can provide a more concentrated, curated and neutral perspective as well as a community platform.”
The RealTime Conference, founded in 2020 by Jean-Michel Blottière, is fully virtual and free. The live gathering explores realtime technologies with live demos, panels, workshops, classes and keynote speeches (https://realtime.community/conference).
VIEW Conference (https://www.viewconference.it), another notable event, is set in Turin (Torino), Italy, and focuses on computer graphics, interactive techniques, digital cinema, 2D/3D animation, VR and AR, gaming and VFX.
VFX/ANIMATION SCHOOLS
Schools offer various levels of courses, which can be utilized by VFX artists currently working or between jobs. “VFX artists are constantly learning,” says Colin Giles, Head of School of Animation at Vancouver Film School (VFS). “As the industry changes, it is important that people who work as VFX artists have the outlets to keep learning while they’re working. This is why, in addition to our one-year intensive program, we offer online, part-time certificate

Students on set in the on-campus greenscreen room at Vancouver Film School. (Image courtesy of the Vancouver Film School)

Technicolor Creative Studios, through MPC and its other studios, have a training academy to educate new employees in specific VFX tools.

(Image courtesy of Technicolor Creative Studios)
The Unreal Fellowship teaches about virtual production and the mechanics of using Unreal Engine for storytelling.

(Image courtesy of Epic Games)
courses and workshops – called VFS Connect – to help people learn part-time without having to leave their jobs.”
For VFX artists to keep up-to-date at work, “some of the bigger studios have quite extensive internal training programs,” while “smaller studios and freelancers depend on online resources [free and paid for] and upskilling via training providers like ourselves,” says Dr. Ian Palmer, Vice Principal at Escape Studios, part of Pearson College in London.
According to Scott Thompson, Co-founder of Think Tank Training Center (TTT) in Vancouver, “In some cases, studios do work with schools like Think Tank to increase their understanding of new ideas. As an example, Think Tank had Mari well before the studios did, so our grads often became teachers to catch them up.”
Escape Studios provides continuing education for VFX professionals “through our short courses, most of which are in-person. We do short daytime courses for those who can take a break from their day job and evening courses for those who are fitting around other commitments. Many of our evening classes are available online,” Palmer says.
VFX artists are often looking for an upgrade. Palmer explains, “Sometimes it’s just to get a fresh skill set to enhance their career. We also have people that want to change direction and need some guidance in that.”
At SVA, a lot of students “are working professionals,” Adam Meyers, Producer at New York City’s School of Visual Arts (SVA), says. “When you have that allotted time after work hours each week, it seems easier than fitting it in at work. Most of them are looking to get training that is more focused.” Meyers adds that “my current continuing education classes are online since COVID.” Asked if he often sees visual artists leaving the industry and going to school for an upgrade to get a “next level” job or to change their career path, Meyers responds, “Every semester. Education is about growth. Artists evolve just like the software.”
At the Savannah College of Art and Design in Georgia, students in the visual effects department take on assignments that reflect the most current working studio practices, such as in virtual production. Dan Bartlett, SCAD Dean of the School of Animation and Motion, notes, “The LED volumes in Atlanta and in Savannah are industry standard, so what we’re able to do in these spaces are create learning experiences that are absolute mirrors of what they would be doing if they were working for a major studio on features or on long-form television. Students work on everything from negotiating the production design components to developing both the digital and the physical assets that go into a shoot, to working in-engine – in our case Unreal Engine – to build the virtual cameras and the virtual lighting setups in order to bring those on-set shoots to life.”
At some VFX studios, like Framestore, visual effects artists do extensive in-house training and also benefit from some specialized classes. “Framestore has a real ‘melting pot’ of learning styles and preferences,” Simon Devereux, Framestore Director, Global Talent Development, says. “Historically, their studios have always offered their employees everything from life drawing and clay sculpting masterclasses to software mastery and behavioral skills training.”


Framestore recently hired a new Global Head of Content & Curriculum, Chris Williams, “who now leads the development of technical and software training, building new learning pathways, and he will ultimately develop the teaching capabilities of all our employees in order to support our global network of offices,” Devereux says. “That, along with a dedicated Unreal and 3D trainer and two production trainers, means we’re in a unique position to build on what is already an incredible investment in the personal and professional growth of our colleagues. In addition to this, we invest in a range of technical tutorial-based training platforms that are accessible across all of our studios.”
Aikat adds, “Constant in-house upskilling with customized training programs and keeping a close eye on the new trends in the market is the key to success.” DNEG also supplements its training curriculum with talks from experts on a range of topics that cover technical tools, product technologies and creative pursuits.
Rodeo FX helps broaden its employees’ horizons with evening classes (paid for by Rodeo) at official partner schools and a few technical colleges. But the greatest learning may come from the other artists. “Fifty percent of what we learn comes from others – whether it is new talent arriving from other companies or Rodeo employees sharing their knowledge with other studios,” Sylvain Nouveau, Rodeo FX Head of FX, comments. She adds, “In many ways, it’s all of these exchanges that keep the industry moving forward. With an average of about two years in a studio, there’s a lot of information flowing.” Marie-Denise Prud’homme, Rodeo FX Manager of Learning and Development, remarks, “Formally, artists share information with each other [tutorials, videos how-tos] through an information-sharing platform we use called Confluence. But shared learning can even be as simple as arranging meetings and calls on the fly.”



For on-the-job training, TCS has had a training Academy for years “for college leavers to be trained on photorealism and specific VFX tools and pipeline for features,” Cruz comments. Recently it started a new initiative. “We have a trainer for each department who are specialists in the tools of that department and pipeline. For example, for the FX department we have a dedicated FX trainer to help the department on the floor, so they would be an expert in Houdini, for example, and FX sims.” (TCS’s network of studios includes the Mill and MPC.)
Ron Edwards, TCS Global Head of Commercial Development – L&D, notes, “We encourage our artists to master cutting-edge technology and tools so they can produce content at the highest caliber. The company will also sponsor these efforts and provide employees the chance to learn from accredited institutes and alongside their peers to further their careers.” Edwards adds, “We always want to stay on top of the latest tech to future-proof ourselves and our students.”
Escape’s Palmer comments, “It’s always an exciting field to work in. Just when you think you’ve seen it all, something comes along to amaze you. The industry is full of very bright people with a thirst for knowledge, so while it’s challenging to keep up, that’s what makes it so interesting. Long may it continue!”



CREATING WAVES WITH AVATAR: THE WAY OF WATER

After recreating a famous nautical disaster that became the highest-grossing film of all time, filmmaker James Cameron turned his focus towards the skies and imagined what galactic colonialism would look like if humans discovered other inhabitable planets and moons. Avatar went on to unseat Titanic at the box office and is now being expanded into a franchise consisting of four planned sequels, the first of which, Avatar: The Way of Water, takes places 14 years after the original movie, where wheelchair-bound mercenary Jake Sully (Sam Worthington) leaves behind his crippled body for a genetically engineered human/Na’vi hybrid body to live with the indigenous Neytiri (Zoe Saldana) on the lunar setting of Pandora.
“Visual effects allow us to put up on the screen compelling and emotive characters that could not be created with makeup and prosthetics, and cannot be as engaging with robotics, and it allows us to present a world that doesn’t exist in a photoreal way as if that world really exists,” Producer Jon Landau remarks. “Those two things, combined with the story that we have, creates a compelling cinematic experience.”
TOP:
Images courtesy of 20th Century Studios
OPPOSITE TOP TO BOTTOM: Fourteen years have gone by since we last saw Neytiri and Jake Sully, who have gone on to have a family that includes Kiri, Neteyam, Lo’ak and Tuk.
Even when the scripts were being developed, James Cameron had his team of concept artists at Lightstorm Entertainment visualizing settings, such as a bio reef from a high angle.
From left: Sigourney Weaver as Dr. Grace Augustine, James Cameron and Joel David Moore as Norm Spellman rehearse a scene in one of the many practical sets. (Photo: Mark Fellman)
The biggest technical advancement has been in facial capture. “On the first movie, we recorded facial performance with a single standard-definition head rig,” Landau details. “This time around we’re using two high-definition head rigs. We’re capturing quadruple, or more, the type of data to drive the performance. Wētā FX has a smart learning algorithm that trains on what the actors do after we put them into FACS session.”
A template is constructed before principal photography commences. “We probably have 160 people here in Los Angeles that build these files, that are a slightly cruder representation of the movie but describe exactly what it is we want to achieve,” Production Visual Effects Supervisor Richard Baneham explains.
“We acquire our performances, go through an editorial process selecting the preferred performances, and put them into what we call camera loads, which are moments in time as if they are happening. A scene might be made up of 10 ‘camera loads,’ or just one depending on the consistency of the performances that we need to deal with and the intended cutting pattern. It is a true representation of our intended lighting, environments and effects to the point when we’re done with it, Jim often asks, ‘Does it match the template?’”
Returning as the main vendor to the franchise is Wētā FX, which handled the vast majority of the 3,200 visual effects shots, with ILM contributing 36. The performance capture took into account that a number of scenes take place above and below water. “When actors are performing underwater, almost always you can use the body capture, which is helpful for getting that sense of how characters would be behaving in water,” Joe Letteri, Senior Visual Effects Supervisor for Wētā FX states. “The facial gets more complicated because the face rig that we can use above ground won’t hold up below ground. We used the light system underwater, like a single GoPro, rather than a pair of high-resolution stereo cameras that we would normally use. If there was a particular emotional beat where we were on the characters and Jim needed their performance, as soon as he got the takes done in the water, he would have them come up, put on the normal head rig and repeat the performance for their facial capture above the water. Then we would stitch the two together later.”


Adopted into the Sully family is the human character of Spider, portrayed by Jack Champion. “We had Jack onstage for performance capture and captured everything with him in scale, which meant building multiple scale sets and getting the interaction

from character to character in a nonuniform scale, which is a hard thing to do,” Baneham remarks. “Hopefully that culminates into a singular performance that has an integrated representation of Spider at the right scale. Then, Jim is able to take the cameras that he or I did and repeat it on set with the live-action crew. Jack as Spider live-action is often heavily informed by what he did in the capture session, but there is lots of room to maneuver or make choices on set that are different from what we did. Exterior forces impounding on a character is actually how you trick an audience into believing that something is really integrated when in fact it’s a full CG representation of Quaritch and a live-action representation of Spider. Pantomime is your enemy when it comes to integration.”

Each sequel will introduce new cultures, with the coastal-dwelling Metkayina making an appearance in Avatar: The Way of Water. “The Metkayina skin is slightly greener and has stripes that have more of an aquatic feel, while the forest Na’vi are bluer and have a stripe pattern based on tiger stripes,” Letteri explains. “The Metkayina have a stronger dorsal ventral coloring, where they are lighter in the front and darker in the back, more similar with what you would see with aquatic creatures. And they are evolved for swimming, so they have tails that have a wide end to them that can be used to help propel them through the water. The Metkayina have what we call stripes on their arms and legs that are like thicker skin, almost like flanges that can be used for propulsion. There were a number of design changes to make them distinctly adaptive to their environment, and that is part of the story. When Jake and his clan go to visit, they have to adapt, despite not being physically built for the environment.”

Marine life has been incorporated into the world-building of Pandora. “You figured that something like a fern would have probably evolved on another planet,” Letteri notes. “There are a lot of plants that we spread out to give you that familiarity, and interspersed are what we call the exotic plants, which are the Pandoraonly plants. That got set up during the daylight scenes, and then for the nighttime scenes we turned on the bioluminescence, which


TOP TO BOTTOM: Everything is based on reality. Even the stripe pattern on the Na’vi was inspired by tiger stripes.
Returning as an antagonist is the character of Miles Quaritch, played by Stephen Lang.
From left: Ronal, Tonowari and the Metkayina clan have a skin tone that is greener than the bluer forest-dwelling Na’vi.

was on both plants [ferns and exotic], which gave it that distinction. That was definitely inspired by bioluminescence underwater, and bringing it above ground and having it illuminate this whole forest. Underwater we took a similar approach by having different types of coral-like fan or tree corals mixed with new ones that give it more of that Pandora feel, and adding bioluminescence.”
A sentient creature called Payakan has a pivotal role. “Don’t call him a whale!” Baneham laughs. “He’s a Tulkun. What we do is try to never allow design to be for design’s sake. The kinematic structures, environmental aspects of how and where they live, what they would eat and how they would hunt, all inform the design, which is the outward expression. What is bringing them to life is the motion side of things. The motion design is understanding how something locomotes, emotes and expresses itself on a physical level. We may need to make changes to where the fins are and how big they are. Then you ask, ‘What would make sense for this creature and character?’ You can start to manipulate the design into its final end stage. I always try to ground everything in terrestrial reference, and Jim is the same. Even though we know that it’s not real, our job is to make the audience walk away feeling that place and those characters can exist [on Pandora].”
On set was Wētā FX Senior Visual Effects Supervisor Eric Saindon, who had a close working relationship with Cinematographer Russell Carpenter. “In The Hobbit, there were a few shots in the troll sequence where we put a background plate behind a bluescreen and did a quick Ncam situation so you could see what was going on,” Saindon remarks. “In this film, every live-action shot was captured with Simulcam using depth-based compositing within the liveaction camera. When you are looking through the lens, our liveaction and CG elements are composited together at the correct depth and placed in the proper space. We know exactly where Spider is walking in and amongst five or six Na’vi. You know where the background falls and where the eyelines of the live-action characters could be. It’s not even using the bluescreen.”

Carpenter found his third collaboration with Cameron different


Aquatic




from lensing True Lies and Titanic. “On a regular film, the director of photography is working with the production designer from the word ‘go,’” Carpenter observes. “Here, I come in and had to make sure that the language is already in the virtual world, because I was bringing live lighting to that. Towards the beginning of the film, there were definitely scenes populated by humans that had hardly anything virtual about them, but as the story branches out into the Na’vi world, then you’re into this world of compositing. That’s the point where it gets painstaking. Especially painstaking are the jungles, because what kind of light is coming down through the canopy? We don’t have a canopy onstage, but we do have branches that we can put that are fairly close. And if you’re running through a jungle, your actor is hit by this kind of light, and he goes 15 feet and it’s a different kind of light, and then there is something else. Not only does the light have to happen at the right time, but it also has to be the proper color temperature and right quality.”
Action takes place out at sea. “That big Picador we actually built with two HamiltonJet engines in it,” Saindon reveals. “The boats were put on the same gimbal that we use for flying, but at a much bigger scale. We did simulations of those boats in the water so we got the proper motion of them going over the waves, how the bow goes up over the waves, and the way they jump the waves and move through.” Water interaction was achieved by utilizing spray cannons. “We could have faked it, because we had to add water anyway for the bow of the boat,” Saindon notes, “but what we wanted was Mick Scoresby [played by Brendan Cowell] up on the bow being hit by a wave, clearing his mask, being absolutely drenched, and knowing how to act if he got hit by a wave. We hit him with four spray cannons that just about knocked him over, because it was a lot more water than he was expecting! It puts you into the shot.”
Coordinating everything was a massive enterprise. “What was difficult for me, though it was satisfactory watching what it eventually became, was not having that payoff at the end of the day when you’ve done a live-action film and go, ‘That was such a great scene. I loved watching the actor do that,’” Carpenter states. “The payoff comes much later when you’ve seen the whole thing done by Wētā FX.” The important aspect is making sure that the technology does not overshadow the storytelling. “That’s the thing,” Carpenter emphasizes. “This goes back to Jim. The miracle is that there was five and a half years of blood, sweat and tears that went into this, and there is ton of technology, and when I look at the scenes, I don’t see any of the technology. It’s just an immersive experience. I see Jim as a North Pole explorer. He tries some challenges that he doesn’t know how to do, but is betting that he can do it by the time the film is finished.”
“Visual effects allow us to put up on the screen compelling and emotive characters that could not be created with makeup and prosthetics, and cannot be as engaging with robotics, and it allows us to present a world that doesn’t exist in a photoreal way as if that world really exists. Those two things combined with the story that we have creates a compelling cinematic experience.”
—Jon Landau, Producer

James Cameron Conducts a Deeper Exploration of Pandora




Unlike the marker-based facial capture used by Robert Zemeckis for Beowulf and The Polar Express, James Cameron decided to go the image-based route for Avatar. “It was an intense learning curve,” Cameron admits. “When we finished Avatar, what I requested was for the studio to continue to pay everybody for a couple of months so that we would have time to do a full and complete download and debrief. We did a three-day retreat during which I asked everybody to bring a white paper and notes of their three or four years of work and R&D so we could figure out how to do it better this time. Better in two ways: the end result on the screen looking better and a process that was more straightforward, efficient, intuitive and user friendly for artists.”
Along with merging their pipelines, Lightstorm Entertainment (a production company founded by Cameron and producer
James
Water tanks were built specifically for the underwater performance capture. (Photo: Mark Fellman)
Lawrence Kasanoff) focused on the performance capture in water while Wētā FX looked after the CG water. “Wētā FX was responsible for developing all of the tools necessary for the computational fluid dynamic simulations that were necessary to simulate water and figuring out exactly the layers of simulation technology that would be required for that,” Cameron remarks. “In the early stages, my in-house team had to develop the methodology for capturing in the water. Our air volume used infrared. However, infrared doesn’t propagate underwater. We wanted something from a nonvisible spectrum so that our reference camera lighting didn’t interfere with the performance-capture camera system. We tested ultraviolet and that turned out to work quite well. Then we had to create the code to knit the two volumes together in real-time so that we would solve for the entire figure.”

James Cameron, visual effects are part of the image-making process.



Ocean conditions had to be recreated to get the proper performance capture interactions. “We created an underwater wind tunnel so we could have actors riding on their creatures and acting and interacting with each other where we could shoot the reference camera so that we could see their facial and body performances clearly,” Cameron states. “Capture data is useless if you don’t have great reference, because that reference footage is what we cut with as editors and is how we ground truth the dataset. It’s like a triangulation process. In the end, we used that quite extensively in the animation phase.” The technology enhances rather than hinders performances. “In live-action, an actor has to maintain a performance across multiple times, whereas with performance capture they only have to do it once. When we do additional takes, those takes are exploratory, they’re not about matching. When people see this film, they’ll be captivated by the world-building, the visuals, and all of the things that we couldn’t do with the camera. But what they’ll be astonished by is the sense of connection to the characters.”
Helping to drive the storytelling is a group of in-house concept artists at Lightstorm Entertainment. “On the sequels, I started by working for six months generating 800 pages of notes on the characters and the general ideas of the story,” Cameron explains. “I gave those notes to my co-writers and was already in the design process. It was a parallel processing between the story-building and the world-building. I knew that they were going to need creatures to ride, so I came up with two creatures – an Ilu and a Skimwing. I said, ‘Start playing with these ideas.’ Meanwhile, I’ve started to write characters jumping on an Ilu and doing this and that. It emerged that the Skimwing would be more of a warrior’s mount and the Ilu is more of a local horse-like creature, not in appearance but in behavior. We have this idea from Hitchcock on down that auteurs already have the movie running in their head and it’s just of process of communicating it to everybody else. It’s not like that at all. You have this fuzzy picture. That’s why I work so well with all of these artists, because they know that there’s something there and that their input will be the final thing that people see.”
“As I was coming up as a director, there were the regular sequences and then there were the visual effects sequences,” Cameron observes. “An Avatar movie works differently. The Way of Water is three hours long, and there is not one second of that which is not a visual effect. It more plays by the rules of an animated film, except the end result looks like photography. We used to call it special effects; however, when they’re not special anymore, what do you call them? To me, they’re not visual effects anymore, but the image-making process. My live-action cast of Jack Champion, Brendan Cowell and Edie Falco worked in the performancecapture volume first, roughing in the scene so that later they wouldn’t be disoriented when shooting in front of a bluescreen and a partial set. When you ask me, ‘What is my attitude towards visual effects now?’ It’s my craft. Do I fantasize about just going out and grabbing a camera because I love to handhold it and shoot a liveaction film down and dirty? Yeah, I love that. But the truth is that I get to do that within the greater story of an Avatar production.”
“When people see this film, they’ll be captivated by the world-building, the visuals, and all of the things that we couldn’t do with the camera. But what they’ll be astonished by is the sense of connection to the characters.”
—James CameronTOP TO BOTTOM: James Cameron describes visual effects as being the fabric of his craft when it comes to making Avatar movies. (Photo: Mark Fellman) Edie Falco joins the Avatar franchise as General Ardmore. (Photo: Mark Fellman)
For

2023 GLOBAL PERSPECTIVE: THOUGHT LEADERS ON THE VFX AND ANIMATION SUPERHIGHWAY
By JIM McCULLAUGHWhat does the road ahead look like for the VFX and animation industries? The subjects that are most top of mind for those at the nexus of VFX and animation include: real-time, virtual production, LED volumes, AI, machine learning, AR, the Cloud, hybrid working, tangible effects of the pandemic, global expansion and the search for talent. With the world canvas now a bullet train of VFX-infused movies, via streaming and related platforms, a global cross-section of industry leaders meets in this VFX Voice virtual roundtable to discuss the outlook for the new year.
Paul Salvini, Global Chief Technology Officer, DNEG
When I think of global trends happening in the VFX community, the one that excites me most at DNEG is how real-time technology is establishing itself within our film pipeline. With the determination to provide our artists with the best possible content creation tools, DNEG’s UX (User Experience) and R&D teams have been working closely with our artists to find innovative and better ways of working by leveraging the power and immediate feedback of real-time technologies.
TOP: Iwájú, a comics-style animated series set in a futuristic Lagos, Nigeria, is the first production between Disney Animation and an outside studio, African animation company Kugali. CG animation is provided by Cinesite. (Image courtesy of Disney+)
Over the last year, we completed several successful projects using our new hybrid real-time pipeline: an animated short film (“Mr. Spam Gets a New Hat”), final pixel environments for a major feature film (The Matrix Resurrections), and digital backgrounds for various virtual production projects. Thanks to the immediate feedback that real-time technologies provide, artists have more time to iterate and explore creative possibilities. The results speak for themselves. The quality of real-time output today is impressive.


TOP: Mission Impossible: Dead Reckoning – Part One is the seventh entry in the series. Visual effects and animation are supplied by ILM London.
BOTTOM: Shazam! Fury of the Gods follows in the wake of 2019’s Shazam! SFX vendors include DNEG, OPSIS, RISE Visual Effects Studios, Scanline VFX and Weta Digital. (Image courtesy of Warner Bros. Pictures)
Johnny Fisk, President, FuseFX
This new renaissance of entertainment touches every one of us. We’ve seen our industry explode in all directions as the use of VFX continues to proliferate throughout all aspects of the market. High-end content is now being produced for all platforms in media. With the growing work, we, too, are stepping into the next generation of VFX. We’ve only just begun scratching the surface of how emerging tools and techniques can be utilized to tell bigger and more engaging stories moving forward. As artists, we’re forging new territory, such as utilizing real-time software and deep learning technology in our imagery and workflows. Bringing innovation to the table is bringing a renewed energy to all of our work. I don’t think there’s a more exciting time to be working in VFX than right now.
Christopher Edwards, Founder & CEO, The Third Floor
Regarding location-based entertainment, experiential entertainment and the real promise of the Metaverse... At The Third Floor we have always been storytellers at heart. We love helping visionaries take audiences on visceral, emotional journeys, but this isn’t strictly limited to linear media. Modern audiences are increasingly obsessed with quality interactive and immersive experiences that can take their sense of engagement to the next level.
For over 18 years, The Third Floor team has crafted cinematic moments for AAA video games and world-class theme parks, including Super Nintendo World at Universal Studios Japan. The pandemic sequestered so many people for so long that there has been a societal shift towards appreciating communal events. Whether this is a physical gathering, such as a live concert or a trip to an amusement park, or a virtual gathering in an MMO (Massive Multiplayer Online) game, there is nothing quite as satisfying as a shared experience with friends and family. So, content creators are beginning to adapt and expand their IPs to formats that complement traditional media formats and encourage social engagement and viral marketing.
Tram Le-Jones, Vice President of Solutions, ftrack The pandemic has encouraged a lot of new thinking, which is very exciting for us as an industry. With the pivot to working from home, we’ve realized that making changes wasn’t as hard as we thought. The pandemic disrupted our norms, made us realize what’s really important in our lives, and forced us to do things personally and professionally we hadn’t done before. We’re making new connections that have opened us to adjacent and nonadjacent industries. It isn’t an entirely new concept, but the pandemic has accelerated it. Not only are we learning from others, but also they are learning from us. We’re working more collaboratively and finding that we all have much more in common than we originally thought. We’re going to see a lot more from this intersection.
Kim Libreri, Chief Technology Officer, Epic Games
Real-time technology will continue to have significant impacts on filmmaking and entertainment. As we’ve seen with the explosion of virtual production and in-camera visual effects in particular, VFX crews are becoming more and more of an integral part of the primary on-set filmmaking process. VFX artists are now joining the ranks of cinematographers, production designers, costume


“The most recent example of a game-changer for us was the way our artists leveraged our latest AI face-swapping tools as an element to create the youthful Skywalker in The Book of Boba Fett . Combining the best of our digital facial animation techniques with the latest in machine learning really lets us achieve a combination of likeness and detail that wouldn’t have been possible just a few years ago.”
—Rob Bredow, Senior Vice President & Chief Creative Officer, ILM

designers and other roles that shape production from its earliest stages. As the VFX process becomes more immediate and tactile to key creatives, artists are collaborating and iterating more with other departments. This dynamic reduces miscommunication and repetition, as creative decisions can be made interactively while in production.

Furthermore, emerging tools and workflows are starting to make transmedia production a reality. With Unreal Engine, for example, you only need to create your content once, and then you can easily deploy it across film, games, immersive experiences and other forms of art and multimedia. Real-time production is making it easier than ever to completely rethink how and where your IP can be consumed. As filmmakers become more comfortable with this new reality, adapting game content for film, and vice versa, will become the norm.
Michael
Ford, Chief Technology Officer, Sony Pictures Imageworks
I’m incredibly excited about the continued industry adoption and participation in open source software initiatives. With the leadership and structure provided by the Academy Software Foundation (ASWF), the VFX and animation industry is making great strides to work together as a community to build software, libraries and processes that benefit us all. At Imageworks, we like to say that open source is the “engine of innovation” that allows us to leverage not just our talents, but also the talents of an entire industry. Open source also allows us to reach a more diverse group of people that might otherwise not have the opportunity to work in our industry, and we need this more than ever in order to build and strengthen our global workforce.
With the expanding use of game engines and faster compute via GPUs and distributed CPU rendering, the industry is moving towards a real-time future where creative decision-making is being made at a much higher rate. ICVFX (in-camera visual effects), animation and VFX workflows are all being influenced by these enabling technologies. I think the next few years are really going to change the way we think about computer graphics, especially when we look to the future of generating new and innovative looks via AI and machine learning.

In the last few years, remote work has proved to be very efficient for many studios in the VFX industry, including Hybride. It opened the door to new possibilities and access to a broader and more diverse talent pool. For instance, since 2020, Hybride has managed to significantly grow its workforce with team members working from all over Quebec. I am interested to see how the VFX industry will continue to adapt to this new reality.
In 2023, we will start to see a democratization of the virtual production pipeline. The technical complexity and extensive pre-productions are becoming easier to manage while industry expertise and know-how are increasing immensely. This will open new creative opportunities for a wider spectrum of productions. AI is increasingly part of our processes, but I think it will also lead to
evolutions.
impactful technologicalTOP: Mario (voiced by Chris Pratt) travels through an underground labyrinth in The Super Mario Bros. Movie. Illumination Studios Paris provide character animation and computer graphics. (Image courtesy of Universal Pictures) BOTTOM: The Little Mermaid goes live-action. VFX/SFX vendors include Framestore, ILM, DNEG, Rodeo FX, MPC, Lifecast and Clear Angle Studios. (Image courtesy of Walt Disney Studios)
“Things that would have taken months of cooking in the animation process can now take seconds. We can switch style sheets instantly, and a machine can re-code entire projects instantly as opposed to hours of hand, frame-by-frame labor. AI is an amazing way to start a conversation or to drive inspiration.”
—Andrew Melchior, Executive Vice President/ Global Director of Brand Experience, The Mill

pre-production
BOTTOM: John Wick: Chapter 4 is the fourth installment of the one-man-takes-on-the-entire-underworld series. Primary VFX vendor is The Yard VFX, with contributions from One of Us and Mavericks VFX. (Image courtesy of Lionsgate)

Frank Montero, Managing Director/President, ROE Visual US, Inc.
The exploration of virtual production borrows from the sentiment ‘With great risk often comes great reward.’ While advancements are continuously taking place, a certain amount of ambiguity is naturally associated with this budding technology. In truth, the concept of virtual production in film is far from revolutionary; however, the ways in which it has expanded to include digital artwork and LED displays are. Today, VP techniques facilitate production in a myriad of departments throughout studios worldwide. Most prominently, the use of LED displays for backgrounds on set simplifies the work for VFX teams in post-production while increasing real-time engagement for the cast and crew. The dynamic nature of the digital content ensures production can move forward in the desired direction while on-set modifications to the LED canvas can take place at any point in the process.
Adrianna ‘AJ’ Cohen, Senior Vice President/Global Head of Production, Mikros Animation
The technology in animation is improving at an incredible rate. An animator’s ability to create uniquely beautiful animation is getting easier and more accessible. This allows all previously unattainable ideas to come to life (e.g., mocap, animation/live-action hybrids, 2D and 3D content, etc.), and studios can take more chances on ideas they couldn’t previously afford to.
In addition to the advancement in technology, the movement to streaming providers has created a demand never seen before. Everything is changing. The need for resources will drive opportunities for every artist across the globe, which is an excellent opportunity for Mikros Animation and everyone working in the field. The challenge as a studio, however, will be to figure out how to attract and train talent, and make them part of our family.
Wayne Brinton, Business Development Director, Rodeo FX
The consumers’ high standards are not just a question of visuals. It’s about getting the same emotional experience you had when you consumed the content on your screen the first time – no matter the platform or the format. The sheer amount of content consumption in the past few decades has created expectations of fidelity in visual effects. When they/we don’t get that in an experience (ads, movies or even Snapchat filters), the experience becomes less than what it “should have been.” Like trying to redo the dragons from Game of Thrones in a Snapchat filter. Of course, users are going to be disappointed. Consumers are expecting a very high standard in terms of imagery and visuals, yes, but that’s not really what the expectation is rooted in – they expect great storytelling.
Dennis Kleyn, NVX, Founder/CEO/VFX Creative Director, Planet X
Virtual production is gaining noticeable traction in the Netherlands as well as in this part of Europe in general. Our Dutch film industry is relatively small and not on the most progressive/ innovative side, so it feels a bit like most producers have just caught up with considering ‘traditional VFX’ as a creative department within the filmmaking process (rather than a problem-fixing one), and now an even newer technique is on the doorstep. Planet X has been involved in the founding of the first VP/ICVFX studio in the Netherlands: ReadySet Studios.

“The benefits of virtual production are driving the growth, with the number of LED volumes likely to double in the next three to four years. While the need for post-production will shrink, digital
work will increase in order to optimize the on-set shoot. Novel uses for LED volumes beyond virtual production will also help drive the growth of LED volumes.”
—Kim Davidson, President & CEO, SideFX Software

BOTTOM:
Rob Bredow, Senior Vice President & Chief Creative Officer, ILM
We’re very fortunate at ILM to get to work with world-class innovative filmmakers and showrunners who push us to invent new techniques on nearly every new show. Just yesterday, I was on set on one of our shows with a talented director of photography who was inventing new workflows on that day, and seeing our StageCraft team respond with just the right artistic and technical solutions just in time to shoot. It was inspiring.
AI and machine learning techniques are transforming the way software is written – and the way our artists interact with our tools. The most recent example of a game-changer for us was the way our artists leveraged our latest AI face-swapping tools as an element to create the youthful Skywalker in The Book of Boba Fett. Combining the best of our digital facial animation techniques with the latest in machine learning really lets us achieve a combination of likeness and detail that wouldn’t have been possible just a few years ago.
Andrew Melchior, Executive Vice President/Global Director of Brand Experience, The Mill
AI is such a sophisticated and wild beast. There is always the question of whether the thing you’ve created will stay within the black boxes, and there is a lot of concern about that. One thing is for sure: AI certainly creates a stir and an interest.
Regarding AR, VR and AI and their relationship to the evolution of VFX/animation, from a visual effects point of view, we have been having lots of conversations. Things that would have taken months of cooking in the animation process can now take seconds. We can switch style sheets instantly, and a machine can re-code entire projects instantly as opposed to hours of hand, frame-byframe labor. AI is an amazing way to start a conversation or to drive inspiration.
We can address and scale down huge 3D geometries and make them real-time assets that can run on local devices. With NeRF [Neutral Radiance Fields] – instead of taking very detailed geometry and point clouds, you can now take 2D photos, and machine learning can take them and build 3D models on the fly. This means you can easily create characters with true-to-life shadows and textures that would have taken ages before. The price of entry used to be the restricting factor, but now we can access these technologies on the browser. It has completely democratized the process, which will change the game and open accessibility to everyone. However, there will always be a market for hand-made content. It is still obvious when humans make something versus a machine. When we automate everything, it does run the risk of all looking the same or similar. Hand-made content will still be considered ‘magical’ and ‘special’ because of its uniqueness and visceral qualities. It will be a smaller industry, but it will always exist.
Kim Davidson, President and CEO, SideFX Software Emerging and evolving technologies, such as AI/ML or AR/VR, will not replace current solutions. Rather, they will continue to complement and improve current approaches. In digital modeling, for example, VR and ML are nice complements to procedural and interactive modeling techniques. By combining these technologies, modeling in the future will be more versatile, interactive

“Due to the increased demand for animated series, traditional animation will have to investigate leaner techniques, such as bypassing the 2D process altogether and jumping straight into 3D, using game engine technology to be able to scale creative output.”
—Mariana Acuña Acosta, Senior Vice President, Global Virtual Production, TechnicolorTOP: Dungeons & Dragons: Honor Among Thieves deploys the latest VFX to honor the franchise. ILM, MPC, Clear Angle Studios and Legacy Effects contribute special effects. (Image courtesy of Hasbro and Paramount Studios) Trek across post-pandemic America with Joel and Ellie in the HBO Max TV series The Last of Us. Primary VFX/SFX vendors are DNEG and RISE Visual Effects Studios. (Image courtesy of HBO Max)

and intuitive. At SideFX, we look to incorporate ML into future releases of Houdini as a complement to procedural workflows in modeling, environment generation, layout, animation, character effects and lighting.

Virtual production is rapidly changing current production pipelines. The benefits of virtual production are driving the growth, with the number of LED volumes likely to double in the next three to four years. While the need for post-production will shrink, digital pre-production work will increase in order to optimize the on-set shoot. Novel uses for LED volumes beyond virtual production will also help drive the growth of LED volumes. Who needs the “Metaverse” when you can spend time with friends inside a giant LED half-dome? The shortage of talent, particularly technical, is the biggest issue our industry is currently facing, and it is unlikely to subside over the next three to five years.
Danny
Turner,Executive Producer, Yannix Co., Ltd.
The industry has recovered in a big way and, in turn, Yannix enjoyed unprecedented demand for our services in 2021-22, particularly our Character Rotomation (RotoAnim) service for which we’ve seen exponential increases in client demand. There are no signs of things slowing down any time soon.
Throughout the global health crisis, Yannix remained open for business without any interruption by implementing a “seven-daysa-week” work strategy. By splitting our teams into two separate shifts, Yannix complied with social-distancing guidelines and, most importantly, we kept our people safe and healthy. Through it all, we never had a need to implement a “work from home” strategy. As the industry and consumers adjusted to the “new normal,” we focused on keeping the lines of communication open with our clients and strived to remain ready.
Mathieu Raynault, Founder & CEO, Raynault VFX
With the need for VFX productions at an all-time high, the landscape of the cinema and television industries is changing at an unprecedented pace. VFX companies have to reinvent and streamline their pipelines and technologies to account for both remote work and labor shortages. All this movement creates engaging challenges, thrilling opportunities and, without a doubt, uncertainty. We believe that keeping the human aspect of our business at the center of the VFX conversation is key to surfing this wave in the future. Raynault’s bet is to create a model where artists have greater ownership over their work, shots and assets. Our team thrives on overcoming their most complex tasks while maintaining a healthy work environment and VFX/life balance. This concept may sound cliché, but it’s actually at the core of our philosophy now and for the many years to come.
Steve Read, Head of Studio and Executive Producer, Versatile Media Company Ltd.
Creative drives technology, and it all begins with a great story. The purpose of virtual production is to align both under one clear vision: to see results in real-time, in-camera and at the hands of key creatives.
Lensing shots on LED volumes allows directors and DPs to obtain immediate results and keep full control of both the practical and digital elements. This level of collaboration naturally brings

“Bringing innovation to the table is bringing a renewed energy to all of our work, and I don’t think there’s a more exciting time to be working in VFX than right now.”
—Johnny Fisk, President, FuseFXTOP: A toy company robotics engineer builds a life-like, almost-living doll in M3GAN. Concept design and specialty props are provided by Wētā Workshop. (Image courtesy of Universal Pictures) BOTTOM: The excesses of early Hollywood are on full display in Babylon ILM handles VFX and animation. (Image courtesy of Paramount Pictures)



BOTTOM:
the principal photography and VFX components together under the control of one creative drive. It promotes a transparent workflow in real-time to capture the best results possible. Our industry has evolved. Audiences today have a tremendous appetite for more quality content. The distribution and platforms are also quickly evolving.
Patrick Davenport, President, Ghost VFX

Talent remains our key priority. There’s always a shortage of artists, but this is exacerbated in the current climate. So, the focus has to be on employee retention, not just recruiting, but paying fairly (including overtime) and providing a supportive work environment and culture. The industry is still in a dizzying state of flux, with so many companies for sale, being acquired or merged. We would like to get on with the work and enjoy a stable environment, especially with everything going on in the world.
Nearly three years since the start of the pandemic, it feels like the shift to full-time WFH (Work from Home) or hybrid has become permanently embedded in our industry, which means studios have to optimize the employees’ remote work experience through technology and enhanced people support to maintain creative collaboration and productivity.
Mariana Acuña Acosta, Senior Vice President, Global Virtual Production, Technicolor
Traditional VFX pipelines will continue to evolve, moving to the Cloud, enabled by machine learning for automated data wrangling. Automation will continue to be a key piece of the VFX puzzle as it relates to performance transfers, keying, rotoscoping, de-noising, data clean-up, etc. Due to the increased demand for animated series, traditional animation will have to investigate leaner techniques, such as bypassing the 2D process altogether and jumping straight into 3D, using game engine technology to be able to scale creative output.
Given the challenges of on-set virtual production, standardization is key, which is why there’s a movement towards SMPTE 2110 (this is the equivalent of when physical tapes moved to digital files for content storage). Virtual production will continue to grow in other areas than just film and episodic. VP will see wider adoption in animation and advertising.
Hitesh Shah, Founder and CEO, BOT VFX
VFX embraces the gig economy – more seriously this time. Three years ago, most facilities (with rare exceptions) could not conceive of artists doing their work from remote locations because of the technical constraints of the infrastructure itself, let alone other factors. Then the pandemic forced a reluctant embrace of PCoIP (PC over IP) technology out of sheer desperate necessity. Soon, this tool of necessity became the transformative tool for facilities to free themselves of geographic constraints in hiring artist talent. Pipelines and infrastructure built in one city could now leverage artist talent in far-off places without much incremental cost, setup time or process changes. Concurrent with this new enabling technology were two other factors: the surge in VFX service demand that exceeded the readily available industry capacity, and the wider social movement normalizing work-from-home. Facilities have embraced these changes by changing their operating model. They

“The pressure to deliver large amounts of high-quality shots around tight deadlines [for high-end episodic productions] has set recruitment teams on fire – with no signs of slowing down anytime soon.”
—Gaurav Gupta, Managing CEO, FutureWorks Media Limited

have begun recruiting remote talent to augment their base office teams.
Christophe Casenave, Head of Category Management and Sales Cinema Products, Carl Zeiss AG
Film productions in general, and VFX productions in particular, are striving for efficiency, driven by the high demand for streaming content. The well-established VFX workflows are being updated with new technologies like virtual production, which allow for the final production of pixels on set and offer new levels of flexibility. One of the major challenges these teams are encountering is the matching of the look of the CGI with the look of the physical lens, especially when the glass shows very strong characteristics, as seen with popular vintage lenses. Matching the look is mainly achieved with a lot of manual work in post-production and relies on guesswork to reproduce lens characteristics, which makes it difficult to scale and to use with real-time tools. Optimizing virtual production processes and making the images produced even more cinematic will be the major challenge in the near future.
Markus Manninen, Managing Director,Goodbye Kansas Studios
At Goodbye Kansas Studios, we see a continued high demand from clients to get visual effects work done during 2023. In particular, the episodic segment is continuing the trend of setting higher expectations of visual complexity and quality, with the effect being that clients are reaching out earlier to secure resources that are able to accomplish complex shots and scenes. We expect to see more strategic relationships between vendors and clients as a result.
The open source of core tools and capabilities will become a much more integral part of next-generation tools, workflows and processes in 2023. Virtual production is clearly here to stay, even
TOP: The Quantum Realm and its strange creatures challenge Ant-Man in Ant-Man and the Wasp: Quantumania. SFX vendors are Digital Domain, ILM, Method Studios and Sony Pictures Imageworks. (Image courtesy of Marvel Studios and Walt Disney Studios)

as on-location work continues to grow during 2023.
David Patton, CEO, Jellyfish Pictures
As we navigate the expanding VFX landscape, 2023 will see us continue to build new global collaborative workflows – not only to break down the geographical barriers when it comes to sourcing talent, but also to allow us to build more speed, scale, agility and sustainability in delivering new projects.
With this in mind, Jellyfish Pictures has been harnessing the power of Cloud technology, working closely with providers such as Microsoft Azure, Hammerspace and HP to optimize our internal pipelines and boost productivity. Using modern Cloud-based solutions empower our artists to create the same standard of work as they would within a studio environment, no matter where they are in the world. In addition, the rapid rise of virtual production has also taken the industry by storm.
Gaurav Gupta, CEO, FutureWorks Media Limited
This year, we’ll be back to ‘business as usual,’ though the world we live in is not the same. The deep transformation our lives and our industry have been through will continue to permeate throughout 2023.
Thanks to the new ways in which consumers access content, there’s never been so much demand for our services. I’m talking not only about traditional feature films, but also high-end episodic productions. The pressure to deliver large amounts of high-quality shots around tight deadlines has set recruitment teams on fire –with no signs of slowing down anytime soon.

STOP-MOTION BREATHES NEW LIFE INTO GUILLERMO DEL TORO’S PINOCCHIO

When Italian writer Carlo Collodi published the novel The Adventures of Pinocchio in 1883 about a wooden marionette who desires to become a real boy, the movie industry did not exist. The story has taken a life of its own on the big screen with the most famous being the 1940 Disney animated classic, but this did not deter filmmaker Guillermo del Toro (The Shape of Water) to put forth his own version of the fairy tale utilizing stop-motion animation, and partnering with co-director Mark Gustafson (The PJs) and Netflix. “The biggest challenge is that it took almost two decades to get this made,” del Toro notes. “It was a completely new approach to the material that made the pilgrimage through the world of financing and logistics difficult.”
In Guillermo del Toro’s Pinocchio, the title character does not transform into flesh and blood. “That was clear to me from the start,” del Toro explains. “In a way, it’s about subverting and finding new meaning on the themes of Pinocchio. One is that disobedience is actually the beginning of thought and conscience, which goes against the idea that a good boy is a boy who obeys. The second one is the idea that you don’t have to transform yourself to be loved. You can be loved for exactly who you are.”
Images courtesy of Netflix.
TOP: Animator Peggy Arel repositions the Geppetto puppet on the doctor’s office set.
OPPOSITE TOP TO BOTTOM: A color concept of Death and the realm of Limbo, practical Death sculpture, and Pinocchio in Limbo where he goes upon dying and is subsequently revived by Death.
Character traits influenced the design of the puppets. “There are practical considerations because they do physically exist,” Gustafson notes. “Some of those limitations can play to your advantage. Pinocchio is a perfect character to do in stop-motion because he is a puppet. There is something simpler about Pinocchio that makes the audience lean in and connect with him. His face isn’t all over the place. We wanted him to be made of wood, and that brings a power once you figure out this language.”
Handling the production design duties were Guy Davis and
Curt Enderle. “One of my favorite characters I got to design with Guillermo was Death,” Davis reveals. “Death was fun because she was mythic like the Wood Sprite, and we designed both of them as sisters. Death and the Wood Sprite are mythology come to life. They’re their own thing. We originally thought of Death being portrayed as a Chimera mythical beast, and then she was more sphinx-like as far as the body, not as the culture. We started with the idea of the Greek mask for her face. It gave us a lot of freedom to come up with our own mythology. The Wood Sprite went through a couple of changes, too. Guillermo had a definite idea in mind as far as angels with the eyes on the wings. Death went through a lot of iterations getting her to where she was ready to be a puppet, and the same with the Wood Sprite. Even Pinocchio, as we first had him, was based on the original Gris Grimly design, but then there are other things, like Black Rabbits, that clicked from the first design pass and carried over from 2012 with not any changes to the concept art.”

Surrealism prevailed with the skies. “We went through the show and asked, ‘How many unique skies do we need?’” Art Director Robert DeSue remarks. “There is a journey montage, night and day requirements, and considerations for mood to help reinforce happier times versus somber times. It ended up being in the neighborhood of 38, maybe 40 unique skies. We decided to do a keyframe painting for each one of those skies, and we made these sheets: ‘Here is the storyboard and the shot this applies to,’ so we had the composition information. The director of photography, production designer and myself went through to decide upon the type of sky, like cloud forms and color.” Del Toro did a course correction. “He said, ‘I want you to look at the skies by Grant Wood,


Georgia O’Keeffe and the Group of Seven,’” DeSue describes. “‘The Italian ones, I like the colors and atmosphere, but that style is too realistic.’ We had keyframes, two images and a painting. That helped the skies get a nice runway. The first time Guillermo would see a sky, he might make a color correction. But in terms of style, fluffy clouds should look like that cotton batting that you use in pillows because they have a level of artificiality that looks handmade.”

A memo was circulated consisting of eight rules of animation. “Mark and I discussed, ‘What are we going to do differently?’” del Toro recalls. “We said, ‘We’re going to try to give a depth to the acting style of this puppet that makes them become human.’ The goal is this: 15 to 20 minutes into the movie, you would forget that they are puppets. You’re just watching actors act and humans think and feel. We always emphasized with the animators to do micro-gestures, things that are brief and changing, because most of the animation is key poses and pantomime. It’s characters looking defiant, skeptical and happy. It’s all about emojis! Hayao Miyazaki said, ‘If you animate the ordinary, it will be extraordinary.’ It’s about stopping the plot and allowing life to enter the film.” Animators were cast. “We found that some of them were much better at doing characters or they had a real affinity for it,” Gustafson states. “That was useful to get some sort of consistency out of them, too. We tried to keep them in scenes as much as possible, as a sense of ownership is important. They can come away from this film going, ‘That sequence is mine.’ That feels really good.”


Printed faces for the puppets were something that Animation Supervisor Brian Leif Hansen wanted to avoid. “When you are working with the printed face, you’ve got a limited stack of money and a limited stack of faces, therefore your facial expressions would probably be on the stiffer side of things, with the budget that we had,” Hansen notes. “A silicone face you can move around all of the time. All of our main characters had the mechanical silicone

face, but Pinocchio has a printed face, which is a stroke of genius because it keeps him in the hard world.”

Sebastian J. Cricket getting captured in glass by Pinocchio was accomplished practically. “It’s so wild,” Hansen describes. “There are seven different composite layers in it, because the paper is big and because Cricket needs to walk on it. Pinocchio draws a sun on the paper. Pinocchio’s hand size was shot in a different plate. The drawing itself was a different plate. The glass is a different plate as well. Cricket needed to be animated inside the glass. We couldn’t have the glass there. We animated Cricket first and animated the glass afterwards. Cricket is pounding on the glass. It works brilliantly. It was fun to put the shot together without [anything] other than the technology of stop-motion.”
Visual effects were mainly utilized for atmospherics. “It’s easier to do some of the smoke, skies, fire embers, and even then, we did a lot of it with physical embers, miniatures, and silk to recreate a physical haze in the forest,” del Toro states. “When it’s a landscape made of water, that’s going to be rendered faster in CG. Then the trick is to art direct it not like a real piece of water. You have to make it artificial in order to match the world.”
There were 1,374 visual effects shots created by MR. X, an in-house team, with additional support provided by BOT VFX. “The benefit of stop-motion is the fact that there is not another take,” observes Visual Effects Producer Jeffrey Schaper. “You basically have the shot and have passes for it. We all used ShotGrid as our shot-tracking software, from the stages all the way through post, editorial and effects. We would generally speak with our first AD scheduler and make sure that everything was approved. As soon as it was approved, we would pile the shots and try to turn it over. At first, it was once every month, but then it became every two weeks to every week, to try to get shots out the door as quickly as we could. The good thing is that you have a shot that is turned over to the full length, and we work with 10 frame handles and let editorial decide what they’re going to use of those handles.”

AND BOTTOM:

Two major visual effects environments were the interior of the dogfish and the realm of Limbo inhabited by Death. “There was a practical boat and half of a lighthouse,” Visual Effects Supervisor Aaron Weintraub reveals. “Those were the pieces of the set that the characters would interact with, and everything else was always planned to be digital inside of a dogfish. The idea was to create a feeling of this dank, wet cavern. The dogfish swallows them, and they do this water-slide trip down the throat and come out into the inside and fall down these various levels, trudge through the goo, find the ship and make their camp there.” The inside of the aquatic creature had to feel tactile. “We had little organic, fungal growths and things like that scattered throughout,” Weintraub adds. “There was a hanging mist in there too, as well as streams of standing water and weird temperature changes. Because it’s organic, like skin, when the light hits it, it had to feel like there is a subsurface scattering. It was always a question of how thick the skin is to the outside world, so there is a red glow from the sun that is coming from the outside that breathes through some of the diffused ambient light in there when the lighthouse isn’t shining directly on something.”


Originally, Limbo was not intended to be CG. “They were going to build these shelves that were filled with hundreds of hourglasses made out of laser-cut acrylic, but then COVID-19 hit and there was a massive acrylic shortage because it became sneeze guards in banks and post offices,” Weintraub explains. “We were always doing the sky dome in there, which is like planetarium-type space.” Early tests were conducted to get the look of the flowing sand. “We figured out the right speed and frame rate,” Weintraub recounts. “What they had on set was the wood frame of the hourglass, and we would do the glass insert, composite it inside, and render all of the reflections of the environment and characters moving around. We had a CG version of Pinocchio for all of the collisions and reflections.” The Death puppet had ping pong balls placed where the eyes were supposed to go. “We replaced each of those with


A greenscreen test of Geppetto’s boathouse set.

BOTTOM: Many of the sets made use of motion-control cameras, such as this scene between Volpe and Spazzatura.

animated eyeballs,” Weintraub says, “so we had to rotomate the wings so that our models would line up exactly and the eyeballs would track in, do the animation, and match the performance for pupils following [the action] and blinking at the right moment.”
One character is fully digital. “Before the Wood Sprite becomes a form, she has small eyes, and they essentially make up her wings,” reveals On-set and In-house Visual Effects Supervisor Cameron Carson. “The eyes float through a couple of scenes and interact with things. We wanted them to feel as close to our stop-motion puppets as we could. We built a couple of practical eyes out of polyurethane and did a couple of different things, which we then scanned and sent over to MR. X to try to replicate that, as well as add the ethereal trail that follows behind them. There was a lot of back and forth with that in terms of the look and how that is supposed to feel in the movie. That was probably one of our biggest ones in terms of designing because it was a little more of an unknown.” The fully-formed Wood Sprite is luminescent. “She has practical lights behind her eyes, which is helping to cause that glow,” Carson states. “In most of her shots, the Wood Sprite is captured on greenscreen, and that enabled us to separate her out and give some digital glow and atmospherics to her so it feels like she is ethereal and moving through the space.”

As many as 56 sets were shot simultaneously, though not all of them had motion control cameras. “When our camera is six inches away from the set, the smallest motion reads massively onstage,’ Carson remarks. “We will shoot for the day, come back in the morning and the camera would have moved. Just the slightest atmospheric change of temperature, or the lights coming on and warming up the set, will actually swell or shrink the wood, and it creates micro-stutters in our tracks. Almost every single shot in our production had to be stabilized, and we have to worry about light flicker.” Dust is a big problem. “It’s small particles that are on the set that when an animator touches something on the puppet, all of the chatters, or moves around. You see that in Fantastic Mr. Fox, characters with fur chatter. That movement comes from animators touching the puppet and moving their finger, and the pieces don’t get back to the same spot,” Carson adds. “We’re trying to strike a balance of removing things that are distracting to the viewer while leaving as much charm or stop-motion aesthetic as possible.”

“[Co-director] Mark [Gustafson] and I discussed, ‘What are we going to do differently?’ We said, ‘We’re going to try to give a depth to the acting style of this puppet that makes them become human.’ The goal is this: 15 to 20 minutes into the movie, you would forget that they are puppets. You’re just watching actors act and humans think and feel.” —Guillermo del Toro

Being a veteran visual effects producer, Diana Giorgiutti is used to managing time zones for an industry that literally works 24/7, which means her work day begins at the ungodly hour of 4:30 a.m. while in production for the role-playing game adaptation Dungeon & Dragons: Honor Among Thieves, which stars Chris Pine, Michelle Rodriguez, Hugh Grant and Jason Wong, and is directed by John Francis Daley and Jonathan Goldstein. “That’s about as early as I can get up!” laughs Giorgiutti, who is working from her hometown of Sydney, Australia. “2 a.m. is like 9 a.m. in Los Angeles, so I’m usually coming into the day two or three hours behind them all. I’m not an early bird! Never have been. I’m a night owl.”
Art and math were the subjects that appealed to her most as a student. “Hence the art and producer combination!” Giorgiutti notes. “My parents are both Italian, and they came to Sydney in 1961 when Australia was appealing to immigrants from Europe to help build the country. A year later, I was born [followed by three sisters]. Sydney is wonderful. The industry was small here, so I knew that if I wanted to grow in visual effects and learn, I would have to go to England. The U.S. wasn’t obtainable. I have Italian citizenship, so Europe was much easier. Leaving Sydney behind was hard, and I had always planned to come back.”
VFX PRODUCER DIANA GIORGIUTTI: FROM POP
VIDEOS
AND TALKING ANIMALS TO BULLET TIME
By TREVOR HOGGStar Wars was what caused Giorgiutti to become involved with the visual effects industry. “I was 15 years old in 1977 and could only talk one of my sisters into go with me [to see Star Wars]. I walked out of the cinema saying, ‘I don’t know what it is and how they do it, but that’s what I want to do!’ The next couple of years, I was investigating it through newspapers or magazines because there was no Internet.” The first job in the film and television business for the high school graduate was as a production assistant for a tiny company called [Sydney] Panorama Productions which did a local TV show called Variety Italian Style. “There was a five-minute cooking segment that we shot and some local adverts. Being in Newcastle at a TV station watching all of the cameras, that was my love: the tech of it all.”
Images courtesy of Diana Giorgiutti.
TOP: Visual Effects Producer Diana Giorgiutti.

OPPOSITE TOP: On set shooting the Neo/Agent Smith crater scene from The Matrix Revolutions – “a joyous wet and muddy shoot,” according to Giorgiutti.
OPPOSITE BOTTOM: During The Matrix Revolutions (2003) and The Matrix Reloaded (2003), shooting with the ‘yak’ rig, known as this “because the stunt guys were barely able to keep their breakfast down,” Giorgiutti recalls.
Getting hired as a tape operator at VideoLab, thanks to a personal recommendation of a colleague, was the big career break for Giorgiutti. It exposed her to Grass Valley Vision Mixers, Bosch FGS-4000, Softimage 3D and a Quantel Paintbox. However, the real boom in visual effects was happening in London. “I joined an editor friend and we bought one of the first Avids in London. Running that business was not my thing and didn’t keep me very busy, so I sent a few letters out to six of the top visual effects houses. Rushes offered me a three-month freelance gig as a visual effects producer which then became five years. I did mostly pop videos because none of the other producers liked doing them [as commercials were more profitable]. I still call London the music capital of the world. I was going to gigs all of the time. I worked on some great pop videos during my time, such as ‘Frozen’ by Madonna. We did a shot where she falls backwards and turns into hundreds of black crows.”
Artists back then were generalists doing everything from previs and lighting to compositing. “It would be one artist who would take the shot all of the way through,” Giorgiutti reflects. “The visual effects producer back then had more creative involvement, but
over time, when it became more departmentalized, you lose touch of that. Also, the projects got bigger, and it became more of money management and working with the vendor doing the work.” The paradigm has shifted back to the creative side for the blockbuster visual effects productions that are too big for a visual effects supervisor to handle alone. “It’s important for visual effects shows to have the supervisor and producer because they balance each other well. Whereas the supervisor is more creative and technical, the producer is creative and technical, too. We have to be creative with numbers at times, but it’s good for the supervisor to have someone else to go, ‘How do you think we’ll get this shot finalled by our director quicker?’ There are all of these strategies and plans we have to discuss and come up with. If it was somebody doing that on their own, it wouldn’t be as successful.”
Two significant films that Giorgiutti worked on early in her career won Oscars for Best Visual Effects, Babe and The Matrix “[Writer/Producer] George Miller always intended to shoot with real animals and some animatronics, but also knew that he needed visual effects, so he waited for the technology. At that time, I was at VideoLab as a telecine colorist, which is now what a DI person is but much more analog, so no DaVinci Resolve. A film roll would log in, I would put it up and if I thought it looked good, I would call George and get him to have a look. Every now and then, he would ring to check if particular companies had sent something and ask me what I thought. One day, this box came from Rhythm & Hues [Studios]. I finished whatever commercial I was doing, put it up and went, ‘Ahhhhhh.’ I called George right away and he came zooming over. I was in the room when he called them to say, ‘This is mind-blowing.’ Then onward and upwards! Babe was made.”


MIDDLE
BOTTOM:
A personal recommendation by Sally Goldberg (Computer Animation Supervisor at DFilm) led to Giorgiutti becoming involved with The Matrix, which was in a state of turmoil made worse by the visual effects producer leaving the production. “The Matrix was a whole bunch of mind-blowing things. My big job during the shooting of that film was managing the Bullet Time shots, and we had no idea what we were doing. The guys setting up the rig would go, ‘This is what we’re going to need on the day.’ I said, ‘Okay, I’ll come up with the chart.’ On the day when they took the photos, it was my job to run around to all of the 120 still cameras and record what frame they landed on, and do it quick enough because the actor would be itching to do the next take. I had all of my precious rolls of film that I had to take over to the lab to get processed. This is hundreds of thousands of dollars in the making! I like to say that I was one of the world’s first data wranglers.”
For a brief time, Giorgiutti returned to working for a visual effects company. “Luma Pictures was one of the vendors I used on every single show that I did for Marvel. The owner is a smart guy and thought having someone like myself to represent the company would help expand beyond Marvel and into other things like its own content. I was enticed over and it was great, but I was only there for a year and a half in the end because I missed the studio side. The overall management is what I love about what I do, being able to wrangle everything and everyone which you don’t do as a vendor. You just have your patch of shots, and if you’re lucky you’ll get time with the director here and there. Mostly you deal with the supervisor.” The visual effects industry has been significantly impacted by the streaming services. “The advent of Netflix has changed visual effects in another way,” Giorgiutti observes. “There is too much work and not enough people like myself with good solid experience, so a lot of green individuals are being thrown into managing these shows. I’m already talking to vendors for my potential next film, which doesn’t start shooting until May 2023.”
Key skills for a visual effects producer are mathematics,





communication and counseling. “I am protective of the vendors because the studio side does not do the shots,” Giorgiutti remarks. “I’ve always felt if you look after people, they’ll look after you. If we’re going to need them at the eleventh hour to pump out the extra shots, give us a few freebies or throw in that extra effort, you make them feel like an important part of the process, which they are. Then it all fits and works nicely.” Being able to delegate responsibilities is important. “I’m not a micromanager,” she says. “I copy the coordinators on a lot of my emails so they can learn and see how things are handled. I’m always giving them lessons. If you delegate and have trust in your team, it’s wonderful. Going back to, ‘there is too much work and not enough experience,’ some of our crew on Dungeons & Dragons haven’t done much of the role before, but you get enough time in the early beginnings of post to teach, train and hopefully instill some of the better ideas and ways of doing things.”

Virtual production excels in specific situations, she says. “If you are doing The Mandalorian-type stuff that only really works if the filmmakers are prepared to essentially post the movie to a degree before they shoot, all of the environments have to be designed and not change.” Audiences are a lot savvier about visual effects. Giorgiutti adds, “If you read any of the [film fan] blogs or Reddit things, there are all these people out there giving their own opinion on why it looks so shitty. Interestingly, most of them say it’s probably because of not having enough time. How do they know this stuff? I blame the Internet. There are so many behind-the-scenes [articles] where they reveal a lot of our stuff, so these people are learning from all of that. It starts with having and shooting a solid script. D&D and even Mulan were solid scripts, and we improved both films by doing a bit of additional photography. Each of them came in on budget. A lot of movies barely have a script, or the script is too long and they go in shooting. Unless the filmmakers plan




better and have better scripts, it’s going to be more of the same. Maybe films won’t be successful because of the visuals now, especially if they’re visual effects heavy.”




One shot took 117 versions to get approved, Giorgiutti reveals. “It was all about a character being yanked into a wall, falling down, then that comical Roadrunner moment, and the dust falling a beat after he falls. We could not make them happy on the dust! I’ve got a lot of those kinds of fond memories.” After over 40 years in the business, The Matrix remains a career highlight. “Back then, you always had a family in post, because the post-production team –director, editing, sound and visual effects – was a group of 20 to 30 people. Shooting these days, your crews go up to 500 to 600 and even more if you have a lot of extras involved. It becomes too many people. On The Matrix, I got to know all of the names of the grips and electrics. A lot of the guys doubled up with their jobs because that’s how it was back then. There weren’t that many people in the industry, certainly in Australia.”
“I think I have five more years in me,” Giorgiutti reflects. “I have seen it all. I was sticky-taping two-inch videotape together, and I remember the one-inch machines coming in that were vacuum operated so they would move really fast. Any of us girls with long hair had to wear it in a ponytail because ‘whoosh’ your hair could get caught. Now, of course, it’s digital, volume and metaverse! I’ve got to look that up!”As for what avatar she would choose for herself, Giorgiutti responds, “I would be a woman on a unicorn. Nobody has ever asked me that before, so that immediately came to my head! I love horses. If only we could all have unicorns in our lives!”

LOST IN THE MOVIES OF PETER WEIR

One does not think about spectacle when watching the movies of filmmaker Peter Weir, who believes in infiltrating the subconscious with subtle visual and sonic cues, rather than overloading the senses with eye candy to create the desired mood and atmosphere for audience members. Even when digital and practical effects play an integral role in achieving the necessary epic scope, there is an organic quality to the image being presented on the screen. “The most work that I did with special effects or CGI was Master and Commander: The Far Side of the World, and that was quite something to make [ocean scenes] look real. Most of the oceans are composites. We were only at sea for 10 days,” notes Weir, who received an Honorary Oscar at the Governors Awards for a career consisting of 13 films, six Academy Award nominations, and being a key member of the Australian New Wave that turned a cinematic hinterland into an internationally renowned film industry.
Shooting in the water tank built for Titanic in Rosarito, Mexico, was the preferred option for Master and Commander, which takes place during the Napoleonic War. “It was probably one of the most difficult films I ever worked on,” remarks Russell Boyd, who reunited with Weir after two decades and received an Oscar for Best Cinematography. “There were an awful lot of mechanical special effects in it like water explosions, all that fun. I remember about six weeks before shooting started, we all looked at each other and asked, ‘How are we going to make this picture?’ because there were so many variables. Everything was scaled up. The visual effects certainly played a part. It was a huge learning curve for all of us, but in the end, we stuck with our guns. Peter made the most genius call by commissioning these huge models that were 1/6th scale to be built at Wētā Workshop, which I believe that the studio didn’t want to do. They wanted to digitally create the boats, and it honestly would have been a disaster because those models turned out to be fantastic.”
Good fortune occurred when shooting plate footage. “The Endeavour replica was sailing around Cape Horn from west to east during our pre-production time, so I managed to get a cameraman on board,” Weir remarks. “It is hard to buy 35mm or high-quality visual shots of the ocean with swells or huge waves all around. People shoot on 16mm, [camera operators did] in the old days anyway, or on inferior digital video cameras. He didn’t get the storm but did get some fabulous ocean plates with big swells. Those were valuable for Asylum VFX [the visual effects company].” The only time CG ships appeared was in the wide shots. “Peter was so precise in what he wanted,” recalls Nathan McGuinness, former Creative Director/Visual Effects Supervisor of Asylum, who received an Oscar nomination for Best Visual Effects. “He came in early in the meetings with an oil painting of that period ship in a storm. Peter goes, ‘I want the storm to look like this.’ That’s exactly what I used as my reference.”
“It was a monolithic compositing show,” McGuinness states. “We had 17 Flames running. Everyone was doing their roto and sitting there sifting through all of the ocean plates we had and blending. What we did was to create two or three master shots that were exactly what Peter wanted and that then helped us to control
the look. [Editor] Lee Smith and his team were in my building, so we were together. That took the communication [concern] of not knowing what the editors are doing completely away.” Digital double work was minimal, McGuinness observes. “We did shoot libraries of doubles so that we could stack them onto the ship, especially for the models and anything that wasn’t live. Also, we were able to take pieces from what we shot off of the live shoot with the crew on and put that on the models.” There was a limit to what could be done on set with the actors. “We mapped it all out, had the actors go everywhere, recreated that layout with the explosions going off, and drop-comped it all in,” McGuinness describes. “We would pick up the elements that were needed, like the wood, embers and the cannon fire. We had libraries of footage that the compositors could grab and add in. We had a lot of atmospherics.”
A real-life plane crash was recreated for Fearless, but Weir does not view it in the same light as his nautical adventure. “Oh, that was a different thing,” Weir explains. “That was in the earlier days before CGI. There was a company in Los Angeles [Introvision International] that specialized in making these plates for you, and they were very good. When Jeff Bridges is standing on the roof of the building, they built the actual corner of the building at the studio and then made up plates of the traffic in the background and seamlessly married them. I could see it through the camera in that case, so we were able to fold in the background into the viewfinder. For the plane crash, I bought materials that had been used for a plane cabin, and we created a lot of it in the studio because it was mostly interior, so you could create the chaos and debris.”

Introvision International altered a model that had been previously used for the television movie Miracle Landing. “We shot it


TOP TO BOTTOM: Orchestrating the special effects, which included dump tanks to add to the realism.
Ocean plates and live-action footage of Russell Crowe are composited together.

Water cannons were utilized to get the proper interaction between the ocean and ship.
Some shots were captured using a gimbal and bluescreen rather than in the water tank.


Water tanks in action during principal photography of Master and Commander


upside down so that anything breaking off would fall down to help make it look like it was blowing away,” remarks William Mesa, who was the Visual Effects Supervisor on Fearless. “All kinds of wires were connected to little parts of the plane body so when the camera started rolling, we could pull off luggage compartments or different windows to make it look as if it’s being ripped away. Then we shot many different plates from various angles with Jeff Bridges and the boy next to him.” Plates were shot outdoors. “We had a truck with a VistaVision high-speed camera mounted on the roof and crashed through a cornfield as fast as we could possibly go until it got all clogged up underneath.” Weir wanted the passengers to see that the plane was out of control. “For a lot of the shots, we used a Sabreliner jet, took the door off and mounted a VistaVision camera in it,” Mesa states. “Then we made runs back towards Bakersfield and literally took that plane upside down and then back up again. I almost got sick doing it!”
The roof of a 12-story building in San Francisco was the location for when Jeff Bridges’ character stands on a ledge. “We couldn’t get certain angles there because the set was way inland to the actual building,” Mesa explains. “Plus, the ledge was much higher than the real one. Jeff could climb up on top of that and be totally safe. The biggest concern was for me to shoot the plates hanging over the end of the building and looking down. It was a safety concern for the camera. But we worked that rig out.” A number of onlookers appeared in the surrounding windows. “Tons of women would put up their phone number saying, ‘Call me tonight,’” Mesa comments. “The production manager had to go over to the building and tell them we couldn’t do anything because all of this stuff was in the background; that ended up delaying us for a while.” The set was reconstructed inside a studio. “You go through a certain process of having to get what you call ambient light,” Mesa adds. “[DP] Allen Daviau left it up to me to light a lot of that, then he would go in and tweak the lighting on the face to be the look that he wanted it to be.”
Experimenting with visual and sonic trickery such as camera speeds and earthquake sounds date back to the adaption of Picnic

TOP: An intricate series of wires was connected to the miniature to make sure that the pieces came off at the right time for Fearless
MIDDLE LEFT: A miniature plane engine was placed outside of a helicopter and shot from the perspective of a passenger for Fearless

MIDDLE RIGHT: Visual Effects Supervisor William Mesa with Weir for the studio shoot of the high-rise sequence Fearless when Jeff Bridges stands on the ledge, which involved Introvision plate projection.

BOTTOM: Preparing a shot of the interior of the miniature plane used for the crash sequence in Fearless


at Hanging Rock. At the turn of 20th century, a group of school girls disappear upon entering a mysterious volcanic formation in Australia. “Not only did the film not have an ending, it was a whodunnit with no ending,” Weir observes. “I had to somehow strive to make it so that you want to live in the mystery. But to do that I had to make it dreamy and not overemphasize the investigation from the police.” [Cinematographer] Russell Boyd finds it inspiring to work with a director who is adventurous with camera angles. “Peter has always liked to experiment with using different lenses and different heights with the lenses in getting a shot, and the speed of the camera. He likes slow motion just to heighten one little mannerism or a little movement. In Picnic at Hanging Rock, when the girls were crossing the river, we shot in 32 frames, which gives it that slight motion effect.” Lens distortion was utilized to create an impression that a magnetic field might be present. “There was a bit of that,” remarks John Seale, who was the Cinematographer on the film, as well as on The Last Wave and Gallipoli. “Also, the use of the rock formations, finding a face and having it quietly sitting in the top left corner where the audience might suddenly say, ‘Did you see that?’” There are voyeuristic shots. “It’s as though the rock is watching and preying on them,” Seale says. “The simplicity of girls walking up to a rock enhanced by Peter Weir is something awesome to watch.”
Visions of an impending disaster dominate the narrative of The Last Wave where a criminal defense lawyer (Richard Chamberlain) discovers that he has a mystical connection to his Aboriginal clients. “At the time,” Weir states, “I was influenced by reading the works of Immanuel Velikovsky, who believed the world is changed often or several times by catastrophes, which were acts with bodies from space at one time or another, like from asteroid collisions or coming close to stars that moved out of their alignment. I also wanted to talk to these Aboriginal elders

MIDDLE
MIDDLE
BOTTOM
BOTTOM
which was the most interesting part of making the film.” Seale believes that Weir always looks for that ethereal sense he can get out of a normal emotion. “The Last Wave was full of that right from the word ‘go’ because it was from the imagination of a man that comes into reality,” Seale explains. “Peter invented a tracking shot we called the ‘imperceptible move.’ The camera moves in almost subliminally so that the audience sitting in the theater would feel as though they were leaning forward, that something is going to happen. The two grips would sit down on opposite wheels of the dolly and turn them by hand so the camera was so gently moving forward. Once it got awkward for one of the grips to turn the wheel without taking his hands off of it, the other guy on the other side would take over, keeping it moving while he re-positioned his hands for the next bit.”





Gallipoli was all about honoring the memory of the Australian soldiers slaughtered during the infamous World War I battle that left a permanent scar on the nation’s psyche. “When I went, you could walk around the battlefield,” Weir recalls. “There are bullets and broken bayonets. The trenches had fallen in, but you could still see trench lines. Having gone through the experience of that day at Gallipoli on my own with no one around, that was it. I swore that I would make this picture for them as a sort of war memorial.” Special effects handled the explosions and guns. “It was a bit of Australian ‘what if?’ because when we’re in the trenches and the shell explosions were coming down the hill towards us,” Seale remembers. “There were actually giant packets of dynamite in the ground and we all got shell shocked! The trench walls were shaking and starting to collapse. We had to have earplugs in because of the compression.”
The silo death scene in Witness was achieved practically with a stunt double and a hidden oxygen tank and mask. “The farm belonged to the family called the Krantzes,” Weir states. “I asked Mr. and Mrs. Krantzes, ‘What’s that?’ They said, ‘It’s a grain silo. We store it up top, open the lever and drop what we require at various times.’ Then I said something like, ‘Can you go in through this

door?’ They replied, ‘Yes, but I wouldn’t want to be in there with the door closed. If any grain fell, you’d suffocate. You have to wear a mask or something because the dust is harmful to the lungs.’ I thought, ‘My god, what a weapon!’ We quickly reconfigured it all, and Harrison Ford loved it.”

McGuinness describes his Australian countryman as a legend. “Super in the moment. You spent a lot of time in awe with the experience and nature that he had. Peter was calm and astute. You felt good being around him, which was always the case. He always thought of everybody. Peter always remembered everyone’s names. Always respected everybody from top to bottom. You could see the human side of Peter as well as commanding a full production under a lot of pressure. It was a high-pressure movie with a lot of pressure coming from all sides. Peter pushed through it.” Mesa was intrigued by how Weir works. “Before we started anything in pre-production, he interviewed everybody,” Mesa observes. “There were people who did not make it on the show because he knew that they were going to be problematic to him. And what it does is make a great experience in making the movie.”

Reflecting on his attitude towards filmmaking, Weir states, “My approach had been that I wanted [to do to] the audience as I had done to me by other filmmakers [to feel like] I really was there. I really believed it. And so, when I walk outside, I’d used to joke and say, ‘I don’t know where I parked the car. I’m so lost in the movie, I can’t remember real life.’”






PHIL TIPPETT EMERGES FROM AN INFERNO TO CREATE ‘MAD


Images
TOP LEFT:
GOD’
By TREVOR HOGGWhile exploring the primordial manifestations of the subconscious mind that consist of tortured souls, decrepit bunkers and wretched monstrosities, stop-motion and visual effects icon Phil Tippett experienced his own mental purgatory during the 30-year journey to make Mad God. “My biggest challenge was my own mental stability,” Tippett admits. “I discovered on this movie that I was unipolar and when unchecked without medication you just go, ‘Whoosh.’ Like that. I ended up in the psych ward for a few days, and it took me a couple of months to recover. I ended up hating the project. It was just a matter of getting behind the wheel and finishing it. My friends and relatives saw me diminish and turn into this thing that was like a homeless guy. It was a religious experience and conversion. I don’t seem different, but I know that I am. It changed me to the extent that I don’t want to make anything with my hands anymore. I want to do something else with my mind.”
Dreams were the source of the narrative structure but not the actual imagery. “When I aggressively started to reboot Mad God close to 13 years ago, I dreamt proficiently and wrote down all of the dreams that I had,” Tippett explains. “I did it with an intention because, wow! It was just handed to me on a platter. I didn’t do it to mine for content. The content of the dreams had nothing to do with Mad God. What I was looking for was story structure. I suspected from all of my readings and whatnot that stories are indeed innate within mankind, and where do you think they came from in the first place? Through visions. You see things in your mind, and it’s very likely that those first visions were dreams. I pursued that angle and found that about 80% of dreams have a first act that is a statement, a second act that is confusing – it’s very

much like the unconscious mind is grasping with what the first act laid out, and there is a resolution in the third act. Sometimes it’s big. Sometimes it’s like, holy shit! Sometimes the narratives are complete stories. That gave me some confidence, and what was surprising to me was that once Mad God was done, bam! The dreams disappeared.”

Providing a narrative throughline is the protagonist known as The Assassin who descends through the various levels. “At the beginning there were a number of variations [for him], and when I hit on the cyberpunk look, that stayed there,” Tippett states. “It was always the central character. The Assassin goes through a number of stages and processes; that character is our eyes throughout the story. He is our hero on this hero’s journey.” The hellish environment is essentially a junkyard of humanity. “I would read Dante and Virgil, and study the history of Hell, which is really interesting, as well as anthropology, geology and psychology, like Freud and Jung. That was during this 20-year period from the time that I made the first three minutes of Mad God and had to abandon it. There was some force that made me continue on during that 20 years where I built this thing in my mind that I could do anything with. Then I got all of these volunteers and said, ‘What the hell!?’ I got all of this equipment. I would hate to go to my grave not having used all of the resources that I have. I’m at this time in my life where I don’t have to go out on location all of the time or supervise stuff.”
Most of the volunteers were college and high school students. “You couldn’t let them go on machines,” Tippett notes. “It was like


Tippett overlooks the creation of a
Creating creatures is something that comes naturally to Tippett.



giving five-year-olds razor blades. You had to be really careful, but they were helpful doing a lot of the heavy lifting. During the week, I would figure out processes, and then on the weekends I would say, ‘You do this or do that.’ Then they would do all of that stuff which would have taken me hundreds of years to do. One set where there are these mountains of dead guys and The Assassin’s car goes through it; it took three years to build all of that stuff.” Talent did emerge from the volunteers. “Some local guys I mentored turned into really good stop-motion animators and now they have their own company; that was rewarding,” Tippett reveals. Professionals were also involved in the production. “Alex Cox plays this character called the Last Man, and he directed Repo Man, Sid & Nancy and Walker. We’re friends. Through him, I was introduced to composer Dan Wool and sound designer Richard Beggs, who goes back to Apocalypse Now and won an Academy Award. My directorial style is to involve the best people I possibly can and stay out of their way, which is what I learned from working with George Lucas, Steven Spielberg and Paul Verhoeven. That sounds biblical, doesn’t it?!”
There is no dialogue in Mad God; however, baby sounds were incorporated into the sound design. “That was all Richard Beggs,” Tippett reveals. “I had no idea that he was going to do that. I had some stuff that was on the nose. It was stupid. We did it in 10 or 15-minute chunks of time. I might in a spotting session have half a dozen things to say. ‘Wouldn’t it be cool if we did this?’ It would tend to be general. Then it was ready and we would go to the screening room and look at it. It was like I was looking at my movie for the first time. It was like, ‘holy shit! Who made this?’ For those baby sounds,

TOP LEFT: The fire bridge is one of several environments created for Mad God

TOP RIGHT: In order to develop the narrative for Mad God, Tippett would write down his dreams and discovered that there is an innate structure to them.
MIDDLE: Volunteers working on weekends were critical in making Mad God a cinematic reality.


BOTTOM: Mad God is a combination of 35mm film stock and digital stills, with the aspect ratio being 1.85:1.

I was laughing hysterically. It was so inventive and perfectly on the nose for Mad God.” The beginning and ending were shot right at the end of production. “It’s stop-motion, so you’re editing as you go along building the narrative. It’s like making up a story. Once upon a time, there was a little girl who wanted to eat porridge and there were bears. You go down that line and you’re building it. Just like you would make up a tale. But in this case, it is more like doing a painting or musical composition where you are working it and piecing it together like a jigsaw puzzle.” Various genres are represented. “As I built the narrative, such as it was, I tried to get different epochs of things that happen on Earth. There were dinosaurs, industrial mayhem and a war scene. That’s what it was going to be. I thought of this world as the ghost of history.”
Over the three decades, the only technology that changed was the digital cameras. “The whole point was to use these handcrafted techniques and not go digital, which I don’t like,” Tippett states. “Chris Morley, the DP, and Editor Ken Rogerson are good with color; they would lead the charge, and I would keep my eye on it and go, ‘What if we did this or that?’” Shudder acquired the exclusive streaming rights for Mad God, but success was not predestined. “The first two film festivals that we applied to were in Berlin, and they rejected it. At that point, it was like, ‘Oh, shit!’ Because I had shown it to a few friends and they liked it a lot and said, ‘But it’s not for everybody, Phil.’ I thought, ‘Okay, here we go. This is the “not for everybody” part.’ Then, once we premiered it at Locarno [Film Festival in Switzerland in August 2021], it exploded beyond my wildest dreams. It’s only going to be in that indie world.”
The future lies with writing for Tippett. “There are some prospects out there. I don’t even care. Where I’m at right now is like the same I ever was, which for the lack of a better term is essentially a hobby that I don’t know what it’s going to turn into.” In regards to the Disney+ documentary series Light & Magic, produced by Lawrence Kasdan, about the establishment and legacy of Industrial Light & Magic, Tippett remarks, “Never thought about it [as being my formative years]. Of course, they were. I was sent a screener which I binge-watched. It was very nostalgic. I really liked the way that Larry Kasdan put it together. It expressed the camaraderie of the people. It was like looking at a family album and watching your kids grow up. Generally, I don’t look back. I put things behind me.”

VFX COMPANIES LOOK TO THE METAVERSE FOR NEW VIRTUAL WORLDS TO CONQUER

Among the landmark science fiction films released in 1982 was Tron, where software engineer Kevin Flynn (Jeff Bridges) gets transported into the mainframe of a computer and becomes an avatar interacting with and battling various programs that are personified. A decade later, the virtual world premise was further developed and christened the ‘Metaverse’ in the cyberpunk novel Snow Crash by Neal Stephenson, who imagined a digital realm where users in their CG personas socialize by hanging out, shopping and attending concerts. Theory became a reality in 2017 when Epic Games created the online video game Fortnite, which had players from around the world engaging in a Battle Royale competition with each other; the scope of activities was expanded in 2019 when DJ Marshmello performed the first in-game concert.
Banking on the Metaverse being the next big social media platform, in 2021 Facebook was rebranded as Meta to reflect its new corporate mandate and strategy. So where do things stand in 2023? Is there a role for the visual effects industry to play in the development and execution of the Metaverse? As platforms expand, so does the application of various computer assets, which raises concerns about intellectual property rights and social responsibility.
Fiction inspires reality. “Without movies and entertainment, some of these things we create wouldn’t even happen because they present ideas, then people try to figure out how to make that happen,” notes James DeFina, Co-founder of Astro Project LLC. “The Metaverse is an infinite number of digital worlds that will be interconnected by portals, and it’s also going to be a combination

TOP TO BOTTOM: The Matrix Resurrections explores the social implications of humans and technology becoming entwined as would be the case with the Metaverse.

(Image courtesy of Warner Bros. Pictures and DNEG)

Envisioning the Metaverse as a photorealistic virtual reality prison for humans created by computers is The Matrix franchise.
(Image courtesy of Warner Bros Pictures and DNEG)
Visual effects artists will be indispensable in constructing the Metaverse since they are accustomed to building believable environments like London in the 1960s for Last Night in Soho

(Image courtesy of Universal Pictures and DNEG)
of AR, VR and AI. The Astro Project is going into the Web 3.0 space of NFTs [Non-Fungible Tokens]. A lot of people think that NFTs are scams, but in this digital world there has to be some kind of ownership. What we’re doing is creating content and giving people ownership of that content as well as the Unreal Engine assets and teaching them how they can be used. Everybody is going to be building the Metaverse in the future, so we want to inform them how to do it. NFTs are collectible but have to do something. From that [concept] we’re going to fund our next project. People come to us to make content, but we’re also bringing a community along for the ride that is investing in us. That’s how I see it making money. At the same time, there has to be a way for the masses to see stuff for free. We can put ads inside of our content if we want, like a logo on a building. There has to be a balance of both.”
Visual effects companies are positioned to take advantage of the Metaverse. “We have hardware now that is capable of creating some absolutely incredible photorealistic or artistic, stylized, synthetic experiences like you would see in the future animation world,” notes Paul Salvini, Global Chief Technology Officer at DNEG. “What’s amazing for companies like DNEG is we have the tools and talent who are used to creating realistic worlds and digital doubles of people. Working at that quality level has been standard in the film industry for many years.” DNEG recreated 1960s Soho for Edgar Wright’s Last Night in Soho. “In order to do that,” Salvini says, “the artists LiDAR-scanned and captured photographs through photogrammetry to recreate what Soho was like back then. You get a sense of that in the movie. To be able to walk around and experience it in a way of your own choosing is incredible. I would wish to go to various places on Earth at different points in time, but it’s not possible. The closest thing that we have to being able to capture or recreate those experiences would exist virtually. Potentially, there are also beautiful worlds that could be created that are not found on Earth, which people could invent, imagine and then share with others. It opens the door to shared experiences that we don’t have available to us.”
“The Metaverse is usually described as virtual/augmented reality, sometimes as cyberspace. I like to consider it through the

TOP: Virtual reality tools being developed by ILMxLAB with experiences such as Vader Immortal will serve as a cornerstone of the Metaverse. (Image courtesy of Lucasfilm)

BOTTOM: Impossible Objects has been responsible for Diablo ad campaigns, and views the Metaverse as an opportunity to experience stories as a community in a virtual space. (Image courtesy of Impossible Objects)
lens of ‘visual effects for interaction,’” describes David Conley, Executive VFX Producer at Wētā FX. “I’m always excited for evolutions in storytelling, so I would like to see opportunities open up for new content and narrative engagement. You could argue we’re the visual ‘architects’ [cue The Matrix wall of screens] – part designers, part engineers. At Wētā FX, we’re very keen to produce content that pushes the boundaries of audience experience and the creativity of our artists. Our real-time capacity and expertise are constantly developing. Within the industry the Metaverse offers a great space for stretching creative and technological muscles.” Intellectual property rights belong to the studio, Conley says, “but we believe that we can collaborate with them to help build out their IP across multiple environments delivering into movies, theme parks and the Metaverse. We’re always looking to explore where assets and content creation can be leveraged across arenas for audiences to enjoy.” Variety will still exist even with the tools becoming more widely available. “Creativity will never stagnate. I’d argue that tooling or access shouldn’t be the measure of success – democratization of these elements means that artistry, craft, innovation, talent and ‘voice’ actually become more important and have more opportunities to be showcased,” Conley says.
For ILMxLab, the Metaverse is viewed as connected crossplatform storytelling. “We are trying to transcend the boundaries of the physical, digital and virtual worlds,” explains Vicki Dobbs Beck, Vice President, Immersive Content, at Lucasfilm and ILMxLAB. “For us, being part of the Disney organization, we have the opportunity to leverage the entire ecosystem. You will have personas that you can travel with across these various experiences.” VR, AR, MR and XR are building blocks for different types of Metaverse experiences, according to Dobbs Beck. “I would say that virtual and augmented reality are portals into a story world,


TOP: KitBash3D produces environmental tools that are only limited by the user’s imagination and can be used to create a wide range of worlds in the Metaverse. (Image courtesy of KitBash3D)

BOTTOM: Exterior paneling on the 100-foot-wide LED volume at Vū Productions’ virtual production studio in Tampa, Florida. (Image courtesy of Vū Studio)

and they provide unique opportunities and experiences. We always try to develop to the strengths of the individual platforms. Virtual reality allows us to transport people to other worlds. AR helps us to see our world differently. Real-time enables interactivity, and things like machine learning and AI are essentially tools that would contribute to establishing persistence; this idea of an evolving world and compelling characters. Each of those have a role, but they’re somewhat different roles.” It is important to find partners with complimentary goals as no company can take on the Metaverse alone, Dobbs Beck explains. “When doing Vadar Immortal [A Star Wars VR Series], we were interested in immersive storytelling on a brand-new platform, and then Oculus, now Meta, was launching a new platform. We are going to have to look for those kinds of opportunities within this evolving space.”
There doesn’t appear to be a universal vision for the Metaverse. “It seems so amorphous and abstract to so many folks because the definition of it feels so subjective,” observes Joe Sill, Founder and Director at Impossible Objects. “It feels like there will be a new opportunity to experience stories as a community in a virtual space.” Sharing assets with users, like what was done with Epic Games and The Matrix Awakens, is something to be embraced rather than discouraged, Sill believes. “If one person tells a story of The Matrix, but then releases all of these assets for anybody to reiterate and evolve upon the story with their own interpretation with the exact same assets, that feels like an exciting place to bear witness to new ideas and stories. However, it does raise a lot of questions as far as how do you make sure that your work is not taken for granted.” AI has led to the emergence of art programs like Midjourney and DALL-E. “I know that concept artists have questioned whether Midjourney is going to replace the entire identity of concept art, if AI-based art programs are developing all of these

pieces and applications,” Sill comments. “It is just a tool, a new way for a concept artist to stand up an idea quickly, and then evolve upon it themselves.” Fortnite, Midjourney, TikTok and YouTube are virtual communities occurring in real-time. “What everyone is trying to craft is a singular space that umbrellas everything. It’s essentially The Matrix, but Fortnite is a good example of what that can look like.”
KitBash3D Co-founders/Co-CEOs Banks Boutté and Max Burman view the Metaverse as the 3D version of the Internet. “For the last five or six years, we’ve been talking about what is called the virtual frontier,” Boutté states. “Fundamentally, we’re heading as a society into the screens. The Internet has existed as 2D. You scroll up and down and swipe left and right. That’s Y and X. Tomorrow, we’re going to take the Z axis and go forward and backwards in the machine. When that happens, fundamentally the fabric of society will change. Whether or not that happens in a VR headset or a contact lens or a hologram coming off of a watch doesn’t necessarily matter at this stage. What matters is that we’re adding dimensionality to our life in front of screens.” The Metaverse does not represent the downfall or rise of society, according to Burman. “When you start to look at the morality of new technology, you get into dangerous waters because inherently technology is neither good nor bad,” Burman remarks. “It’s just change on a constant curve. 3D is a powerful tool. We live our normal lives in a 3D world, so 3D is more intuitive. 3D allows us to be more immersed and have more presence in a place. You can look at the difference between the social and human interaction when playing a video game with a friend versus on a social platform today like Facebook or Instagram. One is through text and likes, and the other one is, ‘We’re going to have a shared experience doing something and create memories.’”

Virtual production methodologies can assist with worldbuilding and cross-platform applications. “What we do at Vū is in-camera visual effects, and that has evolved independently of what’s happening with Web 3.0 and Metaverse,” remarks Daniel Mallek, Director of Content & Innovation at Vū Studios. “But what’s interesting is that it does use the same building blocks. We are using Unreal Engine environments so we can create an asset for a Pepsi commercial, take that same asset and use it for an experience for people to go into, and they become the characters. There are interesting creative opportunities like that not only for marketing and commercials, but also for storytelling and filmmaking. For the most recent season of Westworld, several of their scenes were shot in the volume with Unreal Engine environments. They could take those same environments and create a cool visual experience for their fans in the Metaverse. What could end up happening is, as it becomes more democratized for people to create stories and environments for the Metaverse where they can use Metahumans, facial tracking, and do motion capture right from their phone, eventually there may not be a need for a studio. In that case, maybe it turns more into a place to experience the Metaverse in a shared setting or at a concert; those types of experiences that we don’t have today, but will have a need for in the future as the Metaverse becomes more widely adopted.”



Talent in Action
Victoria Itow-Tsering
Director, Animation Films Production Management, Netflix
VES Member: 8 Years
VES benefit most enjoyed: Networking events
Born and raised in Simi Valley, California, Victoria was about six years old when she went on the King Kong ride at Universal Studios in Florida and was stunned by “the combination of mechanics, artistry and animation that created the illusion. I immediately wondered how it all worked and how people could create that type of magic. I feel like I have been chasing that feeling of awe ever since.”
Animated projects such as Rugrats (1991), Sailor Moon (1992), Pocahontas (1995) and Anastasia (1997) ignited her passion to seek out and ultimately be employed within the industry. She is most proud of her work on Back to the Outback (2021), as well as her move in 2014 to Cinesite, Montreal, where she built their animation division. There she supported many new artists early in their careers, growing their confidence and abilities.

Victoria offers encouragement to young women hoping to become part of the industry. “You have something absolutely incredible to bring to this industry and your perspective is needed,” she states. “It can be difficult at first to jump in when everyone around you seems to have so much experience, but there are many mentorship opportunities out there through organizations such as Women in Animation, Rise Up Animation and the Visual Effects Society to help you find the network you will need to support you.”
Her advice to others: “Keep an open mind about the industry because the parts that intrigue you and bring you joy might surprise you and can change over time. If you follow your excitement, it will keep the work feeling fresh and propel you through the tricky parts of production.”
Victoria says that her four-year-old daughter keeps her on her toes and provides her with a welcome change of outlooks.

Rebecca Weigold
VFX Editor – Feature Films
VES Member: 3 Years
VES benefit most enjoyed: the annual VES Awards
Rebecca was born in Camden, New Jersey, but her family moved to Los Angeles when she was very young. Her first job out of college was as a management trainee at Warner Bros. After picking up a UCLA extension catalog, she decided to take the Assistant Editor class at UCLA with the goal of moving into post-production.
“I love post-production, and VFX in particular, because of the blend of the technical and the creative that we have. I once worked as a VFX editor on a movie that was entirely in Arabic. I don’t speak Arabic.”
The film that hooked Rebecca and made her want to be in our industry is The Shawshank Redemption (1994). Over the years, Rebecca migrated to VFX Editing.
Rebecca’s first movie as apprentice editor was Cobb (1994). From apprentice editor, she worked her way up to assistant editor, to editor, then switched to VFX editing in 2014 on 22 Jump Street
“I get along well with all types of people, I work hard. I always have a good attitude no matter what the circumstances or number of hours. I’m engaged in my job. I have 30+ years of experience in the industry (although my VFX work is more recent), so I bring a lot of knowledge to any job that I do, whether it’s managing my department or organizing my database or anticipating needs before they happen.”
Her advice to other up-and-comers is, “Be adaptable, work hard, watch and learn from those around you, and have a good attitude.”
When not working she likes to run, hike, roller blade, water ski, kayak, draw, spend time with friends and family, travel, go to the beach, and cut independent features.
The projects she is most proud of are Crazy Rich Asians (2018), Star Trek: Strange New Worlds (2022) and Jungle Cruise (2021) Her most recent film was The Woman King (2022).

Color Management
By JOSEPH GOLDSTONEEdited for this publication by Jeffrey A. Okun, VES
Abstracted from The VES Handbook of Visual Effects – 3rd Edition

Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES
Any visual effects professional whose personal contribution is judged on color should read the three guidelines below. Anyone building a workflow from scratch or integrating existing workflows should read (or at least skim) this entire section.
The Three Guidelines
1. Make critical color decisions in the screening room, not in an artist’s cubicle.
2. At the desktop, use an open-source or commercial color management system if possible, but do not expect too much.
3. Understand and document the color image encodings produced by a production’s digital motion picture (DMP) cameras, renderers, film scanners and the encodings consumed by displays, film recorders and digital projectors that deliver production output to clients.
Turning to camera raw formats for digital still and motion picture cameras, Figure 6.10 shows those color image encodings are quite different from that of Adobe RGB.
Like Adobe RGB, the camera raw colors represent physically measurable colors. Unlike Adobe RGB, there is no mandated breakdown into particular image channels; some raw formats used by single-sensor cameras, for example, provide for two green channels if the camera sensor contains twice the number of green-filtered pixels as it does red-filtered and blue-filtered ones. The encoded colors leaving the camera are similarly unconstrained, but typically the encoding provides for three or four channels and the bit depth (not necessarily the same for all channels) ranges between 10 and 16 bits.
With all this uncertainty, it is reasonable to ask why anyone would want camera raw encodings. The answer is that the image data are not color rendered; in color management terms, they are scene referred. If the vendor provides some post-processing tool that converts the image data to a more standard RGB encoding without

losing the raw nature of the image data (that is, without color rendering it for some display device), then this scene-referred data may play very well with CGI renderers and compositing applications. For digital still cameras, Adobe’s DNG Converter and Dave Coffin’s open-source dcraw program12 can be used to perform this type of extraction. For digital motion picture cameras, dedicated tools for this type of extraction (e.g., RED’s REDCINE-X PRO, or ARRI’s ARRIRAWConverter.app) are usually freely downloadable from the vendor; and some free versions of color correction software (notably Blackmagic Design’s Resolve Lite) support several digital cinema camera raw formats.
With raw encodings, color rendering will need to be done downstream before the completed shot is shown in a theater. Color rendering is more easily done by the visual effects team, however, than it can be undone if the camera manufacturer has baked-in color rendering in a non-raw format. The color image encoding for camera raw implicitly has as its viewing environment the original scene itself.
Color Management at the Desktop
With the above as background, the responsibilities of any color management system are thus:
• Interpret image data according to the image’s associated color image encoding.
• Provide a path to and from some common color space in which the correctly interpreted captured image data can be combined (a “working space”).
• Display original or combined image data as an accurate reproduction of what the final consumer (e.g., moviegoer) will see.
(Editor’s Note: The version of the article printed here is a brief overview of the in-depth version in the book. Various sections have been kludged together to give the reader an idea of what they can find in the book.)

Vancouver Revisited: Thriving in a VFX Metropolis
By NAOMI GOLDMANThe VES’s international presence gets stronger every year, and so much of that is because of our regional VFX communities –and their work to advance the Society and bring people together. Founded in 2007, the VES Vancouver Section, one of three Canadian Sections along with Toronto and Montreal, is flourishing thanks to its strong leadership and collaborative visual effects community. Vancouver is recognized as a truly world-leading center of VFX and animation talent and technology. And the Vancouver Section has established itself as a hub for the VFX community in Western Canada.
More than 60 studios make up the VFX and animation industry in Vancouver, cementing the city as the world’s largest cluster of domestic and foreign companies, including Sony Pictures Imageworks, ILM, Rainmaker Studios, Image Engine, Stargate Studios, DNEG, Method Studios and Cinesite Vancouver. Last year in Vancouver, the explosive growth included the opening of a Walt Disney Animation studio, new Wētā FX, Electronic Arts Canada, Animal Logic and Framestore facilities and the expansion of ILM’s local footprint. The industry is also well-supported through education and training programs for artists, software engineers and other technical-related professionals.
“Filmed entertainment is thriving in Vancouver,” said VES Vancouver Co-Chair and VFX Producer Steve Garrad. “More studios and facilities are being built every year, and a vast number of projects are coming into the market for production and then staying for post-production to tap into the diverse VFX workforce. This boom is creating enormous growth and job opportunities for local VFX talent, while attracting international artists and technicians to the region. Facilities are now coming to us to sponsor our events to help attract VFX talent… and that kind of outreach is amazing.”
VES Vancouver has more than 225 members, primarily working in feature film and episodic television. Much of its member recruitment comes through monthly pub nights sponsored by local VFX facilities and screenings hosted at Sony Imageworks, all of which offer a steady stream of social events for networking among members and prospects.
“We have turnout and engagement at our events. With an expansive market of professionals, we have a tremendous opportunity to grow and diversify our membership, including tapping in to the gaming and animation community,” said VES Vancouver Co-Chair and VFX Supervisor at Framestore Mat Krentz. “We take great pride in showcasing all of the benefits and value that the VES offers in terms of networking, community building and educational development.”
This past August, Vancouver was host to SIGGRAPH, the premier conference for computer graphics and interactive techniques worldwide, and a festive VES party supported by Netflix and Scanline VFX, which drew upwards of 500 guests.



“Vancouver does not normally get the same kind of notoriety and high-visibility events like Los Angeles, so it was really nice for the VES to lift up our community and come together,” said Krentz. “The SIGGRAPH party was a worldwide event with members who all got to come here and experience our VES community at the center point of the local industry. So we got to celebrate and show off what we do and our special Vancouver culture.”

Prior to the pandemic, VES Vancouver hosted a roster of successful educational events, including a session to have reels reviewed by recruiters and VFX supervisors and an interactive presentation on the history of VFX. Since returning to in-person events, the Section has hosted a demonstration of the world’s largest virtual production LED volume at Canadian Motion Picture Park, and is working on creating new programs to support ongoing learning and career enrichment, including a series on virtual cinematography. It also brought back the Section’s popular Summer BBQ this past June and the annual December holiday party, held this past holiday season as a high-energy bowling and pizza party.

“To have a Society like the VES to help people connect, foster education and build a network is a great gift. The support and camaraderie are strong,” said Garrad. “I want people to know that we are helping to build the next generation of artists and practitioners, and we are vested in their growth and mentorship. We offer so much. I get my healthcare through the VES as a freelancer, and that benefit is so meaningful. I think in being home to SIGGRAPH and our party, we did Vancouver proud as part of the global VES family from our outpost in Canada, and I am excited about the future we are shaping together.”
TOP LEFT: Virtual Trivia Night was a great success.

TOP RIGHT TO BOTTOM: VES Vancouver welcomes guests to the festive SIGGRAPH 2022 celebration, while guests enjoy VES party.


“Filmed entertainment is thriving in Vancouver. More studios and facilities are being built every year, and a vast number of projects are coming into the market for production and then staying for post-production to tap into the diverse VFX workforce.”
—Steve Garrad, Co-Chair, VES Vancouver
Visual Effects Society Recognizes Special 2022 Honorees
By NAOMI GOLDMANVES celebrated a distinguished group of VFX practitioners at its annual VES Honors Celebration this past October at the Skirball Cultural Center in Los Angeles.




“Our VES honorees represent a group of exceptional artists, innovators and professionals who have had a profound impact on the field of visual effects,” said VES Board Chair Lisa Cooke. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.”
Founders Award recipient and Lifetime VES Member Pam Hogarth has fulfilled many roles during her more than 35 years in visual effects and high-end computer graphics, from marketing to industry relations. The majority of her career was dedicated to incubating and growing educational programs geared towards training people for careers in entertainment production. Hogarth served eight terms on the VES Board of Directors, twice elected as the first female Vice Chair, and was Chair of the Education Committee. She also served as Secretary of the VES Los Angeles Section.
Honorary Member Pete Docter is the Oscar-winning director of Monsters, Inc., Up, Inside Out and Soul and Chief Creative Officer at Pixar Animation Studios. Docter has won three Academy Awards including Best Animated Feature-winners Up, Inside Out and Soul, and Best Original Screenplay nominations for Up, Inside Out and WALL•E. He has won two VES Awards – for Outstanding Animation in an Animated Picture for Up and for Outstanding Visual Effects in an Animated Feature for Soul.
Lifetime VES Member Jeff Barnes is an entertainment and technology creative executive who continues to make marked impacts across Silicon Valley and Hollywood. Named one of the top 200 creative people in the world by Entertainment Weekly, he is Executive Vice President of Creative Development at Light Field Lab. Barnes has been a longtime leader of the VES, serving as Chair and Vice Chair of the global Board of Directors and Co-Chair of the VES Summit.
Lifetime VES Member Patricia “Rose” Duignan started working in visual effects on Star Wars and worked her way up the production ladder from Production Assistant to Production Supervisor on Return of the Jedi and was the first Marketing Director at ILM. She plays a key role working to uplift women, people of color and veterans in VFX through her service on the Education Committee.
Lifetime VES Member Toni Pace Carstensen is a founding member of the Visual Effects Society, joining the organization as member 0004 and serving as the Society’s first Treasurer. She was a founding member of the Executive Committee and served on the global Board of Directors, as well as Co-Chair of the Global Education Committee, Co-Editor of the first edition of the VES Handbook of Visual Effects and longtime Chair of the Vision Committee.
OPPOSITE TOP TO BOTTOM: Lifetime Member and new VES Fellow David Tanaka, VES and Lifetime Member Jeff Barnes flank Pam Hogarth, Founders Awards recipient and Lifetime Member.
VES Honorary Member Pete Docter (center) celebrates with Pixar Animation President Jim Morris and new VES Fellow Rick Sayre, VES.
VES members celebrate with new VES Fellow Jeff Kleiser, VES.
Lifetime Member Patricia “Rose” Duignan strikes a pose with VES Board Chair Lisa Cooke.
THIS PAGE, LEFT TO RIGHT: Lifetime VES Member Toni Pace Carstensen.

VES Fellow Tony Clark, VES.
VES Fellow Gene Kozicki, VES with VES Board Chair Lisa Cooke.
A beautiful night honoring distinguished VFX practitioners at the VES Honors Celebration.

Lifetime VES Member Eric Roth served as the Executive Director of the VES for 19 years and retired last September. Under his watch, the Society grew into a thriving global community, launched VFX Voice magazine, created three editions of the VES Handbook of Visual Effects, produced 20 years of the VES Awards, hosted the annual VES Honors and Hall of Fame Programs, and delivered a robust roster of educational programming.
Lifetime VES Member and VES Fellow David Tanaka, VES is an Editor, Producer and Creative Director with experience ranging from visual effects and animation, to documentary and live-action feature films. He is a VFX Editor at Tippett Studio and Adjunct Professor at the Academy of Art University, San Francisco. Tanaka has served three terms as Chair of the VES Bay Area Section, as 2nd Vice Chair on the global Board of Directors and serves on the VES Archives, Awards, Outreach and Work-From-Home Committees.
VES Fellow Tony Clark, VES is an Emmy Award-winning cinematographer, and serves as the Co-Founder and Managing Director at Rising Sun Pictures. He received the Academy Scientific and Technical Award for the creation of cineSync. Clark has been a pivotal member of Australian Boards, including a role on the South Australian Film Corporation and Royal Institute of Australia.
VES Fellow Jeff Kleiser, VES is a Visual Effects Supervisor and CEO/Co-Founder of Synthespian Studios. His pioneering work in computer animation has spanned the history of the medium, and many of his projects have involved the creation of CG humans and creatures. Kleiser served on the VES global board of directors and is a member of the CG Society Advisory Board.
VES Fellow Gene Kozicki, VES is a VES Founders Awards recipient and Lifetime Member, has served as a member of the Board


of Directors and L.A. Section Board of Managers, and was an organizer of the VES Festival for many years. A VFX historian and Chair of the VES Archives Committee, he was instrumental in organizing the VES archives and helping to secure funding to digitize many of our assets.
VES Fellow Rick Sayre, VES is a Senior Production Scientist at Pixar Animation Studios. He received a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences for his work developing animation input systems for Jurassic Park. He was a founding member of the VES Bay Area Section and on the global and Bay Area boards.
Hall of Fame Inductee Mary Ellen Bute was a pioneer American film animator, producer and director. She was one of the first female experimental filmmakers, and was the creator of some of the first electronically-generated film images.
Hall of Fame Inductee Alice Guy-Blaché was a French pioneer filmmaker. She was one of the first filmmakers to make a narrative fiction film, as well as the first woman to direct a film.
Hall of Fame Inductee Grace Hopper, known as “Grandma COBOL,” was an American computer scientist and U.S. Navy Rear Admiral. She was a pioneer of computer programming who invented one of the first linkers.
Hall of Fame Inductee Bill Kovacs was a pioneer of commercial computer animation technology. As Vice President of R&D at Robert Abel and Associates, he co-developed the company’s animation software.
Hall of Fame Inductee George Pal was a Hungarian-American animator, film director and producer, principally associated with the fantasy and science-fiction genres.
Happy 140th Birthday, Pinocchio!
“It’s a story you think you know, but you don’t” is a tagline from the newest film version of Pinocchio by acclaimed director Guillermo del Toro and co-produced by Netflix. (See story, page 58). Indeed, it’s true, as Del Toro gives it his own artistic interpretation –stop-motion wooden characters and a somewhat darker telling than previous incarnations. The film’s setting is in war-torn Italy during Mussolini’s time. Del Toro’s version also attempts to push animation to new levels.
Pinocchio has been in world culture since 1883 when Italian writer Carlo Collodi of Tuscany introduced the tale of a wood carver who creates a wooden puppet that he wishes would become a real boy – long nose and all. The Blue Fairy grants that wish, and Pinocchio must figure out what the world is really about. The reason the story is timeless is because of such enduring themes as child rearing, middle class virtues, appeal to both sexes, morality and capitalism. No wonder there have been so many remakes and Del Toro had been contemplating his for years.
There have been countless screen and TV adaptations, liveaction and animated, of Pinocchio. Most notable, however, is the 1940 animated version by Walt Disney. It is still considered one of the greatest animated movies ever made. It has the distinction of being the first animated feature to win an Academy Award. It won for Best Original Song (“When You Wish Upon A Star”) and Best Original Score. The animation effects were said to be groundbreaking at the time by giving a sense of realistic motion to machinery and cars as well as to natural elements like rain and lightning. Legendary artist Josh Meador led the effects animation team and became Director of Animation Effects at Walt Disney Studios.

Perhaps the greatest achievement of Walt Disney’s Pinocchio was its artistic ambition. It employed methods still followed today in both hand-drawn and CG animation. No detail was too small to be left untouched by the dedicated animator’s hand. Pinocchio established Disney Feature Animation as an artistic and commercial force. Now it’s Guillermo del Toro’s turn to take it to the next level.
Image courtesy of the Walt Disney Co.

