VFXVOICE.COM SPRING 2022
GLOBAL VFX: SQUID GAME
THE MATRIX RESURRECTIONS • THE TRAGEDY OF MACBETH • THE FRENCH DISPATCH BELLE • BLADE RUNNER: BLACK LOTUS • PROFILES: MARCUS STOKES & SUE ROWE
CVR1 VFXV issue 21 SPRING 2022.indd 1
2/28/22 12:30 PM
CVR2 RAYNAULT AD.indd 2
2/28/22 2:58 PM
PG 1 AUTODESK ADVERTORIAL.indd 1
2/28/22 3:02 PM
[ EXECUTIVE NOTE ]
Welcome to the April 2022 issue of VFX Voice! Happy Anniversary, VFX Voice! Five years ago, the Visual Effects Society published our inaugural issue of VFX Voice. After shining a light on VFX artistry and innovation for 20 years, it was a huge point of pride to create a new forum to uplift the art, the craft and the people of the global visual effects community and our thriving Society. Because of your enthusiastic support, VFX Voice quickly claimed the mantle as a must-read print and digital magazine, garnered prestige in the publishing community and keeps earning its moniker as “the definitive authority on all things VFX.” We took a longstanding goal of the organization, to create the standard-bearer publication for the VFX industry and – with a remarkable editorial, creative and publishing team at the helm – brought this dream to life. Thank you for being an integral part of the magazine’s success. All year long, VFX Voice is celebrating special moments in the Society’s 25-year legacy, so consider this issue another commemorative keepsake and read on for a salute to our eyepopping magazine covers from April 2017 to today. Also in this issue: We go inside the big-screen VFX of The Matrix Resurrections, The Tragedy of Macbeth, The French Dispatch and animated Belle, as well as the small-screen anime spinoff of Blade Runner: Black Lotus. We sit down for personal profiles with VFX Supervisor Sue Rowe and director Marcus Stokes. We delve into global VFX trends, the outlook on animation and what’s ahead for VR and AR. We update virtual production developments with a filmmakers roundtable and dive deep into LED volumes. And we serve up the latest in tech & tools, highlight our Toronto Section and more. Now, turn the page and meet the visionaries and risk-takers who push the boundaries of what’s possible and advance the field of visual effects. Cheers!
Lisa Cooke, Chair, VES Board of Directors
Eric Roth, VES Executive Director
P.S. Please continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.
2 • VFXVOICE.COM SPRING 2022
PG 2 EXECUTIVE NOTE.indd 2
2/28/22 12:32 PM
I founded Proof 20 years ago, in May 2002. We were the very first studio committed exclusively to offering Visualization services to the entertainment industry. It was a bold step into the unknown! To all of our clients, Thank you for hiring us, believing in us and trusting us with your projects. To all of our artists, Thank you for your talent and dedication. It is beautiful to see. To all of our production staff, Thank you for giving your very best. We wouldn’t be here without you. It’s been an amazing journey and I can’t wait to see where we go next. -Ron Frankel
PREVIS • TECHVIS • POSTVIS
PG 3 PROOF AD.indd 3
LOS ANGELES • LONDON • ATLANTA
WWW.PROOF-INC.COM
2/28/22 3:03 PM
[ CONTENTS ] FEATURES 8 COVER: GLOBAL VFX The globalization of the VFX industry has kicked into high gear.
VFXVOICE.COM
DEPARTMENTS 2 EXECUTIVE NOTE 92 VES SECTION SPOTLIGHT: TORONTO
16 ANIMATION: ANIMATION OUTLOOK Technology and content-demand have transformed the landscape. 20 VR/AR/MR TRENDS: VR FUTURE VR and AR continue to steadily climb the mainstream mountain. 26 FILM: THE MATRIX RESURRECTIONS Expanding ‘real’ and virtual realms while focusing on character.
94 VES NEWS 96 FINAL FRAME – GLOBAL VFX
ON THE COVER: A scene from Squid Game. (Image courtesy of Netflix)
34 PROFILE: MARCUS STOKES Multifaceted director entered filmmaking through visual effects. 40 VIRTUAL PRODUCTION: LED VOLUMES Five active firms with LED stages discuss latest developments. 48 VFX VOICE 5TH ANNIVERSARY: MILESTONE The industry offers feedback on the magazine’s first five years. 50 VIRTUAL PRODUCTION: ROUNDTABLE Filmmakers reveal lessons learned so far in virtual productions. 56 PROFILE: SUE ROWE VFX Supervisor is excited about the future of virtual production. 62 ANIMATION: BLADE RUNNER: BLACK LOTUS Exploring other characters and parts of the Blade Runner world. 68 TECH & TOOLS: COMPOSITING TOOLS Some new features of today’s most popular compositing tools. 72 FILM: THE TRAGEDY OF MACBETH Creative effects enhance ageless drama in Joel Coen’s adaption. 78 ANIMATION: BELLE Three different animation styles shape innovative anime film. 84 TECH & TOOLS: EFFECTS ELEMENTS How VFX studios, stock companies shoot and utilize elements. 88 FILM: THE FRENCH DISPATCH VFX plays an essential role in Wes Anderson’s complex tableau.
4 • VFXVOICE.COM SPRING 2022
PG 4 TOC.indd 4
2/28/22 12:42 PM
PG 5 SCANLINE AD.indd 5
2/28/22 3:05 PM
SPRING 2022 • VOL. 6, NO. 2
WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.
VFXVOICE
Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com
VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS
EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING Arlene Hansen Arlene-VFX@outlook.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers, VES Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth
OFFICERS Lisa Cooke, Chair Emma Clifton Perry, 1st Vice Chair Susan O’Neal, 2nd Vice Chair Rita Cahill, Secretary Laurie Blavin, Treasurer DIRECTORS Jan Adamczyk, Neishaw Ali, Nicolas Casanova Mike Chambers, VES, Bob Coleman Dayne Cowan, Michael Fink, VES Gavin Graham, Dennis Hoffman, Thomas Knop Kim Lavery, VES, Brooke Lyndon-Stanford Josselin Mahot, Arnon Manor, Andres Martinez Tim McGovern, Karen Murphy Janet Muswell Hamilton, VES, Maggie Oh Jim Rygiel , Richard Winn Taylor II, VES David Valentin, Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Colin Campbell, Himanshu Gandhi, Johnny Han Adam Howard, Robin Prybil, Dane Smith, Philipp Wolf Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Jim Sullivan, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Shannon Carmona, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations
Tom Atkin, Founder Allen Battino, VES Logo Design Follow us on social media
VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2022 The Visual Effects Society. Printed in the U.S.A.
6 • VFXVOICE.COM SPRING 2022
PG 6 MASTHEAD.indd 6
2/28/22 12:43 PM
PG 7 YANNIX AD.indd 7
2/28/22 3:06 PM
GLOBALIZATION LIFTS VFX INDUSTRY TO NEW HEIGHTS AND NEW HORIZONS By CHRIS McGOWAN
TOP: Netflix distributed Girls from Ipanema, a Brazil-produced series about four Brazilian female friends in Rio de Janeiro’s bossa nova scene in the late 1950s. Sao Paulo-based Quanta Post contributed to the VFX. (Image courtesy of Netflix) OPPOSITE TOP: As of late 2021, the Korea-produced dystopian survival drama Squid Game was Netflix’s most watched series. Seoul-based Gulliver Studios supplied VFX. (Image courtesy of Siren Pictures and Netflix) OPPOSITE BOTTOM: Ragnarok, a Norwegian-language fantasy series from Copenhagen-based SAM Productions, is distributed by Netflix. Ghost VFX and Oslo-based Stardust Effects contributed VFX. (Image courtesy of Sam Productions and Netflix)
The growth and globalization of the visual effects industry has resulted in worldwide interconnectivity and a vast workflow spanning the planet. There is more top-notch VFX in films and series than ever before, boosted by the growth in streaming content, episodic fare becoming more cinematic in terms of quality, and a continued evolution in VFX technology. Demand for VFX artists as a whole is also growing due to the surging video game industry, amusement park visual effects and the gradual ascension of VR and AR. All of those factors have increased the work for VFX studios and the demand for skilled artists from Vancouver to London to Mumbai. Financial incentives in certain locations have helped globalize the VFX business for some time now. And the COVID-19 crisis further accelerated home entertainment demand and remote VFX work. “The pandemic has really kicked the globalization of the VFX industry into high gear, and now even more producers know what can be achieved with VFX,” says David Lebensfeld, Founding Partner and VFX Supervisor of Ingenuity Studios, which has offices in Los Angeles, New York and Vancouver. Local productions outside North America, such as many series funded by Netflix, are spreading work across the planet in both film production and post-production. Fiona Walkinshaw, Framestore’s Global Managing Director, Film, comments, “The streamers have made no secret about their desire for regionallyfocused content and how this feeds into their business strategies.
8 • VFXVOICE.COM SPRING 2022
PG 8-14 GLOBAL VFX.indd 8
2/28/22 12:45 PM
There’s a tremendous desire for stories that could only come from a certain city or country – shows like Squid Game or Money Heist, for example, which, like the Scandi noir boom, captivate viewers by dint of their freshness and unique cultural or geographical perspectives. This will inevitably mean our worlds become larger, as we work with storytellers, producers and below-the-line talent from all over the world. It’s an exciting prospect, and it will help us all grow and learn.” Walkinshaw adds, “In time I’m sure we’ll also see new VFX hotspots establishing themselves – you just have to look at the way the Harry Potter franchise helped turbocharge London’s VFX industry, or what the Lord of the Rings films did for New Zealand.” Framestore itself is quite globalized, with offices in London, Mumbai, Montreal, Vancouver, Melbourne, New York, Los Angeles and Chicago. Visual Effects studios have spread widely over the last two years across North America, Europe and Australia/New Zealand, and a growing number can be found also in Asia. Many facilities built initially for wire removal and rotoscoping have evolved into full-service VFX studios. BOT VFX, founded in 2008 in India, has expanded from an outsourcing facility in Chennai for rotoscoping and other detail work into a large and complete VFX business; it now has its headquarters in Atlanta and has worked on high-profile recent projects, including The Book of Boba Fett, Dune and Black Widow. Just as Korea has grown into a movie/series global powerhouse, so too have its VFX studios expanded over the last 10 years.
SPRING 2022 VFXVOICE.COM • 9
PG 8-14 GLOBAL VFX.indd 9
2/28/22 12:45 PM
COVER
“The pandemic has really kicked the globalization of the VFX industry into high gear, and now even more producers know what can be achieved with VFX.” —David Lebensfeld, Founding Partner and VFX Supervisor, Ingenuity Studios
Gulliver Studios supplied VFX for the Netflix hit series Squid Game, while Dexter Studios contributed VFX work to Bong JoonHo’s Parasite. Dexter and five other VFX studios worked on Space Sweepers, arguably Korea’s first high-production science fiction film. Korea’s 4th Creative Party helped with the VFX for Joonha’s acclaimed Snowpiercer and Okja films (along with Method Studios). And Digital Idea worked on the hit zombie film Train to Busan. VHQ Media, founded in 1987 in Singapore, has grown into a large film studio and claims to be Asia’s largest post-production house, working on both national and international productions. It also has studios in Beijing, Kuala Lumpur and Jakarta. Many international VFX firms have opened offices in Asia, including DNEG (four offices in India), The Third Floor (Beijing), Scanline VFX (Seoul), Method Studios (Pune), ILM (Singapore), Digital Domain (Taiwan, Hyderabad and four locations in China) and MPC (Bangalore). “The global growth of the VFX industry and VFX as a tool of technology is limitless and boundless, to say the least,” says Merzin Tavaria, President, Global Production and Operations at DNEG. The London-based firm is another example of a VFX studio with offices spread across the globe. It was formed in 2014 by a merger between Prime Focus (India-based) and Double Negative (U.K.-based) and has studios in Los Angeles, Vancouver, Montreal, Toronto and London along with Mumbai, Bangalore, Chandigarh, and Chennai in India. VFX BOOM
TOP TO BOTTOM: Netflix globally distributed Invisible City, a Brazil-produced fantasy series about mythological creatures in the rain forest, created by Carlos Saldanha, the Brazilian director of various successful animated films, such as the Ice Age movies and Rio. (Image courtesy of Netflix) S.O.Z.: Soldiers or Zombies, an eight-episode horror-action TV series distributed by Prime Video, is a Mexican production created by Nicolas Entel and Miguel Tejada Flores. (Image courtesy of Prime Video) The Last Forest, Luiz Bolognesi’s movie about the Yanomami Indians of the Amazon rainforest, produced in Brazil, mixes documentary and staged scenes. It was distributed globally by Netflix. (Image courtesy of Netflix)
“There are some fantastic companies doing amazing work in all corners of the globe,” says Pixomondo CEO Jonny Slow, “and at the moment, they are all working to keep up with an unprecedented level of demand. Growing demand, driven by episodic content with a higher budget and huge creative ambition, is a big factor in all the trends affecting the market [this year] and beyond.” Pixomondo has offices in Vancouver, Toronto, Montreal, Los Angeles, Frankfurt, Stuttgart and London. The streamers Netflix, Amazon, Hulu (majority owned by Disney) and Apple TV have added their VFX demand to that coming from traditional movie/TV companies and affiliated streaming services (HBO Max, Disney+, Peacock, Paramount+). Florian Gellinger, RISE Visual Effects Studios Co-Founder and Executive Producer, notes, “Right now, as the market is so saturated, the work is going to be globally distributed to whoever has availability and meets the required profile. So yes, clients will have to look increasingly globally for a good fit.” RISE has offices in Berlin, Cologne, Munich, Stuttgart and London. Other VFX studios concur that the business has been activated. “We have too much work, which means we need more capacity, more artists and more supervisors. Right now, we’re ensuring that we continue to make our established clients happy while bringing in new clients,” says Tom Kendall, VFX Head of Business Development, Sales & Marketing for Ghost VFX, which has offices in Los Angeles, Copenhagen, London, Manchester, Toronto and Vancouver.
10 • VFXVOICE.COM SPRING 2022
PG 8-14 GLOBAL VFX.indd 10
2/28/22 12:45 PM
Executive Producer Måns Björklund of Stockholm-based Important Looking Pirates (ILP) notes, “There aren’t enough VFX companies in the world to do all the work. The demand for content has boomed, and the need for clients to seek new vendors around the world has increased.” GLOBALIZED WORKFLOWS
DNEG is one of the pioneers in the globalization of VFX workflows. Tavaria comments, “With nine facilities working seamlessly together across three continents, I believe we’ve led by example, creating an ever-expanding global network that can deliver highly creative and compelling visual storytelling while introducing new norms of efficiency and flexibility.” He adds, “The standardization of workflows, tools and capabilities across sites allows us to move work around our network to cater to the demands of our clients and to balance the load across locations to maximize utilization. We also take full advantage of time zone differences to create efficiencies in our production scheduling.” Framestore recently opened a studio in Mumbai, which already has 130 on-site creatives. Walkinshaw comments, “Being able to set up in Mumbai and seamlessly integrate with our new colleagues there is an incredible advantage, especially given the tremendous talent pool there. Generally speaking, increased access to amazing talent is the main consequence of this worldwide connectivity.” Another positive effect of globalization is that “exchanging work between companies has become much easier despite everyone running their own pipeline,” says Gellinger. Slow sees the globalization of VFX as a positive trend that creates stability for those companies who are prepared to evolve continuously and adapt to constant change. “It’s not the only trend in the VFX industry, but it’s a trend in response to demand for capacity and client requirements for speed and efficiency. But there is also a quality threshold. Quality output drives stability for VFX companies, wherever their artists are located.” BEYOND BORDERS
Lebensfeld comments, “What certainly helps with this business becoming more globalized is access to talent that isn’t in your zip code, which backfills what already makes us competitive.” To open up to talent in another country, he says, “we already have the technology – the hardware, software and methodology – to work remotely. Anything else past that are just details. We look at it less like it is a global business and more as one that breaks down borders. One of the more exciting things is getting the chance to develop artists and give opportunities to people who would not have had them otherwise. They only need a computer, inherent artistic talent, and we work with them on training in a studio environment. I’m a big believer that a combination of local studio artists and international artists is the best way to go because the industry still relies on specific locations for some projects for a variety of reasons. The business still needs a large base of talent in certain production hubs.”
TOP TO BOTTOM: Korean sci-fi mystery series The Silent Sea, written and directed by Choi Hang-yong, is distributed by Netflix globally. (Image courtesy of Artist Company and Netflix) The Korean zombie apocalypse and coming-of-age series All of Us Are Dead is distributed internationally by Netflix. (Image courtesy of Film Monster Co., JTBC Studios/Kimjonghak Production Co. and Netflix) Teddy Roosevelt (Aidan Quinn) pulling an arrow from the arm of Brazilian explorer Cãndido Rondon (Chico Diaz) in the HBO mini-series The American Guest. The 2021 Brazilian production showcased the work of Brazilian artists, including Orbtal Studios in Sao Paulo, which supplied visual effects. (Photo courtesy of HBO Max)
SPRING 2022 VFXVOICE.COM • 11
PG 8-14 GLOBAL VFX.indd 11
2/28/22 12:45 PM
COVER
“[S]ince the workforce has become so flexible in where it settles, recruiting has become a lot harder and companies have to reach out further than they used to in order to meet their talent requirements. [Globalization] has solved a couple of these problems by having access to top talent across borders, not being limited to one’s own backyard.” —Florian Gellinger, Co-founder and Executive Producer, RISE Visual Effects Studios Gellinger observes that it has become easier for artists to find a job in their desired ‘adventure destination’ abroad. “And since the workforce has become so flexible in where it settles, recruiting has become a lot harder and companies have to reach out further than they used to in order to meet their talent requirements.” Yet globalization also “has solved a couple of these problems by having access to top talent across borders, not being limited to one’s own backyard.” Walkinshaw adds, “From a production perspective it means juggling more time zones, currencies and teams, so this part of the business has become more complex, and there is a need for investment in both more people and technology solutions to make it easier for production to function. The role of a producer working for a company like Framestore on a project spread globally is far more complex and demanding than it used to be.” Producers and supervisors now must be more patient and organized because of the time differences, and they have to schedule their work around that, according to Kendall. “The projects are shot in so many diverse locations, it’s about being able to address clients’ needs in a timely manner and be flexible in terms of how we work.” Gellinger notes that the way that business is being distributed globally is “definitely creating stability, but only as long as companies keep investing in their talent. Flying in entire teams from abroad is not a business model. Investing in education and training is more important than ever.” Slow comments, “We have seen a lot of these consequences [of globalization] playing out for the past few years. It has allowed the formation of larger, better funded, better organized companies that are becoming attractive investment propositions. This has been very positive for the industry – for growth to happen, investment is required.” TOP TO BOTTOM: Money Heist is a Spanish heist drama that had a successful run of five seasons and is one of Netflix’s biggest international hits. (Image courtesy of Atresmedia/Vancounver Media and Netflix) How I Fell in Love with a Gangster is a Polish crime drama distributed globally by Netflix. (Image courtesy of Netflix) Ghost VFX worked on Shadow and Bone, a fantasy series distributed by Netflix and based on books by Israeli novelist Leigh Bardugo. Shadow and Bone was shot in Budapest and Vancouver. (Image courtesy of 21 Laps Entertainment and Netflix)
INCENTIVES
DARK BAY Virtual Production Studio is an example of how VFX globalization has been boosted by Netflix and by government help. Baran bo Odar and Jantje Friese, the creators of Netflix’s hit series Dark – a German science fiction thriller – built an LED stage in Potsdam-Babelsberg in part to shoot 1899, their next Netflix series. Odar and Friese’s production company Dark Ways holds a majority share in DARK BAY (Studio Babelsberg has a minority share). Funding from the state of Brandenburg in Germany and a longterm booking commitment from Netflix backed the venture. Incentives continue to play a role in the globalization of VFX. Framestore’s Walkinshaw comments, “National or regional incentives have provided a huge boost for our industry and encouraged
12 • VFXVOICE.COM SPRING 2022
PG 8-14 GLOBAL VFX.indd 12
2/28/22 12:45 PM
PG 13 FTRACK AD.indd 13
2/28/22 3:07 PM
COVER
“We have seen a lot of these consequences [of globalization] playing out for the past few years. It has allowed the formation of larger, better funded, better organized companies that are becoming attractive investment propositions. This has been very positive for the industry – for growth to happen, investment is required.” —Jonny Slow, CEO, Pixomondo
TOP: Amazon Studios produced the epic fantasy series The Wheel of Time, another international VFX effort involving Cinesite, MPC, Automatik VFX, Outpost VFX, Union Visual Effects and RISE Visual Effects Studios, among others. (Image courtesy of Sony Pictures Television and Amazon Studios) BOTTOM: HBO series Beforeigners is a science-fiction crime drama produced in Norway by Oslo-based Rubicon TV AS. (Photo courtesy of HBO Max)
international collaboration. They’ve been key to growth in the U.K. and Canada – to date our biggest sites for film and episodic work – and the willingness of studios to put work in these regions helps create a virtuous circle: it allows companies to invest in their talent, facilities and infrastructure, makes those places a magnet for established talent from elsewhere, and also helps schools and universities attract ambitious students. Take Montreal for example – Framestore was the first major studio to open a studio there [in 2013], and it’s now an established and hugely-respected hub for the global visual effects industry.” THE PANDEMIC
“The pandemic forced studios like us to build a pipeline that works in remote environments,” says Lebensfeld. “We were able to leverage figuring out how to work remotely with talent that have previously been local to our studio locations. We have history and momentum with these artists, and we figured out processes that mirror what we have already been doing – just with remote capabilities.” Already existing worldwide VFX interconnectivity helped DNEG to address the challenges of the pandemic, according to Tavaria. “The unprecedented speed with which our technology teams enabled global remote working was astounding, based on work that was already underway. It also, somewhat counterintuitively, brought us closer together and enabled even more collaboration across our global teams,” he comments. “These advances have positioned us well to cater to the growth in demand for visual effects and animation work this year, driven by the increases in content production by TV and OTT companies, in addition to increased demand for our VFX and animation services from our studio clients.” Walkinshaw comments, “The pandemic has definitely encouraged us to think outside the box, be this seeking workarounds for physical shoots, having colleagues working remotely from different countries or broadening our talent pool by making hires from different territories, because so much of the workforce has spent time outside the office. I imagine this will endure, especially as we continue to seek skills beyond the ‘traditional,’ particularly in areas such as technology and gaming.” Slow says, “In our industry, technology and innovation are the fundamental drivers of changes like globalization. We are at a very interesting point – with technology driving once-in-a-lifetime changes in content distribution and production technique – and these trends have been accelerated by a major pandemic. The consequences are significant, and the impact will largely play out over the next five years.” Lebensfeld concludes, “Pre-pandemic [film companies] went on location and brought on as many extras as needed. The scope of requests has expanded well beyond that. There’s a VFX solution for every aspect of a story. That’s a powerful thing.” He adds, “I think the VFX industry has transformed these past two years, with very positive changes overall. There’s no going back now. Our industry is global, and that’s here to stay.”
14 • VFXVOICE.COM SPRING 2022
PG 8-14 GLOBAL VFX.indd 14
2/28/22 12:45 PM
PG 15 ROE VISUAL AD.indd 15
2/28/22 3:08 PM
ANIMATION STUDIOS EMBRACE LATEST STYLES AND TECHNIQUES TO TAP CONTENT EXPLOSION By TREVOR HOGG
TOP: A scene from A Shaun the Sheep Movie: Farmageddon from Aardman Animations, which was originally founded as a stopframe studio in 1972. (Image courtesy of Aardman Animations) OPPOSITE TOP TO BOTTOM: Baba Yaga won the Daytime Emmy for Interactive Media and resulted in Disney+ partnering with Baobab Studios to create a Witchverse series for the streaming service. (Image courtesy of Baobab Studios) Veteran animator and lecturer Ken Fountain finds it to be exciting that the demand for animation content is allowing for experimentation such as Spider-Man: Into the Spider-Verse. (Image courtesy of Sony Pictures Entertainment and Marvel) Netflix decided to bridge the gap between Season 1 and 2 of the live-action The Witcher by releasing an anime prequel Nightmare of the Wolf. (Image courtesy of Netflix)
Being able to successfully manage an animation studio in 2022 and beyond requires foresight and the desire to be an industry leader. The landscape has been transformed by globalization, dominance of streaming services, growing demand for content, and the development of other media platforms such as virtual reality. Being flexible has enabled Oscar-winning Aardman Animations to remain relevant since being founded as a stop-motion studio in 1972, responsible for Creature Comforts and Wallace & Gromit, and creating CG features and AR projects. More recently there is Emmy-lauded Baobab Studios, which has specialized in interactive action animation since 2015 and is creating The Witchverse anthology series for Disney+ based on Baba Yaga. Veteran animator and lecturer Ken Fountain has found himself in the middle of all of this, having worked on Megamind for DreamWorks Animation, Pearl for Google Spotlight Stories and Baba Yaga for Baobab Studios, as well as doing tutorials on SplatFrog.com. There is still value in using real and tactile material for characters and world-building, believes Sarah Cox, Executive Creative Director at Aardman Animations. “It’s less about creating assets digitally and more about the way that technology can enable handcrafted processes to be more efficient, beautiful and effective.” Technology is embraced by Lorna Probert, Head of Interactive Production at Aardman Animations, who last year released the Wallace & Gromit AR adventure The Big Fix Up. “There is some exciting technology, like virtual production and the way that we can use real-time engines for previs and creating more flexible and iterative processes.” Another major trend pointed out by Cox is the push for photorealism. “The distinction between live action and animation, what is
16 • VFXVOICE.COM SPRING 2022
PG 16-19 ANIMATION.indd 16
2/28/22 12:47 PM
real and what’s not real, is continually going to be blurred as we use live-action information in animation.” The filmmaking process is not entirely different. “Stop-frame is effectively like a live-action asset because it’s shot in-camera, and then you’re using those assets to mix with other bits of animation,” observes Cox. “What we did with our last Christmas special was shoot all of the effects in-camera. The snow was made out of puffs of wool, but then composited together in the most technically proficient, up-to-date way.” Virtual reality has yet to become mainstream, partly hampered by being perceived strictly as a marketing tool. “You are still wearing what is not a comfortable thing on your face,” states Probert. “Until your interface with the content becomes more natural and comfortable, that suspension of reality is always going to be broken.” The communal aspect is presently missing. “A lot of our content design is for family viewing, so that whole being in your own space is quite contrary to what we do,” notes Cox. “It quadruples the production time [because various viewer narrative decisions have to be accounted for], and the other challenge is most of our work is comedy. If there is user interaction, you can’t time the gags in quite the same way.” It is important to recognize that each medium cannot be treated in the same way. “It’s creating content that plays to the strength of the format, and we’re doing lots of exploration,” remarks Probert. “The fact that you can, for the first time, be in one of our sets, be able to build and move things and explore the detail in VR is exciting. It’s exciting to be in the barn with Shaun the Sheep and see him reacting to you – that’s an interesting thing for us to play with.” When co-founding Baobab Studios CEO Maureen Fan
SPRING 2022 VFXVOICE.COM • 17
PG 16-19 ANIMATION.indd 17
2/28/22 12:47 PM
ANIMATION
TOP TO BOTTOM: The Very Small Creatures is a pre-school series produced by Aardman Animations for Sky Kids. (Image courtesy of Aardman Animations and Sky Kids) The mandate for Baobab Studios is to make viewers feel that they are an essential part of the storytelling. (Image courtesy of Baobab Studios) Animation is no longer just for families as illustrated by the adult-oriented anthology Love, Death + Robots. (Image courtesy of Netflix)
combined her video game experience with the filmmaking expertise of CCO Eric Darnell and technical leadership of CTO Larry Cutler. “The reason why something is a success isn’t because it’s stop-motion versus traditional animation. It is how good the story and characters are. The style is in support of that story. For Crow: The Legend and Baba Yaga, we created both for VR and 2D. Namoo was created within the VR tool Oculus Quill, where you are literally painting 360, but the project was meant to be a 2D output. The animation is different because the director is different. We brought in Erick Oh, and it’s more like stop-motion because in Quill there is no frame interpolation and rigging. Every single project that we’ve done has had completely different methods and tools, which is fun.” The mandate driving everything is making sure that the user feels like a protagonist. “Certain parts of Baba Yaga and Bonfire were straight animation, but our animators also built in a bunch of cycles that we fed into the AI engine. We built a characteremotive AI system similar to games, so whatever the audience did, the story would change and reorder itself, and the characters would do different things.” “There is a real-time revolution that is going to come, but not everybody has embraced it yet,” remarks Fan. “Even when the big studios adopt real-time, unless you’re having the product completely change where it’s interactive, you would still animate the same way. I don’t think that animators need to change any time soon. But if you’re interested in doing interactive animation, your skillset needs to be embracing the real-time aspect.” The visual language for VR is still being developed. “The fun thing about VR is no one knows what they’re doing! You don’t need a lot of previous experience. It is finding the best animator and rigger, and find one who has a flexibility to try different things and not always have to do things the way they did previously.” Globalization of the industry has not only had an impact on the workforce. “You will notice that all of our projects have specifically cast minorities and women,” adds Fan. “That’s because I’m a female minority and feel like if I don’t do it, who will? Crow: The Legend was inspired by a Native American legend, and it was one of the first indigenous-themed stories in animation. Instead of a hero’s journey, they’re much more about the community. Baba Yaga is based on a character well-known in Eastern European literature. I’m excited about the different types of stories that we can tell with globalization.” Previously an animation supervisor at Baobab Studios and currently working as Animation Supervisor for DNEG Animation, Ken Fountain points out that VR is rooted in an artform that predates cinema, which is theatre. “If you’re talking about going into the AR/VR space, the filmmaking language is totally changing because you can’t rely on an editor anymore,” observes Fountain. “The editor is the one standing with the headset and making the choices of where to look. The way that you build a story and performance, and attract the eye and create compositions, is based around theatrical approaches rather than cinematic ones. You have to be procedural and theatrical.” As much as it is exciting for a user to be able to choose their own narrative, there is also respect for boundaries. “Ultimately, it’s up to whoever is creating the
18 • VFXVOICE.COM SPRING 2022
PG 16-19 ANIMATION.indd 18
2/28/22 12:47 PM
experience to decide if they’re giving the user room to do literally whatever they want. Personally, I like the engineered outcomes,” adds Fountain, as parameters have a positive impact on the creative process. “You make your best work when you have limitations, and that translates into writing stories for this open universe. Unless you give people a box, the outcome is not going to be as good.” With the growing reliance on AI and machine learning, could there come a time where animation is literally created in realtime? “That seems so far away,” remarks Fountain. “The first wall to get over for AI is it being able to create empathy in animation.” Animators are not going to be replaced any time soon by machines, but the required skillset has changed somewhat, according to Fountain. “Because there’s so much demand for content as an artist, you have to be able to do so many more different things stylistically, which wasn’t always the case. Also, because there are so many start-up companies, your technical generalist knowledge is way more valuable now.” Streaming has provided a platform for projects that are not strictly family entertainment. “Streaming has eclipsed everything right now,” states Fountain. “The Netflix release of Arcane is another bit of proof that people are craving adult-based animation, because the production value of that series is amazing. The Disney-Pixar model has had too much of a grip for too long.” Different animation styles and techniques are being melded together. Comments Fountain, “Even in VR, we used 2D-animated effects in our engine at Baobab Studios. It’s the same thing that Arcane is doing by combining CG animation, 2D effects and rough 2D-composited motion graphics. Something like Love, Death + Robots has so many new techniques and combinations of things that are sometimes hit and miss. I am so glad that people are shooting for those things because it’s making the artform better.”
TOP TO BOTTOM: Netflix paid over $100 million to purchase the rights to The Mitchells vs. the Machines. (Image courtesy of Netflix) The Big Fix Up is an augmented and mixed reality adventure starring Wallace and Gromit. (Image courtesy of Aardman Animations and Fictioneers Ltd.) The animation style of Namoo was dictated by Korean filmmaker Erick Oh and the story he wanted to tell. (Image courtesy of Baobab Studios)
SPRING 2022 VFXVOICE.COM • 19
PG 16-19 ANIMATION.indd 19
2/28/22 12:48 PM
STEADY GROWTH EDGES VR CLOSER TO DEEPER MAINSTREAM WATERS By CHRIS McGOWAN
TOP: The VR rhythm game Beat Saber involves slicing through cubes to the beat of popular hits from artists like Billie Eilish, available in DLC releases. Beat Saber has sold more than 4 million copies across all VR platforms and earned more than $100 million in total revenue. (Image courtesy of Beat Games and Oculus Studios/Meta) OPPOSITE TOP TO BOTTOM: You may find yourself in a digital living room using a HP Reverb G2 VR headset, which boasts a resolution of 2,160 x 2,160 pixels per eye and a 114-degree field of view. (Image courtesy of Hewlett Packard) Hewlett Packard’s HP Reverb G2 VR headset, developed in collaboration with Windows and Valve. (Image courtesy of Hewlett Packard) Peak Blinders: The King’s Ransom is a narrative-driven VR adventure developed and published by Maze Theory. (Image courtesy of Maze Theory)
VR is going mainstream next year. VR is going nowhere. AR will be bigger than VR. There is no consensus on where virtual reality and augmented reality are headed and how soon they will get there. But although the virtual reality and augmented reality platforms are still far from mass acceptance, certain positive signs indicate that they really will become large, viable businesses, growing steadily over the next several years. Tuong H. Nguyen, Senior Principal Analyst for Gartner, Inc., comments, “AR and VR have been hyped as being ready for widespread adoption for a long time, but the technology, use cases, content and ecosystem readiness have yet to live up to the hype.” Nguyen believes that VR in particular will go mainstream when it has three Cs – content, convenience and control. He notes, “While we’ve made progress on all these fronts, we’re still far from reaching the point where each of those aspects are sufficiently mature to make VR go mainstream. It will become mainstream, but I don’t expect it to happen until five to 10 years from now.” Others are pessimistic that VR will ever become mainstream. “The answer is never. Sorry. Here’s why. People don’t like wearing stuff on their face and getting sick doing it, and having to pay a lot of money for the privilege,” says Jon Peddie, President of Jon Peddie Research and author of the book Augmented Reality, Where We Will All Live. “The VR market has bifurcated into industrial and scientific – where it started in the 1990s – and consumer. The consumer portion is a small group of gamers and a few – very few – people who watch 360 videos.” On the other hand, in the opinion of Maze Theory CEO Ian Hambleton, the point has passed for people to doubt VR’s future. “With over 10 million active headsets on Oculus Quest sold now, it’s an active ecosystem. The 10 million unit number is often cited
20 • VFXVOICE.COM SPRING 2022
PG 20-24 VR FUTURE.indd 20
2/28/22 12:51 PM
as a crucial stepping point.” Maze Theory developed the VR title Dr. Who and the Edge of Time. In November, Qualcomm CEO Cristiano Amon announced at the company’s 2021 investor day that Meta had sold 10 million Oculus Quest 2 headsets worldwide (Qualcomm’s Snapdragon XR2 chipset powers the Quest 2). A Qualcomm spokesperson later clarified that the number wasn’t meant to be official and came from market-size estimates from industry analysts. But as Qualcomm obviously knows how many Snapdragon XR2 chips it has sold to Meta, the cat seemed to be out of the bag. Meta’s Oculus VR app marked another key milestone at the end of 2021, when it was the most popular download on Apple’s App Store on Christmas Day. The Oculus app beat out long-standing leaders like TikTok, YouTube, Snapchat and Instagram for having the most downloads. And the category’s size is bigger than Quest 2. There are also Sony PlayStation VR, HP Reverb G2, Valve Index VR, HTC Vive Pro 2 and HTC Vive Cosmos, among other headsets. And PlayStation’s Next Generation VR (NGVR) is also joining the mix. Research firm Statista estimates that the total cumulative installed base of VR headsets worldwide reached 16.4 million units in 2021 and that the cumulative installed base will surpass 34 million in 2024. And Statista predicts the global VR market size will grow from $5 billion in 2021 to $12 billion by 2024. Another firm, Reportlinker.com, foresees 62 million units shipped by 2026. Hambleton thinks the launch of the next-generation Sony PlayStation VR (sometimes called NGVR) will give a major boost to VR. “[It’s] really important. NGVR will be huge. That’s our prediction. It’s got some great new features and sorted out many of the issues of the previous headset, including inside-out tracking and much better controllers. So long as they ensure there’s a strong
SPRING 2022 VFXVOICE.COM • 21
PG 20-24 VR FUTURE.indd 21
2/28/22 12:51 PM
VR/AR/MR TRENDS
TOP: Hewlett Packard’s HP Reverb G2 VR headset with handheld controllers. (Image courtesy of Hewlett Packard) BOTTOM: A “home environment” view inside the standalone Oculus (now Meta) Quest 2 headset. More than 10 million Quest 2s had been sold as of November, according to estimates. (Image courtesy of Oculus Studios and Facebook/Meta) OPPOSITE TOP TO BOTTOM: Participating actively with the Oculus (now Meta) Quest 2 headset. The Quest 2 is credited with taking virtual reality closer to being a mainstream business. (Image courtesy of Oculus Studios and Facebook/Meta) A Dalek from Doctor Who: The Edge of Time, a VR adventure developed by Maze Theory and published by Playstack. (Image courtesy of the BBC, Maze Theory and Playstack) The Valve Index is a tethered high-end VR system, which comes with headset, hand controllers and base stations, and connects to your PC. (Image courtesy of Valve)
pipeline of content for NGVR, we think it will do really well.” The latter support is likely – the current PlayStation VR has released over 500 VR games and experiences since the format’s debut in October, 2016. Another indication of a growing market came when Oculus announced in February 2021 that the rhythm game Beat Saber had sold over four million copies across all platforms and over 40 million songs from paid DLCs. In an October Oculus blog, it was revealed that Beat Saber had surpassed $100 million in gross lifetime revenue on the Quest platform alone. In February 2021, Facebook announced that more than 60 VR games available for Oculus Quest and Quest 2 had garnered over $1 million since the beginning of 2020, with six topping $10 million, including Skydance Interactive’s The Walking Dead: Saints and Sinners, released on Oculus in January 2020. The latter title has grossed more than $50 million across all platforms, it was announced by Skydance late last year. “VR is definitely at an inflection point. It’s starting to look like the early PC evolution, which expanded from just hardcore enthusiasts and tinkerers and hobbyists to everyday use for a whole lot of people. That’s happening with VR now — people beyond the initial core believers are buying headsets and making it a regular part of their lives,” says Johanna Peace, Manager of Technology Communications at Meta, formerly Facebook. She continues, “A lot of that is thanks to Quest 2. Before Quest 2, there hadn’t been a high-resolution, all-in-one form factor headset at that price point yet, so when we built it, people really
22 • VFXVOICE.COM SPRING 2022
PG 20-24 VR FUTURE.indd 22
2/28/22 12:51 PM
responded. A big part of this is because the headset is so intuitive and approachable. With a small and portable form factor, no wires or external sensors, anyone can pick it up and in seconds they’ll be immersed in a VR experience. That simplicity is incredibly powerful, and it removes big barriers to adopting VR.” She adds, “Quest 2 sales are strong and have surpassed our expectations, and we’re thrilled to see the community’s response to Quest 2.” Peace notes that other genres growing in popularity in VR include fitness/wellness/meditation such as Supernatural and FitXR, multiplayer/social games like POPULATION: ONE and adventure games like Star Wars: Tales from the Galaxy’s Edge. Vicki Dobbs Beck, ILMxLAB Vice President of Immersive Content Innovation, believes that VR has already begun its breakout and will continue to be adopted by a more mainstream audience given the headsets’ (such as Meta Quest 2) accessibility and ease of use. She comments, “In addition to a robust array of premium game titles, new content categories are further helping to drive growth.” ILMxLAB has shown its interest in the format with its production of the Star Wars: Tales from the Galaxy’s Edge interactive VR game titles, compatible with the Oculus Quest systems. Beck also sees social VR sites having a positive effect on the acceptance of VR. She comments, “Against the backdrop of the pandemic and the desire to reconnect across geographies, we’ve seen a rise in engagement through social VR sites like VRChat and AltSpaceVR. Whether to experience immersive theater, remote viewing parties, engage in collaboration or just ‘be’ with friends, I expect such use will meaningfully increase in the year ahead.” Other popular social VR sites include Rec Rom, Bigscreen VR, and Meta’s Horizon Home and Horizon Worlds. The Metaverse, a predicted global network of immersive virtual worlds, is expected to boost virtual reality, one of its key components. “VR’s greatest strengths are the power of ‘being there’ and the power of ‘connection.’ While there is neither a single definition of the Metaverse nor a universal strategy for engagement, I believe that VR will be one of the most compelling ways to explore and experience new worlds [and] emerging stories and establish relationships with characters,” says Beck. Peace adds, “VR will be one of many entry points into the Metaverse, similar to how phones, laptops and other devices are entry points to the Internet today. The Metaverse won’t happen overnight, and it will take years to be fully realized. But VR today shows a glimpse of the immersive, social experiences that can be possible in the Metaverse, and these experiences will continue to develop as VR hardware advances and as the building blocks of the Metaverse are built.” Augmented Reality has also not yet hit the mainstream, but it has plenty of believers, as it only requires special glasses or goggles, not headsets. Peddie comments, “AR has the biggest potential long-term. AR properly done will change our lives – for the better. When done right, it will be like wearing sunglasses or normal corrective lenses. It won’t be conspicuous or obnoxious, and it won’t take you out of the now – it expands it. AR has nothing in common with VR.”
SPRING 2022 VFXVOICE.COM • 23
PG 20-24 VR FUTURE.indd 23
2/28/22 12:51 PM
VR/AR/MR TRENDS
“With over 10 million active headsets on Oculus Quest sold now, it’s an active ecosystem. The 10 million unit number is often cited as a crucial stepping point.” —Ian Hambleton, CEO, Maze Theory Nguyen comments, “AR continues to mature. Much of the maturation and adoption are driven by the enterprise frontline work of AR. Companies like Snap and Niantic, as well as use cases like virtual try-on, and Google AR dinosaurs, animals and [Google] Maps arrow [Live View], have raised the profile for consumer AR, but we’ll still far from mainstream.” Beck notes that while there is growing interest in AR and some novel applications, the pivotal shift will come with the introduction of compelling AR glasses. She comments, “The kinds of experiences we can create when people do not have to hold a phone or tablet could be truly transformational. A key will be the seamless blend of our digital and physical realities.” Mobile AR is already widely used with smart phones, tablets and other mobile devices. The most notable example is the Niantic game Pokémon Go, developed and published in collaboration with Nintendo and The Pokémon Company for iOS and Android devices. Pokémon Go has over 150 million worldwide active users (its peak was 232 million users in 2016) and has passed one billion downloads, according to Niantic. Pokémon Go’s in-app purchases account for a large proportion of consumer mobile AR spending, according to Statista, which predicts that the number of mobile AR users will grow from 800 million in 2021 to 1.7 billion by 2024. Microsoft HoloLens 2, Magic Leap One, Google Glass Enterprise Edition 2, Ray-Ban Stories and Vuzix Blade are examples of AR
glasses on the market. Apple is expected to launch either AR or MR (mixed reality) glasses by early 2023. Statista forecasts AR glasses sales rising from 410,000 units in 2021 to 3.9 million units in 2024. The firm predicts that enterprise spending on AR glasses will rise from $2 billion in 2021 to almost $12 billion in 2024. Nguyen concludes, “AR and VR will continue to see forward momentum. The pace and trajectory of AR and VR haven’t changed. The difference will be the level of hype and the mismatch between that hype and the reality. Regardless, it’ll take five to 10 years.”
TOP: Skydance Interactive’s The Walking Dead: Saints and Sinners was one of the best-selling VR game titles as of late 2021. (Image courtesy of Skydance Interactive) BOTTOM: Zombies go boom in Skydance Interactive’s The Walking Dead: Saints and Sinners, which has grossed more than $50 million across all VR platforms, according to a Skydance announcement in October. (Image courtesy of Skydance Interactive)
24 • VFXVOICE.COM SPRING 2022
PG 20-24 VR FUTURE.indd 24
2/28/22 12:51 PM
PG 25 5TH KIND AD.indd 25
2/28/22 3:10 PM
THE MATRIX RESURRECTIONS: WELCOME TO THE REAL WORLD … AGAIN By KEVIN H. MARTIN
Images courtesy of Warner Bros. and DNEG. TOP: DNEG referenced Geof Darrow’s original sketches from the first film in helping to recreate and enhance various environments being revisited in The Matrix Resurrections. These locales include the gel-filled pod containing Neo’s living body. OPPOSITE TOP: The Dojo sequence was a callback to the Neo/Morpheus fight in the first film, and stylistically differed from the look of the real world and the matrix world. OPPOSITE MIDDLE AND BOTTOM: DNEG added environmental effects to provide nuances such as rippling water and falling leaves.
If the original Star Wars stood as the iconic representation of visual effects innovation for the last decades of the 20th Century, then 1999’s The Matrix surely had taken up that standard in the intervening years. Two sequels – also written and directed by the Wachowskis – may have diluted some of its impact, but the sheer excitement of the first film’s innovative bullet-time scenes raised the bar on VFX in a no-going-back-from-here way that was much aped, but rarely equaled. Working alone this time, writer-director Lana Wachowski’s return to the franchise with The Matrix Resurrections effectively subverts expectations of both fans and casual moviegoers, choosing to focus on character over flash while still managing to expand on both ‘real’ and virtual realms. For visual effects, this meant revisiting some creatures and environments from the original trilogy, but endowing them with greater verisimilitude; in equal parts, this called for creating new characters from a combination of on-set motion capture and CGI. As a veteran of the original two Matrix sequels, Visual Effects Supervisor Dan Glass offered an informed perspective on the new film’s place in the Matrix pantheon. “There was some trepidation,” he admits, “in terms of meeting audience expectations. But the goal was always to treat the film as its own entity; while it was written as a continuation, Resurrections also presented its own solid story. Visually and aesthetically, there was a deliberately chosen departure in its approach, something quite apart from its predecessors.” Part of that visual distinction came from Wachowski’s intent to shoot more in the real world and rely less on static compositions that harken back to graphic novels. “We used previs on a limited basis, mainly for the fully CG shots, so we understood how they laid out and how a camera might move through them,” states Glass. “There were also previs studies to figure out logistics in bigger sequences, but the main stylistic difference comes from being out on location rather than being studio-based, and Lana wanting to have tremendous freedom with the camera. So this one has a lot
26 • VFXVOICE.COM SPRING 2022
PG 26-32 MATRIX.indd 26
2/28/22 12:59 PM
“[T]he main stylistic difference comes from being out on location rather than being studio-based... So this one has a lot more fluidity; not necessarily a documentary look, but with more organic aspects of shooting out in the world. It was justified from a story standpoint because we’re depicting a different and more advanced version of the Matrix, where they’ve learned lessons about what constitutes reality. So our approach, to incorporate as much reality into things as possible, permitted a lot of impressive visuals, like seeing our two lead actors jump off the 40th story of a skyscraper.” —Dan Glass, Senior Visual Effects Supervisor, DNEG more fluidity; not necessarily a documentary look, but with more organic aspects of shooting out in the world. It was justified from a story standpoint because we’re depicting a different and more advanced version of the Matrix, where they’ve learned lessons about what constitutes reality. So our approach, to incorporate as much reality into things as possible, permitted a lot of impressive visuals, like seeing our two lead actors jump off the 40th story of a skyscraper.” Instead of simply revisiting bullet-time, Resurrections instead features scenes that essentially show the process through the other end of the scope, with time dilation being used against Neo. “To portray these moments, we built scenes using multiple frame rates for various characters and events,” Glass explains. “This involved referencing underwater photography to get ideas for how things might look and feel as Neo tries desperately to react to events transpiring faster than he can effectively defend against. We used lots of complex layering of CG over original photography, and in a few scenes employed a very subtle use of fluid dynamics, but it wasn’t about creating a big visual moment, and instead really focused on the emotional content and the acting.” Even with the focus often being on the intimate, there were still action set pieces, which range from a bullet-train battle to a chase through the streets of San Francisco. “Neo and Trinity race
SPRING 2022 VFXVOICE.COM • 27
PG 26-32 MATRIX.indd 27
2/28/22 12:59 PM
FILM
TOP TO BOTTOM: Numerous hoses plugged into Neo’s body were created in CG, and had to be tracked to actor Keanu Reeves’ moves and to his underlying musculature. The hoses also had to appear to interact with the red gel and various microbot creatures maintaining the pod. Pre-production artwork suggested a path for realizing the environment surrounding the pods of Neo and Trinity, with production building a 10-meter-high set piece including a practical pod. New visual sensibilities – rendered with latticeworks and curved forms – contrast with the previously established brutalist look, suggesting how the faction of Machines working and living with the humans have created a new hybrid aesthetic. For the first time with their feature work, DNEG relied heavily throughout the scene on Epic’s Unreal engine for real-time rendering at 4K resolution.
a motorcycle while trying to evade menacing throngs trying to stop them,” says Glass. “In terms of complexity, it was immense. We shot on location in the city with the principals on a real bike – rigged to a platform. Digitally, we went in and added even more figures to the thousands in the crowd while removing rigging.” The film utilized work from several vendors, including Framestore, Rise, One of Us, BUF, Turncoat Pictures and Studio C, with DNEG drawing a significant number of shots and sequences, including the Dojo sequence in which Neo and a new incarnation of Morpheus square off. Though DNEG created the CG dojo with Clarisse, DNEG Visual Effects Supervisor Huw Evans reports this was the first occasion when the company used real-time rendering with Epic Games’ Unreal engine for finals-quality renders at 4K resolution. “That was a big deal for us,” he avows. “Lana and Epic crafted the scene based on a place called Devil’s Bridge in Germany. After Epic built that out, we took it off their hands, adding extra detail such as rippling water and falling leaves, then starting lighting with it before running our CG cameras and match-moving the real cameras. “The big challenge at the time in running it all through Unreal was the limitations of version 4.25,” Evans elaborates. “That was missing a lot of features we’d normally use in dealing with imagery, like OCIO color support, that can help get us into comp with all the extra sprinkles. So after we got these beautiful images from Epic, we’d always put things into our pipeline to have the ability to tweak individual grades, such as varying bloom or emphasizing some aspect of a visual effect. We could have output straight from Unreal, but it would have been missing that 10% we always strive to achieve.” Glass notes that the decision to handle that sequence in Unreal was made in part because the scene represents a construct within the matrix. “There was an aesthetic choice for that to be different in look from the rest of the film, but internally consistent.” With the film’s new machine version of Morpheus, live on-set facial capture was used. “The real benefit here was getting all that interaction between the actors live so the eyelines were maintained naturally during the conversations,” Glass acknowledges. “The biggest R&D components on the project connected to the newer body-method capture, which used AI and machine learning to reconstruct from data these three-dimensional representations of actors. It didn’t require a CG head or do the modeling and lighting, because you get all that baked in from the original photography. That was something Lana really wanted because it went along with how she might want to pan rapidly from one direction to the other. The AI/machine learning tech was also used when two actresses are supposed to be occupying the same space – occupying the same avatar, if you will – so we recorded each of them, then used machine learning to superimpose one over another while they appear to be fully synchronized with all their actions.” Determining just how different this new Morpheus would appear and behave was an iterative process for DNEG. “While he looks human when inside the matrix, his digital look flows and changes as the situation demands,” says Evans. “The earliest attempts were very free-flowing, and I really quite liked the
28 • VFXVOICE.COM SPRING 2022
PG 26-32 MATRIX.indd 28
2/28/22 12:59 PM
“The big challenge at the time in running it all through Unreal was the limitations of version 4.25. That was missing a lot of features we’d normally use in dealing with imagery, like OCIO color support, that can help get us into comp with all the extra sprinkles. So after we got these beautiful images from Epic, we’d always put things into our pipeline to have the ability to tweak individual grades, such as varying bloom or emphasizing some aspect of a visual effect. We could have output straight from Unreal, but it would have been missing that 10% we always strive to achieve.” —Huw Evans, Visual Effects Supervisor, DNEG casually arrogant way he could let parts of himself flow ahead before snapping back into the human configuration. But in going down that route, we found that it was harder to relate to this major character when he was all over the place; your eye didn’t always know where to look when he was in such extreme motion. I decided that if he was looking at you, his face and arms would be mostly human, but you’d see this almost seagrass-like effect visible on his back, and a different level of flowing when his muscles were in use, like when he was climbing.” Separate muscle and skin passes were required before the flowing character effect went in. “There’s a lot of nuance to the performance that started with what the face camera got, in order to convey how he doesn’t have the fidelity of a real human,” states Evans. “When the effects team took over, they tried to proceduralize it as much as possible, but with so much custom work to make the character, there’s still a lot remaining.” DNEG was also responsible for aspects involving locales recognizable from the first film, including the pod containing Neo’s inert form as he lay in a pool of red gel, literally plugged into the Matrix through various hoses. “The look of that environment started with a beautiful piece of concept art that gave us the broad strokes,” recalls Evans. “Then we did countless designs on the specifics of the pods and the huge turbines around them. Production actually built a massive set piece for the live action that measured 10 meters high and included Neo’s pod – we reused that for Trinity’s, too – which contained all that practical goo. We extended the turbine and built a whole chamber around it in CG, along with
TOP: The resistance’s new belowground digs, Io, represented a departure from their older Zion enclave, suggesting a much more expansive abode, with factories and delineated residential blocs. BOTTOM TWO: For a Machine incarnation of Morpheus, the film utilized live on-set facial capture. AI and machine learning permitted reconstruction of the character from three-dimensional presences on set. Separate passes for skin and muscle helped finesse the look, but Morpheus’ ability to fluctuate his form went through many iterations before filmmakers settled on a subdued rippling to avoid distracting or confusing viewers.
SPRING 2022 VFXVOICE.COM • 29
PG 26-32 MATRIX.indd 29
2/28/22 12:59 PM
FILM
TOP TO BOTTOM: A flashback illustrating the war between two factions of Machines provided an opportunity to deliver exposition through a pair of expansive visual cuts. The Machines’ armada reflected some souping-up from their earlier forms, while the opposition – operating in what were dubbed ‘squid tanks’ – used geometry made up of familiar detail bits from the trilogy. The faction of Machines working and living with the humans have created a new hybrid aesthetic. These constructs were only slightly ‘upgraded’ – both to reflect the changes in the nature of the matrix as well as CG developments over the intervening years separating Resurrections from the original trilogy.
the pit below. It was a fairly standard build, but to give the scene life and a tie-in back to the original film, we added a bunch of little microbot creatures for additional texture and detail. We referenced Geof Darrow’s original sketches from the first film [of the machine ecosystem] to keep these things looking familiar and appropriate, but we got to embellish things by giving them a sense of character when Neo unexpectedly wakes – they are surprised and get away really quickly!” Evans found one of the most challenging parts of the sequence to be a fairly invisible aspect. “The hoses that attach to their bodies were all digital,” he explains. “We tracked all those ports on Neo’s back individually – that meant not only matching the movement, but also understanding and reproducing the musculature beneath those ports, including how his skin moves and he reacts. These CG cables also had to interact with the practical goo that was in the pod and dripping off Neo, plus those microbot creatures as they splashed around in the goo; all of this amounted to incredibly detailed work that was often painful to get just right. None of this is really in-your-face stuff, but the invisible work is often at least as challenging. It made for just a ton of fine-detail balancing.” A throwaway line of dialog led to a short but memorable sequence featuring combat between two factions of warring machines. “The machine battle was originally just referred to in conversation, and then it was decided to do a single big shot of the action, which sounded exciting,” admits Evans. “What was even better was how it eventually became two massive shots. This started life as another piece of concept art that roughly blocked out what was going on. We used the armada ships from the original trilogy on one side of frame, updated with new detail, flying through the air, while on the other side we had a group in what we called squid tanks. They were a hybrid using familiar bits of geometry from the original, including elements from the harvester and sentinel, but with new weird dreadlocked bits that resembled sentinel legs. Lana was keen to not include the harvesters intact, because they are farming creatures, not built for firing lasers and doing battle. The Environments team did an amazing job with all the fire and smoke and debris, which was especially important
30 • VFXVOICE.COM SPRING 2022
PG 26-32 MATRIX.indd 30
2/28/22 12:59 PM
PG 31 DNEG AD.indd 31
2/28/22 3:11 PM
FILM
TOP TO BOTTOM: Among the familiar ‘faces’ from earlier Matrix entries are the Machines’ attack Sentinels and Io’s hovercraft vessels, along with Harvesters used by the Machines to service fetus fields. DNEG extended the turbine section and filled in the rest of the environment digitally. Upgraded from earlier Matrix entries are the Machines’ attack Sentinels and Io’s hovercraft vessels, along with Harvesters used by the Machines to service fetus fields. Belowground in Io.
in creating the spectacle, because since it was just two shots, we didn’t want to go crazy modeling all these forms and mainly built to camera whenever possible. Environments, along with the matte painting team, also contributed heavily to Io, the new city that succeeds Zion as home to the human resistance, “If Zion represented a town, then Io is a megacity – massive, with towering buildings, factories and delineated residential blocs within a sprawling cave environment,” remarks Evans. “We tried to generate stuff procedurally, when possible, but when there were specific story points, like being able to see people moving around in their residences and people here interact with these machines on their side, there had to be a great deal of detail work. Lana was very keen that this environment show how the combining of people and machines working together reflected a very different visual sensibility from that of Zion, which was very run-down and clearly just a product of human minds. With the help of this faction of machines, you go beyond just a Brutalist grouping of buildings into details featuring 3D-printed lattices and curved shapes made from unusual exotic materials.” To facilitate the film’s less-regimented approach to shooting, DNEG always used the production imagery as a reference. “Whenever we had a fully CG shot, especially one that went between two production shots, we made sure our CG cameras could match the moving and sometimes handheld look of the production shoot,” says Evans. That meant duplicating the exact kind of camera shake and bounce, but it was absolutely necessary to make everything live together in the cut. “Everybody loved what the first Matrix brought to cinema,” Evans concludes. “But if Lana had taken the easy way out and done the same exact thing over again 20 years later, it would have been treading on familiar territory that so many other projects have already leveraged off. This one defies a lot of conventional thinking about sequels, while having meta kinds of fun with the whole notion of sequels. That may not be what everybody expected it to be, but it is very different, and I found that aspect, along with getting to be part of the Matrix history, to be extremely worthwhile.”
32 • VFXVOICE.COM SPRING 2022
PG 26-32 MATRIX.indd 32
2/28/22 12:59 PM
PG 33 VFXV VOICE ACTIVATED AD.indd 33
2/28/22 3:27 PM
MARCUS STOKES: PERSEVERANCE POWERS SUCCESS OF VFX VETERAN TURNED DIRECTOR By TREVOR HOGG
Images courtesy of Marcus Stokes. TOP: Director Marcus Stokes OPPOSITE TOP: A significant experience for Stokes was teaching English to Japanese middle school students. OPPOSITE MIDDLE: Stokes proudly displays his favorite childhood toy. OPPOSITE BOTTOM: As a middle schooler, Stokes was part of a quiz bowl team.
Marcus Stokes describes himself on Twitter as a director, visual effects supervisor, screenwriter, surfer, snowboarder and martial artist whose personal journey has taken him from Atlanta to Tokyo to Los Angeles. “I was born in New Haven, Connecticut, and raised in Macon, Georgia, which is located in the suburbs of Atlanta. It was so different from Los Angeles. Nowadays, there is so much filmmaking in Georgia, they’re a lot more similar.” Stokes’ mother was an actress in Atlanta, but upon divorcing his father (a dentist who subsequently remarried a future state senator) she became an attorney. “At the time, Atlanta was a metropolis that had a lot of Black and white people but not much of anybody else, but has since become more ethnically diverse.” He had no intention to pursue a career in the film industry. “I got into filmmaking through visual effects and I got into visual effects through architecture. I don’t have this story where my father gave me a Bolex camera when I was seven and I made little movies with my friends.” His ambition was to be an architectural revolutionary like Frank Gehry and Richard Meier. “It didn’t matter if it was a big or small structure,” notes Stokes. “I just wanted to do artistic interpretations of livable spaces.” Upon majoring in architecture, the Georgia Tech graduate went to Japan for three years to teach English to junior high school students. “I didn’t want to be the foreigner forever, so I decided to go back to grad school and continue my architectural quest. While on a scholarship at the University of California, Berkeley, I realized that I didn’t want to do architecture anymore.” A new inspiration emerged upon seeing the walk cycle of a skeletal dinosaur at the computer lab. “It was the coolest thing that I had ever seen,” he remarks. “We had the option to start designing on the computer, and I found that I had some skill at it. We had one class in CG animation
34 • VFXVOICE.COM SPRING 2022
PG 34-38 MARCUS STOKES.indd 34
2/28/22 1:11 PM
“When that opportunity comes, you cannot fumble it. You get one chance. As a Black male director, I have to be exceptional. Maybe that has changed now. But you have to be prepared, polite, clear, direct, and try to make everyone enjoy what they’re doing. You can’t be above the crew.” —Marcus Stokes, Director where we put together these short films, which I used to apply for the Lucasfilm internship, and that changed everything. I interned in the training department where the new hires could learn the proprietary software. After work I would sit at the same cubicle for the new hires and learned the same proprietary software, so by the end of my internship four months later, I blinded them with content.” As part of the commercial department, the new recruit had to become a generalist. “Within a year I was rendering shots, comping and learning how to code,” states Stokes. “If you’re at ILM, then you’ve made it to the top of the mountain. What that means is there’s no need to horde information.” Aspirations soon went beyond doing visual effects. “The thing that got me to want to direct is the fact that I was in the commercial department and had access to the directors like Steve Beck, Robert Caruso and Bill Timmer. One day we were doing a Pepsi tie-in for Star Wars: Episode I [The Phantom Menace], and it was the first time I saw a film being shot – that was the moment where I decided I wanted to direct.” After working on the Star Wars prequels and Wild Wild West (1999), the desire to work on The Matrix sequels resulted in Stokes leaving ILM for ESC Entertainment. “That experience made me so bulletproof in how to create something from the ground up.” Other projects included being a computer graphics supervisor on Peter Pan (2003) for Digital Domain and a sequence
SPRING 2022 VFXVOICE.COM • 35
PG 34-38 MARCUS STOKES.indd 35
2/28/22 1:11 PM
PROFILE
TOP TO BOTTOM: Stokes attends a middle school graduation in Japan. Stokes conducts a rehearsal while preparing to shoot an episode of Blindspot. Jim Brolin gets directions from Stokes while filming the ABC series No Ordinary Family. Stokes takes a selfie with the cast of No Ordinary Family.
supervisor on Serenity (2005) for upstart Zoic Studios. “Serenity was another difficult show at a facility that was scaling themselves up to do a feature film.” Around the time of The Matrix Reloaded (2003), Stokes joined the DGA. “While I was at ILM, on weekends I would borrow their 35mm cameras, buy some short ends and shoot fake spots. I wanted to join their stable of directors and, in order to do that, I needed to have a reel of three to five commercials. Since no one was going to hire me to make commercials, I had to go out and make fake ones. I put together three or four and posted them on my own website. Someone found them, thought they were real and hired me to do a car commercial. I found a way to struggle through that experience which gave me the opportunity to join the DGA.” Stepping behind the camera provided a change of perspective and understanding. “I learned from doing the short films how hard it is to create the media that visual effects artists work on. In my day-to-day life, I know what are the optimum set of conditions to do post work, what are the reasonable set of conditions, and what is the minimum that we need to give them so that we can be successful. I would rather get the scene than that one perfect shot because otherwise you don’t have a movie.” Making the transition to directing television shows was assisted by creating short films such as The Catalyst (2005), Till Death Do Us Part (2013) and The Signal (2015). “Television is a fast medium which is not always great but also comes with an army to help you, and that is great,” notes Stokes. “In short films, you don’t have an army and there isn’t any time. The Catalyst was my first short film. It was too long and expensive, and looked like a film school film. Both my AD and DP came out of the AFI and they taught me how to make films. When The Catalyst got put on HBO, I got a lot of heat. It made me think that I might be able to make a run at this.” The small-screen résumé features helming episodes for Blindspot, Station 19, Criminal Minds, The Astronauts and 9-1-1. “The process has changed because I’ve gotten access to more toys. I pride myself on two things: the strength of my ideas and my ability to work with less.” He has also been involved with the Arrowverse, in particular Arrow, Supergirl and The Flash. “There is a lot of mediocrity in the superhero genre, but in general the shows that are doing the best are the ones taking the biggest swings, such as The Boys. As you know, visual effects have become a part of every show. It is my job to know what is going to happen to those visual effects elements so that I can fight to make them right.” Narrative storytelling is not going to become obsolete despite the push for virtual and augmented reality and the mass popularity of video games. “For me, there is a difference between playing Call of Duty, Grand Theft Auto or Fortnite and being able to relax with a drink and watch someone tell me a story,” observes Stokes. “The desire for stories to be told is not going anywhere.” Black films were not made because it was believed that there was no market for them. “Over the past 10 years, producers like Will Packer and Tim Story have proven that there is a market for almost all types of films, including ones that have almost an entirely Black cast or almost an entirely Korean cast. Look at Parasite. Those days [of no market for them] are coming to an end.” The issue is with those
36 • VFXVOICE.COM SPRING 2022
PG 34-38 MARCUS STOKES.indd 36
2/28/22 1:11 PM
“The reason that I’m directing now is because I always thought there was a higher level of creative input that I wanted to be involved in. Now I’m on the other side where I can determine what the shots are going to be that will make the story better, and the visual effects will back into that.” —Marcus Stokes, Director determining which cultures are showcased. “The argument would be, ‘Why are you bringing in all of these international films when we are from Detroit and can’t get our film made? Culture is great, but we all still have our films.’” There has been a content explosion fueled in large part by the streaming services. “If you want to see a particular type of content, you’re able to find it. If you want one-hour dramas about the MMA, there was a show called Kingdom that is on Peacock starring Frank Grillo. You can expand that to whatever you want. There is probably a one-hour drama about someone dedicated to making French cuisine.” A common mistake for filmmakers starting out is for them to do too much. “You try to use your strengths to the best of your ability and have that at your disposal so that it can help even the playing field,” remarks Stokes. “If you’ve done your job, you’ve hired competent people. You tell them what you want, not how to do it. As long as the methodology doesn’t affect the timeline, it doesn’t matter to me. The same applies with the actors. I will explain the scene, see what happens, and will guide them.” It took 10 years to get his big break in television. “I figured if
TOP: An important part of the process for Stokes is blocking scenes with cast members. BOTTOM: Stokes has a conversation with cinematographer Oliver Bokelberg during production of Station 19.
SPRING 2022 VFXVOICE.COM • 37
PG 34-38 MARCUS STOKES.indd 37
2/28/22 1:11 PM
PROFILE
“The argument [about which cultures to showcase] would be, ‘Why are you bringing in all of these international films when we are from Detroit and can’t get our film made? Culture is great, but we all still have our films.’ If you want to see a particular type of content, you’re able to find it. ... You can expand that to whatever you want. There is probably a one-hour drama about someone dedicated to making French cuisine.” —Marcus Stokes, Director
TOP TO BOTTOM: Stokes studies the monitors while conducting principal photography for Station 19. Stokes blocks out an exterior shot while directing an episode of Station 19. No stranger in dealing with visual effects, Stokes prepares a scene that takes place in a burnt-out set.
I kept winning these director fellowships [eight in total] that I would eventually get an opportunity, and that’s exactly what happened. The reason I got the opportunity is technically Criminal Minds was an ABC Studio show that was aired on the CBS Network. At the time I had done both the ABC and CBS director fellowships, so I had them surrounded!” There was no margin for error. “When that opportunity comes, you cannot fumble it. You get one chance. As a Black male director, I have to be exceptional. Maybe that has changed now. But you have to be prepared, polite, clear, direct, and try to make everyone enjoy what they’re doing. You can’t be above the crew.” The next step for the former visual effects artist is his feature directorial debut. “We locked picture in November 2021, and it will be coming out in the first half of 2022. State Consciousness is a psychological thriller that stars Emile Hirsch. Without doing television, there is no way I could have done this because it was a low-budget run-and-gun feature. It was a good experience, and I hope to bring what I learned to the next project I have in development which is for Disney+ and is something that I created.” A certain attitude has served Stokes well over the years. “I am good at sticking to my goals, which is necessary for directing and, to a lesser extent, visual effects. Throughout my career I have always found a way to get to the place I want to get to. I felt if I kept going at it, eventually the walls would break. I’m starting to get inbound calls with people wanting to hire me for television shows. The reason that I’m directing now is because I always thought there was a higher level of creative input that I wanted to be involved in. Now I’m on the other side where I can determine what the shots are going to be that will make the story better, and the visual effects will back into that.” Thinking about his experience of directing an episode for Magnum P.I., he adds, “When I was 10 years old in Macon, Georgia, if someone said to me, “We’re going to pay you to fly to Hawaii to shoot car chases and interrogations,’ I would have thought that was the best job in the world. And I do feel like I have the best job in the world.”
38 • VFXVOICE.COM SPRING 2022
PG 34-38 MARCUS STOKES.indd 38
2/28/22 1:11 PM
PG 39 RODEO FX AD.indd 39
2/28/22 3:29 PM
INTO THE VOLUME: FIVE INNOVATIVE FIRMS WITH LED STAGES EYE YEAR OF GROWTH By CHRIS McGOWAN
TOP: The set of the ship’s deck in 1899 with actors Emily Beecham and Andreas Pietschmann to the right, in front of DARK BAY’s LED Volume. The setup included input from Framestore, ARRI Solutions Group, FABER AV, ROE Visual, Vicon Motion Systems, Epic Games, Helios Megapixel VR and Netflix’s Product Innovation team. (Image courtesy of DARK WAYS and Netflix) OPPOSITE TOP TO BOTTOM: The deck of the ship in front of the LED wall seen from above during the filming of the 1899 series at the DARK BAY LED studio in Potsdam-Babelsberg, Germany. (Image courtesy of DARK WAYS and Netflix) A car commercial at the MELS LED Studio, which had over 20 productions in 2021, including Transformers 7 and Disappointment Boulevard. (Image courtesy of MELS Studios) At Studio Lab using an LED wall for a remote location and accurate reflections. (Image courtesy of Studio Lab)
These are the formative years of virtual production, full of challenges and discoveries. Here, executives of five prominent firms with LED stages – DARK BAY, MELS Studios, Virtual Production Studios (80six), Studio Lab and Pixomondo – discuss the arc of their growth and some of their latest developments and innovations. DARK BAY
The genesis of the DARK BAY LED stage in Potsdam-Babelsberg, Germany, was driven by Baran bo Odar and Jantje Friese, the creators of Netflix’s Dark, who were working on ideas for their next series. 1899 was originally imagined as a pan-European show with vast backlot builds and locations spread over four countries in Europe. Then came the pandemic. Odar and Friese came up with the idea to shoot 1899 in a LED volume, which seemed the best solution to realize their vision while keeping the cast and crew safe. In addition, building their own stage made sense considering the scale of the series, according to DARK BAY Managing Director Philipp Klausing. Thus came about the DARK BAY Virtual Production Stage, owned in a majority share by Odar and Friese’s production company Dark Ways and in a minority share by Studio Babelsberg. DARK BAY was financially supported by funding from the Ministry of Economic Affairs, Labour and Energy of the State of Brandenburg (MWAE) and backed up by a long-term booking commitment from Netflix. The facility, with an LED wall 180 feet long and 23 feet high and an active shooting area of 4,843 square feet, is located on the lot of Studio Babelsberg and came online in May 2021. ARRI was involved in the DARK BAY planning and installation, and offered its know-how throughout the setup. The ARRI Solutions Group specializes in setting up mixed reality studios
40 • VFXVOICE.COM SPRING 2022
PG 40-47 LED STAGES.indd 40
2/28/22 3:57 PM
around the globe and was a crucial support, according to Klausing, who says, “ARRI was a big help in finding the look of the project. One of the biggest supports we received was the customization work on the ARRI anamorphic lenses.” Framestore also played an active role in setting up the volume, which happened in parallel with the pre-production for 1899. Framestore took responsibility for the virtual content that played on the LED wall, according to On Set VFX Supervisor Andy Scrase. “Artists, technicians and myself, all from Framestore, were involved very early in pre-production with the decision-making process for the different sets,” remarks Scrase. “Our VAD team in London then started building our digital environments several months in advance, which we began to review and adjust as soon as we were set up on the DARK BAY stage in Babelsberg. Once we’d completed this, there was the important task of pre-light and making sure our virtual sets matched the physical ones [which was done at different stages during the schedule]. When shooting started, the Framestore team made sure our content played as it should on the wall and that it tracked correctly with cameras on set.” Netflix’s Product Innovation team, along with its VP division, also provided assistance in setting up the stage. “Andy Fowler, Vice President of Product Innovation, and Girish Balakrishnan, Director of Virtual Production, saw the potential and guided us in the right direction,” notes Klausing. “Without their trust and passion to enable virtual production, we would not have been able to get the stage up and running in such a short time [three months].” Other contributors to the DARK BAY setup included FABER AV (delivery and construction of all LED elements [ROE Ruby 2.3, CB3 and CB5] ), plus Vicon Motion Systems (Delivery of Tracking System), Epic Games (support for any Unreal-related
SPRING 2022 VFXVOICE.COM • 41
PG 40-47 LED STAGES.indd 41
2/28/22 3:57 PM
VIRTUAL PRODUCTION
TOP: The MELS LED stage in Montreal has a diameter of 65 feet, height of 20 feet and a fully motorized and computerized ceiling. The volume is composed of ROE BP2V2 and is powered by Brompton processors, Unreal Engine and disguise technology. (Image courtesy of MELS Studios) BOTTOM: Behind the scenes of Star Trek: Discovery, Season 4, filmed on a Pixomondo Toronto stage, that shows the physical lighting hanging from the ceiling. The Discovery DP decided that he wanted to use physical soft boxes for more traditional control of his lighting. The perspective on the wall is skewed and stretched due to the high angle of the camera (seen at the end of the Technocrane). (Image courtesy of Paramount+ and CBS Interactive)
topics), and Helios Megapixel VR (support during processor implementation). For productions utilizing DARK BAY’s and other LED stages, Klausing explains, “The biggest underestimated benefit is that you are able to finish a day of shooting and have an almost finalized shot, which has a big impact on the precision of the edit.” For 1899, creating the content for a virtual production studio was a collaboration between the art director and VFX supervisors, all under the supervision of the director, the DP and the production designer. Klausing notes, “All departments need to talk a lot to each other. [It is] a collaboration which was in the past always separated by production and post work. Working with an LED studio like DARK BAY enhances creative freedom and control. It allows the merging and definition of every aspect of a production, all at the same time. Beyond that, it’s very satisfying to see actors performing in an environment that comes closer to reality than classic greenscreen setups.” One of DARK BAY’s innovations is a revolving stage inside the LED volume. Almost the entire interior of the volume consists of a motor-driven turntable with a diameter of 69 feet and a load capacity of 25 tons, which allows filming the environment from all possible angles. This removes the common physical limitations of shooting in an LED volume and can reduce or remove the time needed for a set rebuild, according to Klausing. “Complete sets can be rotated 360° in just three minutes,” he says, “and entrances and exits can be simulated through a trapdoor built into the stage. We are convinced that we will see this revolving floor in all studios in the future.”
42 • VFXVOICE.COM SPRING 2022
PG 40-47 LED STAGES.indd 42
2/28/22 3:57 PM
VIRTUAL PRODUCTION STUDIOS (80SIX)
80six’s Virtual Production Studios stage is located in Slough, U.K. and debuted in March 2021. The facility has expanded and now has a footprint over 10,000 square feet, according to Aura Popa, 80six Marketing Manager, “which enables us to build to spec large-scale LED volumes of any shape and form, up to eight meters high.” She continues, “Rather than having a pre-configured LED stage of a standard shape, we’ve discovered that this is the only viable approach to make virtual production with LED screens accessible to productions of all budgets. We wouldn’t be able to do this if we didn’t have one of the largest ROE Visual Diamond 2.6mm inventories in the U.K., a leading LED panel currently used for in-camera VFX with incredible cinematic results.” Popa adds, “Creatively, cinematographers and the VFX teams especially embrace our flexible offering as it gives them the chance to work alongside our tech team to decide the specs of the LED volume, depending on the size of the shot, the nature of the physical props required and the camera movement. We’ve installed a multitude of LED volumes for best in-camera VFX results that match the interactive light from the LED panels to the lighting in the location, capturing the reflections and shadows inherent to it.” As noted above, virtual production creates new schedules. Popa explains, “With in-camera VFX, post-production starts in pre-production. You need to come to the set ready with a solid shooting and lighting plan, after a considerable testing period. Usually, the director, cinematographer, VAD team and the Unreal Engine Artist [if Unreal Engine is used] come together early in pre-production to create the virtual set and pre-light it, so all the elements of the set,
“Probably the most critical area in the pipeline is the color workflow. We have spent a large amount of time to ensure the color remains the same as it flows through Unreal onto the LED panels, through the camera and finally to the DIT for the final look. The calibration of the LED panels is checked frequently and we develop custom LUTs for each camera and also for each show, which is finessed with the DP and DIT.” —Phil Jones, VFX Supervisor Virtual Production, Pixomondo
TOP: Behind the scenes of a “100% Shutterstock” ad shoot at Virtual Production Studios by 80six, in Slough, U.K. (Image courtesy of Virtual Production Studios)
SPRING 2022 VFXVOICE.COM • 43
PG 40-47 LED STAGES.indd 43
2/28/22 3:57 PM
VIRTUAL PRODUCTION
TOP: The Ni’var Science Institute, from Star Trek: Discovery Season 4, Episode 3, was shot on a Pixomondo LED stage in Toronto. (Image courtesy of Paramount+ and CBS Interactive) BOTTOM: The Studio Lab team dials in camera settings for a multi-cam LED virtual production shoot of a forest setting. (Image courtesy of Studio Lab) OPPOSITE TOP: Shooting Netflix’s live-action Avatar: The Last Airbender on a Vancouver Pixomondo stage. Unreal Engine was the primary software, and the setup includes ROE’s Black Pearl II V2 panels, Brompton processors, and OptiTrack or Vicon Tracking systems. (Image courtesy of Netflix)
including physical and virtual, are known and have been tested.” Popa thinks LED volumes offer incredible flexibility in terms of simulated settings. “You are now inside an LED studio, a highly controllable space, and with the help of the LED screens and realistic graphics you can bring to life any location you can imagine – from locations that don’t even exist to difficult locations to reach. During a single day, you can shoot a variety of different environments with the exact lighting conditions of that location. [And] inside an LED volume, you can have the ‘golden hour’ for seven hours. This level of control on the shooting conditions has never been achieved before.” Virtual Production Studios seeks to design a “seamless VP workflow with Unreal Engine, aided by disguise media servers which speed up the virtual production workflow.” Popa comments, “It is a different approach to the VP workflow which helps streamline it by facilitating a stable communication between Unreal Engine and the camera tracking system, Mo-Sys StarTracker, through disguise’s innovative RenderStreams.” STUDIO LAB
Studio Lab was born from a desire to provide world-class studios and gear to creatives across a variety of disciplines, according to Studio Lab Director Benjamin Davis. It officially opened to the public in April 2020. Located in Derry, New Hampshire, the stage has a 14-ft.-high x 52-ft.-long LED wall. Davis decided early on that there are three main reasons to use an XR LED stage as opposed to using traditional sets or shooting on location. “The first is if the project is created for an environment that does not exist. Think stylized environments or an alien planet. Then, of course, XR is the perfect creative solution for the imagined world.” He continues, “The second reason for an XR [LED volume] project, would be if the shoot requires a visit to an actual recognizable location that would otherwise be prohibitive due to cost or
44 • VFXVOICE.COM SPRING 2022
PG 40-47 LED STAGES.indd 44
2/28/22 3:57 PM
“The first thing [when working with virtual production and LED stages] is to get the proper gear. Not all video walls, drivers and cameras work well for virtual production, and one can spend millions on a solution that seems like it should work, only to come up short because one piece of hardware doesn’t perform the way you expected it to. Do the research and spend the money to get the correct solution.” —Benjamin Davis, Director, Studio Lab scheduling. This would be things like sports venues, landmarks, or specific buildings. And thirdly, if the project requires shooting in multiple locations in a short amount of time, XR can save time and money by having all locations in one place while allowing you to move between them quickly.” In general, Davis notes, “We decided to put our focus on making the XR process and workflow as easy as possible. This equates to days of testing before a shoot and ensuring color, spatial and lens calibration is correct before the start of the shoot. We also provide each project team with easy-to-use tools that allow for changes to be made on the fly with XR Stage App Integration. Teams are able to directly find exactly the settings they need using controls via an iPad. Providing DMX controlled lighting and environmental controls, such as weather and time of day, are good examples of this.” One of the biggest breakthroughs for Studio Lab came fairly early on with the ability to support live camera switching inside of virtual environments, according to Davis. “In addition to aiding traditional shooting, especially for video/stills in the same shoot, this makes the technology available to us for live broadcasting
and opens a host of possibilities. None of these options would be possible without the use of disguise and they have been an excellent partner throughout.” To have success with virtual productions with LED stages, “the first thing is to get the proper gear,” says Davis. “Not all video walls, drivers and cameras work well for virtual production, and one can spend millions on a solution that seems like it should work, only to come up short because one piece of hardware doesn’t perform the way you expected it to. Do the research and spend the money to get the correct solution.” PIXOMONDO
Pixomondo has two LED volumes in Toronto and two in Vancouver. “We are currently planning to build more stages in North America as well as Europe,” says Phil Jones, Pixomondo VFX Supervisor Virtual Production. Avatar: The Last Airbender, Star Trek: Discovery and Star Trek: Strange New Worlds are among the major productions that used Pixomondo stages last year. Its first such stage in Toronto launched in January 2021 with an LED volume internal stage that is 72 feet in diameter, 90 feet in length and 24 feet high.
SPRING 2022 VFXVOICE.COM • 45
PG 40-47 LED STAGES.indd 45
2/28/22 3:57 PM
VIRTUAL PRODUCTION
TOP: An Open Cube LED volume built to specs at Virtual Production Studios by 80six. (Image courtesy of Virtual Production Studios) BOTTOM: The Studio Lab team showcases a Ferrari in an XR virtual production using LED wall, ceiling, and moveable/trackable wall sections. (Image courtesy of Studio Lab) OPPOSITE TOP: A virtual production takes place in front of a ROE LED wall in VA Studio Hanam, near Seoul. South Korea took a big step forward in VP in late 2021 with VA Corporation’s establishment of VA Studio Hanam, a virtual production facility located in Hanam, created in partnership with South Korea’s ARK Ventures. VA Studio Hanam has three main studios with large walls featuring ROE LED panels, powered by Brompton Technology’s Tessera processing. Studio C is Asia’s largest virtual studio, according to VA Corp., with a floor area of 1,088 square meters and the biggest oval LED wall in Korea, with ROE Black Pearl 2 panels and ROE Black Marble on the floor. (Image courtesy of ROE Visuals)
In terms of the new scheduling required for productions utilizing LED stages, “We are now coordinating with the client creative team even earlier in production, as compared to traditional VFX,” says Jones. “In the early script stage, our virtual art department works with the lead creatives on the production side, building 3D concepts for each environment. Starting the environment builds in the script concept stage helps to ensure the creative and aesthetic needs are met well before the shoot day. These concepts are then used as the base for the remainder of the work required to finesse and finalize the environment.” Jones notes that review sessions have changed somewhat, particularly in that many of the notes can be handled in real-time, “which obviously speeds up the process. In some of the client review sessions, after creative notes are given, we spend some time blocking out shots with the director and DP in real-time. This is very valuable for all involved, and it also identifies the areas of the environment that will require that little bit of extra love!” For Jones, the disciplines required to create the environments for the wall are very similar to those required in a normal VFX pipeline, with a few additions. “We are using many of the same tools from our VFX toolset and have added Unreal to the artist’s toolkit.” He adds, “There are some areas that differ from our regular VFX pipeline, and we have some specialized pipeline and coding artists who have added new tools and processes to blend Unreal in to our workflow. They have created specialized tools that we use throughout the process, including such areas as asset creation, element transfer, environment reviews and Unreal
46 • VFXVOICE.COM SPRING 2022
PG 40-47 LED STAGES.indd 46
2/28/22 3:58 PM
real-time optimization tools.” He explains, “If a shot does require some post-VFX work, such as the addition of a CG character, we have developed tools and processes to ensure the data from the selected take in editorial makes its way to the appropriate VFX shot in the post pipeline. Due to our close relationship with Epic, some of the tools we have developed are being wrapped into future releases in Unreal Engine.” In working with the LED volume, Jones notes, “Probably the most critical area in the pipeline is the color workflow. We have spent a large amount of time to ensure the color remains the same as it flows through Unreal onto the LED panels, through the camera and finally to the DIT for the final look. The calibration of the LED panels is checked frequently and we develop custom LUTs for each camera and also for each show, which is finessed with the DP and DIT.” For Jones, the most gratifying part of shooting on the LED stage “is seeing everyone’s reaction when they first step into the volume. Instead of having to imagine what will be placed beyond the practical sets and props within a massive green screen, everyone can see and react to the filmmakers’ intent right there on the day. This excitement has also been seen in all the artists who came to the stage and saw their work on an unusually large screen. It is always fun to watch them walk around inside the work they created.”
MELS STUDIOS
The MELS Studios LED volume bowed in February 2021, and had over 20 different productions last year, including Transformers 7 and Disappointment Blvd. The volume is situated in a 10,000sq.-ft. stage in Montreal, where MELS can offer a turnkey service and additional services, like grip, camera, lighting and insert stage on demand, according to Richard Cormier, MELS Studios Vice President Creative Services and Executive Producer – Virtual Production. “We have built and designed on specs a second volume for specific production in addition to our permanent stage.” Cormier notes that the permanent MELS LED stage has a diameter of 65 feet and a height of 20 feet, but “what makes ours special is the ceiling that is fully motorized and computerized. It is separated into two 20 ft. x 20 ft. panels that can be programmed within a millimeter. It gives production full and precise control over reflection and dynamic lighting, and it can host as many SkyPanels as production wishes.” Work in the volume caused two main collaborations to emerge for MELS. “One is our involvement with the set design team from the onset all the way through shoot day,” says Cormier. “The other has to do with our involvement with the VFX team on the asset preparation so we can have the ultimate flexibility on shoot day.” On the innovation front, he adds, “The one I can share is that we have the possibility of recording a greenscreen and our digital set at the same time, which can give compositing the perfect greenscreen and the perfect scene lighting.”
SPRING 2022 VFXVOICE.COM • 47
PG 40-47 LED STAGES.indd 47
2/28/22 3:58 PM
VFXV ANNIVERSARY
VFX VOICE CELEBRATES ITS 5TH ANNIVERSARY In Spring 2017, the VES launched our premiere magazine, VFX Voice, which quickly claimed the mantle as a must-read publication among decision- makers and influencers in visual effects and global entertainment. The goal was to create a new platform to shine a light on outstanding visual effects artistry and innovation worldwide, advance the profile and recognition of the VFX industry and marvel at the talent who never cease to inspire us all – and do it all with top-notch editorial and eye-popping visuals. Because of your enthusiastic support, the success of our print and digital magazine, dubbed “the definitive authority on all things VFX” has exceeded our wildest expectations. In its first two years, VFX Voice won a number of national magazine awards and was recognized as Best New Magazine Launch, Best New Magazine Design and Best Website (VFXVoice.com). These achievements were a testament to everyone who played a role in getting VFX Voice off the ground, from concept to publication, and to everyone who continues to help it flourish. VFX Voice keeps us tethered through stories that celebrate our craft and our global community – so thank you for being a part of our journey and stay tuned for all that comes next! Here’s what the VFX industry has said about five years of VFX Voice, along with the covers of every issue of the magazine. “It’s hard to imagine what the visual effects community would feel like without VFX Voice. It’s a robust platform filled with rich features and news stories that I think everyone in our industry can benefit from. So whether you’re a model maker, an artist or a studio executive, there’s something here for you.” —John Knoll, Executive Creative Director and Senior Visual Effects Supervisor, Industrial Light & Magic “Five years in and VFX Voice has become essential reading. Keeping abreast of the latest trends, breakthroughs and conversations in the industry is crucial. Through the magazine I not only am able to track the evolution of a rapidly changing field, I’m able to read thoughtful, well-written articles about subjects that I enjoy following. It’s both business and pleasure.” —Jim Morris, VES, President, Pixar Animation Studios “As we continue to move to a more connected world, where remote collaboration is part of our every day, and where the rate of technological transformation is increasing rapidly, VFX Voice and its connection to our global members will become ever more important. VFX Voice will continue to be the prominent voice that informs and inspires the generations of creatives, producers and technologists that follow.” —Jennie Zeiher, VES Australia Section Board of Managers, Head of Business Development, Rising Sun Pictures
48 • VFXVOICE.COM SPRING 2022
PG 48-49 VFXV ANNIVERSARY.indd 48
2/28/22 1:19 PM
“Congratulations on five years of coverage. Thank you for your support in advancing the VFX industry and its many voices.” —Netflix “VFX Voice has given an important platform to visual effects, the artists, production teams, engineers and the community as a whole for half a decade and counting. Their impact on the industry is significant, and ILM is honored to have some of our stories told in this quality publication.” —Rob Bredow, Senior Vice President, Chief Creative Officer, Industrial Light & Magic “For most of my career, the VES has been there to honor and promote the best about our industry. VFX Voice is a natural voice for that effort, with detail and style.” —Simon Robinson, Co-founder and Chief Scientist, Foundry “VFX Voice is the definitive source for visual effects professionals, students and enthusiasts. Congratulations on five years of creative storytelling!” —Daniel Kramer, Head of CG/VFX Supervisor, Sony Pictures Imageworks “VFX Voice is the gateway to learning about our vast world of film and television. The magazine not only celebrates our greatest creative achievements, but also educates us about the latest technology advancements, celebrates our global creative community and brings us into one marketplace to learn from each other. In a world where we are constantly inundated by data and time is limited, to be able to go to one place for industry information is extremely rewarding and productive. Plus it sits nicely on the coffee table and is a good conversation starter.” —Neishaw Ali, President/Executive Producer, SpinVFX “VFX Voice is a must-read for anyone in the visual effects and animation industries, including those in the VR/AR, virtual production, gaming and other digital media spaces. From in-depth technical pieces to profiles on key industry figures, VFX Voice covers a breadth of topics that appeal to a wide readership. I am impressed with all that they have accomplished in just five years, and I am excited for the future of the magazine.” —Joel Mendias, Executive Producer, Scanline VFX “I think VFX Voice is well-researched, insightful, in-depth and often at the cutting edge.” —Pete Jopling, Executive VFX Supervisor, MPC Episodic “VFX Voice has beautifully risen to the challenge of providing a platform that showcases the incredible talents of everyone who makes our global VFX community exciting and possible. Every time I open a new issue I get re-energized about the VFX community I’m a part of, and I’m amazed at how VFX Voice showcases a wealth of content while allowing each page to breathe with simplicity.” —Andrew Bly, Principal/CEO, Molecule Visual Effects
SPRING 2022 VFXVOICE.COM • 49
PG 48-49 VFXV ANNIVERSARY.indd 49
2/28/22 1:19 PM
ROUNDTABLE: VIRTUAL PRODUCTION LESSONS LEARNED ON THE GROUND By IAN FAILES
TOP: On the set of Haz Dulull’s Percival, a virtual production shoot made at Rebellion Studios. (Image courtesy of Haz Dulull) OPPOSITE TOP: Stowaway cinematographer Klemens Becker operates the virtual camera while the film’s virtual cinematographer, Jannicke Mikkelsen, looks on. (Image courtesy of Unity) OPPOSITE MIDDLE: Jannicke Mikkelsen devises a shot during the making of previs for Stowaway. (Image courtesy of Unity) OPPOSITE BOTTOM: An Unreal Engine screenshot from the animated feature film RIFT, produced by HaZimation Ltd. (Image courtesy of Haz Dulull)
Many experienced filmmakers, indie creators, visual effects supervisors and cinematographers have been diving into new virtual production techniques over the past few years as various real-time related technologies become more developed and more accessible. Those who have jumped headfirst into virtual production are still learning the benefits and pitfalls of the various technologies – from LED volumes to virtual camera shooting – while also applying traditional filmmaking techniques to these new sides of the craft. With that in mind, we asked a range of filmmakers what lessons they had learned so far in their virtual production escapades that they would share for new users. Our roundtable group consisted of the following: Kathryn Brillhart is a cinematographer and director whose work explores visual poetry, movement and new technologies. She recently directed Camille, a naturalistic live-action short captured entirely on an LED stage. She is the Virtual Production Supervisor for the upcoming film Black Adam. Haz Dulull is a director whose virtual production work includes the live-action short Percival, filmed entirely in an LED volume, and the animated feature film RIFT, created in Unreal Engine. Paul Franklin is Creative Director and Co-founder of DNEG. He was the Visual Effects Supervisor on Inception and Interstellar, among many others, and has recently been directing virtual productions on LED stages, including the short, Fireworks. Jannicke Mikkelsen is a director and cinematographer specializing in immersive and next-generation film productions. She was
50 • VFXVOICE.COM SPRING 2022
PG 50-55 VIRTUAL PRODUCTION.indd 50
2/28/22 4:00 PM
the virtual cinematographer for Stowaway, using Unity’s virtual camera system to help craft previs for key space scenes. Nguyen-Anh Nguyen is currently producing and directing his own virtual production project, a pilot for BABIRU, a post-human show about humanity. He is also developing a feature film using virtual production methods. One of the interesting things about virtual production techniques is problem solving in this new paradigm. What are some specific things you’ve had to solve? Jannicke Mikkelsen: For Stowaway, we did a lot of virtual camera shooting for the previs, which was all about trying to transition the virtual astronauts from simulated gravity to zero gravity and then back to simulated gravity. You couldn’t do it with a crane shot or a jib or anything like that. So I had to get really inventive with the handheld movements. And because the spaceship is spinning as well, once you cross the center point in a game engine, all the settings on your camera – your virtual rig – flip around and mirror after you switch the halfway point, until the spaceship comes back again. So, it was really confusing at first and something we had to think about differently in the virtual world. Kathryn Brillhart: Every project has unique challenges. For Camille, we were going for a very naturalistic look recreating hard sunlight, extreme stormy weather conditions and fire. Creating direct sunlight in a small room with 15-foot ceilings meant that we had to use smaller lights, bounce and negative fill to shape light
SPRING 2022 VFXVOICE.COM • 51
PG 50-55 VIRTUAL PRODUCTION.indd 51
2/28/22 4:00 PM
VIRTUAL PRODUCTION
differently than we would with a larger space. We also had a seveninch gap between the LED wall and the floor, which was visible in wide shots. We needed a solution to blend the seam. After some trial and error, and conversations with other filmmakers, we used aluminum foil. The slight reflection of the screen created a soft blend between the emitted light from the LED and the practical objects receiving light. So if you angled it the right way, it really hid the seam. Paul Franklin: I recently directed a shoot on ARRI’s LED stage in London. The scenario featured a man exploring a spooky cave where he finds a monster. We created the monster in real-time,
TOP: A final shot from Kathryn Brillhart’s virtual production short, Camille. (Image courtesy of Kathryn Brillhart) BOTTOM: On the set of Camille, where wheat grass and an LED wall were part of the shooting setup. (Image courtesy of Kathryn Brillhart)
with one of our crew wearing a motion capture suit, tracked by the same rig that was recording the camera’s moves. Our actor could see the monster playing back in real-time on the LED wall, in the full Unreal Engine environment. I was very pleasantly surprised at how well the screen held up and how close we could get to it with the camera. Haz Dulull: One thing in terms of problem solving that I tend to do is tech-viz before an LED wall shoot, and that is to get the stage plans from the studio you are going to be shooting at, which will have the dimension of the stage, the length and height of the wall, etc., and then build that in Unreal Engine, like a white-box level design, but built to scale. Then I’d start placing my camera and what the foreground assets will be – by adding an image on the LED wall and making the curved plane representing the LED wall self-emissive, too, and then virtually planning the shoot that way. Nguyen-Anh Nguyen: This isn’t so much a specific thing, but in terms of problem solving, I’ve found virtual production seems to allow for infinite possibilities, which is great, but it doesn’t mean that just because you can do it, you should. I’m trying to keep things contained and have a physical cinematic approach to the process in order to inject this with a sense of realism. That doesn’t mean to say that I can’t accomplish some extremely complicated things that I would have never been able to do on a real set. That’s true. What have you found are the kinds of scenarios that do and don’t work for LED wall shoots or real-time workflows? Nguyen: I think that there are definitely projects that are better suited for virtual production and others for a more traditional approach. This is something I think of a lot now when I’m
52 • VFXVOICE.COM SPRING 2022
PG 50-55 VIRTUAL PRODUCTION.indd 52
2/28/22 4:00 PM
conceiving a new idea. If a project tends to gravitate to something very simple, with a few people in a room, I’ll keep things simple and just tend to shoot it live-action, but – and this happens often – I have a crazier idea that requires scope, lots of production design and multiple set pieces, I may not limit myself as much in the writing process and really try to nail the best story I can if there’s a way to create all of these images in virtual production. Brillhart: It really comes down to what problems you are trying to solve by using LED workflows. Creative should drive these conversations. In some cases it makes sense to capture final in-camera VFX shots and other times it works well for acquiring elements of a scene with interactive lighting, or both. Say your story requires a burning field at night. An LED wall becomes an interesting application because it doesn’t require cast or crew to breathe in fumes or work nights. If you’re shooting a vehicle, you may be able to capture final shots or separate elements with interactive lighting on actors. There is still so much experimentation yet to be had, and that’s the fun part! Franklin: The more you do with the LED volume, the more you learn – the things it’s good at, it’s really good at, but there are things that don’t work so well. The LED can’t create the kind of hard light you get in open-air locations. It can depict the effects of that hard light within the image on the wall, but you get a very soft diffuse light on the set. So you have to use practical lights on the foreground to get defined shadows, and then you have to figure out how to balance the exposures because the LED isn’t very bright, and you need to avoid light bouncing off the set into the LED, washing out the black levels. Practice makes perfect!
Does virtual production suddenly mean there are new roles on set? Or are they versions of roles that already exist? Brillhart: Currently, virtual production requires a combination of new and existing roles – it’s critical to establish how they work together early on in the process, especially with LED volume workflows. The timing in which leads are brought onto a project counts! Many roles in visualization, reality capture and performance capture already come from VFX and integrate into production in a similar way. As projects scale in size, it’s important to make sure that the client-side production team has core team members to
TOP: A still from Nguyen-Anh Nguyen’s BABIRU, being made in Unreal Engine. (Image courtesy of Nguyen-Anh Nguyen) BOTTOM: Nguyen-Anh Nguyen on the set of his Hyperlight short. He has been using previous filmmaking experience to jump more into real-time and virtual production work. (Image courtesy of Nguyen-Anh Nguyen)
SPRING 2022 VFXVOICE.COM • 53
PG 50-55 VIRTUAL PRODUCTION.indd 53
2/28/22 4:00 PM
VIRTUAL PRODUCTION
support their creative needs. For example, an art department may need a client-side VAD lead to help QC and prepare designs for the VAD team and/or help with optimization. On LED stages, be prepared to introduce roles such as stage ops, content playback, engineers and IT support to the mix. Mikkelsen: It’s important to work out who does what now in virtual production. Early on for the previs of Stowaway, I thought, ‘Well, I can’t make a call on the lens or the camera because we need a ‘real’ DP in here. Because if I call the lens and the lighting, then what’s Klemens Becker, the cinematographer, going to do? I can’t go treading on his toes.’ And so we got Klemens in really early. We actually got the head of visual effects, the special effects supervisor, the gaffer, the grip. We got everybody in to ask them, ‘How are we going to build this? What’s it going to look like in the studio?’ And I think once everybody was living in the previs, that’s when they understood the film, because you can then see and test if something’s going to work or if something’s not going to work. Franklin: At the moment, I think people with a VFX background who have on-set experience do well in virtual production. I see it as a hybrid VFX/live-action process – we’re in a live-action environment with real cameras, and you’re on the clock, you don’t want to waste time. So it’s good to have an understanding of the pressures of the set, but it’s also an environment where an understanding of CG and VFX workflows are of huge benefit when working with the LED and the real-time system. It’s an interesting convergence. Dulull: I’ve had to think differently about virtual production versus visual effects in terms of what happens on set. In
non-virtual production projects, you rely on multiple passes of renders out of CG and then composite them together for the final results, but in virtual production all these things you rely on in composting, like adding atmosphere, fire effects, controlling motion blur, bloom, lens effects, etc. is all done in real-time, so you really have to treat the Unreal Engine scenes, say, as a physical film set – mentally and actually literally, too – and understand that whatever you put in the scene will be processed in real-time. Do you have any specific words of advice for people who might be looking to get into more virtual production approaches? Dulull: When working with an LED volume, try and get as much tech recce time as possible, do as much testing as possible, because the LED volume shoot days are expensive at the moment. You don’t want to be figuring technical stuff out on the [shoot] day and eating into your time and budget. Brillhart: There are unlimited entry points into virtual production, and it’s OK to be multi-passionate. Coming from both physical production and post VFX, it’s exciting to work with tools that connect both worlds. Explore all your options and take time to experiment. Nguyen: The most important thing I would say to other filmmakers out there is to get your hands dirty and explore this space. The toolset that we currently have at our disposal is virtually free and so powerful. Also, surround yourself with people who are
54 • VFXVOICE.COM SPRING 2022
PG 50-55 VIRTUAL PRODUCTION.indd 54
2/28/22 4:00 PM
passionate but may not have all the intimate knowledge of VP yet. It’s such a new field that there aren’t a lot of people specifically trained for this. My team is comprised of a lot of people with film backgrounds who are exploring Unreal for the first time, as well as game-oriented artists who are doing film work for the first time. Mikkelsen: I would say, embrace the power of the technology. With the real-time previs we were doing on Stowaway, with multiplayer and ray tracing, it’s a game-changer, where you can have everybody living in the previs and you can see changes immediately. You don’t have to wait for the render. Instead of it having to go through a chain of command, anybody can just go in, especially there with a VR headset, and just go, ‘Well, what if we just moved this?’ and then just click, move, and everybody goes, ‘Oh, okay. Yeah, that does work.’ One of the big outcomes, too, was that when we moved onto set, everything just ran super smoothly, because everybody had already shot the movie five times. Franklin: My big takeaway is be bolder. I was very aware of all the potential bear traps waiting for me on my first virtual production, and as a result I think I was more cautious than I needed to be, especially with the issue of moiré artifacts caused by the screen pixels fighting the pixels in the camera. My subsequent experience tells me that you can get away with a lot more than you might at first think, so I’m less worried about those things and more focused on the filmmaking and storytelling – you can be more adventurous with the content on the screen.
OPPOSITE TOP: On the LED set of Paul Franklin’s virtual production short, Fireworks. (Image copyright © Wilder Films. Photo: Tom Oxley) TOP: Fireworks brought together several individuals and companies that have been working in virtual production, including Dimension and DNEG. (Image copyright © Wilder Films. Photo: Tom Oxley)
SPRING 2022 VFXVOICE.COM • 55
PG 50-55 VIRTUAL PRODUCTION.indd 55
2/28/22 4:00 PM
SUE ROWE: VFX SUPERVISOR IS THE PERSONIFICATION OF DETERMINATION By TREVOR HOGG
Images courtesy of Sue Rowe, except where noted. TOP: Sue Rowe, Visual Effects Supervisor, Scanline VFX OPPOSITE TOP: Rowe found Roland Emmerich and Volker Engel a joy to work with during the making of Independence Day: Resurgence (2016). OPPOSITE MIDDLE: Rowe holds an Oscar with the Cinesite team. OPPOSITE BOTTOM: For over 20 years Rowe was part of the U.K. visual effects industry working on productions such as Troy (2004). (Image courtesy of Warner Bros.)
Animation has become a family tradition for Scanline VFX Visual Effects Supervisor Sue Rowe. Not only was her husband also involved in the industry, but one of their two daughters studies the craft at Emily Carr University of Art + Design. Rowe’s love for the arts was furthered by the creative expertise and chauvinistic attitude of a particular educator. “I grew up in Bridgend, Wales which is on the coast, beautiful and rugged. I had a brilliant art teacher who used to say things to me like, ‘You are OK, but you’ll go off, get married and not have a career in this.’ I thought even then, ‘I’m going to show you that I’m good at this.’ One of his words of advice was, ‘Draw what you can see. Not what you think you can see.’ I use that with my compositors. ‘Study real life. Go outside and look at the horizon. That tone at the bottom is not going to be the same as the one at the top.’” Despite growing up surrounded by engineers, Rowe was still exposed to the arts. “My mum would take me to the theatre once a year to Cardiff or London. I remember seeing Madame Butterfly and literally crying because of the music and visuals,” Rowe reveals. An iconic talking bear was a career inspiration for the teenager. “I wrote a letter to FilmFair saying that when I grow up that I want to be an animator on Paddington Bear. They never wrote back but I persisted!” Other influences were the Dune books and cinematic adaptation by David Lynch, as well as Pixar shorts Luxo Jr. and Tin Toy. “Just the excitement of seeing something so visually unique and wondering, ‘How did they do that?’” The aspiring Disney animator got a Bachelor of Arts in Traditional Animation at University for the Creative Arts at Farnham. “I used to build the sets and model armatures. That was my first step into the world of filmmaking. At the time, hand-drawn and stop-frame animation was not a good market, so I decided to take a Master’s in Computer Animation at Bournemouth University. My whole life changed after that. The visual effects industry was beginning to blossom in Soho. I went from hawking my way around companies trying to get work to getting offered a job on the spot. I chose Cinesite because they had a bar!” Despite being employed by Cinesite as an animator, Rowe had to learn how to be a generalist. “I had to model, rig, shade, light and then animate; that was definitely a calling for me to end up being a visual effects supervisor because I loved all of the sides of filmmaking. The thing I didn’t know that I would love was reading script, walking onto a set and having a director ask, ‘How do we do that?’ I’m a big planner! As a visual effects supervisor, you have to have at least three plans cooking in your head every time you’re setting up for a shot because someone will throw you a curveball and you can’t stand there in the middle of the stage and say, ‘I don’t know the answer to that. I’ll come back to you.’” Visual effects have become more prevalent. “We did visual effects on 20 to 30 shots [in 1994] and by 2012, John Carter had 800.” The industry has become more globalized. “So many of the companies now are in different time zones. It’s a blessing and a curse that I can start my day talking to London and finish speaking to India.” Rowe has experience both as a studio and facility visual effects supervisor. “When Eric Brevig and I did The Maze Runner, the budget was not huge, but every pixel counted and looked
56 • VFXVOICE.COM SPRING 2022
PG 56-61 SUE ROWE.indd 56
2/28/22 1:31 PM
“As a visual effects supervisor, you have to have at least three plans cooking in your head every time you’re setting up for a shot because someone will throw you a curveball and you can’t stand there in the middle of the stage and say, ‘I don’t know the answer to that. I’ll come back to you.’” —Sue Rowe, Visual Effects Supervisor, Scanline VFX amazing. What I love about being in a facility is being surrounded by smart people and new technologies. I like to dip in and out so that I can get the best of both worlds.” Netflix has purchased Scanline VFX, which positions Rowe in the heart of the content surge. “Streaming means that you are turning around movies and series in shorter periods of time,” remarks Rowe. “That is always going to be a challenge because we want to give our best work, which probably means working longer hours to fill that need for new content.” Audiences are extremely savvy. “I sit next to my teenager and she’ll go, ‘Mum, that didn’t
PG 56-61 SUE ROWE.indd 57
2/28/22 1:31 PM
PROFILE
“There are female visual effects artists out there, but not many of them are going to the next level. We need to support other females and elevate them, and give them the confidence so that they can obtain these jobs and responsibilities. If there is nobody ahead of you who is doing it, it becomes harder to see yourself in that position. I love the Geena Davis quote, ‘If she can see it, she can be it.’ ... It’s amazing we’re encouraging people to go for those roles.” —Sue Rowe, Visual Effects Supervisor, Scanline VFX
TOP AND BOTTOM: A career highlight for Rowe was being the co-production VFX Supervisor with Eric Brevig on The Maze Runner (2014). (Image courtesy of Twentieth Century Fox)
look good.’ We expect so much from streaming because we’ve been delivered such quality work. I don’t know if you could ever change that.” What has not changed is the foundation for making the most believable digital imagery. “If I get asked, ‘How should we do this?’ The first thing I say is, ‘What can you do for real?’ That’s where you need to start. I have huge respect for special effects teams and art departments. We should be wary of a digital solution for every time. Cinesite had its own miniature division called the Magic Camera Company in the U.K., and I loved going down there. I saw the Hogwarts scale model which was 40 feet high. Real talented artists who have perfected their art: why wouldn’t you go to them to get the best possible movie?” The (1999) TV adaptation of Animal Farm saw Rowe go from being a digital composite supervisor to visual effects supervisor at Cinesite. “Angus Bickerton was the Overall Visual Effects Supervisor and I learned a lot from him,” acknowledges Rowe. “It was done in conjunction with Jim Henson’s Creature Shop. A lot of the talking animals were animatronics, and we would do some mouth enhancements. I spent three months in Ireland shooting that, and one day it rained so much that our set flooded and we had to get evacuated!” Historical preservation took place while making
58 • VFXVOICE.COM SPRING 2022
PG 56-61 SUE ROWE.indd 58
2/28/22 1:31 PM
a remake starring Johnny Depp as Willy Wonka. “We had to come up with some ideas about what the nut house would look like. I was looking at a beautiful old Victorian hospital in East Dulwich that was being knocked down; it is gone now but exists forever in Charlie and the Chocolate Factory!” A favorite experience for Rowe was working with filmmaker Andrew Stanton on John Carter. “Andrew elevated my role so I became a joint visual effects supervisor during the making of the movie. One of my favorite assets that Cinesite built were the flying machines that had beautiful reflective scales on their wings because they fly using light.” A particular cinematic sequence still haunts Rowe. “It is from Event Horizon and the number was M25_005. Richard Yuricich and Dave Stewart did a motion control shot that starts outside of the spaceship, goes in through the windshield, travels through the center of the spaceship and back out the other side. It starts fully CG, goes into a model miniature, then a live-action plate, and I had to smooth all of these things together. I literally worked on that shot for six months and haven’t watched Event Horizon since!” However, the experience was not entirely bad. “There is a haunted house moment when the spaceship Lewis and Clark is just outside of Neptune, so the environment around it is full of electrical
TOP: A scene from Charlie and the Chocolate Factory (2005). (Image courtesy of Warner Bros.) BOTTOM: Rowe stands among fellow colleagues at Cinesite in 1997.
SPRING 2022 VFXVOICE.COM • 59
PG 56-61 SUE ROWE.indd 59
2/28/22 1:31 PM
PROFILE
TOP TO BOTTOM: Rowe has given presentations about the visual effects found in John Carter (2012). (Image courtesy of Walt Disney Pictures) Rowe poses with Patrick Stewart, who presented her with the Panalux Craft Award at the Target Women in Film and Television Awards. Rowe on set in Utah with John Carter screenwriter Mark Andrews. Rowe with director Andrew Stanton at the premiere of John Carter in 2012. Sheila Duggal and Rowe take part in the “VES Tools for Change” Seminar.
storms. I hand-painted and hand-animated the lightning strikes. This was before you could Google, ‘What does lightning look like?’ I remember thinking about the timing of it and trying to hear it in my head. I look back on that shot and think, ‘We did well there.’” Conducting visual effects presentations are a necessity for Rowe. “I quite often do them to get exposure for women in animation and filmmaking. There is a report that has come out recently in Variety about the lack of female representation, especially in visual effects. I’m sure it is to do with unconscious bias. Unconscious bias means ‘employ that guy who was at university with you or a guy who is a friend of a friend.’ We should be aware of this and open up the field to all. There are female visual effects artists out there, but not many of them are going to the next level. We need to support other females and elevate them, and give them the confidence so that they can obtain these jobs and responsibilities. If there is nobody ahead of you who is doing it, it becomes harder to see yourself in that position. I love the Geena Davis quote, ‘If she can see it, she can be it.’ I’m married and have a family; those things didn’t stop me. Recently, I got an email from someone who saw me do a talk at a local school and she had just started a job at Framestore. It’s amazing we’re encouraging people to go for those roles.” Positive influences have been Warner Bros. Executive Vice President, Visual Effects Anne Kolbe and Visual Effects Supervisor Volker Engel. “Anne has been a great supporter, while Volker nurtured me on Independence Day: Resurgence and gave me a chance to succeed. Angus Bickerton encouraged me to go for what I wanted early on, and I still talk to Eric Brevig when I have a challenge because he’s good getting at the root of the problem and coming up with a solution.” A particular tool has captured Rowe’s imagination. “What I am excited about in virtual production is when you can achieve stuff in-camera, and the developments there have been exponential. It takes some planning ahead – the craft always benefits from planning! I am particularly excited in unusual developments in virtual production. I see great strides being made there with techviz and real-time storytelling. One area I am interested in
60 • VFXVOICE.COM SPRING 2022
PG 56-61 SUE ROWE.indd 60
2/28/22 1:31 PM
“If I get asked, ‘How should we do this?’ The first thing I say is, ‘What can you do for real?’ That’s where you need to start. I have huge respect for special effects teams and art departments. We should be wary of a digital solution for every time.” —Sue Rowe, Visual Effects Supervisor, Scanline VFX harks back to my roots in traditional animation. I can see VP and traditional animation being the future of storytelling very soon.” Rowe has been described as tenacious. “If somebody says, ‘No, I can’t do it,’ I think, ‘I can and I will. I’m just going to find a different way of doing it.’ That has stood me in good stead in visual effects because people will tell you it’s not possible to shoot it or that’s going to be too expensive. I will find a different way around it. I’m very pragmatic. If I can find a way, I’ll encourage everyone else to join me on that trip. There have been lots of times when I would not let things go to make sure that it looked as good as possible.” A successful visual effects supervisor also has to be a good communicator. “I have to be able to speak in various languages to different people. I take the brief and disseminate that information out to all of the teams. You have to be organized and love detail because you can’t bluff. And enjoy some of the technical aspects. You have to able to delegate and trust other people. The thing that I hope never changes in my life is I still love my job! I get excited reading a script and imagining how we’re going to do it. I love working with teams of people and encouraging and getting the best out of them. At the end of the day, what a privilege it is to be making movies.”
TOP: Bringing The Hitchhiker’s Guide to the Galaxy to the big screen in 2005 with filmmaker Garth Jennings was a lot of fun for Rowe. (Image courtesy of Touchstone Pictures and Spyglass Entertainment) BOTTOM: Rowe worked on Space Jam in 1996 as a digital compositor for Cinesite. (Image courtesy of Warner Bros.)
SPRING 2022 VFXVOICE.COM • 61
PG 56-61 SUE ROWE.indd 61
2/28/22 1:31 PM
THE REPLICANT UPGRADE OF BLADE RUNNER: BLACK LOTUS By TREVOR HOGG
Images courtesy of Alcon Entertainment, Adult Swim and Crunchyroll. TOP: Introduced into the weaponry found in the Blade Runner universe is the katana wielded by Elle. OPPOSITE TOP: A character that also appears in Blade Runner 2049 is Niander Wallace Jr. OPPOSITE BOTTOM: Elle attempts to uncover her past and why she is being hunted.
As part of the marketing surrounding the release of Blade Runner 2049, a trio of shorts were produced to explain the significant events that occurred between the sequel and the original Blade Runner, which was set in 2019. Alcon Entertainment partnered with Sola Digital Arts for the anime installment Blade Runner: Black Out 2022. The two studios have reunited and are working with Warner Bros. TV, Adult Swim and Crunchyroll to produce Blade Runner: Black Lotus, which revolves around a katanawielding amnesiac attempting to uncover her past and the reason why she is being hunted. “Alcon picked Sola Digital Arts to work with, as they had had a great experience with them previously on the Black Out 2022 anime short, directed by Shinichiro Watanabe,” states Jason DeMarco, Senior Vice President of Anime & Action Series/ Longform for Warner Bros. Animation and Cartoon Network Studios, who was the development executive overseeing the project. “Watanabe suggested that if Sola were onboard for the series, he would like to be involved. And I don’t say ‘no’ to Watanabe!” “Alcon took their time building the story over many months and several writers,” explains DeMarco. “The intention was always to create an animated TV series. Everyone involved in the creation of the show are huge fans of both of the films. Honoring those [involved] meant putting our blood, sweat and tears into making sure the world of Los Angeles in 2032 felt like Blade Runner. In terms of creating something distinct, we thought the opportunity to center on a female character and bring some of the existing
62 • VFXVOICE.COM SPRING 2022
PG 62-67 BLACK LOTUS.indd 62
2/28/22 1:38 PM
“The idea for the Black Lotus series was to take advantage of this profound setting of the Blade Runner universe, which has captured the imagination of many fans, and place the spotlight on someone else who may have lived in this world that we haven’t seen before, expanding the worldview. We also made sure to keep certain connections with the films so all the fans would be able to enjoy.” —Kenji Kamiyama, Co-director ‘cyberpunk’ tropes that derived from anime and manga that were produced in the wake of Blade Runner, would be an interesting new addition.” Blade Runner: Black Lotus serves as bridge between the two films. “Blade Runner: Black Lotus takes place in 2032, after the original film, after the Black Out 2022 anime short and before Blade Runner 2049,” notes DeMarco. “Thematically, it fits within the same wheelhouse of larger philosophical discussions about what makes us human, especially in a world that consistently devalues what that means on every level.” Going beyond the restraints of a theatrical run-time meant that co-directors Shinji Aramaki and Kenji Kamiyama, who previously collaborated on Ghost in the Shell: SAC_2045, were able to expand the narrative over 13 episodes. “Both Blade Runner stories were theatrical films, the story being about a certain individual, Deckard, and the events that happen around him within this universe,” explains Kamiyama.
SPRING 2022 VFXVOICE.COM • 63
PG 62-67 BLACK LOTUS.indd 63
2/28/22 1:38 PM
ANIMATION
TOP TO BOTTOM: Making an appearance is the test that appeared in the original Blade Runner, which determines a person to be human or replicant. Elle gains an ally in Blade Runner turned junk dealer Joseph. Alani Davis is an officer of the Los Angeles Police Department.
“The idea for the Black Lotus series was to take advantage of this profound setting of the Blade Runner universe, which has captured the imagination of many fans, and place the spotlight on someone else who may have lived in this world that we haven’t seen before, expanding the worldview,” observes Kamiyama. “We also made sure to keep certain connections with the films so all the fans would be able to enjoy.” Aramaki relished the episodic opportunity. “Since we had the pleasure of creating this anime as a 13-episode series,” says Aramaki, “we were able to take the time to depict other parts of this world and fully create the lives of these characters that live within, which can be hard to do with a single movie. We were careful during this process so that nothing would deviate from the imagery that fans, including myself, have of this world.” Even though Joseph Chou, CEO of Sola Digital Arts and Executive Producer of Blade Runner: Black Lotus, has been responsible for creating anime for Halo and The Matrix, working on the cinematic franchise established by Ridley Scott was a different experience. “Black Lotus came with a heavy sense of duty and responsibility, not just because the original film is a revered classic [which it is], but also because of the impact Blade Runner had on the anime industry. We would not have titles such as Akira, Cowboy Bebop and Ghost in the Shell without it. Everyone approached the project with absolute reverence and respect – and a healthy sense of fear that we cannot screw up!” The visual aesthetic evolved. “There was a long period of visual development with a ton of concept art created as we dialed in the look of the show,” remarks DeMarco. “We originally started in a very different place visually, and all of that exploration was necessary to get a look we all felt excited about. We expected it to be hard to match the quality of animation we wanted with a budget that wasn’t huge. We did not expect our entire production having to shut down for extended periods of time, or for people to have to figure out how to create the show remotely, thanks to a global pandemic.” Breaking away from firearm tradition of Blade Runner, a katana is the primary weapon for the protagonist Elle (Jessica Henwick/ Arisa Shida). “That was something that was present from the earliest design phases,” reveals DeMarco. “All of us felt it was a different element than one would maybe expect to see in this world, in a good way. It also helps the series lean into the essential Japanese quality of this show, which is firmly an ‘anime’ in every sense of the word.” The iconic Japanese sword fits into the visual aesthetic. “One of the things that particularly struck me when I saw the original Blade Runner was how the city looked: night, rain, fog, steam, neon lights,” remarks Aramaki. “What surprised me even more so was how that overall coolness was combined with these Japanese-character neon lights, Asian characters and commercials to create this unique mood. We both had thoughts of extending an additional Japanese element into this and happened to think a female replicant with a Japanese sword could fit into this world very nicely, along with bringing in a new source of action into it.” For Aramaki, there has been a blurring between the capabilities of animation and live action. “Nowadays, since it’s possible to create great CG characters that are highly indistinguishable from
64 • VFXVOICE.COM SPRING 2022
PG 62-67 BLACK LOTUS.indd 64
2/28/22 1:38 PM
“Black Lotus came with a heavy sense of duty and responsibility, not just because the original film is a revered classic [which it is], but also because of the impact Blade Runner had on the anime industry. We would not have titles such as Akira, Cowboy Bebop and Ghost in the Shell without it. Everyone approached the project with absolute reverence and respect – and a healthy sense of fear that we cannot screw up!” —Joseph Chou, Executive Producer the real characters in live action, I don’t think there are many things that only animation can do anymore. Maybe with anime it’s a little easier to convince the audience that this non-macho female lead is taking down these tough-looking men with her amazing fighting skills and sword battles.” Kamiyama believes in reverse engineering approach. “There’s really not anything I can think of that is unique in that way to the animation,” says Kamiyama. “Rather, I approached it the other way around, trying to really move towards a more realistic representation of the world by creating a heightened sense of reality and a presence to the characters.” Layout movies were created for every scene and shot for each episode. “In Japan, the production team makes precise cuts to the storyboard before moving on to animatics,” explains Kamiyama. “Since the motion capturing process is done in between the storyboarding and animatics, there ends up being more cutting that happens there. Although the storyboard is an important guideline in creating the animatics after the motion capture process, we ended up requiring many more additional images for the layouts, which may have been the opposite of how animatics are used in live-action titles.” Sola Digital Arts excels in utilizing motion capture with drama and action performances done separately. “We thought CG animation would be the best way to reproduce the Blade Runner world as much as possible,” states Aramaki. “Since the original is a live-action film, we wanted to keep that look and feel to the best of our ability through animation, which helped us decide to utilize motion capture as our animation base. And also, simply because this style is something I’m used to the most.” The technology required some creative assistance. “We made full use of motion capture but as everyone in the industry knows, motion capture still needs the hands of animators,” observes Chou. “It was used more or less as a guide in our case.” An older technology proved to be the answer when it came to rendering. “We initially thought the only way to do this on time and on budget was to utilize real-time engine rendering,” explains Chou. “We soon found, however, that we were unprepared for the challenge. We then tried GPU render, which had its own problems because it’s not something yet ready for a large-scale production.
TOP TO BOTTOM: A lighting board for a shot with Elle staring out at the Wallace Tower. 2D background environment model sheet for a distant view of Los Angeles. Stage lighting for the market environment.
SPRING 2022 VFXVOICE.COM • 65
PG 62-67 BLACK LOTUS.indd 65
2/28/22 1:38 PM
ANIMATION
TOP TO BOTTOM: A jazz club is one of the establishments that comes alive in nighttime Los Angeles. The 13 episodes provided the time and room to explore other parts of the Blade Runner world and the lives of the characters. A major fight sequence with Elle takes place in a cemetery.
Eventually we moved back to the good old way of CPU render – our main software was Maya and Houdini. However, we are taking the lessons learned during this production and will continue to try developing ways to utilize real-time and GPU rendering in the future.” Look development tests were not possible because there was not enough time. “Things were happening real time based on the production flow that we established with series work that we’ve done previously,” comments Chou. Central to the success of the series is the ability for viewers to empathize with Elle. “Even though Elle’s a replicant and has these amazing combat skills, her memories are vague and she doesn’t really know much about herself or the world around her,” states Aramaki. “The key to having her become a compelling character was to have her keep facing the world head-on and be this positive force of energy that continues to move forward towards a better life, no matter how hopeless things may get.” In many ways, the journey to uncover the past is a shared experience. “I thought that if we focused only on the problems that replicants have, the fans of the films may understand these issues immediately, but the newer audience may not,” observes Kamiyama. “That’s why we came up with the idea of having her lose her memory and go on this adventure, which would put her in the same perspective as the audience. I think those watching will empathize with her that way.” Unlike previous productions that were originally recorded with Japanese voice cast and then dubbed in English, the process was reversed for Blade Runner: Black Lotus. “Once we completed the script, we moved on to the main cast recordings,” remarks Aramaki. “After we had that in hand, we then requested the motion capture actors to try and convey the emotions and speaking styles of the main cast recordings while filming. The performances the English voice cast provided us were the basis of the animation [although we still had to go back and re-record some areas after the animation was completed, for certain reasons]. Jessica Henwick and Will Yun Lee [Joseph], in particular, gave us so much more than just their voice-over, and influenced us even in the creation of their characters. For the Japanese voice actors, we selected most of them ourselves. We were able to cast them in a way that was the closest to the image of the character we had in mind. After the English recordings were complete, we used that as a base for the Japanese recordings and we were able to direct those sessions to have the script come through. We’re very proud the finished product came out to be such a high-quality one.” A variety of vehicles populate the cityscape. “The technology and vehicle designs of this world were pretty much already defined in the two movies, so I used that as a base,” states Aramaki. “Also, I utilized the two art books from the films and Syd Mead’s art books for reference.” A lot of attention was paid to the Spinners flown by the LAPD. “They are an iconic character to the world of Blade Runner,” notes Chou. “We needed to be consistent but also reflect the passage of time from the first film, so a lot of attention was paid to striking that balance of ‘future tech’ and ‘consistency.’” Signage is another iconic element. “The original film featured a lot of Japanese and Chinese kanji character signage, and the look and content were actually kind of funny to the Japanese
66 • VFXVOICE.COM SPRING 2022
PG 62-67 BLACK LOTUS.indd 66
2/28/22 1:38 PM
“Nowadays, since it’s possible to create great CG characters that are highly indistinguishable from the real characters in live action, I don’t think there are many things that only animation can do anymore. Maybe with anime it’s a little easier to convince the audience that this non-macho female lead is taking down these toughlooking men with her amazing fighting skills and sword battles.” —Shinji Aramaki, Co-director audience. But it was actually an endearing feature to Japanese fans, not something that was looked on derisively. Recreating that signage was actually a challenge to the creators, because it was now Japanese creators actually having to create that same awkward but endearing signage in their own language. It was an interesting challenge, but we had a lot of fun doing it and populating the city with them.” Los Angeles is central to the storytelling. “A continued refrain from the directors was that ‘the city is the other half of our cast’ – meaning pedestrians, signage, buildings, vehicles all combine to create the look and feel of Blade Runner,” remarks Chou. “We needed to create the look of the city that the audience can look and say right away, ‘that’s Blade Runner.’ Blending and integrating the characters with all these elements and lighting them properly was a big part of our job on this production.” DeMarco recognizes the importance of the urban setting within the franchise. “The city itself, which is always a main character in any Blade Runner, took the longest,” observes DeMarco, “particularly recreating some of the buildings that were in the original film, like the Bradbury Building or the Tyrell Corporation ziggurat. Beyond that, we spent the most time developing the character of Elle. As the main character of the show, it was important to all of us that we got her right.” “The entire journey of the show from reaching an agreement to make the show, to releasing it, was five years,” explains DeMarco. “A year to make the deal, a year or so on pre-production, a year and a half of production, a year and a half of post.” The biggest challenge was the tight production timeline. “Each process was moving simultaneously at times,” observes Chou. “If I am being honest, it was not a ‘controlled chaos’ situation but a real chaos. Producing a series at this level – not a short or a movie – was a real challenge that really taxed our system and pushed us to our limits. We’ve done a series before, but this was a whole other level of challenge, especially accounting for the impact of the COVID-19 pandemic to the running production. I often talked with our directors and supervisors that this is not a planned military operation. We were commandos who parachuted into a battlefield and needed to shoot our way out. Which we did!”
LEFT TO RIGHT FROM TOP TO BOTTOM: Illegal fights take place in the underground. Exploring the character design of Niander Wallace Sr. 2D model sheet for the face of Niander Wallace Jr. A lethal Blade Runner is Brad Marlowe, with his last name reflecting the crime noir inspiration for the franchise. Various facial expressions and poses are examined for Joseph. Conceptualizing the face and full frontal design of Alani Davis.
SPRING 2022 VFXVOICE.COM • 67
PG 62-67 BLACK LOTUS.indd 67
2/28/22 1:38 PM
TECH & TOOLS
KEY NEW FEATURES IN COMPOSITING TOOLS TO HELP EXPEDITE YOUR WORK By IAN FAILES
If you’re a visual effects compositor, you may tend to stick mostly to just one tool for your 2D work. So you may not be overly familiar with the latest features in some other compositing software that can help improve your productivity. By identifying some key new features of the most popular compositing tools out there right now, you can find out about a useful feature or workflow in a tool you don’t currently use. Here, we talk to representatives from Foundry (Nuke), Autodesk (Flame), Blackmagic Design (Fusion) and Adobe (After Effects), who each highlight a new or prominent feature in their respective compositing toolset. Some of these features involve machine learning or AI techniques, while others deal with specific workflow speed-ups. NUKE’S CUSTOM MACHINE LEARNING NETWORKS WITH COPYCAT
TOP: Nuke’s CopyCat node, which relies on machine learning techniques, in action for carrying out a paint fix. (Image courtesy of Foundry) OPPOSITE TOP TO BOTTOM: CopyCat copies sequence-specific effects such as garbage matting, beauty repairs or deblurring. (Image courtesy of Foundry) Camera Analysis is a match move 3D camera solver in Flame that aids in adding objects into a scene, carrying out cleanup work with projections, or crafting set extensions. (Image courtesy of Autodesk) Face match moving using the Flame Camera Analysis solver. (Image courtesy of Autodesk)
A number of compositing tools have adopted machine learning and AI techniques to enhance image processing. In particular, Foundry’s Nuke, in conjunction with CopyCat, allows you to orchestrate bespoke training for machine learning networks specific to individual compositing problems. “The easiest way to think about it is with, for example, a cleanup shot,” outlines Juan Salazar, Nuke Product Manager at Foundry. “Say you have a really difficult object removal to do on a 100-frame shot. You could do that cleanup on four or five carefully chosen frames, and pass those into CopyCat along with the original, un-cleaned-up versions of those frames. With that data and a bit of training time, CopyCat can train a machine learning algorithm whose only job is to remove that specific object from that specific shot. “Once you run that network on your shot,” continues Salazar,
68 • VFXVOICE.COM SPRING 2022
PG 68-71 COMP TOOLS.indd 68
2/28/22 1:50 PM
“using the new Inference node, that object will be removed from every frame. Now, it probably won’t work for any other shots, but that doesn’t matter; you’ve solved your specific problem and you’ve done 100 frames of really tricky cleanup, but only painted five or so frames manually – the rest is done by the computer.” CopyCat came about when Foundry Research Engineering Manager Ben Kent, as part of Foundry’s Artificial Intelligence Research (AIR) team, was investigating how to develop a machine learning deblur. “He was working with a shot where the focus slipped for a few frames,” recalls Salazar. “It’s the kind of shot where you’d have to cut when the focus slipped and be locked to the frame range that was in focus. Ben realized that by taking pieces of the frame while they were in focus, and the same pieces of the frame when they were out of focus, he could train an ML network to make the out-of-focus frames look like the in-focus frames. The network he trained fixed the focus slip and made the whole frame range usable.” Salazar adds that CopyCat is not simply just a cleanup tool, and that it can generate any image-to-image machine learning network. “That means if you have reference images for where you’re starting and what you want the result to look like, you can train an ML network to do that for you. We’ve seen people doing all sorts of really cool things with it – style transfers, deep fakes, deblurs, and applying full lighting effects to normal renders.” FLAME GOES AI WITH MATCH MOVE 3D CAMERA SOLVER CAMERA ANALYSIS
Autodesk, similarly, has introduced several machine learning approaches into its compositing package, Flame. A highlight is Camera Analysis, a match move 3D camera solver that combines the techniques of SFM (Structure from Motion), Visual SLAM
SPRING 2022 VFXVOICE.COM • 69
PG 68-71 COMP TOOLS.indd 69
2/28/22 1:50 PM
TECH & TOOLS
(Simultaneous Localization and Mapping), and Semantic Machine Learning for non-static object recognition. With it, you can complete visual effects tasks like adding objects into a scene, carrying out cleanup work with projections, or crafting set extensions. How does Camera Analysis work? Will Harris, Flame Family Product Manager at Autodesk, explains. “Autonomous vehicle-style ‘smart’ vision and dense point cloud solving work in conjunction with machine learning object detection to discard bad data from the scene of the camera solve, such as moving people, vehicles, skies and unwanted reflections such as those found in lakes, puddles, etc. Once solved, you can build accurate z-depth maps and/or 3D geometry of objects in your scene, for example, for painting, projections or relighting.” Camera Analysis arose inside Flame due to requests from users on the Flame Feedback website, says Harris. “In developing the tool, we knew it had to be fast – by leveraging GPUs – accurate, and predominantly automatic, to allow Flame artists to use the feature within their daily workflow. The tool came together through new industry developments, including autonomous vehicle-style spatial solving via 3D motion vector analysis, as well as near realtime SLAM point solving, based solely on video footage. “Beyond these advancements,” adds Harris, “it was a natural progression to reuse our machine learning-trained models for a human head, body, skies, and other objects to provide automatic occlusion from the algorithm. As a result, in a very short amount of time, this offers a primarily automatic camera solve with tens of thousands of points locked to the static features of a shot, with a reliable 3D camera.” Indeed, Harris sees big changes coming in compositing and VFX tools to continue to take advantage of AI and machine learning workflows. “Among the recent machine learning-powered features are depth and face maps, sky extraction, human head and body extraction, and Salient Keyer,” Harris notes. FUSION NOW SUPPORTING MORE PLATFORMS
TOP TO BOTTOM: The user interface setup for Blackmagic Design’s Fusion 17 Studio. (Image courtesy of Blackmagic Design) Fusion is now able to run on the Apple M1 platform. (Image courtesy of Blackmagic Design) A scene being rendered in Adobe After Effects’ new Multi-Frame Rendering architecture. (Image courtesy of Adobe) The Composition Profiler in After Effects, which displays which layers and effects in a composition are taking the most time to render. (Image courtesy of Adobe)
The standout new feature for Blackmagic Design’s compositing software, Fusion Studio, is not so much a feature, but more in relation to its capability to now run on the Apple M1 platform. These M1 chips are available in the latest MacBook and Mac mini systems from Apple, furthering a trend that sees visual effects creation enabled on smaller, portable machines. As Dan May, President of Blackmagic Design, Americas, comments, the M1 platform is “a non x86-GPU system and our fourth platform we support. Fusion now runs seamlessly on Windows, Linux, OSX and M1. What makes this possible is the under-the-hood GPU compute optimizations using Apple’s Metal, AMD and NVIDIA GPUs to process more of the pipeline faster.” Fusion was primarily Windows-based prior to its acquisition by Blackmagic Design in 2014. It has since been redeveloped to work across platforms and be more broadly available. Indeed, Fusion is closely tied in with Blackmagic Design’s DaVinci Resolve, which has both free and paid versions.
70 • VFXVOICE.COM SPRING 2022
PG 68-71 COMP TOOLS.indd 70
2/28/22 1:50 PM
Beyond the M1 support, May notes that there are also improvements in the GPU compute image pipeline and multilanguage support for the interface. “We’ve also expanded Fusion’s availability by integrating its capabilities into our DaVinci Resolve post-production software, which has been a long development path,” advises May. “So, there’s now the standalone Fusion Studio VFX and motion graphics software, and DaVinci Resolve, which is an end-to-end post-production solution that features a robust Fusion page for compositing.” AFTER EFFECTS IS ALL-IN ON MULTI-FRAME RENDERING
A new central feature in Adobe’s After Effects is Multi-Frame Rendering. The intention of the feature is to utilize a system’s CPU cores when previewing and rendering. Inside the feature are a couple of specific attributes aimed at enhancing performance. One is the Speculative Preview, which renders compositions while the application is idle. Then there is the Composition Profiler, which displays which layers and effects in a composition are taking the most time to render. “The Composition Profiler will show you what’s actually taking up your render time,” observes Victoria Nece, Adobe Senior Product Manager, Motion Graphics and Visual Effects. “That’s something that I think is particularly relevant for anyone who’s dealing with a complex project or handing off a project to someone else and trying to figure out, ‘Wait, what’s going on here? Why isn’t this performing like it should?’” Nece identifies Multi-Frame Rendering as being the “No. 1 feature request for the better part of a decade. This is something that everyone’s been asking for – ‘Use more of my hardware, use it more efficiently.’ When we took a look at what people wanted, it wasn’t just to make it faster. We really saw that it was about the preview and iteration loop as the biggest things. It’s not always about your final export. So now while you’re browsing for effects or you’re writing an email, it’ll start rendering in the background so that when you get back to your timeline, it’s ready to preview.” To enable Multi-Frame Rendering, Adobe’s internal ‘Dynamic Composition Analysis’ allows After Effects to look at the hardware that’s available and the complexity of the project from frame-toframe, and then scale up and down the system resources that it is using to manage the rendering. “If you have a project that’s really slow through one part and really light through another part,” explains Nece, “it’ll scale up and down how much of your system it’s using for each frame. You can actually watch that number change up and down while you render. “Multi-Frame Rendering is a workflow efficiency feature,” concludes Nece, who adds that M1 support is also moving ahead in After Effects. “It’s really all about speed, and it is the idea of helping the software get out of your way. And so, instead of being something that’s front and center – like, ‘Wow, this is a great new effect’ – now you don’t have to think about what’s going on inside After Effects because you can just think about the work you’re doing in it instead.”
A New Compositing Tool on the Horizon? Arguably there are relatively few off-the-shelf compositing packages available for the wider public. But soon a new tool is looking to enter the market, Autograph, from French company Left Angle. The founders of Left Angle, Alexandre Gauthier-Foichat and Francois Grassard, have experience in software development, including the animation software Anima and open-source compositing tool Natron. Autograph is designed to focus on the motion graphics and VFX, as well as on the wider video and multi-format needs of content made for social networks. The tool is a compositor that combines both a layer-based and nodal interface, and relies on a combination of 2D and 3D approaches. “The layer-based approach is there to help you work quickly and easily synchronize animated graphic designs with sound and music,” notes Grassard. “On top of that, Autograph will allow for the connection of parameters together and adding of different kinds of modifiers – for images and sound but also textual, numerical and geometrical – without expressions or scripting, giving you a high level of control. “We will also provide several features for creating complex animations quickly, precise VFX tools like a keyer, re-timer and a tracker based on motion vectors. And we will integrate a full 3D renderer which can ingest huge scenes through the USD standard. The idea is that you can render your scene on the fly and then composite all passes without saving any files on your storage, in a unified solution.” The first version of Autograph is planned for release in Q2 of 2022. It will be distributed by RE:Vision Effects.
TOP: A screenshot from the upcoming new compositing tool Autograph, from Left Angle. (Image courtesy of Left Angle)
SPRING 2022 VFXVOICE.COM • 71
PG 68-71 COMP TOOLS.indd 71
2/28/22 1:50 PM
SHADOWS AND FOG, WITCHERY AND TRICKERY HAUNT JOEL COEN’S THE TRAGEDY OF MACBETH By KEVIN H. MARTIN
Images courtesy of east side effects, Inc. TOP TO BOTTOM: Digital mimicry was used for more expansive views.
Shakespeare’s Macbeth has been a frequent subject for cinematic adaptation. Orson Welles labored under poverty-row conditions – shooting on sets left over from Roy Rogers westerns – to fashion the first sound version, released in 1948. Nearly a quarter century passed before Roman Polanski tackled the project, but after that, a bevy of versions appeared on television. In 2015, a lush new Macbeth feature was directed by Justin Kurzel. This exploited the natural beauty of expansive location work, enhanced by significant VFX background replacements by BlueBolt and augmented by Artem’s pyrotechnic floor effects and makeup prosthetics. Joel Coen’s The Tragedy of Macbeth, starring Denzel Washington and Frances McDormand, eschews the naturalism of that most recent adaptation. Finding an appropriate tone for the visual effects was principal among the challenges for east side effects, Inc., a New York-based VFX company with credits for directors ranging from Charlie Kaufman, Darren Aronofsky and Ang Lee to Rob Marshall and Peter Berg. Having created the bulk of effects on Coen’s previous film (his last to date with brother Ethan), The Ballad of Buster Scruggs, Co-founders and Visual Effects Supervisors Alex Lemke and Michael Huber boarded the project very early in prep. “Visual effects people, by virtue of expectation and experience, almost automatically plan and execute with an eye toward photorealism,” says Huber. “But The Tragedy of Macbeth was not about that at all – not in the slightest. The work involved matching to live action, yes, but the live action had this thrilling and unique approach that straddled Shakespeare’s theatrical history on stage with the visual splendor of films like Murnau’s Sunrise and the innovations of Orson Welles’ Citizen Kane. So that was a bit of a rethink – a welcome one, certainly – to bring our thinking in line with Joel’s, and that was the most interesting part of it. We’d enhance what they had done, but also come up with our own creative contributions that blended with production’s work.” Given the film was being produced independently, budget was
72 • VFXVOICE.COM SPRING 2022
PG 72-76 MACBETH.indd 72
2/28/22 2:19 PM
“Visual effects people, by virtue of expectation and experience, almost automatically plan and execute with an eye toward photorealism. But The Tragedy of Macbeth was not about that at all – not in the slightest. The work involved matching to live action, yes, but the live action had this thrilling and unique approach that straddled Shakespeare’s theatrical history on stage with the visual splendor of films like Murnau’s Sunrise and the innovations of Orson Welles’ Citizen Kane.” —Michael Huber, Visual Effects Supervisor very much an issue, so the head-start helped facilitate planning and maximize resources. “A lot of our process involved approaches we’ve used on past projects for Joel,” notes Lemke. “These often consist of looking up various visual references to get on the same common visual ground with him and achieve the exact look he is after.” “At first, Joel just wanted to discuss conceptual aspects and kick around possible approaches,” recalls Huber. “Even without Ethan being involved this time, it was still very much a collaborative process that included Director of Photography Bruno Delbonnel and later on Production Designer Stefan Dechant for every meeting. We prevised a couple of sequences, which later helped everybody see what was needed, and even aided the actors by showing them more about what would be happening beyond the confines of the set build.” A rather elaborate technical previs resulted. “It wasn’t anywhere near on the level of what happens on a Marvel show,” admits Lemke, “but it did reveal that nailing things down the way you do on heavy VFX pictures might not be the way this time. When Denzel sees a raven in the apparition chamber, we initially worked out an exact flight path, and production set up speakers on set so you could hear exactly where the bird would be in the room if it were actually flying through instead of being done in post. It seemed like a much more useful approach than just waving a tennis ball around. But then we all realized that when you’re dealing with these extraordinary actors, you don’t want to limit
Denzel Washington’s responses by tying him to some prearranged series of beats. So we turned things round, and in the end matched our animation to what he did.” Like Welles’ effort, Shakespeare’s misty Scottish moors in Coen’s film were achieved on stage. “There was a lot of discussion with
TOP TO BOTTOM: Visually, the Coen Macbeth intent fell between that of a theatrical presentation and the stylization that marked numerous Expressionist films, requiring VFX to take a more artful and less photorealistic tact. The three witches encountered by Macbeth manifest both physically and as reflections, but not with consistency, enhancing their ethereal presence.
SPRING 2022 VFXVOICE.COM • 73
PG 72-76 MACBETH.indd 73
2/28/22 2:19 PM
FILM
TOP TO BOTTOM: After providing numerous VFX shots for the Coens’ The Ballad of Buster Scruggs, east side effects added both atmosphere and backgrounds to many shots requiring set extensions on the shot-on-stage feature. OPPOSITE TOP TO BOTTOM: Many of Citizen Kane’s Xanadu interiors featured limited set construction that was embellished through the use of matte art. With Macbeth, a similar approach permitted limited art department constructs to be embellished by digital set extensions. East side effects also previsualized several scenes, which helped actors imagine the missing parts of the environment. They were also responsible for creating the various birds seen in the film, as a mix of existing live-action footage and CGI.
the DP about how to handle atmospherics,” Lemke remarks, “and figuring out how much would be done practically on set [by Special Effects Supervisor Scott R. Fisher]. Nobody wanted to try to do the whole shoot ‘clean’ and then add all the atmospherics in post; it certainly wouldn’t have helped the performances, and been far too complicated to sustain anyway. So it was always going to be a mixture, combining a decent amount of practical atmosphere – through which Bruno’s lighting would be working – and VFX. Most sets were just hazed up to a certain level; our end was often a matter of creating some animated structure within the fog, giving it some life and movement. While we could have gone down the CG simulation route, instead we went with a few 2D-shot elements against black. These smoke elements saw a lot of reuse, slowed up and sometimes inverted, just like they might have tried if doing this back in the 1930s. This approach fit Joel’s tone best.” Other scenes requiring visual effects were often handled as simply as possible. The opening – with three witches revealing themselves to Macbeth and uttering a prophesy about his destiny – utilized takes with and without the talent present in order to create the eerie illusion that two of them only manifested as reflections in the water. “A lot of the visuals drew upon a huge online look book,” says Huber. “That kept getting added to by Stefan, Bruno and Joel, so we always had style references to examine together. The Night of the Hunter, directed by Charles Laughton, was a really big reference; Throne of Blood by Kurosawa and Dreyer’s The Passion of Joan of Arc were others.” One of the more expansive faux exteriors, called the crossroads set, utilized the largest stage on the Warner lot. While providing a striking showcase for the art department, it also involved east side’s assistance for a largely ‘invisible’ effect. “With respect to the background for that set, we debated the use of bluescreen,” Lemke acknowledges. “And while that made sense from an efficiency standpoint and that of consistency, it meant there would be an enormous number of visual effects required. Instead, we had a digital matte painting created, which was then printed out and enlarged to serve as a practical background on that stage. That worked fine for most shots, but inevitably the backdrop had droops at certain points. So that meant we had to do some fix-it work on certain shots, but it was a lot cheaper than doing all those screen comps.” VFX also added some subtle animation to the backdrop’s skies in order to better integrate the whole of the environment with the cinematographer’s use of moving lights in the foreground. The other bit of largely invisible magic involved birds – single creatures seen zooming around Macbeth and a huge gathering that fills the screen during the film’s conclusion. “We used a mix of hero animation and flocking work for the birds,” say Huber. “There were plates of real birds that got mixed with the CG birds for the end shot; the latter was essentially a particle system that got us such a tremendous volume of them.” Lemke suggested the use of real bird footage to the director. “I worked on a film 15 years back that featured a lot of ravens,” he relates. “Back then, doing a lot of CG birds was difficult and expensive, and we got some really good footage. I was still on good terms with the production company on that film, so I was able
74 • VFXVOICE.COM SPRING 2022
PG 72-76 MACBETH.indd 74
2/28/22 2:19 PM
“When Denzel sees a raven in the apparition chamber, we initially worked out an exact flight path, and production set up speakers on set so you could hear exactly where the bird would be in the room if it were actually flying through instead of being done in post. It seemed like a much more useful approach than just waving a tennis ball around. But then we all realized that when you’re dealing with these extraordinary actors, you don’t want to limit Denzel Washington’s responses by tying him to some prearranged series of beats. So we turned things round, and in the end matched our animation to what he did.” —Alex Lemke, Visual Effects Supervisor to get access again to what was shot. We knew the views with the single bird coming close to the camera really needed to withstand scrutiny and so would feature the real thing.” Out of 10 bird shots, only half required CG animation. Shooting was about two-thirds complete when COVID reared its head, shutting the film down. “During the interim, Joel figured out that rather than just keep sitting around wondering what to do during the interim, he should edit the sections that he could,” says Lemke. “So, since we had all the material that was shot at home, Mikey [Michael Huber] and I started to work up temp comps,
SPRING 2022 VFXVOICE.COM • 75
PG 72-76 MACBETH.indd 75
2/28/22 2:19 PM
FILM
TOP TO BOTTOM: Eschewing an entirely practical approach to atmosphere as too prosaic, production opted for VFX atmosphere extensions – largely 2D real-world smoke elements – that provided structure and animation to the Scottish murk.
then refining them for finals. By doing 4.5K proofs for him, we presented very solid shots for him to cut into his edit. And during this same time, Stefan continued to work on design concepts. His background as an illustrator really helped us tremendously in terms of communicating ideas; we could send him still images of our shots and he’d just draw right on them to show us what he thought would help complete the visual. Normally, it is a race to finalize and finish once shooting has been done, but we were able to take advantage of the delay. Using a small team over a longer period of time let us explore a lot more options, and dig more deeply, than would have been possible under more normal or typical conditions.” Huber was pleased how the situation permitted more sustained input from the DP. “Bruno was also very involved in post, more than was possible on Buster,” he reports. “Having that extra time to talk about things, especially as shots were getting finessed at the very end of post, was another wonderful part of all this. Bruno had been very daring with how he shot the live action, but, having worked things out in advance with his colorist and DIT, he was able to push the look even further during the DI. This was all done in service to Joel’s visual ideas for the film, which involved staying true to the play’s theatrical origins, but also utilized aspects of German Expressionist filmmaking. All in all, it was about helping bring something uniquely Coen to the perspective.”
76 • VFXVOICE.COM SPRING 2022
PG 72-76 MACBETH.indd 76
2/28/22 2:19 PM
PG 77 VES MAP AD.indd 77
2/28/22 3:30 PM
DISCOVERING THE VOICE OF BELLE IN THE VIRTUAL WORLD OF U By TREVOR HOGG
Images courtesy of Studio Chizu and GKIDS. TOP: Jin Kim reinterpreted the freckles on Suzu as a facial tattoo on her avatar Belle. OPPOSITE TOP TO BOTTOM: Belle witnesses the destructive side of U inhabitants. A major inspiration for Belle is Beauty and the Beast. The crescent-shaped moon creates a ‘u’ within the virtual realm.
Exploring the impact of the Internet has been an ongoing fascination for Japanese filmmaker Mamoru Hosoda, beginning with the Digimon franchise onto Summer Wars and now Belle, which transposes the story of Beauty and the Beast into a virtual reality populated by avatars created from the biometrics of their users. Suzu Naito is a 17-year-old high school student traumatized by the childhood drowning of her mother, who rediscovers her singing voice in the virtual realm of U. “I’ve been making movies that deal with themes of the Internet for 20 years,” notes Mamoru Hosoda through a translator while attending the Animation is Film Festival in Los Angeles. “When I was making films like Digimon, I had this sense that this Internet was a brand-new world or space where the younger generation could smash the old world and create something new for themselves. Fast forward to today, and the Internet has become a necessity for all of us to live, and because of that, our reality and what exists inside the Internet have become much closer.” Unlike in America, where dialogue serves as the basis of the animation, in Japan the voice tracks are recorded afterwards, even with singing playing a major role in the narrative. “I wrote all of the storyboards first, from which the composers were able to create the music,” reveals Hosoda. “Then we animated, tried to lip sync and added the choreography to it.” By having the story unfold in reality and the virtual world of U, Hosoda was able to make use of 3D animation for the first time in order to contrast the environments. “It’s interesting that you bring up the three different animation styles,” notes Hosoda. “There is the 2D reality, 3D-generated U and the third one, which is the abstract labyrinth done by a company in Ireland called Cartoon Saloon and directed under the guidance of Tomm Moore [Wolfwalkers].” U was treated as a global melting pot of various cultures. “It’s a megacity and has this vitality where people can come express themselves. In contrast, the rural areas where Suzu comes from are losing their position in this globalized economy.”
78 • VFXVOICE.COM SPRING 2022
PG 78-83 BELLE.indd 78
2/28/22 2:31 PM
“We asked the character designers to design their interpretation of both Belle and Dragon/Beast. Ultimately, we settled on Jin’s interpretation of what Belle looks like as probably the best. I recall having a lot of conversations with Jin about what it means between Belle and Suzu, and what appears to be these two extremes are actually the same person deep down.” —Mamoru Hosoda, Director Initially, U was modeled on the totem structure featured in Summer Wars, but the concept was abandoned. “At the time, the title was called Love Song,” explains Eric Wong, who served as a co-production designer on the project. “We started from a string of a harp, zoom into this world and pan down into the gathering hub. You get this beautiful central line of the equator running right down the middle of the city amongst this sea of skyscrapers and geometric shapes.” CAD program MicroStation made shifting between 2D and 3D sketches possible. “The sketchbook was helpful as it made the time I spent traveling to and from my fulltime architectural job the most productive period,” states Wong, “as I was able to plan out the sheets and my ideas of what I was going to do each night.” A dominant visual concept was the idea of a bright night. “We went through various tests that explored changing the size of the moon, the opacity of it and the color of the night sky. Having a 3D model allowed me to adjust the night and the light around it as I pleased.” Abstracted puzzle pieces that took the form of screens were the central elements in the construction of the retractable spherical stadium. “I looked at Big Hero 6 as San Fransokyo, which is a portmanteau of San Francisco and Tokyo,” explains Wong. “The advert ring that surrounds the stadium was made straighter, much like the advertisements you get at football stadiums.” Corporate logos and user interfaces were simplified. “Capitalization gets reduced to lowercase lettering,” he says. “Drop shadows and bling get reduced to simple block colors. Text usually gets removed from the drawing
SPRING 2022 VFXVOICE.COM • 79
PG 78-83 BELLE.indd 79
2/28/22 2:31 PM
ANIMATION
TOP TO BOTTOM: U is a virtual reality platform which echoes Mamoru Hosada’s belief that the Internet mirrors real life. Avatars of various shapes and sizes populate U. Character designs of Ruka Watanabe who serves as the main inspiration for Belle.
and gets placed underneath. Thin text gets replaced with thicker text.” Even how the app icon appeared on the smartphone had to be considered, Wong reports. “I tested different options of the interface; where it is City Strip U design, the U is along the edge of the outline, or the U touches the top, or be simply in the middle. If it is different, does that mean you interact with it differently? Do you swipe through it? Would the interface be different if you swiped up?” The white ballroom from 2001: A Space Odyssey was an inspiration for the gateway sequence when Suzu transforms into Belle and enters U for the first time. “The walls were removed and replaced with these curtains that Suzu could push through,” describes Wong. Several character designers, including Jin Kim (Over the Moon), worked on Belle. “We asked the character designers to design their interpretation of both Belle and Dragon/Beast,” states Hosoda. “Ultimately, we settled on Jin’s interpretation of what Belle looks like as probably the best. I recall having a lot of conversations with Jin about what it means between Belle and Suzu, and what appears to be these two extremes are actually the same person deep down.” Kim kept in mind he was creating an avatar. “She is not real,” states Kim. “I wanted to mix some modern feel and then fantasy. I looked at K-pop artists and opera singers. There is a famous Korean opera singer Sumi Jo. She is a classic singer, and I studied her gestures and attitude. That was the starting point of developing Belle.” Suzu also wants to look like popular classmate Ruka Watanabe. “Suzu wants to be popular amongst the group of friends,” adds Kim. “At the time, there was no design for Ruka in the real world. Director Hosoda wanted the Belle design first and then design Ruka.” However, a prominent physical trait of Suzu does appear on Belle. “I didn’t want to do random spots like real freckles, so I designed a cartoon face tattoo, and the director liked it.” Belle is frequently
80 • VFXVOICE.COM SPRING 2022
PG 78-83 BELLE.indd 80
2/28/22 2:31 PM
“[Belle] is not real. I wanted to mix some modern feel and then fantasy. I looked at K-pop artists and opera singers. There is a famous Korean opera singer Sumi Jo. She is a classic singer, and I studied her gestures and attitude. That was the starting point of developing Belle.” —Jin Kim, Character Designer in the presence of whales, which is a favorite visual theme for Hosoda. Comments Kim, “The idea that someone is able to stand on this whale and sing shows that there is something special about Belle.” Prep time was minimal for animation. “We shaped our animation style as we created actual shots,” remarks Ryo Horibe, CG Director at Digital Frontier. “For scenes where characters sing, our animators looked at movements of Kaho Nakamura, who voiced Belle, as well as a contemporary dancer, and incorporated them into motions.”American animation was an influence. “Upon hearing the director’s vision, I came up with an idea of combining Japanese animation style [limited animation with 2s and 3s] with full-frame animation like Pixar’s,” remarks Horibe. “We kept body movements closer to Japanese style while incorporating Pixar’s style into facial expressions, especially eye movements. Obviously, we looked at Disney’s classic animation for reference as well. Having Jin Kim as the character designer [of Belle] also inspired us to create this style.” A lot of focus was placed upon expressing emotion through facial expressions and hand poses. Adds Horibe, “Though we had 3D models as a base, adjustments on facial parts and hand models were necessary, depending on the camera angle of each shot, in order to create a better expression. Also, we employed a system that allowed us to composite background colors and tones without going back to 3D applications, in order to
TOP TO BOTTOM: A climatic moment occurs when Suzu decides to unveil herself inside of U. Hiroka Betsuyaku and Suzu Naito desperately seek the real identity of the Dragon. Moments of stillness are used for dramatic and comedic effect.
SPRING 2022 VFXVOICE.COM • 81
PG 78-83 BELLE.indd 81
2/28/22 2:31 PM
ANIMATION
TOP TO BOTTOM: A standout comedic moment occurs when Suzu has to intervene on the behalf of embarrassed classmate Ruka Watanabe. Whales are a favorite visual motif for Mamoru Hosoda. Suzu attempts to sing after being unveiled to be Belle. It was important to portray the unveiling of Suzu as a painful but beautiful moment.
quickly address feedback from the director.” “With the opening shot being long, we were careful with camera movements and time distribution,” remarks Horibe. “We also paid extra attention to the layout of buildings and details to make the world of U appear vast and grand.” Cartoon Saloon provided some surreal animation for the scenes taking place on the outskirts of U. “We received a 2D data of BG from Cartoon Saloon,” says Horibe. “We created scenes in 3D based on that data and added motions. In the process, we looked at their works as reference and created motions the design intended.” The castle of the Dragon gets destroyed. “For this spectacle scene where the castle burns up in flame,” Horibe explains, “we went through a lot of trial and error trying to find the right balance between frailty, as in a paper-crafted model burning up instantly, and the dynamics of flames as graphics. Flames and smoke were created in 3D, but adjustments were made so they look like 2D as well.” Many scenes required visual effects. “We used water effects for a countless number of avatars, and confetti in the opening sequence as well as for the performance at the theater,” confirms Horibe. “There were also spectacle effects like flames and castle destruction. Every effect needed to be matched with the stylized world of U, instead of being photorealistic, and that was a challenge. Also, we did a crowd simulation for many scenes as we had to place lots of avatars in backgrounds.” Effects had to be created for when Suzu unveils herself to be Belle. “Details of the effects used for the unveil scene were not included in storyboards,” states Horibe. “‘Unveil’ reveals someone’s identity, thus it causes some sort of pain. And we needed to portray Suzu’s determination to accept that pain beautifully. Using water current and bubbles as a motif, we were able to create an effect that gives Suzu the power to transform while making a link to her painful memory of her mother drowning in a river.” The climactic scene where Belle sings on top of the
82 • VFXVOICE.COM SPRING 2022
PG 78-83 BELLE.indd 82
2/28/22 2:31 PM
whale is linked to the opening scene. “Though her 3D model remains the same, Suzu as a character has grown a lot through the story, and we needed to communicate that through Belle’s singing and facial expressions. With the help of wonderful animators, effect artists and compositors, we were able to make this scene memorable.” Horibe adds, “As you can imagine, Belle was the most time-consuming of all, and her acting and expressions required the most iterations. As for environment, the interior of Dragon’s castle, digital noise effects and the live performance scene at the end took many hours.” Moments of stillness and silence are found throughout the movie. “There is this concept in Japanese culture called Ma, which in movie language translates to a gap or negative space,” explains Hosoda. “Through nonmovement you can actually express something. This is also something that you see when Suzu is on the bus and there are a lot of reflections across her face. Had Suzu been acting or moving exactly as the audience thought she would, then it would have taken the audience out of that moment.” That is not to say that there are not any extreme moments, as demonstrated by Suzu’s best friend in real life, Hiroka Betsuyaku, and as her avatar. Comments Hosoda, “The animator did a really good job of expressing that emotion and not wanting to be outperformed by the animation when voice actress Ikuta-san stepped up to the plate to deliver that evil laugh!” Initially, Kim was not a fan of anime. “One main reason why I wasn’t a big fan of anime was the flow of the movement,” reveals Kim. “I understand that the productions want to save some money, so that’s why they started using limited animation, like using one drawing for a few frames. Hosoda does it cleverly in his movies. For the train station sequence, he used a minimum number of drawings, but the impact was so beautiful. If he used a lot of awkward movements, like in U.S. animation, it wouldn’t have had the same impact. I started to learn those things and began to appreciate anime.”
TOP TO BOTTOM: The performance of Belle was partially modeled on Korean opera singer Sumi Jo. Background artwork of the exterior of Suzu’s House. A storyboard of Suzu’s bedroom which explores size and scale. Eric Wong made use of abstracted puzzle pieces to construct the spherical stadium.
SPRING 2022 VFXVOICE.COM • 83
PG 78-83 BELLE.indd 83
2/28/22 2:31 PM
TECH & TOOLS
THE SECRETS BEHIND STOCK EFFECTS ELEMENTS By IAN FAILES
These days, whenever we see a raging fire, or a plume of smoke, or a muzzle flash on the screen, there’s a high chance this has been added to the scene with visual effects. And, of course, it might be something added to the shot care of a particular stock effects element. Effects elements have long formed part of visual effects studio libraries, and can now be purchased as pre-filmed and pre-canned elements from various providers. But how are those particular elements typically captured? Here, two visual effects studios – Rodeo FX and Mavericks VFX – and two stock effects companies – ActionVFX and The Light Shop VFX – explain how they shoot and utilize effects elements. RAIN, HAIL OR SAND
TOP: Bullet-hit elements shot and provided as stock effects by ActionVFX. (Image courtesy of ActionVFX) OPPOSITE TOP: The Light Shop VFX flame stock effects elements. (Image courtesy of Tyler Kindred) OPPOSITE MIDDLE AND BOTTOM: The original plate for a scene from Don’t Breathe 2, in which VFX Legion would add fire elements from ActionVFX. And then the final shot by VFX Legion. (Images courtesy of Sony Pictures)
In the time of analog and optical effects, many VFX studios had their own ‘space’ to film effects elements. Generally, those days are long gone, but Rodeo FX is one studio that has retained a dedicated 4,000-square-foot stage in which element shoots can take place. The space in Montreal, which is also set up for bluescreen/greenscreen, motion capture and LED screen filming, enables crews to shoot elements in a controlled lighting environment. One element shoot conducted at Rodeo’s stage was for helicopter rotor wash and rain for the film Pacific Rim. “We created cut-outs of silhouettes and propelled vapor stream against them to simulate the blade wash of the helicopter on people, melted lead with a blowtorch, and filmed it macro and high speed on greenscreen to mimic melting metal in the cockpit,” recalls Robert Bock, Visual Effects Supervisor and Director of VFX Photography at Rodeo FX. “We also built a small tabletop pool and put a vibrator underneath to create ‘fractal’ waves/textures. For water drops on the helicopters, we filmed milk and water on black screen sliding down glass. In our parking lot, we set up rain towers with sprinklers we
84 • VFXVOICE.COM SPRING 2022
PG 84-87 EFFECTS ELEMENTS.indd 84
2/28/22 2:35 PM
got from Home Depot on stands, a black backdrop for showers and a collecting pool for drop textures – all filmed at high speed to help with the scale.” Bock believes the value of these shot-specific elements – another example was a shoot on the Rodeo stage of a stand-in dressed in black, spitting out ‘sand’ to composite into a desert scene in Aquaman – is the benefit of being able to match the lens and lighting from the principal photography. And, he says, some of the element shoots are directly referenced when the final elements end up being done in computer graphics. “A very important part of our work is lookdev. We often design and experiment with abstract practical effects in order to give the director an artistic direction before going into FX and simulations. That’s a great time-saving tool.” FIRE ON A TENNIS COURT
Mavericks VFX Founder and Visual Effects Supervisor Brendan Taylor certainly agrees with that time-saving assessment. “Shooting something for 20 minutes is way better than trying to shoehorn an element in for two hours that might not work,” Taylor says. “We’ve got the lights, we’ve got the cameras, we’ve got everything. Honestly, it’s put up two lights, set up the camera, lensing works, we’re good to go – boom!” For the film A Dog’s Purpose, for instance, Taylor oversaw a moment that required significant fire and smoke for a front landing setting. An initial CG fire effort proved unsuccessful, so Taylor turned to the special effects company Dynamic Effects to help with a practical fire element shoot. “We rented out a space that was meant for firefighter training, built a porch and a roof that could be lit on fire at night. It was all raised up so that we could get the cameras at the same level and so the flames could be keyed against the black sky. We then took the actual roof plate and just composited those flames right in.”
It was so successful that a similar approach was repeated for a panning shot featuring the film’s star dog as it runs past fire up some stairs past a fire. The same firefighting space was not available, so instead, Taylor found an abandoned tennis court near his family cottage to capture the fire elements at night. Another time, fire elements for a person-on-fire shot were filmed at the tennis court location on a plastic mannequin. It’s all part of Taylor’s desire to find a quick practical solution wherever possible. “A further example is for Fahrenheit 451, where we had to
SPRING 2022 VFXVOICE.COM • 85
PG 84-87 EFFECTS ELEMENTS.indd 85
2/28/22 2:35 PM
TECH & TOOLS
LEFT TOP TO BOTTOM: A shoot for an explosion effects element that will be available at The Light Shop VFX. (Image courtesy of Tyler Kindred) A plate of Jake Gyllenhaal for the spider scene in Enemy, with the greenscreen spider effects element shot by Rodeo FX and the final composited Enemy shot. (Images copyright © 2013 Entertainment One. Courtesy of Rodeo FX) RIGHT TOP AND BOTTOM: Mavericks VFX Facility and Practical FX Manager Stuart Wall oversees all manner of effects element shooting, including with blood. (Image courtesy of Mavericks VFX) Stuart Wall handles some meat burning as reference elements at Mavericks VFX. (Image courtesy of Mavericks VFX)
show a weird laser effect,” he adds. “So we rented a laser machine and a smoke machine, and we shot something and gave it to our compositors as reference. Then on The House with a Clock in Its Walls, there was a shot where acid needed to start burning through a desk. If you put acetone on Styrofoam, it gives the same effect. We just shot a bunch of elements that way. “I don’t come from a compositor background,” comments Taylor. “I don’t come from a CG background. I come from a set-shooting background, and that stuff is really exciting and can produce amazing results, fast. I find it really, really fun.” FILMING STOCK EFFECTS ELEMENTS
Rodeo and Mavericks tend to film elements in response to required specific shots they are working on. Productions themselves might also carry out bespoke shoots. Then there are a number of stock effects element solutions, such as elements available from ActionVFX. The company offers many element collections – fire, explosions, smoke, muzzle flashes, bullet hits, blood hits, weather effects – that it produces out of dedicated shoots. “For one of our collections, Forest Fires, we had to source and work with large logs that were 10 feet tall and weighed close to 800 lbs. each,” details Zac VanHoy, Head of Product Development
86 • VFXVOICE.COM SPRING 2022
PG 84-87 EFFECTS ELEMENTS.indd 86
2/28/22 2:35 PM
at ActionVFX. “We tested multiple accelerants to find out how to get the fire to burn long enough while also getting that deep orange flame that looks so great on camera. Since we have the ability to film at resolutions higher than 4K and also utilize the wide dynamic range of the RED cameras, we can plan for these unexpected things much better and get great takes even when something doesn’t go exactly as planned.” ActionVFX typically keys out the background of any effects element it shoots to produce an isolated element, as well as carrying out paint work and rotoscoping to remove extraneous details. “More than likely an element needs to interact with the environment in a shot,” notes ActionVFX CEO Rodolphe PierreLouis. “A fire doesn’t simply float in midair; it needs a horizontal ground plane, a wall, or a tree to burn off of. So we capture all these varieties at multiple angles, on various surfaces, to ensure that the artist has all they need to comp a realistic shot.” One recent production that utilized ActionVFX fire elements was Don’t Breathe 2. Here, visual effects studio VFX Legion used the elements to enhance existing in-plate flames, add fireballs to rooms and rooftops, and to show entire structures engulfed. There were nearly 40 VFX sequences where the elements came into play. “Due to time constraints and the sheer amount of shots, using simulated fire wasn’t a realistic option for them,” says PierreLouis. “Our extensive fire library helped them achieve the most realistic results.” BANG!
A new entrant in the stock effects elements arena is The Light Shop VFX, established by visual effects artist Tyler Kindred. This marketplace currently specializes in fire effects elements, liveaction explosions, sparks, smoke and larger scale fire, acquired with ARRI cameras on anamorphic lenses. Kindred recounts a shoot for some Light Shop VFX elements involving C4 explosives. “This took place on a ridge overlooking a clear blue sky, where we then had a sniper in position to shoot the explosive into detonation from a safe distance. This allowed us to key for transparency and still have this authentic earthy feel to the billowing smoke. We rely largely on veterans from the military and navy to assist with explosives, fires and firearms. This same crew also works in the film industry, and have a very safe and dedicated protocol when handling these more dangerous procedures.” How do artists, then, use these stock effects elements? One thing that’s consistent among the shooting of these pieces of footage by the studios and companies is that VFX artists are generally also behind the actual filming. This means they tend to understand the next compositing steps of those elements. “For example,” advises Kindred, “for fire elements, the standard process is to use blend modes, typically ‘screen’ to composite. Smoke is added separately with the same process, as well as sparks and any debris. Most compositors add some kind of glow effect, though I often refrain because I like the sharpness of the flames. You know, digital explosions have largely taken over as the go-to asset for effects, but we believe there is still a place for live-action explosives and elements.”
When One of Your Effects Elements is… a Spider On Denis Villeneuve’s Enemy, Rodeo FX was called upon to help realize a shot where the main protagonist (played by Jake Gyllenhaal) walks into a room to find a huge iconic spider against the back wall. “This was the final shot of the film and was quite important,” states Rodeo’s Robert Bock. “We analyzed our options and budget and decided to go ‘old school.’ We hired an animal wrangler and did a casting of spiders. We sent casting photos of the different spiders to Denis – it’s fascinating how there is quite a range of looks in spiders – and he chose a very scary looking tarantula.” Rodeo then built a miniature greenscreen room with painted foamcore using a scan of the actual set. Details Bock, “We cut out a window, put in diffusion and pumped 10,000 watts of lighting through it. We had to do so since we had to film at 300fps and with an f-stop of 16 to make the miniature work.” A small ramp against the back wall for the spider to use was also built. And then the filming could commence, as Bock further describes. “As we cajoled the spider and delicately sprayed it with compressed air, it backed away in the corner of the miniature room. In order for it to be comfortable, since it was getting quite hot in there, we rented two portable air conditioners and streamed in cold air to cool the set. It worked wonderfully and the final product is quite a success. After the shoot, a member of our team even adopted the tarantula. Her name is Rosy.”
A plate of Amber Heard landing in the desert for a scene in Aquaman required her to mime spitting out sand. And the element shoot for the shot filmed by Rodeo FX. (Images courtesy of Warner Bros. Pictures and Rodeo FX)
SPRING 2022 VFXVOICE.COM • 87
PG 84-87 EFFECTS ELEMENTS.indd 87
2/28/22 2:35 PM
FILM
THE FRENCH DISPATCH: TRACKING ALL THE VISUAL NUANCES AND MOVING PARTS FOR SEVERAL FILMS IN ONE By CHRIS McGOWAN
Images courtesy of Walt Disney Studios Motion Pictures and Searchlight Pictures. TOP: The French Dispatch’s gruff lead editor and founder Arthur Howitzer Jr. (Bill Murray) in his office. OPPOSITE TOP: Travel writer Hersaint Sazerac (Owen Wilson) taking his bicycle on the metro. The Sazerac bike-riding sequence involved miniatures and live-action elements. OPPOSITE MIDDLE AND BOTTOM: Images of the front (after and before) and the rear of The French Dispatch office. In an opening scene, a waiter from a street café scales several flights of stairs in the rear of the building to deliver a tray full of cocktails and apéritifs to an editorial meeting.
Wes Anderson’s visually dazzling The French Dispatch harkens back to a pre-digital era when print magazines had wide readership and filmmakers relied on celluloid and optical effects. Writerdirector Anderson’s movie is structured like an issue of The New Yorker magazine, and its visuals utilize color and black-and-white film, miniatures, practical effects, tableaux vivant, elaborate tracking shots, differing aspect ratios and an exceedingly busy art department. Yet, the Digital Age is also ever-present in the film. Which scenes have visual effects we wouldn’t suspect? Responds Editor Andrew Weisblum, “Well, the whole film. Every scene has VFX.” Weisblum continues, “VFX becomes part of an ongoing tool for refining and tinkering with the film, both editorially and in terms of production design and all other details. It’s an essential role [in The French Dispatch]. It’s not meant to call attention to itself. It’s also not meant necessarily to be photoreal. It’s more of a design element.” He estimates there were “maybe four or five hundred” VFX shots. In addition, there is a three-minute, 2D-animated vignette that is part of a larger live-action story. “There wasn’t a VFX department per se until late in the postproduction process. It was primarily handled through editorial,” remarks Weisblum, who more or less filled the role of a visual effects supervisor, along with producer Jeremy Dawson, who had worked as a VFX supervisor before. “There was a lot of discussion on set with the art department and everyone else where we’d
88 • VFXVOICE.COM SPRING 2022
PG 88-91 FRENCH DISPATCH.indd 88
2/28/22 2:47 PM
cobble together our different methodologies and ideas to come up with the ingredients to figure out the visual effects.” He notes, “RISE [Visual Effects Studios] worked with us as did Union VFX, Goodbye Kansas [Studios], the Artery in New York City and Red VFX, and then we had several in-house artists.” Robert Yeoman served as director of photography, Alexandre Desplat scored the music and Adam Stockhausen was in charge of production design. Steven Rales, Dawson and Anderson were the producers of the film, which is distributed by Searchlight Pictures. Previs played an integral role in The French Dispatch. Weisblum explains, “The previs process was an animatic process along the lines of what we do on [Anderson’s] animated films. There is a set of storyboards from practically the whole movie that is built out and paced out and allows everyone to efficiently sort out what needs to be shot, what elements need to be shot, what sets need to be built, what minatures need to be built, what the methodology is. It’s not an elaborate 3D process; it’s more a 2D process to get across the ideas that Wes has in his mind than it is a technical exercise.” The French Dispatch is a fictionalized version of, and tribute to, the venerable New Yorker as well as to France and French New Wave films of the ’50s and ’60s. It also celebrates the charms and seediness of the real-world small town of Angoulême, which was transformed into the imaginary French city of Ennui-sur-Blasé, the home of the Dispatch. And, throughout, the film is enlivened by Anderson’s inimitable flair for the whimsical and the absurd.
SPRING 2022 VFXVOICE.COM • 89
PG 88-91 FRENCH DISPATCH.indd 89
2/28/22 2:47 PM
FILM
TOP TO BOTTOM: A picturesque newsstand at a corner in the French city of Angoulême, selling The French Dispatch among other newspapers. Angoulême was tranformed into the imaginary city of Ennui-sur-Blasé. Zeffirelli (Timothée Chalamet) and Julie (Lyna Khoudri) lean against the jukebox in a café, after the wall has opened up to reveal the street behind it. Besides the practical effect, a lot of manipulation took place – from the timing of the wall suddenly opening to the re-design of the jukebox graphics to different signage on the walls outside. Prisoner and artist Moses Rosenthaler (Benicio del Toro) in a straightjacket and his muse Simone (Léa Seydoux) to the right.
Following the death of the Dispatch’s gruff lead editor and founder, Arthur Howitzer Jr. (Bill Murray), the magazine staff plans the final issue of The French Dispatch. The movie brings its four parts to life: a brief travel piece and three extended feature articles. The opening shots are some of Weisblum’s favorites. In an opening scene, a waiter from a street café scales several flights of stairs to the magazine’s offices to deliver a tray full of cocktails, absinthe and apéritifs to an editorial meeting. “In some ways I like the simplest stuff,” Weisblum observes, “like the [shots of] mixing the drinks and the Tati shot going up the building in the beginning, just because it was fun to play with timing like that and find all those quick little splits and morphs. That kind of tinkering was a lot of fun.” The movie’s first “article” is a local excursion with travel writer Herbsaint Sazerac (Owen Wilson), who takes us on a dizzying (and narrated) bike ride through the town, including the most colorful sections. “There were many shots in Owen Wilson’s Sazerac sequence that involved a combination of both miniature and live-action elements,” recalls Weisblum. “The subway shot. The background until we arrive at the rats [infestation] was a miniature. [While cycling, when he is] holding onto the truck before he falls into the subway station, [there was] a miniature background and then several layers of live-action cars and other moving elements. And several other shots had bits and pieces like that, but a lot of the tableau shots in that sequence itself were built of several different visual effects elements, different comps of cats, miniatures and different live-action plates to create a cityscape.” The next entry is framed by a lecture given by writer J.K.L. Berensen (Tilda Swinton) about the notorious artist Moses Rosenthaler (Benicio del Toro), who achieves sudden fame while serving a life sentence for murder. His art has blossomed during his affair with the prison guard Simone (Léa Seydoux), his muse. Julian Cadazio (Adrien Brody) is his obstinate and calculating agent. In the story, Rosenthaler’s art is being displayed to wealthy art collectors at the prison. There is a scene where his fellow prisoners break free and run into the art crowd. At one point, the actors stop in mid-motion. “It was all done practically,” states Weisblum, “but we sometimes merged several plates together. And some actors moved a little bit more than others who had to be slowed down or stabilized so they didn’t pop out as much. But generally the actors were just told to freeze.” And, throughout the film, “when there was more than one person on the screen at once, you’ll see there’s a split screen there, or a morph. Or a re-speed. If you see two people, three people, four people and so on, everyone gets their own kind of approach, inside a shot and inside a frame, with lots of splits and timing adjustments. That’s something I don’t think anybody would ever be cognizant of,” explains Weisblum. In the second story, Lucinda Krementz (Frances McDormand) works on a story about the student revolt, led by the moody revolutionary Zeffirelli (Timothée Chalamet) and Juliette (Lyna Khoudri). Part of it takes place inside a café where in one startling scene a wall beside them opens up to the street. “That is done
90 • VFXVOICE.COM SPRING 2022
PG 88-91 FRENCH DISPATCH.indd 90
2/28/22 2:47 PM
primarily practically with a wall that slid open. Everything else was there on set, but a lot of manipulation happened afterwards – from the timing of the wall suddenly opening to the re-design of the jukebox graphics to different signage on the walls outside. The timing of the boys on the mopeds driving through was changed. That’s all the kind of afterthought, manipulation and sculpting that happens to a shot when there’s generally some kind of practical base to its execution on set.” And, lastly, writer Roebuck Wright (Jeffrey Wright) profiles the legendary chef Nescaffier (Steve Park). The latter is working his culinary magic in a police department’s kitchen (the commissioner is a devoted gourmet), when a gang of thugs kidnaps the commissioner’s beloved son. That chapter includes a three-minute, 2D-animated police car chase in the streets of Ennui-sur-Blasé, with “a style that is something between Tintin and Blake and Mortimer [comics series], an aesthetic that I really like,” says Animation Supervisor Gwenn Germain (Isle of Dogs). The animation was created in Angoulême, which was the setting for the film and is France’s second city for animation after Paris. “I didn’t know Angoulême before this experience. But I felt there was a special mood during the shooting and producing of the movie. We felt it was a conversation topic in the restaurants and bars, and people were excited.” Germain adds, “The whole process of animation took around sevenmonths. Twomonths of pre-production and five months of production. At the maximum there were around 15 people in the team. I was in charge of the art direction, the team management and the production follow-up. Every day, Wes Anderson had a look at the production. He was following this very closely.” Germain comments, “He was only in touch with me, and I was in charge of translating his vision to the team. I guess he doesn’t want to have too many interlocutors. He needs one manager for each pole of the production. My role was to communicate his wishes to a production team. He doesn’t necessarily have all the technical constraints in mind, so my role was to balance between his vision and the constraints of the studio [time and budget]. But he has a true vision. He’s a great director with great ideas; he knows where he is going. Most of the time he has breakthrough ideas; we get surprised by these ideas, but at the end the execution proves he’s right. It’s quite impressive.” The addition of the animation made The French Dispatch into an even more complex project. For Weisblum, the most challenging task was “just keeping track of all the moving pieces in a film like this that’s several films and design elements in one, making sure we’re all moving forward to completion.”
TOP LEFT: Prison guard Simone (Léa Seydoux). Black and white film stock was used frequently in the movie, and there was a strong influence of the French New Wave cinema of the late ‘50s and ‘60s. TOP RIGHT AND MIDDLE: A three-minute, 2D animated sequence, with a style like that of Tintin and Blake and Mortimer, is part of the story about the chef Nescaffier and the kidnapping of the police commissioner’s son. The sequence was created in Angoulême, which is France’s second city for animation after Paris. BOTTOM: Owen Wilson looking out on many, many cats on the rooftop overlooking the city. Many different visual effects elements – including miniatures and live- action plates – were used to create the cityscape in different scenes.
SPRING 2022 VFXVOICE.COM • 91
PG 88-91 FRENCH DISPATCH.indd 91
2/28/22 2:47 PM
[ VES SECTION SPOTLIGHT: TORONTO ]
Toronto Revisited: Doubling Down on Innovation By NAOMI GOLDMAN
TOP: Introducing the 2022 Toronto Section Board of Managers. MIDDLE TWO: VES Toronto members and guests at a pub night meetup pre-COVID. BOTTOM: VES Toronto gets ready for a viewing of Matrix Resurrections.
The VES’s international presence gets stronger every year, and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. The Toronto Section exemplifies adaptability, resilience and growth – continuing to thrive amidst the pandemic – as they developed interactive educational and entertainment experiences and bolstered community among their 130-plus members and the local industry. Profiled in the inaugural VFX Voice in April 2017 and now in the magazine’s 5th anniversary issue, VES Toronto is excited to mark this moment and celebrate their evolution. “We are in a great place jump-starting 2022, poised for a rich year of programming and community-building, says Tavia Charlton, VES Toronto Section Co-chair and Post Production Producer, Originals Marketing at Amazon Studios. “With our recent election, we have the largest and most diverse Board of Managers since the Section’s inception in 2013, with 12 passionate and extremely impressive leaders fully engaged in elevating our craft and supporting our members.” Filmed entertainment is thriving in Toronto. More film studios and VR facilities are being built every year, and an increasing number of projects are coming into the market for production and then staying for post-production to tap into the veteran VFX workforce. This boom is creating enormous growth and job opportunities for local VFX talent, while attracting international artists and technicians to the region. “And the attraction is not limited to the production arts, but goes deep into the technology side,” says Roy C. Anthony, VES Toronto Co-chair and Global Head of Research at DNEG. “Toronto has a rich research and innovation culture. There is a lot of interaction with universities to underpin technology solutions and develop new ways of generating revenue and IP for studios.” Continues Anthony, “The Toronto Section has always done intense hands-on, technical arts-focused presentations. A great example is a two-day session we hosted on lighting where we brought in Dave Stump, author of Digital Cinematography. He trained people working in 3D to understand how people light on set and on stage, to break down barriers between digital content creators and on-set creators. We want to help foster that common language and make sure our members are at the forefront of the industry.” The Toronto Section continues to offer its popular “Tech Talk” series, making skills-building and information accessible for both technical artists and practitioners. Last year they held a highly attended “Tech Talk” on “USD Pipeline Successes and Challenges (Universal Scene Description)” with guest George ElKoura, former Lead Engineer at Pixar Animation Studios, and have many programs in the pipeline. “The USD event is a great example of the value we bring, as there is really exciting energy around USD to streamline the production process and leverage technologies in a more flexible way,”
92 • VFXVOICE.COM SPRING 2022
PG 92-93 VES TORONTO SECTION.indd 92
2/28/22 2:49 PM
TOP LEFT: VES Toronto members and guests gather for an IMAX screening. BOTTOM LEFT: VES Toronto parties on the patio pre-COVID. TOP RIGHT: Royal Cinema readies for a VES private screening.
says Kim Davidson, VES Toronto Secretary and CEO of SideFX Software. “Our Section is taking a leadership role in educating on what is now becoming a worldwide standard for sharing data and scenes across companies, and we will continue to offer handson workshops to improve skills and understanding across our membership, which encompasses a wide range of artists, editors, technicians and software designers.” During the pandemic, the Section joined “the big pivot” to virtual programming, which allowed it to expand its outreach both to prospective members and across the global visual effects community. Events included a “Virtual Production Introduction” webinar to provide members with a baseline of knowledge on emerging VP technologies and how to look at VP in the future and be best prepared. The Section also served up some Strawberry Milkshakes – online mixers that allowed Toronto to cross-pollinate and connect with other Sections, starting with their peers in the Georgia Section. The meetups were primarily created to connect Board of Managers across Sections to share best practices and understand other markets. Entertainment also abounded with a steady stream of film and TV screenings, both virtual and in-person (as COVID protocols allowed, with attendees sporting VES-branded masks). Its virtual meetups included a Netflix movie-night screening of Awake with a Zoom Q&A, and a showing of Amazon’s The Boys, followed by a panel with the special effects/visual effects team. Thanks to partnerships with the Paradise Theater and Royal Cinema, the Section’s in-venue events included a screening of Dune and live Q&A with director Denis Villeneuve and the production and VFX team, and screenings of No Time to Die, The Matrix Resurrections and Ghostbusters Afterlife.
“These high-value screenings give our members direct interaction with the visionaries and hands-on practitioners,” says Charlton. “This access provides continuity and connectivity that deepens our understanding of what it takes to successfully move from concept to execution.” “While navigating during COVID was challenging, we are now hitting our stride,” says Lisa Sepp-Wilson, VES Toronto Treasurer and Visual Effects Producer and Supervisor. “I’m excited to keep reaching out to grow and diversify our membership, develop educational content and help our members see the value of our Section – both the benefits of belonging and what they can contribute to enrich our community and our industry.” “Planning ahead, we are trying to leverage keen interest in the technical side of our community, as well as access to the research corridor and software development companies to advance new innovations and get people ready for these transitions,” says Anthony. “As a Section, our vision is to stay the course on what’s working well and double down on the innovations in the space to provide more context and more awareness to help push this local industry forward.” “We are exceptionally proud to be a part of the VES global community and celebrate an amazing 25 years of this vibrant organization,” says Charlton. “Here in Toronto, we have achieved a lot over the years and continued to strengthen our bonds and our impact, even during the pandemic. We have diversity on our Board and in our members, a depth of great programming and an exciting vision for what we can create next. I hope that as we evolve, our Section can be an incubator of ideas for the VES and our fellow Sections, while we deepen our commitment to serving the VFX community in the Greater Toronto Area.”
SPRING 2022 VFXVOICE.COM • 93
PG 92-93 VES TORONTO SECTION.indd 93
2/28/22 2:49 PM
[ VES NEWS ]
VES Elects 2022 Board of Directors Officers: Lisa Cooke Re-Elected Board Chair; Historic All-Women Executive Committee to Lead the Global Organization By NAOMI GOLDMAN
The 2022 VES Board of Directors Officers, who comprise the VES Board Executive Committee, were elected at the January 2022 Board meeting. The Officers include Lisa Cooke, who was re-elected Board Chair, and is the first woman to hold this role since the Society’s inception. And for the first time in the Society’s 25-year history, the five-member Executive Committee is comprised of an all-women group of venerated industry professionals. “It is my honor and privilege to continue to Chair this society of outstanding artists and innovators and I appreciate the trust placed in me,” said Lisa Cooke, VES Chair. “I’m proud to work amongst an exceptional group of impassioned and talented leaders, who are committed to advancing our mission and serving our members worldwide. As an all-women Executive Committee, I hope that this representation encourages other women to seek positions of leadership in our industry, and that organizations continue to lift up people from diverse backgrounds and experience to fully reflect our global community.” The 2022 Officers of the VES Board of Directors are: • Chair: Lisa Cooke • 1st Vice Chair: Emma Clifton Perry • 2nd Vice Chair: Susan O’Neal • Secretary: Rita Cahill • Treasurer: Laurie Blavin
TOP TO BOTTOM: Lisa Cooke Emma Clifton Perry Susan O’Neal Rita Cahill Laurie Blavin
A producer at Tippett Studio, Lisa Cooke has several decades experience as an animation/VFX producer, creative consultant, screenwriter and actor. Firmly believing that it is vital for science to effectively communicate its truths to a broader audience, she started Green Ray Media, and for the last 10-plus years has been producing animation and VFX to create scientific, medical and environmental media for a broad audience. She served six years on the Bay Area Board of Managers, currently serves as Chair of the VES Archives Committee and was previously 1st Vice Chair of the Board of Directors.
Emma Clifton Perry is a Senior Compositor with more than 16 years experience working across feature films, longform/TV series, commercials and advertising at VFX facilities worldwide. Based in Wellington, New Zealand, she offers remote compositing and VFX consulting services. Perry has served several terms as Chair, Co-Chair and Secretary/Treasurer on the VES New Zealand Board of Managers and was previously 2nd Vice Chair of the Board of Directors. Susan O’Neal is a recruiter for BLT Recruiting and has worked as an Operations Manager at The Mill, Operations Director at Escape Studios in Los Angeles and as an Account Manager at Side Effects Software. Recipient of the 2019 VES Founder’s Award, O’Neal has previously served on the Board of Directors and Chair of the Legacy Global Education Committee and Membership Committee, and has been instrumental in the Society’s membership growth. Rita Cahill is an international business and marketing/PR consultant and has worked with a number of U.S., Canadian, U.K., EU and Chinese companies for visual effects and animation projects. Previously, Cahill was the Vice President of Marketing for Cinesite and a founding Board member of the Mill Valley Film Festival/California Film Institute. This is Cahill’s seventh term as Secretary. Previously, she served as Chair or Co-Chair of the VES Summit for eight years. Laurie Blavin’s 20-year career spans the culture, creativity and technology of both Hollywood and Silicon Valley. She was thrilled to join Scanline VFX as their Global Head of Talent Acquisition. Blavin counsels companies and professionals in strategy, brand-value stewardship, full-cycle staffing, and recruiting for international companies large and small, start-up and established businesses. This is Blavin’s third term on the VES Board of Directors.
94 • VFXVOICE.COM SPRING 2022
PG 94-95 VES NEWS.indd 94
2/28/22 2:54 PM
VES 2021 Honors Celebration a Great Success
In November 2021, the VES honored a host of distinguished visual effects practitioners at a special reception at the Skirball Cultural Center in Los Angeles. The honorees included our 2021 inductees into the VES Hall of Fame, the newest Lifetime and Honorary members, VES Fellows and this year’s recipients of the VES Founders Award. “Our VES honorees represent a group of exceptional artists, innovators and professionals who have had a profound impact on the field of visual effects,” said Lisa Cooke, VES Board Chair. “As individuals and collectively, these stellar artists, technicians and innovators are at the forefront of advancing the craft of visual effects. They are pioneers and risk-takers who push the boundaries of what’s possible. And they are the source of inspiration for legions of storytellers and creators who aspire to carry forward their rich legacy. We are very proud to honor them.” Congratulations again to our esteemed honorees: •F ounders Award Recipients: Mike Chambers, VES and Rita Cahill •L ifetime Members: Gene Kozicki; Richard Winn Taylor II, VES; Mike Chambers, VES and Rita Cahill •H onorary Members: James Cameron and Gary Demos •V ES Fellows: Brooke Breton, VES; Mike Chambers, VES; Van Ling, VES and Nancy St. John, VES •V ES 2021 Hall of Fame honorees: Roy Field; John P. Fulton, ASC; Phil Kellison; the Lumière brothers and John Whitney, Sr.
CLOCKWISE: Celebrating 2021 VES Fellows: Van Ling, VES; Brooke Breton, VES; Mike Chambers, VES and Nancy St. John, VES. 2021 Founders Award Recipients and Lifetime Members Mike Chambers, VES and Rita Cahill. VES Chair Lisa Cooke (center), Executive Director Eric Roth and VFX Supervisor Richie Baneham accepting VES Honors for director James Cameron. Celebrating at VES Honors: Dan Curry; Ubolzan Curry; Jonathan Erland, VES; Kay Erland; Gene Kozicki and Jeffrey A. Okun, VES. John Whitney, Jr. celebrating his father John Whitney, Sr.’s induction into the VES Hall of Fame with Lifetime Member Richard Winn Taylor II , VES. VES Hall of Fame inductee Phil Kellison’s family celebrating at VES Honors.
SPRING 2022 VFXVOICE.COM • 95
PG 94-95 VES NEWS.indd 95
2/28/22 2:54 PM
[ FINAL FRAME ]
Streaming is Peaking Around the World
Consider that Game of Thrones was a brash young upstart when it launched in 2011 by HBO. HBO helped set the tone for a U.S.based company producing epic shows worldwide using a global pipeline of VFX vendors. Game of Thrones became one of the most successful VFX-driven streaming shows of all time and dominated the airwaves for close to 10 years. That model has been copied by others. At the same time, original productions have been emanating from any number of foreign companies and are crossing back all over the world. Think of Casa de Papel (Money Heist), Squid Game, Borgen, Call My Agent, Killing Eve, The Crown and many others. (See Global VFX article on page 8). At a minimum, these original productions are huge hits in their local markets if they don’t cross over globally. They become must-see viewing and they create work for VFX studios and artists.
One reason VFX has reached, and continues to reach, new levels is the astonishing number of U.S. and international original productions on streaming services. In a short span of time, approximately 50 services have sprouted in North America alone from such brands as Netflix, Amazon, Apple, HBO, Disney, Paramount, Hulu, NBCUniversal and others. A number of services also have originating outside the U.S. such as BritBox, Acorn, MHz and others. Newer streaming services are waiting in the wings worldwide. Original scripted shows are numbering in the hundreds yearly and should grow even more internationally in 2022. That’s great news for the VFX industry – as the majority of these shows crave VFX of one kind or another. The “Roaring ‘20s” for the worldwide VFX industry is here. (Image courtesy of HBO)
96 • VFXVOICE.COM SPRING 2022
PG 112 FINAL FRAME.indd 96
2/28/22 2:56 PM
CVR3 FOTOKEM AD.indd 3
2/28/22 2:59 PM
CVR4 HP AD.indd 4
2/28/22 3:00 PM