VFX Voice - Spring 2020 Issue

Page 34

PREVIS

HOW PREVIS HAS GONE REAL-TIME By IAN FAILES

TOP LEFT: Halon’s virtual camera setup used for previs at the studio. (Image courtesy of Halon Entertainment) TOP RIGHT: A previs still from Halon’s work for Ford v Ferrari, where real-time techniques aided in generating the stylized animatics. (Image copyright © 2019 20th Century Fox) BOTTOM RIGHT: The idea behind Halon’s cel-shaded-like renders for the previs was to be able to cut easily with storyboards. (Image copyright © 2019 20th Century Fox) OPPOSTE TOP: Behind the scenes of the Drogon oner shot in Season 8 of Game of Thrones, where The Third Floor made use of virtual production techniques to plan the shot. (Image copyright © 2019 HBO) OPPOSITE MIDDLE: Nviz Head of Visualization Janek Lender showcases the studio’s virtual production system for previs. (Image courtesy of Nviz) OPPOSITE BOTTOM: A previs frame from Nviz’s Vcam setup. (Image courtesy of Nviz)

32 • VFXVOICE.COM SPRING 2020

Previs studios have always had to work fast. They are regularly called upon to fashion animatics quickly and make numerous iterations to suit changing scripts and last-minute inclusions. Often this also involves delivery of technical schematics – techvis – for how a shot could be filmed. And then they tend to regularly get involved after the shoot to do postvis, where again it’s necessary to deliver quickly. While previs studios have had to be nimble, new real-time rendering tools have now enabled them to be even more so. Plus, using the same tools coupled with virtual production techniques allows previs studios to deliver even more in the shot planning and actual shooting process. For example, previs and virtual production on the Disney+ series The Mandalorian adopted real-time workflows in a whole new way. Here, a ‘volume’ approach to shooting in which actors were filmed against real-time rendered content on LED walls was undertaken. That content went through several stages of planning that involved traditional previs, a virtual art department and then readying for real-time rendering using Epic Games’ Unreal Engine on the LED walls. The studios involved in this work included ILM, The Third Floor, Halon Entertainment, Happy Mushroom and Mold3D Studio. The Mandalorian is just one of the standout projects where previs, virtual production and real-time rendering are making significant in-roads. Here’s how a bunch of previs studios, including those that worked on the show, have adapted to these new workflows.

Supervisor Ryan McCoy. “War for the Planet of the Apes was our first previs effort with the game engine, and our tools and experience have grown considerably since. It has been our standard workflow for all of our visualization shows for a few years now.” Modeling and keyframe animation for previs still happens in Maya and is then exported to Unreal. Here, virtual rigs to light the scene can be manipulated, or VR editing tools used to place set dressing. “We also have virtual production capabilities where multiple actors can be motion-captured and shot through a virtual camera displaying the Unreal rendered scene,” adds McCoy. “This can work through an iPad, or a VR headset, allowing the director to run the actors through a scene and improvise different actions and camera moves, just as they would on a real set. “Aside from the virtual production benefits, these real-time techniques also have some significant ramifications for our finalpixel projects,” continues McCoy. “Being able to re-light a scene on the fly with ray-traced calculations allows us to tweak the final image without waiting hours for renders. Look development for the shaders and FX simulations are also all done in real-time.” One of Halon’s real-time previs projects was Ford v Ferrari, where the studio produced car race animatics that had a stylized look achieved with cel-shaded-like renders – the aim was to cut easily with storyboards. “We utilized Unreal’s particle systems and post processes to make everything fit the desired style while still timing the visuals to be as realistic as possible,” outlines Halon Lead Visualization Artist Kristin Turnipseed. “That combined with a meticulously built LeMans track and being faithful to the actual race and cornering speeds allowed us to put together a dynamic sequence that felt authentic, and allowed us to provide detailed techvis to production afterward.”

HALON JUMPS INTO REAL-TIME

THE THIRD FLOOR GOES VIRTUAL

Like many previs studios, Halon Entertainment relied for many years almost solely on using Maya and its native viewport rendering for previs. However, the studio did dabble in real-time via MotionBuilder for mocap, the use of ILM’s proprietary previs tool ZVIZ on the film Red Tails, and with Valve’s Source Filmmaker. When Epic Games’ Unreal Engine 4 was released, Halon jumped on board real-time in a much bigger way. “We saw the potential in the stunning real-time render quality, which our founder Dan Gregoire mandated be implemented across all visualization projects,” relays Halon Senior Visualization

The Third Floor has also adopted real-time in a major way, for several reasons, according to CTO and co-founder Eric Carney. “First, the technology can render high-quality visuals in real-time and works really well with real-world input devices such as mocap and cameras. We think a hybrid Maya/Unreal approach offers a lot of great advantages to the work we do and are making it a big part of our workflow. It allows us to more easily provide tools like virtual camera (Vcam), VR scouting and AR on-set apps. It lets us iterate through versions of our shots even faster than before.” The Third Floor has recently leveraged real-time in visualization

“We also have virtual production capabilities where multiple actors can be motion-captured and shot through a virtual camera displaying the Unreal rendered scene. This can work through an iPad, or a VR headset, allowing the director to run the actors through a scene and improvise different actions and camera moves, just as they would on a real set.” —Ryan McCoy, Senior Visualization Supervisor, Halon

SPRING 2020 VFXVOICE.COM • 33


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.