3 minute read

THE VES HANDBOOK

Virtual Production

Abstracted fromTheVES Handbook of Visual Effects – 3rdEdition (Chapter 2, page 57) Written byAddison Bath. Edited for this publication by Jeffrey A. Okun, VES

Much of the The Mandalorian was filmed indoors using a new technology called Stagecraft, which utilizes massive, rearprojected LED screens to create realistic background environments for scenes. (Image copyright © 2019 Disney+ and Industrial Light & Magic)

Live Action with CG Elements

Merging CG into a live-action shoot allows the director and crew to evaluate composition, lighting, timing, and more, before the final plate is captured. Bringing CG elements into the real world requires a great deal of planning. Avatar (2009) was the first film to use a system that allowed the crew to see a live composite of the CG elements through the eyepiece of the camera and on set monitors.

To do this the picture camera must be accurately tracked and recreated in the virtual world. The most common camera tracking techniques are optical markers, encoders, IMUs, computer vision, or some combination of these. The other key to camera recreation is using encoders to read lens information for the virtual camera. The real-time render is then composited with the live plate. Aligning the virtual world to the real world must be accurate for the technique to be effective. Survey, lidar, and other tracked physical objects are used to achieve this lineup.

Gravity (2013) advanced virtual production by creating the light box – walls of LED lights surrounding a 10ft x 10ft acting space. The LED lights provided correct lighting for the live-action subject and allowed them to see the CG images to be added later. These were driven in real-time by images pre-rendered from the actor’s perspective allowing them a better performance because they were in the CG world in real time and able to act/react to it. The Mandalorian (2019) pushed the LED wall technique farther being used extensively.

Live Action with CG Characters

Merging CG characters into live action in real time offers many advantages: It allows camera moves to be accurately motivated by CG characters; proper eyelines can be verified; and actors can perform against their CG counterparts. Performances from the CG characters can be prerecorded or achieved live, but they must be rendered in real time. On Real Steel (2011), performance capture was used to record the choreography for all the fights between the boxing robots in pre-production. Once shooting began, motion capture volumes were built at all fight locations to track the picture cameras; this included both indoor and outdoor sets enabled by active LED markers used for tracking. The live composite was recorded along with the tracking information and provided to the VFX facilities as key information needed to produce the final shots.

In Game of Thrones (2011–2019), multiple techniques were used to bring the CG dragons to life. Motioncontrolled flamethrowers were driven with animation from the dragon’s head. Thor: Ragnarok (2017) also applied this technique by using real-time performance from an actor to drive the Hulk, using Simulcam to view the Hulk in context of the real world. This was key to the production as the CG character was much larger than the actor driving the performance. The actor and camera were fitted with markers, and movable towers of motion capture cameras were constructed to allow maximum flexibility during the shoot. Using this process allowed both performance and composition of physical and digital characters to be explored in real time.

The power of handing back creative freedom and control to the key decisionmakers is what virtual production is all about. It is a growing area both for use and advances in the technology.

This article is from: