VFX Voice - Fall 2020 Issue

Page 68

PROFILE

MATT WORKMAN: SHARING HIS VIRTUAL PRODUCTION JOURNEY AS IT’S HAPPENING By IAN FAILES

All images courtesy of Matt Workman unless otherwise noted. TOP: Matt Workman in 2010, using the Red One camera – an early leader in digital cinema cameras that could capture up to 4K resolution – during the filming of a music video in Manhattan for Def Jam Recordings.

66 • VFXVOICE.COM FALL 2020

There’s no doubt that virtual production, and the associated area of real-time, is the hot topic in filmmaking and visual effects right now. Since virtual production is a relatively new area of VFX, information, training and advice on the latest techniques have not always been easy to find for those eager to learn. But one person, cinematographer Matt Workman (http:// www.cinematographydb.com), has become somewhat of a go-to source. He regularly shares – often daily – his own experiments in virtual production with followers on social media, also turning to the burgeoning virtual production community for advice and inspiration. Workman worked as a commercial cinematographer in New York for more than a decade, eventually looking to transition more into larger visual effects projects. This also sparked an interest in doing his own previs. “When I was working as a live-action cinematographer, the bigger projects required weeks of pre-production and planning for a one-to-three-day shoot,” Workman remarks. “Many of these projects had storyboards and previs already done, so I wanted to be able to contribute and share my ideas for camera work and lighting. I started by building tools in Maya, and I actually had a product for a short time called ‘Virtual Cinematography Tools’ that some people still use that was a mix of rigged models and a simple Python/MEL plug-in.” That early tool segued into the development of a real-time cinematography simulator called Cine Tracer, after Workman had started a company called Cinematography Database and begun selling a Cinema 4D previs plug-in known as Cine Designer. The plug-in allowed for very accurate technical previs with rigged 3D models of cameras, lights and other film industry equipment. It caught the eye of Unreal Engine maker Epic Games. “One day Epic Games asked me if I would develop something similar for Unreal,” relates Workman. “So I started to learn Unreal Engine 4 and I made a quick prototype into a video game. I posted the video online and it went viral. Since then I’ve kept adding to what is now called Cine Tracer as I learn Unreal Engine, and it’s turned into its own ecosystem and includes set building and digital humans on top of the cameras and lights from my Maya/Cinema 4D work.” Having worked with Epic since the beginning of Cine Tracer, Workman has maintained that relationship. He was awarded an Unreal Dev Grant and has been part of the Unreal Engine 4 Live Training stream. When Epic Games became involved in helping productions with real-time footage delivery on LED walls (the kind popularized by The Mandalorian), they asked Workman to help demo the tech on a purpose-built stage featuring Lux Machina LED screens. Specifically, he was part of an LED wall shoot as ‘director/DP’ during SIGGRAPH 2019. “I was already going to be presenting twice with Unreal Engine at SIGGRAPH, so I was up for one more project,” says Workman. “I had no idea how massive the LED project would be and I spent a full month on the stage. My primary role was to be the cinematographer and consult on the camera, lens, lighting, grip, equipment, and the general approach to shooting the project. I also had to

“My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out. So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.” —Matt Workman, Cinematographer/Developer consult on crew and other typical cinematographer responsibilities to make a smooth professional shoot. “The team on the project had already worked together on The Mandalorian, so I was the one catching up,” adds Workman. “But I contributed ideas on how I wanted to control the CG lighting environment at run time. I was also responsible for finding shots that best showed off the interaction of the LED reflections of the wall on the live-action motorcycle and actor they had there.” Since the LED wall experience, Workman has continued experimenting in real-time and in showcasing his results. For example, earlier this year he built his own virtual production at home, a fortuitous step for when the COVID-19 crisis arrived. “The main tool I’ve employed here is the HTC Vive Tracker that allows me to track the position of a real-world camera,” explains Workman. “I bought a Blackmagic URSA Mini Pro 4.6K G2, and using a Blackmagic DeckLink 8K Pro I can bring the live footage into Unreal Engine 4.

TOP: Workman during a demo on Epic Games’ LED wall setup. (Image courtesy of Epic Games) BOTTOM: A ‘Cyber Punk Car Scene’ made with Unreal Engine, in which Workman used the tool’s Sequence Recorder to act out the scene and actually drive the car.

FALL 2020 VFXVOICE.COM • 67


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.