9 minute read

PROFILE: MATT WORKMAN

Next Article
VES NEWS

VES NEWS

MATT WORKMAN: SHARING HIS VIRTUAL PRODUCTION JOURNEY AS IT’S HAPPENING

By IAN FAILES

All images courtesy of Matt Workman unless otherwise noted.

TOP: Matt Workman in 2010, using the Red One camera – an early leader in digital cinema cameras that could capture up to 4K resolution – during the filming of a music video in Manhattan for Def Jam Recordings. There’s no doubt that virtual production, and the associated area of real-time, is the hot topic in filmmaking and visual effects right now. Since virtual production is a relatively new area of VFX, information, training and advice on the latest techniques have not always been easy to find for those eager to learn.

But one person, cinematographer Matt Workman (http:// www.cinematographydb.com), has become somewhat of a go-to source. He regularly shares – often daily – his own experiments in virtual production with followers on social media, also turning to the burgeoning virtual production community for advice and inspiration.

Workman worked as a commercial cinematographer in New York for more than a decade, eventually looking to transition more into larger visual effects projects. This also sparked an interest in doing his own previs.

“When I was working as a live-action cinematographer, the bigger projects required weeks of pre-production and planning for a one-to-three-day shoot,” Workman remarks. “Many of these projects had storyboards and previs already done, so I wanted to be able to contribute and share my ideas for camera work and lighting. I started by building tools in Maya, and I actually had a product for a short time called ‘Virtual Cinematography Tools’ that some people still use that was a mix of rigged models and a simple Python/MEL plug-in.”

That early tool segued into the development of a real-time cinematography simulator called Cine Tracer, after Workman had started a company called Cinematography Database and begun selling a Cinema 4D previs plug-in known as Cine Designer. The plug-in allowed for very accurate technical previs with rigged 3D models of cameras, lights and other film industry equipment. It caught the eye of Unreal Engine maker Epic Games.

“One day Epic Games asked me if I would develop something similar for Unreal,” relates Workman. “So I started to learn Unreal Engine 4 and I made a quick prototype into a video game. I posted the video online and it went viral. Since then I’ve kept adding to what is now called Cine Tracer as I learn Unreal Engine, and it’s turned into its own ecosystem and includes set building and digital humans on top of the cameras and lights from my Maya/Cinema 4D work.”

Having worked with Epic since the beginning of Cine Tracer, Workman has maintained that relationship. He was awarded an Unreal Dev Grant and has been part of the Unreal Engine 4 Live Training stream. When Epic Games became involved in helping productions with real-time footage delivery on LED walls (the kind popularized by The Mandalorian), they asked Workman to help demo the tech on a purpose-built stage featuring Lux Machina LED screens. Specifically, he was part of an LED wall shoot as ‘director/DP’ during SIGGRAPH 2019.

“I was already going to be presenting twice with Unreal Engine at SIGGRAPH, so I was up for one more project,” says Workman. “I had no idea how massive the LED project would be and I spent a full month on the stage. My primary role was to be the cinematographer and consult on the camera, lens, lighting, grip, equipment, and the general approach to shooting the project. I also had to

“My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out. So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.” —Matt Workman, Cinematographer/Developer

consult on crew and other typical cinematographer responsibilities to make a smooth professional shoot.

“The team on the project had already worked together on The Mandalorian, so I was the one catching up,” adds Workman. “But I contributed ideas on how I wanted to control the CG lighting environment at run time. I was also responsible for finding shots that best showed off the interaction of the LED reflections of the wall on the live-action motorcycle and actor they had there.”

Since the LED wall experience, Workman has continued experimenting in real-time and in showcasing his results. For example, earlier this year he built his own virtual production at home, a fortuitous step for when the COVID-19 crisis arrived. “The main tool I’ve employed here is the HTC Vive Tracker that allows me to track the position of a real-world camera,” explains Workman. “I bought a Blackmagic URSA Mini Pro 4.6K G2, and using a Blackmagic DeckLink 8K Pro I can bring the live footage into Unreal Engine 4.

TOP: Workman during a demo on Epic Games’ LED wall setup. (Image courtesy of Epic Games)

BOTTOM: A ‘Cyber Punk Car Scene’ made with Unreal Engine, in which Workman used the tool’s Sequence Recorder to act out the scene and actually drive the car.

TOP LEFT: Workman, working as a cinematographer, shot commercials for brands such as BMW, Kodak and General Mills in New York for several years.

TOP RIGHT: Some of the camera gear and rig Workman has for shooting in his virtual stage.

MIDDLE: Workman films, from home, with a virtual camera.

BOTTOM: Early crane-operating experience on a commercial shoot.

“I use UE4.25 Composure to live composite footage into a 3D Unreal Engine environment and I can do basic camera movement using the Vive Tracker. This same ‘indie’ approach can be used on an LED wall using nDisplay and there are several smaller studios using this workflow today.”

Workman says that even before the coronavirus pandemic he had been experimenting with how to produce content remotely with virtual production techniques. To prove it could be done, he teamed up with Eric Jacobus from SuperAlloy Interactive to produce a short film.

“Eric is a professional stunt/action actor and has a mocap studio and team. He wrote the script and we did a remote mocap session with an Xsens suit over Zoom. I could see multiple video witness cameras as well as the live Xsens preview of the scene. The mocap from the day was then cleaned up by [mocap artist] Mike Foster and sent to me.”

Workman has also been writing a new ‘Virtual Production Tools’ framework for Unreal Engine 4, and was able to streamline the live filming of the mocap data. “We actually streamed four hours of the shoot to YouTube. The amazing part was that it felt very similar to a director and DP working on a live-action set.

“I recently saw the rough cut of the film and it’s turning out great,” says Workman. “Our next step is to further refine the mocap based on the edit and do a final art and lighting pass, then film the project again to get a final result. This is an experimental approach, but the results are quite promising.”

From Workman’s perspective, virtual production is clearly going to be part of the future of production, especially for the benefits of collaboration. “Having everyone actually seeing the virtual environment is a breath of fresh air,” he observes. “It brings all the creative decision-making back into one space and one moment, whereas production is typically distributed around the world and over weeks or months in a typical greenscreen virtual environment pipeline.”

There are, however, some cautions that remain. “LED walls have their limits,” Workman states. “They won’t work for every shoot,

“Having everyone actually seeing the virtual environment is a breath of fresh air. It brings all the creative decision making back into one space and one moment, whereas production is typically distributed around the world and over weeks or months in a typical greenscreen virtual environment pipeline.” —Matt Workman, Cinematographer/Developer

but I think directors and writers are interested in using them, and they will find ways of writing and shooting that make great use of them in production.”

Workman hopes there will also be continued developments in matching real-time tools to real-world tools. “Right now, matching real-world cameras and lenses in Unreal Engine is quite difficult. It’s been solved by companies like MoSys and Ncam, but it requires very expensive hardware and software. The hope is that camera and lens manufacturers can make lens distortion and other important variables internally mapped and then made accessible at runtime gen-locked to the camera’s shutter. Once that process is made simpler and cheaper, doing mixed reality on greenscreen and LED walls will be much more accessible.”

The cinematographer observes, too, that there is a strong crossover between virtual production and what VFX practitioners already do on live-action sets. “Completely CG/engine virtual production is very similar to a traditional VFX or 3D animation pipeline and is where most VFX artists will probably enter realtime production. Once they are comfortable in a real-time engine, then they can move onto being an on-set real-time tech artist or artist and help run the virtual environment that is on the LED walls or live composited on a greenscreen shoot.”

To many newcomers to real-time and virtual production – and to many already in the industry – Workman has been a significant source of energy. And while he certainly acknowledges being in the business of selling products, via his company Cinematography Database and its main product Cine Tracer, Workman says the community aspect of virtual production right now is very important.

“As a developer and artist, I thrive on sharing my work daily. This provides me feedback that I need to continue to learn and grow. I livestreamed the first year of Cine Tracer development every day on Twitch. I didn’t make much money from the stream, but I met a community of Unreal Engine developers that literally taught me Unreal Engine as I was streaming it.

“My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out,” Workman adds. “So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.”

TOP: Workman has been acquiring a wealth of virtual production gear in the past few years.

MIDDLE: A game-engine-rendered scene is captured.

BOTTOM: The resulting frame.

This article is from: