13 minute read

VIRTUAL PRODUCTION: AARON SIMS CREATIVE

Next Article
VES NEWS

VES NEWS

TOP TO BOTTOM: The LED wall, provided to Zoic on loan from Fuse Technical Group, gave the visual effects studio a chance to test several shooting scenarios.

The LED wall, a ROE Black Pearl 2, was 28 tiles wide and 8 tiles high.

An on-set tracking system utilizing Vive controllers allowed for camera movement and real-time re-adjustment of the imagery on the LED wall. “The panels create light, but not strong enough to get shadows. So, you can have an overall global lighting and the value’s just right on the character, but there is no shadow for the character. That’s why we wanted to also have practical lighting. We used the DMX plugin from Unreal, which would send information for correct intensity and color temperature to every single practical light. So, when you move in real-time, the light actually changes, and we can create the real cast shadows on the characters.” —Julien Brami, Creative Director and Visual Effects Supervisor, Zoic Studios

wall at their studio space in Culver City. It was a big leap for Zoic, which predominantly delivers episodic, film and commercial VFX work. Subsequently, the studio launched a ‘Real Time Group’ to explore more virtual production offerings and use Unreal Engine for visualization, virtual art department deliveries, animation and virtual production itself.

“What we said was, ‘We have the time, we have the technology, we need to learn it,’” relates Brami, in terms of the initial push into LED wall filmmaking. “FTG lent us an LED wall for about three months, and it was really exciting because they said, ‘We can give you the panels and we can do the installation, but we can’t really give you anyone who knows how to do it. You have to learn yourself.’ So it was just a few of us here at Zoic doing it. We each tried to figure everything out.”

The LED wall setup at Zoic was used for a range of demos and tests, as well as for a ‘Gundam Battle: Gunpla Warfare’ spot featuring YouTuber Preston Blaine Arsement. The spot sees Preston interact with an animated Gundam character in a futuristic warehouse. The character and warehouse environment were projected on the LED wall. “That was the perfect scenario for this,” says Brami.

The wall was made up of around 220 panels. Zoic employed a Vive tracker setup for camera tracking and as a way to implement followfocus. The content was pre-made for showing on the LED wall and optimized to run in Unreal Engine in real-time.

“We had the LED processor and then two stations that were running on the set,” details Brami. “We also had a signal box that allowed us to have genlock, because that’s what matters the most. Everything has to work in tandem to ensure the refresh rate is right and the video cards can handle the playback.”

There were many challenges in making its LED wall setup work, notes Brami, such as dealing with lag on the screen, owing to the real-time tracking of the physical camera. Another challenge involved lighting the subject in front of the LED panels.

“The panels create light, but not strong enough to get shadows. So, you can have an overall global lighting and the value’s just right on the character, but there is no shadow for the character. That’s why we wanted to also have practical lighting. We used the DMX plugin from

Thoughts on an LED Wall Future

Arising from his LED wall and virtual production experiences so far, Zoic Studios Creative Director and Visual Effects Supervisor Julien Brami believes there are many advances still to come in this space.

One development Brami predicts is that there will be more widespread operation of LED walls remotely, partly something that became apparent during the pandemic. “All of this technology just works on networks. My vision is that one day I can have an actor somewhere in Texas, an actor somewhere in Germany, I can have the director anywhere else, but they all look at the same scene. As long as it can be all synchronized, we’ll be able to do it. And then you won’t need to all travel to the same location if you can’t do that or if it’s too expensive.”

Another advancement that Brami sees as coming shortly is further development in on-set real-time rendering to blend real and virtual environments. “This is going to be like an XR thing. You shoot with an LED wall, but then you can also add a layer of CG onto the frame as well – that’s still real-time. You can sandwich the live-action through the background that’s Unreal Engine-based on the wall and then add extra imagery over the wall.

“Doing this,” Brami says, “means you can actually create a lot of really cool effects, like particles and atmospherics, and make your sets bigger. It does need more firepower on set, but I think this is really what’s going to blend from an actor in front of a screen to something that’s fully completed. You could put in a creature there, you can put anything your space can sandwich. I’m really excited about this.”

Unreal, which would send information for correct intensity and color temperature to every single practical light. So, when you move in real-time, the light actually changes, and we can create the real cast shadows on the characters.”

Zoic tested several cameras on the LED wall set, including an ALEXA Mini, Sony VENICE and RED KOMODO. Brami says they noticed each camera sensor produced different responses to the LEDs. “We learned really early on that to make it this beautiful, magical treatment when everything’s still degraded, that the grading of the scene through the monitor was extremely important. Being able to match the blackness level, and also the hue and saturation values of the screen, was extremely important to make it believable.”

Ultimately, Brami came away from the experience more enthusiastic about the role of LED walls in production and visual effects than he had ever been previously. That was both because of what he saw could be achieved with in-camera visual effects and interactivity on set, but also what it meant for ‘world-building,’ something VFX studios are regularly called upon to do.

“What’s really interesting from my point of view is the power of virtual production to save the backgrounds forever,” declares

TOP TO BOTTOM: LED panels with a 2.8mm pixel pitch made up the wall.

Zoic Studios has ultimately established a ‘Real-Time Group’ at the studio to handle virtual production and real-time game-engine solutions to VFX work.

Filming the ‘Gundam Battle: Gunpla Warfare’ spot featuring YouTuber Preston Blaine Arsement, left.

TOP TO BOTTOM: Preston Blaine Arsement performs on the LED wall set.

An Unreal Engine view of the Gundam character for the ‘Gundam Battle: Gunpla Warfare’ commercial.

The video village, or ‘brain bar’ area of the LED wall stage, during filming of the Gundam commercial at Zoic. “We know how to create VFX and we know how to create content and assets, we just need to get more involved in preparing this content for real-time on LED walls, since the assets need to run at high frame rates...” —Julien Brami, Creative Director and Visual Effects Supervisor, Zoic Studios

Brami. “When you shoot on location, you may have the best location ever, but you don’t know if next year if there’s another season whether this location will be there or will remain untouched. Now, even 10 years from now, you might want a location in Germany, well, we can come back to the same scene built for virtual production. Exactly the same scene, which really is mind blowing.”

THE STATE OF PLAY WITH ZOIC AND LED WALLS

It became apparent to Zoic during this test and building phase that a permanent build at their studio space may not be necessary, as Brami explains. “We realized a couple of things. First, this technology was changing so fast. Literally, every three months there was a new development, which meant buying something now and having it delivered in three months means it would be obsolete.

“Take the LED panels, for example,” adds Brami. “The panels we had at the time were 2.8mm pixel pitch. The pixel pitch is basically the distance between LEDs on a panel, and these define the resolution you can get from the real estate of the screen and how close you can get to it and all the aberrations you can get. When we started shooting, 2.8 was state-of-the-art. But then we’ve seen pixel pitches of 1.0 appearing already. Everybody has seen the potential of this technology, and the manufacturers want to make it even better.”

This meant Zoic decided that, instead of buying an LED wall for permanent use, to utilize the loaned wall as much as possible for the three months to understand how it all worked. “Our main focus then became creating really amazing content for LED walls,” states Brami. “We know how to create VFX and we know how to create content and assets, we just need to get more involved in preparing this content for real-time on LED walls, since the assets need to run at high frame rates, etc.”

Already, Zoic has produced such content for a space-related project and also for scenes in the streaming series Sweet Tooth, among other projects. Indeed, Zoic believes it can take the knowledge learned from its LED wall experience and adapt it for individual clients, partnering with panel creators or other virtual production services to produce what the client needs. “So, if they need a really high-end wall, because everything needs to be still in focus, we know where to go and how to create a set for that,” says Brami. “But if they just need, let’s say, an out-of-focus background for a music video, we know where to go for that approach.

“I love where it’s all going, and I’m starting to really love this new technology,” continues Brami. “The past three years has really been the birth of it. The next five years are going to be amazing.”

DEEP-DIVING INTO REAL-TIME CAN OFFER A CONTENT CREATOR INSTANT BENEFITS

By IAN FAILES

Images courtesy of Aaron Sims Creative, except where noted.

TOP: A concept for the Demogorgon creature in Stranger Things by Aaron Sims Creative (ASC). (Image copyright © 2016 Netflix)

BOTTOM: Aaron Sims, founder of Aaron Sims Creative.

OPPOSITE TOP: Sims paints on a puppet used during the filming of Gremlins 2: The New Batch. Aaron Sims Creative (ASC), which crafts both concept designs and final visual effects for film, television and games, bridging both the worlds of practical and digital effects, has long delivered content relying on the long-standing ‘traditional’ workflows inherent in crafting creatures and environments. But more recently, ASC has embraced the latest real-time methods to imagine worlds and create final shots. In fact, a new series of films ASC is developing are using game engines at their core, as well as related virtual production techniques.

The first of these films is called DIVE, a live-action project in which ASC has been capitalizing on Epic Games’ Unreal Engine for previs, set scouting and animation workflows. Meanwhile, the studio has also tackled a number of commercial and demo projects with game engines and with LED volumes, part of a major move into adopting real-time workflows.

So why has ASC jumped head-first into real-time? “It’s exactly in the wording ‘real-time,’” comments ASC CEO Aaron Sims, who started out in the industry in the field of practical special effects makeup with Rick Baker before segueing into character design at Stan Winston Studio, introducing new digital workflows there. “You see everything instantaneously. That’s the benefit right there. With traditional visual effects, you do some work, you add your textures and materials, you look at it, you render it, and then you go, ‘OK, I’ve got to tweak it.’ And then you go and do that all again.

“You build it, and you can go with your team to this virtual world and go scout out your world before you go make it – we never really had that luxury in traditional visual effects. We can build just enough that you can actually start to scout it with a basic camera, almost like rough layouts, and then use Unreal’s Sequencer to base your shots, and work out what else you need to build.” —Aaron Sims

“But with real-time, as you’re tweaking you’re seeing everything happen with zero delay. There’s no delay whatsoever. So, the timeframe of being able to develop things and just be creative is instant.”

The move to real-time began when ASC started taking on more game-related and VR projects, with Sims noting they have now actually been using Unreal Engine for a number of years. “However,” he says, “it wasn’t until COVID hit that I saw some really incredible things being produced by artists and other visionaries who were coming up with techniques on how to use the engine for things beyond games. That’s what excited me, so I started getting into it and learning Unreal in more detail to help tell some of our stories. Then I realized, ‘OK, wait, this is more powerful than I expected it to actually be.’ On every level, it was like it was doing more than I ever anticipated.”

The other real benefit of real-time for Sims has been speed. “The faster I can get from point A to point B, I know it’s true to my original vision. The more you start muddying it up, the more it becomes something else and the less that you can actually see it in real-time. You’re just guessing until the end. So for me it’s been fascinating to see something that’s available now, especially during COVID when I’ve been stuck at home. It’s made it even more reason to dive into it as much as I can to learn as much.”

Over the past few years, ASC has regularly produced short film projects. Some have been for demos and tests and some as part of what Sims calls the studio’s ‘Sketch-to-Screen’ process, designed to showcase the steps in concept design, layout, asset creation, lookdev, previs, animation, compositing and final rendering. Sims has even had a couple of potential feature films come and go.

DIVE is the first project for which the studio has completely embraced a game engine pipeline, and is intended as the start of a series of ‘survival’ films that Sims and his writing partner, Tyler Winther (Head of Development at ASC), have envisaged.

“The films revolve around different environment-based scenarios,” outlines Sims. “This first one, DIVE, is about a character who’s put in a situation where they have to survive. We’re going to see diving and cave diving in the film, but we wanted to put an everyday person into all these different situations and have the audience go, ‘That could be me.’”

The development of the films has been at an internal level so far (ASC has also received an Epic Games MegaGrant to help fund development), but still involves some ambitious goals. For example, DIVE, of course, contains an underwater aspect to the filmmaking. Water simulation can be tricky enough already with existing visual effects tools, let alone inside a game engine for filmquality photorealistic results.

“It’s very challenging to do water,” admits Sims, although he adds that the technology has progressed dramatically in game engines.

This article is from: