26 minute read
TV: THE WITCHER
By IAN FAILES
All images © copyright 2019 Netflix.
TOP: The swamp arachnid creature known as the Kikimora, which Geralt battles right at the top of the series.
OPPOSITE TOP TO BOTTOM: A clash between soldiers was filmed with minimal actors and horses.
The final shot made use of crowd simulation software to fill the frame.
CG geometry for Cintra Castle.
The final shot of the city under siege. Already a popular book and game series, The Witcher has now also made the move to a streaming series on Netflix, created by Lauren Schmidt Hissrich. With a story that centers around monster hunter Geralt of Rivia (Henry Cavill) making his way in a fantastical world, the show was always going to involve some level of visual effects work.
That effort was overseen by Visual Effects Supervisor Julian Parry and Visual Effects Producer Gavin Round, with a host of visual effects studios delivering shots for the show. These included Cinesite, Framestore, One of Us, Nviz and Platige Image.
“When I read the scripts, I realized I was not looking at a standard TV series,” discusses Parry. “I realized it had a whole worldbuilding sub-story. I did some further research and saw that it was actually a really big book series, and it dawned on me that we were going to have to do this justice.”
The show was to be filmed in Hungary, the Canary Islands and Poland. From a visual effects point of view, live-action photography was a jumping off point for environment extensions, battles, creatures, magical effects and a host of weapons, blood and gore.
Says Parry, “Cinesite supplied a couple of our creatures and environments. Framestore worked on mainly environments and 2D effects. One of Us had a mixture of 3D creatures and environments. Nviz were supplying predominantly the eye coloration effects. Platige Image, who were closely tied into the show from the game series, also supplied us with 2D and 3D effects.”
CREATURE FEATURES
One of the first major VFX sequences audiences encounter in
The Witcher is a fight between Geralt and a swamp-based creature called the Kikimora. The scene plays out almost like hand-to-hand combat. “Our aquatic arachnid here lived in a bog swamp, so I knew we had to deal with water interaction and other occlusions,” outlines Parry. “We started out with previs to get an idea of what action was required, and from that we worked backwards to work with Production Designer Andrew Laws and Stunt Coordinator Franklin Henson to plan things out.
“Alex Pejic, the Visual Effects Supervisor from Cinesite, came out to the shoot,” continues Parry. “We chose to have a couple of prosthetic pieces made. We felt we would benefit from the interaction. The eyelines and scale of the creature was all previs’d so Henry could see what this thing looked like and what space it would take up in the environment. Then it went to Cinesite for post.”
“The shots that challenged us the most,” details Pejic, “were mainly to do with the movements of the creature in relationship to Henry. Also, the viscosity of the water, since the movement would drive the water simulations.”
Cinesite was also responsible for visual effects related to the Striga, which Geralt battles. Here, both a practical prosthetic suit creation and a completely CG creature were utilized for final scenes. “They shot everything, but there were some safety and physical limitations with what could be done practically,” outlines Pejic. “We ended up with a hybrid version of the creature. Certain shots were kept as is, but we’d sometimes trim down or extend particularly parts of the body.
“We body tracked every single shot where we had to do those adjustments, and also built a fully 3D creature,” says Pejic. “Then, with a specific script that we created in Nuke, the walk was blended between the plates and CG elements to create this particular look. A number of shots also involved a fully CG creature where we wanted the animation to be a little ‘crazier.’”
Cinesite’s creature work continued with the green and dragons seen later in the series. Gold, in particular, presented a number of design hurdles for the team, as did the various components making up the dragons, including alligator-like features and wings. “It was a major challenge to make it look gold, but not look ridiculous,” admits Pejic. “But I don’t think it had been done before – I don’t remember having seen a golden dragon in any other work.”
BUILDING THE MAGIC The Witcher covers vast tracks of land. Some locations were built practically, or took advantage of existing locations. For instance, the exterior of Cintra Castle in the show included a fort near the Slovakian border in Komárom, which was then extended with visual effects.
“Cintra was one of the main builds we did,” attests Framestore Visual Effects Producer Christopher Gray. “There was a lot of research that went into medieval castles. We looked at 11th century Paris and the logic of the way a city and castle needed to feel together.”
“Our original inspiration was Carcassonne in France, which is obviously a real place,” adds Framestore Visual Effects Supervisor
70 • VFXVOICE.COM SPRING 2020 Pedro Sabrosa. “Our version just became a bigger version of that city, a more exaggerated version treading the line between it being fantasy and being real.”
Establishers were part of this environment work, as were moments when the city is under siege. Fire and smoke simulations were required here. Many of the final shots during the siege ended up being split across multiple vendors.
Among the many other environments Framestore delivered for the show was the female sorceresses academy at Aretuza on a rocky outcrop situated on Thanedd Island in Temeria. Framestore took a live-action location and added brutalist-like CG buildings to the shots.
“This was a tricky one because it was always envisaged as two structures that would sit on an island with water in the middle,” details Sabrosa. “It was purely by chance that they found this location in the Canary Islands when they were filming a whole other sequence.”
Aretuza is, of course, the location for a bunch of magical effects. One of those is the portal created by the characters Yennefer and Istredd. This was produced by Platige. “It was one of the first magical effects that we started developing very early in the production,” notes Platige Visual Effects Producer Krzysztof Krok. “It was evolving and needed some adjustments when we got plates from set. We understood then how flexible the FX setups as well as our thinking about magic had to be.”
Just as practical locations often informed the final environments, practical photography was also a major part of shooting the various battle sequences. Where vast amounts of soldiers could not be filmed, these shots were sometimes supplemented with digital crowds, including work by Cinesite.
“On this show we decided to introduce a new crowd system, which was Atom Crowd from Toolchefs,” states Pejic. “We had a lot of groundwork to do first, although we had experimented with this a little in the past. It worked out really well for us, making sure we could accommodate all kinds of creatures, integrate with Houdini and other tools, for driving different secondary dynamics, hair simulations and so forth.”
Platige was also a contributor to environments, including Shan Keyan Tree in the enchanted forest. “We had to show it in shots during the day and in the character Ciri’s vision as well,” says Platige Visual Effects Supervisor Mateusz Tokarz. “It has very vibrant color correction in trailers and fans loved it. On the other hand, in some shots from this sequence we’ve replaced around 75% of the screen with CG renders, and it was demanding to match to the very stylized look that was used there.”
TOP TO BOTTOM: A plate shot in the Canary Islands for the sorceresses academy at Aretuza.
The composited shot with the added structures.
A plate showing the result of an attack from the Roach Hound.
The rendered Roach Hound, realized as a Frankenstein’d creature with extra armor and appendages.
Rearing a Roach Hound
During the series, the sorceress Yennefer at one point attempts to evade a Mage and his Roach Hound. One of Us tackled the Roach Hound as a CG creation.
“I remember asking at the time, what is the Roach Hound?” recalls The Witcher Visual Effects Supervisor Julian Parry. “Is it a hound that looks like a moth-bitten threadbare mangy dog, or was it literally a roach and a hound? The answer that came back was, ‘No, it’s a Roach Hound.’ We sent over some notes to One of Us and very simply said to the team, ‘We need a Roach Hound, something that looks like a cockroach but moves like a hound.’
“We did a bit of tinkering with the design,” continues Parry. “We embellished it with some markings and gave it a backstory with regards to the way that the Mages took these creatures and then turned them into these little Frankenstein’d things. Some of their limbs have been replaced with blades and some of their shells had been replaced with armor. So there’s a bit of a hybrid thing going on there with the poor souls.”
One of Us also worked on the portals that Yennefer and the Mage make their way through during that sequence. “We wanted these portals to look very different from other shows,” notes Parry. “They are formed from the elements that surround their environments. There’s a portal made up of water from the pool, then sand when they’re in the desert, and so on with snow and dust and dirt.”
MANAGING A TV VFX SHOW
The VFX shot count on Season 1 of The Witcher approached 2,000 in number over the seven hours of content. This was post-produced in a six-month period, a relatively fast turnaround. “And the expectation isn’t any lower because it’s television,” notes Parry.
Most of the vendors were based in London; Platige was the only remote vendor. “But in this day and age,” says Parry, “sharing the work like that and having different sorts of review tools available such as cineSync and Skype, that’s all pretty straightforward now. By dividing the work up, you get the opportunity to cherry pick where work can go, you don’t over-stress one particular company, and you just manage a nice workflow with the multiple vendors.”
TOP TO BOTTOM: Actress Anya Chalotra as Yennefer mocks leaping through a portal.
The final shot where the portal appears as if made up of sand.
CG golden dragons featured in Episode 6 of the series.
The dragon completed with fire effects simulation.
ILLUMINATING THE PATH AHEAD FOR REAL-TIME RAY TRACING
By TREVOR HOGG
TOP: The Speed of Light is a real-time cinematic experience that makes use of NVIDIA Turing architecture, RTX technology, and the rendering advancements of Unreal Engine. (Image courtesy of Epic Games) Critical in simulating the realistic physical properties of light as it moves through and interacts with virtual environments and objects is ray tracing, which can become cumbersome at render time, especially with complex imagery. A huge technical and creative effort has been made by the likes of NVIDIA, Unity Technologies, Chaos Group, Epic Games and The Future Group to be able to translate the pixels representing the paths of light within real-time.
Natalya Tatarchuk, Vice President of Global Graphics at Unity Technologies, has a Masters in Computer Science from Harvard University, with a focus in Computer Graphics. “Having a graphics background is key to being able to understand the technology needs for Unity graphics, in order to deeply engage with our customers to understand how they want to create their entertainment, and what features and functionality they will need,” states Tatarchuk.
Real-time ray tracing is not confined to video games, she adds. “Real-time ray tracing will benefit any creative vision where maximal quality of surfaces or lighting is needed. You certainly can improve the look of video games with the hybrid ray-tracing pipeline we offer in High Definition Render Pipeline (HDRP), which selects the highest-quality effect with the highest performance as needed; for example, rendering high-fidelity area lights and shadows with ray tracing, but using rasterization for regular opaque material passes. Incredible fidelity lighting and photorealistic surface response can be created in architectural or industrial scenarios with our High Definition Render Pipeline Real-Time Ray Tracing (HDRP RTRT). You can also render Pixar-quality stylized
movies, or photorealistic cinema, or broadcast TV with beautiful visuals, and the most accurate medical visualizations with realtime ray tracing.”
Chaos Group colleagues Lon Grohs, Global Head of Creative, and Phillip Miller, Vice President of Project Management, not only both studied architecture, but also share the same alma mater, the University of Illinois at Urbana-Champaign. The duo is excited about the prospects for Project Lavina, which is a software program that allows for V-Ray scenes to be explored and altered within a fully ray-traced and real-time environment. “It’s definitely a tougher path to go down for sure and one filled with landmines, like the uncanny valley,” admits Grohs. “There’s a built-up demand for realism because we’ve become a more astute audience.”
Another element needs to be taken into consideration, notes Miller. “There is also the issue during real-time that it is baked-in so much that things are inaccurate. That bothers people as well.” A streamlined workflow is important. “Two big things are at work here,” Miller adds. “There’s the resulting imagery and the path to get to it. Especially with designers, we’re finding it so much easier for people to stay in pure ray tracing because they can take their original data or design and move it over easily. If you’re working physically based, which most designers are, all you have to do is put lights where they’re suppose to be, use the correct materials, and you’ll get the real result. That’s much easier for people to relate to than think about, ‘What computer graphics do I need to get a certain look or performance?’”
Juan Cañada, Ray Tracing Lead Engineer at Epic Games, studied Mechanical Engineering at Universidad Carlos III de Madrid. “My studies gave me a background in math and simulation of physical phenomena that has proven very valuable,” he says. “The real-time quest for ray tracing is practical even with constantly growing creative demands for imagery. The question is whether traditional digital content creation pipelines based on offline techniques are still a practical quest, considering the cost, iteration times and lack of flexibility. Real-time techniques are not the future, but the present.”
Unreal Engine drives the real-time technology for Epic Games. “Unreal Engine’s ray-tracing technology has been built in a non-disruptive way, so users can quickly go from raster to ray tracing, and vice versa, without being forced to do things such as changing shader models, or spend time on project settings to make things look right,” Miller explains. “From the general point of view, the workflow would still be the same. However, ray tracing allows creators to light scenes intuitively and closer to how they would light things in the real world. Besides, ray tracing systems are typically easier to use. Ray tracing makes it possible to focus more on the artistic side of things and less on learning technical aspects of algorithms that are difficult to control.”
Unity allows creators to set up custom ray-tracing LOD for materials so that the most effective computation can happen. “We simply use their regular materials when real-time reflections are enabled with real-time ray tracing, and fallback to screen-space reflections when it’s not available,” remarks Tatarchuk. “In Unity, real-time ray tracing is part of the HDRP, and we offer the same
TOP TO BOTTOM: By binning together directions in the same spatial location for the rays that are shot, Unity can optimize performance for the ray computations on GPU. (Image courtesy of Unity Technologies)
Filmmaker Kevin Margo collaborated with 75 artists and performers to produce 210 fully CG shots for the 12-minute teaser Construct. (Image courtesy of Chaos Group)
NVIDIA RTX-powered ray tracing combined with real-time facial animation enabled the movements and reactions of the AR character to match the presenters and dances onstage. (Image courtesy of NVIDIA)
TOP: Unity allows creators to set up custom ray-tracing LOD for materials so that the most effective computation can happen, such as for reflection effects. (Image courtesy of Unity Technologies)
BOTTOM LEFT: Natalya Tatarchuk, Vice President of Global Graphics, Unity Technologies. (Image courtesy of Unity Technologies)
BOTTOM RIGHT: Juan Cañada, Ray Tracing Lead Engineer, Epic Games
unified workflows as we do for everything in that pipeline.”
A key component is the use of ray binning technology. “By binning together directions in the same spatial location for the rays that we shoot, we can optimize performance for the ray computations on GPU,” Tatarchuk says. “We do this by binning by direction of rays in octahedral directional space. This may be commonly found in offline renderers, but to adopt this method to the GPU architecture was a great achievement for optimization in real-time, where we took advantage of the modern GPU compute pipeline with atomics execution for fast performance.” A focus was placed on shooting only the rays that were needed. “Specifically, we employed variance reduction techniques in order to evaluate the lighting for ray directions more effectively. We reduced variance with analytic approximation and precomputation passes to improve performance. It is important to try to reduce the variance of whatever you are integrating during the ray-tracing computations. You can use importance sampling, consistent estimators or even biased estimators if you make sure to compare your result to a reference.”
Project Lavina breaks away from the tradition of raster graphics. “That’s a different thing for people to wrap their heads around,” states Grohs, “because we’re so used to having these raster tradeoffs where you have certain geometry or shader limitations.”
Project Lavina is designed to take advantage of new ray-tracing hardware. “The hardest thing in the world is to make something easy,” observes Miller. “Project Lavina is a central gathering spot for people to share their projects regardless of the tool that it was made in. Our goal is, if you can play a 3D game you should be able to use Project Lavina. We’re adding functionality with the measure being, ‘How easy can we make it?’ That’s the core tenet behind real-time.” The denoiser runs in DirectX. “The denoising is a lovehate relationship,” observes Miller. “It’s silky smooth when you’re moving, but as soon as you stop, then you want to see the full detail. One thing that we wanted to make sure was in there before we went to general beta was crossfading from denoising to the actual finished result. That’s in there now. People will be able to enjoy the best of both worlds without having to choose. It’s very clean.”
“The first phase in our real-time ray-tracing implementation was focused on basic architecture and required functionality,” remarks Cañada. “This technology was released almost a year ago in beta in Unreal Engine 4.22. We have since shifted the focus to put more effort into both stability and performance. We just released UE 4.24, where many ray-tracing effects are two-three times faster than in previous versions. Of course, we are not done. The know-how among Epic’s engineers is staggering, and we have many ideas to try that surely will keep striving towards increased
TOP: The virtual production software program Pixotope, developed by The Future Group, makes use of a single pass render pipeline which enables ray-tracing processing while maintaining video playback rates. (Image courtesy of NVIDIA)
BOTTOM LEFT: Lon Grohs, Global Head of Creative, Chaos Group
BOTTOM RIGHT: Phillip Miller, Vice President of Project Management, Chaos Group
“Ray tracing allows creators to light scenes intuitively and closer to how they would light things in the real world. Besides, ray tracing systems are typically easier to use. Ray tracing makes it possible to focus more on the artistic side of things and less on learning technical aspects of algorithms that are difficult to control.” —Phillip Miller, Vice President of Project Management, Chaos Group
TOP: Construct was seen as the means to test the real-time ray-tracing and virtual production capabilities of Project Lavina. (Image courtesy of Chaos Group)
OPPOSITE TOP: Goodbye Kansas used no custom plug-ins when putting together the short Troll, which demonstrates the cinematic lighting capabilities of the real-time ray-tracing features of Unreal Engine 4.22. (Image courtesy of Epic Games)
OPPOSITE BOTTOM: Troll follows a princess on a journey through a misty forest full of fairies. (Image courtesy of Epic Games) performance with each release.”
Real-time ray tracing in Unreal Engine requires a NVIDIA graphics card compatible with DXR. “When other hardware systems implement further support for real-time ray tracing, we will work hard to make them compatible with Unreal Engine,” states Cañada. Compatibility with other software programs and plug-ins is part of the workflow. “Unreal Engine provides a plug-in framework, so this is already possible. Many of the latest internal developments done at Epic have been using this system. On top of that, the whole Unreal Engine code is available on GitHub, so anyone can modify it to make it compatible with third-party applications.”
Case studies are an integral part of the research and development. “We were wondering how fast can an artist put together sets from their KitBash3D models and create extremely complex images, almost two billion polygons, without sparing or optimizing any details, and could we do it?” explains Grohs. “Evan Butler, an artist at Blizzard Entertainment, put together a 3ds Max scene. There are scenes that take place in a futuristic utopia, Neo-Tokyo, and a dock. They were KitBash’d together and exported out as a V-Ray scene file. We were in Lavina moving around in real-time just by the drag-and-drop which was awesome.” Miller adds, “The important thing here is that you cannot handle that level of
geometry within the creation tool, but you’re getting it real-time with Lavina.” Another experiment was called Construct. “Kevin Margo, who at the time was with Blur Studios and is now at NVIDIA, always had this idea to take virtual production to the next stage,” states Grohs. “We used a whole cluster of GPU servers and V-Ray GPU and tried to get real-time motion capture. We did that and were able to put it inside MotionBuilder and came close to a 24 fps ray-traced view. It didn’t have the denoising and the speed that we have now.”
Real-time ray tracing is a fundamental shift in computer graphics. “To get things into a game engine right now, there are quite a few steps and procedures that people have to use,” notes Grohs. “One is applying things like UV unwrapping and UV maps across all of the channels. Another is optimizing geometry and decimating it so that it fits within polygon counts budgets. People can spend days if not weeks bringing things into whatever their favorite game engine is. Whereas a real-time ray tracer like Lavina can chew through huge scenes and you don’t need UV unwrapping or any of those other in-between steps. It’s literally a drag-anddrop process.”
Tatarchuk sees game engines rising to the technological challenge. “With the requirements of orchestration of dynamically load-balanced multi-threaded execution in real-time, taking
A World First for Real-Time Ray Tracing
During the world’s first real-time ray-traced live broadcast, an AR gaming character answered questions and danced in real-time. In order to achieve this, Riot Games utilized The Future Group’s mixed-reality virtual production software Pixotope which makes use of NVIDIA RTX GPUs.
“The raw character assets were provided by Riot Games,” states Marcus Brodersen, CTO at The Future Group. “A large aspect of our creative challenge is simply ensuring that the end result looks convincing, and seamlessly blending the real with the virtual. This includes enforcing correct use of PBR, final lookdev passes, as well as onsite HDRI captures and final lighting to ensure the virtual talent is rendered in a way consistent with the physical environment we put her in.
“For the League of Legends Pro League regional finals [in Shanghai in September 2019], the heavy use of fog machines made compositing in Akali far more difficult, as we needed to occlude her with matching virtual fog in order to blend her in seamlessly,” explains Brodersen. “Ray tracing, powered by the newest series of RTX cards from NVIDIA, allows the use of traditional rendering methods in real-time environments. However, emulating millions of photons of light is still a taxing endeavor and has severe performance implications. We had some difficult technical requirements we had to meet. We were rendering and compositing 60 full HD fps and could not miss a single frame as this would be visible as a significant pop in the broadcast.” A significant amount of work went into optimizing every aspect of the performance from dialing ray lengths, to combining as many light sources as possible, selectively trimming shadow attributes for each light, and fine-tuning post-processing effects. “One of the main reasons we are even able to do all of this is because Pixotope is a single-pass rendering solution, giving us the maximum performance of the Unreal Engine,” remarks Brodersen. “With little added overhead in ingesting and compositing our virtual graphics with the camera feed in a single renderer, we are able to push the boundary of what is possible in real-time VFX, resulting in this world-first event of its kind.”
Synchronization and latency are technical challenges. “When we are integrating with time-based elements like singing and dancing, we have to make sure that we are in perfect sync with the rest of the production,” notes Brodersen. “This is achieved using a shared timecode. We also have to make sure that the delay between an action on stage to a finished rendered image is low enough that people can actively interact. This was especially important for the live interview segment where the virtual character was talking to an interviewer and the audience. Most importantly, we had to make it look like the character was actually in the venue. This is done by replicating the physical lighting that’s on the stage in the virtual scene, including using the onstage video feeds to light our virtual character. Now being able to use ray tracing as one of the tools makes that job both easier, faster and with a better result.”
TOP: Unity Technologies’ Natalya Tatarchuk believes that real-time ray tracing will benefit any creative vision where maximal quality of surfaces or lighting is needed. (Image courtesy of Unity Technologies)
advantage of all available hardware to nanoseconds, modern game engines perform a ridiculous amount of complex calculations. I believe game engines are the right vehicle for all ambitions in the real-time experiences and will continue to be, as they are innovating along the right axis for that domain.”
“Our goal was to avoid forcing the user to follow different workflows for raster and ray tracing,” states Cañada. “Users should be able to adopt ray tracing quickly, without investing significant amounts of time learning new methodologies. We are proud of our accomplishments so far, but of course it does not mean we are done here. Maintaining usability while adding new features is a key goal for everyone involved. It is important that the system can accommodate rendering, denoising and ray tracing without crashing. Having great engineering, production and QA teams are the reasons all these components work seamlessly. The support from NVIDIA has also been amazing, with incredible graphics hardware, as well as solid DXR specs and drivers for emerging tech.”
Machine learning is integral to the future of real-time ray tracing. “In a way,” Cañada says, “techniques that use reinforcement learning have been popular in graphics for a long time. Some of these techniques, such as DLSS, are already suitable for real-time graphics and others will be soon. Machine learning will become an important component of the real-time rendering engineer’s toolset.” The technology continues to evolve. “Rendering a forest with hundreds of thousands of trees, with millions of leaves moved by the wind, is a major challenge that will require more research on the creation and refit of acceleration structures. Additionally, solving complex light transport phenomena, such as caustics or glossy inter-reflections, will require new and inventive techniques.”
In regards to visual effects, real-time ray tracing can be found in previs and shot blocking. “The biggest area in visual effects is virtual production where people want to converge the offline and real-time render pipelines from as early on as possible,” notes Grohs. “One of the benefits with Lavina is that you’d be able to get a real-time ray-tracer view of your production asset for the most part. The challenge there being that there are tools from a visual effects standpoint that they’re looking for in terms of live motion capture and some other things that we’ve got to do more homework on to get it there.”
There is no limit to how the technology can be applied, according to the experts. “It’s fast enough to be used for the quickest iteration and previz, as an exploration tool, or any part of the virtual cinematography pipeline,” remarks Tatarchuk. “It is high quality that any discerning final frames will be well received if rendered with it. Our job is to deliver the best technology, and then it’s up to the creator of the frame to drive the vision of how to apply it. It’s really been an inspiring journey to see what people all over the world have been able to accomplish with this technology already, and it’s just at the beginning of what we can do!”