VFX Voice - Summer 2020 Issue

Page 1

LOST IN SPACE • UNDONE • THE PLOT AGAINST AMERICA 5G ADVANCES • VFX BIDDING • PROFILE: BEN GROSSMANN

VFX STREAMING

VFXVOICE.COM SUMMER 2020


CVR2 VES HANDBOOK AD.indd 2

5/14/20 12:46 PM


PG 1 AUTODESK Advertorial.indd 1

5/14/20 12:49 PM


[ EXECUTIVE NOTE ]

Welcome to the Summer issue of VFX Voice! Now, more than ever, we appreciate being a part of this strong, vibrant global community. We take to heart the opportunity we have, through VFX Voice, to keep us all tethered through success stories that celebrate hard-working, visionary visual effects talent across the world. In this issue, we go deep on the explosion of streaming content, explore the VFX bidding process, and break down the VFX behind HBO’s Westworld and The Plot Against America, CBS All Access’ Star Trek: Picard and Netflix’s Lost in Space. We also get animated, delving into the anime revival of Netflix’s Ghost in the Shell and Amazon’s live-action hybrid Undone. Catch our special profiles of Magnopus Co-founder and CEO Ben Grossmann and VFX Supervisor Jake Morrison, known for his stellar work in the Marvel Cinematic Universe. In our Tech & Tools section, we look at new approaches in TV VFX and the 5G revolution, and in VR/AR/MR Trends we feature Cloudhead Games on the frontiers of gaming and VR. And we continue to shine a light on our regional sections, with a spotlight on VES France in the City of Lights. Thank you again for being a supporter of VFX Voice. We’re proud to be the definitive authority on all things VFX. We continue to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Wishing you well in these uncertain times. Stay safe.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SUMMER 2020

PG 2 EXECUTIVE NOTE.indd 2

5/13/20 5:26 PM


PG 3 DISNEY MANDALORIAN AD.indd 3

5/14/20 12:51 PM



PG 5 DISNEY LADY AND TRAMP AD.indd 5

5/14/20 12:53 PM


SUMMER 2020 • VOL. 4, NO. 3

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING VFXVoiceAds@gmail.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Mike Chambers, Chair Lisa Cooke, 1st Vice Chair Emma Clifton Perry, 2nd Vice Chair Laurie Blavin, Treasurer Rita Cahill, Secretary DIRECTORS Brooke Breton, Kathryn Brillhart, Colin Campbell Bob Coleman, Dayne Cowan, Kim Davidson Rose Duignan, Richard Edlund, VES, Bryan Grill Dennis Hoffman, Pam Hogarth, Jeff Kleiser Suresh Kondareddy, Kim Lavery, VES Tim McGovern, Emma Clifton Perry Scott Ross, Jim Rygiel, Tim Sassoon Lisa Sepp-Wilson, Katie Stetson David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Joe Weidenbach ALTERNATES Andrew Bly, Gavin Graham, Charlie Iturriaga Andres Martinez, Dan Schrecker Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2020 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SUMMER 2020

PG 6 MASTHEAD.indd 6

5/13/20 5:28 PM


PG 7 DISNEY TOGO AD.indd 7

5/14/20 12:55 PM


FILM

A DAY IN THE LIFE OF MARVEL VFX SUPERVISOR JAKE MORRISON By TREVOR HOGG

TOP: Jake Morrison, VFX Supervisor, Marvel Studios

8 • VFXVOICE.COM SUMMER 2020

PG 8-13 DAY IN THE LIFE.indd 8-9

As with any member of a film crew, the core responsibility of a visual effects supervisor is to facilitate and improve the cinematic vision of the director. The daily journey to achieve this goal varies depending on whether it is pre-production, principal photography or post-production. “I liken the whole thing to a triathlon more than anything else in the sense that you have three distinct sections of the movie and each one is completely different,” notes Marvel Studios VFX Supervisor Jake Morrison (Thor: Ragnarok) while in pre-production for Thor: Love and Thunder, which reunites him with the God of Thunder (Chris Hemsworth) and filmmaker Taika Waititi (Jojo Rabbit). “The thing that is probably most misunderstood is that in pre-production you don’t end up talking to any other visual effects people because there aren’t any. There’s only me and my producer. “Pre-production is the blue sky period where no idea is a bad one and everything could be the best thing that has ever been done,” states Morrison. “You do a mental post mortem of every movie that you’ve ever worked or seen. The interesting thing about the visual effects supervisor’s job at this stage of design is you’re focused more on building set pieces and the cool moments. A lot of it is visually driven but not all of it. You’re thinking, ‘If we did this in the first act and set it up visually then we can pay it off in the second act battle because the audience will be educated enough to get the next level of the gag.’ You can effectively build an ‘aha moment’ into it.” During pre-production and principal photography, a close relationship is developed with the production designer and cinematographer to establish a visual language for the project. “It becomes an open discussion. ‘What are we actually going to have there, what is it going to look like ultimately, and what tools do we need to bring to play to be able to create the best and most cost-effective way to do the film.’” Generally, Morrison spends two years on a project. “You build a family and develop a shorthand, especially if you work with the same people again. You have to build a trust level because the production designer needs to know that I understand what the vision is for the world or environment that we’re going to build. They can only construct so much as a set or backlot piece. The virtual cinematography for most of the environment that the audience is going to see falls on my shoulders. Then in post things change and we’re relying on our visual effects vendors to do a lot of the design and conceptualizing, but the trouble with that is it rarely dovetails as neatly as you want with the original vision of the production designer. Every production designer has a signature look regardless of the medium or time period. You have to be able to glean that and carry it over the finish line, much like if I’m directing second unit for Taika, then I will make sure that I’m using his signature camera moves. “The nice thing about art departments now is that they tend to be working in 3D software much more and will do paint overs,” states Morrison. “You’d get a lot of 3D architecture in Maya or Cinema 4D taken up to a certain level and then we would flesh the whole thing out with lighting and textures. I make sure that there is a clear flow of information that comes from the production designer that heads into previs. I’ll also do a previs version of the

costume designs. I will start building the lighting model so when the DP gets onboard everybody can see the sort of scene that we are aiming for. You begin to build up these mini-sequences and the cool action beats that you’ve been designing. You start to see the movie actually work. At some point in the pre-production phase

TOP: Morrison with director Taika Waititi and crew while shooting a scene for Thor: Ragnarok (Photo: Jasin Boland. Copyright © 2016 Marvel Studios) BOTTOM: Hulk (Mark Ruffalo) battles Surtur (voiced by Clancy Brown) in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios)

SUMMER 2020 VFXVOICE.COM • 9

5/13/20 5:31 PM


FILM

A DAY IN THE LIFE OF MARVEL VFX SUPERVISOR JAKE MORRISON By TREVOR HOGG

TOP: Jake Morrison, VFX Supervisor, Marvel Studios

8 • VFXVOICE.COM SUMMER 2020

PG 8-13 DAY IN THE LIFE.indd 8-9

As with any member of a film crew, the core responsibility of a visual effects supervisor is to facilitate and improve the cinematic vision of the director. The daily journey to achieve this goal varies depending on whether it is pre-production, principal photography or post-production. “I liken the whole thing to a triathlon more than anything else in the sense that you have three distinct sections of the movie and each one is completely different,” notes Marvel Studios VFX Supervisor Jake Morrison (Thor: Ragnarok) while in pre-production for Thor: Love and Thunder, which reunites him with the God of Thunder (Chris Hemsworth) and filmmaker Taika Waititi (Jojo Rabbit). “The thing that is probably most misunderstood is that in pre-production you don’t end up talking to any other visual effects people because there aren’t any. There’s only me and my producer. “Pre-production is the blue sky period where no idea is a bad one and everything could be the best thing that has ever been done,” states Morrison. “You do a mental post mortem of every movie that you’ve ever worked or seen. The interesting thing about the visual effects supervisor’s job at this stage of design is you’re focused more on building set pieces and the cool moments. A lot of it is visually driven but not all of it. You’re thinking, ‘If we did this in the first act and set it up visually then we can pay it off in the second act battle because the audience will be educated enough to get the next level of the gag.’ You can effectively build an ‘aha moment’ into it.” During pre-production and principal photography, a close relationship is developed with the production designer and cinematographer to establish a visual language for the project. “It becomes an open discussion. ‘What are we actually going to have there, what is it going to look like ultimately, and what tools do we need to bring to play to be able to create the best and most cost-effective way to do the film.’” Generally, Morrison spends two years on a project. “You build a family and develop a shorthand, especially if you work with the same people again. You have to build a trust level because the production designer needs to know that I understand what the vision is for the world or environment that we’re going to build. They can only construct so much as a set or backlot piece. The virtual cinematography for most of the environment that the audience is going to see falls on my shoulders. Then in post things change and we’re relying on our visual effects vendors to do a lot of the design and conceptualizing, but the trouble with that is it rarely dovetails as neatly as you want with the original vision of the production designer. Every production designer has a signature look regardless of the medium or time period. You have to be able to glean that and carry it over the finish line, much like if I’m directing second unit for Taika, then I will make sure that I’m using his signature camera moves. “The nice thing about art departments now is that they tend to be working in 3D software much more and will do paint overs,” states Morrison. “You’d get a lot of 3D architecture in Maya or Cinema 4D taken up to a certain level and then we would flesh the whole thing out with lighting and textures. I make sure that there is a clear flow of information that comes from the production designer that heads into previs. I’ll also do a previs version of the

costume designs. I will start building the lighting model so when the DP gets onboard everybody can see the sort of scene that we are aiming for. You begin to build up these mini-sequences and the cool action beats that you’ve been designing. You start to see the movie actually work. At some point in the pre-production phase

TOP: Morrison with director Taika Waititi and crew while shooting a scene for Thor: Ragnarok (Photo: Jasin Boland. Copyright © 2016 Marvel Studios) BOTTOM: Hulk (Mark Ruffalo) battles Surtur (voiced by Clancy Brown) in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios)

SUMMER 2020 VFXVOICE.COM • 9

5/13/20 5:31 PM


FILM

“I make sure that there is a clear flow of information that comes from the production designer that heads into previs. I’ll also do a previs version of the costume designs. I will start building the lighting model so when the DP gets onboard everybody can see the sort of scene that we are aiming for. You begin to build up these mini-sequences and the cool action beats that you’ve been designing. You start to see the movie actually work.” —Jake Morrison, Visual Effects Supervisor, Marvel Studios

TOP: An aerial shot of Asgard from Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) MIDDLE: Asgard as it appears in Thor: Dark World. (Image copyright © 2013 Marvel Studios) BOTTOM: A signature action sequence in Thor: Ragnarok is a gladiator battle between Hulk (Mark Ruffalo) and Thor. (Image copyright © 2017 Marvel Studios)

10 • VFXVOICE.COM SUMMER 2020

PG 8-13 DAY IN THE LIFE.indd 10-11

the first AD comes on board. The previs is an immensely powerful tool for them, as they can slice it into pieces to work out how long it will take to shoot the stuff, look at the availability of the actors, and put together a proper shooting schedule.” A critical partnership is the one between the visual effects supervisor and visual effects producer. “My producer needs to know that I’m going to be responsible,” remarks Morrison. “I’m not going to ask for a large number of expensive things that have no payoff. Likewise, I have to believe that my producer trusts me enough that when I am asking for something expensive it’s something that I need. The budget is the only budget. You look at how much you can afford to do and then make sure that you do the best version of each of those.” The second unit visual effects supervisor is brought in later on in pre-production. “You want them to have enough time to understand the flavor of the film. Performance is everything. You have to put on your future goggles, look at the scene as the finished thing with the CG character and ask, ‘Do I believe the actor’s performance against this thing that isn’t there?’” Once principal photography takes place, the majority of the time is spent away from the production office in Los Angeles. “You live

on set,” remarks Morrison. “We get a trailer for conference calls or cineSync sessions with vendors. I’m there from call to wrap.” A mobile office is set up in the video village with a laptop connected to the Internet. “There are monitors that show you the feed from each of the cameras. It’s a matter of keeping an eye on everything, and making sure that things are being shot in the way you want them to be. If there’s something questionable I’ll politely knock on the door of the DIT [digital intermediate technician].” During production, bluescreen is favored over greenscreen. “If you get a little bit of blue spill and you’re shooting an exterior it’s actually forgiving because the sky component is in the blue register. But there are times when I will go white. The lighting is more important than the edges because you can never recover from bad lighting. Often what I will do on a large bluescreen sequence is go through and do a neutralizing pass, which takes the essence of what I know the director of photography wanted. “If you know your cast upfront, and that there are going to be well-presented digital doubles that hold up for relative closeups, you try as hard as you can to get LightStage and Medusa captures in pre-production,” notes Morrison. “As of right now the only place you can get LightStage scans is in Los Angeles, so at some point you’re going to have to get the actors out to you. The male actors need to be clean shaven for that, otherwise the data is useless. The LightStage captures all of the reflective values of the skin, which allows us to get a good quality render of the actor in any lighting environment. But then the facial performances, like actual acting, and the physical detail of the face are captured with a rig called the Medusa that creates high resolution 3D meshes. For the overall scanning of the body and costumes, we hire a company that will be there for the entire principal photography. The same company will scan every single prop doing before and afters. We also LiDAR scan every set build and then do a full neutral texture pass so we can reconstruct any of them in post.” As actors are cast for roles, so are visual effects vendors according to their specialties. “When you’re in the design phase, the single most important thing is making sure that you’re thinking of the right vendors for the right type of work because the skillsets

TOP TO BOTTOM: Thor (Chris Hemsworth) and Hulk (Mark Ruffalo) spend some quality time together in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) Skurge (Karl Urban) travels across the Bifröst to conquer Asgard in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) Heimdall (Idris Elba) stands before a spaceship used by the Dark Elves known as The Harrow in Thor: The Dark World. (Image copyright © 2013 Marvel Studios) Eir (Alice Krige) attempts to identify what has affflicted Jane Foster (Natalie Portman) in Thor: The Dark World. (Image copyright © 2013 Marvel Studios)

SUMMER 2020 VFXVOICE.COM • 11

5/13/20 5:31 PM


FILM

“I make sure that there is a clear flow of information that comes from the production designer that heads into previs. I’ll also do a previs version of the costume designs. I will start building the lighting model so when the DP gets onboard everybody can see the sort of scene that we are aiming for. You begin to build up these mini-sequences and the cool action beats that you’ve been designing. You start to see the movie actually work.” —Jake Morrison, Visual Effects Supervisor, Marvel Studios

TOP: An aerial shot of Asgard from Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) MIDDLE: Asgard as it appears in Thor: Dark World. (Image copyright © 2013 Marvel Studios) BOTTOM: A signature action sequence in Thor: Ragnarok is a gladiator battle between Hulk (Mark Ruffalo) and Thor. (Image copyright © 2017 Marvel Studios)

10 • VFXVOICE.COM SUMMER 2020

PG 8-13 DAY IN THE LIFE.indd 10-11

the first AD comes on board. The previs is an immensely powerful tool for them, as they can slice it into pieces to work out how long it will take to shoot the stuff, look at the availability of the actors, and put together a proper shooting schedule.” A critical partnership is the one between the visual effects supervisor and visual effects producer. “My producer needs to know that I’m going to be responsible,” remarks Morrison. “I’m not going to ask for a large number of expensive things that have no payoff. Likewise, I have to believe that my producer trusts me enough that when I am asking for something expensive it’s something that I need. The budget is the only budget. You look at how much you can afford to do and then make sure that you do the best version of each of those.” The second unit visual effects supervisor is brought in later on in pre-production. “You want them to have enough time to understand the flavor of the film. Performance is everything. You have to put on your future goggles, look at the scene as the finished thing with the CG character and ask, ‘Do I believe the actor’s performance against this thing that isn’t there?’” Once principal photography takes place, the majority of the time is spent away from the production office in Los Angeles. “You live

on set,” remarks Morrison. “We get a trailer for conference calls or cineSync sessions with vendors. I’m there from call to wrap.” A mobile office is set up in the video village with a laptop connected to the Internet. “There are monitors that show you the feed from each of the cameras. It’s a matter of keeping an eye on everything, and making sure that things are being shot in the way you want them to be. If there’s something questionable I’ll politely knock on the door of the DIT [digital intermediate technician].” During production, bluescreen is favored over greenscreen. “If you get a little bit of blue spill and you’re shooting an exterior it’s actually forgiving because the sky component is in the blue register. But there are times when I will go white. The lighting is more important than the edges because you can never recover from bad lighting. Often what I will do on a large bluescreen sequence is go through and do a neutralizing pass, which takes the essence of what I know the director of photography wanted. “If you know your cast upfront, and that there are going to be well-presented digital doubles that hold up for relative closeups, you try as hard as you can to get LightStage and Medusa captures in pre-production,” notes Morrison. “As of right now the only place you can get LightStage scans is in Los Angeles, so at some point you’re going to have to get the actors out to you. The male actors need to be clean shaven for that, otherwise the data is useless. The LightStage captures all of the reflective values of the skin, which allows us to get a good quality render of the actor in any lighting environment. But then the facial performances, like actual acting, and the physical detail of the face are captured with a rig called the Medusa that creates high resolution 3D meshes. For the overall scanning of the body and costumes, we hire a company that will be there for the entire principal photography. The same company will scan every single prop doing before and afters. We also LiDAR scan every set build and then do a full neutral texture pass so we can reconstruct any of them in post.” As actors are cast for roles, so are visual effects vendors according to their specialties. “When you’re in the design phase, the single most important thing is making sure that you’re thinking of the right vendors for the right type of work because the skillsets

TOP TO BOTTOM: Thor (Chris Hemsworth) and Hulk (Mark Ruffalo) spend some quality time together in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) Skurge (Karl Urban) travels across the Bifröst to conquer Asgard in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) Heimdall (Idris Elba) stands before a spaceship used by the Dark Elves known as The Harrow in Thor: The Dark World. (Image copyright © 2013 Marvel Studios) Eir (Alice Krige) attempts to identify what has affflicted Jane Foster (Natalie Portman) in Thor: The Dark World. (Image copyright © 2013 Marvel Studios)

SUMMER 2020 VFXVOICE.COM • 11

5/13/20 5:31 PM


FILM

TOP TO BOTTOM: From left: Kurse (Adewale Akinnuoye-Agbaje) and Malekith (Christopher Eccleston) attempt to conquer the Nine Realms in Thor: The Dark World. (Image copyright © 2013 Marvel Studios) The Harrow on a collision course with Greenwich, England in Thor: The Dark World. (Image copyright © 2013 Marvel Studios) Thor (Chris Hemsworth), Loki (Tom Hiddleston) and Jane Foster (Natalie Portman) travel different realms in Thor: The Dark World. (Image copyright © 2013 Marvel Studios)

12 • VFXVOICE.COM SUMMER 2020

PG 8-13 DAY IN THE LIFE.indd 13

are not across the board,” states Morrison. “You’re looking at Weta Digital, ILM and Digital Domain for digital human work. At the same time. there are hundreds of other vendors out there who are great. I used Fin Design + Effects in Sydney on Thor: Ragnorak for the ‘Willy Wonka’ sequence where we shot Chris Hemsworth in a wheelchair that didn’t move on a bluescreen set, with a guy holding a little fan. Fin sketched out this whole two-minute sequence out of absolutely nothing. The bigger houses won’t necessarily be able to afford to do something that is so bespoke and not pipeline. You go to the big houses because they can take the final battle, which can easily be 400 or 500 shots. You need to know that the machine has been created in order to be able to create the characters, environments, lighting and effects, and be able to work in parallel with 1,000 artists.” Shots are constantly rebid as the photography and edit evolve. “You build this shot list for what you think the movie will be [based on storyboards and the script],” explains Morrison. “For each of the big sequences or any world or creatures, I will do a two-page synopsis with some concept art to explain, ‘This is what the creature does and how it moves. This is what it is made of, and these are its powers. This is what production is going to build, this is what we’re going to get, and this what you can expect to happen.’ I will send that over to the vendors, and it answers a lot of the questions before we even get on the phone. But then you do the shoot. Some of it happens and some of it doesn’t. If it’s Taika, we’re doing a lot of improv and the entire script can change. The editor gets two weeks followed by 10 weeks of director’s cut. The studio is going to have a look at it and send some notes. We’re doing postvis, filling in the bluescreen and putting in CG characters. It’s a whole other round of the previs into postvis. Only at that stage do you really know what the movie is going to be. “When you’re shooting, you’ll often refer to the ‘circle take,’” remarks Morrison. “The script supervisor notes that and it goes to the edit suite. It’s the quickest way for the editor to know what the director thought was the best take. And the further you get through principal photography, there are more and more requests for specific shots or coverage from the editor. It’s quite intuitive. I prefer for the editor to cut the previs because there is a certain amount of ownership to that. Some editors are very cutty while others are much more relaxed and the shots are longer.” The visual effects editor acts as an intermediate between visual effects and editorial. “They’re usually part of our crew. For a director’s cut, they’ll get rid of bluescreens and put in temporary backgrounds alongside the postvis crew. When I’m doing presentations at the end of the process and showing stuff for final to the director and studio, I’ll show it in context. You never show character animation as a single shot. I’ll show a six or 10 shot run, especially if it’s a funny scene you’ve got to have the context to see if it plays. The visual effects editor is driving that stuff in the back of the visual effects screening room. That relationship doesn’t tend to kick in until just before the shoot.” It’s like a marathon at the end of post-production. “We had 2,700 visual effects shots on Thor: Ragnorak and finaled 2,500 of them in the last month. In the end there were 18 vendors on

Thor: Ragnorak. We would start our day in L.A. with a cineSync and a Polycom. I have a big projection screen hooked up to a Mac, plus a Wacom tablet with a pen. There’s a row for my reviews – my producer is sitting across the room – and then I have a row of coordinators with each one looking after three visual effects vendors. I would start off with Munich followed by London, New York City, Montreal, Vancouver, Los Angeles, Wellington, Sydney, Melbourne and Adelaide. In the beginning I would have reviews with the key vendors on a Tuesday and Thursday, and closer to final delivery I will have reviews with each of the vendors every day. “Marvel has built tools to enable me to quickly and easily review shots offline,” Morrison continues. “This means I can leave at a reasonable hour, go home, get some exercise in, have dinner with my wife and then send further notes out. If, for example, I’m working with Weta Digital, which is five hours back from tomorrow, they might send a delivery at the end of the Los Angeles day, but still afternoon in New Zealand. If I can get notes out in a timely manner, an artist can implement them and set off an overnight render thus saving us a day. I’m either on the phone directly so people can hear me, ask questions and get clarification, or I write the notes with annotated stills because it is important to convey the proper tone.” In post-production the key relationship is with the editorial department, with some editors wanting the visual effects shots to exactly replicate the postvis while others embrace the potential of a different take. “While I’m sending all of the notes to the vendors and telling them where to make improvements, on the other side it is important to try get the work that the vendors are delivering, even though it’s not complete, into the cut,” states Morrison. “There’s a great tendency to leave the postvis in there until the shot is done, but you’ve missed a thousand opportunities as you’re not working with the vendors’ animation and rendering. I’ll actively ask the vendors, especially if it’s an animated sequence, to send an alternative version. I will also be suggesting sound design ideas because so much of visual effects is about sound. When I’m putting together edits for previs I will layer on sound effects to make sure it feels like an emotive piece.” When it comes to deliverables, Morrison works in a special format that falls between 2K and 4K. “I found that to be an immensely fair way of doing it because you’re not asking the vendors to do work that 95% of people will never see. You can also make a nice IMAX print from that as well, as I have already made it a higher resolution. We are always finishing with EDR [Extended Dynamic Range] in the back of our minds. The 2K DCP is graded SDR followed by a trim pass where you do a refit for the levels that can be put on the screen, and finally an aesthetic pass. We will also do a home video grade.” The value of contributing in the creative and shooting process can never be underestimated, attests Morrison. “Putting the dots in the right order and making the pictures look good are crucial, but you’re there to help with ideas, promote creativity and suggest things that people aren’t thinking of. The visual effects supervisor works for the movie, first and foremost, and we’re the ones who have to carry that final product over the finish line, look the director in the eye and proudly say, ‘This is absolutely the best version of the movie we could have possibly done.’”

TOP TO BOTTOM: Microphotography of Yellowjacket/Darren Cross (Corey Stoll) in Ant-Man. (Image copyright © 2015 Marvel Studios Valkyrie (Tessa Thompson) attempts to defeat and capture Hela in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) Ant-Man/Scott Lang (Paul Rudd) riding Antony, with the flying insect given a more appealing aesthetic for Ant-Man. (Image copyright © 2015 Marvel Studio)

SUMMER 2020 VFXVOICE.COM • 13

5/13/20 5:31 PM


FILM

TOP TO BOTTOM: From left: Kurse (Adewale Akinnuoye-Agbaje) and Malekith (Christopher Eccleston) attempt to conquer the Nine Realms in Thor: The Dark World. (Image copyright © 2013 Marvel Studios) The Harrow on a collision course with Greenwich, England in Thor: The Dark World. (Image copyright © 2013 Marvel Studios) Thor (Chris Hemsworth), Loki (Tom Hiddleston) and Jane Foster (Natalie Portman) travel different realms in Thor: The Dark World. (Image copyright © 2013 Marvel Studios)

12 • VFXVOICE.COM SUMMER 2020

PG 8-13 DAY IN THE LIFE.indd 13

are not across the board,” states Morrison. “You’re looking at Weta Digital, ILM and Digital Domain for digital human work. At the same time. there are hundreds of other vendors out there who are great. I used Fin Design + Effects in Sydney on Thor: Ragnorak for the ‘Willy Wonka’ sequence where we shot Chris Hemsworth in a wheelchair that didn’t move on a bluescreen set, with a guy holding a little fan. Fin sketched out this whole two-minute sequence out of absolutely nothing. The bigger houses won’t necessarily be able to afford to do something that is so bespoke and not pipeline. You go to the big houses because they can take the final battle, which can easily be 400 or 500 shots. You need to know that the machine has been created in order to be able to create the characters, environments, lighting and effects, and be able to work in parallel with 1,000 artists.” Shots are constantly rebid as the photography and edit evolve. “You build this shot list for what you think the movie will be [based on storyboards and the script],” explains Morrison. “For each of the big sequences or any world or creatures, I will do a two-page synopsis with some concept art to explain, ‘This is what the creature does and how it moves. This is what it is made of, and these are its powers. This is what production is going to build, this is what we’re going to get, and this what you can expect to happen.’ I will send that over to the vendors, and it answers a lot of the questions before we even get on the phone. But then you do the shoot. Some of it happens and some of it doesn’t. If it’s Taika, we’re doing a lot of improv and the entire script can change. The editor gets two weeks followed by 10 weeks of director’s cut. The studio is going to have a look at it and send some notes. We’re doing postvis, filling in the bluescreen and putting in CG characters. It’s a whole other round of the previs into postvis. Only at that stage do you really know what the movie is going to be. “When you’re shooting, you’ll often refer to the ‘circle take,’” remarks Morrison. “The script supervisor notes that and it goes to the edit suite. It’s the quickest way for the editor to know what the director thought was the best take. And the further you get through principal photography, there are more and more requests for specific shots or coverage from the editor. It’s quite intuitive. I prefer for the editor to cut the previs because there is a certain amount of ownership to that. Some editors are very cutty while others are much more relaxed and the shots are longer.” The visual effects editor acts as an intermediate between visual effects and editorial. “They’re usually part of our crew. For a director’s cut, they’ll get rid of bluescreens and put in temporary backgrounds alongside the postvis crew. When I’m doing presentations at the end of the process and showing stuff for final to the director and studio, I’ll show it in context. You never show character animation as a single shot. I’ll show a six or 10 shot run, especially if it’s a funny scene you’ve got to have the context to see if it plays. The visual effects editor is driving that stuff in the back of the visual effects screening room. That relationship doesn’t tend to kick in until just before the shoot.” It’s like a marathon at the end of post-production. “We had 2,700 visual effects shots on Thor: Ragnorak and finaled 2,500 of them in the last month. In the end there were 18 vendors on

Thor: Ragnorak. We would start our day in L.A. with a cineSync and a Polycom. I have a big projection screen hooked up to a Mac, plus a Wacom tablet with a pen. There’s a row for my reviews – my producer is sitting across the room – and then I have a row of coordinators with each one looking after three visual effects vendors. I would start off with Munich followed by London, New York City, Montreal, Vancouver, Los Angeles, Wellington, Sydney, Melbourne and Adelaide. In the beginning I would have reviews with the key vendors on a Tuesday and Thursday, and closer to final delivery I will have reviews with each of the vendors every day. “Marvel has built tools to enable me to quickly and easily review shots offline,” Morrison continues. “This means I can leave at a reasonable hour, go home, get some exercise in, have dinner with my wife and then send further notes out. If, for example, I’m working with Weta Digital, which is five hours back from tomorrow, they might send a delivery at the end of the Los Angeles day, but still afternoon in New Zealand. If I can get notes out in a timely manner, an artist can implement them and set off an overnight render thus saving us a day. I’m either on the phone directly so people can hear me, ask questions and get clarification, or I write the notes with annotated stills because it is important to convey the proper tone.” In post-production the key relationship is with the editorial department, with some editors wanting the visual effects shots to exactly replicate the postvis while others embrace the potential of a different take. “While I’m sending all of the notes to the vendors and telling them where to make improvements, on the other side it is important to try get the work that the vendors are delivering, even though it’s not complete, into the cut,” states Morrison. “There’s a great tendency to leave the postvis in there until the shot is done, but you’ve missed a thousand opportunities as you’re not working with the vendors’ animation and rendering. I’ll actively ask the vendors, especially if it’s an animated sequence, to send an alternative version. I will also be suggesting sound design ideas because so much of visual effects is about sound. When I’m putting together edits for previs I will layer on sound effects to make sure it feels like an emotive piece.” When it comes to deliverables, Morrison works in a special format that falls between 2K and 4K. “I found that to be an immensely fair way of doing it because you’re not asking the vendors to do work that 95% of people will never see. You can also make a nice IMAX print from that as well, as I have already made it a higher resolution. We are always finishing with EDR [Extended Dynamic Range] in the back of our minds. The 2K DCP is graded SDR followed by a trim pass where you do a refit for the levels that can be put on the screen, and finally an aesthetic pass. We will also do a home video grade.” The value of contributing in the creative and shooting process can never be underestimated, attests Morrison. “Putting the dots in the right order and making the pictures look good are crucial, but you’re there to help with ideas, promote creativity and suggest things that people aren’t thinking of. The visual effects supervisor works for the movie, first and foremost, and we’re the ones who have to carry that final product over the finish line, look the director in the eye and proudly say, ‘This is absolutely the best version of the movie we could have possibly done.’”

TOP TO BOTTOM: Microphotography of Yellowjacket/Darren Cross (Corey Stoll) in Ant-Man. (Image copyright © 2015 Marvel Studios Valkyrie (Tessa Thompson) attempts to defeat and capture Hela in Thor: Ragnarok. (Image copyright © 2017 Marvel Studios) Ant-Man/Scott Lang (Paul Rudd) riding Antony, with the flying insect given a more appealing aesthetic for Ant-Man. (Image copyright © 2015 Marvel Studio)

SUMMER 2020 VFXVOICE.COM • 13

5/13/20 5:31 PM


PG 14-15 AMAZON ORIGINALS Spread AD.indd All Pages

5/14/20 12:56 PM


PG 14-15 AMAZON ORIGINALS Spread AD.indd All Pages

5/14/20 12:56 PM


TV/STREAMING

TURNING BACK TIME IN THE PLOT AGAINST AMERICA By TREVOR HOGG

All images courtesy of HBO. TOP: There was not a lot of architecture from the 1940s still existing in New York and New Jersey. OPPOSITE TOP: Air conditioners, satellite dishes and modern traffic lights needed to be painted out of live-action plates.

16 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 16-17

Upon learning that some radical Republican senators wanted famous aviator Charles Lindbergh to run against Franklin D. Roosevelt for the presidency of the United States, novelist Philip Roth wrote The Plot Against America, a revisionist history of what life would have been like for his Jewish family living under the White House leadership of an international celebrity with antiSemitic views and sympathy for the Nazi government. The Wire and Generation Kill creator David Simon and Ed Burns, producer of The Wire, have adapted the drama into a six-part limited series for HBO with a cast that features Zoe Kazan, Morgan Spector, Winona Ryder, John Turturro, Ben Cole, Anthony Boyle, Azhy Robertson and Caleb Malis. Overseeing the digital transformation of present-day locations were Visual Effects Supervisor Jim Rider (Fahrenheit 451) and Post-Production Producer Claire Shanley (The Deuce) with Nina K. Noble (Treme) serving as an executive producer. “The main difference when we started to put this project together was the challenge of shooting in New York and New Jersey for a 1940s period piece,” notes Noble. “There is not a lot of architecture from that period still existing and that meant increased travel time between practical locations and the necessity to use visual effects more generously than on previous projects.” The novel provided extensive information about the characters and their environments. “Additionally, we reviewed photos and interviewed local residents about the Weequahic neighborhood of Newark, New Jersey, in particular. Rider and his team used period maps to build Lower Manhattan

for the Lindbergh flight. Storyboards and previs were created for when Charles Lindbergh (Ben Cole) flies the Spirit of St. Louis over New York City while on a campaign tour and lands at Newark Airport. “The storyboards and previs helped us to figure out which shots we needed to get and the best strategy to shoot them,” explains Rider. “Greenscreen was wrapped around a replica cockpit of the Spirit of St. Louis, which was on a rigged gimbal that could be rotated within the lighting environment, and we did an aerial helicopter shoot replicating his tour. Even though what we were flying over has changed considerably since the 1940s, what reality we were able to get out of the real photography was invaluable because we could hang our hat on the realities of the city and then make the necessary changes. At the end of the day, we did a complete CG Lower Manhattan. The mandate was to get as much practical elements in camera. “Whenever possible, we like to create a real environment for our actors to inhabit, so our general approach is to build practical sets on location 12 feet or so high and leave the rest to post. There were period elements like airplanes that must be created, but also decisions had to be made regarding period cleanup and CG crowds. We always have to weigh the time and labor required to accomplish these things practically versus the cost and impact to the post schedule if we do it later, but first understanding what the results will be in each scenario. Reality and authenticity are the goal first and foremost, and then we consider the financial efficiency of each alternative.” It was not a difficult decision selecting the visual effects

supervisor for The Plot Against America. “We had been working with Jim Rider since the early days of The Deuce, our HBO series set in the 1970s and ’80s,” explains Noble. “He understands what’s important to us and has the expertise to be able to offer different ways to tackle the practical visual challenges we faced on The Plot Against America. Our main visual effects vendor, Phosphene, has been an enthusiastic partner since Treme, which was set in postKatrina New Orleans and was done without a full-time visual effects supervisor. We did not have all of the scripts when we started prep on Plot, so the budget was an estimate based on Jim’s experience on The Deuce, combined with the story The Plot Against America by Philip Roth. We assumed that the amount of cleanup [removal of air conditioners and satellite dishes] would be similar. Episode 2 turned out to be the most expensive episode for visual effects, since we were depicting Lindbergh’s flight around New York City and then landing at a 1940s Newark airport with hundreds of spectators.” Old movie theater newsreels are used as a means to anchor the story for viewers. “That was a strategic move [on the part of David Simon] as it reminds us on how people got news in the 1940s and what visuals they saw at the time,” remarks Shanley. “We had the challenge of integrating production footage of our Lindbergh (Ben Cole) with archival newsreel footage. It is presented in the visual idiom of the time. What would we see of a political candidate in 1940? How would they be photographed? What sort of voiceover would that have? What kind of music would be playing? How would that be cut together? What story elements is that giving us as well?”

SUMMER 2020 VFXVOICE.COM • 17

5/13/20 5:38 PM


TV/STREAMING

TURNING BACK TIME IN THE PLOT AGAINST AMERICA By TREVOR HOGG

All images courtesy of HBO. TOP: There was not a lot of architecture from the 1940s still existing in New York and New Jersey. OPPOSITE TOP: Air conditioners, satellite dishes and modern traffic lights needed to be painted out of live-action plates.

16 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 16-17

Upon learning that some radical Republican senators wanted famous aviator Charles Lindbergh to run against Franklin D. Roosevelt for the presidency of the United States, novelist Philip Roth wrote The Plot Against America, a revisionist history of what life would have been like for his Jewish family living under the White House leadership of an international celebrity with antiSemitic views and sympathy for the Nazi government. The Wire and Generation Kill creator David Simon and Ed Burns, producer of The Wire, have adapted the drama into a six-part limited series for HBO with a cast that features Zoe Kazan, Morgan Spector, Winona Ryder, John Turturro, Ben Cole, Anthony Boyle, Azhy Robertson and Caleb Malis. Overseeing the digital transformation of present-day locations were Visual Effects Supervisor Jim Rider (Fahrenheit 451) and Post-Production Producer Claire Shanley (The Deuce) with Nina K. Noble (Treme) serving as an executive producer. “The main difference when we started to put this project together was the challenge of shooting in New York and New Jersey for a 1940s period piece,” notes Noble. “There is not a lot of architecture from that period still existing and that meant increased travel time between practical locations and the necessity to use visual effects more generously than on previous projects.” The novel provided extensive information about the characters and their environments. “Additionally, we reviewed photos and interviewed local residents about the Weequahic neighborhood of Newark, New Jersey, in particular. Rider and his team used period maps to build Lower Manhattan

for the Lindbergh flight. Storyboards and previs were created for when Charles Lindbergh (Ben Cole) flies the Spirit of St. Louis over New York City while on a campaign tour and lands at Newark Airport. “The storyboards and previs helped us to figure out which shots we needed to get and the best strategy to shoot them,” explains Rider. “Greenscreen was wrapped around a replica cockpit of the Spirit of St. Louis, which was on a rigged gimbal that could be rotated within the lighting environment, and we did an aerial helicopter shoot replicating his tour. Even though what we were flying over has changed considerably since the 1940s, what reality we were able to get out of the real photography was invaluable because we could hang our hat on the realities of the city and then make the necessary changes. At the end of the day, we did a complete CG Lower Manhattan. The mandate was to get as much practical elements in camera. “Whenever possible, we like to create a real environment for our actors to inhabit, so our general approach is to build practical sets on location 12 feet or so high and leave the rest to post. There were period elements like airplanes that must be created, but also decisions had to be made regarding period cleanup and CG crowds. We always have to weigh the time and labor required to accomplish these things practically versus the cost and impact to the post schedule if we do it later, but first understanding what the results will be in each scenario. Reality and authenticity are the goal first and foremost, and then we consider the financial efficiency of each alternative.” It was not a difficult decision selecting the visual effects

supervisor for The Plot Against America. “We had been working with Jim Rider since the early days of The Deuce, our HBO series set in the 1970s and ’80s,” explains Noble. “He understands what’s important to us and has the expertise to be able to offer different ways to tackle the practical visual challenges we faced on The Plot Against America. Our main visual effects vendor, Phosphene, has been an enthusiastic partner since Treme, which was set in postKatrina New Orleans and was done without a full-time visual effects supervisor. We did not have all of the scripts when we started prep on Plot, so the budget was an estimate based on Jim’s experience on The Deuce, combined with the story The Plot Against America by Philip Roth. We assumed that the amount of cleanup [removal of air conditioners and satellite dishes] would be similar. Episode 2 turned out to be the most expensive episode for visual effects, since we were depicting Lindbergh’s flight around New York City and then landing at a 1940s Newark airport with hundreds of spectators.” Old movie theater newsreels are used as a means to anchor the story for viewers. “That was a strategic move [on the part of David Simon] as it reminds us on how people got news in the 1940s and what visuals they saw at the time,” remarks Shanley. “We had the challenge of integrating production footage of our Lindbergh (Ben Cole) with archival newsreel footage. It is presented in the visual idiom of the time. What would we see of a political candidate in 1940? How would they be photographed? What sort of voiceover would that have? What kind of music would be playing? How would that be cut together? What story elements is that giving us as well?”

SUMMER 2020 VFXVOICE.COM • 17

5/13/20 5:38 PM


TV/STREAMING

Archive footage can be costly and not easily accessible. “We have an incredible archival researcher who has been able to identify footage that was unique and accurate,” explains Shanley about referencing old footage formats that have long since disappeared. “The newsreels that we’re working with are from the period. Some of the film masters were scanned once to a format that would terrify you! Other times there were ones we could work from which was a huge gift. We couldn’t remove every pixel of dust and scratch because, in reality, those reels were shown many times a day for stretches of time. There was never a pristine print by the time it got to the theater where Shepsie Tirchwell (Michael Kostroff) is a projectionist. It also had footage that was shot in a war zone so the negative would reflect that as well.” Across the entire series there were around 800 visual effects

18 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 18-19

shots ranging from 80 to 200 per episode created by Phosphene, LVLY VFX and The Molecule. “It was a case of matching skills to our needs,” observes Shanley. “We had an incredible experience working with our primary vendor, Phosphene, on The Deuce. When you’re trying to get a lot done in a short period of time, that pre-established understanding of the aesthetic priorities is important. Phosphene was responsible for asset heavy work. We also worked with LIVLY VFX on The Deuce, and they have an expertise in texture mapping and cleanup. The Molecule was our third vendor.” The post-production schedule overlapped with principal photography and began in earnest at the beginning of November 2019, concluding in early March 2020. The tight schedule did not allow for extensive greenscreen as the production crew had

to quickly get in and out of locations. “For general location work there would be no greenscreen,” states Rider. “It was straight rotoscoping and tracking. Where possible we would shoot plates for additional cars. We had to recreate a number of large crowds, with one being at Madison Square Garden and another in Manhattan for a big funeral scene.” “We also did a large greenscreen shoot at an airport on Long Island,” reveals Rider. “The 1940s terminal at Newark Airport still exists, but it wasn’t feasible to film there, so we completely created that building in CG. We had crowds and principal actors on greenscreen. We shot a number of crowd tiles for our groundlevel work, but for aerial shots you’re seeing digital doubles. We wanted to show that thousands of people are coming to see Charles Lindbergh. Shooting from a helicopter, 150 people gets small really

OPPOSITE TOP: The Newsreel Theater where the character of Shepsie Tirchwell (Michael Kostroff) is a projectionist. OPPOSITE BOTTOM: The marquee signage about the presidential campaign of Charles Lindbergh is added digitally. TOP: Greenscreen was wrapped around a replica cockpit of the Spirit of St. Louis which was on a rigged gimbal that could be rotated within the lighting environment. BOTTOM: A CG Lower Manhattan was created as the area has changed considerably since the 1940s.

SUMMER 2020 VFXVOICE.COM • 19

5/13/20 5:38 PM


TV/STREAMING

Archive footage can be costly and not easily accessible. “We have an incredible archival researcher who has been able to identify footage that was unique and accurate,” explains Shanley about referencing old footage formats that have long since disappeared. “The newsreels that we’re working with are from the period. Some of the film masters were scanned once to a format that would terrify you! Other times there were ones we could work from which was a huge gift. We couldn’t remove every pixel of dust and scratch because, in reality, those reels were shown many times a day for stretches of time. There was never a pristine print by the time it got to the theater where Shepsie Tirchwell (Michael Kostroff) is a projectionist. It also had footage that was shot in a war zone so the negative would reflect that as well.” Across the entire series there were around 800 visual effects

18 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 18-19

shots ranging from 80 to 200 per episode created by Phosphene, LVLY VFX and The Molecule. “It was a case of matching skills to our needs,” observes Shanley. “We had an incredible experience working with our primary vendor, Phosphene, on The Deuce. When you’re trying to get a lot done in a short period of time, that pre-established understanding of the aesthetic priorities is important. Phosphene was responsible for asset heavy work. We also worked with LIVLY VFX on The Deuce, and they have an expertise in texture mapping and cleanup. The Molecule was our third vendor.” The post-production schedule overlapped with principal photography and began in earnest at the beginning of November 2019, concluding in early March 2020. The tight schedule did not allow for extensive greenscreen as the production crew had

to quickly get in and out of locations. “For general location work there would be no greenscreen,” states Rider. “It was straight rotoscoping and tracking. Where possible we would shoot plates for additional cars. We had to recreate a number of large crowds, with one being at Madison Square Garden and another in Manhattan for a big funeral scene.” “We also did a large greenscreen shoot at an airport on Long Island,” reveals Rider. “The 1940s terminal at Newark Airport still exists, but it wasn’t feasible to film there, so we completely created that building in CG. We had crowds and principal actors on greenscreen. We shot a number of crowd tiles for our groundlevel work, but for aerial shots you’re seeing digital doubles. We wanted to show that thousands of people are coming to see Charles Lindbergh. Shooting from a helicopter, 150 people gets small really

OPPOSITE TOP: The Newsreel Theater where the character of Shepsie Tirchwell (Michael Kostroff) is a projectionist. OPPOSITE BOTTOM: The marquee signage about the presidential campaign of Charles Lindbergh is added digitally. TOP: Greenscreen was wrapped around a replica cockpit of the Spirit of St. Louis which was on a rigged gimbal that could be rotated within the lighting environment. BOTTOM: A CG Lower Manhattan was created as the area has changed considerably since the 1940s.

SUMMER 2020 VFXVOICE.COM • 19

5/13/20 5:38 PM


TV/STREAMING

fast. The idea was that Lindbergh is landing at Newark Airport with the Spirit of St. Louis, so we filmed at a small airstrip in Upstate New York where they have a perfect replica of the plane. We brought our Lindbergh actor, so when you see the final product it is a combination of a complete CG environment and elements that were shot at Long Island and Upstate New York.” Editorial had a significant role in the development and integration of the visual effects, with Joe Hobeck (Homeland) cutting scenes in Washington, D.C. and Madison Square Garden while Brian Kates (The Marvelous Mrs. Maisel) assembled the Lindbergh flyover sequence. “Brian cut the sequence taking pieces of Lindbergh and potential pieces of the background to map out the story in a way that flowed editorially,” notes Rider. “I would see that we were using a piece of foreground of our

20 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 21

Lindbergh greenscreen where the sun appears to be coming from the wrong direction,” adds Rider, “and I’d say, ‘I might be able to find another section that tells the story that you want to tell that hangs together better visually.’ I would take what he had edited and create new temps that had background and foreground more in sync visually in terms of lighting and angles. I would give them back to editorial to put into the cut. We did that a number of times.” There was a constant back and forth between visual effects and editorial. “As we would refine a cut, depending on the shot, we would work out in either After Affects or Avid temp to give a sketch of what we intended the work to be,” remarks Shanley. “Enough was done practically that we weren’t caught in a situation where we needed a build to come in from visual effects.” Simon and Noble prefer to capture practical elements in-camera

as often as possible. “David and Nina brought in enough background and stunt people to completely photograph the riot,” remarks Rider. “The biggest visual effects component was general cleanup of the area.” Buildings, street lights and store signage needed to be altered to reflect the 1940s. “Even a location that you would think there isn’t any visual effects work very often has a complete ground and road replacement.” The mountaintop scene in Episode 6 was shot day for night. “What that enabled our DP Martin Ahlgren (House of Cards) to do is shoot it in such a way that we could get a proper exposure for the foreground and the valley around us and still have it read photographically and beautifully. The sunshine was played as moonlight. The sky was completely bright, so we had to do a full sky replacement.” A limb amputation was done in CG. “One of our characters

OPPOSITE TOP: Madison Square Garden from the 1940s needed to be recreated for a presidential campaign rally for Charles Lindbergh. OPPOSITE BOTTOM: Extras were shot against greenscreen to produce crowd tiles. TOP: A small number of extras are part of a scene that takes place at Madison Square Garden. BOTTOM: As massive crowd is replicated to fill the frame.

SUMMER 2020 VFXVOICE.COM • 21

5/13/20 5:39 PM


TV/STREAMING

fast. The idea was that Lindbergh is landing at Newark Airport with the Spirit of St. Louis, so we filmed at a small airstrip in Upstate New York where they have a perfect replica of the plane. We brought our Lindbergh actor, so when you see the final product it is a combination of a complete CG environment and elements that were shot at Long Island and Upstate New York.” Editorial had a significant role in the development and integration of the visual effects, with Joe Hobeck (Homeland) cutting scenes in Washington, D.C. and Madison Square Garden while Brian Kates (The Marvelous Mrs. Maisel) assembled the Lindbergh flyover sequence. “Brian cut the sequence taking pieces of Lindbergh and potential pieces of the background to map out the story in a way that flowed editorially,” notes Rider. “I would see that we were using a piece of foreground of our

20 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 21

Lindbergh greenscreen where the sun appears to be coming from the wrong direction,” adds Rider, “and I’d say, ‘I might be able to find another section that tells the story that you want to tell that hangs together better visually.’ I would take what he had edited and create new temps that had background and foreground more in sync visually in terms of lighting and angles. I would give them back to editorial to put into the cut. We did that a number of times.” There was a constant back and forth between visual effects and editorial. “As we would refine a cut, depending on the shot, we would work out in either After Affects or Avid temp to give a sketch of what we intended the work to be,” remarks Shanley. “Enough was done practically that we weren’t caught in a situation where we needed a build to come in from visual effects.” Simon and Noble prefer to capture practical elements in-camera

as often as possible. “David and Nina brought in enough background and stunt people to completely photograph the riot,” remarks Rider. “The biggest visual effects component was general cleanup of the area.” Buildings, street lights and store signage needed to be altered to reflect the 1940s. “Even a location that you would think there isn’t any visual effects work very often has a complete ground and road replacement.” The mountaintop scene in Episode 6 was shot day for night. “What that enabled our DP Martin Ahlgren (House of Cards) to do is shoot it in such a way that we could get a proper exposure for the foreground and the valley around us and still have it read photographically and beautifully. The sunshine was played as moonlight. The sky was completely bright, so we had to do a full sky replacement.” A limb amputation was done in CG. “One of our characters

OPPOSITE TOP: Madison Square Garden from the 1940s needed to be recreated for a presidential campaign rally for Charles Lindbergh. OPPOSITE BOTTOM: Extras were shot against greenscreen to produce crowd tiles. TOP: A small number of extras are part of a scene that takes place at Madison Square Garden. BOTTOM: As massive crowd is replicated to fill the frame.

SUMMER 2020 VFXVOICE.COM • 21

5/13/20 5:39 PM


TV/STREAMING

TOP: Azhy Robertson, Zoe Kazan, Morgan Spector and Winona Ryder star in the HBO miniseries The Plot Against America. (Photo: Michele K. Short) MIDDLE: John Turturro portrays Rabbi Lionel Bengelsdorf, a supporter of Charles Lindbergh. (Photo: Michele K. Short) BOTTOM LEFT TO RIGHT: Nina K. Noble, Executive Producer Claire Shanley, Post-Production Producer Jim Rider, Visual Effects Supervisor

is missing a leg, and I worked closely with the makeup department. That was done with a prosthetic stump below the knee of the actor, and then below that his shin and foot were wrapped in custom-made greenscreen sock,” reveals Rider. “We also did a full 3D body scan of him because there are other places where his greenscreen leg is crossing with his existing leg. In a number of cases we had to create a CG intact leg.” On-screen smoke and the fire in Episode 6 were provided by Special Effects Supervisor Doug Coleman (The Knick). “I’ve known Doug from a number of projects, and we were often on set together. He was instrumental in creating a hydraulic motion gimbal for the Spirit of St. Louis cockpit that enabled it to bank and roll in a realistic way. Doug also produced a fire rig to simulate a burning building, and we were prepared to do visual effects enhancements as needed, but it ended up not being necessary because what he had created looked so good on camera.” A logistical challenge was conducting principal photography at the Lincoln Memorial Reflecting Pool. “It’s a location that is always open to the public,” explains Rider. “We had to eliminate and replace non-period people. When you’re inside the Lincoln Memorial there are strict rules about filming. You are only allowed to have five people in there including the camera operator at any one time, which meant we couldn’t even shoot our whole family together. Ridged camera equipment is not permitted, so it had to be handheld with Steadicam. We would shoot a scene with two of our main actors and they would literally step out. We would keep rolling and the children would step in where they should be. Visual effects had to stitch that together later as well as taking out and replacing the non-period people with period ones shot at another time.” “I’m proud of the work on the whole show,” states Rider. “Due to some location restraints we ended up shooting a street in Paterson, New Jersey to double as West 24th Street in Manhattan. Forty percent of what we’re seeing was worthy of being on camera, and the rest of it was inappropriate for 1940 and certainly didn’t look like New York City. We recreated the entire view of West 24th Street looking towards Bryant Park with the Sun Life Building beyond it. Even though it is a relative short shot of the show, I’m proud of how that came out. “My favorite is the entire Lindbergh flyover sequence, as Phosphene did an amazing job of creating the 1940s aerial view of New York City. There were so many components involved and required so much planning that to see it completed was satisfying.” Success was achieved by adopting a different approach to how the various departments interacted with one another. “There has traditionally been a separation of production and post, which is not helpful in making the best creative and financial decisions,” observes Noble. “Encouraging and facilitating collaboration during all phases has helped us achieve high-quality results on ambitious projects like The Plot Against America.”

22 • VFXVOICE.COM SUMMER 2020

PG 16-22 PLOT.indd 22

5/13/20 5:39 PM


PG 23 AMAZON THE BOYS AD.indd 23

5/14/20 12:57 PM


PROFILE

BEN GROSSMANN: OSCAR-WINNING VFX VETERAN EMBRACES IMMERSIVE FUTURE By TREVOR HOGG

Images courtesy of Ben Grossmann except where noted.

24 • VFXVOICE.COM SUMMER 2020

PG 24-29 BEN GROSSMANN.indd 24-25

Even though he has won a pair of VES Awards and an Oscar, the pathway to Hollywood was not an obvious one for Magnopus Co-founder and CEO Ben Grossmann, who shortly after being born in Washington, D.C. moved to Alaska, where reading books was his primary entertainment. “I lived in a small town called Delta Junction,” recalls Grossmann. “If you looked in any direction there was nothing for at least 100 miles. I lived in a log cabin in the woods, surrounded by nature and fighting the cold. For most of my childhood we didn’t even have a television as there were no TV channels. I decided to buy a television after getting a job at a TV station where I was a cameraman, editor and the evening news weatherman.” Going to the cinema was not a major event growing up. “I can count on both hands the number of times I went to a movie theater when I was a kid.” An interest in the film industry developed while Grossmann was working as a photojournalist for the Associated Press and studying political science and international relations at the University of Alaska, Fairbanks. “There’s a scene in Wag the Dog where it’s decided that visual effects are needed to create this farce of a war that will distract the population from the president’s misdeed and get him re-elected. They go into this room filled with Hollywood producers and computer technicians, and start creating this visual effects shot in real-time of a child running across bridge in a war-torn Albania. The actress is given a bag of chips that is going to be turned into a kitten. I was stunned

“I didn’t have a fervent devotion to cinema, but what I did develop was an obsession for solving hard problems that people said couldn’t be done. As a consequence, a visual effects supervisor is two things: a translator and problem-solver. You have to translate what is inside the director’s head, not what is coming out of their mouth, to the people who have to create the vision. A problem-solver knows how to make effective compromises without sacrificing the greater good. Sometimes you have to play creativity off of technology, the schedule off the budget, and there are occasions when you have to play all of those things off of each other and balance them all out.” —Ben Grossmann, Co-founder and CEO, Magnopus and thought, ‘Forget politics. I’m going to Hollywood to change the world by other means.’ I threw all of my stuff in my car and drove straight to California. Turns out visual effects wasn’t like that at all.” While the computer-savvy Grossmann was working as a temp receptionist at a Hollywood talent agency, an agent connected him with Bob Coleman, the President and CEO of Digital Artists Agency. “I got a job working at Digital Artists Agency formatting the résumés of visual effects artists, encoding their demo reels and putting their stuff on websites. What that did was give me the ability to see what skills the successful visual effects artists had and which ones were in most demand.” A job opportunity arose for a roto paint artist. “I went into the interview and said, ‘I’m not qualified for this job, but I’ll tell you this. If I’m not delivering work

OPPOSITE TOP LEFT: Ben Grossmann, Co-founder and CEO, Magnopus. OPPOSITE TOP RIGHT: Grossmann and his wife, Ariane Rosier, get acquainted with the Oscar for Best Visual Effects awarded to Hugo, at the Governor’s Ball in Hollywood in 2012. (Photo: Valerie Macon/AFP/Getty Images) TOP: Alex Henning, Robert Legato, ASC and Ben Grossmann with their Oscars for Hugo in 2012.

SUMMER 2020 VFXVOICE.COM • 25

5/13/20 5:42 PM


PROFILE

BEN GROSSMANN: OSCAR-WINNING VFX VETERAN EMBRACES IMMERSIVE FUTURE By TREVOR HOGG

Images courtesy of Ben Grossmann except where noted.

24 • VFXVOICE.COM SUMMER 2020

PG 24-29 BEN GROSSMANN.indd 24-25

Even though he has won a pair of VES Awards and an Oscar, the pathway to Hollywood was not an obvious one for Magnopus Co-founder and CEO Ben Grossmann, who shortly after being born in Washington, D.C. moved to Alaska, where reading books was his primary entertainment. “I lived in a small town called Delta Junction,” recalls Grossmann. “If you looked in any direction there was nothing for at least 100 miles. I lived in a log cabin in the woods, surrounded by nature and fighting the cold. For most of my childhood we didn’t even have a television as there were no TV channels. I decided to buy a television after getting a job at a TV station where I was a cameraman, editor and the evening news weatherman.” Going to the cinema was not a major event growing up. “I can count on both hands the number of times I went to a movie theater when I was a kid.” An interest in the film industry developed while Grossmann was working as a photojournalist for the Associated Press and studying political science and international relations at the University of Alaska, Fairbanks. “There’s a scene in Wag the Dog where it’s decided that visual effects are needed to create this farce of a war that will distract the population from the president’s misdeed and get him re-elected. They go into this room filled with Hollywood producers and computer technicians, and start creating this visual effects shot in real-time of a child running across bridge in a war-torn Albania. The actress is given a bag of chips that is going to be turned into a kitten. I was stunned

“I didn’t have a fervent devotion to cinema, but what I did develop was an obsession for solving hard problems that people said couldn’t be done. As a consequence, a visual effects supervisor is two things: a translator and problem-solver. You have to translate what is inside the director’s head, not what is coming out of their mouth, to the people who have to create the vision. A problem-solver knows how to make effective compromises without sacrificing the greater good. Sometimes you have to play creativity off of technology, the schedule off the budget, and there are occasions when you have to play all of those things off of each other and balance them all out.” —Ben Grossmann, Co-founder and CEO, Magnopus and thought, ‘Forget politics. I’m going to Hollywood to change the world by other means.’ I threw all of my stuff in my car and drove straight to California. Turns out visual effects wasn’t like that at all.” While the computer-savvy Grossmann was working as a temp receptionist at a Hollywood talent agency, an agent connected him with Bob Coleman, the President and CEO of Digital Artists Agency. “I got a job working at Digital Artists Agency formatting the résumés of visual effects artists, encoding their demo reels and putting their stuff on websites. What that did was give me the ability to see what skills the successful visual effects artists had and which ones were in most demand.” A job opportunity arose for a roto paint artist. “I went into the interview and said, ‘I’m not qualified for this job, but I’ll tell you this. If I’m not delivering work

OPPOSITE TOP LEFT: Ben Grossmann, Co-founder and CEO, Magnopus. OPPOSITE TOP RIGHT: Grossmann and his wife, Ariane Rosier, get acquainted with the Oscar for Best Visual Effects awarded to Hugo, at the Governor’s Ball in Hollywood in 2012. (Photo: Valerie Macon/AFP/Getty Images) TOP: Alex Henning, Robert Legato, ASC and Ben Grossmann with their Oscars for Hugo in 2012.

SUMMER 2020 VFXVOICE.COM • 25

5/13/20 5:42 PM


PROFILE

TOP LEFT: A storm sequence is shot against bluescreen for Shutter Island (2010). (Image courtesy of Paramount Pictures) MIDDLE LEFT: Shooting background plates on Shutter Island. BOTTOM LEFT: From left: Alex Henning, Rob Legato, ASC, Karen Murphy and Grossmann at the 10th Annual VES Awards in 2012, where they won the award for Outstanding Supporting Visual Effects in a Feature Motion Picture. (Photo: Brandon Clark) TOP RIGHT: With the train crash miniature on Hugo.

26 • VFXVOICE.COM SUMMER 2020

PG 24-29 BEN GROSSMANN.indd 26-27

that makes you happy in 30 days, I’ll give you all of your money back. But I guarantee you that I can figure out whatever it is you want me to do and I can deliver it. Just give me a chance.’ They hired me, and that’s how I got into visual effects.” In order to accelerate the learning process, Grossmann sought out the most difficult projects. “Bob had fun finding the most complex, impossible circumstances for me. I would routinely show up for a job that I was unqualified for and spend the whole week pulling all-nighters learning the software. As a result of those experiences I ended up working with some insanely talented people you either did or didn’t want to emulate.” A rare experience was had on Master and Commander: The Far Side of the World when the young roto paint artist was invited to sit next to filmmaker Peter Weir to review his work. “Anytime I went in that screening room he treated me with respect and appreciation, no matter how insignificant my shots were. I felt like I was a part of the movie, so I worked extra hard to make sure that my work was phenomenal. It’s hard to find that acknowledgement from a director when you’re so junior. But it’s like what Lao Tzu said, ‘If you fail to trust the people, you make them untrustworthy.’” “When it comes to role models I think of Volker Engel (Independence Day) and Marc Weigert (White House Down), as I learned so much from them in my early days. Nothing could stop those guys,” remarks Grossmann. “Then I was fortunate to work with Rob Legato [ASC] (Titanic), who is also a director,

cinematographer and editor. Rob does all of these things because they are a means to an end. Whenever anyone tells him, ‘You can’t do that,’ you can rest assured that Rob will be sitting in his basement for the next 36 hours learning everything there is to know about why he can’t do that. The next morning Rob comes out and says, ‘This is how we’re doing it.’ And it works. Rob is not motivated by ego, but devoted to preserving the integrity of the story and executing the vision of the director in the best way possible. That sole ambition makes him one of the most invincible people I have ever worked for.” Eventually, Grossmann became a facility visual effects supervisor at The Syndicate, CafeFX and Pixomondo, with a résumé that includes collaborating with Martin Scorsese on a series of commercial shorts and Shutter Island and Hugo, as well as with J.J. Abrams, and Tim Burton respectively on Star Trek Into Darkness and Alice in Wonderland. “I didn’t have a fervent devotion to cinema, but what I did develop was an obsession for solving hard problems that people said couldn’t be done. As a consequence, a visual effects supervisor is two things: a translator and problem-solver. You have to translate what is inside the director’s head, not what is coming out of their mouth, to the people who have to create the vision. A problem-solver knows how to make effective compromises without sacrificing the greater good. Sometimes you have to play creativity off of technology, the schedule off the budget, and there are occasions when you have to play all of those things off of each other and balance them all out.”

TOP LEFT: With a render farm built at Uncharted Territory in a weekend to make a trailer delivery. TOP RIGHT: Hugo Cabret (Asa Butterfield) and Isabelle (Chloë Grace Moretz) enjoy a view of Paris behind the clock tower at Gare Montparnasse railway station in Hugo. (Image courtesy of Paramount Pictures) MIDDLE RIGHT: Hugo Cabret tries to escape from Inspector Gustave Dasté (Sacha Baron Cohen) in Hugo. (Image courtesy of Paramount Pictures) BOTTOM: Directing a team of Chinese acrobatic performers during a commercial shoot for Six Flags in Hollywood in 2005.

SUMMER 2020 VFXVOICE.COM • 27

5/13/20 5:42 PM


PROFILE

TOP LEFT: A storm sequence is shot against bluescreen for Shutter Island (2010). (Image courtesy of Paramount Pictures) MIDDLE LEFT: Shooting background plates on Shutter Island. BOTTOM LEFT: From left: Alex Henning, Rob Legato, ASC, Karen Murphy and Grossmann at the 10th Annual VES Awards in 2012, where they won the award for Outstanding Supporting Visual Effects in a Feature Motion Picture. (Photo: Brandon Clark) TOP RIGHT: With the train crash miniature on Hugo.

26 • VFXVOICE.COM SUMMER 2020

PG 24-29 BEN GROSSMANN.indd 26-27

that makes you happy in 30 days, I’ll give you all of your money back. But I guarantee you that I can figure out whatever it is you want me to do and I can deliver it. Just give me a chance.’ They hired me, and that’s how I got into visual effects.” In order to accelerate the learning process, Grossmann sought out the most difficult projects. “Bob had fun finding the most complex, impossible circumstances for me. I would routinely show up for a job that I was unqualified for and spend the whole week pulling all-nighters learning the software. As a result of those experiences I ended up working with some insanely talented people you either did or didn’t want to emulate.” A rare experience was had on Master and Commander: The Far Side of the World when the young roto paint artist was invited to sit next to filmmaker Peter Weir to review his work. “Anytime I went in that screening room he treated me with respect and appreciation, no matter how insignificant my shots were. I felt like I was a part of the movie, so I worked extra hard to make sure that my work was phenomenal. It’s hard to find that acknowledgement from a director when you’re so junior. But it’s like what Lao Tzu said, ‘If you fail to trust the people, you make them untrustworthy.’” “When it comes to role models I think of Volker Engel (Independence Day) and Marc Weigert (White House Down), as I learned so much from them in my early days. Nothing could stop those guys,” remarks Grossmann. “Then I was fortunate to work with Rob Legato [ASC] (Titanic), who is also a director,

cinematographer and editor. Rob does all of these things because they are a means to an end. Whenever anyone tells him, ‘You can’t do that,’ you can rest assured that Rob will be sitting in his basement for the next 36 hours learning everything there is to know about why he can’t do that. The next morning Rob comes out and says, ‘This is how we’re doing it.’ And it works. Rob is not motivated by ego, but devoted to preserving the integrity of the story and executing the vision of the director in the best way possible. That sole ambition makes him one of the most invincible people I have ever worked for.” Eventually, Grossmann became a facility visual effects supervisor at The Syndicate, CafeFX and Pixomondo, with a résumé that includes collaborating with Martin Scorsese on a series of commercial shorts and Shutter Island and Hugo, as well as with J.J. Abrams, and Tim Burton respectively on Star Trek Into Darkness and Alice in Wonderland. “I didn’t have a fervent devotion to cinema, but what I did develop was an obsession for solving hard problems that people said couldn’t be done. As a consequence, a visual effects supervisor is two things: a translator and problem-solver. You have to translate what is inside the director’s head, not what is coming out of their mouth, to the people who have to create the vision. A problem-solver knows how to make effective compromises without sacrificing the greater good. Sometimes you have to play creativity off of technology, the schedule off the budget, and there are occasions when you have to play all of those things off of each other and balance them all out.”

TOP LEFT: With a render farm built at Uncharted Territory in a weekend to make a trailer delivery. TOP RIGHT: Hugo Cabret (Asa Butterfield) and Isabelle (Chloë Grace Moretz) enjoy a view of Paris behind the clock tower at Gare Montparnasse railway station in Hugo. (Image courtesy of Paramount Pictures) MIDDLE RIGHT: Hugo Cabret tries to escape from Inspector Gustave Dasté (Sacha Baron Cohen) in Hugo. (Image courtesy of Paramount Pictures) BOTTOM: Directing a team of Chinese acrobatic performers during a commercial shoot for Six Flags in Hollywood in 2005.

SUMMER 2020 VFXVOICE.COM • 27

5/13/20 5:42 PM


PROFILE

TOP AND MIDDLE: Grossmann worked with J.J. Abrams as Visual Effects Supervisor for Pixomondo on Star Trek Into Darkness (2013), which was nominated for an Oscar and VES Award. It was Grossmann’s last film as a visual effects supervisor before launching Magnopus. (Image courtesy of Paramount Pictures) BOTTOM: Grossmann worked with director Peter Weir on Master and Commander: The Far Side of the World, an experience Grossmann found inspiring. (Photo: Stephen Vaughan. Copyright © 2002 Twentieth Century Fox)

28 • VFXVOICE.COM SUMMER 2020

PG 24-29 BEN GROSSMANN.indd 29

Upon the release of Hugo, Grossmann won an Oscar and VES Award for being part of the visual effects team, and subsequently received Oscar and VES Award nominations for Star Trek Into Darkness. “For Hugo, we needed four times our budget to make the movie that Marty wanted, and we launched a Hail Mary plan for the visual effects schedule, budget and crew while it was being shot. It took artists in a dozen offices around the world at Pixomondo to deliver the vision, but it was close. We had one shot that we worked on for a year and a half and only ever managed to get one render out. We finaled Version 1 of that shot, although Rob Legato comped more snow into it at the DI using a color matte, so I guess that that counts as Version 1.5.” Star Trek Into Darkness was an entirely different experience. “That was the last movie I worked on as a visual effects supervisor. I love J.J. Abrams, but not the machinery of what it takes to get a movie like that done. At the end of day, we finished on budget and did remarkable work that got nominated for an Oscar.” A factory mentality prevails in the visual effects industry, observes Grossmann. “People like process, but that’s usually the death of innovation because every movie and challenge is unique. Visual effects companies are inherently about needing to process things because they’ve got to automate stuff as much as possible in order to survive financially. As a consequence, you have a generation of visual effects supervisors who come up through that system where optimism can be hard to find.” In 2013, Grossmann ventured into the emerging medium of VR and AR by co-founding Magnopus with Alex Henning and Rodrigo Teixeira. “I noticed that my daughter was more interested in playing with interactive things than staring at the television. I thought, ‘Something new has to be created that goes beyond theaters, where each audience member gets to have their own unique perspective and role to play in a movie.’ We started this company to figure out how to get audiences inside of those experiences and get those experiences out into the real world. We’ve done experiments like Pixar’s Coco VR, Alcon’s Blade Runner 2049: The Memory Lab and Mission: ISS with NASA. Magnopus is checking off boxes in the problem process to get to that canvas where movies are a world you can experience from the inside. That’s what motivated our work on The Lion King.” A significant creative and technical partnership has been forged between Magnopus, Visual Effects Supervisor Rob Legato and filmmaker Jon Favreau (Iron Man), which has led to virtual production methods being utilized when making photorealistic adaptations of Disney animated classics The Jungle Book and The Lion King. “It’s hard to find a filmmaker like Jon who is the passion and energy behind a team,” notes Grossmann. “He is fearless about technology but not foolish, which is hard. There are a lot of challenges with technology on a film like The Jungle Book. Because real-time filmmaking and virtual production is such a niche industry it was clear to us on The Jungle Book that no company could make enough money from it to properly invest in it. Jon had to struggle to see the picture while they were shooting and editing, and everybody was doing the best they possibly could to show him what it looked like. All filmmakers face that challenge when you’re trying to shoot in two worlds, physical and digital, at the same time.”

“I was fortunate to work with Rob Legato... Whenever anyone tells him, ‘You can’t do that’ you can rest assured that Rob will be sitting in his basement for the next 36 hours learning everything there is to know about why he can’t do that. The next morning Rob comes out and says, ‘This is how we’re doing it.’ And it works. Rob is not motivated by ego, but devoted to preserving the integrity of the story and executing the vision of the director in the best way possible. That sole ambition makes him one of the most invincible people I have ever worked for.” —Ben Grossmann, Co-founder and CEO, Magnopus An opportunity arose for Magnopus to further develop the virtual production technology with MPC on The Lion King, which resulted in Grossmann, Legato, cinematographer Caleb Deschanel, ASC (The Right Stuff) and virtual production producer Adrian J. Sciutto (The Amazing Spider-Man) winning the 2020 VES Award for Outstanding Virtual Cinematography in a CG Project. “We wanted to see what would happen if the world was built in a real-time engine. We constructed the tools that everyone expects from live-action filmmaking so they don’t have to learn it, and made hardware which adapted to that,” explains Grossmann. “Our idea was, ‘What if we put the film crew in the movie that’s a prototype for what it would be like to put the audience in there?’ In that sense it was successful because the film crew had a great time. The visual effects work was amazing.” A by-product of the virtual production innovation is filmmakers will be able to figure out what they want better and present it to the visual effects artists to improve. “More than half of visual effects budgets are wasted on bad communication and people not knowing what they want. I don’t blame the filmmakers because it is hard to understand a shot that is incomplete. We have to fix that.” The next question is how the virtual production technology can be applied to live-action filmmaking. “The idea came from rear-screen projection, but powering the content that’s on the screens with a real-time game engine like we did on The Lion King and going for photorealism,” states Grossmann. “What would the future look like for live-action films if you were essentially shooting them on a Star Trek holodeck? The techniques of realtime film production need to be aligned with consumer technologies, though. When you come up with innovation that you target towards consumers your market potential is immense. It’s super cool to see people’s faces when you put them into something for the first time. The film crew inside of The Lion King was like, ‘This is awesome! I’m standing on Pride Rock.’”

TOP AND MIDDLE: Grossmann served as Visual Effects Supervisor for CafeFX on Tim Burton’s Alice In Wonderland (2010). (Images courtesy of Walt Disney Studios) BOTTOM: Grossmann, cinematographer Caleb Deschanel, ASC, Rob Legato, ASC and virtual production producer Adrian J. Sciutto (not pictured) received the 2020 VES Award for Outstanding Virtual Cinematography in a CG Project for The Lion King. (Photo: Danny Moloshok and Phil McCarten)

SUMMER 2020 VFXVOICE.COM • 29

5/13/20 5:42 PM


PROFILE

TOP AND MIDDLE: Grossmann worked with J.J. Abrams as Visual Effects Supervisor for Pixomondo on Star Trek Into Darkness (2013), which was nominated for an Oscar and VES Award. It was Grossmann’s last film as a visual effects supervisor before launching Magnopus. (Image courtesy of Paramount Pictures) BOTTOM: Grossmann worked with director Peter Weir on Master and Commander: The Far Side of the World, an experience Grossmann found inspiring. (Photo: Stephen Vaughan. Copyright © 2002 Twentieth Century Fox)

28 • VFXVOICE.COM SUMMER 2020

PG 24-29 BEN GROSSMANN.indd 29

Upon the release of Hugo, Grossmann won an Oscar and VES Award for being part of the visual effects team, and subsequently received Oscar and VES Award nominations for Star Trek Into Darkness. “For Hugo, we needed four times our budget to make the movie that Marty wanted, and we launched a Hail Mary plan for the visual effects schedule, budget and crew while it was being shot. It took artists in a dozen offices around the world at Pixomondo to deliver the vision, but it was close. We had one shot that we worked on for a year and a half and only ever managed to get one render out. We finaled Version 1 of that shot, although Rob Legato comped more snow into it at the DI using a color matte, so I guess that that counts as Version 1.5.” Star Trek Into Darkness was an entirely different experience. “That was the last movie I worked on as a visual effects supervisor. I love J.J. Abrams, but not the machinery of what it takes to get a movie like that done. At the end of day, we finished on budget and did remarkable work that got nominated for an Oscar.” A factory mentality prevails in the visual effects industry, observes Grossmann. “People like process, but that’s usually the death of innovation because every movie and challenge is unique. Visual effects companies are inherently about needing to process things because they’ve got to automate stuff as much as possible in order to survive financially. As a consequence, you have a generation of visual effects supervisors who come up through that system where optimism can be hard to find.” In 2013, Grossmann ventured into the emerging medium of VR and AR by co-founding Magnopus with Alex Henning and Rodrigo Teixeira. “I noticed that my daughter was more interested in playing with interactive things than staring at the television. I thought, ‘Something new has to be created that goes beyond theaters, where each audience member gets to have their own unique perspective and role to play in a movie.’ We started this company to figure out how to get audiences inside of those experiences and get those experiences out into the real world. We’ve done experiments like Pixar’s Coco VR, Alcon’s Blade Runner 2049: The Memory Lab and Mission: ISS with NASA. Magnopus is checking off boxes in the problem process to get to that canvas where movies are a world you can experience from the inside. That’s what motivated our work on The Lion King.” A significant creative and technical partnership has been forged between Magnopus, Visual Effects Supervisor Rob Legato and filmmaker Jon Favreau (Iron Man), which has led to virtual production methods being utilized when making photorealistic adaptations of Disney animated classics The Jungle Book and The Lion King. “It’s hard to find a filmmaker like Jon who is the passion and energy behind a team,” notes Grossmann. “He is fearless about technology but not foolish, which is hard. There are a lot of challenges with technology on a film like The Jungle Book. Because real-time filmmaking and virtual production is such a niche industry it was clear to us on The Jungle Book that no company could make enough money from it to properly invest in it. Jon had to struggle to see the picture while they were shooting and editing, and everybody was doing the best they possibly could to show him what it looked like. All filmmakers face that challenge when you’re trying to shoot in two worlds, physical and digital, at the same time.”

“I was fortunate to work with Rob Legato... Whenever anyone tells him, ‘You can’t do that’ you can rest assured that Rob will be sitting in his basement for the next 36 hours learning everything there is to know about why he can’t do that. The next morning Rob comes out and says, ‘This is how we’re doing it.’ And it works. Rob is not motivated by ego, but devoted to preserving the integrity of the story and executing the vision of the director in the best way possible. That sole ambition makes him one of the most invincible people I have ever worked for.” —Ben Grossmann, Co-founder and CEO, Magnopus An opportunity arose for Magnopus to further develop the virtual production technology with MPC on The Lion King, which resulted in Grossmann, Legato, cinematographer Caleb Deschanel, ASC (The Right Stuff) and virtual production producer Adrian J. Sciutto (The Amazing Spider-Man) winning the 2020 VES Award for Outstanding Virtual Cinematography in a CG Project. “We wanted to see what would happen if the world was built in a real-time engine. We constructed the tools that everyone expects from live-action filmmaking so they don’t have to learn it, and made hardware which adapted to that,” explains Grossmann. “Our idea was, ‘What if we put the film crew in the movie that’s a prototype for what it would be like to put the audience in there?’ In that sense it was successful because the film crew had a great time. The visual effects work was amazing.” A by-product of the virtual production innovation is filmmakers will be able to figure out what they want better and present it to the visual effects artists to improve. “More than half of visual effects budgets are wasted on bad communication and people not knowing what they want. I don’t blame the filmmakers because it is hard to understand a shot that is incomplete. We have to fix that.” The next question is how the virtual production technology can be applied to live-action filmmaking. “The idea came from rear-screen projection, but powering the content that’s on the screens with a real-time game engine like we did on The Lion King and going for photorealism,” states Grossmann. “What would the future look like for live-action films if you were essentially shooting them on a Star Trek holodeck? The techniques of realtime film production need to be aligned with consumer technologies, though. When you come up with innovation that you target towards consumers your market potential is immense. It’s super cool to see people’s faces when you put them into something for the first time. The film crew inside of The Lion King was like, ‘This is awesome! I’m standing on Pride Rock.’”

TOP AND MIDDLE: Grossmann served as Visual Effects Supervisor for CafeFX on Tim Burton’s Alice In Wonderland (2010). (Images courtesy of Walt Disney Studios) BOTTOM: Grossmann, cinematographer Caleb Deschanel, ASC, Rob Legato, ASC and virtual production producer Adrian J. Sciutto (not pictured) received the 2020 VES Award for Outstanding Virtual Cinematography in a CG Project for The Lion King. (Photo: Danny Moloshok and Phil McCarten)

SUMMER 2020 VFXVOICE.COM • 29

5/13/20 5:42 PM


TV/STREAMING

BEYOND THE PARK IN WESTWORLD WITH VFX SUPERVISOR JAY WORTH By IAN FAILES

All images copyright © 2020 HBO except where noted.

30 • VFXVOICE.COM SUMMER 2020

PG 30-35 WESTWORLD.indd 30-31

After two seasons of HBO’s Westworld set mostly inside the show’s complex Wild West-themed artificial park, the latest season has moved out into the ‘real’ world. We now get to glimpse a whole new futurescape and the complex people – and machines – within it. For the visual effects team on Season 3, in particular, this meant contributing a wealth of new environments, vehicles, robots and action. It also offered up new opportunities for how certain scenes could be imagined. For instance, a number of flight sequences and office environments made use of LED walls and real-time rendering technologies to provide interactive backgrounds during filming. The new locations of Season 3 were in many ways a continuation of the VFX design challenge faced by Visual Effects Supervisor Jay Worth. “Every inch of every frame is designed,” he says. “You can’t really point the camera anywhere in the [Westworld] world without [visual effects], the art department or costume having to touch it. Even when we were back in the park, we had to remove everything in the sky. We’d be up on a hillside and say, ‘Oh this is great!,’ and then all of a sudden you see a random telephone pole in the background. Those times happen a lot where we’re at now this season.” To handle the large visual effects assignment on Season 3 – more than 3,000 shots over eight episodes – Worth realized early on that he needed to concentrate on city environments and robots. “So we said, ‘Which hard-surface company do we want to use and which city environment do we want to use? I actually got reels from all

“Every inch of every frame is designed. You can’t really point the camera anywhere in the [Westworld] world without [visual effects], the art department or costume having to touch it. Even when we were back in the park, we had to remove everything in the sky. We’d be up on a hillside and say, ‘Oh this is great!,’ and then all of a sudden you see a random telephone pole in the background. Those times happen a lot where we’re at now this season.” —Jay Worth, Visual Effects Supervisor the different vendors that I worked with and did a ‘blind taste test’ with Jonah [series co-creator Jonathan Nolan]. Thankfully, we were on the same page with who we wanted to go with.” In the end, Pixomondo was chosen to carry out environment VFX, with DNEG handling the new robots. Other vendors on the show included CoSA, Important Looking Pirates, RISE FX, Crafty Apes, Deep Water/Yafka and Bot VFX. VISUALIZING THE OUTSIDE WORLD

OPPOSITE TOP LEFT: Visual Effects Supervisor Jay Worth. (Image courtesy of Jay Worth) OPPOSITE TOP RIGHT: Aaron Paul as Caleb Nichols with the construction robot George. DNEG was responsible for Season 3’s various robots. TOP: An airpod appearing in Episode 1 of Season 3. Pixomondo handled city landscapes this series.

Since the beginning of the series, there were many discussions between Worth and Nolan about how outside the park would appear, Worth says. “We always built everything within the park with a very clear understanding of what the outside world looked like. If you think about it, the park is set up for the uber-rich to come and figure out how to play. So we thought, ‘What does the world look like if the .01% have been able to have their way more than is probably healthy?’”

SUMMER 2020 VFXVOICE.COM • 31

5/13/20 5:43 PM


TV/STREAMING

BEYOND THE PARK IN WESTWORLD WITH VFX SUPERVISOR JAY WORTH By IAN FAILES

All images copyright © 2020 HBO except where noted.

30 • VFXVOICE.COM SUMMER 2020

PG 30-35 WESTWORLD.indd 30-31

After two seasons of HBO’s Westworld set mostly inside the show’s complex Wild West-themed artificial park, the latest season has moved out into the ‘real’ world. We now get to glimpse a whole new futurescape and the complex people – and machines – within it. For the visual effects team on Season 3, in particular, this meant contributing a wealth of new environments, vehicles, robots and action. It also offered up new opportunities for how certain scenes could be imagined. For instance, a number of flight sequences and office environments made use of LED walls and real-time rendering technologies to provide interactive backgrounds during filming. The new locations of Season 3 were in many ways a continuation of the VFX design challenge faced by Visual Effects Supervisor Jay Worth. “Every inch of every frame is designed,” he says. “You can’t really point the camera anywhere in the [Westworld] world without [visual effects], the art department or costume having to touch it. Even when we were back in the park, we had to remove everything in the sky. We’d be up on a hillside and say, ‘Oh this is great!,’ and then all of a sudden you see a random telephone pole in the background. Those times happen a lot where we’re at now this season.” To handle the large visual effects assignment on Season 3 – more than 3,000 shots over eight episodes – Worth realized early on that he needed to concentrate on city environments and robots. “So we said, ‘Which hard-surface company do we want to use and which city environment do we want to use? I actually got reels from all

“Every inch of every frame is designed. You can’t really point the camera anywhere in the [Westworld] world without [visual effects], the art department or costume having to touch it. Even when we were back in the park, we had to remove everything in the sky. We’d be up on a hillside and say, ‘Oh this is great!,’ and then all of a sudden you see a random telephone pole in the background. Those times happen a lot where we’re at now this season.” —Jay Worth, Visual Effects Supervisor the different vendors that I worked with and did a ‘blind taste test’ with Jonah [series co-creator Jonathan Nolan]. Thankfully, we were on the same page with who we wanted to go with.” In the end, Pixomondo was chosen to carry out environment VFX, with DNEG handling the new robots. Other vendors on the show included CoSA, Important Looking Pirates, RISE FX, Crafty Apes, Deep Water/Yafka and Bot VFX. VISUALIZING THE OUTSIDE WORLD

OPPOSITE TOP LEFT: Visual Effects Supervisor Jay Worth. (Image courtesy of Jay Worth) OPPOSITE TOP RIGHT: Aaron Paul as Caleb Nichols with the construction robot George. DNEG was responsible for Season 3’s various robots. TOP: An airpod appearing in Episode 1 of Season 3. Pixomondo handled city landscapes this series.

Since the beginning of the series, there were many discussions between Worth and Nolan about how outside the park would appear, Worth says. “We always built everything within the park with a very clear understanding of what the outside world looked like. If you think about it, the park is set up for the uber-rich to come and figure out how to play. So we thought, ‘What does the world look like if the .01% have been able to have their way more than is probably healthy?’”

SUMMER 2020 VFXVOICE.COM • 31

5/13/20 5:43 PM


TV/STREAMING

“We always built everything within the park with a very clear understanding of what the outside world looked like. If you think about it, the park is set up for the uber-rich to come and figure out how to play. So we thought, ‘What does the world look like if the .01% have been able to have their way more than is probably healthy?’” —Jay Worth, Visual Effects Supervisor

32 • VFXVOICE.COM SUMMER 2020

PG 30-35 WESTWORLD.indd 32-33

Inspiration for that look and feel came from locations that tended to already have a futuristic presence, including Singapore and the City of Arts and Sciences in Valencia, Spain. Production filmed at those two cities, and then final scenes would be a mix and match between them and Los Angeles, with visual effects led by Pixomondo carrying out the merger. While there would be several hero environment shots, a large component of the backgrounds also included readying them for use on LED wall projections for use during ‘airpod’ flying scenes and other sequences. That approach was something that had been attempted, but not fully developed for previous seasons. “Jonah and I have always been passionate about projection and LED technology,” states Worth. “That goes back to the park map, and just generally moving away from throwing up a bluescreen or greenscreen. With the progress in LED technology, as well as real-time tools like Unreal Engine, we really wanted to embrace that new approach. “We have tests from four years ago,” adds Worth, “where we built our own little supercomputer and had projectors and built a real-time environment. And we put a sensor on the camera and sensors around the room to see how it worked to have it translate to camera. Unfortunately, at the time we just couldn’t quite get it there.”

The Westworld production took on more of a partner approach this time around for the LED screens, collaborating with Epic Games (the makers of Unreal Engine), Fuse Technical Group for LED screens, and Profile Studios to handle the on-set calibration and operation of screens and imagery. The screens first came into use for the airpod flying shots. Production designer Howard Cummings’ art department produced the sleek airpods, designed by concept designer Thang Lee. Special effects contributed a gimbal rig that allowed the actors to be filmed inside the airpods and go through banking and other moves. This was filmed against a large semi-circle LED wall, onto which were projected augmented helicopter plates of Los Angeles. “It meant we were really able to put the camera wherever we wanted and get real-time playback, real interactive light and real reflections – all those things that make it really feel like these people are flying,” says Worth. “In fact, we basically built a mini-amusement park ride. The actors are in this world and they’re looking around and they really feel like they’re flying through the city.” While extra reflections were composited into the windows, along with additions to the practical water droplets, most of the views outside were achieved in-camera. This particular setup did

OPPOSITE TOP: On Season 3, LED screens and real-time rendered backgrounds were utilized for flying scenes, allowing VFX shots to be captured in-camera. OPPOSITE BOTTOM: Westworld co-creator Jonathan Nolan with Lena Waithe, who plays Ash in Season 3. TOP: Existing scenes of Los Angeles were augmented digitally to include new buildings, as well as the flying airpod. BOTTOM: An airpod flies through neo-Los Angeles in an Episode 1 VFX shot crafted by Pixomondo.

SUMMER 2020 VFXVOICE.COM • 33

5/13/20 5:44 PM


TV/STREAMING

“We always built everything within the park with a very clear understanding of what the outside world looked like. If you think about it, the park is set up for the uber-rich to come and figure out how to play. So we thought, ‘What does the world look like if the .01% have been able to have their way more than is probably healthy?’” —Jay Worth, Visual Effects Supervisor

32 • VFXVOICE.COM SUMMER 2020

PG 30-35 WESTWORLD.indd 32-33

Inspiration for that look and feel came from locations that tended to already have a futuristic presence, including Singapore and the City of Arts and Sciences in Valencia, Spain. Production filmed at those two cities, and then final scenes would be a mix and match between them and Los Angeles, with visual effects led by Pixomondo carrying out the merger. While there would be several hero environment shots, a large component of the backgrounds also included readying them for use on LED wall projections for use during ‘airpod’ flying scenes and other sequences. That approach was something that had been attempted, but not fully developed for previous seasons. “Jonah and I have always been passionate about projection and LED technology,” states Worth. “That goes back to the park map, and just generally moving away from throwing up a bluescreen or greenscreen. With the progress in LED technology, as well as real-time tools like Unreal Engine, we really wanted to embrace that new approach. “We have tests from four years ago,” adds Worth, “where we built our own little supercomputer and had projectors and built a real-time environment. And we put a sensor on the camera and sensors around the room to see how it worked to have it translate to camera. Unfortunately, at the time we just couldn’t quite get it there.”

The Westworld production took on more of a partner approach this time around for the LED screens, collaborating with Epic Games (the makers of Unreal Engine), Fuse Technical Group for LED screens, and Profile Studios to handle the on-set calibration and operation of screens and imagery. The screens first came into use for the airpod flying shots. Production designer Howard Cummings’ art department produced the sleek airpods, designed by concept designer Thang Lee. Special effects contributed a gimbal rig that allowed the actors to be filmed inside the airpods and go through banking and other moves. This was filmed against a large semi-circle LED wall, onto which were projected augmented helicopter plates of Los Angeles. “It meant we were really able to put the camera wherever we wanted and get real-time playback, real interactive light and real reflections – all those things that make it really feel like these people are flying,” says Worth. “In fact, we basically built a mini-amusement park ride. The actors are in this world and they’re looking around and they really feel like they’re flying through the city.” While extra reflections were composited into the windows, along with additions to the practical water droplets, most of the views outside were achieved in-camera. This particular setup did

OPPOSITE TOP: On Season 3, LED screens and real-time rendered backgrounds were utilized for flying scenes, allowing VFX shots to be captured in-camera. OPPOSITE BOTTOM: Westworld co-creator Jonathan Nolan with Lena Waithe, who plays Ash in Season 3. TOP: Existing scenes of Los Angeles were augmented digitally to include new buildings, as well as the flying airpod. BOTTOM: An airpod flies through neo-Los Angeles in an Episode 1 VFX shot crafted by Pixomondo.

SUMMER 2020 VFXVOICE.COM • 33

5/13/20 5:44 PM


TV/STREAMING

not require real-time rendering, but other scenes did utilize that approach. “There’s a sequence in Episode 3 with Charlotte Hale (Tessa Thompson) in her office at Delos,” advises Worth, “where we built the entire asset in Unreal. All of those shots are with the massive 150-foot LED wall outside a piece of glass, so we had to test what shooting through glass would involve, making sure that there were no aberrations and that it translated correctly to camera.” ROBOTS RISING

Westworld has already seen its fair share of robots for the park’s ‘hosts,’ although of course those are of the kind that look almost like humans. In the ‘real’ world, robots were envisaged as metallic-surfaced versions, i.e. more like the classic robots we might come to think of. “One of the challenges we had on the show was actually creating those new robots who are standing right next to these humanoid robots that we’ve designed that obviously are so much more human than anything we’ve ever seen,” identifies Worth. “It was somewhat of a weird dichotomy – how to create a robot that you’ve not quite seen before but still feels like a more utilitarian robot that’s still next to one of these robots that’s so human and lifelike. It was a fun challenge.” To help bring the robots to life the production at first considered whether to build them practically, but ultimately went down a motion capture route. “We got our suits and got our motion capture system set up for being on construction sites and different places and got really good performers to be these robots for us. It ended up being a process where we used some of the motion data acquired on the shoots, some hand-animation, and then DNEG’s team did some separate capture during post. “We took a lot of what the performers did,” continues Worth. “Then there’s definitely times when it’s hand animation, and there’s definitely times where DNEG and their crew jumped right in and did some additional motion capture for us. I’m really happy with how they all turned out.” ADDITIONAL EFFECTS TOP: Bernard (Jeffrey Wright) stands with a drone host in a scene from Episode 2 of Season 3. MIDDLE: LED screens became a larger part of the filmmaking process for Season 3. BOTTOM: Architecture at the City of Arts and Sciences in Valencia, Spain informed final shots.

34 • VFXVOICE.COM SUMMER 2020

PG 30-35 WESTWORLD.indd 35

Westworld is one of those series that might be considered to have a set of principal visual effects (such as the environments and robots), while also including a plethora of other supporting VFX work. For example, says Worth, “every time you see a tablet or a phone, we have to burn it in. We have an amazing graphics team that design all these wonderful things. Anytime you see a screen, we do try and shoot as many things practically and then we either replace some of it or all of it. We want our screens to be very thin, so the actors tend to be just holding a piece of metal.” A number of the vehicles in the ‘real’ world were built props, like the airpods, which could be suspended on a crane and be filmed coming to the ground or put on the gimbal as described above. There was also a rideshare vehicle that could actually drive. Then there were augmentations made to live-action vehicles, such as the autonomous motorbike seen this season. “The

“Every time you see a tablet or a phone, we have to burn it in. We have an amazing graphics team that design all these wonderful things. Anytime you see a screen, we do try and shoot as many things practically and then we either replace some of it or all of it. We want our screens to be very thin, so the actors tend to be just holding a piece of metal.” —Jay Worth, Visual Effects Supervisor

Westworld: A Pioneer in VFX Westworld on film and in TV is uniquely placed both as a game-changer in VFX and as a major adopter of the latest in filmmaking techniques to help tell its future dystopian tales. Westworld and Futureworld: The original 1973 film is regarded as the first feature to rely on digital image processing. Shots by Information International, Inc. utilized digitally processed pixelated versions of real photography to simulate an android POV. The 1976 follow-up, Futureworld, employed one of the world’s first 3D animations – CG hand-rendered in 1972 by Ed Catmull, VES and Fred Parke. It, along with a polygonal CG head, played on monitors in a scene. Digital de-aging: A large trend in the VFX industry has been the aging and de-aging of actors, and Westworld jumped on board in the first season of the show to feature park founder and creative director Robert Ford (Anthony Hopkins) as a younger man. This work was achieved with an entirely CG face crafted by Important Looking Pirates, using a cyberscan of Hopkins as a base.

motorcycle is a fully operational designed electric motorcycle that is custom-built,” explains Worth. “We ended up painting out things when it’s automated. There’s a rider in a black suit on it for the shoot, which we paint out. We also did a couple of modifications to remove a chain that felt a little too antiquated, and did a little bit of touch-up work. But it’s a real thing there in the frame.”

LED walls: This third season of Westworld has taken advantage of perhaps the latest big trend in VFX, with a virtual production process for realizing ‘in-camera’ visual effects, notably via the use of actors filmed on sets in front of large LED screens. Pre-built environments could be played back on those screens, and also orchestrated to be synchronized with a moving camera using real-time rendering techniques, if necessary.

THE WESTWORLD ‘LOOK’

Westworld is shot on film which, suggests Worth, aids in bringing a distinctive look to the show. It also brings its own set of challenges for the visual effects crews. “The technologies that we utilize now, they just lean so far into digital,” he notes. “Simple things like pulling keys can get a lot more challenging on film than they can on digital. And of course we have to go through film scanning and all that kind of thing.” Still, film provided the show’s creators with a surprising benefit in terms of the LED wall sequences. These were actually components that many thought should be tackled with digital cameras. “But,” recalls Worth, “Jonah felt that doing those scenes on film was just going to actually end up looking better, that it was going to get a softness and a cinematic quality that was really beautiful. And it turned out to be right. “We shot tests on both and it’s still shocking to me how good the LED walls look on film compared to digital. They just end up looking so much more beautiful and cinematic. The same reason why we like shooting on film for our actors and the way you just feel how light looks more beautiful. It really ended up being to our advantage to shoot it all on film.”

TOP: Exterior of Marina One in Singapore serving as a plate for an explosion shot. BOTTOM: A central theme in Westworld imagery over three seasons has been the 3D printing of hosts.

SUMMER 2020 VFXVOICE.COM • 35

5/13/20 5:44 PM


TV/STREAMING

not require real-time rendering, but other scenes did utilize that approach. “There’s a sequence in Episode 3 with Charlotte Hale (Tessa Thompson) in her office at Delos,” advises Worth, “where we built the entire asset in Unreal. All of those shots are with the massive 150-foot LED wall outside a piece of glass, so we had to test what shooting through glass would involve, making sure that there were no aberrations and that it translated correctly to camera.” ROBOTS RISING

Westworld has already seen its fair share of robots for the park’s ‘hosts,’ although of course those are of the kind that look almost like humans. In the ‘real’ world, robots were envisaged as metallic-surfaced versions, i.e. more like the classic robots we might come to think of. “One of the challenges we had on the show was actually creating those new robots who are standing right next to these humanoid robots that we’ve designed that obviously are so much more human than anything we’ve ever seen,” identifies Worth. “It was somewhat of a weird dichotomy – how to create a robot that you’ve not quite seen before but still feels like a more utilitarian robot that’s still next to one of these robots that’s so human and lifelike. It was a fun challenge.” To help bring the robots to life the production at first considered whether to build them practically, but ultimately went down a motion capture route. “We got our suits and got our motion capture system set up for being on construction sites and different places and got really good performers to be these robots for us. It ended up being a process where we used some of the motion data acquired on the shoots, some hand-animation, and then DNEG’s team did some separate capture during post. “We took a lot of what the performers did,” continues Worth. “Then there’s definitely times when it’s hand animation, and there’s definitely times where DNEG and their crew jumped right in and did some additional motion capture for us. I’m really happy with how they all turned out.” ADDITIONAL EFFECTS TOP: Bernard (Jeffrey Wright) stands with a drone host in a scene from Episode 2 of Season 3. MIDDLE: LED screens became a larger part of the filmmaking process for Season 3. BOTTOM: Architecture at the City of Arts and Sciences in Valencia, Spain informed final shots.

34 • VFXVOICE.COM SUMMER 2020

PG 30-35 WESTWORLD.indd 35

Westworld is one of those series that might be considered to have a set of principal visual effects (such as the environments and robots), while also including a plethora of other supporting VFX work. For example, says Worth, “every time you see a tablet or a phone, we have to burn it in. We have an amazing graphics team that design all these wonderful things. Anytime you see a screen, we do try and shoot as many things practically and then we either replace some of it or all of it. We want our screens to be very thin, so the actors tend to be just holding a piece of metal.” A number of the vehicles in the ‘real’ world were built props, like the airpods, which could be suspended on a crane and be filmed coming to the ground or put on the gimbal as described above. There was also a rideshare vehicle that could actually drive. Then there were augmentations made to live-action vehicles, such as the autonomous motorbike seen this season. “The

“Every time you see a tablet or a phone, we have to burn it in. We have an amazing graphics team that design all these wonderful things. Anytime you see a screen, we do try and shoot as many things practically and then we either replace some of it or all of it. We want our screens to be very thin, so the actors tend to be just holding a piece of metal.” —Jay Worth, Visual Effects Supervisor

Westworld: A Pioneer in VFX Westworld on film and in TV is uniquely placed both as a game-changer in VFX and as a major adopter of the latest in filmmaking techniques to help tell its future dystopian tales. Westworld and Futureworld: The original 1973 film is regarded as the first feature to rely on digital image processing. Shots by Information International, Inc. utilized digitally processed pixelated versions of real photography to simulate an android POV. The 1976 follow-up, Futureworld, employed one of the world’s first 3D animations – CG hand-rendered in 1972 by Ed Catmull, VES and Fred Parke. It, along with a polygonal CG head, played on monitors in a scene. Digital de-aging: A large trend in the VFX industry has been the aging and de-aging of actors, and Westworld jumped on board in the first season of the show to feature park founder and creative director Robert Ford (Anthony Hopkins) as a younger man. This work was achieved with an entirely CG face crafted by Important Looking Pirates, using a cyberscan of Hopkins as a base.

motorcycle is a fully operational designed electric motorcycle that is custom-built,” explains Worth. “We ended up painting out things when it’s automated. There’s a rider in a black suit on it for the shoot, which we paint out. We also did a couple of modifications to remove a chain that felt a little too antiquated, and did a little bit of touch-up work. But it’s a real thing there in the frame.”

LED walls: This third season of Westworld has taken advantage of perhaps the latest big trend in VFX, with a virtual production process for realizing ‘in-camera’ visual effects, notably via the use of actors filmed on sets in front of large LED screens. Pre-built environments could be played back on those screens, and also orchestrated to be synchronized with a moving camera using real-time rendering techniques, if necessary.

THE WESTWORLD ‘LOOK’

Westworld is shot on film which, suggests Worth, aids in bringing a distinctive look to the show. It also brings its own set of challenges for the visual effects crews. “The technologies that we utilize now, they just lean so far into digital,” he notes. “Simple things like pulling keys can get a lot more challenging on film than they can on digital. And of course we have to go through film scanning and all that kind of thing.” Still, film provided the show’s creators with a surprising benefit in terms of the LED wall sequences. These were actually components that many thought should be tackled with digital cameras. “But,” recalls Worth, “Jonah felt that doing those scenes on film was just going to actually end up looking better, that it was going to get a softness and a cinematic quality that was really beautiful. And it turned out to be right. “We shot tests on both and it’s still shocking to me how good the LED walls look on film compared to digital. They just end up looking so much more beautiful and cinematic. The same reason why we like shooting on film for our actors and the way you just feel how light looks more beautiful. It really ended up being to our advantage to shoot it all on film.”

TOP: Exterior of Marina One in Singapore serving as a plate for an explosion shot. BOTTOM: A central theme in Westworld imagery over three seasons has been the 3D printing of hosts.

SUMMER 2020 VFXVOICE.COM • 35

5/13/20 5:44 PM


TV/STREAMING

“There was a specific tone planned for the show’s visuals right from the beginning, and that aesthetic breaks some from what had been seen in past series. We felt justified, given that a lot of time has passed for this character between his last appearance [in 2002]. Making it look more cinematic was an important concern, plus our new dramatic situation informed our choices, with more happening on various planets.” —Jason Zimmerman, Visual Effects Supervisor

STAR TREK: PICARD WARPS THE GAMUT FROM SUBTLE TO SPECTACULAR By KEVIN H. MARTIN

Images courtesy of CBS All Access. TOP: Among the references for the USS Enterprise NCC-1701-D drawn upon by VFX Supervisor Jason Zimmerman were the huge volume of TV imagery shot at Image-G, plus ILM’s work on the series pilot for Picard and again on the Star Trek Generations feature film.

36 • VFXVOICE.COM SUMMER 2020

PG 36-40 PICARD.indd 36-37

During its downright rocky first two seasons, Star Trek: The Next Generation was able to only occasionally capture the magic of its legendary but short-lived predecessor, and few could be faulted for thinking history would treat it less kindly. But the direct-tosyndication series – already a ratings juggernaut right out of the gate, featuring lead actor Patrick Stewart as Captain Jean Luc-Picard – saw drastically improved storylines in later seasons. This helped many viewers imprint on it as the definitive vision of Star Trek, set within a near-Utopian 24th Century featuring the powerful but benign United Federation of Planets. While some darker textures emerged in the inevitable NextGen feature films, the much-beloved Picard character and his crew were still viewed as virtuous types. Perhaps taking a page from the original Star Trek’s writer’s guide – the one indicating its lead character feels “responsibilities strongly and is fully capable of letting the worry and frustration lead him into error” – the makers of Star Trek: Picard have continued down a darker path, one that presents him as an older, somewhat infirm and long-retired Starfleet officer at odds with the powers that be and operating without his usual coterie of stalwarts. Spearheading Picard’s visual effects was Supervisor Jason Zimmerman, who previously undertook that task for the first two seasons of CBS All Access’ series Star Trek: Discovery and Short Treks installments. A multiple Emmy and VES award nominee

and winner of the latter for Terra Nova, Zimmerman’s breadth of experience with contemporary action-drama (Hawaii 5-0 reboot, Mission: Impossible - Ghost Protocol), as well as the fantasy genre while at Pixomondo made him an ideal choice for the new series. Picard, however, eschews some of the starship-battle bombast of

TOP: Civilian Picard (Patrick Stewart) uses one of a series of transporter booths in future San Francisco. Delivering bright materialization/ dematerialization effects in broad daylight was just one of the aspects to the production that required greater finesse on the part of the VFX artists. BOTTOM: Seven of Nine, aka Annika Hansen (Jeri Ryan), an ex-Borg featured in Star Trek: Voyager, now a peacekeeping space ranger.

SUMMER 2020 VFXVOICE.COM • 37

5/13/20 5:59 PM


TV/STREAMING

“There was a specific tone planned for the show’s visuals right from the beginning, and that aesthetic breaks some from what had been seen in past series. We felt justified, given that a lot of time has passed for this character between his last appearance [in 2002]. Making it look more cinematic was an important concern, plus our new dramatic situation informed our choices, with more happening on various planets.” —Jason Zimmerman, Visual Effects Supervisor

STAR TREK: PICARD WARPS THE GAMUT FROM SUBTLE TO SPECTACULAR By KEVIN H. MARTIN

Images courtesy of CBS All Access. TOP: Among the references for the USS Enterprise NCC-1701-D drawn upon by VFX Supervisor Jason Zimmerman were the huge volume of TV imagery shot at Image-G, plus ILM’s work on the series pilot for Picard and again on the Star Trek Generations feature film.

36 • VFXVOICE.COM SUMMER 2020

PG 36-40 PICARD.indd 36-37

During its downright rocky first two seasons, Star Trek: The Next Generation was able to only occasionally capture the magic of its legendary but short-lived predecessor, and few could be faulted for thinking history would treat it less kindly. But the direct-tosyndication series – already a ratings juggernaut right out of the gate, featuring lead actor Patrick Stewart as Captain Jean Luc-Picard – saw drastically improved storylines in later seasons. This helped many viewers imprint on it as the definitive vision of Star Trek, set within a near-Utopian 24th Century featuring the powerful but benign United Federation of Planets. While some darker textures emerged in the inevitable NextGen feature films, the much-beloved Picard character and his crew were still viewed as virtuous types. Perhaps taking a page from the original Star Trek’s writer’s guide – the one indicating its lead character feels “responsibilities strongly and is fully capable of letting the worry and frustration lead him into error” – the makers of Star Trek: Picard have continued down a darker path, one that presents him as an older, somewhat infirm and long-retired Starfleet officer at odds with the powers that be and operating without his usual coterie of stalwarts. Spearheading Picard’s visual effects was Supervisor Jason Zimmerman, who previously undertook that task for the first two seasons of CBS All Access’ series Star Trek: Discovery and Short Treks installments. A multiple Emmy and VES award nominee

and winner of the latter for Terra Nova, Zimmerman’s breadth of experience with contemporary action-drama (Hawaii 5-0 reboot, Mission: Impossible - Ghost Protocol), as well as the fantasy genre while at Pixomondo made him an ideal choice for the new series. Picard, however, eschews some of the starship-battle bombast of

TOP: Civilian Picard (Patrick Stewart) uses one of a series of transporter booths in future San Francisco. Delivering bright materialization/ dematerialization effects in broad daylight was just one of the aspects to the production that required greater finesse on the part of the VFX artists. BOTTOM: Seven of Nine, aka Annika Hansen (Jeri Ryan), an ex-Borg featured in Star Trek: Voyager, now a peacekeeping space ranger.

SUMMER 2020 VFXVOICE.COM • 37

5/13/20 5:59 PM


TV/STREAMING

Discovery in favor of a sometimes more intimate, character-driven storyline. “There was a specific tone planned for the show’s visuals right from the beginning,” states Zimmerman, “and that aesthetic breaks some from what had been seen in past series. We felt justified, given that a lot of time has passed for this character between his last appearance [in 2002]. Making it look more cinematic was an important concern, plus our new dramatic situation informed our choices, with more happening on various planets.” That tone helped Zimmerman differentiate the new series from Discovery, which for its first two seasons took place in the 23rd Century. “We’re less about the spectacle and more about invisibly integrating elements into the live-action environments,” he states. “There’s a lot of work that goes into removing elements that aren’t supposed to be there and tweaking others, essentially modernizing contemporary aspects when necessary. It’s been fun to take these kinds of challenges on, and it makes for a nice change of pace for us from Discovery, where we had this huge 85-ship battle last year.” Zimmerman cites the outdoor transporter booths seen on Earth as an example of effects work that doesn’t call attention to itself, but still requires an expert touch. “When you’re out in broad daylight, the flash of dematerialization doesn’t allow for as much of an interactive effect,” he relates. “Going to absolute white on the flash in those already-bright conditions in the frame meant we needed to do a lot of finessing to get a good blend.” VFX is able to weigh in on possible approaches and solutions very early in the show’s prep period. “We are very involved up front,” he declares, “not just the writers but also the executives. We can help by giving them our read on a given scene, and they in turn ensure we understand exactly what is needed from us. Getting a bead on things prior to shooting means we can offer production an informed opinion. If they determine warp [speed] is made up of blue and white streaks, both we and those shooting live-action can create work that reflects that and dovetails together, so the thing really sings. And there’s always an appetite on our part to work with [production designer] Todd Cherniawsky’s team, so they can start us off on the right foot with a lot of practical elements that VFX can build upon. Same with the cinematography, as we carry on from the lighting done by the DPs [Philip Lanyon and Darran Tiernan] and also with the [VVDFX] prosthetic makeup team.” The title character is not the only familiar silhouette on Picard. The immense, cube-shaped ship of the implacable Borg – a hybrid form of machine and organic life – is also back. Illusion Arts portrayed the vast interior of the cube in the series’ seminal Borg installment, “Q Who,” then topped themselves by delivering a spectacular pullback reveal of the cube that expanded its scope exponentially at the start of Star Trek: First Contact. Zimmerman chose to revisit this territory with another impressive pullback. “Sometimes we work with camera lockoffs shot by the main unit, and then ‘postage stamp’ what has been shot into a CG realm to make the space look larger,” he explains. “But in a lot of instances, such as when the characters are on a catwalk with the camera booming up from beneath them to wind up looking down at them, production executes crane moves that can address shifts

38 • VFXVOICE.COM SUMMER 2020

PG 36-40 PICARD.indd 38-39

“For our hero ship, the La Sirena, production started the design process. That ship is smaller and more maneuverable, so that impacts how we animate it. This is well-removed from how the big Federation starships move, which were more like [naval] battleships. Our first animation tests revealed this ship could do banks and more exciting turns.” —Jason Zimmerman, Visual Effects Supervisor in perspective and parallax changes. From there, we can continue the move in CG. That’s a matter of making sure there is an ease-in and ease-out from the physical crane move, so when we take over with the digital camera it is as smooth as possible.” To populate the cube, VFX creates digidoubles, an approach also seen for planetary crowd scenes. “Having more people moving around in the background really seems to help make the scene come alive,” Zimmerman acknowledges. “We’re guided by the lights set by the DP for the actual set. Taking a cue from that work, we try to ripple that same approach into the vaster area seen in the comp. It’s often not a cut-and-dried matter. Again, we are finessing the image. Sometimes it is as simple as adding a bit of halo to the lights to indicate atmosphere in larger spaces, or perhaps some lens flare – though we try not to overdo those – if they’re facing something bright in-frame. We map all of production’s lenses in order to mimic what and how they see, and to get some idea of what it would look like if there were dirt on the lens. Occasionally this requires an additional CG pass, but we do a lot of our finishing touches in the comp phase.” Another aspect of the main-unit shoot relates to the creation of interactive light effects. “We have to anticipate what might happen after the shooting,” says Zimmerman. “What if, in the edit, something changes the story – but we’re still tied to the flash that was done on set? When we foresee such a possibility, we prefer to not use an on-set interactive. But the DPs definitely help our integration with interactives done right, and with the programmable light effects now possible, the gaffer’s work can be very precise and repeatable, much like a lot of the show’s physical effects gags.” No Trek series would be complete without spaceships. In the case of Picard, many designs emerge from the production office. “Sometimes we’ll have modelers working with production to create a first-pass look, though they also have their own,” reveals Zimmerman. “From there, we make it camera-ready. For our hero ship, the La Sirena, production started the design process. That ship is smaller and more maneuverable, so that impacts how we animate it. This is well-removed from how the big Federation starships move, which were more like [naval] battleships. Our first animation tests revealed this ship could do banks and more exciting turns. “The other key aspect for the ships is the lighting,” Zimmerman continues. “If the ship is out in deep space, where is the lighting

ALL PHOTOS: Without Starfleet resources at his disposal, Picard sets out on his mission with a rag-tag crew on a privateer-style vessel, La Sirena. The vessel is deliberately utilitarian and starkly functional, as opposed to the relatively lavish trappings seen elsewhere in Star Trek’s United Federation of Planets. Instead of traditional instrumentation, the controls are of the ‘air guitar’ variety seen throughout the science fiction genre ever since Minority Report. Joining him are a pair of ex-Starfleet officers: Raffi (Michelle Hurd) and ship’s owner Rios (Santiago Cabrera), whose crew includes a series of holograms bearing his likeness. Others aboard include cybernetics whiz Dr. Jurati (Alison Pill) and Soji (Isa Briones), the surviving member of a pair of twin androids.

SUMMER 2020 VFXVOICE.COM • 39

5/13/20 5:59 PM


TV/STREAMING

Discovery in favor of a sometimes more intimate, character-driven storyline. “There was a specific tone planned for the show’s visuals right from the beginning,” states Zimmerman, “and that aesthetic breaks some from what had been seen in past series. We felt justified, given that a lot of time has passed for this character between his last appearance [in 2002]. Making it look more cinematic was an important concern, plus our new dramatic situation informed our choices, with more happening on various planets.” That tone helped Zimmerman differentiate the new series from Discovery, which for its first two seasons took place in the 23rd Century. “We’re less about the spectacle and more about invisibly integrating elements into the live-action environments,” he states. “There’s a lot of work that goes into removing elements that aren’t supposed to be there and tweaking others, essentially modernizing contemporary aspects when necessary. It’s been fun to take these kinds of challenges on, and it makes for a nice change of pace for us from Discovery, where we had this huge 85-ship battle last year.” Zimmerman cites the outdoor transporter booths seen on Earth as an example of effects work that doesn’t call attention to itself, but still requires an expert touch. “When you’re out in broad daylight, the flash of dematerialization doesn’t allow for as much of an interactive effect,” he relates. “Going to absolute white on the flash in those already-bright conditions in the frame meant we needed to do a lot of finessing to get a good blend.” VFX is able to weigh in on possible approaches and solutions very early in the show’s prep period. “We are very involved up front,” he declares, “not just the writers but also the executives. We can help by giving them our read on a given scene, and they in turn ensure we understand exactly what is needed from us. Getting a bead on things prior to shooting means we can offer production an informed opinion. If they determine warp [speed] is made up of blue and white streaks, both we and those shooting live-action can create work that reflects that and dovetails together, so the thing really sings. And there’s always an appetite on our part to work with [production designer] Todd Cherniawsky’s team, so they can start us off on the right foot with a lot of practical elements that VFX can build upon. Same with the cinematography, as we carry on from the lighting done by the DPs [Philip Lanyon and Darran Tiernan] and also with the [VVDFX] prosthetic makeup team.” The title character is not the only familiar silhouette on Picard. The immense, cube-shaped ship of the implacable Borg – a hybrid form of machine and organic life – is also back. Illusion Arts portrayed the vast interior of the cube in the series’ seminal Borg installment, “Q Who,” then topped themselves by delivering a spectacular pullback reveal of the cube that expanded its scope exponentially at the start of Star Trek: First Contact. Zimmerman chose to revisit this territory with another impressive pullback. “Sometimes we work with camera lockoffs shot by the main unit, and then ‘postage stamp’ what has been shot into a CG realm to make the space look larger,” he explains. “But in a lot of instances, such as when the characters are on a catwalk with the camera booming up from beneath them to wind up looking down at them, production executes crane moves that can address shifts

38 • VFXVOICE.COM SUMMER 2020

PG 36-40 PICARD.indd 38-39

“For our hero ship, the La Sirena, production started the design process. That ship is smaller and more maneuverable, so that impacts how we animate it. This is well-removed from how the big Federation starships move, which were more like [naval] battleships. Our first animation tests revealed this ship could do banks and more exciting turns.” —Jason Zimmerman, Visual Effects Supervisor in perspective and parallax changes. From there, we can continue the move in CG. That’s a matter of making sure there is an ease-in and ease-out from the physical crane move, so when we take over with the digital camera it is as smooth as possible.” To populate the cube, VFX creates digidoubles, an approach also seen for planetary crowd scenes. “Having more people moving around in the background really seems to help make the scene come alive,” Zimmerman acknowledges. “We’re guided by the lights set by the DP for the actual set. Taking a cue from that work, we try to ripple that same approach into the vaster area seen in the comp. It’s often not a cut-and-dried matter. Again, we are finessing the image. Sometimes it is as simple as adding a bit of halo to the lights to indicate atmosphere in larger spaces, or perhaps some lens flare – though we try not to overdo those – if they’re facing something bright in-frame. We map all of production’s lenses in order to mimic what and how they see, and to get some idea of what it would look like if there were dirt on the lens. Occasionally this requires an additional CG pass, but we do a lot of our finishing touches in the comp phase.” Another aspect of the main-unit shoot relates to the creation of interactive light effects. “We have to anticipate what might happen after the shooting,” says Zimmerman. “What if, in the edit, something changes the story – but we’re still tied to the flash that was done on set? When we foresee such a possibility, we prefer to not use an on-set interactive. But the DPs definitely help our integration with interactives done right, and with the programmable light effects now possible, the gaffer’s work can be very precise and repeatable, much like a lot of the show’s physical effects gags.” No Trek series would be complete without spaceships. In the case of Picard, many designs emerge from the production office. “Sometimes we’ll have modelers working with production to create a first-pass look, though they also have their own,” reveals Zimmerman. “From there, we make it camera-ready. For our hero ship, the La Sirena, production started the design process. That ship is smaller and more maneuverable, so that impacts how we animate it. This is well-removed from how the big Federation starships move, which were more like [naval] battleships. Our first animation tests revealed this ship could do banks and more exciting turns. “The other key aspect for the ships is the lighting,” Zimmerman continues. “If the ship is out in deep space, where is the lighting

ALL PHOTOS: Without Starfleet resources at his disposal, Picard sets out on his mission with a rag-tag crew on a privateer-style vessel, La Sirena. The vessel is deliberately utilitarian and starkly functional, as opposed to the relatively lavish trappings seen elsewhere in Star Trek’s United Federation of Planets. Instead of traditional instrumentation, the controls are of the ‘air guitar’ variety seen throughout the science fiction genre ever since Minority Report. Joining him are a pair of ex-Starfleet officers: Raffi (Michelle Hurd) and ship’s owner Rios (Santiago Cabrera), whose crew includes a series of holograms bearing his likeness. Others aboard include cybernetics whiz Dr. Jurati (Alison Pill) and Soji (Isa Briones), the surviving member of a pair of twin androids.

SUMMER 2020 VFXVOICE.COM • 39

5/13/20 5:59 PM


TV/STREAMING

TOP: A small craft on a landing pad within the Borg cube. Lighting for the series’ VFX scenes take their lead from live-action, with ships often illuminated by their specular highlights. Haze and glare are often added as finishing touches during the comp phase. The many smaller craft on Picard demonstrate a wider variety of dynamic maneuvers that would have looked less credible if executed by the bulky starships previously seen in the various series. MIDDLE: The interior of the Borg cube contains countless drones being assimilated during conquest, each in its own regeneration chamber. Picard showcases advances in VFX technique by delivering an even more ambitious pullback reveal, echoing similar shots from the series Star Trek: The Next Generation and the feature film Star Trek: First Contact. BOTTOM: Countless Borg drones are ejected out into space. Digidoubles were often used to populate stellar and planetbound vistas. Several VFX vendors, most already familiar with its universe through work on Star Trek: Discovery, handle the wide variety of effects challenges, which range from ship design to particle and character animation.

coming from? And then you have to figure out key and fill for when the ship is near a planet, or in a solar system with multiple suns. There is a huge bulk of real space imagery from NASA in addition to all the previous Trek [incarnations], so what emerged was a kind of amalgamation of these influences. It’s kind of daunting when you consider the amount of reference material amassed with Trek series and movies down through the years – a lot of which still holds up very well. We have to consider how to represent these staple Trek elements in an updated way, using the tools available today.” One example is Picard’s former command, the USS Enterprise NCC 1701-D, which appears to him in a dream, moving very close to camera in a way somewhat reminiscent of John Knoll and ILM’s work in Star Trek Generations. “Both the Borg cube and the Enterprise are daunting challenges because they are so well loved and well known, but we try to up the ante. We’ve been trying to solve the issues with the cube all season long, and I think we’re getting much better at it with the later episodes. Showing it in relation to La Sirena is a genuine challenge. You have to create depth cues, because there’s nothing [no atmosphere] in space to give you that automatically. And if this thing moves too fast – say, zooming up from five miles away to just 100 yards – the scale can easily get destroyed. We’ve had previs iterations that make it look like a two-inch model a foot from the lens. But that’s all part of the process. We previs everything, so the CG shots go through multiple iterations for speed, scale and maneuverability to nail down all aspects prior to going to full render.” Zimmerman employs multiple VFX vendors, a Trek tradition dating back to the original series when work was spread among four principal optical houses. “Since we’ve been doing Trek for a few years at this point, we have a shorthand with several vendors we enjoy working with,” Zimmerman affirms. “If a vendor gets burdened with a ton of work on a particular episode, then somebody else we know is reliable can be brought on, which permits a degree of overlap. The division of labor is based on knowing who does the best work, who has the supervisors that are most experienced with a particular kind of effect and which facilities have a workflow and the personnel availability to execute.” Vet Trek providers include Pixomondo, Crafty Apes and Gentle Giant Studios, while Technicolor VFX, Filmworks/FX and DNEG TV are more recent additions. To ensure consistency in looks, Zimmerman ensures the vendors all start with CDLs and a spec sheet. “When we review their dailies, this guarantees their shots will match what is coming from editorial on the Avid,” explains Zimmerman. “Preserving our color pipeline from start to finish means there won’t be unwelcome surprises for the execs and the episode directors, as well.” Musing about the show’s ‘brain trust,’ Zimmerman notes that, “Pretty much everybody working on the show has some kinship with Trek. [Creator and supervising producer] Kirsten Beyer is such an incredible resource for us in this regard that we check in with her weekly to make sure our work is going to pass muster. Executive producers Alex Kurtzman, Akiva Goldsman and Michael Chabon all started as fans, so there’s an expectation to be met on all our parts, and that challenge is also part of the fun of the show.”

40 • VFXVOICE.COM SUMMER 2020

PG 36-40 PICARD.indd 40

5/13/20 5:59 PM


PG 41 FTRACK AD.indd 41

5/14/20 12:58 PM


ANIMATION

“Priority was given to making things seem like they could exist in the real world over giving them futuristic designs. That’s because we thought full CG would work better with realistic designs. The armored vehicles and the vehicles that the members of Section 9 ride in Los Angeles are based on actual military ones. In contrast, the drone that appears in Episode 2 is a private drone that’s been loaded with Hellfire missiles, so actual civilian drones were referenced for the design.” —Kenji Kamiyama and Shinji Aramaki, Co-directors TOP: llustrator Ilya Kuvshinov was given the responsibility of designing the iconic character of Motoko Kusanagi. BOTTOM LEFT: Concept art of the Foreigner District in Fukuoka, Japan. Designed by Senju Kobo Co.,Ltd. BOTTOM RIGHT: Concept art of the Foreigner District in Fukuoka, Japan where Togusa makes his first appearance in the series. Designed by Senju Kobo Co.,Ltd.

GHOST IN THE SHELL ANIME REVIVAL FEATURES ‘NEW LOOK’ IN 3D CG By TREVOR HOGG

Images courtesy of Netflix, Production I.G and Sola Digital Arts. TOP LEFT: 3D Character Supervisor Hiromi Matsushige modifies the 2D character design of Motoko Kusanagi for the CG version. TOP RIGHT: The final CG render of Motoko Kusanagi as executed by 3D Character Supervisor Hiromi Matsushige.

42 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 42-43

The Ghost in the Shell franchise, established by manga artist Shirow Masamune, revolves around a counter-cyberterrorist organization called Public Security Section 9 battling criminals and conspiracies in 21st century Japan where people have cybernetic implants to enhance their capabilities. As part of this franchise, Japanese anime director Kenji Kamiyama created the Stand Alone Complex television series in 2002, which ran for two seasons and spawned a feature film. The project has recently been revived, with Production I.G and Sola Digital Arts partnering with Netflix to produce Ghost in the Shell: SAC_2045, which will consist of two 12-episode seasons. This time around, the disbanded members of Section 9, led by Major Motoko Kusanagi, are a mercenary group known as Ghost that reunites with their former Chief, Lt. Colonel Daisuke Aramaki, in an effort to contain the mysterious global threat of highly evolved beings referred to as ‘Post Humans.’ Kamiyama shares directorial duties with Shinji Aramaki (Appleseed), with the duo communicating as one by translated emails, courtesy of Netflix. “Every time Ghost in the Shell is adapted, each director and production team explores a different way to represent the series. This time, however, the concept of creating a series in full 3D CG itself was a completely different approach from the previous projects. While this project was based on the S.A.C. series directed by Kenji Kamiyama and its world,

it tells a story that exists parallel to the previous stories in the franchise. “After we shared our ideas on the project, director Kamiyama took on the series composition at the pre-production stage and would consult with director Aramaki at the episodic screenplay

development stage. For the development of character and production design and all stages after storyboarding, we took on the same position, and would carry out checks and proceed together until each part was complete.” Adopting a realistic approach for the animation was important

SUMMER 2020 VFXVOICE.COM • 43

5/13/20 6:05 PM


ANIMATION

“Priority was given to making things seem like they could exist in the real world over giving them futuristic designs. That’s because we thought full CG would work better with realistic designs. The armored vehicles and the vehicles that the members of Section 9 ride in Los Angeles are based on actual military ones. In contrast, the drone that appears in Episode 2 is a private drone that’s been loaded with Hellfire missiles, so actual civilian drones were referenced for the design.” —Kenji Kamiyama and Shinji Aramaki, Co-directors TOP: llustrator Ilya Kuvshinov was given the responsibility of designing the iconic character of Motoko Kusanagi. BOTTOM LEFT: Concept art of the Foreigner District in Fukuoka, Japan. Designed by Senju Kobo Co.,Ltd. BOTTOM RIGHT: Concept art of the Foreigner District in Fukuoka, Japan where Togusa makes his first appearance in the series. Designed by Senju Kobo Co.,Ltd.

GHOST IN THE SHELL ANIME REVIVAL FEATURES ‘NEW LOOK’ IN 3D CG By TREVOR HOGG

Images courtesy of Netflix, Production I.G and Sola Digital Arts. TOP LEFT: 3D Character Supervisor Hiromi Matsushige modifies the 2D character design of Motoko Kusanagi for the CG version. TOP RIGHT: The final CG render of Motoko Kusanagi as executed by 3D Character Supervisor Hiromi Matsushige.

42 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 42-43

The Ghost in the Shell franchise, established by manga artist Shirow Masamune, revolves around a counter-cyberterrorist organization called Public Security Section 9 battling criminals and conspiracies in 21st century Japan where people have cybernetic implants to enhance their capabilities. As part of this franchise, Japanese anime director Kenji Kamiyama created the Stand Alone Complex television series in 2002, which ran for two seasons and spawned a feature film. The project has recently been revived, with Production I.G and Sola Digital Arts partnering with Netflix to produce Ghost in the Shell: SAC_2045, which will consist of two 12-episode seasons. This time around, the disbanded members of Section 9, led by Major Motoko Kusanagi, are a mercenary group known as Ghost that reunites with their former Chief, Lt. Colonel Daisuke Aramaki, in an effort to contain the mysterious global threat of highly evolved beings referred to as ‘Post Humans.’ Kamiyama shares directorial duties with Shinji Aramaki (Appleseed), with the duo communicating as one by translated emails, courtesy of Netflix. “Every time Ghost in the Shell is adapted, each director and production team explores a different way to represent the series. This time, however, the concept of creating a series in full 3D CG itself was a completely different approach from the previous projects. While this project was based on the S.A.C. series directed by Kenji Kamiyama and its world,

it tells a story that exists parallel to the previous stories in the franchise. “After we shared our ideas on the project, director Kamiyama took on the series composition at the pre-production stage and would consult with director Aramaki at the episodic screenplay

development stage. For the development of character and production design and all stages after storyboarding, we took on the same position, and would carry out checks and proceed together until each part was complete.” Adopting a realistic approach for the animation was important

SUMMER 2020 VFXVOICE.COM • 43

5/13/20 6:05 PM


ANIMATION

for Kamiyama and Aramaki. “Conveying the drama of the story to the audience through animation was a top priority. To do so, we utilized motion capture and tried to have life-sized characters act as much as possible. Specifically, we avoided the unnaturally shaped bodies and faces that are common in full CG animation and opted for proportions similar to those of real humans. This is because we wanted to use a style that would allow the audience to concentrate more on the story.” Russian illustrator Ilya Kuvshinov provides his own interpretation of the Stand Alone Complex characters that were originally designed by Hajime Shimomura. “We instructed Ilya that while Ghost in the Shell is sci-fi, we weren’t looking for out-there clothing designs and asked him to keep within the confines of clothes that actually exist or look like they would exist. Also, after thoroughly explaining one by one the images we had in mind for the characters, we held one-hour meetings every week to check the designs. All the character designs for the entire series eventually came together after more than a year of doing this.” Combining motion capture and computer animation was a familiar approach for Aramaki. “Aramaki was originally one of the first people in Japan to do a motion capture-based anime. He has been using motion capture for performances in animation since the release of his movie Appleseed in 2004. However, around the world, those who use motion capture in this way are in the minority, with the majority of CG production teams still considering the motion capture technique as just a tool for creating ‘action scenes.’ While Aramaki advocated that ‘it’s theatrical performances, in particular, that should use motion capture,’ he found little support in the industry. The reason why Aramaki

44 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 44-45

“Conveying the drama of the story to the audience through animation was a top priority. To do so, we utilized motion capture and tried to have life-sized characters act as much as possible. Specifically, we avoided the unnaturally shaped bodies and faces that are common in full CG animation and opted for proportions similar to those of real humans. This is because we wanted to use a style that would allow the audience to concentrate more on the story.” —Kenji Kamiyama and Shinji Aramaki, Co-directors thought Kamiyama would go for motion capture was that his works contained live-action-esque performances and stories that are developed via the script. “It’s established that the members of Public Security Section 9 are a polished and professional special unit, so they don’t perform any character-specific actions,” note Kamiyama and Aramaki. “That’s because such professionals all go through the same theory-based, high-level training, so their actions and styles should resemble each other. For the members of Section 9, we aimed for them to perform reasonably streamlined and modest actions. Alternatively, by giving the actions of the enemies [for example, Post Human] unique characteristics, we were able to contrast

OPPOSITE TOP LEFT: A “hard-boiled” look was given to the character of Togusa. OPPOSITE MIDDLE: Motoko Kusanagi drives a Tachikoma from the cockpit situated inside of the AI robot. OPPOSITE BOTTOM: Kenji Kamiyama and Shinji Aramaki requested that models be closer to real life than the unrealistic bodies commonly found in cel animation. TOP: Because Batou is so iconic illustrator Ilya Kuvshinov did not alter the character design. BOTTOM: Tachikoma robots that have artificial intelligence are nicknamed ‘think tanks’ and have a childish eagerness to provide military support to Motoko Kusanagi.

SUMMER 2020 VFXVOICE.COM • 45

5/13/20 6:05 PM


ANIMATION

for Kamiyama and Aramaki. “Conveying the drama of the story to the audience through animation was a top priority. To do so, we utilized motion capture and tried to have life-sized characters act as much as possible. Specifically, we avoided the unnaturally shaped bodies and faces that are common in full CG animation and opted for proportions similar to those of real humans. This is because we wanted to use a style that would allow the audience to concentrate more on the story.” Russian illustrator Ilya Kuvshinov provides his own interpretation of the Stand Alone Complex characters that were originally designed by Hajime Shimomura. “We instructed Ilya that while Ghost in the Shell is sci-fi, we weren’t looking for out-there clothing designs and asked him to keep within the confines of clothes that actually exist or look like they would exist. Also, after thoroughly explaining one by one the images we had in mind for the characters, we held one-hour meetings every week to check the designs. All the character designs for the entire series eventually came together after more than a year of doing this.” Combining motion capture and computer animation was a familiar approach for Aramaki. “Aramaki was originally one of the first people in Japan to do a motion capture-based anime. He has been using motion capture for performances in animation since the release of his movie Appleseed in 2004. However, around the world, those who use motion capture in this way are in the minority, with the majority of CG production teams still considering the motion capture technique as just a tool for creating ‘action scenes.’ While Aramaki advocated that ‘it’s theatrical performances, in particular, that should use motion capture,’ he found little support in the industry. The reason why Aramaki

44 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 44-45

“Conveying the drama of the story to the audience through animation was a top priority. To do so, we utilized motion capture and tried to have life-sized characters act as much as possible. Specifically, we avoided the unnaturally shaped bodies and faces that are common in full CG animation and opted for proportions similar to those of real humans. This is because we wanted to use a style that would allow the audience to concentrate more on the story.” —Kenji Kamiyama and Shinji Aramaki, Co-directors thought Kamiyama would go for motion capture was that his works contained live-action-esque performances and stories that are developed via the script. “It’s established that the members of Public Security Section 9 are a polished and professional special unit, so they don’t perform any character-specific actions,” note Kamiyama and Aramaki. “That’s because such professionals all go through the same theory-based, high-level training, so their actions and styles should resemble each other. For the members of Section 9, we aimed for them to perform reasonably streamlined and modest actions. Alternatively, by giving the actions of the enemies [for example, Post Human] unique characteristics, we were able to contrast

OPPOSITE TOP LEFT: A “hard-boiled” look was given to the character of Togusa. OPPOSITE MIDDLE: Motoko Kusanagi drives a Tachikoma from the cockpit situated inside of the AI robot. OPPOSITE BOTTOM: Kenji Kamiyama and Shinji Aramaki requested that models be closer to real life than the unrealistic bodies commonly found in cel animation. TOP: Because Batou is so iconic illustrator Ilya Kuvshinov did not alter the character design. BOTTOM: Tachikoma robots that have artificial intelligence are nicknamed ‘think tanks’ and have a childish eagerness to provide military support to Motoko Kusanagi.

SUMMER 2020 VFXVOICE.COM • 45

5/13/20 6:05 PM


ANIMATION

The Art of Character Design

“Such scale and quantity of resources were impossible with productions in Japan up to this point. Even when we did it in 2D, we were told that making a Ghost in the Shell series was a ‘reckless endeavor’ because of the number of assets and locations that need to be produced. The story is dense, and there’s a huge amount of information packed in the visuals, so we think you’ll probably wear yourself out if you binge-watch it!” —Kenji Kamiyama and Shinji Aramaki, Co-directors

46 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 47

them with the actions of the Section 9 members and make them look that much better.” A dramatic showdown occurs in Episode 5 on the estate of tech billionaire Patrick Huge that features robotic guard dogs, cyborg housemaids and an armored suit. “For these scenes, we had action coordinator Motoki Kawana devise special action sequences at the motion capture stage. Then for the subsequent animation process, we had our most skilled special animator on the production team take charge. By combining the skills and ideas of both an actor in charge of the action and a special animator, we were able to create some great fight scenes. “The use of motion capture allowed for an environment that made it easier than conventional animation production to create blank space durations, so storyboards were often off the mark in regards to the fixed episode length,” state Kamiyama and Aramaki. “Even after reaching the post-motion capture animatics [layout] stage, the average length of an episode was three to four minutes over the fixed episode length. After that, we would cut out any unneeded parts and picture lock the episode before moving onto hand animation. The longest cut we had to make from an episode was eight minutes long.” Action takes place in the economically devastated cities of Palm Springs, California and Los Angeles. “For Palm Springs, we actually went there. In the Japanese scenes, as Fukuoka is the capital city [in the series], location designs were based on the actual city of Fukuoka. Since Aramaki happens to be from Fukuoka, he was able to realistically recreate downtown Fukuoka [Hakata] without visiting. For example, the area around the bank that appears in Episode 7 was recreated so faithfully that someone from Fukuoka would easily be able to tell where it’s modeled after.”

Among the high-tech weaponry are artificial intelligence ‘think tanks’ and attack helicopters. “Priority was given to making things seem like they could exist in the real world over giving them futuristic designs,” remark Kamiyama and Aramaki. “That’s because we thought full CG would work better with realistic designs. The armored vehicles and the vehicles that the members of Section 9 ride in Los Angeles are based on actual military ones. In contrast, the drone that appears in Episode 2 is a private drone that’s been loaded with Hellfire missiles, so actual civilian drones were referenced for the design.” Signature elements include the ability to hack into a human mind and thermo-optical camouflage. “The way we showed brain dives and thermo-optical camouflage followed all the Ghost in the Shell series up to this point. That being said, with regard to thermo-optical camouflage, we were able to take advantage of full 3D and succeeded in producing a sense of transparency while maintaining a three-dimensional feel. This was something that was attempted for the 2D series as well, but it was difficult to achieve such a look with drawings.” The major task for Kamiyama and Aramaki was producing an anime series of 22 minutes per episode that included backgrounds using only 3D CG. “Such scale and quantity of resources were impossible with productions in Japan up to this point. Even when we did it in 2D, we were told that making a Ghost in the Shell series was a ‘reckless endeavor’ because of the number of assets and locations that need to be produced. The story is dense, and there’s a huge amount of information packed in the visuals, so we think you’ll probably wear yourself out if you binge-watch it! But we also hope that everyone is able to enjoy those aspects too.”

Illustrator Ilya Kuvshinov is based in Tokyo, Japan, and describes Ghost in the Shell: SAC_2045 as “the most awesome project” he has ever worked on. “On the design meetings we talk about the ‘feel’ a character should invoke in a viewer, and basic movie-watching experience is really helping a lot here because it’s a usual point of reference.” Visual research went beyond the Internet. “For example, when I designed a character in kimono, I bought a book called Kimono Guide for Novices that thoroughly explained how to put it on, which materials are used, and design flow. When designing military costumes, I bought a book just for those. I own a lot of books now!” When designing the characters for the Netflix series their physical and cyber abilities needed to be kept in mind. “That’s actually a big thing, because the cyborg-to-human-body ratio really affects the weight of some parts of your body in different movement and postures. Basically, you need more sturdy pants for just sitting around or stronger shoes just to stand. That’s why Motoko has heavy boots in real life but high heels in virtual space.” No alterations were required for the iconic Batou, while Togusa was given a “hard-boiled” look for the sake of the story. “Motoko’s design logic was the most fun one,” notes Kuvshinov. “She’s a full-body cyborg, so she can actually own any body she wants. This time, I thought about what kind of body does she like for the fast, agile movements.” Getting the proper hairstyle was important for Daisuke Aramaki. “When I did the first sketches for Aramaki, director Kamiyama asked me to make his hair point up, for a more energetic feeling. That edit was so simple but so powerful in terms of how you see the character subconsciously, just basing your feelings in shapes.”

OPPOSITE TOP: It was important to directors Kenji Kamiyama and Shinji Aramaki that the clothing designs be kept within the confines of what actually exists in real life. OPPOSITE BOTTOM: Ghost in the Shell: SAC_2045 has the most assets that Sola Digital Arts has ever needed to be created for a project. TOP: Motoko Kusanagi took the longest to develop, with more attention paid to the finer details such as her eyelashes and the lines of her eyes.

SUMMER 2020 VFXVOICE.COM • 47

5/13/20 6:05 PM


ANIMATION

The Art of Character Design

“Such scale and quantity of resources were impossible with productions in Japan up to this point. Even when we did it in 2D, we were told that making a Ghost in the Shell series was a ‘reckless endeavor’ because of the number of assets and locations that need to be produced. The story is dense, and there’s a huge amount of information packed in the visuals, so we think you’ll probably wear yourself out if you binge-watch it!” —Kenji Kamiyama and Shinji Aramaki, Co-directors

46 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 47

them with the actions of the Section 9 members and make them look that much better.” A dramatic showdown occurs in Episode 5 on the estate of tech billionaire Patrick Huge that features robotic guard dogs, cyborg housemaids and an armored suit. “For these scenes, we had action coordinator Motoki Kawana devise special action sequences at the motion capture stage. Then for the subsequent animation process, we had our most skilled special animator on the production team take charge. By combining the skills and ideas of both an actor in charge of the action and a special animator, we were able to create some great fight scenes. “The use of motion capture allowed for an environment that made it easier than conventional animation production to create blank space durations, so storyboards were often off the mark in regards to the fixed episode length,” state Kamiyama and Aramaki. “Even after reaching the post-motion capture animatics [layout] stage, the average length of an episode was three to four minutes over the fixed episode length. After that, we would cut out any unneeded parts and picture lock the episode before moving onto hand animation. The longest cut we had to make from an episode was eight minutes long.” Action takes place in the economically devastated cities of Palm Springs, California and Los Angeles. “For Palm Springs, we actually went there. In the Japanese scenes, as Fukuoka is the capital city [in the series], location designs were based on the actual city of Fukuoka. Since Aramaki happens to be from Fukuoka, he was able to realistically recreate downtown Fukuoka [Hakata] without visiting. For example, the area around the bank that appears in Episode 7 was recreated so faithfully that someone from Fukuoka would easily be able to tell where it’s modeled after.”

Among the high-tech weaponry are artificial intelligence ‘think tanks’ and attack helicopters. “Priority was given to making things seem like they could exist in the real world over giving them futuristic designs,” remark Kamiyama and Aramaki. “That’s because we thought full CG would work better with realistic designs. The armored vehicles and the vehicles that the members of Section 9 ride in Los Angeles are based on actual military ones. In contrast, the drone that appears in Episode 2 is a private drone that’s been loaded with Hellfire missiles, so actual civilian drones were referenced for the design.” Signature elements include the ability to hack into a human mind and thermo-optical camouflage. “The way we showed brain dives and thermo-optical camouflage followed all the Ghost in the Shell series up to this point. That being said, with regard to thermo-optical camouflage, we were able to take advantage of full 3D and succeeded in producing a sense of transparency while maintaining a three-dimensional feel. This was something that was attempted for the 2D series as well, but it was difficult to achieve such a look with drawings.” The major task for Kamiyama and Aramaki was producing an anime series of 22 minutes per episode that included backgrounds using only 3D CG. “Such scale and quantity of resources were impossible with productions in Japan up to this point. Even when we did it in 2D, we were told that making a Ghost in the Shell series was a ‘reckless endeavor’ because of the number of assets and locations that need to be produced. The story is dense, and there’s a huge amount of information packed in the visuals, so we think you’ll probably wear yourself out if you binge-watch it! But we also hope that everyone is able to enjoy those aspects too.”

Illustrator Ilya Kuvshinov is based in Tokyo, Japan, and describes Ghost in the Shell: SAC_2045 as “the most awesome project” he has ever worked on. “On the design meetings we talk about the ‘feel’ a character should invoke in a viewer, and basic movie-watching experience is really helping a lot here because it’s a usual point of reference.” Visual research went beyond the Internet. “For example, when I designed a character in kimono, I bought a book called Kimono Guide for Novices that thoroughly explained how to put it on, which materials are used, and design flow. When designing military costumes, I bought a book just for those. I own a lot of books now!” When designing the characters for the Netflix series their physical and cyber abilities needed to be kept in mind. “That’s actually a big thing, because the cyborg-to-human-body ratio really affects the weight of some parts of your body in different movement and postures. Basically, you need more sturdy pants for just sitting around or stronger shoes just to stand. That’s why Motoko has heavy boots in real life but high heels in virtual space.” No alterations were required for the iconic Batou, while Togusa was given a “hard-boiled” look for the sake of the story. “Motoko’s design logic was the most fun one,” notes Kuvshinov. “She’s a full-body cyborg, so she can actually own any body she wants. This time, I thought about what kind of body does she like for the fast, agile movements.” Getting the proper hairstyle was important for Daisuke Aramaki. “When I did the first sketches for Aramaki, director Kamiyama asked me to make his hair point up, for a more energetic feeling. That edit was so simple but so powerful in terms of how you see the character subconsciously, just basing your feelings in shapes.”

OPPOSITE TOP: It was important to directors Kenji Kamiyama and Shinji Aramaki that the clothing designs be kept within the confines of what actually exists in real life. OPPOSITE BOTTOM: Ghost in the Shell: SAC_2045 has the most assets that Sola Digital Arts has ever needed to be created for a project. TOP: Motoko Kusanagi took the longest to develop, with more attention paid to the finer details such as her eyelashes and the lines of her eyes.

SUMMER 2020 VFXVOICE.COM • 47

5/13/20 6:05 PM


ANIMATION

Sola Digital Arts’ Deep Dive into Animation

TOP TO BOTTOM: Public Security Bureau Section office occupied by Lt. Col. Daisuke Aramaki with a character representation of Motoko Kusanagi used as a size and scale reference. All designs by Stanislas Brunet. A detailed exploration of the layout of the Public Security Bureau Section office occupied by Lt. Col. Daisuke Aramaki. Exterior concept art of the residency of the prime minister of Japan. Ground level concept art of the prime minister’s residency.

48 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 49

Founded by Shinji Aramaki, producer Joseph Chou and CG producer Shigehito Kawada, Sola Digital Arts provide insights into bringing the cyberpunk action tale conceived by Shirow Masamune into the realm of 3D CG for the first time. “Compared to our previous project, Ultraman, Ghost in the Shell: SAC_2045 is more realistic,” remarks Modeling Supervisor Masamitsu Tasaki and 3D Character Supervisor Hiromi Matsushige. “Even so, since it is still an anime, characters were lined with Pencil [a rendering plug-in that outputs cartoon-like outlines]. Also, we took certain measures when we wanted to give it a little more of that anime feel, like reducing the size of feet of Motoko.” All of the 3D animation was done in Maya. “Basically, character designs came together after discussions between the two directors and Ilya Kuvshinov,” explain Tasaki and Matsushige. “Based on those, we would work together with the directors to add the final touches. Both directors requested that models be closer to real-life than the unrealistic bodies commonly found in cel animation, so we worked on creating character models in response to requests such as ‘don’t make the legs too long.’ “At the beginning of the project, there weren’t enough solid design drawings, so we went through Ilya’s past illustrations. It was difficult to convert Ilya’s designs into 3D, and we got quite confused along the way, but in the end we managed to pull through and bring them together. Ilya’s style of coloring is quite unique, so we focused on how to add specular reflections. We also paid close attention to getting the texture of the hair right.” Models assets were created by different animation studios. “To do this, the main staff first created the textures and base models for the male and female characters, then assets would be derived from them,” remark Tasaki and Matsushige. “This allowed us to maintain some kind of consistency.” Motion capture improved the performance fidelity of the animation. “By using motion capture, we were able to incorporate not only the sensibilities of the animators, but also the consistent performances and emotions of the mocap actors too,” states Chief Episode Director Hiroshi Yagishita. “For this project, auditions narrowed down over 200 people to just six mocap actors who became the fixed cast for the entire series. [Through the use of mocap actors] we were able to express natural human-like movements in not only the main characters, but the supporting characters, too, while adding depth to the world of the series.” Storyboards were designed around the fixed episode length of around 22 minutes. “In the subsequent animatics [layout] stage, it was often the case that episodes came to be around 26 minutes long because there was more captured in the motion capture than was expected,” remarks Yagishita. “After that, the animation would be cut down to fit the fixed length. The

workflow of this project avoided adhering too strictly to the details of the storyboards and animatics. Instead, we took advantage of the performance of the mocap actors to review, and create camerawork and cuts during the layout process. “At the character design stage, we were aware that costumes and character styles needed to allow for action. In terms of action sequences, live-action animatics [video storyboards] of the mocap actors were shot in advance. We would then examine the actions and continuity.” Body shapes and cyber abilities were taken into account with the character designs. “While acting out scenes, various expressions – such as lip movements which wouldn’t usually be necessary for wireless cyber communications – were used based on the specific situation and emotions of the characters.” Military training was given to the mocap actors, such as how to handle guns and operate as a platoon. “We had a dedicated stunt actor perform the action scenes, and we would decide upon actions and fight styles that matched a character’s personality after consulting with the directors,” remarks Lead Animation Artist Yuya Yamaguchi. “The animators would also use this as a reference. Animators would use the actor’s performances and storyboards to create Motoko’s stylish movements, while for Batou, his powerful and swinging movements were beefed up during animation.” The main protagonist, Major Motoko Kusanagi, took the longest to develop. “More attention was paid to the finer details, such as her eyelashes and the lines of her eyes, than to the different parts of her body. Since Ilya’s eyes are so distinctive, I created a collage of his illustrations of Motoko Kusanagi, which I used as a reference when making it possible to reproduce them. Also, since Ilya’s illustrations are cute we wanted to make Motoko cute. I think her design came out a little different from the Motoko we’ve seen until now and will be well received by newer anime fans alike.” Providing military support and a childish eagerness are the Tachikoma think tanks. “The challenge that took the longest to overcome was probably making it possible for characters to fit inside its pod,” reveals background and vehicles modeling artist Vlastimil Pochop. “Since there was only a single unified size for all the Tachikomas, it wasn’t easy to figure out how the interior would fit such a broad spectrum of characters of various sizes while keeping the exterior in line with the concept art. All that while also thinking about the rig and animations. It was necessary to make my own simple rig and animations that I later shared with the riggers as reference. That way I could make sure all those geometry layers sliding over each other won’t clip through as the pod opens up. The low-poly stage of the development involved a lot of consulting between me, the concept artist, the riggers and the directors. “As for the high-definition model, I had to be careful not to deviate too much from the concept art by adding too much detail. The new version of Tachikomas is different from the original, yet they share some similarities, like their fairly simple spider-like shape and their color palette. Adding too much geometric details or weathering would deviate from the concept.”

Robodogs are lethal and relentless. “This is a simple asset, but the challenge here was to make it transform into its shape from a box,” observes Pochop. “The first thing I did is to make a very low-poly version with a rig, and transformed it using bones and a little bit of blend shapes to see how close I can get to the concept while having the ability to transform. This asset also involved quite a lot of back and forth between me and the rigger. After that, the asset was passed to another modeler for HD pass.” The threat level escalates when Motoko and her mercenary colleagues have to battle an adversary wearing an armored suit. “With its quite unusual silhouette it was a little bit challenging to get the upper and lower body balanced in just the right way. And again, with a pretty tall character to visibly fit inside, it took a while to match the interior and exterior. “Because the concept design didn’t have quite enough detail in it, I also had to add some extra details while making sure these details didn’t result in undesirable outlines and overall look. This applied to all the other assets as well. While modeling, I needed to keep thinking in the back of my head how it would eventually look with outlines included. That’s why I would sometimes rather use textures and normal maps to add details instead of actual geometry. This asset also had a couple of damage states, but these weren’t particularly difficult to make. The only challenge here was to resist going too crazy with these bullet holes.” During compositing, the color of the background and the shadows of the characters were matched first. “As it’s an anime, there was no need to rigorously match the shadows like photorealistic works, so there are times when we let characters stand out,” explains Lighting, Composite Supervisor Kouya Takahashi. “Also, since there are so many scenes in the series, the methods change slightly for each one.” The thermo-optical camouflage had to be readable and invisible at the same time. “At first, we made something similar to a pre-set, but it needed to be adjusted for each shot. As the distortion of the cloaks makes them easier to see, the areas behind the characters change their impression considerably. It was necessary to choose the level of visibility of the cloaks for each shot based on whether or not the actions of the characters needed to been seen, so I don’t think there ever was a right balance to be found.” “This had the largest number of assets of any project we’ve worked on,” state Tasaki and Matsushige. “Looking back, the battles were cool. The scenes in Episodes 1 and 2 where the Tachikoma go up against tanks looked really good in 3D CG. Also, the entire series was made with an emphasis on story, so I hope everyone enjoys that element too.” Tasaki concludes, “Although the series does have a new look, people will enjoy Ghost in the Shell SAC_2045 when they watch it as one complete package.”

SUMMER 2020 VFXVOICE.COM • 49

5/13/20 6:05 PM


ANIMATION

Sola Digital Arts’ Deep Dive into Animation

TOP TO BOTTOM: Public Security Bureau Section office occupied by Lt. Col. Daisuke Aramaki with a character representation of Motoko Kusanagi used as a size and scale reference. All designs by Stanislas Brunet. A detailed exploration of the layout of the Public Security Bureau Section office occupied by Lt. Col. Daisuke Aramaki. Exterior concept art of the residency of the prime minister of Japan. Ground level concept art of the prime minister’s residency.

48 • VFXVOICE.COM SUMMER 2020

PG 42-49 GHOST.indd 49

Founded by Shinji Aramaki, producer Joseph Chou and CG producer Shigehito Kawada, Sola Digital Arts provide insights into bringing the cyberpunk action tale conceived by Shirow Masamune into the realm of 3D CG for the first time. “Compared to our previous project, Ultraman, Ghost in the Shell: SAC_2045 is more realistic,” remarks Modeling Supervisor Masamitsu Tasaki and 3D Character Supervisor Hiromi Matsushige. “Even so, since it is still an anime, characters were lined with Pencil [a rendering plug-in that outputs cartoon-like outlines]. Also, we took certain measures when we wanted to give it a little more of that anime feel, like reducing the size of feet of Motoko.” All of the 3D animation was done in Maya. “Basically, character designs came together after discussions between the two directors and Ilya Kuvshinov,” explain Tasaki and Matsushige. “Based on those, we would work together with the directors to add the final touches. Both directors requested that models be closer to real-life than the unrealistic bodies commonly found in cel animation, so we worked on creating character models in response to requests such as ‘don’t make the legs too long.’ “At the beginning of the project, there weren’t enough solid design drawings, so we went through Ilya’s past illustrations. It was difficult to convert Ilya’s designs into 3D, and we got quite confused along the way, but in the end we managed to pull through and bring them together. Ilya’s style of coloring is quite unique, so we focused on how to add specular reflections. We also paid close attention to getting the texture of the hair right.” Models assets were created by different animation studios. “To do this, the main staff first created the textures and base models for the male and female characters, then assets would be derived from them,” remark Tasaki and Matsushige. “This allowed us to maintain some kind of consistency.” Motion capture improved the performance fidelity of the animation. “By using motion capture, we were able to incorporate not only the sensibilities of the animators, but also the consistent performances and emotions of the mocap actors too,” states Chief Episode Director Hiroshi Yagishita. “For this project, auditions narrowed down over 200 people to just six mocap actors who became the fixed cast for the entire series. [Through the use of mocap actors] we were able to express natural human-like movements in not only the main characters, but the supporting characters, too, while adding depth to the world of the series.” Storyboards were designed around the fixed episode length of around 22 minutes. “In the subsequent animatics [layout] stage, it was often the case that episodes came to be around 26 minutes long because there was more captured in the motion capture than was expected,” remarks Yagishita. “After that, the animation would be cut down to fit the fixed length. The

workflow of this project avoided adhering too strictly to the details of the storyboards and animatics. Instead, we took advantage of the performance of the mocap actors to review, and create camerawork and cuts during the layout process. “At the character design stage, we were aware that costumes and character styles needed to allow for action. In terms of action sequences, live-action animatics [video storyboards] of the mocap actors were shot in advance. We would then examine the actions and continuity.” Body shapes and cyber abilities were taken into account with the character designs. “While acting out scenes, various expressions – such as lip movements which wouldn’t usually be necessary for wireless cyber communications – were used based on the specific situation and emotions of the characters.” Military training was given to the mocap actors, such as how to handle guns and operate as a platoon. “We had a dedicated stunt actor perform the action scenes, and we would decide upon actions and fight styles that matched a character’s personality after consulting with the directors,” remarks Lead Animation Artist Yuya Yamaguchi. “The animators would also use this as a reference. Animators would use the actor’s performances and storyboards to create Motoko’s stylish movements, while for Batou, his powerful and swinging movements were beefed up during animation.” The main protagonist, Major Motoko Kusanagi, took the longest to develop. “More attention was paid to the finer details, such as her eyelashes and the lines of her eyes, than to the different parts of her body. Since Ilya’s eyes are so distinctive, I created a collage of his illustrations of Motoko Kusanagi, which I used as a reference when making it possible to reproduce them. Also, since Ilya’s illustrations are cute we wanted to make Motoko cute. I think her design came out a little different from the Motoko we’ve seen until now and will be well received by newer anime fans alike.” Providing military support and a childish eagerness are the Tachikoma think tanks. “The challenge that took the longest to overcome was probably making it possible for characters to fit inside its pod,” reveals background and vehicles modeling artist Vlastimil Pochop. “Since there was only a single unified size for all the Tachikomas, it wasn’t easy to figure out how the interior would fit such a broad spectrum of characters of various sizes while keeping the exterior in line with the concept art. All that while also thinking about the rig and animations. It was necessary to make my own simple rig and animations that I later shared with the riggers as reference. That way I could make sure all those geometry layers sliding over each other won’t clip through as the pod opens up. The low-poly stage of the development involved a lot of consulting between me, the concept artist, the riggers and the directors. “As for the high-definition model, I had to be careful not to deviate too much from the concept art by adding too much detail. The new version of Tachikomas is different from the original, yet they share some similarities, like their fairly simple spider-like shape and their color palette. Adding too much geometric details or weathering would deviate from the concept.”

Robodogs are lethal and relentless. “This is a simple asset, but the challenge here was to make it transform into its shape from a box,” observes Pochop. “The first thing I did is to make a very low-poly version with a rig, and transformed it using bones and a little bit of blend shapes to see how close I can get to the concept while having the ability to transform. This asset also involved quite a lot of back and forth between me and the rigger. After that, the asset was passed to another modeler for HD pass.” The threat level escalates when Motoko and her mercenary colleagues have to battle an adversary wearing an armored suit. “With its quite unusual silhouette it was a little bit challenging to get the upper and lower body balanced in just the right way. And again, with a pretty tall character to visibly fit inside, it took a while to match the interior and exterior. “Because the concept design didn’t have quite enough detail in it, I also had to add some extra details while making sure these details didn’t result in undesirable outlines and overall look. This applied to all the other assets as well. While modeling, I needed to keep thinking in the back of my head how it would eventually look with outlines included. That’s why I would sometimes rather use textures and normal maps to add details instead of actual geometry. This asset also had a couple of damage states, but these weren’t particularly difficult to make. The only challenge here was to resist going too crazy with these bullet holes.” During compositing, the color of the background and the shadows of the characters were matched first. “As it’s an anime, there was no need to rigorously match the shadows like photorealistic works, so there are times when we let characters stand out,” explains Lighting, Composite Supervisor Kouya Takahashi. “Also, since there are so many scenes in the series, the methods change slightly for each one.” The thermo-optical camouflage had to be readable and invisible at the same time. “At first, we made something similar to a pre-set, but it needed to be adjusted for each shot. As the distortion of the cloaks makes them easier to see, the areas behind the characters change their impression considerably. It was necessary to choose the level of visibility of the cloaks for each shot based on whether or not the actions of the characters needed to been seen, so I don’t think there ever was a right balance to be found.” “This had the largest number of assets of any project we’ve worked on,” state Tasaki and Matsushige. “Looking back, the battles were cool. The scenes in Episodes 1 and 2 where the Tachikoma go up against tanks looked really good in 3D CG. Also, the entire series was made with an emphasis on story, so I hope everyone enjoys that element too.” Tasaki concludes, “Although the series does have a new look, people will enjoy Ghost in the Shell SAC_2045 when they watch it as one complete package.”

SUMMER 2020 VFXVOICE.COM • 49

5/13/20 6:05 PM


COVER

“[T]here are more effects in every show now. Whereas you used to have some shows that were visual effects heavy, we would have as many as 103 shots in a half hour of Silicon Valley. I know that Avenue 5 is the same way. That is somewhat new. Technology has made it easier to do the visual effects, so it’s part of the language of filmmaking.” —Cynthia Kanner, Senior Vice President, Post Production, HBO

THE ART AND BUSINESS OF VFX FOR TV AND STREAMING By TREVOR HOGG

TOP: Visual Effects Supervisor Adrian de Wet believes that there are more complex visual effects shots with a cinematic style in TV shows now than ever before, such as in See on Apple TV+. (Image courtesy of Apple TV+ and Outpost VFX)

50 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 50-51

The 2019 subscriber numbers for the top seven cable companies in the U.S., which include Comcast, Charter, Cox, Altice, Mediacom, Cable One and Atlantic Broadband, totaled 45.8 million, according to the Leichtman Research Group. The number of cable subscribers can be contrasted with the explosive subscriber growth of streaming services. In 2019 Netflix reported 61 million paid memberships in the U.S., Hulu had 30 million and Amazon Prime Video had 112 million. The major driving force for viewership remains original and exclusive content. Not surprisingly, as the streaming wars heat up, BMO Capital Markets forecasts that content spending by Netflix will go from $17.3 billion in 2020 to $26.3 billion by 2028. A significant portion of that expenditure will be devoted to CGI, with the “Global Animation, VFX & Video Games Industry: Strategies, Trends & Opportunities (2020-2025)” report stating that special effects as a percent of production cost is between 20%-25%. The report also estimates the size of the streaming market for animation and VFX content was $3.5 billion in 2019 and is growing at an annual rate of 8%. “The continued emergence of new streaming platforms, Disney+ and Apple TV+ among some of the most recent to launch, has created entirely new avenues for content, and much of it is prestige programming that requires quality visual effects, with the scope of a traditional feature blockbuster or high-profile cable series,” states Simon Rosenthal, Executive Vice President Global Studio

Operations at Method Studios. “At the same time, technology advancements are enabling studios to work more quickly and efficiently, and so producers are increasingly using visual effects to support their storytelling, whether creating a full CG creature, mass destruction, or digitally altering practical locations to be period-authentic.” Before Netflix, a major rival to cable television was specialty channel HBO, which has always sought to blur the line between film and television productions. A big part of being able to achieve this is in the willingness to embrace and push technology. “We have always built our projects around having to have the time to make excellent visual effects,” states Cynthia Kanner, Senior Vice President, Post Production at HBO. “Our schedules traditionally have been longer than network schedules. I would say that there are more effects in every show now. Whereas you used to have some shows that were visual effects heavy, we would have as many as 103 shots in a half hour of Silicon Valley. I know that Avenue 5 is the same way. That is somewhat new. Technology has made it easier to do the visual effects, so it’s part of the language of filmmaking.” The growing demand has improved the quality of the visual effects. “This is 100% quality-driven and it’s raised the bar quite a bit,” remarks Adrian de Wet, who was the Visual Effects Supervisor on the first season of See for Apple TV+. “Audiences have extremely

TOP: The Witcher was the most-watched first season ever for Netflix, with Framestore having a major role in the production. (Image courtesy of Netflix and Framestore) BOTTOM: The Mary Powell was a fast and luxurious steamboat built in 1861, and is added into a live-action plate by Freefolk for the TNT series The Alienist. (Image courtesy of TNT and Freefolk)

SUMMER 2020 VFXVOICE.COM • 51

5/13/20 6:06 PM


COVER

“[T]here are more effects in every show now. Whereas you used to have some shows that were visual effects heavy, we would have as many as 103 shots in a half hour of Silicon Valley. I know that Avenue 5 is the same way. That is somewhat new. Technology has made it easier to do the visual effects, so it’s part of the language of filmmaking.” —Cynthia Kanner, Senior Vice President, Post Production, HBO

THE ART AND BUSINESS OF VFX FOR TV AND STREAMING By TREVOR HOGG

TOP: Visual Effects Supervisor Adrian de Wet believes that there are more complex visual effects shots with a cinematic style in TV shows now than ever before, such as in See on Apple TV+. (Image courtesy of Apple TV+ and Outpost VFX)

50 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 50-51

The 2019 subscriber numbers for the top seven cable companies in the U.S., which include Comcast, Charter, Cox, Altice, Mediacom, Cable One and Atlantic Broadband, totaled 45.8 million, according to the Leichtman Research Group. The number of cable subscribers can be contrasted with the explosive subscriber growth of streaming services. In 2019 Netflix reported 61 million paid memberships in the U.S., Hulu had 30 million and Amazon Prime Video had 112 million. The major driving force for viewership remains original and exclusive content. Not surprisingly, as the streaming wars heat up, BMO Capital Markets forecasts that content spending by Netflix will go from $17.3 billion in 2020 to $26.3 billion by 2028. A significant portion of that expenditure will be devoted to CGI, with the “Global Animation, VFX & Video Games Industry: Strategies, Trends & Opportunities (2020-2025)” report stating that special effects as a percent of production cost is between 20%-25%. The report also estimates the size of the streaming market for animation and VFX content was $3.5 billion in 2019 and is growing at an annual rate of 8%. “The continued emergence of new streaming platforms, Disney+ and Apple TV+ among some of the most recent to launch, has created entirely new avenues for content, and much of it is prestige programming that requires quality visual effects, with the scope of a traditional feature blockbuster or high-profile cable series,” states Simon Rosenthal, Executive Vice President Global Studio

Operations at Method Studios. “At the same time, technology advancements are enabling studios to work more quickly and efficiently, and so producers are increasingly using visual effects to support their storytelling, whether creating a full CG creature, mass destruction, or digitally altering practical locations to be period-authentic.” Before Netflix, a major rival to cable television was specialty channel HBO, which has always sought to blur the line between film and television productions. A big part of being able to achieve this is in the willingness to embrace and push technology. “We have always built our projects around having to have the time to make excellent visual effects,” states Cynthia Kanner, Senior Vice President, Post Production at HBO. “Our schedules traditionally have been longer than network schedules. I would say that there are more effects in every show now. Whereas you used to have some shows that were visual effects heavy, we would have as many as 103 shots in a half hour of Silicon Valley. I know that Avenue 5 is the same way. That is somewhat new. Technology has made it easier to do the visual effects, so it’s part of the language of filmmaking.” The growing demand has improved the quality of the visual effects. “This is 100% quality-driven and it’s raised the bar quite a bit,” remarks Adrian de Wet, who was the Visual Effects Supervisor on the first season of See for Apple TV+. “Audiences have extremely

TOP: The Witcher was the most-watched first season ever for Netflix, with Framestore having a major role in the production. (Image courtesy of Netflix and Framestore) BOTTOM: The Mary Powell was a fast and luxurious steamboat built in 1861, and is added into a live-action plate by Freefolk for the TNT series The Alienist. (Image courtesy of TNT and Freefolk)

SUMMER 2020 VFXVOICE.COM • 51

5/13/20 6:06 PM


COVER

high standards, and so do we as artists and content providers. Audiences and providers are feeding off each other to push the bar higher every day. There are more complex visual effects shots, with a more cinematic style, in TV shows now than ever before. For instance, I recently completed an eight-part series that contained 3,000+ visual effects shots, many of which were highly complex [digital humans and creatures in digital environments with large effects simulations] in a show that would not necessarily be described as being ‘visual effects driven.’” Television and film do not share the same workflow. “You do less iterations for television than you do for feature film work,” states Fiona Walkinshaw, Global Managing Director, Film at Framestore. “The iterative process is reduced because there is simply no time. The filmmakers generally understand that because they’re working to the same deadlines. They’re not thinking that these 300 shots are being delivered in four months. They’re thinking 100 of these 300 shots need to be delivered in six weeks. The post is scheduled differently as DI is being done as they go along each episode.” The demand for higher resolution such as UHD HDR is not going to disappear for streamers whereas networks and theatrical releases remain at 2K.“There is a perceived idea that more pixels make better pictures,” remarks Rachael Penfold, Company Director and Executive Producer at One of Us. “4K or even 6K is slower than 2K, but we strive to find ways to avoid this impacting on the free flow of creative ideas. And besides, other components of the technology go some way toward compensating for the impact of resolution inflation.” Calculating a compound annual interest growth of 11.4%, Zion

52 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 52-53

“The interesting thing about Game of Thrones is we couldn’t have made each season any earlier than we did. In many cases we were driving the visual effects industry to produce some new tool or way of doing something to allow us to be able to make the show. In that respect, it’s more about setting the pace for the technology, and knowing that this is the expectation of our filmmakers and the audience.” —Stephen Beres, Senior Vice President, Media & Production Operations, HBO Market Research predicts that the global visual effects industry will be worth $23.8 billion by 2025. “From my standpoint, the demand for visual effects has had a positive impact,” states Duncan McWilliam, CEO and Founder at Outpost VFX, “Our growth has doubled every year for seven years now, and more than half of our business is now TV and streaming.” Image Engine has seen the amount of streaming content that it is bidding on quadruple over a period of five years. “The worlds of high-end television and feature film are fundamentally and tangibly much closer together now than ever,” states Shawn Walsh, Visual Effects Executive Producer and General Manager at Image Engine. “As a result, the impact on the industry has been deep, affecting the assigning of key staff,

OPPOSITE TOP: HBO fantasy series Game of Thrones is seen as setting the gold standard for high-end visual effects on television. (Image courtesy of HBO) OPPOSITE MIDDLE: Hugh Laurie, Andy Buckley and Rebecca Front star in the HBO series Avenue 5. (Image courtesy of HBO) OPPOSITE BOTTOM: It is not uncommon to have over 100 visual effects shots in a half hour of Silicon Valley on HBO. (Image courtesy of HBO) TOP: Jon Snow (Kit Harington) rides a dragon buck, which is transformed by Image Engine into an aerial shot featured in HBO’s Game of Thrones. (Image courtesy of HBO and Image Engine) BOTTOM: Vignette Stonemoss (Cara Delevingne) battles with a stand-in performer that is subsequently turned into a ferocious marrok by Image Engine for the Amazon Studios production of Carnival Row. (Image courtesy of Amazon Studios and Image Engine)

SUMMER 2020 VFXVOICE.COM • 53

5/13/20 6:07 PM


COVER

high standards, and so do we as artists and content providers. Audiences and providers are feeding off each other to push the bar higher every day. There are more complex visual effects shots, with a more cinematic style, in TV shows now than ever before. For instance, I recently completed an eight-part series that contained 3,000+ visual effects shots, many of which were highly complex [digital humans and creatures in digital environments with large effects simulations] in a show that would not necessarily be described as being ‘visual effects driven.’” Television and film do not share the same workflow. “You do less iterations for television than you do for feature film work,” states Fiona Walkinshaw, Global Managing Director, Film at Framestore. “The iterative process is reduced because there is simply no time. The filmmakers generally understand that because they’re working to the same deadlines. They’re not thinking that these 300 shots are being delivered in four months. They’re thinking 100 of these 300 shots need to be delivered in six weeks. The post is scheduled differently as DI is being done as they go along each episode.” The demand for higher resolution such as UHD HDR is not going to disappear for streamers whereas networks and theatrical releases remain at 2K.“There is a perceived idea that more pixels make better pictures,” remarks Rachael Penfold, Company Director and Executive Producer at One of Us. “4K or even 6K is slower than 2K, but we strive to find ways to avoid this impacting on the free flow of creative ideas. And besides, other components of the technology go some way toward compensating for the impact of resolution inflation.” Calculating a compound annual interest growth of 11.4%, Zion

52 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 52-53

“The interesting thing about Game of Thrones is we couldn’t have made each season any earlier than we did. In many cases we were driving the visual effects industry to produce some new tool or way of doing something to allow us to be able to make the show. In that respect, it’s more about setting the pace for the technology, and knowing that this is the expectation of our filmmakers and the audience.” —Stephen Beres, Senior Vice President, Media & Production Operations, HBO Market Research predicts that the global visual effects industry will be worth $23.8 billion by 2025. “From my standpoint, the demand for visual effects has had a positive impact,” states Duncan McWilliam, CEO and Founder at Outpost VFX, “Our growth has doubled every year for seven years now, and more than half of our business is now TV and streaming.” Image Engine has seen the amount of streaming content that it is bidding on quadruple over a period of five years. “The worlds of high-end television and feature film are fundamentally and tangibly much closer together now than ever,” states Shawn Walsh, Visual Effects Executive Producer and General Manager at Image Engine. “As a result, the impact on the industry has been deep, affecting the assigning of key staff,

OPPOSITE TOP: HBO fantasy series Game of Thrones is seen as setting the gold standard for high-end visual effects on television. (Image courtesy of HBO) OPPOSITE MIDDLE: Hugh Laurie, Andy Buckley and Rebecca Front star in the HBO series Avenue 5. (Image courtesy of HBO) OPPOSITE BOTTOM: It is not uncommon to have over 100 visual effects shots in a half hour of Silicon Valley on HBO. (Image courtesy of HBO) TOP: Jon Snow (Kit Harington) rides a dragon buck, which is transformed by Image Engine into an aerial shot featured in HBO’s Game of Thrones. (Image courtesy of HBO and Image Engine) BOTTOM: Vignette Stonemoss (Cara Delevingne) battles with a stand-in performer that is subsequently turned into a ferocious marrok by Image Engine for the Amazon Studios production of Carnival Row. (Image courtesy of Amazon Studios and Image Engine)

SUMMER 2020 VFXVOICE.COM • 53

5/13/20 6:07 PM


COVER

“New technology is playing a key part in removing or lowering barriers that make our stories possible across the globe. Stranger Things 3 required complex simulations that would have been impossible at such scale and quality for series a few years ago.” —Andy Fowler, Director of Visual Effects and Virtual Production, Netflix

54 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 55

facility resources, technologies and management.” Increased demand for visual effects means that there is a strain on the available talent pool. “There is a challenge in getting good visual effects supervisors, producers and coordinators,” remarks Kanner. “We don’t have a visual effects department per se. The effects are part of the show and each executive oversees their show.” There are productions where a vendor is given the responsibility of managing the visual effects. “A couple of things haven’t changed over the years,” notes Walkinshaw. “As with Gulliver’s Travels, our supervisor for His Dark Materials was also the production side supervisor, so we were able to streamline the process as there weren’t lots of people between us and the director.” “At Netflix we have dedicated teams specific to features and series embedded in our global offices,” states Andy Fowler, the company’s Director of Visual Effects and Virtual Production. “They help bring the right talent together, working closely with showrunners, producers and visual effects teams. We generate a lot of visual effects content; hence, a unified connected group ensures that the many international vendors and crew we employ intersect well across our content. We also look to leverage technology that helps reduce friction and ease the process of visual effects production.” The academic mindset needs to be altered. “This is an indutrywide issue where at the entry level there is a vacuum of talent,” observes Stephen Beres, Senior Vice President, Media & Production Operations at HBO. “One thing that we’re doing is investing a lot in education to try to help post-secondary programs understand what the industry of today needs is not the traditional liberal arts film school concept but a curriculum that is

craft focused.” Visual effects companies are being impacted by the talent shortage. “On one hand it’s exciting to see a broader range of projects being greenlit,” states David Conley, Executive VFX Producer at Weta Digital. “Studios are taking more creative chances than ever. But this demand has put pressure on worldwide capacity. There just aren’t enough artists available to produce all the work.” Staff salaries are rapidly rising, as well as the poaching of talent. “With all of the larger facilities opening TV divisions, they are taking from the midsize vendors who have a track history in this work,” notes Lucy Ainsworth-Taylor, CEO and Co-founder at BlueBolt. “However, this must be the same in every area of the industry and [if it isn’t] it will become unsustainable.” Shifting deadlines can complicate staffing. “One thing I would say on scheduling is networks have booked advertising around the programming slot whereas streamers aren’t bound to that rule,” remarks McWilliam. “If streamers want to slide the deadline back further then they are able to do so. That can be good or bad. If you have another job colliding with another finishing, that can be a real recruitment issue for us.” “The scope and ambition of productions has grown exponentially along with costs,” observes Mark Binke, Executive Vice President of Production at Universal Content Productions, a division of NBCUniversal Content Studios. “As a result, series typically require more high-end visual effects, spread globally across more vendors. For television, unlike film, a high volume of visual effects shots are required on an ongoing basis versus film which is typically backloaded and for a percentage of the time.” Increased content spending does not necessarily mean more money allotted

OPPOSITE TOP: The Netflix production of Altered Carbon takes place in futuristic setting of Bay City which was formerly known as San Francisco. (Image courtesy of Netflix) OPPOSITE BOTTOM: Olivia Taylor Dudley portrays Alice Quinn in The Magicians, produced by Universal Content Productions for Syfy. (Image courtesy of SYFY Media) TOP: Tom Hooper portrays Luther Hargreeves/Number One, who returns to Earth upon learning of the death of his adoptive father in The Umbrella Academy, produced by Universal Content Productions for Netflix. (Image courtesy of Netflix) BOTTOM: The final shot of the Ice Age episode of the ‘Love, Death and Robots’ anthology, produced by Netflix, was created by Method Studios. (Image courtesy of Netflix)

SUMMER 2020 VFXVOICE.COM • 55

5/13/20 6:07 PM


COVER

“New technology is playing a key part in removing or lowering barriers that make our stories possible across the globe. Stranger Things 3 required complex simulations that would have been impossible at such scale and quality for series a few years ago.” —Andy Fowler, Director of Visual Effects and Virtual Production, Netflix

54 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 55

facility resources, technologies and management.” Increased demand for visual effects means that there is a strain on the available talent pool. “There is a challenge in getting good visual effects supervisors, producers and coordinators,” remarks Kanner. “We don’t have a visual effects department per se. The effects are part of the show and each executive oversees their show.” There are productions where a vendor is given the responsibility of managing the visual effects. “A couple of things haven’t changed over the years,” notes Walkinshaw. “As with Gulliver’s Travels, our supervisor for His Dark Materials was also the production side supervisor, so we were able to streamline the process as there weren’t lots of people between us and the director.” “At Netflix we have dedicated teams specific to features and series embedded in our global offices,” states Andy Fowler, the company’s Director of Visual Effects and Virtual Production. “They help bring the right talent together, working closely with showrunners, producers and visual effects teams. We generate a lot of visual effects content; hence, a unified connected group ensures that the many international vendors and crew we employ intersect well across our content. We also look to leverage technology that helps reduce friction and ease the process of visual effects production.” The academic mindset needs to be altered. “This is an indutrywide issue where at the entry level there is a vacuum of talent,” observes Stephen Beres, Senior Vice President, Media & Production Operations at HBO. “One thing that we’re doing is investing a lot in education to try to help post-secondary programs understand what the industry of today needs is not the traditional liberal arts film school concept but a curriculum that is

craft focused.” Visual effects companies are being impacted by the talent shortage. “On one hand it’s exciting to see a broader range of projects being greenlit,” states David Conley, Executive VFX Producer at Weta Digital. “Studios are taking more creative chances than ever. But this demand has put pressure on worldwide capacity. There just aren’t enough artists available to produce all the work.” Staff salaries are rapidly rising, as well as the poaching of talent. “With all of the larger facilities opening TV divisions, they are taking from the midsize vendors who have a track history in this work,” notes Lucy Ainsworth-Taylor, CEO and Co-founder at BlueBolt. “However, this must be the same in every area of the industry and [if it isn’t] it will become unsustainable.” Shifting deadlines can complicate staffing. “One thing I would say on scheduling is networks have booked advertising around the programming slot whereas streamers aren’t bound to that rule,” remarks McWilliam. “If streamers want to slide the deadline back further then they are able to do so. That can be good or bad. If you have another job colliding with another finishing, that can be a real recruitment issue for us.” “The scope and ambition of productions has grown exponentially along with costs,” observes Mark Binke, Executive Vice President of Production at Universal Content Productions, a division of NBCUniversal Content Studios. “As a result, series typically require more high-end visual effects, spread globally across more vendors. For television, unlike film, a high volume of visual effects shots are required on an ongoing basis versus film which is typically backloaded and for a percentage of the time.” Increased content spending does not necessarily mean more money allotted

OPPOSITE TOP: The Netflix production of Altered Carbon takes place in futuristic setting of Bay City which was formerly known as San Francisco. (Image courtesy of Netflix) OPPOSITE BOTTOM: Olivia Taylor Dudley portrays Alice Quinn in The Magicians, produced by Universal Content Productions for Syfy. (Image courtesy of SYFY Media) TOP: Tom Hooper portrays Luther Hargreeves/Number One, who returns to Earth upon learning of the death of his adoptive father in The Umbrella Academy, produced by Universal Content Productions for Netflix. (Image courtesy of Netflix) BOTTOM: The final shot of the Ice Age episode of the ‘Love, Death and Robots’ anthology, produced by Netflix, was created by Method Studios. (Image courtesy of Netflix)

SUMMER 2020 VFXVOICE.COM • 55

5/13/20 6:07 PM


COVER

“The ability to visualize the story and characters the way the creators imagine it has given TV and streamers the opportunity to produce highconcept series at a level that was not possible a few years ago. The ability to produce material with such scope means that the costs have escalated and the time from greenlight to air has expanded significantly as a result.” —Mark Binke, Executive Vice President of Production, Universal Content Productions

56 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 57

to each project. “Tighter budgets and shorter schedules have created a squeeze all around on visual effects studios,” notes Meg Guidon, VFX Executive Producer at Freefolk. “It requires an extreme streamlining of workflow with increased use of project management tools such as Shotgun to monitor shot status and share data for delivery.” Television is an innovative medium. “You get to do things easily that you might not be able to do on a big feature film,” remarks Framestore’s Walkinshaw. “Even back in the days of [1996 television miniseries] Gulliver’s Travels, we developed a motion tracking system that you wouldn’t be able to possibly do if it hadn’t been on a miniseries where you were owning all of the technology and working out how to do it. There is a lot of innovative work going on in television at the moment.” Game of Thrones set the gold standard for high-end visual effects. “We couldn’t have made each season any earlier than we did,” notes Beres. “In many cases we were driving the visual effects industry to produce some new tool or way of doing something to allow us to be able to make the show. In that respect, it’s more about setting the pace for the technology, and knowing that this is the expectation of our filmmakers and the audience.” “Our content can raise the bar for visual effects normally destined for studio theatrical releases,” remarks Fowler. “The Irishman, for example, would not have been possible without new thinking around facial capture and deep learning, and it was Netflix that opened the door to allow ILM to execute upon this ground-breaking work. Certainly, continued advancements in visual effects have brought about higher quality work throughout our series and feature content.” Expanding the canvas

of storytelling also means the studios have more of an invested interest with Binke noting, “The ability to produce material with such scope means that the costs have escalated and the time from greenlight to air has expanded significantly as a result.” Tax incentives remain essential in attracting projects to different countries and visual effects companies. “Incentives sometimes play a huge role and other times do not,” remarks Kanner. “I had a show recently where we ended up having more visual effects than we thought, and I needed to move some of the extra visual effects to a place where I could get incentives to make the overall cost less significant. In other cases, we may move visual effects not to where the visual effects are cheaper, but because we need a certain spend in that country or need a labor ratio to make it work for a different part of the incentive.” There is uncertainty regarding international productions continuing to be drawn to the U.K. “Removing the 80% cap on the current U.K. tax credit would be vital for the visual effects industry to keep the work in the U.K.,” states Ainsworth-Taylor. “This is essential, and hopefully after Brexit this can be addressed by government with the help of U.K. Screen lobbying for it.” The good times will not last forever. “If we’re not careful, there can be a breaking point,” states Wayne Brinton, Executive Producer at Rodeo FX. “With streaming, there’s just so much work coming. Netflix, Amazon, Disney+, Hulu and Apple TV+ are producing original content over and above the traditional studios. This means exponential growth, and so we need to find different business models. We need to invest in AI and machine learning to automate a lot of processes and we’re looking at different rendering software. Sometimes the trend is to outsource

OPPOSITE TOP: Lela Loren portrays Danica Harlan, the governor of the planet Harlan’s World in Altered Carbon, produced by Netflix. (Image courtesy of Netflix) OPPOSITE BOTTOM: Claire Foy portrays Queen Elizabeth II on her wedding day in The Crown, with visual effects created by One of Us for the Netflix production. (Image courtesy of Netflix and One of Us) TOP: Cows magically appear in HBO limited series Watchmen courtesy of Outpost VFX. (Image courtesy of HBO and Outpost VFX) BOTTOM: UPP was responsible for the cinematic visual effects featured in Neill Blomkamp’s live-action trailer for the sci-fi game, Anthem. (Image courtesy of UPP)

SUMMER 2020 VFXVOICE.COM • 57

5/13/20 6:07 PM


COVER

“The ability to visualize the story and characters the way the creators imagine it has given TV and streamers the opportunity to produce highconcept series at a level that was not possible a few years ago. The ability to produce material with such scope means that the costs have escalated and the time from greenlight to air has expanded significantly as a result.” —Mark Binke, Executive Vice President of Production, Universal Content Productions

56 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 57

to each project. “Tighter budgets and shorter schedules have created a squeeze all around on visual effects studios,” notes Meg Guidon, VFX Executive Producer at Freefolk. “It requires an extreme streamlining of workflow with increased use of project management tools such as Shotgun to monitor shot status and share data for delivery.” Television is an innovative medium. “You get to do things easily that you might not be able to do on a big feature film,” remarks Framestore’s Walkinshaw. “Even back in the days of [1996 television miniseries] Gulliver’s Travels, we developed a motion tracking system that you wouldn’t be able to possibly do if it hadn’t been on a miniseries where you were owning all of the technology and working out how to do it. There is a lot of innovative work going on in television at the moment.” Game of Thrones set the gold standard for high-end visual effects. “We couldn’t have made each season any earlier than we did,” notes Beres. “In many cases we were driving the visual effects industry to produce some new tool or way of doing something to allow us to be able to make the show. In that respect, it’s more about setting the pace for the technology, and knowing that this is the expectation of our filmmakers and the audience.” “Our content can raise the bar for visual effects normally destined for studio theatrical releases,” remarks Fowler. “The Irishman, for example, would not have been possible without new thinking around facial capture and deep learning, and it was Netflix that opened the door to allow ILM to execute upon this ground-breaking work. Certainly, continued advancements in visual effects have brought about higher quality work throughout our series and feature content.” Expanding the canvas

of storytelling also means the studios have more of an invested interest with Binke noting, “The ability to produce material with such scope means that the costs have escalated and the time from greenlight to air has expanded significantly as a result.” Tax incentives remain essential in attracting projects to different countries and visual effects companies. “Incentives sometimes play a huge role and other times do not,” remarks Kanner. “I had a show recently where we ended up having more visual effects than we thought, and I needed to move some of the extra visual effects to a place where I could get incentives to make the overall cost less significant. In other cases, we may move visual effects not to where the visual effects are cheaper, but because we need a certain spend in that country or need a labor ratio to make it work for a different part of the incentive.” There is uncertainty regarding international productions continuing to be drawn to the U.K. “Removing the 80% cap on the current U.K. tax credit would be vital for the visual effects industry to keep the work in the U.K.,” states Ainsworth-Taylor. “This is essential, and hopefully after Brexit this can be addressed by government with the help of U.K. Screen lobbying for it.” The good times will not last forever. “If we’re not careful, there can be a breaking point,” states Wayne Brinton, Executive Producer at Rodeo FX. “With streaming, there’s just so much work coming. Netflix, Amazon, Disney+, Hulu and Apple TV+ are producing original content over and above the traditional studios. This means exponential growth, and so we need to find different business models. We need to invest in AI and machine learning to automate a lot of processes and we’re looking at different rendering software. Sometimes the trend is to outsource

OPPOSITE TOP: Lela Loren portrays Danica Harlan, the governor of the planet Harlan’s World in Altered Carbon, produced by Netflix. (Image courtesy of Netflix) OPPOSITE BOTTOM: Claire Foy portrays Queen Elizabeth II on her wedding day in The Crown, with visual effects created by One of Us for the Netflix production. (Image courtesy of Netflix and One of Us) TOP: Cows magically appear in HBO limited series Watchmen courtesy of Outpost VFX. (Image courtesy of HBO and Outpost VFX) BOTTOM: UPP was responsible for the cinematic visual effects featured in Neill Blomkamp’s live-action trailer for the sci-fi game, Anthem. (Image courtesy of UPP)

SUMMER 2020 VFXVOICE.COM • 57

5/13/20 6:07 PM


COVER

“The innovation is in how you can build large-scale environments, rig complicated creatures and characters, and get to first pixel render within these shorter schedules, without compromising labor costs. In protecting our artists, we have to give them the best tools to succeed in this business environment.” —David Conley, Executive VFX Producer, Weta Digital to cheaper regions with more available labor, but even that is starting to become harder and harder. At some point, there won’t be enough people. Visual effects is only going to increase. At Rodeo FX, we’re always using R&D to find better, more efficient ways to do work so both the people and tech we have will be fast enough to produce the content.” Production spending by streamers will not continue at the current level. “One of the reasons it will hit the wall is, go speak to a 10-year-old about what they’re interested in,” states Will Cohen, Co-founder and CEO at Milk VFX. “It’s not going to the movies or traditional dramas. I won’t buy everybody’s services because no one can afford to do so every month. People will go where they think they’re best served. There will be a war that lasts five to 10 years and then it will definitely change. In the last five years we’ve seen truly global companies develop that were once local, and whether the market is sustainable for that I don’t know.” Right now, it is all about seizing the moment. “It’s an exciting time to be in the industry,” notes Conley. “There are more creative opportunities with new kinds of stories being told than ever before. There are greater visual challenges being asked of filmmakers and their creative partners. The theatrical experience is evolving, streaming experiences are evolving. As stewards of the industry we need to work together to make sure these changes don’t have a lasting impact on the economics of the industry so that we can focus instead on the exciting technical advancements and creative opportunities at present.”

TOP TO BOTTOM: Rodeo FX created the Demogorgon for the Netflix production of Stranger Things. (Image courtesy of Netflix and Rodeo FX) Milk VFX was sole vendor for the Amazon Studios production of Good Omens. (Image courtesy of Amazon Studios and Milk VFX) Renowned for creating photorealistic creatures, Weta Digital was responsible for Pogo in the Netflix adaptation of The Umbrella Academy. (Image courtesy of Netflix and Weta Digital) Number Five (Aidan Gallagher) travels to the future where he witnesses an apocalypse in the Universal Cable Productions adaptation of The Umbrella Academy. (Image courtesy of Universal Cable Productions and Netflix)

58 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 59

An Eye Towards the Future If one is to look into the technological future, the visual effects industry will be transformed with the proliferation of real-time rendering, AI and virtual production methodologies. 5G networks will be important in allowing the shift from fiber optics to wireless. “There may be a time in the near future where things are shot in the morning and feedback is being given by creative executives by late afternoon,” remarks Stephen Beres, Senior Vice President, Media & Production Operations at HBO. “That’s where we are excited about the promise of 5G. We’re starting to look at the idea of immersive object-based sound, and there is a certain amount of conversation about interactivity in content being an important new venue for storytelling. Digitally de-aging opens up an interesting realm where actors can play roles that would have been considered to be inappropriate given their age. The idea of digital doubles and bringing actors back from the dead is also interesting. I don’t know if it’s a good or a bad thing to do, but it’s certainly intriguing to see where that technology is going and the opportunities it presents.” Technological advancements in distribution and exhibition need to be kept in mind. “Production teams need to keep up on advancements in areas such as camera technologies, volumetric capture, previsualization, and others that may affect how audiences consume the programming,” states Mark Binke, Executive Vice President of Production at Universal Content Productions, a division of NBCUniversal Content Studios. “Certain processes, like rendering time for visual effects and dailies, require more time. Additionally, the ability to ‘broadcast’ higher quality programming to more devices creates the need for all produced media to be of the highest quality. Even at the HD level, on-set hair and makeup has become even more important.” The ability to do real-time rendering would reduce costs, believes Duncan McWilliam, Founder and CEO at Outpost VFX. “Something technological, which would help us a lot, is to get towards a real-time rendering built into every workstation so that an artist can animate on a final quality image, and we’re not animating on basic rigs then putting them through a whole pipeline of six other people until we see a final result, and then realize that the muscles don’t move right and we start the loop again.” “Visual effects production is a complex endeavor, and it is easy to lose clarity of oversight and steer the ship in the wrong direction,” remarks Shawn Walsh, Visual Effects Executive Producer and General Manager at Image Engine. “In order to address this problem, we are consistently trying to progress towards a more data-driven approach to production management using analytical tools. Expect terms like data-warehousing, business intelligence and data-driven predictive modeling to become more commonplace as companies realize the wealth inherent in this oftenunused information. Also, the industry-wide adoption of open file formats and standards like USD and ACES will make it even

easier to share more and more complex assets between vendors easily, removing technological hurdles that previously required complex ingest, conversion and delivery pipelines.” Cloud-based technology will continue to be a significant part of the visual effects industry. “While the cloud debate is hardly new, cloud-based solutions have reached a point in development where studios are successfully leveraging them in a secure, effective and efficient way, and I think this will increasingly be the case,” states Simon Rosenthal, Executive Vice President, Global Studio Operations at Method Studios. “We’ve used the cloud for rendering and will continue to do so. There have been projects we would have been hard-pressed to deliver on time without cloud rendering, so we certainly see the value in being able to quickly scale.” “As virtual production tools and workflows continue to mature, they’ll have a huge impact on how the business plans for and implements visual effects content,” remarks Andy Fowler, Director of Visual Effects and Virtual Production at Netflix. “LED stages, virtual studio and real-time rendering all open up even greater opportunities for filmmakers to engage with visual effects in so many new ways. Visual effects, virtual production and animation will merge more and more over time; hence, it will become harder to distinguish where one folds into the next. This is an exciting time to be in the business as these worlds align. The influence of AI will have a big impact, the limits of which are unclear. The question may become: Are there any limits to what AI could achieve in visual effects?” AI is the way of the future. “It’s fascinating to see the inroads that AI is already making, and this will only accelerate,” states Rachael Penfold, Company Director and Executive Producer at One of Us. “Things which were inconceivable when we thought we had to code them have now become workable. This means that those aspects of an image which seemed to be utterly beyond the reach of automation are now beginning to succumb.” The future is in real-time rendering. “Game engines will be more important in our film world,” believes Viktor Muller, CEO and Visual Effects Supervisor at UPP. “The quality of games has grown faster than the quality of filmmaking over the past years. A few years from now there will not be a big difference between feature, TV, streaming and games in terms of visual effects quality. Another important element will be AI. The tools that we use are 20 years old. The latest Nuke studio has the same features as Flame or Inferno did 15 years ago. Yes, computers are much more capable, so some algorithms work better now, but nothing super new. I am personally investing a lot of money and energy in AI tools like automatic keying W/O B/S, rotoscoping, retouching, AI color matching, compositing, and AI scan model cleaning. This is the way to improve quality versus time versus budget for the whole industry.”

SUMMER 2020 VFXVOICE.COM • 59

5/13/20 6:07 PM


COVER

“The innovation is in how you can build large-scale environments, rig complicated creatures and characters, and get to first pixel render within these shorter schedules, without compromising labor costs. In protecting our artists, we have to give them the best tools to succeed in this business environment.” —David Conley, Executive VFX Producer, Weta Digital to cheaper regions with more available labor, but even that is starting to become harder and harder. At some point, there won’t be enough people. Visual effects is only going to increase. At Rodeo FX, we’re always using R&D to find better, more efficient ways to do work so both the people and tech we have will be fast enough to produce the content.” Production spending by streamers will not continue at the current level. “One of the reasons it will hit the wall is, go speak to a 10-year-old about what they’re interested in,” states Will Cohen, Co-founder and CEO at Milk VFX. “It’s not going to the movies or traditional dramas. I won’t buy everybody’s services because no one can afford to do so every month. People will go where they think they’re best served. There will be a war that lasts five to 10 years and then it will definitely change. In the last five years we’ve seen truly global companies develop that were once local, and whether the market is sustainable for that I don’t know.” Right now, it is all about seizing the moment. “It’s an exciting time to be in the industry,” notes Conley. “There are more creative opportunities with new kinds of stories being told than ever before. There are greater visual challenges being asked of filmmakers and their creative partners. The theatrical experience is evolving, streaming experiences are evolving. As stewards of the industry we need to work together to make sure these changes don’t have a lasting impact on the economics of the industry so that we can focus instead on the exciting technical advancements and creative opportunities at present.”

TOP TO BOTTOM: Rodeo FX created the Demogorgon for the Netflix production of Stranger Things. (Image courtesy of Netflix and Rodeo FX) Milk VFX was sole vendor for the Amazon Studios production of Good Omens. (Image courtesy of Amazon Studios and Milk VFX) Renowned for creating photorealistic creatures, Weta Digital was responsible for Pogo in the Netflix adaptation of The Umbrella Academy. (Image courtesy of Netflix and Weta Digital) Number Five (Aidan Gallagher) travels to the future where he witnesses an apocalypse in the Universal Cable Productions adaptation of The Umbrella Academy. (Image courtesy of Universal Cable Productions and Netflix)

58 • VFXVOICE.COM SUMMER 2020

PG 50-59 TV STREAMING.indd 59

An Eye Towards the Future If one is to look into the technological future, the visual effects industry will be transformed with the proliferation of real-time rendering, AI and virtual production methodologies. 5G networks will be important in allowing the shift from fiber optics to wireless. “There may be a time in the near future where things are shot in the morning and feedback is being given by creative executives by late afternoon,” remarks Stephen Beres, Senior Vice President, Media & Production Operations at HBO. “That’s where we are excited about the promise of 5G. We’re starting to look at the idea of immersive object-based sound, and there is a certain amount of conversation about interactivity in content being an important new venue for storytelling. Digitally de-aging opens up an interesting realm where actors can play roles that would have been considered to be inappropriate given their age. The idea of digital doubles and bringing actors back from the dead is also interesting. I don’t know if it’s a good or a bad thing to do, but it’s certainly intriguing to see where that technology is going and the opportunities it presents.” Technological advancements in distribution and exhibition need to be kept in mind. “Production teams need to keep up on advancements in areas such as camera technologies, volumetric capture, previsualization, and others that may affect how audiences consume the programming,” states Mark Binke, Executive Vice President of Production at Universal Content Productions, a division of NBCUniversal Content Studios. “Certain processes, like rendering time for visual effects and dailies, require more time. Additionally, the ability to ‘broadcast’ higher quality programming to more devices creates the need for all produced media to be of the highest quality. Even at the HD level, on-set hair and makeup has become even more important.” The ability to do real-time rendering would reduce costs, believes Duncan McWilliam, Founder and CEO at Outpost VFX. “Something technological, which would help us a lot, is to get towards a real-time rendering built into every workstation so that an artist can animate on a final quality image, and we’re not animating on basic rigs then putting them through a whole pipeline of six other people until we see a final result, and then realize that the muscles don’t move right and we start the loop again.” “Visual effects production is a complex endeavor, and it is easy to lose clarity of oversight and steer the ship in the wrong direction,” remarks Shawn Walsh, Visual Effects Executive Producer and General Manager at Image Engine. “In order to address this problem, we are consistently trying to progress towards a more data-driven approach to production management using analytical tools. Expect terms like data-warehousing, business intelligence and data-driven predictive modeling to become more commonplace as companies realize the wealth inherent in this oftenunused information. Also, the industry-wide adoption of open file formats and standards like USD and ACES will make it even

easier to share more and more complex assets between vendors easily, removing technological hurdles that previously required complex ingest, conversion and delivery pipelines.” Cloud-based technology will continue to be a significant part of the visual effects industry. “While the cloud debate is hardly new, cloud-based solutions have reached a point in development where studios are successfully leveraging them in a secure, effective and efficient way, and I think this will increasingly be the case,” states Simon Rosenthal, Executive Vice President, Global Studio Operations at Method Studios. “We’ve used the cloud for rendering and will continue to do so. There have been projects we would have been hard-pressed to deliver on time without cloud rendering, so we certainly see the value in being able to quickly scale.” “As virtual production tools and workflows continue to mature, they’ll have a huge impact on how the business plans for and implements visual effects content,” remarks Andy Fowler, Director of Visual Effects and Virtual Production at Netflix. “LED stages, virtual studio and real-time rendering all open up even greater opportunities for filmmakers to engage with visual effects in so many new ways. Visual effects, virtual production and animation will merge more and more over time; hence, it will become harder to distinguish where one folds into the next. This is an exciting time to be in the business as these worlds align. The influence of AI will have a big impact, the limits of which are unclear. The question may become: Are there any limits to what AI could achieve in visual effects?” AI is the way of the future. “It’s fascinating to see the inroads that AI is already making, and this will only accelerate,” states Rachael Penfold, Company Director and Executive Producer at One of Us. “Things which were inconceivable when we thought we had to code them have now become workable. This means that those aspects of an image which seemed to be utterly beyond the reach of automation are now beginning to succumb.” The future is in real-time rendering. “Game engines will be more important in our film world,” believes Viktor Muller, CEO and Visual Effects Supervisor at UPP. “The quality of games has grown faster than the quality of filmmaking over the past years. A few years from now there will not be a big difference between feature, TV, streaming and games in terms of visual effects quality. Another important element will be AI. The tools that we use are 20 years old. The latest Nuke studio has the same features as Flame or Inferno did 15 years ago. Yes, computers are much more capable, so some algorithms work better now, but nothing super new. I am personally investing a lot of money and energy in AI tools like automatic keying W/O B/S, rotoscoping, retouching, AI color matching, compositing, and AI scan model cleaning. This is the way to improve quality versus time versus budget for the whole industry.”

SUMMER 2020 VFXVOICE.COM • 59

5/13/20 6:07 PM


VFX TRENDS

DOING THE BIDDING OF VISUAL EFFECTS By TREVOR HOGG

TOP: Erin Moriarty portrays Starlight/Annie January with help from Framestore in the Amazon Studios production of The Boys. (Images courtesy of Amazon Studios and Framestore)

60 • VFXVOICE.COM SUMMER 2020

PG 60-65 BIDDING.indd 60-61

Imagine trying to hit a moving target rather than a static object. That is what essentially occurs when visual effects companies bid on projects for television, streamers and movie studios, as what is conceived on the printed page is never entirely replicated in principal photography and post-production due to financial and logistical restrictions or creative serendipity. How does one navigate through the ins and outs of the bidding process? A good place to begin is with the words of advice of industry veterans who have gained a firsthand insight that has enabled them to survive and thrive in what has become a highly competitive market place. Duncan McWilliam, CEO and Founder, Outpost VFX We will bid to the script and then caveat that with a document that goes into the bid contract that these shots are based on these assumptions on this visual effects breakdown, which is often supplied by the client so we’re filling in their spreadsheets. We would never be able to bid a script and then deliver exactly what we said we would because so much can change by the time you get to the end of the shoot. Invariably you have incurred more costs. You’ve got to re-evaluate the bid as you go, [including] overage for more work, or give credit for work that didn’t end up being done. As long as you’re clear upfront that things will change and we’re prepared for it, that usually works out just fine. The way it has changed is, in the old days there was big talk of buyout bids bankrupting visual effects houses. If we agreed to do Life of Pi for 10 million quid but it cost 14, that’s your problem and

you go out of business. Since then, what everyone has understood is that we bid based on what we’re talking about this week, and if it changes next week then we rebid. We have a continual bidding department constantly adjusting, refining and resubmitting bids based on every new conversation that we have around a job, and if they can’t afford it then we’ll do our absolute best to come up with a cost-savvy way of doing it either on set or in 2D instead of 3D. Or we will help them with their edit. Fiona Walkinshaw, Global Managing Director, Film, Framestore Sometimes the client will tell you what tax incentives rebate that they’re looking for upfront, if they have a strong production plan. That can be for all sorts of reasons. It can be that the production side through to the director and DP want to shoot in a certain place, or the production plan has been built around rebates in various places in the world. We will either bid according to that or to our capacity. You have to be able to look at your capacity globally, but also you can’t say to every client, ‘We can put 100% of your show in Montreal where there is a 40% rebate.’ We have to balance that up with a lot of our animation capacity, which might be in the U.K. Part of that work will be 25% rebate, part of it will be 40% rebate, and part of it will be India, which makes for a cheaper blended cost. It’s certainly always a negotiation. Clients understand that you’ve got to balance the work, but they’re always looking for some rebate. Without a doubt, a blended cost of rebate is important to winning work.

TOP: Freefolk had to recreate 19th century New York City for the TNT series The Alienist. (Image courtesy of TNT and Freefolk) BOTTOM: Ethan Phillips stars as Spike Martin, who claims to be the first Canadian to land on Mars in HBO’s Avenue 5. (Image courtesy of HBO)

SUMMER 2020 VFXVOICE.COM • 61

5/13/20 6:08 PM


VFX TRENDS

DOING THE BIDDING OF VISUAL EFFECTS By TREVOR HOGG

TOP: Erin Moriarty portrays Starlight/Annie January with help from Framestore in the Amazon Studios production of The Boys. (Images courtesy of Amazon Studios and Framestore)

60 • VFXVOICE.COM SUMMER 2020

PG 60-65 BIDDING.indd 60-61

Imagine trying to hit a moving target rather than a static object. That is what essentially occurs when visual effects companies bid on projects for television, streamers and movie studios, as what is conceived on the printed page is never entirely replicated in principal photography and post-production due to financial and logistical restrictions or creative serendipity. How does one navigate through the ins and outs of the bidding process? A good place to begin is with the words of advice of industry veterans who have gained a firsthand insight that has enabled them to survive and thrive in what has become a highly competitive market place. Duncan McWilliam, CEO and Founder, Outpost VFX We will bid to the script and then caveat that with a document that goes into the bid contract that these shots are based on these assumptions on this visual effects breakdown, which is often supplied by the client so we’re filling in their spreadsheets. We would never be able to bid a script and then deliver exactly what we said we would because so much can change by the time you get to the end of the shoot. Invariably you have incurred more costs. You’ve got to re-evaluate the bid as you go, [including] overage for more work, or give credit for work that didn’t end up being done. As long as you’re clear upfront that things will change and we’re prepared for it, that usually works out just fine. The way it has changed is, in the old days there was big talk of buyout bids bankrupting visual effects houses. If we agreed to do Life of Pi for 10 million quid but it cost 14, that’s your problem and

you go out of business. Since then, what everyone has understood is that we bid based on what we’re talking about this week, and if it changes next week then we rebid. We have a continual bidding department constantly adjusting, refining and resubmitting bids based on every new conversation that we have around a job, and if they can’t afford it then we’ll do our absolute best to come up with a cost-savvy way of doing it either on set or in 2D instead of 3D. Or we will help them with their edit. Fiona Walkinshaw, Global Managing Director, Film, Framestore Sometimes the client will tell you what tax incentives rebate that they’re looking for upfront, if they have a strong production plan. That can be for all sorts of reasons. It can be that the production side through to the director and DP want to shoot in a certain place, or the production plan has been built around rebates in various places in the world. We will either bid according to that or to our capacity. You have to be able to look at your capacity globally, but also you can’t say to every client, ‘We can put 100% of your show in Montreal where there is a 40% rebate.’ We have to balance that up with a lot of our animation capacity, which might be in the U.K. Part of that work will be 25% rebate, part of it will be 40% rebate, and part of it will be India, which makes for a cheaper blended cost. It’s certainly always a negotiation. Clients understand that you’ve got to balance the work, but they’re always looking for some rebate. Without a doubt, a blended cost of rebate is important to winning work.

TOP: Freefolk had to recreate 19th century New York City for the TNT series The Alienist. (Image courtesy of TNT and Freefolk) BOTTOM: Ethan Phillips stars as Spike Martin, who claims to be the first Canadian to land on Mars in HBO’s Avenue 5. (Image courtesy of HBO)

SUMMER 2020 VFXVOICE.COM • 61

5/13/20 6:08 PM


VFX TRENDS

Jeff Shapiro, Director of VFX Global Resource Strategy, Netflix Title complexity and budget are key variables that guide the vendor selection process. Netflix’s highly diverse slate provides opportunities for all types of visual effects studios. The wide degree of creative slate demands can be serviced by local and multi-regional service providers with varying specialties and competencies. Vendor diversity and inclusion are top priorities and drive assignment consideration. Before the streaming boom, visual effects studios focused on efficiency over agility. Today, production and creative agility is critical to a vendor’s success. Production planning and scheduling, artist workflow and creative pipelines need to be able to handle varying degrees of iterations. Production tracking tools are more important than ever. Andy Fowler, Director of Visual Effects and Virtual Production, Netflix It comes down to market pricing and vendors being realistic about what they need to sustain their business. Being mindful not to bid too much below their market price, hence lowering their margins. Also, those making content should respect those market prices. It’s an ecosystem, we depend on each other, and so we need to support each other. Responsible management at all levels and sides of the business are important for it to remain sustainable. Shawn Walsh, Visual Effects Executive Producer and General Manager, Image Engine At Image Engine, we bid exactly the same as always – assessing CG builds, shots and management personnel expenditures needed

62 • VFXVOICE.COM SUMMER 2020

PG 60-65 BIDDING.indd 62-63

to execute the work from a cost perspective as a priority prior to assigning market value. We believe strongly in first assessing the cost impact on us prior to seeing what the market value for a particular scope of work might be. This is probably a more film-like approach in its considerations, but our philosophy has always been that if we are using all of the same things – facilities, technologies, people – and those things cost exactly what they did yesterday, then why would we cost out differently? That being said, our bidding process is extremely robust and can both adjust to a rapidly changing creative brief as well as keep us in check with historical information. We are confident that whatever work makes it through our bidding filter will be the right work regardless of what type of work it is. Lately, it seems like even our relationships are straddling the gap between television and film work with many of our former film clients coming back to repeat work at Image Engine, only on the television side. Looking at the debt loads and fiscal challenges of larger entities like Deluxe and Technicolor, who have a major stake in the visual effects business as a whole, is duly concerning for everyone in the industry. Without sufficient cash flows to substantiate research and development, many of the top 50 or so companies worldwide will struggle to move forward, and that’s bad for our clients. Especially in an era of rapidly increasing sophistication, that’s a lose-lose scenario. Just surviving corporately is not enough, and perhaps that’s where the win-win is for the industry and the client base. Well-run companies that prove their merit technically, creatively and managerially deserve the assumption of profitability from their clients towards the goal of sustainability. That has to be a win-win, and it has to start with the client base valuing what visual effects companies bring to the table.

OPPOSITE TOP: Das Boot was remade into a television series for Sky Atlantic, with visual effects provided by UPP. (Image courtesy of Sky Atlantic and UPP) OPPOSITE MIDDLE: Method Studios took part in the famous ‘Battle of the Bastards’ episode of Game of Thrones. (Image courtesy of HBO) OPPOSITE BOTTOM: A live-action plate shot in South Africa becomes populated with buildings courtesy of Milk VFX for the Amazon Studios production of Good Omens. (Image courtesy of Amazon Studios and Milk VFX) TOP: Anthony Mackie portrays Takeshi Kovacs in the second season of Altered Carbon, which streams on Netflix. (Image courtesy of Netflix) BOTTOM: Behind the scenes of the Netflix adaption of Locke & Key with Connor Jessup and Emilia Jones. (Image courtesy of Netflix)

SUMMER 2020 VFXVOICE.COM • 63

5/13/20 6:09 PM


VFX TRENDS

Jeff Shapiro, Director of VFX Global Resource Strategy, Netflix Title complexity and budget are key variables that guide the vendor selection process. Netflix’s highly diverse slate provides opportunities for all types of visual effects studios. The wide degree of creative slate demands can be serviced by local and multi-regional service providers with varying specialties and competencies. Vendor diversity and inclusion are top priorities and drive assignment consideration. Before the streaming boom, visual effects studios focused on efficiency over agility. Today, production and creative agility is critical to a vendor’s success. Production planning and scheduling, artist workflow and creative pipelines need to be able to handle varying degrees of iterations. Production tracking tools are more important than ever. Andy Fowler, Director of Visual Effects and Virtual Production, Netflix It comes down to market pricing and vendors being realistic about what they need to sustain their business. Being mindful not to bid too much below their market price, hence lowering their margins. Also, those making content should respect those market prices. It’s an ecosystem, we depend on each other, and so we need to support each other. Responsible management at all levels and sides of the business are important for it to remain sustainable. Shawn Walsh, Visual Effects Executive Producer and General Manager, Image Engine At Image Engine, we bid exactly the same as always – assessing CG builds, shots and management personnel expenditures needed

62 • VFXVOICE.COM SUMMER 2020

PG 60-65 BIDDING.indd 62-63

to execute the work from a cost perspective as a priority prior to assigning market value. We believe strongly in first assessing the cost impact on us prior to seeing what the market value for a particular scope of work might be. This is probably a more film-like approach in its considerations, but our philosophy has always been that if we are using all of the same things – facilities, technologies, people – and those things cost exactly what they did yesterday, then why would we cost out differently? That being said, our bidding process is extremely robust and can both adjust to a rapidly changing creative brief as well as keep us in check with historical information. We are confident that whatever work makes it through our bidding filter will be the right work regardless of what type of work it is. Lately, it seems like even our relationships are straddling the gap between television and film work with many of our former film clients coming back to repeat work at Image Engine, only on the television side. Looking at the debt loads and fiscal challenges of larger entities like Deluxe and Technicolor, who have a major stake in the visual effects business as a whole, is duly concerning for everyone in the industry. Without sufficient cash flows to substantiate research and development, many of the top 50 or so companies worldwide will struggle to move forward, and that’s bad for our clients. Especially in an era of rapidly increasing sophistication, that’s a lose-lose scenario. Just surviving corporately is not enough, and perhaps that’s where the win-win is for the industry and the client base. Well-run companies that prove their merit technically, creatively and managerially deserve the assumption of profitability from their clients towards the goal of sustainability. That has to be a win-win, and it has to start with the client base valuing what visual effects companies bring to the table.

OPPOSITE TOP: Das Boot was remade into a television series for Sky Atlantic, with visual effects provided by UPP. (Image courtesy of Sky Atlantic and UPP) OPPOSITE MIDDLE: Method Studios took part in the famous ‘Battle of the Bastards’ episode of Game of Thrones. (Image courtesy of HBO) OPPOSITE BOTTOM: A live-action plate shot in South Africa becomes populated with buildings courtesy of Milk VFX for the Amazon Studios production of Good Omens. (Image courtesy of Amazon Studios and Milk VFX) TOP: Anthony Mackie portrays Takeshi Kovacs in the second season of Altered Carbon, which streams on Netflix. (Image courtesy of Netflix) BOTTOM: Behind the scenes of the Netflix adaption of Locke & Key with Connor Jessup and Emilia Jones. (Image courtesy of Netflix)

SUMMER 2020 VFXVOICE.COM • 63

5/13/20 6:09 PM


VFX TRENDS

you have to work with them and, equally, they have to work with us understanding what we can achieve in visual effects for that financial cost. Sometimes they will highlight a particular sequence because it might be challenging to shoot, so they want to be clear as to how they might want to go back to it. They don’t often board the whole series, but cherry pick certain sequences because of their complications. You do rounds and rounds of bidding. You often do alternative bids for the same shots for possible additional requirements so the client has a sense of where their money is going and can decide what the best vendors are for onscreen. There is constant bidding and transparency as well. If you’re raising the cost of anything you have to be able to justify the need.

Viktor Muller, CEO and VFX Supervisor, UPP Visual effects vendors will always find how to adapt to all requests from their clients. Of course, only some will survive. It’s the same as the time when a rebate system came up. Vendors should worry more about what happens when one of these giants stop their production and there’s a few hundred projects less. In the meantime, this industry is super hungry and everyone is missing people. What will everyone do when Netflix or Amazon slows down? There is a new ‘millennial’ style that connects people/freelancers into virtual companies/clouds so all of them are working on their own and the only management, supervision, rendering and IO is going through a solid VFX house. I am personally pessimistic about this model as it will lead to problems with laziness and egos. Wayne Brinton, Executive Producer, Rodeo FX We don’t differentiate how we work with streaming versus studio. With different constraints and timelines, you have to make sure you understand the work before giving a cost. We spend a lot of time and effort understanding the scope of work before we bid, but the process is essentially the same. Everything is schedule, but quality is absolutely paramount. How can we fit enormously complicated shots into a very tight budget? Truth is, scheduling for streaming episodic work is tight, but it’s consistent, and that’s what’s good about it. With films, once you start working on a project, you don’t know when you’re going to finish it. When it comes to episodic TV, schedules can change on a regular basis, so we work really hard to get ahead of things.

64 • VFXVOICE.COM SUMMER 2020

PG 60-65 BIDDING.indd 65

Will Cohen, Co-founder and CEO, Milk VFX You need to be brutally honest with yourself about your capacity. Those of us who come from the 1990s in a service industry are used to trying to make hay when the sun shines. We’ve lived in a world for years of increasingly shorter deadlines, more ambition and higher resolution. Depending on the dynamic of each individual project, schedules can push or pull accordingly, so you need to be careful. Most projects that visual effects companies work on grow rather than shrink. Like stocks and shares they can go up or down. You’d be smart to anticipate a bit of growth on a project that you have. It can be frustrating for the client if you can’t accommodate growth on the shots or sequences. You have to be brutally honest about your capabilities, because it is better to do a great job of the work that you have than a bad job of too much work and ruin a project and relationship by greediness. You get rid of your offices, put artists at desks, and have a triage of the visual effects community to get through a busy period. Claire Norman, Business Development Director, Milk VFX Generally, the process is that we like to get the script at the initial stage and some sense of the director’s creative vision. Obviously on TV projects, if you have multiple directors then you’re vying with different visions all at the same time, but you try to keep everybody happy. After the initial discussions we usually break the script down and ballpark it with the best knowledge we have of the methodology of shooting and with an eye on the budget. When it’s that kind of bidding process it has to be collaborative, because

Meg Guidon, VFX Executive Producer, Freefolk Competitive bidding – this has to be expected as part of handling longform work and is a time-consuming process. Bids for TV and longform projects are always on a larger scale in terms of shot count across multiple episodes. The time required to create accurate bids has to be allowed for unpaid [work], even though the award is not guaranteed. Bids can go through several iterations. Money is sometimes the most important factor in winning work, rather than the creative. Being awarded visual effects shots that require the creation of bespoke CG assets will normally guarantee more work due to the ‘ownership’ of the asset across the series. So, though sharing assets happens more often, it is still thought better to keep assets with one vendor if possible. Clients expect good quality work at competitive rates. For this to work as a business model, we need to have access to a broad range of talent. If talent becomes too expensive and difficult to access then the model can quickly break down with the available budgets. David Conley, Executive VFX Producer, Weta Digital No doubt about it, bidding is booming! I think we’re at an inflection point in the industry where we’re seeing an increased number of bidding packages come through the facility. And with the increase in these packages, we’re being asked to turn them around in shorter timeframes to allow studios to make a lot of decisions very quickly. However, these bid packages are increasingly less detailed with studios themselves often having to turn around breakdowns in less time to give to facilities. Which means we’re dealing in the bidding laws of averages. While we have been putting a lot of effort into our bidding analytics in recent years to give us some confidence in turning bid packages around at these speeds, the risk we’re seeing across the industry is that projects can be awarded with wildly divergent creative and financial expectations. And that can have an impact on the greater visual effects community, as the resulting overages and schedule extensions required to solve these problems has a corollary downstream impact to production pipelines around the world.

OPPOSITE TOP LEFT: One of Us was the sole vendor on the Netflix series The Crown. (Image courtesy of Netflix and One of Us) OPPOSITE TOP RIGHT: A strange doll makes an appearance in the Netflix series Black Mirror with some assistance from Outpost VFX. (Image courtesy of Netflix and Outpost VFX) OPPOSITE BOTTOM LEFT: See on Apple TV+ was not viewed as a visual effects-driven show, but Adrian de Wet supervised over 3,000 shots that featured digital humans, creatures and environments with large effects simulations. (Image courtesy of Apple TV+ and Outpost VFX) OPPOSITE BOTTOM RIGHT: Netflix provided filmmaker Martin Scorsese and ILM with the opportunity to push the boundaries of facial capture in The Irishman. (Image courtesy of Netflix and ILM) TOP: Brian J. Smith as Doug McKenna in the pilot episode of Treadstone, produced by Universal Content Productions for the USA Network. (Image courtesy of USA Network) MIDDLE: The Mind Flayer comes to life in the Netflix series Stranger Things courtesy of Rodeo FX. (Image courtesy of Netflix and Rodeo FX) BOTTOM: Eleven (Millie Bobby Brown) and Jim Hooper (David Harbour) fight demonic creatures from an alternative world known as Upside Down, created by Method Studios for Stranger Things. (Image courtesy of Method Studios and Netflix)

SUMMER 2020 VFXVOICE.COM • 65

5/13/20 6:09 PM


VFX TRENDS

you have to work with them and, equally, they have to work with us understanding what we can achieve in visual effects for that financial cost. Sometimes they will highlight a particular sequence because it might be challenging to shoot, so they want to be clear as to how they might want to go back to it. They don’t often board the whole series, but cherry pick certain sequences because of their complications. You do rounds and rounds of bidding. You often do alternative bids for the same shots for possible additional requirements so the client has a sense of where their money is going and can decide what the best vendors are for onscreen. There is constant bidding and transparency as well. If you’re raising the cost of anything you have to be able to justify the need.

Viktor Muller, CEO and VFX Supervisor, UPP Visual effects vendors will always find how to adapt to all requests from their clients. Of course, only some will survive. It’s the same as the time when a rebate system came up. Vendors should worry more about what happens when one of these giants stop their production and there’s a few hundred projects less. In the meantime, this industry is super hungry and everyone is missing people. What will everyone do when Netflix or Amazon slows down? There is a new ‘millennial’ style that connects people/freelancers into virtual companies/clouds so all of them are working on their own and the only management, supervision, rendering and IO is going through a solid VFX house. I am personally pessimistic about this model as it will lead to problems with laziness and egos. Wayne Brinton, Executive Producer, Rodeo FX We don’t differentiate how we work with streaming versus studio. With different constraints and timelines, you have to make sure you understand the work before giving a cost. We spend a lot of time and effort understanding the scope of work before we bid, but the process is essentially the same. Everything is schedule, but quality is absolutely paramount. How can we fit enormously complicated shots into a very tight budget? Truth is, scheduling for streaming episodic work is tight, but it’s consistent, and that’s what’s good about it. With films, once you start working on a project, you don’t know when you’re going to finish it. When it comes to episodic TV, schedules can change on a regular basis, so we work really hard to get ahead of things.

64 • VFXVOICE.COM SUMMER 2020

PG 60-65 BIDDING.indd 65

Will Cohen, Co-founder and CEO, Milk VFX You need to be brutally honest with yourself about your capacity. Those of us who come from the 1990s in a service industry are used to trying to make hay when the sun shines. We’ve lived in a world for years of increasingly shorter deadlines, more ambition and higher resolution. Depending on the dynamic of each individual project, schedules can push or pull accordingly, so you need to be careful. Most projects that visual effects companies work on grow rather than shrink. Like stocks and shares they can go up or down. You’d be smart to anticipate a bit of growth on a project that you have. It can be frustrating for the client if you can’t accommodate growth on the shots or sequences. You have to be brutally honest about your capabilities, because it is better to do a great job of the work that you have than a bad job of too much work and ruin a project and relationship by greediness. You get rid of your offices, put artists at desks, and have a triage of the visual effects community to get through a busy period. Claire Norman, Business Development Director, Milk VFX Generally, the process is that we like to get the script at the initial stage and some sense of the director’s creative vision. Obviously on TV projects, if you have multiple directors then you’re vying with different visions all at the same time, but you try to keep everybody happy. After the initial discussions we usually break the script down and ballpark it with the best knowledge we have of the methodology of shooting and with an eye on the budget. When it’s that kind of bidding process it has to be collaborative, because

Meg Guidon, VFX Executive Producer, Freefolk Competitive bidding – this has to be expected as part of handling longform work and is a time-consuming process. Bids for TV and longform projects are always on a larger scale in terms of shot count across multiple episodes. The time required to create accurate bids has to be allowed for unpaid [work], even though the award is not guaranteed. Bids can go through several iterations. Money is sometimes the most important factor in winning work, rather than the creative. Being awarded visual effects shots that require the creation of bespoke CG assets will normally guarantee more work due to the ‘ownership’ of the asset across the series. So, though sharing assets happens more often, it is still thought better to keep assets with one vendor if possible. Clients expect good quality work at competitive rates. For this to work as a business model, we need to have access to a broad range of talent. If talent becomes too expensive and difficult to access then the model can quickly break down with the available budgets. David Conley, Executive VFX Producer, Weta Digital No doubt about it, bidding is booming! I think we’re at an inflection point in the industry where we’re seeing an increased number of bidding packages come through the facility. And with the increase in these packages, we’re being asked to turn them around in shorter timeframes to allow studios to make a lot of decisions very quickly. However, these bid packages are increasingly less detailed with studios themselves often having to turn around breakdowns in less time to give to facilities. Which means we’re dealing in the bidding laws of averages. While we have been putting a lot of effort into our bidding analytics in recent years to give us some confidence in turning bid packages around at these speeds, the risk we’re seeing across the industry is that projects can be awarded with wildly divergent creative and financial expectations. And that can have an impact on the greater visual effects community, as the resulting overages and schedule extensions required to solve these problems has a corollary downstream impact to production pipelines around the world.

OPPOSITE TOP LEFT: One of Us was the sole vendor on the Netflix series The Crown. (Image courtesy of Netflix and One of Us) OPPOSITE TOP RIGHT: A strange doll makes an appearance in the Netflix series Black Mirror with some assistance from Outpost VFX. (Image courtesy of Netflix and Outpost VFX) OPPOSITE BOTTOM LEFT: See on Apple TV+ was not viewed as a visual effects-driven show, but Adrian de Wet supervised over 3,000 shots that featured digital humans, creatures and environments with large effects simulations. (Image courtesy of Apple TV+ and Outpost VFX) OPPOSITE BOTTOM RIGHT: Netflix provided filmmaker Martin Scorsese and ILM with the opportunity to push the boundaries of facial capture in The Irishman. (Image courtesy of Netflix and ILM) TOP: Brian J. Smith as Doug McKenna in the pilot episode of Treadstone, produced by Universal Content Productions for the USA Network. (Image courtesy of USA Network) MIDDLE: The Mind Flayer comes to life in the Netflix series Stranger Things courtesy of Rodeo FX. (Image courtesy of Netflix and Rodeo FX) BOTTOM: Eleven (Millie Bobby Brown) and Jim Hooper (David Harbour) fight demonic creatures from an alternative world known as Upside Down, created by Method Studios for Stranger Things. (Image courtesy of Method Studios and Netflix)

SUMMER 2020 VFXVOICE.COM • 65

5/13/20 6:09 PM


TECH & TOOLS

NEW APPROACHES TO TV VFX By IAN FAILES

Today there is more filmed entertainment on television and streaming services available than ever before. So many of these shows are filled with visual effects, creating a boon for many VFX studios. The challenge for visual effects studios and artists, though, has been that these TV and streaming networks are continually pushing for higher quality deliveries in the realm of 4K and even 8K, which can actually be greater than required for film VFX. The budgets, meanwhile, are typically lower in comparison to film. How have those in the visual effects industry responded to the new wave of streaming and television requirements from a technical, workflow and creative point of view? The answer is via a number of methods, from adopting new pipelines and software, to invoking real-time and virtual production tools. DEMANDING DELIVERIES

TOP: The Volume and LED walls used for shooting Disney+’s The Mandalorian. (Image copyright © 2019 Lucasfilm Ltd.)

66 • VFXVOICE.COM SUMMER 2020

PG 66-71 TV VFX.indd 66-67

First, a perspective from one of the big streaming services itself, Netflix. The service is well known for requiring creators of its Netflix ‘Originals’ to shoot and deliver in 4K, while its VFX best practice guidelines also state that ‘VFX pulls and deliveries must have an intended active image area of at least UHD (3,840 square pixels in width).’ “We put forth comprehensive guidelines about not only camera choice, but also recording settings: codec, compression, color space,” outlines Netflix Director of Creative Technologies and Infrastructure Jimmy Fusil. “We also provide support throughout

the design of the ‘dailies-to-DI’ image workflow on every project. A big part of our efforts there is ensuring alignment between on-set monitoring, dailies, VFX and picture finishing.” Fusil singles out The King, released in November 2019, as a recent production in which the camera choice, 4K color management workflow and VFX deliveries were heavily pre-planned. DNEG was the lead visual effects vendor. The film, shot on ARRI Alexa 65, had what Fusil says was a clearly defined color-managed workflow, with color decisions baked into editorial dailies. “So when it came to pulling the plates, the turnover package included LUT plus CDLs (ASC Color Decision List) per shot for the VFX vendors for them to use coming back into the editorial timeline. “Another part of the imaging pipeline that we encourage for VFX is to work in scene linear color space,” adds Fusil . “For The King, teams agreed upon an AWG linear color space exchanged in a 16bit half float EXR file format, which would then be transformed to an ACEScg working colorspace in-house, a widespread VFX facility format. For these files to be viewed by the artists and delivered back into editorial, LUT package included more substantial versions of the show LUT that transformed the ACEScg to AWG LogC to rec709.” What this all meant for the VFX teams, suggests Fusil, is that they were not held up trying to match proxy files with the color in the editorial timeline, something that can happen in non-color managed image pipelines. “Plus,” he says, “having one consistent deliverable, AWG linear EXRs, meant that the final color process

TOP: A visual effects shot by Hybride for the Amazon Prime Video series Tom Clancy’s Jack Ryan. (Image copyright © 2019 Amazon) BOTTOM LEFT TO RIGHT: David Morin, Head of Epic Games Los Angeles Lab. Jimmy Fusil, Director of Creative Technologies and Infrastructure, Netflix. Russell Dodgson, Visual Effects Supervisor, Framestore. Pierre Raymond, President and Head of Operations, Hybride. Pascal Chappuis, Head of Technical Production, Image Engine.

SUMMER 2020 VFXVOICE.COM • 67

5/13/20 6:11 PM


TECH & TOOLS

NEW APPROACHES TO TV VFX By IAN FAILES

Today there is more filmed entertainment on television and streaming services available than ever before. So many of these shows are filled with visual effects, creating a boon for many VFX studios. The challenge for visual effects studios and artists, though, has been that these TV and streaming networks are continually pushing for higher quality deliveries in the realm of 4K and even 8K, which can actually be greater than required for film VFX. The budgets, meanwhile, are typically lower in comparison to film. How have those in the visual effects industry responded to the new wave of streaming and television requirements from a technical, workflow and creative point of view? The answer is via a number of methods, from adopting new pipelines and software, to invoking real-time and virtual production tools. DEMANDING DELIVERIES

TOP: The Volume and LED walls used for shooting Disney+’s The Mandalorian. (Image copyright © 2019 Lucasfilm Ltd.)

66 • VFXVOICE.COM SUMMER 2020

PG 66-71 TV VFX.indd 66-67

First, a perspective from one of the big streaming services itself, Netflix. The service is well known for requiring creators of its Netflix ‘Originals’ to shoot and deliver in 4K, while its VFX best practice guidelines also state that ‘VFX pulls and deliveries must have an intended active image area of at least UHD (3,840 square pixels in width).’ “We put forth comprehensive guidelines about not only camera choice, but also recording settings: codec, compression, color space,” outlines Netflix Director of Creative Technologies and Infrastructure Jimmy Fusil. “We also provide support throughout

the design of the ‘dailies-to-DI’ image workflow on every project. A big part of our efforts there is ensuring alignment between on-set monitoring, dailies, VFX and picture finishing.” Fusil singles out The King, released in November 2019, as a recent production in which the camera choice, 4K color management workflow and VFX deliveries were heavily pre-planned. DNEG was the lead visual effects vendor. The film, shot on ARRI Alexa 65, had what Fusil says was a clearly defined color-managed workflow, with color decisions baked into editorial dailies. “So when it came to pulling the plates, the turnover package included LUT plus CDLs (ASC Color Decision List) per shot for the VFX vendors for them to use coming back into the editorial timeline. “Another part of the imaging pipeline that we encourage for VFX is to work in scene linear color space,” adds Fusil . “For The King, teams agreed upon an AWG linear color space exchanged in a 16bit half float EXR file format, which would then be transformed to an ACEScg working colorspace in-house, a widespread VFX facility format. For these files to be viewed by the artists and delivered back into editorial, LUT package included more substantial versions of the show LUT that transformed the ACEScg to AWG LogC to rec709.” What this all meant for the VFX teams, suggests Fusil, is that they were not held up trying to match proxy files with the color in the editorial timeline, something that can happen in non-color managed image pipelines. “Plus,” he says, “having one consistent deliverable, AWG linear EXRs, meant that the final color process

TOP: A visual effects shot by Hybride for the Amazon Prime Video series Tom Clancy’s Jack Ryan. (Image copyright © 2019 Amazon) BOTTOM LEFT TO RIGHT: David Morin, Head of Epic Games Los Angeles Lab. Jimmy Fusil, Director of Creative Technologies and Infrastructure, Netflix. Russell Dodgson, Visual Effects Supervisor, Framestore. Pierre Raymond, President and Head of Operations, Hybride. Pascal Chappuis, Head of Technical Production, Image Engine.

SUMMER 2020 VFXVOICE.COM • 67

5/13/20 6:11 PM


TECH & TOOLS

was very manageable, especially as we were overseeing the delivery of final shots to multiple locations. The DI was finished in New York, Australia and Los Angeles. Supporting the filmmakers and facilities on The King was a win for us. They were able to direct all their energy into the creative space and finish on time in Dolby Vision.” THE VFX STUDIO PERSPECTIVE

TOP: CG creature by Image Engine for the Amazon Prime Video series Carnival Row. (Image copyright © 2019 Amazon). BOTTOM: The Witcher series on Netflix could be watched by audiences in 4K resolution with Dolby Vision and Dolby Atmos audio. (Image copyright © 2019 Netflix)

68 • VFXVOICE.COM SUMMER 2020

PG 66-71 TV VFX.indd 68-69

So, has having to follow more stringent color workflows and deliver in 4K and above changed the approach within VFX studios? It certainly has, according to Image Engine Head of Technical Production Pascal Chappuis. His studio, which has worked on shows for Netflix, Disney+, HBO, Amazon Prime Video and other services, has had to specifically manage areas such as rendering and reviews. In terms of the very practical impact of delivering higher resolution and more data, Chappuis advises this partly involved hardware upgrades. “The compositing machines, for example, needed to be accommodated with SSD (solid state drive) caches and bigger machines, just to make sure that 4K images, sometimes rendered in deep, could be worked on. Our machines for review

in the theater were updated too; there’s just more data and more bandwidth required over our network. 4K impacts us, every step of the way. “Sometimes we have had to ramp up the storage, because we thought we could off-line say Episode 1, but you can’t,” continues Chappuis. “They might come back and work on that episode later on. So, if you have lots of episodes, it can be like a 10-hour movie in a way, especially at 4K.” Interestingly, one benefit of the advent of more episodic work, says Chappuis, is that deliveries have become somewhat more predictable than film. Still, he adds, “it is all about scale. When it all works, it seems that suddenly they can have more shots or more episodes in their pocket to give you. So you need to be able to scale quickly for shots and with crew.” Another VFX studio that has taken on a greater range of television and streaming work is Hybride. President and Head of Operations Pierre Raymond observes that where film VFX and television VFX were once considered quite different, they have now become quite similar. “Television series don’t have the same pace, as they are composed of many episodes, but 4K resolution is now standard,”

TOP: The final rendered creature by Image Engine for Carnival Row. (Image copyright © 2019 Amazon). BOTTOM: The ThruView system set up by Stargate Studios for the HBO series Run allowed for views to be established outside train windows. (Image courtesy of Stargate Studios)

SUMMER 2020 VFXVOICE.COM • 69

5/13/20 6:11 PM


TECH & TOOLS

was very manageable, especially as we were overseeing the delivery of final shots to multiple locations. The DI was finished in New York, Australia and Los Angeles. Supporting the filmmakers and facilities on The King was a win for us. They were able to direct all their energy into the creative space and finish on time in Dolby Vision.” THE VFX STUDIO PERSPECTIVE

TOP: CG creature by Image Engine for the Amazon Prime Video series Carnival Row. (Image copyright © 2019 Amazon). BOTTOM: The Witcher series on Netflix could be watched by audiences in 4K resolution with Dolby Vision and Dolby Atmos audio. (Image copyright © 2019 Netflix)

68 • VFXVOICE.COM SUMMER 2020

PG 66-71 TV VFX.indd 68-69

So, has having to follow more stringent color workflows and deliver in 4K and above changed the approach within VFX studios? It certainly has, according to Image Engine Head of Technical Production Pascal Chappuis. His studio, which has worked on shows for Netflix, Disney+, HBO, Amazon Prime Video and other services, has had to specifically manage areas such as rendering and reviews. In terms of the very practical impact of delivering higher resolution and more data, Chappuis advises this partly involved hardware upgrades. “The compositing machines, for example, needed to be accommodated with SSD (solid state drive) caches and bigger machines, just to make sure that 4K images, sometimes rendered in deep, could be worked on. Our machines for review

in the theater were updated too; there’s just more data and more bandwidth required over our network. 4K impacts us, every step of the way. “Sometimes we have had to ramp up the storage, because we thought we could off-line say Episode 1, but you can’t,” continues Chappuis. “They might come back and work on that episode later on. So, if you have lots of episodes, it can be like a 10-hour movie in a way, especially at 4K.” Interestingly, one benefit of the advent of more episodic work, says Chappuis, is that deliveries have become somewhat more predictable than film. Still, he adds, “it is all about scale. When it all works, it seems that suddenly they can have more shots or more episodes in their pocket to give you. So you need to be able to scale quickly for shots and with crew.” Another VFX studio that has taken on a greater range of television and streaming work is Hybride. President and Head of Operations Pierre Raymond observes that where film VFX and television VFX were once considered quite different, they have now become quite similar. “Television series don’t have the same pace, as they are composed of many episodes, but 4K resolution is now standard,”

TOP: The final rendered creature by Image Engine for Carnival Row. (Image copyright © 2019 Amazon). BOTTOM: The ThruView system set up by Stargate Studios for the HBO series Run allowed for views to be established outside train windows. (Image courtesy of Stargate Studios)

SUMMER 2020 VFXVOICE.COM • 69

5/13/20 6:11 PM


TECH & TOOLS

he says. “This has had an important impact in the VFX industry, and the movie business is rapidly catching up. I’d say the main difference between the two is the pace and the expectations. That being said, if TV series come to have the same expectations in regards to quality and time investment, both challenges will not be approached differently.” Server storage and network equipment received upgrades to handle 4K deliveries at Hybride, but, like Image Engine, few staffing changes were necessary. “All of our artists are involved on both types of projects, some of which sometimes overlap during the same period,” states Raymond. PRODUCTION CHANGES

TOP: The original plate for a battle scene in Netflix’s The King. (Image copyright © 2019 Netflix) MIDDLE: Extra crowd army is added by DNEG. (Image copyright © 2019 Netflix) BOTTOM: The final composite. (Image copyright © 2019 Netflix)

70 • VFXVOICE.COM SUMMER 2020

PG 66-71 TV VFX.indd 71

We’ve seen, then, how delivery requirements for television and streaming services can impact on VFX studios. There’s another kind of change that’s also taking place in visual effects generally that is set to change – and in fact is already changing – the way TV and streaming shows are made: virtual production. The use of LED screens and real-time rendering engines during the making of Disney+’s The Mandalorian is the standout example of virtual production at work. Using a combination of ILM’s StageCraft technology and Epic Games’ Unreal Engine, digital backgrounds could be pre-built for projection on the LED walls while actors performed directly in front of them. These backgrounds could be changed in real-time, effectively providing for full in-camera VFX shots (with no or little post-production necessary), as well as interactive lighting on the actors themselves. It also eliminated the need for greenscreens, although the LED walls could become greenscreens themselves. This might mean more work up front to prepare the real-time ready imagery, but the idea is that it can all aid in helping the cast and crew to imagine scenes directly there on set, without the need to visit exotic locations. In addition, notes David Morin, Head of Epic Games Los Angeles Lab, “Art directors can visualize their sets faster up front and use VR for virtual location scouting, previs teams can interactively block out scenes and camera moves, VFX teams can iterate much more quickly on their environments and characters, and so on. “Everyone on set of The Mandalorian was able to see the digital environments with their own eyes in real-time, giving the crew the ability to make on-the-fly changes and the actors real elements to react to,” adds Morin. “Even more possibilities opened up, whether it was the set dresser easily moving objects from one place to another within the digital environment, or the cinematographer having the ability to shoot with magic-hour lighting all day long.” Morin believes that the use of virtual production in television and streaming production can, by doing more earlier, aim to reduce the post process, which might “also help shows go to air in shorter time frames, allowing studios to keep up with audience demand. The use of LED walls in particular, though it requires a large upfront investment, can ultimately deliver significant return on investment, as filmmakers can return to the set again and again throughout production and even later on for re-shoots without

Meeting the 4K Challenge As visual effects studios have had to deliver higher-resolution final visual effects for TV and streaming, software providers have also had to keep up. Foundry, the makers of commonly used tools Nuke, Mari and Katana, have been adapting to 4K – and even 8K – production for some time. “While Nuke can easily process images at higher than 8K resolution, we’ve done a lot of work to improve the artist experience and pipeline efficiency when working with these large files, notes Christy Anzelmo, Foundry’s Director of Product, Compositing and Finishing. “Recently, we’ve optimized Nuke’s handling of various EXR compression types, especially those performing over a network, resulting in faster rendering and more interactive performance in Nuke.” Anzelmo says Nuke Studio and Hiero now have a rebuilt playback engine which is optimized for playing back EXRs that are multichannel and colored managed. “We have seen more camera footage available for compositors in post as well as compositing happening on set,” adds Anzelmo, “and have been adding support for more camera formats across the Nuke family.” Meanwhile, Foundry’s Mari is regularly used to create textures that support 4K production, while Katana has maintained its ability to render large files. “We’re seeing an uptake of Mari,” outlines Jordan Thistlewood, Director of Product – Pre-production, LookDev & Lighting at Foundry. “It’s really all driven by the increase in resolution creating a higher demand on the requirements of what’s fed into the 3D rendering process in order to support that resolution.” Thistlewood acknowledges that making and rendering all of this higher-resolution imagery can introduce a higher cost factor. “Katana can offset that,” he says, “through its sequence-based lighting workflows where you can work on one shot and then replicate this across subsequent shots. So people are starting to use tools and workflows like that to offset the cost of rendering because then they can make more choices upfront before they hit the final ‘make it 4K’ button.”

having to worry about location availability, weather conditions and other disruptive factors.” The Volume shoot on The Mandalorian is one of the more elaborate examples of virtual production use in episodic work. Other virtual production and real-time rendering examples are becoming more widespread, such as via many previs studios, other LED screen implementations including Stargate Studios’ ThruView system, and the use of many on-set simul-cam and virtual camera systems. One example of the latter technology was adopted by Framestore for the signature polar bear fight in the HBO and BBC series His Dark Materials. The fight was visualized and scouted with a real-time Unreal Engine asset of the environment and the bears, with Vanishing

TOP LEFT: Jordan Thistlewood, Director of Product Pre-production, LookDev & Lighting, Foundry. TOP RIGHT: Christy Anzelmo, Director of Product, Compositing and Finishing, Foundry. MIDDLE: A screenshot from the latest release of Foundry’s Nuke 12.1. (Image courtesy of Foundry) BOTTOM: Katana, from Foundry, is used for managing lighting and rendering. (Image courtesy of Foundry)

Point’s Vector virtual production setup then utilized on set to ‘film’ the fight. Its real-time camera tracking and live compositing abilities enabled immediate feedback for the VFX and production teams. His Dark Materials Visual Effects Supervisor Russell Dodgson is adamant that virtual production was the right tool for the right job on this demanding scene, especially as it went beyond his usual experience with previs. “As we get more and more to a place where virtual production tools are really informing shot production rather than just early thoughts,” says Dodgson, “I think they can really gain relevance. Having real-time tools that give you good-looking feedback so people buy into it earlier is really important. It’s definitely the future of where things are going.”

SUMMER 2020 VFXVOICE.COM • 71

5/13/20 6:11 PM


TECH & TOOLS

he says. “This has had an important impact in the VFX industry, and the movie business is rapidly catching up. I’d say the main difference between the two is the pace and the expectations. That being said, if TV series come to have the same expectations in regards to quality and time investment, both challenges will not be approached differently.” Server storage and network equipment received upgrades to handle 4K deliveries at Hybride, but, like Image Engine, few staffing changes were necessary. “All of our artists are involved on both types of projects, some of which sometimes overlap during the same period,” states Raymond. PRODUCTION CHANGES

TOP: The original plate for a battle scene in Netflix’s The King. (Image copyright © 2019 Netflix) MIDDLE: Extra crowd army is added by DNEG. (Image copyright © 2019 Netflix) BOTTOM: The final composite. (Image copyright © 2019 Netflix)

70 • VFXVOICE.COM SUMMER 2020

PG 66-71 TV VFX.indd 71

We’ve seen, then, how delivery requirements for television and streaming services can impact on VFX studios. There’s another kind of change that’s also taking place in visual effects generally that is set to change – and in fact is already changing – the way TV and streaming shows are made: virtual production. The use of LED screens and real-time rendering engines during the making of Disney+’s The Mandalorian is the standout example of virtual production at work. Using a combination of ILM’s StageCraft technology and Epic Games’ Unreal Engine, digital backgrounds could be pre-built for projection on the LED walls while actors performed directly in front of them. These backgrounds could be changed in real-time, effectively providing for full in-camera VFX shots (with no or little post-production necessary), as well as interactive lighting on the actors themselves. It also eliminated the need for greenscreens, although the LED walls could become greenscreens themselves. This might mean more work up front to prepare the real-time ready imagery, but the idea is that it can all aid in helping the cast and crew to imagine scenes directly there on set, without the need to visit exotic locations. In addition, notes David Morin, Head of Epic Games Los Angeles Lab, “Art directors can visualize their sets faster up front and use VR for virtual location scouting, previs teams can interactively block out scenes and camera moves, VFX teams can iterate much more quickly on their environments and characters, and so on. “Everyone on set of The Mandalorian was able to see the digital environments with their own eyes in real-time, giving the crew the ability to make on-the-fly changes and the actors real elements to react to,” adds Morin. “Even more possibilities opened up, whether it was the set dresser easily moving objects from one place to another within the digital environment, or the cinematographer having the ability to shoot with magic-hour lighting all day long.” Morin believes that the use of virtual production in television and streaming production can, by doing more earlier, aim to reduce the post process, which might “also help shows go to air in shorter time frames, allowing studios to keep up with audience demand. The use of LED walls in particular, though it requires a large upfront investment, can ultimately deliver significant return on investment, as filmmakers can return to the set again and again throughout production and even later on for re-shoots without

Meeting the 4K Challenge As visual effects studios have had to deliver higher-resolution final visual effects for TV and streaming, software providers have also had to keep up. Foundry, the makers of commonly used tools Nuke, Mari and Katana, have been adapting to 4K – and even 8K – production for some time. “While Nuke can easily process images at higher than 8K resolution, we’ve done a lot of work to improve the artist experience and pipeline efficiency when working with these large files, notes Christy Anzelmo, Foundry’s Director of Product, Compositing and Finishing. “Recently, we’ve optimized Nuke’s handling of various EXR compression types, especially those performing over a network, resulting in faster rendering and more interactive performance in Nuke.” Anzelmo says Nuke Studio and Hiero now have a rebuilt playback engine which is optimized for playing back EXRs that are multichannel and colored managed. “We have seen more camera footage available for compositors in post as well as compositing happening on set,” adds Anzelmo, “and have been adding support for more camera formats across the Nuke family.” Meanwhile, Foundry’s Mari is regularly used to create textures that support 4K production, while Katana has maintained its ability to render large files. “We’re seeing an uptake of Mari,” outlines Jordan Thistlewood, Director of Product – Pre-production, LookDev & Lighting at Foundry. “It’s really all driven by the increase in resolution creating a higher demand on the requirements of what’s fed into the 3D rendering process in order to support that resolution.” Thistlewood acknowledges that making and rendering all of this higher-resolution imagery can introduce a higher cost factor. “Katana can offset that,” he says, “through its sequence-based lighting workflows where you can work on one shot and then replicate this across subsequent shots. So people are starting to use tools and workflows like that to offset the cost of rendering because then they can make more choices upfront before they hit the final ‘make it 4K’ button.”

having to worry about location availability, weather conditions and other disruptive factors.” The Volume shoot on The Mandalorian is one of the more elaborate examples of virtual production use in episodic work. Other virtual production and real-time rendering examples are becoming more widespread, such as via many previs studios, other LED screen implementations including Stargate Studios’ ThruView system, and the use of many on-set simul-cam and virtual camera systems. One example of the latter technology was adopted by Framestore for the signature polar bear fight in the HBO and BBC series His Dark Materials. The fight was visualized and scouted with a real-time Unreal Engine asset of the environment and the bears, with Vanishing

TOP LEFT: Jordan Thistlewood, Director of Product Pre-production, LookDev & Lighting, Foundry. TOP RIGHT: Christy Anzelmo, Director of Product, Compositing and Finishing, Foundry. MIDDLE: A screenshot from the latest release of Foundry’s Nuke 12.1. (Image courtesy of Foundry) BOTTOM: Katana, from Foundry, is used for managing lighting and rendering. (Image courtesy of Foundry)

Point’s Vector virtual production setup then utilized on set to ‘film’ the fight. Its real-time camera tracking and live compositing abilities enabled immediate feedback for the VFX and production teams. His Dark Materials Visual Effects Supervisor Russell Dodgson is adamant that virtual production was the right tool for the right job on this demanding scene, especially as it went beyond his usual experience with previs. “As we get more and more to a place where virtual production tools are really informing shot production rather than just early thoughts,” says Dodgson, “I think they can really gain relevance. Having real-time tools that give you good-looking feedback so people buy into it earlier is really important. It’s definitely the future of where things are going.”

SUMMER 2020 VFXVOICE.COM • 71

5/13/20 6:11 PM


FILM

For the shoot, almost the entire floor of the station was staged in one pass, notes Miller and Bair. “There were a few things that were built on set like lockers, benches and a couple stands where people could gather and mull around, but everything else from the ground up to all the walls around it were CG. “The production team provided us with the floor plan and we based all the proportions on that,” add Miller and Bair. “In addition to the core structure, we put a lot of time and effort into the shooting day, planning movement and lighting on the people. We knew where the giant windows were going to be, and what time of the day we wanted the scene to be in. With that information, we then determined how the light entered the room and built the CG model around that so shots of people going in and out of pools of light would match the CG.” Ncam’s camera tracking toolset was utilized on set, allowing Phosphene to drop in its rudimentary CG model behind actors set on greenscreen. The stage, which was around half the size of Penn Station, allowed for the filming of Norton and several extras. Shooting extras was a major co-ordination exercise, with the final shots also including extras filmed in separate element shoots, and some digital doubles. BUILDING A STATION

PHOSPHENE PULLS OUT ALL STOPS FOR MOTHERLESS BROOKLYN By IAN FAILES

Visual effects artists are regularly called upon to craft exotic landscapes, alien worlds and – sometimes – real-life locations. That was the case for Phosphene VFX, which re-created the original Penn Station in New York for Edward Norton’s 1950s crime noir Motherless Brooklyn. This work, which involved an almost completely computergenerated rendition of the iconic station, would ultimately result in a VES Award nomination for the Phosphene team (John Bair, Vance Miller, Sebastian Romero and Steve Sullivan) for Outstanding Created Environment in a Photoreal Feature. PLANNING PENN’S PRODUCTION

All images copyright © 2019 Warner Bros. Entertainment TOP: Motherless Brooklyn cinematographer Dick Pope, left, with director Edward Norton. OPPOSITE TOP TO BOTTOM: A greenscreen portion of the stage used for the Penn Station scenes. The final shot, with the CG station environment crafted by Phosphene VFX. Partial props and set pieces were used during filming. The final shot, which includes signature shafts and pools of light – and retained the real pigeon.

72 • VFXVOICE.COM SUMMER 2020

PG 72-73 BROOKLYN.indd All Pages

Production Visual Effects Supervisor Mark Russell first oversaw the making of an animatic for the Penn Station scenes of Edward Norton’s character walking around as part of an investigation. “From that point, we found some good angles that we knew we’d want to use on the day of shooting,” related film’s CG Supervisor Vance Miller and Phosphene’s VFX Supervisor John Bair. “We wanted to have a little bit of freedom with moving the camera around, so we did realtime pre-visualization, meaning that we built a very rough model of Penn Station that could be dropped into the background on the set.” The set itself was a partial build on a stage fitted with a significant amount of greenscreen, with Phosphene ultimately collaborating with production designer Beth Mickle on a historically accurate design that matched the unique features of the station, while also tapping into Norton’s desire to show a ‘romanticized’ version of the famous building.

Phosphene re-created Penn Station mostly as one large asset – sometimes it was changed slightly per camera angle. Historical photography was a crucial reference source for the build. “We were really drawn to pictures of the station when it was on the verge of opening,” advises Miller and Bair. “It was in pristine condition without a huge crowd in there, so we were able to see what the floor looked like, what the walls looked like and how it looked in a perfect state. We started from there and then added a little bit of weathering and wear and tear.” Large, separated shafts and beams of light filling up sections of the station floor were one of the signature elements added to the shots. By planning in advance the time of day they wanted the scene to be shot, Miller and Bair reveal, “we knew which direction we were going to be facing and what side to have the sun coming from. We had at least three large pools of light hitting the floor. They provided kind of generic cut-out shadows that weren’t perfect, but there were pools of light that the actors would walk in and out of.” Once the visual effects team placed the architecture and 3D model around the actors in post, they could match the sun to make sure that those pools of light worked. Atmosphere and dust in the ‘air’ of the station were also added to provide an authentic feel to the shots. While much of Penn Station became a digital creation, one element in particular remained very real: a pigeon seen frolicking on the station floor. Still, it did require some animal expertise, as Miller and Bair observe. “There were two takes of the pigeon, and it was not behaving in terms of performing and flying in and hitting its mark. They brought in an animal wrangler, but animals are still unpredictable. Thankfully, it wasn’t the main thrust of the whole scene, but it held up well for what it needed to do.”

SUMMER 2020 VFXVOICE.COM • 73

5/13/20 6:12 PM


FILM

For the shoot, almost the entire floor of the station was staged in one pass, notes Miller and Bair. “There were a few things that were built on set like lockers, benches and a couple stands where people could gather and mull around, but everything else from the ground up to all the walls around it were CG. “The production team provided us with the floor plan and we based all the proportions on that,” add Miller and Bair. “In addition to the core structure, we put a lot of time and effort into the shooting day, planning movement and lighting on the people. We knew where the giant windows were going to be, and what time of the day we wanted the scene to be in. With that information, we then determined how the light entered the room and built the CG model around that so shots of people going in and out of pools of light would match the CG.” Ncam’s camera tracking toolset was utilized on set, allowing Phosphene to drop in its rudimentary CG model behind actors set on greenscreen. The stage, which was around half the size of Penn Station, allowed for the filming of Norton and several extras. Shooting extras was a major co-ordination exercise, with the final shots also including extras filmed in separate element shoots, and some digital doubles. BUILDING A STATION

PHOSPHENE PULLS OUT ALL STOPS FOR MOTHERLESS BROOKLYN By IAN FAILES

Visual effects artists are regularly called upon to craft exotic landscapes, alien worlds and – sometimes – real-life locations. That was the case for Phosphene VFX, which re-created the original Penn Station in New York for Edward Norton’s 1950s crime noir Motherless Brooklyn. This work, which involved an almost completely computergenerated rendition of the iconic station, would ultimately result in a VES Award nomination for the Phosphene team (John Bair, Vance Miller, Sebastian Romero and Steve Sullivan) for Outstanding Created Environment in a Photoreal Feature. PLANNING PENN’S PRODUCTION

All images copyright © 2019 Warner Bros. Entertainment TOP: Motherless Brooklyn cinematographer Dick Pope, left, with director Edward Norton. OPPOSITE TOP TO BOTTOM: A greenscreen portion of the stage used for the Penn Station scenes. The final shot, with the CG station environment crafted by Phosphene VFX. Partial props and set pieces were used during filming. The final shot, which includes signature shafts and pools of light – and retained the real pigeon.

72 • VFXVOICE.COM SUMMER 2020

PG 72-73 BROOKLYN.indd All Pages

Production Visual Effects Supervisor Mark Russell first oversaw the making of an animatic for the Penn Station scenes of Edward Norton’s character walking around as part of an investigation. “From that point, we found some good angles that we knew we’d want to use on the day of shooting,” related film’s CG Supervisor Vance Miller and Phosphene’s VFX Supervisor John Bair. “We wanted to have a little bit of freedom with moving the camera around, so we did realtime pre-visualization, meaning that we built a very rough model of Penn Station that could be dropped into the background on the set.” The set itself was a partial build on a stage fitted with a significant amount of greenscreen, with Phosphene ultimately collaborating with production designer Beth Mickle on a historically accurate design that matched the unique features of the station, while also tapping into Norton’s desire to show a ‘romanticized’ version of the famous building.

Phosphene re-created Penn Station mostly as one large asset – sometimes it was changed slightly per camera angle. Historical photography was a crucial reference source for the build. “We were really drawn to pictures of the station when it was on the verge of opening,” advises Miller and Bair. “It was in pristine condition without a huge crowd in there, so we were able to see what the floor looked like, what the walls looked like and how it looked in a perfect state. We started from there and then added a little bit of weathering and wear and tear.” Large, separated shafts and beams of light filling up sections of the station floor were one of the signature elements added to the shots. By planning in advance the time of day they wanted the scene to be shot, Miller and Bair reveal, “we knew which direction we were going to be facing and what side to have the sun coming from. We had at least three large pools of light hitting the floor. They provided kind of generic cut-out shadows that weren’t perfect, but there were pools of light that the actors would walk in and out of.” Once the visual effects team placed the architecture and 3D model around the actors in post, they could match the sun to make sure that those pools of light worked. Atmosphere and dust in the ‘air’ of the station were also added to provide an authentic feel to the shots. While much of Penn Station became a digital creation, one element in particular remained very real: a pigeon seen frolicking on the station floor. Still, it did require some animal expertise, as Miller and Bair observe. “There were two takes of the pigeon, and it was not behaving in terms of performing and flying in and hitting its mark. They brought in an animal wrangler, but animals are still unpredictable. Thankfully, it wasn’t the main thrust of the whole scene, but it held up well for what it needed to do.”

SUMMER 2020 VFXVOICE.COM • 73

5/13/20 6:12 PM


ANIMATION

FRACTURING REALITY IN UNDONE By TREVOR HOGG

Images courtesy of Amazon Studios. TOP: Director Hisko Hulsing. (Photo: Jaroslav Repta)

74 • VFXVOICE.COM SUMMER 2020

PG 74-77 UNDONE.indd 74-75

A prevailing question in Amazon Studios’ first original animated series Undone is whether Alma Winograd-Diaz (Rosa Salazar) has the ability to travel through space and time to prevent the untimely death of her father Jacob (Bob Odenkirk) or if she is suffering from mental illness. In order to seamlessly integrate the fractures in reality, creators Kate Purdy and Raphael Bob-Waksberg (BoJack Horseman) collaborated with director Hisko Hulsing (Montage of Heck) to produce a hybrid style that combines live-action performances, oil painting backgrounds, rotoscoped characters and 3D animation. “I used comparable techniques for Montage of Heck and Junkyard,” states Hulsing, who recruited animation artists from Submarine in Amsterdam and Minnow Mountain in Austin, and live-action crew from Lightbox in Los Angeles. “We incorporated all of those elements using projection mapping to make sure that everything seems from the same universe.” While serving as a jury member at the GLAS Animation Festival in Berkeley, California, Hulsing was approached by Noel Bright and Steven Cohen from The Tornante Company about the project and subsequently flew to Los Angeles to meet with Purdy and Bob-Waksberg. “The scripts were sophisticated with a lot of depth and the dialog was adult and realistic. I realized that the dialog was too subtle to hand off to animators and then have the actors do their acting. Most animation nowadays is family entertainment, and the majority of animated acting is physical. I proposed rotoscoping so we could get all of the micro expressions of the actors. In the beginning there was a discussion about treating the hallucinations, dreams and flashbacks in a different way than the realistic scenes. I pitched the idea that it should all look the same, as it would enable the audience to go on a trip with Alma and have them as confused as her about what is real and what is not real.” A critical aspect was being able to convey Alma’s state of mind without trivializing mental illness. “It was my biggest challenge,” admits Hulsing. “In my opinion the whole emotional core of the series is the possibility that Alma is mentally ill. But at the same time, the other perspective of her mental state is that she is actually capable of bending the rules of the universe and traveling back in time. In the series it remains ambiguous as to which of those truths is the real one. Personally, I had a psychosis when I was 17, so it was important to take the audience on a journey with Alma so they could experience how it was to be mentally unstable.” Even though he has a career spanning 20 years in animation, Hulsing draws inspiration from live-action cinema. “My favorite film of all-time is The Tenant by Roman Polanski, which I’ve seen over 25 times and is about somebody losing his mind.” For Hulsing, it was important when there are fractures in reality that they did not distract from the storytelling. “One of our biggest concerns at the beginning with this technique was that we would get into the uncanny valley where the reality is hidden behind a filter,” he remarks. “I wanted to make sure that people would get used to the visual language and be completely transported by the story.”

The fractures in reality were precisely storyboarded with a team of visual effects designers. “We didn’t have much of a budget or stage, so we always used smart solutions to film it in a way that can be used for the animation. For instance, if Alma is floating in space we would put Rosa on a stool and rotate her, and that would be enough because we had already envisioned how it would be transported to animation.” Around 3,000 shots were storyboarded for the entire first

TOP: Rosa Salazar portrays Alma Winograd-Diaz in Amazon Studios’ first original animated series Undone. BOTTOM: Bob Odenkirk portrays Jacob Winograd, the deceased father of Alma who appears to her in visions.

SUMMER 2020 VFXVOICE.COM • 75

5/13/20 6:13 PM


ANIMATION

FRACTURING REALITY IN UNDONE By TREVOR HOGG

Images courtesy of Amazon Studios. TOP: Director Hisko Hulsing. (Photo: Jaroslav Repta)

74 • VFXVOICE.COM SUMMER 2020

PG 74-77 UNDONE.indd 74-75

A prevailing question in Amazon Studios’ first original animated series Undone is whether Alma Winograd-Diaz (Rosa Salazar) has the ability to travel through space and time to prevent the untimely death of her father Jacob (Bob Odenkirk) or if she is suffering from mental illness. In order to seamlessly integrate the fractures in reality, creators Kate Purdy and Raphael Bob-Waksberg (BoJack Horseman) collaborated with director Hisko Hulsing (Montage of Heck) to produce a hybrid style that combines live-action performances, oil painting backgrounds, rotoscoped characters and 3D animation. “I used comparable techniques for Montage of Heck and Junkyard,” states Hulsing, who recruited animation artists from Submarine in Amsterdam and Minnow Mountain in Austin, and live-action crew from Lightbox in Los Angeles. “We incorporated all of those elements using projection mapping to make sure that everything seems from the same universe.” While serving as a jury member at the GLAS Animation Festival in Berkeley, California, Hulsing was approached by Noel Bright and Steven Cohen from The Tornante Company about the project and subsequently flew to Los Angeles to meet with Purdy and Bob-Waksberg. “The scripts were sophisticated with a lot of depth and the dialog was adult and realistic. I realized that the dialog was too subtle to hand off to animators and then have the actors do their acting. Most animation nowadays is family entertainment, and the majority of animated acting is physical. I proposed rotoscoping so we could get all of the micro expressions of the actors. In the beginning there was a discussion about treating the hallucinations, dreams and flashbacks in a different way than the realistic scenes. I pitched the idea that it should all look the same, as it would enable the audience to go on a trip with Alma and have them as confused as her about what is real and what is not real.” A critical aspect was being able to convey Alma’s state of mind without trivializing mental illness. “It was my biggest challenge,” admits Hulsing. “In my opinion the whole emotional core of the series is the possibility that Alma is mentally ill. But at the same time, the other perspective of her mental state is that she is actually capable of bending the rules of the universe and traveling back in time. In the series it remains ambiguous as to which of those truths is the real one. Personally, I had a psychosis when I was 17, so it was important to take the audience on a journey with Alma so they could experience how it was to be mentally unstable.” Even though he has a career spanning 20 years in animation, Hulsing draws inspiration from live-action cinema. “My favorite film of all-time is The Tenant by Roman Polanski, which I’ve seen over 25 times and is about somebody losing his mind.” For Hulsing, it was important when there are fractures in reality that they did not distract from the storytelling. “One of our biggest concerns at the beginning with this technique was that we would get into the uncanny valley where the reality is hidden behind a filter,” he remarks. “I wanted to make sure that people would get used to the visual language and be completely transported by the story.”

The fractures in reality were precisely storyboarded with a team of visual effects designers. “We didn’t have much of a budget or stage, so we always used smart solutions to film it in a way that can be used for the animation. For instance, if Alma is floating in space we would put Rosa on a stool and rotate her, and that would be enough because we had already envisioned how it would be transported to animation.” Around 3,000 shots were storyboarded for the entire first

TOP: Rosa Salazar portrays Alma Winograd-Diaz in Amazon Studios’ first original animated series Undone. BOTTOM: Bob Odenkirk portrays Jacob Winograd, the deceased father of Alma who appears to her in visions.

SUMMER 2020 VFXVOICE.COM • 75

5/13/20 6:13 PM


ANIMATION

season, which consisted of eight episodes with each having a run time of 22 minutes. “The right angle, perspective, focal length of the lens, and the lighting were described in the storyboards,” states Hulsing. “Not only did we storyboard every single shot, but we also designed the floor plans for the sets before filming began. We had a stage but not an actual set. What we did was tape all of the measurements on the ground and there were some props, like tables. The actors had to trust me if I told them that they’re in a church or that the cross on the wall is a pyramid. I went back and forth about seven times between Amsterdam and Los Angeles to direct the cast on set with Kate and Raphael.” The live-action footage was edited to the exact length of the episode and served as the basis of the animation. “In Undone only the actors are traced because there is no set.” Backgrounds were created utilizing the oil painting techniques of 17th century Dutch painters. “The reason why I started doing that is because I like the old Disney films, such as Bambi,” explains Hulsing. “I enjoy that warm artistic look in contrast with modern animation films being made in Hollywood, which are good entertainment but feel sterile because they’re from a computer. “For Undone,” he continues, “we recruited a whole team of classically trained painters who then had to be trained to paint in the same style.” The lighting schemes were also influenced by famous 17th century Dutch painters. “Often we used dramatic lighting. For instance, a Rembrandt has a strong light source with a lot of dark spots. The other thing is that the series is situated in San Antonio, Texas, where I have never been. However, Kate grew up there and compiled references of places that exist in the city. The art designers used those photos loosely to create a whole world that was more dream-like and beautifully painted.” Projection mapping allowed the characters to move through the 2D backgrounds as if they were in a 3D space. We didn’t have many moving cameras but we would motion track them,” notes Hulsing. “The space was completely built in 3D and treated the various props as separate elements. For instance, if there was a couch in a room, certain aspects of it would be painted and projected into 3D. That way we could not only make strange special effects where an oil-painted room would suddenly collapse or blow up but still look painted. Also, if there was a location that would come back frequently, like Alma’s house, we could completely projection map it with oil paintings and use different angles in that space as backgrounds and it would still look painted.” Looking after the cinematography was Nick Ferreiro (Truth or Dare). “If I wanted to communicate to the audience that we’re seeing a vision from Alma I often used a wide lens, like 50mm, to exaggerate the point of view. Normal dialog scenes were filmed with 50mm or 70mm lenses.” Compositing was instrumental in combining the different elements into a unified aesthetic. “Not only did we have 3D and 2D animation, rotoscoping and oil paintings, but our group of painters each had a slightly different style, so the compositing

76 • VFXVOICE.COM SUMMER 2020

PG 74-77 UNDONE.indd 76-77

“The reason why I started [utilizing the oil painting techniques of 17th century Dutch painters] is because I like the old Disney films, such as Bambi. I enjoy that warm artistic look in contrast with modern animation films being made in Hollywood, which are good entertainment but feel sterile because they’re from a computer. For Undone, we recruited a whole team of classically trained painters who then had to be trained to paint in the same style.” —Hisko Hulsing, Director turned out to be important for color grading as well as merging those various techniques into one believable world,” states Hulsing. “The characters also feel painterly as they consist of the same materials that are in the backgrounds. There were a lot of methods that the compositors used, like adding light effects and contrast, and grading all of the colors.” The most stressful aspect was the nine-hour time difference between Amsterdam and Los Angeles. “We had Skype and conference calls,” remarks Hulsing. “The only way we could get this done was through the online platform Shotgun. Everyone was connected to Shotgun, which had every single shot, storyboard and design. It meant that the end of every day I would get a playlist with a 100 rotoscoped shots done in Texas that I will have to review, approve or give notes.” Amazon Studios took a major risk with Undone, which is currently in production on a second season. “It was fantastic that they gave us a chance and Undone turned out to be successful. The hard thing was that we didn’t have any pre-production time. I always use the analogy of driving a car while you’re still building it. We had to train the painters while we were already in production. It was a tight schedule. We had to do three hours of high-end animation in one and a half years. It’s easier [now though]. We’re working on the second season as the painters on Season 1 were thoroughly trained to work in this style.” Co-existing with the visuals is the score by composer Amie Doherty (Legion). “The series has a lot of dream-like psychedelic things, and Amie was able to make that more accessible by creating orchestral psychedelic sounds,” states Hulsing. “There is one scene that I’m extremely proud of, and that’s in Episode 4 where we’re transported through all kinds of happenings in the past of Alma, but also in the past of her ancestors. It was done like one continuous shot through different times like the 1930s in Europe. Everyone was amazed when the shot was finished because it was so fluid and strange. Things are moving without you noticing them. It starts off with Alma floating in space with her father, who is explaining how she can deal with her emotions, and then she is transported to all of these past memories sewn together seamlessly.”

OPPOSITE TOP TO BOTTOM: Undone attempts to unravel the question of Alma Winograd-Diaz’s sanity: Can she actually travel through space and time to save her father, or is she suffering from mental illness? Rosa Salazar was placed on a stool and rotated to capture a sense of floating in space for the animators. A team of visual effects designers storyboarded the details of the fractures in reality. The artistry of 17th century Dutch painters inspired the warm, classic look of the oil-painted backgrounds. TOP TO BOTTOM: Art designers referenced photos of places in San Antonio, Texas to create an ethereal, painted world. Undone blends live-action footage, oil paintings, rotoscoping and 2D and 3D animation to create an aesthetic that is surreal but grounded by the performances of the cast. Sam (Siddharth Dhananjay) and Alma travel through 2D environments as if they were in 3D, enabled by projection mapping.

SUMMER 2020 VFXVOICE.COM • 77

5/13/20 6:13 PM


ANIMATION

season, which consisted of eight episodes with each having a run time of 22 minutes. “The right angle, perspective, focal length of the lens, and the lighting were described in the storyboards,” states Hulsing. “Not only did we storyboard every single shot, but we also designed the floor plans for the sets before filming began. We had a stage but not an actual set. What we did was tape all of the measurements on the ground and there were some props, like tables. The actors had to trust me if I told them that they’re in a church or that the cross on the wall is a pyramid. I went back and forth about seven times between Amsterdam and Los Angeles to direct the cast on set with Kate and Raphael.” The live-action footage was edited to the exact length of the episode and served as the basis of the animation. “In Undone only the actors are traced because there is no set.” Backgrounds were created utilizing the oil painting techniques of 17th century Dutch painters. “The reason why I started doing that is because I like the old Disney films, such as Bambi,” explains Hulsing. “I enjoy that warm artistic look in contrast with modern animation films being made in Hollywood, which are good entertainment but feel sterile because they’re from a computer. “For Undone,” he continues, “we recruited a whole team of classically trained painters who then had to be trained to paint in the same style.” The lighting schemes were also influenced by famous 17th century Dutch painters. “Often we used dramatic lighting. For instance, a Rembrandt has a strong light source with a lot of dark spots. The other thing is that the series is situated in San Antonio, Texas, where I have never been. However, Kate grew up there and compiled references of places that exist in the city. The art designers used those photos loosely to create a whole world that was more dream-like and beautifully painted.” Projection mapping allowed the characters to move through the 2D backgrounds as if they were in a 3D space. We didn’t have many moving cameras but we would motion track them,” notes Hulsing. “The space was completely built in 3D and treated the various props as separate elements. For instance, if there was a couch in a room, certain aspects of it would be painted and projected into 3D. That way we could not only make strange special effects where an oil-painted room would suddenly collapse or blow up but still look painted. Also, if there was a location that would come back frequently, like Alma’s house, we could completely projection map it with oil paintings and use different angles in that space as backgrounds and it would still look painted.” Looking after the cinematography was Nick Ferreiro (Truth or Dare). “If I wanted to communicate to the audience that we’re seeing a vision from Alma I often used a wide lens, like 50mm, to exaggerate the point of view. Normal dialog scenes were filmed with 50mm or 70mm lenses.” Compositing was instrumental in combining the different elements into a unified aesthetic. “Not only did we have 3D and 2D animation, rotoscoping and oil paintings, but our group of painters each had a slightly different style, so the compositing

76 • VFXVOICE.COM SUMMER 2020

PG 74-77 UNDONE.indd 76-77

“The reason why I started [utilizing the oil painting techniques of 17th century Dutch painters] is because I like the old Disney films, such as Bambi. I enjoy that warm artistic look in contrast with modern animation films being made in Hollywood, which are good entertainment but feel sterile because they’re from a computer. For Undone, we recruited a whole team of classically trained painters who then had to be trained to paint in the same style.” —Hisko Hulsing, Director turned out to be important for color grading as well as merging those various techniques into one believable world,” states Hulsing. “The characters also feel painterly as they consist of the same materials that are in the backgrounds. There were a lot of methods that the compositors used, like adding light effects and contrast, and grading all of the colors.” The most stressful aspect was the nine-hour time difference between Amsterdam and Los Angeles. “We had Skype and conference calls,” remarks Hulsing. “The only way we could get this done was through the online platform Shotgun. Everyone was connected to Shotgun, which had every single shot, storyboard and design. It meant that the end of every day I would get a playlist with a 100 rotoscoped shots done in Texas that I will have to review, approve or give notes.” Amazon Studios took a major risk with Undone, which is currently in production on a second season. “It was fantastic that they gave us a chance and Undone turned out to be successful. The hard thing was that we didn’t have any pre-production time. I always use the analogy of driving a car while you’re still building it. We had to train the painters while we were already in production. It was a tight schedule. We had to do three hours of high-end animation in one and a half years. It’s easier [now though]. We’re working on the second season as the painters on Season 1 were thoroughly trained to work in this style.” Co-existing with the visuals is the score by composer Amie Doherty (Legion). “The series has a lot of dream-like psychedelic things, and Amie was able to make that more accessible by creating orchestral psychedelic sounds,” states Hulsing. “There is one scene that I’m extremely proud of, and that’s in Episode 4 where we’re transported through all kinds of happenings in the past of Alma, but also in the past of her ancestors. It was done like one continuous shot through different times like the 1930s in Europe. Everyone was amazed when the shot was finished because it was so fluid and strange. Things are moving without you noticing them. It starts off with Alma floating in space with her father, who is explaining how she can deal with her emotions, and then she is transported to all of these past memories sewn together seamlessly.”

OPPOSITE TOP TO BOTTOM: Undone attempts to unravel the question of Alma Winograd-Diaz’s sanity: Can she actually travel through space and time to save her father, or is she suffering from mental illness? Rosa Salazar was placed on a stool and rotated to capture a sense of floating in space for the animators. A team of visual effects designers storyboarded the details of the fractures in reality. The artistry of 17th century Dutch painters inspired the warm, classic look of the oil-painted backgrounds. TOP TO BOTTOM: Art designers referenced photos of places in San Antonio, Texas to create an ethereal, painted world. Undone blends live-action footage, oil paintings, rotoscoping and 2D and 3D animation to create an aesthetic that is surreal but grounded by the performances of the cast. Sam (Siddharth Dhananjay) and Alma travel through 2D environments as if they were in 3D, enabled by projection mapping.

SUMMER 2020 VFXVOICE.COM • 77

5/13/20 6:13 PM


TV/STREAMING

“The CG model is massive. We knew there was going to be an incredible amount of interaction with the asset. So during prep, we actually enlisted Important Looking Pirates to rebuild the asset from scratch. We needed to be able to put it in any camera position, any lighting environment. And then, ultimately, we had to break it up.” —Terron Pratt, Visual Effects Producer

LOST IN SPACE, SEASON 2: FROM SCRIPT TO SCENE By IAN FAILES

One of the main functions of the production visual effects team on any television series is, of course, to work out how to translate the script into the final imagery. The core VFX team on Season 2 of Netflix’s Lost in Space – Visual Effects Supervisor Jabbar Raisani, Visual Effects Producer Terron Pratt and additional Visual Effects Supervisor Marion Spates – had to do exactly that, but they also found themselves much more embedded inside the creative effort on the show this season. VFX Voice asked Raisani and Pratt to break down a number of key sequences from this second season and how creative decisions were made to help tell the story of the Robinson family as they continue their adventures aboard the spaceship Jupiter 2 and on alien worlds. A NEW SEASON OF EFFECTS

All images copyright © 2019 Netflix. TOP: The Robinsons prepare their ship for sailing, having been isolated on a thin strip of land. OPPOSITE TOP: The Jupiter 2 sets sail, part of a plan to recharge the ship’s batteries. OPPOSITE BOTTOM: The Resolute was a more significant CG asset this time around.

78 • VFXVOICE.COM SUMMER 2020

PG 78-82 LOST IN SPACE.indd 78-79

Getting involved earlier in production was a key desire for both Raisani and Pratt on Season 2. “We really strived to be more part of the process early on, where concepts were being generated and the visuals were being designed,” notes Raisani. “And we were an integral part of that design in the end.” “For example,” says Raisani, “the top deck of the Jupiter 2 became something that involved visual effects and production really working tightly together. We provided a model of the Jupiter 2 asset, and it was realized that the slope of the top of the

set was too dangerous for actors or even stunts to work on, which we needed for the ocean shots. So we worked together to find an angle that we felt was true to the visuals that had been established in Season 1 but was safe to work on. Then we figured out exactly where digital effects would take over on the build. Production built

only the top portion of the ship and only about three quarters of the actual top deck. The rest is all in digital effects, with full CG versions of the entire asset as well.” A central team of visual effects studios worked on Season 2, led by Important Looking Pirates, Mackevision and Image Engine,

SUMMER 2020 VFXVOICE.COM • 79

5/13/20 6:15 PM


TV/STREAMING

“The CG model is massive. We knew there was going to be an incredible amount of interaction with the asset. So during prep, we actually enlisted Important Looking Pirates to rebuild the asset from scratch. We needed to be able to put it in any camera position, any lighting environment. And then, ultimately, we had to break it up.” —Terron Pratt, Visual Effects Producer

LOST IN SPACE, SEASON 2: FROM SCRIPT TO SCENE By IAN FAILES

One of the main functions of the production visual effects team on any television series is, of course, to work out how to translate the script into the final imagery. The core VFX team on Season 2 of Netflix’s Lost in Space – Visual Effects Supervisor Jabbar Raisani, Visual Effects Producer Terron Pratt and additional Visual Effects Supervisor Marion Spates – had to do exactly that, but they also found themselves much more embedded inside the creative effort on the show this season. VFX Voice asked Raisani and Pratt to break down a number of key sequences from this second season and how creative decisions were made to help tell the story of the Robinson family as they continue their adventures aboard the spaceship Jupiter 2 and on alien worlds. A NEW SEASON OF EFFECTS

All images copyright © 2019 Netflix. TOP: The Robinsons prepare their ship for sailing, having been isolated on a thin strip of land. OPPOSITE TOP: The Jupiter 2 sets sail, part of a plan to recharge the ship’s batteries. OPPOSITE BOTTOM: The Resolute was a more significant CG asset this time around.

78 • VFXVOICE.COM SUMMER 2020

PG 78-82 LOST IN SPACE.indd 78-79

Getting involved earlier in production was a key desire for both Raisani and Pratt on Season 2. “We really strived to be more part of the process early on, where concepts were being generated and the visuals were being designed,” notes Raisani. “And we were an integral part of that design in the end.” “For example,” says Raisani, “the top deck of the Jupiter 2 became something that involved visual effects and production really working tightly together. We provided a model of the Jupiter 2 asset, and it was realized that the slope of the top of the

set was too dangerous for actors or even stunts to work on, which we needed for the ocean shots. So we worked together to find an angle that we felt was true to the visuals that had been established in Season 1 but was safe to work on. Then we figured out exactly where digital effects would take over on the build. Production built

only the top portion of the ship and only about three quarters of the actual top deck. The rest is all in digital effects, with full CG versions of the entire asset as well.” A central team of visual effects studios worked on Season 2, led by Important Looking Pirates, Mackevision and Image Engine,

SUMMER 2020 VFXVOICE.COM • 79

5/13/20 6:15 PM


TV/STREAMING

Designing a Lost in Space Dinosaur

ABOARD THE RESOLUTE

The Resolute is the mothership spacecraft appearing in the Lost in Space series. While it had been seen in Season 1, this new season featured the station in a much larger way. Plus, it gets destroyed. “It’s huge,” comments Pratt. “The CG model is massive. We knew there was going to be an incredible amount of interaction with the asset. So during prep, we actually enlisted Important Looking Pirates to rebuild the asset from scratch. We needed to be able to put it in any camera position, any lighting environment. And then, ultimately, we had to break it up.” The interior of the Resolute was also an important consideration. “The pod bay has to be one of my favorite sets,” outlines Raisani. “And that doesn’t exist at all, practically. It was just a doorway. That was another example of production and VFX really figuring out the best strategy. All that we built was the interior of the pod and everything else was Important Looking Pirates.” with several other vendors contributing. “On Season 1 our show grew, so we kept taking on new vendors to help with that growth,” attests Pratt. “Our plan for Season 2 was to try and reduce that to a smaller number and more of a core team.” ON THE WATER

The Jupiter 2 ocean shots were one of the more significant sides of the VFX effort. So much so, in fact, that VFX actually began concepts even before scripts had been written. Mackevision and Important Looking Pirates delivered a number of visuals that were used, says Raisani, “to help inspire the writers and then help us all understand what that would actually look like. There was also a ton of internal previs done, overseen by our Previs Supervisor, Dirk Valk.” Adds Pratt: “Our additional Visual Effects Supervisor, Marion Spates, also took some of the concepts and previs and immediately started developing a water simulation, dropping our ship model from Season 1 into it to determine what kind of waves it could actually deal with in a storm, how is this going to function, how is it going to cut through the waves, those kinds of things.” Ultimately, scenes featuring the spaceship having deployed sails and navigating the currents (plus the storm and a waterfall) were shared between Mackevision, Important Looking Pirates and Digital Domain.

80 • VFXVOICE.COM SUMMER 2020

PG 78-82 LOST IN SPACE.indd 80-81

MEET THE SPACE WHALE

At one point, John and Maureen Robinson observe a giant creature – dubbed the ‘space whale’ – as they traverse a gas planet. Important Looking Pirates and Mackevision entered a design phase, with El Ranchito completing the final VFX shots. “We really needed this character to give us a sense of wonder,” states Raisani. “We started with conceptualization and really figuring out, who is this creature within this world? We thought of it as the earthworm of the gas planet. They are ingesting one type of gas and excreting another kind of gas. So we have this very ethereal type creature. There was a lot of back and forth to find something that felt large but didn’t feel ominous, because we really wanted to maintain that sense of wonder, not a sense of fear.”

In Episodes 5 and 6 of Season 2 of Lost in Space, the Robinson family is on a rocky alien planet where they encounter a series of dinosaur-like creatures. Known as ‘Vivs,’ the vivacious – and arm-less – monsters were CG Image Engine creations. In terms of design, Production Visual Effects Supervisor Jabbar Raisani says showrunner Zack Estrin wanted to craft something more alien than had been seen on the show before, while being grounded in a creature that the audience could recognize. “We try to ground everything in something that you’ve seen on Earth, usually,” acknowledges Raisani. “So that’s the furthest we’ve pushed away from an Earth-like creature. We started with looking at raptors and dinosaurs, and then mixed that with dolphins and other underwater life.” The Vivs inhabit an ochre rock formation, which informed one of the creatures’ design features, specific barbs on their tails that they would use to navigate the rocks and cliffs. “We really tried to think deeply about design thinking when it comes to creatures,” states Raisani. “We say, ‘What does this creature do within this world?’ And then we design everything based on its function within that alien planet. “We also wanted them to feel bird-like,” says Raisani, who adds that ostriches were an inspiration here. “Ostriches have this thing where they breathe in through their mouth or nose, but then they exhale through the back of their head or neck. We really leaned into that to try to give him something that felt like a creature that lived in a hot environment which was made for running.” Another design element added into the Vivs was in terms of movement. It was imagined they could almost perform parkour steps around the cliffs, “which gave us an opportunity to really bounce around and use that tail to help push them off of the rocks,” notes Visual Effects Producer Terron Pratt. “Then on set we had our stunt actors running around, where they held their arms behind their back. At that point we had already determined that the Vivs would not have any upper arms. That helped everybody understand what we were eventually going to get with the final creatures.”

THEY FIGHT!

Season 2’s final episode includes a dramatic showdown between the robot known as Scarecrow and an invading legion of robots on board the Resolute. The massive sequence was, of course, a major visual effects moment. “In the script,” relates Raisani, “it literally read ‘Thirty seconds of robot awesomeness.’ And that was the only description of the action! I went over to Image Engine and I let them read that line, and then we talked about how we wanted to approach it.” Raisani says reference for the fight centered on action films that featured long takes. “We looked at things that had longer shots

OPPOSITE TOP TO BOTTOM: Jupiter 2 sits precariously atop a waterfall. Important Looking Pirates handled the fluid simulations for this sequence. For the VFX team, crafting the space whale involved thinking heavily about what role it played on the gas planet. A practical Spectral Motion suit was used for many robot shots. (Photo: Eike Schroter) TOP: A final shot of the Vivs as they attack a campsite.

SUMMER 2020 VFXVOICE.COM • 81

5/13/20 6:15 PM


TV/STREAMING

Designing a Lost in Space Dinosaur

ABOARD THE RESOLUTE

The Resolute is the mothership spacecraft appearing in the Lost in Space series. While it had been seen in Season 1, this new season featured the station in a much larger way. Plus, it gets destroyed. “It’s huge,” comments Pratt. “The CG model is massive. We knew there was going to be an incredible amount of interaction with the asset. So during prep, we actually enlisted Important Looking Pirates to rebuild the asset from scratch. We needed to be able to put it in any camera position, any lighting environment. And then, ultimately, we had to break it up.” The interior of the Resolute was also an important consideration. “The pod bay has to be one of my favorite sets,” outlines Raisani. “And that doesn’t exist at all, practically. It was just a doorway. That was another example of production and VFX really figuring out the best strategy. All that we built was the interior of the pod and everything else was Important Looking Pirates.” with several other vendors contributing. “On Season 1 our show grew, so we kept taking on new vendors to help with that growth,” attests Pratt. “Our plan for Season 2 was to try and reduce that to a smaller number and more of a core team.” ON THE WATER

The Jupiter 2 ocean shots were one of the more significant sides of the VFX effort. So much so, in fact, that VFX actually began concepts even before scripts had been written. Mackevision and Important Looking Pirates delivered a number of visuals that were used, says Raisani, “to help inspire the writers and then help us all understand what that would actually look like. There was also a ton of internal previs done, overseen by our Previs Supervisor, Dirk Valk.” Adds Pratt: “Our additional Visual Effects Supervisor, Marion Spates, also took some of the concepts and previs and immediately started developing a water simulation, dropping our ship model from Season 1 into it to determine what kind of waves it could actually deal with in a storm, how is this going to function, how is it going to cut through the waves, those kinds of things.” Ultimately, scenes featuring the spaceship having deployed sails and navigating the currents (plus the storm and a waterfall) were shared between Mackevision, Important Looking Pirates and Digital Domain.

80 • VFXVOICE.COM SUMMER 2020

PG 78-82 LOST IN SPACE.indd 80-81

MEET THE SPACE WHALE

At one point, John and Maureen Robinson observe a giant creature – dubbed the ‘space whale’ – as they traverse a gas planet. Important Looking Pirates and Mackevision entered a design phase, with El Ranchito completing the final VFX shots. “We really needed this character to give us a sense of wonder,” states Raisani. “We started with conceptualization and really figuring out, who is this creature within this world? We thought of it as the earthworm of the gas planet. They are ingesting one type of gas and excreting another kind of gas. So we have this very ethereal type creature. There was a lot of back and forth to find something that felt large but didn’t feel ominous, because we really wanted to maintain that sense of wonder, not a sense of fear.”

In Episodes 5 and 6 of Season 2 of Lost in Space, the Robinson family is on a rocky alien planet where they encounter a series of dinosaur-like creatures. Known as ‘Vivs,’ the vivacious – and arm-less – monsters were CG Image Engine creations. In terms of design, Production Visual Effects Supervisor Jabbar Raisani says showrunner Zack Estrin wanted to craft something more alien than had been seen on the show before, while being grounded in a creature that the audience could recognize. “We try to ground everything in something that you’ve seen on Earth, usually,” acknowledges Raisani. “So that’s the furthest we’ve pushed away from an Earth-like creature. We started with looking at raptors and dinosaurs, and then mixed that with dolphins and other underwater life.” The Vivs inhabit an ochre rock formation, which informed one of the creatures’ design features, specific barbs on their tails that they would use to navigate the rocks and cliffs. “We really tried to think deeply about design thinking when it comes to creatures,” states Raisani. “We say, ‘What does this creature do within this world?’ And then we design everything based on its function within that alien planet. “We also wanted them to feel bird-like,” says Raisani, who adds that ostriches were an inspiration here. “Ostriches have this thing where they breathe in through their mouth or nose, but then they exhale through the back of their head or neck. We really leaned into that to try to give him something that felt like a creature that lived in a hot environment which was made for running.” Another design element added into the Vivs was in terms of movement. It was imagined they could almost perform parkour steps around the cliffs, “which gave us an opportunity to really bounce around and use that tail to help push them off of the rocks,” notes Visual Effects Producer Terron Pratt. “Then on set we had our stunt actors running around, where they held their arms behind their back. At that point we had already determined that the Vivs would not have any upper arms. That helped everybody understand what we were eventually going to get with the final creatures.”

THEY FIGHT!

Season 2’s final episode includes a dramatic showdown between the robot known as Scarecrow and an invading legion of robots on board the Resolute. The massive sequence was, of course, a major visual effects moment. “In the script,” relates Raisani, “it literally read ‘Thirty seconds of robot awesomeness.’ And that was the only description of the action! I went over to Image Engine and I let them read that line, and then we talked about how we wanted to approach it.” Raisani says reference for the fight centered on action films that featured long takes. “We looked at things that had longer shots

OPPOSITE TOP TO BOTTOM: Jupiter 2 sits precariously atop a waterfall. Important Looking Pirates handled the fluid simulations for this sequence. For the VFX team, crafting the space whale involved thinking heavily about what role it played on the gas planet. A practical Spectral Motion suit was used for many robot shots. (Photo: Eike Schroter) TOP: A final shot of the Vivs as they attack a campsite.

SUMMER 2020 VFXVOICE.COM • 81

5/13/20 6:15 PM


TV/STREAMING

TOP TO BOTTOM: A scene from Episode 8 of Season 2, directed by Visual Effects Supervisor Jabbar Raisani. A majority of the exterior environments in the series were completely synthetic creations. The Jupiter Transport makes a rescue as the Resolute explodes.

that covered a lot of different beats. Image Engine was so good – it was more like directing actors than it was directing visual effects with them. They were just so in tune with how the robots should perform.” Given the complexity of the sequence, there was some initial doubt as to whether it could remain in the show. “When it came into post-production, that scene was really on the chopping block because it was so expensive,” admits Raisani. “I really fought hard for it, not only for the effects of the sequence, but as a big fan of sci-fi and robots, I really felt that our fans wanted and needed this moment in the show. So I pitched it to showrunner Zack Estrin as, ‘We’re not going to have a fight for the sake of having a fight. We’ re going to tell a big story within a fight.’” Pratt suggests that the completion of the sequence, while daunting, was actually made possible by the large amount of autonomy the VFX team was given. “It was pretty much a clean slate. Zack has always been on board and very supportive of our department, and allowing us to engage with vendors and really bring them into the process and let them create and let them come up with ideas that they think would be exciting.” Indeed, the visual effects side of the production had gained so much traction with the show that Raisani was given the opportunity to direct Episode 8, ‘Unknown.’ He had already directed on second unit for the previous season and this new one, which gave the VFX supervisor further familiarity with the cast and crew, and aided in helming the episode. “Plus,” he says, “the benefit of having such a heavy VFX background is that it allowed me to not think about that stuff – the VFX. I also had an amazing team of people that I could rely on to think about those things for me in detail so I could really focus on the actors and their performances and the camera.”

82 • VFXVOICE.COM SUMMER 2020

PG 78-82 LOST IN SPACE.indd 82

5/13/20 6:15 PM


PG 83 AMD AD.indd 83

5/14/20 12:59 PM


VR/AR/MR TRENDS

CLOUDHEAD GAMES: VR MULTI-TASKER AND STORYTELLER By CHRIS McGOWAN

Images copyright © Cloudhead Games. TOP: Denny Unger, Co-founder and CEO, Cloudhead Games

84 • VFXVOICE.COM SUMMER 2020

PG 84-87 CLOUDHEAD.indd 84-85

If you’ve teleported inside a VR experience or used snap-turns to rotate then you’ve already experienced some of Cloudhead Games’ contributions to the modern VR industry, according to Co-founder and CEO Denny Unger. In addition, the Canadian-based studio’s pioneering efforts in VR hand interactions caught the attention of Valve, who approached them to create a demo on Valve Index hand controllers. And Cloudhead has been a pioneer in using VR performance capture, employing it for its 2016 VR adventure-fantasy The Gallery: Call of the Starseed, which won multiple awards. Its 2019 first-person-shooter VR game Pistol Whip was similarly much lauded. Cloudhead Games is nothing if not a VR multi-tasker. Unger’s first experience of virtual reality came in 1992 on a system called Virtuality, playing a title called Dactyl Nightmare. “The VR was high-latency, low FOV, low frame rate, and made me sick as hell, but from that point forward I knew it was the future,” recalls Unger. “It just took a lot longer to enter the mainstream than I thought. From there, I became a life-long VR enthusiast and garage hacker, building collimated displays, flight simulator cabinets and crude VR headsets.” Unger met Palmer Luckey at a VR enthusiasts’ forum called MTBS in 2009. Luckey was the designer of the Oculus Rift HMD, which many credit with reviving the virtual reality industry, and the founder of Oculus VR, later sold to Facebook. At the MTBS, Unger’s fellow VR hackers would share their latest ideas about building new headsets. “It was very grassroots,” says Unger. “I even designed the first logo for the Oculus. Palmer came up with the prototypical Oculus, which at that time was just a package of off-the-shelf parts and cell phone tech that a handful of us were hoping to build in our shops. But it was clear even then that he had cracked the formula for an affordable, wide FOV, low-latency VR headset that actually lived up to the promise of what VR should be. “When Palmer began a collaboration with John Carmack, it was a clear signal to a few of us that it was time to get ahead of the VR curve,” continues Unger. Carmack was famous in the video game industry as the Co-founder of Id Software (creator of Wolfenstein, Doom and Quake games). He became the CTO of Oculus in 2013 (he has since stepped down to be its Consulting CTO). In 2012, Unger founded Cloudhead Games. “I reinvested money from a prior business, pulled in some trusted co-workers and we got to work, building the first VR adventure series inspired by cinema, The Gallery: Call of the Starseed, and its sequel Heart of the Emberstone.” Along the way, Unger realized that he also needed to contribute to the tech side of VR. “There were no Google searches for VR best practices in those days, so we quickly became an R&D studio out of necessity. “A lot of what we did early on was figure out how to keep people from feeling nauseous,” explains Unger. Moving users around virtual spaces affected human vestibular systems – the sensory system that provides the brain with information about motion, head position, balance and spatial orientation. “In the most basic terms, the golden rule in VR is to have movement be user-driven whenever possible, with very few exceptions. Artificial motion and visual velocities not driven by the user [vection] cause a disconnect

between what a user sees and what their inner ear feels. That contradiction in signals causes discomfort or worse. So it turns out the less you try to control a user’s actions and let them move freely, the better off they are. “Snap-turns, rapid or instant translation to a new orientation can shortcut artificial rotational velocities much in the same way a ballerina spots for a stationary object in the distance to avoid dizziness during a spin,” he continues. “And teleportation – we called

TOP: The Gallery: Heart of the Emberstone is the sequel to Cloudhead’s The Gallery: Call of the Starseed. BOTTOM: From The Gallery: Heart of the Emberstone.

SUMMER 2020 VFXVOICE.COM • 85

5/13/20 6:15 PM


VR/AR/MR TRENDS

CLOUDHEAD GAMES: VR MULTI-TASKER AND STORYTELLER By CHRIS McGOWAN

Images copyright © Cloudhead Games. TOP: Denny Unger, Co-founder and CEO, Cloudhead Games

84 • VFXVOICE.COM SUMMER 2020

PG 84-87 CLOUDHEAD.indd 84-85

If you’ve teleported inside a VR experience or used snap-turns to rotate then you’ve already experienced some of Cloudhead Games’ contributions to the modern VR industry, according to Co-founder and CEO Denny Unger. In addition, the Canadian-based studio’s pioneering efforts in VR hand interactions caught the attention of Valve, who approached them to create a demo on Valve Index hand controllers. And Cloudhead has been a pioneer in using VR performance capture, employing it for its 2016 VR adventure-fantasy The Gallery: Call of the Starseed, which won multiple awards. Its 2019 first-person-shooter VR game Pistol Whip was similarly much lauded. Cloudhead Games is nothing if not a VR multi-tasker. Unger’s first experience of virtual reality came in 1992 on a system called Virtuality, playing a title called Dactyl Nightmare. “The VR was high-latency, low FOV, low frame rate, and made me sick as hell, but from that point forward I knew it was the future,” recalls Unger. “It just took a lot longer to enter the mainstream than I thought. From there, I became a life-long VR enthusiast and garage hacker, building collimated displays, flight simulator cabinets and crude VR headsets.” Unger met Palmer Luckey at a VR enthusiasts’ forum called MTBS in 2009. Luckey was the designer of the Oculus Rift HMD, which many credit with reviving the virtual reality industry, and the founder of Oculus VR, later sold to Facebook. At the MTBS, Unger’s fellow VR hackers would share their latest ideas about building new headsets. “It was very grassroots,” says Unger. “I even designed the first logo for the Oculus. Palmer came up with the prototypical Oculus, which at that time was just a package of off-the-shelf parts and cell phone tech that a handful of us were hoping to build in our shops. But it was clear even then that he had cracked the formula for an affordable, wide FOV, low-latency VR headset that actually lived up to the promise of what VR should be. “When Palmer began a collaboration with John Carmack, it was a clear signal to a few of us that it was time to get ahead of the VR curve,” continues Unger. Carmack was famous in the video game industry as the Co-founder of Id Software (creator of Wolfenstein, Doom and Quake games). He became the CTO of Oculus in 2013 (he has since stepped down to be its Consulting CTO). In 2012, Unger founded Cloudhead Games. “I reinvested money from a prior business, pulled in some trusted co-workers and we got to work, building the first VR adventure series inspired by cinema, The Gallery: Call of the Starseed, and its sequel Heart of the Emberstone.” Along the way, Unger realized that he also needed to contribute to the tech side of VR. “There were no Google searches for VR best practices in those days, so we quickly became an R&D studio out of necessity. “A lot of what we did early on was figure out how to keep people from feeling nauseous,” explains Unger. Moving users around virtual spaces affected human vestibular systems – the sensory system that provides the brain with information about motion, head position, balance and spatial orientation. “In the most basic terms, the golden rule in VR is to have movement be user-driven whenever possible, with very few exceptions. Artificial motion and visual velocities not driven by the user [vection] cause a disconnect

between what a user sees and what their inner ear feels. That contradiction in signals causes discomfort or worse. So it turns out the less you try to control a user’s actions and let them move freely, the better off they are. “Snap-turns, rapid or instant translation to a new orientation can shortcut artificial rotational velocities much in the same way a ballerina spots for a stationary object in the distance to avoid dizziness during a spin,” he continues. “And teleportation – we called

TOP: The Gallery: Heart of the Emberstone is the sequel to Cloudhead’s The Gallery: Call of the Starseed. BOTTOM: From The Gallery: Heart of the Emberstone.

SUMMER 2020 VFXVOICE.COM • 85

5/13/20 6:15 PM


VR/AR/MR TRENDS

TOP: From The Gallery: Heart of the Emberstone. MIDDLE: Cloudhead pioneered the use of VR performance capture for its 2016 VR adventurefantasy The Gallery: Call of the Starseed. BOTTOM: From The Gallery: Call of the Starseed, the first title in the Starseed trilogy.

86 • VFXVOICE.COM SUMMER 2020

PG 84-87 CLOUDHEAD.indd 86-87

it ‘blink’ at the time – is a way to move a user’s entire physical volume to a new location without causing artificial forward or strafing [sideways] velocities.” Certain that VR would utilize motion control to interact with the virtual world, Cloudhead used a system called the Razer Hydra, developed by Sixense and originally made for 2D games, and adapted it for VR. “Using that as a surrogate we were able to develop a deep set of tools for hand and object interactions. Exploring what feels good and how to trick the brain into feeling weight that didn’t exist was in and of itself a monstrous task.” In 2019, Cloudhead released its five-minute demo for Valve Index hand, knuckle and finger tracking technology, Aperture Hand Lab. When it came time to capture actor performances for The Gallery, Unger recognized that they had a rare opportunity to invert the traditional process in a unique way. Instead of the director having a small window into the virtual world, with the actors using their imaginations to fill in the gaps on a greenscreen stage, Cloudhead put its actors into VR while they captured their performance data, using Noitom’s Perception Neuron full-body motion capture system. “They could see and interact with the virtual world we had created, hit their marks, read virtual teleprompters, all with no imagination required,” comments Unger. “The performances we captured in this way felt more genuine and grounded.” Call of the Starseed, the first title of The Gallery trilogy, was released in 2016. “When thinking about the best consumer entry point into virtual reality, it always came back to finding a cinematic language in the medium,” says Unger. “But this wasn’t a passive experience, this was interacting directly with a living narrative. A big pillar for us was making the user the hero, the central figure in an adventure, something that felt familiar.” The experience had influences from the landmark Myst series, and “we took elements from our favorite films to give [it] a classic hero’s journey. If we could get them into that head space, we knew that they would have an easier time naturally interacting with the spaces we created.” Cloudhead sought to transport viewers to “another place, to the degree that you forget where you were. That sense of going somewhere and returning.” It won some two dozen nominations and awards in the VR world and was followed by the second title in the trilogy, Heart of the Emberstone, at the end of 2017. For VR storytelling, Unger feels that telling stories through the environment is critical, “because so many perceptual filters are removed. Users can look in and around objects, crawl on the floor, jump and move through. You have to guide a user through the story by carving out a path that makes sense and fits the narrative, and at critical points along the way feed them information without any of the tools you might otherwise have. You can’t lock the camera down or frame a cutscene in a specific way.” The process should feel like an act of natural discovery. “Character interactions have to be built with the conceit that players have agency and can move around, breaking all of your best laid plans. So where and how those moments occur is a

“2019 was the tipping point many of us were waiting for. VR had a tumultuous re-birth, riding the hype cycle down to more realistic expectations, but the landscape has changed over the last couple of years. VR studios are beginning to turn healthy profits, and the technology itself is reaching a high point with some solid hardware standards coming into focus. Games remain the leading driver in the space, and there’s an evolution of quality, a general industry understanding of what works and what doesn’t in the medium. As a result, there are some undeniably fun experiences that could only happen in VR.” —Denny Unger, Co-founder and CEO, Cloudhead Games constant balancing act, using environment to have it all make sense. In some ways it’s much closer to theme park design than game design. “Simplifying interaction design to natural gestures is always better,” he adds. “The more you do to accommodate natural motion into your experience, into the acts of exploration and interaction, the better things will feel.” Unger feels that “2019 was the tipping point many of us were waiting for. VR had a tumultuous re-birth, riding the hype cycle down to more realistic expectations, but the landscape has changed over the last couple of years. VR studios are beginning to turn healthy profits, and the technology itself is reaching a high point with some solid hardware standards coming into focus. Games remain the leading driver in the space, and there’s an evolution of quality, a general industry understanding of what works and what doesn’t in the medium. As a result, there are some undeniably fun experiences that could only happen in VR.” He believes that the Oculus Quest is hitting “that sweet spot” in terms of affordability, matched with a full 6-DOF VR headset with tracked controllers, in a stand-alone unit (no computer required), “basically VR’s first self-contained console.” It can also plug into your gaming rig at home. “On the other end of the spectrum you have the Valve Index,” he adds, “a high-end VR headset with an emphasis on comfort, wide FOV, incredible sound, controllers with finger tracking, premium quality through and through. It relies on a PC with adequate horsepower to drive it, but you’re going to start seeing higher-budget experiences coming to platform.” In the near future, Cloudhead will further develop Pistol Whip and bring it to PSVR (PlayStation VR). “Beyond that we are continuing work on The Gallery franchise.”

TOP: Cloudhead continues development on the third title in The Gallery series. MIDDLE: Cloudhead’s 2019 first-person-shooter VR game Pistol Whip. BOTTOM: Cloudhead plans to bring Pistol Whip to PSVR (PlayStation VR).

SUMMER 2020 VFXVOICE.COM • 87

5/13/20 6:15 PM


VR/AR/MR TRENDS

TOP: From The Gallery: Heart of the Emberstone. MIDDLE: Cloudhead pioneered the use of VR performance capture for its 2016 VR adventurefantasy The Gallery: Call of the Starseed. BOTTOM: From The Gallery: Call of the Starseed, the first title in the Starseed trilogy.

86 • VFXVOICE.COM SUMMER 2020

PG 84-87 CLOUDHEAD.indd 86-87

it ‘blink’ at the time – is a way to move a user’s entire physical volume to a new location without causing artificial forward or strafing [sideways] velocities.” Certain that VR would utilize motion control to interact with the virtual world, Cloudhead used a system called the Razer Hydra, developed by Sixense and originally made for 2D games, and adapted it for VR. “Using that as a surrogate we were able to develop a deep set of tools for hand and object interactions. Exploring what feels good and how to trick the brain into feeling weight that didn’t exist was in and of itself a monstrous task.” In 2019, Cloudhead released its five-minute demo for Valve Index hand, knuckle and finger tracking technology, Aperture Hand Lab. When it came time to capture actor performances for The Gallery, Unger recognized that they had a rare opportunity to invert the traditional process in a unique way. Instead of the director having a small window into the virtual world, with the actors using their imaginations to fill in the gaps on a greenscreen stage, Cloudhead put its actors into VR while they captured their performance data, using Noitom’s Perception Neuron full-body motion capture system. “They could see and interact with the virtual world we had created, hit their marks, read virtual teleprompters, all with no imagination required,” comments Unger. “The performances we captured in this way felt more genuine and grounded.” Call of the Starseed, the first title of The Gallery trilogy, was released in 2016. “When thinking about the best consumer entry point into virtual reality, it always came back to finding a cinematic language in the medium,” says Unger. “But this wasn’t a passive experience, this was interacting directly with a living narrative. A big pillar for us was making the user the hero, the central figure in an adventure, something that felt familiar.” The experience had influences from the landmark Myst series, and “we took elements from our favorite films to give [it] a classic hero’s journey. If we could get them into that head space, we knew that they would have an easier time naturally interacting with the spaces we created.” Cloudhead sought to transport viewers to “another place, to the degree that you forget where you were. That sense of going somewhere and returning.” It won some two dozen nominations and awards in the VR world and was followed by the second title in the trilogy, Heart of the Emberstone, at the end of 2017. For VR storytelling, Unger feels that telling stories through the environment is critical, “because so many perceptual filters are removed. Users can look in and around objects, crawl on the floor, jump and move through. You have to guide a user through the story by carving out a path that makes sense and fits the narrative, and at critical points along the way feed them information without any of the tools you might otherwise have. You can’t lock the camera down or frame a cutscene in a specific way.” The process should feel like an act of natural discovery. “Character interactions have to be built with the conceit that players have agency and can move around, breaking all of your best laid plans. So where and how those moments occur is a

“2019 was the tipping point many of us were waiting for. VR had a tumultuous re-birth, riding the hype cycle down to more realistic expectations, but the landscape has changed over the last couple of years. VR studios are beginning to turn healthy profits, and the technology itself is reaching a high point with some solid hardware standards coming into focus. Games remain the leading driver in the space, and there’s an evolution of quality, a general industry understanding of what works and what doesn’t in the medium. As a result, there are some undeniably fun experiences that could only happen in VR.” —Denny Unger, Co-founder and CEO, Cloudhead Games constant balancing act, using environment to have it all make sense. In some ways it’s much closer to theme park design than game design. “Simplifying interaction design to natural gestures is always better,” he adds. “The more you do to accommodate natural motion into your experience, into the acts of exploration and interaction, the better things will feel.” Unger feels that “2019 was the tipping point many of us were waiting for. VR had a tumultuous re-birth, riding the hype cycle down to more realistic expectations, but the landscape has changed over the last couple of years. VR studios are beginning to turn healthy profits, and the technology itself is reaching a high point with some solid hardware standards coming into focus. Games remain the leading driver in the space, and there’s an evolution of quality, a general industry understanding of what works and what doesn’t in the medium. As a result, there are some undeniably fun experiences that could only happen in VR.” He believes that the Oculus Quest is hitting “that sweet spot” in terms of affordability, matched with a full 6-DOF VR headset with tracked controllers, in a stand-alone unit (no computer required), “basically VR’s first self-contained console.” It can also plug into your gaming rig at home. “On the other end of the spectrum you have the Valve Index,” he adds, “a high-end VR headset with an emphasis on comfort, wide FOV, incredible sound, controllers with finger tracking, premium quality through and through. It relies on a PC with adequate horsepower to drive it, but you’re going to start seeing higher-budget experiences coming to platform.” In the near future, Cloudhead will further develop Pistol Whip and bring it to PSVR (PlayStation VR). “Beyond that we are continuing work on The Gallery franchise.”

TOP: Cloudhead continues development on the third title in The Gallery series. MIDDLE: Cloudhead’s 2019 first-person-shooter VR game Pistol Whip. BOTTOM: Cloudhead plans to bring Pistol Whip to PSVR (PlayStation VR).

SUMMER 2020 VFXVOICE.COM • 87

5/13/20 6:15 PM


VFX TRENDS

5G TURBO-CHARGES STREAMING, VR, AR AND VFX By CHRIS McGOWAN

“All of a sudden, people will have more access to high-quality data streams, very fast, with low latency. … Games that have multiple players in them are going to be superpowered, And, in terms of AR, 5G will allow for an overlaying of what’s happening in the game with the reality around you.” —David Bloom, Journalist/Consultant

With its blazing speed, 5G – the next generation of wireless technology – is expected to spark plentiful innovations and burgeoning growth in entertainment and the arts, especially for streaming media, video games, virtual reality and augmented reality. These innovations will create new opportunities for content creators and visual effects artists. “All of a sudden, people will have more access to high-quality data streams, very fast, with low latency,” according to David Bloom, a Santa Monica-based journalist and consultant who tracks the collision of technology and entertainment. For television, there will be options galore as 5G will bolster IP services that stream “all your different video content.” That different content may include new forms of interactive TV, music and advertising. One can easily imagine the interactive potential of shows like Dancing with the Stars and American Idol, as well as live music concerts and sporting events. There will be narrative that follows in the footsteps of Charlie Brooker’s 2018 branching episode Bandersnatch in the Netflix series Black Mirror. And there will be a variety of interactive ads, some inspired by the Proctor & Gamble 2020 Super Bowl commercial “When We Come Together,” powered by interactive video company eko. In that commercial, visitors to the ad’s website made narrative choices that determined the ad’s final version. 5G-enabled location awareness may add another layer to the mix for advertising, TV events and other shows, according to Bloom. For most content, 5G will impact the production of the VFX underpinning it all. THE CLOUD AND THE THIN CLIENT

TOP LEFT TO RIGHT: David Bloom, journalist/consultant Neishaw Ali, President and Executive Producer, SpinVFX Vicki Dobbs Beck, Executive in Charge, ILMxLAB Chris Healer, CEO, The Molecule

88 • VFXVOICE.COM SUMMER 2020

PG 88-91 5G.indd 88-89

Neishaw Ali, President and Executive Producer at SpinVFX, explains that since 5G streams high bandwidth with low latency (little or no lag time), “You are basically pulling data from your ‘studio in the cloud,’ thus allowing you to work from your laptop and a tablet anywhere in the world as long as you are connected to a 5G network. In a world where security is critical, 5G also allows you to secure your data in the cloud and only stream images to your mobile device. It also helps to reduce cost by allowing you to ‘spin up’ a different level of services only when you need it. This flexibility is exciting and challenging to VFX studios as it streamlines costs, provides efficiency and reduces barriers to entry which in turn creates global competition.” Chris Healer, CEO of VFX studio The Molecule, comments,

“I think the first thing that comes to mind is that the ‘thin client’ revolution will happen because of this technology [5G], where it’s cheaper and easier to send a screen share or transmit a UI than it is to process on a device locally. I would say that more software will join that revolution.” He describes one possible example, “It’s really not a far stretch for Adobe to release a 5G-enabled version of Photoshop that works on an iPad Pro, where Photoshop is running in the cloud and the interface is transmitted to the iPad.” 5G could change other things in the studio, says Healer. “One simple thing that would be improved tremendously would be the ability to view dailies, previs, VFX drafts, etc., while on set. As a VFX supervisor you don’t get much time on set with a director between setups, so being able to hit play on a video and see it instantly in full resolution without waiting for it to load would be huge. I envision VFX and post-production being untethered.” Mathias Chelebourg, director of the VR film Doctor Who: The Runaway, comments, “The race for rendering power has always been at the center of our pipeline designs. The recent rise and evolution of streamed remote workstations in the studio with the addition of real-time engine tools are revolutionary breakthroughs that are deeply impacting the way we approach filmmaking from set to delivery.”

“One of the most exciting things about a robust 5G network and Edge compute is how they’ll allow consumers to engage with high-fidelity immersive experiences both indoors and outdoors, where highbandwidth data streams haven’t previously been available. 5G technology could allow us to do that on an even grander scale, across a number of platforms and locations, threading interactive narratives through multiple aspects of our daily lives.” —Vicki Dobbs Beck, Executive in Charge, ILMxLAB

ENABLING VR AND AR

Virtual reality, which fully immerses users in another place/ reality, and augmented reality, which overlays text or images on what is in the real world, are immersive media that have applications ranging from arts and entertainment to education and industry. “5G will be a huge enabler for AR/VR,” says Tuong H. Nguyen, Senior Principal Analyst for research and advisory company Gartner, Inc. “Having 5G opens up many possibilities. It provides high bandwidth, low latency, massively collaborative, on-demand, contextualized experiences. In AR, this could mean facilitating content delivery and consumption – providing contextualized experiences – giving the user relevant, interesting, actionable information based on their context.” Lower latency and stable connections, thanks to 5G, will enhance the sense of presence in virtual reality – that you are really “there” inside the experience. In addition, 5G will cut the cord connecting many headsets to PCs, with the cloud taking over the heavy lifting. High-quality VR and AR will become possible where

TOP LEFT TO RIGHT: Mathias Chelebourg, VR Director Tuong H. Nguyen, Senior Principal Analyst, Gartner, Inc. Jason Ambler, Vice President of Digital Media, Falcon’s Creative Group Bei Yang, Technology Studio Executive, Walt Disney Imagineering

SUMMER 2020 VFXVOICE.COM • 89

5/13/20 6:21 PM


VFX TRENDS

5G TURBO-CHARGES STREAMING, VR, AR AND VFX By CHRIS McGOWAN

“All of a sudden, people will have more access to high-quality data streams, very fast, with low latency. … Games that have multiple players in them are going to be superpowered, And, in terms of AR, 5G will allow for an overlaying of what’s happening in the game with the reality around you.” —David Bloom, Journalist/Consultant

With its blazing speed, 5G – the next generation of wireless technology – is expected to spark plentiful innovations and burgeoning growth in entertainment and the arts, especially for streaming media, video games, virtual reality and augmented reality. These innovations will create new opportunities for content creators and visual effects artists. “All of a sudden, people will have more access to high-quality data streams, very fast, with low latency,” according to David Bloom, a Santa Monica-based journalist and consultant who tracks the collision of technology and entertainment. For television, there will be options galore as 5G will bolster IP services that stream “all your different video content.” That different content may include new forms of interactive TV, music and advertising. One can easily imagine the interactive potential of shows like Dancing with the Stars and American Idol, as well as live music concerts and sporting events. There will be narrative that follows in the footsteps of Charlie Brooker’s 2018 branching episode Bandersnatch in the Netflix series Black Mirror. And there will be a variety of interactive ads, some inspired by the Proctor & Gamble 2020 Super Bowl commercial “When We Come Together,” powered by interactive video company eko. In that commercial, visitors to the ad’s website made narrative choices that determined the ad’s final version. 5G-enabled location awareness may add another layer to the mix for advertising, TV events and other shows, according to Bloom. For most content, 5G will impact the production of the VFX underpinning it all. THE CLOUD AND THE THIN CLIENT

TOP LEFT TO RIGHT: David Bloom, journalist/consultant Neishaw Ali, President and Executive Producer, SpinVFX Vicki Dobbs Beck, Executive in Charge, ILMxLAB Chris Healer, CEO, The Molecule

88 • VFXVOICE.COM SUMMER 2020

PG 88-91 5G.indd 88-89

Neishaw Ali, President and Executive Producer at SpinVFX, explains that since 5G streams high bandwidth with low latency (little or no lag time), “You are basically pulling data from your ‘studio in the cloud,’ thus allowing you to work from your laptop and a tablet anywhere in the world as long as you are connected to a 5G network. In a world where security is critical, 5G also allows you to secure your data in the cloud and only stream images to your mobile device. It also helps to reduce cost by allowing you to ‘spin up’ a different level of services only when you need it. This flexibility is exciting and challenging to VFX studios as it streamlines costs, provides efficiency and reduces barriers to entry which in turn creates global competition.” Chris Healer, CEO of VFX studio The Molecule, comments,

“I think the first thing that comes to mind is that the ‘thin client’ revolution will happen because of this technology [5G], where it’s cheaper and easier to send a screen share or transmit a UI than it is to process on a device locally. I would say that more software will join that revolution.” He describes one possible example, “It’s really not a far stretch for Adobe to release a 5G-enabled version of Photoshop that works on an iPad Pro, where Photoshop is running in the cloud and the interface is transmitted to the iPad.” 5G could change other things in the studio, says Healer. “One simple thing that would be improved tremendously would be the ability to view dailies, previs, VFX drafts, etc., while on set. As a VFX supervisor you don’t get much time on set with a director between setups, so being able to hit play on a video and see it instantly in full resolution without waiting for it to load would be huge. I envision VFX and post-production being untethered.” Mathias Chelebourg, director of the VR film Doctor Who: The Runaway, comments, “The race for rendering power has always been at the center of our pipeline designs. The recent rise and evolution of streamed remote workstations in the studio with the addition of real-time engine tools are revolutionary breakthroughs that are deeply impacting the way we approach filmmaking from set to delivery.”

“One of the most exciting things about a robust 5G network and Edge compute is how they’ll allow consumers to engage with high-fidelity immersive experiences both indoors and outdoors, where highbandwidth data streams haven’t previously been available. 5G technology could allow us to do that on an even grander scale, across a number of platforms and locations, threading interactive narratives through multiple aspects of our daily lives.” —Vicki Dobbs Beck, Executive in Charge, ILMxLAB

ENABLING VR AND AR

Virtual reality, which fully immerses users in another place/ reality, and augmented reality, which overlays text or images on what is in the real world, are immersive media that have applications ranging from arts and entertainment to education and industry. “5G will be a huge enabler for AR/VR,” says Tuong H. Nguyen, Senior Principal Analyst for research and advisory company Gartner, Inc. “Having 5G opens up many possibilities. It provides high bandwidth, low latency, massively collaborative, on-demand, contextualized experiences. In AR, this could mean facilitating content delivery and consumption – providing contextualized experiences – giving the user relevant, interesting, actionable information based on their context.” Lower latency and stable connections, thanks to 5G, will enhance the sense of presence in virtual reality – that you are really “there” inside the experience. In addition, 5G will cut the cord connecting many headsets to PCs, with the cloud taking over the heavy lifting. High-quality VR and AR will become possible where

TOP LEFT TO RIGHT: Mathias Chelebourg, VR Director Tuong H. Nguyen, Senior Principal Analyst, Gartner, Inc. Jason Ambler, Vice President of Digital Media, Falcon’s Creative Group Bei Yang, Technology Studio Executive, Walt Disney Imagineering

SUMMER 2020 VFXVOICE.COM • 89

5/13/20 6:21 PM


VFX TRENDS

“On-device rendering fidelity has always been an issue with AR, and 5G can shift that thinking entirely as cloud-based rendering becomes a reality. Light fields streaming, rather than frames, made possible by larger bandwidth and lower latency will enable fairly lag-free AR experiences. More accurate locations from 5G will allow AR to be solidly based on the real world and allow crowd-sourced mapping of actual structures. We [Walt Disney Imagineers] create physical manifestations of these ‘virtual worlds’ for Disney theme parks, and we are constantly exploring new and emerging technologies, like 5G, that enhance the storied experiences within.” —Bei Yang, Technology Studio Executive, Walt Disney Imagineering 5G is present. “The ability to rely on a strong 5G network to harvest the power of cloud computing will definitely be a key milestone in the evolution of VR broadly,” Chelebourg comments. “It needs to happen.” “For VR and AR, the onus will be on a remote machine, without lugging around the hardware,” says Healer. “For instance, the Oculus Quest will basically not need much, if any, local storage. Taking it one step further is the interoperability of users who [become] untethered. With inside-out tracking you won’t need to reconfigure your living room to have a VR chat with your family. You could go to a park or the seaside and know you won’t be bumping into things.” “Whether it’s AR or VR, 5G can enable massive collaboration. Think about something like The World of Minecraft, but everyone using it at once, or social VR, with thousands of virtual people interacting within a space,” says Nguyen. “Games that have multiple players in them are going to be superpowered,” adds Bloom. And, in terms of AR, 5G will allow for an overlaying of what’s happening in the game with the reality around you,” according to Bloom. FIXED LOCATIONS TOP AND MIDDLE: 5G could tap into the interactive potential of shows like Dancing with the Stars and American Idol, as well as live concerts and sporting events. (Images courtesy of American Broadcasting Company) BOTTOM: Future interactive narratives could follow the example of Bandersnatch, the 2018 branching episode in the Netflix series Black Mirror. (Image courtesy of Netflix)

90 • VFXVOICE.COM SUMMER 2020

PG 88-91 5G.indd 90-91

5G will also impact the world of fixed location entertainment. Jason Ambler is VP of Digital Media at Falcon’s Creative Group, which specializes in theme park and attraction design. “We certainly see 5G as part of our future content development strategy for location-based entertainment and immersive experiences,” relates Ambler, “not only for AR, VR and other mobile-based applications, but also for large-format and

large-scale interactive content as well. One of the most exciting aspects of 5G for us is the potential to stream massive amounts of data from the cloud over extremely low latency networks, thus enabling the possibility of personalized foveated rendering [eye tracking] from the cloud. This could help to streamline the guest – or user – facing technologies. “In addition, 5G could allow for what have been traditionally offline or pre-loaded VR and AR experiences to now become wide-area online experiences with the added benefits of real-time socialization, layers of customization and spontaneity that have not yet been seen in our industry. More and more of our projects are incorporating levels of real-time interactivity, mobile functionality, and other advanced networking systems that could all greatly benefit from the promise of 5G and allow you to engage with your environment and personalize your experience like never before.” Bei Yang, Technology Studio Executive at Walt Disney Imagineering, comments, “On-device rendering fidelity has always been an issue with AR, and 5G can shift that thinking entirely as cloud-based rendering becomes a reality. Light fields streaming, rather than frames, made possible by larger bandwidth and lower latency will enable fairly lag-free AR experiences. More accurate locations from 5G will allow AR to be solidly based on the real world and allow crowd-sourced mapping of actual structures. We [Walt Disney Imagineers] create physical manifestations of these ‘virtual worlds’ for Disney theme parks, and we are constantly exploring new and emerging technologies, like 5G, that enhance the storied experiences within.” “One of the most exciting things about a robust 5G network and Edge compute is how they’ll allow consumers to engage with high-fidelity immersive experiences both indoors and outdoors, where high-bandwidth data streams haven’t previously been available,” says Vicki Dobbs Beck, Executive in Charge at ILMxLAB, which together with The VOID created the fixed-location VR experience Star Wars: Secrets of the Empire. The immersive studio strives, as per its motto, to let audiences “step inside our stories.” “5G technology could allow us to do that on an even grander scale, across a number of platforms and locations, threading interactive narratives through multiple aspects of our daily lives,” says Beck. VR and AR’s digital effects will foster new types of artists who will overlay things on the real world in semi-real-time. Bloom comments, “So what does that mean for a VFX artist? You’re creating this stuff. You’re animating the real world. Our definition of what a VFX artist is will expand and shift. There will be new types of media possible with 5G.” Ian Hambleton, CEO of the immersive studio Maze Theory, notes that there may be glitches with 5G in the beginning, “but as with many new technologies, and particularly connectivity, often things move forward far quicker than expected.” Bloom observes that before 4G, “we couldn’t anticipate the rise of things like Uber and Lyft, and the ability to dispatch cars wherever we were and to track them and pay and do all of that. 4G enabled them. Just as that seemingly came out of nowhere with 4G, there’s a bunch of things we can’t anticipate quite yet that can be enabled with 5G.”

“Whether it’s AR or VR, 5G can enable massive collaboration. Think about something like The World of Minecraft, but everyone using it at once, or social VR, with thousands of virtual people interacting within a space.” —Tuong H. Nguyen, Senior Principal Analyst, Gartner, Inc.

TOP: Next-generation interactive ads may be inspired by Proctor & Gamble’s 2020 Super Bowl commercial “When We Come Together,” powered by interactive video company eko. In that commercial, visitors to the ad’s website made narrative choices that determined the ad’s final version. (Image courtesy of Proctor & Gamble) BOTTOM: 5G will allow multiple-player games like The World of Minecraft to be ‘superpowered,’ with potentially thousands of people using it at once. (Image courtesy of Mojang and Xbox Games Studios)

SUMMER 2020 VFXVOICE.COM • 91

5/13/20 6:21 PM


VFX TRENDS

“On-device rendering fidelity has always been an issue with AR, and 5G can shift that thinking entirely as cloud-based rendering becomes a reality. Light fields streaming, rather than frames, made possible by larger bandwidth and lower latency will enable fairly lag-free AR experiences. More accurate locations from 5G will allow AR to be solidly based on the real world and allow crowd-sourced mapping of actual structures. We [Walt Disney Imagineers] create physical manifestations of these ‘virtual worlds’ for Disney theme parks, and we are constantly exploring new and emerging technologies, like 5G, that enhance the storied experiences within.” —Bei Yang, Technology Studio Executive, Walt Disney Imagineering 5G is present. “The ability to rely on a strong 5G network to harvest the power of cloud computing will definitely be a key milestone in the evolution of VR broadly,” Chelebourg comments. “It needs to happen.” “For VR and AR, the onus will be on a remote machine, without lugging around the hardware,” says Healer. “For instance, the Oculus Quest will basically not need much, if any, local storage. Taking it one step further is the interoperability of users who [become] untethered. With inside-out tracking you won’t need to reconfigure your living room to have a VR chat with your family. You could go to a park or the seaside and know you won’t be bumping into things.” “Whether it’s AR or VR, 5G can enable massive collaboration. Think about something like The World of Minecraft, but everyone using it at once, or social VR, with thousands of virtual people interacting within a space,” says Nguyen. “Games that have multiple players in them are going to be superpowered,” adds Bloom. And, in terms of AR, 5G will allow for an overlaying of what’s happening in the game with the reality around you,” according to Bloom. FIXED LOCATIONS TOP AND MIDDLE: 5G could tap into the interactive potential of shows like Dancing with the Stars and American Idol, as well as live concerts and sporting events. (Images courtesy of American Broadcasting Company) BOTTOM: Future interactive narratives could follow the example of Bandersnatch, the 2018 branching episode in the Netflix series Black Mirror. (Image courtesy of Netflix)

90 • VFXVOICE.COM SUMMER 2020

PG 88-91 5G.indd 90-91

5G will also impact the world of fixed location entertainment. Jason Ambler is VP of Digital Media at Falcon’s Creative Group, which specializes in theme park and attraction design. “We certainly see 5G as part of our future content development strategy for location-based entertainment and immersive experiences,” relates Ambler, “not only for AR, VR and other mobile-based applications, but also for large-format and

large-scale interactive content as well. One of the most exciting aspects of 5G for us is the potential to stream massive amounts of data from the cloud over extremely low latency networks, thus enabling the possibility of personalized foveated rendering [eye tracking] from the cloud. This could help to streamline the guest – or user – facing technologies. “In addition, 5G could allow for what have been traditionally offline or pre-loaded VR and AR experiences to now become wide-area online experiences with the added benefits of real-time socialization, layers of customization and spontaneity that have not yet been seen in our industry. More and more of our projects are incorporating levels of real-time interactivity, mobile functionality, and other advanced networking systems that could all greatly benefit from the promise of 5G and allow you to engage with your environment and personalize your experience like never before.” Bei Yang, Technology Studio Executive at Walt Disney Imagineering, comments, “On-device rendering fidelity has always been an issue with AR, and 5G can shift that thinking entirely as cloud-based rendering becomes a reality. Light fields streaming, rather than frames, made possible by larger bandwidth and lower latency will enable fairly lag-free AR experiences. More accurate locations from 5G will allow AR to be solidly based on the real world and allow crowd-sourced mapping of actual structures. We [Walt Disney Imagineers] create physical manifestations of these ‘virtual worlds’ for Disney theme parks, and we are constantly exploring new and emerging technologies, like 5G, that enhance the storied experiences within.” “One of the most exciting things about a robust 5G network and Edge compute is how they’ll allow consumers to engage with high-fidelity immersive experiences both indoors and outdoors, where high-bandwidth data streams haven’t previously been available,” says Vicki Dobbs Beck, Executive in Charge at ILMxLAB, which together with The VOID created the fixed-location VR experience Star Wars: Secrets of the Empire. The immersive studio strives, as per its motto, to let audiences “step inside our stories.” “5G technology could allow us to do that on an even grander scale, across a number of platforms and locations, threading interactive narratives through multiple aspects of our daily lives,” says Beck. VR and AR’s digital effects will foster new types of artists who will overlay things on the real world in semi-real-time. Bloom comments, “So what does that mean for a VFX artist? You’re creating this stuff. You’re animating the real world. Our definition of what a VFX artist is will expand and shift. There will be new types of media possible with 5G.” Ian Hambleton, CEO of the immersive studio Maze Theory, notes that there may be glitches with 5G in the beginning, “but as with many new technologies, and particularly connectivity, often things move forward far quicker than expected.” Bloom observes that before 4G, “we couldn’t anticipate the rise of things like Uber and Lyft, and the ability to dispatch cars wherever we were and to track them and pay and do all of that. 4G enabled them. Just as that seemingly came out of nowhere with 4G, there’s a bunch of things we can’t anticipate quite yet that can be enabled with 5G.”

“Whether it’s AR or VR, 5G can enable massive collaboration. Think about something like The World of Minecraft, but everyone using it at once, or social VR, with thousands of virtual people interacting within a space.” —Tuong H. Nguyen, Senior Principal Analyst, Gartner, Inc.

TOP: Next-generation interactive ads may be inspired by Proctor & Gamble’s 2020 Super Bowl commercial “When We Come Together,” powered by interactive video company eko. In that commercial, visitors to the ad’s website made narrative choices that determined the ad’s final version. (Image courtesy of Proctor & Gamble) BOTTOM: 5G will allow multiple-player games like The World of Minecraft to be ‘superpowered,’ with potentially thousands of people using it at once. (Image courtesy of Mojang and Xbox Games Studios)

SUMMER 2020 VFXVOICE.COM • 91

5/13/20 6:21 PM


[ VES SECTION SPOTLIGHT: FRANCE ]

A Vibrant Community in the City of Lights By NAOMI GOLDMAN Much of the Visual Effects Society’s strong international presence is due to its network of Sections whose members build rich communities in their local regions while advancing the Society and industry worldwide. Founded in 2018, the France Section is thriving with more than 80 members. “France has a long history in visual effects, going back to our strong connection to VFX pioneer Georges Méliès, whose iconic Man in the Moon image is synonymous with the VES,” says France Section Co-Chair Franck Lambertz. “His amazing short films are still mind-blowing 100 years later, and he is appreciated worldwide for his artistic contributions. That pride in our French colleagues continues today, as we saw Guillaume Rocheron walk on stage this year to receive the Academy Award [for Outstanding Visual Effects] for his work on 1917.” From an enthusiastic group of five VFX practitioners in 2015 who worked to formally establish this Section, the Board of Managers strives to hit 100 members this year. “Our Section represents a wide group of talent, including 2D and 3D artists, designers, producers, supervisors and directors,” says Lambertz. “There is a lot of solidarity among French VFX artists. A lot of them work freelance and share their time between different companies, so everybody knows each other to some degree. These are the relationships we want to bring into the VES and build upon.” There is a diversity of VFX work taking place in France, including high-end fashion and perfume companies (Chanel, Dior and Louis Vuitton, among others), VFX for local cinema, international blockbusters (including Blade Runner 2049, Terminator: Dark Fate and Ford v Ferrari), animated features (Asterix, Minions) and the gaming industry (with a lot of work for Ubisoft, creator of Assassin’s Creed and the Tom Clancy video game series). Lambertz continues, “Big studios like Mikros Animation, BUF and Illumination Mac Guff are the French flagships working internationally. A multitude of middle-size companies, like

TOP: Franck Lambertz, France Section Chair MIDDLE: The France Section was a co-sponsor of the 2020 Paris Images Digital Summit. BOTTOM: VFX pros attending the Mangrove-Open Source VFX Framework event presented by VES France in partnership with The Yard and the Centre national du cinéma et de l’image animée.

92 • VFXVOICE.COM SUMMER 2020

PG 92-93 FRANCE.indd All Pages

“We are excited about our future. Our visual effects and animation schools have been well appreciated by international studios, and that high profile helps our community continue to grow. There is still a lot to be done to help our industry be recognized as it should at an international level. And we are here to do that, to help our community share their experiences and passion for our industry and support one another.” —Franck Lambertz, France Section Co-Chair

“Our Section represents a wide group of talent, including 2D and 3D artists, designers, producers, supervisors and directors. There is a lot of solidarity among French VFX artists. A lot of them work freelance and share their time between different companies, so everybody knows each other to some degree. These are the relationships we want to bring into the VES and build upon.” —Franck Lambertz, France Section Co-Chair Mathematic, Digital District, Unit, The Yard, Trimaran, Happy Flamingos and Nightshift, are all very dynamic and attract a high level of skilled professionals. And small boutiques, such as HECAT Studio and Blacklab, continue to grow thanks to innovative creative talent.” The France Section has hosted an ongoing roster of screenings for members and guests, thanks to the TSF Group, who hosts the events in their projection room in the Plaine Saint -Denis business district, which features state of the art sound and a 4K laser projector. The Section also hosts pub nights and an annual BBQ to bring together both existing and prospective members. They have organized several educational events, including the collaborative Megabrain Masterclass series with the VES Germany and London Sections, where each Section contributed to the content and participated through a live feed. “The idea around the Megabrain series has been teaching actual techniques that help artists improve their skills, and we are excited to offer more programs like this,” says Lambertz. The Section was also a proud partner in the 2020 Paris Images Digital Summit, an event created by the Centre des arts d’Enghien-les-Bains dedicated to visual effects, crossing the creative, technical and economic stakes of a sector that is constantly evolving. This forum offered the VES an opportunity to share knowledge and expertise and host keynote conversations with global leaders in the VFX industry. “We are excited about our future,” comments Lambertz looking ahead. “Our visual effects and animation schools have been well appreciated by international studios, and that high profile helps our community continue to grow. There is still a lot to be done to help our industry be recognized as it should at an international level. And we are here to do that, to help our community share their experiences and passion for our industry and support one another.”

TOP TWO: VES France members and guests enjoy the annual Summer BBQ. MIDDLE: The Megabrain Masterclass series helps artists improve their skills. BOTTOM: VES France members and guests enjoy exclusive film screenings at the state-of-the-art TSF projection room.

SUMMER 2020

VFXVOICE.COM • 93

5/13/20 6:23 PM


[ VES SECTION SPOTLIGHT: FRANCE ]

A Vibrant Community in the City of Lights By NAOMI GOLDMAN Much of the Visual Effects Society’s strong international presence is due to its network of Sections whose members build rich communities in their local regions while advancing the Society and industry worldwide. Founded in 2018, the France Section is thriving with more than 80 members. “France has a long history in visual effects, going back to our strong connection to VFX pioneer Georges Méliès, whose iconic Man in the Moon image is synonymous with the VES,” says France Section Co-Chair Franck Lambertz. “His amazing short films are still mind-blowing 100 years later, and he is appreciated worldwide for his artistic contributions. That pride in our French colleagues continues today, as we saw Guillaume Rocheron walk on stage this year to receive the Academy Award [for Outstanding Visual Effects] for his work on 1917.” From an enthusiastic group of five VFX practitioners in 2015 who worked to formally establish this Section, the Board of Managers strives to hit 100 members this year. “Our Section represents a wide group of talent, including 2D and 3D artists, designers, producers, supervisors and directors,” says Lambertz. “There is a lot of solidarity among French VFX artists. A lot of them work freelance and share their time between different companies, so everybody knows each other to some degree. These are the relationships we want to bring into the VES and build upon.” There is a diversity of VFX work taking place in France, including high-end fashion and perfume companies (Chanel, Dior and Louis Vuitton, among others), VFX for local cinema, international blockbusters (including Blade Runner 2049, Terminator: Dark Fate and Ford v Ferrari), animated features (Asterix, Minions) and the gaming industry (with a lot of work for Ubisoft, creator of Assassin’s Creed and the Tom Clancy video game series). Lambertz continues, “Big studios like Mikros Animation, BUF and Illumination Mac Guff are the French flagships working internationally. A multitude of middle-size companies, like

TOP: Franck Lambertz, France Section Chair MIDDLE: The France Section was a co-sponsor of the 2020 Paris Images Digital Summit. BOTTOM: VFX pros attending the Mangrove-Open Source VFX Framework event presented by VES France in partnership with The Yard and the Centre national du cinéma et de l’image animée.

92 • VFXVOICE.COM SUMMER 2020

PG 92-93 FRANCE.indd All Pages

“We are excited about our future. Our visual effects and animation schools have been well appreciated by international studios, and that high profile helps our community continue to grow. There is still a lot to be done to help our industry be recognized as it should at an international level. And we are here to do that, to help our community share their experiences and passion for our industry and support one another.” —Franck Lambertz, France Section Co-Chair

“Our Section represents a wide group of talent, including 2D and 3D artists, designers, producers, supervisors and directors. There is a lot of solidarity among French VFX artists. A lot of them work freelance and share their time between different companies, so everybody knows each other to some degree. These are the relationships we want to bring into the VES and build upon.” —Franck Lambertz, France Section Co-Chair Mathematic, Digital District, Unit, The Yard, Trimaran, Happy Flamingos and Nightshift, are all very dynamic and attract a high level of skilled professionals. And small boutiques, such as HECAT Studio and Blacklab, continue to grow thanks to innovative creative talent.” The France Section has hosted an ongoing roster of screenings for members and guests, thanks to the TSF Group, who hosts the events in their projection room in the Plaine Saint -Denis business district, which features state of the art sound and a 4K laser projector. The Section also hosts pub nights and an annual BBQ to bring together both existing and prospective members. They have organized several educational events, including the collaborative Megabrain Masterclass series with the VES Germany and London Sections, where each Section contributed to the content and participated through a live feed. “The idea around the Megabrain series has been teaching actual techniques that help artists improve their skills, and we are excited to offer more programs like this,” says Lambertz. The Section was also a proud partner in the 2020 Paris Images Digital Summit, an event created by the Centre des arts d’Enghien-les-Bains dedicated to visual effects, crossing the creative, technical and economic stakes of a sector that is constantly evolving. This forum offered the VES an opportunity to share knowledge and expertise and host keynote conversations with global leaders in the VFX industry. “We are excited about our future,” comments Lambertz looking ahead. “Our visual effects and animation schools have been well appreciated by international studios, and that high profile helps our community continue to grow. There is still a lot to be done to help our industry be recognized as it should at an international level. And we are here to do that, to help our community share their experiences and passion for our industry and support one another.”

TOP TWO: VES France members and guests enjoy the annual Summer BBQ. MIDDLE: The Megabrain Masterclass series helps artists improve their skills. BOTTOM: VES France members and guests enjoy exclusive film screenings at the state-of-the-art TSF projection room.

SUMMER 2020

VFXVOICE.COM • 93

5/13/20 6:23 PM


[ VES NEWS ]

The VES Handbook of Visual Effects, Third Edition to be Released in July By NAOMI GOLDMAN

Jeffrey A. Okun, VES, Visual Effects Supervisor

Susan Zwerman, VES, Visual Effects Producer

The much-anticipated third edition of the VES Handbook of Visual Effects: Industry Standard VFX Practices and Procedures will be released next month. Hailed as the most complete guide to visual effects techniques and best practices on the market today, it covers essential techniques and solutions for all VFX artists, producers and supervisors, from pre-production through production and post-production. Edited by renowned Visual Effects Supervisor Jeffrey A. Okun, VES and Visual Effects Producer Susan Zwerman, VES, the update to the awardwinning guide includes the latest industrystandard techniques, technologies and workflows in the fast-paced world of visual effects. The VES tasked the original authors to update their areas of expertise, including AR/VR moviemaking, color management, cameras, VFX editorial, stereoscopic and the digital intermediate, and to provide detailed chapters on interactive games and full animation. Additionally, 56 new contributors – representing the best and the brightest in the industry – share their proven methods, tricks and shortcuts earned through decades of real-world, hands-on experience. “The VES sees this book’s continual update as an essential mandate,” says Okun. “Far beyond basic information on techniques for visual effects artists in general, it shares the combined practical hands-on experience and tips from leaders in all VFX verticals. It is the guide to navigate the practical day-to-day issues that will be experienced by every working professional at various points in their careers.” “This is a must-read for all visual effects

filmmakers,” says Zwerman. “The writers have combined wisdom and practicality to produce an extraordinary book that covers every aspect of visual effects techniques in a concise manner without losing sight of its art and innovation.” This third edition has been expanded to feature lessons on 2.5D/3D Compositing; 3D Scanning; Digital Cinematography; Editorial Workflow in Animated and Visual Effects Features; Gaming updates; General Geometry Instancing; Lens Mapping for VFX; Native Stereo; Real-Time VFX & Camera Tracking; Shot/ Element Pulls and Delivery to VFX; Techvis; VFX Elements and Stereo; Virtual Production; and VR/AR. The VES Handbook of Visual Effects: Industry Standard VFX Practices and Procedures, Third Edition will be available in July at leading booksellers.

“It is the guide to navigate the practical day-to-day issues that will be experienced by every working professional at various points in their careers.” —Jeffrey A. Okun, VES, Visual Effects Supervisor

94 • VFXVOICE.COM SUMMER 2020

PG 94 VES NEWS.indd 94

5/13/20 6:23 PM


PG 95 VES COVID HOUSE AD.indd 95

5/14/20 1:00 PM


[ FINAL FRAME ]

Streaming Hits the Jackpot With House of Cards

These days, with streaming series arriving at blazing speed and volume on Amazon, Netflix, HBO, Hulu, Disney+, CBS All Access, AMC Premiere, FXNOW, USA Now and others, it’s hard to believe that the streaming TV industry is so young. Traveling back to February 2013, we remember that Netflix’s House of Cards began the episodic streaming era and became its first big hit, as well as the most streamed piece of content in the U.S. and 40 other countries. Its episodes were released all at once, popularizing the activity known as “binge-watching.” The show was a political thriller based upon a BBC series of the same name from 1990. It told the tale of ruthless congressman Frank Underwood (Kevin Spacey) and his equally scheming wife Claire (Robin Wright) who maneuvered themselves into the President and First Lady posts in a daring display of manipulation, power and treachery. It went on to earn one Primetime Emmy Award (for director David Fincher) and 23 Primetime Emmy nominations, including best drama, best leading actor and best leading actress, as well as six Primetime Creative Arts Emmys. It was the first original online web series to earn Emmy nominations. It also picked up a number of Golden Globe nominations and wins.

Photo by Patrick Harbron courtesy of Netflix

96 • VFXVOICE.COM SUMMER 2020

PG 96 FINAL FRAME.indd 96

5/13/20 6:26 PM


CVR3 FOTOKEM AD.indd 3

5/14/20 12:47 PM


CVR4 YANNIX AD.indd 4

5/14/20 12:48 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.