VFXVOICE.COM
SPRING 2019
DUMBO RETURNS
KING KONG ON BROADWAY • A.I.: SMART VFX ARRIVES GLOBAL VFX: SPECIAL REPORT • FOCUS: CANADA VFX • 2019 VES AWARDS & WINNERS
CVR1 VFXV issue 9 SPRING 2019.indd 1
2/26/19 12:30 PM
CVR2 SPIN VFX AD.indd 2
2/19/19 11:14 AM
PG 1 BLACKMAGIC AD.indd 1
2/19/19 11:16 AM
[ EXECUTIVE NOTE ]
Welcome to the Spring issue of VFX Voice! Over the last two decades, the Visual Effects Society has used its position as a convener to take on critical issues affecting its membership and the greater VFX community. In this issue of VFX Voice, we take a look at “Global VFX: State of the Industry 2019” in a VFX Voice Special Report. Disruption over the past decade has irrevocably changed the visual effects industry, from the point of view of business models, creative approaches, new content delivery platforms and technologies. Since we issued our last report on the state of the industry, we have spoken to dozens of VFX experts around the world. Now, we present an exclusive and insightful look at the VFX industry and the evolution of key trends impacting the rapidly shifting global marketplace – and how visual effects practitioners are navigating the landscape and shaping the future. Read on and let us know your thoughts. Coming out of an exciting awards season, this issue shines a light on the 17th Annual VES Awards celebration. Get an inside look at VFX supervisors’ picks for hot apps, the arrival of artificial intelligence and a special focus on visual effects in Canada. And we go inside King Kong on Broadway, Dumbo, Tom Clancy’s Jack Ryan, How to Train Your Dragon 3 and “Fresh Spring VFX Film and TV.” It’s all in here. We’re continuing to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Thank you for making VFX Voice a must-read publication around the globe. We’re proud to be the definitive authority of all things VFX.
Mike Chambers, Chair, VES Board of Directors
Eric Roth, VES Executive Director
2 • VFXVOICE.COM SPRING 2019
PG 2 EXECUTIVE NOTE.indd 2
2/21/19 12:11 PM
PG 3 FTRACK AD.indd 3
2/19/19 11:17 AM
[ CONTENTS ] FEATURES 8 THEATRE: KING KONG ON BROADWAY Visual effects key the show’s success and survival.
VFXVOICE.COM
DEPARTMENTS 2 EXECUTIVE NOTE 93 VES NEWS
16 VFX TRENDS: SUPERVISORS’ APPS Leading supervisors share their go-to VFX apps. 22 FILM: ARTIFICIAL INTELLIGENCE The new wave of smart VFX software solutions. 28 FILM: FRESH SPRING VFX A breakdown of Spring’s top VFX films and TV.
94 VES SECTION: VANCOUVER 96 FINAL FRAME: DUMBO
ON THE COVER: Dumbo (Image courtesy of Walt Disney Pictures)
32 FILM: DUMBO RETURNS The VFX balancing act in Tim Burton’s live circus. 38 TV: TOM CLANCY’S JACK RYAN Maintaining authenticity from big screen to streaming TV. 44 FILM: WETA DIGITAL The visual frontiers of Mortal Engines and Alita: Battle Angel. 50 THE 17TH ANNUAL VES AWARDS Celebrating the best in visual effects. 58 VES AWARDS WINNERS Photo Gallery 64 SPECIAL REPORT: GLOBAL VFX An analysis of the state of the worldwide effects industry. 78 FOCUS: CANADA VFX Steady growth drives facilities and companies in Canada. 86 ANIMATION: HOW TO TRAIN YOUR DRAGON 3 Adopting visual effects and live-action film techniques.
4 • VFXVOICE.COM SPRING 2019
PG 4 TOC.indd 4
2/21/19 12:13 PM
PG 5 RED DIGITAL AD.indd 5
2/19/19 11:18 AM
SPRING 2019 • VOL. 3, NO. 2
MULTIPLE WINNER OF THE 2018 FOLIO: EDDIE & OZZIE AWARDS
VFXVOICE
Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com
VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS
EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com Maria Lopez mmlopezmarketing@earthlink.net MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Mark J. Miller Chris McGowan Barbara Robertson ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth
OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Lisa Cooke, 2nd Vice Chair Brooke Lyndon-Stanford, Treasurer Rita Cahill, Secretary DIRECTORS Brooke Breton, Kathryn Brillhart, Colin Campbell Bob Coleman, Dayne Cowan, Kim Davidson Rose Duignan, Richard Edlund, VES, Bryan Grill Dennis Hoffman, Pam Hogarth, Jeff Kleiser Suresh Kondareddy, Kim Lavery, VES Tim McGovern, Emma Clifton Perry Scott Ross, Jim Rygiel, Tim Sassoon Lisa Sepp-Wilson, Katie Stetson David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Joe Weidenbach ALTERNATES Andrew Bly, Gavin Graham, Charlie Iturriaga Andres Martinez, Dan Schrecker Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Callie C. Miller, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations
Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2019 The Visual Effects Society. Printed in the U.S.A.
6 • VFXVOICE.COM SPRING 2019
PG 6 MASTHEAD.indd 6
2/21/19 12:14 PM
PG 7 YANNIX AD.indd 7
2/19/19 11:19 AM
THEATRE
EFFECTS PLAY LEAD ROLE ON BROADWAY IN KING KONG By MARK J. MILLER
TOP LEFT AND RIGHT: Christiani Pitts as Ann Darrow with Kong – 20-feet tall, 1.1-tons. (Photo: Joan Marcus)
8 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 8-9
Broadway has had its share of visual effects extravaganzas – live elephants, aerial battles, a full-blown chariot race – but it’s never had a 20-foot, 1.1 ton ape wandering across one of its stages. Until now. King Kong debuted on the Great White Way in early November 2018 after 10 years of development and $35 million in investment. A good chunk of that cash went toward figuring out a way to make that big ape as realistic as possible. The show’s amazing visual effects are the key to this production’s survival. Kong is what the audience has paid its money to see. Part of the joy of Broadway, of course, is the visual sleight of hand designers bring to each show, whether it is a helicopter dropping into Miss Saigon, the massive chandelier swinging dramatically to the stage from the theater’s high ceiling in Phantom of the Opera, or actors parachuting from a plane and dropping to the stage in The Who’s Tommy. Audiences are much more demanding now for wow moments, but the idea of “spectacle” is relative. “When there’s a technological advance that can be incorporated onstage, the theater quickly harnesses it and exploits it,” says Laurence F. Maslon, Arts Professor at NYU’s Graduate Acting Program. Audiences came to see shows in the 19th century simply because the sets had doors and windows that opened and closed. When it became possible to cook an actual meal onstage, audiences came to see that as well. Bringing this 20-foot, 1.1-ton behemoth to life on Broadway are 14 puppeteers and computer technicians, along with 16 microprocessors, 16,000 connections, and nearly 1,000 feet of electrical cable built into the beast. Kong took years to perfect through concept design, prototyping, and testing, but the final version took about six to eight months to construct, according to Creature Designer Sonny Tilders, the man who led the team that built the ape. Ten of the puppeteers are known as the King’s Company and move with the beast onstage, using their weight and movements to move Kong’s lower body and arms. When the ape needs to lift an arm high, one puppeteer climbs
up his back and dives from his shoulders with a rope in hand to help create the effect. These 10 actors are also part of the cast when not on Kong duty. There were times, of course, when it didn’t seem like it would ever get to this point, says Tilders. “This was a daunting task to do for the stage and concerns grew at the start of the project that it might just be too difficult,” says the man who is plenty comfortable working with big puppets. He’s the creative director of Australia’s Creature Technology Company, the outfit responsible for designing and building the fullsize animatronic dinosaurs in the arena show, Walking With Dinosaurs, that launched in 2007 and proceeded to wander across four continents. “Our history as a company was one of making very realistic creatures with an attention to detail and sophistication of movement usually reserved for film animatronics,” Tilders says. “Apart from anything else there was a risk that the technology would overshadow everything we were trying to achieve dramatically,” he says. It also wasn’t clear how the initial Kong design would work with actors on the stage, so the concept of being as realistic as possible was scrapped and the team moved toward a more theatrical outcome. One of the inspirations for Kong and the King’s Company came from the 2007 play War Horse, which featured nine full-size horse puppets, each controlled by two or three people onstage. “Puppeteers seen openly manipulating the stunning horse puppets quickly disappear as they become one with the creatures they are controlling,” Tilders says. “It gave us the confidence to embrace exposed puppeteering solutions.” However, this simple old-school form of puppetry is married to the highest technology theater has seen. “We thought embracing more traditional puppetry techniques would provide more practical low-tech solutions,” Tilders says. “The result is a hybrid of new and old technologies, both manual and mechanical. Kong evolved into a giant string puppet – albeit one jam-packed with hidden tricks and technology.” Four people known as Voodoo Operators control a fair amount of that technology. They sit in a booth in the back of the theater up in the balcony using computers to control Kong’s eyebrows and eyelids, mouth, nose, shoulders, jaw and hips. One member of the team also serves as the voice of Kong through a processor that lets each gentle grunt, curious
TOP: King Kong Creature Designer Sonny Tilders outside the Broadway Theatre in Manhattan. (Photo: Joan Marcus) BOTTOM: It takes 14 puppeteers and computer technicians along with 16 microprocessors, 16,000 connections, and nearly 1,000 feet of electrical cable built into the creature to create the effect that Kong is alive. (Photo: Joan Marcus)
“Our history as a company was one of making very realistic creatures with an attention to detail and sophistication of movement usually reserved for film animatronics. Apart from anything else there was a risk that the technology would overshadow everything we were trying to achieve dramatically,” —Sonny Tilders, Creature Designer
SPRING 2019 VFXVOICE.COM • 9
2/22/19 11:18 AM
THEATRE
EFFECTS PLAY LEAD ROLE ON BROADWAY IN KING KONG By MARK J. MILLER
TOP LEFT AND RIGHT: Christiani Pitts as Ann Darrow with Kong – 20-feet tall, 1.1-tons. (Photo: Joan Marcus)
8 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 8-9
Broadway has had its share of visual effects extravaganzas – live elephants, aerial battles, a full-blown chariot race – but it’s never had a 20-foot, 1.1 ton ape wandering across one of its stages. Until now. King Kong debuted on the Great White Way in early November 2018 after 10 years of development and $35 million in investment. A good chunk of that cash went toward figuring out a way to make that big ape as realistic as possible. The show’s amazing visual effects are the key to this production’s survival. Kong is what the audience has paid its money to see. Part of the joy of Broadway, of course, is the visual sleight of hand designers bring to each show, whether it is a helicopter dropping into Miss Saigon, the massive chandelier swinging dramatically to the stage from the theater’s high ceiling in Phantom of the Opera, or actors parachuting from a plane and dropping to the stage in The Who’s Tommy. Audiences are much more demanding now for wow moments, but the idea of “spectacle” is relative. “When there’s a technological advance that can be incorporated onstage, the theater quickly harnesses it and exploits it,” says Laurence F. Maslon, Arts Professor at NYU’s Graduate Acting Program. Audiences came to see shows in the 19th century simply because the sets had doors and windows that opened and closed. When it became possible to cook an actual meal onstage, audiences came to see that as well. Bringing this 20-foot, 1.1-ton behemoth to life on Broadway are 14 puppeteers and computer technicians, along with 16 microprocessors, 16,000 connections, and nearly 1,000 feet of electrical cable built into the beast. Kong took years to perfect through concept design, prototyping, and testing, but the final version took about six to eight months to construct, according to Creature Designer Sonny Tilders, the man who led the team that built the ape. Ten of the puppeteers are known as the King’s Company and move with the beast onstage, using their weight and movements to move Kong’s lower body and arms. When the ape needs to lift an arm high, one puppeteer climbs
up his back and dives from his shoulders with a rope in hand to help create the effect. These 10 actors are also part of the cast when not on Kong duty. There were times, of course, when it didn’t seem like it would ever get to this point, says Tilders. “This was a daunting task to do for the stage and concerns grew at the start of the project that it might just be too difficult,” says the man who is plenty comfortable working with big puppets. He’s the creative director of Australia’s Creature Technology Company, the outfit responsible for designing and building the fullsize animatronic dinosaurs in the arena show, Walking With Dinosaurs, that launched in 2007 and proceeded to wander across four continents. “Our history as a company was one of making very realistic creatures with an attention to detail and sophistication of movement usually reserved for film animatronics,” Tilders says. “Apart from anything else there was a risk that the technology would overshadow everything we were trying to achieve dramatically,” he says. It also wasn’t clear how the initial Kong design would work with actors on the stage, so the concept of being as realistic as possible was scrapped and the team moved toward a more theatrical outcome. One of the inspirations for Kong and the King’s Company came from the 2007 play War Horse, which featured nine full-size horse puppets, each controlled by two or three people onstage. “Puppeteers seen openly manipulating the stunning horse puppets quickly disappear as they become one with the creatures they are controlling,” Tilders says. “It gave us the confidence to embrace exposed puppeteering solutions.” However, this simple old-school form of puppetry is married to the highest technology theater has seen. “We thought embracing more traditional puppetry techniques would provide more practical low-tech solutions,” Tilders says. “The result is a hybrid of new and old technologies, both manual and mechanical. Kong evolved into a giant string puppet – albeit one jam-packed with hidden tricks and technology.” Four people known as Voodoo Operators control a fair amount of that technology. They sit in a booth in the back of the theater up in the balcony using computers to control Kong’s eyebrows and eyelids, mouth, nose, shoulders, jaw and hips. One member of the team also serves as the voice of Kong through a processor that lets each gentle grunt, curious
TOP: King Kong Creature Designer Sonny Tilders outside the Broadway Theatre in Manhattan. (Photo: Joan Marcus) BOTTOM: It takes 14 puppeteers and computer technicians along with 16 microprocessors, 16,000 connections, and nearly 1,000 feet of electrical cable built into the creature to create the effect that Kong is alive. (Photo: Joan Marcus)
“Our history as a company was one of making very realistic creatures with an attention to detail and sophistication of movement usually reserved for film animatronics. Apart from anything else there was a risk that the technology would overshadow everything we were trying to achieve dramatically,” —Sonny Tilders, Creature Designer
SPRING 2019 VFXVOICE.COM • 9
2/22/19 11:18 AM
THEATRE
“I knew that with our surround LED screen I could then start playing around with horizon lines, waves and clouds to further enhance this sense of travel and motion dynamic. Combining all these factors I felt we could theatrically create a large boat that ‘moved’ believably – and quite possibly a boat that the audience might also feel they were on with us.” —Peter England, Scenic and Projections Designer
TOP: King Kong at the Regent Theatre, Melbourne. Ten puppeteers known as the King’s Company move with Kong onstage, using their weight and movements to move his lower body and arms. (Photo: James Morgan) BOTTOM: A scene from the Melbourne, Australia production of King Kong. A team of operators in the balcony using computers control Kong’s eyebrows and eyelids, mouth, nose, shoulders, jaw and hips. (Photo: James Morgan)
10 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 10-11
growl and enraged roar sound as if it’s coming from a massive ape rather than from a mere human. Three of the Voodoo Operators also appear on stage in larger group scenes. Another member of the Voodoo Operators is the creative engineer. He spends the show monitoring all of Kong’s onboard systems, such as hydraulic oil pressures and temperatures, motor temperatures, air pressures and general creature diagnostics. “He’s there to make sure the monkey behaves itself,” says Tilders. Kong is too big to fit into the wings of the stage, so he is lifted into the flyspace above the stage when he’s not needed. Kong’s realism was key to actually getting the show onto Broadway. Director and choreographer Drew McOnie flew from London to Australia to meet the ape before taking the job. After all, he needed to see if he could work with his star. “This probably makes me sound like a complete weirdo, but he’s definitely got an aura that has something to do with his eyes,” he told the New York Times. “He ends up reflecting the person he’s looking at, and that ends up feeling very theatrical and very beautiful.” The actual gargantuan ape isn’t the only visual effect to gape at in the show. King Kong features a few other fascinating effects, including a battle with a 40-foot snake and the integration of a 90-foot video screen. “Bringing all the different departments together as well as integrating the actors is what makes it all really work from the technical side,” notes Peter England, the Scenic and Projections Designer. “Working with massive video screens and giant puppets makes things more interesting for actors these days.” Before theatergoers meet Kong, the main characters of Ann Darrow (Christiani Pitts) and Carl Denham (Eric William Morris), the film director who initially just wants to capture Kong on camera, hop onto a boat to go find the great ape. The full expanse of the 27-inches-tall, 90-feet-wide semi-circular ROE screen called New Linx 9 creates the effect. The screen is filled with an ocean view and part of the stage is lifted to create the illusion of a boat’s prow cutting through the water. England and his team chose this particular setup because it is relatively lightweight, weighing just over 6,600 pounds and is very
Broadway’s History of Visual Effects 1899-1900 Ben-Hur. The production featured “two real chariots tethered to two live horses, who were themselves tethered in place to a treadmill that cranked the moving cyclorama behind them,” according to Laurence F. Maslon, Arts Professor at NYU’s Graduate Acting Program. 1935-1936 Jumbo. Staged by showman Billy Rose at the massive Hippodrome, the show was set at a circus and featured a live two-ton elephant. 1946 Around the World in 80 Days. Adapted by Orson Welles, the musical had four mechanical elephants, a train chase, a flying eagle, and five filmed segments shown on a screen near the stage. 1993-1995 Kiss of the Spider Woman. Created by Kander and Ebb with a book by Terrence McNally, this was the first show to use pixelated projections. 2010-2014 Spider-Man: Turn Off the Dark. At a cost of $75 million, the most expensive Broadway show in history, this musical wasn’t loved by critics but did feature phenomenal visual effects, particularly battle scenes fought in the air over the audience. Unfortunately, several actors were hurt in the process. 2011-2013 War Horse. This play featured life-size horse puppets that were controlled onstage by three puppeteers. By play’s end, though, theatergoers could be convinced that the horses were real.
TOP: Bathed in dramatic stagelight against a 90-foot video screen, Kong and a 40-foot snake prepare to battle. (Photo: Matthew Murphy) BOTTOM: King Kong Creature Designer Sonny Tilders with his creation.
flexible. The panels are 12 inches wide x 48 feet long, which allowed some freedom to place any needed openings. The power and data is onboard rather than remote, which reduces cabling, and it’s fanless, so it’s quiet, an important element for the theater-going experience. England chose imagery that he thought looked “painterly.” “My decision to create artwork that looked painted was motivated by the over-arching goal of the production to feel ‘hand-made’ and driven by human labor,” he says. “A welcome side effect of this painterly approach was that the imagery has an inherent softness to it and largely gets around the ugly pixilation that so often curses LED screens as a result.” One of the design concerns in the boat scene was creating the illusion of the size of the boat. “ In the original story the boat journey from New York to Skull island is epic,” England says. “It is a massive three-month journey into the unknown. From the outset it was considered essential we meet this epic quality on stage.” Since there were a few scenes and a large song-and-dance number, as well as the need for it to appear to be a boat big enough to cart a giant gorilla across the waters, the design of the boat became a challenge. That led to the idea of a small bit toward the rear of the stage being lifted to create a triangular ramp that gives off the idea of the ship’s
SPRING 2019 VFXVOICE.COM • 11
2/22/19 11:18 AM
THEATRE
“I knew that with our surround LED screen I could then start playing around with horizon lines, waves and clouds to further enhance this sense of travel and motion dynamic. Combining all these factors I felt we could theatrically create a large boat that ‘moved’ believably – and quite possibly a boat that the audience might also feel they were on with us.” —Peter England, Scenic and Projections Designer
TOP: King Kong at the Regent Theatre, Melbourne. Ten puppeteers known as the King’s Company move with Kong onstage, using their weight and movements to move his lower body and arms. (Photo: James Morgan) BOTTOM: A scene from the Melbourne, Australia production of King Kong. A team of operators in the balcony using computers control Kong’s eyebrows and eyelids, mouth, nose, shoulders, jaw and hips. (Photo: James Morgan)
10 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 10-11
growl and enraged roar sound as if it’s coming from a massive ape rather than from a mere human. Three of the Voodoo Operators also appear on stage in larger group scenes. Another member of the Voodoo Operators is the creative engineer. He spends the show monitoring all of Kong’s onboard systems, such as hydraulic oil pressures and temperatures, motor temperatures, air pressures and general creature diagnostics. “He’s there to make sure the monkey behaves itself,” says Tilders. Kong is too big to fit into the wings of the stage, so he is lifted into the flyspace above the stage when he’s not needed. Kong’s realism was key to actually getting the show onto Broadway. Director and choreographer Drew McOnie flew from London to Australia to meet the ape before taking the job. After all, he needed to see if he could work with his star. “This probably makes me sound like a complete weirdo, but he’s definitely got an aura that has something to do with his eyes,” he told the New York Times. “He ends up reflecting the person he’s looking at, and that ends up feeling very theatrical and very beautiful.” The actual gargantuan ape isn’t the only visual effect to gape at in the show. King Kong features a few other fascinating effects, including a battle with a 40-foot snake and the integration of a 90-foot video screen. “Bringing all the different departments together as well as integrating the actors is what makes it all really work from the technical side,” notes Peter England, the Scenic and Projections Designer. “Working with massive video screens and giant puppets makes things more interesting for actors these days.” Before theatergoers meet Kong, the main characters of Ann Darrow (Christiani Pitts) and Carl Denham (Eric William Morris), the film director who initially just wants to capture Kong on camera, hop onto a boat to go find the great ape. The full expanse of the 27-inches-tall, 90-feet-wide semi-circular ROE screen called New Linx 9 creates the effect. The screen is filled with an ocean view and part of the stage is lifted to create the illusion of a boat’s prow cutting through the water. England and his team chose this particular setup because it is relatively lightweight, weighing just over 6,600 pounds and is very
Broadway’s History of Visual Effects 1899-1900 Ben-Hur. The production featured “two real chariots tethered to two live horses, who were themselves tethered in place to a treadmill that cranked the moving cyclorama behind them,” according to Laurence F. Maslon, Arts Professor at NYU’s Graduate Acting Program. 1935-1936 Jumbo. Staged by showman Billy Rose at the massive Hippodrome, the show was set at a circus and featured a live two-ton elephant. 1946 Around the World in 80 Days. Adapted by Orson Welles, the musical had four mechanical elephants, a train chase, a flying eagle, and five filmed segments shown on a screen near the stage. 1993-1995 Kiss of the Spider Woman. Created by Kander and Ebb with a book by Terrence McNally, this was the first show to use pixelated projections. 2010-2014 Spider-Man: Turn Off the Dark. At a cost of $75 million, the most expensive Broadway show in history, this musical wasn’t loved by critics but did feature phenomenal visual effects, particularly battle scenes fought in the air over the audience. Unfortunately, several actors were hurt in the process. 2011-2013 War Horse. This play featured life-size horse puppets that were controlled onstage by three puppeteers. By play’s end, though, theatergoers could be convinced that the horses were real.
TOP: Bathed in dramatic stagelight against a 90-foot video screen, Kong and a 40-foot snake prepare to battle. (Photo: Matthew Murphy) BOTTOM: King Kong Creature Designer Sonny Tilders with his creation.
flexible. The panels are 12 inches wide x 48 feet long, which allowed some freedom to place any needed openings. The power and data is onboard rather than remote, which reduces cabling, and it’s fanless, so it’s quiet, an important element for the theater-going experience. England chose imagery that he thought looked “painterly.” “My decision to create artwork that looked painted was motivated by the over-arching goal of the production to feel ‘hand-made’ and driven by human labor,” he says. “A welcome side effect of this painterly approach was that the imagery has an inherent softness to it and largely gets around the ugly pixilation that so often curses LED screens as a result.” One of the design concerns in the boat scene was creating the illusion of the size of the boat. “ In the original story the boat journey from New York to Skull island is epic,” England says. “It is a massive three-month journey into the unknown. From the outset it was considered essential we meet this epic quality on stage.” Since there were a few scenes and a large song-and-dance number, as well as the need for it to appear to be a boat big enough to cart a giant gorilla across the waters, the design of the boat became a challenge. That led to the idea of a small bit toward the rear of the stage being lifted to create a triangular ramp that gives off the idea of the ship’s
SPRING 2019 VFXVOICE.COM • 11
2/22/19 11:18 AM
THEATRE
prow. That lift point stays set for the entire trip on the boat. With the lift machinery, the motion of the actors and the water on the screen, it gives the sense of rhythmic movement. “I knew that with our surround LED screen I could then start playing around with horizon lines, waves and clouds to further enhance this sense of travel and motion dynamic,” England says. “Combining all these factors I felt we could theatrically create a large boat that ‘moved’ believably – and quite possibly a boat that the audience might also feel they were on with us.” Once the island is reached and the characters meet Kong, other big visual effects surprises lay ahead, such as the massive snake and its fight with Kong. The snake has six puppeteers onstage alongside Kong’s 10. The fight moves off stage and then appears as if the action is between large shadow puppets. “The shadow puppetry is actually done with front projection onto the painted PVC and steel scenic drop which is concealing some rapid de-rigging of both creatures upstage,” England says. Each creature was photographed separately and then a series of still images were compiled to show them “fighting” while stormy-lightning strobe effects are filling the theater. There are four groups of 20 images that are played in about six seconds. England compares the result to stopmotion animation. The show culminates, as the original 1933 film did, with Kong escaping his captors, running through the streets of New York, climbing the recently constructed Empire State Building and doing battle with gunmen on planes. While the film famously used Willis O’Brien’s stop-action animation, the Broadway show employs other effects to create the scene. The run through the streets is basically a marriage between the screen and the puppet. Kong runs and jumps in one spot while the screen imagery moves and leaps and shifts and changes to make it appear that Kong is moving quickly through New York. “In many ways what we did was utilize some simple moviemaking techniques that have their origins way back around the time when the original Kong movie came out,” England notes. He uses the example of a camera shot through the front windshield of a couple ‘driving’ in a stationary car. Visible through the car’s rear window is pre-shot footage of streets trailing off in the distance, creating the impression that the couple and the vehicle are actually moving. For the climb up the Empire State Building, window framing is projected onto fabric stage center, and audience members can look through to see Kong suspended in the background.
OPPOSITE TOP: The company of King Kong on the boat. (Photo: Matthew Murphy) OPPOSITE BOTTOM: Christiani Pitts as Ann Darrow with Kong. (Photo: Joan Marcus) TOP: The company of King Kong. (Photo: Matthew Murphy) BOTTOM: Christiani Pitts as Ann Darrow and Eric William Morris as Carl Denham in King Kong. (Photo: Joan Marcus)
“In many ways what we did was utilize some simple movie-making techniques that have their origins way back around the time when the original Kong movie came out.” —Peter England, Scenic and Projections Designer
12 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 13
SPRING 2019 VFXVOICE.COM • 13
2/22/19 11:18 AM
THEATRE
prow. That lift point stays set for the entire trip on the boat. With the lift machinery, the motion of the actors and the water on the screen, it gives the sense of rhythmic movement. “I knew that with our surround LED screen I could then start playing around with horizon lines, waves and clouds to further enhance this sense of travel and motion dynamic,” England says. “Combining all these factors I felt we could theatrically create a large boat that ‘moved’ believably – and quite possibly a boat that the audience might also feel they were on with us.” Once the island is reached and the characters meet Kong, other big visual effects surprises lay ahead, such as the massive snake and its fight with Kong. The snake has six puppeteers onstage alongside Kong’s 10. The fight moves off stage and then appears as if the action is between large shadow puppets. “The shadow puppetry is actually done with front projection onto the painted PVC and steel scenic drop which is concealing some rapid de-rigging of both creatures upstage,” England says. Each creature was photographed separately and then a series of still images were compiled to show them “fighting” while stormy-lightning strobe effects are filling the theater. There are four groups of 20 images that are played in about six seconds. England compares the result to stopmotion animation. The show culminates, as the original 1933 film did, with Kong escaping his captors, running through the streets of New York, climbing the recently constructed Empire State Building and doing battle with gunmen on planes. While the film famously used Willis O’Brien’s stop-action animation, the Broadway show employs other effects to create the scene. The run through the streets is basically a marriage between the screen and the puppet. Kong runs and jumps in one spot while the screen imagery moves and leaps and shifts and changes to make it appear that Kong is moving quickly through New York. “In many ways what we did was utilize some simple moviemaking techniques that have their origins way back around the time when the original Kong movie came out,” England notes. He uses the example of a camera shot through the front windshield of a couple ‘driving’ in a stationary car. Visible through the car’s rear window is pre-shot footage of streets trailing off in the distance, creating the impression that the couple and the vehicle are actually moving. For the climb up the Empire State Building, window framing is projected onto fabric stage center, and audience members can look through to see Kong suspended in the background.
OPPOSITE TOP: The company of King Kong on the boat. (Photo: Matthew Murphy) OPPOSITE BOTTOM: Christiani Pitts as Ann Darrow with Kong. (Photo: Joan Marcus) TOP: The company of King Kong. (Photo: Matthew Murphy) BOTTOM: Christiani Pitts as Ann Darrow and Eric William Morris as Carl Denham in King Kong. (Photo: Joan Marcus)
“In many ways what we did was utilize some simple movie-making techniques that have their origins way back around the time when the original Kong movie came out.” —Peter England, Scenic and Projections Designer
12 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 13
SPRING 2019 VFXVOICE.COM • 13
2/22/19 11:18 AM
THEATRE
The puppeteers have his arms and legs moving in a climbing motion while his body swings from side to side. Meanwhile, the LED screen shows animated content as Kong moves higher through New York City’s skyscrapers. “The speed at which this content rises is matched to the speed at which the front projected windows descend – and both speeds are motivated by the speed at which Kong’s climbing action can be performed,” England says. “Supporting all this is the music tempo – sound being a critical element to selling all of these visual motion dynamic moments.” Critical to making it all work, England says, is having some automation to build around. The King’s Company and Voodoo Operators can do their thing live each night, but England can’t change the timing of his animations or the length of the boat ride film. “The automation component, which controls the puppet’s entrances, exits and gross moves around the stage, is a series of pre-programmed cues and the same every show,” he says. “This is the ‘anchor’ we needed to ensure synchronicity.” The auto cues are called by the stage manager and have set timings to them so the team was able to accurately build its content-animation cues to match the auto cues. According to Tilders, the ability to adjust the automation to the live environment is a vital element. “This meant we could keep the puppet alive and reactive to the variability that occurs in any live show, in particular that which comes from needing to act alongside humans,” he says. Still, while the other visual effects are impressive, the main attraction is Kong. One of the show’s more potent moments occurs when the only performer onstage is Kong (and his puppeteers). The music and dancing have stopped and Kong moves right up to the lip of the stage. It appears that he is going to come right into the audience, and theatergoers are rapt. Kong reaches out and the audience gasps, unsure what he’ll do and where this amusement-park-ride of a show will take them next. For that moment, after all the behind-the-scenes work, Kong is truly king.
TOP LEFT AND BOTTOM: Christiani Pitts as Ann Darrow with Kong. (Photo: Joan Marcus) TOP RIGHT: The Company of King Kong. (Photo: Joan Marcus)
14 • VFXVOICE.COM SPRING 2019
PG 8-14 KING KONG.indd 14
2/22/19 11:18 AM
PG 15 THE MOLECULE AD.indd 15
2/19/19 11:20 AM
VFX TRENDS
WHAT APPS DO VFX SUPERVISORS USE? By IAN FAILES
The explosion in smartphone and tablet apps has brought many benefits for visual effects production. Now VFX supervisors can – right at their fingertips – review scripts, carry out on-set surveys, record crucial data and measurements, take photographic reference, and help produce their shots from start to finish. To help wade through the vast selection available, seven leading visual effects supervisors working in film, television, shorts, commercials and game cinematics share their go-to VFX smartphone and tablet apps with VFX Voice. APPS FOR SCRIPT REVIEW
TOP: Director and Visual Effects Supervisor Victor Perez consults his iPad on the set of the short film Echo. RIGHT: Brainstorm Digital Visual Effects Supervisor Eran Dinur.
16 • VFXVOICE.COM SPRING 2019
PG 16-21 WHAT APPS.indd 16-17
The first thing that visual effects supervisors might do on any project is read the script and make crucial VFX notes on it. Visual Effects Supervisor Chris Harvey (Oats Studios shorts, Chappie) recommends Weekend Read, a way of reading scripts on an iPhone without having to print out hundreds of pages. “I’ll often find I have a few quite minutes in some unexpected and random place and all I have with me is my phone,” he says. “If you have ever tried reading a script on a phone screen, it’s a
huge pain. Weekend Read has fixed that. It’s a great, simple and secure app that formats scripts specifically to be easy to read on the smaller screen.” Harvey and fellow Visual Effects Supervisor Lawren Bancroft-Wilson (The Terror [Season 2]), Unspeakable, A Million Little Things) both also highlighted the app Scriptation for use during the pre-production process for breaking down scripts. It allows for mark-ups, drawing and the tracking of script changes. “It’s even great for things other than scripts,” Harvey adds. “I often create technical documents for going out on location scouts. I can load them into Scriptation and take notes right on the document as I go and even snap quick photos and insert them into the document. Then I can export and share with anyone else that needs it. You can even take notes on different layers, so maybe you are sharing a document and VFX has one layer, SFX has one, art department, etc.”
TOP: Director and Visual Effects Supervisor Victor Perez. BOTTOM LEFT: Visual Effects Supervisor Jay Worth. BOTTOM RIGHT: Visual Effects Supervisor Lawren Bancroft-Wilson.
APPS FOR VFX PLANNING
When it comes to planning visual effects shots and live-action shoots, a number of apps can assist in these important early
SPRING 2019 VFXVOICE.COM • 17
2/22/19 1:02 PM
VFX TRENDS
WHAT APPS DO VFX SUPERVISORS USE? By IAN FAILES
The explosion in smartphone and tablet apps has brought many benefits for visual effects production. Now VFX supervisors can – right at their fingertips – review scripts, carry out on-set surveys, record crucial data and measurements, take photographic reference, and help produce their shots from start to finish. To help wade through the vast selection available, seven leading visual effects supervisors working in film, television, shorts, commercials and game cinematics share their go-to VFX smartphone and tablet apps with VFX Voice. APPS FOR SCRIPT REVIEW
TOP: Director and Visual Effects Supervisor Victor Perez consults his iPad on the set of the short film Echo. RIGHT: Brainstorm Digital Visual Effects Supervisor Eran Dinur.
16 • VFXVOICE.COM SPRING 2019
PG 16-21 WHAT APPS.indd 16-17
The first thing that visual effects supervisors might do on any project is read the script and make crucial VFX notes on it. Visual Effects Supervisor Chris Harvey (Oats Studios shorts, Chappie) recommends Weekend Read, a way of reading scripts on an iPhone without having to print out hundreds of pages. “I’ll often find I have a few quite minutes in some unexpected and random place and all I have with me is my phone,” he says. “If you have ever tried reading a script on a phone screen, it’s a
huge pain. Weekend Read has fixed that. It’s a great, simple and secure app that formats scripts specifically to be easy to read on the smaller screen.” Harvey and fellow Visual Effects Supervisor Lawren Bancroft-Wilson (The Terror [Season 2]), Unspeakable, A Million Little Things) both also highlighted the app Scriptation for use during the pre-production process for breaking down scripts. It allows for mark-ups, drawing and the tracking of script changes. “It’s even great for things other than scripts,” Harvey adds. “I often create technical documents for going out on location scouts. I can load them into Scriptation and take notes right on the document as I go and even snap quick photos and insert them into the document. Then I can export and share with anyone else that needs it. You can even take notes on different layers, so maybe you are sharing a document and VFX has one layer, SFX has one, art department, etc.”
TOP: Director and Visual Effects Supervisor Victor Perez. BOTTOM LEFT: Visual Effects Supervisor Jay Worth. BOTTOM RIGHT: Visual Effects Supervisor Lawren Bancroft-Wilson.
APPS FOR VFX PLANNING
When it comes to planning visual effects shots and live-action shoots, a number of apps can assist in these important early
SPRING 2019 VFXVOICE.COM • 17
2/22/19 1:02 PM
VFX TRENDS
stages. One of those is Shot Designer, a staging tool. “This app helps get everybody on the same page,” outlines director and Visual Effects Supervisor Victor Perez (The Invisible Boy: Second Generation, Echo, Ensemble). “It creates quick animations for the cameras and allows you to use photos. I find it helpful also to work with data wranglers to explain what they should expect and exactly what I need and when.” Blocker is also a staging-like app that exists in the augmented reality space. “It’s a pretty specialized AR app that’s designed to let you ‘drop’ virtual stand-ins into your locations when viewed through your phone,” says Harvey. “Then, as you walk around and view the location, the virtual stand-ins will be anchored to the location you dropped them.” For further VFX planning help, a number of sun surveying apps exist. Sun Surveyor is “a great tool for plotting sun position and angle for every time of the day,” notes Brainstorm Digital Visual Effects Supervisor Eran Dinur (The Woman in the Window, Uncut Gems, author of The Filmmaker’s Guide to Visual Effects). “I find it especially useful for texture acquisition of buildings and other large elements – you can plan in advance the best time to shoot with flat, ambient light. It is also useful for greenscreen orientation to avoid strong shadows.” Director and Visual Effects Supervisor and cinematics specialist Hugo Guerra’s sun app choice is Sun Seeker. “Using augmented reality, this app shows you the sun’s trajectory during the day; in fact, you can even check where and when the sun will be on any day of the year – a true lifesaver and always a great conversation piece on the shoot. It’s like magic.” APPS FOR ON SET
TOP LEFT: Director and Visual Effects Supervisor Hugo Guerra. TOP MIDDLE: Visual Effects Supervisor Chris Harvey. TOP RIGHT: Sony Pictures Imageworks Visual Effects Supervisor Sue Rowe. MIDDLE: A screenshot from Magic Universal ViewFinder. BOTTOM: Light Meter by WBPhoto has free and paid versions.
18 • VFXVOICE.COM SPRING 2019
PG 16-21 WHAT APPS.indd 18-19
Perhaps the most common – and helpful – kinds of apps in visual effects are ones that can be used directly on set for surveying locations, making measurements, mimicking cameras and recording much-needed metadata. A commonly used app for doing quick surveys is My Measures Pro. It automatically measures objects and rooms in pictures taken by your phone using augmented reality. “I normally use this tool to document the height, angle and distance from the camera to the objects,” says Guerra. “It’s also a great way to get accurate measurements for both the CG modeling and tracking departments.” Perez adds that “you can manually set the measures, and the app even helps you calculate areas and volumes. I use it a lot in combination with my distometer to get the scale of a set and positions.”
In recent years, several VFX supervisors have adopted the use of Ricoh’s Theta 360 camera to quickly acquire 360-degree panoramic or fisheye-like images of sets or locations. Using the Ricoh Theta V Remote App, Guerra says you can remotely control a Theta to also generate bracketed HDRIs or take reference photos of light positions. “It does not replace a HDR using a full-frame DSLR with an 8mm, but when you have no time on set, or the location is too small or dangerous, this tool can make a very ‘quick and dirty’ HDRI. It’s also great for crew selfies!” Another dedicated app used to capture Theta HDRIs is Theta S Bracket for HDR, something employed regularly on set by Bancroft-Wilson. “It’s quick, fast and non-intrusive, letting me get HDRIs where otherwise ADs would complain about me slowing things down. That said, for more important HDRIs that may be used for high-res reflections, I’d use my Sony Alpha A7RIII.” Light meter apps can equally save time on set. Dinur recommends Light Meter by WBPhoto. “I use this app to quickly check light on a greenscreen. DPs usually try to set optimal lighting on the greenscreen, but when I know that the background will need to be dark – nighttime out the window, for example – I ask to reduce the light on the screen to avoid too much backlight on hair and soft edges.” Meanwhile, Luxi produces a light meter app that works in conjunction with an attachment, something Guerra says “can convert your phone into a reasonable-quality incident light meter.” Keep in mind, he adds, “it has a margin of error of about 1/2 to 1/4 f-stops, but in my opinion, it is a great learning/testing tool before you invest in a more expensive meter.” Guerra’s other go-to in the light measurement space is LightSpectrum Pro, which allows users to check the Kelvin temperature of on-set lights. Visual effects supervisors often need to be on board with cinematographers, and a range of apps offer options for emulating lenses, cameras and sensors. Among the most popular is Artemis Pro. Guerra says, “Don’t leave home without it,” suggesting that it is an essential app for the shoot, pre-light and the ‘tech recce.’ “Beware,” adds Guerra, “it’s limited to 19mm since the iPhone can’t go lower than that, but there are fisheye adaptors to open up the field of view.” Perez likes to use the app on scouting to “get aligned with the DP regarding what we see in the frame,” while Harvey’s use of Artemis Pro extends to “capturing photos with correct camera and lens packages, sorting, making written and audio notes, filming quick movies, and recording a slew of technical information like time of day and GPS coordinates.” Another app in this area is Magic Universal ViewFinder, an app that Dinur notes “has a comprehensive range of cameras, formats and frame guides that let you frame a shot pretty accurately.” Then there’s Cadrage Director’s Viewfinder, which offers ways to preview frames based on actual film lenses. Sony Pictures Imageworks Visual Effects Supervisor Sue Rowe (The Meg, A Series of Unfortunate Events) uses the app to ‘frame up
TOP: My Measures Pro used on set during greenscreen filming. MIDDLE: AR techniques are used to follow the sun’s trajectory in Sun Seeker. BOTTOM: A screenshot from LightSpectrum Pro. The app checks the Kelvin temperature of set lights.
SPRING 2019 VFXVOICE.COM • 19
2/22/19 1:02 PM
VFX TRENDS
stages. One of those is Shot Designer, a staging tool. “This app helps get everybody on the same page,” outlines director and Visual Effects Supervisor Victor Perez (The Invisible Boy: Second Generation, Echo, Ensemble). “It creates quick animations for the cameras and allows you to use photos. I find it helpful also to work with data wranglers to explain what they should expect and exactly what I need and when.” Blocker is also a staging-like app that exists in the augmented reality space. “It’s a pretty specialized AR app that’s designed to let you ‘drop’ virtual stand-ins into your locations when viewed through your phone,” says Harvey. “Then, as you walk around and view the location, the virtual stand-ins will be anchored to the location you dropped them.” For further VFX planning help, a number of sun surveying apps exist. Sun Surveyor is “a great tool for plotting sun position and angle for every time of the day,” notes Brainstorm Digital Visual Effects Supervisor Eran Dinur (The Woman in the Window, Uncut Gems, author of The Filmmaker’s Guide to Visual Effects). “I find it especially useful for texture acquisition of buildings and other large elements – you can plan in advance the best time to shoot with flat, ambient light. It is also useful for greenscreen orientation to avoid strong shadows.” Director and Visual Effects Supervisor and cinematics specialist Hugo Guerra’s sun app choice is Sun Seeker. “Using augmented reality, this app shows you the sun’s trajectory during the day; in fact, you can even check where and when the sun will be on any day of the year – a true lifesaver and always a great conversation piece on the shoot. It’s like magic.” APPS FOR ON SET
TOP LEFT: Director and Visual Effects Supervisor Hugo Guerra. TOP MIDDLE: Visual Effects Supervisor Chris Harvey. TOP RIGHT: Sony Pictures Imageworks Visual Effects Supervisor Sue Rowe. MIDDLE: A screenshot from Magic Universal ViewFinder. BOTTOM: Light Meter by WBPhoto has free and paid versions.
18 • VFXVOICE.COM SPRING 2019
PG 16-21 WHAT APPS.indd 18-19
Perhaps the most common – and helpful – kinds of apps in visual effects are ones that can be used directly on set for surveying locations, making measurements, mimicking cameras and recording much-needed metadata. A commonly used app for doing quick surveys is My Measures Pro. It automatically measures objects and rooms in pictures taken by your phone using augmented reality. “I normally use this tool to document the height, angle and distance from the camera to the objects,” says Guerra. “It’s also a great way to get accurate measurements for both the CG modeling and tracking departments.” Perez adds that “you can manually set the measures, and the app even helps you calculate areas and volumes. I use it a lot in combination with my distometer to get the scale of a set and positions.”
In recent years, several VFX supervisors have adopted the use of Ricoh’s Theta 360 camera to quickly acquire 360-degree panoramic or fisheye-like images of sets or locations. Using the Ricoh Theta V Remote App, Guerra says you can remotely control a Theta to also generate bracketed HDRIs or take reference photos of light positions. “It does not replace a HDR using a full-frame DSLR with an 8mm, but when you have no time on set, or the location is too small or dangerous, this tool can make a very ‘quick and dirty’ HDRI. It’s also great for crew selfies!” Another dedicated app used to capture Theta HDRIs is Theta S Bracket for HDR, something employed regularly on set by Bancroft-Wilson. “It’s quick, fast and non-intrusive, letting me get HDRIs where otherwise ADs would complain about me slowing things down. That said, for more important HDRIs that may be used for high-res reflections, I’d use my Sony Alpha A7RIII.” Light meter apps can equally save time on set. Dinur recommends Light Meter by WBPhoto. “I use this app to quickly check light on a greenscreen. DPs usually try to set optimal lighting on the greenscreen, but when I know that the background will need to be dark – nighttime out the window, for example – I ask to reduce the light on the screen to avoid too much backlight on hair and soft edges.” Meanwhile, Luxi produces a light meter app that works in conjunction with an attachment, something Guerra says “can convert your phone into a reasonable-quality incident light meter.” Keep in mind, he adds, “it has a margin of error of about 1/2 to 1/4 f-stops, but in my opinion, it is a great learning/testing tool before you invest in a more expensive meter.” Guerra’s other go-to in the light measurement space is LightSpectrum Pro, which allows users to check the Kelvin temperature of on-set lights. Visual effects supervisors often need to be on board with cinematographers, and a range of apps offer options for emulating lenses, cameras and sensors. Among the most popular is Artemis Pro. Guerra says, “Don’t leave home without it,” suggesting that it is an essential app for the shoot, pre-light and the ‘tech recce.’ “Beware,” adds Guerra, “it’s limited to 19mm since the iPhone can’t go lower than that, but there are fisheye adaptors to open up the field of view.” Perez likes to use the app on scouting to “get aligned with the DP regarding what we see in the frame,” while Harvey’s use of Artemis Pro extends to “capturing photos with correct camera and lens packages, sorting, making written and audio notes, filming quick movies, and recording a slew of technical information like time of day and GPS coordinates.” Another app in this area is Magic Universal ViewFinder, an app that Dinur notes “has a comprehensive range of cameras, formats and frame guides that let you frame a shot pretty accurately.” Then there’s Cadrage Director’s Viewfinder, which offers ways to preview frames based on actual film lenses. Sony Pictures Imageworks Visual Effects Supervisor Sue Rowe (The Meg, A Series of Unfortunate Events) uses the app to ‘frame up
TOP: My Measures Pro used on set during greenscreen filming. MIDDLE: AR techniques are used to follow the sun’s trajectory in Sun Seeker. BOTTOM: A screenshot from LightSpectrum Pro. The app checks the Kelvin temperature of set lights.
SPRING 2019 VFXVOICE.COM • 19
2/22/19 1:02 PM
VFX TRENDS
App Websites Airtable www.airtable.com Annotable www.moke.com/annotable Artemis Pro www.chemicalwedding.tv Blocker www.blocker.afternow.io Cadrage Director’s Viewfinder www.cadrage.at TOP: The virtual stand-ins in the Blocker app.
FileMaker Go www.filemaker.com/products/filemaker-go Light Meter by WBPhoto www.willblaschko.com
TOP: 360-degree output from Hugo Guerra’s Ricoh Theta camera. MIDDLE: With many different kinds of lenses available to shoot with, Artemis Pro can be a handy way to see how a shot may ultimately look. BOTTOM: A Cadrage frame that Sue Rowe has annotated in Annotable.
shots’ to illustrate to the director the relevant VFX requirements. “I tend to take a series of images on the set, pre-shoot and send them to the director. This way the director can visualize the shot and be aware of the VFX work required. “For example,” Rowe continues, “I will often shoot with a 21mm lens and do a quick sketch using Annotable [see below] of where the VFX extensions will be needed. Then I will flip to a longer lens like a 35 or a 50 and show the same frame. By changing the lens, the set extensions would be less extensive or the depth of field would play to our advantage, and a scenic extension would suffice rather than a digital solution.” The Annotable app Rowe allows users to notate photos and documents, and then send them around quickly while on a production. Says Rowe: “I used it on The Meg to notate to myself where the shark was supposed to be in the chase sequence. The director was firing out ideas about each shot, and I found quick photos and a bad doodle of a shark told the story clearly. At the end of a 12-hour shoot the last thing you want to do is write up notes. With this tool I would put them into a document and make a PDF on the day of the shoot.” APPS FOR VFX PRODUCTION
Managing the flow of information is sometimes just as important as the imagery itself. And there are plenty of apps to help. Westworld Visual Effects Supervisor Jay Worth relies heavily, for example, on Airtable, an app that plugs into database software FileMaker Pro. “With working on multiple projects with multiple artists, this is the best method I have found to keep track of all the elements I need,” says Worth. “Something like Shotgun is great, but I don’t need to track that much data. For our most recent work on Westworld, we fully embraced Airtable and it was great – everything from financials and shot tracking to editorial count
20 • VFXVOICE.COM SPRING 2019
PG 16-21 WHAT APPS.indd 21
sheets and set data.” Bespoke solutions can also be generated out of FileMaker Go, FileMaker’s own tablet and phone app. “Many people have their own custom databases they have set up prior on the computer in FileMaker’s main software application,” discusses Harvey. “Then in the app it’s all about being able to quickly record the needed info. If you don’t have a custom camera report database, the VES has a great free one at camerareports.org.” Another common database solution is Setellite. According to Guerra, “this app is a complete package to create a full VFX report. It allows you to take notes, catalog all the cameras, lenses, lights, keep track of all the takes, and helps you label the VFX plates, among other things. It also has a cloud save in case you have 4G during filming. In the end, it creates a very complete PDF of everything that happens on set regarding VFX. This PDF is fundamental to keep information flowing between the VFX houses and the production.” Bancroft-Wilson says the latest versions of Setelitte have improved upon initial releases of the app. “It allows sharing between accounts and a web readout which I at first was excited about, but the web interface wasn’t great for manipulation and data entry. They’ve since offered a standalone version, which is annoying after spending twice its price on subscriptions. I’ll give the app a go again and see what they’ve improved. It’s a great way to make sure data, reference and media are being gathered in a clean, searchable way.” Finally, VFX supervisors who travel a lot during production might benefit from the use of Trip Case, an app that manages your flights all from one place. “For work,” relates Rowe, “I fly all over the world. All you do is forward the Movement Order or E-ticket booking to trips@tripcase.com and it puts your flights into the wallet. It will update you of changes and cancellations, and, very importantly, it reminds you that you have trips coming up.”
LightSpectrum Pro www.ampowersoftware.it/lightspectrum-pro Luxi www.makers4good.com/products/luxi Magic Universal ViewFinder www.dev.kadru.net My Measures Pro www.mymeasuresapp.com Ricoh Theta V Remote App www.theta360.com Scriptation www.scriptation.com Setellite www.setellite.nl Shot Designer www.hollywoodcamerawork.com Sun Seeker www.ozpda.com Sun Surveyor www.sunsurveyor.com Theta S Bracket for HDR www.sites.google.com/view/h360/theta-s-bracket-for-hdr Trip Case www.tripcase.com Weekend Read www.quoteunquoteapps.com/weekendread
SPRING 2019 VFXVOICE.COM • 21
2/22/19 1:02 PM
VFX TRENDS
App Websites Airtable www.airtable.com Annotable www.moke.com/annotable Artemis Pro www.chemicalwedding.tv Blocker www.blocker.afternow.io Cadrage Director’s Viewfinder www.cadrage.at TOP: The virtual stand-ins in the Blocker app.
FileMaker Go www.filemaker.com/products/filemaker-go Light Meter by WBPhoto www.willblaschko.com
TOP: 360-degree output from Hugo Guerra’s Ricoh Theta camera. MIDDLE: With many different kinds of lenses available to shoot with, Artemis Pro can be a handy way to see how a shot may ultimately look. BOTTOM: A Cadrage frame that Sue Rowe has annotated in Annotable.
shots’ to illustrate to the director the relevant VFX requirements. “I tend to take a series of images on the set, pre-shoot and send them to the director. This way the director can visualize the shot and be aware of the VFX work required. “For example,” Rowe continues, “I will often shoot with a 21mm lens and do a quick sketch using Annotable [see below] of where the VFX extensions will be needed. Then I will flip to a longer lens like a 35 or a 50 and show the same frame. By changing the lens, the set extensions would be less extensive or the depth of field would play to our advantage, and a scenic extension would suffice rather than a digital solution.” The Annotable app Rowe allows users to notate photos and documents, and then send them around quickly while on a production. Says Rowe: “I used it on The Meg to notate to myself where the shark was supposed to be in the chase sequence. The director was firing out ideas about each shot, and I found quick photos and a bad doodle of a shark told the story clearly. At the end of a 12-hour shoot the last thing you want to do is write up notes. With this tool I would put them into a document and make a PDF on the day of the shoot.” APPS FOR VFX PRODUCTION
Managing the flow of information is sometimes just as important as the imagery itself. And there are plenty of apps to help. Westworld Visual Effects Supervisor Jay Worth relies heavily, for example, on Airtable, an app that plugs into database software FileMaker Pro. “With working on multiple projects with multiple artists, this is the best method I have found to keep track of all the elements I need,” says Worth. “Something like Shotgun is great, but I don’t need to track that much data. For our most recent work on Westworld, we fully embraced Airtable and it was great – everything from financials and shot tracking to editorial count
20 • VFXVOICE.COM SPRING 2019
PG 16-21 WHAT APPS.indd 21
sheets and set data.” Bespoke solutions can also be generated out of FileMaker Go, FileMaker’s own tablet and phone app. “Many people have their own custom databases they have set up prior on the computer in FileMaker’s main software application,” discusses Harvey. “Then in the app it’s all about being able to quickly record the needed info. If you don’t have a custom camera report database, the VES has a great free one at camerareports.org.” Another common database solution is Setellite. According to Guerra, “this app is a complete package to create a full VFX report. It allows you to take notes, catalog all the cameras, lenses, lights, keep track of all the takes, and helps you label the VFX plates, among other things. It also has a cloud save in case you have 4G during filming. In the end, it creates a very complete PDF of everything that happens on set regarding VFX. This PDF is fundamental to keep information flowing between the VFX houses and the production.” Bancroft-Wilson says the latest versions of Setelitte have improved upon initial releases of the app. “It allows sharing between accounts and a web readout which I at first was excited about, but the web interface wasn’t great for manipulation and data entry. They’ve since offered a standalone version, which is annoying after spending twice its price on subscriptions. I’ll give the app a go again and see what they’ve improved. It’s a great way to make sure data, reference and media are being gathered in a clean, searchable way.” Finally, VFX supervisors who travel a lot during production might benefit from the use of Trip Case, an app that manages your flights all from one place. “For work,” relates Rowe, “I fly all over the world. All you do is forward the Movement Order or E-ticket booking to trips@tripcase.com and it puts your flights into the wallet. It will update you of changes and cancellations, and, very importantly, it reminds you that you have trips coming up.”
LightSpectrum Pro www.ampowersoftware.it/lightspectrum-pro Luxi www.makers4good.com/products/luxi Magic Universal ViewFinder www.dev.kadru.net My Measures Pro www.mymeasuresapp.com Ricoh Theta V Remote App www.theta360.com Scriptation www.scriptation.com Setellite www.setellite.nl Shot Designer www.hollywoodcamerawork.com Sun Seeker www.ozpda.com Sun Surveyor www.sunsurveyor.com Theta S Bracket for HDR www.sites.google.com/view/h360/theta-s-bracket-for-hdr Trip Case www.tripcase.com Weekend Read www.quoteunquoteapps.com/weekendread
SPRING 2019 VFXVOICE.COM • 21
2/22/19 1:02 PM
FILM
“We already knew that we could build a system that will take motion-capture data and produce a result. With machine learning, we can take the original system we built and now feed in corrections. All future results will then be corrected in the desired manner. This is a more rudimentary version of machine learning, but really shows great promise in speeding up the work and improving the quality.” —Darren Hendler, Head of Digital Humans, Digital Domain
THE NEW ARTIFICIAL INTELLIGENCE FRONTIER OF VFX By IAN FAILES
If there’s a buzz phrase right now in visual effects, it’s “machine learning.” In fact, there are three: machine learning, deep learning and artificial intelligence (A.I.). Each phrase tends to be used interchangeably to mean the new wave of smart software solutions in VFX, computer graphics and animation that lean on A.I. techniques. Already, research in machine and deep learning has helped introduce both automation and more physically-based results in computer graphics, mostly in areas such as camera tracking, simulations, rendering, motion capture, character animation, image processing, rotoscoping and compositing. VFX Voice asked several key players – from studios to software companies and researchers – about the areas of the industry that will likely be impacted by this new world of A.I. WHAT MACHINE LEARNING IS, AND WHAT IT CAN MEAN FOR VFX
TOP: Arraiy’s A.I.-based tracking solution being utilized to solve both camera match-moving and object tracking of a person.
22 • VFXVOICE.COM SPRING 2019
PG 22-27 A.I..indd 22-23
What exactly is machine or deep learning? An authority on the subject is Hao Li, a researcher and the CEO and co-founder of Pinscreen, which is developing ‘instant’ 3D avatars via mobile applications with the help of machine learning techniques. He describes machine learning (of which deep learning is a subset) as the use of “computational frameworks that are based on artificial neural networks which can be trained to perform highly complex tasks when a lot of training data exists.” Neural networks have existed for some time, explains Li, but
it was only relatively recently that ‘deep’ neural networks (which have multiple layers) can be trained efficiently with GPUs and massive amounts of data. “It turned out that deep learning-based techniques outperformed many, if not most, of the classic computer vision methods for fundamental pattern recognition-related problems, such as object recognition, segmentation and other inference tasks,” says Li. Since many graphics-related challenges are directly connected to vision-related ones – such as motion capture, performance-driven 3D facial animation, 3D scanning and others – it has become obvious that many existing techniques would immediately benefit from deep learning-based techniques once sufficient training data can be obtained. “More recently,” adds Li, “researchers have further come up with new forms of deep neural network architectures, where believable images of a real scene can be directly generated, either from some user input or even from random noise. Popular examples for these deep generative models include generative adversarial networks (GAN) and variational autoencoders (VAE).” MACHINE LEARNING IN ACTION
More on Pinscreen’s own implementation of these kinds of networks is below, but first a look at one of the most front-andcenter examples of where machine learning has been used in VFX in recent times – Digital Domain’s Thanos in Avengers: Infinity War. Here, the visual effects studio used a type of machine learning to transform Josh Brolin’s face – captured with head-cam cameras looking onto facial tracking markers – into the film’s lead character. This involved taking advantage of facial-capture training data. “We already knew that we could build a system that will take motion-capture data and produce a result,” states Digital Domain’s Head of Digital Humans, Darren Hendler. “With machine learning, we can take the original system we built and now feed in corrections. All future results will then be corrected in the desired manner. This is a more rudimentary version of machine learning, but really shows great promise in speeding up the work and improving the quality.” Since their work on Infinity War, Digital Domain has furthered
TOP: The CG character Thanos from Avengers: Infinity War. Digital Domain relied on machine learning techniques to help bring the character to life. BOTTOM: Digital Domain Senior Director of Software R&D Doug Roble in CG form – as part of a test of the studio’s facialcapture and CG humans pipeline.
SPRING 2019 VFXVOICE.COM • 23
2/22/19 12:36 PM
FILM
“We already knew that we could build a system that will take motion-capture data and produce a result. With machine learning, we can take the original system we built and now feed in corrections. All future results will then be corrected in the desired manner. This is a more rudimentary version of machine learning, but really shows great promise in speeding up the work and improving the quality.” —Darren Hendler, Head of Digital Humans, Digital Domain
THE NEW ARTIFICIAL INTELLIGENCE FRONTIER OF VFX By IAN FAILES
If there’s a buzz phrase right now in visual effects, it’s “machine learning.” In fact, there are three: machine learning, deep learning and artificial intelligence (A.I.). Each phrase tends to be used interchangeably to mean the new wave of smart software solutions in VFX, computer graphics and animation that lean on A.I. techniques. Already, research in machine and deep learning has helped introduce both automation and more physically-based results in computer graphics, mostly in areas such as camera tracking, simulations, rendering, motion capture, character animation, image processing, rotoscoping and compositing. VFX Voice asked several key players – from studios to software companies and researchers – about the areas of the industry that will likely be impacted by this new world of A.I. WHAT MACHINE LEARNING IS, AND WHAT IT CAN MEAN FOR VFX
TOP: Arraiy’s A.I.-based tracking solution being utilized to solve both camera match-moving and object tracking of a person.
22 • VFXVOICE.COM SPRING 2019
PG 22-27 A.I..indd 22-23
What exactly is machine or deep learning? An authority on the subject is Hao Li, a researcher and the CEO and co-founder of Pinscreen, which is developing ‘instant’ 3D avatars via mobile applications with the help of machine learning techniques. He describes machine learning (of which deep learning is a subset) as the use of “computational frameworks that are based on artificial neural networks which can be trained to perform highly complex tasks when a lot of training data exists.” Neural networks have existed for some time, explains Li, but
it was only relatively recently that ‘deep’ neural networks (which have multiple layers) can be trained efficiently with GPUs and massive amounts of data. “It turned out that deep learning-based techniques outperformed many, if not most, of the classic computer vision methods for fundamental pattern recognition-related problems, such as object recognition, segmentation and other inference tasks,” says Li. Since many graphics-related challenges are directly connected to vision-related ones – such as motion capture, performance-driven 3D facial animation, 3D scanning and others – it has become obvious that many existing techniques would immediately benefit from deep learning-based techniques once sufficient training data can be obtained. “More recently,” adds Li, “researchers have further come up with new forms of deep neural network architectures, where believable images of a real scene can be directly generated, either from some user input or even from random noise. Popular examples for these deep generative models include generative adversarial networks (GAN) and variational autoencoders (VAE).” MACHINE LEARNING IN ACTION
More on Pinscreen’s own implementation of these kinds of networks is below, but first a look at one of the most front-andcenter examples of where machine learning has been used in VFX in recent times – Digital Domain’s Thanos in Avengers: Infinity War. Here, the visual effects studio used a type of machine learning to transform Josh Brolin’s face – captured with head-cam cameras looking onto facial tracking markers – into the film’s lead character. This involved taking advantage of facial-capture training data. “We already knew that we could build a system that will take motion-capture data and produce a result,” states Digital Domain’s Head of Digital Humans, Darren Hendler. “With machine learning, we can take the original system we built and now feed in corrections. All future results will then be corrected in the desired manner. This is a more rudimentary version of machine learning, but really shows great promise in speeding up the work and improving the quality.” Since their work on Infinity War, Digital Domain has furthered
TOP: The CG character Thanos from Avengers: Infinity War. Digital Domain relied on machine learning techniques to help bring the character to life. BOTTOM: Digital Domain Senior Director of Software R&D Doug Roble in CG form – as part of a test of the studio’s facialcapture and CG humans pipeline.
SPRING 2019 VFXVOICE.COM • 23
2/22/19 12:36 PM
FILM
“Now we can take a single image and in real time re-create a high-resolution version of any actor to a similar quality as our final result in the film. It sometimes seems like pure magic how this works, but like all machine learning can be very temperamental and unpredictable, which makes the solution finding particularly rewarding.” —Doug Roble, Senior Director of Software R&D, Digital Domain
TOP: An example of Pinscreen’s 3D avatar technology, which is able to generate a 3D avatar from a single photo. BOTTOM: Pinscreen’s paGAN model allows for input expressions to be replicated in CG avatars of other people.
its deep learning techniques to create an entirely new facial-capture system. “Now,” says Senior Director of Software R&D, Doug Roble, “we can take a single image and in real time re-create a high-resolution version of any actor to a similar quality as our final result in the film. How this works sometimes seems like pure magic, but like all machine learning, it can be very temperamental and unpredictable, which makes the solution finding particularly rewarding.” At Pinscreen, one of the goals of the company has been to generate photorealistic and animatable 3D avatars complete with accurate facial and hair detail from single mobile phone images of users. They’ve been relying on deep learning approaches to make that possible, based on extensive ‘semi-supervised’ or ‘unsupervised’ training data. The data, coupled with deep neural networks, is used to help predict the correct results for what a 3D avatar should look like – for example, to work out what facial expression should be displayed. Pinscreen’s results are sometimes compared to ‘deep fake’ face-swapping videos, which have gained popularity by using deep learning techniques to re-animate a famous person’s face to make them say things they never actually said or appear where they never actually appeared. Li notes that “while the deep fakes code still requires a large amount of training data, i.e. video footage of a person, to create a convincing model for face swapping, we have shown recently at Pinscreen that the paGAN (photorealistic avatar GAN) technology only needs a single input picture.” Research that Li is part of at the University of Southern California is looking at ways of generating photorealistic and fullyclothed production-level 3D avatars without any human intervention, and how to model general 3D objects using deep models. “In the long term,” he says, “I believe that we will fully democratize the ability to create complex 3D content, and anyone can capture and share their stories immersively, just like we do with video nowadays.”
their toes in the area. Ziva Dynamics, which offers physicallybased simulation software called Ziva VFX, has been exploring machine learning, particularly in relation to its real-time solver technology. “This technology,” explains Ziva Dynamics co-CEO and co-founder James Jacobs, “makes it possible to convert highquality offline simulations, crafted by technical directors using Ziva VFX, into performant real-time characters. We’ve deployed this tech in a few public demonstrations and engaged in confidential prototyping with several leading companies in different sectors to explore use cases and future product strategies.” “The machine learning algorithms [enable artists] to interactively pose high-quality Ziva characters in real-time,” adds Jacobs. “The bakes produced from offline simulations are combined with representative animation data through a machine learning training process. From that, our solvers rapidly approximate the natural dynamics of the character for entirely new positions. This results in a fast, interactive character asset that achieves really consistent shapes, all in a relatively small file.” Allegorithmic, which makes the Substance suite of 3D texturing and material creation tools, has also been exploring the field of A.I. to combine several material-related processes such as image recognition and color extraction, into a single tool, called Alchemist. Alchemist’s A.I. capabilities are, in particular, powered by NVIDIA GPUs (NVIDIA itself is at the center of a great deal of computer graphics-related machine learning research). For one side of the Alchemist software, the delighter – which was created to help artists remove baked shadows from a base color or reference photo – a neural network was created from Substance’s material library to train the system. Artists need their images to be free of such shadows in order to get absolute control over the material. The A.I.-powered delighter detects the shadows, removes
TOP: This CG lion incorporates muscle and flesh simulations using Ziva VFX. BOTTOM: Ziva VFX is a Maya plugin. This frame shows a screenshot from the software.
MACHINE LEARNING IN CONTENT CREATION
The use of machine and deep learning techniques in the creation of CG creatures and materials is still relatively new, but incredibly promising, which is why several companies have been dipping
24 • VFXVOICE.COM SPRING 2019
PG 22-27 A.I..indd 24-25
SPRING 2019 VFXVOICE.COM • 25
2/22/19 12:36 PM
FILM
“Now we can take a single image and in real time re-create a high-resolution version of any actor to a similar quality as our final result in the film. It sometimes seems like pure magic how this works, but like all machine learning can be very temperamental and unpredictable, which makes the solution finding particularly rewarding.” —Doug Roble, Senior Director of Software R&D, Digital Domain
TOP: An example of Pinscreen’s 3D avatar technology, which is able to generate a 3D avatar from a single photo. BOTTOM: Pinscreen’s paGAN model allows for input expressions to be replicated in CG avatars of other people.
its deep learning techniques to create an entirely new facial-capture system. “Now,” says Senior Director of Software R&D, Doug Roble, “we can take a single image and in real time re-create a high-resolution version of any actor to a similar quality as our final result in the film. How this works sometimes seems like pure magic, but like all machine learning, it can be very temperamental and unpredictable, which makes the solution finding particularly rewarding.” At Pinscreen, one of the goals of the company has been to generate photorealistic and animatable 3D avatars complete with accurate facial and hair detail from single mobile phone images of users. They’ve been relying on deep learning approaches to make that possible, based on extensive ‘semi-supervised’ or ‘unsupervised’ training data. The data, coupled with deep neural networks, is used to help predict the correct results for what a 3D avatar should look like – for example, to work out what facial expression should be displayed. Pinscreen’s results are sometimes compared to ‘deep fake’ face-swapping videos, which have gained popularity by using deep learning techniques to re-animate a famous person’s face to make them say things they never actually said or appear where they never actually appeared. Li notes that “while the deep fakes code still requires a large amount of training data, i.e. video footage of a person, to create a convincing model for face swapping, we have shown recently at Pinscreen that the paGAN (photorealistic avatar GAN) technology only needs a single input picture.” Research that Li is part of at the University of Southern California is looking at ways of generating photorealistic and fullyclothed production-level 3D avatars without any human intervention, and how to model general 3D objects using deep models. “In the long term,” he says, “I believe that we will fully democratize the ability to create complex 3D content, and anyone can capture and share their stories immersively, just like we do with video nowadays.”
their toes in the area. Ziva Dynamics, which offers physicallybased simulation software called Ziva VFX, has been exploring machine learning, particularly in relation to its real-time solver technology. “This technology,” explains Ziva Dynamics co-CEO and co-founder James Jacobs, “makes it possible to convert highquality offline simulations, crafted by technical directors using Ziva VFX, into performant real-time characters. We’ve deployed this tech in a few public demonstrations and engaged in confidential prototyping with several leading companies in different sectors to explore use cases and future product strategies.” “The machine learning algorithms [enable artists] to interactively pose high-quality Ziva characters in real-time,” adds Jacobs. “The bakes produced from offline simulations are combined with representative animation data through a machine learning training process. From that, our solvers rapidly approximate the natural dynamics of the character for entirely new positions. This results in a fast, interactive character asset that achieves really consistent shapes, all in a relatively small file.” Allegorithmic, which makes the Substance suite of 3D texturing and material creation tools, has also been exploring the field of A.I. to combine several material-related processes such as image recognition and color extraction, into a single tool, called Alchemist. Alchemist’s A.I. capabilities are, in particular, powered by NVIDIA GPUs (NVIDIA itself is at the center of a great deal of computer graphics-related machine learning research). For one side of the Alchemist software, the delighter – which was created to help artists remove baked shadows from a base color or reference photo – a neural network was created from Substance’s material library to train the system. Artists need their images to be free of such shadows in order to get absolute control over the material. The A.I.-powered delighter detects the shadows, removes
TOP: This CG lion incorporates muscle and flesh simulations using Ziva VFX. BOTTOM: Ziva VFX is a Maya plugin. This frame shows a screenshot from the software.
MACHINE LEARNING IN CONTENT CREATION
The use of machine and deep learning techniques in the creation of CG creatures and materials is still relatively new, but incredibly promising, which is why several companies have been dipping
24 • VFXVOICE.COM SPRING 2019
PG 22-27 A.I..indd 24-25
SPRING 2019 VFXVOICE.COM • 25
2/22/19 12:36 PM
FILM
produce 3D animation that requires little to no cleanup, coding, investment or training.” “To do that,” adds Gravesen, “we’re not relying on hardwaredriven detections of tons of small data points that are aggregated into larger data sums that, after some intensive cleaning up, collectively resemble human activity. Rather, we deliver learning-based, software-driven reconstructions of human motion in 3D space.” AUTOMATION WITH A.I.
TOP LEFT: Kognat’s Rotobot masking results. TOP RIGHT: In this demo scene, Arraiy’s software was able to extract these segmentations. BOTTOM LEFT: Rotobot demo on a number of netball players, where only the players are isolated in the footage. BOTTOM RIGHT: A real-time demo using The Mill’s tracking Blackbird car utilized Arraiy’s tracking solution to solve a high-quality object track of the vehicle, and then replace it with a CG photoreal version.
26 • VFXVOICE.COM SPRING 2019
PG 22-27 A.I..indd 27
them, and reconstructs what is under the shadows. In the motion-capture space, a number of companies are employing machine learning techniques to help make the process more efficient. DeepMotion, for instance, uses A.I. in several ways: to re-target and post-process motion-capture data; to simulate softbody deformation in real time; to achieve 2D and 3D pose estimation; to train physicalized characters to synthesize dynamic motion in a simulation; and to stitch multiple motions together for seamless transitioning and blending. “These applications of A.I. solve a variety of problems for accelerating VFX processes, enabling truly interactive character creation, and expanding pipelines for animation and simulation data,” says DeepMotion founder Kevin He. “Machine learning has been used for years to create interesting effects in physics-based animation and the media arts, but we’re seeing a new wave of applications as computations become more efficient and novel approaches, like deep reinforcement learning, create more scalable models.” Meanwhile, RADiCAL is also utilizing A.I. in motion capture and, in particular, challenging the usual hardware-based approach to capture. “Specifically,” notes RADiCAL CEO Gavan Gravesen, “our solution uses input from conventional 2D video cameras to
One of the promises of deep and machine learning is as an aid to artists with tasks that are presently labor-intensive. One of those tasks familiar to visual effects artists, of course, is rotoscoping. Kognat, a company started by Rising Sun Pictures pipeline software developer Sam Hodge, has made its Rotobot deep learning rotoscope and compositing tool available for use with NUKE. Hodge’s adoption of deep learning techniques, and intense ‘training,’ enables Rotobot to isolate all of the pixels that belong to a certain class into a single mask, called segmentation. The effect is the isolation of portions of the image, just like rotoscoping. “Then there is instance segmentation,” adds Hodge, “which can isolate the pixels of a single instance of a class into its own layer. A class could be ‘person,’ so with segmentation you get all of the people on one layer. With instance segmentation you can isolate a single person from a crowd. “As an effects artist,” continues Hodge, “you might need to put an explosion behind the actors in the foreground – with the tool you can get a rough version done without interrupting the rotoscoping artists’ schedules, they can focus on final quality and the temporary version can use the A.I. mask.” Other companies are exploring similar A.I. image processing techniques, including Arraiy, which has employed several VFX industry veterans and engineers to work on machine learning and computer vision tools. “We have built software that makes it easy for creators to create custom A.I. learning algorithms to automate the manual processes that are currently required to generate VFX assets,” says Arraiy Chief Operating Officer, Brendan Dowdle. “For example, we provide a tool in which an artist can train a neural network to create mattes for an entire shot by providing the networks with a few labels on what the artist desires to segment. Once that network is trained for that particular shot, the algorithm can generate mattes for an arbitrary number of frames, and even in real time, while on set, if desired.” Foundry, makers of NUKE, says it is also investigating deep learning approaches that could be implemented into its software. This includes in the area of rotoscoping, the more mechanical ‘drudge work’ in VFX and in the workflow side of projects. “There are many eye-catching developments in deep learning which center around image processing,” notes Foundry co-founder and chief scientist, Simon Robinson. “What’s equally interesting is the application of these algorithms to more organizational or workflow-centric tasks. Modern VFX is an extraordinarily complicated management problem, whether of human or computing resources. This is especially true when running multiple overlapping shows. Spotting scheduling
patterns and improving efficiency and resource utilization is one of the areas where better algorithms could make a real difference to the industry.” INTO AN A.I. FUTURE
There isn’t just one thing that A.I. or machine learning or deep learning is bringing to visual effects, it’s many things. Digital Domain’s Darren Hendler summarizes that “machine learning is making big inroads in accelerating all sorts of slow processes in visual effects. We’ll be seeing machine learning muscle systems, hair systems, and more in the coming years. In the future, I really see all these machine learning capabilities as additional tools for VFX artists to refocus their talents on the nuances for even better end results.”
TOP: A demo from DeepMotion of its A.I. motion-capture capabilities. BOTTOM: RADiCAL’s A.I. motion-capture solution is designed to work with no suits and no hardware.
SPRING 2019 VFXVOICE.COM • 27
2/22/19 12:37 PM
FILM
produce 3D animation that requires little to no cleanup, coding, investment or training.” “To do that,” adds Gravesen, “we’re not relying on hardwaredriven detections of tons of small data points that are aggregated into larger data sums that, after some intensive cleaning up, collectively resemble human activity. Rather, we deliver learning-based, software-driven reconstructions of human motion in 3D space.” AUTOMATION WITH A.I.
TOP LEFT: Kognat’s Rotobot masking results. TOP RIGHT: In this demo scene, Arraiy’s software was able to extract these segmentations. BOTTOM LEFT: Rotobot demo on a number of netball players, where only the players are isolated in the footage. BOTTOM RIGHT: A real-time demo using The Mill’s tracking Blackbird car utilized Arraiy’s tracking solution to solve a high-quality object track of the vehicle, and then replace it with a CG photoreal version.
26 • VFXVOICE.COM SPRING 2019
PG 22-27 A.I..indd 27
them, and reconstructs what is under the shadows. In the motion-capture space, a number of companies are employing machine learning techniques to help make the process more efficient. DeepMotion, for instance, uses A.I. in several ways: to re-target and post-process motion-capture data; to simulate softbody deformation in real time; to achieve 2D and 3D pose estimation; to train physicalized characters to synthesize dynamic motion in a simulation; and to stitch multiple motions together for seamless transitioning and blending. “These applications of A.I. solve a variety of problems for accelerating VFX processes, enabling truly interactive character creation, and expanding pipelines for animation and simulation data,” says DeepMotion founder Kevin He. “Machine learning has been used for years to create interesting effects in physics-based animation and the media arts, but we’re seeing a new wave of applications as computations become more efficient and novel approaches, like deep reinforcement learning, create more scalable models.” Meanwhile, RADiCAL is also utilizing A.I. in motion capture and, in particular, challenging the usual hardware-based approach to capture. “Specifically,” notes RADiCAL CEO Gavan Gravesen, “our solution uses input from conventional 2D video cameras to
One of the promises of deep and machine learning is as an aid to artists with tasks that are presently labor-intensive. One of those tasks familiar to visual effects artists, of course, is rotoscoping. Kognat, a company started by Rising Sun Pictures pipeline software developer Sam Hodge, has made its Rotobot deep learning rotoscope and compositing tool available for use with NUKE. Hodge’s adoption of deep learning techniques, and intense ‘training,’ enables Rotobot to isolate all of the pixels that belong to a certain class into a single mask, called segmentation. The effect is the isolation of portions of the image, just like rotoscoping. “Then there is instance segmentation,” adds Hodge, “which can isolate the pixels of a single instance of a class into its own layer. A class could be ‘person,’ so with segmentation you get all of the people on one layer. With instance segmentation you can isolate a single person from a crowd. “As an effects artist,” continues Hodge, “you might need to put an explosion behind the actors in the foreground – with the tool you can get a rough version done without interrupting the rotoscoping artists’ schedules, they can focus on final quality and the temporary version can use the A.I. mask.” Other companies are exploring similar A.I. image processing techniques, including Arraiy, which has employed several VFX industry veterans and engineers to work on machine learning and computer vision tools. “We have built software that makes it easy for creators to create custom A.I. learning algorithms to automate the manual processes that are currently required to generate VFX assets,” says Arraiy Chief Operating Officer, Brendan Dowdle. “For example, we provide a tool in which an artist can train a neural network to create mattes for an entire shot by providing the networks with a few labels on what the artist desires to segment. Once that network is trained for that particular shot, the algorithm can generate mattes for an arbitrary number of frames, and even in real time, while on set, if desired.” Foundry, makers of NUKE, says it is also investigating deep learning approaches that could be implemented into its software. This includes in the area of rotoscoping, the more mechanical ‘drudge work’ in VFX and in the workflow side of projects. “There are many eye-catching developments in deep learning which center around image processing,” notes Foundry co-founder and chief scientist, Simon Robinson. “What’s equally interesting is the application of these algorithms to more organizational or workflow-centric tasks. Modern VFX is an extraordinarily complicated management problem, whether of human or computing resources. This is especially true when running multiple overlapping shows. Spotting scheduling
patterns and improving efficiency and resource utilization is one of the areas where better algorithms could make a real difference to the industry.” INTO AN A.I. FUTURE
There isn’t just one thing that A.I. or machine learning or deep learning is bringing to visual effects, it’s many things. Digital Domain’s Darren Hendler summarizes that “machine learning is making big inroads in accelerating all sorts of slow processes in visual effects. We’ll be seeing machine learning muscle systems, hair systems, and more in the coming years. In the future, I really see all these machine learning capabilities as additional tools for VFX artists to refocus their talents on the nuances for even better end results.”
TOP: A demo from DeepMotion of its A.I. motion-capture capabilities. BOTTOM: RADiCAL’s A.I. motion-capture solution is designed to work with no suits and no hardware.
SPRING 2019 VFXVOICE.COM • 27
2/22/19 12:37 PM
FILM
Pet Sematary (Paramount Pictures) Release date: U.K., April 5; April 5 Kevin Kölsch and Dennis Widmyer direct this new film adaptation of the Stephen King novel, in which Louis Creed (Jason Clarke), his wife Rachel (Amy Seimetz) and two kids move to a rural home located near a most peculiar burial ground. Paul Docherty is the VFX Production Manager. Mr. X is handling VFX. Shazam! (Warner Bros. Pictures) Release date: U.K., April 5; U.S., April 5 One magic word transforms young Bill Batson (Asher Angel) into superhero Shazam (Zachary Levi), but he must also learn to get serious with his newfound powers. David F. Sandberg helms the movie based on the DC Comics character. MPC (VFX Supervisor Asregadoo Arundi) and Mr. X (VFX Supervisor Aaron Weintraub) are the main visual effects houses. Hellboy (Lionsgate) Release date: U.K., April 11; U.S., April 12 Director Neil Marshall reboots the Hellboy film series, as David Harbour (Stranger Things) portrays the title character who must save mankind from Nimue, an ancient sorceress (Milla Jovovich). Steven Begg is the Production VFX Supervisor, and Mr. X, Digital District, Rhythm & Hues, Goodbye Kansas and WWFX participated in the VFX.
FRESH SPRING VFX By CHRIS McGOWAN
TOP: Game of Thrones (Photo courtesy of HBO)
28 • VFXVOICE.COM SPRING 2019
PG 28-31 SPRING VFX.indd 28-29
Visual effects talent worked on a trio of high-profile movies releasing this spring that come laden with blockbuster expectations – Avengers: Endgame with Robert Downey Jr. and Chris Evans, Ad Astra with Brad Pitt and Tommy Lee Jones, and Disney’s live-action remake of Aladdin, directed by Guy Ritchie and starring Will Smith. With all that big-screen firepower, however, the biggest VFX event of spring may be the final series run of Game of Thrones on television. VFX also plays its part as humankind is threatened in Godzilla: King of the Monsters, where a variety of giant beasts once again bring verisimilitude to city-squashing creatures. And effects masters signed on to enhance Pet Sematary, the Hellboy reboot and Shazam! VFX Voice has compiled a list of some of the top effectsinfused productions due this spring along with their visual effects producers, supervisors, and the VFX houses involved. This is not intended to be a complete listing. The coverage period is April and May. Release dates are subject to change.
Game of Thrones (HBO) Release date: U.K., April 14; U.S., April 14 Visual effects royalty returns for its eighth and final season. David Benioff, D.B. Weiss, David Nutter and Miguel Sapochnik will be the directors. Joe Bauer is the Visual Effects Supervisor.
Missing Link (Annapurna Pictures) Release date: U.K., April 5; U.S., April 12 Mr. Link, eight feet tall and furry, hopes explorer Sir Lionel Frost can help him locate long-lost relatives in Shangri-La. Zach Galifianakis, Hugh Jackman, Zoe Saldana, Stephen Fry and Emma Thompson are among the vocal talents in the animated tale from director Chris Butler and the Laika stop-motion-animation studio. Richard Thwaites is Visual Effects Production Supervisor and Steve Emerson is Visual Effects Supervisor. Avengers: Endgame (Walt Disney Studios) Release date: U.K., April 26; U.S., April 26 After Thanos wipes out half of all life in the universe, the remaining Avengers must undo the Mad Titan’s deed. With an ensemble of Marvel superheroes like this – Robert Downey Jr., Chris Evans, Chris Hemsworth, Scarlett Johansson, Don Cheadle, Gwyneth Paltrow, Chadwick Boseman and Mark Ruffalo – Thanos (Josh Brolin) is hard-pressed to wipe out the rest of the universe. Anthony Russo and Joe Russo direct. Dan DeLeeuw is Visual Effects Supervisor, Jen Underdahl is a Visual Effects Producer, and DNEG, Weta Digital, Framestore, ILM and Digital Domain contributed VFX. Pokémon: Detective Pikachu (Warner Bros. Pictures) Release date: U.K., May 10; U.S., May 10 Ace detective Harry Goodman has gone missing and it will be up
TOP: Pet Sematary (Photo © 2018 Paramount Pictures) MIDDLE: Shazam! (Photo © 2018 Warner Bros. Pictures) BOTTOM: Hellboy (Photo © 2018 Lionsgate)
SPRING 2019 VFXVOICE.COM • 29
2/22/19 1:06 PM
FILM
Pet Sematary (Paramount Pictures) Release date: U.K., April 5; April 5 Kevin Kölsch and Dennis Widmyer direct this new film adaptation of the Stephen King novel, in which Louis Creed (Jason Clarke), his wife Rachel (Amy Seimetz) and two kids move to a rural home located near a most peculiar burial ground. Paul Docherty is the VFX Production Manager. Mr. X is handling VFX. Shazam! (Warner Bros. Pictures) Release date: U.K., April 5; U.S., April 5 One magic word transforms young Bill Batson (Asher Angel) into superhero Shazam (Zachary Levi), but he must also learn to get serious with his newfound powers. David F. Sandberg helms the movie based on the DC Comics character. MPC (VFX Supervisor Asregadoo Arundi) and Mr. X (VFX Supervisor Aaron Weintraub) are the main visual effects houses. Hellboy (Lionsgate) Release date: U.K., April 11; U.S., April 12 Director Neil Marshall reboots the Hellboy film series, as David Harbour (Stranger Things) portrays the title character who must save mankind from Nimue, an ancient sorceress (Milla Jovovich). Steven Begg is the Production VFX Supervisor, and Mr. X, Digital District, Rhythm & Hues, Goodbye Kansas and WWFX participated in the VFX.
FRESH SPRING VFX By CHRIS McGOWAN
TOP: Game of Thrones (Photo courtesy of HBO)
28 • VFXVOICE.COM SPRING 2019
PG 28-31 SPRING VFX.indd 28-29
Visual effects talent worked on a trio of high-profile movies releasing this spring that come laden with blockbuster expectations – Avengers: Endgame with Robert Downey Jr. and Chris Evans, Ad Astra with Brad Pitt and Tommy Lee Jones, and Disney’s live-action remake of Aladdin, directed by Guy Ritchie and starring Will Smith. With all that big-screen firepower, however, the biggest VFX event of spring may be the final series run of Game of Thrones on television. VFX also plays its part as humankind is threatened in Godzilla: King of the Monsters, where a variety of giant beasts once again bring verisimilitude to city-squashing creatures. And effects masters signed on to enhance Pet Sematary, the Hellboy reboot and Shazam! VFX Voice has compiled a list of some of the top effectsinfused productions due this spring along with their visual effects producers, supervisors, and the VFX houses involved. This is not intended to be a complete listing. The coverage period is April and May. Release dates are subject to change.
Game of Thrones (HBO) Release date: U.K., April 14; U.S., April 14 Visual effects royalty returns for its eighth and final season. David Benioff, D.B. Weiss, David Nutter and Miguel Sapochnik will be the directors. Joe Bauer is the Visual Effects Supervisor.
Missing Link (Annapurna Pictures) Release date: U.K., April 5; U.S., April 12 Mr. Link, eight feet tall and furry, hopes explorer Sir Lionel Frost can help him locate long-lost relatives in Shangri-La. Zach Galifianakis, Hugh Jackman, Zoe Saldana, Stephen Fry and Emma Thompson are among the vocal talents in the animated tale from director Chris Butler and the Laika stop-motion-animation studio. Richard Thwaites is Visual Effects Production Supervisor and Steve Emerson is Visual Effects Supervisor. Avengers: Endgame (Walt Disney Studios) Release date: U.K., April 26; U.S., April 26 After Thanos wipes out half of all life in the universe, the remaining Avengers must undo the Mad Titan’s deed. With an ensemble of Marvel superheroes like this – Robert Downey Jr., Chris Evans, Chris Hemsworth, Scarlett Johansson, Don Cheadle, Gwyneth Paltrow, Chadwick Boseman and Mark Ruffalo – Thanos (Josh Brolin) is hard-pressed to wipe out the rest of the universe. Anthony Russo and Joe Russo direct. Dan DeLeeuw is Visual Effects Supervisor, Jen Underdahl is a Visual Effects Producer, and DNEG, Weta Digital, Framestore, ILM and Digital Domain contributed VFX. Pokémon: Detective Pikachu (Warner Bros. Pictures) Release date: U.K., May 10; U.S., May 10 Ace detective Harry Goodman has gone missing and it will be up
TOP: Pet Sematary (Photo © 2018 Paramount Pictures) MIDDLE: Shazam! (Photo © 2018 Warner Bros. Pictures) BOTTOM: Hellboy (Photo © 2018 Lionsgate)
SPRING 2019 VFXVOICE.COM • 29
2/22/19 1:06 PM
FILM
to his son Tim (Justice Smith) and wise-cracking Detective Pikachu (voiced by Ryan Reynolds) to discover his whereabouts in the sprawling metropolis of Ryme City. The Rob Letterman film has Visual Effects Producers Greg Baxter and Geoff Anderson, bolstered by Visual Effects Supervisor Erik Nordby for MPC, with VFX work by Framestore, Image Engine and Clear Angle Studios. John Wick: Chapter 3 - Parabellum (Lionsgate) Release date: U.K., May 17; U.S., May 17 This stylish action series starring Keanu Reeves continues, with Chad Stahelski directing and a host of stars in the cast, including Ian McShane, Halle Berry, Laurence Fishburne and Anjelica Huston. Kerry Joseph is the Visual Effects Producer. Image Engine and Proof Inc. are on board for VFX. Ad Astra (20th Century Fox) Release date: U.K., May 24; U.S., May 24 Twenty years after Clifford McBride (Tommy Lee Jones) left for Neptune to find signs of extra-terrestrial life, his son Roy (Brad Pitt) goes in search of him in this James Gray film. Bradley Parker is Production VFX Supervisor, Allen Maris is Visual Effects Producer, and Christopher Downs and Jiwoong Kim are Visual Effects Supervisors. Weta Digtal, Vitality VFX, Mr. X, Method Studios, MPC, Capital T, Shade VFX and Brainstorm VFX are handling the VFX load. Aladdin (Walt Disney Studios) Release date: U.K., May 24; U.S., May 24 Guy Ritchie helms the live-action remake of the 1992 animated feature with Will Smith as the voluble Genie. Jeff Capogreco, Chas Jarrett and Michael Mulholland are Visual Effects Supervisors working with the talent at ILM, Magiclab, Host VFX, One of Us, Clear Angle Studios and Nvizage.
TOP TO BOTTOM: Missing Link (Photo © 2018 Annapurna Pictures)
Godzilla: King of the Monsters (Warner Bros. Pictures) Release date: U.K., May 31; U.S., May 31 Millie Bobby Brown (Stranger Things) and Sally Hawkins (The Shape of Water) lead the cast in the latest telling of the Godzilla saga, joined by Vera Farmiga, David Strathairn and Ken Watanabe, not to mention Mothra, Rodan and threeheaded King Ghidorah. Maricel Pagulayan is Visual Effects Producer, Guillaume Rocheron is Visual Effects Supervisor, and DNEG, Clear Angle Studios, Method Studios, MPC, Raynault VFX, Ollin VFX, Factory VFX and Rodeo FX all participated.
Avengers: Endgame (Photo © 2018 Walt Disney Studios) Pokémon Detective Pikachu (Photo © 2018 Warner Bros. Pictures) John Wick: Chapter 3 (Photo © 2018 Lionsgate)
30 • VFXVOICE.COM SPRING 2019
PG 28-31 SPRING VFX.indd 30
2/22/19 1:06 PM
PG 31 SVA_NYC AD.indd 31
2/19/19 11:20 AM
COVER
WALKING A TIGHTROPE ON TIM BURTON’S DUMBO By KEVIN H. MARTIN
All images © Disney Enterprises Inc. TOP: Director Tim Burton had a small green-suited performer stand in for CG Dumbo on set so the actors could seem to physically interact with the character. (Photo: Jay Maidment) BOTTOM: Colin Farrell, who plays Holt Farrier, a former circus star hired to take care of Dumbo, on set with full-scale elephant models and helpers in green. (Photo: Jay Maidment)
32 • VFXVOICE.COM SPRING 2019
PG 32-37 DUMBO.indd 32-33
Dumbo first took flight in the pages of an illustrated children’s book, before Walt Disney selected it as an inexpensive follow-up to a pair of costly (and at that point not yet profitable) animated features released in 1940, Pinocchio and Fantasia. The tale of an ostracized baby circus elephant who becomes the big top’s star attraction when it’s discovered that by flapping his ears, he can fly, Dumbo warmed hearts for multiple generations, spanning four theatrical reissues as well as strong success in home video, while the ‘Dumbo the Flying Elephant’ ride has become a staple at the various Disney theme parks. An animated sequel was announced in 2001 and quietly canceled a few years later, so it was not until 2014 that the live-action remake secured approval, with director Tim Burton coming on board the following year. In the film, this little elephant not only rejuvenates the circus’ business, but draws the attention of one V.A. Vandevere (Michael Keaton), who intends to feature Dumbo in his spectacular Dreamland facility, where some things wicked lie beneath. Burton selected Richard Stammers (Prometheus, The Martian) as his VFX supervisor. “Tim’s original brief was that this be an expressionist movie,” Stammers recalls. “We needed to approach realism, but you can’t go too far in that direction because his take always has a heightened reality that won’t work with a documentary look, [resulting in] the constant striving for us to maintain visual credibility while also incorporating Tim’s stylistic needs.” Before overseeing the contributions of several VFX vendors, Stammers, along with key production leads, re-screened the original cartoon. “That was mainly to see how the emotions played during key character moments,” he reports. “There are certainly aspects and qualities of the animation in the original that we wanted to carry over, but most of the time our animation had real elephant behavior as a starting point. There was a ton of great reference provided that allowed us to study both grown and baby elephants, seeing exactly how they move. “The ears on baby elephants really do look huge, and we took note of the kind of cycle that represented their normal ear flapping. In Dumbo’s case, he’d really need the most enormous ears to carry that size and weight; they’d be dragging on the floor as he walked around, and that was going a bridge too far design-wise for us. We looked at the cartilage that supports the ears, too. We considered including some translucency in there, but Tim didn’t like seeing veins and blood vessels when we tested the look, so we decided to keep them opaque. I must say, there was a bat-wing quality to the translucent look, so the thick leathery look worked best, and carried a sense of weight with it that helped with the idea that the ears were strong enough to keep him airborne.” Among the key first calls was determining the size and shape of Dumbo’s head. “One thing that came out of our earliest explorations was that a normal-sized elephant head didn’t work with these larger ears,” says Stammers. “So the design of Dumbo evolved from a slightly larger head with correspondingly larger features like the eyes. This actually helped with the performance and allows the audience to see these expressive features even more clearly. There was some to-and-fro with facial elements being resized based
on how well they worked to create a character that was cute and relatable. I think this time up front was very well spent, so by the time we started shooting, the design issues were mostly resolved. Tim was happy with things, and we all agreed to stick with this look through principal photography. This allowed us to set up our interactives appropriately on set and to establish eyelines for the actors on a consistent basis.” But while the elephant in the room was being dealt with, the pachyderm in the air remained an issue. “There’s just not much to rely upon regarding the physics of flying baby elephant ears,” declares Stammers. “So we took the flight cycle from a large bird and applied that for a series of animation tests that were aimed at letting us see what might work, that you would believe an elephant could fly. There wasn’t anything in the original film’s animation that helped us convey an actual physicality to his flying, so it was definitely new ground being pioneered, to make it feel right with that weight. There wasn’t going to be any single solution that was enough in and of itself. This required us to make a leap of faith when depicting something that weighs 80 kilos [176 lbs.] and flies by flapping its ears.” Stammers recalls Burton being much impressed with MPC’s character work on The Jungle Book. “Facility skillsets is a major factor when deciding how to award the work,” he acknowledges. “MPC did a marvelous job matching the set lighting, which got us about 90% of the way there pretty quickly with their lighting model. But since Dumbo is the star, there was the matter of adding another pass, one that makes the hero character the focus of the scene, much as beauty lighting on set brings out the lead actor and actress. We’d often add small levels of bounce light to fill in the shadow side of Dumbo’s face.” Production’s creature effects team was able to build a full-size practical Dumbo, textured with appropriate skin and hair, and even correct eye color, for use on set. “Even though it was not 100% accurate to what we eventually finalized during post, it was definitely close enough in scale and color to be a very useful lighting reference in every live-action setup shot by [director of photography] Ben Davis. He could see how the physical stand-in would read, and in darker scenes he would know to add lights to suggest kicks and glints in the eyes. So when we shot our HDRIs and silver/ gray balls, it let us match his work with our CG lighting when adding CG Dumbo into the plates. Sometimes we might embellish the eye glint, because those caustics in the eye really helped bring his iris to life.” Burton also made use of a small green-suited performer who could stand in for Dumbo on set. “We put a kind of tortoise shell on his back so the actors could seem to have physical interaction with the character. Sometimes the actions would have to get painted out if the scene evolved, but it was enough so that Tim could direct the scene with confidence.” Previs was principally limited to flying scenes. “When Dumbo was on the ground, it wasn’t often needed,” affirms Stammers, “but if he went flying past camera, cues had to be worked up so special effects could bring the wind to blow off people’s hats or kick up some hay. The previs also helped with figuring out moving
“There’s just not much to rely upon regarding the physics of flying baby elephant ears. So we took the flight cycle from a large bird and applied that for a series of animation tests that were aimed at letting us see what might work, that you would believe an elephant could fly.” —Richard Stammers, VFX Supervisor
TOP: Holt Farrier’s children, Joe (Finley Hobbins) and Milly (Nico Parker) riding a “flying” Dumbo model against bluescreen, with special effects bringing in the wind. (Photo: Jay Maidment) BOTTOM: Aerial artist Colette Marchant (Eva Green) poses atop Dumbo rig as Tim Burton and lighting crew prep scene. (Photo: Jay Maidment)
SPRING 2019 VFXVOICE.COM • 33
2/23/19 3:05 PM
COVER
WALKING A TIGHTROPE ON TIM BURTON’S DUMBO By KEVIN H. MARTIN
All images © Disney Enterprises Inc. TOP: Director Tim Burton had a small green-suited performer stand in for CG Dumbo on set so the actors could seem to physically interact with the character. (Photo: Jay Maidment) BOTTOM: Colin Farrell, who plays Holt Farrier, a former circus star hired to take care of Dumbo, on set with full-scale elephant models and helpers in green. (Photo: Jay Maidment)
32 • VFXVOICE.COM SPRING 2019
PG 32-37 DUMBO.indd 32-33
Dumbo first took flight in the pages of an illustrated children’s book, before Walt Disney selected it as an inexpensive follow-up to a pair of costly (and at that point not yet profitable) animated features released in 1940, Pinocchio and Fantasia. The tale of an ostracized baby circus elephant who becomes the big top’s star attraction when it’s discovered that by flapping his ears, he can fly, Dumbo warmed hearts for multiple generations, spanning four theatrical reissues as well as strong success in home video, while the ‘Dumbo the Flying Elephant’ ride has become a staple at the various Disney theme parks. An animated sequel was announced in 2001 and quietly canceled a few years later, so it was not until 2014 that the live-action remake secured approval, with director Tim Burton coming on board the following year. In the film, this little elephant not only rejuvenates the circus’ business, but draws the attention of one V.A. Vandevere (Michael Keaton), who intends to feature Dumbo in his spectacular Dreamland facility, where some things wicked lie beneath. Burton selected Richard Stammers (Prometheus, The Martian) as his VFX supervisor. “Tim’s original brief was that this be an expressionist movie,” Stammers recalls. “We needed to approach realism, but you can’t go too far in that direction because his take always has a heightened reality that won’t work with a documentary look, [resulting in] the constant striving for us to maintain visual credibility while also incorporating Tim’s stylistic needs.” Before overseeing the contributions of several VFX vendors, Stammers, along with key production leads, re-screened the original cartoon. “That was mainly to see how the emotions played during key character moments,” he reports. “There are certainly aspects and qualities of the animation in the original that we wanted to carry over, but most of the time our animation had real elephant behavior as a starting point. There was a ton of great reference provided that allowed us to study both grown and baby elephants, seeing exactly how they move. “The ears on baby elephants really do look huge, and we took note of the kind of cycle that represented their normal ear flapping. In Dumbo’s case, he’d really need the most enormous ears to carry that size and weight; they’d be dragging on the floor as he walked around, and that was going a bridge too far design-wise for us. We looked at the cartilage that supports the ears, too. We considered including some translucency in there, but Tim didn’t like seeing veins and blood vessels when we tested the look, so we decided to keep them opaque. I must say, there was a bat-wing quality to the translucent look, so the thick leathery look worked best, and carried a sense of weight with it that helped with the idea that the ears were strong enough to keep him airborne.” Among the key first calls was determining the size and shape of Dumbo’s head. “One thing that came out of our earliest explorations was that a normal-sized elephant head didn’t work with these larger ears,” says Stammers. “So the design of Dumbo evolved from a slightly larger head with correspondingly larger features like the eyes. This actually helped with the performance and allows the audience to see these expressive features even more clearly. There was some to-and-fro with facial elements being resized based
on how well they worked to create a character that was cute and relatable. I think this time up front was very well spent, so by the time we started shooting, the design issues were mostly resolved. Tim was happy with things, and we all agreed to stick with this look through principal photography. This allowed us to set up our interactives appropriately on set and to establish eyelines for the actors on a consistent basis.” But while the elephant in the room was being dealt with, the pachyderm in the air remained an issue. “There’s just not much to rely upon regarding the physics of flying baby elephant ears,” declares Stammers. “So we took the flight cycle from a large bird and applied that for a series of animation tests that were aimed at letting us see what might work, that you would believe an elephant could fly. There wasn’t anything in the original film’s animation that helped us convey an actual physicality to his flying, so it was definitely new ground being pioneered, to make it feel right with that weight. There wasn’t going to be any single solution that was enough in and of itself. This required us to make a leap of faith when depicting something that weighs 80 kilos [176 lbs.] and flies by flapping its ears.” Stammers recalls Burton being much impressed with MPC’s character work on The Jungle Book. “Facility skillsets is a major factor when deciding how to award the work,” he acknowledges. “MPC did a marvelous job matching the set lighting, which got us about 90% of the way there pretty quickly with their lighting model. But since Dumbo is the star, there was the matter of adding another pass, one that makes the hero character the focus of the scene, much as beauty lighting on set brings out the lead actor and actress. We’d often add small levels of bounce light to fill in the shadow side of Dumbo’s face.” Production’s creature effects team was able to build a full-size practical Dumbo, textured with appropriate skin and hair, and even correct eye color, for use on set. “Even though it was not 100% accurate to what we eventually finalized during post, it was definitely close enough in scale and color to be a very useful lighting reference in every live-action setup shot by [director of photography] Ben Davis. He could see how the physical stand-in would read, and in darker scenes he would know to add lights to suggest kicks and glints in the eyes. So when we shot our HDRIs and silver/ gray balls, it let us match his work with our CG lighting when adding CG Dumbo into the plates. Sometimes we might embellish the eye glint, because those caustics in the eye really helped bring his iris to life.” Burton also made use of a small green-suited performer who could stand in for Dumbo on set. “We put a kind of tortoise shell on his back so the actors could seem to have physical interaction with the character. Sometimes the actions would have to get painted out if the scene evolved, but it was enough so that Tim could direct the scene with confidence.” Previs was principally limited to flying scenes. “When Dumbo was on the ground, it wasn’t often needed,” affirms Stammers, “but if he went flying past camera, cues had to be worked up so special effects could bring the wind to blow off people’s hats or kick up some hay. The previs also helped with figuring out moving
“There’s just not much to rely upon regarding the physics of flying baby elephant ears. So we took the flight cycle from a large bird and applied that for a series of animation tests that were aimed at letting us see what might work, that you would believe an elephant could fly.” —Richard Stammers, VFX Supervisor
TOP: Holt Farrier’s children, Joe (Finley Hobbins) and Milly (Nico Parker) riding a “flying” Dumbo model against bluescreen, with special effects bringing in the wind. (Photo: Jay Maidment) BOTTOM: Aerial artist Colette Marchant (Eva Green) poses atop Dumbo rig as Tim Burton and lighting crew prep scene. (Photo: Jay Maidment)
SPRING 2019 VFXVOICE.COM • 33
2/23/19 3:05 PM
COVER
“This feature is shot entirely onstage, which meant having to create horizons and set extensions when portraying exterior environments shot against greenscreen. Much of the film’s third act takes place in a locale called Dreamland, and Framestore handled a lot of standalone shots there.” —Richard Stammers, VFX Supervisor TOP LEFT: Burton on set directing Eva Green with a full-size, practical Dumbo, textured with appropriate skin and hair. (Photo: Jay Maidment) TOP RIGHT: Dumbo’s green-suited stand-in emerges from his circus train cage under the watchful eye of director Tim Burton. (Photo: Leah Gallo) BOTTOM: Colin Farrell, Danny DeVito, Deobia Oparei and cast encounter Dumbo in the form of a green stand-in. (Photo: Jay Maidment)
34 • VFXVOICE.COM SPRING 2019
PG 32-37 DUMBO.indd 34-35
eyelines – we could see if the old tennis-ball-on-a-stick would work for specific cuts. When flying the camera round to represent Dumbo’s POV, we also had a programmable winch system, so the crowds knew where to look as he traveled above their heads. That technique also allowed us to direct spotlights to pan along and follow his flight path.” Production utilized the ACES [Academy Color Encoding Specification] pipeline, which incorporated a film-emulation show LUT favored by cinematographer Davis. “The whole delivery process from the original pulls, to Pinewood Digital [for dailies], editorial and back to DI is working very well,” reports Stammers. “There are a lot of seamlessly shared shots where Framestore created backgrounds that MPC put their Dumbo into, and other shots involving either or both of them and Rodeo FX. Having a robust pipeline is essential when you’re taking something like this on.” Stammers also relied upon an in-house VFX team that tackled various scenes in concert with other vendors, which included Rising Sun Pictures in Australia, Atomic Arts in London and Germany’s RISE. Of the film’s 1,800 VFX shots, only about 800 feature digital elephants. “This feature is shot entirely onstage, which meant having to create horizons and set extensions when portraying exterior environments shot against greenscreen,” notes Stammers. “Much of the film’s third act takes place in a locale called Dreamland, and Framestore handled a lot of standalone shots there.” The skies of Dumbo reflect Burton’s dictum about expressionism. “Each cloud has been placed after its position has been considered for compositional value, and the same is true for the color values in the sky,” Stammers notes. “We shot about 300 skydomes to capture the range of looks demanded. And all this happened while we worked closely with Ben. He dealt with these very large set builds, but since we were adding the skies, that would have a massive effect on his lighting. Ben did his pre-lights while referencing the colors we had chosen. His lighting came from a giant RGB LED lightbox over the whole stage, one that could be set to particular color temperatures. If we needed a very red sunset, that
TOP AND BOTTOM: Danny DeVito, Colin Farrell and cast respond to a stick-and-ball figure of Dumbo against bluescreen, and in the final shot with Dumbo shielded by a pile of hay. (Photos: Jay Maidment)
SPRING 2019 VFXVOICE.COM • 35
2/23/19 3:05 PM
COVER
“This feature is shot entirely onstage, which meant having to create horizons and set extensions when portraying exterior environments shot against greenscreen. Much of the film’s third act takes place in a locale called Dreamland, and Framestore handled a lot of standalone shots there.” —Richard Stammers, VFX Supervisor TOP LEFT: Burton on set directing Eva Green with a full-size, practical Dumbo, textured with appropriate skin and hair. (Photo: Jay Maidment) TOP RIGHT: Dumbo’s green-suited stand-in emerges from his circus train cage under the watchful eye of director Tim Burton. (Photo: Leah Gallo) BOTTOM: Colin Farrell, Danny DeVito, Deobia Oparei and cast encounter Dumbo in the form of a green stand-in. (Photo: Jay Maidment)
34 • VFXVOICE.COM SPRING 2019
PG 32-37 DUMBO.indd 34-35
eyelines – we could see if the old tennis-ball-on-a-stick would work for specific cuts. When flying the camera round to represent Dumbo’s POV, we also had a programmable winch system, so the crowds knew where to look as he traveled above their heads. That technique also allowed us to direct spotlights to pan along and follow his flight path.” Production utilized the ACES [Academy Color Encoding Specification] pipeline, which incorporated a film-emulation show LUT favored by cinematographer Davis. “The whole delivery process from the original pulls, to Pinewood Digital [for dailies], editorial and back to DI is working very well,” reports Stammers. “There are a lot of seamlessly shared shots where Framestore created backgrounds that MPC put their Dumbo into, and other shots involving either or both of them and Rodeo FX. Having a robust pipeline is essential when you’re taking something like this on.” Stammers also relied upon an in-house VFX team that tackled various scenes in concert with other vendors, which included Rising Sun Pictures in Australia, Atomic Arts in London and Germany’s RISE. Of the film’s 1,800 VFX shots, only about 800 feature digital elephants. “This feature is shot entirely onstage, which meant having to create horizons and set extensions when portraying exterior environments shot against greenscreen,” notes Stammers. “Much of the film’s third act takes place in a locale called Dreamland, and Framestore handled a lot of standalone shots there.” The skies of Dumbo reflect Burton’s dictum about expressionism. “Each cloud has been placed after its position has been considered for compositional value, and the same is true for the color values in the sky,” Stammers notes. “We shot about 300 skydomes to capture the range of looks demanded. And all this happened while we worked closely with Ben. He dealt with these very large set builds, but since we were adding the skies, that would have a massive effect on his lighting. Ben did his pre-lights while referencing the colors we had chosen. His lighting came from a giant RGB LED lightbox over the whole stage, one that could be set to particular color temperatures. If we needed a very red sunset, that
TOP AND BOTTOM: Danny DeVito, Colin Farrell and cast respond to a stick-and-ball figure of Dumbo against bluescreen, and in the final shot with Dumbo shielded by a pile of hay. (Photos: Jay Maidment)
SPRING 2019 VFXVOICE.COM • 35
2/23/19 3:05 PM
COVER
TOP LEFT: Director Tim Burton wanted Dumbo to be an Expressionist film, with an emphasis on emotional shadings, compositional and color values. (Photo: Jay Maidment) TOP RIGHT: Circus owner Max Medici (Danny DeVito), circus performer Rongo the Strongo (Deobia Oparai) and the rest of the big-top team welcome a newborn Dumbo into their tight-knit family. BOTTOM LEFT: Joe (Finley Hobbins) and Milly (Nico Parker) working with a full-size, greensuited Dumbo model. (Photo: Jay Maidment) BOTTOM RIGHT: VFX Supervisor Richard Stammers and the production team studied grown and baby elephants to see exactly how they move. They studied elephants’ ears and how they flapped, their size, weight, texture and translucency.
whole lightbox could reflect that. He also had a hard key light for the sun that was sometimes in frame to give us the proper flaring, but the lightbox was crucial, giving us the color of the ambients that was so very important in creating the hues that tied into our CG skies.” While the 3D conversion was handled by Double Negative, Stammers notes that the HDR version will require careful attention. “We’re trying to be very mindful of how it will impact things, especially with the bright lights in the circus scenes, which could distract away from the film’s focus. Even the glints in Dumbo’s eyes could pop too much, registering overly bright in a jarring manner. So that is going to be a matter of riding careful herd to preserve those nice rolloffs in the highlights. I make a point of letting the colorist [Darren Rae] know in advance about the shots when this might become a factor and require addressing.” “For a circus picture, it’s rather amusing to find we’re walking a bit of a tightrope through post, balancing stylization against credibility,” Stammers laughs. “I think the fact we can keep it under control shows it paid dividends to work so much out in advance of shooting.”
36 • VFXVOICE.COM SPRING 2019
PG 32-37 DUMBO.indd 36
2/23/19 3:05 PM
PG 37 RISING SUN AD.indd 37
2/19/19 11:21 AM
TV
ENHANCING REALITY IN TOM CLANCY’S JACK RYAN By TREVOR HOGG
TOP: The Paris apartment explosion witnessed by Jack Ryan (John Krasinski) was created by using gas explosions and demolition references. (Photo: Jan Thijs. Image courtesy of Amazon) BOTTOM: One of biggest set extensions for Hybride was the refugee base camp in Episode 5, which involved plate cleanups and reconstructions, ground and background replacements, designing CG tents, digital doubles, as well as dust and smoke effects. (Photo: Jan Thijs. Image courtesy of Amazon)
38 • VFXVOICE.COM SPRING 2019
PG 38-43 JACK RYAN.indd 38-39
Shifting from the big screen to the streaming service Amazon Prime is Tom Clancy’s Jack Ryan, the further adventures of former U.S. marine turned CIA analyst Jack Ryan, with John Krasinski taking over the role previously played by Alec Baldwin, Harrison Ford, Ben Affleck and Chris Pine. In order to visualize the espionage world originally conceived by author Tom Clancy, Tom Clancy’s Jack Ryan creators Carlton Cuse and Graham Roland recruited Erik Henry, who has won Primetime Emmy Awards for his visual effects work on John Adams (HBO) and Black Sails (Starz). “We did have a consultant from the CIA and he was forthcoming with, ‘This is how small that would be’ or, ‘The technology that we have is much better than what you’re seeing on the Internet.’ It was an interesting balance,” says Henry. “A perfect example is surveillance videos. We made sure to give the audience something better than what they’re seeing on YouTube, but not so crazy good that it would take them out of the show.” A decision was made to shoot the eight episodes for the first season all at the same time with the post-production period lasting six months. “We would ask the editors to cut visual effects sequences quickly, get those in front of Carlton and Graham for approval, and then show them to Paramount and Amazon. That way we could get these things in the pipeline, and start the look development and animation,” explains Henry. “It guaranteed that we had a good amount of time for sequences or shots that needed attention.” Continuity was not a major issue. “There was a bit of luck in that we did not have a lot of changes,” he adds. “If there was one scene that we experimented with a lot and even added some things that we hadn’t planned originally, it was the opening of Episode 1 with the jet airplanes dropping bombs on the kids’ home in Lebanon.” There are a little over 1,000 visual effects shots in the show with a significant amount involving wire removals and cleanups. “There wasn’t a lot of reuse [when it came to the digital assets],” remarks Henry. “We would have an episode where they said, ‘We need to have a 737 parked on the tarmac there.’ It was like, ‘Okay, we’ll do that.’ We absolutely modelled a 737, put it into the scene, and it’s a beautiful shot, but it’s the one and only time that we did it. The airport runway scene was shot only 40 yards from the fuselage, so there was a quite a bit of detail that had to go into it to make that look real, including having the sense of dirt or water stains at the corners of windows that you see on planes that have not had a lot of maintenance.” The opening bombing run in the pilot episode was completely done with previs and thoroughly vetted before heading to Morocco to shoot the plate photography; however, other sequences came late in the production. “The ending of Episode 8 was rewritten to amp things up so we have a CG subway sequence where a character gets shot, stumbles in front of the train and is hit,” recalls Henry. “That and the fact that our main character had to squeeze himself against the tunnel wall as the train races by him. We didn’t have enough time to do previs, but we came up with a rig to get interactive light coming from the train windows onto John Krasinski as he moves through the tunnel. You take it as it comes.
I never felt that we were in a position where there was a sense that we were not prepared for a sequence.” Helming the pilot episode was Norwegian filmmaker Morten Tyldum, who previously directed Headhunters and The Imitation Game. “The script was solid,” notes Henry. “Morten brought his film background to it, and we have some wonderful actors in John Krasinski, Wendell Pierce and Ali Suliman. They made this not your average run-of-the-mill television show.” Various kinds of visual research were conducted. “Graham Roland played an important role in vetting our research because he fought in Iraq as a member of the armed services. That helped us quite a bit. We did research into specific types of writing that would be appropriate for a Turkish airport sign and made sure to authentically name people from other countries, whether they be Chechen or Russian. We spent the time and money to bring in graphics companies that had worked on $100 million feature films. I could go on about the level of detail that Jack Ryan so elegantly puts on the screen.” A total of 222 visual effects shots amounting to 17 minutes of screen time were produced by Hybride for Tom Clancy’s Jack Ryan. “Most of the FX produced for the show implicated some level of creative and technical challenges,” notes Hybride Visual Effects Supervisor François Lambert. “When the pickup truck explodes in the army base in Episode 1, for example, we actually needed to cut with actual pyro elements just like we did for the Paris apartment explosion that created a massive fireball and smoke plume. In Episode 7, one of the terrorists is testing a device the terrorists plan on using to release poisonous gas in a hospital. For this scene, it was important that we understand that it’s the cell phone that will be triggering the explosion of two glass cartridges that contain a deadly gas, and that this gas will be sucked in by the fan and dispersed through the vents. It took us a few tries before we got the dust particles exploding and swirling into the fan in a way that would make sure the audience would understand exactly what was going on.” Cinesite delivered 111 shots. “One of the biggest sequences in the show was one where we had not anticipated any effects,” reveals Cinesite VFX Supervisor Gunnar Hansen. “The scene took place in a Vegas casino. As they had shot a casino in Montreal, the architecture and black ceilings were not at all like a grand Vegas-style casino. The request from production was to replace the ceilings in all the shots. The tracking and layout team did an incredible job of reverse engineering the camera positions and placing CG grandlofted ceilings to match those seen in the larger casinos of Vegas.” Hansen was on set when principal photography was taking place in Montreal. “I am looking forward to the pizza parlour explosion,” he says. “It was fun to be there on the shoot as they dressed up and blew up the façade of one of my favourite pizza places. Cinesite enhanced the huge fireball with realistic CG debris, and flaming bits and pieces. Thankfully, production reset the real pizza place the very next day.” Outpost VFX looked after 70 shots across three episodes. “We did a fair amount of research regarding de-aging the antagonist Suleiman [Ali Suliman],” remarks Giorgio Pitino, Visual Effects
TOP: Montreal stands in for Washington, D.C. as Jack Ryan cycles through the city. (Photo: Jon Cournoyer. Image courtesy of Amazon) MIDDLE: On the set with Jack Ryan in Paris. (Photo: Jon Cournoyer. Image courtesy of Amazon) BOTTOM: A key dynamic is the relationship between Jack Ryan and his CIA superior James Greer (Wendell Pierce). (Photo: James Minchin III. Image courtesy of Amazon)
SPRING 2019 VFXVOICE.COM • 39
2/26/19 1:03 PM
TV
ENHANCING REALITY IN TOM CLANCY’S JACK RYAN By TREVOR HOGG
TOP: The Paris apartment explosion witnessed by Jack Ryan (John Krasinski) was created by using gas explosions and demolition references. (Photo: Jan Thijs. Image courtesy of Amazon) BOTTOM: One of biggest set extensions for Hybride was the refugee base camp in Episode 5, which involved plate cleanups and reconstructions, ground and background replacements, designing CG tents, digital doubles, as well as dust and smoke effects. (Photo: Jan Thijs. Image courtesy of Amazon)
38 • VFXVOICE.COM SPRING 2019
PG 38-43 JACK RYAN.indd 38-39
Shifting from the big screen to the streaming service Amazon Prime is Tom Clancy’s Jack Ryan, the further adventures of former U.S. marine turned CIA analyst Jack Ryan, with John Krasinski taking over the role previously played by Alec Baldwin, Harrison Ford, Ben Affleck and Chris Pine. In order to visualize the espionage world originally conceived by author Tom Clancy, Tom Clancy’s Jack Ryan creators Carlton Cuse and Graham Roland recruited Erik Henry, who has won Primetime Emmy Awards for his visual effects work on John Adams (HBO) and Black Sails (Starz). “We did have a consultant from the CIA and he was forthcoming with, ‘This is how small that would be’ or, ‘The technology that we have is much better than what you’re seeing on the Internet.’ It was an interesting balance,” says Henry. “A perfect example is surveillance videos. We made sure to give the audience something better than what they’re seeing on YouTube, but not so crazy good that it would take them out of the show.” A decision was made to shoot the eight episodes for the first season all at the same time with the post-production period lasting six months. “We would ask the editors to cut visual effects sequences quickly, get those in front of Carlton and Graham for approval, and then show them to Paramount and Amazon. That way we could get these things in the pipeline, and start the look development and animation,” explains Henry. “It guaranteed that we had a good amount of time for sequences or shots that needed attention.” Continuity was not a major issue. “There was a bit of luck in that we did not have a lot of changes,” he adds. “If there was one scene that we experimented with a lot and even added some things that we hadn’t planned originally, it was the opening of Episode 1 with the jet airplanes dropping bombs on the kids’ home in Lebanon.” There are a little over 1,000 visual effects shots in the show with a significant amount involving wire removals and cleanups. “There wasn’t a lot of reuse [when it came to the digital assets],” remarks Henry. “We would have an episode where they said, ‘We need to have a 737 parked on the tarmac there.’ It was like, ‘Okay, we’ll do that.’ We absolutely modelled a 737, put it into the scene, and it’s a beautiful shot, but it’s the one and only time that we did it. The airport runway scene was shot only 40 yards from the fuselage, so there was a quite a bit of detail that had to go into it to make that look real, including having the sense of dirt or water stains at the corners of windows that you see on planes that have not had a lot of maintenance.” The opening bombing run in the pilot episode was completely done with previs and thoroughly vetted before heading to Morocco to shoot the plate photography; however, other sequences came late in the production. “The ending of Episode 8 was rewritten to amp things up so we have a CG subway sequence where a character gets shot, stumbles in front of the train and is hit,” recalls Henry. “That and the fact that our main character had to squeeze himself against the tunnel wall as the train races by him. We didn’t have enough time to do previs, but we came up with a rig to get interactive light coming from the train windows onto John Krasinski as he moves through the tunnel. You take it as it comes.
I never felt that we were in a position where there was a sense that we were not prepared for a sequence.” Helming the pilot episode was Norwegian filmmaker Morten Tyldum, who previously directed Headhunters and The Imitation Game. “The script was solid,” notes Henry. “Morten brought his film background to it, and we have some wonderful actors in John Krasinski, Wendell Pierce and Ali Suliman. They made this not your average run-of-the-mill television show.” Various kinds of visual research were conducted. “Graham Roland played an important role in vetting our research because he fought in Iraq as a member of the armed services. That helped us quite a bit. We did research into specific types of writing that would be appropriate for a Turkish airport sign and made sure to authentically name people from other countries, whether they be Chechen or Russian. We spent the time and money to bring in graphics companies that had worked on $100 million feature films. I could go on about the level of detail that Jack Ryan so elegantly puts on the screen.” A total of 222 visual effects shots amounting to 17 minutes of screen time were produced by Hybride for Tom Clancy’s Jack Ryan. “Most of the FX produced for the show implicated some level of creative and technical challenges,” notes Hybride Visual Effects Supervisor François Lambert. “When the pickup truck explodes in the army base in Episode 1, for example, we actually needed to cut with actual pyro elements just like we did for the Paris apartment explosion that created a massive fireball and smoke plume. In Episode 7, one of the terrorists is testing a device the terrorists plan on using to release poisonous gas in a hospital. For this scene, it was important that we understand that it’s the cell phone that will be triggering the explosion of two glass cartridges that contain a deadly gas, and that this gas will be sucked in by the fan and dispersed through the vents. It took us a few tries before we got the dust particles exploding and swirling into the fan in a way that would make sure the audience would understand exactly what was going on.” Cinesite delivered 111 shots. “One of the biggest sequences in the show was one where we had not anticipated any effects,” reveals Cinesite VFX Supervisor Gunnar Hansen. “The scene took place in a Vegas casino. As they had shot a casino in Montreal, the architecture and black ceilings were not at all like a grand Vegas-style casino. The request from production was to replace the ceilings in all the shots. The tracking and layout team did an incredible job of reverse engineering the camera positions and placing CG grandlofted ceilings to match those seen in the larger casinos of Vegas.” Hansen was on set when principal photography was taking place in Montreal. “I am looking forward to the pizza parlour explosion,” he says. “It was fun to be there on the shoot as they dressed up and blew up the façade of one of my favourite pizza places. Cinesite enhanced the huge fireball with realistic CG debris, and flaming bits and pieces. Thankfully, production reset the real pizza place the very next day.” Outpost VFX looked after 70 shots across three episodes. “We did a fair amount of research regarding de-aging the antagonist Suleiman [Ali Suliman],” remarks Giorgio Pitino, Visual Effects
TOP: Montreal stands in for Washington, D.C. as Jack Ryan cycles through the city. (Photo: Jon Cournoyer. Image courtesy of Amazon) MIDDLE: On the set with Jack Ryan in Paris. (Photo: Jon Cournoyer. Image courtesy of Amazon) BOTTOM: A key dynamic is the relationship between Jack Ryan and his CIA superior James Greer (Wendell Pierce). (Photo: James Minchin III. Image courtesy of Amazon)
SPRING 2019 VFXVOICE.COM • 39
2/26/19 1:03 PM
TV
TOP LEFT: Real aircraft such as a Elizabeth City Coast Guard helicopter as well as digital versions of a Boeing 747 and DassaultBreguet Super Étendard (a French strike fighter) make an appearance throughout the series. (Photo: Myles Aronowitz. Image courtesy of Amazon) TOP RIGHT: Omar Rahbini (Helmi Dridi) and Suleilman (Ali Suliman), with the latter being de-aged using a 2D approach for two flashback sequences. (Photo: James Minchin III. Image courtesy of Amazon) BOTTOM LEFT: An effort was made to go to actual locations such as Paris to add to the sense of authenticity. (Photo: Jan Thijs. Image courtesy of Amazon) BOTTOM RIGHT: John Krasinski shooting on location with series co-creator Carlton Cuse who directed Episode 6. (Photo: Jan Thijs. Image courtesy of Amazon)
40 • VFXVOICE.COM SPRING 2019
PG 38-43 JACK RYAN.indd 40-41
Supervisor at Outpost VFX. “We’ve done quite a lot of 2D de-aging in the past for things like Nocturnal Animals and always do plenty of photo research to ensure that any de-aging work we do is as natural as possible. Usually this involves tracking down pictures of the talent from when they were a little younger and then sort of reverse engineering the aging process with visual effects.” Critical was finding the right balance between de-aging Suleiman and making him believable as a younger man. “Retouching an actor’s face is a subtle art that needs a lot of fine tweaks and balance. The most difficult part is to make someone look younger but still maintain the key features of their face and not create a look that feels soft or airbrushed. For instance, rotating the corner of the eyes by just a couple of pixels could drastically change the final look making the actor appear completely different.” Important Looking Pirates was responsible for 36 shots over three episodes. “The two main sequences we worked on take place in Lebanon in the 1980s, where we follow the story of two brothers that get caught in a life-threatening air raid,” remarks Important Looking Pirates VFX Supervisor Bobo Skipper. “For the first sequence our task was to create fighter-jet aircraft flying in for a bomb run that demolishes a village,” he explains.
“The plates we received were shot in Morocco with a few houses in the foreground. Part of our job was to set-extend the shots with a CG village, including trees and vegetation. We then simulated this environment together with explosions and smoke plumes rising from the bombardment. Later in the sequence, we find the younger brother stuck under a burning log. For these shots, our job was to enhance the plates with fire, embers and smoke. We created a CG version of the log that replaced the on-set prop, giving it a scorched looked, and adding burning embers, flames and smoke.” Territory Studio created 100 graphic assets to help story beats involving military operations, scrubbing drone footage, data searches and facial recognition. “We mainly looked at drone reconnaissance and firing footage to make this as realistic as possible,” states Territory Studio Creative Director Nils Kloth. “The story is meant to play out now and be almost nonfiction,” Kloth says, “so realism was one of the key driving factors for all the work we did. For us, it is always important to ensure that we build a believable system that people feel comfortable operating and interacting with.” Kloth adds, “The biggest challenge was the lack of information available at the time we started work. The episodes and scripts developed and changed, thus affecting the graphics and what they
TOP LEFT: A gray model render of the carpet bombing that occurs in the opening scene of the series. (Image courtesy of Important Looking Pirates) TOP RIGHT: The final composite of the carpet bombing created by Important Looking Pirates. (Image courtesy of Important Looking Pirates) BOTTOM LEFT: 300 extras were not enough to fill the scene of a refugee camp, so Hybride created a crowd simulation that was interwoven into the live-action plate. (Image courtesy of Hybride) BOTTOM RIGHT: A close-up shot of the crowd simulation produced by Hybride for the refugee camp scene. (Image courtesy of Hybride)
SPRING 2019 VFXVOICE.COM • 41
2/26/19 1:03 PM
TV
TOP LEFT: Real aircraft such as a Elizabeth City Coast Guard helicopter as well as digital versions of a Boeing 747 and DassaultBreguet Super Étendard (a French strike fighter) make an appearance throughout the series. (Photo: Myles Aronowitz. Image courtesy of Amazon) TOP RIGHT: Omar Rahbini (Helmi Dridi) and Suleilman (Ali Suliman), with the latter being de-aged using a 2D approach for two flashback sequences. (Photo: James Minchin III. Image courtesy of Amazon) BOTTOM LEFT: An effort was made to go to actual locations such as Paris to add to the sense of authenticity. (Photo: Jan Thijs. Image courtesy of Amazon) BOTTOM RIGHT: John Krasinski shooting on location with series co-creator Carlton Cuse who directed Episode 6. (Photo: Jan Thijs. Image courtesy of Amazon)
40 • VFXVOICE.COM SPRING 2019
PG 38-43 JACK RYAN.indd 40-41
Supervisor at Outpost VFX. “We’ve done quite a lot of 2D de-aging in the past for things like Nocturnal Animals and always do plenty of photo research to ensure that any de-aging work we do is as natural as possible. Usually this involves tracking down pictures of the talent from when they were a little younger and then sort of reverse engineering the aging process with visual effects.” Critical was finding the right balance between de-aging Suleiman and making him believable as a younger man. “Retouching an actor’s face is a subtle art that needs a lot of fine tweaks and balance. The most difficult part is to make someone look younger but still maintain the key features of their face and not create a look that feels soft or airbrushed. For instance, rotating the corner of the eyes by just a couple of pixels could drastically change the final look making the actor appear completely different.” Important Looking Pirates was responsible for 36 shots over three episodes. “The two main sequences we worked on take place in Lebanon in the 1980s, where we follow the story of two brothers that get caught in a life-threatening air raid,” remarks Important Looking Pirates VFX Supervisor Bobo Skipper. “For the first sequence our task was to create fighter-jet aircraft flying in for a bomb run that demolishes a village,” he explains.
“The plates we received were shot in Morocco with a few houses in the foreground. Part of our job was to set-extend the shots with a CG village, including trees and vegetation. We then simulated this environment together with explosions and smoke plumes rising from the bombardment. Later in the sequence, we find the younger brother stuck under a burning log. For these shots, our job was to enhance the plates with fire, embers and smoke. We created a CG version of the log that replaced the on-set prop, giving it a scorched looked, and adding burning embers, flames and smoke.” Territory Studio created 100 graphic assets to help story beats involving military operations, scrubbing drone footage, data searches and facial recognition. “We mainly looked at drone reconnaissance and firing footage to make this as realistic as possible,” states Territory Studio Creative Director Nils Kloth. “The story is meant to play out now and be almost nonfiction,” Kloth says, “so realism was one of the key driving factors for all the work we did. For us, it is always important to ensure that we build a believable system that people feel comfortable operating and interacting with.” Kloth adds, “The biggest challenge was the lack of information available at the time we started work. The episodes and scripts developed and changed, thus affecting the graphics and what they
TOP LEFT: A gray model render of the carpet bombing that occurs in the opening scene of the series. (Image courtesy of Important Looking Pirates) TOP RIGHT: The final composite of the carpet bombing created by Important Looking Pirates. (Image courtesy of Important Looking Pirates) BOTTOM LEFT: 300 extras were not enough to fill the scene of a refugee camp, so Hybride created a crowd simulation that was interwoven into the live-action plate. (Image courtesy of Hybride) BOTTOM RIGHT: A close-up shot of the crowd simulation produced by Hybride for the refugee camp scene. (Image courtesy of Hybride)
SPRING 2019 VFXVOICE.COM • 41
2/26/19 1:03 PM
TV
TOP LEFT: Storefront of a pizza shop shot on location in Montreal. (Image courtesy of Cinesite) TOP RIGHT: Cinesite digitally augments an explosion featuring fire and glass shards coming from the interior of the pizza shop. (Image courtesy of Cinesite) BOTTOM LEFT: A command search computer screen created by Territory Studio. (Image courtesy of Territory Studio) BOTTOM RIGHT: Territory Studio was responsible for creating computer screen graphics for drone footage. (Image courtesy of Territory Studio)
needed to do.” Roto animation was favored over greenscreen largely because the handheld cinematography did not lend itself to motion-control cameras. “We have some soundstage work like when the scene is at the CIA,” notes Henry. “We knew right from the get-go that in order for this show to be successful you couldn’t use the California desert for Morocco or the Middle East. There are certain things like the people, vehicles and the ways that roads are built that give such credibility when you globetrot to Paris, Moscow, London or Washington, D.C.” Shooting extensively on location presented logistical challenges such as the jihadi explosion in a Parisian apartment. “We were able to seal the street because it was almost like an alley, but no explosions were ever going to happen. We were re-enacting to some degree an event that did occur in the Muslim neighborhood of Paris a few years back where the police stormed an apartment and a suicide vest was detonated and blew out a section of the building. We toured that area of Paris while scouting and picked a neighborhood that was close to it. Lots of great plates and simulation work by Hybride.” Digital doubles needed to be created for refugees trying to get onto boats. “There were 300 extras, and we clearly understood after looking at it that we were going to need about 1,000,” explains
“We did research into specific types of writing that would be appropriate for a Turkish airport sign and made sure to authentically name people from other countries, whether they be Chechen or Russian. We spent the time and money to bring in graphics companies that had worked on $100 million feature films. I could go on about the level of detail that Jack Ryan so elegantly puts on the screen.” —Erik Henry, Visual Effects Supervisor
TOP: John Krasinksi is cornered in Tom Clancy’s Jack Ryan. (Photo: Philippe Bossé. Image courtesy of Amazon)
42 • VFXVOICE.COM SPRING 2019
PG 38-43 JACK RYAN.indd 43
Henry. “Hybride has a relationship with Ubisoft in Montreal, which has a high-quality motion-capture studio. They were able to take the textures that I supplied from set and put that into their digital doubles. We also did mocap for motions such as picking things up and walking that the people on the beach would be doing. Because the digital doubles were of such a high quality, we were able to mix them throughout the drone shots to fill up the beach.” Extreme changes in the weather were encountered during the production, notes Henry. “We started in Montreal where it was subzero and our feet had become blocks of ice, and we laughed when we finished the Morocco shoot and it was 118 degrees in the desert. We definitely had some extremes and that shows on camera.” “For the opening bombing sequence, the special effects guys gave us amazing gasoline explosions that would singe the hair on your face if you got too close,” reveals Henry. “But they could only do so much. Important Looking Pirates did some simulation tests of what a bomb would do if it hits the ground or a tree. We had some good footage of bombs falling during the Vietnam War that showed the shockwave that emulates out from each bomb that is dropped and the damage that it does to the buildings.” Because of several challenges, the opening bombing run was the most difficult sequence to execute. “This was a real event that happened in 1983 in Lebanon, and the whole point of it is that we need the audience to feel that there is ambiguity so things aren’t so black and white. It’s a visual effects sequence, but I hope that you forget that at one point and live in it as these two kids do.”
SPRING 2019 VFXVOICE.COM • 43
2/26/19 1:03 PM
TV
TOP LEFT: Storefront of a pizza shop shot on location in Montreal. (Image courtesy of Cinesite) TOP RIGHT: Cinesite digitally augments an explosion featuring fire and glass shards coming from the interior of the pizza shop. (Image courtesy of Cinesite) BOTTOM LEFT: A command search computer screen created by Territory Studio. (Image courtesy of Territory Studio) BOTTOM RIGHT: Territory Studio was responsible for creating computer screen graphics for drone footage. (Image courtesy of Territory Studio)
needed to do.” Roto animation was favored over greenscreen largely because the handheld cinematography did not lend itself to motion-control cameras. “We have some soundstage work like when the scene is at the CIA,” notes Henry. “We knew right from the get-go that in order for this show to be successful you couldn’t use the California desert for Morocco or the Middle East. There are certain things like the people, vehicles and the ways that roads are built that give such credibility when you globetrot to Paris, Moscow, London or Washington, D.C.” Shooting extensively on location presented logistical challenges such as the jihadi explosion in a Parisian apartment. “We were able to seal the street because it was almost like an alley, but no explosions were ever going to happen. We were re-enacting to some degree an event that did occur in the Muslim neighborhood of Paris a few years back where the police stormed an apartment and a suicide vest was detonated and blew out a section of the building. We toured that area of Paris while scouting and picked a neighborhood that was close to it. Lots of great plates and simulation work by Hybride.” Digital doubles needed to be created for refugees trying to get onto boats. “There were 300 extras, and we clearly understood after looking at it that we were going to need about 1,000,” explains
“We did research into specific types of writing that would be appropriate for a Turkish airport sign and made sure to authentically name people from other countries, whether they be Chechen or Russian. We spent the time and money to bring in graphics companies that had worked on $100 million feature films. I could go on about the level of detail that Jack Ryan so elegantly puts on the screen.” —Erik Henry, Visual Effects Supervisor
TOP: John Krasinksi is cornered in Tom Clancy’s Jack Ryan. (Photo: Philippe Bossé. Image courtesy of Amazon)
42 • VFXVOICE.COM SPRING 2019
PG 38-43 JACK RYAN.indd 43
Henry. “Hybride has a relationship with Ubisoft in Montreal, which has a high-quality motion-capture studio. They were able to take the textures that I supplied from set and put that into their digital doubles. We also did mocap for motions such as picking things up and walking that the people on the beach would be doing. Because the digital doubles were of such a high quality, we were able to mix them throughout the drone shots to fill up the beach.” Extreme changes in the weather were encountered during the production, notes Henry. “We started in Montreal where it was subzero and our feet had become blocks of ice, and we laughed when we finished the Morocco shoot and it was 118 degrees in the desert. We definitely had some extremes and that shows on camera.” “For the opening bombing sequence, the special effects guys gave us amazing gasoline explosions that would singe the hair on your face if you got too close,” reveals Henry. “But they could only do so much. Important Looking Pirates did some simulation tests of what a bomb would do if it hits the ground or a tree. We had some good footage of bombs falling during the Vietnam War that showed the shockwave that emulates out from each bomb that is dropped and the damage that it does to the buildings.” Because of several challenges, the opening bombing run was the most difficult sequence to execute. “This was a real event that happened in 1983 in Lebanon, and the whole point of it is that we need the audience to feel that there is ambiguity so things aren’t so black and white. It’s a visual effects sequence, but I hope that you forget that at one point and live in it as these two kids do.”
SPRING 2019 VFXVOICE.COM • 43
2/26/19 1:03 PM
FILM
WETA DIGITAL: MOVING CITIES AND BRINGING ANIME TO LIFE By TREVOR HOGG
TOP: In Mortal Engines, London consists of eight tiers, each supported by a suspension system that results in a smoother ride for the upper classes that live in higher levels of the city. (Image © Universal Pictures and MRC)
44 • VFXVOICE.COM SPRING 2019
PG 44-48 WETA.indd 44-45
Pushing creative and technical boundaries is a mantra for Weta Digital, which had to reimagine London, England as a massive urban predator travelling across Europe in Mortal Engines, and in the process had to produce an entirely new pipeline. “With every project there’s always something different,” notes Animation Supervisor Dennis Yoo, who traded the biologically enhanced primates of War for the Planet of the Apes for a futuristic world where cities literally hunt and dismantle each other. “In Mortal Engines, it was literally the scale of everything. We built these moving cities to actual scale in CG. The math involved got a bit crazy. The happy spot for the software is for x, y and z to be set at zero. When these cities move a few kilometers away we started to lose accuracy, so they had to be cheated back to zero space.” Pipelines needed to be revised in order to accomplish the work. “There were some growing pains on how we were going to animate these cities,” states Yoo. “We developed a layout puppet.” A city was divided into different tiers consisting of sections that could be moved independently from each other. “There’s this hierarchy structure in the puppet. You can go, ‘I need this part of London.’ You could load just that part of London. Unfortunately, another aspect of that system was if you only wanted a gun situated in a particular part of London you had to load that entire layout.” Different teams on the film had to collaborate in order to resolve issues. “We got the creature lead to sit with the animation team so we could troubleshoot things on the fly with each other. Later we had the layout lead as well. It was the only way that we could communicate properly about something that complex.” Previs Supervisor Marco Spitoni has been working on Mortal Engines for a decade. “Some of the previs was on point where we literally grabbed the previs camera and put that into the correct world space,” remarks Yoo. “Also, [producer] Peter Jackson [King Kong] went onstage, shot a bunch of virtual cameras, and we moved those into our scene files. We did try to streamline it where we could work in a 3D environment. We solved a lot of issues by
using a tech called Koru Skinning which does calculations outside of Maya, and in such a way that is easier and faster for the animation software program to distinguish and organize.” Concept art developed by Nick Keller established the look of the predator city of London. “The bottom tiers were industrial-type buildings, and the further you move up, the posher and sleeker the city becomes. The concepts for that were there right from the beginning. St. Paul’s Cathedral, which contains the Medusa weapon, was switched around.” Atmospherics like fog were incorporated into the shots to add a sense of depth. “No one wants to see a snail race,” laughs Yoo. “Initially, it was thought that the cities would move a maximum of 100 km an hour [62 mph]. In the trailer, to make things more dynamic, London was sped up in the edit. When trying to recreate
TOP: A key trait for the character of Shrike in Mortal Engines is his glowing green eyes. (Image © Universal Pictures and MRC) BOTTOM: Not all of the cities in Mortal Engines are landbound such as the aerial dwelling of Airhaven. Volumetric clouds were significant in conveying the size and scale of Airhaven. (Image © Universal Pictures and MRC)
SPRING 2019 VFXVOICE.COM • 45
2/26/19 1:31 PM
FILM
WETA DIGITAL: MOVING CITIES AND BRINGING ANIME TO LIFE By TREVOR HOGG
TOP: In Mortal Engines, London consists of eight tiers, each supported by a suspension system that results in a smoother ride for the upper classes that live in higher levels of the city. (Image © Universal Pictures and MRC)
44 • VFXVOICE.COM SPRING 2019
PG 44-48 WETA.indd 44-45
Pushing creative and technical boundaries is a mantra for Weta Digital, which had to reimagine London, England as a massive urban predator travelling across Europe in Mortal Engines, and in the process had to produce an entirely new pipeline. “With every project there’s always something different,” notes Animation Supervisor Dennis Yoo, who traded the biologically enhanced primates of War for the Planet of the Apes for a futuristic world where cities literally hunt and dismantle each other. “In Mortal Engines, it was literally the scale of everything. We built these moving cities to actual scale in CG. The math involved got a bit crazy. The happy spot for the software is for x, y and z to be set at zero. When these cities move a few kilometers away we started to lose accuracy, so they had to be cheated back to zero space.” Pipelines needed to be revised in order to accomplish the work. “There were some growing pains on how we were going to animate these cities,” states Yoo. “We developed a layout puppet.” A city was divided into different tiers consisting of sections that could be moved independently from each other. “There’s this hierarchy structure in the puppet. You can go, ‘I need this part of London.’ You could load just that part of London. Unfortunately, another aspect of that system was if you only wanted a gun situated in a particular part of London you had to load that entire layout.” Different teams on the film had to collaborate in order to resolve issues. “We got the creature lead to sit with the animation team so we could troubleshoot things on the fly with each other. Later we had the layout lead as well. It was the only way that we could communicate properly about something that complex.” Previs Supervisor Marco Spitoni has been working on Mortal Engines for a decade. “Some of the previs was on point where we literally grabbed the previs camera and put that into the correct world space,” remarks Yoo. “Also, [producer] Peter Jackson [King Kong] went onstage, shot a bunch of virtual cameras, and we moved those into our scene files. We did try to streamline it where we could work in a 3D environment. We solved a lot of issues by
using a tech called Koru Skinning which does calculations outside of Maya, and in such a way that is easier and faster for the animation software program to distinguish and organize.” Concept art developed by Nick Keller established the look of the predator city of London. “The bottom tiers were industrial-type buildings, and the further you move up, the posher and sleeker the city becomes. The concepts for that were there right from the beginning. St. Paul’s Cathedral, which contains the Medusa weapon, was switched around.” Atmospherics like fog were incorporated into the shots to add a sense of depth. “No one wants to see a snail race,” laughs Yoo. “Initially, it was thought that the cities would move a maximum of 100 km an hour [62 mph]. In the trailer, to make things more dynamic, London was sped up in the edit. When trying to recreate
TOP: A key trait for the character of Shrike in Mortal Engines is his glowing green eyes. (Image © Universal Pictures and MRC) BOTTOM: Not all of the cities in Mortal Engines are landbound such as the aerial dwelling of Airhaven. Volumetric clouds were significant in conveying the size and scale of Airhaven. (Image © Universal Pictures and MRC)
SPRING 2019 VFXVOICE.COM • 45
2/26/19 1:31 PM
FILM
TOP: In Alita: Battle Angel, Houdini was used to procedurally model proper irises with each eye consisting of a million polygons. (Image courtesy of Twentieth Century Fox) MIDDLE: Zapan consists strictly of the face of Ed Skrein with everything else including the back of his head replaced with CG in Alita: Battle Angel (Image courtesy of Twentieth Century Fox) BOTTOM: Dr. Dyson Ido (Christoph Waltz) recovers the Cyber Core of Alita. (Image courtesy of Twentieth Century Fox)
it we realized that London was going Mach 2. We had to figure out how to make it more consistent so the effects team didn’t have to do a bespoke simulation for every shot. We tried to keep London within the 300 km mark, which is still unbelievable, but looks exciting.” It was fun animating the cities. “There is a cool tiny city called Saltzhaken that gets eaten up by London right at the start.” Another significant animation challenge for Yoo was the CG character of Shrike (played by Stephen Lang), a deadly warrior and human-machine hybrid. “The performance of Stephen Lang on set was constantly changing as director Christian Rivers and the producers were trying to figure out what they needed from the character. Initially, Shrike was literally a jaw and a skull inside a metal face. Later we decided to do a full facial rig to pull some emotions out of his face.” A distinguishing trait are the character’s glowing green eyes. “Shrike wasn’t hard to rig. The only things that he had extra of were his feet, which are like claws, and the ability to flick out his fingernails.” The physical motion was simplified to be more humanoid to avoid Shrike looking like a guy in a mocap suit. “One added thing that we didn’t expect to do was his flowing jacket, which the client wanted to see in our animation, so the cloth simulation had to be baked in.” While Mortal Engines was being developed, Weta Digital was also involved in producing a believable photorealistic anime protagonist for Alita: Battle Angel, which is based on the cyberpunk manga by Yukito Kishiro that revolves around an amnesic cyborg with lethal martial arts skills. Alita was one of the most ambitious CG characters developed at the six-time Oscar-winning Wellington, New Zealand-based visual effects facility, which has previously created Gollum (the Lord of the Rings trilogy), Neytiri (Avatar), Caesar (the Planet of the Apes trilogy) and Thanos (Avengers: Infinity War). “She has an anime style with those big eyes, but we put a lot of energy into making sure the puppet was going to capture all the subtleties and complexities of Rosa Salazar’s performance,” explains Animation Supervisor Mike Cozens, who previously helped to bring to life the title creature in Pete’s Dragon. “This film was the first time that we’ve done a direct-match actor-puppet before putting the motion onto the actual CG character puppet where it is furthered refined.” There was also a movement away from FACS (Facial Action Coding System), which is a method of mapping out muscle motions in
“[Alita] has an anime style with those big eyes, but we put a lot of energy into making sure the puppet was going to capture all the subtleties and complexities of Rosa Salazar’s performance. This film was the first time that we’ve done a direct-match actor-puppet before putting the motion onto the actual CG character puppet where it is furthered refined.” —Mike Cozens, Animation Supervisor, Alita: Battle Angel
46 • VFXVOICE.COM SPRING 2019
PG 44-48 WETA.indd 46-47
“We worked hard to get the correct lip behavior in relation to the teeth and gums. All of this structural work was expanded upon, reworked, and put into the puppet. With that and the phoneme work, the detail of the facial performance came together and we had a photoreal facial performance.” —Mike Cozens, Animation Supervisor, Alita: Battle Angel relation to emotional facial poses developed by psychologist Paul Ekman. “That’s how we’ve been building puppets over a number of years. The FACS puppet build process tends to build for extreme facial expressions [for example: extremely happy, angry or sad]. However, people tend to be quite efficient with their lips when talking, and there’s a whole bunch of complexity that happens in the subtle range. We had to rethink and rework a bunch of posing in order to catch those speech phonemes.” Then there was the matter of the character’s facial structure. “The jaw is the primary bone in your face, so if that motion is incorrect everything goes askew from there,” explains Cozens. “We went back and looked at how the jaw was pulling the muscles and flesh around on the face, how the mouth structure changed and how the lip corners behaved in relation to jaw motion. The other thing that we found was that the inside of the mouth, most of which you don’t see, is a critical part of the face build; this included the inside of the lips and the modiolus [a point at the corner of the mouth where eight muscles meet]. We worked hard to get the correct lip behavior in relation to the teeth and gums. All of this structural work was expanded upon, reworked, and put into the puppet. With that and the phoneme work, the detail of the facial performance came together and we had a photoreal facial performance.” Real-time renderer Gazebo was utilized to prerender the animation with a lighting pass that included shadows. “The
TOP: In Mortal Engines, the matte painting department 3D-printed treads from the city models, marked up model clay with them, and did a photometric scan to create a pallet of displacements that they could put into the ground. (Image © Universal Pictures and MRC) MIDDLE: A fleshy look was incorporated into the metallic face based on the facial features of actor Stephen Lang to enable Shrike to emote. (Image © Universal Pictures and MRC) BOTTOM: The final shot of the opening chase sequence in Mortal Engines, with clouds and lighting. (Image © Universal Pictures and MRC)
SPRING 2019 VFXVOICE.COM • 47
2/26/19 1:31 PM
FILM
TOP: In Alita: Battle Angel, Houdini was used to procedurally model proper irises with each eye consisting of a million polygons. (Image courtesy of Twentieth Century Fox) MIDDLE: Zapan consists strictly of the face of Ed Skrein with everything else including the back of his head replaced with CG in Alita: Battle Angel (Image courtesy of Twentieth Century Fox) BOTTOM: Dr. Dyson Ido (Christoph Waltz) recovers the Cyber Core of Alita. (Image courtesy of Twentieth Century Fox)
it we realized that London was going Mach 2. We had to figure out how to make it more consistent so the effects team didn’t have to do a bespoke simulation for every shot. We tried to keep London within the 300 km mark, which is still unbelievable, but looks exciting.” It was fun animating the cities. “There is a cool tiny city called Saltzhaken that gets eaten up by London right at the start.” Another significant animation challenge for Yoo was the CG character of Shrike (played by Stephen Lang), a deadly warrior and human-machine hybrid. “The performance of Stephen Lang on set was constantly changing as director Christian Rivers and the producers were trying to figure out what they needed from the character. Initially, Shrike was literally a jaw and a skull inside a metal face. Later we decided to do a full facial rig to pull some emotions out of his face.” A distinguishing trait are the character’s glowing green eyes. “Shrike wasn’t hard to rig. The only things that he had extra of were his feet, which are like claws, and the ability to flick out his fingernails.” The physical motion was simplified to be more humanoid to avoid Shrike looking like a guy in a mocap suit. “One added thing that we didn’t expect to do was his flowing jacket, which the client wanted to see in our animation, so the cloth simulation had to be baked in.” While Mortal Engines was being developed, Weta Digital was also involved in producing a believable photorealistic anime protagonist for Alita: Battle Angel, which is based on the cyberpunk manga by Yukito Kishiro that revolves around an amnesic cyborg with lethal martial arts skills. Alita was one of the most ambitious CG characters developed at the six-time Oscar-winning Wellington, New Zealand-based visual effects facility, which has previously created Gollum (the Lord of the Rings trilogy), Neytiri (Avatar), Caesar (the Planet of the Apes trilogy) and Thanos (Avengers: Infinity War). “She has an anime style with those big eyes, but we put a lot of energy into making sure the puppet was going to capture all the subtleties and complexities of Rosa Salazar’s performance,” explains Animation Supervisor Mike Cozens, who previously helped to bring to life the title creature in Pete’s Dragon. “This film was the first time that we’ve done a direct-match actor-puppet before putting the motion onto the actual CG character puppet where it is furthered refined.” There was also a movement away from FACS (Facial Action Coding System), which is a method of mapping out muscle motions in
“[Alita] has an anime style with those big eyes, but we put a lot of energy into making sure the puppet was going to capture all the subtleties and complexities of Rosa Salazar’s performance. This film was the first time that we’ve done a direct-match actor-puppet before putting the motion onto the actual CG character puppet where it is furthered refined.” —Mike Cozens, Animation Supervisor, Alita: Battle Angel
46 • VFXVOICE.COM SPRING 2019
PG 44-48 WETA.indd 46-47
“We worked hard to get the correct lip behavior in relation to the teeth and gums. All of this structural work was expanded upon, reworked, and put into the puppet. With that and the phoneme work, the detail of the facial performance came together and we had a photoreal facial performance.” —Mike Cozens, Animation Supervisor, Alita: Battle Angel relation to emotional facial poses developed by psychologist Paul Ekman. “That’s how we’ve been building puppets over a number of years. The FACS puppet build process tends to build for extreme facial expressions [for example: extremely happy, angry or sad]. However, people tend to be quite efficient with their lips when talking, and there’s a whole bunch of complexity that happens in the subtle range. We had to rethink and rework a bunch of posing in order to catch those speech phonemes.” Then there was the matter of the character’s facial structure. “The jaw is the primary bone in your face, so if that motion is incorrect everything goes askew from there,” explains Cozens. “We went back and looked at how the jaw was pulling the muscles and flesh around on the face, how the mouth structure changed and how the lip corners behaved in relation to jaw motion. The other thing that we found was that the inside of the mouth, most of which you don’t see, is a critical part of the face build; this included the inside of the lips and the modiolus [a point at the corner of the mouth where eight muscles meet]. We worked hard to get the correct lip behavior in relation to the teeth and gums. All of this structural work was expanded upon, reworked, and put into the puppet. With that and the phoneme work, the detail of the facial performance came together and we had a photoreal facial performance.” Real-time renderer Gazebo was utilized to prerender the animation with a lighting pass that included shadows. “The
TOP: In Mortal Engines, the matte painting department 3D-printed treads from the city models, marked up model clay with them, and did a photometric scan to create a pallet of displacements that they could put into the ground. (Image © Universal Pictures and MRC) MIDDLE: A fleshy look was incorporated into the metallic face based on the facial features of actor Stephen Lang to enable Shrike to emote. (Image © Universal Pictures and MRC) BOTTOM: The final shot of the opening chase sequence in Mortal Engines, with clouds and lighting. (Image © Universal Pictures and MRC)
SPRING 2019 VFXVOICE.COM • 47
2/26/19 1:31 PM
FILM
?
TOP: Alita examines her Cyber Girl body made out of alabaster, which was originally constructed by Ido for his daughter. (Image courtesy of Twentieth Century Fox) MIDDLE: Alita prepares to take part in the lethal gladiator sport of Motor Ball. (Image courtesy of Twentieth Century Fox) BOTTOM LEFT: A two block set was constructed of Iron City in Austin, Texas which still required digital augmentation to get necessary scope of the environment. (Image courtesy of Twentieth Century Fox) BOTTOM RIGHT: Rosa Salazar, wearing a mocap suit and facial markers, speaking to filmmaker Robert Rodriguez and co-star Keean Johnson on the set of Alta: Battle Angel. (Photo: Rico Torres)
lighters provide the final scene lighting so we have key and fill directions correct, so when the animation evolves the lighting is not jumping all over the place. Having the final lighting setup at the animation stage also helped us to debug problems with emotional performance.” Unlike Alita, who is entirely CG, a few cyborgs have live-action faces and digital bodies or live-action partial bodies and digital partial bodies. “Because the film was shot native stereo, all of that stuff had to be bulletproof,” remarks Cozens. “All of the match moves and camera tracks had to be tight in order to make the CG and the live action stick together. Often when you’re doing a character like Zapan (played by Ed Skrein), who has a live-action face on a CG body, we tend to be quite limited with what we can do with the live-action part of the final composite. It can’t move. A couple of the animation R&D guys came up with a helpful new way of doing cards where we could move the element around in-camera without breaking the stereo or its connection to the CG. That gave us more freedom to recompose and augment performance for those characters when we needed and still have the result be believable. Developments like that opened up our ability to get the shots that director Robert Rodriguez wanted. “Rosa’s performance evolves through the film as Alita finds her warrior spirit and discovers what it means to be human,” notes Cozens. “The stunt team, led by Garrett Warren, had done a lot of development for the on-set stunts while, at the same time, we worked through a bunch of previs for some of the larger CG scenes. I had an opportunity in Austin, Texas to meet with Garrett and fight coordinator Steven Brown to align what we were doing in terms of her motion studies.” Alita’s abilities move quickly from an ordinary girl to being able to do extreme things, but all of that needed to be grounded in real-world physics. “The interesting thing about Rosa’s performance is that it’s multilayered, and you can see that she’s saying one thing and thinking another. There’s an inner monologue that’s happening. Catching this type of complex performance on Alita while making sure that she looks good was the biggest challenge and the thing that I’m most excited to see onscreen. As Robert said, ‘Alita is a living manga character who quickly becomes more than human.’”
48 • VFXVOICE.COM SPRING 2019
PG 44-48 WETA.indd 48
2/26/19 1:31 PM
PG 49 WETA AD.indd 49
2/19/19 11:21 AM
VES AWARDS
VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT
1
1. Eric Roth, Executive Director of the Visual Effects Society, welcomes the crowd. 2. Patton Oswalt hosts the VES Awards Show.
2
3. The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Avengers: Infinity War and the team of Daniel DeLeeuw, Jen Underdahl, Kelly Port, Matt Aitken and Daniel Sudick.
3
50 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 50-51
5. The VES Awards for Outstanding Visual Effects in an Animated Feature went to Spider-Man: Into the Spider-Verse and the team of Joshua Beveridge, Christian Hejnal, Danny Dimian and Bret St. Clair.
Captions list all members of each Award-winning team, even if some members were not present or out of frame. For more Show photos and a complete list of nominees and winners of the 17th Annual VES Awards visit visualeffectssociety.com
All photos by: Danny Moloshok and Phil McCarten
4. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to First Man and the team of Paul Lambert, Kevin Elam, Tristan Myles, Ian Hunter and JD Schwalm.
The Visual Effects Society held the 17th Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues. Comedian Patton Oswalt served as host for the sixth time to the more than 1,200 guests gathered at the Beverly Hilton Hotel, Los Angeles, on February 5, 2019, to celebrate VFX talent in 24 awards categories. Avengers: Infinity War was named the photoreal feature winner, garnering four awards. Spider-Man: Into the Spider-Verse was named top animated film, winning four awards. Lost in Space was named best photoreal episode and also garnered four awards. Jimmy Kimmel presented the VES Award for Creative Excellence to awardwinning creators-executive producerswriters-directors David Benioff and D.B. Weiss for their groundbreaking work on HBO’s Game of Thrones. Westworld star Evan Rachel Wood presented the VES Visionary Award to acclaimed writer-director-producer Jonathan Nolan. And Steve Carell presented the Lifetime Achievement Award to Oscar®-nominated producer and founder and CEO of Illumination Chris Meledandri. Awards presenters also included: James Marsden, Incredibles 2 director Brad Bird, Avengers: Infinity War directors Anthony and Joe Russo, legendary director-producer Roger Corman, Allen Leech, Suzanne Cryer, Thomas Middleditch and Sydney Sweeney.
4
5
6
7
8
9
6. The VES Award for Outstanding Visual Effects in a Photoreal Episode went to Lost in Space; Danger, Will Robinson and the team of Jabbar Raisani, Terron Pratt, Niklas Jacobson and Joao Sita. 7. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to Tom Clancy’s Jack Ryan; Pilot and the team of Erik Henry, Matt Robken, Bobo Skipper, Deak Ferrand and Pau Costa. 8. The VES Award for Outstanding Visual Effects in a Real-Time Project went to Age of Sail and the team of John Kahrs, Kevin Dart, Cassidy Curtis and Theresa Latzko. 9. The VES Award for Outstanding Visual Effects in a Special Venue Project went to Childish Gambino’s Pharos and the team of Keith Miller, Alejandro Crawford, Thelvin Cabezas and Jeremy Thompson. 10. The VES Award for Outstanding Animated Character in a Photoreal Feature went to Avengers: Infinity War; Thanos and the team of Jan Philip Cramer, Darren Hendler, Paul Story and Sidney KomboKintombo.
10
SPRING 2019 VFXVOICE.COM • 51
2/22/19 1:21 PM
VES AWARDS
VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT
1
1. Eric Roth, Executive Director of the Visual Effects Society, welcomes the crowd. 2. Patton Oswalt hosts the VES Awards Show.
2
3. The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Avengers: Infinity War and the team of Daniel DeLeeuw, Jen Underdahl, Kelly Port, Matt Aitken and Daniel Sudick.
3
50 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 50-51
5. The VES Awards for Outstanding Visual Effects in an Animated Feature went to Spider-Man: Into the Spider-Verse and the team of Joshua Beveridge, Christian Hejnal, Danny Dimian and Bret St. Clair.
Captions list all members of each Award-winning team, even if some members were not present or out of frame. For more Show photos and a complete list of nominees and winners of the 17th Annual VES Awards visit visualeffectssociety.com
All photos by: Danny Moloshok and Phil McCarten
4. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to First Man and the team of Paul Lambert, Kevin Elam, Tristan Myles, Ian Hunter and JD Schwalm.
The Visual Effects Society held the 17th Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues. Comedian Patton Oswalt served as host for the sixth time to the more than 1,200 guests gathered at the Beverly Hilton Hotel, Los Angeles, on February 5, 2019, to celebrate VFX talent in 24 awards categories. Avengers: Infinity War was named the photoreal feature winner, garnering four awards. Spider-Man: Into the Spider-Verse was named top animated film, winning four awards. Lost in Space was named best photoreal episode and also garnered four awards. Jimmy Kimmel presented the VES Award for Creative Excellence to awardwinning creators-executive producerswriters-directors David Benioff and D.B. Weiss for their groundbreaking work on HBO’s Game of Thrones. Westworld star Evan Rachel Wood presented the VES Visionary Award to acclaimed writer-director-producer Jonathan Nolan. And Steve Carell presented the Lifetime Achievement Award to Oscar®-nominated producer and founder and CEO of Illumination Chris Meledandri. Awards presenters also included: James Marsden, Incredibles 2 director Brad Bird, Avengers: Infinity War directors Anthony and Joe Russo, legendary director-producer Roger Corman, Allen Leech, Suzanne Cryer, Thomas Middleditch and Sydney Sweeney.
4
5
6
7
8
9
6. The VES Award for Outstanding Visual Effects in a Photoreal Episode went to Lost in Space; Danger, Will Robinson and the team of Jabbar Raisani, Terron Pratt, Niklas Jacobson and Joao Sita. 7. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to Tom Clancy’s Jack Ryan; Pilot and the team of Erik Henry, Matt Robken, Bobo Skipper, Deak Ferrand and Pau Costa. 8. The VES Award for Outstanding Visual Effects in a Real-Time Project went to Age of Sail and the team of John Kahrs, Kevin Dart, Cassidy Curtis and Theresa Latzko. 9. The VES Award for Outstanding Visual Effects in a Special Venue Project went to Childish Gambino’s Pharos and the team of Keith Miller, Alejandro Crawford, Thelvin Cabezas and Jeremy Thompson. 10. The VES Award for Outstanding Animated Character in a Photoreal Feature went to Avengers: Infinity War; Thanos and the team of Jan Philip Cramer, Darren Hendler, Paul Story and Sidney KomboKintombo.
10
SPRING 2019 VFXVOICE.COM • 51
2/22/19 1:21 PM
VES AWARDS 11. The VES Award for Outstanding Animated Character in an Animated Feature went to Spider-Man: Into the Spider-Verse; Miles Morales and the team of Marcos Kang, Chad Belteau, Humberto Rosa and Julie Bernier Gosselin.
11
12. The VES Award for Outstanding Animated Character in an Episode or Real-Time Project went to Lost in Space; Humanoid and the team of Chad Shattuck, Paul Zeke, Julia Flanagan and Andrew McCartney.
12
16
17
13. The VES Award for Outstanding Animated Character in a Commercial went to Volkswagen; Born Confident; Bam and the team of David Bryan, Chris Welsby, Fabian Frank and Chloe Dawe. 14. The VES Award for Outstanding Created Environment in a Photoreal Feature went to Ready Player One; The Shining, Overlook Hotel and the team of Mert Yamak, Stanley Wong, Joana Garrido and Daniel-Stefan Gagiu. 15. The VES Award for Outstanding Created Environment in an Animated Feature went to Spider-Man: Into the Spider-Verse; Graphic New York City and the team of Terry Park, Bret St. Clair, Kimberly Liptrap and Dave Morehead. 13
14
52 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 52-53
18 16. The VES Award for Outstanding Created Environment in an Episode, Commercial, or Real-Time Project went to Lost in Space; Pilot; Impact Area and the team of Philip Engström, Kenny Vähäkari, Jason Martin and Martin Bergquist. 17. The VES Award for Outstanding Visual Effects in a Commercial went to John Lewis; The Boy and the Piano and the team of Kamen Markov, Philip Whalley, Anthony Bloor and Andy Steele.
15
19
18. The VES Award for Outstanding Virtual Cinematography in a Photoreal Project went to Ready Player One; New York Race and the team of Daniele Bigi, Edmund Kolloen, Mathieu Vig and Jean-Baptiste Noyau. 19. The VES Award for Outstanding Model in a Photoreal or Animated Project went to Mortal Engines; London and the team of Matthew Sandoval, James Ogle, Nick Keller and Sam Tack. 20. The Outstanding Effects Simulations in a Photoreal Feature went to Avengers: Infinity War; Titan and the team of Gerardo Aguilera, Ashraf Ghoniem, Vasilis Pazionis and Hartwell Durfor. 20
SPRING 2019 VFXVOICE.COM • 53
2/22/19 1:21 PM
VES AWARDS 11. The VES Award for Outstanding Animated Character in an Animated Feature went to Spider-Man: Into the Spider-Verse; Miles Morales and the team of Marcos Kang, Chad Belteau, Humberto Rosa and Julie Bernier Gosselin.
11
12. The VES Award for Outstanding Animated Character in an Episode or Real-Time Project went to Lost in Space; Humanoid and the team of Chad Shattuck, Paul Zeke, Julia Flanagan and Andrew McCartney.
12
16
17
13. The VES Award for Outstanding Animated Character in a Commercial went to Volkswagen; Born Confident; Bam and the team of David Bryan, Chris Welsby, Fabian Frank and Chloe Dawe. 14. The VES Award for Outstanding Created Environment in a Photoreal Feature went to Ready Player One; The Shining, Overlook Hotel and the team of Mert Yamak, Stanley Wong, Joana Garrido and Daniel-Stefan Gagiu. 15. The VES Award for Outstanding Created Environment in an Animated Feature went to Spider-Man: Into the Spider-Verse; Graphic New York City and the team of Terry Park, Bret St. Clair, Kimberly Liptrap and Dave Morehead. 13
14
52 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 52-53
18 16. The VES Award for Outstanding Created Environment in an Episode, Commercial, or Real-Time Project went to Lost in Space; Pilot; Impact Area and the team of Philip Engström, Kenny Vähäkari, Jason Martin and Martin Bergquist. 17. The VES Award for Outstanding Visual Effects in a Commercial went to John Lewis; The Boy and the Piano and the team of Kamen Markov, Philip Whalley, Anthony Bloor and Andy Steele.
15
19
18. The VES Award for Outstanding Virtual Cinematography in a Photoreal Project went to Ready Player One; New York Race and the team of Daniele Bigi, Edmund Kolloen, Mathieu Vig and Jean-Baptiste Noyau. 19. The VES Award for Outstanding Model in a Photoreal or Animated Project went to Mortal Engines; London and the team of Matthew Sandoval, James Ogle, Nick Keller and Sam Tack. 20. The Outstanding Effects Simulations in a Photoreal Feature went to Avengers: Infinity War; Titan and the team of Gerardo Aguilera, Ashraf Ghoniem, Vasilis Pazionis and Hartwell Durfor. 20
SPRING 2019 VFXVOICE.COM • 53
2/22/19 1:21 PM
VES AWARDS 24. The VES Award for Outstanding Compositing in a Photoreal Episode went to Lost in Space; Impact; Crash Site Rescue and the team of David Wahlberg, Douglas Roshamn, Sofie Ljunggren and Fredrik Lönn. 25. The VES Award for Outstanding Compositing in a Photoreal Commercial went to Apple; Welcome Home and the team of Michael Ralla, Steve Drew, Alejandro Villabon and Peter Timberlake. 21
23
22
24
26. The VES Award for Outstanding Visual Effects in a Student Project went to Terra Nova and the team of Thomas Battistetti, Mélanie Geley, Mickael Le Mezo and Guillaume Hoarau.
28
29
27. The VES Award for Outstanding Effects Simulations in an Animated Feature went to Spider-Man: Into the Spider-Verse and the team of Ian Farnsworth, Pav Grochola, Simon Corbaux and Brian D. Casper. 28. Jonathan Nolan receives the VES Visionary Award.
25
29. Actress Evan Rachel Wood prepares to give Jonathan Nolan the VES Visionary Award.
30
30. Jeff Okun, VES, and VES First Vice Chair, prepares to present some awards. 26 27 21. VES Chair Mike Chambers takes the stage to present several awards.
22. The VES Award for Outstanding Effects Simulations in an Episode, Commercial, or RealTime Project went to Altered Carbon and the team of Philipp Kratzer, Daniel Fernandez, Xavier Lestourneaud and Andrea Rosa.
54 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 55
23. The VES Award for Outstanding Compositing in a Photoreal Feature went to Avengers: Infinity War; Titan and the team of Sabine Laimer, Tim Walker, Tobias Wiesner and Massimo Pasquetti.
31. Actor Steve Carell arrives as Despicable Me’s Gru and reveals himself before giving the VES Lifetime Achievement Award to Chris Meledandri.
31
SPRING 2019 VFXVOICE.COM • 55
2/22/19 1:22 PM
VES AWARDS 24. The VES Award for Outstanding Compositing in a Photoreal Episode went to Lost in Space; Impact; Crash Site Rescue and the team of David Wahlberg, Douglas Roshamn, Sofie Ljunggren and Fredrik Lönn. 25. The VES Award for Outstanding Compositing in a Photoreal Commercial went to Apple; Welcome Home and the team of Michael Ralla, Steve Drew, Alejandro Villabon and Peter Timberlake. 21
23
22
24
26. The VES Award for Outstanding Visual Effects in a Student Project went to Terra Nova and the team of Thomas Battistetti, Mélanie Geley, Mickael Le Mezo and Guillaume Hoarau.
28
29
27. The VES Award for Outstanding Effects Simulations in an Animated Feature went to Spider-Man: Into the Spider-Verse and the team of Ian Farnsworth, Pav Grochola, Simon Corbaux and Brian D. Casper. 28. Jonathan Nolan receives the VES Visionary Award.
25
29. Actress Evan Rachel Wood prepares to give Jonathan Nolan the VES Visionary Award.
30
30. Jeff Okun, VES, and VES First Vice Chair, prepares to present some awards. 26 27 21. VES Chair Mike Chambers takes the stage to present several awards.
22. The VES Award for Outstanding Effects Simulations in an Episode, Commercial, or RealTime Project went to Altered Carbon and the team of Philipp Kratzer, Daniel Fernandez, Xavier Lestourneaud and Andrea Rosa.
54 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 55
23. The VES Award for Outstanding Compositing in a Photoreal Feature went to Avengers: Infinity War; Titan and the team of Sabine Laimer, Tim Walker, Tobias Wiesner and Massimo Pasquetti.
31. Actor Steve Carell arrives as Despicable Me’s Gru and reveals himself before giving the VES Lifetime Achievement Award to Chris Meledandri.
31
SPRING 2019 VFXVOICE.COM • 55
2/22/19 1:22 PM
VES AWARDS 32. Jimmy Kimmel prepares to present the VES Award for Creative Excellence. 33. Chris Meledandri accepts the VES Lifetime Achievement Award. 34. The legendary Roger Corman prepares to bestow VES Awards. 35. David Benioff and D.B. Weiss accept the VES Award for Creative Excellence. 36. Avengers: Infinity War directors Anthony and Joe Russo prepare to present VES Awards.
32
37. ILM’s Rob Bredow was in attendance.
33
38 38. VES Executive Director Eric Roth with presenter Brad Bird, second from left, VES Chair Mike Chambers, third from left, and VES First Vice Chair Jeffrey A. Okun, VES.
39
40 41
42
43
44
39. Awards presenter James Marsden of Westworld. 40. Awards presenter actor Allen Leech. 41. VES Executive Director Eric Roth on the Red Carpet with Award for Creative Excellence recipients David Benioff and D.B. Weiss.
34
42. Awards presenter Suzanne Cryer.
35
43. Awards presenter Thomas Middleditch. 44. Awards presenter Sydney Sweeney.
36
56 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 57
37
SPRING 2019 VFXVOICE.COM • 57
2/22/19 1:22 PM
VES AWARDS 32. Jimmy Kimmel prepares to present the VES Award for Creative Excellence. 33. Chris Meledandri accepts the VES Lifetime Achievement Award. 34. The legendary Roger Corman prepares to bestow VES Awards. 35. David Benioff and D.B. Weiss accept the VES Award for Creative Excellence. 36. Avengers: Infinity War directors Anthony and Joe Russo prepare to present VES Awards.
32
37. ILM’s Rob Bredow was in attendance.
33
38 38. VES Executive Director Eric Roth with presenter Brad Bird, second from left, VES Chair Mike Chambers, third from left, and VES First Vice Chair Jeffrey A. Okun, VES.
39
40 41
42
43
44
39. Awards presenter James Marsden of Westworld. 40. Awards presenter actor Allen Leech. 41. VES Executive Director Eric Roth on the Red Carpet with Award for Creative Excellence recipients David Benioff and D.B. Weiss.
34
42. Awards presenter Suzanne Cryer.
35
43. Awards presenter Thomas Middleditch. 44. Awards presenter Sydney Sweeney.
36
56 • VFXVOICE.COM SPRING 2019
PG 50-57 VES AWARDS.indd 57
37
SPRING 2019 VFXVOICE.COM • 57
2/22/19 1:22 PM
VES AWARD WINNERS
AVENGERS: INFINITY WAR
The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Avengers: Infinity War, which won four VES Awards, including Outstanding Animated Character in a Photoreal Featrure (Avengers: Infinity War; Thanos), Outstanding Effects Simulations in a Photoreal Feature (Avengers: Infinity War; Titan) and Outstanding Compositing in a Photoreal Feature (Avengers: Infinity War; Titan). (Photos courtesy of Marvel Studios. All rights reserved.)
58 • VFXVOICE.COM SPRING 2019
PG 58-63 VES AWARDS WINNERS.indd 58-59
SPRING 2019 VFXVOICE.COM • 59
2/26/19 1:01 PM
VES AWARD WINNERS
AVENGERS: INFINITY WAR
The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Avengers: Infinity War, which won four VES Awards, including Outstanding Animated Character in a Photoreal Featrure (Avengers: Infinity War; Thanos), Outstanding Effects Simulations in a Photoreal Feature (Avengers: Infinity War; Titan) and Outstanding Compositing in a Photoreal Feature (Avengers: Infinity War; Titan). (Photos courtesy of Marvel Studios. All rights reserved.)
58 • VFXVOICE.COM SPRING 2019
PG 58-63 VES AWARDS WINNERS.indd 58-59
SPRING 2019 VFXVOICE.COM • 59
2/26/19 1:01 PM
VES AWARD WINNERS
SPIDER-MAN: INTO THE SPIDER-VERSE
Spider-Man: Into the Spider-Verse won the VES Award for Outstanding Visual Effects in an Animated Feature, and won three additional VES Awards, including Outstanding Animated Character in an Animated Feature (Miles Morales), Outstanding Created Environment in an Animated Feature (Graphic New York City) and Outstanding Effects Simulations in an Animated Feature. (Photo courtesy of Sony Pictures Animation. All rights reserved.)
60 • VFXVOICE.COM SPRING 2019
PG 58-63 VES AWARDS WINNERS.indd 60-61
SPRING 2019 VFXVOICE.COM • 61
2/26/19 1:01 PM
VES AWARD WINNERS
SPIDER-MAN: INTO THE SPIDER-VERSE
Spider-Man: Into the Spider-Verse won the VES Award for Outstanding Visual Effects in an Animated Feature, and won three additional VES Awards, including Outstanding Animated Character in an Animated Feature (Miles Morales), Outstanding Created Environment in an Animated Feature (Graphic New York City) and Outstanding Effects Simulations in an Animated Feature. (Photo courtesy of Sony Pictures Animation. All rights reserved.)
60 • VFXVOICE.COM SPRING 2019
PG 58-63 VES AWARDS WINNERS.indd 60-61
SPRING 2019 VFXVOICE.COM • 61
2/26/19 1:01 PM
VES AWARD WINNERS
LOST IN SPACE
62 • VFXVOICE.COM SPRING 2019
PG 58-63 VES AWARDS WINNERS.indd 63
Lost in Space won the VES Award for Outstanding Visual Effects in a Photoreal Episode (Lost in Space; Danger, Will Robinson). The series won four VES Awards, including Outstanding Animated Character in a Episode or Real-Time Project (Lost in Space; Humanoid), Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project (Lost in Space; Pilot; Impact Area) and Outstanding Compositing in a Photoreal Episode (Lost in Space; Impact; Crash Site Rescue). Photos courtesy of Netflix. All rights reserved.)
SPRING 2019 VFXVOICE.COM • 63
2/26/19 1:01 PM
VES AWARD WINNERS
LOST IN SPACE
62 • VFXVOICE.COM SPRING 2019
PG 58-63 VES AWARDS WINNERS.indd 63
Lost in Space won the VES Award for Outstanding Visual Effects in a Photoreal Episode (Lost in Space; Danger, Will Robinson). The series won four VES Awards, including Outstanding Animated Character in a Episode or Real-Time Project (Lost in Space; Humanoid), Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project (Lost in Space; Pilot; Impact Area) and Outstanding Compositing in a Photoreal Episode (Lost in Space; Impact; Crash Site Rescue). Photos courtesy of Netflix. All rights reserved.)
SPRING 2019 VFXVOICE.COM • 63
2/26/19 1:01 PM
SPECIAL REPORT
GLOBAL VFX: STATE OF THE INDUSTRY 2019 By DEBRA KAUFMAN
Though the last five to 10 years have been disruptive in the visual effects universe, the industry is currently quite robust. Movies, animation and video games rich with the latest stunning visual effects top box offices and sales charts worldwide. With the current golden age of television, and an explosion of new platforms from Amazon Prime Video to Hulu to Netflix, the need for content has mushroomed and, with it, the need for visual effects of all kinds. Furthermore, companies are exploring new technologies from virtual reality to 360-degree experiences and light-field capture, all of which require the expertise and knowledge base of the visual effects industry. Although the work may be plentiful, some financial considerations can either help or hobble visual effect facilities. Several experts told VFX Voice that business margins are as thin as ever. Financial incentives have failed in some locations, but have become further entrenched in others. In places like the U.K., where incentives have been used to build a robust all-around film industry, as well as other places, incentives appear to be here to stay. Two things have arisen as a result: larger facilities have been establishing a presence in the cities where film/TV productions take place, and the number of must-be-there locations has grown. At the same time, many visual effects artists continue to go where the work is, essentially becoming nomadic as they move to different cities for different projects. Those facilities built in India, China and elsewhere to take on wire removal and rotoscoping are also coming into their own, particularly in India where they are offering a full range of visual effects for homegrown movies – with the aspiration that they might be able to do more than rotoscoping for out-of-town productions shooting in their countries. As a company like Netflix produces more and more country-specific or regional movies, the global market may turn to facilities in India and China in the future. Visual effects houses always have to stay on top of new technologies, which are also opportunities to gain market share in new entertainment arenas, such as virtual reality and augmented reality. In today’s market, just as the mid-budget movie has nearly been squeezed out of existence, mid-sized VFX houses are more challenged than ever to keep afloat. They can’t afford to build up to the scale of a bigger house, but they lack the agility of the mom-and-pop operation. Still, the minority of mid-sized facilities remain, balanced on that delicate position between the two extremes.
64 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 64-65
LOOKING BACK
When the VES did its last in-depth survey of the visual effects industry in 2013, its “White Paper” portrayed the industry as being in a state of flux. “In recent months, worldwide dialogue in the visual effects community has created a sense of urgency to address the complex pressures on artists and facilities dealing with issues of frayed business models, financial instability, and an increasingly ‘nomadic’ workforce operating without a secure vision of the future,” said the report, which surveyed “four complex, independent drivers” of change: government dynamics, including tax incentives; growing competition due to technology advances, expanding workforce and globalization; passion for the industry that leads to creation of unsustainable business models; and
TOP: Mike Chambers, Visual Effects Producer and Chair, VES MIDDLE: World War II allied soldiers face death from the sky in Christopher Nolan’s Dunkirk (2017) (Photo courtesy of Warner Bros. Pictures) BOTTOM: The ever-changing, gravity-defying dream world of Inception (2010). (Image courtesy of Warner Bros. Pictures)
“Maintaining business at a small independent house is very tough. The pressure to grow is intense, but once you get past 100 artists, it becomes a different ball game.” —Mike Chambers, Visual Effects Producer issues related to how the film industry is structured. For many in the VFX industry, a painful moment that highlighted all of the above was the 2013 Academy Awards, when Life of Pi won numerous Oscars, including Best Achievement in Visual Effects, while Rhythm & Hues, the pioneering effects facility that provided most of the effects, was shutting its doors and declaring bankruptcy. “The globalization of our industry was really taking off and Southern California VFX people were feeling very misplaced,” notes visual effects producer and VES Chair Mike Chambers. BOOMING BUSINESS
Data from research firm comScore reveals that North America broke box office records in 2017 – ticket sales in the U.S. and Canada earned $11.8 billion. The final tally for 2018 was $11.9 billion, good news for those who fear that streaming may be eroding movie theater activity. Global box office in 2018 was $42 billion, according to comScore. The 2018 summer North American box-office figures broke a record with a U.S. domestic total of $4.8 billion, suggesting box office and streaming are synergistic. All ten of the 10 highest grossing films in the U.S. were either VFX movies or computer-animated films. Many of the most popular streaming shows were also VFX heavy. Although there is plenty of uncertainty regarding numerous factors affecting the VFX industry, business is booming for a variety of reasons. Jon Peddie Research Vice President/Software Analyst Kathleen Maher credits what she calls “the Netflix boom,” referring to the rise of streaming media. “There is so much more work and there are a lot of small productions,” she says. “In the period after Life of Pi, when all those companies went bust and people were losing their jobs, it forced an improvement in the industry.”
SPRING 2019 VFXVOICE.COM • 65
2/22/19 2:57 PM
SPECIAL REPORT
GLOBAL VFX: STATE OF THE INDUSTRY 2019 By DEBRA KAUFMAN
Though the last five to 10 years have been disruptive in the visual effects universe, the industry is currently quite robust. Movies, animation and video games rich with the latest stunning visual effects top box offices and sales charts worldwide. With the current golden age of television, and an explosion of new platforms from Amazon Prime Video to Hulu to Netflix, the need for content has mushroomed and, with it, the need for visual effects of all kinds. Furthermore, companies are exploring new technologies from virtual reality to 360-degree experiences and light-field capture, all of which require the expertise and knowledge base of the visual effects industry. Although the work may be plentiful, some financial considerations can either help or hobble visual effect facilities. Several experts told VFX Voice that business margins are as thin as ever. Financial incentives have failed in some locations, but have become further entrenched in others. In places like the U.K., where incentives have been used to build a robust all-around film industry, as well as other places, incentives appear to be here to stay. Two things have arisen as a result: larger facilities have been establishing a presence in the cities where film/TV productions take place, and the number of must-be-there locations has grown. At the same time, many visual effects artists continue to go where the work is, essentially becoming nomadic as they move to different cities for different projects. Those facilities built in India, China and elsewhere to take on wire removal and rotoscoping are also coming into their own, particularly in India where they are offering a full range of visual effects for homegrown movies – with the aspiration that they might be able to do more than rotoscoping for out-of-town productions shooting in their countries. As a company like Netflix produces more and more country-specific or regional movies, the global market may turn to facilities in India and China in the future. Visual effects houses always have to stay on top of new technologies, which are also opportunities to gain market share in new entertainment arenas, such as virtual reality and augmented reality. In today’s market, just as the mid-budget movie has nearly been squeezed out of existence, mid-sized VFX houses are more challenged than ever to keep afloat. They can’t afford to build up to the scale of a bigger house, but they lack the agility of the mom-and-pop operation. Still, the minority of mid-sized facilities remain, balanced on that delicate position between the two extremes.
64 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 64-65
LOOKING BACK
When the VES did its last in-depth survey of the visual effects industry in 2013, its “White Paper” portrayed the industry as being in a state of flux. “In recent months, worldwide dialogue in the visual effects community has created a sense of urgency to address the complex pressures on artists and facilities dealing with issues of frayed business models, financial instability, and an increasingly ‘nomadic’ workforce operating without a secure vision of the future,” said the report, which surveyed “four complex, independent drivers” of change: government dynamics, including tax incentives; growing competition due to technology advances, expanding workforce and globalization; passion for the industry that leads to creation of unsustainable business models; and
TOP: Mike Chambers, Visual Effects Producer and Chair, VES MIDDLE: World War II allied soldiers face death from the sky in Christopher Nolan’s Dunkirk (2017) (Photo courtesy of Warner Bros. Pictures) BOTTOM: The ever-changing, gravity-defying dream world of Inception (2010). (Image courtesy of Warner Bros. Pictures)
“Maintaining business at a small independent house is very tough. The pressure to grow is intense, but once you get past 100 artists, it becomes a different ball game.” —Mike Chambers, Visual Effects Producer issues related to how the film industry is structured. For many in the VFX industry, a painful moment that highlighted all of the above was the 2013 Academy Awards, when Life of Pi won numerous Oscars, including Best Achievement in Visual Effects, while Rhythm & Hues, the pioneering effects facility that provided most of the effects, was shutting its doors and declaring bankruptcy. “The globalization of our industry was really taking off and Southern California VFX people were feeling very misplaced,” notes visual effects producer and VES Chair Mike Chambers. BOOMING BUSINESS
Data from research firm comScore reveals that North America broke box office records in 2017 – ticket sales in the U.S. and Canada earned $11.8 billion. The final tally for 2018 was $11.9 billion, good news for those who fear that streaming may be eroding movie theater activity. Global box office in 2018 was $42 billion, according to comScore. The 2018 summer North American box-office figures broke a record with a U.S. domestic total of $4.8 billion, suggesting box office and streaming are synergistic. All ten of the 10 highest grossing films in the U.S. were either VFX movies or computer-animated films. Many of the most popular streaming shows were also VFX heavy. Although there is plenty of uncertainty regarding numerous factors affecting the VFX industry, business is booming for a variety of reasons. Jon Peddie Research Vice President/Software Analyst Kathleen Maher credits what she calls “the Netflix boom,” referring to the rise of streaming media. “There is so much more work and there are a lot of small productions,” she says. “In the period after Life of Pi, when all those companies went bust and people were losing their jobs, it forced an improvement in the industry.”
SPRING 2019 VFXVOICE.COM • 65
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Ed Ulbrich, President/General Manager, Method Studios MIDDLE: Black Panther uses kinetic energy absorbed in his vibranium suit to create an energy blast in Black Panther (2018). (Image courtesy of Marvel Studios and Method Studios) BOTTOM: The underwater kingdom of Atlantis comes to life in Aquaman (2018). (Image courtesy of Warner Bros. Pictures)
At Toronto’s Spin VFX, President/Executive Producer Neishaw Ali notes that she’s “seen a tremendous growth in the quality and complexity of visual effects, as well as constant innovation in photorealism. Because of our ability to view on demand whatever we want, wherever and whenever, and with content that appeals to all cultures, Spin VFX is responding to the increased demand by working on more complex creatures and building more exciting worlds.” Australia’s Rising Sun Pictures’ Managing Director/Co-founder Tony Clark reports that “today’s market is buoyant, but definitely different than when we started. VFX is very much a mature market, having passed its point where demand outstripped supply,” he says. “It is now consolidating and optimizing as mature industries do. You can see this with the consolidation of vendors
“You have a whole new breed of client emerging. There’s a paradigm shift. It you look at the landscape of major studios, you have to include Netflix, Apple and Amazon. Now you’re seeing Paramount producing movies for Netflix, which has no legacy studio infrastructure to preserve. All these companies are tech companies first, so they own massive cloud infrastructure or partner with each other to provide services.” —Ed Ulbrich, President/General Manager, Method Studios and the never-ending search for cost efficiencies as expectations rise faster than budgets.” INTERSECTING FORCES
Parsing out today’s market is complex, as the forces of globalization, tax incentives, technology and workflow interact, both in terms of the industry and individual productions. Added to that are new platforms – especially streaming services – that increase opportunity, but also destabilize traditional models. Whether it’s Netflix, Amazon, YouTube, video games, virtual reality, augmented reality, theme parks or mobile content, the huge influx of content from new players and new platforms is a big factor behind today’s robust VFX market. Netflix especially has played a role in creating high-end streaming content in numerous regional markets worldwide, boosting local production, post and VFX markets. “You have a whole new breed of client emerging,” says Ed Ulbrich, Method Studios President and GM. “There’s a paradigm shift. If you look at the landscape of major studios, you have to include Netflix, Apple and Amazon. Now you’re seeing Paramount producing movies for Netflix, which has no legacy studio infrastructure to preserve. All these companies are tech companies first, so they own massive cloud infrastructure or partner with each other to provide services.” Ali reports that Spin VFX has worked with Amazon and Netflix, saying the relationships have been good. “They have strong production teams that work very collaboratively with vendors,” she says. At ILM in London, Executive in Charge Sue Lyster notes that the company is doing features for streaming, most recently
66 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 66-67
working on Netflix’s Bird Box. “We’ve also hired a team of TV executives to build up the episodic TV side of the business,” she says. ILM London Creative Director Ben Morris notes that “the world of streaming does to a degree change how we work. If you have 1,000 shots in a film and 12 months to deliver, you stack up a lot of crew towards the end of delivery,” he says. “In episodic TV, you deliver a certain number of shots per month. There is no mad dash at the end, and you might get fewer iterations.” Rising Sun Pictures’ Clark also notes how the customers are changing. “We’re simultaneously seeing multiple new outlets with the streaming players and their demand for incredibly high quality on those platforms,” he says. “There’s been a little work for Netflix, and we’re keen to explore more in that space. A key part of our strategy is to facilitate top-end creative services for streaming platforms. We’ve invested significantly in being in the right shape to address those opportunities.” At The Molecule in New York, Co-founder/Executive Producer Andrew Bly notes how OTT (Over the Top) services have changed the workflow – and the invoicing. “In the 2000s, the schedule was very predictable,” he says. “Now, with OTT services, seasons that used to be your downtime could be the busiest and vice versa. You can juggle all 10 episodes at once, and you have to think about how to invoice it. You’re on the hook to carry that money for a longer period. OTT services have also changed the quality. It used to be about doing the best job, but fast enough to hit the airwaves. Now, the Amazons and Hulus are all gunning for that high quality and viewers expect it. As a company you have to figure out how to keep an efficient pipeline to achieve feature-film quality.” Framestore’s Managing Director of Integrated Advertising, Helen Stanley, describes how her company has been active in pioneering another new platform, virtual reality, beginning with a VR project in 2012 for Game of Thrones that was presented at the South By Southwest (SXSW) confab. “Since then, we’ve grown in that area,” she says. “We’ve started to work for major ride installations, for the main IP or agencies. Clients come to us with real-time requirements and the VR projects are becoming more and more ambitious.” One outlet is so-called dark rides, in which stereoscopic media is projected on to a huge screen, a very immersive replacement for animatronics. To accommodate this genre of work, Framestore has pulled project management and creatives into one team. “It’s adding a new category for the visual effects facility,” says Stanley. “We combine the creative people with those who are experts in rides, and by end of April we’ll have created a ride in every type of genre, including VR walk-through, riding and motion-based. We’ve done brand-based rides for Samsung and VW in China, and gravity-free experiences where we partnered with hardware manufacturers and worked like a digital production company for the agency.”
TOP: Kathleen Maher, Vice President/Software Analyst, Jon Peddie Research MIDDLE: Artificial intelligence facial-capture helped detail Thanos’ facial features in Avengers: Infinity War (2018). (Image courtesy of Digital Domain) BOTTOM: Behind the scenes on First Man (2018). (Image courtesy Universal Pictures and DNEG)
“People have work. I’m not hearing about unionization so much. Also, sadly, labor movements all over the world are being undermined by companies and governments, but, again, people have work.” —Kathleen Maher, Vice President/Software Analyst, Jon Peddie Research
TAX INCENTIVES
While new platforms and outlets provide for a growing regional market and stretch traditional digital VFX houses into new genres, tax incentives are responsible for sending productions – and often post and VFX – to far-flung destinations, from Atlanta to Montreal,
SPRING 2019 VFXVOICE.COM • 67
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Ed Ulbrich, President/General Manager, Method Studios MIDDLE: Black Panther uses kinetic energy absorbed in his vibranium suit to create an energy blast in Black Panther (2018). (Image courtesy of Marvel Studios and Method Studios) BOTTOM: The underwater kingdom of Atlantis comes to life in Aquaman (2018). (Image courtesy of Warner Bros. Pictures)
At Toronto’s Spin VFX, President/Executive Producer Neishaw Ali notes that she’s “seen a tremendous growth in the quality and complexity of visual effects, as well as constant innovation in photorealism. Because of our ability to view on demand whatever we want, wherever and whenever, and with content that appeals to all cultures, Spin VFX is responding to the increased demand by working on more complex creatures and building more exciting worlds.” Australia’s Rising Sun Pictures’ Managing Director/Co-founder Tony Clark reports that “today’s market is buoyant, but definitely different than when we started. VFX is very much a mature market, having passed its point where demand outstripped supply,” he says. “It is now consolidating and optimizing as mature industries do. You can see this with the consolidation of vendors
“You have a whole new breed of client emerging. There’s a paradigm shift. It you look at the landscape of major studios, you have to include Netflix, Apple and Amazon. Now you’re seeing Paramount producing movies for Netflix, which has no legacy studio infrastructure to preserve. All these companies are tech companies first, so they own massive cloud infrastructure or partner with each other to provide services.” —Ed Ulbrich, President/General Manager, Method Studios and the never-ending search for cost efficiencies as expectations rise faster than budgets.” INTERSECTING FORCES
Parsing out today’s market is complex, as the forces of globalization, tax incentives, technology and workflow interact, both in terms of the industry and individual productions. Added to that are new platforms – especially streaming services – that increase opportunity, but also destabilize traditional models. Whether it’s Netflix, Amazon, YouTube, video games, virtual reality, augmented reality, theme parks or mobile content, the huge influx of content from new players and new platforms is a big factor behind today’s robust VFX market. Netflix especially has played a role in creating high-end streaming content in numerous regional markets worldwide, boosting local production, post and VFX markets. “You have a whole new breed of client emerging,” says Ed Ulbrich, Method Studios President and GM. “There’s a paradigm shift. If you look at the landscape of major studios, you have to include Netflix, Apple and Amazon. Now you’re seeing Paramount producing movies for Netflix, which has no legacy studio infrastructure to preserve. All these companies are tech companies first, so they own massive cloud infrastructure or partner with each other to provide services.” Ali reports that Spin VFX has worked with Amazon and Netflix, saying the relationships have been good. “They have strong production teams that work very collaboratively with vendors,” she says. At ILM in London, Executive in Charge Sue Lyster notes that the company is doing features for streaming, most recently
66 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 66-67
working on Netflix’s Bird Box. “We’ve also hired a team of TV executives to build up the episodic TV side of the business,” she says. ILM London Creative Director Ben Morris notes that “the world of streaming does to a degree change how we work. If you have 1,000 shots in a film and 12 months to deliver, you stack up a lot of crew towards the end of delivery,” he says. “In episodic TV, you deliver a certain number of shots per month. There is no mad dash at the end, and you might get fewer iterations.” Rising Sun Pictures’ Clark also notes how the customers are changing. “We’re simultaneously seeing multiple new outlets with the streaming players and their demand for incredibly high quality on those platforms,” he says. “There’s been a little work for Netflix, and we’re keen to explore more in that space. A key part of our strategy is to facilitate top-end creative services for streaming platforms. We’ve invested significantly in being in the right shape to address those opportunities.” At The Molecule in New York, Co-founder/Executive Producer Andrew Bly notes how OTT (Over the Top) services have changed the workflow – and the invoicing. “In the 2000s, the schedule was very predictable,” he says. “Now, with OTT services, seasons that used to be your downtime could be the busiest and vice versa. You can juggle all 10 episodes at once, and you have to think about how to invoice it. You’re on the hook to carry that money for a longer period. OTT services have also changed the quality. It used to be about doing the best job, but fast enough to hit the airwaves. Now, the Amazons and Hulus are all gunning for that high quality and viewers expect it. As a company you have to figure out how to keep an efficient pipeline to achieve feature-film quality.” Framestore’s Managing Director of Integrated Advertising, Helen Stanley, describes how her company has been active in pioneering another new platform, virtual reality, beginning with a VR project in 2012 for Game of Thrones that was presented at the South By Southwest (SXSW) confab. “Since then, we’ve grown in that area,” she says. “We’ve started to work for major ride installations, for the main IP or agencies. Clients come to us with real-time requirements and the VR projects are becoming more and more ambitious.” One outlet is so-called dark rides, in which stereoscopic media is projected on to a huge screen, a very immersive replacement for animatronics. To accommodate this genre of work, Framestore has pulled project management and creatives into one team. “It’s adding a new category for the visual effects facility,” says Stanley. “We combine the creative people with those who are experts in rides, and by end of April we’ll have created a ride in every type of genre, including VR walk-through, riding and motion-based. We’ve done brand-based rides for Samsung and VW in China, and gravity-free experiences where we partnered with hardware manufacturers and worked like a digital production company for the agency.”
TOP: Kathleen Maher, Vice President/Software Analyst, Jon Peddie Research MIDDLE: Artificial intelligence facial-capture helped detail Thanos’ facial features in Avengers: Infinity War (2018). (Image courtesy of Digital Domain) BOTTOM: Behind the scenes on First Man (2018). (Image courtesy Universal Pictures and DNEG)
“People have work. I’m not hearing about unionization so much. Also, sadly, labor movements all over the world are being undermined by companies and governments, but, again, people have work.” —Kathleen Maher, Vice President/Software Analyst, Jon Peddie Research
TAX INCENTIVES
While new platforms and outlets provide for a growing regional market and stretch traditional digital VFX houses into new genres, tax incentives are responsible for sending productions – and often post and VFX – to far-flung destinations, from Atlanta to Montreal,
SPRING 2019 VFXVOICE.COM • 67
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Neishaw Ali, President/ Executive Producer, Spin VFX MIDDLE: The moving cities of Mortal Engines (2018). (Image courtesy of Universal Pictures) BOTTOM: War for the Planet of the Apes (2017). (Image courtesy of Twentieth Century Fox)
London to Berlin – to enjoy rebates. “Vancouver was the granddaddy of all of them, and has been very successful,” says Chambers. “But look what’s happened in places like Michigan – they threw big tax incentives out there and used old warehouses for stages, and the producers said yes, but they didn’t have the crew base to support the work.” For that reason, although Atlanta is one of the destinations that has benefited from incentives, many other U.S. states that initially offered them have taken stock of the fact that tax incentives were not a short-term fix to create a new industry sector. “The way producers are using tax incentives has changed,” says Visual Effects Supervisor Jeffrey A. Okun, VES. “The spend has to be correct and it has to go to the right company and be the right incentive. A VFX producer’s job has become really complicated.
“[I’ve]seen a tremendous growth in the quality and complexity of visual effects, as well as constant innovation in photorealism. Because of our ability to view on demand whatever we want, wherever and whenever, and with content that appeals to all cultures, Spin VFX is responding to the increased demand by working on more complex creatures and building more exciting worlds.” —Neishaw Ali, President/Executive Producer, Spin VFX My job is more fun. It’s easy to find creative talent because I have a huge pool to pull from – the entire world! What’s important to note is that if most of the incentive companies were put up against the U.S. companies, the U.S. companies are actually cheaper.” In London, ILM’s Lyster points out that the film tax break “isn’t there to support visual effects, but the growth and well-being of the British film industry. Still, there’s no doubt in many ways it plays a part in the growth of the VFX industry in London,” she says. “There’s a symbiosis. In a place like Montreal, where they have a very healthy tax break, the challenge has been finding and building the talent. I don’t think ILM would have come to London if there had been a tax break but no talent to hire. The two go hand in hand.” At Framestore, Managing Director of Feature Films Fiona Walkinshaw talks about the decision to open a Montreal facility in 2013. “There are big commitments from the government to maintain healthy incentives there,” she says. “We wouldn’t go into these places without knowing it’s a robust set-up. Incentives attracts a huge number of artists to Montreal, and so many people paying taxes is good for their economy and the other businesses they’re trying to build up.” In Toronto, Ali reports that most of Spin VFX’s work comes from the U.S. “We’ve spent the past 15 years developing relationships locally and internationally,” she says. “Incentives are quite attractive, and many U.S. companies want to shoot in Canada, which creates a great opportunity to complete VFX here.” At the same time, she notes, “many jurisdictions have similar incentives, which has resulted in greater competition within the industry.” Australia also has competitive incentives, says Clark, with a 30% federal rebate on qualifying gross spend, with various state
68 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 69
incentives that can be added on top. “They’re all simple and predictable,” he says. “In Rising Sun Pictures’ home base of South Australia, they can be combined to create a 40% rebate pretty much on all VFX spend – and as high as 50% on an Australian project.” In India, BotVFX Chief Executive/Co-founder Hitesh Shah notes that, although the city of Chennai has incentives, he considers incentives as “a fickle thing. The dynamics of incentives create friction in global decision making,” he says. “The accountants get into the number crunching to see where it needs to be and everyone has to work around those targets, even if it’s suboptimal in other ways.” ROAMING SAMURAI
In a way, VFX artists are probably the biggest losers in the tax
TOP: Tony Clark, Managing Director/Co-founder, Rising Sun Pictures MIDDLE: The Valkyrie attack Hela (played by Cate Blanchett) while riding winged horses amid swirling layers of atmosphere in Thor: Ragnarok (2017). (Image courtesy Marvel Studios, Walt Disney Pictures and Rising Sun Pictures) BOTTOM: Valkyrie, the Asgardian warrior (played by Tessa Thompson), in Thor: Ragnarok (2017). (Image courtesy Marvel Studios, Walt Disney Pictures and Rising Sun Pictures)
“Today’s market is buoyant, but definitely different than when we started. VFX is very much a mature market, having passed its point where demand outstripped supply. It is now consolidating and optimizing as mature industries do. You can see this with the consolidation of vendors and the never-ending search for cost efficiencies as expectations rise faster than budgets.” —Tony Clark, Managing Director/Co-founder, Rising Sun Pictures incentive game, as noted in the VES’s 2013 White Paper. “Artists are roaming samurai,” says Ulbrich. “They move to where the work is. I’ve lived it, and that’s the hardship for the VFX community. It requires mobility to be where the work is.” Globalization of the VFX industry was a trend that’s now a reality. “Visual effects aren’t the first industry to be disrupted by globalization,” points out Ulbrich. “We’ve seen it happen in consumer electronics and automotive.” In the early days of digital visual effects, the VFX facilities existed in North America, Europe and Australia/New Zealand – places that had access to the latest, fastest computers and software or developed their own. Ulbrich notes that with incentives, the VFX house has to be able to “move to where you need to be,” spurring globalization. “The California visual effects industry is completely different than it was 10 years ago,” he says. “Two decades of slow, steady globalization eliminated the core business.” The landscape continues to change, sometimes in unpredictable ways. Look no farther than Beijing-headquartered VHQ Media, where Dayne Cowan heads up the 31-year-old company’s feature film division. Cowan’s career was built in Europe, where he worked as Double Negative’s 3D/VFX Supervisor, Cinesite’s Digital Development Manager and Head of VFX at Molinare/Pixion. For the last six years, he’s been in charge of VHQ Media, which grew from 20 to 350+ people and also has locations in Singapore, Kuala Lumpur and Jakarta. “With public funding from the Taiwan stock exchange, we’ve been investing in a lot of Chinese and regional film and TV,” says Cowan. “It’s driven by a boom in the Chinese film market, as the number of cinemas in China has skyrocketed and the call for content is enormous.” The facility works on “the odd North American or European film,” adds Cowan, “but we have our
SPRING 2019 VFXVOICE.COM • 69
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Neishaw Ali, President/ Executive Producer, Spin VFX MIDDLE: The moving cities of Mortal Engines (2018). (Image courtesy of Universal Pictures) BOTTOM: War for the Planet of the Apes (2017). (Image courtesy of Twentieth Century Fox)
London to Berlin – to enjoy rebates. “Vancouver was the granddaddy of all of them, and has been very successful,” says Chambers. “But look what’s happened in places like Michigan – they threw big tax incentives out there and used old warehouses for stages, and the producers said yes, but they didn’t have the crew base to support the work.” For that reason, although Atlanta is one of the destinations that has benefited from incentives, many other U.S. states that initially offered them have taken stock of the fact that tax incentives were not a short-term fix to create a new industry sector. “The way producers are using tax incentives has changed,” says Visual Effects Supervisor Jeffrey A. Okun, VES. “The spend has to be correct and it has to go to the right company and be the right incentive. A VFX producer’s job has become really complicated.
“[I’ve]seen a tremendous growth in the quality and complexity of visual effects, as well as constant innovation in photorealism. Because of our ability to view on demand whatever we want, wherever and whenever, and with content that appeals to all cultures, Spin VFX is responding to the increased demand by working on more complex creatures and building more exciting worlds.” —Neishaw Ali, President/Executive Producer, Spin VFX My job is more fun. It’s easy to find creative talent because I have a huge pool to pull from – the entire world! What’s important to note is that if most of the incentive companies were put up against the U.S. companies, the U.S. companies are actually cheaper.” In London, ILM’s Lyster points out that the film tax break “isn’t there to support visual effects, but the growth and well-being of the British film industry. Still, there’s no doubt in many ways it plays a part in the growth of the VFX industry in London,” she says. “There’s a symbiosis. In a place like Montreal, where they have a very healthy tax break, the challenge has been finding and building the talent. I don’t think ILM would have come to London if there had been a tax break but no talent to hire. The two go hand in hand.” At Framestore, Managing Director of Feature Films Fiona Walkinshaw talks about the decision to open a Montreal facility in 2013. “There are big commitments from the government to maintain healthy incentives there,” she says. “We wouldn’t go into these places without knowing it’s a robust set-up. Incentives attracts a huge number of artists to Montreal, and so many people paying taxes is good for their economy and the other businesses they’re trying to build up.” In Toronto, Ali reports that most of Spin VFX’s work comes from the U.S. “We’ve spent the past 15 years developing relationships locally and internationally,” she says. “Incentives are quite attractive, and many U.S. companies want to shoot in Canada, which creates a great opportunity to complete VFX here.” At the same time, she notes, “many jurisdictions have similar incentives, which has resulted in greater competition within the industry.” Australia also has competitive incentives, says Clark, with a 30% federal rebate on qualifying gross spend, with various state
68 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 69
incentives that can be added on top. “They’re all simple and predictable,” he says. “In Rising Sun Pictures’ home base of South Australia, they can be combined to create a 40% rebate pretty much on all VFX spend – and as high as 50% on an Australian project.” In India, BotVFX Chief Executive/Co-founder Hitesh Shah notes that, although the city of Chennai has incentives, he considers incentives as “a fickle thing. The dynamics of incentives create friction in global decision making,” he says. “The accountants get into the number crunching to see where it needs to be and everyone has to work around those targets, even if it’s suboptimal in other ways.” ROAMING SAMURAI
In a way, VFX artists are probably the biggest losers in the tax
TOP: Tony Clark, Managing Director/Co-founder, Rising Sun Pictures MIDDLE: The Valkyrie attack Hela (played by Cate Blanchett) while riding winged horses amid swirling layers of atmosphere in Thor: Ragnarok (2017). (Image courtesy Marvel Studios, Walt Disney Pictures and Rising Sun Pictures) BOTTOM: Valkyrie, the Asgardian warrior (played by Tessa Thompson), in Thor: Ragnarok (2017). (Image courtesy Marvel Studios, Walt Disney Pictures and Rising Sun Pictures)
“Today’s market is buoyant, but definitely different than when we started. VFX is very much a mature market, having passed its point where demand outstripped supply. It is now consolidating and optimizing as mature industries do. You can see this with the consolidation of vendors and the never-ending search for cost efficiencies as expectations rise faster than budgets.” —Tony Clark, Managing Director/Co-founder, Rising Sun Pictures incentive game, as noted in the VES’s 2013 White Paper. “Artists are roaming samurai,” says Ulbrich. “They move to where the work is. I’ve lived it, and that’s the hardship for the VFX community. It requires mobility to be where the work is.” Globalization of the VFX industry was a trend that’s now a reality. “Visual effects aren’t the first industry to be disrupted by globalization,” points out Ulbrich. “We’ve seen it happen in consumer electronics and automotive.” In the early days of digital visual effects, the VFX facilities existed in North America, Europe and Australia/New Zealand – places that had access to the latest, fastest computers and software or developed their own. Ulbrich notes that with incentives, the VFX house has to be able to “move to where you need to be,” spurring globalization. “The California visual effects industry is completely different than it was 10 years ago,” he says. “Two decades of slow, steady globalization eliminated the core business.” The landscape continues to change, sometimes in unpredictable ways. Look no farther than Beijing-headquartered VHQ Media, where Dayne Cowan heads up the 31-year-old company’s feature film division. Cowan’s career was built in Europe, where he worked as Double Negative’s 3D/VFX Supervisor, Cinesite’s Digital Development Manager and Head of VFX at Molinare/Pixion. For the last six years, he’s been in charge of VHQ Media, which grew from 20 to 350+ people and also has locations in Singapore, Kuala Lumpur and Jakarta. “With public funding from the Taiwan stock exchange, we’ve been investing in a lot of Chinese and regional film and TV,” says Cowan. “It’s driven by a boom in the Chinese film market, as the number of cinemas in China has skyrocketed and the call for content is enormous.” The facility works on “the odd North American or European film,” adds Cowan, “but we have our
SPRING 2019 VFXVOICE.COM • 69
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Sue Lyster, Executive in Charge, ILM London MIDDLE: Mark Ruffalo on the set of Avengers: Infinity War (2018). (Image courtesy of Marvel Studios) BOTTOM: Tom Cruise as Ethan Hunt clings to a helicopter in Mission: Impossible – Fallout (2018). (Image courtesy of Paramount Pictures)
hands full with work for China.” BotVFX started 10 years ago as a classic outsourcing facility in Chennai, India, offering rotoscoping, tracking, paint and prep to a variety of clients. At an earlier company he founded, Shah had a fateful conversation with Rhythm & Hues, which had a Mumbai unit. “They helped me to understand that the trick was in cultivating a full culture, have proper training and QC – and not get ahead of your headlights,” he says. “We put our head down and did that.” BotVFX, which does all its work in Chennai, has headquarters in Atlanta with offices in Los Angeles, Vancouver and Montreal. “We work with WETA, Hybride, Rodeo and ILM [among others],” says Shah. “There is still a large base that doesn’t have that dedicated back end and are looking for a reliable player with scale. With 350 people, we can take on a feature.”
“[A.I.] will cut down on CPU and render time. We’re also investigating techniques to recognize and replace human faces in a semi-automated way, potentially a huge leap forward. Even with rotoscoping, you can ask the computer to hunt for humans and automatically rotoscope. It’s not production level yet, but pretty amazing.” —Sue Lyster, Executive in Charge, ILM London Okun points out that, although “roto work is still big in India and China,” that’s not all those facilities do. “They’ve also stepped up their game to make regional films filled with visual effects,” he says. “India has made a couple of great sword and sorcery movies. Those countries as well as Taiwan and Korea have actively gone out and hired a lot of American and British artists to train their artists or run the company.” Globalization has also meant that the North American and European companies have had to choose – judiciously – where else to set up shop. Vancouver was an early destination, but the number of locations has expanded exponentially. ILM’s Morris notes that opening the facility in London five years ago was a big step. “Setting up in the U.K. made sense – with so many filmmakers and productions,” he says. “The VFX industry in London has grown to be the biggest in the world and the density of talent was and is still very high.” He notes that committing to a London facility also coincided with Lucasfilm making more films in the U.K. With facilities now in San Francisco, Singapore, Vancouver and London, says Morris, “we tri-sect the 24 hours of the globe, which brings challenges, but also benefits us to be able to distribute work among the global community.” Method Studios, a Deluxe company, has facilities in Los Angeles, New York, Atlanta, Vancouver, San Francisco, Pune (India), Melbourne and, with the acquisition of Atomic Fiction, now Montreal. Framestore, in addition to its London site, has facilities in Vancouver, Singapore, Montreal and Pune, with sites servicing advertising clients in Chicago, New York and Los Angeles. In her research, Maher has found the beginning of a trend she’s calling de-globalization – which may eventually impact the VFX industry. “There is plenty of outsourcing to Vancouver or
70 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 71
wherever there’s an advantage,” she says, “but there is actually less outsourcing than before. Businesses are getting more comfortable working closer to home, in everything from manufacturing to the film industry. I’ve just started to see pick-up in CAD (Computer Assisted Design) and manufacturing. I’m hearing people say they didn’t like the loss of control, the time lag and the cultural differences with other countries.” As one example, Walkinshaw reports that Framestore doesn’t usually outsource anything it does in feature films. “We manage it all ourselves,” she says. “We do outsource if we have to, but we prefer to do it all within Framestore’s operation pipelines. It’s easier to manage your own toolset and you can respond to the inevitable changes. The constant pressure is to do more and be very efficient, and we’re constantly refining our toolset.” SIZE MATTERS
In today’s market, most would agree with Walkinshaw’s conclusion. “You need to have global locations or have partners to remain big,” she says. That brings up the issue of the size of a VFX facility – how big is big enough? Can small houses survive? Are mid-sized houses getting squeezed between the two extremes? Lyster reports that ILM has grown its London studio from “very small to quite large – bigger than we originally intended it to be. My experience is that in some ways it’s easier in the industry to be big rather than small,” she says. “Visual effects are constantly buffeted by the winds of change, and running a VFX project is constantly bending to those winds. In some ways, it’s easier to scale [when you’re big] than it is if you’re small. You have more options, more resources to be able to cope with what filmmakers throw at you in terms of schedules and creative changes.” Still, she notes, being big means “you have a lot of mouths to feed.” The last five to 10 years have also seen a tremendous amount of consolidation and other changes. Chambers refers to the VES survey done almost a decade ago. “We found about 4,000 companies around the world doing VFX,” he says. “How many of those smaller companies were absorbed?” Many VFX companies have closed doors, changed hands, been acquired by bigger companies, or sold a majority stake to equity partners. Ulbrich reports that Method Studios, originally independent, was first acquired by Ascent Media, which sold it to Deluxe when it was transitioning from being a photochemical lab into a digital creative services building. “The strategy is buy vs. build,” he says. “By buying, you scale up quickly.” Sometimes just acquiring a company isn’t enough to become a force in the VFX industry. Ulbrich notes that Method’s pedigree was commercials and music videos. Then came the decision to get into VFX for features. “You have to change the operation,” says Ulbrich. “You can’t produce a 500-shot movie like a 15-second commercial. Thus began a journey of building and acquiring to put a network of studios in place to optimize talent, low cost labor and government incentives.” Private equity money is another way to scale up, a path followed by Pixomondo and Fuse FX. Mayfair Equity Partners acquired a majority ownership in Pixomondo, which valued the VFX
TOP: Ben Morris, Creative Director, ILM London MIDDLE: Sandra Bullock and George Clooney miles from Earth in Gravity (2013). (Image courtesy of Warner Bros. Pictures) BOTTOM: All light is moving towards the black hole in Christopher Nolan’s Interstellar (2014). (Image courtesy of Warner Bros. Pictures)
“We tri-sect the 24 hours of the globe, which brings challenges, but also benefits us to be able to distribute work among the global community.” —Ben Morris, Creative Director, ILM London
SPRING 2019 VFXVOICE.COM • 71
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Sue Lyster, Executive in Charge, ILM London MIDDLE: Mark Ruffalo on the set of Avengers: Infinity War (2018). (Image courtesy of Marvel Studios) BOTTOM: Tom Cruise as Ethan Hunt clings to a helicopter in Mission: Impossible – Fallout (2018). (Image courtesy of Paramount Pictures)
hands full with work for China.” BotVFX started 10 years ago as a classic outsourcing facility in Chennai, India, offering rotoscoping, tracking, paint and prep to a variety of clients. At an earlier company he founded, Shah had a fateful conversation with Rhythm & Hues, which had a Mumbai unit. “They helped me to understand that the trick was in cultivating a full culture, have proper training and QC – and not get ahead of your headlights,” he says. “We put our head down and did that.” BotVFX, which does all its work in Chennai, has headquarters in Atlanta with offices in Los Angeles, Vancouver and Montreal. “We work with WETA, Hybride, Rodeo and ILM [among others],” says Shah. “There is still a large base that doesn’t have that dedicated back end and are looking for a reliable player with scale. With 350 people, we can take on a feature.”
“[A.I.] will cut down on CPU and render time. We’re also investigating techniques to recognize and replace human faces in a semi-automated way, potentially a huge leap forward. Even with rotoscoping, you can ask the computer to hunt for humans and automatically rotoscope. It’s not production level yet, but pretty amazing.” —Sue Lyster, Executive in Charge, ILM London Okun points out that, although “roto work is still big in India and China,” that’s not all those facilities do. “They’ve also stepped up their game to make regional films filled with visual effects,” he says. “India has made a couple of great sword and sorcery movies. Those countries as well as Taiwan and Korea have actively gone out and hired a lot of American and British artists to train their artists or run the company.” Globalization has also meant that the North American and European companies have had to choose – judiciously – where else to set up shop. Vancouver was an early destination, but the number of locations has expanded exponentially. ILM’s Morris notes that opening the facility in London five years ago was a big step. “Setting up in the U.K. made sense – with so many filmmakers and productions,” he says. “The VFX industry in London has grown to be the biggest in the world and the density of talent was and is still very high.” He notes that committing to a London facility also coincided with Lucasfilm making more films in the U.K. With facilities now in San Francisco, Singapore, Vancouver and London, says Morris, “we tri-sect the 24 hours of the globe, which brings challenges, but also benefits us to be able to distribute work among the global community.” Method Studios, a Deluxe company, has facilities in Los Angeles, New York, Atlanta, Vancouver, San Francisco, Pune (India), Melbourne and, with the acquisition of Atomic Fiction, now Montreal. Framestore, in addition to its London site, has facilities in Vancouver, Singapore, Montreal and Pune, with sites servicing advertising clients in Chicago, New York and Los Angeles. In her research, Maher has found the beginning of a trend she’s calling de-globalization – which may eventually impact the VFX industry. “There is plenty of outsourcing to Vancouver or
70 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 71
wherever there’s an advantage,” she says, “but there is actually less outsourcing than before. Businesses are getting more comfortable working closer to home, in everything from manufacturing to the film industry. I’ve just started to see pick-up in CAD (Computer Assisted Design) and manufacturing. I’m hearing people say they didn’t like the loss of control, the time lag and the cultural differences with other countries.” As one example, Walkinshaw reports that Framestore doesn’t usually outsource anything it does in feature films. “We manage it all ourselves,” she says. “We do outsource if we have to, but we prefer to do it all within Framestore’s operation pipelines. It’s easier to manage your own toolset and you can respond to the inevitable changes. The constant pressure is to do more and be very efficient, and we’re constantly refining our toolset.” SIZE MATTERS
In today’s market, most would agree with Walkinshaw’s conclusion. “You need to have global locations or have partners to remain big,” she says. That brings up the issue of the size of a VFX facility – how big is big enough? Can small houses survive? Are mid-sized houses getting squeezed between the two extremes? Lyster reports that ILM has grown its London studio from “very small to quite large – bigger than we originally intended it to be. My experience is that in some ways it’s easier in the industry to be big rather than small,” she says. “Visual effects are constantly buffeted by the winds of change, and running a VFX project is constantly bending to those winds. In some ways, it’s easier to scale [when you’re big] than it is if you’re small. You have more options, more resources to be able to cope with what filmmakers throw at you in terms of schedules and creative changes.” Still, she notes, being big means “you have a lot of mouths to feed.” The last five to 10 years have also seen a tremendous amount of consolidation and other changes. Chambers refers to the VES survey done almost a decade ago. “We found about 4,000 companies around the world doing VFX,” he says. “How many of those smaller companies were absorbed?” Many VFX companies have closed doors, changed hands, been acquired by bigger companies, or sold a majority stake to equity partners. Ulbrich reports that Method Studios, originally independent, was first acquired by Ascent Media, which sold it to Deluxe when it was transitioning from being a photochemical lab into a digital creative services building. “The strategy is buy vs. build,” he says. “By buying, you scale up quickly.” Sometimes just acquiring a company isn’t enough to become a force in the VFX industry. Ulbrich notes that Method’s pedigree was commercials and music videos. Then came the decision to get into VFX for features. “You have to change the operation,” says Ulbrich. “You can’t produce a 500-shot movie like a 15-second commercial. Thus began a journey of building and acquiring to put a network of studios in place to optimize talent, low cost labor and government incentives.” Private equity money is another way to scale up, a path followed by Pixomondo and Fuse FX. Mayfair Equity Partners acquired a majority ownership in Pixomondo, which valued the VFX
TOP: Ben Morris, Creative Director, ILM London MIDDLE: Sandra Bullock and George Clooney miles from Earth in Gravity (2013). (Image courtesy of Warner Bros. Pictures) BOTTOM: All light is moving towards the black hole in Christopher Nolan’s Interstellar (2014). (Image courtesy of Warner Bros. Pictures)
“We tri-sect the 24 hours of the globe, which brings challenges, but also benefits us to be able to distribute work among the global community.” —Ben Morris, Creative Director, ILM London
SPRING 2019 VFXVOICE.COM • 71
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Andrew Bly, Co-founder/ Executive Producer, The Molecule MIDDLE: The visual effects stretch the imagination in Pacific Rim: Uprising (2018). (Image courtesy of Universal Pictures) BOTTOM: Tony Stark is Iron Man in Avengers: Age of Ultron (2015). (Image courtesy Marvel Studios/ Walt Disney Studios)
company at $65 million, partnering with company founder Thilo Kuther, who remained as Chief Executive/Executive Producer. Pixomondo was founded in 2001 in Germany, and doubled in size between 2014 and 2017. In 2018, EagleTree took a majority stake in FuseFX; no financial details were disclosed, but Founder/Chief Executive David Altenau and Co-founders, Chief Development Officer Tim Jacobsen and Chief Technology Officer Jason Fotter retain a minority stake and have stayed with the company in their current roles. Everybody notes the pressure to scale in an industry that still produces very thin profit margins. One VFX house executive reports that a VFX industry joke is that facility owners aspire to the profit margins earned by grocery stores, which typically range from 1% to 2%. Walkinshaw notes that one of the pressures to get
“[I]t is hard to run a mid-sized company. You’re not so small, so you can’t be as nimble with schedule changes as you used to be. But you’re not a big corporation where you have a plethora of resources to shuffle. It definitely has its challenges. It’s easier to form little companies to compete. And as people become more skilled and efficient, you can achieve certain tasks in a day that took a week five years ago,” —Andrew Bly, Co-founder/Executive Producer, The Molecule big is because the projects themselves are bigger. “There are more shops than there used to be,” she says. “You can’t just focus on animation or environments or effects.” At the same time, it’s getting easier than ever to open a small house, says Chambers, who points out the existence of less expensive, more standardized equipment and software availability. “But maintaining business at a small independent house is very tough,” he admits. “The pressure to grow is intense, but once you get past 100 artists, it becomes a different ball game.” Okun also notes that the idea of a company of five or six people is appealing but, for a smaller house, expanding and contracting per project isn’t as easy to do as a bigger house. Therefore, the need to keep constant work flowing in is essential to the health of the small company. The Molecule in New York inhabits the space of the mid-sized visual effects facility. “We started with five guys in a room 13 years ago,” says Bly. “Now we are up to 45 full-time employees – up to 90 between New York and Los Angeles, although we do 70% of our business in New York because of the tax incentives.” Most of the work, he explains is for episodic series, including House of Cards, Ballers, The Americans and, currently, Happy. “I would agree that it is hard to run a mid-sized company,” he says. “You’re not so small, so you can’t be as nimble with schedule changes as you used to be. But you’re not a big corporation where you have a plethora of resources to shuffle. It definitely has its challenges.” He adds that, “it’s easier to form little companies to compete. And as people become more skilled and efficient, you can achieve certain tasks in a day that took a week five years ago,” he says. Even as a mid-sized house, The Molecule does outsource some of the
72 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 73
rotoscoping and other similar work. “When we’re allowed to,” adds Bly, “as long as the client is all right with it.” ROLE OF TECHNOLOGY
Change is a constant and nowhere is that more obvious in visual effects than in the technologies used to create visual effects. The maturity and strength of the VFX industry help explain why visual effects work has become a booming business. “Visual effects are seeping into all the other departments – camera, lighting, costumes, sets, makeup, hair, production design and stunts,” says Okun, “so there’s more work than ever before.” Morris agrees, noting that, “visual effects now support so much of the storytelling process.” Visual effects companies are also leveraging new technologies such as virtual reality and immersive LED walls to grow their businesses. In pursuit of the ever more-efficient pipeline, executives at visual effects houses are tasked with staying on top of trends as diverse as software-as-a-service (SaaS), light-field capture, cloud rendering and game engines for real-time production. “It’s important to remain aware of the interesting things happening in the market,” says Rising Sun Pictures’ Clark. “We all need to be open to embracing new technology and ideas, but also maintain a stable production environment. We need to test new technologies in smaller, less interdependent environments – and keep in mind that the hard things of today are ultimately commoditized.” In her research, Maher points to “new tools for collaboration and the cloud. SaaS is coming to all software and will be especially attractive to VFX studios because it’s a logical way to keep track of expenses,” she says. She adds that The Jungle Book ushered in the use of game engines for virtual production, which, encouraged by game engine companies Unreal and Unity, has been picking up greater adoption in the VFX community. Okun notes that First Man is an example of a movie that took the tools from Gravity and pushed them further. “There are lots of creative technicians and artists working behind the scenes,” he says. “On First Man, they built a bigger LED wall and instead of projecting proxy images, they projected the real images and used that to light the actors inside the space ships. That’s pretty clever.” He also points to Alejandro Iñárritu’s virtual reality installation Carne y Arena as “an amazing use of technology to tell a story that’s never been told, in a way that’s so immersive that it can’t be ignored.” He also points to advances in lighting that have helped visual effects, specifically smaller LED lights that put out more light and are remote-controlled as responsible for speeding set-ups. “Being able to program the lights means I can get my interactive lighting done just right,” he says. “We’re out of the age of grips waving flags in front of lights.” Rendering in the cloud has had an uneven uptake. At Framestore, Walkinshaw reports they already do cloud rendering. Ulbrich notes that, although the cloud is a massive opportunity and works extremely well, “it is shockingly expensive unless you own it. We’ve sliced and diced it to make sense of it, and now we burst into the cloud when we’re peaking and scaled beyond our data centers,” he says, “but the money is still better spent in our
TOP: Helen Stanley, Managing Director of Integrated Advertising, Framestore MIDDLE AND BOTTOM: Framestore provided concept and production of ‘strange environments’ for Volkswagen Group China Import’s “Touareg Hyper Reality Test Drive” VR experience, unveiled last fall. (Image courtesy of Framestore)
“We’ve started to work for major ride installations, for the main IP or agencies. Clients come to us with real-time requirements and the VR projects are becoming more and more ambitious. It’s adding a new category for the visual effects facility.” —Helen Stanley, Managing Director of Integrated Advertising, Framestore
SPRING 2019 VFXVOICE.COM • 73
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Andrew Bly, Co-founder/ Executive Producer, The Molecule MIDDLE: The visual effects stretch the imagination in Pacific Rim: Uprising (2018). (Image courtesy of Universal Pictures) BOTTOM: Tony Stark is Iron Man in Avengers: Age of Ultron (2015). (Image courtesy Marvel Studios/ Walt Disney Studios)
company at $65 million, partnering with company founder Thilo Kuther, who remained as Chief Executive/Executive Producer. Pixomondo was founded in 2001 in Germany, and doubled in size between 2014 and 2017. In 2018, EagleTree took a majority stake in FuseFX; no financial details were disclosed, but Founder/Chief Executive David Altenau and Co-founders, Chief Development Officer Tim Jacobsen and Chief Technology Officer Jason Fotter retain a minority stake and have stayed with the company in their current roles. Everybody notes the pressure to scale in an industry that still produces very thin profit margins. One VFX house executive reports that a VFX industry joke is that facility owners aspire to the profit margins earned by grocery stores, which typically range from 1% to 2%. Walkinshaw notes that one of the pressures to get
“[I]t is hard to run a mid-sized company. You’re not so small, so you can’t be as nimble with schedule changes as you used to be. But you’re not a big corporation where you have a plethora of resources to shuffle. It definitely has its challenges. It’s easier to form little companies to compete. And as people become more skilled and efficient, you can achieve certain tasks in a day that took a week five years ago,” —Andrew Bly, Co-founder/Executive Producer, The Molecule big is because the projects themselves are bigger. “There are more shops than there used to be,” she says. “You can’t just focus on animation or environments or effects.” At the same time, it’s getting easier than ever to open a small house, says Chambers, who points out the existence of less expensive, more standardized equipment and software availability. “But maintaining business at a small independent house is very tough,” he admits. “The pressure to grow is intense, but once you get past 100 artists, it becomes a different ball game.” Okun also notes that the idea of a company of five or six people is appealing but, for a smaller house, expanding and contracting per project isn’t as easy to do as a bigger house. Therefore, the need to keep constant work flowing in is essential to the health of the small company. The Molecule in New York inhabits the space of the mid-sized visual effects facility. “We started with five guys in a room 13 years ago,” says Bly. “Now we are up to 45 full-time employees – up to 90 between New York and Los Angeles, although we do 70% of our business in New York because of the tax incentives.” Most of the work, he explains is for episodic series, including House of Cards, Ballers, The Americans and, currently, Happy. “I would agree that it is hard to run a mid-sized company,” he says. “You’re not so small, so you can’t be as nimble with schedule changes as you used to be. But you’re not a big corporation where you have a plethora of resources to shuffle. It definitely has its challenges.” He adds that, “it’s easier to form little companies to compete. And as people become more skilled and efficient, you can achieve certain tasks in a day that took a week five years ago,” he says. Even as a mid-sized house, The Molecule does outsource some of the
72 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 73
rotoscoping and other similar work. “When we’re allowed to,” adds Bly, “as long as the client is all right with it.” ROLE OF TECHNOLOGY
Change is a constant and nowhere is that more obvious in visual effects than in the technologies used to create visual effects. The maturity and strength of the VFX industry help explain why visual effects work has become a booming business. “Visual effects are seeping into all the other departments – camera, lighting, costumes, sets, makeup, hair, production design and stunts,” says Okun, “so there’s more work than ever before.” Morris agrees, noting that, “visual effects now support so much of the storytelling process.” Visual effects companies are also leveraging new technologies such as virtual reality and immersive LED walls to grow their businesses. In pursuit of the ever more-efficient pipeline, executives at visual effects houses are tasked with staying on top of trends as diverse as software-as-a-service (SaaS), light-field capture, cloud rendering and game engines for real-time production. “It’s important to remain aware of the interesting things happening in the market,” says Rising Sun Pictures’ Clark. “We all need to be open to embracing new technology and ideas, but also maintain a stable production environment. We need to test new technologies in smaller, less interdependent environments – and keep in mind that the hard things of today are ultimately commoditized.” In her research, Maher points to “new tools for collaboration and the cloud. SaaS is coming to all software and will be especially attractive to VFX studios because it’s a logical way to keep track of expenses,” she says. She adds that The Jungle Book ushered in the use of game engines for virtual production, which, encouraged by game engine companies Unreal and Unity, has been picking up greater adoption in the VFX community. Okun notes that First Man is an example of a movie that took the tools from Gravity and pushed them further. “There are lots of creative technicians and artists working behind the scenes,” he says. “On First Man, they built a bigger LED wall and instead of projecting proxy images, they projected the real images and used that to light the actors inside the space ships. That’s pretty clever.” He also points to Alejandro Iñárritu’s virtual reality installation Carne y Arena as “an amazing use of technology to tell a story that’s never been told, in a way that’s so immersive that it can’t be ignored.” He also points to advances in lighting that have helped visual effects, specifically smaller LED lights that put out more light and are remote-controlled as responsible for speeding set-ups. “Being able to program the lights means I can get my interactive lighting done just right,” he says. “We’re out of the age of grips waving flags in front of lights.” Rendering in the cloud has had an uneven uptake. At Framestore, Walkinshaw reports they already do cloud rendering. Ulbrich notes that, although the cloud is a massive opportunity and works extremely well, “it is shockingly expensive unless you own it. We’ve sliced and diced it to make sense of it, and now we burst into the cloud when we’re peaking and scaled beyond our data centers,” he says, “but the money is still better spent in our
TOP: Helen Stanley, Managing Director of Integrated Advertising, Framestore MIDDLE AND BOTTOM: Framestore provided concept and production of ‘strange environments’ for Volkswagen Group China Import’s “Touareg Hyper Reality Test Drive” VR experience, unveiled last fall. (Image courtesy of Framestore)
“We’ve started to work for major ride installations, for the main IP or agencies. Clients come to us with real-time requirements and the VR projects are becoming more and more ambitious. It’s adding a new category for the visual effects facility.” —Helen Stanley, Managing Director of Integrated Advertising, Framestore
SPRING 2019 VFXVOICE.COM • 73
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Jeffrey A. Okun, VES, Visual Effects Supervisor MIDDLE: Weta Digital is working on Avatar sequels. (Image © Twentieth Century Fox and courtesy of Weta Digital) BOTTOM: A teenage Diana Prince (played by Gal Gadot) unleashes her Amazonian powers on a competitor in Wonder Woman (2017). (Image courtesy of Warner Bros. and DNEG)
“Visual effects are seeping into all the other departments – camera, lighting, costumes, sets, makeup, hair, production design and stunts, so there’s more work than ever before.” —Jeffrey A. Okun, VES, Visual Effects Supervisor
own render farm, where it can be used again and again.” Shah adds that “the moment it becomes viable to have the artist workstation rendered out of the cloud, it’ll be a tipping point for the economic model. One of the anomalies of our business is that you have a fixed cost business with a fickle revenue flow,” he says. “When you can put things into the OPEX (operating costs) instead of the CAPEX (capital costs) side, I believe it will dramatically change the industry. The largest companies may be the initial beneficiaries, but with a highly variable model, you’re in a better position than any company with fixed infrastructure and talent. That small, talented team is no longer limited.” The future of VFX technology – and, to a degree, the VFX industry – can be found in the research some visual effects companies continue to do, teaming with key partners to make the workflow more efficient and the end results better. One is Project Sauce, a three-year EU Research and Innovation project involving a nine-partner consortium, including Double Negative, Disney Research and Foundry, that aims to greatly increase efficiency and cost effectiveness of VFX. According to its website [www.sauceproject.eu], Sauce has three “measurable” goals: “Create tools to unlock value in previously-created content by allowing to edit or automatically adapt its properties (e.g. appearance, scale, motion) to the current production needs. Create tools to allow content created in the future to be more easily re-used and re-purposed, by developing light-field technology for the creative industries in terms of capture, storage, distribution and processing. Create management, animation and production tools to keep digital content accessible, discoverable and malleable by using procedural techniques and high-level semantic knowledge.”
the friction between the artist and his/her interaction with tools would be a great step forward. “It’s not render power, which is going to evolve naturally,” he says, “it’s finding the nuances through an artist’s day that are redundant or more technical. Taking them out will allow the artist to shine.” When it comes to the artists, Lyster hopes for a more diverse workplace in the future. “One of the big challenges in the industry is that we’re very male,” she says. “I’d like to see anyone coming back in five years to see a large percentage of females in the workforce.” Noting that with cloud-computing, real-time and improved GPUs, a frame that would take two hours to render now could take two seconds in the future, Ulbrich also looks to the future. “There’s no secret that the ecosystem has been disrupted,” he says. “Disruption isn’t always bad. Look at how the music industry was
TOP: Fiona Walkinshaw, Managing Director of Feature Films, Framestore BOTTOM: Framestore was the production company for Samsung’s “A Moon for All Mankind,” a lunargravity-simulation VR experience. (Image courtesy of Framestore)
“There are big commitments from the government to maintain healthy incentives [in Canada]. We wouldn’t go into these places without knowing it’s a robust set-up. Incentives attracts a huge number of artists to Montreal, and so many people paying taxes is good for their economy and the other businesses they’re trying to build up.” —Fiona Walkinshaw, Managing Director of Feature Films, Framestore
A.I. INFLUENCE
At ILM, Lyster reports that, with Disney Research, ILM is working on using artificial intelligence to de-noise the images produced by ray tracing. “It will cut down on CPU and render time,” she says. “We’re also investigating techniques to recognize and replace human faces in a semi-automated way, potentially a huge leap forward. Even with rotoscoping, you can ask the computer to hunt for humans and automatically rotoscope. It’s not production level yet, but pretty amazing.” Many other companies are turning to artificial intelligence or machine learning to improve the visual effects process. Morris adds that ILM, again with Disney Research, is also looking at a technology to capture performance without helmets and balls. “An actor will move differently when you put a weight on his head,” he says. “The best performances are when an actor feels comfortable.” At Spin VFX, Ali points out the changes that 5G – the fifth-generation wireless system – will create in principle 100 times faster than the current 4G. Ulbrich adds that 5G isn’t just a faster phone, but will take down latency to below human perception. “In a few years, it’ll be in wider use, and you’ll be able to move data in seemingly real time. The day when you’re streaming it directly into the cloud with no limits on storage will change everything.” Looking to the future, Bly points out that anything that removes
74 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 75
SPRING 2019 VFXVOICE.COM • 75
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Jeffrey A. Okun, VES, Visual Effects Supervisor MIDDLE: Weta Digital is working on Avatar sequels. (Image © Twentieth Century Fox and courtesy of Weta Digital) BOTTOM: A teenage Diana Prince (played by Gal Gadot) unleashes her Amazonian powers on a competitor in Wonder Woman (2017). (Image courtesy of Warner Bros. and DNEG)
“Visual effects are seeping into all the other departments – camera, lighting, costumes, sets, makeup, hair, production design and stunts, so there’s more work than ever before.” —Jeffrey A. Okun, VES, Visual Effects Supervisor
own render farm, where it can be used again and again.” Shah adds that “the moment it becomes viable to have the artist workstation rendered out of the cloud, it’ll be a tipping point for the economic model. One of the anomalies of our business is that you have a fixed cost business with a fickle revenue flow,” he says. “When you can put things into the OPEX (operating costs) instead of the CAPEX (capital costs) side, I believe it will dramatically change the industry. The largest companies may be the initial beneficiaries, but with a highly variable model, you’re in a better position than any company with fixed infrastructure and talent. That small, talented team is no longer limited.” The future of VFX technology – and, to a degree, the VFX industry – can be found in the research some visual effects companies continue to do, teaming with key partners to make the workflow more efficient and the end results better. One is Project Sauce, a three-year EU Research and Innovation project involving a nine-partner consortium, including Double Negative, Disney Research and Foundry, that aims to greatly increase efficiency and cost effectiveness of VFX. According to its website [www.sauceproject.eu], Sauce has three “measurable” goals: “Create tools to unlock value in previously-created content by allowing to edit or automatically adapt its properties (e.g. appearance, scale, motion) to the current production needs. Create tools to allow content created in the future to be more easily re-used and re-purposed, by developing light-field technology for the creative industries in terms of capture, storage, distribution and processing. Create management, animation and production tools to keep digital content accessible, discoverable and malleable by using procedural techniques and high-level semantic knowledge.”
the friction between the artist and his/her interaction with tools would be a great step forward. “It’s not render power, which is going to evolve naturally,” he says, “it’s finding the nuances through an artist’s day that are redundant or more technical. Taking them out will allow the artist to shine.” When it comes to the artists, Lyster hopes for a more diverse workplace in the future. “One of the big challenges in the industry is that we’re very male,” she says. “I’d like to see anyone coming back in five years to see a large percentage of females in the workforce.” Noting that with cloud-computing, real-time and improved GPUs, a frame that would take two hours to render now could take two seconds in the future, Ulbrich also looks to the future. “There’s no secret that the ecosystem has been disrupted,” he says. “Disruption isn’t always bad. Look at how the music industry was
TOP: Fiona Walkinshaw, Managing Director of Feature Films, Framestore BOTTOM: Framestore was the production company for Samsung’s “A Moon for All Mankind,” a lunargravity-simulation VR experience. (Image courtesy of Framestore)
“There are big commitments from the government to maintain healthy incentives [in Canada]. We wouldn’t go into these places without knowing it’s a robust set-up. Incentives attracts a huge number of artists to Montreal, and so many people paying taxes is good for their economy and the other businesses they’re trying to build up.” —Fiona Walkinshaw, Managing Director of Feature Films, Framestore
A.I. INFLUENCE
At ILM, Lyster reports that, with Disney Research, ILM is working on using artificial intelligence to de-noise the images produced by ray tracing. “It will cut down on CPU and render time,” she says. “We’re also investigating techniques to recognize and replace human faces in a semi-automated way, potentially a huge leap forward. Even with rotoscoping, you can ask the computer to hunt for humans and automatically rotoscope. It’s not production level yet, but pretty amazing.” Many other companies are turning to artificial intelligence or machine learning to improve the visual effects process. Morris adds that ILM, again with Disney Research, is also looking at a technology to capture performance without helmets and balls. “An actor will move differently when you put a weight on his head,” he says. “The best performances are when an actor feels comfortable.” At Spin VFX, Ali points out the changes that 5G – the fifth-generation wireless system – will create in principle 100 times faster than the current 4G. Ulbrich adds that 5G isn’t just a faster phone, but will take down latency to below human perception. “In a few years, it’ll be in wider use, and you’ll be able to move data in seemingly real time. The day when you’re streaming it directly into the cloud with no limits on storage will change everything.” Looking to the future, Bly points out that anything that removes
74 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 75
SPRING 2019 VFXVOICE.COM • 75
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Hitesh Shah, Chief Executive/ Co-founder, BotVFX MIDDLE: Agent K (Ryan Gosling) cruising the spinner back to base in Blade Runner 2049 (2017). Design by Jeremy Hanna for Weta Workshop. (Image © 2017 Alcon Entertainment, LLC, Warner Bros. Entertainment Inc. and Columbia Tristar Marketing Group Inc. and courtesy of Weta Workshop) BOTTOM: A crash-landed Agent K surveys a desolate L.A. landscape in Blade Runner 2049 (2017). (Image © 2017 Alcon Entertainment, LLC, Warner Bros. Entertainment Inc. and Columbia Tristar Marketing Group Inc. and courtesy of Rodeo FX)
disrupted two decades ago. It’s thriving now, but it’s different than it was. I’d say we’re in a growth industry. But you have to be nimble and situated geographically in the right place and be capitalized to be able to pivot and remain relevant. Some adapt and reign supreme, and others go the way of the dinosaur.” GENDER PARITY AND DIVERSITY
At the same time, other issues are dotting the global VFX landscape. Among them: gender parity and diversity, left-over unionization thoughts, and that old bogeyman – the ‘race to the bottom.’ Says Maher: “We are seeing an increase in 50/50 pledges and we’re seeing it across industries. The film and entertainment industry is much more proactive than what we’re seeing for
“The moment it becomes viable to have the artist workstation rendered out of the cloud, it’ll be a tipping point for the economic model. One of the anomalies of our business is that you have a fixed cost business with a fickle revenue flow. When you can put things into the OPEX (operating costs) instead of the CAPEX (capital costs) side, I believe it will dramatically change the industry.” —Hitesh Shah, Chief Executive/Co-founder, BotVFX design and engineering, probably because the customer base is already 50/50 (according to MPAA). There is also evidence that companies with equitable gender distribution are more profitable, though apparently it has taken some time for this news to trickle down to HR. I believe the #MeToo movement has done more to demonstrate that women are willing to speak up and demand fairness or at least redress and thus shake up the status quo. Not coincidentally, the film industry is the flash point for #MeToo.” “People have work,” Mahar states. “I’m not hearing about unionization so much. Also, sadly, labor movements all over the world are being undermined by companies and governments, but, again, people have work.” Adds Cowan, “I think all VFX artists would love to have some of the protection of a union, but from a global perspective, it’s very hard to achieve. It’s now a bit of a case of ‘the barn door is open and the horse is long gone’ in many places. It’s hard to apply retroactively. Unions would also have to be set up not just in each country, but in very specific regions, from what I am told.” As for the ‘race to the bottom,’ Maher says, “I think a number of things are going on. I think the race to the bottom was driven primarily by studios and directors, not necessarily contract workers. Sure, people need to have work. I’m sure there was and still is underbidding. But companies who couldn’t deliver went out of business. Also, we’ve seen companies prefer to stay closer to home when hiring contractors (we’ve also seen this in Design and Engineering as well as Media and Entertainment) because there are fewer issues in communications, time lag, etc. However, key to this is quality not convenience, I think. The digital transformation has made it easier to hire people wherever they are.
76 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 77
That’s going to continue.” Notes Cowan, “Yes, that is still ongoing, but what tends to happen is that some companies go in so low with bids, offering very high salaries, then collapse quite quickly and fail to deliver. The work ‘at the bottom,’ if you like, tends to then bounce back up the chain to get fixed and re-done. Clients are always under pressure to find a cheaper/better deal, but there is a point where they tend to find that you often get what you pay for, and many are now going with VFX companies with better reputations that are more expensive just to ensure quality is good and delivery is met to the required standard.” CONCLUSION
Disruption over the past five to 10 years has irrevocably changed the visual effects industry from the point of view of
TOP: Dayne Cowan, VP of Film, VHQ Media MIDDLE: Alita (played by Rosa Salazar) is a cyborg with a human brain, heart and soul, and the ability to save the world, in the anime adapted to film, Alita: Battle Angel (2019). (Image courtesy of Twentieth Century Fox) BOTTOM: Insurance salesman/ ex-cop Liam Neeson gets caught up in a criminal conspiracy during his daily train ride home in The Commuter (2018). (Image courtesy of StudioCanal and Cinesite)
“I think all VFX artists would love to have some of the protection of a union, but from a global perspective, it’s very hard to achieve. It’s now a bit of a case of ‘the barn door is open and the horse is long gone’ in many places. It’s hard to apply retroactively. Unions would also have to be set up not just in each country, but in very specific regions...” —Dayne Cowan, VP of Film, VHQ Media business models and technologies. As long as financial incentives exist – and they show no sign of going away any time soon – big facilities will fill the space by opening up subsidiaries in “hot” cities. The small shop, headed by passionate artists, will always find a way to operate in and around the margins, and the mid-sized house will always be shifting its weight, trying to find the sweet spot between big and small. New technologies also promise to disrupt the status quo even further – but it’s too early to tell how. As machine learning and artificial intelligence combine with faster 5G networks, more automated processes are likely to displace manual ones. If rotoscoping becomes a fully automated operation, what does that do to facilities that specialize in that? Will new technology create new jobs that will require the labor of individuals? And when it comes to the operation of the VFX facility, will automated processes cut jobs and costs, enabling better profit margins? These questions are ones that current VFX executives are already thinking about. On the human scale, the past few years have brought a more intense spotlight on gender parity and diversity in the workplace, two issues of new focus in the visual effects industry as well as in the larger Hollywood film/TV industry. How to make the VFX industry more diverse and protect workers from harassment are just two questions to be addressed in 2019 and beyond. Visual effects will become ever more ubiquitous as communications increasingly move from the word to the image. With more and more video, there’s every reason to believe that the next five years will be as busy – if not busier – than the last five to 10 years. How visual effects personnel, from executives and producers, supervisors to artists, navigate these evolving trends remains to be seen.
SPRING 2019 VFXVOICE.COM • 77
2/22/19 2:57 PM
SPECIAL REPORT
TOP: Hitesh Shah, Chief Executive/ Co-founder, BotVFX MIDDLE: Agent K (Ryan Gosling) cruising the spinner back to base in Blade Runner 2049 (2017). Design by Jeremy Hanna for Weta Workshop. (Image © 2017 Alcon Entertainment, LLC, Warner Bros. Entertainment Inc. and Columbia Tristar Marketing Group Inc. and courtesy of Weta Workshop) BOTTOM: A crash-landed Agent K surveys a desolate L.A. landscape in Blade Runner 2049 (2017). (Image © 2017 Alcon Entertainment, LLC, Warner Bros. Entertainment Inc. and Columbia Tristar Marketing Group Inc. and courtesy of Rodeo FX)
disrupted two decades ago. It’s thriving now, but it’s different than it was. I’d say we’re in a growth industry. But you have to be nimble and situated geographically in the right place and be capitalized to be able to pivot and remain relevant. Some adapt and reign supreme, and others go the way of the dinosaur.” GENDER PARITY AND DIVERSITY
At the same time, other issues are dotting the global VFX landscape. Among them: gender parity and diversity, left-over unionization thoughts, and that old bogeyman – the ‘race to the bottom.’ Says Maher: “We are seeing an increase in 50/50 pledges and we’re seeing it across industries. The film and entertainment industry is much more proactive than what we’re seeing for
“The moment it becomes viable to have the artist workstation rendered out of the cloud, it’ll be a tipping point for the economic model. One of the anomalies of our business is that you have a fixed cost business with a fickle revenue flow. When you can put things into the OPEX (operating costs) instead of the CAPEX (capital costs) side, I believe it will dramatically change the industry.” —Hitesh Shah, Chief Executive/Co-founder, BotVFX design and engineering, probably because the customer base is already 50/50 (according to MPAA). There is also evidence that companies with equitable gender distribution are more profitable, though apparently it has taken some time for this news to trickle down to HR. I believe the #MeToo movement has done more to demonstrate that women are willing to speak up and demand fairness or at least redress and thus shake up the status quo. Not coincidentally, the film industry is the flash point for #MeToo.” “People have work,” Mahar states. “I’m not hearing about unionization so much. Also, sadly, labor movements all over the world are being undermined by companies and governments, but, again, people have work.” Adds Cowan, “I think all VFX artists would love to have some of the protection of a union, but from a global perspective, it’s very hard to achieve. It’s now a bit of a case of ‘the barn door is open and the horse is long gone’ in many places. It’s hard to apply retroactively. Unions would also have to be set up not just in each country, but in very specific regions, from what I am told.” As for the ‘race to the bottom,’ Maher says, “I think a number of things are going on. I think the race to the bottom was driven primarily by studios and directors, not necessarily contract workers. Sure, people need to have work. I’m sure there was and still is underbidding. But companies who couldn’t deliver went out of business. Also, we’ve seen companies prefer to stay closer to home when hiring contractors (we’ve also seen this in Design and Engineering as well as Media and Entertainment) because there are fewer issues in communications, time lag, etc. However, key to this is quality not convenience, I think. The digital transformation has made it easier to hire people wherever they are.
76 • VFXVOICE.COM SPRING 2019
PG 64-77 GLOBAL VFX.indd 77
That’s going to continue.” Notes Cowan, “Yes, that is still ongoing, but what tends to happen is that some companies go in so low with bids, offering very high salaries, then collapse quite quickly and fail to deliver. The work ‘at the bottom,’ if you like, tends to then bounce back up the chain to get fixed and re-done. Clients are always under pressure to find a cheaper/better deal, but there is a point where they tend to find that you often get what you pay for, and many are now going with VFX companies with better reputations that are more expensive just to ensure quality is good and delivery is met to the required standard.” CONCLUSION
Disruption over the past five to 10 years has irrevocably changed the visual effects industry from the point of view of
TOP: Dayne Cowan, VP of Film, VHQ Media MIDDLE: Alita (played by Rosa Salazar) is a cyborg with a human brain, heart and soul, and the ability to save the world, in the anime adapted to film, Alita: Battle Angel (2019). (Image courtesy of Twentieth Century Fox) BOTTOM: Insurance salesman/ ex-cop Liam Neeson gets caught up in a criminal conspiracy during his daily train ride home in The Commuter (2018). (Image courtesy of StudioCanal and Cinesite)
“I think all VFX artists would love to have some of the protection of a union, but from a global perspective, it’s very hard to achieve. It’s now a bit of a case of ‘the barn door is open and the horse is long gone’ in many places. It’s hard to apply retroactively. Unions would also have to be set up not just in each country, but in very specific regions...” —Dayne Cowan, VP of Film, VHQ Media business models and technologies. As long as financial incentives exist – and they show no sign of going away any time soon – big facilities will fill the space by opening up subsidiaries in “hot” cities. The small shop, headed by passionate artists, will always find a way to operate in and around the margins, and the mid-sized house will always be shifting its weight, trying to find the sweet spot between big and small. New technologies also promise to disrupt the status quo even further – but it’s too early to tell how. As machine learning and artificial intelligence combine with faster 5G networks, more automated processes are likely to displace manual ones. If rotoscoping becomes a fully automated operation, what does that do to facilities that specialize in that? Will new technology create new jobs that will require the labor of individuals? And when it comes to the operation of the VFX facility, will automated processes cut jobs and costs, enabling better profit margins? These questions are ones that current VFX executives are already thinking about. On the human scale, the past few years have brought a more intense spotlight on gender parity and diversity in the workplace, two issues of new focus in the visual effects industry as well as in the larger Hollywood film/TV industry. How to make the VFX industry more diverse and protect workers from harassment are just two questions to be addressed in 2019 and beyond. Visual effects will become ever more ubiquitous as communications increasingly move from the word to the image. With more and more video, there’s every reason to believe that the next five years will be as busy – if not busier – than the last five to 10 years. How visual effects personnel, from executives and producers, supervisors to artists, navigate these evolving trends remains to be seen.
SPRING 2019 VFXVOICE.COM • 77
2/22/19 2:57 PM
FOCUS
BRIGHT NORTHERN LIGHTS: THE EVOLUTION OF THE CANADIAN VFX INDUSTRY By TREVOR HOGG
TOP: In-house film project called Faster Than Light produced by Artifex Studios. (Image courtesy of Artifex Studios)
78 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 78-79
What do Houdini, Maya and Softimage have in common? They originated from Canada courtesy of SideFX, Alias Systems Corporation and Softimage. “All of a sudden there was this growth in computer graphics in the mid-1990s, everyone went public except for us, and then both Alias and Softimage were purchased by Autodesk,” notes SideFX President and CEO Kim Davidson, who believes that the key to survival for an independent company such as SideFX is servicing a niche market. “The general problem in Canada is the lack of mid-market funding. But this wasn’t an issue for us as we grew organically with great support from Canadian schools, both on the technical and artistic sides, as well as from governments with R&D tax credits.” Continuing the tradition of technological innovation is Vancouver-located Ziva Dynamics, who released a charactercreation Maya plug-in that produces realistic muscle growth and deformations called Ziva VFX. “I realized that as the projects I worked on grew in complexity, the commercial rigging solutions were no longer yielding the required level of realism,” remarks James Jacobs, Co-founder and Co-CEO at Ziva Dynamics. “Scanline VFX played a pivotal role in the feature decisions we made for our MVP release back in 2015, and now can be found using Ziva VFX to create fully-simulated creatures for major productions, like the megalodon in the 2018 summer blockbuster The Meg.” One of the early pioneers was Toronto-based Omnibus Computer Graphics, best known for producing the procedural graphics application PRISMS (the forerunner for Houdini) and digital effects for Explorers (1985) and Flight of the Navigator (1986) before going bankrupt in 1987. Seven years later, John Mariella, Bob Munroe, Kyle Menzies and William Shatner founded C.O.R.E. Digital Pictures, which was involved with a variety of movie and television projects such as Fly Away Home (1996), Wonderfalls (2004) and The Wild (2006). “Back then, Canadian talent had to move out of the country to find work as effects artists on feature films,” notes SideFX Senior Production Consultant John Mariella. “There weren’t
any companies servicing the long-form or feature-film markets in Canada. That was the key motivation for us to set up in Toronto. Ultimately, delays in production green-lighting, the strength of the Canadian dollar, and costs associated with gap financing tax credits, led to our closure in 2010.” Visual effects and animation have evolved into a billion-dollar industry in British Columbia and are responsible for 12,000 jobs. “All our tax incentives are labor-based, so if you hire a permanent B.C. resident, you will receive back a portion of their salary,” remarks Nancy Basi, Executive Director, Media and Entertainment Center at the Vancouver Economic Commission. “This both encourages and sustains the growth of the talent pool and career development of the professionals working in this sector.” A significant vote of confidence occurred when Sony Pictures Imageworks, responsible for The Polar Express (2004) and Watchmen (2009), decided to relocate its headquarters from Culver City, California to Vancouver in 2014. “In order to be more competitive, we needed to be aggressive ramping up our footprint in tax-incentive locations, so we decided to go big in Vancouver and not just grow our office but make a commitment in terms of a physical headquarters and building up our management team,” remarks Sony Pictures Imageworks President Randy Lake. “It has worked out great. We have a beautiful 73,000 square-foot floor for our needs, and ended up needing incidental space as well – as our production has ebbed and flowed over the past four years – and we have around 1,100 artists.” “Image Engine was founded by three people in 1995 and has
since grown into a 300-person, high-end visual effects studio,” states Shawn Walsh, Visual Effects Executive Producer & General Manager at Image Engine. “This growth is emblematic of the overall trend in visual effects in both Vancouver and Canada in the past 20 years. The studio cut its teeth contributing to the growing need for video post-production services in Vancouver during the late 1990s and early 2000s, and had its coming of age moment with the Academy Award-nominated sophisticated creature work for Neill Blomkamp’s District 9 in 2009. In the decade since, Image Engine has focused on delivering an ever-expanding portfolio of work to mainly feature film [Skyscraper] and high-end television clientele [Lost in Space].” “Artifex started in Vancouver as a place for me to work!” laughs Artifex Studios President Adam Stern whose independent
TOP LEFT: Movie posters showcase the work of Framestore Montreal. (Photo courtesy of Framestore) TOP RIGHT: Framestore Montreal won an Oscar for being part of the Blade Runner 2049 visual effects team. (Image courtesy of Framestore) BOTTOM LEFT: A screening room situated at Hybride. (Photo courtesy of Hybride) BOTTOM RIGHT: Hybride contributed visual effects on Solo: A Star Wars Story. (Image courtesy of Hybride)
SPRING 2019 VFXVOICE.COM • 79
2/22/19 3:08 PM
FOCUS
BRIGHT NORTHERN LIGHTS: THE EVOLUTION OF THE CANADIAN VFX INDUSTRY By TREVOR HOGG
TOP: In-house film project called Faster Than Light produced by Artifex Studios. (Image courtesy of Artifex Studios)
78 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 78-79
What do Houdini, Maya and Softimage have in common? They originated from Canada courtesy of SideFX, Alias Systems Corporation and Softimage. “All of a sudden there was this growth in computer graphics in the mid-1990s, everyone went public except for us, and then both Alias and Softimage were purchased by Autodesk,” notes SideFX President and CEO Kim Davidson, who believes that the key to survival for an independent company such as SideFX is servicing a niche market. “The general problem in Canada is the lack of mid-market funding. But this wasn’t an issue for us as we grew organically with great support from Canadian schools, both on the technical and artistic sides, as well as from governments with R&D tax credits.” Continuing the tradition of technological innovation is Vancouver-located Ziva Dynamics, who released a charactercreation Maya plug-in that produces realistic muscle growth and deformations called Ziva VFX. “I realized that as the projects I worked on grew in complexity, the commercial rigging solutions were no longer yielding the required level of realism,” remarks James Jacobs, Co-founder and Co-CEO at Ziva Dynamics. “Scanline VFX played a pivotal role in the feature decisions we made for our MVP release back in 2015, and now can be found using Ziva VFX to create fully-simulated creatures for major productions, like the megalodon in the 2018 summer blockbuster The Meg.” One of the early pioneers was Toronto-based Omnibus Computer Graphics, best known for producing the procedural graphics application PRISMS (the forerunner for Houdini) and digital effects for Explorers (1985) and Flight of the Navigator (1986) before going bankrupt in 1987. Seven years later, John Mariella, Bob Munroe, Kyle Menzies and William Shatner founded C.O.R.E. Digital Pictures, which was involved with a variety of movie and television projects such as Fly Away Home (1996), Wonderfalls (2004) and The Wild (2006). “Back then, Canadian talent had to move out of the country to find work as effects artists on feature films,” notes SideFX Senior Production Consultant John Mariella. “There weren’t
any companies servicing the long-form or feature-film markets in Canada. That was the key motivation for us to set up in Toronto. Ultimately, delays in production green-lighting, the strength of the Canadian dollar, and costs associated with gap financing tax credits, led to our closure in 2010.” Visual effects and animation have evolved into a billion-dollar industry in British Columbia and are responsible for 12,000 jobs. “All our tax incentives are labor-based, so if you hire a permanent B.C. resident, you will receive back a portion of their salary,” remarks Nancy Basi, Executive Director, Media and Entertainment Center at the Vancouver Economic Commission. “This both encourages and sustains the growth of the talent pool and career development of the professionals working in this sector.” A significant vote of confidence occurred when Sony Pictures Imageworks, responsible for The Polar Express (2004) and Watchmen (2009), decided to relocate its headquarters from Culver City, California to Vancouver in 2014. “In order to be more competitive, we needed to be aggressive ramping up our footprint in tax-incentive locations, so we decided to go big in Vancouver and not just grow our office but make a commitment in terms of a physical headquarters and building up our management team,” remarks Sony Pictures Imageworks President Randy Lake. “It has worked out great. We have a beautiful 73,000 square-foot floor for our needs, and ended up needing incidental space as well – as our production has ebbed and flowed over the past four years – and we have around 1,100 artists.” “Image Engine was founded by three people in 1995 and has
since grown into a 300-person, high-end visual effects studio,” states Shawn Walsh, Visual Effects Executive Producer & General Manager at Image Engine. “This growth is emblematic of the overall trend in visual effects in both Vancouver and Canada in the past 20 years. The studio cut its teeth contributing to the growing need for video post-production services in Vancouver during the late 1990s and early 2000s, and had its coming of age moment with the Academy Award-nominated sophisticated creature work for Neill Blomkamp’s District 9 in 2009. In the decade since, Image Engine has focused on delivering an ever-expanding portfolio of work to mainly feature film [Skyscraper] and high-end television clientele [Lost in Space].” “Artifex started in Vancouver as a place for me to work!” laughs Artifex Studios President Adam Stern whose independent
TOP LEFT: Movie posters showcase the work of Framestore Montreal. (Photo courtesy of Framestore) TOP RIGHT: Framestore Montreal won an Oscar for being part of the Blade Runner 2049 visual effects team. (Image courtesy of Framestore) BOTTOM LEFT: A screening room situated at Hybride. (Photo courtesy of Hybride) BOTTOM RIGHT: Hybride contributed visual effects on Solo: A Star Wars Story. (Image courtesy of Hybride)
SPRING 2019 VFXVOICE.COM • 79
2/22/19 3:08 PM
FOCUS
TOP LEFT: Image Engine has contributed to the acclaimed fantasy series Game of Thrones. (Image courtesy of Image Engine) TOP RIGHT: A group of Image Engine employees gather for a discussion. (Photo courtesy of Image Engine) BOTTOM LEFT: An example of ethereal effects produced as part of the compositing program at Lost Boy Studios by Hillary Vu. (Image courtesy of Lost Boys Studios) BOTTOM RIGHT: Co-founders/owners Ria and Mark Bénard are surrounded by up and coming digital artists trained by Lost Boys Studios. (Photo courtesy of Lost Boys Studios.
80 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 80-81
visual effects studio has contributed to Showcase’s Continuum and Amazon’s The Man in the High Castle. “We’ve now been in operation for over 20 years and generally employ between 30 and 40 staff. It’s important to me to recognize that people have lives outside of work so that generally means a 40-hour work week, paid OT, paid time off, and respect for the individual artists and the teams they’re part of. We also do our absolute best to bring in projects that we think fit the studio and artists would want to work on.” Stern is exploring being more than a service provider. “To date, Artifex has produced two short films, and they’ve both done quite well [and are both being developed further]. It has helped inform how we approach working on other film and television projects, and has given Artifex a window into the unique challenges other creatives face within this industry.” The total economic impact of the computer animation and visual effects sector for Ontario was $449.5 million in 2014 with 7,000 people being employed. “Ontario Creates manages a variety of programs and financial incentives to support the production sector, including the Ontario Computer Animation and Special Effects Tax Credit (OCASE),” explains Justin Cutler, Film Commissioner of Ontario at Ontario Creates. “This tax credit returns 18% of Ontario-based expenditures related to animation and special effects to the production to help offset costs and encourage investment in future productions. It is available in addition to other provincial and federal tax credits and grants that support film and television production.” “Canada as a whole has the best talent pool and tax incentives whether you’re in British Columbia, Ontario or Quebec,” believes Soho VFX CEO and Co-founder Allan Magled. “All of the talent is either in Toronto, Montreal or Vancouver. You could do the same
thing in another province, but you would be hard-pressed to get the talent over there.” Established in 2002, Soho VFX experienced an international breakthrough completing 200 shots for the Marvel Studio blockbuster The Incredible Hulk (2007). “You’re always chasing projects, but over the years you also manage to develop relationships and keep those, provided that your work gets better or you’re doing good work and the service is good.” Magled does not have plans to expand beyond Toronto. “When MPCs of the world take on a show, it’s an enormous chunk, like a 1,000 shots. We could take on 400 shots, and are happy with our size and ability.” Spin VFX has been part of the Toronto visual effects scene for 30 years and has been responsible for Suicide Squad (2016) and Netflix’s The Haunting of Hill House. “SIRT [Screen Industries Research and Training Center] has provided an important bridge between academics and industry in the Toronto visual effects community,” notes Neishaw Ali, President and Executive Producer at Spin VFX. “Their focus on research and new technology keeps them relevant and engaged with current and upcoming challenges. It provides students with a chance to gain valuable experience while filling a real need for companies.” Ali adds, “Toronto and Canada must continue to invest in human and physical infrastructure. Additional studio space needs to be built to ensure more shows have the flexibility to shoot in Toronto. Equally important is the need to continue to develop our creative and technical talent locally.” The visual effects industry has an economic worth of $262 million in Quebec and is responsible for 3,000 jobs, with a rebate program managed by SODEC (Society of Developing Cultural Enterprises) that gives 20% cash-back for all expenses and 16% bonus on all CGI and greenscreen shots. “The last five years have been quite remarkable,” remarks Eric Kucharsky, Director, Business Development, ICT Europe - Foreign Investments at Montreal International. “The industry up until then was local and niche with few shots on larger projects, like Hybride on 300 [2006].” Foreign visual effects studios, such as Framestore, Cinesite, MPC, Digital Domain, and most recently Method Studios, Scanline VFX and Pixomondo, have been drawn to Montreal. “Not only because of the tax credit, but more importantly the large talent pool that we have, especially, in the gaming sector,” notes Christian Bernard, Chief Economist and Vice President Marketing Communication at Montreal International.
TOP LEFT: An inside look at Method Studios Vancouver. (Photo courtesy of Method Studios) TOP RIGHT: Atomic Fiction, now Method Studios, has established a creative partnership with Oscarwinning filmmaker Robert Zemeckis, which includes The Walk. (Image Courtesy of Method Studios) BOTTOM: Mr. X was responsible for the visual effects featured in The Shape of Water, which won the Oscar for Best Picture in 2018. (Image courtesy of Mr. X)
SPRING 2019 VFXVOICE.COM • 81
2/22/19 3:08 PM
FOCUS
TOP LEFT: Image Engine has contributed to the acclaimed fantasy series Game of Thrones. (Image courtesy of Image Engine) TOP RIGHT: A group of Image Engine employees gather for a discussion. (Photo courtesy of Image Engine) BOTTOM LEFT: An example of ethereal effects produced as part of the compositing program at Lost Boy Studios by Hillary Vu. (Image courtesy of Lost Boys Studios) BOTTOM RIGHT: Co-founders/owners Ria and Mark Bénard are surrounded by up and coming digital artists trained by Lost Boys Studios. (Photo courtesy of Lost Boys Studios.
80 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 80-81
visual effects studio has contributed to Showcase’s Continuum and Amazon’s The Man in the High Castle. “We’ve now been in operation for over 20 years and generally employ between 30 and 40 staff. It’s important to me to recognize that people have lives outside of work so that generally means a 40-hour work week, paid OT, paid time off, and respect for the individual artists and the teams they’re part of. We also do our absolute best to bring in projects that we think fit the studio and artists would want to work on.” Stern is exploring being more than a service provider. “To date, Artifex has produced two short films, and they’ve both done quite well [and are both being developed further]. It has helped inform how we approach working on other film and television projects, and has given Artifex a window into the unique challenges other creatives face within this industry.” The total economic impact of the computer animation and visual effects sector for Ontario was $449.5 million in 2014 with 7,000 people being employed. “Ontario Creates manages a variety of programs and financial incentives to support the production sector, including the Ontario Computer Animation and Special Effects Tax Credit (OCASE),” explains Justin Cutler, Film Commissioner of Ontario at Ontario Creates. “This tax credit returns 18% of Ontario-based expenditures related to animation and special effects to the production to help offset costs and encourage investment in future productions. It is available in addition to other provincial and federal tax credits and grants that support film and television production.” “Canada as a whole has the best talent pool and tax incentives whether you’re in British Columbia, Ontario or Quebec,” believes Soho VFX CEO and Co-founder Allan Magled. “All of the talent is either in Toronto, Montreal or Vancouver. You could do the same
thing in another province, but you would be hard-pressed to get the talent over there.” Established in 2002, Soho VFX experienced an international breakthrough completing 200 shots for the Marvel Studio blockbuster The Incredible Hulk (2007). “You’re always chasing projects, but over the years you also manage to develop relationships and keep those, provided that your work gets better or you’re doing good work and the service is good.” Magled does not have plans to expand beyond Toronto. “When MPCs of the world take on a show, it’s an enormous chunk, like a 1,000 shots. We could take on 400 shots, and are happy with our size and ability.” Spin VFX has been part of the Toronto visual effects scene for 30 years and has been responsible for Suicide Squad (2016) and Netflix’s The Haunting of Hill House. “SIRT [Screen Industries Research and Training Center] has provided an important bridge between academics and industry in the Toronto visual effects community,” notes Neishaw Ali, President and Executive Producer at Spin VFX. “Their focus on research and new technology keeps them relevant and engaged with current and upcoming challenges. It provides students with a chance to gain valuable experience while filling a real need for companies.” Ali adds, “Toronto and Canada must continue to invest in human and physical infrastructure. Additional studio space needs to be built to ensure more shows have the flexibility to shoot in Toronto. Equally important is the need to continue to develop our creative and technical talent locally.” The visual effects industry has an economic worth of $262 million in Quebec and is responsible for 3,000 jobs, with a rebate program managed by SODEC (Society of Developing Cultural Enterprises) that gives 20% cash-back for all expenses and 16% bonus on all CGI and greenscreen shots. “The last five years have been quite remarkable,” remarks Eric Kucharsky, Director, Business Development, ICT Europe - Foreign Investments at Montreal International. “The industry up until then was local and niche with few shots on larger projects, like Hybride on 300 [2006].” Foreign visual effects studios, such as Framestore, Cinesite, MPC, Digital Domain, and most recently Method Studios, Scanline VFX and Pixomondo, have been drawn to Montreal. “Not only because of the tax credit, but more importantly the large talent pool that we have, especially, in the gaming sector,” notes Christian Bernard, Chief Economist and Vice President Marketing Communication at Montreal International.
TOP LEFT: An inside look at Method Studios Vancouver. (Photo courtesy of Method Studios) TOP RIGHT: Atomic Fiction, now Method Studios, has established a creative partnership with Oscarwinning filmmaker Robert Zemeckis, which includes The Walk. (Image Courtesy of Method Studios) BOTTOM: Mr. X was responsible for the visual effects featured in The Shape of Water, which won the Oscar for Best Picture in 2018. (Image courtesy of Mr. X)
SPRING 2019 VFXVOICE.COM • 81
2/22/19 3:08 PM
FOCUS
TOP LEFT: The mysterious alien spacecraft from Arrival featuring environmental work from Reynault VFX. (Image courtesy of Reynault VFX) TOP RIGHT: An interior perspective of Reynault VFX based in Montreal. (Photo courtesy of Reynault VFX) BOTTOM LEFT: Rodeo FX helped filmmaker Eli Roth bring The House with a Clock in Its Walls to the big screen. (Image courtesy of Rodeo FX) BOTTOM RIGHT: The reception area for Rodeo FX with a number of awards on display. (Photo courtesy of Rodeo FX)
82 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 83
Tax credits and studio space are critical to Rodeo FX, which has a presence in Montreal, Quebec City, Los Angeles and Munich, as well as having worked on Blade Runner 2049 (2017) and HBO’s Game of Thrones. “We have grown exponentially each year and added a new building in Montreal that can house up to 120 artists and connects two of our existing studios,” states Sébastien Moreau, President and VFX Supervisor at Rodeo FX. “We moved our practical shooting stage to what has become our Rodeo FX Campus in Old Montreal. We are also taking over a new floor at our main building that will add an additional 110 artists. As for tax credits, they allow us to be competitive and attract top clients; however, it’s important to note that our clients receive them, not us.” “Post production and VFX companies need to stay proactive and dynamic, and if they’re not, the VFX market will force them to be, because our market is in constant evolution,” notes Pierre Raymond, President, Co-founder and Head of Operations at Hybride. “I’d say that in 27 years of existence, we have probably reinvented Hybride five times.” High-profile projects include Rogue One: A Star Wars Story (2016) and Kong: Skull Island (2017). “One of the things that Montreal’s academic institutions did well over the years was opening programs to train extremely talented people in different fields. It’s mainly because of the quality of the resources we could find from Montreal’s diversity that Hybride became the first company on the North American east coast to really stand out and the first company in Quebec to work on blockbusters.” In 2011, digital matte painter, concept artist and visual effects art director Mathieu Raynault established Raynault VFX, which specializes in environmental work that has appeared in Starz’s Black Sails, Arrival (2016), and Murder on the Orient Express
(2017). “Montreal is an attractive city being close to Europe and New York,” remarks Charlene Eberle, Executive Producer at Raynault VFX. “like Vancouver is attractive being close to California. The growth has been great. We have benefited greatly because of it. Clients now have a larger market to choose from and a larger pool of talent to utilize.” Contributing to the success of the boutique studio is a high staff retention rate and recruiting employees who are proactive, autonomous and good team players. “Like any business you need to stay on top of the industry in order to stay competitive. You also need to keep a keen eye on technological advances and always focus on the ever-changing demands of the industry.” In 2018, Deluxe Entertainment Group purchased Atomic Fiction, which has worked on The Walk and Welcome to Marwen, to become part of Method Studios. “We started Atomic Fiction out of my home office in San Francisco and over the course of three years wanted to grow more,” remarks Atomic Fiction Co-founder Kevin Baillie, now Creative Director, Senior VFX Supervisor at Method Studios. “Montreal felt like uncharted territory. There weren’t any other American visual effects companies. MPC and Framestore had gotten started there, and Rodeo FX was the only established medium-size visual effects company. Fast forward four years later and every company is either in Montreal or going to be there. The talent pool has grown immensely, both in terms of what schools are doing as well as people coming from all around the world.” Ed Ulbrich, President and General Manager, Visual Effects at Deluxe Entertainment adds, “Most clients call and ask for Montreal first because they want that rebate. I was nearly ready to put pen to paper to sign a lease in Montreal to build our own facility, and through a mutual acquaintance I got to meet Kevin. It
TOP LEFT: A screenshot of Houdini 17. (Image courtesy of SideFX) TOP RIGHT: An explosion created by Sheridan College student Zhongyan Dai. (Photo courtesy of Sheridan College) BOTTOM LEFT: Inside the expanded studio space of Soho VFX. (Photo courtesy of Soho VFX) BOTTOM RIGHT: Soho VFX worked on the blockbuster Avengers: Age of Ultron. (Image courtesy of Soho VFX)
SPRING 2019 VFXVOICE.COM • 83
2/22/19 3:08 PM
FOCUS
TOP LEFT: The mysterious alien spacecraft from Arrival featuring environmental work from Reynault VFX. (Image courtesy of Reynault VFX) TOP RIGHT: An interior perspective of Reynault VFX based in Montreal. (Photo courtesy of Reynault VFX) BOTTOM LEFT: Rodeo FX helped filmmaker Eli Roth bring The House with a Clock in Its Walls to the big screen. (Image courtesy of Rodeo FX) BOTTOM RIGHT: The reception area for Rodeo FX with a number of awards on display. (Photo courtesy of Rodeo FX)
82 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 83
Tax credits and studio space are critical to Rodeo FX, which has a presence in Montreal, Quebec City, Los Angeles and Munich, as well as having worked on Blade Runner 2049 (2017) and HBO’s Game of Thrones. “We have grown exponentially each year and added a new building in Montreal that can house up to 120 artists and connects two of our existing studios,” states Sébastien Moreau, President and VFX Supervisor at Rodeo FX. “We moved our practical shooting stage to what has become our Rodeo FX Campus in Old Montreal. We are also taking over a new floor at our main building that will add an additional 110 artists. As for tax credits, they allow us to be competitive and attract top clients; however, it’s important to note that our clients receive them, not us.” “Post production and VFX companies need to stay proactive and dynamic, and if they’re not, the VFX market will force them to be, because our market is in constant evolution,” notes Pierre Raymond, President, Co-founder and Head of Operations at Hybride. “I’d say that in 27 years of existence, we have probably reinvented Hybride five times.” High-profile projects include Rogue One: A Star Wars Story (2016) and Kong: Skull Island (2017). “One of the things that Montreal’s academic institutions did well over the years was opening programs to train extremely talented people in different fields. It’s mainly because of the quality of the resources we could find from Montreal’s diversity that Hybride became the first company on the North American east coast to really stand out and the first company in Quebec to work on blockbusters.” In 2011, digital matte painter, concept artist and visual effects art director Mathieu Raynault established Raynault VFX, which specializes in environmental work that has appeared in Starz’s Black Sails, Arrival (2016), and Murder on the Orient Express
(2017). “Montreal is an attractive city being close to Europe and New York,” remarks Charlene Eberle, Executive Producer at Raynault VFX. “like Vancouver is attractive being close to California. The growth has been great. We have benefited greatly because of it. Clients now have a larger market to choose from and a larger pool of talent to utilize.” Contributing to the success of the boutique studio is a high staff retention rate and recruiting employees who are proactive, autonomous and good team players. “Like any business you need to stay on top of the industry in order to stay competitive. You also need to keep a keen eye on technological advances and always focus on the ever-changing demands of the industry.” In 2018, Deluxe Entertainment Group purchased Atomic Fiction, which has worked on The Walk and Welcome to Marwen, to become part of Method Studios. “We started Atomic Fiction out of my home office in San Francisco and over the course of three years wanted to grow more,” remarks Atomic Fiction Co-founder Kevin Baillie, now Creative Director, Senior VFX Supervisor at Method Studios. “Montreal felt like uncharted territory. There weren’t any other American visual effects companies. MPC and Framestore had gotten started there, and Rodeo FX was the only established medium-size visual effects company. Fast forward four years later and every company is either in Montreal or going to be there. The talent pool has grown immensely, both in terms of what schools are doing as well as people coming from all around the world.” Ed Ulbrich, President and General Manager, Visual Effects at Deluxe Entertainment adds, “Most clients call and ask for Montreal first because they want that rebate. I was nearly ready to put pen to paper to sign a lease in Montreal to build our own facility, and through a mutual acquaintance I got to meet Kevin. It
TOP LEFT: A screenshot of Houdini 17. (Image courtesy of SideFX) TOP RIGHT: An explosion created by Sheridan College student Zhongyan Dai. (Photo courtesy of Sheridan College) BOTTOM LEFT: Inside the expanded studio space of Soho VFX. (Photo courtesy of Soho VFX) BOTTOM RIGHT: Soho VFX worked on the blockbuster Avengers: Age of Ultron. (Image courtesy of Soho VFX)
SPRING 2019 VFXVOICE.COM • 83
2/22/19 3:08 PM
FOCUS
TOP: A group discussion taking place at Sony Pictures Imageworks. (Photo courtesy of Sony Pictures Imageworks) MIDDLE: Sony Pictures Imageworks had a fun and challenging time integrating the different animation styles in Spider-Man: Into the Spider-Verse. (Image courtesy of Sony Pictures Imageworks) BOTTOM: A shot from Hunter Killer created by Spin VFX. (Photo courtesy of Spin VFX)
84 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 85
was clear that the vision for the future of visual effects at scale and the challenges of Montreal, that there was a mutual fit.” Sheridan College established a computer animation program in 1981. “We’ve developed our curriculum in Visual Effects with training a generalist in mind,” states Sheridan College Program Coordinator/Professor of Computer Animation, Visual Effects & Digital Creature Animation, Noel Hooper. “We are also trying to stay ahead of industry needs by developing new programs like our Digital Creature TD program.” Lost Boys Studio, founded by Mark Bénard, has shifted toward specialized programs. “We were producing excellent generalists, capable of adapting to different production roles, but noticed that once a discipline was selected it was rare that our alumni ever shifted departments [usually staying in Lighting, FX or Compositing].” Technicolor is actively involved with training initiatives in Canada. “Our courses are aligned to the skills framework and pipelines of the studios that we support: MPC Film, Mr. X and Mill Film,” states Jonathan Fletcher, Global Head of Learning and Development, Technicolor Academy. “It includes offering training on proprietary tools and VFX workflows. This in turn gives the artists more confidence about beginning their careers at one of the visual effects studios and helps with the transition onto the show floor.” “The visual effects landscape is constantly evolving, and every region needs to utilize as many tools as possible to stay competitive, based on their unique strengths,” remarks Sony Pictures Imageworks Vice President, New Business Production, Shauna Bryan. “Vancouver has a labor-based rebate, so the industry has to focus on not only attracting, but retaining international talent, as well as focus on local education, training and internship programs.” The first of the big companies to come to Montreal was Framestore in 2013. “It made a lot of sense to open another studio in Canada,” notes Chloë Grysole, Managing Director at Framestore Montreal. “One of the factors was the lifestyle in Montreal. Knowing that you’re going to have to relocate people from all over the world, you want to make it a place where they are going to want to live. We have 600 people from 49 different nationalities on our staff.” Canadian visual effects companies have attracted foreign ownership interest with Ubisoft buying Hybride in 2008, Cinesite merging with Image Engine in 2015, and Mr. X being acquired by Technicolor in 2014. “If you are not strategically aligned or able to invest the capital needed in today’s global landscape, it’s going to be difficult for a mid-size studio,” notes Mr. X CEO Dennis Berardi, who supervised the visual effects for The Shape of Water (2017) and is currently producing Monster Hunter (2020). “It’s probably easier for a boutique studio that has a niche offering, like motion graphics or high-end matte paintings; they stay small and contain their costs. In my case, I wasn’t looking at being acquired or exiting the business. I was looking for growth and investment. I’ve been able to maintain creative control of the business. I’ve effectively switched from a Canadian charter bank to the bank of Technicolor. It has been a positive result. We have maintained our support of the Canadian Film Center, Bell TIFF Lightbox and independent films. I’m proud of that.”
Canada VFX A-Z Major visual effects companies from around the world have a presence in Canada, including BUF, Cinesite, Digital Domain, DNEG, Framestore, ILM, Method Studios, MPC, Pixomondo, and Scanline VFX. But there is no shortage of homegrown vendors that have had a significant impact on the industry. Here is a brief guide to some of the most active companies in Canada (not meant to be comprehensive list). Animatrik. Founded by Brett Ineson and headquartered in Vancouver with a second facility in Los Angeles, Animatrik focuses on motion capture, pre-visualization and virtual cinematography services. Projects: Deadpool 2, Avengers: Infinity War, Ready Player One. www.animatrik.com Artifex Studios. An independently-owned studio situated in Vancouver started by Adam Stern that produces visual effects for both television and film as well as producing its own content. Projects: Wayward Pines, The Company You Keep, Zoo. www.artifexstudios.com Hybride. Quebec’s first visual effects studio based in Piedmont and Montreal that has over a quarter of a century of experience contributing to advertising campaigns and high-end digital augmentation for film and television. Projects: Tomorrowland, Star Wars: The Force Awakens, Avatar. www.hybride.com/en Image Engine. The Vancouver studio received an Academy Award nomination for District 9 and specializes in character/ creature design and animation, digital environments, VFX supervision and concept art. Projects: Jurassic World: Fallen Kingdom, The X-Files, The Meg. www.image-engine.com
SideFX. After purchasing the coding rights of PRISMS, former Omnibus employees Kim Davidson and Greg Hermanovic laid the foundation for the Toronto-based SideFX and the procedural graphic program Houdini, which has been lauded with four Academy of Motion Pictures, Arts and Sciences Awards. www.sidefx.com Soho VFX. Initially a boutique studio in Toronto, Soho VFX experienced a great deal of growth working on blockbuster fantasy, comics and horror productions. Projects: San Andreas, The Conjuring 2, Logan. www.sohovfx.com Sony Pictures Imageworks. Transplanted from Culver City, California to Vancouver, SPI has been a major player not only in Canada but internationally for visual effects and computer animation, with Oscars for Spider-Man 2 and The ChubbChubbs!. Projects: Spider-Man: Into the Spider-Verse, Smallfoot, Jumanji: Welcome to the Jungle. www.imageworks.com Spin VFX. Established in 1987, the Toronto-based visual effects studio focuses on providing on-set and in-studio supervision, 3D animation, on-set production services, concept and visualization, and CG effects for film and television productions. Projects: Spotlight, John Wick, Marco Polo. www.spinvfx.com Ziva Dynamics. Described as a ‘software company with the vision to transform human interaction in virtual spaces,’ the enterprise founded by James Jacobs in Vancouver has gained international acclaim for its character creation tool Ziva VFX. www.zivadynamics.com Industry Websites
Mr. X. Located in Toronto, Montreal and Bangalore, the company established by Dennis Berardi has received Primetime Emmy and BAFTA Award nominations for Vikings and The Shape of Water and is currently entering in the realm of producing with Monster Hunter. Projects: Creed II, Vice, Tomb Raider. www.mrxfx.com
www.ontariocreates.ca www.casont.ca www.montrealinternational.com/en www.vfx-montreal.com www.qftc.ca/tax-incentives/information www.creativebc.com/programs/tax-credits/film-incentive-bc www.vancouvereconomic.com/vfx-animation
Reynault VFX. Founded by Mathieu Raynault, a Canadian digital matte painter, concept artist and VFX art director, the Montreal studio specializes in CG environments, digital compositing and concept art. Projects: A Quiet Place, Assassin’s Creed, Murder on the Orient Express. www.raynault.com
www.sirtcentre.com
Rodeo FX. Headquartered in Montreal, the privately-held company established by Sébastien Moreau also has studios in Quebec City, Munich and Los Angeles, along with receiving six VES Awards. Projects: Aquaman, Downsizing, Mowgli. www.rodeofx.com/en/home
SPRING 2019 VFXVOICE.COM • 85
2/22/19 3:08 PM
FOCUS
TOP: A group discussion taking place at Sony Pictures Imageworks. (Photo courtesy of Sony Pictures Imageworks) MIDDLE: Sony Pictures Imageworks had a fun and challenging time integrating the different animation styles in Spider-Man: Into the Spider-Verse. (Image courtesy of Sony Pictures Imageworks) BOTTOM: A shot from Hunter Killer created by Spin VFX. (Photo courtesy of Spin VFX)
84 • VFXVOICE.COM SPRING 2019
PG 78-85 CANADA FOCUS.indd 85
was clear that the vision for the future of visual effects at scale and the challenges of Montreal, that there was a mutual fit.” Sheridan College established a computer animation program in 1981. “We’ve developed our curriculum in Visual Effects with training a generalist in mind,” states Sheridan College Program Coordinator/Professor of Computer Animation, Visual Effects & Digital Creature Animation, Noel Hooper. “We are also trying to stay ahead of industry needs by developing new programs like our Digital Creature TD program.” Lost Boys Studio, founded by Mark Bénard, has shifted toward specialized programs. “We were producing excellent generalists, capable of adapting to different production roles, but noticed that once a discipline was selected it was rare that our alumni ever shifted departments [usually staying in Lighting, FX or Compositing].” Technicolor is actively involved with training initiatives in Canada. “Our courses are aligned to the skills framework and pipelines of the studios that we support: MPC Film, Mr. X and Mill Film,” states Jonathan Fletcher, Global Head of Learning and Development, Technicolor Academy. “It includes offering training on proprietary tools and VFX workflows. This in turn gives the artists more confidence about beginning their careers at one of the visual effects studios and helps with the transition onto the show floor.” “The visual effects landscape is constantly evolving, and every region needs to utilize as many tools as possible to stay competitive, based on their unique strengths,” remarks Sony Pictures Imageworks Vice President, New Business Production, Shauna Bryan. “Vancouver has a labor-based rebate, so the industry has to focus on not only attracting, but retaining international talent, as well as focus on local education, training and internship programs.” The first of the big companies to come to Montreal was Framestore in 2013. “It made a lot of sense to open another studio in Canada,” notes Chloë Grysole, Managing Director at Framestore Montreal. “One of the factors was the lifestyle in Montreal. Knowing that you’re going to have to relocate people from all over the world, you want to make it a place where they are going to want to live. We have 600 people from 49 different nationalities on our staff.” Canadian visual effects companies have attracted foreign ownership interest with Ubisoft buying Hybride in 2008, Cinesite merging with Image Engine in 2015, and Mr. X being acquired by Technicolor in 2014. “If you are not strategically aligned or able to invest the capital needed in today’s global landscape, it’s going to be difficult for a mid-size studio,” notes Mr. X CEO Dennis Berardi, who supervised the visual effects for The Shape of Water (2017) and is currently producing Monster Hunter (2020). “It’s probably easier for a boutique studio that has a niche offering, like motion graphics or high-end matte paintings; they stay small and contain their costs. In my case, I wasn’t looking at being acquired or exiting the business. I was looking for growth and investment. I’ve been able to maintain creative control of the business. I’ve effectively switched from a Canadian charter bank to the bank of Technicolor. It has been a positive result. We have maintained our support of the Canadian Film Center, Bell TIFF Lightbox and independent films. I’m proud of that.”
Canada VFX A-Z Major visual effects companies from around the world have a presence in Canada, including BUF, Cinesite, Digital Domain, DNEG, Framestore, ILM, Method Studios, MPC, Pixomondo, and Scanline VFX. But there is no shortage of homegrown vendors that have had a significant impact on the industry. Here is a brief guide to some of the most active companies in Canada (not meant to be comprehensive list). Animatrik. Founded by Brett Ineson and headquartered in Vancouver with a second facility in Los Angeles, Animatrik focuses on motion capture, pre-visualization and virtual cinematography services. Projects: Deadpool 2, Avengers: Infinity War, Ready Player One. www.animatrik.com Artifex Studios. An independently-owned studio situated in Vancouver started by Adam Stern that produces visual effects for both television and film as well as producing its own content. Projects: Wayward Pines, The Company You Keep, Zoo. www.artifexstudios.com Hybride. Quebec’s first visual effects studio based in Piedmont and Montreal that has over a quarter of a century of experience contributing to advertising campaigns and high-end digital augmentation for film and television. Projects: Tomorrowland, Star Wars: The Force Awakens, Avatar. www.hybride.com/en Image Engine. The Vancouver studio received an Academy Award nomination for District 9 and specializes in character/ creature design and animation, digital environments, VFX supervision and concept art. Projects: Jurassic World: Fallen Kingdom, The X-Files, The Meg. www.image-engine.com
SideFX. After purchasing the coding rights of PRISMS, former Omnibus employees Kim Davidson and Greg Hermanovic laid the foundation for the Toronto-based SideFX and the procedural graphic program Houdini, which has been lauded with four Academy of Motion Pictures, Arts and Sciences Awards. www.sidefx.com Soho VFX. Initially a boutique studio in Toronto, Soho VFX experienced a great deal of growth working on blockbuster fantasy, comics and horror productions. Projects: San Andreas, The Conjuring 2, Logan. www.sohovfx.com Sony Pictures Imageworks. Transplanted from Culver City, California to Vancouver, SPI has been a major player not only in Canada but internationally for visual effects and computer animation, with Oscars for Spider-Man 2 and The ChubbChubbs!. Projects: Spider-Man: Into the Spider-Verse, Smallfoot, Jumanji: Welcome to the Jungle. www.imageworks.com Spin VFX. Established in 1987, the Toronto-based visual effects studio focuses on providing on-set and in-studio supervision, 3D animation, on-set production services, concept and visualization, and CG effects for film and television productions. Projects: Spotlight, John Wick, Marco Polo. www.spinvfx.com Ziva Dynamics. Described as a ‘software company with the vision to transform human interaction in virtual spaces,’ the enterprise founded by James Jacobs in Vancouver has gained international acclaim for its character creation tool Ziva VFX. www.zivadynamics.com Industry Websites
Mr. X. Located in Toronto, Montreal and Bangalore, the company established by Dennis Berardi has received Primetime Emmy and BAFTA Award nominations for Vikings and The Shape of Water and is currently entering in the realm of producing with Monster Hunter. Projects: Creed II, Vice, Tomb Raider. www.mrxfx.com
www.ontariocreates.ca www.casont.ca www.montrealinternational.com/en www.vfx-montreal.com www.qftc.ca/tax-incentives/information www.creativebc.com/programs/tax-credits/film-incentive-bc www.vancouvereconomic.com/vfx-animation
Reynault VFX. Founded by Mathieu Raynault, a Canadian digital matte painter, concept artist and VFX art director, the Montreal studio specializes in CG environments, digital compositing and concept art. Projects: A Quiet Place, Assassin’s Creed, Murder on the Orient Express. www.raynault.com
www.sirtcentre.com
Rodeo FX. Headquartered in Montreal, the privately-held company established by Sébastien Moreau also has studios in Quebec City, Munich and Los Angeles, along with receiving six VES Awards. Projects: Aquaman, Downsizing, Mowgli. www.rodeofx.com/en/home
SPRING 2019 VFXVOICE.COM • 85
2/22/19 3:08 PM
ANIMATION
BRINGING LIVE-ACTION VFX TO HOW TO TRAIN YOUR DRAGON 3 By BARBARA ROBERTSON
All photos © 2019 DreamWorks Animation LLC. TOP: Hiccup (voiced by Jay Baruchel) and Night Fury dragon Toothless in DreamWorks Animation’s How To Train Your Dragon: The Hidden World.
86 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 86-87
Studios creating a live-action blockbuster might cite more than 1,000 visual effects shots in the film, perhaps even 2,000, including some sequences in which everything – environments, effects, actors – is digital. These sequences are like animated films within the live-action film. But people rarely think in terms of the opposite – that is, they rarely think of animated features in terms of “visual effects.” “When people who don’t know the animation industry hear I’m a visual effects supervisor, they just ask about effects,” says Dave Walvoord, Visual Effects Supervisor for DreamWorks Animation’s How to Train Your Dragon: The Hidden World, the third and final film in the popular and critically acclaimed franchise. “I have to explain that ‘visual effects’ is not just blowing things up – that in Planet of the Apes, Caesar is a visual effect – and it’s like that here, too.” In fact, a visual effects supervisor on an animated feature typically oversees everything except editorial, layout, previs, story and the animated performances, as did Walvoord for The Hidden World. The first film of the franchise, How to Train Your Dragon, won an Annie in 2011 for Best Animated Feature, three VES awards (for Animated Character, Animation and Effects Animation), and received an Oscar nomination for Best Animated Feature. How to Train Your Dragon 2 won an Annie for Best Animated Feature, received an Oscar nomination for Best Animated Feature, and four VES nominations. Actor Jay Baruchel voices the main character, Hiccup, a young Viking who in the first film rescued and befriended an injured black dragon, a Night Fury he dubs Toothless. Hiccup bonds with Toothless, and ultimately the pair leads their communities – human and dragon – to live in harmony. In this third installment, Toothless meets a white dragon, the Light Fury. “The challenge for Hiccup is that he wants Toothless to find a mate,” says Simon Otto, head of character animation for all three Dragon films. “But the only way Toothless can fly is with Hiccup on his back [because of an injury from the first film]. And the only way
to be with the Light Fury is when Hiccup is not there.” Dean DeBlois directed all three films. Walvoord was Visual Effects Supervisor for Dragon 2 and Dragon 3, managing teams doing modeling, rigging, character effects, digimattes, effects, simulation, lighting, rendering, compositing, stereo, and so forth. DreamWorks Animation teams use a combination of commercial software and proprietary software tools that have evolved since the first Dragon film took flight. “We changed a lot of technology for this show,” Walvoord says. Those changes include a new character effects pipeline and the adoption of open source USD (Universal Scene Description), which opened up their ability to handle complex scenes. The most dramatic change, though, was the studio’s new proprietary ray tracer “Moonray” along with a new “Moonshine” shader platform, both used for the first time on this film. “Considering how much technology we changed, that we were using a ray tracer for the first time, and that we built an entirely new pipeline in six months to a year, the surprising thing is that this wasn’t the hardest movie I’ve made,” Walvoord says. “It all came together well and quickly. We had giant machines with 48 cores, 196 gigabytes of RAM, and a ray tracer that could take advantage of all that. We had more sophisticated algorithms and software. But we didn’t have a very good schedule. So we started to work more like a visual effects movie.” Although artists creating 3D-animated films use many of the same tools and techniques as those creating visual effects for live-action films, their process often resembles the linear pipeline developed for 2D-animated films rather than methods used in visual effects studios. “In visual effects, it feels like every shot is its own problem to solve – not entirely true, but more so than in animation,” Walvoord says. “The 2D legacy sends you down a linear path with an economy of scale that visual effects don’t have: If there are two characters in a set, every shot can be approached in the same way. But we didn’t do that. We approached every shot looking
“Considering how much technology we changed, that we were using a ray tracer for the first time, and that we built an entirely new pipeline in six months to a year, the surprising thing is that this wasn’t the hardest movie I’ve made. It all came together well and quickly.” —David Walvoord, Visual Effects Supervisor
TOP: The Viking village of Berk has become a chaotic dragon utopia. BOTTOM: The villages where the previous Dragon films largely took place have roots in Nordic reality with cliffs and oceans.
SPRING 2019 VFXVOICE.COM • 87
2/22/19 3:37 PM
ANIMATION
BRINGING LIVE-ACTION VFX TO HOW TO TRAIN YOUR DRAGON 3 By BARBARA ROBERTSON
All photos © 2019 DreamWorks Animation LLC. TOP: Hiccup (voiced by Jay Baruchel) and Night Fury dragon Toothless in DreamWorks Animation’s How To Train Your Dragon: The Hidden World.
86 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 86-87
Studios creating a live-action blockbuster might cite more than 1,000 visual effects shots in the film, perhaps even 2,000, including some sequences in which everything – environments, effects, actors – is digital. These sequences are like animated films within the live-action film. But people rarely think in terms of the opposite – that is, they rarely think of animated features in terms of “visual effects.” “When people who don’t know the animation industry hear I’m a visual effects supervisor, they just ask about effects,” says Dave Walvoord, Visual Effects Supervisor for DreamWorks Animation’s How to Train Your Dragon: The Hidden World, the third and final film in the popular and critically acclaimed franchise. “I have to explain that ‘visual effects’ is not just blowing things up – that in Planet of the Apes, Caesar is a visual effect – and it’s like that here, too.” In fact, a visual effects supervisor on an animated feature typically oversees everything except editorial, layout, previs, story and the animated performances, as did Walvoord for The Hidden World. The first film of the franchise, How to Train Your Dragon, won an Annie in 2011 for Best Animated Feature, three VES awards (for Animated Character, Animation and Effects Animation), and received an Oscar nomination for Best Animated Feature. How to Train Your Dragon 2 won an Annie for Best Animated Feature, received an Oscar nomination for Best Animated Feature, and four VES nominations. Actor Jay Baruchel voices the main character, Hiccup, a young Viking who in the first film rescued and befriended an injured black dragon, a Night Fury he dubs Toothless. Hiccup bonds with Toothless, and ultimately the pair leads their communities – human and dragon – to live in harmony. In this third installment, Toothless meets a white dragon, the Light Fury. “The challenge for Hiccup is that he wants Toothless to find a mate,” says Simon Otto, head of character animation for all three Dragon films. “But the only way Toothless can fly is with Hiccup on his back [because of an injury from the first film]. And the only way
to be with the Light Fury is when Hiccup is not there.” Dean DeBlois directed all three films. Walvoord was Visual Effects Supervisor for Dragon 2 and Dragon 3, managing teams doing modeling, rigging, character effects, digimattes, effects, simulation, lighting, rendering, compositing, stereo, and so forth. DreamWorks Animation teams use a combination of commercial software and proprietary software tools that have evolved since the first Dragon film took flight. “We changed a lot of technology for this show,” Walvoord says. Those changes include a new character effects pipeline and the adoption of open source USD (Universal Scene Description), which opened up their ability to handle complex scenes. The most dramatic change, though, was the studio’s new proprietary ray tracer “Moonray” along with a new “Moonshine” shader platform, both used for the first time on this film. “Considering how much technology we changed, that we were using a ray tracer for the first time, and that we built an entirely new pipeline in six months to a year, the surprising thing is that this wasn’t the hardest movie I’ve made,” Walvoord says. “It all came together well and quickly. We had giant machines with 48 cores, 196 gigabytes of RAM, and a ray tracer that could take advantage of all that. We had more sophisticated algorithms and software. But we didn’t have a very good schedule. So we started to work more like a visual effects movie.” Although artists creating 3D-animated films use many of the same tools and techniques as those creating visual effects for live-action films, their process often resembles the linear pipeline developed for 2D-animated films rather than methods used in visual effects studios. “In visual effects, it feels like every shot is its own problem to solve – not entirely true, but more so than in animation,” Walvoord says. “The 2D legacy sends you down a linear path with an economy of scale that visual effects don’t have: If there are two characters in a set, every shot can be approached in the same way. But we didn’t do that. We approached every shot looking
“Considering how much technology we changed, that we were using a ray tracer for the first time, and that we built an entirely new pipeline in six months to a year, the surprising thing is that this wasn’t the hardest movie I’ve made. It all came together well and quickly.” —David Walvoord, Visual Effects Supervisor
TOP: The Viking village of Berk has become a chaotic dragon utopia. BOTTOM: The villages where the previous Dragon films largely took place have roots in Nordic reality with cliffs and oceans.
SPRING 2019 VFXVOICE.COM • 87
2/22/19 3:37 PM
ANIMATION
“[Cinematographer] Roger [Deakins] has a unique way of telling stories through images that may not be in line with what people expect from animated films. Animated films are often very bright. They leave little for the imagination. But Roger is not like that. He likes bold images through pushed composition, through using minimal lighting sources, and with a palette more like visual effects and live-action films than animated films. We embraced that everywhere we could.” —Pablo Valle, Lighting Supervisor TOP: Hiccup (Jay Baruchel) and Toothless in DreamWorks Animation’s How To Train Your Dragon: The Hidden World, directed by Dean DeBlois. BOTTOM: Astrid (America Ferrera) Hiccup (Jay Baruchel) and Toothless.
88 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 88-89
at how we could make it the best it could be. We animated while we were building sets. We were lighting while were surfacing. All the departments collaborated heavily. Our new pipeline and a common toolset made it easy to pass data back and forth and work in parallel.” Helping the artists embrace a live-action methodology was Oscar-winning cinematographer Roger Deakins, who was a visual consultant on Dragon 2 and again on this film. “On the second Dragon, he told us to stop worrying about continuity, and we took it to heart,” Walvoord says. “Like a cinematographer on set, we just lit the shot. We realized that it’s much more interesting to work this way.” LIGHTING
Dragon 3 is the 12th film for which Pablo Valle has been a lighting supervisor. On this film, he supervised 40 lighting artists, organizing them into four teams. Each team handled approximately 10 of the film’s 40 sequences. “This film is easily the pinnacle of what I’ve done, both in meeting our artistic goals and in our technical achievements,” he says. “Every day we wondered, ‘How are we going to do this?’ It was an absolute blast.” The lighting team leaned on Deakins to help them elevate the look of this film without deviating too far from the design language of previous films. “Roger [Deakins] has a unique way of telling stories through images that may not be in line with what people expect from animated films,” Valle says. “Animated films are often very bright. They leave little for the imagination. But Roger is not like that. He likes bold images through pushed composition, through using minimal lighting sources, and with a palette more like visual effects and live-action films than animated films. We embraced that everywhere we could.” The most spectacular lighting happens in the Hidden World, a magical dragon lair with miles of caverns made with bioluminescent mushrooms, crystals, water, waterfalls and mist. It pushed the
new ray tracer to its limit. “The Hidden World is made of all the things that are hard to do, but thankfully our lighting leads came up with clever ways to package lighting rigs into plug-and-play things that could be dropped into sequences,” Valle says “We had a library of light types and responses and could deal with volumetrics in a collaborative way. Before, we sometimes needed eight weeks to see an image. On this film, we could sometimes turn around complex images in a day.” The packaged lighting assets meant the lighting artists could place sources where they needed lights – as would a DP on a live-action set – rather than, as more typically on an animated feature, having the light sources predetermined in layout or by effects simulations. “For example, when the characters are camping in the middle of a forest, rather than having layout artists place sources, we could package lamps, torches and volumetric fire into an asset that the lighters would move around the set to the best places,” Valle says. “We placed the lanterns for some shots in thick fog. Also, we would light to lines of dialog in animated shots. We learned that when lighting is dependent on sources, the sources should be placed in lighting, not before, and/or we should be involved from the beginning.”
“Hiccup couldn’t be photoreal, but he had to look better than in Dragon 2. We came up with a way to add detail and create a believable character that reacts to light better, and still stay away from the uncanny valley.” —David Walvoord, Visual Effects Supervisor
TOP: Hiccup (Jay Baruchel) and Toothless lead the Dragon Riders. BOTTOM: The Hidden World of the Caldera is the mythical home of dragons.
WORLD BUILDERS
Dragon 3 is, in many ways, a traveling film – Hiccup’s villagers, threatened by people who want their dragons, move from place to place through the film. But the three main environments are the Isle of Berk where the previous films largely took place, New Berk to which the characters retreat, and the Hidden World. Berk and New Berk have roots in Nordic reality with cliffs, oceans and villages. The Hidden World pushes further. “In the Hidden World there are obsidian rock towers covered in bioluminescent coral,” Walvoord says. “We built 22 spires and instanced them hundreds of times over a three-mile span. On top, we have 140 million pieces of mushroom coral. It wasn’t easy
SPRING 2019 VFXVOICE.COM • 89
2/22/19 3:37 PM
ANIMATION
“[Cinematographer] Roger [Deakins] has a unique way of telling stories through images that may not be in line with what people expect from animated films. Animated films are often very bright. They leave little for the imagination. But Roger is not like that. He likes bold images through pushed composition, through using minimal lighting sources, and with a palette more like visual effects and live-action films than animated films. We embraced that everywhere we could.” —Pablo Valle, Lighting Supervisor TOP: Hiccup (Jay Baruchel) and Toothless in DreamWorks Animation’s How To Train Your Dragon: The Hidden World, directed by Dean DeBlois. BOTTOM: Astrid (America Ferrera) Hiccup (Jay Baruchel) and Toothless.
88 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 88-89
at how we could make it the best it could be. We animated while we were building sets. We were lighting while were surfacing. All the departments collaborated heavily. Our new pipeline and a common toolset made it easy to pass data back and forth and work in parallel.” Helping the artists embrace a live-action methodology was Oscar-winning cinematographer Roger Deakins, who was a visual consultant on Dragon 2 and again on this film. “On the second Dragon, he told us to stop worrying about continuity, and we took it to heart,” Walvoord says. “Like a cinematographer on set, we just lit the shot. We realized that it’s much more interesting to work this way.” LIGHTING
Dragon 3 is the 12th film for which Pablo Valle has been a lighting supervisor. On this film, he supervised 40 lighting artists, organizing them into four teams. Each team handled approximately 10 of the film’s 40 sequences. “This film is easily the pinnacle of what I’ve done, both in meeting our artistic goals and in our technical achievements,” he says. “Every day we wondered, ‘How are we going to do this?’ It was an absolute blast.” The lighting team leaned on Deakins to help them elevate the look of this film without deviating too far from the design language of previous films. “Roger [Deakins] has a unique way of telling stories through images that may not be in line with what people expect from animated films,” Valle says. “Animated films are often very bright. They leave little for the imagination. But Roger is not like that. He likes bold images through pushed composition, through using minimal lighting sources, and with a palette more like visual effects and live-action films than animated films. We embraced that everywhere we could.” The most spectacular lighting happens in the Hidden World, a magical dragon lair with miles of caverns made with bioluminescent mushrooms, crystals, water, waterfalls and mist. It pushed the
new ray tracer to its limit. “The Hidden World is made of all the things that are hard to do, but thankfully our lighting leads came up with clever ways to package lighting rigs into plug-and-play things that could be dropped into sequences,” Valle says “We had a library of light types and responses and could deal with volumetrics in a collaborative way. Before, we sometimes needed eight weeks to see an image. On this film, we could sometimes turn around complex images in a day.” The packaged lighting assets meant the lighting artists could place sources where they needed lights – as would a DP on a live-action set – rather than, as more typically on an animated feature, having the light sources predetermined in layout or by effects simulations. “For example, when the characters are camping in the middle of a forest, rather than having layout artists place sources, we could package lamps, torches and volumetric fire into an asset that the lighters would move around the set to the best places,” Valle says. “We placed the lanterns for some shots in thick fog. Also, we would light to lines of dialog in animated shots. We learned that when lighting is dependent on sources, the sources should be placed in lighting, not before, and/or we should be involved from the beginning.”
“Hiccup couldn’t be photoreal, but he had to look better than in Dragon 2. We came up with a way to add detail and create a believable character that reacts to light better, and still stay away from the uncanny valley.” —David Walvoord, Visual Effects Supervisor
TOP: Hiccup (Jay Baruchel) and Toothless lead the Dragon Riders. BOTTOM: The Hidden World of the Caldera is the mythical home of dragons.
WORLD BUILDERS
Dragon 3 is, in many ways, a traveling film – Hiccup’s villagers, threatened by people who want their dragons, move from place to place through the film. But the three main environments are the Isle of Berk where the previous films largely took place, New Berk to which the characters retreat, and the Hidden World. Berk and New Berk have roots in Nordic reality with cliffs, oceans and villages. The Hidden World pushes further. “In the Hidden World there are obsidian rock towers covered in bioluminescent coral,” Walvoord says. “We built 22 spires and instanced them hundreds of times over a three-mile span. On top, we have 140 million pieces of mushroom coral. It wasn’t easy
SPRING 2019 VFXVOICE.COM • 89
2/22/19 3:37 PM
ANIMATION
“Our algorithms are better, as is the light transport simulation and our surfacing. We even introduced peach fuzz which we’d been cautious about. It softened the characters and helped us pull away from CG.” —David Walvoord, Visual Effects Supervisor
TOP: The female Light Fury dragon and black Toothless were challenges to light. The black Night Furry has shiny skin, but Light Fury has an iridescence that shines when light hits her. Subtle methods were used to make Toothless shine more. BOTTOM: The Hidden World of the Caldera. OPPOSITE TOP: Astrid and Hiccup.
90 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 91
on the ray tracer, but because we did it with instancing, it was possible.” A new procedural scattering system called “Sprinkles” gave surfacers tools for set dressing. “They basically painted geometry,” Walvoord says. “We used Sprinkles for ferns, trees, rocks, bark on trees, flowers – any kind of set dressing – and coupled that with the surfacers doing texturing at the same time.” In one scene, Hiccup plows through a forest, pushing aside “Sprinkled” grass, ferns and anything else in the way. Birds and bugs move aside. “If you watch animated movies closely, you often see clear paths for characters to neatly walk through,” Walvoord says. “We wanted to bring the environments into the world of the characters. Rather than having clear paths in a forest, animators started whacking.” Animators mimed the action using proxy plants. Then surfacers dressed the set and character effects artists ran the simulations. ANIMATION
Simon Otto organized his team of 55 animators with seven supervising animators responsible for main characters and a few leads for smaller characters. “They would own the characters,” Otto says. “They knew where the libraries were, how to work with the rigs. They led the decision-making on the acting and defended their characters through the production.” Animators on Dragon 2 were the first to use the studio’s Premo Animation System. Using Premo, animators can scrub through and work in real time with multiple shots and access entire sequences. In 2018, Alex Powell, Jason Reisig, Martin Watt and Alex Wells received Academy Certificates in the Scientific and Engineering Awards for developing the system. “Of course, Premo has evolved since Dragon 2,” Otto says. “The rigs are incredibly sophisticated. We can create more complex, detailed performances. And, as we work in real time with geometry and surfaces, we can see more details. In essence, though, it’s the
“This film is easily the pinnacle of what I’ve done, both in meeting our artistic goals and in our technical achievements. Every day, we wondered, ‘How are we going to do this?’ It was an absolute blast.”—Pablo Valle, Lighting Supervisor Draconian Effects Head of Effects Lawrence Lee led a team of approximately 30 artists who created magical effects for How to Train Your Dragon: The Hidden World. “We had traditional effects – fire, water, and destruction – on this movie, but we had three big new things: the entrance to the Hidden World is a set made of water and we had many waterfalls in the New Berk environment, we gave the Light Fury dragon the ability to become invisible, and we gave a bad dragon acidic fire.” The entrance to the Hidden World is a four-kilometer wide hole in the ocean into which water pours thanks to a group of effects artists led by Baptiste van Opstal. The falling water is a volume-preserving particle simulation and a fluid simulation. “We had 16 machines simulating simultaneously,” Lee says. “The result was a big level set with three billion voxels. We rendered them directly as iso surfaces all in one pass.” The entrance leads to an underground cavern filled with waterfalls. For one large shot the artists generated 600 waterfalls from 15 basic waterfalls. “We had a cool tool that let us set dress the waterfalls every-
where,” Lee says. “It had physics built into it so the water collides with the terrain, and the artists could change the speed of the water. We gave them to lighting as a gizmo. We can now give lighting artists tools for effects we traditionally do when it’s more efficient. If they want to add haze they can do it themselves. It works out really well – it feels more like a small visual effects shot than feature animation production.” To make the Light Fury invisible, the artists turned her scales into mirrors. She blows a fireball. As she flies through it, her scales heat up and become reflective. And she emerges “invisible.” As she cools, she becomes visible again. “We had to choreograph reflections on the mirrors to have recognizable objects show on the surface,” Lee says. “We didn’t always get that out of the box because of her angle, so every shot became a hero shot.” She becomes visible again using a simulation based on cellular automata patterns developed by DreamWorks Animation CG Specialist Amaury Aubel, who also led work on the acid fire, a liquid simulation that erodes any hard surface it hits.
“We can now give lighting artists tools for effects we traditionally do when it’s more efficient. If they want to add haze they can do it themselves. It works out really well – it feels more like a small visual effects shot than feature animation production.” —Lawrence Lee, Head of Effects
SPRING 2019 VFXVOICE.COM • 91
2/22/19 3:37 PM
ANIMATION
“Our algorithms are better, as is the light transport simulation and our surfacing. We even introduced peach fuzz which we’d been cautious about. It softened the characters and helped us pull away from CG.” —David Walvoord, Visual Effects Supervisor
TOP: The female Light Fury dragon and black Toothless were challenges to light. The black Night Furry has shiny skin, but Light Fury has an iridescence that shines when light hits her. Subtle methods were used to make Toothless shine more. BOTTOM: The Hidden World of the Caldera. OPPOSITE TOP: Astrid and Hiccup.
90 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 91
on the ray tracer, but because we did it with instancing, it was possible.” A new procedural scattering system called “Sprinkles” gave surfacers tools for set dressing. “They basically painted geometry,” Walvoord says. “We used Sprinkles for ferns, trees, rocks, bark on trees, flowers – any kind of set dressing – and coupled that with the surfacers doing texturing at the same time.” In one scene, Hiccup plows through a forest, pushing aside “Sprinkled” grass, ferns and anything else in the way. Birds and bugs move aside. “If you watch animated movies closely, you often see clear paths for characters to neatly walk through,” Walvoord says. “We wanted to bring the environments into the world of the characters. Rather than having clear paths in a forest, animators started whacking.” Animators mimed the action using proxy plants. Then surfacers dressed the set and character effects artists ran the simulations. ANIMATION
Simon Otto organized his team of 55 animators with seven supervising animators responsible for main characters and a few leads for smaller characters. “They would own the characters,” Otto says. “They knew where the libraries were, how to work with the rigs. They led the decision-making on the acting and defended their characters through the production.” Animators on Dragon 2 were the first to use the studio’s Premo Animation System. Using Premo, animators can scrub through and work in real time with multiple shots and access entire sequences. In 2018, Alex Powell, Jason Reisig, Martin Watt and Alex Wells received Academy Certificates in the Scientific and Engineering Awards for developing the system. “Of course, Premo has evolved since Dragon 2,” Otto says. “The rigs are incredibly sophisticated. We can create more complex, detailed performances. And, as we work in real time with geometry and surfaces, we can see more details. In essence, though, it’s the
“This film is easily the pinnacle of what I’ve done, both in meeting our artistic goals and in our technical achievements. Every day, we wondered, ‘How are we going to do this?’ It was an absolute blast.”—Pablo Valle, Lighting Supervisor Draconian Effects Head of Effects Lawrence Lee led a team of approximately 30 artists who created magical effects for How to Train Your Dragon: The Hidden World. “We had traditional effects – fire, water, and destruction – on this movie, but we had three big new things: the entrance to the Hidden World is a set made of water and we had many waterfalls in the New Berk environment, we gave the Light Fury dragon the ability to become invisible, and we gave a bad dragon acidic fire.” The entrance to the Hidden World is a four-kilometer wide hole in the ocean into which water pours thanks to a group of effects artists led by Baptiste van Opstal. The falling water is a volume-preserving particle simulation and a fluid simulation. “We had 16 machines simulating simultaneously,” Lee says. “The result was a big level set with three billion voxels. We rendered them directly as iso surfaces all in one pass.” The entrance leads to an underground cavern filled with waterfalls. For one large shot the artists generated 600 waterfalls from 15 basic waterfalls. “We had a cool tool that let us set dress the waterfalls every-
where,” Lee says. “It had physics built into it so the water collides with the terrain, and the artists could change the speed of the water. We gave them to lighting as a gizmo. We can now give lighting artists tools for effects we traditionally do when it’s more efficient. If they want to add haze they can do it themselves. It works out really well – it feels more like a small visual effects shot than feature animation production.” To make the Light Fury invisible, the artists turned her scales into mirrors. She blows a fireball. As she flies through it, her scales heat up and become reflective. And she emerges “invisible.” As she cools, she becomes visible again. “We had to choreograph reflections on the mirrors to have recognizable objects show on the surface,” Lee says. “We didn’t always get that out of the box because of her angle, so every shot became a hero shot.” She becomes visible again using a simulation based on cellular automata patterns developed by DreamWorks Animation CG Specialist Amaury Aubel, who also led work on the acid fire, a liquid simulation that erodes any hard surface it hits.
“We can now give lighting artists tools for effects we traditionally do when it’s more efficient. If they want to add haze they can do it themselves. It works out really well – it feels more like a small visual effects shot than feature animation production.” —Lawrence Lee, Head of Effects
SPRING 2019 VFXVOICE.COM • 91
2/22/19 3:37 PM
ANIMATION
“In the Hidden World there are obsidian rock towers covered in bioluminescent coral. We built 22 spires and instanced them hundreds of times over a three-mile span. On top, we have 140 million pieces of mushroom coral. It wasn’t easy on the ray tracer, but because we did it with instancing, it was possible.” —David Walvoord, Visual Effects Supervisor
TOP AND BOTTOM: The Hidden World of the Caldera.
same basic setup. The big change was on the lighting side.” The tricky question for the lighting artists and shader writers was in making the characters look older and better, yet still look like they are part of the same world as before. “Hiccup couldn’t be photoreal, but he had to look better than in Dragon 2,” Walvoord says. “We came up with a way to add detail and create a believable character that reacts to light better, and still stay away from the uncanny valley.” The team changed the subsurface model they had used, changed the film LUT, and changed the shaders to have physically-based responses. “Our algorithms are better, as is the light transport simulation and our surfacing,” Walvoord says. “We even introduced peach fuzz which we’d been cautious about. It softened the characters and helped us pull away from CG.” The trickiest dragons to light were the black Toothless and the white Light Fury. “With a black dragon, his specular response, the shiny skin, determines how he looks,” Valle says. “Light Fury has an iridescence that shines when light hits her. Roger [Deakins] helped us carefully construct compositions when both characters were on set and with clever and subtle ways to have Toothless shine more.” As the characters in the film move toward a new world, so did the crew creating this film. New tools made it possible to work more collaboratively in ways that resembled the way in which visual effects studios work on live-action films. “With our schedule, we didn’t have much of a choice but to collaborate,” Walvoord says. “But our new pipeline made it easy to pass data back and forth and work in parallel. That can be stressful for the artists. But for me, it was exciting. I think it’s better. It gave us the ability to have happy accidents and the flexibility to invent on the fly. We ran with it. Everything about this film was a lot more fun.”
92 • VFXVOICE.COM SPRING 2019
PG 86-92 TRAIN YOUR DRAGON.indd 92
2/22/19 3:37 PM
[ VES NEWS ]
2019 VES Board of Directors Officers; Our 14th Section; New Archive Initiative By NAOMI GOLDMAN The VES Board of Directors officers for 2019, who comprise the VES Board Executive Committee, were elected at the January 2019 Board meeting. The officers include Mike Chambers, a renowned freelance visual effects producer who is beginning his fifth term as Chair of the VES. The 2019 Officers of the VES Board of Directors are: • Chair: Mike Chambers • 1st Vice Chair: Jeffrey A. Okun, VES • 2nd Vice Chair: Lisa Cooke • Treasurer: Brooke Lyndon-Stanford • Secretary: Rita Cahill The officers reflect a combination of returning Executive Committee members – Chambers, Okun and Cahill – and newly elected officers, Cooke and Lyndon-Stanford, who have served on the global board and in leadership positions with their Sections. “We are fortunate to have such esteemed leadership represented on the Executive Committee,” said Eric Roth, VES Executive Director. “Collectively, these talented professionals bring passion, diverse experience and enormous commitment to our organization. We look forward to Mike Chambers’ continued vision in taking the organization and our global membership to new heights.” VES WELCOMES ITS 14TH SECTION – GEORGIA
As a flourishing global honor society, the Visual Effects Society’s international presence owes much to its thriving network of Sections, who lend their expertise and entrepreneurship to galvanize their regional VFX communities while advancing the reach and reputation of our organization and the industry worldwide. Our newest VES Section in Georgia (U.S.) joins those in Australia, San Francisco’s Bay Area, France, Germany, India, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington. This new Section is poised to offer exciting, new opportunities to all VES members in Georgia. Congratulations and welcome!
VES ARCHIVE INITIATIVE
The Society has embarked on an exciting and wide-reaching archive project to capture, preserve and share the rich history of the visual effects community. This will be an enduring and living initiative that builds on our extensive catalog of videos and will serve VFX artists and practitioners, the entertainment industry, and the general public. Under the leadership of our new Archives Committee Chair Lisa Cooke and Co-Chair Kim Lavery, VES, the initiative will endeavor to capture the full richness of our craft – from recording our luminaries to documenting all the props and physical assets currently housed in hundreds of private collections. The goal is to shine a light on all the unique visual effects techniques that go into the intricate planning of the most difficult shots and create a speaker’s series to highlight the behind-thescenes stories that only insiders know about. VES will showcase this treasure trove in a new online digital library. “Through this exciting project, the VES will create a legacy by documenting and showcasing the people, treasures and techniques of the visual effects industry for future generations,” said Roth.
“We are fortunate to have such esteemed leadership represented on the Executive Committee. Collectively, these talented professionals bring passion, diverse experience and enormous commitment to our organization. We look forward to Mike Chambers’ continued vision in taking the organization and our global membership to new heights.” —Eric Roth, VES Executive Director
SPRING 2019 VFXVOICE.COM • 93
PG 93 VES NEWS.indd 93
2/22/19 3:41 PM
[ VES SECTION SPOTLIGHT: VANCOUVER ]
Dynamic Growth in a Vibrant Community By NAOMI GOLDMAN
TOP: Meeting Section members and guests. MIDDLE: Jeffrey A. Okun, VES, Susan Rowe, Steve Garrad and Shauna Bryan at Spark FX 2019. BOTTOM: Ready to enjoy a special screening of Spider-Man: Into the Spider-Verse.
94 • VFXVOICE.COM SPRING 2019
PG 94-95 VANCOUVER.indd All Pages
The Visual Effects Society’s growing international presence owes much to its dynamic network of Sections, who galvanize their regional VFX communities while advancing the reach and reputation of Society and industry worldwide. Founded in 2007, the Vancouver Section – one of three Canadian Sections, along with Toronto and Montreal – is flourishing thanks to its strong leadership and collaborative visual effects community. The Vancouver Section has established itself as a hub in western Canada, as well as an active and contributing force in the global marketplace. “The Vancouver VFX community is tight knit and has a high caliber of professionals working here today,” says Shauna Bryan, former Chair, VES Vancouver Section Board of Managers. “There are a lot of people who have been here since the genesis of the VFX boom and have a strong sense of local pride in their roots and our hometown industry. And then there are those that have relocated to our beautiful, livable city. Together, that mix makes for a diverse network and support system.” Vancouver has established itself as a truly world-leading center of VFX and animation talent and technology. It boasts one of the largest clusters of visual effects and animation studios in the world, home to more than 60 domestic and foreign-owned companies, including Sony Picture Imageworks, ILM, Rainmaker Studios, Image Engine, Stargate Studios, Double Negative, Method Studios and Cinesite Vancouver. The industry is also wellsupported through education and training programs for artists, software engineers, and other technical-related professionals. “Our set-up is similar to London, where everyone works in the bustling SoHo hub,” continues Bryan. “Because of our city design, we have two clusters of facilities in downtown Gastown and across the Cambie Street Bridge, and you can walk and hit most of the VFX houses. It makes for a community that is very social, communal and integrated, and practitioners are constantly moving from one company to another.” VES Vancouver has more than 200 members, about two-thirds working in feature film and the remainder in television. Much of its member recruitment comes through high-energy pub nights, a rich social platform to bring guests and prospective members. It also has a robust film screening series, holding events every month to foster a collegial atmosphere. Says Bryan, “We created a specific mandate to offer our members more value. Part of that was making a commitment to offer more screenings with better variety, and in downtown, accessible locations like Imageworks and DNEG.” The Section has been focused on developing more educational events, hosted forums on the history of visual effects and a career development workshop on demo reels, with an opportunity to get feedback from industry professionals on how to better structure and present the work to make an impression and stand out from the crowd.
In addition to these programs, the Board of Managers decided to focus its time and resources on two large events for its 2019 calendar. In February, the Section joined up with the Spark FX Conference to co-present a Diversity & Inclusion Summit with the Spark CG Society. The “Tools for Change” summit offered practical advice to maintain and grow a culture of diversity, and focused on issues including equal pay and equity, navigating workplace challenges and work-family balance. The Vancouver Section has a dynamic and engaged Board of Managers propelling it forward under the leadership of Co-Chairs Susan Rowe and Steve Garrad – and is enjoying a surge in new voices stepping up as volunteers for specific events or subcommittees. In summing up the locale and unique culture, Bryan cites Vancouver’s impressive and ever-increasing concentration of companies and talent, matched by a first-rate infrastructure, a collaborative environment, business-friendly climate and outstanding quality of life. “As with any experience, what you put in directly impacts what you get out of it. Professionally, if you want to invest in yourself, you need to be a part of an industry-collegial community where you can learn and grow and also give back. The VES is about fellowship, and it is an active way to bring together our peers to enhance their experience in this city, their social lives and professional career network. The Society is that place where ideas and relationships start to come together. At our best, we spark collaboration.”
“The Vancouver VFX community is tight knit and has a high caliber of professionals working here today. There are a lot of people who have been here since the genesis of the VFX boom and have a strong sense of local pride in their roots and our hometown industry. And then there are those that have relocated to our beautiful, livable city. Together, that mix makes for a diverse network and support system.” —Shauna Bryan, former Chair, VES Vancouver Section Board of Managers
TOP: A full house at the Diversity & Inclusion Summit at Spark FX 2019. BOTTOM: VES members and guests enjoying Pub Night at SIGGRAPH.
SPRING 2019 VFXVOICE.COM • 95
2/22/19 3:44 PM
[ VES SECTION SPOTLIGHT: VANCOUVER ]
Dynamic Growth in a Vibrant Community By NAOMI GOLDMAN
TOP: Meeting Section members and guests. MIDDLE: Jeffrey A. Okun, VES, Susan Rowe, Steve Garrad and Shauna Bryan at Spark FX 2019. BOTTOM: Ready to enjoy a special screening of Spider-Man: Into the Spider-Verse.
94 • VFXVOICE.COM SPRING 2019
PG 94-95 VANCOUVER.indd All Pages
The Visual Effects Society’s growing international presence owes much to its dynamic network of Sections, who galvanize their regional VFX communities while advancing the reach and reputation of Society and industry worldwide. Founded in 2007, the Vancouver Section – one of three Canadian Sections, along with Toronto and Montreal – is flourishing thanks to its strong leadership and collaborative visual effects community. The Vancouver Section has established itself as a hub in western Canada, as well as an active and contributing force in the global marketplace. “The Vancouver VFX community is tight knit and has a high caliber of professionals working here today,” says Shauna Bryan, former Chair, VES Vancouver Section Board of Managers. “There are a lot of people who have been here since the genesis of the VFX boom and have a strong sense of local pride in their roots and our hometown industry. And then there are those that have relocated to our beautiful, livable city. Together, that mix makes for a diverse network and support system.” Vancouver has established itself as a truly world-leading center of VFX and animation talent and technology. It boasts one of the largest clusters of visual effects and animation studios in the world, home to more than 60 domestic and foreign-owned companies, including Sony Picture Imageworks, ILM, Rainmaker Studios, Image Engine, Stargate Studios, Double Negative, Method Studios and Cinesite Vancouver. The industry is also wellsupported through education and training programs for artists, software engineers, and other technical-related professionals. “Our set-up is similar to London, where everyone works in the bustling SoHo hub,” continues Bryan. “Because of our city design, we have two clusters of facilities in downtown Gastown and across the Cambie Street Bridge, and you can walk and hit most of the VFX houses. It makes for a community that is very social, communal and integrated, and practitioners are constantly moving from one company to another.” VES Vancouver has more than 200 members, about two-thirds working in feature film and the remainder in television. Much of its member recruitment comes through high-energy pub nights, a rich social platform to bring guests and prospective members. It also has a robust film screening series, holding events every month to foster a collegial atmosphere. Says Bryan, “We created a specific mandate to offer our members more value. Part of that was making a commitment to offer more screenings with better variety, and in downtown, accessible locations like Imageworks and DNEG.” The Section has been focused on developing more educational events, hosted forums on the history of visual effects and a career development workshop on demo reels, with an opportunity to get feedback from industry professionals on how to better structure and present the work to make an impression and stand out from the crowd.
In addition to these programs, the Board of Managers decided to focus its time and resources on two large events for its 2019 calendar. In February, the Section joined up with the Spark FX Conference to co-present a Diversity & Inclusion Summit with the Spark CG Society. The “Tools for Change” summit offered practical advice to maintain and grow a culture of diversity, and focused on issues including equal pay and equity, navigating workplace challenges and work-family balance. The Vancouver Section has a dynamic and engaged Board of Managers propelling it forward under the leadership of Co-Chairs Susan Rowe and Steve Garrad – and is enjoying a surge in new voices stepping up as volunteers for specific events or subcommittees. In summing up the locale and unique culture, Bryan cites Vancouver’s impressive and ever-increasing concentration of companies and talent, matched by a first-rate infrastructure, a collaborative environment, business-friendly climate and outstanding quality of life. “As with any experience, what you put in directly impacts what you get out of it. Professionally, if you want to invest in yourself, you need to be a part of an industry-collegial community where you can learn and grow and also give back. The VES is about fellowship, and it is an active way to bring together our peers to enhance their experience in this city, their social lives and professional career network. The Society is that place where ideas and relationships start to come together. At our best, we spark collaboration.”
“The Vancouver VFX community is tight knit and has a high caliber of professionals working here today. There are a lot of people who have been here since the genesis of the VFX boom and have a strong sense of local pride in their roots and our hometown industry. And then there are those that have relocated to our beautiful, livable city. Together, that mix makes for a diverse network and support system.” —Shauna Bryan, former Chair, VES Vancouver Section Board of Managers
TOP: A full house at the Diversity & Inclusion Summit at Spark FX 2019. BOTTOM: VES members and guests enjoying Pub Night at SIGGRAPH.
SPRING 2019 VFXVOICE.COM • 95
2/22/19 3:44 PM
[ FINAL FRAME ]
Dumbo – 70 Years Ago
The original Dumbo, a story about a little circus elephant with big ears, arrived in theaters in 1941, just prior to the U.S. entry into World War II. The film’s animated cels are considered very rare. In sharp contrast to director Tim Burton’s new iteration of Dumbo, the original had a very limited budget because other Disney animated films – Pinocchio, Fantasia and Bambi – were racking up huge expenses at the time. The outbreak of World War II also had a chilling effect on the studio. To save money, the studio employed watercolor backgrounds instead of the usual oil and gouache. In addition, only a few characters appeared together on screen at the same time to economize on technical issues. Supervising director Ben Sharpsteen’s orders were to keep the project simple and inexpensive, while other Disney animated films were much more complex in their design and animation techniques. Nevertheless, Dumbo is regarded as a masterpiece of its time. It won Best Animation Design at the Cannes Film Festival in 1947 and also an Academy Award for Best Music, Scoring of a Musical Picture, in 1942. The film was added to the U.S. National Film Registry in December 2017.
Photo courtesy of Walt Disney Pictures
96 • VFXVOICE.COM SPRING 2019
PG 96 FINAL FRAME.indd 96
2/22/19 3:47 PM
CVR3 FOTOKEM AD.indd 3
2/19/19 11:15 AM
CVR4 BOT AD.indd 4
2/21/19 1:55 PM