VFX Voice - Fall 2017 Issue

Page 1

VFXVOICE.COM FALL 2017

VES 70: ALL-TIME VFX FILMS

VES 20TH ANNIVERSARY • VR ROUNDTABLE • THE FUTURE OF VFX



THE H O L LY W O O D R E P O R T E R

con g ra tula tes

VISUAL EFFECTS SOCIETY on its

20 th Anniversary

PRINT | DIGITAL | MOBILE | SOCIAL | EVENTS

HOLLYWOODREPORTER.COM


[ EXECUTIVE NOTE ]

Welcome to Our VES 20th Anniversary Special Issue! After shining a light on the artistry and innovation in the VFX community for 20 years, we are immensely proud of launching our premiere magazine and extending our work to advance the profile and recognition of our industry. VFX Voice is a truly exciting way to embark on our next 20 years, and we thank you for being a part of the inaugural year of this endeavor. Your enthusiastic feedback is music to our ears and we pledge to continue delivering eye-popping visuals, in-depth profiles and insightful stories that highlight the exceptional artists and innovators all across our dynamic global community. In honor of our milestone anniversary, this issue features a special look at “The Legacy of The VES,” our updated member-created honor roll, “The VES 70: The Most Influential VFX Films of All Time,” and a compelling look at “The Future of VFX.” We cover the explosion of Virtual Reality in a host of industries with stories looking at VR in museums, animation, music, medicine and cognitive rehabilitation and new uses in Holocaust education. We look back at the original 1982 Blade Runner, while anxiously awaiting its sequel. We visit “VFX/Animation Schools” and do deep dives on The Lego Ninjago Movie and American Gods TV show. We present a rousing “Industry Roundtable” with luminaries and thought leaders from across the VR/AR world. We profile industry powerhouse ILM. And we shine a light on George Méliès’s Le Voyage dans La Lune, giving a nod to the seminal man in the moon image that we honor every day at the VES. Add to that, the latest VES News and a profile on the VES Germany Section, the latest addition to our global family. While you’re delving in to the print edition, we encourage you to spend some time on VFXVoice.com. You can access the digital version of the magazine in all its glory when and where you want to and read archived articles. But more than that, you don’t want to miss our Web exclusives. Between issues, we’re posting new and timely stories that you can only find online. So become a regular online visitor. And to get updates on new VFX Voice stories – and all of the daily VES and VFX industry news – follow us on Twitter at @VFXSociety. So please take full advantage of all that VFX Voice has to offer. Share it with your colleagues in the industry at large and help us grow the community. And continue to send your thoughts and ideas to us. Thank you for your partnership and support in helping us promote this exciting new addition to our legacy.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM FALL 2017



[ CONTENTS ] FEATURES 8 VR/AR TRENDS: VR MUSEUMS Immersive exhibits engage visitors of all ages.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 104 VES SECTION SPOTLIGHT: GERMANY

16 MUSIC: VR MUSIC Evolving art form adds new listening dimension. 24 EDUCATION: VFX/VR EDUCATION Training the VFX artists of today and tomorrow. 32 ANIMATION: LEGO NINJAGO Building a film empire brick by brick. 38 TV: AMERICAN GODS When New and Old Gods battle, Cinesite VFX rules. 42 ANIMATION: VR FILMS VR’s impact on storytelling is already being felt. 48 PROFILE: DR. ALBERT “SKIP” RIZZO VR as a powerful new instrument of healing. 54 VR/AR TRENDS: USC SHOAH FOUNDATION How VR technology helps the world never forget. 58 COMPANY PROFILE: ILM Venerable studio continues to change and grow worldwide. 64 VES 20TH ANNIVERSARY: “A TRIP TO THE MOON” How Technicolor restored the 1902 VFX classic. 66 VES 20TH ANNIVERSARY: LEGACY VES leaders capture highlights of the first two decades. 74 VES 20TH ANNIVERSARY: VES 70 ALL-TIME VFX FILMS The VES adds 21 films to its “Most Influential” list. 80 VES 20TH ANNIVERSARY: THE FUTURE OF VFX Industry visionaries shed light on the path ahead. 88 INDUSTRY ROUNDTABLE Is VR the Holy Grail of new technology? 98 VFX VAULT: BLADE RUNNER 1982 Flash back with Chief Model Maker Mark Stetson, VES. 102 COMPANY PROFILE: RED CHILLIES Mumbai studio bringing Bollywood VFX to Hollywood.

4 • VFXVOICE.COM FALL 2017

108 V-ART: LORIN WOOD 110 VES NEWS 112 FINAL FRAME: GEORGE MÉLIÈS

ON THE COVER: Avatar (2009) is one of the 21 new films recently added to the honor roll of the “VES 70: The Most Influential Visual Effects Films of All Time.” (Photo courtesy of Twentieth Century Fox. All Rights Reserved.)



FALL 2017 • VOL. 1, NO. 3

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Willie Clark Ian Failes Michael Goldman Naomi Goldman Trevor Hogg Debra Kaufman Chris McGowan Ed Ochs Paula Parisi Barbara Robertson Helene Seifer PUBLICATION ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Rita Cahill, Secretary Dennis Hoffman, Treasurer DIRECTORS Jeff Barnes, Andrew Bly, Brooke Breton, Kathryn Brillhart, Emma Clifton Perry, Bob Coleman, Dayne Cowan, Kim Davidson, Debbie Denise, Richard Edlund, VES, Pam Hogarth, Joel Hynek, Jeff Kleiser, Neil Lim-Sang, Brooke Lyndon-Stanford, Tim McGovern, Kevin Rafferty, Scott Ross, Barry Sandrew, Tim Sassoon, Dan Schrecker, David Tanaka, Bill Taylor, VES, Richard Winn Taylor II, VES, Susan Zwerman ALTERNATES Fon Davis, Charlie Iturriaga, Christian Kubsch, Andres Martinez, Daniel Rosen, Katie Stetson, Bill Villarreal Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Jeff Casper, Manager of Media & Graphics Ben Schneider, Membership Coordinator Colleen Kelly, Office Manager Jennifer Cabrera, Administrative Assistant Alex Romero, Global Administrative Coordinator P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2017 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM FALL 2017



VR/AR TRENDS

IMMERSIVE EXHIBITS BRING HISTORY AND CULTURE TO NEW LIFE By HELENE SEIFER

TOP: “theBlu,” a VR adventure from Wevr, takes museum patrons to the depths of the ocean for virtual underwater encounters. “theBlu” made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. The Natural History Museum version of “theBlu” consisted of six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “VRZOO” is another Wevr VR experience. (Photos courtesy of Design I/O)

8 • VFXVOICE.COM FALL 2017

PG 8-15 VR MUSEUM.indd 8-9

A Babylonian princess created the world’s first museum in 530 BC in what is now Iraq. This repository of Mesopotamian artifacts was neatly arranged and each ancient treasure accompanied by a chiseled clay cylinder with a description in three languages. Although more than 2,500 years have passed since Princess Ennigaldi preserved the past for all to see, for much of history the basic organization of museums remained the same: display some cool stuff and label it. In this age of exploding technology, however, the creative approach has kicked into hyper-drive. Welcome to the new museum experience, where systems developed for film and gaming now reign. “How many times will you dive with a whale?” asks Jake Rowell, Director of “theBlu,” a VR adventure from Wevr that brings us to the depths of the ocean for underwater encounters without getting wet. “Most people are not scuba certified!” “TheBlu,” which started as a home application, made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. Museum patrons virtually encounter a big Blue, a jellyfish migration, and a whale fall where a deceased whale’s bones have become an undersea ecosystem. “It’s a celebration of the ocean and ocean life,” says Neville Spiteri, CEO and Co-founder of Wevr. “We worked with oceanographers and scientists to ensure a degree of accuracy.” The Natural History Museum version of “theBlu” consisted of

six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “We go to great lengths to consider the audience,” continues Spiteri. “VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you.” Rowell adds, “Exploration is the heart of what ‘theBlu’ is.” Meeting a full-sized whale, even virtually, is awesome. “The creature acknowledges you’re there and they’re there. The connection is very powerful.” According to Spiteri, reactive sea anemones and bioluminescent creatures are possible because, “It’s a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim.” Explains Rowell, “‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.” Art is also critical, and the work of Academy Awardwinning visual effects artist Andy Jones was key. The Museum of Science, Boston has taken a deep dive into tech in order to excite visitors and inspire life-long learning. “Particle Mirror” opened this year, and inculcates an interest in physics. Kids and adults alike gambol in a snowstorm, bounce giant colorful orbs and frolic through pixie dust – and nothing is real. Created by Karl Sims, a digital media artist, computer scientist, and recipient of a MacArthur Fellowship, the wall-sized virtual mirror uses a Microsoft XBox One Kinect Depth Sensor camera to capture visitors’ motions and depth and project them into the AR environment. Sims, who previously founded software company GenArts, developed the systems for “Particle Mirror” using C, OpenGL and Open CL, which runs on a Linux computer with an NVIDIA GTX 1080 graphics card. Participants are led through a series of nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics and the properties programmed into each. Gravity causes “snow” to fall from the top of the screens – and, just like real snow, it collects on heads and shoulders, and can be scooped into snowballs. Music enhances the magic – when particles collide different sounds are generated. “The bubbly effect makes bubbly sounds. They gurgle when pushed,” Sims explains. “The goal is to inspire kids to learn more.” The museum also houses Sims’s “Reaction-Diffusion Media Wall,” installed in 2016. He details that the exhibit simulates “two chemicals that make emergent dynamic patterns.” Consumers at a kiosk manipulate patterns projected on 24 hi-def screens. To create the effects, Sims used a consumer-gaming piece of hardware: a graphics processor on Linux machine with 2000 processing cores and 11 processing GPU. Museum of Science, Boston develops most things in-house with a dedicated nine or 10-person team, including a 3D designer, a physical manipulative engineer, software developers and builders. According to VP of Exhibition Development Christine Reich, special effects excite and engage people in learning. “Neuroscience is now teaching us that emotions are the starting point for

TOP and BOTTOM: “Particle Mirror” is now on display at the Museum of Science, Boston. Participants are led through nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics. In the “snow” simulation, visitors collect the falling particles, toss them around, or sweep them off the floor as they accumulate. In the “blue sparks” simulation, a “trails” effect makes the “sparks” look like they’re swimming through swirling liquid. Particles generate different musical chords when pushed. (Photos courtesy of I/O)

FALL 2017 VFXVOICE.COM • 9

9/14/17 3:22 PM


VR/AR TRENDS

IMMERSIVE EXHIBITS BRING HISTORY AND CULTURE TO NEW LIFE By HELENE SEIFER

TOP: “theBlu,” a VR adventure from Wevr, takes museum patrons to the depths of the ocean for virtual underwater encounters. “theBlu” made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. The Natural History Museum version of “theBlu” consisted of six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “VRZOO” is another Wevr VR experience. (Photos courtesy of Design I/O)

8 • VFXVOICE.COM FALL 2017

PG 8-15 VR MUSEUM.indd 8-9

A Babylonian princess created the world’s first museum in 530 BC in what is now Iraq. This repository of Mesopotamian artifacts was neatly arranged and each ancient treasure accompanied by a chiseled clay cylinder with a description in three languages. Although more than 2,500 years have passed since Princess Ennigaldi preserved the past for all to see, for much of history the basic organization of museums remained the same: display some cool stuff and label it. In this age of exploding technology, however, the creative approach has kicked into hyper-drive. Welcome to the new museum experience, where systems developed for film and gaming now reign. “How many times will you dive with a whale?” asks Jake Rowell, Director of “theBlu,” a VR adventure from Wevr that brings us to the depths of the ocean for underwater encounters without getting wet. “Most people are not scuba certified!” “TheBlu,” which started as a home application, made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. Museum patrons virtually encounter a big Blue, a jellyfish migration, and a whale fall where a deceased whale’s bones have become an undersea ecosystem. “It’s a celebration of the ocean and ocean life,” says Neville Spiteri, CEO and Co-founder of Wevr. “We worked with oceanographers and scientists to ensure a degree of accuracy.” The Natural History Museum version of “theBlu” consisted of

six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “We go to great lengths to consider the audience,” continues Spiteri. “VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you.” Rowell adds, “Exploration is the heart of what ‘theBlu’ is.” Meeting a full-sized whale, even virtually, is awesome. “The creature acknowledges you’re there and they’re there. The connection is very powerful.” According to Spiteri, reactive sea anemones and bioluminescent creatures are possible because, “It’s a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim.” Explains Rowell, “‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.” Art is also critical, and the work of Academy Awardwinning visual effects artist Andy Jones was key. The Museum of Science, Boston has taken a deep dive into tech in order to excite visitors and inspire life-long learning. “Particle Mirror” opened this year, and inculcates an interest in physics. Kids and adults alike gambol in a snowstorm, bounce giant colorful orbs and frolic through pixie dust – and nothing is real. Created by Karl Sims, a digital media artist, computer scientist, and recipient of a MacArthur Fellowship, the wall-sized virtual mirror uses a Microsoft XBox One Kinect Depth Sensor camera to capture visitors’ motions and depth and project them into the AR environment. Sims, who previously founded software company GenArts, developed the systems for “Particle Mirror” using C, OpenGL and Open CL, which runs on a Linux computer with an NVIDIA GTX 1080 graphics card. Participants are led through a series of nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics and the properties programmed into each. Gravity causes “snow” to fall from the top of the screens – and, just like real snow, it collects on heads and shoulders, and can be scooped into snowballs. Music enhances the magic – when particles collide different sounds are generated. “The bubbly effect makes bubbly sounds. They gurgle when pushed,” Sims explains. “The goal is to inspire kids to learn more.” The museum also houses Sims’s “Reaction-Diffusion Media Wall,” installed in 2016. He details that the exhibit simulates “two chemicals that make emergent dynamic patterns.” Consumers at a kiosk manipulate patterns projected on 24 hi-def screens. To create the effects, Sims used a consumer-gaming piece of hardware: a graphics processor on Linux machine with 2000 processing cores and 11 processing GPU. Museum of Science, Boston develops most things in-house with a dedicated nine or 10-person team, including a 3D designer, a physical manipulative engineer, software developers and builders. According to VP of Exhibition Development Christine Reich, special effects excite and engage people in learning. “Neuroscience is now teaching us that emotions are the starting point for

TOP and BOTTOM: “Particle Mirror” is now on display at the Museum of Science, Boston. Participants are led through nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics. In the “snow” simulation, visitors collect the falling particles, toss them around, or sweep them off the floor as they accumulate. In the “blue sparks” simulation, a “trails” effect makes the “sparks” look like they’re swimming through swirling liquid. Particles generate different musical chords when pushed. (Photos courtesy of I/O)

FALL 2017 VFXVOICE.COM • 9

9/14/17 3:22 PM


VR/AR TRENDS

TOP: On Karl Sims’ “Reaction-Diffusion Media Wall” in the Museum of Science, Boston, two simulated chemicals, shown as white and dark blue, react and diffuse to generate biological-looking patterns and shapes, which are displayed on a high-resolution wall of 24 screens. A touchscreen kiosk in front of the display allows visitors to adjust parameters and create a wide range of different results. (Photos courtesy of Design I/O)

“We go to great lengths to consider the audience. VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you. Exploration is the heart of what ‘theBlu’ is. The creature [a full-sized whale] acknowledges you’re there and they’re there. The connection is very powerful.” —Neville Spiteri, CEO and Co-founder, Wevr

10 • VFXVOICE.COM FALL 2017

PG 8-15 VR MUSEUM.indd 10-11

behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.” Equally important is referencing cultural touchstones. As Reich notes, the Museum is always trying to determine “how we can leverage pop culture to teach STEM (Science, Technology, Engineering and Mathematics).” Last year, their “Star Wars®: Where Science Meets Imagination”experience concluded its eightyear, 20-venue tour. Their “The Science Behind Pixar” exhibition elucidates the engineering and computer science behind animation technology. Seen by over 800,000 people thus far, there are now two traveling versions, each with over 40 interactive components, including a simulation allowing visitors to program the grass in a movie scene to determine if blade density affects movement. Understanding how actions affect our environment is the goal of the 2,500-square- foot interactive “Connected Worlds” at the New York Hall of Science. Visitors are surrounded by six interconnected ecosystems: Desert, Mountain Valley, Wetlands, Reservoir, Jungle and Grasslands. A 40-foot-high waterfall, rivers, indigenous plants and native creatures are projected onto the walls and floor, and environments thrive or die depending on what’s done to them. Infra-red cameras react when practical logs, wrapped in reflective material, are used to divert the path of a virtual stream to feed or starve flora and fauna. Gesture-reading Kinect cameras react to actions that the group of museum-goers take: whether interacting with animals or planting virtual seeds, every human action counts in how well the entire ecosystem works. According to Geralyn Abinader, Creative Producer at NYSCI, “When you talk about system-thinking, it’s really hard for even adults to grasp the relationships in a living system. When an individual does an action or there are aggregate behaviors that cause changes; sometimes the reactions are immediate, but sometimes they are over the long term. Sometimes they are nearby, sometimes on the other side of the world.” Visitors can see that if they plant enough seeds and supply enough water, animals will appear. If food sources dry up, animals will migrate elsewhere. Although based on real scientific models about the ways water systems work, the projected worlds are fanciful, filled with imaginary critters. “We wanted children to understand one or two important concepts and not get bogged down with expectations about what they were seeing,” offers “Connected Worlds” developer/designer Theo Watson, who, with his wife Emily, founded Design I/O, the company that designed and programmed “Connected Worlds.” “The more straight-forward the characters were, kids said they liked them more, but had very little to say about them. The weirder the characters, the more they had to say. That helped us because it captured their interest and attention.” The exhibit-planning included scientific collaboration, as Watson explains. “Some of the people who advised the UN on climate change were involved.” The complex responsiveness of the exhibit was a technological challenge, as Watson attests. “Almost every aspect of this was on the verge of being impossible to do.” He developed the software on OpenFrameworks, built it on C++. “Each of the six environments

AD

SUMMER 2017

VFXVOICE.COM • 11

9/14/17 3:22 PM


BOXX AD.indd 11

9/14/17 3:23 PM


VR/AR TRENDS

AD LEFT and RIGHT: In the “Star Wars: Where Science Meets Imagination” exhibition at the Museum of Science, Boston, visitors can jump to light speed in a full-size replica of the cockpit of Episode IV’s Millennium Falcon in a multimedia ride through the universe. And, they can meet a puppet of Jedi Master Yoda up close in the swamps of Dagobah from Episode V: The Empire Strikes Back. (Photo credit: Dom Miguel Photography. Copyright © 2006 Museum of Science, Boston and Lucasfilm Ltd.)

“[The VR experience] is a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim. ‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.” —Jake Rowell, Director of ‘theBlu’

is a single projector connected to a single computer. The floor is another computer, but the content comes from seven projectors, so it feels like one continuous digital surface. All the environments talk to each other over a network. When a creature flies from the dessert to the forest, the forest knows that and makes sure the bird shows up on the forest screen.” Design I/O used the XBox Kinect camera, but the interaction and tracking aspects are customized. “Most of our installations are closed source. Sometimes we’ll solve really hard problems that would be helpful to millions of people, so we’ll open source that part of it.” An added attraction in “Connected Worlds” is the “Living Library,” a 2-foot by 4-foot physical book. Look through the book, find an interesting section and a hi-res camera mounted above will recognize the page and prompt an interactive digital display. The Franklin Institute in Philadelphia has embraced immersive technology as a means of engagement and learning, with VR and AR strategies. “We stay up on what the latest available technologies are,” explains Susan Poulton, the Institute’s Chief Digital Officer. “If the technology exists in the public, we want it here.” Under her guidance the museum has VR stations with HTC Vive and Oculus Rift headsets which transport visitors to the depths of the sea, into outer space, or inside the human body. The Institute also has four movable, 6-foot-tall, 10-person VR Carts outfitted with 10 iPod touches, Samsung headsets, and access to the museums full array of 360-degree photos and videos, including all 26 of NASA’s VR properties. Visitors can download the app on

12 • VFXVOICE.COM FALL 2017

PG 8-15 VR MUSEUM.indd 13

9/14/17 3:22 PM


VARIETY AD.indd 13

9/14/17 3:24 PM


VR/AR TRENDS

TOP: At the Museum of Science, Boston, a visitor changes the surface appearance of a virtual object in “The Science Behind Pixar” exhibition, which demonstrates some of the engineering and computer science behind animation technology. Seen by more than a 800,000 people thus far, there are now two traveling versions, each with over 40 interactive components, including a simulation allowing visitors to program elements in a movie scene. (Photo copyright © Michael Malyszko) BOTTOM: The Franklin Institute of Philadelphia has embraced immersive technology as a means of engagement and learning, with VR and AR strategies. The museum has VR stations with HTC Vive and Oculus Rift headsets that transport visitors to the depths of the sea, into outer space, or inside the human body. The Institute also has four movable, 6-foot-tall, 10-person VR Carts outfitted with 10 ipod touches, Samsung headsets, and access to the museums full array of 360-degree photos and videos, including all 26 of NASA’s VR properties. The “Terracotta Warriors of the First Emperor” augmented reality experience was featured last September. (Photo courtesy of The Franklin Institute)

“Neuroscience is now teaching us that emotions are the starting point for behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.” —Christine Reich, VP of Exhibition Development, Museum of Science, Boston

14 • VFXVOICE.COM FALL 2017

PG 8-15 VR MUSEUM.indd 15

their phones and access them at home, as well. To make sure that’s possible, the museum is giving away over 10,000 Google cardboards. Additionally, in a further commitment to immersive tech, two flight simulators have been upgraded to recreate the Apollo 11 landing in a 4D ride using HTC Vive. “The real goal,” according to Poulton, “is providing technology accessibility. To experience VR and all the things it can do. But the big issue is through-put. How many visitors can we put through in a day? The VR stations are limited to 24 people an hour, so that’s approximately 100 a day. It’s a challenge finding great content that works for the rhythm and flow of the museum.” The other prong in their strategy is augmented reality experiences. For example, the “Terracotta Warriors” exhibit opened in September. Visitors can download an app to access AR on their phones. “Activate the AR and hold the phone over a warrior,” Poulton explains, “and it might show the original weaponry, created with CGI. Some will show the chemical decay process or the original coloring of the statues. “We have to meet expectations,” she says. “This younger generation – the phone is a third eye, not a device. They need and will expect to use it, and if we’re not figuring that out now, 10 years from now, even 5 years from now, we’re in trouble.” She continues, “We are learning how to react to audience needs and do things quicker and not get so hung up on the details. We need hyper-relevant contexts – not just a hurricane exhibit, but a hurricane that’s happening right now in Japan. Museums are going to radically change.” Even art museums are jumping aboard the technology train. Design I/O was commissioned to create two pieces for an interactive video wall at The Cleveland Museum of Art. In the reveal mode, visitors’ movements effectively “paint” an item from their collection. Zoom turns each museum-goer into a living magnifying glass. As Watson details, “In reveal, you’re basically throwing paint strokes off your body. Zoom is the simplest idea. As you walk up to the video wall, you see an object from the collection. Your body position can zoom into it until it’s 10-20 times magnified.” In an effort to entice young adults to appreciate older art, The Montreal Museum of Fine Arts has projected an enchanted garden in their Romanticism Gallery. Leaves rustle and winds blow, creating an alluring room. They also developed a fascinating duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of a fashion icon. “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier” both feature manikins that talk with realistic articulation, but they’re not animatronic. Nathalie Bondil, Director General and Chief Curator of the museum, worked with UBU Theatre, an innovative company known for incorporating video masks into their productions. A selection of real people, including Gaultier’s top model and Gaultier himself, were tapped for the show. Bondil explains that first UBU artisans “made precise molds of their heads. Then they created a new head in plaster. After that, they recorded film of

the same characters acting.” This part of the process was not easy. “Each person must combine the perfect features for the perfect 3D head. We cannot have someone with big gestures, who can’t move their head left to right while talking, who must look in front of him.” Then UBU devised a means to realistically project the talking head onto that person’s sculpted head mold. “Magic comes from the combination of the sculpture and the recording of the same person projected on the same head. A completely unique specific system had to be created to avoid bizarre effects. In Korea, they thought we hired people to be on stage.” Stéphanie Jasmin, Artistic Co-director of UBU, adds, “There were tricks we developed. When we project an image, it’s a square. The image is flat. A face is not square – it’s 3D. And we needed to make invisible cuts [in the monologues] or there’d be a jump cut in the face!” The systems created were so precise that a team of technicians had to travel with the exhibition to set it up properly for each venue. With its ability to engage and excite, special effects techniques in museum exhibitions are sure to expand. Wevr’s Rowell observes, “VR is a powerful form of storytelling. I’m a huge fan of film, television, video games, animation. What I get excited about with VR is that it’s a new way to communicate, to show people the story.”

TOP LEFT to RIGHT: Understanding how actions affect the environment is the goal of the 2,500-square-foot interactive “Connected Worlds” at the New York Hall of Science. Visitors are surrounded by six interconnected ecosystems: Desert, Mountain Valley, Wetlands, Reservoir, Jungle and Grasslands. A 40-foot-high waterfall, rivers, indigenous plants and native creatures are projected onto the walls and floor, and environments thrive or die depending on what’s done to them. (Photos courtesy of I/O Design and copyright David Handschuh) BOTTOM LEFT: To entice young adults to appreciate older art, the museum has projected an enchanted garden in their Romanticism Gallery for “The Salons of the Belle Époque: Romanticism” exhibition in the Michal and Renata Hornstein Pavilion for Peace. Leaves rustle and winds blow, creating an alluring room. (Photo copyright © Marc Cramer) BOTTOM RIGHT: The Montreal Museum of Fine Arts developed a duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier.” Both feature manikins that talk with realistic articulation, but are not animatronic. (Photo copyright © MBAM/Denis Farley)

FALL 2017 VFXVOICE.COM • 15

9/14/17 3:22 PM


VR/AR TRENDS

TOP: At the Museum of Science, Boston, a visitor changes the surface appearance of a virtual object in “The Science Behind Pixar” exhibition, which demonstrates some of the engineering and computer science behind animation technology. Seen by more than a 800,000 people thus far, there are now two traveling versions, each with over 40 interactive components, including a simulation allowing visitors to program elements in a movie scene. (Photo copyright © Michael Malyszko) BOTTOM: The Franklin Institute of Philadelphia has embraced immersive technology as a means of engagement and learning, with VR and AR strategies. The museum has VR stations with HTC Vive and Oculus Rift headsets that transport visitors to the depths of the sea, into outer space, or inside the human body. The Institute also has four movable, 6-foot-tall, 10-person VR Carts outfitted with 10 ipod touches, Samsung headsets, and access to the museums full array of 360-degree photos and videos, including all 26 of NASA’s VR properties. The “Terracotta Warriors of the First Emperor” augmented reality experience was featured last September. (Photo courtesy of The Franklin Institute)

“Neuroscience is now teaching us that emotions are the starting point for behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.” —Christine Reich, VP of Exhibition Development, Museum of Science, Boston

14 • VFXVOICE.COM FALL 2017

PG 8-15 VR MUSEUM.indd 15

their phones and access them at home, as well. To make sure that’s possible, the museum is giving away over 10,000 Google cardboards. Additionally, in a further commitment to immersive tech, two flight simulators have been upgraded to recreate the Apollo 11 landing in a 4D ride using HTC Vive. “The real goal,” according to Poulton, “is providing technology accessibility. To experience VR and all the things it can do. But the big issue is through-put. How many visitors can we put through in a day? The VR stations are limited to 24 people an hour, so that’s approximately 100 a day. It’s a challenge finding great content that works for the rhythm and flow of the museum.” The other prong in their strategy is augmented reality experiences. For example, the “Terracotta Warriors” exhibit opened in September. Visitors can download an app to access AR on their phones. “Activate the AR and hold the phone over a warrior,” Poulton explains, “and it might show the original weaponry, created with CGI. Some will show the chemical decay process or the original coloring of the statues. “We have to meet expectations,” she says. “This younger generation – the phone is a third eye, not a device. They need and will expect to use it, and if we’re not figuring that out now, 10 years from now, even 5 years from now, we’re in trouble.” She continues, “We are learning how to react to audience needs and do things quicker and not get so hung up on the details. We need hyper-relevant contexts – not just a hurricane exhibit, but a hurricane that’s happening right now in Japan. Museums are going to radically change.” Even art museums are jumping aboard the technology train. Design I/O was commissioned to create two pieces for an interactive video wall at The Cleveland Museum of Art. In the reveal mode, visitors’ movements effectively “paint” an item from their collection. Zoom turns each museum-goer into a living magnifying glass. As Watson details, “In reveal, you’re basically throwing paint strokes off your body. Zoom is the simplest idea. As you walk up to the video wall, you see an object from the collection. Your body position can zoom into it until it’s 10-20 times magnified.” In an effort to entice young adults to appreciate older art, The Montreal Museum of Fine Arts has projected an enchanted garden in their Romanticism Gallery. Leaves rustle and winds blow, creating an alluring room. They also developed a fascinating duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of a fashion icon. “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier” both feature manikins that talk with realistic articulation, but they’re not animatronic. Nathalie Bondil, Director General and Chief Curator of the museum, worked with UBU Theatre, an innovative company known for incorporating video masks into their productions. A selection of real people, including Gaultier’s top model and Gaultier himself, were tapped for the show. Bondil explains that first UBU artisans “made precise molds of their heads. Then they created a new head in plaster. After that, they recorded film of

the same characters acting.” This part of the process was not easy. “Each person must combine the perfect features for the perfect 3D head. We cannot have someone with big gestures, who can’t move their head left to right while talking, who must look in front of him.” Then UBU devised a means to realistically project the talking head onto that person’s sculpted head mold. “Magic comes from the combination of the sculpture and the recording of the same person projected on the same head. A completely unique specific system had to be created to avoid bizarre effects. In Korea, they thought we hired people to be on stage.” Stéphanie Jasmin, Artistic Co-director of UBU, adds, “There were tricks we developed. When we project an image, it’s a square. The image is flat. A face is not square – it’s 3D. And we needed to make invisible cuts [in the monologues] or there’d be a jump cut in the face!” The systems created were so precise that a team of technicians had to travel with the exhibition to set it up properly for each venue. With its ability to engage and excite, special effects techniques in museum exhibitions are sure to expand. Wevr’s Rowell observes, “VR is a powerful form of storytelling. I’m a huge fan of film, television, video games, animation. What I get excited about with VR is that it’s a new way to communicate, to show people the story.”

TOP LEFT to RIGHT: Understanding how actions affect the environment is the goal of the 2,500-square-foot interactive “Connected Worlds” at the New York Hall of Science. Visitors are surrounded by six interconnected ecosystems: Desert, Mountain Valley, Wetlands, Reservoir, Jungle and Grasslands. A 40-foot-high waterfall, rivers, indigenous plants and native creatures are projected onto the walls and floor, and environments thrive or die depending on what’s done to them. (Photos courtesy of I/O Design and copyright David Handschuh) BOTTOM LEFT: To entice young adults to appreciate older art, the museum has projected an enchanted garden in their Romanticism Gallery for “The Salons of the Belle Époque: Romanticism” exhibition in the Michal and Renata Hornstein Pavilion for Peace. Leaves rustle and winds blow, creating an alluring room. (Photo copyright © Marc Cramer) BOTTOM RIGHT: The Montreal Museum of Fine Arts developed a duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier.” Both feature manikins that talk with realistic articulation, but are not animatronic. (Photo copyright © MBAM/Denis Farley)

FALL 2017 VFXVOICE.COM • 15

9/14/17 3:22 PM


MUSIC

VR EXPERIENCE CATCHING ON WITH ARTISTS AND AUDIENCES By CHRIS MCGOWAN

TOP and BOTTOM: Queen: The Bohemian Rhapsody Experience is a 360-degree immersive video with spatialized sound that evokes a “dreamlike journey through Freddie Mercury’s subconscious mind” and through “surrealistic worlds.” (Images courtesy of Queen, Google Play and Enosis)

16 • VFXVOICE.COM FALL 2017

PG 16-23 VR MUSIC.indd 16-17

Over the last 20 years, cheap or free downloads and streaming have diminished the value and importance of commercial recorded music for many listeners. But virtual reality, with its growing technological viability and commercial promise, may change the listening equation once again, adding new levels of immersion, interaction and involvement. Live-streamed 360-degree concerts as well as the evolving new art form of interactive music VR may create significant new revenue streams for the music industry and heighten our appreciation of a favorite artist’s work. The medium may even engage us in music more deeply than back in the day when fans “immersed” in the music, cover art and liner notes of vinyl albums like Sgt. Pepper and Dark Side of the Moon. Virtual reality has come of age and titles are widely available for streaming. Most current music experiences are 360 videos that give the viewer a rotating, panoramic view of a show or setting, while full VR is interactive, with a responsive environment and real-time rendering. The medium can be experienced in various ways. The Oculus Rift and HTC Vive headsets offer higher-end VR with sufficiently powerful PCs. And Apple’s new computers this fall will be powerful enough to run VR, which will further expand the audience. Meanwhile, viewers can experience VR inexpensively via cell phones with Samsung Gear VR (powered by Oculus) and Google’s Cardboard viewer and Daydream headset. Sony also offers a PlayStation VR headset. The Consumer Technology Association (CTA) forecasts sales of 2.5 million VR headsets in 2017 (a 79% increase over the previous year). “It’s a new medium, so it will augment every message that people create and deliver in a new way, which happened with radio and TV. VR is an extension of all these efforts to communicate and will affect every possible angle of our lives,” comments Vangelis Lympouridis, the founder of Enosis V.R., based in Los Angeles. Commissioned by Google Play, Enosis collaborated with the rock group Queen to create an immersive music video for the classic song “Bohemian Rhapsody.” The resulting VR “experience”

was made available last year for Google Cardboard, for Android and iOS. “The app was downloaded 180,000 to 200,000 times” with “close to a million” estimated views around the world, according to Lympouridis. Prior to founding Enosis, Lympouridis oversaw VR research and productions at the MxR Studio in the USC School of Cinematic Arts. “We academics have been training with all sorts of room-scale and full-body virtual reality experiences,” he says, noting that he has nearly a decade and a half of work in the field. “Music is a good paradigm for exploring virtual reality,” he observes, “because it has a short duration and we can’t have a comfortable long session with the current state of virtual reality. And then there is a pre-established narrative and a connection with a broad audience, so we have the creative allowance to render beautiful worlds and abstractions. We can make a great synaesthetic experience.” For The Bohemian Rhapsody Experience,” “Queen sent all the original [musical] stems,” recalls Lympouridis. “We had access to all the individual instruments and original Freddie Mercury voice tracks.” Enosis worked with Dolby Labs on the 360 surround sound. “It was a fascinating experience remixing ‘Bohemian Rhapsody’ in an immersive way.” The sound is spatialized as one moves one’s head, “reacting to your gaze. It is rendered in real time according to your orientation in space.” The goal was to take the audience on a “dreamlike journey through Freddie Mercury’s subconscious mind” and through a “series of surrealistic worlds.” The Bohemian Rhapsody Experience illustrates the important role that visual effects will play in VR. The experience integrated 2D and 3D animation, motion capture and digital effects, and was created with Unity Game Engine and proprietary software “we wrote ourselves,” recalls Lympouridis. “Half the project was about technical innovation and the other half about the high entertainment value we were after.” Enosis spent four months on the title, with “a team of 15 on site and another 15 contributing contractors.” “We were very pleased and exceeded Google’s expectations,” Lympouridis says. “Queen loved it, especially Brian May, who has a passion for VR. I think we created a quantum leap in understanding what is possible in mobile VR and VR in general.” Over the last two years, the Icelandic singer Björk also delved into virtual reality, participating in VR music videos for several songs (including “NotGet,” “Stonemilker,” “Quicksand” and “Mouth Mantra”) from her 2015 album Vulnicura, and launched an 18-month traveling VR exhibit, Björk Digital. “I think how VR will affect music will be more powerful than how music video changed the music experience,” comments Anthony Batt, Co-creator and Executive VP of Wevr, a VR creative studio in Venice, California. Wevr is partly headquartered in a fitting space for pushing the VR envelope: actor Dennis Hopper’s former house, an avant-garde trapezoidal structure made of corrugated steel, in which the unconventional actor kept his collection of cutting-edge modern art. Prior to music video, Batt says, “Most music experiences were non-visual unless you were going to a live concert. But after music videos entered the pop culture, artists and directors were

TOP: Björk’s immersive 360 video for “Stonemilker” from her 2015 Vulnicura album. (Image courtesy of Megaforce, Sony and Björk).

BOTTOM: Reggie Watts and a green screen, working on the Waves music VR, directed by Benjamin Dickinson, and a scene from Waves. (Images courtesy of Wevr)

FALL 2017 VFXVOICE.COM • 17

9/14/17 3:24 PM


MUSIC

VR EXPERIENCE CATCHING ON WITH ARTISTS AND AUDIENCES By CHRIS MCGOWAN

TOP and BOTTOM: Queen: The Bohemian Rhapsody Experience is a 360-degree immersive video with spatialized sound that evokes a “dreamlike journey through Freddie Mercury’s subconscious mind” and through “surrealistic worlds.” (Images courtesy of Queen, Google Play and Enosis)

16 • VFXVOICE.COM FALL 2017

PG 16-23 VR MUSIC.indd 16-17

Over the last 20 years, cheap or free downloads and streaming have diminished the value and importance of commercial recorded music for many listeners. But virtual reality, with its growing technological viability and commercial promise, may change the listening equation once again, adding new levels of immersion, interaction and involvement. Live-streamed 360-degree concerts as well as the evolving new art form of interactive music VR may create significant new revenue streams for the music industry and heighten our appreciation of a favorite artist’s work. The medium may even engage us in music more deeply than back in the day when fans “immersed” in the music, cover art and liner notes of vinyl albums like Sgt. Pepper and Dark Side of the Moon. Virtual reality has come of age and titles are widely available for streaming. Most current music experiences are 360 videos that give the viewer a rotating, panoramic view of a show or setting, while full VR is interactive, with a responsive environment and real-time rendering. The medium can be experienced in various ways. The Oculus Rift and HTC Vive headsets offer higher-end VR with sufficiently powerful PCs. And Apple’s new computers this fall will be powerful enough to run VR, which will further expand the audience. Meanwhile, viewers can experience VR inexpensively via cell phones with Samsung Gear VR (powered by Oculus) and Google’s Cardboard viewer and Daydream headset. Sony also offers a PlayStation VR headset. The Consumer Technology Association (CTA) forecasts sales of 2.5 million VR headsets in 2017 (a 79% increase over the previous year). “It’s a new medium, so it will augment every message that people create and deliver in a new way, which happened with radio and TV. VR is an extension of all these efforts to communicate and will affect every possible angle of our lives,” comments Vangelis Lympouridis, the founder of Enosis V.R., based in Los Angeles. Commissioned by Google Play, Enosis collaborated with the rock group Queen to create an immersive music video for the classic song “Bohemian Rhapsody.” The resulting VR “experience”

was made available last year for Google Cardboard, for Android and iOS. “The app was downloaded 180,000 to 200,000 times” with “close to a million” estimated views around the world, according to Lympouridis. Prior to founding Enosis, Lympouridis oversaw VR research and productions at the MxR Studio in the USC School of Cinematic Arts. “We academics have been training with all sorts of room-scale and full-body virtual reality experiences,” he says, noting that he has nearly a decade and a half of work in the field. “Music is a good paradigm for exploring virtual reality,” he observes, “because it has a short duration and we can’t have a comfortable long session with the current state of virtual reality. And then there is a pre-established narrative and a connection with a broad audience, so we have the creative allowance to render beautiful worlds and abstractions. We can make a great synaesthetic experience.” For The Bohemian Rhapsody Experience,” “Queen sent all the original [musical] stems,” recalls Lympouridis. “We had access to all the individual instruments and original Freddie Mercury voice tracks.” Enosis worked with Dolby Labs on the 360 surround sound. “It was a fascinating experience remixing ‘Bohemian Rhapsody’ in an immersive way.” The sound is spatialized as one moves one’s head, “reacting to your gaze. It is rendered in real time according to your orientation in space.” The goal was to take the audience on a “dreamlike journey through Freddie Mercury’s subconscious mind” and through a “series of surrealistic worlds.” The Bohemian Rhapsody Experience illustrates the important role that visual effects will play in VR. The experience integrated 2D and 3D animation, motion capture and digital effects, and was created with Unity Game Engine and proprietary software “we wrote ourselves,” recalls Lympouridis. “Half the project was about technical innovation and the other half about the high entertainment value we were after.” Enosis spent four months on the title, with “a team of 15 on site and another 15 contributing contractors.” “We were very pleased and exceeded Google’s expectations,” Lympouridis says. “Queen loved it, especially Brian May, who has a passion for VR. I think we created a quantum leap in understanding what is possible in mobile VR and VR in general.” Over the last two years, the Icelandic singer Björk also delved into virtual reality, participating in VR music videos for several songs (including “NotGet,” “Stonemilker,” “Quicksand” and “Mouth Mantra”) from her 2015 album Vulnicura, and launched an 18-month traveling VR exhibit, Björk Digital. “I think how VR will affect music will be more powerful than how music video changed the music experience,” comments Anthony Batt, Co-creator and Executive VP of Wevr, a VR creative studio in Venice, California. Wevr is partly headquartered in a fitting space for pushing the VR envelope: actor Dennis Hopper’s former house, an avant-garde trapezoidal structure made of corrugated steel, in which the unconventional actor kept his collection of cutting-edge modern art. Prior to music video, Batt says, “Most music experiences were non-visual unless you were going to a live concert. But after music videos entered the pop culture, artists and directors were

TOP: Björk’s immersive 360 video for “Stonemilker” from her 2015 Vulnicura album. (Image courtesy of Megaforce, Sony and Björk)

BOTTOM: Reggie Watts and a green screen, working on the Waves music VR, directed by Benjamin Dickinson, and a scene from Waves. (Images courtesy of Wevr)

FALL 2017 VFXVOICE.COM • 17

9/21/17 12:35 PM


MUSIC

TOP and BOTTOM: Future Islands’ acclaimed Old Friend, directed by Tyler Hurd. (Image courtesy of Wevr)

combining their efforts and creating stories that created narratives for culture. I think that was profound and had its effect on pop culture and society. I think VR is going to extend it because parts of it are interactive and [viewers] will be fully immersed into narratives with the music. We have to assume that will have a big effect on pop culture and art. I’m very bullish on all that’s happening and its being a real win for both the artists and their audiences.” So far, Wevr has created such VR experiences as Future Islands’ exuberant and much acclaimed Old Friend, Run the Jewels’ Crown and Reggie Watts’s Waves. Outside of music, Wevr has worked on

AD

18 • VFXVOICE.COM FALL 2017

PG 16-23 VR MUSIC.indd 18-19

9/14/17 3:25 PM


DISNEY AD.indd 19

9/14/17 3:25 PM


MUSIC

TOP: Coldplay in a 360 concert. (Image courtesy of Vrtify) BOTTOM: Live Nation is streaming live VR concerts with NextVR. (Image courtesy of Live Nation)

20 • VFXVOICE.COM FALL 2017

PG 16-23 VR MUSIC.indd 21

VR experiences with the likes of Jon Favreau (Gnomes and Goblins) and Deepak Chopra (Find Your True Self). The company runs Transport, an independent network for VR distribution, which charges $8 to subscribe for 360 video (for mobile) or $25 for both mobile and “room-scale” VR. “There are roughly 25 pieces there now and it’s growing.” Batt feels that both creators and audiences have to get used to the new medium. “A lot of it is just sit back and get your hair blown back, but some of it will have interactivity to it. The art form itself is new to both the creator and the observer. There will be a learning curve for everybody. Movies went through the same thing when they first came out. People had to learn what they were.” Wevr has created its VR music titles without industry support. “We’ve done all these ourselves without any record labels,” adds Batt. “The artists themselves are keen to do it, they’re interested in making their audiences happy and sharing. I think the labels are still trying to figure everything out.” “There are definitely some significant things happening” in music VR, says Tom Szirtes, Creative Technologist at Mbryonic, a digital design studio in London. Szirtes has been working with Sony Music on “how we can take existing assets and use them in a virtual environment.” He continues, “Various apps are trying to create virtual worlds where you can interact and explore music. Various artists have created 360 videos and some fairly big deals have been struck in 360 streaming.” One of those deals involves Live Nation (the global entertainment and concert booking giant), Citibank and NextVR, a production company based in Newport Beach, California. The

three companies will produce up to 10 live VR concerts with major artists. Viewers will have front-row seats and multiple other vantage points, and be able to visit the artists backstage. Live Nation streamed its first VR concert with NextVR last December with Thievery Corporation. The streams are viewable through Daydream and Samsung Gear VR headsets, with a NextVR app. Vimeo added support for 360-degree videos to its streaming video platform in 2017, enabling creators to upload, share and sell their immersive videos. It joined YouTube and Facebook in offering support for 360, which can be viewed with headsets (for immersion) or without them (for a limited experience on your computer or tablet). “I can see it working,” Szirtes says about 360 concert videos. “Your audience might not be in the same city, or it might be people who don’t fancy going to concerts anymore or paying $70 or $100. It’s the lowest hanging fruit of the VR music tree.” Also, he speculates, “why not have gigs on the moon or in a volcano? You can integrate different ways for the performer to be present in a digital venue.” In addition, multiple users, such as a group of friends, could attend a VR concert together. Szirtes comments, “I think social is going to be a really big thing and that’s something we’ll build into our product. At the moment, VR is seen as a solitary experience, but the real power of VR is to bring people together. When you think about music you think of it as a social activity. The next wave of VR is going to be social and music will play a big part in that.” Universal Music also entered the field in October of last year with a live-streamed 360, 3D performance by Avenged Sevenfold, presented through Universal’s VRTGO platform and powered by VRLive, a Los Angeles-based VR broadcast network. Queen and Adam Lambert released VR The Champions in June on VRTGO. Vantage.tv, based in Los Angeles, produces live 360 streams of major music festivals, and produced a ticketed 360 performance by Eric Church earlier this year. Boiler Room has begun to stream live DJ sets in VR through Daydream. It partnered with Google on the 15-minute release VR Dancefloors: Techno in Berlin. Boiler Room, based in London, is a global online music broadcasting platform that streams live music sessions around the world. About DJ VR, Szirtes asks, “Reactive environments can move and change and pulsate with the music. What will a DJ do in virtuality?” Vrtify has amassed a large music VR library. The Palo Alto-based company is transforming existing music videos into 360 experiences and shooting 360 concert videos with major artists. Vrtify 360 content includes Coldplay, Sting, Florence and the Machine, Twenty One Pilots and others, with titles available through Spotify, Deezer, Pandora and Soundcloud. Vrtify has more than 5,000 hours of concerts, interviews and music videos in virtual and mixed reality, according to the company, which pays 70% of its income to the artists or music rights holders. Challenges for the new medium include the amount of footage that must be shot. “Imagine you’re doing a 2D animation. Imagine the same duration for VR, with the sides, backs, etc. It’s at least four to six times more content,” notes Lympouridis. In addition, “unique software designed for VR production in mind” is needed.

TOP: Duran Duran at Lollapalooza in a 360 concert. (Image courtesy of Vrtify) BOTTOM: Lali Esposito in a 360 concert. (courtesy of Vrtify)

FALL 2017 VFXVOICE.COM • 21

9/14/17 3:25 PM


MUSIC

TOP: Coldplay in a 360 concert. (Image courtesy of Vrtify) BOTTOM: Live Nation is streaming live VR concerts with NextVR. (Image courtesy of Live Nation)

20 • VFXVOICE.COM FALL 2017

PG 16-23 VR MUSIC.indd 21

VR experiences with the likes of Jon Favreau (Gnomes and Goblins) and Deepak Chopra (Find Your True Self). The company runs Transport, an independent network for VR distribution, which charges $8 to subscribe for 360 video (for mobile) or $25 for both mobile and “room-scale” VR. “There are roughly 25 pieces there now and it’s growing.” Batt feels that both creators and audiences have to get used to the new medium. “A lot of it is just sit back and get your hair blown back, but some of it will have interactivity to it. The art form itself is new to both the creator and the observer. There will be a learning curve for everybody. Movies went through the same thing when they first came out. People had to learn what they were.” Wevr has created its VR music titles without industry support. “We’ve done all these ourselves without any record labels,” adds Batt. “The artists themselves are keen to do it, they’re interested in making their audiences happy and sharing. I think the labels are still trying to figure everything out.” “There are definitely some significant things happening” in music VR, says Tom Szirtes, Creative Technologist at Mbryonic, a digital design studio in London. Szirtes has been working with Sony Music on “how we can take existing assets and use them in a virtual environment.” He continues, “Various apps are trying to create virtual worlds where you can interact and explore music. Various artists have created 360 videos and some fairly big deals have been struck in 360 streaming.” One of those deals involves Live Nation (the global entertainment and concert booking giant), Citibank and NextVR, a production company based in Newport Beach, California. The

three companies will produce up to 10 live VR concerts with major artists. Viewers will have front-row seats and multiple other vantage points, and be able to visit the artists backstage. Live Nation streamed its first VR concert with NextVR last December with Thievery Corporation. The streams are viewable through Daydream and Samsung Gear VR headsets, with a NextVR app. Vimeo added support for 360-degree videos to its streaming video platform in 2017, enabling creators to upload, share and sell their immersive videos. It joined YouTube and Facebook in offering support for 360, which can be viewed with headsets (for immersion) or without them (for a limited experience on your computer or tablet). “I can see it working,” Szirtes says about 360 concert videos. “Your audience might not be in the same city, or it might be people who don’t fancy going to concerts anymore or paying $70 or $100. It’s the lowest hanging fruit of the VR music tree.” Also, he speculates, “why not have gigs on the moon or in a volcano? You can integrate different ways for the performer to be present in a digital venue.” In addition, multiple users, such as a group of friends, could attend a VR concert together. Szirtes comments, “I think social is going to be a really big thing and that’s something we’ll build into our product. At the moment, VR is seen as a solitary experience, but the real power of VR is to bring people together. When you think about music you think of it as a social activity. The next wave of VR is going to be social and music will play a big part in that.” Universal Music also entered the field in October of last year with a live-streamed 360, 3D performance by Avenged Sevenfold, presented through Universal’s VRTGO platform and powered by VRLive, a Los Angeles-based VR broadcast network. Queen and Adam Lambert released VR The Champions in June on VRTGO. Vantage.tv, based in Los Angeles, produces live 360 streams of major music festivals, and produced a ticketed 360 performance by Eric Church earlier this year. Boiler Room has begun to stream live DJ sets in VR through Daydream. It partnered with Google on the 15-minute release VR Dancefloors: Techno in Berlin. Boiler Room, based in London, is a global online music broadcasting platform that streams live music sessions around the world. About DJ VR, Szirtes asks, “Reactive environments can move and change and pulsate with the music. What will a DJ do in virtuality?” Vrtify has amassed a large music VR library. The Palo Alto-based company is transforming existing music videos into 360 experiences and shooting 360 concert videos with major artists. Vrtify 360 content includes Coldplay, Sting, Florence and the Machine, Twenty One Pilots and others, with titles available through Spotify, Deezer, Pandora and Soundcloud. Vrtify has more than 5,000 hours of concerts, interviews and music videos in virtual and mixed reality, according to the company, which pays 70% of its income to the artists or music rights holders. Challenges for the new medium include the amount of footage that must be shot. “Imagine you’re doing a 2D animation. Imagine the same duration for VR, with the sides, backs, etc. It’s at least four to six times more content,” notes Lympouridis. In addition, “unique software designed for VR production in mind” is needed.

TOP: Duran Duran at Lollapalooza in a 360 concert. (Image courtesy of Vrtify) BOTTOM: Lali Esposito in a 360 concert. (courtesy of Vrtify)

FALL 2017 VFXVOICE.COM • 21

9/14/17 3:25 PM


MUSIC

AD TOP: The Oculus Rift headset BOTTOM LEFT: The Samsung Gear VR headset BOTTOM RIGHT: The HTC Vive headset

22 • VFXVOICE.COM FALL 2017

PG 16-23 VR MUSIC.indd 23

In terms of viewers, Lympouridis and Batt both see a need for lighter, more comfortable headsets to help accelerate the VR industry. Batt adds, “As the headsets become more comfortable and easy to use, you could find yourself kicking back and putting on the headset as a way to listen to music.” The VR music gold rush has not yet begun, but it’s on the horizon. About VR, Batt predicts, “Music and the way that you watch it and be a part of it will change forever. It won’t be for everybody, but it’ll certainly be an awesome way to celebrate music.”

FALL 2017 VFXVOICE.COM • 23

9/14/17 3:25 PM


LEGEND 3D AD.indd 23

9/14/17 3:26 PM


EDUCATION

VFX/ANIMATION SCHOOLS: ANSWERING THE CALL By DEBRA KAUFMAN

TOP: Gnomon in Hollywood specializes in computer graphics education for careers in the entertainment industry and offers both a BFA and Vocational Certificate in Digital Production. (Photo credit: Phil Holland). BOTTOM: While the BFA degree at Gnomon is a film-centric generalist production art program, the vocational program allows students to specialize in one of five tracks: 3D Generalist, Character and Creature Animation, Visual Effects Animation, Modeling and Texturing, and Games. Pictured is a Digital Sculpture Lab. (Photo credit: Brad Buckman. Courtesy of Gnomon)

Many young people dream of working in the visual effects industry. Learning that unique combination of artistry and technology to become hirable in a demanding, fast-paced and ever-changing field of visual effects is a challenge, but numerous universities, schools and diploma programs teach the software and hardware and offer a real-world experience of working through a VFX pipeline, with the advice and experience of talented mentors/ teachers in the industry. VFX Voice spoke with a sampling of top VFX programs to find out more about what they teach and how they help students get a leg up into their first jobs in VFX. All the schools listed below have graduates working in the top VFX houses in the world, from ILM, Framestore, Walt Disney Animation and Weta to Dreamworks, Digital Domain, MPC, Method Studios, Sony Pictures Imageworks and Zoic Studios among many others. Here they are, with a sidebar on numerous other top programs. This is not a complete listing of VFX schools. NEXTGEN SKILLS ACADEMY LONDON AND A VARIETY OF LOCATIONS IN ENGLAND WWW.NEXTGENSKILLSACADEMY.COM

NextGen Skills Academy is not a traditional university, but rather an academy consisting of member colleges. Founded in 2014 through a co-investment from the U.K. government and the games, animation and VFX industry, NextGen Skills was created to develop a pre-university qualification (a so-called Extended Diploma) that emphasizes learning through doing. On completion of the Extended Diploma, students can go on to a three-year university degree course, take concentrated higher education courses in a specialist area like VR or into a VFX apprenticeship. Starting in September 2015, the Academy began expanding throughout England via partnerships with vetted Further Education Colleges; by this September, there will be nine affiliated colleges, with two more slated for September 2018. The first 106 NextGen students have just made their final submissions; a final year student showcase in London drew attendees from Framestore, Double Negative, ILM, MPC, Microsoft, Sony Interactive Entertainment and others. Since this is the first graduating class, the Academy VFX and Animation Partnership Manager Phil Attfield doesn’t have data on job placement. “We’re hoping to fill many of the apprenticeship places that London VFX companies are planning to recruit for the summer,” he says, estimating there will be 15 apprenticeships, a 50% jump from last year. This next academic year is expected to have a class of between 310 and 340 students. DIGITAL ANIMATION & VISUAL EFFECTS (DAVE) SCHOOL, ORLANDO, FLORIDA WWW.DAVESCHOOL.COM

The DAVE School offers a diploma and a BA degree in Visual Effects Production and a diploma in Game Production, as well as online education for BA degrees in Motion Graphics and Production Programming. Academic Director David Sushil reports that the DAVE School, which was founded in 2000, has

24 • VFXVOICE.COM FALL 2017

PG 24-31 VR EDUCATION.indd 24-25

offered a Visual Effects Production degree since its inception; the Game Production degree was introduced in 2013, and the online degrees have been in place for a year. DAVE typically has between 150 and 180 students taking classes at any given time. To keep up with the needs of the industry, the DAVE School holds an annual industry advisory board meeting where experts describe their needs as employers and the trends they see emerging in industry. “The curriculum is adapted accordingly,” says Sushil. For example, DAVE School just introduced a concentration track for Mixed Reality Development and now supports more physically-based rendering, real-time rendering and previsualization tools. Students train for 12 months, “hyper-focused on developing employable skills” via a “production style” of education modeled on how they will work in the industry. Career Services Director Norman Justicia says his department teaches students how to network, build resumes, interview skills, portfolio presentation and more. It also acts as a liaison between students and employers, and hosts Artists & Employers for tech talks, presentation, game tournaments and other fun activities. SHERIDAN COLLEGE, ONTARIO, CANADA WWW.SHERIDANCOLLEGE.CA

Sheridan College opened its Computer Animation Department in 1981, 14 years before the release of Toy Story, and now offers three 8-to-10 week post-graduate certificate programs in Computer Animation, Visual Effects and now Digital Creatures. Noel Hooper, coordinator of the computer animation program, reports that, with a September and January intake, all three programs enroll about 90 students every year. He points out that the Digital Creatures program replaced a former Digital Character program based on performance animation, in response to industry trends. “This was a way for us to differentiate our program,” he says. “It’s more of a technical directors’ program for designing, developing and rigging creatures.” Sheridan is also active with the Toronto Section of the VES and holds regular meetings with industry leaders. As part of their training, the Sheridan students perform every task of creating a shot from learning how to shoot a camera to all the 3D modeling, rendering and dynamics. “They have full ownership of the shot from beginning to end,” says Hooper. “It’s very effective, but it’s an enormous amount of information to get across.” To move students into the VFX industry, says Hooper, Sheridan brings in VFX professionals and studios to give presentations. “That’s where they start networking,” he says. Sheridan also has two “industry days” where students can meet representatives from companies and studios.

TOP: Song of a Toad, created by Animationsinstitut’s alumni Kariem Saleh (director) and Alexandra Stautmeister (producer), was named “Best in Show” by SIGGRAPH 2017 in July. The Film Academy Baden-Württemberg in Ludwigsburg, Germany, has been home to the Institute of Animation, Visual Effects and Digital Postproduction, one of the world’s leading institutions in animation and interactive media, since 2002. (Photo courtesy of The Film Academy Baden-Württemberg) BOTTOM: The Digital Animation & Visual Effects (DAVE) School in Orlando offers intensive career training for jobs in film, TV and gaming. DAVE’s one-year program includes hard-surface and organic modeling, character animation, stereoscopic 3D effects compositing and more. (Image courtesy of the DAVE School)

SAVANNAH COLLEGE OF ART & DESIGN (SCAD) SAVANNAH AND ATLANTA, GEORGIA; HONG KONG, LACOSTE, FRANCE; WWW.SCAD.EDU

Savannah College of Art & Design offers undergraduate and graduate programs in Visual Effects, with BFA, MFA and MA degrees; the College’s School of Digital Media also offers BFA, MA and MFA degrees in Animation, Motion Media,

FALL 2017 VFXVOICE.COM • 25

9/14/17 3:27 PM


EDUCATION

VFX/ANIMATION SCHOOLS: ANSWERING THE CALL By DEBRA KAUFMAN

TOP: Gnomon in Hollywood specializes in computer graphics education for careers in the entertainment industry and offers both a BFA and Vocational Certificate in Digital Production. (Photo credit: Phil Holland). BOTTOM: While the BFA degree at Gnomon is a film-centric generalist production art program, the vocational program allows students to specialize in one of five tracks: 3D Generalist, Character and Creature Animation, Visual Effects Animation, Modeling and Texturing, and Games. Pictured is a Digital Sculpture Lab. (Photo credit: Brad Buckman. Courtesy of Gnomon)

Many young people dream of working in the visual effects industry. Learning that unique combination of artistry and technology to become hirable in a demanding, fast-paced and ever-changing field of visual effects is a challenge, but numerous universities, schools and diploma programs teach the software and hardware and offer a real-world experience of working through a VFX pipeline, with the advice and experience of talented mentors/ teachers in the industry. VFX Voice spoke with a sampling of top VFX programs to find out more about what they teach and how they help students get a leg up into their first jobs in VFX. All the schools listed below have graduates working in the top VFX houses in the world, from ILM, Framestore, Walt Disney Animation and Weta to Dreamworks, Digital Domain, MPC, Method Studios, Sony Pictures Imageworks and Zoic Studios among many others. Here they are, with a sidebar on numerous other top programs. This is not a complete listing of VFX schools. NEXTGEN SKILLS ACADEMY LONDON AND A VARIETY OF LOCATIONS IN ENGLAND WWW.NEXTGENSKILLSACADEMY.COM

NextGen Skills Academy is not a traditional university, but rather an academy consisting of member colleges. Founded in 2014 through a co-investment from the U.K. government and the games, animation and VFX industry, NextGen Skills was created to develop a pre-university qualification (a so-called Extended Diploma) that emphasizes learning through doing. On completion of the Extended Diploma, students can go on to a three-year university degree course, take concentrated higher education courses in a specialist area like VR or into a VFX apprenticeship. Starting in September 2015, the Academy began expanding throughout England via partnerships with vetted Further Education Colleges; by this September, there will be nine affiliated colleges, with two more slated for September 2018. The first 106 NextGen students have just made their final submissions; a final year student showcase in London drew attendees from Framestore, Double Negative, ILM, MPC, Microsoft, Sony Interactive Entertainment and others. Since this is the first graduating class, the Academy VFX and Animation Partnership Manager Phil Attfield doesn’t have data on job placement. “We’re hoping to fill many of the apprenticeship places that London VFX companies are planning to recruit for the summer,” he says, estimating there will be 15 apprenticeships, a 50% jump from last year. This next academic year is expected to have a class of between 310 and 340 students. DIGITAL ANIMATION & VISUAL EFFECTS (DAVE) SCHOOL, ORLANDO, FLORIDA WWW.DAVESCHOOL.COM

The DAVE School offers a diploma and a BA degree in Visual Effects Production and a diploma in Game Production, as well as online education for BA degrees in Motion Graphics and Production Programming. Academic Director David Sushil reports that the DAVE School, which was founded in 2000, has

24 • VFXVOICE.COM FALL 2017

PG 24-31 VR EDUCATION.indd 24-25

offered a Visual Effects Production degree since its inception; the Game Production degree was introduced in 2013, and the online degrees have been in place for a year. DAVE typically has between 150 and 180 students taking classes at any given time. To keep up with the needs of the industry, the DAVE School holds an annual industry advisory board meeting where experts describe their needs as employers and the trends they see emerging in industry. “The curriculum is adapted accordingly,” says Sushil. For example, DAVE School just introduced a concentration track for Mixed Reality Development and now supports more physically-based rendering, real-time rendering and previsualization tools. Students train for 12 months, “hyper-focused on developing employable skills” via a “production style” of education modeled on how they will work in the industry. Career Services Director Norman Justicia says his department teaches students how to network, build resumes, interview skills, portfolio presentation and more. It also acts as a liaison between students and employers, and hosts Artists & Employers for tech talks, presentation, game tournaments and other fun activities. SHERIDAN COLLEGE, ONTARIO, CANADA WWW.SHERIDANCOLLEGE.CA

Sheridan College opened its Computer Animation Department in 1981, 14 years before the release of Toy Story, and now offers three 8-to-10 week post-graduate certificate programs in Computer Animation, Visual Effects and now Digital Creatures. Noel Hooper, coordinator of the computer animation program, reports that, with a September and January intake, all three programs enroll about 90 students every year. He points out that the Digital Creatures program replaced a former Digital Character program based on performance animation, in response to industry trends. “This was a way for us to differentiate our program,” he says. “It’s more of a technical directors’ program for designing, developing and rigging creatures.” Sheridan is also active with the Toronto Section of the VES and holds regular meetings with industry leaders. As part of their training, the Sheridan students perform every task of creating a shot from learning how to shoot a camera to all the 3D modeling, rendering and dynamics. “They have full ownership of the shot from beginning to end,” says Hooper. “It’s very effective, but it’s an enormous amount of information to get across.” To move students into the VFX industry, says Hooper, Sheridan brings in VFX professionals and studios to give presentations. “That’s where they start networking,” he says. Sheridan also has two “industry days” where students can meet representatives from companies and studios.

TOP: Song of a Toad, created by Animationsinstitut’s alumni Kariem Saleh (director) and Alexandra Stautmeister (producer), was named “Best in Show” by SIGGRAPH 2017 in July. The Film Academy Baden-Württemberg in Ludwigsburg, Germany, has been home to the Institute of Animation, Visual Effects and Digital Postproduction, one of the world’s leading institutions in animation and interactive media, since 2002. (Photo courtesy of The Film Academy Baden-Württemberg) BOTTOM: The Digital Animation & Visual Effects (DAVE) School in Orlando offers intensive career training for jobs in film, TV and gaming. DAVE’s one-year program includes hard-surface and organic modeling, character animation, stereoscopic 3D effects compositing and more. (Image courtesy of the DAVE School)

SAVANNAH COLLEGE OF ART & DESIGN (SCAD) SAVANNAH AND ATLANTA, GEORGIA; HONG KONG, LACOSTE, FRANCE; WWW.SCAD.EDU

Savannah College of Art & Design offers undergraduate and graduate programs in Visual Effects, with BFA, MFA and MA degrees; the College’s School of Digital Media also offers BFA, MA and MFA degrees in Animation, Motion Media,

FALL 2017 VFXVOICE.COM • 25

9/14/17 3:27 PM


EDUCATION

Game Development and Interactive Design. The Animation Department’s focus is on creating 2D and 3D animation, characters, movement and storytelling. The Visual Effects Department, founded in 2003, teaches lighting, compositing, texturing/shading, rendering, FX, scripting and pipeline skills. SCAD offers Visual Effects classes at its campuses in Savannah, Atlanta and Hong Kong. The total number of undergraduates and graduates at all campuses is approximately 280. To keep up with industry trends, SCAD has embraced AR/ VR technologies, and developed projects with Oculus, Samsung, Google, Unreal, Unity and others. SCAD spokesman Ramsay Horn reports that the Visual Effects Department’s close relationship with the visual effects industry means that it is “able to respond quickly to changes in the industry,” adding new courses to reflect “ever-changing technology and innovation.” SCAD also offers a Collaborative Learning Center for mentored projects and the College has VFX industry relationships with Pixar, DreamWorks, ILM, The Mill, ESRI and NASA’s Goddard Space Flight Center. SCAD’s Career and Alumni Services division and alumni network helps students find placement; alumni work at numerous facilities including Pixar, Shade VFX and Blur Studio. VANCOUVER FILM SCHOOL BRITISH COLUMBIA, CANADA; WWW.VFS.EDU

TOP: Located on the backlot of Universal Studios Florida, the DAVE School’s facility includes three labs, a shooting stage with a massive 65’ x 25’ green screen, and state-of-the-art motion-capture system. The school recently introduced a concentration track for Mixed Reality Development and now support more physically-based rendering, real-time rendering and previsualization tools. (Image courtesy of the DAVE School) BOTTOM: The New York School of Visual Arts (SVA) offers undergraduate and graduate programs in Animation, Computer Animation, Film and Visual Effects. The programs in CGI were established over 25 years ago. (Image courtesy of the New York School of Visual Arts)

At the Vancouver Film School (VFS), the Visual Effects Department offers a one-year diploma in 13 different programs, with 183 students currently studying 3D, 78 classical animation and 62 concept art. Vanessa Jacobsen, head of the classical animation, 3D and concept art & design programs, notes that, with three starts a year, the number of students adds up quickly. VFS has close ties with the industry, which allows the programs to quickly adapt to changes in technology, hardware and software. Jacobsen adds the focus is “more about thinking critically and problem solving,” a skill that studio heads and recruiters prize. VFS also teaches workshops in specific software programs, such as a recent 36-hour, four-week workshop on Side FX’s Houdini. VFS also partnered with performance capture and animation studio, Mimic, to open the largest performance-motion-capture studio in Canada. The campus-based facility gave first priority of employment opportunities to VFS alumni, and Mimic mentors students on projects involving performance capture. Jacobsen adds that VFS also collaborates with other schools, including the British Institute of Technology, which allows VFS students to get a degree, while BIT students can take classes at VFS. Strategies for getting VFS students into the workforce focuses on networking: all the staff works in the industry; 3D animation students can attend a get-together where they meet industry people; and industry liaisons present workshops and guest speakers.

AD

NEW YORK SCHOOL FOR VISUAL ARTS NEW YORK CITY; WWW.SVACOMPUTERART.COM

The New York School of Visual Arts (SVA) offers undergraduate and graduate programs in Animation, Computer Animation, Film and Visual Effects; the programs in CGI were established over

26 • VFXVOICE.COM FALL 2017

PG 24-31 VR EDUCATION.indd 26-27

9/14/17 3:27 PM


NYC-SVA AD.indd 27

9/14/17 3:28 PM


EDUCATION

GNOMON, HOLLYWOOD, CALIFORNIA WWW.GNOMON.EDU

TOP LEFT: One of the advantages of the New York School of Visual Arts is its proximity to world-class VFX studios such as Framestore, The Mill, Method, The Molecule, Psyop, MPC, Aardman Nathan Love and Blue Sky Studios, creating opportunities for activities and events often sponsored by the school and the VES. (Image courtesy of the New York School of Visual Arts) TOP RIGHT: NextGen Skills Academy is not a traditional university, but rather an academy consisting of member colleges. Pictured is a classroom on the Nescot campus in Surrey, currently one of four member college campuses. NextGen plans to have nine campuses on board by the end of this year. (Photo courtesy of NextGen Skills Academy) BOTTOM: Savannah College of Art & Design in Atlanta and Savannah, Georgia, and Hong Kong offers undergraduate and graduate programs in Visual Effects, with BFA, MFA and MA degrees. The College’s School of Digital Media also offers BFA, MA and MFA degrees in Animation, Motion Media, Game Development and Interactive Design. (Photo credit: Raftermen Photography. Courtesy of Savannah College of Art and Design.)

25 years ago. Enrollment in the SVA Computer Art, Animation and Visual Effects department is limited to 100 students per year, says John McIntosh, Department Chair of the BFA program. The faculty is made up of working professionals and the curriculum and facilities are constantly updated to reflect the latest tools available and the changing needs of studio productions. One of the school’s advantages, says McIntosh, is the proximity of such world-class VFX studios as Framestore, The Mill, Method, The Molecule, Psyop, MPC, Aardman Nathan Love and Blue Sky Studios, “creating exciting opportunities for activities and events often sponsored by SVA and the Visual Effects Society.” “The greatest challenge in VFX education is in managing creative collaborations and growingly complex teams of artists,” says McIntosh. As Maya and Nuke have gained ubiquitous acceptance, he adds, the industry and education are using the same pipeline tools. “We can provide a better educational experience and the industry can hire more talented artists.” Via industry partners, SVA students are placed in internships, an experience that seasons their education and makes them more hirable.

Founded in 1997, Gnomon offers both a BFA and a Vocational Certificate in Digital Production. According to Max Dayan, Director of Education for the vocational program, the school has offered the certificate curriculum for 20 years, whereas the BFA is new this year, “the culmination of many years of planning and hard work.” While the BFA degree is a film-centric generalist production art program, the vocational program allows students to specialize in one of five tracks: 3D Generalist, Character and Creature Animation, Visual Effects Animation, Modeling and Texturing and Games. The school’s students represent a range of levels, from beginners to industry professionals looking to broaden their skill set, with an average of 500 students attending the school per term. According to Dayan, Gnomon specializes in computer graphics education for careers in the entertainment industry, with instructors who bring real-world experience into the classroom. “One challenge in production art education is keeping up with the rate that techniques and tools evolve in our industry,” Dayan says. “We must evaluate and develop relevant curriculum on an almost-constant basis.” Gnomon also has a Placement and Alumni Relations team dedicated to finding “viable and aligned employment” for its graduates, while the Education and Placement departments work in tandem to prepare students for externships, full-time work and freelance opportunities. OTHER VFX SCHOOLS

Academy of Art University in San Francisco. The Academy of Art University has a Department of Animation and visual effects where students can choose VFX, 3D modeling, 3D animation, 2D animation and stop motion or storyboarding as a primary area of emphasis. Among its offerings is the largest green screen in Northern California. An online program offers AA, BFA, MA and MFA programs. Alumni have been hired at ILM, DreamWorks, Walt Disney Animation Studios, Pixar and other studios and facilities. www.academyart.edu

TOP: Savannah College has embraced AR/VR technologies, and developed projects with Oculus, Samsung, Google, Unreal, Unity and others. The Visual Effects Department has a close relationship with the visual effects industry, and its alumni work at some of the top companies. (Photo credit: Chia Chong. Courtesy of Savannah College of Art and Design.) BOTTOM: Sheridan College in Ontario, Canada, opened its Computer Animation Department in 1981, 14 years before the release of Toy Story, and now offers three eight-to-10 week post-graduate certificate programs in Computer Animation, Visual Effects and now Digital Creatures. Sheridan is active with the Toronto Section of the VES and holds regular meetings with industry leaders. (Image courtesy of Sheridan College)

Carnegie Mellon University’s Entertainment Technology Center, Pittsburgh, Pennsylania, with campuses in Silicon Valley and Qatar. Carnegie Mellon’s Entertainment Technology Center is a professional graduate program for interactive entertainment. Founded in 1998, ETC, as an interdisciplinary research center, offers a Master of Entertainment Technology (MET) degree, conferred jointly by the School of Computer Science and the College of Fine Arts. ETC is led by two co-directors, computer science professor Randy Pausch and drama professor Don Marinelli. All ETC students start with an “Immersion” curriculum, held in Pittsburgh, in the program’s first semester. www.etc.cmu.edu Filmakademie Baden-Württemberg, Ludwigsburg, Germany. Since it was founded in 1991, Film Academy Baden-Württemberg has become one of the world’s leading film schools. Students are taught and mentored by more than 300 media/film industry

28 • VFXVOICE.COM FALL 2017

PG 24-31 VR EDUCATION.indd 29

FALL 2017 VFXVOICE.COM • 29

9/14/17 3:27 PM


EDUCATION

GNOMON, HOLLYWOOD, CALIFORNIA WWW.GNOMON.EDU

TOP LEFT: One of the advantages of the New York School of Visual Arts is its proximity to world-class VFX studios such as Framestore, The Mill, Method, The Molecule, Psyop, MPC, Aardman Nathan Love and Blue Sky Studios, creating opportunities for activities and events often sponsored by the school and the VES. (Image courtesy of the New York School of Visual Arts) TOP RIGHT: NextGen Skills Academy is not a traditional university, but rather an academy consisting of member colleges. Pictured is a classroom on the Nescot campus in Surrey, currently one of four member college campuses. NextGen plans to have nine campuses on board by the end of this year. (Photo courtesy of NextGen Skills Academy) BOTTOM: Savannah College of Art & Design in Atlanta and Savannah, Georgia, and Hong Kong offers undergraduate and graduate programs in Visual Effects, with BFA, MFA and MA degrees. The College’s School of Digital Media also offers BFA, MA and MFA degrees in Animation, Motion Media, Game Development and Interactive Design. (Photo credit: Raftermen Photography. Courtesy of Savannah College of Art and Design.)

25 years ago. Enrollment in the SVA Computer Art, Animation and Visual Effects department is limited to 100 students per year, says John McIntosh, Department Chair of the BFA program. The faculty is made up of working professionals and the curriculum and facilities are constantly updated to reflect the latest tools available and the changing needs of studio productions. One of the school’s advantages, says McIntosh, is the proximity of such world-class VFX studios as Framestore, The Mill, Method, The Molecule, Psyop, MPC, Aardman Nathan Love and Blue Sky Studios, “creating exciting opportunities for activities and events often sponsored by SVA and the Visual Effects Society.” “The greatest challenge in VFX education is in managing creative collaborations and growingly complex teams of artists,” says McIntosh. As Maya and Nuke have gained ubiquitous acceptance, he adds, the industry and education are using the same pipeline tools. “We can provide a better educational experience and the industry can hire more talented artists.” Via industry partners, SVA students are placed in internships, an experience that seasons their education and makes them more hirable.

Founded in 1997, Gnomon offers both a BFA and a Vocational Certificate in Digital Production. According to Max Dayan, Director of Education for the vocational program, the school has offered the certificate curriculum for 20 years, whereas the BFA is new this year, “the culmination of many years of planning and hard work.” While the BFA degree is a film-centric generalist production art program, the vocational program allows students to specialize in one of five tracks: 3D Generalist, Character and Creature Animation, Visual Effects Animation, Modeling and Texturing and Games. The school’s students represent a range of levels, from beginners to industry professionals looking to broaden their skill set, with an average of 500 students attending the school per term. According to Dayan, Gnomon specializes in computer graphics education for careers in the entertainment industry, with instructors who bring real-world experience into the classroom. “One challenge in production art education is keeping up with the rate that techniques and tools evolve in our industry,” Dayan says. “We must evaluate and develop relevant curriculum on an almost-constant basis.” Gnomon also has a Placement and Alumni Relations team dedicated to finding “viable and aligned employment” for its graduates, while the Education and Placement departments work in tandem to prepare students for externships, full-time work and freelance opportunities. OTHER VFX SCHOOLS

Academy of Art University in San Francisco. The Academy of Art University has a Department of Animation and visual effects where students can choose VFX, 3D modeling, 3D animation, 2D animation and stop motion or storyboarding as a primary area of emphasis. Among its offerings is the largest green screen in Northern California. An online program offers AA, BFA, MA and MFA programs. Alumni have been hired at ILM, DreamWorks, Walt Disney Animation Studios, Pixar and other studios and facilities. www.academyart.edu

TOP: Savannah College has embraced AR/VR technologies, and developed projects with Oculus, Samsung, Google, Unreal, Unity and others. The Visual Effects Department has a close relationship with the visual effects industry, and its alumni work at some of the top companies. (Photo credit: Chia Chong. Courtesy of Savannah College of Art and Design.) BOTTOM: Sheridan College in Ontario, Canada, opened its Computer Animation Department in 1981, 14 years before the release of Toy Story, and now offers three eight-to-10 week post-graduate certificate programs in Computer Animation, Visual Effects and now Digital Creatures. Sheridan is active with the Toronto Section of the VES and holds regular meetings with industry leaders. (Image courtesy of Sheridan College)

Carnegie Mellon University’s Entertainment Technology Center, Pittsburgh, Pennsylania, with campuses in Silicon Valley and Qatar. Carnegie Mellon’s Entertainment Technology Center is a professional graduate program for interactive entertainment. Founded in 1998, ETC, as an interdisciplinary research center, offers a Master of Entertainment Technology (MET) degree, conferred jointly by the School of Computer Science and the College of Fine Arts. ETC is led by two co-directors, computer science professor Randy Pausch and drama professor Don Marinelli. All ETC students start with an “Immersion” curriculum, held in Pittsburgh, in the program’s first semester. www.etc.cmu.edu Filmakademie Baden-Württemberg, Ludwigsburg, Germany. Since it was founded in 1991, Film Academy Baden-Württemberg has become one of the world’s leading film schools. Students are taught and mentored by more than 300 media/film industry

28 • VFXVOICE.COM FALL 2017

PG 24-31 VR EDUCATION.indd 29

FALL 2017 VFXVOICE.COM • 29

9/14/17 3:27 PM


EDUCATION

experts. Around 250 films covering a range of genres are created by teams of students each year. The Film Academy is also home to the Institute of Animation, Visual Effects and Digital Postproduction, one of the world’s leading institutions in animation and interactive media. Student graduation films students regularly win prizes at the industry’s major festivals, including SIGGRAPH and the VES Awards. www.filmakademie.de/en/; www.animationsinstitut.de Gobelins, L’École de L’Image, Paris, France. A global leader in the fields of digital communication, interactive design and entertainment for almost 50 years, Gobelins offers courses in Photography, Animated Filmmaking, 3D Animation, Motion Design and Video Gaming. Animated Filmmaking (4 years fulltime) helps students master both traditional and digital animation techniques (2D and 3D). 3D Animation (1 year full-time) is an advanced program in 3D Character Animation featuring international animation and videogame experts, production of an individual demo reel and 3-month internship. Video Gaming (1 year Specialized Master’s degree) prepares engineers, graphic designers and IT specialists for game design and creation. Gobelins has a strong “international dimension” with global partnerships. Key courses taught entirely in English. www.gobelins-school.com DigiPen Institute of Technology, Redmond, Washington. DigiPen Institute offers a four-year BFA program in Digital Art & Animation (the school also offers a BS in Computer Science & Game Design). The BFA program emphasizes foundational skills, including drawing and art concepts, as these skills “remain relevant regardless of the technology or medium.” BFA graduates have become character modelers, prop and environment modelers, texture artists, 3D lighting and camera designers, character riggers, TOP LEFT: VFS partnered with performance-capture and animation studio Mimic to open the largest performance motion-capture studio in Canada, where Mimic mentors students on projects involving performance capture. Pictured is the Mimic MoCap room with body actor Henry Shot. (Image courtesy of the Vancouver Film School) TOP MIDDLE: The Visual Effects Department of the Vancouver Film School (VFS), located in British Columbia, Canada, offers a one-year diploma program in 13 different programs including 3D, classical animation and concept art. VFS also teaches workshops in specific software programs, such as a recent 36-hour, four-week workshop on Side FX’s Houdini. Pictured is a poster for the 3D program. (Image courtesy of the Vancouver Film School)

30 • VFXVOICE.COM FALL 2017

PG 24-31 VR EDUCATION.indd 31

character animators, effects animators, storyboard artists, cinematic animators and conceptual illustrators. www.digipen.edu Purdue University, West Lafayette, Indiana. Purdue offers degrees in Animation, Visual Effects Compositing, and Effects Technical Direction as part of the university’s Polytechnic Institute, one of its 10 academic colleges. The major in Animation focuses on 3D modeling, texturing, lighting, rendering, character rigging and motion, using Maya. The Visual Effects Compositing major gives students experience in creating effects for live action and computer-generated integration, including virtual environments, merging 3D models with live-action sets and layering video and photo elements. www.purdue.edu

AD

Ringling College of Art and Design, Sarasota, Florida. Ringling College of Art and Design, which describes itself as a “community of artists and designers,” offers degrees in the Business of Art & Design, Computer Animation, Creative Writing, Film, Fine Arts, Game Art, Graphic Design, Illustration, Interior Design, Motion Design, Photography & Imaging and Visual Studies. With regard to industry relations, for over a decade Pixar has visited the campus to do presentations and review student portfolios. Ringling College alumni now work at Pixar, Disney, Electronic Arts, Hasbro and many other top facilities and studios. www.ringling.edu The University of the Arts, Philadelphia, Pennsylvania. The University of the Arts’ School of Film offers degrees in Animation, Film & Animation, and Game Art. The Animation program focuses on drawn animation, CGI 3D and photography-based stop motion for feature films, commercials, TV shows, and independent shorts among other media. Students can collaborate with musicians, dancers, actors and other artists, and have opportunities to study abroad and partake in international festivals and workshops. Alumni work in the industry as production artists, storyboard artists and directors in studios, animation studios and on TV series. www.uarts.edu

TOP RIGHT: Ringling College Motion Design Department Head Ed Cheetham teaching a class. Cheetham manages the curriculum for concept, design and production of 2D and 3D motion graphics and animation. (Photo courtesy of Ringling College)

FALL 2017 VFXVOICE.COM • 31

9/14/17 3:27 PM


SHOOT AD.indd 31

9/14/17 3:29 PM


ANIMATION

MARTIAL (BRICK) ARTS: THE MAKING OF THE LEGO NINJAGO MOVIE By IAN FAILES

When Animal Logic first embarked on the computer-animated The LEGO Movie, released in 2014, it wasn’t clear how an entire world and its characters could be made in CG from the famous plastic bricks. But the studio invested heavily in a dedicated pipeline and new tools and techniques to pull it off, and then took things even further for 2017’s The LEGO Batman Movie. Now with The LEGO Ninjago Movie, inspired by the line of martial arts toys from LEGO, Animal Logic has had to once again re-think its approach to animating bricks and take on several fresh challenges, including animating kung fu action, rendering natural environments and even – spoiler alert – crafting a photorealistic cat. NINJA STYLE, IN LEGO FORM

TOP: Garmadon (Justin Theroux) and Master Wu (Jackie Chan) battle it out among one of the natural environments made by Animal Logic.

All images copyright © 2017 Warner Bros. Pictures. All rights reserved.

32 • VFXVOICE.COM FALL 2017

PG 32-37 LEGO NINJAGO.indd 32-33

Early on in production, LEGO Ninjago director Charlie Bean tasked Animal Logic with imbuing this new adventure with a particular style that combined the genres of kung fu, monster movies and the films of director John Hughes. On top of that, LEGO Ninjago had to feel – just as in the previous films – like it was made by hand and with a stop-motion feel reminiscent of ‘brick films’. Animal Logic had established that look and feel in its approach to animation, for example, by not rendering motion blur on the minifigures (minifigs) and by adhering to a number of clear rules; most importantly, that almost everything was made from ‘legal’ LEGO bricks. Indeed, Animal Logic’s pipeline for these LEGO films is built entirely around the brick. It begins with the use of LEGO Digital Designer (LDD), a product anyone can use to construct LEGO forms from legal LEGO bricks. These are then brought into Autodesk’s Maya for further modeling and surfacing. Rigging, animation and layout are done in Softimage XSI, and effects made possible in Side Effects Software’s Houdini. Shading and rendering is handled in Animal Logic’s proprietary path tracer Glimpse,

developed during the original LEGO Movie. Compositing is done in The Foundry’s NUKE. NEW LOCATIONS, NEW APPROACHES

Many elements of LEGO Ninjago still made use of this brick pipeline – from the film’s buildings and city to the Mechs within, which the main characters battle. But the film also introduced a natural environment full of vegetation, water, sand, rocks and soil that Animal Logic would have to create in CG, albeit at the same scale as a minifig. “That was the director’s dream and goal from the beginning,” notes CG Supervisor Greg Jowle. “The film had to feel akin to a child playing in the backyard at a macro level with their little LEGO pieces. We aimed at everything being miniaturized.” This included vegetation, re-created to appear as if from the real world, but at LEGO minifig scale. “An obvious place to start became bonsai trees,” says Jowle. “We went to a bonsai place and did reference shoots. We looked at moss for different grasses. We also mixed in common house plants to remind people of the scale that we were in. We’d throw in plants where the leaves were smaller than our hands, but for the minifigures they were bigger than their entire body.”

“That was the director’s dream and goal from the beginning. The film had to feel akin to a child playing in the backyard at a macro level with their little LEGO pieces. We aimed at everything being miniaturized.” —Greg Jowle, CG Supervisor

REAL-WORLD EFFECTS

In previous LEGO film outings, effects simulations had been replicated at a brick level. For Ninjago, there was more scope to simulate phenomena such as water, sand, explosions and smoke as real-world effects. However, it was still at a minifig scale and sometimes with a certain LEGO style. “There’s a scene in the film where the characters pop out of water,” notes FX Supervisor Miles Green. “In reality, often water just sheets off LEGO pieces and they look completely dry. But we

TOP LEFT: Model of bamboo forest. TOP RIGHT: Characters and forest. BOTTOM LEFT: Virtual camera layout for the scene. BOTTOM RIGHT: Early render.

FALL 2017 VFXVOICE.COM • 33

9/14/17 3:29 PM


ANIMATION

MARTIAL (BRICK) ARTS: THE MAKING OF THE LEGO NINJAGO MOVIE By IAN FAILES

When Animal Logic first embarked on the computer-animated The LEGO Movie, released in 2014, it wasn’t clear how an entire world and its characters could be made in CG from the famous plastic bricks. But the studio invested heavily in a dedicated pipeline and new tools and techniques to pull it off, and then took things even further for 2017’s The LEGO Batman Movie. Now with The LEGO Ninjago Movie, inspired by the line of martial arts toys from LEGO, Animal Logic has had to once again re-think its approach to animating bricks and take on several fresh challenges, including animating kung fu action, rendering natural environments and even – spoiler alert – crafting a photorealistic cat. NINJA STYLE, IN LEGO FORM

TOP: Garmadon (Justin Theroux) and Master Wu (Jackie Chan) battle it out among one of the natural environments made by Animal Logic.

All images copyright © 2017 Warner Bros. Pictures. All rights reserved.

32 • VFXVOICE.COM FALL 2017

PG 32-37 LEGO NINJAGO.indd 32-33

Early on in production, LEGO Ninjago director Charlie Bean tasked Animal Logic with imbuing this new adventure with a particular style that combined the genres of kung fu, monster movies and the films of director John Hughes. On top of that, LEGO Ninjago had to feel – just as in the previous films – like it was made by hand and with a stop-motion feel reminiscent of ‘brick films’. Animal Logic had established that look and feel in its approach to animation, for example, by not rendering motion blur on the minifigures (minifigs) and by adhering to a number of clear rules; most importantly, that almost everything was made from ‘legal’ LEGO bricks. Indeed, Animal Logic’s pipeline for these LEGO films is built entirely around the brick. It begins with the use of LEGO Digital Designer (LDD), a product anyone can use to construct LEGO forms from legal LEGO bricks. These are then brought into Autodesk’s Maya for further modeling and surfacing. Rigging, animation and layout are done in Softimage XSI, and effects made possible in Side Effects Software’s Houdini. Shading and rendering is handled in Animal Logic’s proprietary path tracer Glimpse,

developed during the original LEGO Movie. Compositing is done in The Foundry’s NUKE. NEW LOCATIONS, NEW APPROACHES

Many elements of LEGO Ninjago still made use of this brick pipeline – from the film’s buildings and city to the Mechs within, which the main characters battle. But the film also introduced a natural environment full of vegetation, water, sand, rocks and soil that Animal Logic would have to create in CG, albeit at the same scale as a minifig. “That was the director’s dream and goal from the beginning,” notes CG Supervisor Greg Jowle. “The film had to feel akin to a child playing in the backyard at a macro level with their little LEGO pieces. We aimed at everything being miniaturized.” This included vegetation, re-created to appear as if from the real world, but at LEGO minifig scale. “An obvious place to start became bonsai trees,” says Jowle. “We went to a bonsai place and did reference shoots. We looked at moss for different grasses. We also mixed in common house plants to remind people of the scale that we were in. We’d throw in plants where the leaves were smaller than our hands, but for the minifigures they were bigger than their entire body.”

“That was the director’s dream and goal from the beginning. The film had to feel akin to a child playing in the backyard at a macro level with their little LEGO pieces. We aimed at everything being miniaturized.” —Greg Jowle, CG Supervisor

REAL-WORLD EFFECTS

In previous LEGO film outings, effects simulations had been replicated at a brick level. For Ninjago, there was more scope to simulate phenomena such as water, sand, explosions and smoke as real-world effects. However, it was still at a minifig scale and sometimes with a certain LEGO style. “There’s a scene in the film where the characters pop out of water,” notes FX Supervisor Miles Green. “In reality, often water just sheets off LEGO pieces and they look completely dry. But we

TOP LEFT: Model of bamboo forest. TOP RIGHT: Characters and forest. BOTTOM LEFT: Virtual camera layout for the scene. BOTTOM RIGHT: Early render.

FALL 2017 VFXVOICE.COM • 33

9/14/17 3:29 PM


ANIMATION

wanted to show they were wet so we would stick a little spritz of water on them.” Simulating water, which was done in Houdini, also proved complicated because animation of the characters had a stop-motion style (mostly done on 2s). Water sims did not work well this way. “We had to work quite closely with animation to try and get objects that aren’t in the water on 2s, but objects in water ideally need to be smooth and flowing without velocity changes,” explains Green. “Otherwise things start exploding and changing. If they have to be on 2s, then we have to do a bit of re-timing and interpolation and then work backwards to make it look smooth.” Effects artists also, of course, had to take note of scale. Says Green: “When you’re doing that for a small LEGO asset, it makes that asset feel huge, and we want it to look tiny. Everything needs to run a lot faster than you’re used to seeing it. So the drips have to almost drop away in a couple of frames.” NINJAGO-ANIM

TOP: Animal Logic modeled several Mechs for the film, each having different powers and each also having to be made, for real, with real LEGO bricks. BOTTOM: From left: Nya (Abbi Jacobson), Kai (Michael Pena), Lloyd (Dave Franco), Zane (Zach Woods), Jay (Kumail Nanjiani) and Cole (Fred Armisen).

GROWING THE WORLD OF LEGO NINJAGO Two new tools built by Animal Logic specifically for The LEGO Ninjago Movie aided in filling out the world with natural vegetation. The first of these, dubbed ‘Spawn’, was a scattering tool that allowed artists to “lay a field or carpet of moss as grass and vary it or add leaf debris,” describes CG Supervisor Greg Jowle. “You don’t want to do that by hand.” The second purpose-built tool was called ‘Spruce’ and was designed to grow multiple versions of randomized bushes, trees and other plants. “We break them down into component pieces called elements,” says Jowle, “and then like any L-system you can put them back together, scatter leaves on the ends, de-intersect the leaves, and put ‘keep alive’ on them to give them this gentle constant movement.”

Inspired by traditional kung fu movies – and with Jackie Chan featured as one of the main characters – Animal Logic was faced with the challenge of inputting martial arts action into the minifigs. “We’d watch the way Jackie Chan is in the middle of a fight, has an idea, looks at an object, looks at an opponent, looks back at the object and then figures out how to use the object,” says Animation Director Matt Everitt. “And then we’d apply that to our scenes.” The fights also saw the use of ‘brick blur’, something Animal Logic had done even on the first LEGO Movie. This is where instead of motion blur, animators would add in additional legal brick pieces to exaggerate a move or replicate what a ‘smear’ frame may have achieved. “When you watch it on the run you get this beautiful fluid movement, but when you frame-by-frame it, each frame is individually hand-crafted,” states Everitt. “You can’t really bend the elbows of a minifig, say, but we can make the ‘feeling’ of a bent elbow when you watch it on the run. I like to think you could stop the movie at any point and you would find a treat in there.” Animal Logic’s animators were also faced with the challenge of animating the Mechs, the robot warriors operated by the film’s heroes and villains. Although the Mechs engage in epic battle action, they also had to operate within the realm of this miniature world. “When you pop out to a wide shot of a high angle looking down at them, you’re reminded, oh it’s actually a miniature, a toy,” says Everitt. “But when you’re down looking up at the Mechs, it’s like, oh we’re in Pacific Rim! So there’s this constant playing with the audience’s expectations of what they’re seeing on screen and reminding them that it is a toy, it’s not real.”

AD

KITTY SURPRISE

The big twist in LEGO Ninjago is the arrival of a cat into the world, playing into the idea that this is all occurring within a suburban backyard. It quickly becomes a menace to the film’s heroes, knocking over buildings and causing mayhem while

34 • VFXVOICE.COM FALL 2017

PG 32-37 LEGO NINJAGO.indd 34-35

9/14/17 3:29 PM


ROTOMAKER AD.indd 35

9/14/17 3:30 PM


ANIMATION

SHOOTING FOR REFERENCE The evil warlord Garmadon commands an army of ‘shark Mechs’ in the film who fire little fish brick missiles as their weapons. Looking to ground the look of these missiles in reality when simulating the effects of them being fired, but at the appropriate small scale, Animal Logic discovered the world of home-made match rockets. “People make and post these match rocket videos on YouTube,” outlines FX Supervisor Miles Green. “Basically you get a match and some tin foil and you put the tin foil over the head of the match and wrap it around. And when it’s lit, they whiz off. We used them for missile trails. As they go they leave these beautiful fine trails – very small and diffuse.”

“We’d watch the way Jackie Chan is in the middle of a fight, has an idea, looks at an object, looks at an opponent, looks back at the object and then figures out how to use the object. And then we’d apply that to our scenes.” —Matt Everitt, Animation Director

AD

remaining largely impenetrable. Since ‘Meothra’ – as it became known – was intended to be a live-action house cat, Animal Logic did several tests with real cats against bluescreen. However, it soon proved difficult to make the animals perform as required. That led to a fully CG feline being crafted for the film. Animal Logic drew upon previous creature work it had done, including onw furry animals, to build Meothra. In terms of animation, the cat had to represent the movie’s monster, but at the same time not be malicious. “It’s just a cat doing cat stuff,” says Everitt. “We also recorded a couple of cats and I would manipulate a Mech like a bad guy with the cat. The cat would give it a sniff and knock it over and burrow around within the LEGO and explore within it, and so we used some of that as explicit reference in the film. “We spent a bit of time on YouTube researching videos of cats, working out what makes it so appealing,” Everitt explains. “People just love watching cats doing silly stuff.” And thanks to Animal Logic, that silly stuff is now immortalized in a LEGO film. TOP to BOTTOM: Master Wu (Jackie Chan). Animal Logic closely referenced the actor’s previous films for his unique fighting style and editing of fight scenes. Real LEGO minifigs were scanned and photographed to ensure that the proportions and details were accurately replicated in CG. Garmadon (Justin Theroux) atop one of his Mechs. Lloyd (Dave Franco). For lip sync animation, the animators relied on a large facial library of 2D face shapes.

36 • VFXVOICE.COM FALL 2017

PG 32-37 LEGO NINJAGO.indd 37

FALL 2017 VFXVOICE.COM • 37

9/14/17 3:29 PM


CINESITE AD.indd 37

9/14/17 3:31 PM


TV

up elements to form a supercell and shaping a library of flanking lines, wall clouds and anvils. “In order to allow our compositing department to build internal movements,” explains Cinesite Visual Effects Supervisor Aymeric Perceval, “we divided our matte paintings into multiple layers and used various 2D and 2.5D distortion maps we created to fine-tune on a per shot basis.” The storm featured in the series finale, where the Gods seem to control the weather, was fleshed out in 3D. Artists referenced timelapse photography of clouds forming and disappearing, as if rolling in like waves, and then replicated that look in CG. “We created a custom volume deformer on top of the layers of simulation in Houdini,” outlines FX artist Masaya Sugimura. “This allowed flexible art direction of the FX, giving us time and options to adjust the effect in multiple shots while keeping an eye out for visual inconsistencies created by the holes in the 3D noise.” “These layers were then balanced by compositing to complete the animated matte painting without concealing it,” says Perceval. “Small lightning bolts and flashes were finally composited in before the result was shared with the other vendors who were using our comps as a backplate. Once this overall storm set-up was in place, it allowed us to deliver two complete, full CG shots where the audience travels within the storm clouds and discovers the rolling from the inside.” INTO THE AFTERLIFE

DIVINE AND CONQUER: CINESITE’S AMERICAN GODS EXPERIENCE By IAN FAILES

When it hit small screens earlier this year, the Starz series American Gods caused waves with its impressive adaptation of Neil Gaiman’s 2001 novel about a war raging between Old and New Gods. Receiving equal praise was the television show’s approach to its startling and often confronting imagery – made possible with 3,000 visual effects shots across eight episodes. Cinesite was one of the show’s key contributors, delivering effects ranging from an ongoing storm, to views of the afterlife, and even the addition of a cat into key scenes. THE ENDLESS STORM

TOP: Cinesite artists studied supercell weather patterns to build up the storm that follows Mr. Wednesday and Shadow Moon. BOTTOM LEFT and RIGHT: The original plate and final shot for a storm featured in the show’s climactic Episode 8. All photos copyright © 2017 Starz Entertainment and courtesy of Cinesite.

38 • VFXVOICE.COM FALL 2017

PG 38-41 AMERICAN GODS.indd 38-39

American Gods concentrates on two main characters, Mr. Wednesday (Ian McShane) and Shadow Moon (Ricky Whittle), as they take a road trip across the U.S. A storm front consistently looms over them, making a particularly dramatic appearance in the series finale during a confrontation between the Old and New Gods. Cinesite’s digital matte painting (DMP) team, led by Philippe Langlois, initially tackled the storm as a full 2D solution, building

For a scene featuring the entry of Mrs. Fadil (Jacqueline Antaramian) into Anubis’s kingdom – half-way between Earth and another dimension – Cinesite worked on visual effects for scenes right before the character’s death, and as she ascends into the afterlife. One of the first – and surprising – challenges Cinesite faced in the sequence was inserting Mrs. Fadil’s sphynx cat into the shots. Initial efforts to film a real cat on-set were not successful, and a CG cat was deemed too expensive. So instead, two trained cats from GreenScreen Animals were captured performing specific actions on bluescreen, which were then composited into the plates. Mrs Fadil and Anubis (Chris Obi) exit her two story apartment, which was built on a stage, and then climb an infinite wall made possible via a Cinesite set extension. Lead modeling and texture artist Celestin Salomon says, “It was a very interesting job to first match the original building floor, then to build the transition to different construction styles: bricks, old damaged bricks, medieval rock wall, big rock blocks and finally a sculpted cliff. We created finely detailed displacement maps which gave the multiple walls a richer and more realistic look.” For Mrs. Fadil’s arrival into the afterlife, Cinesite augmented plate photography captured on a sandy location in Oklahoma. “We replaced the sand dunes since they were not as pristine as the showrunners wanted them to be,” says Perceval. “We completed the pristine dune environment with blowing FX sand passes to give the shots a bit more life. For the sky, our Lead Compositor Remy Martin played with multiple layers of constellations and stars, using space and long exposure night photography as reference. FX passes and other 2D elements were added to avoid it looking

“Small lightning bolts and flashes were finally composited in before the result was shared with the other vendors who were using our comps as a backplate. Once this overall storm set-up was in place, it allowed us to deliver two complete, full CG shots where the audience travels within the storm clouds and discovers the rolling from the inside.” —Aymeric Perceval, Visual Effects Supervisor, Cinesite

TOP: Mrs. Fadil’s sphynx cat was actually shot separately on bluescreen and composited into the shots by Cinesite. MIDDLE and BOTTOM: The original on-set apartment photography and a wider shot of the digital extension for Mrs. Fadil’s ascent into the afterlife.

FALL 2017 VFXVOICE.COM • 39

9/14/17 3:32 PM


TV

up elements to form a supercell and shaping a library of flanking lines, wall clouds and anvils. “In order to allow our compositing department to build internal movements,” explains Cinesite Visual Effects Supervisor Aymeric Perceval, “we divided our matte paintings into multiple layers and used various 2D and 2.5D distortion maps we created to fine-tune on a per shot basis.” The storm featured in the series finale, where the Gods seem to control the weather, was fleshed out in 3D. Artists referenced timelapse photography of clouds forming and disappearing, as if rolling in like waves, and then replicated that look in CG. “We created a custom volume deformer on top of the layers of simulation in Houdini,” outlines FX artist Masaya Sugimura. “This allowed flexible art direction of the FX, giving us time and options to adjust the effect in multiple shots while keeping an eye out for visual inconsistencies created by the holes in the 3D noise.” “These layers were then balanced by compositing to complete the animated matte painting without concealing it,” says Perceval. “Small lightning bolts and flashes were finally composited in before the result was shared with the other vendors who were using our comps as a backplate. Once this overall storm set-up was in place, it allowed us to deliver two complete, full CG shots where the audience travels within the storm clouds and discovers the rolling from the inside.” INTO THE AFTERLIFE

DIVINE AND CONQUER: CINESITE’S AMERICAN GODS EXPERIENCE By IAN FAILES

When it hit small screens earlier this year, the Starz series American Gods caused waves with its impressive adaptation of Neil Gaiman’s 2001 novel about a war raging between Old and New Gods. Receiving equal praise was the television show’s approach to its startling and often confronting imagery – made possible with 3,000 visual effects shots across eight episodes. Cinesite was one of the show’s key contributors, delivering effects ranging from an ongoing storm, to views of the afterlife, and even the addition of a cat into key scenes. THE ENDLESS STORM

TOP: Cinesite artists studied supercell weather patterns to build up the storm that follows Mr. Wednesday and Shadow Moon. BOTTOM LEFT and RIGHT: The original plate and final shot for a storm featured in the show’s climactic Episode 8. All photos copyright © 2017 Starz Entertainment and courtesy of Cinesite.

38 • VFXVOICE.COM FALL 2017

PG 38-41 AMERICAN GODS.indd 38-39

American Gods concentrates on two main characters, Mr. Wednesday (Ian McShane) and Shadow Moon (Ricky Whittle), as they take a road trip across the U.S. A storm front consistently looms over them, making a particularly dramatic appearance in the series finale during a confrontation between the Old and New Gods. Cinesite’s digital matte painting (DMP) team, led by Philippe Langlois, initially tackled the storm as a full 2D solution, building

For a scene featuring the entry of Mrs. Fadil (Jacqueline Antaramian) into Anubis’s kingdom – half-way between Earth and another dimension – Cinesite worked on visual effects for scenes right before the character’s death, and as she ascends into the afterlife. One of the first – and surprising – challenges Cinesite faced in the sequence was inserting Mrs. Fadil’s sphynx cat into the shots. Initial efforts to film a real cat on-set were not successful, and a CG cat was deemed too expensive. So instead, two trained cats from GreenScreen Animals were captured performing specific actions on bluescreen, which were then composited into the plates. Mrs Fadil and Anubis (Chris Obi) exit her two story apartment, which was built on a stage, and then climb an infinite wall made possible via a Cinesite set extension. Lead modeling and texture artist Celestin Salomon says, “It was a very interesting job to first match the original building floor, then to build the transition to different construction styles: bricks, old damaged bricks, medieval rock wall, big rock blocks and finally a sculpted cliff. We created finely detailed displacement maps which gave the multiple walls a richer and more realistic look.” For Mrs. Fadil’s arrival into the afterlife, Cinesite augmented plate photography captured on a sandy location in Oklahoma. “We replaced the sand dunes since they were not as pristine as the showrunners wanted them to be,” says Perceval. “We completed the pristine dune environment with blowing FX sand passes to give the shots a bit more life. For the sky, our Lead Compositor Remy Martin played with multiple layers of constellations and stars, using space and long exposure night photography as reference. FX passes and other 2D elements were added to avoid it looking

“Small lightning bolts and flashes were finally composited in before the result was shared with the other vendors who were using our comps as a backplate. Once this overall storm set-up was in place, it allowed us to deliver two complete, full CG shots where the audience travels within the storm clouds and discovers the rolling from the inside.” —Aymeric Perceval, Visual Effects Supervisor, Cinesite

TOP: Mrs. Fadil’s sphynx cat was actually shot separately on bluescreen and composited into the shots by Cinesite. MIDDLE and BOTTOM: The original on-set apartment photography and a wider shot of the digital extension for Mrs. Fadil’s ascent into the afterlife.

FALL 2017 VFXVOICE.COM • 39

9/14/17 3:32 PM


TV

TOP LEFT to RIGHT: Mrs. Fadil and Anubis were filmed on location in Oklahoma, with Cinesite transforming the footage into a unique, sandy and heavenly space. BOTTOM LEFT to RIGHT: Laura also makes a trip to the afterlife, but her views of the environment were crafted in a more dark and foreboding manner – and she ultimately returns to Earth.

“One key reason we were able to generate and work in Maya with that number of polygons was because of Cinesite’s in house Meshcache/ particleProxy object. It allowed us to have millions of polygons in our scene and still work with all that information in the viewport. We also used a visual trick in the shot, the further away the trees were from the camera path, the more we decreased the number of leaves and scaled up the leaf instances.” —Eric Senécal, Layout/FX Artist

40 • VFXVOICE.COM FALL 2017

PG 38-41 AMERICAN GODS.indd 40-41

too familiar. Since there was no sun in this universe, we used the constellations as light sources matching the lighting on the actors.” Shadow Moon’s wife, Laura (Emily Browning), is also shown entering the afterlife after she dies in a car crash. Cinesite crafted a darker and murkier version of the universe here, starting this time with only bluescreen stage footage of the actors and some black dunes. “The entire dialogue sequence was shot on a bluescreen,” says Compositing Supervisor Benjamin Ribière, “so the challenge was to keep the viewer in the story by having perfect continuity between the 42 shots in terms of grading and integration between the 2D foreground and the CG background. To achieve this, we neutralized each shot and developed a strong Nuke template script which extracted the characters from the blue screen while ensuring the despill color and regrading remained consistent.” STEALING SPRING

Mr. Wednesday and Shadow Moon eventually visit the Old God Easter (Kristin Chenoweth), who turns the spring-time into a winter state. Cinesite worked on shots here that involved hundreds of petals revolving around Easter and a massive pull-back revealing the landscape turning wintery white. “This big pull out is most certainly the biggest shot we got to do this season,” notes Perceval. “A few array plates had been captured with a Red camera attached to a drone which allowed us to get enough information for our tracking/layout lead, Mehdi Tadlaoui, to recreate the actual topography around the house and give the showrunners the movement, speed and framing they needed. From the trees in the woods to the flowers around the house, all of the lush vegetation had to go, and we knew the parallax would not allow us to take shortcuts. The low and rampant vegetation could

“It was amazing to have a lot of freedom to invent, create and interpret concepts for which there was not always specific references we could draw from. The challenge was finding the balance between producing high-end realistic effects to support the narrative within the fast-moving production schedule.” —Aymeric Perceval, Visual Effects Supervisor, Cinesite certainly go 2D but the trees had to go CG.” DMP artist JaeHee Jung recreated multiple tiles of an empty ground, leaving only dirt and low grass which would later get withered by grading in compositing. During that time, Cinesite employed Speedtree to generate six types of trees with five different shape variations, each and laid out up to 2,200 of them. Eric Senécal, the layout/fx artist on the sequence, explains, “One key reason we were able to generate and work in Maya with that number of polygons was because of Cinesite’s in-house Mesh cache/particle Proxy object. It allowed us to have millions of polygons in our scene and still work with all that information in the viewport. We also used a visual trick in the shot, the further away the trees were from the camera path, the more we decreased the number of leaves and scaled up the leaf instances.” “We streamlined the process even further by only adding the wind to the foreground trees, leaving in ones in the background static,” adds Perceval. “This way, the branches and trunks were imported in Maya while only the leaf points were imported in Houdini for the simulation. Once the points cache was brought back into Maya, we had all the information needed to shrink them down and move them with the wind. This offered us an efficient broadcast turnover of six hours and consequently enough time

to run multiple tests on how to best convey the story point of the shot.” DARING VISUALS

Reflecting on Cinesite’s contributions to American Gods, Perceval remarks that the showrunners and the show’s visual effects supervisors were both “brave and daring” with the visuals. “Between the scripts and the actual plates, there was often such a creative jump that we were not fully aware what would be coming our way until we’d seen it; this had advantages and drawbacks. It was amazing to have a lot of freedom to invent, create and interpret concepts for which there was not always specific references we could draw from. The challenge was finding the balance between producing high-end realistic effects to support the narrative within the fast-moving production schedule.”

TOP: These before and after frames show how Cinesite took the original aerial photographs and transformed the environment as Easter wreaks a wintry feel across the land.

FALL 2017 VFXVOICE.COM • 41

9/14/17 3:32 PM


TV

TOP LEFT to RIGHT: Mrs. Fadil and Anubis were filmed on location in Oklahoma, with Cinesite transforming the footage into a unique, sandy and heavenly space. BOTTOM LEFT to RIGHT: Laura also makes a trip to the afterlife, but her views of the environment were crafted in a more dark and foreboding manner – and she ultimately returns to Earth.

“One key reason we were able to generate and work in Maya with that number of polygons was because of Cinesite’s in house Meshcache/ particleProxy object. It allowed us to have millions of polygons in our scene and still work with all that information in the viewport. We also used a visual trick in the shot, the further away the trees were from the camera path, the more we decreased the number of leaves and scaled up the leaf instances.” —Eric Senécal, Layout/FX Artist

40 • VFXVOICE.COM FALL 2017

PG 38-41 AMERICAN GODS.indd 40-41

too familiar. Since there was no sun in this universe, we used the constellations as light sources matching the lighting on the actors.” Shadow Moon’s wife, Laura (Emily Browning), is also shown entering the afterlife after she dies in a car crash. Cinesite crafted a darker and murkier version of the universe here, starting this time with only bluescreen stage footage of the actors and some black dunes. “The entire dialogue sequence was shot on a bluescreen,” says Compositing Supervisor Benjamin Ribière, “so the challenge was to keep the viewer in the story by having perfect continuity between the 42 shots in terms of grading and integration between the 2D foreground and the CG background. To achieve this, we neutralized each shot and developed a strong Nuke template script which extracted the characters from the blue screen while ensuring the despill color and regrading remained consistent.” STEALING SPRING

Mr. Wednesday and Shadow Moon eventually visit the Old God Easter (Kristin Chenoweth), who turns the spring-time into a winter state. Cinesite worked on shots here that involved hundreds of petals revolving around Easter and a massive pull-back revealing the landscape turning wintery white. “This big pull out is most certainly the biggest shot we got to do this season,” notes Perceval. “A few array plates had been captured with a Red camera attached to a drone which allowed us to get enough information for our tracking/layout lead, Mehdi Tadlaoui, to recreate the actual topography around the house and give the showrunners the movement, speed and framing they needed. From the trees in the woods to the flowers around the house, all of the lush vegetation had to go, and we knew the parallax would not allow us to take shortcuts. The low and rampant vegetation could

“It was amazing to have a lot of freedom to invent, create and interpret concepts for which there was not always specific references we could draw from. The challenge was finding the balance between producing high-end realistic effects to support the narrative within the fast-moving production schedule.” —Aymeric Perceval, Visual Effects Supervisor, Cinesite certainly go 2D but the trees had to go CG.” DMP artist JaeHee Jung recreated multiple tiles of an empty ground, leaving only dirt and low grass which would later get withered by grading in compositing. During that time, Cinesite employed Speedtree to generate six types of trees with five different shape variations, each and laid out up to 2,200 of them. Eric Senécal, the layout/fx artist on the sequence, explains, “One key reason we were able to generate and work in Maya with that number of polygons was because of Cinesite’s in-house Mesh cache/particle Proxy object. It allowed us to have millions of polygons in our scene and still work with all that information in the viewport. We also used a visual trick in the shot, the further away the trees were from the camera path, the more we decreased the number of leaves and scaled up the leaf instances.” “We streamlined the process even further by only adding the wind to the foreground trees, leaving in ones in the background static,” adds Perceval. “This way, the branches and trunks were imported in Maya while only the leaf points were imported in Houdini for the simulation. Once the points cache was brought back into Maya, we had all the information needed to shrink them down and move them with the wind. This offered us an efficient broadcast turnover of six hours and consequently enough time

to run multiple tests on how to best convey the story point of the shot.” DARING VISUALS

Reflecting on Cinesite’s contributions to American Gods, Perceval remarks that the showrunners and the show’s visual effects supervisors were both “brave and daring” with the visuals. “Between the scripts and the actual plates, there was often such a creative jump that we were not fully aware what would be coming our way until we’d seen it; this had advantages and drawbacks. It was amazing to have a lot of freedom to invent, create and interpret concepts for which there was not always specific references we could draw from. The challenge was finding the balance between producing high-end realistic effects to support the narrative within the fast-moving production schedule.”

TOP: These before and after frames show how Cinesite took the original aerial photographs and transformed the environment as Easter wreaks a wintry feel across the land.

FALL 2017 VFXVOICE.COM • 41

9/14/17 3:32 PM


ANIMATION

VR OPENS THE DOOR TO UNLIMITED EXPLORATION By TREVOR HOGG

TOP: The user gets up close to Chloe as she thwarts two bumbling aliens in Invasion! (Image courtesy of Baobab Studios) BOTTOM: The legend of Rainbow Crow brought to life in a serialized VR experience that makes use of sophisticated lighting and pastel colors. (Image courtesy of Baobab Studios)

42 • VFXVOICE.COM FALL 2017

PG 42-47 VR ANIMATION.indd 42-43

While at the early stage where the tools and visual language are being established at the same time, virtual reality is already encompassing a variety of approaches and applications towards storytelling and experiences. In order to get a better understanding of how to use VR effectively and the future of the emerging medium, CEOs and chief creatives from Baobab Studios, Penrose Studios, Tangerine Apps, Google ATAP, ILMxLAB and Skydance Interactive, along with acclaimed filmmaker Kel O’Neill, discuss the current state and future of the emerging medium and its impact on animated film. Often referred to as the Pixar of VR, Baobab Studios, based in Redwood City, California, was founded by CEO Maureen Fan and CCO Eric Darnell. Daytime Emmy-winner Invasion! revolves around a bunny thwarting an alien invasion and is being developed into a traditional feature-length animated film by Roth Kirschenbaum. Their current project is the serialized tale Rainbow Crow, which is based on a Lenape legend. “Technology always takes longer than everybody expects to be disruptive, but when it does the impact is greater than anyone anticipated,” states Fan. “We raised $31 million to make sure that we had enough to last through the trough of disillusionment. In the meantime, we do make some revenue. Inside the home are the standard models of pay for watching, purchase, advertising or subscription. Outside of the home there are VR arcades popping up all over the place. People want our content specifically in those theatres, so you can see a licensing model.” “We’re creating a lot of proprietary pipeline stuff that allows us to do the process efficiently as well to do creative and technical work in our medium,” remarks Darnell. “We’re also working on tools that allow us to do things creatively that are otherwise difficult. In Rainbow Crow, we have developed a tool that allows us to have multiple lights turning off and on at various times which is especially difficult to do when you’re running in real-time. Lights are expensive, so usually you only have one or a lot of cheats. It’s a two-pronged attack. We’re trying to have infrastructure tools in place and improved. We’re also looking at ways to create tools that allow us to be creative.” Penrose Studios in San Francisco has a mandate from Founder and CEO Eugene Chung to create worlds in virtual and augmented reality that allow for unlimited exploration and narrative experiences. The various environments include an asteroid in The Rose and I, a cloud city in Allumette, and a lighthouse situated in a post-apocalyptic water world in Arden’s Wake. “Virtual reality allows us to create these large expansive worlds,” notes Chung. “Allumette has a floating cloud city and that’s not something you can go to in reality. Even if you watch a picture of it, it’s different than if you are in it; that’s what virtual reality unlocks for you.” VR has a particular advantage, Chung adds: “The incredible thing about VR is that it’s fully immersive. People tend to get lost and forget that they have other things. A great example is with Arden’s Wake. The prologue is over 15 minutes long but 90% of viewers think it’s 5 to 10 minutes long. Some people question if there’s going to be an attention apocalypse; it’s an opportunity to go

back to the roots of undistracted storytelling.” Co-founded by Joe Farrell and Dogan Koslu, Tangerine Apps, based in Los Angeles, combines visual effects and videogame expertise to produce VR experiences that assist with traditional filmmaking, as well expand the scope of marketing campaigns for feature films such as Beauty and the Beast and the bid for the 2024 Olympic Games in Los Angeles. “We have been literally standing outside of the computer staring in through a little window into the digital world,” remarks Farrell. “Now what you are able to do is stand inside the digital world and realize the scale or scope of things with a sense of perspective. There’s a lot of prep work that goes into it, but once the filmmaker is in VR decisions are instant. Before, it was like, ‘I am going to make a decision and hope that it’s the best.’” Google Spotlight Stories is VR content produced by Google’s Advanced Technology and Projects group located in San Francisco. Director Jan Pinkava has collaborated with acclaimed filmmakers Aardman Animations, Justin Lin and Patrick Osborne to produce shorts that include the Pink Panther-themed Special Delivery, live-action alien visitation Help and cross-country father/daughter relationship Pearl.

TOP: A color key for the library located within the lighthouse featured in Arden’s Wake. (Image courtesy of Penrose Studios)

“Inside the home are the standard models of pay for watching, purchase, advertising or subscription. Outside of the home there are VR arcades popping up all over the place. People want our content specifically in those theatres, so you can see a licensing model.” —Maureen Fan, CEO, Baobab Studios

FALL 2017 FXVOICE.COM • 43

9/14/17 3:34 PM


ANIMATION

VR OPENS THE DOOR TO UNLIMITED EXPLORATION By TREVOR HOGG

TOP: The user gets up close to Chloe as she thwarts two bumbling aliens in Invasion! (Image courtesy of Baobab Studios) BOTTOM: The legend of Rainbow Crow brought to life in a serialized VR experience that makes use of sophisticated lighting and pastel colors. (Image courtesy of Baobab Studios)

42 • VFXVOICE.COM FALL 2017

PG 42-47 VR ANIMATION.indd 42-43

While at the early stage where the tools and visual language are being established at the same time, virtual reality is already encompassing a variety of approaches and applications towards storytelling and experiences. In order to get a better understanding of how to use VR effectively and the future of the emerging medium, CEOs and chief creatives from Baobab Studios, Penrose Studios, Tangerine Apps, Google ATAP, ILMxLAB and Skydance Interactive, along with acclaimed filmmaker Kel O’Neill, discuss the current state and future of the emerging medium and its impact on animated film. Often referred to as the Pixar of VR, Baobab Studios, based in Redwood City, California, was founded by CEO Maureen Fan and CCO Eric Darnell. Daytime Emmy-winner Invasion! revolves around a bunny thwarting an alien invasion and is being developed into a traditional feature-length animated film by Roth Kirschenbaum. Their current project is the serialized tale Rainbow Crow, which is based on a Lenape legend. “Technology always takes longer than everybody expects to be disruptive, but when it does the impact is greater than anyone anticipated,” states Fan. “We raised $31 million to make sure that we had enough to last through the trough of disillusionment. In the meantime, we do make some revenue. Inside the home are the standard models of pay for watching, purchase, advertising or subscription. Outside of the home there are VR arcades popping up all over the place. People want our content specifically in those theatres, so you can see a licensing model.” “We’re creating a lot of proprietary pipeline stuff that allows us to do the process efficiently as well to do creative and technical work in our medium,” remarks Darnell. “We’re also working on tools that allow us to do things creatively that are otherwise difficult. In Rainbow Crow, we have developed a tool that allows us to have multiple lights turning off and on at various times which is especially difficult to do when you’re running in real-time. Lights are expensive, so usually you only have one or a lot of cheats. It’s a two-pronged attack. We’re trying to have infrastructure tools in place and improved. We’re also looking at ways to create tools that allow us to be creative.” Penrose Studios in San Francisco has a mandate from Founder and CEO Eugene Chung to create worlds in virtual and augmented reality that allow for unlimited exploration and narrative experiences. The various environments include an asteroid in The Rose and I, a cloud city in Allumette, and a lighthouse situated in a post-apocalyptic water world in Arden’s Wake. “Virtual reality allows us to create these large expansive worlds,” notes Chung. “Allumette has a floating cloud city and that’s not something you can go to in reality. Even if you watch a picture of it, it’s different than if you are in it; that’s what virtual reality unlocks for you.” VR has a particular advantage, Chung adds: “The incredible thing about VR is that it’s fully immersive. People tend to get lost and forget that they have other things. A great example is with Arden’s Wake. The prologue is over 15 minutes long but 90% of viewers think it’s 5 to 10 minutes long. Some people question if there’s going to be an attention apocalypse; it’s an opportunity to go

back to the roots of undistracted storytelling.” Co-founded by Joe Farrell and Dogan Koslu, Tangerine Apps, based in Los Angeles, combines visual effects and videogame expertise to produce VR experiences that assist with traditional filmmaking, as well expand the scope of marketing campaigns for feature films such as Beauty and the Beast and the bid for the 2024 Olympic Games in Los Angeles. “We have been literally standing outside of the computer staring in through a little window into the digital world,” remarks Farrell. “Now what you are able to do is stand inside the digital world and realize the scale or scope of things with a sense of perspective. There’s a lot of prep work that goes into it, but once the filmmaker is in VR decisions are instant. Before, it was like, ‘I am going to make a decision and hope that it’s the best.’” Google Spotlight Stories is VR content produced by Google’s Advanced Technology and Projects group located in San Francisco. Director Jan Pinkava has collaborated with acclaimed filmmakers Aardman Animations, Justin Lin and Patrick Osborne to produce shorts that include the Pink Panther-themed Special Delivery, live-action alien visitation Help and cross-country father/daughter relationship Pearl.

TOP: A color key for the library located within the lighthouse featured in Arden’s Wake. (Image courtesy of Penrose Studios)

“Inside the home are the standard models of pay for watching, purchase, advertising or subscription. Outside of the home there are VR arcades popping up all over the place. People want our content specifically in those theatres, so you can see a licensing model.” —Maureen Fan, CEO, Baobab Studios

FALL 2017 FXVOICE.COM • 43

9/14/17 3:34 PM


ANIMATION

TOP to BOTTOM: A still taken from Allumette, which transports the viewer to a cloud city in a re-imagining of The Little Match Girl by Hans Christian Andersen. (Image courtesy of Penrose Studios) A VR experience that allows the user to be a guest in the world of Beauty and the Beast. (Image courtesy of Disney and Tangerine Apps) Peter Osborne situates the viewer in the passenger’s seat of a 1970s hatchback to witness the relationship between a travelling musician and his daughter in Pearl. (Image courtesy of Google Spotlight Stories)

44 • VFXVOICE.COM FALL 2017

PG 42-47 VR ANIMATION.indd 44-45

“VR is about transporting you to somewhere else entirely and AR is about putting you in a different version of the real world you’re in,” observes Pinkava. “If you are interested in storytelling then you have to ask, ‘What kind of story can I tell in the real world when I don’t know what the real world is for each member of my audience? How are they going to interact with this where they are? What mood are they in? Are they travelling, at home or the office? In the world of mobile VR, it’s good not to make assumptions about that and to create whatever experience you’re offering the audience to work wherever it is or to take them somewhere else.” Combining the technical and creative resources of ILM, Skywalker Sound, Lucasfilm and Magic Leap is ILMxLAB in San Francisco, overseen by Executive-in-Charge Vicki Dobbs Beck and Director, Content and Platform Strategy, Mohen Leo. The goal is to produce immersive entertainment and experiences for theaters, theme parks and social spaces that include a series centering around Darth Vader and collaborating with Alejandro González Iñárritu on Carne Y Arena. “We try to put storytelling and the creative vision front and center, and figure out what we have to do with the technology in order to bring that to life,” states Dobbs Beck. “Someday we’re likely to have a single device where we’re experiencing all shades of reality from the real world in its simplest form all the way to fully virtual reality and everything in between. If you believe that trajectory, it’s important to understand how to create those different kinds of experiences even now when we haven’t reached that state.” “The next step for captured virtual reality, if we’re talking about real-world content, is the light-field acquisition,” remarks Mohen Leo. “It’s early days for that but it’s something we’ll see over the next 10 years. It’s interesting progress on not just capturing the single view point, but being able to capture a full volume of reality in which you can then move around.” Skydance Interactive CEO Peter Akemann seeks to produce original interactive and immersive gameplay commencing with mech battle tale Archangel. Other projects from the Marina del Rey-based company include Life VR, based on the sci-fi thriller which places the user inside the International Space Station. “Seeing your face and hands, and this complex high-dimensional presence that you have in the world, is compelling,” states Peter Akemann, “as it is compelling to see another person there, even in that avatar form, or when people can chat or type in ‘dance’ to make their avatar move around. That being said, much like in standard games, the single-player experience is always going to be an important and powerful thing. A lot of individuals are alone a lot of the time and don’t always want the social pressure of being with other people. They just want a great story. You’re going to see a future that has both of those things.” Akemann adds, “Technological questions are always divergent culminating to a broader and richer future with more different kinds of experiences in it.” Kel O’Neill and Eline Jongsma form the award-winning Dutch-America filmmaking duo Jongsma + O’Neill and produce interactive documentaries. The Ark VR enables viewers to witness

“Virtual reality allows us to create these large expansive worlds. Allumette has a floating cloud city and that’s not something you can go to in reality. Even if you watch a picture of it, it’s different than if you are in it; that’s what virtual reality unlocks for you.” —Eugene Chung, Founder/CEO, Penrose Studios

TOP LEFT: Aardman Animations embraces the spirit of the Pink Panther in the Christmas Eve caper, Special Delivery. (Image courtesy of Google Spotlight Stories) MIDDLE and BOTTOM LEFT: Skydance Interactive seeks to produce original interactive and immersive gameplay commencing with mech battle tale Archangel, its first VR game. (Image courtesy of Skydance Interactive) TOP and BOTTOM RIGHT: Stills take from The Ark 360 video showcasing the efforts of scientists in San Diego to save the endangered white rhino in Kenya. (Images courtesy of Jongsma + O’Neill)

American scientists and African rangers trying to protect and preserve endangered species, in particular, the last three northern white rhinos in existence. “For The Ark, our goal was to use the immersive qualities of 360-degree video to allow users to be at two places at once simultaneously,” explains Kel O’Neill. “Using basic matting in Adobe Premiere we were able to devote at various junctures of the piece 180 degrees to Kenya (in the front) and 180 degrees to our second location in San Diego (at the back). It’s this idea that you’re telling a global story by using a global approach. That didn’t come out of videogame tropes, but the work we had done in video installations, making multi-channel pieces for a gallery. “Everyone I know who is working in this medium is conscious about managing the hype cycle, because we’re all invested in

FALL 2017 VFXVOICE.COM • 45

9/14/17 3:34 PM


ANIMATION

TOP to BOTTOM: A still taken from Allumette, which transports the viewer to a cloud city in a re-imagining of The Little Match Girl by Hans Christian Andersen. (Image courtesy of Penrose Studios) A VR experience that allows the user to be a guest in the world of Beauty and the Beast. (Image courtesy of Disney and Tangerine Apps) Peter Osborne situates the viewer in the passenger’s seat of a 1970s hatchback to witness the relationship between a travelling musician and his daughter in Pearl. (Image courtesy of Google Spotlight Stories)

44 • VFXVOICE.COM FALL 2017

PG 42-47 VR ANIMATION.indd 44-45

“VR is about transporting you to somewhere else entirely and AR is about putting you in a different version of the real world you’re in,” observes Pinkava. “If you are interested in storytelling then you have to ask, ‘What kind of story can I tell in the real world when I don’t know what the real world is for each member of my audience? How are they going to interact with this where they are? What mood are they in? Are they travelling, at home or the office? In the world of mobile VR, it’s good not to make assumptions about that and to create whatever experience you’re offering the audience to work wherever it is or to take them somewhere else.” Combining the technical and creative resources of ILM, Skywalker Sound, Lucasfilm and Magic Leap is ILMxLAB in San Francisco, overseen by Executive-in-Charge Vicki Dobbs Beck and Director, Content and Platform Strategy, Mohen Leo. The goal is to produce immersive entertainment and experiences for theaters, theme parks and social spaces that include a series centering around Darth Vader and collaborating with Alejandro González Iñárritu on Carne Y Arena. “We try to put storytelling and the creative vision front and center, and figure out what we have to do with the technology in order to bring that to life,” states Dobbs Beck. “Someday we’re likely to have a single device where we’re experiencing all shades of reality from the real world in its simplest form all the way to fully virtual reality and everything in between. If you believe that trajectory, it’s important to understand how to create those different kinds of experiences even now when we haven’t reached that state.” “The next step for captured virtual reality, if we’re talking about real-world content, is the light-field acquisition,” remarks Mohen Leo. “It’s early days for that but it’s something we’ll see over the next 10 years. It’s interesting progress on not just capturing the single view point, but being able to capture a full volume of reality in which you can then move around.” Skydance Interactive CEO Peter Akemann seeks to produce original interactive and immersive gameplay commencing with mech battle tale Archangel. Other projects from the Marina del Rey-based company include Life VR, based on the sci-fi thriller which places the user inside the International Space Station. “Seeing your face and hands, and this complex high-dimensional presence that you have in the world, is compelling,” states Peter Akemann, “as it is compelling to see another person there, even in that avatar form, or when people can chat or type in ‘dance’ to make their avatar move around. That being said, much like in standard games, the single-player experience is always going to be an important and powerful thing. A lot of individuals are alone a lot of the time and don’t always want the social pressure of being with other people. They just want a great story. You’re going to see a future that has both of those things.” Akemann adds, “Technological questions are always divergent culminating to a broader and richer future with more different kinds of experiences in it.” Kel O’Neill and Eline Jongsma form the award-winning Dutch-America filmmaking duo Jongsma + O’Neill and produce interactive documentaries. The Ark VR enables viewers to witness

“Virtual reality allows us to create these large expansive worlds. Allumette has a floating cloud city and that’s not something you can go to in reality. Even if you watch a picture of it, it’s different than if you are in it; that’s what virtual reality unlocks for you.” —Eugene Chung, Founder/CEO, Penrose Studios

TOP LEFT: Aardman Animations embraces the spirit of the Pink Panther in the Christmas Eve caper, Special Delivery. (Image courtesy of Google Spotlight Stories) MIDDLE and BOTTOM LEFT: Skydance Interactive seeks to produce original interactive and immersive gameplay commencing with mech battle tale Archangel, its first VR game. (Image courtesy of Skydance Interactive) TOP and BOTTOM RIGHT: Stills take from The Ark 360 video showcasing the efforts of scientists in San Diego to save the endangered white rhino in Kenya. (Images courtesy of Jongsma + O’Neill)

American scientists and African rangers trying to protect and preserve endangered species, in particular, the last three northern white rhinos in existence. “For The Ark, our goal was to use the immersive qualities of 360-degree video to allow users to be at two places at once simultaneously,” explains Kel O’Neill. “Using basic matting in Adobe Premiere we were able to devote at various junctures of the piece 180 degrees to Kenya (in the front) and 180 degrees to our second location in San Diego (at the back). It’s this idea that you’re telling a global story by using a global approach. That didn’t come out of videogame tropes, but the work we had done in video installations, making multi-channel pieces for a gallery. “Everyone I know who is working in this medium is conscious about managing the hype cycle, because we’re all invested in

FALL 2017 VFXVOICE.COM • 45

9/14/17 3:34 PM


ANIMATION

AD LEFT to RIGHT: Maureen Fan, Baobab Studios, Eric Darnell, Baobab Studios, Eugene Chung, Penrose Studios, Joe Farrell, Tangerine Apps, Jan Pinkava, Google’s Advanced Technology and Projects Group (ATAP)(Photo credit: Anna Hoye) Vicki Dobbs Beck, ILMxLAB Mohen Leo, ILMxLAB Peter Akemann, Skydance Interactive (Photo credit: Alex J. Berliner/ABImages) Kel O’Neill, O’Neill + Jongsma The Tangerine Apps team

this thing becoming what it could potentially become, which is a mass medium on the scale of television, cinema and radio,” remarks O’Neill. “It’s just an issue of having enough breathing space.” There is also a sense of storytelling circling back to its origins. “We started off in human history with drawing cave paintings and telling stories around a campfire,” reflects Penrose’s Eugene Chung. “Then we moved on to books and literature. Now we’re increasingly going back to the visual medium being the basis of everything we do to communicate.” High-profile advisors for Baobab Studios that include co-founders of Pixar and Pacific Data Image as well as a renowned American animator see a connection between virtual reality and the early days of computer animation. “Alvy Ray Smith, Glenn Entis and Glen Keane are like, ‘Back in the day no one knew what they were doing. It was just artists and engineers trying to figure out how to do it,’” states Maureen Fan. “For them, VR has that same feeling. They want to help and pass it on to the next generation.”

46 • VFXVOICE.COM FALL 2017

PG 42-47 VR ANIMATION.indd 47

9/14/17 3:34 PM


ART OF VFX ad.indd 47

9/14/17 3:35 PM


PROFILE

DR. SKIP RIZZO AND THE RISE OF MEDICAL VR THERAPY By NAOMI GOLDMAN

TOP: Albert “Skip” Rizzo, Ph.D., Director, Medical Virtual Reality, USC Institute for Creative Technologies. BOTTOM: Virtual Iraq/Afghanistan delivers virtual reality exposure therapy for treating PTSD. All images courtesy of Dr. “Skip” Rizzo and the USC Institute for Creative Technologies.

Once thought to be a technology exclusively for entertainment, virtual reality applications pioneered by Albert “Skip” Rizzo, Ph.D. have provided life-changing therapeutic results for clients with serious anxiety disorders and members of the military in particular. As the Director of Medical Virtual Reality at USC’s Institute for Creative Technologies (ICT) and Research Professor at the USC’s Department of Psychiatry and School of Gerontology, Dr. Rizzo has been at the forefront of dramatic innovations in clinical research and care for more than two decades, and his application of VR as a valuable tool in medical treatment underscores the broadening growth of VR beyond entertainment. A leader in artificial intelligence, graphics, virtual reality and narrative, ICT advances low-cost immersive techniques and technologies to creatively address vital medical issues facing society. Under Dr. Rizzo’s direction, ICT has pushed the boundaries in treating people affected with debilitating anxiety disorders. Those treatments include landmark efforts to help military personnel with Bravemind, a virtual reality exposure therapy for treating Post-Traumatic Stress Disorder; Stress Resilience In Virtual Environments (STRIVE), which aims to better prepare military personnel for emotional challenges inherent in the combat environment, before they deploy; and SimCoach, a Web-based virtual human to assist people in care to overcome barriers. ICT has also developed interactive “Game-Based Rehab” to augment physical rehabilitation for patients recovering from stroke or traumatic brain injury, as well as for the elderly and those with a disability. VFX Voice talked with Dr. Rizzo about the rise of VR as a powerful therapeutic tool, the bumps in the road, and the promise that lies ahead. Q: What was the thought process that led you to pursue VR for use in clinical psychology and rehab? A: The lightbulb ultimately went off thanks to a young man and his Game Boy. In the early 1990s, the state of cognitive rehab was really limited by the absence of technology. I would tell my clients that if you want to recover your brain function, your attention, memory and executive function, you would have to put in the same amount of effort into rehab exercises as if you wanted to learn to play the violin. It became more and more obvious to me that to do rehab at a level where you can really measure improvement, you have to do it for intense periods of time beyond what is pragmatically feasible with the cost of having humans facilitate that work. But armed with technology, where the user can practice on their own, you’re looking at a much greater possibility. I was struck by kids who played video games for hours on end and imagined, what if you could get a client engaged in well-produced sophisticated content to do their rehab for that period of time? Then one fateful day in 1991, I watched a less than compliant rehab client transfixed for hours by his Nintendo Game Boy and that was the ‘aha’ moment that kicked things into gear. It started with the idea of using game-based stuff to make it fun

48 • VFXVOICE.COM FALL 2017

PG 48-53 SKIP RIZZO.indd 48-49

and engaging, but as I heard of VR, I knew that the context matters and we could do it. I was naïve and had no idea about the long road ahead. Had I known how far we were from realizing that vision, I might have chickened out! Q. You’ve described the early years in the field as the ‘nuclear winter of VR.’ Tell us about your path during that era. A. During the first challenging wave of VR, I got tossed a life buoy to make me believe there was hope. Back in 1993, Dr. Dean Inman at the Oregon Research Institute had developed a motorized wheelchair training system for children. He built something where you put the user’s own wheelchair on a set of rollers that would induce the movement of the wheels and navigate them thought a virtual obstacle course. But the key thing to motivate the kids was that after they completed a certain amount of training they could virtually fly off into the clouds in their wheelchairs wearing a headmounted display. Their faces lit up and I was sold! By the time I got into this academically in 1995, I had seen some inspiring things and was excited about VR’s potential. But right before I got my spot at USC I got to try a headset myself – and it was really bad. I walked around a virtual city, the interface was clunky, and I got stuck inside a very primitive building and realized this isn’t ready for prime time after all. When I got my position at USC’s Alzheimer’s Center, it was right across the street from computer science. My strategy was to pester people to get access to equipment and programming – and that’s what happened. I saw the wreckage of the past and wanted to move forward, and as we were working on this stuff, everyone else caught on that VR wasn’t ready for the technology. Companies dissipated. VR magazines fizzled. VR conferences crashed and burned. And all of a sudden everyone was excited about the Internet and moved on as VR was labeled a failed thing. And it didn’t really hit its second life until last year.

“The lightbulb ultimately went off thanks to a young man and his Game Boy. … I was struck by kids who played video games for hours on end and imagined, what if you could get a client engaged in well-produced sophisticated content to do their rehab for that period of time?” —Dr. Albert “Skip” Rizzo, Director of Medical Virtual Reality, USC Institute for Creative Technologies (ICT) TOP: “Virtual Patients” use virtual human technology to train future clinicians in therapeutic interview skills. BOTTOM: Afghan helicopter extraction. Virtual Iraq/Afghanistan.

FALL 2017 VFXVOICE.COM • 49

9/14/17 3:35 PM


PROFILE

DR. SKIP RIZZO AND THE RISE OF MEDICAL VR THERAPY By NAOMI GOLDMAN

TOP: Albert “Skip” Rizzo, Ph.D., Director, Medical Virtual Reality, USC Institute for Creative Technologies. BOTTOM: Virtual Iraq/Afghanistan delivers virtual reality exposure therapy for treating PTSD. All images courtesy of Dr. “Skip” Rizzo and the USC Institute for Creative Technologies.

Once thought to be a technology exclusively for entertainment, virtual reality applications pioneered by Albert “Skip” Rizzo, Ph.D. have provided life-changing therapeutic results for clients with serious anxiety disorders and members of the military in particular. As the Director of Medical Virtual Reality at USC’s Institute for Creative Technologies (ICT) and Research Professor at the USC’s Department of Psychiatry and School of Gerontology, Dr. Rizzo has been at the forefront of dramatic innovations in clinical research and care for more than two decades, and his application of VR as a valuable tool in medical treatment underscores the broadening growth of VR beyond entertainment. A leader in artificial intelligence, graphics, virtual reality and narrative, ICT advances low-cost immersive techniques and technologies to creatively address vital medical issues facing society. Under Dr. Rizzo’s direction, ICT has pushed the boundaries in treating people affected with debilitating anxiety disorders. Those treatments include landmark efforts to help military personnel with Bravemind, a virtual reality exposure therapy for treating Post-Traumatic Stress Disorder; Stress Resilience In Virtual Environments (STRIVE), which aims to better prepare military personnel for emotional challenges inherent in the combat environment, before they deploy; and SimCoach, a Web-based virtual human to assist people in care to overcome barriers. ICT has also developed interactive “Game-Based Rehab” to augment physical rehabilitation for patients recovering from stroke or traumatic brain injury, as well as for the elderly and those with a disability. VFX Voice talked with Dr. Rizzo about the rise of VR as a powerful therapeutic tool, the bumps in the road, and the promise that lies ahead. Q: What was the thought process that led you to pursue VR for use in clinical psychology and rehab? A: The lightbulb ultimately went off thanks to a young man and his Game Boy. In the early 1990s, the state of cognitive rehab was really limited by the absence of technology. I would tell my clients that if you want to recover your brain function, your attention, memory and executive function, you would have to put in the same amount of effort into rehab exercises as if you wanted to learn to play the violin. It became more and more obvious to me that to do rehab at a level where you can really measure improvement, you have to do it for intense periods of time beyond what is pragmatically feasible with the cost of having humans facilitate that work. But armed with technology, where the user can practice on their own, you’re looking at a much greater possibility. I was struck by kids who played video games for hours on end and imagined, what if you could get a client engaged in well-produced sophisticated content to do their rehab for that period of time? Then one fateful day in 1991, I watched a less than compliant rehab client transfixed for hours by his Nintendo Game Boy and that was the ‘aha’ moment that kicked things into gear. It started with the idea of using game-based stuff to make it fun

48 • VFXVOICE.COM FALL 2017

PG 48-53 SKIP RIZZO.indd 48-49

and engaging, but as I heard of VR, I knew that the context matters and we could do it. I was naïve and had no idea about the long road ahead. Had I known how far we were from realizing that vision, I might have chickened out! Q. You’ve described the early years in the field as the ‘nuclear winter of VR.’ Tell us about your path during that era. A. During the first challenging wave of VR, I got tossed a life buoy to make me believe there was hope. Back in 1993, Dr. Dean Inman at the Oregon Research Institute had developed a motorized wheelchair training system for children. He built something where you put the user’s own wheelchair on a set of rollers that would induce the movement of the wheels and navigate them thought a virtual obstacle course. But the key thing to motivate the kids was that after they completed a certain amount of training they could virtually fly off into the clouds in their wheelchairs wearing a headmounted display. Their faces lit up and I was sold! By the time I got into this academically in 1995, I had seen some inspiring things and was excited about VR’s potential. But right before I got my spot at USC I got to try a headset myself – and it was really bad. I walked around a virtual city, the interface was clunky, and I got stuck inside a very primitive building and realized this isn’t ready for prime time after all. When I got my position at USC’s Alzheimer’s Center, it was right across the street from computer science. My strategy was to pester people to get access to equipment and programming – and that’s what happened. I saw the wreckage of the past and wanted to move forward, and as we were working on this stuff, everyone else caught on that VR wasn’t ready for the technology. Companies dissipated. VR magazines fizzled. VR conferences crashed and burned. And all of a sudden everyone was excited about the Internet and moved on as VR was labeled a failed thing. And it didn’t really hit its second life until last year.

“The lightbulb ultimately went off thanks to a young man and his Game Boy. … I was struck by kids who played video games for hours on end and imagined, what if you could get a client engaged in well-produced sophisticated content to do their rehab for that period of time?” —Dr. Albert “Skip” Rizzo, Director of Medical Virtual Reality, USC Institute for Creative Technologies (ICT) TOP: “Virtual Patients” use virtual human technology to train future clinicians in therapeutic interview skills. BOTTOM: Afghan helicopter extraction. Virtual Iraq/Afghanistan.

FALL 2017 VFXVOICE.COM • 49

9/14/17 3:35 PM


PROFILE

Q. A lot of your work has focused on supporting members of the armed services –Bravemind on the therapeutic side for veterans diagnosed with PTSD and pre-deployment combat stress management with STRIVE. What was the genesis of your emphasis on this population?

TOP: Afghan humvee. Virtual Iraq/Afghanistan. BOTTOM: U.S. Army veteran interacts with the Bravemind VR therapy to safely relive his deployment experiences. In addition to the visual stimuli presented in the VR headmounted display, directional 3D audio, vibrations and smells can also be delivered. (Photo credit: Stephanie Davis Kleinman French. Photo copyright © Albert “Skip” Rizzo)

A. I always had an interest in helping veterans. I had previously worked with PTSD patients, primarily Vietnam-era veterans, during a clinical internship at the Veterans Administration in Long Beach back in 1985. Fast forward to 2003, I saw a video clip announcing the XBox game Full Spectrum Warrior – and it struck me just like the situation in Iraq. I was following the events of the war and thought we simply cannot have another Vietnam. We’ve got to be ready for these folks to come home and this technology is looking good enough that we can develop these kind of exposure therapy applications to be prepared. Though most people weren’t thinking that mental health and re-entry issues were going to be a problem as we were still in the ‘Mission-Accomplished’ era. As it happens, I found out about Full Spectrum Warrior game because it was on the ICT website. ICT had received a contract from the U.S. Army and hired Pandemic Games to create it as a combat tactical simulation tool that the Army could use with the XBox for training because they had so much success with America’s Army as an online recruitment game. They wanted to go all in for training with squad tactics as you could put an interface into an XBox and it would unlock actual missions. I went to ICT with a mission to take the art assets from the game and build it in a way that allows the clinician to control the elements and put people in different locations. I paired up with Jarrell Pair who was the programmer and we pitched it to ICT and built a primitive prototype – with basically no funding – by Spring of 2004. We yanked one street out of Full Spectrum Warrior and put a person in headset running VR off of a laptop … and different keys on the keyboard would make things happen. Toggle would change the illumination in the environment from morning to afternoon to night … push a button and a bunch of guys in a jeep with guns would come pouring out of an alleyway and start firing … or an insurgent would pop up, or a helicopter would fly over. That was the root of all that came next. Q. How does Bravemind work? A. Bravemind is our VR exposure therapy – a means for a patient to confront and process their trauma memories through a retelling of the experience – aimed at providing relief from PTSD – and it has been shown to produce a meaningful reduction in symptoms. Rather than relying exclusively on imagining a particular scenario, a patient can experience it again in a virtual world under very safe and controlled conditions. We see that young military personnel, having grown up with digital gaming technology, may actually be more attracted to and comfortable with a VR treatment approach as an alternative to traditional ‘talk therapy.’ The current application features a series of virtual scenarios including Afghan and Iraqi cities and desert road environments

50 • VFXVOICE.COM FALL 2017

PG 48-53 SKIP RIZZO.indd 50-51

and scenarios relevant to combat medics. In addition to the visual stimuli presented in the VR head-mounted display, we can deliver directional 3D audio, vibrations and smells. Clinicians control the stimulus presentation via a separate ‘Wizard of Oz’ interface, and are in full audio contact with the patient. The app has been distributed to more than 100 clinical sites and we are also developing scenarios to address military sexual trauma. Q. You’ve said ‘war sucks, but from it much flows that we can learn from.’ How have these modalities originally developed for military personnel been translated for benefit to civilians? A. We explored use of VR to help treat victims and survivors of terrorist attacks after the Paris events in 2015, but the funding has always been a barrier. Now we’re trying to do some things over in Turkey around attacks on airports, bars and mosques, which might take the shape of capturing environmental elements with a good spherical camera and doing 3D graphic overlays of the panoramic content … so we might have a second chance to develop tools to support the survivors of terror attacks. I had proposed an idea back after the Boston Marathon bombing to put together a team of people that would donate their time as Virtual Volunteers, that whenever there was one of these attacks, local graphic artists would go and model some of the spaces. With Boston it might be the one street where the bombings took place, the interior of an ambulance and maybe a hospital setting, that’s all you need. Whereas Bravemind was a complex endeavor with 14 different worlds built to address the diverse kinds of needs of a soldier deployed there … for a constrained terrorist attack, spherical video with 3D graphic overlays is likely the answer to have the systematic control to ramp up the provocative nature of the event.

TOP: U.S. Army veteran interacts with the Bravemind VR therapy to safely relive his deployment experiences. A landmark effort to help military personnel, Bravemind has been shown to produce a meaningful reduction in symptoms of PTSD. The app has been distributed to more than 100 clinical sites.

Q. What is the outlook for funding projects like these that hold great potential for widespread application?

BOTTOM: Capture. Bravemind allows veterans to process their trauma memories in a safe, controlled setting. Clinicians control the stimulus presentation via a separate interface, and are in full audio contact with the patient.

A. Virtual Volunteers was me as a do-gooder wanting to build a community of people to step up, but ultimately you need someone to manage and integrate the work, and that takes income. I’m trying to do everything I can to move things into the commercial sector. Academia is great for invention of new ideas and doing things no one has done before, but going from advanced research prototype to production and product level requires an economic driver. The arduous process of applying for grants and negotiating agreements gets in the way of innovation. I’d like to see some of the majors endow a center where in perpetuity there would be enough money to orchestrate these types of beneficial projects to advance the use of pro-social VR. Q. What projects are next on the horizon at ICT? A. We’re investigating using VR for pain distraction. It has such a solid research literature and it’s now becoming more of a commodity, because we have tools that work. We’re also looking

FALL 2017 VFXVOICE.COM • 51

9/14/17 3:35 PM


PROFILE

Q. A lot of your work has focused on supporting members of the armed services –Bravemind on the therapeutic side for veterans diagnosed with PTSD and pre-deployment combat stress management with STRIVE. What was the genesis of your emphasis on this population?

TOP: Afghan humvee. Virtual Iraq/Afghanistan. BOTTOM: U.S. Army veteran interacts with the Bravemind VR therapy to safely relive his deployment experiences. In addition to the visual stimuli presented in the VR headmounted display, directional 3D audio, vibrations and smells can also be delivered. (Photo credit: Stephanie Davis Kleinman French. Photo copyright © Albert “Skip” Rizzo)

A. I always had an interest in helping veterans. I had previously worked with PTSD patients, primarily Vietnam-era veterans, during a clinical internship at the Veterans Administration in Long Beach back in 1985. Fast forward to 2003, I saw a video clip announcing the XBox game Full Spectrum Warrior – and it struck me just like the situation in Iraq. I was following the events of the war and thought we simply cannot have another Vietnam. We’ve got to be ready for these folks to come home and this technology is looking good enough that we can develop these kind of exposure therapy applications to be prepared. Though most people weren’t thinking that mental health and re-entry issues were going to be a problem as we were still in the ‘Mission-Accomplished’ era. As it happens, I found out about Full Spectrum Warrior game because it was on the ICT website. ICT had received a contract from the U.S. Army and hired Pandemic Games to create it as a combat tactical simulation tool that the Army could use with the XBox for training because they had so much success with America’s Army as an online recruitment game. They wanted to go all in for training with squad tactics as you could put an interface into an XBox and it would unlock actual missions. I went to ICT with a mission to take the art assets from the game and build it in a way that allows the clinician to control the elements and put people in different locations. I paired up with Jarrell Pair who was the programmer and we pitched it to ICT and built a primitive prototype – with basically no funding – by Spring of 2004. We yanked one street out of Full Spectrum Warrior and put a person in headset running VR off of a laptop … and different keys on the keyboard would make things happen. Toggle would change the illumination in the environment from morning to afternoon to night … push a button and a bunch of guys in a jeep with guns would come pouring out of an alleyway and start firing … or an insurgent would pop up, or a helicopter would fly over. That was the root of all that came next. Q. How does Bravemind work? A. Bravemind is our VR exposure therapy – a means for a patient to confront and process their trauma memories through a retelling of the experience – aimed at providing relief from PTSD – and it has been shown to produce a meaningful reduction in symptoms. Rather than relying exclusively on imagining a particular scenario, a patient can experience it again in a virtual world under very safe and controlled conditions. We see that young military personnel, having grown up with digital gaming technology, may actually be more attracted to and comfortable with a VR treatment approach as an alternative to traditional ‘talk therapy.’ The current application features a series of virtual scenarios including Afghan and Iraqi cities and desert road environments

50 • VFXVOICE.COM FALL 2017

PG 48-53 SKIP RIZZO.indd 50-51

and scenarios relevant to combat medics. In addition to the visual stimuli presented in the VR head-mounted display, we can deliver directional 3D audio, vibrations and smells. Clinicians control the stimulus presentation via a separate ‘Wizard of Oz’ interface, and are in full audio contact with the patient. The app has been distributed to more than 100 clinical sites and we are also developing scenarios to address military sexual trauma. Q. You’ve said ‘war sucks, but from it much flows that we can learn from.’ How have these modalities originally developed for military personnel been translated for benefit to civilians? A. We explored use of VR to help treat victims and survivors of terrorist attacks after the Paris events in 2015, but the funding has always been a barrier. Now we’re trying to do some things over in Turkey around attacks on airports, bars and mosques, which might take the shape of capturing environmental elements with a good spherical camera and doing 3D graphic overlays of the panoramic content … so we might have a second chance to develop tools to support the survivors of terror attacks. I had proposed an idea back after the Boston Marathon bombing to put together a team of people that would donate their time as Virtual Volunteers, that whenever there was one of these attacks, local graphic artists would go and model some of the spaces. With Boston it might be the one street where the bombings took place, the interior of an ambulance and maybe a hospital setting, that’s all you need. Whereas Bravemind was a complex endeavor with 14 different worlds built to address the diverse kinds of needs of a soldier deployed there … for a constrained terrorist attack, spherical video with 3D graphic overlays is likely the answer to have the systematic control to ramp up the provocative nature of the event.

TOP: U.S. Army veteran interacts with the Bravemind VR therapy to safely relive his deployment experiences. A landmark effort to help military personnel, Bravemind has been shown to produce a meaningful reduction in symptoms of PTSD. The app has been distributed to more than 100 clinical sites.

Q. What is the outlook for funding projects like these that hold great potential for widespread application?

BOTTOM: Capture. Bravemind allows veterans to process their trauma memories in a safe, controlled setting. Clinicians control the stimulus presentation via a separate interface, and are in full audio contact with the patient.

A. Virtual Volunteers was me as a do-gooder wanting to build a community of people to step up, but ultimately you need someone to manage and integrate the work, and that takes income. I’m trying to do everything I can to move things into the commercial sector. Academia is great for invention of new ideas and doing things no one has done before, but going from advanced research prototype to production and product level requires an economic driver. The arduous process of applying for grants and negotiating agreements gets in the way of innovation. I’d like to see some of the majors endow a center where in perpetuity there would be enough money to orchestrate these types of beneficial projects to advance the use of pro-social VR. Q. What projects are next on the horizon at ICT? A. We’re investigating using VR for pain distraction. It has such a solid research literature and it’s now becoming more of a commodity, because we have tools that work. We’re also looking

FALL 2017 VFXVOICE.COM • 51

9/14/17 3:35 PM


PROFILE

beyond distraction to chronic pain management, which is about reaching people with the cultivation of skills they can access every day to reduce their experience of chronic pain. We are going to move forward on some things related to addiction, to help people after they quit to essentially ‘stay quit,’ known as relapse prevention. I’m looking at putting people in high-risk virtual environments such as bars, crack houses, shooting galleries or social events where they have exposure to stimuli like everyone is drinking or smoking or doing coke. In rehab the person can’t use, so in a safe VR space you are reducing high levels of urge state in a setting where they can’t follow up and the exposure has no negative consequences. You’re behaviorally starting to break the cycle of addiction similar to what we’ve done in anxiety disorder work. We’re planning to do this in concert with a pharmaceutical approach using [Natrexalon], an opiate blocker, which suppresses the brain reinforcement schedule. So when you put an addict into a controlled VR environment and they experience an urge state, you’ve got the clinician right there to help them tap into their cognitive behavioral strategies and coping skills. The opiate epidemic is one thing, but I think we can leverage technology to reduce a wide spectrum of addictive behaviors in new ways. And the next real big thing is to populate these environments with intelligent virtual humans that can serve a lot of roles beyond being just being boxed in a virtual environment. They can be vital tools to train doctors on clinical skills by creating virtual patients as healthcare guides and in other areas where interacting with a credible virtual human agent will have dramatic impact on improving health care.

AD

Q. Where are the VR artists coming from – transition from the entertainment industry or more organic growth in the medical VR field? A. There is certainly some crossover from entertainment, but also a ton of interest in our field. VR in entertainment has a lot of energy and enthusiasm, but it’s also a field with heavy competition and wreckage. Now we’re seeing people looking at the evolved scientific literature, recognizing that it’s theoretically sound and pragmatically possible and something that is getting paid a lot of attention. And health care is the second biggest market after entertainment in Goldman Sachs’ Report on VR. So maybe medical VR is the way to satisfy the creative urge to do VR, with the potential to make money and do something really good for humanity. Now that’s a compelling vision.

TOP TO BOTOM: Veteran interacts with SimCoach, a Web-based virtual human that helps overcome barriers to seeking care. Albert “Skip” Rizzo, Ph.D. interacts with a veteran using the Bravemind VR exposure therapy. (Photo credit: Branimir Kvartuc) Web-based virtual humans in SimCoach help soldiers, veterans and their families overcome barriers to seeking care. The “Virtual Interactive Training Agent” is a VR job interview practice system for building competence and reducing anxiety in young adults with Autism Spectrum Disorder.

52 • VFXVOICE.COM FALL 2017

PG 48-53 SKIP RIZZO.indd 53

FALL 2017 VFXVOICE.COM • 53

9/14/17 3:36 PM


SILICON PICTURES AD.indd 53

9/14/17 3:36 PM


VR/AR TRENDS

Since its founding by Steven Spielberg in 1994, the USC Shoah Foundation – The Institute for Visual History and Education has captured more than 54,000 audio-visual interviews with survivors and other witnesses of the Holocaust and other genocides. This past April at the Tribeca Film Festival’s Virtual Arcade, the Foundation premiered its groundbreaking virtual reality experience, The Last Goodbye, the first-ever Holocaust survivor testimony in room-scale virtual reality. This lauded project builds upon the Foundation’s pioneering work with interactive holograms, which allowed people to interact with a Holocaust survivor avatar and receive answers based on prerecorded questions. What are the challenges in using VR to share historical narratives? How can VR create powerful new digital archives? How does this portend the future of Holocaust education? The creation of The Last Goodbye underscores the need for balance between the use of VR technology simply because it is available with the responsibility of maintaining authenticity and upholding the integrity of an institute charged with preserving historical testimony. It also demonstrates robust collaboration between the creative and VFX team and documentarians in forging new ground to deepen Holocaust education through a powerful photoreal experience. In the immersive experience, Holocaust survivor Pinchas Gutter takes audiences with him on his final visit to Majdanek Concentration Camp where his parents and sister were murdered during World War II. Gutter stands inside each vivid environment as he recounts his story of survival and loss – from the concentration camp bunks to the communal showers to the gas chamber where his family was killed and to the crematorium in which their bodies were burned. The 360-degree video allows viewers to physically walk around and experience the spaces, lending an even more powerful sense of presence while heightening the emotional impact. VFX Voice talked with Stephen Smith, Executive Director of the USC Shoah Foundation and Co-producer of The Last Goodbye about the creative process and the power of place.

VR IN HOLOCAUST EDUCATION: BALANCING TECHNOLOGY AND AUTHENTICITY By NAOMI GOLDMAN

All photos by Lauren Carter/USC Shoah Foundation

54 • VFXVOICE.COM FALL 2017

PG 54-57 SHOAH.indd 54-55

Smith: Gabo Arora (United Nations Senior Creative Advisor and award-winning filmmaker) and I were on the same panel at the 2016 International Documentary Festival in Sheffield, England. He sought me out with interest in doing work around the Holocaust and we connected on some core questions: How do you tell a real story in a real place and navigate spatially? How does one experience the narrative of the same person in different ways if you are in the space they are referring to? Because there is a meaningful relationship between narrative and the power of place in telling a story.”

(Majdanek), so that through this vehicle we could bring something new to the field of knowledge and understanding. The team decided to feature Pinchas Gutter, the survivor who appeared as the avatar in the Foundation’s “New Dimension in Testimony” project, and traveled with him to Poland to capture tens of thousands of photos and hours of 3D video on site. Then there was the ethics of the testimony itself: how much is scripted; how much is free form vs. deciding in advance and capturing vignettes; what would work with the VR medium? When doing 4K 2D format we let the camera roll, but we realized this medium places limits to come up with something meaningful. We had a hunch that if we created photogrammetry of the rooms and filmed Gutter against a green screen, it sounded good technically, but would we be creating an artificial environment? It hadn’t been done before so we were going out on a limb. It was our role to ensure that we didn’t cross any ethical lines in creating documented testimony.

VFX Voice: What were some of the upfront creative decisions that needed to be made?

VFX Voice: What’s an example of the balancing act you managed between VR technology and authenticity?

Smith: First, who do you take and where? We decided not to go to the concentration camp that has the highest level of awareness (Auschwitz), but one that people do not know much about

Smith: A key element in Gutter’s narrative is his recollection of a cattle wagon and its use in transporting people and personal effects. There was no cattle wagon available in post, but it was a key

VFX Voice: What was the genesis of The Last Goodbye?

OPPOSITE TOP: Stephen Smith, Executive Director of USC Shoah Foundation, leads a location-scout mission with producer Ari Palitz in the Warsaw Old Town during the filming of The Last Goodbye. OPPOSITE BOTTOM: The crew for The Last Goodbye makes preparations to film Holocaust survivor Pinchas Gutter at the museum on the grounds of the Majdanek death camp in Poland, where Gutter was imprisoned and his family murdered during World War II. TOP: Holocaust survivor Pinchas Gutter during filming of The Last Goodbye, the first-ever Holocaust survivor testimony in room-scale virtual reality. BOTTOM: In the 360-degree video, viewers can physically walk around and visit the vivid locations in the death camp that Gutter survived.

FALL 2017 VFXVOICE.COM • 55

9/14/17 3:37 PM


VR/AR TRENDS

Since its founding by Steven Spielberg in 1994, the USC Shoah Foundation – The Institute for Visual History and Education has captured more than 54,000 audio-visual interviews with survivors and other witnesses of the Holocaust and other genocides. This past April at the Tribeca Film Festival’s Virtual Arcade, the Foundation premiered its groundbreaking virtual reality experience, The Last Goodbye, the first-ever Holocaust survivor testimony in room-scale virtual reality. This lauded project builds upon the Foundation’s pioneering work with interactive holograms, which allowed people to interact with a Holocaust survivor avatar and receive answers based on prerecorded questions. What are the challenges in using VR to share historical narratives? How can VR create powerful new digital archives? How does this portend the future of Holocaust education? The creation of The Last Goodbye underscores the need for balance between the use of VR technology simply because it is available with the responsibility of maintaining authenticity and upholding the integrity of an institute charged with preserving historical testimony. It also demonstrates robust collaboration between the creative and VFX team and documentarians in forging new ground to deepen Holocaust education through a powerful photoreal experience. In the immersive experience, Holocaust survivor Pinchas Gutter takes audiences with him on his final visit to Majdanek Concentration Camp where his parents and sister were murdered during World War II. Gutter stands inside each vivid environment as he recounts his story of survival and loss – from the concentration camp bunks to the communal showers to the gas chamber where his family was killed and to the crematorium in which their bodies were burned. The 360-degree video allows viewers to physically walk around and experience the spaces, lending an even more powerful sense of presence while heightening the emotional impact. VFX Voice talked with Stephen Smith, Executive Director of the USC Shoah Foundation and Co-producer of The Last Goodbye about the creative process and the power of place.

VR IN HOLOCAUST EDUCATION: BALANCING TECHNOLOGY AND AUTHENTICITY By NAOMI GOLDMAN

All photos by Lauren Carter/USC Shoah Foundation

54 • VFXVOICE.COM FALL 2017

PG 54-57 SHOAH.indd 54-55

Smith: Gabo Arora (United Nations Senior Creative Advisor and award-winning filmmaker) and I were on the same panel at the 2016 International Documentary Festival in Sheffield, England. He sought me out with interest in doing work around the Holocaust and we connected on some core questions: How do you tell a real story in a real place and navigate spatially? How does one experience the narrative of the same person in different ways if you are in the space they are referring to? Because there is a meaningful relationship between narrative and the power of place in telling a story.”

(Majdanek), so that through this vehicle we could bring something new to the field of knowledge and understanding. The team decided to feature Pinchas Gutter, the survivor who appeared as the avatar in the Foundation’s “New Dimension in Testimony” project, and traveled with him to Poland to capture tens of thousands of photos and hours of 3D video on site. Then there was the ethics of the testimony itself: how much is scripted; how much is free form vs. deciding in advance and capturing vignettes; what would work with the VR medium? When doing 4K 2D format we let the camera roll, but we realized this medium places limits to come up with something meaningful. We had a hunch that if we created photogrammetry of the rooms and filmed Gutter against a green screen, it sounded good technically, but would we be creating an artificial environment? It hadn’t been done before so we were going out on a limb. It was our role to ensure that we didn’t cross any ethical lines in creating documented testimony.

VFX Voice: What were some of the upfront creative decisions that needed to be made?

VFX Voice: What’s an example of the balancing act you managed between VR technology and authenticity?

Smith: First, who do you take and where? We decided not to go to the concentration camp that has the highest level of awareness (Auschwitz), but one that people do not know much about

Smith: A key element in Gutter’s narrative is his recollection of a cattle wagon and its use in transporting people and personal effects. There was no cattle wagon available in post, but it was a key

VFX Voice: What was the genesis of The Last Goodbye?

OPPOSITE TOP: Stephen Smith, Executive Director of USC Shoah Foundation, leads a location-scout mission with producer Ari Palitz in the Warsaw Old Town during the filming of The Last Goodbye. OPPOSITE BOTTOM: The crew for The Last Goodbye makes preparations to film Holocaust survivor Pinchas Gutter at the museum on the grounds of the Majdanek death camp in Poland, where Gutter was imprisoned and his family murdered during World War II. TOP: Holocaust survivor Pinchas Gutter during filming of The Last Goodbye, the first-ever Holocaust survivor testimony in room-scale virtual reality. BOTTOM: In the 360-degree video, viewers can physically walk around and visit the vivid locations in the death camp that Gutter survived.

FALL 2017 VFXVOICE.COM • 55

9/14/17 3:37 PM


VR/AR TRENDS

element of what happened. So we made the choice to allow him to tell the story and created a photogrammetric version of a cattle wagon in Los Angeles. We created a version filling the VR wagon with found authentic pieces from the period, but they weren’t his artifacts. He was talking about what the items meant to him, but since they were not his, the applied ethic prevailed and we took them out. That’s the crux for us – just because you can create a visual effect does it mean that you should? The team worked very collaboratively to find a balance we could all work with and remain true to Gutter’s story and stay historically grounded.

VFX Voice: What were you hoping to achieve with the project? Smith: The preservation of history is at the heart of what we do. We came to realize that by doing photogrammetry inside of the rooms at the concentration camp, we were creating a historical legacy document in a different medium. We see this as a digital hi-res archive, a way of preserving dynamic testimony and a whole new data set to pass down to the next generation as a point of reference. This use of VR represents a new way of capturing truth for the future. We aren’t afraid to push boundaries, but we are carefully considering using any innovation so that the awe of technology does not overtake the story. We’re very early in our understanding of what an immersive experience can do as people become voyeurs of others’ painful experiences. And we are cautious that with issues related to human suffering and struggle, these types of exposures can sometimes elicit pity but not compassion and empathy … and I’m not sure a headset creates that connection. But I believe the potential for education and impact is there, as long as it’s contextualized and avoids the specter of technical gimmickry.

“How do you tell a real story in a real place and navigate spatially? … There is a meaningful relationship between narrative and the power of place in telling a story.” —Stephen Smith, Executive Director of the USC Shoah Foundation/Co-producer of The Last Goodbye

TOP: Stephen Smith, Executive Director of USC Shoah Foundation, works with Holocaust survivor Roman Kent, the narrator of the VR film Lala, during the filming. BOTTOM: Holocaust survivor Roman Kent narrates his story for Lala.

56 • VFXVOICE.COM FALL 2017

PG 54-57 SHOAH.indd 56-57

VFX Voice: Can audiences experience The Last Goodbye? What’s next in the realm of VR-based education? Smith: We are being thoughtful in developing the optimal venues to share The Last Goodbye. Since we made decisions about doing real-scale photogrammetry, it is quite difficult to place in multiple locations. For Tribeca, we had a designer create an experience including a trained actor to take people from the film festival floor to a curated space to experience the piece and then to a debrief afterwards. We believe there is a contextualized learning experience bigger than the VR piece in and of itself and that people should experience this in a safe and supportive environment. So we are working with Holocaust museums around the world to create installations in that contextualized environment. And we will likely present more than one experience at a time for the public to experience. [Author’s note: USC Shoah Foundation is also showcasing another VR project, Lala, an animated and live-action retelling of a story told by Holocaust survivor Roman Kent. The film is narrated by Kent himself, with the story of Lala (the family dog)

“The preservation of history is at the heart of what we do. We came to realize that by doing photogrammetry inside of the rooms at the concentration camp, we were creating a historical legacy document in a different medium. We see this as a digital hi-res archive, a way of preserving dynamic testimony and a whole new data set to pass down to the next generation as a point of reference.” —Stephen Smith, Executive Director of the USC Shoah Foundation/Co-producer of The Last Goodbye

illustrated through animation. Lala can be viewed on a smartphone with a cardboard VR viewer, on a smartphone or mobile device on its own, or on a computer screen through YouTube, which allows viewers to click and drag the video around with a cursor to view the film from all angles. Developed in partnership with Discovery Communications, Discovery
Education and Global Nomads Group, Lala is part of IWitness360, a new space on IWitness for virtual reality films and supporting educational resources that made its debut at the International Society for Technology in Education conference in San Antonio this summer.] Smith: Lala is a short animated piece for children aged 5 to 10. It’s already breaking boundaries with young children as the target audience, and we’re thinking about the best ways to distribute it. We are trying to stay true to that ethical through line with the survivor serving as the narrator and appearing in live action at the beginning, middle and end of the animated piece. It’s tightly scripted, but the words are all from his testimony, which is essential in staying true to our mission.

VFX Voice: And what has your experience been like working with the VFX team?

TOP: Holocaust survivor Pinchas Gutter and USC Shoah Foundation Executive Director Stephen Smith (both facing the camera) conversing with filmmaker Gabo Arora (back to the camera) at the site of the sorting grounds at the Majdanek death camp.

Smith: The Last Goodbye was a wonderful collaboration between USC Shoah Foundation and a tremendously talented and committed team – filmmakers Gabo Arora and Ari Palitz, Here Be Dragons, MPC VR and Otoy, in partnership with LightShed. The goodwill and genuine enthusiasm in the visual effects community has been outstanding. We’ve enjoyed great partnership on our path to harness VR to create poignant and accessible testaments to hope and survival.

FALL 2017 VFXVOICE.COM • 57

9/14/17 3:37 PM


VR/AR TRENDS

element of what happened. So we made the choice to allow him to tell the story and created a photogrammetric version of a cattle wagon in Los Angeles. We created a version filling the VR wagon with found authentic pieces from the period, but they weren’t his artifacts. He was talking about what the items meant to him, but since they were not his, the applied ethic prevailed and we took them out. That’s the crux for us – just because you can create a visual effect does it mean that you should? The team worked very collaboratively to find a balance we could all work with and remain true to Gutter’s story and stay historically grounded.

VFX Voice: What were you hoping to achieve with the project? Smith: The preservation of history is at the heart of what we do. We came to realize that by doing photogrammetry inside of the rooms at the concentration camp, we were creating a historical legacy document in a different medium. We see this as a digital hi-res archive, a way of preserving dynamic testimony and a whole new data set to pass down to the next generation as a point of reference. This use of VR represents a new way of capturing truth for the future. We aren’t afraid to push boundaries, but we are carefully considering using any innovation so that the awe of technology does not overtake the story. We’re very early in our understanding of what an immersive experience can do as people become voyeurs of others’ painful experiences. And we are cautious that with issues related to human suffering and struggle, these types of exposures can sometimes elicit pity but not compassion and empathy … and I’m not sure a headset creates that connection. But I believe the potential for education and impact is there, as long as it’s contextualized and avoids the specter of technical gimmickry.

“How do you tell a real story in a real place and navigate spatially? … There is a meaningful relationship between narrative and the power of place in telling a story.” —Stephen Smith, Executive Director of the USC Shoah Foundation/Co-producer of The Last Goodbye

TOP: Stephen Smith, Executive Director of USC Shoah Foundation, works with Holocaust survivor Roman Kent, the narrator of the VR film Lala, during the filming. BOTTOM: Holocaust survivor Roman Kent narrates his story for Lala.

56 • VFXVOICE.COM FALL 2017

PG 54-57 SHOAH.indd 56-57

VFX Voice: Can audiences experience The Last Goodbye? What’s next in the realm of VR-based education? Smith: We are being thoughtful in developing the optimal venues to share The Last Goodbye. Since we made decisions about doing real-scale photogrammetry, it is quite difficult to place in multiple locations. For Tribeca, we had a designer create an experience including a trained actor to take people from the film festival floor to a curated space to experience the piece and then to a debrief afterwards. We believe there is a contextualized learning experience bigger than the VR piece in and of itself and that people should experience this in a safe and supportive environment. So we are working with Holocaust museums around the world to create installations in that contextualized environment. And we will likely present more than one experience at a time for the public to experience. [Author’s note: USC Shoah Foundation is also showcasing another VR project, Lala, an animated and live-action retelling of a story told by Holocaust survivor Roman Kent. The film is narrated by Kent himself, with the story of Lala (the family dog)

“The preservation of history is at the heart of what we do. We came to realize that by doing photogrammetry inside of the rooms at the concentration camp, we were creating a historical legacy document in a different medium. We see this as a digital hi-res archive, a way of preserving dynamic testimony and a whole new data set to pass down to the next generation as a point of reference.” —Stephen Smith, Executive Director of the USC Shoah Foundation/Co-producer of The Last Goodbye

illustrated through animation. Lala can be viewed on a smartphone with a cardboard VR viewer, on a smartphone or mobile device on its own, or on a computer screen through YouTube, which allows viewers to click and drag the video around with a cursor to view the film from all angles. Developed in partnership with Discovery Communications, Discovery
Education and Global Nomads Group, Lala is part of IWitness360, a new space on IWitness for virtual reality films and supporting educational resources that made its debut at the International Society for Technology in Education conference in San Antonio this summer.] Smith: Lala is a short animated piece for children aged 5 to 10. It’s already breaking boundaries with young children as the target audience, and we’re thinking about the best ways to distribute it. We are trying to stay true to that ethical through line with the survivor serving as the narrator and appearing in live action at the beginning, middle and end of the animated piece. It’s tightly scripted, but the words are all from his testimony, which is essential in staying true to our mission.

VFX Voice: And what has your experience been like working with the VFX team?

TOP: Holocaust survivor Pinchas Gutter and USC Shoah Foundation Executive Director Stephen Smith (both facing the camera) conversing with filmmaker Gabo Arora (back to the camera) at the site of the sorting grounds at the Majdanek death camp.

Smith: The Last Goodbye was a wonderful collaboration between USC Shoah Foundation and a tremendously talented and committed team – filmmakers Gabo Arora and Ari Palitz, Here Be Dragons, MPC VR and Otoy, in partnership with LightShed. The goodwill and genuine enthusiasm in the visual effects community has been outstanding. We’ve enjoyed great partnership on our path to harness VR to create poignant and accessible testaments to hope and survival.

FALL 2017 VFXVOICE.COM • 57

9/14/17 3:37 PM


COMPANY PROFILE

ILM: 40 YEARS OF MAKING MEMORABLE IMAGES FOR GREAT STORIES By BARBARA ROBERTSON

TOP to BOTTOM: Deepwater Horizon (Photo copyright © 2016 Lions Gate Entertainment Inc. All Rights Reserved.)

58 • VFXVOICE.COM FALL 2017

PG 58-63 ILM.indd 58-59

The venerable visual effects studio Industrial Light & Magic began 2017, its 41st year, with its artists receiving 16 VES Awards nominations, three VES Awards, four Annie Award nominations, two BAFTA nominations, and three Oscar® nominations for work accomplished last year. The nominations and awards include those for best achievement in visual effects, outstanding effects simulations, compositing, animated effects, character animation, environments, virtual cinematography, modeling – just about every area in visual effects for which there is an award. In addition, four developers at ILM received an Academy Technical Achievement Award for the studio’s facial performance-capture solving system. So what’s ILM up to now? “The short answer is that we’re doing the same thing we’ve been doing for 40 years,” says John Knoll, Chief Creative Officer and Visual Effects Supervisor. “Trying to generate and make startling and memorable imagery for clients. Trying to help create great stories. “All the shows we’re working on have something interesting,” he continues. “That’s how we choose these projects. We ask what potential they have for being the source of striking and memorable imagery, and what can they do to help drive the technology forward. There’s always something creatively cool about the projects we pursue.” Already, the studio has worked on eight films, so far (Transformers: The Last Knight, The Mummy, Kong: Skull Island, Valerian and the City of a Thousand Planets, Pirates of the Caribbean: Dead Men Tell No Tales, Life and Spider-Man: Homecoming). Currently in production are visual effects for another 14 films so far: Star Wars: The Last Jedi, Thor Ragnarok, Mother!, Downsizing and Only the Brave for release this year. Those underway for 2018 include Jurassic World: Fallen Kingdom, Avengers: Infinity War, Black Panther, Ready Player One, Aquaman, A Wrinkle in Time, Monster Hunt 2, Cloverfield Movie, and an untitled “Han Solo Star Wars Anthology Film.” How many films can ILM handle at one time? “Rather than the number of shows, we think about the number of shots we’re doing,” Knoll says. “It’s all about managing capacity. We could have fewer shows with a large number of shots. I did an estimate probably two years ago and came up with a number – we are probably something like a 4,000 to 6,000 shot facility. ILM type of shots.”

GLOBAL ILM SINGAPORE Lucasfilm opened its first international studio in Singapore in 2006, with 15 artists working in three branches: animation, LucasArts and ILM. Now, solely an ILM studio, approximately 300 artists – 400 people total – work in the ILM’s Sandcrawler building located in Singapore’s Fusionopolis. Nigel Sumner, Creative Director, who has traveled back and forth between Singapore and San Francisco since 2007, manages the group. “For me, the great strength of the global ILM is that we can move shots freely between the studios, which means shots can be completed at any one of our four locations,” he says. “And our geographical location allows us to reach into Asia to make connections with regional filmmakers.” In addition to working with the other studios, ILM Singapore hubbed the Chinese co-production of The Great Wall, and is currently hubbing the regional production, Monster Hunt 2. VANCOUVER ILM first opened a studio in Vancouver in 2013 and within a year had expanded into a 30,000-square-foot space in Gastown. In 2017, the company opened a second studio, expanding into another 60,000 square feet. Today, approximately 425 people work at ILM Vancouver. “A lot of our technology is San Francisco based,” says Creative Director Eric Barba. “We remotely connect to the machines in San Francisco.” Although bidding for projects also goes through San Francisco, the goal is to have studios and directors come directly to Vancouver for their projects. And that has begun. “Valerian was the first show we hubbed,” Barba says, “that we had full creative ownership. We’re doing all the work for Only the Brave in Vancouver and we have more in the works. We’re building a team of great visual effects supervisors. “Of course, we also have teams running with a good amount

DIVIDE AND CONQUER

Working on these shots are approximately 1,700 artists and support staff spread nearly equally among four studios: The home base in San Francisco plus Vancouver, London, and Singapore. “We have talented supervisors and creative directors in all four studios,” Knoll says. “In the same way I keep an eye on San Francisco, Eric Barba does that in Vancouver, Nigel Sumner in Singapore, and Ben Morris and David Vickery in London.” Any combination of studios might work on any film – one, two, three or all four – and in addition to work centered in San Francisco, the other three studios often act as the hub for one or more films. For example, Vancouver hubbed Valerian and is the hub for Only the Brave scheduled for release later this year. London has Star Wars: The Last Jedi (Episode VIII), which will release in

of shot work – as spokes on shows that hub elsewhere,” he adds. “ILM is much more collaborative than what I’ve been used to. It’s a different culture. I’m in constant communication with the creative directors in each facility. I can walk through shots with Dennis Muren – and the other supervisors. That’s one of the big positive things about ILM.” LONDON ILM opened its London studio in 2014 with Creative Director Ben Morris. The location is perfect for ILM’s work on Star Wars, which is filmed in part at nearby Pinewood Studios. ILM London is currently the hub for Star Wars, The Last Jedi. “We’re also hubbing Jurassic World and working on Ready Player One, the Han Solo spin-off, and the next Avengers movie, says Executive-in-Charge Sue Lyster. Currently approximately 500 people work in three buildings in central London, but work has started on a new building in Holborn – also central London – to house everyone. London ILM’ers share the same pipeline as those in the other three studios. “There is an enormous depth of experienced talent in London, and I think we have attracted some of the best into our studio,” Lyster says. “We also have a graduate program to bring the best emerging talent into the studio.” With several other award-winning studios already located in London, why would the local talent choose ILM? “I think people join ILM to be part of ILM,” Lyster says, “to join the legacy of a great and long-standing company. ILM has a reputation, which is what brought me here, for treating people well and respecting people. There is an ethos of sharing, of reaching out for an opinion, and what’s happening is that people don’t only reach to San Francisco for help, opinions and feedback, they also reach to the other three studios. We’ve all gotten a little bit used to the time zones.”

December, and Jurassic World: Fallen Kingdom, scheduled for 2018. Singapore is hubbing a regional production, Monster Hunt 2, directed by Raman Hui of Shrek fame. Each of the four studios has artists on staff working in ILM’s global art department to provide concept art and design for films with ILM effects and for other films. And, ILM is moving technical artists – R&D engineers – into all the studios, as well. “For a long time, the R&D was all in San Francisco,” Knoll says. “But there are good reasons to have engineering capacity in all the studios. R&D is easier to manage if we can say particular aspects are concentrated in different studios.” Knoll estimates that a group of about 40 people do R&D, but notes the number is difficult to pin down because artists on production teams also do R&D and vice versa. The goal is two-fold:

FALL 2017 VFXVOICE.COM • 59

9/14/17 3:41 PM


COMPANY PROFILE

ILM: 40 YEARS OF MAKING MEMORABLE IMAGES FOR GREAT STORIES By BARBARA ROBERTSON

TOP to BOTTOM: Deepwater Horizon (Photo copyright © 2016 Lions Gate Entertainment Inc. All Rights Reserved.)

58 • VFXVOICE.COM FALL 2017

PG 58-63 ILM.indd 58-59

The venerable visual effects studio Industrial Light & Magic began 2017, its 41st year, with its artists receiving 16 VES Awards nominations, three VES Awards, four Annie Award nominations, two BAFTA nominations, and three Oscar® nominations for work accomplished last year. The nominations and awards include those for best achievement in visual effects, outstanding effects simulations, compositing, animated effects, character animation, environments, virtual cinematography, modeling – just about every area in visual effects for which there is an award. In addition, four developers at ILM received an Academy Technical Achievement Award for the studio’s facial performance-capture solving system. So what’s ILM up to now? “The short answer is that we’re doing the same thing we’ve been doing for 40 years,” says John Knoll, Chief Creative Officer and Visual Effects Supervisor. “Trying to generate and make startling and memorable imagery for clients. Trying to help create great stories. “All the shows we’re working on have something interesting,” he continues. “That’s how we choose these projects. We ask what potential they have for being the source of striking and memorable imagery, and what can they do to help drive the technology forward. There’s always something creatively cool about the projects we pursue.” Already, the studio has worked on eight films, so far (Transformers: The Last Knight, The Mummy, Kong: Skull Island, Valerian and the City of a Thousand Planets, Pirates of the Caribbean: Dead Men Tell No Tales, Life and Spider-Man: Homecoming). Currently in production are visual effects for another 14 films so far: Star Wars: The Last Jedi, Thor Ragnarok, Mother!, Downsizing and Only the Brave for release this year. Those underway for 2018 include Jurassic World: Fallen Kingdom, Avengers: Infinity War, Black Panther, Ready Player One, Aquaman, A Wrinkle in Time, Monster Hunt 2, Cloverfield Movie, and an untitled “Han Solo Star Wars Anthology Film.” How many films can ILM handle at one time? “Rather than the number of shows, we think about the number of shots we’re doing,” Knoll says. “It’s all about managing capacity. We could have fewer shows with a large number of shots. I did an estimate probably two years ago and came up with a number – we are probably something like a 4,000 to 6,000 shot facility. ILM type of shots.”

GLOBAL ILM SINGAPORE Lucasfilm opened its first international studio in Singapore in 2006, with 15 artists working in three branches: animation, LucasArts and ILM. Now, solely an ILM studio, approximately 300 artists – 400 people total – work in the ILM’s Sandcrawler building located in Singapore’s Fusionopolis. Nigel Sumner, Creative Director, who has traveled back and forth between Singapore and San Francisco since 2007, manages the group. “For me, the great strength of the global ILM is that we can move shots freely between the studios, which means shots can be completed at any one of our four locations,” he says. “And our geographical location allows us to reach into Asia to make connections with regional filmmakers.” In addition to working with the other studios, ILM Singapore hubbed the Chinese co-production of The Great Wall, and is currently hubbing the regional production, Monster Hunt 2. VANCOUVER ILM first opened a studio in Vancouver in 2013 and within a year had expanded into a 30,000-square-foot space in Gastown. In 2017, the company opened a second studio, expanding into another 60,000 square feet. Today, approximately 425 people work at ILM Vancouver. “A lot of our technology is San Francisco based,” says Creative Director Eric Barba. “We remotely connect to the machines in San Francisco.” Although bidding for projects also goes through San Francisco, the goal is to have studios and directors come directly to Vancouver for their projects. And that has begun. “Valerian was the first show we hubbed,” Barba says, “that we had full creative ownership. We’re doing all the work for Only the Brave in Vancouver and we have more in the works. We’re building a team of great visual effects supervisors. “Of course, we also have teams running with a good amount

DIVIDE AND CONQUER

Working on these shots are approximately 1,700 artists and support staff spread nearly equally among four studios: The home base in San Francisco plus Vancouver, London, and Singapore. “We have talented supervisors and creative directors in all four studios,” Knoll says. “In the same way I keep an eye on San Francisco, Eric Barba does that in Vancouver, Nigel Sumner in Singapore, and Ben Morris and David Vickery in London.” Any combination of studios might work on any film – one, two, three or all four – and in addition to work centered in San Francisco, the other three studios often act as the hub for one or more films. For example, Vancouver hubbed Valerian and is the hub for Only the Brave scheduled for release later this year. London has Star Wars: The Last Jedi (Episode VIII), which will release in

of shot work – as spokes on shows that hub elsewhere,” he adds. “ILM is much more collaborative than what I’ve been used to. It’s a different culture. I’m in constant communication with the creative directors in each facility. I can walk through shots with Dennis Muren – and the other supervisors. That’s one of the big positive things about ILM.” LONDON ILM opened its London studio in 2014 with Creative Director Ben Morris. The location is perfect for ILM’s work on Star Wars, which is filmed in part at nearby Pinewood Studios. ILM London is currently the hub for Star Wars, The Last Jedi. “We’re also hubbing Jurassic World and working on Ready Player One, the Han Solo spin-off, and the next Avengers movie, says Executive-in-Charge Sue Lyster. Currently approximately 500 people work in three buildings in central London, but work has started on a new building in Holborn – also central London – to house everyone. London ILM’ers share the same pipeline as those in the other three studios. “There is an enormous depth of experienced talent in London, and I think we have attracted some of the best into our studio,” Lyster says. “We also have a graduate program to bring the best emerging talent into the studio.” With several other award-winning studios already located in London, why would the local talent choose ILM? “I think people join ILM to be part of ILM,” Lyster says, “to join the legacy of a great and long-standing company. ILM has a reputation, which is what brought me here, for treating people well and respecting people. There is an ethos of sharing, of reaching out for an opinion, and what’s happening is that people don’t only reach to San Francisco for help, opinions and feedback, they also reach to the other three studios. We’ve all gotten a little bit used to the time zones.”

December, and Jurassic World: Fallen Kingdom, scheduled for 2018. Singapore is hubbing a regional production, Monster Hunt 2, directed by Raman Hui of Shrek fame. Each of the four studios has artists on staff working in ILM’s global art department to provide concept art and design for films with ILM effects and for other films. And, ILM is moving technical artists – R&D engineers – into all the studios, as well. “For a long time, the R&D was all in San Francisco,” Knoll says. “But there are good reasons to have engineering capacity in all the studios. R&D is easier to manage if we can say particular aspects are concentrated in different studios.” Knoll estimates that a group of about 40 people do R&D, but notes the number is difficult to pin down because artists on production teams also do R&D and vice versa. The goal is two-fold:

FALL 2017 VFXVOICE.COM • 59

9/14/17 3:41 PM


COMPANY PROFILE

ILM’S GLOBAL ART DEPARTMENT ILM’s art department traces back to the first Star Wars and artists Joe Johnston and Ralph McQuarrie. “It’s different now that we use a lot of 3D tools,” says Jennifer Coronado, Senior Manager of ILM’s Art Department. “You don’t need orthos if you can do a sculpt and spin it around, but other than that, it’s still very much the same. We’ve grown and taken on more work, but even though we’re in four countries now, we try to make sure the team feels like a team.” Twenty-two artists work in San Francisco, 14 in London, three “and growing” in Vancouver and two in Singapore. In addition to work on films, the artists have done videogame design, marketing materials, and even provided Star Warsbased designs for prosthetics for children. “We pursue our own work,” Coronado says. “For film, we can take a show from blue sky to pragmatic builds and if the film gets awarded, continue through production. We’d love to work on the visual effects for every film we design, but sometimes that doesn’t work out.” Recently, to find new artists, ILM held a Star Wars-based competition. Three thousand people entered. After phase one and two – ship and character design – only 250 were left. After phase three, only eight. “People post great concept art on the web, but taking that to a final design is different,” Coronado says. “Phase three was most important. We had them create key frames for a fake movie with fake characters and a fake brief. We gave people two weeks and in the middle, changed it on them completely. Because that’s what a director would do.” They hired the third place winner, who now works as an art director in Vancouver. “Talented art directors are a finite resource,” Coronado says. “It’s amazing what these folks do, the way their brains work. They’re not just executing. They’re inventing under pressure. I applaud them every day.”

TOP to BOTTOM: Kong: Skull Island (Photo copyright © 2017 Warner Bros. Entertainment Inc. All Rights Reserved.)

60 • VFXVOICE.COM FALL 2017

PG 58-63 ILM.indd 60-61

improving day-to-day efficiency and creating new tools for the artists. “I think there’s plenty of room for improving the number of man-hours it takes to do a task, to make the workflow smoother and slicker so we can produce the same quality with fewer man hours,” he says. That’s true inside the facilities and out, as well. Lucasfilm and ILM recently announced an open-source release of the MaterialX library developed by Lucasfilm’s Advanced Development Group and ILM engineers. MaterialX, for which both Autodesk and Foundry have voiced support, facilitates the transfer of rich materials and look-development content between applications and renderers. “But equally important is focusing on enabling technology,” Knoll says, “on tools allowing us to do things we couldn’t before.”

“Rather than wait for other people to figure out what the rules [of VR storytelling] are, we’re learning them ourselves. We have people very excited about the possibilities of VR who are driving that forward... .” —John Knoll, Chief Creative Officer & Visual Effects Supervisor, ILM ILM’s long list of Technical Academy Awards attests to that effort. Today, Knoll cites tools for facial animation, for better compositing, and for effects simulation as ones he’s looking at in particular. “There are always more projects you want to do than you have engineering man hours to do,” Knoll says. When Disney bought Lucasfilm, ILM gained some partners in that effort. Helping the studio expand their R&D talent pool are Disney’s Pixar Animation and Walt Disney Animation. “We have a commonality of mission,” Knoll says. “Each year we get together at DISGRAPH where we share technical development and general information about how we work in a way that can be somewhat more open than we can be at SIGGRAPH. We can be explicit. We share code. If we have something one of the other studios likes, they can have it and vice versa.” ILM developers can also tap into work being carried on in the labs at Disney Research. “We meet with them every week,” Knoll says. “They’re interested in making sure that what they do has application, and we suggest things we would like to see. We’re in constant communication about the state of development of some of their tools. They have all kinds of really cool things we can apply directly.” IMMERSIVE ENTERTAINMENT

All these departments plus Lucasfilm’s Skywalker Sound feed talent and expertise into the studio’s ILMxLab in San Francisco where visual artists, sound artists, researchers and interactive storytellers create immersive entertainment. Knoll is intrigued by VR and AR, but personally hasn’t jumped

TOP to BOTTOM: Rogue One: A Star Wars Story (Photo copyright © 2016 Industrial Light & Magic, a division of Lucasfilm Entertainment Company Ltd., All Rights Reserved

FALL 2017 VFXVOICE.COM • 61

9/14/17 3:41 PM


COMPANY PROFILE

ILM’S GLOBAL ART DEPARTMENT ILM’s art department traces back to the first Star Wars and artists Joe Johnston and Ralph McQuarrie. “It’s different now that we use a lot of 3D tools,” says Jennifer Coronado, Senior Manager of ILM’s Art Department. “You don’t need orthos if you can do a sculpt and spin it around, but other than that, it’s still very much the same. We’ve grown and taken on more work, but even though we’re in four countries now, we try to make sure the team feels like a team.” Twenty-two artists work in San Francisco, 14 in London, three “and growing” in Vancouver and two in Singapore. In addition to work on films, the artists have done videogame design, marketing materials, and even provided Star Warsbased designs for prosthetics for children. “We pursue our own work,” Coronado says. “For film, we can take a show from blue sky to pragmatic builds and if the film gets awarded, continue through production. We’d love to work on the visual effects for every film we design, but sometimes that doesn’t work out.” Recently, to find new artists, ILM held a Star Wars-based competition. Three thousand people entered. After phase one and two – ship and character design – only 250 were left. After phase three, only eight. “People post great concept art on the web, but taking that to a final design is different,” Coronado says. “Phase three was most important. We had them create key frames for a fake movie with fake characters and a fake brief. We gave people two weeks and in the middle, changed it on them completely. Because that’s what a director would do.” They hired the third place winner, who now works as an art director in Vancouver. “Talented art directors are a finite resource,” Coronado says. “It’s amazing what these folks do, the way their brains work. They’re not just executing. They’re inventing under pressure. I applaud them every day.”

TOP to BOTTOM: Kong: Skull Island (Photo copyright © 2017 Warner Bros. Entertainment Inc. All Rights Reserved.)

60 • VFXVOICE.COM FALL 2017

PG 58-63 ILM.indd 60-61

improving day-to-day efficiency and creating new tools for the artists. “I think there’s plenty of room for improving the number of man-hours it takes to do a task, to make the workflow smoother and slicker so we can produce the same quality with fewer man hours,” he says. That’s true inside the facilities and out, as well. Lucasfilm and ILM recently announced an open-source release of the MaterialX library developed by Lucasfilm’s Advanced Development Group and ILM engineers. MaterialX, for which both Autodesk and Foundry have voiced support, facilitates the transfer of rich materials and look-development content between applications and renderers. “But equally important is focusing on enabling technology,” Knoll says, “on tools allowing us to do things we couldn’t before.”

“Rather than wait for other people to figure out what the rules [of VR storytelling] are, we’re learning them ourselves. We have people very excited about the possibilities of VR who are driving that forward... .” —John Knoll, Chief Creative Officer & Visual Effects Supervisor, ILM ILM’s long list of Technical Academy Awards attests to that effort. Today, Knoll cites tools for facial animation, for better compositing, and for effects simulation as ones he’s looking at in particular. “There are always more projects you want to do than you have engineering man hours to do,” Knoll says. When Disney bought Lucasfilm, ILM gained some partners in that effort. Helping the studio expand their R&D talent pool are Disney’s Pixar Animation and Walt Disney Animation. “We have a commonality of mission,” Knoll says. “Each year we get together at DISGRAPH where we share technical development and general information about how we work in a way that can be somewhat more open than we can be at SIGGRAPH. We can be explicit. We share code. If we have something one of the other studios likes, they can have it and vice versa.” ILM developers can also tap into work being carried on in the labs at Disney Research. “We meet with them every week,” Knoll says. “They’re interested in making sure that what they do has application, and we suggest things we would like to see. We’re in constant communication about the state of development of some of their tools. They have all kinds of really cool things we can apply directly.” IMMERSIVE ENTERTAINMENT

All these departments plus Lucasfilm’s Skywalker Sound feed talent and expertise into the studio’s ILMxLab in San Francisco where visual artists, sound artists, researchers and interactive storytellers create immersive entertainment. Knoll is intrigued by VR and AR, but personally hasn’t jumped

TOP to BOTTOM: Rogue One: A Star Wars Story (Photo copyright © 2016 Industrial Light & Magic, a division of Lucasfilm Entertainment Company Ltd., All Rights Reserved

FALL 2017 VFXVOICE.COM • 61

9/14/17 3:41 PM


COMPANY PROFILE

ILMxLAB Launched in June, 2015, ILM’s xLAB now boasts a team of approximately 80 people creating immersive entertainment experiences. “We develop, produce and release real-time content,” says Vicki Beck, Executive-in-Charge. “We have about 25 people doing R&D on real-time graphics and another 50 or so doing the actual production and development. “We had been doing some pioneering R&D work in this area for a number of reasons. We realized we had a unique opportunity to leverage all that ILM and Lucasfilm had to offer and bring it together in ILMxLABs. We have world-class production at ILM and [nearby] Skywalker Sound, a technical foundation and master storytellers.” ILM based the group in San Francisco in part due to the proximity to Silicon Valley. “We have strategic relationships with a lot of the big players who are pioneering in this space,” Beck says. One of the highest-profile experiences created at ILMxLAB is Iñárritu’s ‘Carne y Arena,’ featured at Cannes and the Prado and now at the LA County Museum of Art. “It uses VR technology to tell a story that could not be told in any other way,” Beck says. “It’s extremely powerful storytelling. No one has used VR in quite this way before.” The crew has created an experience for the recent Transformers film and has other projects in production. Some are obvious candidates. “You can assume we’re working on many Star Wars-related projects,” Beck says. Others, like ‘Carne y Arena’ are likely to come from outside Disney. “We anticipate working concurrently on at least two projects at a time,” she says.

TOP and MIDDLE: The Great Wall (Photo copyright © 2016 Industrial Light & Magic, a division of Lucasfilm Entertainment Company Ltd., All Rights Reserved.) BOTTOM: The Mummy (Photo copyright © 2017 Universal Studios. All Rights Reserved.)

feet first into the virtual world. “I went to the installation for the border-crossing project we did and I think it’s really good, one of the best uses of VR I’ve seen yet,” he says, referring to Iñárritu’s ‘Carne y Arena’ hybrid of art exhibition, virtual reality simulation, and historical re-enactment that sends participants on a terrifying run across the US-Mexico border. The project debuted at the Prada Foundation in Milan in June and is now at the LA County Museum of Art. “I’ve got 37 years of experience doing storytelling where we control what you’re looking at, and where we can compress time through edits and so forth,” Knoll says. “So I don’t know exactly how storytelling works when you subtract those things, you can look anywhere and navigate anywhere, and where you can’t hide time jumps in a cut, or maybe you can. Rather than wait for other people to figure out what the rules are, we’re learning them ourselves. We have people very excited about the possibilities of VR who are driving that forward, and I’m curious. But I’m staying more focused on traditional ILM. That’s an important business to us.”

“We have a commonality of mission. Each year we get together at DISGRAPH where we share technical development and general information about how we work in a way that can be somewhat more open than we can be at SIGGRAPH. We can be explicit. We share code. If we have something one of the other studios likes, they can have it and vice versa.” —John Knoll, Chief Creative Officer & Visual Effects Supervisor, ILM GROWING THE BUSINESS

In the past, ILM tended to look primarily for senior talent to help the studio grow and change, and the studio’s recruiters still are, but they also consider more junior artists now. All four studios are hiring. “We’re always looking for talented artists and engineers who I think of as artists as well,” Knoll says. “But sometimes you just cannot find experienced people in some disciplines. You just have to make them. We’ve experimented with hiring junior people right out of college and taking them through a training period, and it was successful. Some of that is just a way of thinking and problem solving. You don’t have to have a lot of experience to be a good problem solver. You just have to have the potential. What we need is talent.” From a small company in San Rafael, ILM has grown into an enviable global enterprise with 1,700 employees and walls filled with awards for creative and technical accomplishments over the years. “Physically, we’ve changed pretty dramatically,” Knoll says. “And the industry has matured. But our general philosophy of always trying to question the status quo and ask if there is a better way hasn’t changed. The collegial spirit of openness and sharing and helping each other is the same.”

TOP to BOTTOM: Doctor Strange (Photo copyright © 2016 Marvel Studios. All Rights Reserved.)

62 • VFXVOICE.COM FALL 2017

PG 58-63 ILM.indd 63

FALL 2017 VFXVOICE.COM • 63

9/14/17 3:41 PM


COMPANY PROFILE

ILMxLAB Launched in June, 2015, ILM’s xLAB now boasts a team of approximately 80 people creating immersive entertainment experiences. “We develop, produce and release real-time content,” says Vicki Beck, Executive-in-Charge. “We have about 25 people doing R&D on real-time graphics and another 50 or so doing the actual production and development. “We had been doing some pioneering R&D work in this area for a number of reasons. We realized we had a unique opportunity to leverage all that ILM and Lucasfilm had to offer and bring it together in ILMxLABs. We have world-class production at ILM and [nearby] Skywalker Sound, a technical foundation and master storytellers.” ILM based the group in San Francisco in part due to the proximity to Silicon Valley. “We have strategic relationships with a lot of the big players who are pioneering in this space,” Beck says. One of the highest-profile experiences created at ILMxLAB is Iñárritu’s ‘Carne y Arena,’ featured at Cannes and the Prado and now at the LA County Museum of Art. “It uses VR technology to tell a story that could not be told in any other way,” Beck says. “It’s extremely powerful storytelling. No one has used VR in quite this way before.” The crew has created an experience for the recent Transformers film and has other projects in production. Some are obvious candidates. “You can assume we’re working on many Star Wars-related projects,” Beck says. Others, like ‘Carne y Arena’ are likely to come from outside Disney. “We anticipate working concurrently on at least two projects at a time,” she says.

TOP and MIDDLE: The Great Wall (Photo copyright © 2016 Industrial Light & Magic, a division of Lucasfilm Entertainment Company Ltd., All Rights Reserved.) BOTTOM: The Mummy (Photo copyright © 2017 Universal Studios. All Rights Reserved.)

feet first into the virtual world. “I went to the installation for the border-crossing project we did and I think it’s really good, one of the best uses of VR I’ve seen yet,” he says, referring to Iñárritu’s ‘Carne y Arena’ hybrid of art exhibition, virtual reality simulation, and historical re-enactment that sends participants on a terrifying run across the US-Mexico border. The project debuted at the Prada Foundation in Milan in June and is now at the LA County Museum of Art. “I’ve got 37 years of experience doing storytelling where we control what you’re looking at, and where we can compress time through edits and so forth,” Knoll says. “So I don’t know exactly how storytelling works when you subtract those things, you can look anywhere and navigate anywhere, and where you can’t hide time jumps in a cut, or maybe you can. Rather than wait for other people to figure out what the rules are, we’re learning them ourselves. We have people very excited about the possibilities of VR who are driving that forward, and I’m curious. But I’m staying more focused on traditional ILM. That’s an important business to us.”

“We have a commonality of mission. Each year we get together at DISGRAPH where we share technical development and general information about how we work in a way that can be somewhat more open than we can be at SIGGRAPH. We can be explicit. We share code. If we have something one of the other studios likes, they can have it and vice versa.” —John Knoll, Chief Creative Officer & Visual Effects Supervisor, ILM GROWING THE BUSINESS

In the past, ILM tended to look primarily for senior talent to help the studio grow and change, and the studio’s recruiters still are, but they also consider more junior artists now. All four studios are hiring. “We’re always looking for talented artists and engineers who I think of as artists as well,” Knoll says. “But sometimes you just cannot find experienced people in some disciplines. You just have to make them. We’ve experimented with hiring junior people right out of college and taking them through a training period, and it was successful. Some of that is just a way of thinking and problem solving. You don’t have to have a lot of experience to be a good problem solver. You just have to have the potential. What we need is talent.” From a small company in San Rafael, ILM has grown into an enviable global enterprise with 1,700 employees and walls filled with awards for creative and technical accomplishments over the years. “Physically, we’ve changed pretty dramatically,” Knoll says. “And the industry has matured. But our general philosophy of always trying to question the status quo and ask if there is a better way hasn’t changed. The collegial spirit of openness and sharing and helping each other is the same.”

TOP to BOTTOM: Doctor Strange (Photo copyright © 2016 Marvel Studios. All Rights Reserved.)

62 • VFXVOICE.COM FALL 2017

PG 58-63 ILM.indd 63

FALL 2017 VFXVOICE.COM • 63

9/14/17 3:41 PM


VES 20TH ANNIVERSARY

A TRIP TO THE MOON: A BLOCKBUSTER RESTORED By PAULA PARISI

BOTTOM: Decomposed frames from A Trip to the Moon.

A blockbuster of its time, George Méliès’s 1902 film A Trip to the Moon is considered the first visual effects film, setting the template for the technical manipulation of images to create onscreen magic. The tale of a group of astronomers fired inside a projectile from a huge artillery canon to the lunar surface inspired generations of filmmakers as well as the VES, which took as its logo the signature image of the bullet-like craft poking the man in the moon’s eye. Among the VFX techniques pioneered by the French auteur were substitution splices, multiple exposures, forced-frame perspective, time-lapse photography and dissolves. He employed all manner of practical effects – explosions and trap doors – and was one of the first to use storyboards. Although Méliès made more than 500 films between 1896 and 1912, competing with Edison and the Lumière brothers through the tumultuous birth of cinema, A Trip to the Moon remains the work for which he is best remembered. While it has screened continuously since its release, the color version of the print was believed to have been lost. The discovery in 1993 of a badly deteriorated color print that had been donated to the Filmoteca de Catalunya in Barcelona led to a meticulous restoration made possible by Technicolor. TOP LEFT: The iconic shot of the man in the moon with a rocket in his eye is one of the best-known images in film history. TOP RIGHT: An example of what poor condition the print for Le voyage dans le lune (A Trip to the Moon) was in. More than 13,000 frames needed to be resurrected. This is an example of what frame technicians had to work with. The frames were in an advanced state of decomposition while many were stuck together.

64 • VFXVOICE.COM FALL 2017

PG 64-65 VES 20th MOON.indd All Pages

“In the history of restoration, I’ve never seen such a crazy project,” says Séverine Wemaere, who was running the now-defunct Technicolor Film Foundation that partnered in the restoration with Lobster Films, whose principals – Serge Bromberg and Eric Lange – were the private collectors that found and acquired the tattered treasure. “It was 13,375 frames that were completely fragmented and we had to put together like a huge puzzle,” Wemaere adds. This was done in Los Angeles by Tom Burton, who headed up Technicolor Creative Services at the time of the restoration, and his team. It would take 20 years and $1 million from the time of the ruined masterpiece’s discovery until its reappearance, in all its original splendor, to an appreciative audience at the opening night of the 2011 Festival de Cannes. The film then toured the world, exhibited at museums and film festivals, often as a double bill with The Extraordinary Voyage, the restoration documentary directed by Bromberg and Lange. “A rocket in the eye of moon, what a bizarre idea!” Bromberg says early on in the film, while the notion of ever being able to repair Méliès’s lost masterpiece must have seemed no less strange. Contained in a single reel, the 15 minutes of footage had melted and congealed into what was described as “a large hockey puck.” The sticky configuration was helped along by the fact that in Méliès’s day, black and white films were hand-painted using dyes that were mixed with glue. The process, which was reserved for A-list movies, was undertaken by assembly lines of up to 200 women. The process of separating the rigid, compact mass was arduous, and took team Lobster many years. Although they were initially told the print was irretrievable, their experiments prying it apart led to the discovery that only the edges were stuck; the central frames, though scratched and dirty, were relatively intact. Progress was by the centimeter and took weeks. Once unwound, the brittle reel was sent to the Haghefilm Digitaal in the Netherlands, where it was treated with chemicals to restore pliability. After several months of effort, nearly a third of the film was saved onto an internegative. The part that was not able to be uniformly salvaged was returned to Lobster in shards – some 10,000 pieces which would require the patience of a jigsaw puzzle specialist to recombine. In early 2000, Bromberg and Lange got their hands on a new three-million-pixel digital camera which they used to capture the fragmented images, which kept them busy through 2005. When finished, they found themselves with a disjunctive masterpiece on a hard drive, but were forced to wait for digital restoration

TOP LEFT: Tom Burton, Executive Director of Technicolor Restoration Services in Hollywood, helmed the restoration of A Trip to the Moon at the Technicolor laboratories in Los Angeles. (Photo:Ahmad Ouri) TOP RIGHT: Technicians labor lovingly during restoration of A Trip to the Moon at Technicolor labs in Los Angeles.

techniques to advance to a level suited to the sensitive task. It was in 2010 that the heavy-lifting began for Burton and the group at Technicolor Creative Services. As it happened, at the time of the restoration, Technicolor was also working on Martin Scorsese’s 2012 film, Hugo, about the life of Méliès, and the director wound up incorporating the footage into his project, which received 11 Academy Award nominations, earning a Visual Effects trophy for Rob Legato. That happy accident was a long way from Burton’s mind when he began the restoration. “The files we received were in various formats – TIFF, TGA, JPG – and different resolutions. Some were captured via digital camera, frame by frame, and some were from the digital scanner,” Burton recalls. “There was no way to play back a continuous image, nor was the data sequentially organized.” Using for reference an HDCAM telecine of an existing B&W print, the individual frames and image shards were carefully mapped to their proper locations. From then on, it became a process of iteration. “Playing back this timeline of newly assembled color-tinted images that no one had had the opportunity to view for many decades – possibly a century or more – was very exciting,” Burton says. Ensuing steps included image stabilization – more precisely adjusting the shattered fragments – “de-flickering,” which conformed frame-to-frame densities, and a color pre-grade. With this baseline look established, Technicolor set about recreating the missing color shards and in some cases frames; about 10% of the film had simply crumbled to dust. Carefully sampling the original color palette, Burton’s crew copied data from the black and white print, carefully preserving the anomalies of the hand-tinting, which included smudging and other quirks. The reconstruction utilized VFX tools including Digital Vision Phoenix/DVO, MTI and After Effects. “It’s really more a visual effects project in a way than a restoration project,” Burton told Smithsonian magazine in 2011. “A lot of the technology that we used to rebuild these frames is the technology that you would use if you were making a first-run, major visual effects motion picture. You would never have been able to pull this off 10 years ago, and certainly not at all with analog, photochemical technology.”

FALL 2017 VFXVOICE.COM • 65

9/14/17 3:42 PM


VES 20TH ANNIVERSARY

A TRIP TO THE MOON: A BLOCKBUSTER RESTORED By PAULA PARISI

BOTTOM: Decomposed frames from A Trip to the Moon.

A blockbuster of its time, George Méliès’s 1902 film A Trip to the Moon is considered the first visual effects film, setting the template for the technical manipulation of images to create onscreen magic. The tale of a group of astronomers fired inside a projectile from a huge artillery canon to the lunar surface inspired generations of filmmakers as well as the VES, which took as its logo the signature image of the bullet-like craft poking the man in the moon’s eye. Among the VFX techniques pioneered by the French auteur were substitution splices, multiple exposures, forced-frame perspective, time-lapse photography and dissolves. He employed all manner of practical effects – explosions and trap doors – and was one of the first to use storyboards. Although Méliès made more than 500 films between 1896 and 1912, competing with Edison and the Lumière brothers through the tumultuous birth of cinema, A Trip to the Moon remains the work for which he is best remembered. While it has screened continuously since its release, the color version of the print was believed to have been lost. The discovery in 1993 of a badly deteriorated color print that had been donated to the Filmoteca de Catalunya in Barcelona led to a meticulous restoration made possible by Technicolor. TOP LEFT: The iconic shot of the man in the moon with a rocket in his eye is one of the best-known images in film history. TOP RIGHT: An example of what poor condition the print for Le voyage dans le lune (A Trip to the Moon) was in. More than 13,000 frames needed to be resurrected. This is an example of what frame technicians had to work with. The frames were in an advanced state of decomposition while many were stuck together.

64 • VFXVOICE.COM FALL 2017

PG 64-65 VES 20th MOON.indd All Pages

“In the history of restoration, I’ve never seen such a crazy project,” says Séverine Wemaere, who was running the now-defunct Technicolor Film Foundation that partnered in the restoration with Lobster Films, whose principals – Serge Bromberg and Eric Lange – were the private collectors that found and acquired the tattered treasure. “It was 13,375 frames that were completely fragmented and we had to put together like a huge puzzle,” Wemaere adds. This was done in Los Angeles by Tom Burton, who headed up Technicolor Creative Services at the time of the restoration, and his team. It would take 20 years and $1 million from the time of the ruined masterpiece’s discovery until its reappearance, in all its original splendor, to an appreciative audience at the opening night of the 2011 Festival de Cannes. The film then toured the world, exhibited at museums and film festivals, often as a double bill with The Extraordinary Voyage, the restoration documentary directed by Bromberg and Lange. “A rocket in the eye of moon, what a bizarre idea!” Bromberg says early on in the film, while the notion of ever being able to repair Méliès’s lost masterpiece must have seemed no less strange. Contained in a single reel, the 15 minutes of footage had melted and congealed into what was described as “a large hockey puck.” The sticky configuration was helped along by the fact that in Méliès’s day, black and white films were hand-painted using dyes that were mixed with glue. The process, which was reserved for A-list movies, was undertaken by assembly lines of up to 200 women. The process of separating the rigid, compact mass was arduous, and took team Lobster many years. Although they were initially told the print was irretrievable, their experiments prying it apart led to the discovery that only the edges were stuck; the central frames, though scratched and dirty, were relatively intact. Progress was by the centimeter and took weeks. Once unwound, the brittle reel was sent to the Haghefilm Digitaal in the Netherlands, where it was treated with chemicals to restore pliability. After several months of effort, nearly a third of the film was saved onto an internegative. The part that was not able to be uniformly salvaged was returned to Lobster in shards – some 10,000 pieces which would require the patience of a jigsaw puzzle specialist to recombine. In early 2000, Bromberg and Lange got their hands on a new three-million-pixel digital camera which they used to capture the fragmented images, which kept them busy through 2005. When finished, they found themselves with a disjunctive masterpiece on a hard drive, but were forced to wait for digital restoration

TOP LEFT: Tom Burton, Executive Director of Technicolor Restoration Services in Hollywood, helmed the restoration of A Trip to the Moon at the Technicolor laboratories in Los Angeles. (Photo:Ahmad Ouri) TOP RIGHT: Technicians labor lovingly during restoration of A Trip to the Moon at Technicolor labs in Los Angeles.

techniques to advance to a level suited to the sensitive task. It was in 2010 that the heavy-lifting began for Burton and the group at Technicolor Creative Services. As it happened, at the time of the restoration, Technicolor was also working on Martin Scorsese’s 2012 film, Hugo, about the life of Méliès, and the director wound up incorporating the footage into his project, which received 11 Academy Award nominations, earning a Visual Effects trophy for Rob Legato. That happy accident was a long way from Burton’s mind when he began the restoration. “The files we received were in various formats – TIFF, TGA, JPG – and different resolutions. Some were captured via digital camera, frame by frame, and some were from the digital scanner,” Burton recalls. “There was no way to play back a continuous image, nor was the data sequentially organized.” Using for reference an HDCAM telecine of an existing B&W print, the individual frames and image shards were carefully mapped to their proper locations. From then on, it became a process of iteration. “Playing back this timeline of newly assembled color-tinted images that no one had had the opportunity to view for many decades – possibly a century or more – was very exciting,” Burton says. Ensuing steps included image stabilization – more precisely adjusting the shattered fragments – “de-flickering,” which conformed frame-to-frame densities, and a color pre-grade. With this baseline look established, Technicolor set about recreating the missing color shards and in some cases frames; about 10% of the film had simply crumbled to dust. Carefully sampling the original color palette, Burton’s crew copied data from the black and white print, carefully preserving the anomalies of the hand-tinting, which included smudging and other quirks. The reconstruction utilized VFX tools including Digital Vision Phoenix/DVO, MTI and After Effects. “It’s really more a visual effects project in a way than a restoration project,” Burton told Smithsonian magazine in 2011. “A lot of the technology that we used to rebuild these frames is the technology that you would use if you were making a first-run, major visual effects motion picture. You would never have been able to pull this off 10 years ago, and certainly not at all with analog, photochemical technology.”

FALL 2017 VFXVOICE.COM • 65

9/14/17 3:42 PM


VES 20TH ANNIVERSARY

A POWERFUL LEGACY: VISUAL EFFECTS SOCIETY CELEBRATES ITS 20TH ANNIVERSARY By NAOMI GOLDMAN

TOP: VES Lifetime Achievement Award Winners Frank Marshall and Kathleen Kennedy celebrate with director David Fincher at the 7th Annual VES Awards in 2009. BOTTOM: Director Steven Spielberg at the 6th Annual VES Awards. He was given a VES Lifetime Achievement Award in 2008.

From its auspicious beginnings with a few hundred members in Los Angeles to a flourishing global Honorary society with presence in 35 countries, the Visual Effects Society is proud to celebrate its milestone 20th Anniversary and the exemplary VFX community whose contributions have made the Society what it is today. As the entertainment industry’s only organization representing the full breadth of visual effects practitioners, the VES has built a rich legacy by advancing the arts, science and application of visual effects, improving the welfare of its members, celebrating VFX excellence and serving as a resource to the ever-changing global marketplace. At its core are a number of essential elements: • The VES membership, now 3,500+ strong, includes experts in all areas of entertainment from film, TV, commercials and animation, to games, theme parks, VR/AR and new media. • Dynamic volunteer leadership from all corners of the globe and a dedicated staff who bring their enthusiasm, commitment and deep expertise to nurture and grow the Society. • A high impact roster of educational programs, panels, screenings, networking and signature events hosted all over the world. • And a truly global mindset that drives the organization, focused on the needs and interests that unify the rapidly expanding worldwide VFX community. A LEGACY IS BORN: HONORING OUR ROOTS AND WINGS

The VES origin story is one of community building and creating stronger appreciation and respect for the artists and innovators doing remarkable work, which today drives box office like never before. “We were an all-volunteer organization at the beginning with a good cross-section of representatives and a key rabble-rouser as our Founding Board Chair – me,” says Jim Morris, VES, Founding VES Board Chair and President, Pixar Animation Studios. “We knew the VFX community needed better recognition and were willing to be experimental to achieve that through our programs and initiatives.” “The early VES Board meetings were filled with legendary movers and shakers – [Dennis] Muren, [Jim] Morris, [Phil] Tippett, [Harrison] Ellenshaw, [Jonathan] Erland,” says Jeffrey A. Okun, VES, former VES Board Chair and Los Angeles Section Chair. “They chose to create an apolitical organization and keep it about the artists, honor and education.” “Because we loved doing the work and liked each other, that led to greater camaraderie and collaboration as practitioners across company and individual lines,” adds Morris. WE CAN BE HEROES: THE VES AWARDS

Under the leadership of founding Executive Director Tom Atkin, Founding Chair Jim Morris and then-Board member Jeffrey A. Okun, the Annual VES Awards was the first large-scale undertaking the VES tackled, with the hope that it would have a meaningful impact on the entertainment industry. Now in its 16th year, the VES Awards show is recognized as a world-class

66 • VFXVOICE.COM FALL 2017

PG 66-73 VES HISTORY.indd 66-67

event that fulfills the Society’s promise to celebrate extraordinary artistry and the artists who create it. The VES Awards have become the industry’s must-attend event – which recognizes outstanding VFX talent from around the world and fosters the next generation of filmmakers through its Steven Spielberg inspired and Autodesk supported student award. And two VES Sections thus far, New York and London, have hosted regional awards celebrations, further extending the reach and impact of the VFX awards season. From the beginning, the VES Awards marked a watershed moment for the VFX community wherein below-the-line talent got their deserved moment in the spotlight. “In the early days – seeing the geeks show up in tuxes was awesome,” says Kim Lavery, VES, Co-founder VES Awards and VES Board 2nd Vice Chair. “These guys and gals work in dark spaces and wear worn out superhero t-shirts most of the time, so they were thrilled to get dressed up and attend a black-tie gala where every nominee walks the red carpet, let alone be nominated or win an award.” From an operational standpoint, the VES pioneered the electronic view-and-vote for its global membership, which is now industry standard. And today, the VES red carpet and pressroom bustle, and a growing global audience of 175,000+ social media followers, track the results in real-time, while an ever-expanding media footprint celebrates its pedigreed roster of honorees, nominees and winners. But back then… “I recall vividly when we honored Robert Zemeckis and Tom Hanks agreed to present it to him; that was our ‘we have arrived’ moment,” adds Okun, VES, Founder of the VES Awards. “But while Hanks was illuminating people on who we are and what we do and we’re all feeling good in tuxedos – cars were being towed outside the Palladium. We’ve come a long way from those early days.”

TOP: Director James Cameron at the 8th Annual VES Awards. He was given a VES Lifetime Achievement Award in 2010. BOTTOM: Director Taiko Waititi backstage with Visionary Award recipient Victoria Alonso at the 15th Annual VES Awards

GOING GLOBAL: BIRTH OF THE VES SECTIONS

Today, the VES boasts 11 diverse Sections – Australia, Germany, London, Los Angeles, Montreal, New York, New Zealand, San Francisco Bay Area, Toronto, Vancouver and Washington State. Many among the VES leadership note the creation of the Sections as one of, if not the biggest achievement of VES to date. The aim is always for members to feel that no matter their location, they are part of a unified community, a VFX cloud clubhouse. “While there were a lot of requests for us to address runaway production in California, it was a new day of operating as a global organization and focusing on the issues that cut across geographic lines,” says Jeff Barnes, former VES Board Chair. “Ultimately, it is

“We were an all-volunteer organization at the beginning with a good cross-section of representatives and a key rabble-rouser as our Founding Board Chair – me.” —Jim Morris, VES, Founding VES Board Chair and President, Pixar Animation Studios

FALL 2017 VFXVOICE.COM • 67

9/14/17 3:43 PM


VES 20TH ANNIVERSARY

A POWERFUL LEGACY: VISUAL EFFECTS SOCIETY CELEBRATES ITS 20TH ANNIVERSARY By NAOMI GOLDMAN

TOP: VES Lifetime Achievement Award Winners Frank Marshall and Kathleen Kennedy celebrate with director David Fincher at the 7th Annual VES Awards in 2009. BOTTOM: Director Steven Spielberg at the 6th Annual VES Awards. He was given a VES Lifetime Achievement Award in 2008.

From its auspicious beginnings with a few hundred members in Los Angeles to a flourishing global Honorary society with presence in 35 countries, the Visual Effects Society is proud to celebrate its milestone 20th Anniversary and the exemplary VFX community whose contributions have made the Society what it is today. As the entertainment industry’s only organization representing the full breadth of visual effects practitioners, the VES has built a rich legacy by advancing the arts, science and application of visual effects, improving the welfare of its members, celebrating VFX excellence and serving as a resource to the ever-changing global marketplace. At its core are a number of essential elements: • The VES membership, now 3,500+ strong, includes experts in all areas of entertainment from film, TV, commercials and animation, to games, theme parks, VR/AR and new media. • Dynamic volunteer leadership from all corners of the globe and a dedicated staff who bring their enthusiasm, commitment and deep expertise to nurture and grow the Society. • A high impact roster of educational programs, panels, screenings, networking and signature events hosted all over the world. • And a truly global mindset that drives the organization, focused on the needs and interests that unify the rapidly expanding worldwide VFX community. A LEGACY IS BORN: HONORING OUR ROOTS AND WINGS

The VES origin story is one of community building and creating stronger appreciation and respect for the artists and innovators doing remarkable work, which today drives box office like never before. “We were an all-volunteer organization at the beginning with a good cross-section of representatives and a key rabble-rouser as our Founding Board Chair – me,” says Jim Morris, VES, Founding VES Board Chair and President, Pixar Animation Studios. “We knew the VFX community needed better recognition and were willing to be experimental to achieve that through our programs and initiatives.” “The early VES Board meetings were filled with legendary movers and shakers – [Dennis] Muren, [Jim] Morris, [Phil] Tippett, [Harrison] Ellenshaw, [Jonathan] Erland,” says Jeffrey A. Okun, VES, former VES Board Chair and Los Angeles Section Chair. “They chose to create an apolitical organization and keep it about the artists, honor and education.” “Because we loved doing the work and liked each other, that led to greater camaraderie and collaboration as practitioners across company and individual lines,” adds Morris. WE CAN BE HEROES: THE VES AWARDS

Under the leadership of founding Executive Director Tom Atkin, Founding Chair Jim Morris and then-Board member Jeffrey A. Okun, the Annual VES Awards was the first large-scale undertaking the VES tackled, with the hope that it would have a meaningful impact on the entertainment industry. Now in its 16th year, the VES Awards show is recognized as a world-class

66 • VFXVOICE.COM FALL 2017

PG 66-73 VES HISTORY.indd 66-67

event that fulfills the Society’s promise to celebrate extraordinary artistry and the artists who create it. The VES Awards have become the industry’s must-attend event – which recognizes outstanding VFX talent from around the world and fosters the next generation of filmmakers through its Steven Spielberg inspired and Autodesk supported student award. And two VES Sections thus far, New York and London, have hosted regional awards celebrations, further extending the reach and impact of the VFX awards season. From the beginning, the VES Awards marked a watershed moment for the VFX community wherein below-the-line talent got their deserved moment in the spotlight. “In the early days – seeing the geeks show up in tuxes was awesome,” says Kim Lavery, VES, Co-founder VES Awards and VES Board 2nd Vice Chair. “These guys and gals work in dark spaces and wear worn out superhero t-shirts most of the time, so they were thrilled to get dressed up and attend a black-tie gala where every nominee walks the red carpet, let alone be nominated or win an award.” From an operational standpoint, the VES pioneered the electronic view-and-vote for its global membership, which is now industry standard. And today, the VES red carpet and pressroom bustle, and a growing global audience of 175,000+ social media followers, track the results in real-time, while an ever-expanding media footprint celebrates its pedigreed roster of honorees, nominees and winners. But back then… “I recall vividly when we honored Robert Zemeckis and Tom Hanks agreed to present it to him; that was our ‘we have arrived’ moment,” adds Okun, VES, Founder of the VES Awards. “But while Hanks was illuminating people on who we are and what we do and we’re all feeling good in tuxedos – cars were being towed outside the Palladium. We’ve come a long way from those early days.”

TOP: Director James Cameron at the 8th Annual VES Awards. He was given a VES Lifetime Achievement Award in 2010. BOTTOM: Director Taiko Waititi backstage with Visionary Award recipient Victoria Alonso at the 15th Annual VES Awards

GOING GLOBAL: BIRTH OF THE VES SECTIONS

Today, the VES boasts 11 diverse Sections – Australia, Germany, London, Los Angeles, Montreal, New York, New Zealand, San Francisco Bay Area, Toronto, Vancouver and Washington State. Many among the VES leadership note the creation of the Sections as one of, if not the biggest achievement of VES to date. The aim is always for members to feel that no matter their location, they are part of a unified community, a VFX cloud clubhouse. “While there were a lot of requests for us to address runaway production in California, it was a new day of operating as a global organization and focusing on the issues that cut across geographic lines,” says Jeff Barnes, former VES Board Chair. “Ultimately, it is

“We were an all-volunteer organization at the beginning with a good cross-section of representatives and a key rabble-rouser as our Founding Board Chair – me.” —Jim Morris, VES, Founding VES Board Chair and President, Pixar Animation Studios

FALL 2017 VFXVOICE.COM • 67

9/14/17 3:43 PM


VES 20TH ANNIVERSARY

our consistency and commitment to the mission that has brought real value to our members and industry colleagues.” In the next three to five years the VES expects to add another half dozen Sections. It has received a petition from India and predicts a number of markets that might be next, including France, Massachusetts, Atlanta, Chicago and possibly China. “As Chair, I’m focused on ensuring we can successfully handle large organizational growth in the future and that we inspire new generations to lead us forward,” says Mike Chambers, current VES Board Chair. “Our determination to outreach to all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership, and that commitment to diversity will continue to be a driving force of the organization.” ADDRESSING THE DOLLARS AND SENSE: VES AS INDUSTRY CONVENER

Over the last two decades, the VES has used its position as a convener to take on critical issues affecting its membership and the greater VFX community through forums, white papers, technical platforms, educational events and other opportunities to serve as a resource. Five years ago, the industry hit a crucial moment that called out for leadership – one with the ability to bring leaders together and help forge solutions to complex challenges. After a series of industry disruptions in 2012/2013 (Rhythm & Hues going out of business, the Oscars protest march, renewed calls for VFX unionization) and fervent worldwide dialogue, VES saw an urgent need to take a wholesale look at the uncertain business climate and conducted a rigorous analysis of commoditization, tax incentives, government dynamics, the impact of technological advancement and inadequate pipeline and pricing models. This resulted in its publication of The State of the Global VFX Industry 2013, a significant strategic analysis of the business drivers impacting all sectors of the VFX industry working in film production. To get to its final product, the VES assembled a working group of more than three dozen industry representatives

“Our determination to outreach to all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership and that commitment to diversity will continue to be a driving force of the organization.” —Mike Chambers, VES Board Chair TOP to BOTTOM: VES Board members celebrating on the VES Awards red carpet – Pam Hogarth, Katie Stetson, Kathryn Brillhart, Kim Lavery, VES, Debbie Denise, Brooke Breton and Rita Cahill. Gathered at the 2016 VES Summit. From left, former Board Chair Jeff Barnes, current Board Chair Mike Chambers, VES Executive Director Eric Roth, Founding Board Chair Jim Morris, VES, and former Board Chair Jeffrey A. Okun, VES. Sir Ridley Scott, VES Lifetime Achievement award recipient, enjoying the festive celebration with Kate Mara at the 14th Annual VES Awards.

68 • VFXVOICE.COM FALL 2017

PG 66-73 VES HISTORY.indd 68-69

TOP: Visionary director Darren Aronofsky receives the 2017 VES Empire Award from the VES New York Section at its 3rd Annual Regional Awards Celebration. MIDDLE LEFT: Founding VES Board Chair Jim Morris, VES, kicks off the 1st Annual VES Awards at the Skirball Museum in 2003. MIDDLE RIGHT: Director Christopher Nolan receives a VES Visionary Award in 2011. BOTTOM: VES Executive Director Eric Roth and Executive Committee Members Jeffrey A. Okun, VES, Kim Lavery, VES, Mike Chambers, Rita Cahill and Dennis Hoffman at the 15th Annual VES Awards. (Photo copyright © 2017 Danny Moloshok)

including artists, studio, business and labor leaders and facility executives whose companies had operations all over the globe. “There were things that hadn’t been said out loud at the time as we were experiencing the race to the bottom,” says Carl Rosendahl, VES, former VES Board Chair, Co-author The State of the Global VFX Industry 2013 and Associate Professor and Director at Carnegie Mellon University Entertainment Technology Center’s Silicon Valley campus. Entertainment Technology Center at Carnegie Mellon University. “I’m glad that this vital effort served as an important conversation starter across the industry around the complexities of the system. Some of the issues we identified have normalized and we’re living in a new reality, and I’m keenly interested in how the industry has adapted or innovated to keep pace with our ‘new normal’.” IN THE ROOM WHERE IT HAPPENS – THE VES SUMMIT

The VES Summit is an annual one-day forum bringing together top thought-leaders – creative geniuses, business executives, technical wizards, enlightened storytellers, fearless problem-solvers, VR pioneers and VFX Visionaries – to explore the dynamic evolution of visual imagery and the VFX industry landscape. “My vision was in line with creating more prevalence for the VFX community within the greater entertainment community,” says Rita Cahill, VES Summit Chair and Board Secretary. “I thought if we could create a TED Talks-like atmosphere that would offer thoughtful and mind-boggling perspectives, it would fill an important niche and raise our profile in the process. Our speakers have included a stellar list of luminaries, but some definitely stand out. NASA Interplanetary Robotics Expert Nagin Cox taking us to outer space… our panel of three esteemed AMPAS Presidents… Victoria Alonso speaking on equity for women… last year’s speaker [Dr. Albert] “Skip” Rizzo on using VR to treat PTSD.” The 9th Annual Summit is just weeks away on October 28 offering a rich program of experts and provocateurs and its special yearly recognition program honoring the VES’s newest Fellows, Honorary and Lifetime members and Founders Award recipients. And building on the global model, for the past two years the Bay Area Section has hosted galvanizing regional Summits, creating a rich forum to showcase their homegrown expertise and build community.

FALL 2017 VFXVOICE.COM • 69

9/21/17 12:37 PM


VES 20TH ANNIVERSARY

our consistency and commitment to the mission that has brought real value to our members and industry colleagues.” In the next three to five years the VES expects to add another half dozen Sections. It has received a petition from India and predicts a number of markets that might be next, including France, Massachusetts, Atlanta, Chicago and possibly China. “As Chair, I’m focused on ensuring we can successfully handle large organizational growth in the future and that we inspire new generations to lead us forward,” says Mike Chambers, current VES Board Chair. “Our determination to outreach to all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership, and that commitment to diversity will continue to be a driving force of the organization.” ADDRESSING THE DOLLARS AND SENSE: VES AS INDUSTRY CONVENER

Over the last two decades, the VES has used its position as a convener to take on critical issues affecting its membership and the greater VFX community through forums, white papers, technical platforms, educational events and other opportunities to serve as a resource. Five years ago, the industry hit a crucial moment that called out for leadership – one with the ability to bring leaders together and help forge solutions to complex challenges. After a series of industry disruptions in 2012/2013 (Rhythm & Hues going out of business, the Oscars protest march, renewed calls for VFX unionization) and fervent worldwide dialogue, VES saw an urgent need to take a wholesale look at the uncertain business climate and conducted a rigorous analysis of commoditization, tax incentives, government dynamics, the impact of technological advancement and inadequate pipeline and pricing models. This resulted in its publication of The State of the Global VFX Industry 2013, a significant strategic analysis of the business drivers impacting all sectors of the VFX industry working in film production. To get to its final product, the VES assembled a working group of more than three dozen industry representatives

“Our determination to outreach to all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership and that commitment to diversity will continue to be a driving force of the organization.” —Mike Chambers, VES Board Chair TOP to BOTTOM: VES Board members celebrating on the VES Awards red carpet – Pam Hogarth, Katie Stetson, Kathryn Brillhart, Kim Lavery, VES, Debbie Denise, Brooke Breton and Rita Cahill. Gather at the 2016 VES Summit. From left, former Board Chair Jeff Barnes, current Board Chair Mike Chambers, VES Executive Director Eric Roth, Founding Board Chair Jim Morris, VES, and former Board Chair Jeffrey A. Okun, VES.” Sir Ridley Scott, VES Lifetime Achievement award recipient, enjoying the festive celebration with Kate Mara at the 14th Annual VES Awards.

68 • VFXVOICE.COM FALL 2017

PG 66-73 VES HISTORY.indd 68-69

TOP: Visionary director Darren Aronofsky receives the 2017 VES Empire Award from the VES New York Section at its 3rd Annual Regional Awards Celebration. MIDDLE LEFT: Founding VES Board Chair Jim Morris, VES, kicks off the 1st Annual VES Awards at the Skirball Museum in 2003. MIDDLE RIGHT: Director Christopher Nolan receives a VES Visionary Award in 2011. BOTTOM: VES Executive Director Eric Roth and Executive Committee Members Jeffrey A. Okun, VES, Kim Lavery, VES, Mike Chambers, Rita Cahill and Dennis Hoffman at the 15th Annual VES Awards. (Photo copyright © 2017 Danny Moloshok)

including artists, studio, business and labor leaders and facility executives whose companies had operations all over the globe. “There were things that hadn’t been said out loud at the time as we were experiencing the race to the bottom,” says Carl Rosendahl, VES, former VES Board Chair, Co-author The State of the Global VFX Industry 2013 and Associate Professor and Director at Carnegie Mellon University Entertainment Technology Center’s Silicon Valley campus. Entertainment Technology Center at Carnegie Mellon University. “I’m glad that this vital effort served as an important conversation starter across the industry around the complexities of the system. Some of the issues we identified have normalized and we’re living in a new reality, and I’m keenly interested in how the industry has adapted or innovated to keep pace with our ‘new normal’.” IN THE ROOM WHERE IT HAPPENS – THE VES SUMMIT

The VES Summit is an annual one-day forum bringing together top thought-leaders – creative geniuses, business executives, technical wizards, enlightened storytellers, fearless problem-solvers, VR pioneers and VFX Visionaries – to explore the dynamic evolution of visual imagery and the VFX industry landscape. “My vision was in line with creating more prevalence for the VFX community within the greater entertainment community,” says Rita Cahill, VES Summit Chair and Board Secretary. “I thought if we could create a TED Talks-like atmosphere that would offer thoughtful and mind-boggling perspectives, it would fill an important niche and raise our profile in the process. Our speakers have included a stellar list of luminaries, but some definitely stand out. NASA Interplanetary Robotics Expert Nagin Cox taking us to outer space… our panel of three esteemed AMPAS Presidents… Victoria Alonso speaking on equity for women… last year’s speaker [Dr. Albert] “Skip” Rizzo on using VR to treat PTSD.” The 9th Annual Summit is just weeks away on October 28 offering a rich program of experts and provocateurs and its special yearly recognition program honoring the VES’s newest Fellows, Honorary and Lifetime members and Founders Award recipients. And building on the global model, for the past two years the Bay Area Section has hosted galvanizing regional Summits, creating a rich forum to showcase their homegrown expertise and build community.

FALL 2017 VFXVOICE.COM • 69

9/14/17 3:43 PM


VES 20TH ANNIVERSARY

THE GO-TO VFX REFERENCE – THE VES HANDBOOK

TOP: Director Ang Lee beams after receiving the VES Visionary Award in 2013. He is flanked by VES Executive Director Eric Roth, left, and Jeffrey A. Okun, VES. BOTTOM LEFT: Actress Sandra Bullock is all smiles with director Alfonso Cuarón who received a Visionary Award at the 2014 VES Awards. BOTTOM RIGHT: Director George Lucas, left, helps Dennis Muren, VES, ASC, celebrate his 2007 VES Lifetime Achievement Award.

And in its ongoing commitment to provide resources that elevate the craft of VFX and its practitioners, the VES achieved a milestone with its publication of The VES Handbook of Visual Effects in 2010 and its second edition in 2014. Hailed as the most complete guide to visual effects techniques and best practices on the market, it covers essential solutions for all VFX artists, producers and supervisors, from pre-production through production and post-production. The award-winning guide covers areas including stereoscopic moviemaking, color management, facial capture, virtual productions, 3D conversions, compositing of live-action and CG elements and the digital intermediate, as well as detailed chapters on interactive games and full animation. On the book’s genesis, Co-editor Jeffrey A. Okun noted the ASC’s [The American Society of Cinematographers] go-to manual of photography that became a core educational tool and the parallel need for the VFX community to have a like document to serve as a reference and ‘calling card’ to the other verticals. “In developing this with Susie [Co-editor and VFX producer Susan Zwerman], I wanted it to hit home that there is an enormous amount of science and math and planning, and the result of what you get is based on what you prep. And I hoped that this resource would advance us towards the overall goal of respectability – better credits, better positions in the credits, stronger voice in production meetings.” A third edition of The VES Handbook is slated for release in 2018.

AD

THE LAST WORD

One of the Society’s longstanding goals was realized this year with the publication of VFX Voice. “After shining a light on the artistry and innovation in the VFX community for 20 years, we are immensely proud of launching our premiere magazine, extending our work to advance the profile and recognition of our industry,” says Eric Roth, VES Executive Director.

70 • VFXVOICE.COM FALL 2017

PG 66-73 VES HISTORY.indd 71

FALL 2017 VFXVOICE.COM • 71

9/14/17 3:44 PM


SIDEFX AD.indd 71

9/14/17 3:44 PM


VES 20TH ANNIVERSARY

And a second goal, which gets to the heart of broadening the public’s understanding of visual effects, is well underway. The VES is working diligently to develop an immersive exhibition experience in a museum setting, so that the general public can truly appreciate the artistry and innovation that goes into the visual effects work they enjoy in record numbers. Looking ahead, the VES is focused on continuing its path of smart, diversified global growth and providing outstanding programs and value to its members and the broader VFX community. “I’ve not met a group of artists more committed to their art than VFX practitioners. They may be the purest moviemakers of them all,” says Morris. Consumers have increased their appreciation, but don’t fully understand that VFX creates the entire fabric of filmed entertainment. I’d love for them to understand what is under the hood.” “The global effects industry does not employ drones that operate mysterious black boxes, simply creating VFX out of thin air,” adds Chambers. “It’s been mischaracterized for years as ‘magic’ and ‘wizardry,’ which greatly undervalues the craft, and we need to make that clear with the industry at large in order to enjoy a broader level of respect and recognition. Visual effects are created by talented, experienced artists and technologists who work very, very hard behind the scenes. And while audiences are ever more savvy to VFX, I would really love for them to realize that some of our members are indeed their rock stars. We’ll continue to lead the charge to bring that awareness to all.”

AD

TOP to BOTTOM: Zoe Saldana and VES Visionary Award recipient J.J. Abrams ham it up backstage at the 13th Annual VES Awards in 2015. (Photo by Rene Macura) The VES New York Section celebrating at the 3rd Annual VES Regional Awards Celebration in 2017. Director Robert Zemeckis receiving the Lifetime Achievement Award at the 3rd Annual VES Awards in 2005 flanked by good friend and collaborator Tom Hanks. VES Board member Richard Edlund, VES, ASC, receiving the Lifetime Achievement Award at the 11th Annual VES Awards in 2013, celebrating with presenter Harrison Ford. TOP RIGHT: The VES hosted three AMPAS Presidents at the 2013 VES Summit – VES Executive Director Eric Roth flanked by Sid Ganis, Hawk Kock, Cheryl Boone Isaacs and panel moderator Bill Kroyer.

72 • VFXVOICE.COM FALL 2017

PG 66-73 VES HISTORY.indd 73

FALL 2017 VFXVOICE.COM • 73

9/14/17 3:44 PM


YANNIX AD.indd 73

9/14/17 3:45 PM


VES 20TH ANNIVERSARY

THE VES 70: THE MOST INFLUENTIAL VISUAL EFFECTS FILMS OF ALL TIME By NAOMI GOLDMAN

All-Time VES VFX Films gallery compiled with commentary by Ian Failes. Directors quotes by Paula Parisi.

The Visual Effects Society has released its definitive “VES 70: The Most Influential Visual Effects Films of All Time.” The original VES member-chosen “VES 50” list was created in 2007, marking one decade since the organization’s inception. In commemoration of the VES’s milestone 20th anniversary, the global membership – now almost triple the size of the membership first polled – have added 20 more films from 2015 and earlier to the VFX honor roll. The goal of the two polls was to result in 50 and 20 films respectively, but each poll had ties for the final slots, thus the list includes 72 total films. “The ‘VES 70’ represents films that have had a significant, lasting impact on the practice and appreciation of visual effects as an integral element of cinematic expression and storytelling,” says Mike Chambers, VES Board Chair. “We see this as an important opportunity for our members, leading visual effects practitioners worldwide, to pay homage to our heritage and help shape the future of the global visual effects community. In keeping with our mission to recognize and advance outstanding art and innovation in the VFX field, the ‘VES 70’ now forms a part of our legacy that we can pass down to future generations of filmmakers as a valuable point of reference.” Films included in the “VES 70” span from the early 1900s to 2015. The earliest entry on the list is A Trip to the Moon (Le Voyage dans la Lune), the seminal 1902 French silent film directed by Georges Méliès, whose iconic image exemplifies the VES’s legacy – past, present and future. The most current entries are Academy Award winner for Best Visual Effects, Ex Machina, and the critically acclaimed Mad Max: Fury Road, both from 2015. The ballot, voted on by VES members in Summer 2017, was limited to features from 2015 and earlier, to help ensure that the candidates have had a lasting impact and that voting was not unduly influenced by the most recent VES Award winners.

COVER NOTES: WE’RE ABOUT TO SEE A LOT MORE OF AVATAR Among the newest additions to the VES’s list of most influential visual effects films is James Cameron’s Avatar. In 2009, it set the standard in both virtual production techniques and in transforming human actors into the realm of photo-realistic creatures. Right now, the lead visual effects studio on Avatar – Weta Digital – is also at work on the four sequels to the film being made by Cameron. The first of those is set for release in December 2020, with the fourth sequel coming out in 2025. “What Joe Letteri and Weta Digital bring to these stories is impossible to quantify,” Cameron says in a release about the start of production by Weta Digital. “Since we made Avatar, Weta continued to prove themselves as doing the

best CG animation, the most human, the most alive, the most photo-realistic effects in the world. And of course, that now means I can push them to take it even further.” “Avatar is the ideal type of film for us,” adds Weta Digital’s Senior Visual Effects Supervisor Joe Letteri. “Jim’s vision for the world of Pandora was always so much bigger than what we created for the first film. Helping him expand the language of cinema through new narratives set in such an expansive universe is the type of opportunity that rarely comes along twice. Projects like this allow everyone involved to push themselves to do their best work and you can’t ask for anything more than that.” —Ian Failes

DIRECTORS ON VFX Gravity (2013). For Gravity, Framestore was called upon to craft almost impossibly long shots in space, often fully CG, but also occasionally incorporating live-action sections of Sandra Bullock and George Clooney. The actors were predominantly filmed against custom LED light panels that carried pre-animated scenes designed to aid in interactive lighting. (Photo copyright © 2013 Warner Bros. Pictures. All rights reserved.)

THE VES 70

“Filmmakers imagine for a living. It all starts with a shot and a dream. But sometimes that dream doesn’t quite have a picture frame around it and isn’t quite in focus. The visual effects fill in the colors and bring our dreams into focus. It’s an amazing collaboration that has resulted in bringing the world some of the most astounding images the world has ever seen.” —Steven Spielberg, Director, 6th Annual VES Awards

Avatar (2009). James Cameron’s blockbuster Avatar took performance capture, virtual production and the realization of digital characters and environments to new levels, with Weta Digital crafting the majority of visual effects for the film. It was also filmed in native stereo, further bringing new challenges to the visual effects vendors in terms of live-action integration, compositing and rendering stereo images. (Photo copyright © 2009 Twentieth Century Fox Film Corporation. All rights reserved.)

Inception (2010). Director Christopher Nolan made use of full-scale practical effects, miniatures and digital visual effects to tell the mind-binding story of Inception. This miniature of the hospital fortress was built by New Deal Studios on its backlot and rigged to collapse in sections as part of the film’s ‘kick’ conceit in which the characters would wake themselves from a dream within a dream. (Photo copyright © 2010 Warner Bros. Pictures. All rights reserved.)

(in alphabetical order; newly added films noted in italics)

300 (2007)

Blade Runner (1982)

Forbidden Planet (1956)

King Kong (2005)

Planet of the Apes (1968)

Terminator 2: Judgment Day (1991)

2001: A Space Odyssey (1968)

Citizen Kane (1941)

Forrest Gump (1994)

Life of Pi (2012)

Raiders of the Lost Ark (1981)

Thing, The (1982)

20,000 Leagues Under the Sea (1954)

Close Encounters of the Third Kind (1977)

Gertie the Dinosaur (1914)

Lord of the Rings: The Fellowship of the Ring (2001)

Return of the Jedi (1983)

Titanic (1997)

A Trip to the Moon (1902)

Curious Case of Benjamin Button, The (2008)

Ghostbusters (1984)

Lord of the Rings: The Return of the King (2003)

Rise of the Planet of the Apes (2011)

Total Recall (1990)

Abyss, The (1989)

Darby O’Gill and the Little People (1958)

Godzilla (1954)

Lord of the Rings: The Two Towers (2002)

Seventh Voyage of Sinbad, The (1958)

Toy Story (1995)

Alien (1979)

Day the Earth Stood Still, The (1951)

Gravity (2013)

Lost World, The (1925)

Sin City (2005)

Tron (1982)

Aliens (1986)

District 9 (2009)

Inception (2010)

Mad Max: Fury Road (2015)

Snow White and the Seven Dwarfs (1937)

Transformers (2007)

An American Werewolf in London (1981)

E.T. the Extraterrestrial (1982)

Independence Day (1996)

Mary Poppins (1964)

Star Wars (1977)

Young Sherlock Holmes (1985)

Apollo 13 (1995)

Empire Strikes Back, The (1980)

Jason and the Argonauts (1963)

Mask, The (1994)

Starship Troopers (1997)

War of the Worlds, The (1953)

Avatar (2009)

Ex Machina (2015)

Jaws (1975)

Matrix, The (1999)

Superman: The Movie (1978)

What Dreams May Come (1998)

Babe (1995)

Fantastic Voyage (1966)

Jurassic Park (1993)

Metropolis (1927)

Ten Commandments, The (1956)

Who Framed Roger Rabbit (1988)

Back to the Future (1985)

Fifth Element, The (1997)

King Kong (1933)

Pirates of the Caribbean: Dead Man’s Chest (2006)

Terminator, The (1984)

Wizard of Oz, The (1939)

74 • VFXVOICE.COM FALL 2017

PG 74-79 VES 70.indd 74-75

FALL 2017

FXVOICE.COM • 75

9/14/17 3:51 PM


VES 20TH ANNIVERSARY

THE VES 70: THE MOST INFLUENTIAL VISUAL EFFECTS FILMS OF ALL TIME By NAOMI GOLDMAN

All-Time VES VFX Films gallery compiled with commentary by Ian Failes. Directors quotes by Paula Parisi.

The Visual Effects Society has released its definitive “VES 70: The Most Influential Visual Effects Films of All Time.” The original VES member-chosen “VES 50” list was created in 2007, marking one decade since the organization’s inception. In commemoration of the VES’s milestone 20th anniversary, the global membership – now almost triple the size of the membership first polled – have added 20 more films from 2015 and earlier to the VFX honor roll. The goal of the two polls was to result in 50 and 20 films respectively, but each poll had ties for the final slots, thus the list includes 72 total films. “The ‘VES 70’ represents films that have had a significant, lasting impact on the practice and appreciation of visual effects as an integral element of cinematic expression and storytelling,” says Mike Chambers, VES Board Chair. “We see this as an important opportunity for our members, leading visual effects practitioners worldwide, to pay homage to our heritage and help shape the future of the global visual effects community. In keeping with our mission to recognize and advance outstanding art and innovation in the VFX field, the ‘VES 70’ now forms a part of our legacy that we can pass down to future generations of filmmakers as a valuable point of reference.” Films included in the “VES 70” span from the early 1900s to 2015. The earliest entry on the list is A Trip to the Moon (Le Voyage dans la Lune), the seminal 1902 French silent film directed by Georges Méliès, whose iconic image exemplifies the VES’s legacy – past, present and future. The most current entries are Academy Award winner for Best Visual Effects, Ex Machina, and the critically acclaimed Mad Max: Fury Road, both from 2015. The ballot, voted on by VES members in Summer 2017, was limited to features from 2015 and earlier, to help ensure that the candidates have had a lasting impact and that voting was not unduly influenced by the most recent VES Award winners.

COVER NOTES: WE’RE ABOUT TO SEE A LOT MORE OF AVATAR Among the newest additions to the VES’s list of most influential visual effects films is James Cameron’s Avatar. In 2009, it set the standard in both virtual production techniques and in transforming human actors into the realm of photo-realistic creatures. Right now, the lead visual effects studio on Avatar – Weta Digital – is also at work on the four sequels to the film being made by Cameron. The first of those is set for release in December 2020, with the fourth sequel coming out in 2025. “What Joe Letteri and Weta Digital bring to these stories is impossible to quantify,” Cameron says in a release about the start of production by Weta Digital. “Since we made Avatar, Weta continued to prove themselves as doing the

best CG animation, the most human, the most alive, the most photo-realistic effects in the world. And of course, that now means I can push them to take it even further.” “Avatar is the ideal type of film for us,” adds Weta Digital’s Senior Visual Effects Supervisor Joe Letteri. “Jim’s vision for the world of Pandora was always so much bigger than what we created for the first film. Helping him expand the language of cinema through new narratives set in such an expansive universe is the type of opportunity that rarely comes along twice. Projects like this allow everyone involved to push themselves to do their best work and you can’t ask for anything more than that.” —Ian Failes

DIRECTORS ON VFX Gravity (2013). For Gravity, Framestore was called upon to craft almost impossibly long shots in space, often fully CG, but also occasionally incorporating live-action sections of Sandra Bullock and George Clooney. The actors were predominantly filmed against custom LED light panels that carried pre-animated scenes designed to aid in interactive lighting. (Photo copyright © 2013 Warner Bros. Pictures. All rights reserved.)

THE VES 70

“Filmmakers imagine for a living. It all starts with a shot and a dream. But sometimes that dream doesn’t quite have a picture frame around it and isn’t quite in focus. The visual effects fill in the colors and bring our dreams into focus. It’s an amazing collaboration that has resulted in bringing the world some of the most astounding images the world has ever seen.” —Steven Spielberg, Director, 6th Annual VES Awards

Avatar (2009). James Cameron’s blockbuster Avatar took performance capture, virtual production and the realization of digital characters and environments to new levels, with Weta Digital crafting the majority of visual effects for the film. It was also filmed in native stereo, further bringing new challenges to the visual effects vendors in terms of live-action integration, compositing and rendering stereo images. (Photo copyright © 2009 Twentieth Century Fox Film Corporation. All rights reserved.)

Inception (2010). Director Christopher Nolan made use of full-scale practical effects, miniatures and digital visual effects to tell the mind-binding story of Inception. This miniature of the hospital fortress was built by New Deal Studios on its backlot and rigged to collapse in sections as part of the film’s ‘kick’ conceit in which the characters would wake themselves from a dream within a dream. (Photo copyright © 2010 Warner Bros. Pictures. All rights reserved.)

(in alphabetical order; newly added films noted in italics)

300 (2007)

Blade Runner (1982)

Forbidden Planet (1956)

King Kong (2005)

Planet of the Apes (1968)

Terminator 2: Judgment Day (1991)

2001: A Space Odyssey (1968)

Citizen Kane (1941)

Forrest Gump (1994)

Life of Pi (2012)

Raiders of the Lost Ark (1981)

Thing, The (1982)

20,000 Leagues Under the Sea (1954)

Close Encounters of the Third Kind (1977)

Gertie the Dinosaur (1914)

Lord of the Rings: The Fellowship of the Ring (2001)

Return of the Jedi (1983)

Titanic (1997)

A Trip to the Moon (1902)

Curious Case of Benjamin Button, The (2008)

Ghostbusters (1984)

Lord of the Rings: The Return of the King (2003)

Rise of the Planet of the Apes (2011)

Total Recall (1990)

Abyss, The (1989)

Darby O’Gill and the Little People (1958)

Godzilla (1954)

Lord of the Rings: The Two Towers (2002)

Seventh Voyage of Sinbad, The (1958)

Toy Story (1995)

Alien (1979)

Day the Earth Stood Still, The (1951)

Gravity (2013)

Lost World, The (1925)

Sin City (2005)

Tron (1982)

Aliens (1986)

District 9 (2009)

Inception (2010)

Mad Max: Fury Road (2015)

Snow White and the Seven Dwarfs (1937)

Transformers (2007)

An American Werewolf in London (1981)

E.T. the Extraterrestrial (1982)

Independence Day (1996)

Mary Poppins (1964)

Star Wars (1977)

Young Sherlock Holmes (1985)

Apollo 13 (1995)

Empire Strikes Back, The (1980)

Jason and the Argonauts (1963)

Mask, The (1994)

Starship Troopers (1997)

War of the Worlds, The (1953)

Avatar (2009)

Ex Machina (2015)

Jaws (1975)

Matrix, The (1999)

Superman: The Movie (1978)

What Dreams May Come (1998)

Babe (1995)

Fantastic Voyage (1966)

Jurassic Park (1993)

Metropolis (1927)

Ten Commandments, The (1956)

Who Framed Roger Rabbit (1988)

Back to the Future (1985)

Fifth Element, The (1997)

King Kong (1933)

Pirates of the Caribbean: Dead Man’s Chest (2006)

Terminator, The (1984)

Wizard of Oz, The (1939)

74 • VFXVOICE.COM FALL 2017

PG 74-79 VES 70.indd 74-75

FALL 2017

FXVOICE.COM • 75

9/14/17 3:51 PM


VES 20TH ANNIVERSARY

Life of Pi (2012). Rhythm & Hues crafted a stunningly convincing photoreal CG tiger, among other animals, for Ang Lee’s Life of Pi. The effect was made even more complicated because the tiger had to co-exist with a young boy on a lifeboat in the open ocean. The visual effects teams also had to solve water simulations and the tracking of characters – shot in water tanks – on the moving waves. (Photo copyright © 2012 Twentieth Century Fox. All rights reserved.)

“My films have gone from having no visual effects to being completely visual effects. When I started off in independent film, ‘visual effects’ wasn’t even a line item. You were lucky to raise barely enough money to film the script, and visual effects were new and expensive. … The Jungle Book was truly a creative partnership between film and visual effects. The Lion King takes that partnership a step further, as the production and characters are all completely virtual. By including the effects artists in every step of the process, with meaningful collaboration, I have found that these new capabilities open up vast storytelling opportunities. Innovation in film has always been the dance between building new tools to tell a particular story and then allowing these new tools to inspire new stories that could never be told before.” —Jon Favreau, Director

“There’s no way to quantify the importance of visual effects in the films I’ve been involved in. VFX have become as crucial and ubiquitous as any element. Just as one relies on a fine actor to deliver a moving performance, or world-class DP to shoot a film beautifully, one depends on their visual effects supervisor to provide anything that’s necessary to help believably tell the story. I’ve been especially lucky in that Roger Guyett is a true storyteller. That Venn diagram is critical in a VFX supervisor. A film needs someone who is a technical wizard, certainly, but also someone who understands the inside-out intention of a sequence, scene or shot. But VFX have become as critical as any element of the making of a film.” —J.J. Abrams, Director

Back to the Future (1985). Industrial Light & Magic model shop supervisor Steve Gawley works on a miniature flying DeLorean for the final scenes of Back to the Future. The film featured extensive miniature and optical effects from ILM, which progressed into complex motioncontrol shots, split screens, early digital wire removal and paint effects for sequels to the Robert Zemeckis movie. (Photo copyright © 1985 Universal Studios. All rights reserved.)

“Moving image, in all its formats, has been with us no more than 130 years. Its evolution has been exponential, and along the way there have been two great milestones – the advent of sound in the ‘20s and, since the late ‘80s, digital visual effects. These are the ‘tools of enchantment’ which allow movies to be utterly persuasive, and to be working in a time when this digital dispensation is flourishing is one huge privilege. ... It had been 30 years since the last Mad Max movie and everything had changed. Although we shot old-school live action, not one frame (in 2015’s Mad Max: Fury Road) was left untouched by VFX. Apart from epic dust storms, landscapes and such, we kept continuity of skies, erased the safety harnesses of our cast and stunt performers, plus removed the wheel tracks of previous takes. The plasticity of the image was impossible to imagine all those years ago. Making the movie felt, in some ways, nostalgic, yet most of it could not have been accomplished before this era. For me it was a kind of time traveling.” —George Miller, Director

The Curious Case of Benjamin Button (2008). Turning Brad Pitt into an old man, and then reversing his aging, required CG characters and de-aging effects that were unparalleled on the screen at the time. This image shows a completely synthetic character made by Digital Domain based on life casts of an older actor and facial motion-capture of Pitt. (Photo copyright © 2008 Paramount Pictures. All rights reserved.)

76 • VFXVOICE.COM FALL 2017

PG 74-79 VES 70.indd 76-77

“Alien was the first I got involved with visual effects, which included a little matte painting, and all the universes done with skilled hands and a bristle brush randomly sprinkling stars onto black color board, and then we just photographed it. Prometheus was my first long-range planning to capture ambitious events in other universes, and I marveled at the artistic skill set, and the importance of digital animation in helping to guide this story.

Alien: Covenant took it to yet another level. Over the years, I’ve watched, listened, learned and admired the ingenuity of the artistic science of visual effects. Now anything is possible, providing you use the tools properly.” —Sir Ridley Scott, Director

“I feel I’m one of you, even though I haven’t been a practitioner for many years. I love visual effects. Arthur C. Clarke said, ‘Any sufficiently advanced technology is indistinguishable from magic.’ And that’s what we do. I’ve been asked a lot about my inspiration for Avatar, and I think back to being seven years old, sitting in a movie theater seeing films like Mysterious Island, Jason and the Argonauts and The Seventh Voyage of Sinbad. Then came 2001: A Space Odyssey, and the shock of the incomprehensible, that feeling of how did they do that? I started building models and trying to figure it out. The Avatar experience made me realize that the stuff we were doing back in the day – stop motion, glass painting, models – now it’s all done with supercomputers. When I wrote Avatar we couldn’t make the film because the CG wasn’t quite there yet, so I shelved it because I knew that every year that went by the visual effects community would create new tools, write new code, and the reality level would increase. CG characters and worlds pushed the envelope, but computers don’t make visual effects, people do. It’s still the artist, the imagination and the sense of a pioneering spirit that make the magic. It’s the thought behind the shot, it’s the eye.”

District 9 (2009). Neill Blomkamp – and visual effects studio Image Engine – seemed to burst onto the scene with the alien refugee story of District 9 in 2009. One challenge for the visual effects crew was staying true to the hand-held documentary feel of the film, while also providing for believable, and emotional, alien characters. These were played on set by actors wearing gray tracking suits. (Photo copyright © 2009 Sony Pictures. All rights reserved.)

—James Cameron, Director

“As a filmmaker, I love working with the medium of computer animation; it’s like the whole movie is one big visual effect. What makes computer animation work is not the mere fact that it’s made with a computer, it’s what you do with it and how you entertain the audience. It’s the story and the characters. We are in the business of entertaining audiences. It’s about making films that keep the audience on the edge of their seats, wondering what’s going to happen next. We all love the technology, but more importantly we love hearing the audience laugh.”

Independence Day (1996). A 12-foot City Destroyer model for Independence Day is positioned over a mountain landscape in front of a painted sky backdrop, and shot with a mini-motion control system called Jet Rail. This enabled the camera to fit between the landscape and model for dog-fighting scenes, and represented the practical side of the film’s visual effects, which also made a large headway into CG imagery and digital compositing. (Photo courtesy of Volker Engel)

—John Lasseter, Director

“It’s been a pleasure to watch the Visual Effects Society grow. Those of us who have been around a long time – Dennis Muren and some of the others at ILM – 40 years ago when we started in this business there was no visual effects industry. There were ‘special effects,’ and that was a whole bunch of things, but there really was no ‘industry.’ Now, to watch all these great movies and television shows full of magnificent effects, it’s stunning. So much has changed. In those days, effects were created with a camera that sometimes worked and sometimes didn’t, whereas now, of course, everything works perfectly.” —George Lucas, Director

“I’m as dependent on visual effects as any filmmaker out there, and I’m very grateful to and appreciative of all the people I’ve worked with who help bring my visions to life. For me, it’s a process of starting with a physical reality – the science or process of how

Total Recall (1990). Visual Effects Supervisor Eric Brevig (left) and Director of Miniature Photography Alex Funke stand in one of the model Mars environments created for Total Recall. The Paul Verhoeven film featured significant make-up and creature effects, miniatures, optical compositing, early motion capture and some nascent CG animation. (Photo courtesy of Eric Brevig)

FALL 2017

VFXVOICE.COM • 77

9/14/17 3:51 PM


VES 20TH ANNIVERSARY

Life of Pi (2012). Rhythm & Hues crafted a stunningly convincing photoreal CG tiger, among other animals, for Ang Lee’s Life of Pi. The effect was made even more complicated because the tiger had to co-exist with a young boy on a lifeboat in the open ocean. The visual effects teams also had to solve water simulations and the tracking of characters – shot in water tanks – on the moving waves. (Photo copyright © 2012 Twentieth Century Fox. All rights reserved.)

“My films have gone from having no visual effects to being completely visual effects. When I started off in independent film, ‘visual effects’ wasn’t even a line item. You were lucky to raise barely enough money to film the script, and visual effects were new and expensive. … The Jungle Book was truly a creative partnership between film and visual effects. The Lion King takes that partnership a step further, as the production and characters are all completely virtual. By including the effects artists in every step of the process, with meaningful collaboration, I have found that these new capabilities open up vast storytelling opportunities. Innovation in film has always been the dance between building new tools to tell a particular story and then allowing these new tools to inspire new stories that could never be told before.” —Jon Favreau, Director

“There’s no way to quantify the importance of visual effects in the films I’ve been involved in. VFX have become as crucial and ubiquitous as any element. Just as one relies on a fine actor to deliver a moving performance, or world-class DP to shoot a film beautifully, one depends on their visual effects supervisor to provide anything that’s necessary to help believably tell the story. I’ve been especially lucky in that Roger Guyett is a true storyteller. That Venn diagram is critical in a VFX supervisor. A film needs someone who is a technical wizard, certainly, but also someone who understands the inside-out intention of a sequence, scene or shot. But VFX have become as critical as any element of the making of a film.” —J.J. Abrams, Director

Back to the Future (1985). Industrial Light & Magic model shop supervisor Steve Gawley works on a miniature flying DeLorean for the final scenes of Back to the Future. The film featured extensive miniature and optical effects from ILM, which progressed into complex motioncontrol shots, split screens, early digital wire removal and paint effects for sequels to the Robert Zemeckis movie. (Photo copyright © 1985 Universal Studios. All rights reserved.)

“Moving image, in all its formats, has been with us no more than 130 years. Its evolution has been exponential, and along the way there have been two great milestones – the advent of sound in the ‘20s and, since the late ‘80s, digital visual effects. These are the ‘tools of enchantment’ which allow movies to be utterly persuasive, and to be working in a time when this digital dispensation is flourishing is one huge privilege. ... It had been 30 years since the last Mad Max movie and everything had changed. Although we shot old-school live action, not one frame (in 2015’s Mad Max: Fury Road) was left untouched by VFX. Apart from epic dust storms, landscapes and such, we kept continuity of skies, erased the safety harnesses of our cast and stunt performers, plus removed the wheel tracks of previous takes. The plasticity of the image was impossible to imagine all those years ago. Making the movie felt, in some ways, nostalgic, yet most of it could not have been accomplished before this era. For me it was a kind of time traveling.” —George Miller, Director

The Curious Case of Benjamin Button (2008). Turning Brad Pitt into an old man, and then reversing his aging, required CG characters and de-aging effects that were unparalleled on the screen at the time. This image shows a completely synthetic character made by Digital Domain based on life casts of an older actor and facial motion-capture of Pitt. (Photo copyright © 2008 Paramount Pictures. All rights reserved.)

76 • VFXVOICE.COM FALL 2017

PG 74-79 VES 70.indd 76-77

“Alien was the first I got involved with visual effects, which included a little matte painting, and all the universes done with skilled hands and a bristle brush randomly sprinkling stars onto black color board, and then we just photographed it. Prometheus was my first long-range planning to capture ambitious events in other universes, and I marveled at the artistic skill set, and the importance of digital animation in helping to guide this story.

Alien: Covenant took it to yet another level. Over the years, I’ve watched, listened, learned and admired the ingenuity of the artistic science of visual effects. Now anything is possible, providing you use the tools properly.” —Sir Ridley Scott, Director

“I feel I’m one of you, even though I haven’t been a practitioner for many years. I love visual effects. Arthur C. Clarke said, ‘Any sufficiently advanced technology is indistinguishable from magic.’ And that’s what we do. I’ve been asked a lot about my inspiration for Avatar, and I think back to being seven years old, sitting in a movie theater seeing films like Mysterious Island, Jason and the Argonauts and The Seventh Voyage of Sinbad. Then came 2001: A Space Odyssey, and the shock of the incomprehensible, that feeling of how did they do that? I started building models and trying to figure it out. The Avatar experience made me realize that the stuff we were doing back in the day – stop motion, glass painting, models – now it’s all done with supercomputers. When I wrote Avatar we couldn’t make the film because the CG wasn’t quite there yet, so I shelved it because I knew that every year that went by the visual effects community would create new tools, write new code, and the reality level would increase. CG characters and worlds pushed the envelope, but computers don’t make visual effects, people do. It’s still the artist, the imagination and the sense of a pioneering spirit that make the magic. It’s the thought behind the shot, it’s the eye.”

District 9 (2009). Neill Blomkamp – and visual effects studio Image Engine – seemed to burst onto the scene with the alien refugee story of District 9 in 2009. One challenge for the visual effects crew was staying true to the hand-held documentary feel of the film, while also providing for believable, and emotional, alien characters. These were played on set by actors wearing gray tracking suits. (Photo copyright © 2009 Sony Pictures. All rights reserved.)

—James Cameron, Director

“As a filmmaker, I love working with the medium of computer animation; it’s like the whole movie is one big visual effect. What makes computer animation work is not the mere fact that it’s made with a computer, it’s what you do with it and how you entertain the audience. It’s the story and the characters. We are in the business of entertaining audiences. It’s about making films that keep the audience on the edge of their seats, wondering what’s going to happen next. We all love the technology, but more importantly we love hearing the audience laugh.”

Independence Day (1996). A 12-foot City Destroyer model for Independence Day is positioned over a mountain landscape in front of a painted sky backdrop, and shot with a mini-motion control system called Jet Rail. This enabled the camera to fit between the landscape and model for dog-fighting scenes, and represented the practical side of the film’s visual effects, which also made a large headway into CG imagery and digital compositing. (Photo courtesy of Volker Engel)

—John Lasseter, Director

“It’s been a pleasure to watch the Visual Effects Society grow. Those of us who have been around a long time – Dennis Muren and some of the others at ILM – 40 years ago when we started in this business there was no visual effects industry. There were ‘special effects,’ and that was a whole bunch of things, but there really was no ‘industry.’ Now, to watch all these great movies and television shows full of magnificent effects, it’s stunning. So much has changed. In those days, effects were created with a camera that sometimes worked and sometimes didn’t, whereas now, of course, everything works perfectly.” —George Lucas, Director

“I’m as dependent on visual effects as any filmmaker out there, and I’m very grateful to and appreciative of all the people I’ve worked with who help bring my visions to life. For me, it’s a process of starting with a physical reality – the science or process of how

Total Recall (1990). Visual Effects Supervisor Eric Brevig (left) and Director of Miniature Photography Alex Funke stand in one of the model Mars environments created for Total Recall. The Paul Verhoeven film featured significant make-up and creature effects, miniatures, optical compositing, early motion capture and some nascent CG animation. (Photo courtesy of Eric Brevig)

FALL 2017

VFXVOICE.COM • 77

9/14/17 3:51 PM


VES 20TH ANNIVERSARY Ex Machina (2015). For Ex Machina, actress Alicia Vikander performed the role of the robot Ava on set, which was then seamlessly augmented by visual effects artists Double Negative to reveal cyborg details in different parts of the body. Meticulous tracking of Vikander and replacement with CG sections won much acclaim for its ‘invisible’ role in realizing the character. (Photo copyright © 2015 A24. All rights reserved.)

300 (2007). Zack Snyder’s stylistic rendition of this comic booksourced story took full advantage of the manipulation of live-action photography with visual effects for key ‘frames’, including for this cliff sequence completed by Animal Logic. Actors were commonly filmed against bluescreen, but the final imagery was intentionally pushed and pulled for dramatic effect. (Photo copyright © 2007 Warner Bros. Pictures. All rights reserved.)

Mad Max: Fury Road (2015). George Miller’s Mad Max: Fury Road, his return to the franchise after many years, saw the worlds of stunts, special effects, visual effects and – in a more than prominent way, color grading – combine to give the film its hard-hitting result. These images sum up the general approach to the film’s visual effects, in which plate photography was often augmented with additional environments, crowds, effects and color grading. (Photo copyright © 2015 Warner Bros. Pictures. All rights reserved.)

things actually work. With that as the basis, you can create incredible images and make an audience believe they are real – whether it’s fantastic, like the folding cities of Inception, or the interior of a black hole in Interstellar, or something of this Earth that most people have never experienced, as with the heart of battle in Dunkirk. The exotic nature of what you can achieve with visual effects creates a hyper-reality. Truth can be stranger than fiction.” —Christopher Nolan, Director

The Mask (1994). Jim Carrey’s already ‘rubbery’ performance was taken even further by Industrial Light & Magic in The Mask, with the studio capitalizing on new techniques developed in CG animation and 3D morphing to give a cartoony, yet realistic, result. (Photo copyright © 1994 New Line Cinema. All rights reserved.)

Young Sherlock Holmes (1985). Delivering the first 3D character in a feature film with the Stained Glass Man for Young Sherlock Holmes, the Computer Division at Lucasfilm – working hand-in-hand with other visual effects created by Industrial Light & Magic for the film – brought CG to the forefront of modern day movies. (Photo copyright © 1985 Paramount Pictures. All rights reserved.)

78 • VFXVOICE.COM FALL 2017

PG 74-79 VES 70.indd 79

Transformers (2007). The Decepticon Bonecrusher, a CG creation by Industrial Light & Magic, causes havoc during a freeway chase scene in Michael Bay’s Transformers. ILM solved major animation and physical interaction challenges to help bring the film to life, seamlessly integrating highly reflective metallic characters into what has been described as the director’s ‘Bayhem’ action. (Photo copyright © 2007 Paramount Pictures. All rights reserved.)

Starship Troopers (1997). Phil Tippett’s Tippett Studio orchestrated a raft of CG bugs for Paul Verhoeven’s Starship Troopers, relying on keyframe animation and the use of a ‘digital input device’ armature, first devised for Jurassic Park, that replicated the idea of a stop-motion animation feel for the creatures. The film was also marked by impressive spaceship miniatures, matte paintings and other digital effects. (Photo copyright © 1997 Sony Pictures. All rights reserved.)

“Visual effects have merged into the whole process of the cinematic experience. It’s not about shots, it’s about the integration of universes, the integration of sets, the integration of life and actors. We’re living in a historic moment in cinema.” —Alfonso Cuarón, Director

“I originally studied as a fine artist, so to see so many truly gifted VFX artists and crew working around the clock to bring their incredible skills to Wonder Woman was truly stunning. Bill Westenhoffer, our VFX Supervisor, was by my side every day, and did such an amazing job guiding and overseeing all of the effects, which were so critical to the storytelling. At the moment there is still a limit to what you can do with a digi double versus a real actor, because if it gets too close or detailed you can still start to feel that something isn’t quite right. The next big breakthrough will be to cross that divide and really be able to manipulate a digi double to absolutely appear as fluid and real as a person, in closer, more detailed shots. We’re getting close!” —Patty Jenkins, Director

FALL 2017 VFXVOICE.COM • 79

9/14/17 3:51 PM


VES 20TH ANNIVERSARY Ex Machina (2015). For Ex Machina, actress Alicia Vikander performed the role of the robot Ava on set, which was then seamlessly augmented by visual effects artists Double Negative to reveal cyborg details in different parts of the body. Meticulous tracking of Vikander and replacement with CG sections won much acclaim for its ‘invisible’ role in realizing the character. (Photo copyright © 2015 A24. All rights reserved.)

300 (2007). Zack Snyder’s stylistic rendition of this comic booksourced story took full advantage of the manipulation of live-action photography with visual effects for key ‘frames’, including for this cliff sequence completed by Animal Logic. Actors were commonly filmed against bluescreen, but the final imagery was intentionally pushed and pulled for dramatic effect. (Photo copyright © 2007 Warner Bros. Pictures. All rights reserved.)

Mad Max: Fury Road (2015). George Miller’s Mad Max: Fury Road, his return to the franchise after many years, saw the worlds of stunts, special effects, visual effects and – in a more than prominent way, color grading – combine to give the film its hard-hitting result. These images sum up the general approach to the film’s visual effects, in which plate photography was often augmented with additional environments, crowds, effects and color grading. (Photo copyright © 2015 Warner Bros. Pictures. All rights reserved.)

things actually work. With that as the basis, you can create incredible images and make an audience believe they are real – whether it’s fantastic, like the folding cities of Inception, or the interior of a black hole in Interstellar, or something of this Earth that most people have never experienced, as with the heart of battle in Dunkirk. The exotic nature of what you can achieve with visual effects creates a hyper-reality. Truth can be stranger than fiction.” —Christopher Nolan, Director

The Mask (1994). Jim Carrey’s already ‘rubbery’ performance was taken even further by Industrial Light & Magic in The Mask, with the studio capitalizing on new techniques developed in CG animation and 3D morphing to give a cartoony, yet realistic, result. (Photo copyright © 1994 New Line Cinema. All rights reserved.)

Young Sherlock Holmes (1985). Delivering the first 3D character in a feature film with the Stained Glass Man for Young Sherlock Holmes, the Computer Division at Lucasfilm – working hand-in-hand with other visual effects created by Industrial Light & Magic for the film – brought CG to the forefront of modern day movies. (Photo copyright © 1985 Paramount Pictures. All rights reserved.)

78 • VFXVOICE.COM FALL 2017

PG 74-79 VES 70.indd 79

Transformers (2007). The Decepticon Bonecrusher, a CG creation by Industrial Light & Magic, causes havoc during a freeway chase scene in Michael Bay’s Transformers. ILM solved major animation and physical interaction challenges to help bring the film to life, seamlessly integrating highly reflective metallic characters into what has been described as the director’s ‘Bayhem’ action. (Photo copyright © 2007 Paramount Pictures. All rights reserved.)

Starship Troopers (1997). Phil Tippett’s Tippett Studio orchestrated a raft of CG bugs for Paul Verhoeven’s Starship Troopers, relying on keyframe animation and the use of a ‘digital input device’ armature, first devised for Jurassic Park, that replicated the idea of a stop-motion animation feel for the creatures. The film was also marked by impressive spaceship miniatures, matte paintings and other digital effects. (Photo copyright © 1997 Sony Pictures. All rights reserved.)

“Visual effects have merged into the whole process of the cinematic experience. It’s not about shots, it’s about the integration of universes, the integration of sets, the integration of life and actors. We’re living in a historic moment in cinema.” —Alfonso Cuarón, Director

“I originally studied as a fine artist, so to see so many truly gifted VFX artists and crew working around the clock to bring their incredible skills to Wonder Woman was truly stunning. Bill Westenhoffer, our VFX Supervisor, was by my side every day, and did such an amazing job guiding and overseeing all of the effects, which were so critical to the storytelling. At the moment there is still a limit to what you can do with a digi double versus a real actor, because if it gets too close or detailed you can still start to feel that something isn’t quite right. The next big breakthrough will be to cross that divide and really be able to manipulate a digi double to absolutely appear as fluid and real as a person, in closer, more detailed shots. We’re getting close!” —Patty Jenkins, Director

FALL 2017 VFXVOICE.COM • 79

9/14/17 3:51 PM


VES 20TH ANNIVERSARY

THE FUTURE OF VFX: INDUSTRY LEADERS LOOK AHEAD By MICHAEL GOLDMAN

TOP: While working on Air Force One in 1997, Visual Effects Supervisor Richard Edlund, VES, learned that digital techniques would eventually push past generations of practical effects work and mechanical engineering breakthroughs like motion control. Today, he views the work he and others did on Star Wars and what followed in the 1970s onward as “steppingstones” on the way to a revolution. (Photo copyright © 1997 Columbia Pictures. All Rights Reserved.) BOTTOM: Richard Edlund, VES.

80 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 80-81

As the visual effects community ponders what’s next for the industry, an ironic dichotomy colors the discussion. On the one hand, we are in “a Golden Age of visual effects, and we should take time to recognize that and celebrate it,” suggests Jim Morris, VES, General Manager and President of Pixar Animation Studios, and a veteran VFX producer and Founding Chair of the Visual Effects Society. Morris, of course, is referring to revolutionary breakthroughs currently evident in a wide range of recent motion pictures and, increasingly, on television and in other media. Indeed, Visual Effects Supervisor Rob Legato, ASC, who shared last year’s visual effects Academy Award with Adam Valdez, Andrew R. Jones and Dan Lemmon for stunning work on The Jungle Book using sophisticated virtual production techniques, suggests those techniques are being advanced further at this very moment with work currently underway on 2019’s The Lion King and other projects. He reminds that, “On any given year, something like eight of the top 10 grossing movies are most likely to be visual effects-oriented films. That is a very healthy assemblage of films that greatly depend on the visual effects industry for the core of their existence.” On the other hand, “the ability to do absolutely anything and the propensity for studios to spend tens of millions of dollars on visual effects for [tentpole] style comic-book movies has basically turned visual effects into a commodity,” says pioneering Visual Effects Supervisor Richard Edlund, VES, ASC. “On any given day, every studio has at least a thousand shots in the loop.” Thus, there is concern about issues like saturation of the industry, a glut of artists competing for lower compensation, tightening timelines and shrinking budget demands that impact quality, and the potential devaluing of artistry and craftsmanship by studios. For instance, “I’m constantly amazed by how much pressure we get from studios and clients to make repeated creative changes without additional pay, meet shorter post schedules, and to chase tax breaks around the world,” says ILM Visual Effects Supervisor Lindy DeQuattro. “If you were remodeling your kitchen and told your contractor you wanted granite, then saw the granite and said, ‘no, actually, I’d like to see some marble, or let’s try quartz, and then never mind, let’s put the granite back in,’ you would certainly expect to pay for all those materials and the additional labor.” In other words, as Morris puts it, “Careful what you wish for. We used to hope for a day when visual effects would be pivotal to the filmmaking process at every level, where every movie would require them,” he says. “And then, it came true. The downside, of course, is that work shifted and lots of changes came with that. And yet, from 30 thousand feet, it looks like a vibrant industry.” Contextualizing the great changes in the visual effects industry just in the 20 years since the VES was founded – and in order to look ahead to what might be coming next – major technical, creative and business shifts are inexorably intertwined. The precise direction and impact of these shifts, however, cannot be fully prognosticated, and so Dr. Ed Catmull, VES, Pixar’s Co-founder and now President of Pixar and Walt Disney Animation Studios, suggests facilities need to both take bold risks with new technologies, and simultaneously plan for outcomes

ranging from wild success to abysmal failure. “One recurrent theme is that when a new technology arrives, it initially does not work very well, but may show potential,” Catmull explains. “Sometimes, you will try it, or even buy it, and you will find out it actually does not live up to the hype. And sometimes, having tried it early will give you a great heads-up on things. But if a new thing appears and does not work well initially, some people will therefore conclude that makes it a bad idea, but they might be premature in concluding that. You still have to evaluate something by what its potential might be and move forward. Related to that, most of these things will have a horizon of about four or five years of going right or going wrong. You don’t know, if something goes wrong with a [technology], if a software or computer vendor is going to correct it or be blind to it, or if something will go wrong with their company as a result. So you have to constantly pay attention to the symptoms, and that should give you four or five years to prepare if something does go wrong.” Clearly, virtual production techniques, historically powerful software tools and rendering systems, motion-capture and facial-capture breakthroughs, Cloud computing, the rise of sophisticated previs techniques, deep machine-learning/AI tools, and so much more are on the forefront of the industry’s forward-looking conversation. But, at the same time, so is the restructuring of the industry landscape – how facilities are built, managed, and what their infrastructures and missions should look like. Industry veterans remind us that major companies have closed during this revolutionary period, along with other forms of consolidation and automation, while others evolved successfully. Sony Pictures Imageworks, for example, remains a key player in producing not only high-end CG feature films, but also cuttingedge VFX for major live-action films. To get this done, Imageworks “migrated our headquarters in recent years, and the majority of our workforce, to Vancouver,” explains Randy Lake, President of Sony Studio Operations and Sony Imageworks. “This has been successful in sustaining the quality of our work, while leveraging tax incentives to keep our costs competitive and innovating in technologies that enable our artists to create fantastic imagery.” VFX Voice recently conducted a wide-ranging conversation with several leading industry professionals to get a sense of the industry’s future direction, and several key themes emerged.

TOP: The Jungle Book showcases the virtual environments and liveaction integration developed by Rob Legato, ASC, for director Jon Favreau. (Photo copyright © 2016 Walt Disney Enterprises Inc. All Rights Reserved.) LEFT: Rob Legato, ASC, on set of Hugo. (Photo credit: Ben Grossman. Courtesy of Rob Legato.)

LEFT: Lindy DeQuattro BOTTOM: The United States’ Gipsy Danger moves a crab fishing boat out of danger in a scene from Pacific Rim. (Photo copyright © 2013 Warner Bros. Entertainment and Legendary Pictures Funding, LLC. Courtesy of Warner Bros. Pictures.)

VIRTUAL PRODUCTION TECHNIQUES

Legato argues that the combination of the latest digital filmmaking tools and software with VR-related previsualization technology and game rendering engines, along with lots of innovative thinking, can now help filmmakers achieve “a movie rooted totally in reality, even though it was artificially created – we are going through great pains to make sure the artificial part is removed from the audience purview. Animals and environments can look absolutely realistic, and I was also proud how, in The Jungle Book, we had 150 digital double shots where I doubt the audience, or even most professionals, could detect the use of a digital human within the illusion. “So that, to me,” he continues, “is the future of using this

FALL 2017 VFXVOICE.COM • 81

9/14/17 3:51 PM


VES 20TH ANNIVERSARY

THE FUTURE OF VFX: INDUSTRY LEADERS LOOK AHEAD By MICHAEL GOLDMAN

TOP: While working on Air Force One in 1997, Visual Effects Supervisor Richard Edlund, VES, learned that digital techniques would eventually push past generations of practical effects work and mechanical engineering breakthroughs like motion control. Today, he views the work he and others did on Star Wars and what followed in the 1970s onward as “steppingstones” on the way to a revolution. (Photo copyright © 1997 Columbia Pictures. All Rights Reserved.) BOTTOM: Richard Edlund, VES.

80 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 80-81

As the visual effects community ponders what’s next for the industry, an ironic dichotomy colors the discussion. On the one hand, we are in “a Golden Age of visual effects, and we should take time to recognize that and celebrate it,” suggests Jim Morris, VES, General Manager and President of Pixar Animation Studios, and a veteran VFX producer and Founding Chair of the Visual Effects Society. Morris, of course, is referring to revolutionary breakthroughs currently evident in a wide range of recent motion pictures and, increasingly, on television and in other media. Indeed, Visual Effects Supervisor Rob Legato, ASC, who shared last year’s visual effects Academy Award with Adam Valdez, Andrew R. Jones and Dan Lemmon for stunning work on The Jungle Book using sophisticated virtual production techniques, suggests those techniques are being advanced further at this very moment with work currently underway on 2019’s The Lion King and other projects. He reminds that, “On any given year, something like eight of the top 10 grossing movies are most likely to be visual effects-oriented films. That is a very healthy assemblage of films that greatly depend on the visual effects industry for the core of their existence.” On the other hand, “the ability to do absolutely anything and the propensity for studios to spend tens of millions of dollars on visual effects for [tentpole] style comic-book movies has basically turned visual effects into a commodity,” says pioneering Visual Effects Supervisor Richard Edlund, VES, ASC. “On any given day, every studio has at least a thousand shots in the loop.” Thus, there is concern about issues like saturation of the industry, a glut of artists competing for lower compensation, tightening timelines and shrinking budget demands that impact quality, and the potential devaluing of artistry and craftsmanship by studios. For instance, “I’m constantly amazed by how much pressure we get from studios and clients to make repeated creative changes without additional pay, meet shorter post schedules, and to chase tax breaks around the world,” says ILM Visual Effects Supervisor Lindy DeQuattro. “If you were remodeling your kitchen and told your contractor you wanted granite, then saw the granite and said, ‘no, actually, I’d like to see some marble, or let’s try quartz, and then never mind, let’s put the granite back in,’ you would certainly expect to pay for all those materials and the additional labor.” In other words, as Morris puts it, “Careful what you wish for. We used to hope for a day when visual effects would be pivotal to the filmmaking process at every level, where every movie would require them,” he says. “And then, it came true. The downside, of course, is that work shifted and lots of changes came with that. And yet, from 30 thousand feet, it looks like a vibrant industry.” Contextualizing the great changes in the visual effects industry just in the 20 years since the VES was founded – and in order to look ahead to what might be coming next – major technical, creative and business shifts are inexorably intertwined. The precise direction and impact of these shifts, however, cannot be fully prognosticated, and so Dr. Ed Catmull, VES, Pixar’s Co-founder and now President of Pixar and Walt Disney Animation Studios, suggests facilities need to both take bold risks with new technologies, and simultaneously plan for outcomes

ranging from wild success to abysmal failure. “One recurrent theme is that when a new technology arrives, it initially does not work very well, but may show potential,” Catmull explains. “Sometimes, you will try it, or even buy it, and you will find out it actually does not live up to the hype. And sometimes, having tried it early will give you a great heads-up on things. But if a new thing appears and does not work well initially, some people will therefore conclude that makes it a bad idea, but they might be premature in concluding that. You still have to evaluate something by what its potential might be and move forward. Related to that, most of these things will have a horizon of about four or five years of going right or going wrong. You don’t know, if something goes wrong with a [technology], if a software or computer vendor is going to correct it or be blind to it, or if something will go wrong with their company as a result. So you have to constantly pay attention to the symptoms, and that should give you four or five years to prepare if something does go wrong.” Clearly, virtual production techniques, historically powerful software tools and rendering systems, motion-capture and facial-capture breakthroughs, Cloud computing, the rise of sophisticated previs techniques, deep machine-learning/AI tools, and so much more are on the forefront of the industry’s forward-looking conversation. But, at the same time, so is the restructuring of the industry landscape – how facilities are built, managed, and what their infrastructures and missions should look like. Industry veterans remind us that major companies have closed during this revolutionary period, along with other forms of consolidation and automation, while others evolved successfully. Sony Pictures Imageworks, for example, remains a key player in producing not only high-end CG feature films, but also cuttingedge VFX for major live-action films. To get this done, Imageworks “migrated our headquarters in recent years, and the majority of our workforce, to Vancouver,” explains Randy Lake, President of Sony Studio Operations and Sony Imageworks. “This has been successful in sustaining the quality of our work, while leveraging tax incentives to keep our costs competitive and innovating in technologies that enable our artists to create fantastic imagery.” VFX Voice recently conducted a wide-ranging conversation with several leading industry professionals to get a sense of the industry’s future direction, and several key themes emerged.

TOP: The Jungle Book showcases the virtual environments and liveaction integration developed by Rob Legato, ASC, for director Jon Favreau. (Photo copyright © 2016 Walt Disney Enterprises Inc. All Rights Reserved.) LEFT: Rob Legato, ASC, on set of Hugo. (Photo credit: Ben Grossman. Courtesy of Rob Legato.)

LEFT: Lindy DeQuattro BOTTOM: The United States’ Gipsy Danger moves a crab fishing boat out of danger in a scene from Pacific Rim. (Photo copyright © 2013 Warner Bros. Entertainment and Legendary Pictures Funding, LLC. Courtesy of Warner Bros. Pictures.)

VIRTUAL PRODUCTION TECHNIQUES

Legato argues that the combination of the latest digital filmmaking tools and software with VR-related previsualization technology and game rendering engines, along with lots of innovative thinking, can now help filmmakers achieve “a movie rooted totally in reality, even though it was artificially created – we are going through great pains to make sure the artificial part is removed from the audience purview. Animals and environments can look absolutely realistic, and I was also proud how, in The Jungle Book, we had 150 digital double shots where I doubt the audience, or even most professionals, could detect the use of a digital human within the illusion. “So that, to me,” he continues, “is the future of using this

FALL 2017 VFXVOICE.COM • 81

9/14/17 3:51 PM


VES 20TH ANNIVERSARY

TOP: Moana (Photo copyright © 2016 Walt Disney Pictures. All Rights Reserved. Photo courtesy of Walt Disney Animation.) RIGHT: Ed Catmull, VES (Photo credit: Deborah Coleman, Pixar)

82 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 82-83

technology for filmmaking –not to do a special effects genre movie only, but to make regular films that you can increase in scope or size, meaning bigger themed films, because we can put actors believably into [situations] where we couldn’t before. This trend can apply to any real story about real people. The computer power is outrageous, the software is incredible, and we have new tools that allow us to previsualize much more efficiently. On my next film, we are now extending the virtual reality ability to preview the entire film using VR tools. You put on VR goggles and walk into fully realized, three-dimensional sets, move your head around and see things react as they would in the real environment. All of a sudden, it becomes perfectly natural to invent shots and camera moves, and decide where characters or buildings go. Once [the technique] is figured out, the wheel is created. Now, other people can pick up on it for all sorts of applications.” Industry veteran Tim Sarnoff, Deputy CEO and President of Production Services for Technicolor’s portfolio of visual effects companies (MPC, The Mill, Mikros Image, Mr. X and Technicolor), suggests the rapid adaptation of these tools in such a way as to fit into the needs of a filmmaking workflow has been almost as stunning as the tools themselves. “While the fundamentals of artistry certainly have not changed, technologically, things have advanced tremendously,” Sarnoff says. “A decade ago, you could not have created The Jungle Book. That project applied dozens of new innovative technologies, including Cloud [rendering] technology, which was used to handle the hundreds of thousands of processing hours needed to render an entire rain forest in photographic detail. That’s an example of the convergence of immersive technologies with today’s established technologies, leaving an indelible mark on how this industry is evolving.” The Mill, Sarnoff says, recently focused on bringing these two worlds together for the production of commercials and feature films.

“The Mill developed Blackbird, a virtual car rig that uses VR and AR [augmented reality] technology to digitally re-create any vehicle that needs to be featured for an advertisement,” he relates. “It also interacts dynamically with its environment. That means that when the photorealistic representation of an actual car is broadcast on your screen, it casts natural shadows and reflects light as it twists and turns around corners and climbs hills during a shoot. They have combined that with another just-announced technology called Cyclops, which is a solution that applies AR and game-engine technology to render on-set action for filmmakers in real time. Directors do not have to wait to see what the shoot looks like at a dailies session or in post-production. This will create immense opportunities for talent in the VFX community.” Imageworks’ Randy Lake says the synergy percolating between the emerging virtual reality realm and the traditional visual effects business has caused his company “to become involved in initiatives developed within Sony Pictures Entertainment to explore and develop techniques that bridge the workflow used in our feature film projects into the VR space. We see the opportunity for a bi-directional exchange of technology and workflow as the VR medium and business model evolves. As an example, this can take the form of creative visualization technology within our traditional film pipelines, or R&D from the VFX space helping to improve the fidelity of virtual worlds. In addition, we believe (AR) will play a big role in the filmmaking process. We can build real-time visualization tools to make better decisions on set.”

TOP: Spider-Man and Iron Man in Spider-Man: Homecoming. (Photo copyright © 2017 CTMG, Inc. All Rights Reserved. Courtesy of Columbia Pictures.) LEFT: Randy Lake

IMMERSIVE CONTENT

Lake’s point illustrates why the arrival of virtual reality tools and concepts have opened the door to an entirely new form of immersive media content that, among other things, by its nature, will rely heavily on visual effects artists, tools and techniques. Indeed, at major facilities, there is currently a heavy “cross-pollination of techniques and capabilities and talent that flows between feature

FALL 2017 VFXVOICE.COM • 83

9/14/17 3:52 PM


VES 20TH ANNIVERSARY

TOP: Moana (Photo copyright © 2016 Walt Disney Pictures. All Rights Reserved. Photo courtesy of Walt Disney Animation.) RIGHT: Ed Catmull, VES (Photo credit: Deborah Coleman, Pixar)

82 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 82-83

technology for filmmaking –not to do a special effects genre movie only, but to make regular films that you can increase in scope or size, meaning bigger themed films, because we can put actors believably into [situations] where we couldn’t before. This trend can apply to any real story about real people. The computer power is outrageous, the software is incredible, and we have new tools that allow us to previsualize much more efficiently. On my next film, we are now extending the virtual reality ability to preview the entire film using VR tools. You put on VR goggles and walk into fully realized, three-dimensional sets, move your head around and see things react as they would in the real environment. All of a sudden, it becomes perfectly natural to invent shots and camera moves, and decide where characters or buildings go. Once [the technique] is figured out, the wheel is created. Now, other people can pick up on it for all sorts of applications.” Industry veteran Tim Sarnoff, Deputy CEO and President of Production Services for Technicolor’s portfolio of visual effects companies (MPC, The Mill, Mikros Image, Mr. X and Technicolor), suggests the rapid adaptation of these tools in such a way as to fit into the needs of a filmmaking workflow has been almost as stunning as the tools themselves. “While the fundamentals of artistry certainly have not changed, technologically, things have advanced tremendously,” Sarnoff says. “A decade ago, you could not have created The Jungle Book. That project applied dozens of new innovative technologies, including Cloud [rendering] technology, which was used to handle the hundreds of thousands of processing hours needed to render an entire rain forest in photographic detail. That’s an example of the convergence of immersive technologies with today’s established technologies, leaving an indelible mark on how this industry is evolving.” The Mill, Sarnoff says, recently focused on bringing these two worlds together for the production of commercials and feature films.

“The Mill developed Blackbird, a virtual car rig that uses VR and AR [augmented reality] technology to digitally re-create any vehicle that needs to be featured for an advertisement,” he relates. “It also interacts dynamically with its environment. That means that when the photorealistic representation of an actual car is broadcast on your screen, it casts natural shadows and reflects light as it twists and turns around corners and climbs hills during a shoot. They have combined that with another just-announced technology called Cyclops, which is a solution that applies AR and game-engine technology to render on-set action for filmmakers in real time. Directors do not have to wait to see what the shoot looks like at a dailies session or in post-production. This will create immense opportunities for talent in the VFX community.” Imageworks’ Randy Lake says the synergy percolating between the emerging virtual reality realm and the traditional visual effects business has caused his company “to become involved in initiatives developed within Sony Pictures Entertainment to explore and develop techniques that bridge the workflow used in our feature film projects into the VR space. We see the opportunity for a bi-directional exchange of technology and workflow as the VR medium and business model evolves. As an example, this can take the form of creative visualization technology within our traditional film pipelines, or R&D from the VFX space helping to improve the fidelity of virtual worlds. In addition, we believe (AR) will play a big role in the filmmaking process. We can build real-time visualization tools to make better decisions on set.”

TOP: Spider-Man and Iron Man in Spider-Man: Homecoming. (Photo copyright © 2017 CTMG, Inc. All Rights Reserved. Courtesy of Columbia Pictures.) LEFT: Randy Lake

IMMERSIVE CONTENT

Lake’s point illustrates why the arrival of virtual reality tools and concepts have opened the door to an entirely new form of immersive media content that, among other things, by its nature, will rely heavily on visual effects artists, tools and techniques. Indeed, at major facilities, there is currently a heavy “cross-pollination of techniques and capabilities and talent that flows between feature

FALL 2017 VFXVOICE.COM • 83

9/14/17 3:52 PM


VES 20TH ANNIVERSARY

TOP: Blackbird was created by The Mill specific to their work for advertisers with a focus on automotive motion. (Photo courtesy of The Mill) RIGHT: Tim Sarnoff (Photo credit: Alexis Dickey)

84 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 85

films, broadcasting, advertising, games, and now into the immersive arena,” explains Technicolor’s Sarnoff. Studios and artists are already hard at work exploring this “immersive arena,” causing new creative laboratories to spring out of major facilities to research techniques, tools and concepts along the way. ILM has launched a new division called ILMxLAB to spearhead such work and, likewise, Technicolor has opened a new facility called the Technicolor Experience Center (TEC), while Framestore has launched Framestore VR Studio, just to name a few. In the short term, some facilities are producing VR experiences designed to promote major feature films. Damien Fangou, CTO at MPC Film, a Technicolor company, for instance, points out that MPC recently worked on the Passengers Awakening VR experience project for Sony, linked to the release of the feature film, Passengers; and also on the Alien: Covenant in Utero VR experience for Fox, tied to the Alien: Covenant release. Beyond such applications, a major focus of such facilities is to figure out how the VFX industry can “take advantage of the unique opportunities that VR and AR offer in their various [platforms] – location-based, at-home, mobile, and so on,” suggests Mohen Leo, Director of Content and Platform Strategy for ILMxLAB. While the foundations of such facilities are clearly being built on top of infrastructures that visual effects facilities already have for feature film work, Leo suggests experimentation across the industry is focusing on various new forms of interactive entertainment. “For many years, ILM has made increasing use of real-time computer graphics to support virtual production for films,” Leo explains. “Through on-set, real-time previsualization, real-time previewing of motion-capture re-targeted onto digital characters and, eventually, even scouting digital sets using VR headsets, we tried to let film directors feel like they can intuitively plan and shoot a movie in the digital worlds we create. With VR and AR devices suddenly becoming consumer products in recent years, we realized that if we could let directors step into our virtual worlds and interact with fantastic characters, the next logical step would be to let our audience do it, too. Every fan of Star Wars has wondered what it would be like to be in the Star Wars universe and meet its iconic characters. We wanted to make that possible, so in 2015 we started ILMxLAB with the specific goal of exploring opportunities in immersive storytelling. Since then we’ve released a variety of experiences and experiments, and are continuing to work on increasingly ambitious ideas.” People in this sector “don’t think of these immersive story experiences as a new form of film, nor are they a new form of video game,” Leo adds. “AR, VR, and other emerging digital content platforms are new media that will define their own new language for storytelling and find their own places alongside film, TV, video games, and other traditional media. In creating these experiences, we draw on the talents of VFX artists, along with people from the games industry, sound effects, engineering, music, storytelling, cinematography, and other disciplines.” Some of the resulting content will be radically different from what is on the market today, he suggests, but, simultaneously, the

resulting tools for making VR and AR content “will flow back into filmmaking and television,” Leo adds. “These tools will probably make high-end, user-generated content increasingly streamlined, but linear films and video content are a mature medium with over a century of history. The tools may change, but the final content will likely still follow established rules. On the other hand, the rules for what makes a compelling VR or AR storytelling experience haven’t been written yet.” Aron Hjartarson, Framestore’s Executive Creative Director, calls content coming out of such institutions, including Framestore VR Studio, “non-traditional work.” “[In our Los Angeles office], when it comes to non-traditional work, we have been very busy working on anything from 360 video content to full-room scale experiences,” Hjartarson explains. “Distribution has been via several avenues – anything from YouTube to the App Store, and everything in between. One of the interesting things about these projects is that the clients are diverse and come from different channels than our traditional business. We are finding that our core competencies have a wider application, from development to storytelling.” Hjartarson emphasizes these new avenues increase the need for VFX artists. “As far as how VR impacts the visual effects industry, there is greater demand for [visual effects artists] than before,” he says. “[For the VR Studio], the majority of our artists still come from our existing talent pool, which is deep and wide. But on the CG and compositing side, we have expanded the skill set of many of our artists to deal with the challenges involved, and even on the development side, we have taken some of our best developers from the traditional side, as well as added some new talent to complement them. “Also, 360 video is very challenging technically and needs supervisors that have a deep understanding of camera systems, optics, and how to reverse-engineer them. This is leading to a lot of R&D work in computer vision/computational photography, which is useful in traditional work. Thinking back to traditional VFX projects I’ve done in the past, we have developed tools that would have made those shows a lot easier. I think computational photography will develop at a fast rate, providing viable solutions for volumetric capture, which has a potential to influence everything we do, VR and traditional.”

TOP: Rogue One: A Star Wars Story (Photo credit: Film Frame. Copyright © 2016 Lucasfilm Ltd. All Rights Reserved.) LEFT: Mohen Leo

BEST TOOLS EVER

While working on Air Force One in 1997, Richard Edlund, VES, suddenly comprehended that digital techniques would eventually push past generations of practical effects work and mechanical engineering breakthroughs like motion control. Today, he views the work he and others did on Star Wars and what followed in the 1970s onward as “steppingstones” on the way to a revolution that continues to pick up steam. “On Air Force One, we had a 22-foot wingspan 747 model [built in the model shop of Edlund’s former company, Boss Films],” Edlund recalls. “It was perfected – a magnificent model. But halfway through production, our guys created a digital version

FALL 2017 VFXVOICE.COM • 85

9/14/17 3:52 PM


VES 20TH ANNIVERSARY

TOP: Blackbird was created by The Mill specific to their work for advertisers with a focus on automotive motion. (Photo courtesy of The Mill) RIGHT: Tim Sarnoff (Photo credit: Alexis Dickey)

84 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 85

films, broadcasting, advertising, games, and now into the immersive arena,” explains Technicolor’s Sarnoff. Studios and artists are already hard at work exploring this “immersive arena,” causing new creative laboratories to spring out of major facilities to research techniques, tools and concepts along the way. ILM has launched a new division called ILMxLAB to spearhead such work and, likewise, Technicolor has opened a new facility called the Technicolor Experience Center (TEC), while Framestore has launched Framestore VR Studio, just to name a few. In the short term, some facilities are producing VR experiences designed to promote major feature films. Damien Fangou, CTO at MPC Film, a Technicolor company, for instance, points out that MPC recently worked on the Passengers Awakening VR experience project for Sony, linked to the release of the feature film, Passengers; and also on the Alien: Covenant in Utero VR experience for Fox, tied to the Alien: Covenant release. Beyond such applications, a major focus of such facilities is to figure out how the VFX industry can “take advantage of the unique opportunities that VR and AR offer in their various [platforms] – location-based, at-home, mobile, and so on,” suggests Mohen Leo, Director of Content and Platform Strategy for ILMxLAB. While the foundations of such facilities are clearly being built on top of infrastructures that visual effects facilities already have for feature film work, Leo suggests experimentation across the industry is focusing on various new forms of interactive entertainment. “For many years, ILM has made increasing use of real-time computer graphics to support virtual production for films,” Leo explains. “Through on-set, real-time previsualization, real-time previewing of motion-capture re-targeted onto digital characters and, eventually, even scouting digital sets using VR headsets, we tried to let film directors feel like they can intuitively plan and shoot a movie in the digital worlds we create. With VR and AR devices suddenly becoming consumer products in recent years, we realized that if we could let directors step into our virtual worlds and interact with fantastic characters, the next logical step would be to let our audience do it, too. Every fan of Star Wars has wondered what it would be like to be in the Star Wars universe and meet its iconic characters. We wanted to make that possible, so in 2015 we started ILMxLAB with the specific goal of exploring opportunities in immersive storytelling. Since then we’ve released a variety of experiences and experiments, and are continuing to work on increasingly ambitious ideas.” People in this sector “don’t think of these immersive story experiences as a new form of film, nor are they a new form of video game,” Leo adds. “AR, VR, and other emerging digital content platforms are new media that will define their own new language for storytelling and find their own places alongside film, TV, video games, and other traditional media. In creating these experiences, we draw on the talents of VFX artists, along with people from the games industry, sound effects, engineering, music, storytelling, cinematography, and other disciplines.” Some of the resulting content will be radically different from what is on the market today, he suggests, but, simultaneously, the

resulting tools for making VR and AR content “will flow back into filmmaking and television,” Leo adds. “These tools will probably make high-end, user-generated content increasingly streamlined, but linear films and video content are a mature medium with over a century of history. The tools may change, but the final content will likely still follow established rules. On the other hand, the rules for what makes a compelling VR or AR storytelling experience haven’t been written yet.” Aron Hjartarson, Framestore’s Executive Creative Director, calls content coming out of such institutions, including Framestore VR Studio, “non-traditional work.” “[In our Los Angeles office], when it comes to non-traditional work, we have been very busy working on anything from 360 video content to full-room scale experiences,” Hjartarson explains. “Distribution has been via several avenues – anything from YouTube to the App Store, and everything in between. One of the interesting things about these projects is that the clients are diverse and come from different channels than our traditional business. We are finding that our core competencies have a wider application, from development to storytelling.” Hjartarson emphasizes these new avenues increase the need for VFX artists. “As far as how VR impacts the visual effects industry, there is greater demand for [visual effects artists] than before,” he says. “[For the VR Studio], the majority of our artists still come from our existing talent pool, which is deep and wide. But on the CG and compositing side, we have expanded the skill set of many of our artists to deal with the challenges involved, and even on the development side, we have taken some of our best developers from the traditional side, as well as added some new talent to complement them. “Also, 360 video is very challenging technically and needs supervisors that have a deep understanding of camera systems, optics, and how to reverse-engineer them. This is leading to a lot of R&D work in computer vision/computational photography, which is useful in traditional work. Thinking back to traditional VFX projects I’ve done in the past, we have developed tools that would have made those shows a lot easier. I think computational photography will develop at a fast rate, providing viable solutions for volumetric capture, which has a potential to influence everything we do, VR and traditional.”

TOP: Rogue One: A Star Wars Story (Photo credit: Film Frame. Copyright © 2016 Lucasfilm Ltd. All Rights Reserved.) LEFT: Mohen Leo

BEST TOOLS EVER

While working on Air Force One in 1997, Richard Edlund, VES, suddenly comprehended that digital techniques would eventually push past generations of practical effects work and mechanical engineering breakthroughs like motion control. Today, he views the work he and others did on Star Wars and what followed in the 1970s onward as “steppingstones” on the way to a revolution that continues to pick up steam. “On Air Force One, we had a 22-foot wingspan 747 model [built in the model shop of Edlund’s former company, Boss Films],” Edlund recalls. “It was perfected – a magnificent model. But halfway through production, our guys created a digital version

FALL 2017 VFXVOICE.COM • 85

9/14/17 3:52 PM


VES 20TH ANNIVERSARY

TOP: Framestore is an official creative and technology developer for the new additions to Facebook’s Surround 360 family of 360 cameras: x24 and x6. Framestore used its expertise to develop new workflows for the technology and experiment with extracting point clouds from the camera to be viewed in real-time for creative executions. (Photo courtesy of Framestore) RIGHT: Aron Hjartarson (Photo credit: Chris Eckardt/Framestore)

RIGHT: Damien Fangou BOTTOM: From the Alien: Covenant in Utero VR Experience for Fox, tied to the release of Alien: Covenant. (Photo courtesy of MPC)

86 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 87

of the 747. The CG one was more useful than the huge [practical] model – so realistic [that it was used for some close-up shots of the plane]. The huge model required a day to shoot the model and a day to shoot the matte pass – a complicated nine-wire system built just for that show and never used again. [In that era] we would invent ourselves out of mechanical corners that we constructed, but motion control and all those things were steppingstones toward what we have today with CG.” Radically improved simulation software tools have greatly accelerated this phenomenon beyond anything envisioned back then. ILM’s Lindy DeQuattro points out that “our water pipeline for The Perfect Storm [2000] was considered cutting edge at the time, but it was really just a cloth simulation for the water surface with particles generated based on motion vectors and surface normals to create splashes and sprays. These days, we have a true physics-based fluid simulator, which automatically generates accurate bubbles, splashes, mist, and so on, based on fluid motion. If you compare the water in The Perfect Storm to the work we more recently did on Pacific Rim (2013), it’s night and day. And we’ve had similar sweeping breakthroughs over the years in our creature pipeline, where we incorporated muscle and flesh simulation into the animation; in our rendering/shading pipeline, when we moved from a ray caster to a ray tracer; and in our lighting pipeline, when we switched to HDRI data and environment lighting.” Those improvements, in turn, are powered by the incorporation of increasingly powerful game engines. Indeed, DeQuattro insists “the use of real-time tools will continue to be at the forefront of the industry.” “They are the key to better on-set visualization, as well as rapid turnaround and more iterations in post work, which leads to higher quality and lower costs,” she adds. “Improving the technology used in game engines to be able to produce film-quality VFX will be a huge area of development and will dramatically change the industry.” According to MPC’s Damien Fangou, major advancements in The Jungle Book and other recent films were “all made possible around rendering technology. To be able to render such a large photoreal world was not a small feat. It required a tremendous amount of computing power to deliver those shots. We were able to leverage both our large in-house render farm at MPC, but also our secure Cloud rendering workflow.” Indeed, Cloud computing, industry professionals argue, is about to become a more routine and powerful weapon in the VFX artist’s arsenal. Fangou expects “the Cloud to trigger a huge transformation in the VFX industry, as it will enable new workflows and allow for a greater complexity of work and photorealism in a shorter amount of time.” And it’s Sarnoff’s view that Cloud storage will become important to the economics of running a visual effects facility because, “Cloud storage is having an interesting impact on how (our industry) thinks about where data belongs and how we should pay for it.” By that, he means, “it’s not just a question of managing storage costs. It’s more about shifting storage line items from an inflexible CAPEX [capital expenditure] center, avoiding the need to buy

expensive IT equipment, to a more flexible and scalable OPEX [operational expenditure] center, where storage service openings increase and decrease dynamically, depending on how you use it.” Fangou adds, “There are also big opportunities around [artificial intelligence] technology. I expect the visual effects industry will be seeing some big changes with the automation and machine intelligence that AI is bringing.” Catmull agrees. In particular, he says the Pixar, Disney Animation and ILM research units are all investigating deep-learning computer algorithms and the wider AI paradigm to see how systems might be able to make certain visual effects processes more efficient. The problem, he suggests, is that while he is optimistic “deep learning is likely going to have an impact on our industry,” it still remains “such a broad concept, so it is not crystal clear what that impact is going to be yet, beyond the fact that the rate of growth and use of these chips puts us in a place to apply them in some unforeseen ways.” Most industry representatives say, at the end of the day, most of these issues are about how the visual effects industry does things, and what tools and workflows it employs to do them. Tim Sarnoff suggests the verdict regarding these advancements, as always, will be rendered by the public.“The experience and the storytelling is what matters most when evaluating success in the creative technology arena, not the technical innovation itself,” Sarnoff emphasizes. “This has been the pattern in entertainment and communication throughout the ages, and I see no reason to suspect this pattern will not repeat itself again.”

RIGHT: Ron Frankel BOTTOM: Previs by Proof Inc. for Wonder Woman. (Image copyright © 2015 Warner Bros. Entertainment and courtesy of Proof Inc.)

TOP: Foundry’s Cara VR is an advanced toolkit that streamlines the process of VR content creation by dramatically speeding up the challenging, often tedious process of stitching and compositing 360° video footage. It also makes it easy for artists to use Nuke’s full suite of compositing tools on 360° footage. (Photo courtesy of Foundry) LEFT: Alex Mahon (Photo credit: Martin Beddall/Foundry)

LEFT: Jim Morris, VES (Photo credit: Deborah Coleman/Pixar) BOTTOM: Cars 3 (Photo copyright © 2016 Disney/Pixar Animation Studio. All Rights Reserved.)

FALL 2017 VFXVOICE.COM • 87

9/14/17 3:52 PM


VES 20TH ANNIVERSARY

TOP: Framestore is an official creative and technology developer for the new additions to Facebook’s Surround 360 family of 360 cameras: x24 and x6. Framestore used its expertise to develop new workflows for the technology and experiment with extracting point clouds from the camera to be viewed in real-time for creative executions. (Photo courtesy of Framestore) RIGHT: Aron Hjartarson (Photo credit: Chris Eckardt/Framestore)

RIGHT: Damien Fangou BOTTOM: From the Alien: Covenant in Utero VR Experience for Fox, tied to the release of Alien: Covenant. (Photo courtesy of MPC)

86 • VFXVOICE.COM FALL 2017

PG 80-87 VFX FUTURE.indd 87

of the 747. The CG one was more useful than the huge [practical] model – so realistic [that it was used for some close-up shots of the plane]. The huge model required a day to shoot the model and a day to shoot the matte pass – a complicated nine-wire system built just for that show and never used again. [In that era] we would invent ourselves out of mechanical corners that we constructed, but motion control and all those things were steppingstones toward what we have today with CG.” Radically improved simulation software tools have greatly accelerated this phenomenon beyond anything envisioned back then. ILM’s Lindy DeQuattro points out that “our water pipeline for The Perfect Storm [2000] was considered cutting edge at the time, but it was really just a cloth simulation for the water surface with particles generated based on motion vectors and surface normals to create splashes and sprays. These days, we have a true physics-based fluid simulator, which automatically generates accurate bubbles, splashes, mist, and so on, based on fluid motion. If you compare the water in The Perfect Storm to the work we more recently did on Pacific Rim (2013), it’s night and day. And we’ve had similar sweeping breakthroughs over the years in our creature pipeline, where we incorporated muscle and flesh simulation into the animation; in our rendering/shading pipeline, when we moved from a ray caster to a ray tracer; and in our lighting pipeline, when we switched to HDRI data and environment lighting.” Those improvements, in turn, are powered by the incorporation of increasingly powerful game engines. Indeed, DeQuattro insists “the use of real-time tools will continue to be at the forefront of the industry.” “They are the key to better on-set visualization, as well as rapid turnaround and more iterations in post work, which leads to higher quality and lower costs,” she adds. “Improving the technology used in game engines to be able to produce film-quality VFX will be a huge area of development and will dramatically change the industry.” According to MPC’s Damien Fangou, major advancements in The Jungle Book and other recent films were “all made possible around rendering technology. To be able to render such a large photoreal world was not a small feat. It required a tremendous amount of computing power to deliver those shots. We were able to leverage both our large in-house render farm at MPC, but also our secure Cloud rendering workflow.” Indeed, Cloud computing, industry professionals argue, is about to become a more routine and powerful weapon in the VFX artist’s arsenal. Fangou expects “the Cloud to trigger a huge transformation in the VFX industry, as it will enable new workflows and allow for a greater complexity of work and photorealism in a shorter amount of time.” And it’s Sarnoff’s view that Cloud storage will become important to the economics of running a visual effects facility because, “Cloud storage is having an interesting impact on how (our industry) thinks about where data belongs and how we should pay for it.” By that, he means, “it’s not just a question of managing storage costs. It’s more about shifting storage line items from an inflexible CAPEX [capital expenditure] center, avoiding the need to buy

expensive IT equipment, to a more flexible and scalable OPEX [operational expenditure] center, where storage service openings increase and decrease dynamically, depending on how you use it.” Fangou adds, “There are also big opportunities around [artificial intelligence] technology. I expect the visual effects industry will be seeing some big changes with the automation and machine intelligence that AI is bringing.” Catmull agrees. In particular, he says the Pixar, Disney Animation and ILM research units are all investigating deep-learning computer algorithms and the wider AI paradigm to see how systems might be able to make certain visual effects processes more efficient. The problem, he suggests, is that while he is optimistic “deep learning is likely going to have an impact on our industry,” it still remains “such a broad concept, so it is not crystal clear what that impact is going to be yet, beyond the fact that the rate of growth and use of these chips puts us in a place to apply them in some unforeseen ways.” Most industry representatives say, at the end of the day, most of these issues are about how the visual effects industry does things, and what tools and workflows it employs to do them. Tim Sarnoff suggests the verdict regarding these advancements, as always, will be rendered by the public.“The experience and the storytelling is what matters most when evaluating success in the creative technology arena, not the technical innovation itself,” Sarnoff emphasizes. “This has been the pattern in entertainment and communication throughout the ages, and I see no reason to suspect this pattern will not repeat itself again.”

RIGHT: Ron Frankel BOTTOM: Previs by Proof Inc. for Wonder Woman. (Image copyright © 2015 Warner Bros. Entertainment and courtesy of Proof Inc.)

TOP: Foundry’s Cara VR is an advanced toolkit that streamlines the process of VR content creation by dramatically speeding up the challenging, often tedious process of stitching and compositing 360° video footage. It also makes it easy for artists to use Nuke’s full suite of compositing tools on 360° footage. (Photo courtesy of Foundry) LEFT: Alex Mahon (Photo credit: Martin Beddall/Foundry)

LEFT: Jim Morris, VES (Photo credit: Deborah Coleman/Pixar) BOTTOM: Cars 3 (Photo copyright © 2016 Disney/Pixar Animation Studio. All Rights Reserved.)

FALL 2017 VFXVOICE.COM • 87

9/14/17 3:52 PM


INDUSTRY ROUNDTABLE

VR THOUGHT-LEADERS ON TECHNOLOGY’S NEW SILK ROAD

Is VR the Holy Grail of new technology? Companies like Google, Microsoft, Samsung, Facebook and others are investing billions of dollars in VR hardware and software technology. Motion picture studios, A-list directors, production companies, VFX houses, videogame firms, theme parks, museums, business corporations and related tech companies are also moving forward into this new, uncharted territory. That has triggered an avalanche of content creation by established companies as well as many startups that are also investing their creative time and fortunes in development. VFX Voice Publisher Jim McCullaugh gathered a group of VR trailblazers from around the globe and asked them where we are in VR/AR and where we go from here.

JON PEDDIE, JON PEDDIE RESEARCH, TIBURON, CALIFORNIA

“Content developers are just learning how to manage the viewer’s attention, while giving him or her the excitement and discovery of a full, open, imaginary world, or a view of the real world. Also, the content developers are just now getting the software tools to develop these experiences and worlds. It’s a trial and error process.” —Jon Peddie, Jon Peddie Research

88 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 88-89

“We need more … creativity to help mainstream audiences appreciate the capacity VR has to blow our minds. To say this another way – turn me around! I want to be surrounded by what you’ve created. Otherwise I might as well be sitting on a couch watching it in 2D.” —Tony Parisi, Head of VR, Unity

When VR was thrust into the consciousness of the consumer in 2014, the biggest weakness was, and pretty much still is, content that is designed specifically for VR. That’s been somewhat ameliorated by passive 360 VR which is gentler on the viewer. Content developers are just learning how to manage the viewer’s attention, while giving him or her the excitement and discovery of a full, open, imaginary world, or a view of the real world. Also, the content developers are just now getting the software tools to develop these experiences and worlds. It’s a trial and error process. The first thing most people think of when they hear VR is gaming. That’s because of the sensationally large investment by Facebook in Oculus. The other 40 headset makers that jumped into the market, plus the 70 smartphone-based Samsung Gear clones, and Sony, only had some retrofitted games to use as demonstration. The result was the consumer associated the concept of VR as a gaming machine, but it is ‘oh so much more than that’. Prior to the Facebook intrusion into VR, the market was based on scientific, engineering, medical and military applications. Now applications that show great promise and in various stages of

development are: automotive, construction and real estate, events and conferences, film and entertainment, gaming, health care and medicine, journalism and media dissemination, law enforcement, manufacturing and logistics, marketing and advertising, military/defense, recruiting, talent management and HR, retail, and space exploration. Probably one of the biggest opportunities for clever use of VR is education. AR offers a bigger opportunity for marketing than VR does. TONY PARISI, HEAD OF VR, UNITY, SAN FRANCISCO

As an immersive technology that represents the next computing platform, VR’s use and development are not limited to making games and can be used to tell stories and create worlds. But most important to note about VR is how it creates empathy, escape, embodiment and engagement. It also needs to maintain a high production value to fully immerse people to the point where they may even question what ‘reality’ is. This requires thoughtful and thorough storytelling, beautiful and compelling graphics, and increasing levels of interactivity, whether through conversations, by physical interaction or through social constructs that promote engagement. Finally, VR needs to promote social interactions. This will be the difference between ‘immersing’ yourself and ‘losing’ yourself. One of the most crucial design considerations is how to tell a story that actually makes use of a 360-degree space. We need more of this creativity to help mainstream audiences appreciate the capacity VR has to blow our minds. To say this another way – turn me around! I want to be surrounded by what you’ve created. Otherwise I might as well be sitting on a couch watching it in 2D. SOL ROGERS, CEO/FOUNDER, REWIND, UNITED KINGDOM

Goldman Sachs Research expects virtual and augmented reality to become an $80 billion market by 2025, roughly the size of the desktop PC market today. To get to this point we need mass adoption to occur. With the release of ARKit, Apple

has accelerated Augmented Reality development and when ARKit becomes available on Apple devices running iOS 11, AR will become accessible to a huge user base. Mobile is also going to be extremely important for VR. As mobiles become more powerful and capable of tracking and offering high-end interactive experiences, VR will reach the tipping point. Mobile is the bedrock for the progression of the immersive industries. A common criticism of VR is that it is isolating. While VR will always separate you from the real world, it can and will bring people together to communicate, experience and play in a truly memorable way. This is why Facebook sees VR as one of the world’s biggest social enablers. The all-immersive elements of VR make the experience much more enriching than a ‘like’ or ‘share’. The creation of content that connects individuals through a shared experience is incredibly important to push the industry forward.

“A common criticism of VR is that it is isolating. While VR will always separate you from the real world, it can and will bring people together to communicate, experience and play in a truly memorable way.” —Sol Rogers, CEO/Founder, Rewind

JON WADELTON, CHIEF TECHNOLOGY OFFICER, FOUNDRY, UNITED KINGDOM

For those of us developing VR software, the main focus over the next couple of years is on providing tools that allow directors to build riveting, immersive content. In order for VR technology to fully break into mainstream production and cement its position there, the content itself needs to live up to its namesake and truly replicate reality. VR content is currently still in its early stages, but steps are underway to take us to this next stage. In the coming years we can expect to see the next iteration of immersive content as it becomes ‘hyper-real’. The blending of physical and virtual reality together – as hyper reality does – is certainly a trend we expect to see a lot more of in content creation in the future.

“VR will naturally become a more accessible technology as it goes through a couple of life cycles, just as our phones and computers have done. But one thing we can aim to change more quickly is the price point of VR software and hardware.” —Jon Wadelton, Chief Technology Officer, Foundry

FALL 2017 VFXVOICE.COM • 89

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

VR THOUGHT-LEADERS ON TECHNOLOGY’S NEW SILK ROAD

Is VR the Holy Grail of new technology? Companies like Google, Microsoft, Samsung, Facebook and others are investing billions of dollars in VR hardware and software technology. Motion picture studios, A-list directors, production companies, VFX houses, videogame firms, theme parks, museums, business corporations and related tech companies are also moving forward into this new, uncharted territory. That has triggered an avalanche of content creation by established companies as well as many startups that are also investing their creative time and fortunes in development. VFX Voice Publisher Jim McCullaugh gathered a group of VR trailblazers from around the globe and asked them where we are in VR/AR and where we go from here.

JON PEDDIE, JON PEDDIE RESEARCH, TIBURON, CALIFORNIA

“Content developers are just learning how to manage the viewer’s attention, while giving him or her the excitement and discovery of a full, open, imaginary world, or a view of the real world. Also, the content developers are just now getting the software tools to develop these experiences and worlds. It’s a trial and error process.” —Jon Peddie, Jon Peddie Research

88 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 88-89

“We need more … creativity to help mainstream audiences appreciate the capacity VR has to blow our minds. To say this another way – turn me around! I want to be surrounded by what you’ve created. Otherwise I might as well be sitting on a couch watching it in 2D.” —Tony Parisi, Head of VR, Unity

When VR was thrust into the consciousness of the consumer in 2014, the biggest weakness was, and pretty much still is, content that is designed specifically for VR. That’s been somewhat ameliorated by passive 360 VR which is gentler on the viewer. Content developers are just learning how to manage the viewer’s attention, while giving him or her the excitement and discovery of a full, open, imaginary world, or a view of the real world. Also, the content developers are just now getting the software tools to develop these experiences and worlds. It’s a trial and error process. The first thing most people think of when they hear VR is gaming. That’s because of the sensationally large investment by Facebook in Oculus. The other 40 headset makers that jumped into the market, plus the 70 smartphone-based Samsung Gear clones, and Sony, only had some retrofitted games to use as demonstration. The result was the consumer associated the concept of VR as a gaming machine, but it is ‘oh so much more than that’. Prior to the Facebook intrusion into VR, the market was based on scientific, engineering, medical and military applications. Now applications that show great promise and in various stages of

development are: automotive, construction and real estate, events and conferences, film and entertainment, gaming, health care and medicine, journalism and media dissemination, law enforcement, manufacturing and logistics, marketing and advertising, military/defense, recruiting, talent management and HR, retail, and space exploration. Probably one of the biggest opportunities for clever use of VR is education. AR offers a bigger opportunity for marketing than VR does. TONY PARISI, HEAD OF VR, UNITY, SAN FRANCISCO

As an immersive technology that represents the next computing platform, VR’s use and development are not limited to making games and can be used to tell stories and create worlds. But most important to note about VR is how it creates empathy, escape, embodiment and engagement. It also needs to maintain a high production value to fully immerse people to the point where they may even question what ‘reality’ is. This requires thoughtful and thorough storytelling, beautiful and compelling graphics, and increasing levels of interactivity, whether through conversations, by physical interaction or through social constructs that promote engagement. Finally, VR needs to promote social interactions. This will be the difference between ‘immersing’ yourself and ‘losing’ yourself. One of the most crucial design considerations is how to tell a story that actually makes use of a 360-degree space. We need more of this creativity to help mainstream audiences appreciate the capacity VR has to blow our minds. To say this another way – turn me around! I want to be surrounded by what you’ve created. Otherwise I might as well be sitting on a couch watching it in 2D. SOL ROGERS, CEO/FOUNDER, REWIND, UNITED KINGDOM

Goldman Sachs Research expects virtual and augmented reality to become an $80 billion market by 2025, roughly the size of the desktop PC market today. To get to this point we need mass adoption to occur. With the release of ARKit, Apple

has accelerated Augmented Reality development and when ARKit becomes available on Apple devices running iOS 11, AR will become accessible to a huge user base. Mobile is also going to be extremely important for VR. As mobiles become more powerful and capable of tracking and offering high-end interactive experiences, VR will reach the tipping point. Mobile is the bedrock for the progression of the immersive industries. A common criticism of VR is that it is isolating. While VR will always separate you from the real world, it can and will bring people together to communicate, experience and play in a truly memorable way. This is why Facebook sees VR as one of the world’s biggest social enablers. The all-immersive elements of VR make the experience much more enriching than a ‘like’ or ‘share’. The creation of content that connects individuals through a shared experience is incredibly important to push the industry forward.

“A common criticism of VR is that it is isolating. While VR will always separate you from the real world, it can and will bring people together to communicate, experience and play in a truly memorable way.” —Sol Rogers, CEO/Founder, Rewind

JON WADELTON, CHIEF TECHNOLOGY OFFICER, FOUNDRY, UNITED KINGDOM

For those of us developing VR software, the main focus over the next couple of years is on providing tools that allow directors to build riveting, immersive content. In order for VR technology to fully break into mainstream production and cement its position there, the content itself needs to live up to its namesake and truly replicate reality. VR content is currently still in its early stages, but steps are underway to take us to this next stage. In the coming years we can expect to see the next iteration of immersive content as it becomes ‘hyper-real’. The blending of physical and virtual reality together – as hyper reality does – is certainly a trend we expect to see a lot more of in content creation in the future.

“VR will naturally become a more accessible technology as it goes through a couple of life cycles, just as our phones and computers have done. But one thing we can aim to change more quickly is the price point of VR software and hardware.” —Jon Wadelton, Chief Technology Officer, Foundry

FALL 2017 VFXVOICE.COM • 89

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

CHRISTOPHER GOMEZ, AR/VR INDUSTRY EVANGELIST & ANGEL INVESTOR, SINGAPORE

“In the filmmaking space the real question about content is, “Are we asking the right questions? Will 360 film change storytelling?” No. How it’s told will be the same; how it’s experienced will be different.” —Christopher Gomez, AR/VR Industry Evangelist & Angel Investor

90 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 90-91

“VFX will play a large role in guiding the [VR] user in overt ways, and in suggesting action to the user in more subtle, even subliminal ways. VFX will continue to play the same role it has in 2D storytelling today in enhancing and creating worlds, scenarios and action that is difficult, costly, or impossible to do in real life.” —David Schleifer, COO, Primestream

VR and AR are alive and well in marketing, especially in activation campaigns and events. Storytelling is still disjointed and pretty incoherent. But I will say it’s not the big marketing names that are getting it right. It’s the small studios that are. If you heard about a project that’s done by a big ad company, it’s very likely outsourced. I am glad to be proven wrong, but being proven wrong once or thrice doesn’t make it the industry norm. So keep a look out for the small-tech VR companies, not the marketing VR companies. There aren’t many, but they are out there. In the filmmaking space the real question about content is, “Are we asking the right questions? Will 360 film change storytelling?” No. How it’s told will be the same; how it’s experienced will be different. There are many other issues that come with filmmaking and ‘transitioning’ film into VR film. Filmmaking is a linear medium. Because storytelling is linear, people may argue that there is so much going on around you in the real world so it’s not linear. But in a social, day-to-day setting, we process things sequentially. So stories are told in ways that help you understand. Stories on film are passive narratives. Not experiential ones. When we go watch a movie, we are telling ourselves we are going to be an audience. Audiences are passive. In fact, audiences like and want to be passive. A cinematic experience is one that is social and relatively safe. There are those who are looking to get you to be a participant, and I believe that’s still too far a leap. It’s like cramming a video game into a movie. I believe if there is going to be serious breakthrough in ‘VR film’ it will come from the animation field. The film language of animation today fits a 2D viewing experience. Now, because animation is essentially made of bits, it is pliable. New storytelling elements or tools can be invented and can be introduced to the current film language that we know today, and can be further expanded on. In the coming year, expect VR theme parks. Not VR arcades or gigantic VR

arcades that you see popping up in parts of China, but actual theme parks where you can ride through a story. DAVID SCHLEIFER, COO, PRIMESTREAM, MIAMI, THE NETHERLANDS AND INDIA

As is true in any media delivery ecosystem, content is king, but in the VR space there is also a focus on how to create compelling content that has implications throughout the delivery chain. Cameras are improving at the same time that storytellers are working out the details of marrying spatial sound with high-resolution images. Distributors are working to fine-tune methods of delivering more pixels in the direction you are looking without a need to download them for the sphere, cube or pyramid of images that the viewer is positioned in. The challenge in the VR or 360 space is in how to tell a story – a guided narrative – in an environment that invites user redirection. Just like in real life, people take cues from their environment to make decisions about where to look, move, and how to interact. VFX will play a large role in guiding the user in overt ways, and in suggesting action to the user in more subtle, even subliminal ways. VFX will also continue to play the same role it has in 2D storytelling today in enhancing and creating worlds, scenarios and action that is difficult, costly, or impossible to do in real life.

“On the content side, we need to think about what that mainstream audience is going to love in VR. The answer is not hardcore video games. What will a typical Netflix viewer enjoy? We believe the answer is narrative content running in a real-time 3D game engine that delivers the right balance of immersion, presence and responsiveness.” —John Scott Tynes, Executive Producer for Virtual Reality, Holospark

“There is more and more fantastic work being produced all the time. It really feels like we’re not far away from having not just one, but also many ‘killer apps’ for VR.” —Karl Woolley, Head of VR, Framestore

JOHN SCOTT TYNES, EXECUTIVE PRODUCER FOR VIRTUAL REALITY, HOLOSPARK, SEATTLE

For VR to reach the mainstream, we need to see Rift/Vive quality experiences running on smartphones, 6DOF (six degrees of freedom) head tracking, tracked motion controllers, and mobile GPUs that can run Unreal/Unity VR at much better quality than phones do today. Samsung Gear and Google Cardboard have shown there is a real appetite in the market for VR content, but mainly what they’re getting is 360 videos while the really immersive stuff is behind a wall of high-end technology only used by hardcore gamers. On the content side, we need to think

FALL 2017 VFXVOICE.COM • 91

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

CHRISTOPHER GOMEZ, AR/VR INDUSTRY EVANGELIST & ANGEL INVESTOR, SINGAPORE

“In the filmmaking space the real question about content is, “Are we asking the right questions? Will 360 film change storytelling?” No. How it’s told will be the same; how it’s experienced will be different.” —Christopher Gomez, AR/VR Industry Evangelist & Angel Investor

90 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 90-91

“VFX will play a large role in guiding the [VR] user in overt ways, and in suggesting action to the user in more subtle, even subliminal ways. VFX will continue to play the same role it has in 2D storytelling today in enhancing and creating worlds, scenarios and action that is difficult, costly, or impossible to do in real life.” —David Schleifer, COO, Primestream

VR and AR are alive and well in marketing, especially in activation campaigns and events. Storytelling is still disjointed and pretty incoherent. But I will say it’s not the big marketing names that are getting it right. It’s the small studios that are. If you heard about a project that’s done by a big ad company, it’s very likely outsourced. I am glad to be proven wrong, but being proven wrong once or thrice doesn’t make it the industry norm. So keep a look out for the small-tech VR companies, not the marketing VR companies. There aren’t many, but they are out there. In the filmmaking space the real question about content is, “Are we asking the right questions? Will 360 film change storytelling?” No. How it’s told will be the same; how it’s experienced will be different. There are many other issues that come with filmmaking and ‘transitioning’ film into VR film. Filmmaking is a linear medium. Because storytelling is linear, people may argue that there is so much going on around you in the real world so it’s not linear. But in a social, day-to-day setting, we process things sequentially. So stories are told in ways that help you understand. Stories on film are passive narratives. Not experiential ones. When we go watch a movie, we are telling ourselves we are going to be an audience. Audiences are passive. In fact, audiences like and want to be passive. A cinematic experience is one that is social and relatively safe. There are those who are looking to get you to be a participant, and I believe that’s still too far a leap. It’s like cramming a video game into a movie. I believe if there is going to be serious breakthrough in ‘VR film’ it will come from the animation field. The film language of animation today fits a 2D viewing experience. Now, because animation is essentially made of bits, it is pliable. New storytelling elements or tools can be invented and can be introduced to the current film language that we know today, and can be further expanded on. In the coming year, expect VR theme parks. Not VR arcades or gigantic VR

arcades that you see popping up in parts of China, but actual theme parks where you can ride through a story. DAVID SCHLEIFER, COO, PRIMESTREAM, MIAMI, THE NETHERLANDS AND INDIA

As is true in any media delivery ecosystem, content is king, but in the VR space there is also a focus on how to create compelling content that has implications throughout the delivery chain. Cameras are improving at the same time that storytellers are working out the details of marrying spatial sound with high-resolution images. Distributors are working to fine-tune methods of delivering more pixels in the direction you are looking without a need to download them for the sphere, cube or pyramid of images that the viewer is positioned in. The challenge in the VR or 360 space is in how to tell a story – a guided narrative – in an environment that invites user redirection. Just like in real life, people take cues from their environment to make decisions about where to look, move, and how to interact. VFX will play a large role in guiding the user in overt ways, and in suggesting action to the user in more subtle, even subliminal ways. VFX will also continue to play the same role it has in 2D storytelling today in enhancing and creating worlds, scenarios and action that is difficult, costly, or impossible to do in real life.

“On the content side, we need to think about what that mainstream audience is going to love in VR. The answer is not hardcore video games. What will a typical Netflix viewer enjoy? We believe the answer is narrative content running in a real-time 3D game engine that delivers the right balance of immersion, presence and responsiveness.” —John Scott Tynes, Executive Producer for Virtual Reality, Holospark

“There is more and more fantastic work being produced all the time. It really feels like we’re not far away from having not just one, but also many ‘killer apps’ for VR.” —Karl Woolley, Head of VR, Framestore

JOHN SCOTT TYNES, EXECUTIVE PRODUCER FOR VIRTUAL REALITY, HOLOSPARK, SEATTLE

For VR to reach the mainstream, we need to see Rift/Vive quality experiences running on smartphones, 6DOF (six degrees of freedom) head tracking, tracked motion controllers, and mobile GPUs that can run Unreal/Unity VR at much better quality than phones do today. Samsung Gear and Google Cardboard have shown there is a real appetite in the market for VR content, but mainly what they’re getting is 360 videos while the really immersive stuff is behind a wall of high-end technology only used by hardcore gamers. On the content side, we need to think

FALL 2017 VFXVOICE.COM • 91

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

about what that mainstream audience is going to love in VR. The answer is not hardcore video games. What will a typical Netflix viewer enjoy? We believe the answer is narrative content running in a real-time 3D game engine that delivers the right balance of immersion, presence and responsiveness. KARL WOOLLEY, HEAD OF VR, FRAMESTORE, UNITED KINGDOM

“We believe there is a real opportunity to take a more holistic approach on VR and build out the ecosystem by combining the best of VR in a destination-based application that is more social and interactive, and most importantly, more cost-accessible to consumers.” —Rob Lister, Chief Development Officer, IMAX

“This year I demand a high-budget Netflix VR series, at least five AAA game titles, and less spin-off film/TV/ game experiences, and more things that are allowed to stand on their own. I want to see the first film spin-off of a VR experience!” —Mike Woods, Director of Immersive Content, m ss ng p eces

Like any medium, new or old, the success of itself is usually best determined by the content it produces, so for me, the most important consideration for VR right now and for the future is content. I’ve no doubt we will get there, perhaps this year or next, but once we do, not only will there be demand for quality, I also believe there will be a demand for varied price points and genres, much like film/ TV/games right now. There is more and more fantastic work being produced all the time. It really feels like we’re not far away from having not just one, but also many ‘killer apps’ for VR. The technology side of the VR industry is both its biggest upside and also perhaps its most challenging side, too. VFX studios and the general community are used to changes and adaptations to their workflows, be it a new version of their favorite 3D software necessitating plugin rewrites, or introduction of cloud-based or GPU rendering solutions. The VFX industry, at least in my personal experience, has always been good at adapting to new technologies – because it has to. VR is similar right now. ROB LISTER, CHIEF DEVELOPMENT OFFICER, IMAX, LOS ANGELES

Historically, VR has been slow to go mainstream for several reasons including costs, access to the technology and quality of content. While we expect to continue to see improvements on both the hardware and content side, which is always the case and certainly necessary in any burgeoning entertainment technology industry, at IMAX we believe there is a real opportunity to take a more holistic approach on VR and build out the ecosystem by combining the best of VR in a destination-based application that is more social and interactive,

92 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 93

and most importantly, more cost-accessible to consumers. If you look at the cinema industry, there are thousands of players and components – from camera manufacturers, studios and filmmakers to editors, exhibitors, etc. that all work together in an intricate ecosystem that has evolved over the last century. It will be some time before such a robust ecosystem is created in VR, but we believe we are in a unique position to leverage our brand, relationships and decades of experience to excite consumers about the potential of VR in these early days. MIKE WOODS, DIRECTOR OF IMMERSIVE CONTENT, M SS NG P ECES, LOS ANGELES

We’ve had some groundbreaking experimentation over the past four years, but now is the time to connect to a wider public. There’s an inherent danger in getting too insular in our focus on design/execution/ technical challenges. These are really exciting things for bright minds in our space, but we need to stop just communicating with each other and focus on some epic content that makes the wider public reach for their wallets. Content-wise, I hope to see less experimentation and more investment bravery in pure content plays. I want to be greedy. This year I demand a high-budget Netflix VR series, at least five triple A game titles, and less spin-off film/TV/game experiences, and more things that are allowed to stand on their own. I want to see the first film spin-off of a VR experience!

“Whoever creates a headset with the portability and price of mobile VR and the quality and interactivity of desktop VR will see iPhone-level success. The VFX community can contribute to that vision by designing worlds optimized for mobile and bringing their expertise to VR and AR startups.” —Cosmo Scharf, Co-founder VRLA and Mindshow

“Thus far, we have not seen the ‘virtual’ of virtual reality. All we have done is recreate the environment that we are familiar with. Everything looks like something that already exists. Nobody is taking advantage yet of the ‘virtual’ part. It doesn’t have to an experience that even has a floor. There are a lot of experiences that have not been tried yet.” —Ryan Moore, CEO, Experience 360

COSMO SCHARF, CO-FOUNDER VRLA AND MINDSHOW, LOS ANGELES

The biggest impediment that makes widespread consumer adoption of VR challenging is the chasm between desktop and mobile VR in quality, immersion and interactivity. Headsets like the Vive and Rift are very high quality, but require expensive gaming PCs that sit in your living room. Devices like the Gear and Daydream are extremely portable, but lack great 6DOF (six degrees of freedom) controllers and positional tracking. Whoever creates a headset with the portability and price of

FALL 2017 VFXVOICE.COM • 93

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

about what that mainstream audience is going to love in VR. The answer is not hardcore video games. What will a typical Netflix viewer enjoy? We believe the answer is narrative content running in a real-time 3D game engine that delivers the right balance of immersion, presence and responsiveness. KARL WOOLLEY, HEAD OF VR, FRAMESTORE, UNITED KINGDOM

“We believe there is a real opportunity to take a more holistic approach on VR and build out the ecosystem by combining the best of VR in a destination-based application that is more social and interactive, and most importantly, more cost-accessible to consumers.” —Rob Lister, Chief Development Officer, IMAX

“This year I demand a high-budget Netflix VR series, at least five AAA game titles, and less spin-off film/TV/ game experiences, and more things that are allowed to stand on their own. I want to see the first film spin-off of a VR experience!” —Mike Woods, Director of Immersive Content, m ss ng p eces

Like any medium, new or old, the success of itself is usually best determined by the content it produces, so for me, the most important consideration for VR right now and for the future is content. I’ve no doubt we will get there, perhaps this year or next, but once we do, not only will there be demand for quality, I also believe there will be a demand for varied price points and genres, much like film/ TV/games right now. There is more and more fantastic work being produced all the time. It really feels like we’re not far away from having not just one, but also many ‘killer apps’ for VR. The technology side of the VR industry is both its biggest upside and also perhaps its most challenging side, too. VFX studios and the general community are used to changes and adaptations to their workflows, be it a new version of their favorite 3D software necessitating plugin rewrites, or introduction of cloud-based or GPU rendering solutions. The VFX industry, at least in my personal experience, has always been good at adapting to new technologies – because it has to. VR is similar right now. ROB LISTER, CHIEF DEVELOPMENT OFFICER, IMAX, LOS ANGELES

Historically, VR has been slow to go mainstream for several reasons including costs, access to the technology and quality of content. While we expect to continue to see improvements on both the hardware and content side, which is always the case and certainly necessary in any burgeoning entertainment technology industry, at IMAX we believe there is a real opportunity to take a more holistic approach on VR and build out the ecosystem by combining the best of VR in a destination-based application that is more social and interactive,

92 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 93

and most importantly, more cost-accessible to consumers. If you look at the cinema industry, there are thousands of players and components – from camera manufacturers, studios and filmmakers to editors, exhibitors, etc. that all work together in an intricate ecosystem that has evolved over the last century. It will be some time before such a robust ecosystem is created in VR, but we believe we are in a unique position to leverage our brand, relationships and decades of experience to excite consumers about the potential of VR in these early days. MIKE WOODS, DIRECTOR OF IMMERSIVE CONTENT, M SS NG P ECES, LOS ANGELES

We’ve had some groundbreaking experimentation over the past four years, but now is the time to connect to a wider public. There’s an inherent danger in getting too insular in our focus on design/execution/ technical challenges. These are really exciting things for bright minds in our space, but we need to stop just communicating with each other and focus on some epic content that makes the wider public reach for their wallets. Content-wise, I hope to see less experimentation and more investment bravery in pure content plays. I want to be greedy. This year I demand a high-budget Netflix VR series, at least five triple A game titles, and less spin-off film/TV/game experiences, and more things that are allowed to stand on their own. I want to see the first film spin-off of a VR experience!

“Whoever creates a headset with the portability and price of mobile VR and the quality and interactivity of desktop VR will see iPhone-level success. The VFX community can contribute to that vision by designing worlds optimized for mobile and bringing their expertise to VR and AR startups.” —Cosmo Scharf, Co-founder VRLA and Mindshow

“Thus far, we have not seen the ‘virtual’ of virtual reality. All we have done is recreate the environment that we are familiar with. Everything looks like something that already exists. Nobody is taking advantage yet of the ‘virtual’ part. It doesn’t have to an experience that even has a floor. There are a lot of experiences that have not been tried yet.” —Ryan Moore, CEO, Experience 360

COSMO SCHARF, CO-FOUNDER VRLA AND MINDSHOW, LOS ANGELES

The biggest impediment that makes widespread consumer adoption of VR challenging is the chasm between desktop and mobile VR in quality, immersion and interactivity. Headsets like the Vive and Rift are very high quality, but require expensive gaming PCs that sit in your living room. Devices like the Gear and Daydream are extremely portable, but lack great 6DOF (six degrees of freedom) controllers and positional tracking. Whoever creates a headset with the portability and price of

FALL 2017 VFXVOICE.COM • 93

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

“In 20 years I guarantee you will be spending a lot of time with sims that will be doing some sort of various activity. Like the Internet you won’t be allowed to go back to an earlier time. You will have your eyewear on and be holding up your ultra smartphone. You will have insight into everything.” —Anthony Batt, Co-founder/ Executive VP, Wevr

“VR will go far, but I think AR and MR will happen a lot sooner than VR and may even trump VR at least in the next five years. VR is a trickier proposition since you have to get consumers to acquire hardware. Of course, the devices will get lighter and portable. Augmented reality can get the masses via phone or tablet or TV screen.” —Øystein Larsen, VP Virtual, The Future Group

mobile VR and the quality and interactivity of desktop VR will see iPhone-level success. The VFX community can contribute to that vision by designing worlds optimized for mobile and bringing their expertise to VR and AR startups.

up your ultra smartphone. You will have insight into everything. It’s a wedge-based technology and it will affect all things. The Internet was a wedge-based technology. We are moving towards augmented simulations in 3D.

RYAN MOORE, CEO, EXPERIENCE 360, LOS ANGELES

ØYSTEIN LARSEN, VP VIRTUAL, THE FUTURE GROUP, OSLO, NORWAY

Thus far, we have not seen the ‘virtual’ of virtual reality. All we have done is recreate the environment that we are familiar with. Everything looks like something that already exists. Nobody is taking advantage yet of the ‘virtual’ part. It doesn’t have to be an experience that even has a floor. There are a lot of experiences that have not been tried yet. We are only about two to three years into this and it will take some time for people to understand how to build for it. That is something that I would like to see spoken to. Thus, it’s more of a content issue than a tech issue. One other big issue is imagination. There are a lot of people who don’t get why what they have done, and the way they have done things for the last 50 years, should be disrupted. There is an ongoing education about VR now to the powers that be and people with the money. There has to be an education of ‘why VR’ and new ways to reach your customers. That takes time. You don’t really know until you put the headset on for the first time. ANTHONY BATT, CO-FOUNDER AND EXECUTIVE VP, WEVR, VENICE, CALIFORNIA

Generally, 99% of all people have not experienced VR yet, including the press. So everyone is experiencing it from afar. If one were immersed in it for 40 months, then one would have a substantial opinion on it. It’s like the early days of the Internet when it took six hours to download something. People didn’t think it was going to happen. Now you could not imagine running any part of your life without the Internet. In 20 years I guarantee you will be spending a lot of time with simulations that will be doing some sort of various activity. Like the Internet you won’t be allowed to go back to an earlier time. You will have your eyewear on and be holding

94 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 95

Currently, with boundaries being pushed hardware-wise with companies like Varjo, as well as further development of lightfield technologies, opportunities extend for content creation. With the breadth of companies already doing extensive R&D in the field of VR, the road will be paved well when the hardware catches up. A big part of the VFX community has already embraced VR and has the skills necessary to create amazing experiences in this space. I think once a lot of the main issues with VR for most people are ironed out [wearing bulky hardware, motion sickness, eye strain, resolution issues etc.] the opportunities should be amazing. Visiting museums, live venues, meetings, socializing, playing games, e-shopping, e-learning, all from the comfort of your home. VR will go far, but I think AR and MR will happen a lot sooner than VR and may even trump VR at least in the next five years. VR is a trickier proposition since you have to get consumers to acquire hardware.

“The assets are pretty standard. It’s the integration of the assets into a 360 environment that is a challenge where you need more specific skill sets.” —Will Maurer, VP of VR and Animation, Legend 3D

“Content creators need to understand the medium: To avoid negative side effects, creators need to be aware of – and abide by – the emerging fundamentals of making VR.” —Aidan Sarsfield, Head of Production Technology, Animal Logic

WILL MAURER, VP OF VR AND ANIMATION, LEGEND 3D, LOS ANGELES

The assets are pretty standard. It’s the integration of the assets into a 360 environment that is a challenge where you need more specific skill sets. Finding artists in the 360 space is a challenge. Our competitive advantage is that we have been in it for a long time. Content is always king, no matter what the medium. I think it’s important for writers to understand the power and limitations of immersive/interactive 360 storytelling. And it’s equally important for producers to understand budget and execution. I see all too often ambitious projects being squeezed on an unrealistic

FALL 2017 VFXVOICE.COM • 95

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

“In 20 years I guarantee you will be spending a lot of time with sims that will be doing some sort of various activity. Like the Internet you won’t be allowed to go back to an earlier time. You will have your eyewear on and be holding up your ultra smartphone. You will have insight into everything.” —Anthony Batt, Co-founder/ Executive VP, Wevr

“VR will go far, but I think AR and MR will happen a lot sooner than VR and may even trump VR at least in the next five years. VR is a trickier proposition since you have to get consumers to acquire hardware. Of course, the devices will get lighter and portable. Augmented reality can get the masses via phone or tablet or TV screen.” —Øystein Larsen, VP Virtual, The Future Group

mobile VR and the quality and interactivity of desktop VR will see iPhone-level success. The VFX community can contribute to that vision by designing worlds optimized for mobile and bringing their expertise to VR and AR startups.

up your ultra smartphone. You will have insight into everything. It’s a wedge-based technology and it will affect all things. The Internet was a wedge-based technology. We are moving towards augmented simulations in 3D.

RYAN MOORE, CEO, EXPERIENCE 360, LOS ANGELES

ØYSTEIN LARSEN, VP VIRTUAL, THE FUTURE GROUP, OSLO, NORWAY

Thus far, we have not seen the ‘virtual’ of virtual reality. All we have done is recreate the environment that we are familiar with. Everything looks like something that already exists. Nobody is taking advantage yet of the ‘virtual’ part. It doesn’t have to be an experience that even has a floor. There are a lot of experiences that have not been tried yet. We are only about two to three years into this and it will take some time for people to understand how to build for it. That is something that I would like to see spoken to. Thus, it’s more of a content issue than a tech issue. One other big issue is imagination. There are a lot of people who don’t get why what they have done, and the way they have done things for the last 50 years, should be disrupted. There is an ongoing education about VR now to the powers that be and people with the money. There has to be an education of ‘why VR’ and new ways to reach your customers. That takes time. You don’t really know until you put the headset on for the first time. ANTHONY BATT, CO-FOUNDER AND EXECUTIVE VP, WEVR, VENICE, CALIFORNIA

Generally, 99% of all people have not experienced VR yet, including the press. So everyone is experiencing it from afar. If one were immersed in it for 40 months, then one would have a substantial opinion on it. It’s like the early days of the Internet when it took six hours to download something. People didn’t think it was going to happen. Now you could not imagine running any part of your life without the Internet. In 20 years I guarantee you will be spending a lot of time with simulations that will be doing some sort of various activity. Like the Internet you won’t be allowed to go back to an earlier time. You will have your eyewear on and be holding

94 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 95

Currently, with boundaries being pushed hardware-wise with companies like Varjo, as well as further development of lightfield technologies, opportunities extend for content creation. With the breadth of companies already doing extensive R&D in the field of VR, the road will be paved well when the hardware catches up. A big part of the VFX community has already embraced VR and has the skills necessary to create amazing experiences in this space. I think once a lot of the main issues with VR for most people are ironed out [wearing bulky hardware, motion sickness, eye strain, resolution issues etc.] the opportunities should be amazing. Visiting museums, live venues, meetings, socializing, playing games, e-shopping, e-learning, all from the comfort of your home. VR will go far, but I think AR and MR will happen a lot sooner than VR and may even trump VR at least in the next five years. VR is a trickier proposition since you have to get consumers to acquire hardware.

“The assets are pretty standard. It’s the integration of the assets into a 360 environment that is a challenge where you need more specific skill sets.” —Will Maurer, VP of VR and Animation, Legend 3D

“Content creators need to understand the medium: To avoid negative side effects, creators need to be aware of – and abide by – the emerging fundamentals of making VR.” —Aidan Sarsfield, Head of Production Technology, Animal Logic

WILL MAURER, VP OF VR AND ANIMATION, LEGEND 3D, LOS ANGELES

The assets are pretty standard. It’s the integration of the assets into a 360 environment that is a challenge where you need more specific skill sets. Finding artists in the 360 space is a challenge. Our competitive advantage is that we have been in it for a long time. Content is always king, no matter what the medium. I think it’s important for writers to understand the power and limitations of immersive/interactive 360 storytelling. And it’s equally important for producers to understand budget and execution. I see all too often ambitious projects being squeezed on an unrealistic

FALL 2017 VFXVOICE.COM • 95

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

“The most important thing for the industry is to increase the quality and amount of content no matter what the VR usage – education, training, entertainment, etc.” —Tom Sanocki, CEO, Limitless Ltd.

96 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 97

budget, and in the end, you ultimately get what you pay for. The same can be said for the opposite as well. There are some lofty budgets being spent on ideas and treatments that aren’t fully fleshed out ahead of production. A big challenge for VR at the moment is the lack of quality content. I harken back to the early days of 3D, when a lot of bad content flooded the market and word spread quickly that ‘3D makes you sick’, or gives you a headache, or leaves you ‘cross-eyed’. I still talk to people who say they will not see a 3D movie because of these reasons, some of which still have never even attempted to view a 3D movie because of the negative feedback. AIDAN SARSFIELD, HEAD OF PRODUCTION TECHNOLOGY, ANIMAL LOGIC, AUSTRALIA, LOS ANGELES

Anyone that’s involved with VR would have to concede that, currently, there is an incredibly broad spectrum of content and hardware that makes up the media’s definition of VR. Because of that, I think we’re pretty close to the peak of the hype cycle. Personally, I’m never going to tell anyone that anything other than “six degrees of freedom” is VR, but sadly the media is not so specific. Audiences are having their expectations inflated, and often a first experience can leave the viewer underwhelmed. At best it can be a novelty and at worst it can turn them off to the medium. Based on that, I would suggest that there’s a few important considerations for the immediate future. Educating audiences: Mono panoramas are not VR, stereo panoramas are not VR, premium mobile is great, but HMD is best. Accept no substitutes – or prepare yourself to feel queasy. Content creators need to understand the medium: To avoid negative side effects, creators need to be aware of – and abide by – the emerging fundamentals of making quality VR.

TOM SANOCKI, CEO, LIMITLESS LTD., LOS ANGELES

This year, next year and, honestly, every year, what matters most is content. Cheaper and better VR hardware is important, but consumers won’t buy VR hardware if they don’t have amazing content for it. Often, one or two killer apps can drive hardware sales. The most important thing for the industry is to increase the quality and amount of content no matter what the VR usage – education, training, entertainment, etc. Obstacles to VR content creation are largely two-fold: technology and time. If we can build content faster and cheaper, then we, the VR industry, could try out more ideas. We have three big challenges in VR. One: Uneven investment. VR investment leveled off this year due to 2016 consumer headset sales not meeting VC’s expectations. By itself this isn’t a bad problem. There was a lot of investment last year, but it continues to be unevenly distributed. Two: Uneven talent distribution. VR content is overwhelmingly driven by game developers, and so most VR content is games. VR needs to go beyond games to reach a wider audience. Three: Lack of risk-taking. Because VR content is so hard and expensive to build, it is too tempting to lean on what is safe and known. This is a big limitation for VR.

PETER SCHLUEER, CO-FOUNDER AND PRESIDENT, WORLDVIZ, SANTA BARBARA

From our viewpoint, there are huge opportunities in the B2B marketing and sales arena. We’re addressing this opportunity with a new product that brings VR to sales teams looking for new, cost-effective and more immersive methods for communicating complex ideas to prospects and clients. It operates much like a ‘GoToMeeting’ for VR and takes advantage of VR’s unique ability to make someone feel ‘present’ in a virtual location, drastically diminishing the need for sales teams to travel to a physical location to present a product idea or a physical mockup. We see this offering as being immensely useful across a range of industries, including the entertainment space. MIGUEL ANGEL DONCEL, CEO, SGO, MADRID, SPAIN

VR opens the door to a whole new realm of communication with endless possibilities. Making experiences more immersive completely changes the audience’s perception and experience of content. VR to me, is not just a new form of media, it’s a new language, a new way to tell stories, arming storytellers with an amazing and powerful tool to experiment and play with. In my opinion, it will not only add value to current media content, but it will create unique categories of experiences. New ways to sell products, new ways to provide news, new ways to engage the audience with a story and move them to feel things they have never felt before. Over the next years we all will learn this new language, where and how to apply it, and artists will find amazing ways to engage us as they always do.

“From our viewpoint, there are huge opportunities in the B2B marketing and sales arena. We’re addressing this opportunity with a new product that brings VR to sales teams looking for new, cost-effective and more immersive methods for communicating complex ideas to prospects and clients.” —Peter Schlueer, Co-founder/ President, WorldViz

“[VR] will not only add value to current media content, but it will create unique categories of experiences. New ways to sell products, new ways to provide news, new ways to engage the audience with a story and move them to feel things they have never felt before.” —Miguel Angel Doncel, CEO, SGO

FALL 2017 VFXVOICE.COM • 97

9/14/17 4:02 PM


INDUSTRY ROUNDTABLE

“The most important thing for the industry is to increase the quality and amount of content no matter what the VR usage – education, training, entertainment, etc.” —Tom Sanocki, CEO, Limitless Ltd.

96 • VFXVOICE.COM FALL 2017

PG 88-97 ROUNDTABLE.indd 97

budget, and in the end, you ultimately get what you pay for. The same can be said for the opposite as well. There are some lofty budgets being spent on ideas and treatments that aren’t fully fleshed out ahead of production. A big challenge for VR at the moment is the lack of quality content. I harken back to the early days of 3D, when a lot of bad content flooded the market and word spread quickly that ‘3D makes you sick’, or gives you a headache, or leaves you ‘cross-eyed’. I still talk to people who say they will not see a 3D movie because of these reasons, some of which still have never even attempted to view a 3D movie because of the negative feedback. AIDAN SARSFIELD, HEAD OF PRODUCTION TECHNOLOGY, ANIMAL LOGIC, AUSTRALIA, LOS ANGELES

Anyone that’s involved with VR would have to concede that, currently, there is an incredibly broad spectrum of content and hardware that makes up the media’s definition of VR. Because of that, I think we’re pretty close to the peak of the hype cycle. Personally, I’m never going to tell anyone that anything other than “six degrees of freedom” is VR, but sadly the media is not so specific. Audiences are having their expectations inflated, and often a first experience can leave the viewer underwhelmed. At best it can be a novelty and at worst it can turn them off to the medium. Based on that, I would suggest that there’s a few important considerations for the immediate future. Educating audiences: Mono panoramas are not VR, stereo panoramas are not VR, premium mobile is great, but HMD is best. Accept no substitutes – or prepare yourself to feel queasy. Content creators need to understand the medium: To avoid negative side effects, creators need to be aware of – and abide by – the emerging fundamentals of making quality VR.

TOM SANOCKI, CEO, LIMITLESS LTD., LOS ANGELES

This year, next year and, honestly, every year, what matters most is content. Cheaper and better VR hardware is important, but consumers won’t buy VR hardware if they don’t have amazing content for it. Often, one or two killer apps can drive hardware sales. The most important thing for the industry is to increase the quality and amount of content no matter what the VR usage – education, training, entertainment, etc. Obstacles to VR content creation are largely two-fold: technology and time. If we can build content faster and cheaper, then we, the VR industry, could try out more ideas. We have three big challenges in VR. One: Uneven investment. VR investment leveled off this year due to 2016 consumer headset sales not meeting VC’s expectations. By itself this isn’t a bad problem. There was a lot of investment last year, but it continues to be unevenly distributed. Two: Uneven talent distribution. VR content is overwhelmingly driven by game developers, and so most VR content is games. VR needs to go beyond games to reach a wider audience. Three: Lack of risk-taking. Because VR content is so hard and expensive to build, it is too tempting to lean on what is safe and known. This is a big limitation for VR.

PETER SCHLUEER, CO-FOUNDER AND PRESIDENT, WORLDVIZ, SANTA BARBARA

From our viewpoint, there are huge opportunities in the B2B marketing and sales arena. We’re addressing this opportunity with a new product that brings VR to sales teams looking for new, cost-effective and more immersive methods for communicating complex ideas to prospects and clients. It operates much like a ‘GoToMeeting’ for VR and takes advantage of VR’s unique ability to make someone feel ‘present’ in a virtual location, drastically diminishing the need for sales teams to travel to a physical location to present a product idea or a physical mockup. We see this offering as being immensely useful across a range of industries, including the entertainment space. MIGUEL ANGEL DONCEL, CEO, SGO, MADRID, SPAIN

VR opens the door to a whole new realm of communication with endless possibilities. Making experiences more immersive completely changes the audience’s perception and experience of content. VR to me, is not just a new form of media, it’s a new language, a new way to tell stories, arming storytellers with an amazing and powerful tool to experiment and play with. In my opinion, it will not only add value to current media content, but it will create unique categories of experiences. New ways to sell products, new ways to provide news, new ways to engage the audience with a story and move them to feel things they have never felt before. Over the next years we all will learn this new language, where and how to apply it, and artists will find amazing ways to engage us as they always do.

“From our viewpoint, there are huge opportunities in the B2B marketing and sales arena. We’re addressing this opportunity with a new product that brings VR to sales teams looking for new, cost-effective and more immersive methods for communicating complex ideas to prospects and clients.” —Peter Schlueer, Co-founder/ President, WorldViz

“[VR] will not only add value to current media content, but it will create unique categories of experiences. New ways to sell products, new ways to provide news, new ways to engage the audience with a story and move them to feel things they have never felt before.” —Miguel Angel Doncel, CEO, SGO

FALL 2017 VFXVOICE.COM • 97

9/14/17 4:02 PM


VFX VAULT

In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work. Supervisors Douglas Trumbull, VES, Richard Yuricich and David Dryer – via Entertainment Effects Group – oversaw Blade Runner’s Oscar®-nominated visual effects; work that was completed at a time when practical effects, miniatures, optical compositing and real film (in this case, 65mm film) were the norm. On the eve of the release of Denis Villeneuve’s Blade Runner 2049 sequel, VFX Voice revisits the miniatures of the original film with chief model maker Mark Stetson, VES. He and a crew of distinguished artists helped to craft many of the film’s iconic settings and vehicles, including the opening Hades landscape, Tyrell Corporation pyramids, the Spinner and other flying vehicles, the advertising blimp and the Los Angeles city landscape of 2019.

THE MINIATURE MODELS OF BLADE RUNNER By IAN FAILES

TOP: Christopher S. Ross checks the scale convergence of the Hades Landscape. The artist drew the black silhouettes, in progressively reduced scales, over Virgil Mirano’s reference photography of the El Segundo refinery. They were arranged on 12” x 24” artboard panels from which lithographic films were made. The acid-etching supplier used the films to print stencil masks onto 12” x 24” brass sheet stock for etching the silhouettes (similar to the process used to make printed circuit boards). BOTTOM: The Hades Landscape model on the smoke stage set up on tables with under-lighting. Fans helped control the heat generated.

AN INDUSTRIAL BEGINNING

Blade Runner begins with a slow push-in over a heavily industrialized section of Los Angeles. Many were surprised when it became apparent that the endless refinery imagery – known as the Hades landscape – was largely achieved with rows of acid-etched brass silhouette cut-outs in a forced perspective layout. “The original thought,” says Stetson, “was that Hades would be shot in a smoke room, it would all be backlit, and we would build it on a transparent tabletop so we could angle the lights with these rows of silhouettes. It didn’t really work out that well in that respect – there wasn’t enough lighting control or depth in the fine-scale miniature to separate the layers. So we decided to add a ground plane of cast-detailed parts to the foreground.” The ground plane structures were painted quite roughly to make the buildings look ‘aged and crappy’ – instant coffee was even used for that effect. Then, after making an evening flight into Los Angeles, Stetson was inspired to replicate in the Hades landscape the look of thousands of city lights. A myriad of fiber optic strands – seven miles worth – was added underneath the tables holding the silhouettes and other model pieces. The lights included a mix of different bulbs, too, all filmed in different passes, as were the gas flares captured ‘in-model’ with specially placed projection screens and a synchronized 35mm film projector. PYRAMIDS OF THE FUTURE

The Tyrell Corporation, responsible for genetically engineered replicants in the future-verse of Blade Runner, has its headquarters in a pair of enormous pyramid-shaped buildings constructed by Stetson’s team as miniatures. Their trademark look involved intricate side panels made to look as if consisting of thousands of lit windows. The core pyramid structure was built essentially as a clear plastic shell. Flat panel patterns were accurately cut to make up the basic pyramid shape. The patterns’ surfaces were intricately detailed with plastic strip stock. “We laid down one quarter-inch plastic angle stock on its edges to create a stair-step surface,” says Stetson.

98 • VFXVOICE.COM FALL 2017

PG 98-101 BLADE RUNNER.indd 98-99

“On what would become the vertical faces of the steps we added tiny plastic strips of raised detail to indicate window casements, which eased the process of scraping off the window openings later.” Rubber molds were poured over the completed patterns. Then a new set of pyramid panels were cut from clear Plexiglas, exactly matching the pattern panels. The inner surface of the molds was spray painted with automotive primer and then opaquing fluid, then poured full of clear polyester casting resin and ‘squishmolded’ with the pre-cut Plexiglas sheets pressed into the mold. When assembled, this made up the clear plastic shell for the pyramid structure. Once put together, the paint covering the raised plastic window areas was then hand-scraped away to make it appear like windows and lights all across the pyramid. “The whole thing was ultimately just lit from within by a 10K bare bulb on the smoke stage,” says Stetson. “This required a ton of fans just to keep cool.” Concurrently, the model shop was producing double layers of acid-etched brass sheets to face the flying buttresses on each side of the pyramid. The buttress design posed a different lighting problem, which Kris Gregg solved with U-shaped fluorescent bulbs. “Despite heavy filter correction,” notes Stetson, “the buttress lighting was notably cooler in color than the main pyramid. We decided that was just part of the look of the building.” SPINNING OUT SPINNERS

Equally iconic in Blade Runner lore are the flying police vehicles known as Spinners. In visual futurist Syd Mead’s design explorations for Blade Runner, he called the flying vehicles ‘aerodynes’ – Spinner was a brand name. Modelmakers crafted several scales of Spinner models (there were also full-sized versions built by legendary hot-rodder Gene Winfield) ranging from one inch to 50 inches. The models were laid up in molds made from original wooden patterns, with plastic detailing and a vacuum-formed canopy. The vehicles were particularly recognizable for their flaring and spinning police lights. In fact, the larger scale Spinner models were a significant feat of engineering. They were made to include room for cabling, stepper motors, lighting, and even nitrogen plumbing for exhaust. “Late in the development of the models, Ridley asked for a rack of gumball-style police lights to be mounted on top of the car,” says Stetson. “Getting the lights to spin on the model required a new lighting rig that replaced the rear bodywork on the model and was shot on a repeat pass using motion control. We made little brass cans for each halogen light, with lensed snoots driven through speedo cables by a rack of stepper motors on the back of the car. Shooting in a heavily smoked stage, you’d see these beams of light rotating around, done on a second pass on the model. It was really effective and it really looked beautiful.” BLADE RUNNER’S BROADCASTING BLIMP

A feature of future Los Angeles is a blimp that hovers over the city with video-screen advertising on its sides. It was realized as a miniature craft, wired with a mass of fiber optic cables for lighting,

TOP to BOTTOM: Foam-cast ground plane and additional towers near the foreground of the industrial wasteland being painted in a rough style. Fiber-optic strands are added to the Hades Landscape miniature. At left, Illyanna Lowry uses a caulking gun to secure the cables with silicon. Fellow artists Leslie Ekker, George Trimmer and Jon Roennau also work with the fiber optics. “We learned a hard lesson on Star Trek: The Motion Picture,” says Stetson, “that if you stuff those plastic fiber optics through urethane foam castings that aren’t fully cured, the strands would melt and you would lose everything you did. We didn’t make that mistake again.” Bill George details one of the pattern panels, adding thousands of plastic strips on the pyramid surface to aid the window scraping step.

FALL 2017 VFXVOICE.COM • 99

9/14/17 4:04 PM


VFX VAULT

In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work. Supervisors Douglas Trumbull, VES, Richard Yuricich and David Dryer – via Entertainment Effects Group – oversaw Blade Runner’s Oscar®-nominated visual effects; work that was completed at a time when practical effects, miniatures, optical compositing and real film (in this case, 65mm film) were the norm. On the eve of the release of Denis Villeneuve’s Blade Runner 2049 sequel, VFX Voice revisits the miniatures of the original film with chief model maker Mark Stetson, VES. He and a crew of distinguished artists helped to craft many of the film’s iconic settings and vehicles, including the opening Hades landscape, Tyrell Corporation pyramids, the Spinner and other flying vehicles, the advertising blimp and the Los Angeles city landscape of 2019.

THE MINIATURE MODELS OF BLADE RUNNER By IAN FAILES

TOP: Christopher S. Ross checks the scale convergence of the Hades Landscape. The artist drew the black silhouettes, in progressively reduced scales, over Virgil Mirano’s reference photography of the El Segundo refinery. They were arranged on 12” x 24” artboard panels from which lithographic films were made. The acid-etching supplier used the films to print stencil masks onto 12” x 24” brass sheet stock for etching the silhouettes (similar to the process used to make printed circuit boards). BOTTOM: The Hades Landscape model on the smoke stage set up on tables with under-lighting. Fans helped control the heat generated.

AN INDUSTRIAL BEGINNING

Blade Runner begins with a slow push-in over a heavily industrialized section of Los Angeles. Many were surprised when it became apparent that the endless refinery imagery – known as the Hades landscape – was largely achieved with rows of acid-etched brass silhouette cut-outs in a forced perspective layout. “The original thought,” says Stetson, “was that Hades would be shot in a smoke room, it would all be backlit, and we would build it on a transparent tabletop so we could angle the lights with these rows of silhouettes. It didn’t really work out that well in that respect – there wasn’t enough lighting control or depth in the fine-scale miniature to separate the layers. So we decided to add a ground plane of cast-detailed parts to the foreground.” The ground plane structures were painted quite roughly to make the buildings look ‘aged and crappy’ – instant coffee was even used for that effect. Then, after making an evening flight into Los Angeles, Stetson was inspired to replicate in the Hades landscape the look of thousands of city lights. A myriad of fiber optic strands – seven miles worth – was added underneath the tables holding the silhouettes and other model pieces. The lights included a mix of different bulbs, too, all filmed in different passes, as were the gas flares captured ‘in-model’ with specially placed projection screens and a synchronized 35mm film projector. PYRAMIDS OF THE FUTURE

The Tyrell Corporation, responsible for genetically engineered replicants in the future-verse of Blade Runner, has its headquarters in a pair of enormous pyramid-shaped buildings constructed by Stetson’s team as miniatures. Their trademark look involved intricate side panels made to look as if consisting of thousands of lit windows. The core pyramid structure was built essentially as a clear plastic shell. Flat panel patterns were accurately cut to make up the basic pyramid shape. The patterns’ surfaces were intricately detailed with plastic strip stock. “We laid down one quarter-inch plastic angle stock on its edges to create a stair-step surface,” says Stetson.

98 • VFXVOICE.COM FALL 2017

PG 98-101 BLADE RUNNER.indd 98-99

“On what would become the vertical faces of the steps we added tiny plastic strips of raised detail to indicate window casements, which eased the process of scraping off the window openings later.” Rubber molds were poured over the completed patterns. Then a new set of pyramid panels were cut from clear Plexiglas, exactly matching the pattern panels. The inner surface of the molds was spray painted with automotive primer and then opaquing fluid, then poured full of clear polyester casting resin and ‘squishmolded’ with the pre-cut Plexiglas sheets pressed into the mold. When assembled, this made up the clear plastic shell for the pyramid structure. Once put together, the paint covering the raised plastic window areas was then hand-scraped away to make it appear like windows and lights all across the pyramid. “The whole thing was ultimately just lit from within by a 10K bare bulb on the smoke stage,” says Stetson. “This required a ton of fans just to keep cool.” Concurrently, the model shop was producing double layers of acid-etched brass sheets to face the flying buttresses on each side of the pyramid. The buttress design posed a different lighting problem, which Kris Gregg solved with U-shaped fluorescent bulbs. “Despite heavy filter correction,” notes Stetson, “the buttress lighting was notably cooler in color than the main pyramid. We decided that was just part of the look of the building.” SPINNING OUT SPINNERS

Equally iconic in Blade Runner lore are the flying police vehicles known as Spinners. In visual futurist Syd Mead’s design explorations for Blade Runner, he called the flying vehicles ‘aerodynes’ – Spinner was a brand name. Modelmakers crafted several scales of Spinner models (there were also full-sized versions built by legendary hot-rodder Gene Winfield) ranging from one inch to 50 inches. The models were laid up in molds made from original wooden patterns, with plastic detailing and a vacuum-formed canopy. The vehicles were particularly recognizable for their flaring and spinning police lights. In fact, the larger scale Spinner models were a significant feat of engineering. They were made to include room for cabling, stepper motors, lighting, and even nitrogen plumbing for exhaust. “Late in the development of the models, Ridley asked for a rack of gumball-style police lights to be mounted on top of the car,” says Stetson. “Getting the lights to spin on the model required a new lighting rig that replaced the rear bodywork on the model and was shot on a repeat pass using motion control. We made little brass cans for each halogen light, with lensed snoots driven through speedo cables by a rack of stepper motors on the back of the car. Shooting in a heavily smoked stage, you’d see these beams of light rotating around, done on a second pass on the model. It was really effective and it really looked beautiful.” BLADE RUNNER’S BROADCASTING BLIMP

A feature of future Los Angeles is a blimp that hovers over the city with video-screen advertising on its sides. It was realized as a miniature craft, wired with a mass of fiber optic cables for lighting,

TOP to BOTTOM: Foam-cast ground plane and additional towers near the foreground of the industrial wasteland being painted in a rough style. Fiber-optic strands are added to the Hades Landscape miniature. At left, Illyanna Lowry uses a caulking gun to secure the cables with silicon. Fellow artists Leslie Ekker, George Trimmer and Jon Roennau also work with the fiber optics. “We learned a hard lesson on Star Trek: The Motion Picture,” says Stetson, “that if you stuff those plastic fiber optics through urethane foam castings that aren’t fully cured, the strands would melt and you would lose everything you did. We didn’t make that mistake again.” Bill George details one of the pattern panels, adding thousands of plastic strips on the pyramid surface to aid the window scraping step.

FALL 2017 VFXVOICE.COM • 99

9/14/17 4:04 PM


VFX VAULT

and had footage thrown onto the screens via a synchronized 35mm projector during a second pass. Stetson imagined making the blimp by cutting a template of the perimeter into a thick piece of plywood, then stretching a piece of surgical rubber sheet across it. “I thought we could just fill it with plaster and let the weight of it sag the rubber sheet and we’d get a real nice pneumatic feel to our mold,” he says. Although the plaster proved too heavy for that concept, the modelmakers were able to generate an ‘organic’-looking shape around which they retrofitted framework and many intricate model parts. The blimp had three projection screens, which were actually re-purposed trays from a children’s magnetic ball game (these were also used for a particularly iconic shot of a Geisha appearing on the side of an LA skyscraper). “We painted the trays opaque black and then dry-brushed silver paint over the top of them,” outlines Stetson. “That became our somewhat pixelated-looking projection screen.” CITY BUILDING

TOP to BOTTOM: John Vidor spent weeks with a chisel cutter and an X-acto knife scraping away opaque paint from the raised plastic areas on the pyramid to reveal thousands of windows, illuminated from within the model by a bare 10K stage light bulb. Thomas R. Field (in background) surveys the pyramid, now close to final assembly and with additional buttress pieces, on the shooting stage. Kris Gregg in the foreground is setting up the lighting in the buttresses.

Miniature city buildings and environments, along with matte paintings, made up views of a 2019 Los Angeles. The model buildings were a purposely mixed bag of structures and recycled constructs, as well as some secret inserts included by Stetson’s team. Look out for a building that has a suspiciously similar shape to the Millennium Falcon, based on Bill George’s 5-foot replica. Also look for Jon Roennau’s 4-foot replica of the Dark Star ship from the John Carpenter film. “Basically,” says Stetson, “it was like, turn everything you could possibly find into a building and throw it in the city.” An initial set of miniature buildings was constructed in a traditional manner with the buildings sitting on the floor, which fulfilled storyboards that were conceptually more like vertical set extensions of the city street set at the Burbank Studios. Ridley Scott soon desired shots to be achieved as aerials with a complicated mix of city structures below. “What Ridley really wanted was more of Syd Mead’s visualization of the future for the film, with soaring megastructures planted among the decaying remains of the old city below,” recalls Stetson. “So the emphasis in the miniature build shifted to huge towers and high-altitude aerial views.” Since the motion-control camera rig was such a large apparatus, obtaining high-angle shots presented a rigging problem. The solution: tilting the buildings and allowing for a lower angle of attack. “These pieces of motion-control equipment were big and heavy because they were designed to carry 65mm cameras,” notes Stetson. “We just didn’t have the height in the studio and we couldn’t get the camera high enough to make these moving shots work unless we tilted the models over.”

TOP to BOTTOM, LEFT to RIGHT: Tom Pahk works on the 12th scale Spinner model, fitting a glass roof panel to the top of the vehicle. Bob Wilcox (left) and Bill George wire up the quarter-scale Spinner model with cabling to enable the lighting for filming. Bill George sculpts the miniature puppet of Deckard. “Bill sculpted Deckard and Gaff from scratch,” says Stetson. “We didn’t spend much time on their costumes because what you’d see were pretty much silhouettes. But because this was a big bubbletopped design, we thought that we had to have something in there – it couldn’t be empty seats.” A composite photograph showing the quarter-scale Spinner model lit up. The re-purposed trays that made up the projection screens. The blimp is in an earlier state of finish, without the bristling antennae around the perimeter. Michael McMillen makes adjustments to the blimp model. This image shows the mass of fiber-optic cabling going into the miniature. Michael McMillen continues to wire up small lights around the borders of the advertising screens on the blimp. Says Stetson: “We had fiber-optic bundles that sort of chased light around the edges of the screens, then we had all these incandescent lights.” Patrick Van Auken, on the ladder, rigs the blimp model for filming, attached to a neon-lit (for mattes) motion-control model mover. Mark Stetson, VES, adjusts elements for a city miniature shot.

Christopher S. Ross airbrushes an insert model for a landing platform at the top of the pyramid using a black opaquing fluid.

100 • VFXVOICE.COM FALL 2017

PG 98-101 BLADE RUNNER.indd 100-101

FALL 2017 VFXVOICE.COM • 101

9/14/17 4:04 PM


VFX VAULT

and had footage thrown onto the screens via a synchronized 35mm projector during a second pass. Stetson imagined making the blimp by cutting a template of the perimeter into a thick piece of plywood, then stretching a piece of surgical rubber sheet across it. “I thought we could just fill it with plaster and let the weight of it sag the rubber sheet and we’d get a real nice pneumatic feel to our mold,” he says. Although the plaster proved too heavy for that concept, the modelmakers were able to generate an ‘organic’-looking shape around which they retrofitted framework and many intricate model parts. The blimp had three projection screens, which were actually re-purposed trays from a children’s magnetic ball game (these were also used for a particularly iconic shot of a Geisha appearing on the side of an LA skyscraper). “We painted the trays opaque black and then dry-brushed silver paint over the top of them,” outlines Stetson. “That became our somewhat pixelated-looking projection screen.” CITY BUILDING

TOP to BOTTOM: John Vidor spent weeks with a chisel cutter and an X-acto knife scraping away opaque paint from the raised plastic areas on the pyramid to reveal thousands of windows, illuminated from within the model by a bare 10K stage light bulb. Thomas R. Field (in background) surveys the pyramid, now close to final assembly and with additional buttress pieces, on the shooting stage. Kris Gregg in the foreground is setting up the lighting in the buttresses.

Miniature city buildings and environments, along with matte paintings, made up views of a 2019 Los Angeles. The model buildings were a purposely mixed bag of structures and recycled constructs, as well as some secret inserts included by Stetson’s team. Look out for a building that has a suspiciously similar shape to the Millennium Falcon, based on Bill George’s 5-foot replica. Also look for Jon Roennau’s 4-foot replica of the Dark Star ship from the John Carpenter film. “Basically,” says Stetson, “it was like, turn everything you could possibly find into a building and throw it in the city.” An initial set of miniature buildings was constructed in a traditional manner with the buildings sitting on the floor, which fulfilled storyboards that were conceptually more like vertical set extensions of the city street set at the Burbank Studios. Ridley Scott soon desired shots to be achieved as aerials with a complicated mix of city structures below. “What Ridley really wanted was more of Syd Mead’s visualization of the future for the film, with soaring megastructures planted among the decaying remains of the old city below,” recalls Stetson. “So the emphasis in the miniature build shifted to huge towers and high-altitude aerial views.” Since the motion-control camera rig was such a large apparatus, obtaining high-angle shots presented a rigging problem. The solution: tilting the buildings and allowing for a lower angle of attack. “These pieces of motion-control equipment were big and heavy because they were designed to carry 65mm cameras,” notes Stetson. “We just didn’t have the height in the studio and we couldn’t get the camera high enough to make these moving shots work unless we tilted the models over.”

TOP to BOTTOM, LEFT to RIGHT: Tom Pahk works on the 12th scale Spinner model, fitting a glass roof panel to the top of the vehicle. Bob Wilcox (left) and Bill George wire up the quarter-scale Spinner model with cabling to enable the lighting for filming. Bill George sculpts the miniature puppet of Deckard. “Bill sculpted Deckard and Gaff from scratch,” says Stetson. “We didn’t spend much time on their costumes because what you’d see were pretty much silhouettes. But because this was a big bubbletopped design, we thought that we had to have something in there – it couldn’t be empty seats.” A composite photograph showing the quarter-scale Spinner model lit up. The re-purposed trays that made up the projection screens. The blimp is in an earlier state of finish, without the bristling antennae around the perimeter. Michael McMillen makes adjustments to the blimp model. This image shows the mass of fiber-optic cabling going into the miniature. Michael McMillen continues to wire up small lights around the borders of the advertising screens on the blimp. Says Stetson: “We had fiber-optic bundles that sort of chased light around the edges of the screens, then we had all these incandescent lights.” Patrick Van Auken, on the ladder, rigs the blimp model for filming, attached to a neon-lit (for mattes) motion-control model mover. Mark Stetson, VES, adjusts elements for a city miniature shot.

Christopher S. Ross airbrushes an insert model for a landing platform at the top of the pyramid using a black opaquing fluid.

100 • VFXVOICE.COM FALL 2017

PG 98-101 BLADE RUNNER.indd 100-101

FALL 2017 VFXVOICE.COM • 101

9/14/17 4:04 PM


COMPANY PROFILE

GLOBAL SPOTLIGHT: INDIA’S RED CHILLIES VFX Photos courtesy of Red Chillies VFX.

TOP to BOTOM: Keitan Yadav, Chief Operating Officer, VFX Producer Haresh “Harry” Hingorani, Chief Creative Officer/VFX Supervisor For the 2016 movie Fan, in which Shah Rukh Khan portrayed both a superstar and the obsessed fan, Red Chillies VFX transformed Khan to make him look 20 years younger. In addition to making him younger, leaner and smaller in build, they changed his facial features, like the size of his nose and texture of his skin.

Founded in 2002, Red Chillies Entertainment, based in the heart of Mumbai, India, has blossomed into a major motion picture production and distribution company, and in 2006 added a premier visual effects house called Red Chillies VFX. Shah Rukh Khan, a noted ‘Bollywood’ actor, created the parent company. Ra.One and Krrish 3 were two early movies in which the visual effects side of the company made a name for itself. Red Chillies VFX is now also a post-production studio specializing in visual effects for feature films and commercials. The company has established itself as one of India’s top visual effects studios in that part of the world with an eye on homegrown projects as well as international activities. Says Keitan Yadav, Chief Operating Officer and VFX Producer, about the growth of the industry and Red Chillies VFX: “I think if I compare the last 10 years of filmmaking, we have progressed very significantly, in terms of genre, complexity, turnover, usage and awareness of VFX in the industry. It has boomed!” Adds Haresh “Harry” Hingorani, Chief Creative Officer and VFX Supervisor, “There is a big lineup of VFX films like Robot 2.0 which will open the road for a lot of production houses, directors and producers. I think we have a bright future. It will be an era of biopics, mythological, historical films and, with the likes of Netflix and Amazon, the scale and requirements of VFX is going up. It will be the golden years of the VFX industry.” Among current VFX film work are such titles as Khaidi No. 150, Raees, Phillauri, Tubelight and Judwaa 2. More recently, Red Chillies VFX has been involved with such major movies as Chak De! India, Krrish 3, Don: The Chase Begins Again, Dostana and De Dana Dan. Current projects include an in-house and muchawaited project, Dwarf, featuring owner, Shah Rukh Khan. Red Chillies is also looking to set up a VR facility, and they are currently setting up a VR team.

102 • VFXVOICE.COM FALL 2017

PG 102 RED CHILLIES.indd 102

9/14/17 4:05 PM


RISE AD.indd 103

9/14/17 4:06 PM


[ VES SECTION SPOTLIGHT: GERMANY ]

Newest VES Affiliate Reflects Growing Impact of German VFX By ANDREW HORN

TOP: VES Germany members watching the War for the Planet of the Apes Q&A live in Stuttgart. BOTTOM: VES Germany members attending the War for the Planet of the Apes Q&A in Stuttgart on a lobby break. All Stuttgart photos by Nina Pries and all Berlin photos by Florian Gellinger.

104 • VFXVOICE.COM FALL 2017

PG 104-107 VES GERMANY.indd 104-105

Somebody once called Los Angeles “72 suburbs in search of a city” and perhaps something similar could be said for the VFX industry in Germany. While most VES sections are concentrated in centers like Vancouver, Toronto or London, the principle German companies are spread out over the country in Munich, Stuttgart and Berlin, with a lesser presence in Cologne and Frankfurt, as well as smaller pockets scattered around. This is because of independent regional subsidy systems that require companies to locate in various states, usually more than one, to follow the money. Germany also just introduced a new nationwide 25% VFX rebate on August 1, 2017 to catch up with competing countries – but until now the lack of a strong national subsidy like the UK or Canada that would cover international films shot outside the country has limited German companies’ market advantage. However, this has not stopped the Germans from contributing to a slew of Hollywood films from Harry Potter and the Deathly Hallows to Star Trek Into Darkness, to Rogue One: A Star Wars Story, Rush, practically all the Marvel films, as well as Spider-Man: Homecoming, and including major work on HBO’s Game of Thrones and now the first German series for both Netflix’s Dark and Amazon’s You Are Wanted. Not to mention a steady output of local features, TV, animation and commercial work. Section board member Michael Landgrebe, partner at Celluloid in Berlin, says, “Looking at the scene 10 years ago it seemed absolutely extraordinary to think that a German effects house would be working on Hollywood tentpole movies.” To name a point when German VFX entered the outside world, it would be in 2009 when Chris Townsend was working out of Trixter in Munich doing Ninja Assassin, which was shot in Berlin. He wanted to spread the work out to other local companies and brought in RISE, which had just formed in Berlin, as well as Pixomondo in Stuttgart. Back then, Section board member and compositor Nina Pries, had just started as an intern at Pixomondo, “I came directly from university, and that project was really

“The VES puts us on the world map. But the next problem is also how do we get the people, the artists, who had this great German training, who originally went to Wellington or London or Vancouver to work on the bigger jobs, to come back home?” —Florian Gellinger, VES Section Chair, VFX Supervisor/Partner, RISE awesome for everyone on the team. It was a Hollywood project, and we felt we had to do our best. There was really great team spirit.” More work followed, and 2012 was the year when the world took notice after Pixomondo’s Ben Grossmann and Alex Henning won the Oscar for Hugo and Pixomondo’s Sven Martin won a 2013 Emmy for Game of Thrones Season 3. That same year Cloud Atlas showcased significant work by RISE, Trixter and Scanline. Before the thought of a VES Section, people in the VFX scene were already expressing a need to get together, if only on a local level. Semi-regular hangouts and meet-ups were being independently organized in Stuttgart, Berlin, Munich and Cologne, but there was no interaction between the cities. FMX was a major nexus for the industry, but that’s only once a year. Germany already had some VES members, but the point of it seemed abstract. Landgrebe, who’d already been a member for two years, says, “It was cool to get the newsletter and know what was going on and receiving the screeners. But we were paying our dues and weren’t able to benefit from things like having screenings and organizing events because it was all happening somewhere else.” In March 2016, Florian Gellinger, VFX supervisor and partner at RISE in Berlin, was thinking that the German VFX scene needed some kind of organization to represent them and realized that rather than start something new it would make more sense to have a Germany Section of VES and benefit from the connections and infrastructure already in place. “At that point I think we had over 40 members in Germany,” Gellinger explains, “so we could build on that. But,” he discovered, “Munich and Berlin at that point were completely

neglected, with almost no members at all.” At FMX he met with Tim McGovern, head of outreach at VES, who came on board to help expedite the membership process and endorsement letters. “That was pretty much my hobby last year,” Gellinger continues, “to get people to sign up to become a member, and then helping them organize paperwork. By November 2016, when the new members came on board, we had the biggest push of any country, getting 27 new members in one go.” After the May 2017 meeting of the Board of Directors there were a total of 91 VES members in Germany. Plus, now anybody who lives or has their permanent address in Germany automatically becomes a member of the Germany Section. The next step was to elect a board. With Gellinger taking the Chair, the idea was to spread out the board members across Germany. Comments Gellinger: “You’ve got Benedikt Niemann, Animation Supervisor at Ambient Entertainment in Hannover, who’s our secretary, and we’ve got Urs Franzen, who’s a visual effects supervisor from Munich, as treasurer, which is great because Munich – surprisingly, being a major center – still does not have many members, but we’re getting there. And then we’ve got three board members from Stuttgart, five from Berlin and one from Cologne. So, all in all, it’s representing the membership from each part of Germany.” The section’s first official get-together took place this May at FMX where a gathering of about 20 people, including principal board members, met with VES Executive Director Eric Roth and FMX organizer Andreas Hykade. Board member and partner in Celluloid, Holger Hummel, remembers: “We all met for the first time in one room and got to know each other. And

TOP to BOTTOM: Jan Fiedler from Pixomondo welcomes Stuttgart’s members to the VES Germany’s first-ever screening event for War for the Planet of the Apes. Jonathan Weber and Oliver Hohn from RISE enjoy drinks at the Eiszeit theater’s bar in Berlin. Michael Wortmann, VFX Supervisor on Atomic Blonde, enjoys sharing experiences beyond company walls. VES Germany screened Atomic Blonde with a live Q&A with Wortmann in August.

FALL 2017 VFXVOICE.COM • 105

9/14/17 4:09 PM


[ VES SECTION SPOTLIGHT: GERMANY ]

Newest VES Affiliate Reflects Growing Impact of German VFX By ANDREW HORN

TOP: VES Germany members watching the War for the Planet of the Apes Q&A live in Stuttgart. BOTTOM: VES Germany members attending the War for the Planet of the Apes Q&A in Stuttgart on a lobby break. All Stuttgart photos by Nina Pries and all Berlin photos by Florian Gellinger.

104 • VFXVOICE.COM FALL 2017

PG 104-107 VES GERMANY.indd 104-105

Somebody once called Los Angeles “72 suburbs in search of a city” and perhaps something similar could be said for the VFX industry in Germany. While most VES sections are concentrated in centers like Vancouver, Toronto or London, the principle German companies are spread out over the country in Munich, Stuttgart and Berlin, with a lesser presence in Cologne and Frankfurt, as well as smaller pockets scattered around. This is because of independent regional subsidy systems that require companies to locate in various states, usually more than one, to follow the money. Germany also just introduced a new nationwide 25% VFX rebate on August 1, 2017 to catch up with competing countries – but until now the lack of a strong national subsidy like the UK or Canada that would cover international films shot outside the country has limited German companies’ market advantage. However, this has not stopped the Germans from contributing to a slew of Hollywood films from Harry Potter and the Deathly Hallows to Star Trek Into Darkness, to Rogue One: A Star Wars Story, Rush, practically all the Marvel films, as well as Spider-Man: Homecoming, and including major work on HBO’s Game of Thrones and now the first German series for both Netflix’s Dark and Amazon’s You Are Wanted. Not to mention a steady output of local features, TV, animation and commercial work. Section board member Michael Landgrebe, partner at Celluloid in Berlin, says, “Looking at the scene 10 years ago it seemed absolutely extraordinary to think that a German effects house would be working on Hollywood tentpole movies.” To name a point when German VFX entered the outside world, it would be in 2009 when Chris Townsend was working out of Trixter in Munich doing Ninja Assassin, which was shot in Berlin. He wanted to spread the work out to other local companies and brought in RISE, which had just formed in Berlin, as well as Pixomondo in Stuttgart. Back then, Section board member and compositor Nina Pries, had just started as an intern at Pixomondo, “I came directly from university, and that project was really

“The VES puts us on the world map. But the next problem is also how do we get the people, the artists, who had this great German training, who originally went to Wellington or London or Vancouver to work on the bigger jobs, to come back home?” —Florian Gellinger, VES Section Chair, VFX Supervisor/Partner, RISE awesome for everyone on the team. It was a Hollywood project, and we felt we had to do our best. There was really great team spirit.” More work followed, and 2012 was the year when the world took notice after Pixomondo’s Ben Grossmann and Alex Henning won the Oscar for Hugo and Pixomondo’s Sven Martin won a 2013 Emmy for Game of Thrones Season 3. That same year Cloud Atlas showcased significant work by RISE, Trixter and Scanline. Before the thought of a VES Section, people in the VFX scene were already expressing a need to get together, if only on a local level. Semi-regular hangouts and meet-ups were being independently organized in Stuttgart, Berlin, Munich and Cologne, but there was no interaction between the cities. FMX was a major nexus for the industry, but that’s only once a year. Germany already had some VES members, but the point of it seemed abstract. Landgrebe, who’d already been a member for two years, says, “It was cool to get the newsletter and know what was going on and receiving the screeners. But we were paying our dues and weren’t able to benefit from things like having screenings and organizing events because it was all happening somewhere else.” In March 2016, Florian Gellinger, VFX supervisor and partner at RISE in Berlin, was thinking that the German VFX scene needed some kind of organization to represent them and realized that rather than start something new it would make more sense to have a Germany Section of VES and benefit from the connections and infrastructure already in place. “At that point I think we had over 40 members in Germany,” Gellinger explains, “so we could build on that. But,” he discovered, “Munich and Berlin at that point were completely

neglected, with almost no members at all.” At FMX he met with Tim McGovern, head of outreach at VES, who came on board to help expedite the membership process and endorsement letters. “That was pretty much my hobby last year,” Gellinger continues, “to get people to sign up to become a member, and then helping them organize paperwork. By November 2016, when the new members came on board, we had the biggest push of any country, getting 27 new members in one go.” After the May 2017 meeting of the Board of Directors there were a total of 91 VES members in Germany. Plus, now anybody who lives or has their permanent address in Germany automatically becomes a member of the Germany Section. The next step was to elect a board. With Gellinger taking the Chair, the idea was to spread out the board members across Germany. Comments Gellinger: “You’ve got Benedikt Niemann, Animation Supervisor at Ambient Entertainment in Hannover, who’s our secretary, and we’ve got Urs Franzen, who’s a visual effects supervisor from Munich, as treasurer, which is great because Munich – surprisingly, being a major center – still does not have many members, but we’re getting there. And then we’ve got three board members from Stuttgart, five from Berlin and one from Cologne. So, all in all, it’s representing the membership from each part of Germany.” The section’s first official get-together took place this May at FMX where a gathering of about 20 people, including principal board members, met with VES Executive Director Eric Roth and FMX organizer Andreas Hykade. Board member and partner in Celluloid, Holger Hummel, remembers: “We all met for the first time in one room and got to know each other. And

TOP to BOTTOM: Jan Fiedler from Pixomondo welcomes Stuttgart’s members to the VES Germany’s first-ever screening event for War for the Planet of the Apes. Jonathan Weber and Oliver Hohn from RISE enjoy drinks at the Eiszeit theater’s bar in Berlin. Michael Wortmann, VFX Supervisor on Atomic Blonde, enjoys sharing experiences beyond company walls. VES Germany screened Atomic Blonde with a live Q&A with Wortmann in August.

FALL 2017 VFXVOICE.COM • 105

9/14/17 4:09 PM


[ VES SECTION SPOTLIGHT: GERMANY ]

“The industry is still quite young in Germany, we’re still taking our baby steps, but I think we have a great potential in the VES because we are starting out new.” —Urs Franzen, VFX Supervisor it was Eric and Florian welcoming us, and Eric pointing out that as a long-term attendee of FMX he was very aware of the quality work coming out of Germany, despite its comparative niche status, and was really happy to finally kick it off.” The first members event took place in July in both Berlin and Stuttgart for the screening of War For The Planet Of The Apes, with the international hookup for the live Q&A by director Matt Reeves and Joe Letteri of Weta. Both Berlin and Stuttgart started off the evening with a pub night at the bars of their respective theaters, Eiszeit Kino in Berlin and the Metropol in Stuttgart. While the event itself was a good first step, the logistics of the live-stream hookup were problematic due to the time difference. The Q&A in LA went on at 11 p.m. local time which created problems with booking a cinema in a prime Saturday night slot, which in turn meant the German group had to watch the Q&A before they saw the film. Plus the live stream had to be interrupted at midnight so that the film could be screened. Despite the difficulties, this first official get together was a success on a personal level, both in Berlin and Stuttgart, with Berlin having a full house for the event. Directly afterwards, Berlin began setting up a model for similar local events with a more logistically convenient time – Tuesday evenings after work – using the Eiszeit Kino, which is centrally located for most of the companies. As a response to Germany’s regional fragmentation, there is a general desire to develop means to bring people together – with the screenings, VES pub nights and BBQs – and even talk of national meet-ups. But it’s also about sharing information, and people are already discussing the possibility for an active exchange of ideas through organized lectures and discussions with local and visiting professionals. To this end, Atomic Blonde was screened in Berlin in August with Q&A by VFX Supervisor Michael Wortmann

106 • VFXVOICE.COM FALL 2017

PG 104-107 VES GERMANY.indd 106-107

from Chimney in Berlin. (Valerian: City of a Thousand Planets screened in July.) Nina Pries also reports that discussions are underway in Stuttgart to develop a lecture series along the lines of TED talks. Board treasurer and freelance VFX supervisor Urs Franzen adds, “It was always in my mind that the entire VFX community should come together somehow. Some people say it’s just a community, you meet each other for a beer or something, but for me it’s something more far reaching – making a future. And with people working in Germany and abroad in the UK or New Zealand or Canada, it’s a global community. The industry is still quite young in Germany,” he continues, “we’re still taking our baby steps, but I think we have a great potential in the VES because we are starting out new.” Gellinger concludes, “The VES puts us on the world map. Up to now, with all the great work coming out of Germany, now the clients know us, but the next problem is also how do we get the people, the artists, who had this great German training, who originally went to Wellington or London or Vancouver to work on the bigger jobs, to come back home? “Just having the newsletter going out and seeing that we’re part of what’s happening with all these major VFX cities that all have amazing subsidies, and suddenly there’s Germany among them. So it just pokes people like, hey look, Germany is on the map. And that makes a huge difference.” TOP to BOTTOM: Jan Adamczyk from Trixter (left) discussing the state of the industry with supervisors and artists at Berlin’s first pub night. Marlies Schacherl from Automatik and Jonathan Weber and Nicolas Leu from RISE use the “Berlin Pub Night” to share industry gossip. Florian Gellinger, Chair of the VES Germany Board of Directors and Co-founder of RISE, taking a selfie with a DSLR. Michael Landgrebe of Celluloid VFX and Berlin members catching some fresh air in front of the theater on a warm summer night.

TOP: The Eiszeit theater in Berlin fit precisely all 49 VES members and their guests to the VES Germany’s first-ever screening event.

BOTTOM, LEFT to RIGHT: Florian Gellinger, Chair

MIDDLE LEFT: First gathering of the German section took place on May 4 at FMX 2017 in Stuttgart.

Benedikt Nieman, Secretary

MIDDLE RIGHT: VES Executive Director Eric Roth welcomes the newest section to the global family.

Urs Franzen, Treasurer

FALL 2017 VFXVOICE.COM • 107

9/14/17 4:09 PM


[ VES SECTION SPOTLIGHT: GERMANY ]

“The industry is still quite young in Germany, we’re still taking our baby steps, but I think we have a great potential in the VES because we are starting out new.” —Urs Franzen, VFX Supervisor it was Eric and Florian welcoming us, and Eric pointing out that as a long-term attendee of FMX he was very aware of the quality work coming out of Germany, despite its comparative niche status, and was really happy to finally kick it off.” The first members event took place in July in both Berlin and Stuttgart for the screening of War For The Planet Of The Apes, with the international hookup for the live Q&A by director Matt Reeves and Joe Letteri of Weta. Both Berlin and Stuttgart started off the evening with a pub night at the bars of their respective theaters, Eiszeit Kino in Berlin and the Metropol in Stuttgart. While the event itself was a good first step, the logistics of the live-stream hookup were problematic due to the time difference. The Q&A in LA went on at 11 p.m. local time which created problems with booking a cinema in a prime Saturday night slot, which in turn meant the German group had to watch the Q&A before they saw the film. Plus the live stream had to be interrupted at midnight so that the film could be screened. Despite the difficulties, this first official get together was a success on a personal level, both in Berlin and Stuttgart, with Berlin having a full house for the event. Directly afterwards, Berlin began setting up a model for similar local events with a more logistically convenient time – Tuesday evenings after work – using the Eiszeit Kino, which is centrally located for most of the companies. As a response to Germany’s regional fragmentation, there is a general desire to develop means to bring people together – with the screenings, VES pub nights and BBQs – and even talk of national meet-ups. But it’s also about sharing information, and people are already discussing the possibility for an active exchange of ideas through organized lectures and discussions with local and visiting professionals. To this end, Atomic Blonde was screened in Berlin in August with Q&A by VFX Supervisor Michael Wortmann

106 • VFXVOICE.COM FALL 2017

PG 104-107 VES GERMANY.indd 106-107

from Chimney in Berlin. (Valerian: City of a Thousand Planets screened in July.) Nina Pries also reports that discussions are underway in Stuttgart to develop a lecture series along the lines of TED talks. Board treasurer and freelance VFX supervisor Urs Franzen adds, “It was always in my mind that the entire VFX community should come together somehow. Some people say it’s just a community, you meet each other for a beer or something, but for me it’s something more far reaching – making a future. And with people working in Germany and abroad in the UK or New Zealand or Canada, it’s a global community. The industry is still quite young in Germany,” he continues, “we’re still taking our baby steps, but I think we have a great potential in the VES because we are starting out new.” Gellinger concludes, “The VES puts us on the world map. Up to now, with all the great work coming out of Germany, now the clients know us, but the next problem is also how do we get the people, the artists, who had this great German training, who originally went to Wellington or London or Vancouver to work on the bigger jobs, to come back home? “Just having the newsletter going out and seeing that we’re part of what’s happening with all these major VFX cities that all have amazing subsidies, and suddenly there’s Germany among them. So it just pokes people like, hey look, Germany is on the map. And that makes a huge difference.” TOP to BOTTOM: Jan Adamczyk from Trixter (left) discussing the state of the industry with supervisors and artists at Berlin’s first pub night. Marlies Schacherl from Automatik and Jonathan Weber and Nicolas Leu from RISE use the “Berlin Pub Night” to share industry gossip. Florian Gellinger, Chair of the VES Germany Board of Directors and Co-founder of RISE, taking a selfie with a DSLR. Michael Landgrebe of Celluloid VFX and Berlin members catching some fresh air in front of the theater on a warm summer night.

TOP: The Eiszeit theater in Berlin fit precisely all 49 VES members and their guests to the VES Germany’s first-ever screening event.

BOTTOM, LEFT to RIGHT: Florian Gellinger, Chair

MIDDLE LEFT: First gathering of the German section took place on May 4 at FMX 2017 in Stuttgart.

Benedikt Nieman, Secretary

MIDDLE RIGHT: VES Executive Director Eric Roth welcomes the newest section to the global family.

Urs Franzen, Treasurer

FALL 2017 VFXVOICE.COM • 107

9/14/17 4:09 PM


[ V-ART ]

Lorin Wood V-Art showcases the talents of worldwide VFX professionals as they create original illustrations, creatures, models and worlds. If you would like to contribute to V-Art, send images and captions to publisher@vfxvoice.com. Lorin Wood has been a conceptual designer for nearly 20 years, having his imagination first sparked by the work of Syd Mead, Joe Johnston and Ralph McQuarrie. He has worked in visual effects, previsualization and animation, but his focus has always been in art direction and conceptual design. His most recent work has been at game development studio Gearbox Software in Dallas, Texas. He is the curator of the blog and editor of the book series Nuthin’ But Mech with Design Studio Press. Wood does not prefer one particular toolset, so many of his projects jump between analog tools and digital. Here’s a sampling of his work.

TOP LEFT: This was a fun commercial. Christine Schneider at Method asked me to design the Loch Ness monster for this Toyota Tacoma spot. Due to the very short turn around, I quickly created some sketches showing the monster’s scale to the pickup, some of the body shapes that might be seen above water, but in the end the main focus was on its head. If you pause it right as the camera zooms in on the scientists, you can just make out a heavily motion-blurred head that looks similar to my illustration. The spot still makes me laugh. BOTTOM LEFT: A massive moon in the shape of an “H” (Hyperion Corporation, one of the major weapons manufacturers in the game universe) that orbits the planet’s moon and is ever present in the sky. TOP: A massive drone that the player does battle with. RIGHT: VES member, Lorin Wood

108 • VFXVOICE.COM FALL 2017

PG 108 V-ART.indd 108

9/14/17 4:11 PM


THE WRAP AD.indd 109

9/14/17 4:12 PM


[ VES NEWS ]

VES Annual Summit, Inaugural Hall of Fame, and Honors By NAOMI GOLDMAN

VES TO HOST 9TH ANNUAL INDUSTRY SUMMIT The 9th Annual VES Summit, coming up on October 28 in Los Angeles, is where experts and provocateurs will once again come together to explore the dynamic evolution of visual imagery and the VFX industry landscape. Enlightened storytellers. Industry thought-leaders. VR pioneers. VFX Visionaries. Celebrating the Society’s milestone 20th Anniversary, the TED Talkslike forum is centered on “Inspiring Change: Building on 20 Years of VES Innovation.” Thanks to the exemplary leadership of VES Summit Chair Rita Cahill and the Summit Committee, this year’s lineup is not to be missed. Headlining this year’s Summit are two stellar keynote speakers. VES is pleased to present celebrated director/producer/ writer Ava DuVernay, critically acclaimed for films including Selma and 13th and now helming Queen Sugar and the much-anticipated A Wrinkle in Time. And we are honored to host master visual futurist and conceptual artist Syd Mead, recognized for his legendary design aesthetic in films including Star Trek: The Motion Picture, Blade Runner and Tron. VES will host an impressive lineup of featured speakers offering thought-provoking insights into today’s hottest issues. They include: President of IMAX Home Entertainment Jason Brenek on “Evolution in Entertainment: VR, Cinema and Beyond”; Online security expert and CEO of SSP Blue Hemanshu Nigam on “When Hackers

110 • VFXVOICE.COM FALL 2017

PG 110-111 VES NEWS.indd All Pages

Attack: How Can Hollywood Fight Back?”; Head of Adobe Research Gavin Miller on “Will the Future Look More Like Harry Potter or Star Trek?”; Senior Research Engineer at Autodesk Evan Atherton on “The Age of Imagination”; and the Founder/CEO of the Emblematic Group, Nonny de la Peña, on “Creating for Virtual, Augmented & Mixed Realities.” One of the most popular elements of the Summit are the roundtable discussions led by industry experts. This year will feature more than a dozen topics from Digital Humans to Cloud-based Computing to the Future of Virtual Production and many more. Excellent networking, a special recognition program and celebratory cocktails in a beautiful setting will round out this year’s unparalleled event. The 2017 VES Summit takes place on Saturday, October 28th at the Sofitel Hotel Beverly Hills. Limited tickets are still available at https://www.visualeffectssociety. com/news-events/summit. VES INDUCTS INAUGURAL CLASS INTO NEW HALL OF FAME In concert with the Society’s milestone 20th Anniversary, the VES Board of Directors created the VES Hall of Fame – a new honor to recognize exemplary individuals who came before us and upon whose shoulders we proudly stand. This distinction is being bestowed upon a select group of professionals and pioneers who have played a significant role in advancing the field of visual effects by

invention, science, contribution or avocation of the art, science, technology and/or communications. The goal of the Board poll was to result in 20 inductees, but there was a tie for the final slot, thus there are 21. The roster includes both living legends and those being honored posthumously. The inaugural class of the VES Hall of Fame consists of: Robert Abel, Ed Catmull, VES, Roger Corman, Lynwood Dunn, Peter Ellenshaw, Jim Henson, Ub Iwerks, John Knoll, Grant McCune, Syd Mead, George Méliès, Dennis Muren, VES, Willis O’Brien, Carlo Rambaldi, Phil Tippett, VES, Doug Trumbull, VES, Joe Viskocil, Petro Vlahos, Albert Whitlock, Stan Winston and Matthew Yuricich. The Hall of Fame inductees or family members on their behalf will be recognized at the upcoming VES Summit during a special evening program and reception. VES CONVEYS SPECIAL HONORS AT THE VES SUMMIT The recognition program at the VES Summit is also host to the tradition of conveying honors to distinguished individuals with the Founders Award, Fellows distinction, Honorary Membership and Lifetime Membership designations. This year, the VES Board of Directors is privileged to bestow the Founders Award upon Toni Pace Carstensen for her sustained contributions to the art, science or business of visual effects and meritorious service to the Society. Toni Pace Carstensen has a passion for new worlds – one that has taken her from the Amazon to Pandora. Her feature work as VFX producer/digital production manager includes Avatar, Minority Report and Fantasia 2000. For theme parks she contributed to Epcot’s Mission: Space and Star Trek Voyager: Borg Encounter 4D. While Executive Producer at View Studio Carstensen produced TV FX work for The X-Files and The Outer Limits and then remastered VFX for the original Star Trek series HD release at CBS Digital. Venturing out on her own, Carstensen started

CyberGraphix, one of the first Mac-based motion graphics. She is the recipient of a Broadcast Design Gold Award for Francis Ford Coppola’s White Dwarf, a VES Award nomination for Best Supporting Visual FX – Broadcast for Pushing Daisies and three Emmy Award nominations for her early work as a radio and TV news producer. In 1997, Carstensen helped create the VES as member 0004 and first Treasurer. She was a Founding Member of the Executive Committee and served on the Global Board of Directors for many years, as well as being Co-Chair of the Education Committee. She co-edited the first edition of The VES Handbook and with the Education Committee generated the idea for the VESAGE book – showcasing the personal artwork of VES members. Always thinking about the future, Carstensen created the Vision Committee in 2011, which she continues to Chair. She is currently the Treasurer on the Los Angeles Section Board of Managers. Currently, Carstensen teaches and mentors the next generation of visual effects explorers at the Gnomon School of Visual Effects, while working as a development/ production executive for animation studio Neko Productions. She is also a member of the PGA, TEA, Women in Animation, Motion Picture Editors Guild and the Art Directors Guild. The Board of Directors is proud to designate Chuck Finance with a Lifetime Membership for meritorious service to the Society and the global industry. Chuck Finance has been a member of the VES for nearly 20 years, and a member of the VES Awards Committee for 14 years. He was Chair of the Committee in

2005 and has been one of its dedicated Co-chairs ever since. With an unwavering commitment to the VES, Finance dedicates thousands of hours each year, in his retirement, to advance the organization. He has overseen the interaction between the Awards Committee and hundreds of queries about the Awards process, ensuring the integrity, protocols and high standards of the VES Awards. Finance began his career in film by producing and directing educational and information films for companies like Encyclopaedia Britannica Films, Jet Propulsion Laboratory, and the National Science Foundation. He made the leap into visual effects when producer Raffaella De Laurentiis hired him as Visual Effects Coordinator on the movie Dune, one of the most VFX-heavy films up to that time. That was back in the analog, or optical, era, before computers revolutionized the art of visual effects. In the late 1980s he co-founded a full-service visual effects company, Perpetual Motion Pictures, where he was the General Manager as well as the company’s Effects Producer. Among the films he worked on were Bill and Ted’s Excellent Adventure, Leviathan, Mom and Dad Save the World, The Firm, Honey I Shrunk the Kids and many others. Following that he worked for a number of years as a freelance VFX producer and supervisor on films like George of the Jungle, The Arrival and the TV series 24. The Board of Directors is proud to designate Bob Burns with an Honorary Membership in the VES for his exemplary contributions to the entertainment industry and for furthering the interests and values of visual effects practitioners.

Bob Burns was a VFX practitioner in the 1950s and 60s, doing miniature work and makeup prosthetics. His famed Halloween shows staged in Burbank in the ‘70s gave inspiration to hundreds of now-current practitioners in the VFX and makeup fields. He was a mentor and friend to many who have gone on to become legends in their own right, including Rick Baker, Dennis Muren, VES, Robert and Dennis Skotak. More notably, he became a historian and archivist of the effects arena by amassing an extraordinary collection of film and television VFX props, miniatures, paintings and costumes, dating from the ‘50s to the digital age. Filmmakers have visited his collection and had reunions with items long thought lost, while many of the younger generation have been inspired by seeing (and sometimes handling) the original items from such seminal films as The Time Machine, Destination Moon, Forbidden Planet, Alien, Aliens, Terminator 2 and many others. He is recognized by many for his encyclopedic knowledge and preservation of these artifacts so often thrown away after production and for inspiring new generations with his wit and ability to bring VFX history alive. Recipients of the VES Fellows distinction will also be recognized at the Summit. Names of this year’s VES Fellows have not been announced as of the publication date of this issue.

LEFT to RIGHT: Ava DuVernay, Syd Mead (Photo courtesy of Jenny Risher), Toni Pace Carstensen, Jason Brenek, Hemanshu Nigam (Photo courtesy of Bobby Quillard)

FALL 2017 VFXVOICE.COM • 111

9/14/17 4:13 PM


[ VES NEWS ]

VES Annual Summit, Inaugural Hall of Fame, and Honors By NAOMI GOLDMAN

VES TO HOST 9TH ANNUAL INDUSTRY SUMMIT The 9th Annual VES Summit, coming up on October 28 in Los Angeles, is where experts and provocateurs will once again come together to explore the dynamic evolution of visual imagery and the VFX industry landscape. Enlightened storytellers. Industry thought-leaders. VR pioneers. VFX Visionaries. Celebrating the Society’s milestone 20th Anniversary, the TED Talkslike forum is centered on “Inspiring Change: Building on 20 Years of VES Innovation.” Thanks to the exemplary leadership of VES Summit Chair Rita Cahill and the Summit Committee, this year’s lineup is not to be missed. Headlining this year’s Summit are two stellar keynote speakers. VES is pleased to present celebrated director/producer/ writer Ava DuVernay, critically acclaimed for films including Selma and 13th and now helming Queen Sugar and the much-anticipated A Wrinkle in Time. And we are honored to host master visual futurist and conceptual artist Syd Mead, recognized for his legendary design aesthetic in films including Star Trek: The Motion Picture, Blade Runner and Tron. VES will host an impressive lineup of featured speakers offering thought-provoking insights into today’s hottest issues. They include: President of IMAX Home Entertainment Jason Brenek on “Evolution in Entertainment: VR, Cinema and Beyond”; Online security expert and CEO of SSP Blue Hemanshu Nigam on “When Hackers

110 • VFXVOICE.COM FALL 2017

PG 110-111 VES NEWS.indd All Pages

Attack: How Can Hollywood Fight Back?”; Head of Adobe Research Gavin Miller on “Will the Future Look More Like Harry Potter or Star Trek?”; Senior Research Engineer at Autodesk Evan Atherton on “The Age of Imagination”; and the Founder/CEO of the Emblematic Group, Nonny de la Peña, on “Creating for Virtual, Augmented & Mixed Realities.” One of the most popular elements of the Summit are the roundtable discussions led by industry experts. This year will feature more than a dozen topics from Digital Humans to Cloud-based Computing to the Future of Virtual Production and many more. Excellent networking, a special recognition program and celebratory cocktails in a beautiful setting will round out this year’s unparalleled event. The 2017 VES Summit takes place on Saturday, October 28th at the Sofitel Hotel Beverly Hills. Limited tickets are still available at https://www.visualeffectssociety. com/news-events/summit. VES INDUCTS INAUGURAL CLASS INTO NEW HALL OF FAME In concert with the Society’s milestone 20th Anniversary, the VES Board of Directors created the VES Hall of Fame – a new honor to recognize exemplary individuals who came before us and upon whose shoulders we proudly stand. This distinction is being bestowed upon a select group of professionals and pioneers who have played a significant role in advancing the field of visual effects by

invention, science, contribution or avocation of the art, science, technology and/or communications. The goal of the Board poll was to result in 20 inductees, but there was a tie for the final slot, thus there are 21. The roster includes both living legends and those being honored posthumously. The inaugural class of the VES Hall of Fame consists of: Robert Abel, Ed Catmull, VES, Roger Corman, Lynwood Dunn, Peter Ellenshaw, Jim Henson, Ub Iwerks, John Knoll, Grant McCune, Syd Mead, George Méliès, Dennis Muren, VES, Willis O’Brien, Carlo Rambaldi, Phil Tippett, VES, Doug Trumbull, VES, Joe Viskocil, Petro Vlahos, Albert Whitlock, Stan Winston and Matthew Yuricich. The Hall of Fame inductees or family members on their behalf will be recognized at the upcoming VES Summit during a special evening program and reception. VES CONVEYS SPECIAL HONORS AT THE VES SUMMIT The recognition program at the VES Summit is also host to the tradition of conveying honors to distinguished individuals with the Founders Award, Fellows distinction, Honorary Membership and Lifetime Membership designations. This year, the VES Board of Directors is privileged to bestow the Founders Award upon Toni Pace Carstensen for her sustained contributions to the art, science or business of visual effects and meritorious service to the Society. Toni Pace Carstensen has a passion for new worlds – one that has taken her from the Amazon to Pandora. Her feature work as VFX producer/digital production manager includes Avatar, Minority Report and Fantasia 2000. For theme parks she contributed to Epcot’s Mission: Space and Star Trek Voyager: Borg Encounter 4D. While Executive Producer at View Studio Carstensen produced TV FX work for The X-Files and The Outer Limits and then remastered VFX for the original Star Trek series HD release at CBS Digital. Venturing out on her own, Carstensen started

CyberGraphix, one of the first Mac-based motion graphics. She is the recipient of a Broadcast Design Gold Award for Francis Ford Coppola’s White Dwarf, a VES Award nomination for Best Supporting Visual FX – Broadcast for Pushing Daisies and three Emmy Award nominations for her early work as a radio and TV news producer. In 1997, Carstensen helped create the VES as member 0004 and first Treasurer. She was a Founding Member of the Executive Committee and served on the Global Board of Directors for many years, as well as being Co-Chair of the Education Committee. She co-edited the first edition of The VES Handbook and with the Education Committee generated the idea for the VESAGE book – showcasing the personal artwork of VES members. Always thinking about the future, Carstensen created the Vision Committee in 2011, which she continues to Chair. She is currently the Treasurer on the Los Angeles Section Board of Managers. Currently, Carstensen teaches and mentors the next generation of visual effects explorers at the Gnomon School of Visual Effects, while working as a development/ production executive for animation studio Neko Productions. She is also a member of the PGA, TEA, Women in Animation, Motion Picture Editors Guild and the Art Directors Guild. The Board of Directors is proud to designate Chuck Finance with a Lifetime Membership for meritorious service to the Society and the global industry. Chuck Finance has been a member of the VES for nearly 20 years, and a member of the VES Awards Committee for 14 years. He was Chair of the Committee in

2005 and has been one of its dedicated Co-chairs ever since. With an unwavering commitment to the VES, Finance dedicates thousands of hours each year, in his retirement, to advance the organization. He has overseen the interaction between the Awards Committee and hundreds of queries about the Awards process, ensuring the integrity, protocols and high standards of the VES Awards. Finance began his career in film by producing and directing educational and information films for companies like Encyclopaedia Britannica Films, Jet Propulsion Laboratory, and the National Science Foundation. He made the leap into visual effects when producer Raffaella De Laurentiis hired him as Visual Effects Coordinator on the movie Dune, one of the most VFX-heavy films up to that time. That was back in the analog, or optical, era, before computers revolutionized the art of visual effects. In the late 1980s he co-founded a full-service visual effects company, Perpetual Motion Pictures, where he was the General Manager as well as the company’s Effects Producer. Among the films he worked on were Bill and Ted’s Excellent Adventure, Leviathan, Mom and Dad Save the World, The Firm, Honey I Shrunk the Kids and many others. Following that he worked for a number of years as a freelance VFX producer and supervisor on films like George of the Jungle, The Arrival and the TV series 24. The Board of Directors is proud to designate Bob Burns with an Honorary Membership in the VES for his exemplary contributions to the entertainment industry and for furthering the interests and values of visual effects practitioners.

Bob Burns was a VFX practitioner in the 1950s and 60s, doing miniature work and makeup prosthetics. His famed Halloween shows staged in Burbank in the ‘70s gave inspiration to hundreds of now-current practitioners in the VFX and makeup fields. He was a mentor and friend to many who have gone on to become legends in their own right, including Rick Baker, Dennis Muren, VES, Robert and Dennis Skotak. More notably, he became a historian and archivist of the effects arena by amassing an extraordinary collection of film and television VFX props, miniatures, paintings and costumes, dating from the ‘50s to the digital age. Filmmakers have visited his collection and had reunions with items long thought lost, while many of the younger generation have been inspired by seeing (and sometimes handling) the original items from such seminal films as The Time Machine, Destination Moon, Forbidden Planet, Alien, Aliens, Terminator 2 and many others. He is recognized by many for his encyclopedic knowledge and preservation of these artifacts so often thrown away after production and for inspiring new generations with his wit and ability to bring VFX history alive. Recipients of the VES Fellows distinction will also be recognized at the Summit. Names of this year’s VES Fellows have not been announced as of the publication date of this issue.

LEFT to RIGHT: Ava DuVernay, Syd Mead (Photo courtesy of Jenny Risher), Toni Pace Carstensen, Jason Brenek, Hemanshu Nigam (Photo courtesy of Bobby Quillard)

FALL 2017 VFXVOICE.COM • 111

9/14/17 4:13 PM


[ FINAL FRAME ]

A Trip to the Moon

Photo courtesy of Technicolor

Many consider French director George Méliès’s 1902 short Le Voyage dans la Lune (A Trip to the Moon) to be the first film with visual and special effects. The milestone science fiction extravaganza was an international sensation when it was first released. Moviegoers were astonished at the time. The film can now be seen in color due to an historic restoration. Méliès developed a special effects style of the day that involved stop-motion animation; double, triple and quadruple exposures; cross-dissolves; and jump cuts. Scores of his films were overflowing with stunts, tricks, dancing girls and elaborate sets.

112 • VFXVOICE.COM FALL 2017

PG 112 FINAL FRAME.indd 112

9/14/17 4:13 PM


CVR3 AD PHOTOKEM.indd 3

9/14/17 3:15 PM


BOT VFX.indd 4

9/14/17 3:16 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.