3D World Pro (Sampler)

Page 1

CONCEPT GAME CHARACTERS Design and sculpt a hero model for video games

TEXTURE WITH SUBSTANCE Pro training with Kojima

SCULPT IN ZBRUSH Insomniac Games’

Productions’ artists

INSPIRING NG CG ARTISTS STS

modelling secrets

FREE! ANIMATION SOFTWARE VIDEO TUTORIALS SCENE SETUP FILES ZBRUSH MODELS 3DS MAX SCRIPTS

3dworld.creativebloq.com December 2015 #201

MASTER PROFESSIONAL

GAME ART TECHNIQUES • 343 Industries & Axis: The art of Halo 5 • Pro advice to get your indie game made • Model, texture and animate for games

FREE

SOFTWARE!

iClone 5 plus video training worth £120!

FFREE! REE!! REE

GAME-READY AME-READY MODEL


DEVELOP Nuke techniques

NUKE TECHNIQUES

The perfect patch In the fourth part of our Nuke tips and tricks series, Josh Parks talks through painting and prepping out an object in the plate

R AUTHOR PROFILE Josh Parks Josh is a compositor at MPC as well as a part-time lecturer at the University of Hertfordshire. www.joshparks.co.uk

emoving an object from a plate is an egoless job. Your job as an artist isn’t complete unless your work isn’t realised by the audience. Despite this, I have always found it an incredibly satisfying result to have achieved. The workflow when creating the perfect patch is something that I really wanted to know when I was at university. Despite extensive searching, I was unable to find a tutorial which demonstrated how it was done to the industry standard.

Core skills

It wasn’t until I was in the industry that I realised this is because those who have this knowledge don’t necessarily have the time to document it. Alternatively, they think it’s general knowledge as they’ve been doing it like that for years.

In this tutorial I will give you the details behind an industry standard patch and why to do it this way. Creating a patch like this is important for a couple of reasons. Firstly, you’ll never touch the

The workflow when creating the perfect patch is something that I really wanted to know when I was at university original plate, which is crucial for keeping your client happy. Secondly, you don’t end up with double grain. Double grain can happen when you have created a patch, added the grain that matches your original back plate over it, then premultiplied with a blurred alpha, causing you to bring through some of the original back

plates grain and the grain of your patch. It’s this attention to detail that will get you the job at a big VFX house. This is a core skill of a good compositor and one that you will rely on a lot in your career. So it’s worth spending some time going over the following tutorial a couple of times, as well as working through some practise shots to ensure that you understand each step that is being carried out and why it’s being done this way.

A foot in the door

Generally, in order to become a compositor you will start out in the roto prep department, so grasping this could allow you to get your foot on the ladder. In this tutorial I’m going to go over how you can create the perfect patch. For all the assets you need go to FYI creativebloq.com/vault/3dw201

The dark paint marks are removed from the left of the image, as highlighted by the blue squares

3D WORLD December 2015

84

twitter.com/3DWorldMag


ue

e O W X T L mb LL NE IA ve FO H’S TOR 4 No S U le J O K E T n sa o N U 02 ,

s Is

2

STEP-BY-STEP

r

PROCESS: PAINT AND PREP AN OBJECT Josh explains how to create an industry standard patch

CHECK YOUR PATCH

How to rigorously test you have created the perfect patch

ONE THE DENOISE PROCESS

TWO FRAME HOLD AND PAINT YOUR FRAME

Play with settings, checking all colour channels to ensure no details are lost in the denoise process. Whack the intensity of the grain up, get the size right, then match the intensity. The Denoise node is expensive to use; it’s worth precomping so we don’t have to compute it when painting. So write out the denoise output and read it back in. Once detail is decided you’re done with Denoise node.

As a general rule, pick the sharpest frame when deciding which one to frame hold. However this can change if your patch is going to be getting bigger, as you’ll then want to take this into consideration too. Next, do your paintwork on this held frame using a RotoPaint node. You’ll generally be using the Clone too, hence why it’s useful to pick the sharpest frame to do your paint work on.

THREE CREATE ROTO AND PREMULT NODES

FOUR COVER YOUR PAINTED AREA

As we’ll be generating an alpha with a hard edge later we can blur our roto shape as much as we see fit. When creating the roto shape avoid a generic shape the eye can easily recognise, you want more of a cartoon explosion look. This can change as sometimes it’s better to roto to the shape of the thing you’re patching. Merge your patch onto your background.

Generate an alpha to cover only the area painted on. This can be done using a Difference node connected to our paint work on our backplate and the original backplate. Connect the B pipe to your Merge node and A pipe to your original with grain backplate. Set the gain to 999999999 – it‘ll give a solid alpha on all the areas that have been changed/are different to the original plate.

To ensure that our patch sits perfectly in the plate we need to check it thoroughly. There are a few ways to do this and artists generally have a favourite, but here are some I’ve picked up. Flop the image to refresh your eye, adjust your gamma both up and down and convert your image to log using a Logtolin node. Check each channel all the way through, play the shot backwards and get someone else to have a look – and don’t tell them what you have done. If nothing comes up after these rigorous tests, then congratulations, you’ve created a perfect patch!

FOLLOW THE VIDEO www.creativebloq.com/vault/3dw201

FIVE MATCH THE GRAIN

SIX PREMULT AND MERGE YOUR PATCH

We have a perfect patch painted on our denoised backplate and an alpha of the area we’ve changed. Now match the grain of the original plate so our patch sits. Bring in a Nuke Noise node and look at each colour channel of the original plate. It’s easier to whack the intensity up ridiculously high and get the size right first, then adjust to match the intensity.

Now that we’ve fixed and matched the grain of our patch to our backplate, we need to premult our perfectly grained patch by our alpha – created from our Difference node – and then merge it over the original backplate. This technique ensures that our original plate goes through very little filtering that each node will subtly do to our image.

3D WORLD December 2015

85

twitter.com/3DWorldMag



200TH CELEBRATION ISSUE! 132 PAGES OF TUTORIALS, ADVICE AND INSPIRATIONAL ART!

DISCOVER THE

GREATEST VFX FILMS OF ALL TIME!

T he f t hat c h il m s a t he i n d n g e d ustr y, as vote df by you or !

3dworld.creativebloq.com November 2015 #200

FREE GIFTS!

15GB OF RESOURCES • 3 Digital-Tutors video courses • 132 page Maya tutorial ebook • Models, meshes and more

TUTORIALS!

ZBRUSH, 3DS MAX, MAYA

Master hard surface modelling, fire sims, video game vehicles and more!

WIN A WORKSTATION! WORTH £800

Fujitsu’s new Celsius workstation could be yours!

ED LIMIT N O I T I ED COVER

[ ] #

I N AT O TERM YS GENIS

R

MAKE THIS ARNIE T-800 MODEL


FEATURE The 200 greatest VFX films

Dis t h e V c ov e r t ha F X f ilm t he C t c ha nge s d What’s the greatest VFX film of all time? Which G a s vo i n d u s t r y , t films changed the course of your life or impressed, e by y d f o r ou ! defined or changed the industry? These were the

questions we posed the readers of 3D World and our website community at Creative Bloq. The results can be found on these pages. The criteria to be included in the top 200 was simple: the movie must have a running time of 60 minutes or longer; to have received a full cinematic release; and to have a significant or innovative element of CG VFX work. We’ve intentionally omitted CG animations and only included films that blend CG and live action. The final list is eye-opening and nostalgic; old classics rub shoulders with new releases… Has your favourite VFX film made the final cut?

3D WORLD November 2015

44

3dworld.creativebloq.com


200 INDIANA JONES AND THE KINGDOM OF THE CRYSTAL SKULL ILM, Rodeo FX (2008) A marmite movie on so many levels, hence why it’s found itself at 200. Despite a promise by Lucas and Spielberg to go back to physical and in-camera effects for the fourth Indy movie, it soon became apparent that just the opposite was the case. It’s riddled with some 560 (often surprisingly poor) VFX shots – including some dreadful CG monkeys in that cheesy jungle sequence – but makes our list because you voted for it, and it has some saving graces: the epic saucer take-off at the end and dramatic nuke explosion that introduces us to an Atomic Age Indy.

3D WORLD November 2015

45

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

199 ROAD TO PERDITION Cinesite (2002) Great CG is about selling a scene or a moment, and few have done it as well as Cinesite’s introduction to 1930s Chicago in a 40-second sequence that segues from countryside to an accurately-created created city scene. As the protagonists drive along, the camera pans across the windows, glimpses of period cars and reflections of skyscrapers a tease of what’s to come. The camera then sweeps behind the car to reveal an utterly convincing recreation of period Chicago.

195 LOOPER Hydraulx, Atomic Fiction, Scanline VFX (2012)

198 KRRISH 3

Pixion, Red Chillies VFX, Various (1999) Third in the Bollywood series, Krrish tells the tale of reluctant superhero, Krishnan. Red Chillies and Pixion delivered more than 50 VFX sequences, with an array of set extension, digital doubles, CG creatures and character transformations. One shot in which the hero dives from a high bridge took a team of 35 four months to complete.

This clever time-travel sci-fi thriller mixes sweeping cityscapes with subtle CG enhancements to great effect. The final sequence, created by Scanline using 3ds Max, features a scene in which the protagonists face off in a corn field. It employed practical and CG particle simulations to create the effect of a shockwave that rips the corn field apart, lifting characters into the air and flattening the vegetation.

197 BLADE II

Tippet Studio, Framestore CFC, C.O.R.E Digital Pictures, R!ot Pictures (2002) For the return of the Daywalker, Tippet Studios provided digital doubles for the more athletic acrobatics, plus digital prosthetics for the Reapers – mutated vampires with extendible jaws. Framestore handled 180 shots, including the vampire deaths in which the creatures combust and disintegrate.

194 PRINCE OF PERSIA THE SANDS OF TIME Framestore, MPC, Cinesite, Double Negative (2010) Films based on video games are often cursed but Prince of Persia at least had a number of CG scenes worth the entrance fee – the pick of which is Framestore’s dramatic Sand Room in which an underground chamber turns to sand and replicates the swirling fall of grains in an hour glass.

VFX PIPELINE ON A SHOESTRING Blender (free) Modelling, sculpting, layout and lighting Fusion 7 (free) Node-based GPU-accelerated compositing DaVinci Resolve 12 (free) Pro-level non-linear video editing and colour grading Voodoo (free) Cameramatching software Krita (free) Fully-featured bitmap painting program 123D Catch (free) Smartphone-based 3D scanning

193 ALIEN RESURRECTION ECTION Blue Sky (1997)

196 THE LOVELY BONES

Weta Digital (2009) Peter Jackson’s adaptation of the Alice Sebold novel featured visions of the lead character’s personal heaven. With more than 660 VFX shots, Weta Digital had its work cut out in producing a range of ethereal imagery – from sailing ships in giant bottles to a scene in which it’s both night and day at the same time. 3D WORLD November 2015

This entry in the series embraced CG to bring the creatures to life – with mixed results. While seeing too much of the Aliens can always lessen their fear factor, an underwater sequence modelled on nature footage of crocodiles reminded us all what expert killers Giger’s creations are.

46

3dworld.creativebloq.com

192 SAVING PRIVATE RYAN

ILM (1998) The bulk of Steven Spielberg’s WWII epic owes a debt to good old special effects, with explosions, prosthetics, fire and more than 17,000 squibs, so ILM’s task was to take the principal photography and enhance it. Of the 40 such shots, most notable are the sweeping vistas of Omaha beach with a vast fleet of ships, landing craft and blimps. Other, more subtle VFX include bullet hits and augmented blood splats.


191 POSEIDON

ILM, MPC, Giant Killer Robots, CIS Hollywood, various (2006) The movie marked a breakthrough in the use of off-the-shelf software in Hollywood fluid simulation. The software in question, RealFlow, was first released back in 1998. Initially a pure particle-based system, it had begun to make inroads into commercials work with the release of version 3.0 in 2004. However, there is a world of difference between commercials and movie effects – as Mark Stasiuk, who worked on Poseidon’s ‘Broadway’ shot with CIS Hollywood, explains: “Instead of asking RealFlow to do 100,000 particles, we were now asking it to do five million. That jump pushed us out of 32-bit space – and [version 3] wasn’t 64-bit.” During production, Stasiuk worked with RealFlow’s developers at Next Limit to implement 64-bit support, multi-threading and Python scripting: all features that were to find their way into version 4.0.

190 SPLICE

Mac Guff, BUF, C.O.R.E. Digital Pictures 2009) Vincenzo Natali’s body-horror impressed. Paris-based Mac Guff was brought on to redo a lot of shots that were originally achieved using animatronics; the organic ‘slugs’ Fred and Ginger were sculpted in ZBrush and animated in Maya as well as proprietary software Symbor. BUF crafted the bird-like infant creature, while C.O.R.E. handled the fully-grown Dren, featuring wings, triple-jointed legs and a vicious barbed tail.

186 REIGN OF FIRE

The Secret Lab (2002) Dragons come alive and destroy London, via the use of CG environments and characters. Co-VFX supervisor Dan DeLeeuw says that the experience of working on Armageddon and Mighty Joe helped. “The dragon is part snake, leopard, lion, eagle and hawk, vulture and crocodile.” With such a beast, care needed to be taken to ensure it looked credible, particularly in flight. The end result was a CG dragon the size of a 747.

188 BACKDRAFT ILM (1991) The tale of two Chicago firefighter brothers was the first film to use photorealistic computergenerated fire, which earned ILM another Oscar nomination for Visual Effects.

“They held a screening with people dressed as characters from the movie; they’d put more effort into it than we did for the actor’s costumes! It’s like firing something into space: not enough velocity and you crash, but too much and it goes on forever”

185 ANONYMOUS

Uncharted Territory (2011) Anonymous’ backers gave the filmmakers strict criteria: it had to be shot in Germany, the post work had to be done in Germany, yet it had to be historically accurate in its depiction of Elizabethan London. And so Uncharted Territory embarked on more than 300 VFX shots, recreating the array of CG cityscapes that adorn the movie. To do so, the team developed OGEL (LEGO backwards), a building generator which produced a variety of period houses, and populated scenes using Massive and a bespoke crowd system.

184 COWBOYS & ALIENS

Gareth Edwards (2011) Made on a reported £14,000, Gareth Edwards’ Monsters is the Holy Grail of frustrated filmmakers. The titular creatures, who are used very sparingly in the film, were initially modelled in ZBrush and worked up in 3ds Max. Edwards says the design required a good amount of back and forth. “I’d model and rig it, position it in a shot and then realise that what looked cool as a sketch didn’t come through, so I’d go back and remodel. That happened five or six times.”

ILM, Fuel VFX, The Embassy, Ghost VFX, Shade VFX (2011) Wild west and sci-fi has never been a great mix but ILM gave it a good shot. Mixing practical and CG effects, the aliens are the true stars of the movie – Legacy Effects created a nine-foot replica to use on set and ILM handled animation and modelling. Texture was key, with soft-shell crabs offering inspiration to the artists. To make things more challenging the movie was shot on anamorphic format, on film, to deliver the look of classic westerns. This left little room to re-frame shots to include aliens and humans, meaning ILM had to tile each scene.

187 X-MEN Digital Domain, Kleiser-Walczak, Hammerhead Productions, various (2000) The work of Kleiser-Walczak’s digital effects artists on Mystique’s morphing scenes still impress. The procedural transformations from a live performer to a photorealistic, fully CG character took about five months of work.

3D WORLD November 2015

Scanline VFX, Animal Logic, Hybride, various (2006) This retelling of Frank Miller’s graphic version of the Battle of Thermopylae isn’t a groundbreaking film technologically, but deserves a mention because of the artistry of its visuals. Taking its cue from Sin City (2005) and Sky Captain (2004), its actors were filmed against blue screen on sparse sets. The effects team then used CG to create highly stylised, deeply atmospheric backdrops. But whereas Sin City was clearly aiming to be a comic strip brought to life, 300 has a unique and haunting visual quality.

182 GREEN LANTERN

Gareth Edwards, Director, Monsters

189 MONSTERS

183 300

47

3dworld.creativebloq.com

Sony Pictures Imageworks (2011) While Green Lantern failed to light up the box office it did at least have ambition. Bringing to life DC Comic’s galactic superhero cast proved a tall order, with Sony Pictures Imageworks delivering over 1,000 shots for the film. The standout CG was Green Lantern’s costume, created in CG: “We see the muscle movement and the outer layer sliding against this inner layer. It gave us this ability to have this energy that not only crawled on the outside of the surface but also coming from within,” said Imageworks visual effects supervisor Jim Berney.

181 FLIGHT OF THE NAVIGATOR Omnibus Computer Graphics, various (1996) A teenage boy becomes the unwitting co-pilot of a sleek, silver 3D morphing spaceship in this Disney favourite. The movie features the first use of reflection mapping, used to tie the alien spaceship into its environment. Graphics researcher C. Robert Hoffman’s techniques are still used to good effect today.


FEATURE The 200 greatest VFX films

180 THE AVIATOR Sony Pictures Imageworks, CafeFX, New Deal Studios, various (2004) This biopic of aviator and aerospace tycoon Howard Hughes follows his life as a filmmaker and flying enthusiast. A lot of the VFX entail the seamless integration of model planes with digital landcapes and clouds, as well as full-CG scenes. The climactic launch of the H-4 Hercules – or ‘Spruce Goose’ – is a detailed model shot on dry land and comped with CG water and spray against a period backdrop.

177 TROLL HUNTER Storm Studios, Gimpville (2010) Two of Norway’s most prominent VFX studios, Storm Studios and Gimpville, collaborated on Troll Hunter to create Norway’s first ever CG creature film. Houdini was used by both teams to create the effect of the trolls turning to stone as well as integrating the creatures into the environment; Gimpville used Houdini to create the lead troll’s fur, which is covered in snow and vegetation – there are 40,000 dynamic instanced objects on the creature. 176 ARMAGEDDON

179 SUNSHINE MPC (2007) With some of the most sophisticated effects created for a sci-fi movie, MPC’s solar outing truly shines with around 750 VFX shots. Maya fluids were used to recreate the broiling surface of the sun. With 130,000 individual gold-plated tiles, 432 lights and weighing in at 1.92GB of geometry, plus 200GB of textures, the giant spaceship proved too large to load into Maya. MPC’s scripting language, Giggle,, was used to manipulate the data.

178 X2: X-MEN UNITED ITED Cinesite, Kleiser-Walczak, Rhythm & Hues, various (2003) Adding new digital mutant heroes and villains to X-Men’s roster, this superior superpowered sequel continues to wow audiences. Cinesite made Nightcrawler teleport and dematerialise in stages using Maya and Houdini, and used a proprietary tool called PartMan to render the tens of millions of smoke particles he left behind.

173 TWISTER ILM (1996)

Dream Quest Images, Digital Domain, various (1998) Most of Hollywood worked on Michael Bay’s blatant blockbuster, but Armageddon’s $140 million budget was to prove well spent. Dream Quest Images got an Oscar nomination for its effects, including the villain of the piece: the asteroid, which was created using a model covered in particles representing its emitted gases.

Audiences watched in awe as CG cattle and tankers were spat from vicious cyclones in Jan de Bont’s tornado chaser movie. The real star is Mother Nature, as realised by ILM’s proprietary particle software. It lost to Independence Day at the Oscars, but ILM’s hyper- realistic storm simulations still manages to thrill viewers today.

175 RIDDICK

Mokko Studio, Method Studios, Entity FX, Modus FX (2013) Even though the third installment in the chronicles of Riddick was a more low-key affair than its predecessor, it’s still jam-packed with CG: spaceships, landscapes, future weaponry and a horde of alien predators (not least an evil, two-legged ‘mud demon’). Riddick may only have had a $38 million budget, but it’s all up there on screen.

172 KON-TIKI Important Looking Pirates (2011) Swedish outfit Important Looking Pirates has made a name for itself creating realistically moving water for film and TV, and it all started with Kon-Tiki. The standout scene is the shark attack, in which the film’s protagonists face-off against a pack of Great Whites in the middle of the ocean. ILP created 58 shots in total, blending CG water and animals with live footage to a tight budget.

174 CONTACT

Sony Pictures Imageworks, ILM, various (1997) At the time, the opening sequence of Contact was the longest ever digital effects shot in a movie, and the powers-of-ten sequence still holds its own today. Sequence supervisor and amateur astrologer Jay Redd handled the 199-second, 4,710-frame shot, taking viewers through the known universe into the eye of a small girl. 3D WORLD November 2015

48

3dworld.creativebloq.com


171 47 RONIN

164 WHITE HOUSE DOWN

MPC, Framestore, Digital Domain, Mill Film, various (2013) Amidst the epic sweeps across grand fantasy landscapes, MPC’s cherished achievement in samurai epic 47 Ronin was the twitching muscle animation of its Oni, a demonic character of gigantic proportions. The initial concept had been modelled but the MPC team felt the Oni needed a new direction. The animators wanted to make the skin and muscle movement as accurate as possible, which meant adapting existing tools, and creating new workflows to ensure the muscle had weight when it moved and that the pectorals were affected by gravity. As the demon took shape, the team at MPC wanted to recreate the effects of moving tendons across the muscle to create a more believable simulation. After hiring a model to act out the role, the team would add more detail on top of the animation as they spotted quirks in the performer’s muscles. The end result was a combination of automatic rig-based simulation with hands-on modelling, stop-motion style, to get as much detail as they possibly could.

Prime Focus, Hybride, ScanlineVFX, Method Studios, various (2013) Slow-motion car crashes, downed helicopters, smoke, fire digital doubles, an exploding White House… just another day on your typical Roland Emmerich movie. This popcorn actioner is elevated by some stellar VFX work, not least the thrilling destruction of Airforce One.

163 HULK ILM (2003)

167 TED

Iloura, Tippett Studio (2012) While a lead CG character wasn’t new in 2012, creating a small, furry creature that interacts on a physical level in every scene made Ted an impressive achievement. The innovative use of mocap helped sell the realism of the play between actors and Ted. In one key scene of John (Mark Wahlberg) and his girlfriend Lori having dinner with Ted and his partner, the actors performed with an empty seat for Ted, while Seth MacFarlane (playing Ted) sat nearby on an oversized table with oversized cutlery and crockery, wearing an MVN motion tracking suit delivering the lines. A fight sequence between Ted and Wahlberg only served to enhance the MVN suit’s reputation.

170 SKY CAPTAIN AND THE WORLD OF TOMORROW RROW ILM, Hybride, The Orphanage, Gray Matter FX, CafeFX, R!ot, various (2004) French sci-fi fantasy Immortel (Ad Vitam) was released a few months earlier, but writer-director Kerry Conran’s retro-futuristic fantasy was the first movie shot entirely on bluescreen that most US audiences saw. The stylised, vintage-looking CG backdrops, ranging from Manhattan to Shangri-La, were developed by dozens of VFX facilities, including ILM, The Orphanage and CafeFX. To bring his personal vision to the big screen, Conran decided to make the film independently with his own money and funding from Aurelio De Laurentiis. To keep the budget under control it was largely shot on bluescreen stages with the sets, backdrops and action sequences being added digitally. The film features some 2,000 VFX shots which were completed in just one year – including a posthumous performance by Laurence Olivier, created digitally using BBC archival footage. Live action was shot in just 26 days using a Sony HD video camera, and then composited/edited on standard Apple Macs using Adobe After Effects and Final Cut Pro.

169 SUPER 8

ILM, Scanline (2011) J. J. Abrams’ sci-fi homage to Speilberg movies and home video offered some pressing problems for ILM, not least as the movie was shot using anamorphic lenses for a vintage feel. It’s the train crash sequence that stands out the most, created using rigid dynamics simulations for the carriages, plus additional hand-animated elements in Maya. The impact of the sequence lingers on screen for dramatic effect.

166 CHILDREN OF MEN

Framestore CFC, Double Negative, MPC (2006) Children of Men’s virtual childbirth scene packed a visceral punch back in 2006. Exquisite and palpable touches, such as the steam rising off the newborn CG baby’s body as it emerges from the womb, made the scene both hyper-real and emotionally compelling.

165 DEATH BECOMES HER

ILM (1992) ILM bagged another Best Visual Effects Oscar for its work on Robert Zemeckis’ black comedy. The film featured the first photorealistic CG human skin, which was used to link Meryl Streep’s body and head (facing the wrong way) together with a stretched, digital neck. More CG-trickery involved a huge shotgun wound in Goldie Hawn’s midriff through which we can see the set behind her. Although only scripted for 40 VFX shots, the film ended up with 170 – 80 of which were accomplished using ILM’s nascent CG technology for a variety of digital retouching and shots of a stormy 1978 New York.

168 TOTAL RECALL MetroLight Studios, various (1990) Paul Verhoeven’s sci-fi thriller enlisted the help of MetroLight Studios for a handful of digital effects. Tim McGovern’s Oscar-winning X-ray scanner sequence, in which Arnold Schwarzenegger’s character is seen in CG skeleton form, was also one of the earliest uses of motion capture in the cinema.

3D WORLD November 2015

49

3dworld.creativebloq.com

While we’d take the Hulk from Avengers Assemble over Ang Lee’s version, there’s still a lot to admire in ILM’s ambition to bring the green giant to the big screen for the first time, not least the number of daylight shots. Where many hide their CG characters in gloom, ILM put its Hulk in the bright, sun-lit desert. It’s almost enough to make us forget the unfathomable ending…

162 WHAT DREAMS MAY COME Mass.Illusions, Digital Domain, various (1998) Lush and vibrant in appearance, Vincent Ward’s Oscar-winning vision of heaven looks unlike almost any other modern 3D-enhanced movie. Scenes of a shifting, painterly afterlife were the result of a collaboration between a number of VFX houses, including Mass.Illusions, Digital Domain, Pacific Ocean Post, Giant Killer Robots, Mobility and Shadowcaster. The beautifully colourful backdrops that resemble moving impressionist paintings, were created by the use of LIDAR (Light Detection and Ranging) to scan an entire location and recreate it as a point cloud in the computer. A process called optical flow was then used to generate the moving scenery. This groundbreaking work earned the movie an Oscar for best visual effects.


154 DEEP IMPACT

FEATURE The 200 greatest VFX films

ILM, Pacific Title, CIS Hollywood (1998) For Mimi Leder’s slushy extinctionlevel event flick, ILM was drafted in to visualise a giant meteor wreaking devastation on the US eastern seaboard. When a mission to destroy the meteor fails, we’re treated to the impact itself, which results in a kilometre-high wall of CG water which destroys New York. Although these early water sims are somewhat ‘fluffy’, the sequence still stands up well. Shots of the tsunami moving inland required careful matchmoving of live action crowds.

161 WAR HORSE

Framestore (2012) A great example of CG supporting the story, War Horse features 250 shots that mostly enhance the scale of war scenes and period detail. However, the one CG horse shot, in which the horse and rider defeat a tank, is a triumph. Framestore modelled the horse entirely from photogrammetry reference and details such as mud were painstakingly added by hand.

160 R.I.P.D.

Rhythm & Hues, Image Engine (2013) This his comic-book movie featured 500 CG shots, of which 300 were creatures. Image Engine created ‘deado’ CG characters for the film, notably Pulaski and Nawicki – Pulaski is a giant and Nawicki’s anatomical deformations include growing several feet, detaching a limb and suffering a disintegrated jaw.

153 MIGHTY JOE YOUNG

157 THE THING

ILM, Dream Quest, Matte World Digital (1998) A remake of the 1949 movie of the same name featuring a giant ape at large crushing cars and scaling buildings may not have been the most original movie but it ushered in a new era of creature work. At the time ILM and Dream Quest’s work on the CG Joe – the giant ape – was some of the most advanced hair simulation put to screen. Coupled with traditional robotics and forced perspective shots, the film sold the idea of an out-of-control ape to a whole new audience.

Image Engine (2011) Keeping the mantra ‘horrifically beautiful’ in their minds the VFX team, led by Jesper Kjolsrud, set about creating some classic scares for this remake of the John Carpenter classic. One scene in which the character of Juliette ‘turns’ was devised to resemble “a ball of snakes trying to push through skin,” says Jesper.

156 THE SPIDERWICK CHRONICLES

152 THE SECRET LIFE OF WALTER MITTY

Tippett Studios, ILM (2008)

MPC (2014) Walter sees the would through the lens of classic Time magazine photography, which meant bending reality to create a subtle world view that’s not quite real, but not too fantastical. Packed with neat VFX and practical effects, Walter Mitty surprises at every turn and features a creative use of most tricks in the VFX artist’s arsenal, from swimming with – and punching – sharks to super-heroics in a burning building.

This fantasy adventure required such a large cast of CG creatures, that work had to be split between Tippett and ILM. Tippet’s new lighting pipeline enabled shots to be lit using global illumination, and other innovations included Riot Control, a bespoke system for handling the dozens of goblins attacking the Spiderwick Estate.

155 I, ROBOT Digital Domain, Weta Digital, various (2004)

159 THE MASK

ILM (1994) When bank clerk Stanley Ipkiss dons the mask of Loki, god of mischief, he’s transformed into his wild-man alter-ego. Inspired by Tex Avery cartoons of the 1940s, ILM had to painstakingly blend Jim Carrey’s live-action antics with CG caricatures, with some shots taking as long as four weeks to animate by hand.

158 TROY

MPC, Framestore (2004) MPC delivered 425 VFX shots, including digital environments and weapon replacements. To create the massive battle scenes, elements were filmed in Mexico with hundreds of costumed extras, then 150,000 digital soldiers were added using proprietary crowd simulation software called ALICE and the then largest-ever capture session.

Boss Film Studios (1983-1997) Started by Richard Edlund after his stint at ILM, closed due to competition The Orphanage (1999-2009) Co-founded by ex-ILM staffers Stu Maschwitz, Jonathan Rothbart and Scott Stewart Rhythm & Hues (1987-2013) Famously went bankrupt after receiving the Oscar for Life of Pi Illusion Arts (1985-2009) Owned by Bill Taylor and Syd Dutton, it specialised in digital environments and matte painting Pacific Title (1919-2009) Finally shut up shop after 90 years in Hollywood Image Movers Digital (2007-2011) Robert Zemeckis’ animation studio, killed by the gigantic flop, Mars Needs Moms C.O.R.E. Digital Pictures (1994-2010) Founded by William Shatner, buried by animated movie The Wild Apogee, Inc. (1977-1992) Started by John ‘Star Wars’ Dykstra, who still works in VFX CafeFX (1993-2010) Founded by Jeff Barnes and David Ebner. Expanded into new offices in 2006, closed four years later

I, Robot featured superb effects work from a clutch of major shops. One scene called for Will Smith to walk among 1,600 NS-5 robots. Parallax issues made it impossible to accomplish by texture projection, so full 3D models were used – even the hangar itself is CG. Digital Domain’s workload was so intense that Weta Digital was brought on board to help out, delivering almost 300 shots in just 10 weeks. Its biggest scene is the tunnel car chase sequence in which Smith’s Audi is pounced on by 160 rampaging robots.

3D WORLD November 2015

TOP 10 DEFUNCT VFX HOUSES

151 A.I. ARTIFICIAL INTELLIGENCE

ILM, Pacific Data Images (2001) A tour de force of CG and digital VFX, Spielberg’s movie is crammed with beautiful effects, such as the skeletal mecha with realistic facial facades, a neon-lit Rouge city and scenes of a flooded New York. It also features an unexpected coda when we’re swept through a future ice-bound Manhattan, which is being excavated from a vast glacier. The opening 60 seconds of the flight across the ice field is footage from the movie Firefox, matchmoved by hand with simple CG buildings, but the rest of the trench flight is fully CG. 50

3dworld.creativebloq.com

Dream Quest Images (1980-2001) Founded by Hoyt Yeatman and Abdi Sami, absorbed into Disney as The Secret Lab then disbanded Pacific Data Images (1980-2015) Early pioneer of CG, sold to DreamWorks in 2000, shut down as part of a company-wide restructure


149 INTO THE STORM Method Studios, Digital Domain, MPC, Cinesite, various (2014) This spiritual successor to Twister features a mile-wide tornado wreaking havoc on a US town. There’s little here that isn’t CG, with the need to add stormy skies to footage shot in broad daylight, swirling debris, twisters and fire. Also, Method Studios had to deliver in just four months after the collapse of Rhythm & Hues.

148 CASPER ILM (1995)

150 TOTAL RECALL Double Negative, MPC, The Senate VFX, BUF, various (2012) Total Recall is full of wonderful CG scenery and smart set-pieces, but the car chase sequence is one of the most visually and technically impressive. While previz provided a solid starting point, the sequence evolved and grew once shooting began. “Even though all of the backgrounds and some of the cars were to be replaced, shooting something real was an immense help,” says Double Negative visual effects supervisor Adrian de Wet. “Firstly, it meant that we didn’t have to come up with all the camera moves, aside from the full CG shots. Also, there were a whole host of subtle nuances that you got with a real camera and from the way the hover cars moved with real actors inside; things you’re not really aware of until you have the shot. The idea was to treat it as if it were a fully live-action car chase, and not fall into the trap of thinking ‘we’ll do that in post’, even though the post task was immense.” “We worked predominantly in 3D for the geography, as it enabled quick layout changes,” explains CG supervisor Vanessa Boyce. “As soon as we had 3D layouts that worked, we split each shot up into layers. With the far-distance city, we matte painted clusters of buildings on cards (derived from simple geometry), which we were able to reposition in Nuke. We then looked at every shot and, if there was no noticeable parallax shift in the mid-ground, we just rendered one high-quality overscan frame and projected in Nuke. This enabled us to reduce rendering data and gave the matte painters a frame on which they could paint in extra detail.”

3D WORLD November 2015

51

3dworld.creativebloq.com

Armed only with mid-90s processing power, Casper and his entourage of spooks took a team of 150 people at ILM 15 months to produce, averaging just seven shots a week. Billed as the ‘first digital performer’ Casper takes the lead in a movie that boasts 40 minutes of CG over more than 350 shots – six times the amount featured in Jurassic Park just two years earlier.

147 THE DAY THE EARTH STOOD STILL Weta Digital, various (2008) This dreary remake of the Robert Wise original is much improved by the work of Weta Digital in visualising the alien technology; of the film’s 507 shots, Weta did 227 of them. Highlights include the giant, otherworldly robot, Gort, and the swarm of nanites that consume everything in their path. The shot in which a moving truck is eaten by particles took over 1,150 versions to get right.


FEATURE 146 DRACULA UNTOLD The 200 greatest VFX films Framestore, Milk VFX (2014)

As the principle vendor on the movie, Framestore joined the project early on, ramping up to a crew of 200 for the 300 effects-heavy shots. CG supervisor Ben Lambert says that early discussions pinpointed several key challenges. “The most obvious was that we’d need to deal with Vlad, who goes through several stages of transformation, or ‘vamping out’ as we called it, with a host of different effects that range from very subtle to dialled all the way up to 10. We also needed to develop ways for vampires to disintegrate and die, then there were the bats, including scenes featuring Vlad’s ‘hand of bats’. We also needed to deal with a number of crowd sequences. We have obviously done those before, but never on this scale.” Indeed, for the climactic battle scenes there are around 100,000 bats raking right through a whole army of CG soldiers. 145 THE HUNGER GAMES: CATCHING FIRE

Double Negative, Weta Digital, Method Studios, RodeoFX, various (2013) There’s plenty of great shots to admire in the second Hunger Games, from the new capitol city to the dense jungle arena. But we can’t resist the baboon attack created by Weta Digital, using techniques developed on Planet of the Apes to bring the evil monkeys to life.

143 BATTLE: LOS ANGELES

141 WORLD WAR Z MPC, Cinesite, Framestore (2013)

Cinesite, Hydraulx, SPIN VFX, The Embassy, various (2011) With over 1,000 VFX shots, Battle: Los Angeles mixed alien invasion with epic disaster flick. While box office performance was muted, there’s much to admire in the CG work. The alien ships (inspired by Empire Strikes Back) were scruffy, detailed and hodge-podge – lending these aliens a worn, authentic feel. More so, the animation brought the alien adversaries to life; taking queues from war footage the creatures can be seen dragging comrades to cover.

MPC’s team completed more than 450 shots for World War Z, which included the iconic zombie invasion in Jerusalem. Shots were a mix of live-action plates with CG environments, CG humans, CG zombies, CG helicopters and FX passes for dust and helicopter wash. To create the hordes of zombies, MPC had to tweak its proprietary crowd system ALICE, which is normally used to keep crowd agents apart, but was used here to simulate streams of close-knit zombies. Many of the shots also required MPC to leverage the power of PAPI, its in-house rigid body dynamic solver. The two systems were bridged together, enabling MPC to create behaviours and animations in the crowd engine, which could then be used to drive physics simulations at the same time.

144 BATTLESHIP

ILM, Scanline (2012) A movie based on a board game starring Rihanna was never going to set the Oscar’s alight – however with ILM on board fresh from Transformers success and Scanline’s expertise with fluid simulation, Battleship was always going to look good. ILM had already spent three years working on a water sim pipeline but the scale of the production was so big it required Scanline’s input. Using sing its proprietary Flowline fluid simulation software, Scanline created the shots of the aliens crashing to Earth and the destruction of Hong Kong.

142 POMPEII Mr X Inc., Scanline VFX, Soho VFX, Spin VFX (2014) The cataclysmic destruction of Pompeii is brought vividly to life thanks largely to the efforts of Mr X Inc., which delivered CG versions of Pompeii itself (including a harbour shot that required a 95-million-particle RealFlow ocean), plus of course the lava bombs and pyroclastic clouds of Mount Vesuvius.

3D WORLD November 2015

52

3dworld.creativebloq.com


140 DREDD The Mill, Prime Focus, Baseblack (2012) The criminally overlooked version of 2000 AD’s lawenforcer features a raft of excellent VFX work, which belie its trifling $45 million budget. Realistic Mega City skyscrapers, slow-motion hallucinogenic effects, and an array of grisly deaths all help bring the comic strip to bloody, visceral life.

TOP 5 VFX EPICS COMING IN 2016 Batman vs Superman: Dawn of Justice Dneg, MPC, Scanline, Shade and Weta Digital combine for this beefy super epic Star Wars: Episode VII: Force Awakens ILM’s return to Star Wars will be the event of the year, maybe the decade X-Men Apocalypse Digital Domain are on board to destroy the 80s, what could be better?

138 BATMAN BEGINS

136 MEN IN BLACK 3

Double Negative, MPC, BUF, various (2005) Bringing Christopher Nolan’s vision to life required some 600 effects shots. MPC handled ndled all the digital bat work, BUF took on several hallucination sequences, and both The Senate Visual Effects and Rising Sun Pictures handled several scenes. But the bulk of the work – about 300 shots including all digital work for Gotham City itself – went to Double Negative. To create a suitably detailed CG double, Christian Bale was scanned and the model refined until it was indistinguishable from the source material, even at full screen.

Sony Pictures Imageworks, Method Studios, Prime Focus VFX (2012) Arguably it’s one Men in Black film too many, but the third outing does at least have some spectacular sequences. Most noteworthy of these are Will Smith’s dive from the Chrysler Building (a mixture of real footage, virtual cityscape, digital Will Smith and CG clothing), plus the climactic showdown atop the gantry in Cape Canaveral and Apollo 11 lift-off.

Captain America: Civil War More from ILM as every Marvel hero joins in the action

Weta Digital, Rising Sun Pictures, Iloura (2013) The second Wolverine movie featured a climatic fight with the Silver Samurai, animated by Weta Digital then animated the duel. To replicate the chrome effect, shader writers milled pieces of metal to look at the materials’ different responses.

134 G.I. JOE: RISE OF COBRA

Warcraft ILM is on board for one of the most abitious VFX movies since Avatar

139 JUMANJI ILM, Amalgamated Dynamics (1995) This represented another significant leap in the development of realistic CG fur – this time on some spooky monkeys and a pretty decentlooking lion. The film contains a large number of supposedly photoreal animals of varying quality, including a CG stampede that, oddly, looks less convincing than the dinosaur effects in Jurassic Park two years earlier.

135 THE WOLVERINE

137 CLOVERFIELD

Double Negative, Tippett Studio (2008) Cloverfield is a fine example of how to successfully mix live action with quality CG, such as when the Statue of Liberty’s head is catapulted down the road by an unknown and unseen force. Visible for several seconds in full frame, the head itself had to be built as an extremely high detailed 3D model with precise texturing. The use of camera tracking software enabled the FX technicians to track CG objects into even the most complex, hand-held scenes.

3D WORLD November 2015

53

3dworld.creativebloq.com

Digital Domain, MPC, Framestore, CafeFX, CIS Hollywood, various (2009) The movie’s no Oscar-winner, but some of the CG work – such as the sequence in which the Eiffel Tower is dissolved by Cobra’s nanomites – is phenomenal. Digital Domain handled the Joes’ all-CG suits, driven by motion capture, and delivered a totally CG street for the missile-evading sequence. Animation was handled by Maya, with effects added in Houdini and rendering done using a mix of RenderMan, Mantra and DD’s bespoke volumetric system, Storm. To get the Eiffel Tower to buckle, a Houdini cloth sim was applied to the supporting girders, while a dynamics system handled debris. The nanomites’ metal-eating effect was achieved using a particle system and the Storm renderer – which had to handle some 400,000 girders.


FEATURE The 200 greatest VFX films

133 300: RISE OF AN EMPIRE E

MPC, Cinesite, Scanline VFX various (2014) “When Rhythm & Hues went bust, Warner Bros. had about 250-300 shots they needed to send somewhere, and still try and hit the same deadline,” explains VFX supervisor Richard Clarke. The team found themselves facing the arduous task of delivering a mammoth 270 shots in just 12 weeks.

132 PEARL HARBOR

ILM, Asylum Visual Effects, Cinesite (2002) With a laboured script, leaden acting and insensitive factual inaccuracies, the only reason Pearl Harbor is worth seeing is for the VFX. Unbelievably, there are only four shots that are totally CG in the movie, including the two shots of the USS Arizona exploding. ILM used a combination of software for the attack sequence, including AliasStudio, Maya and Softimage for basic modelling, and employed its proprietary software, Zeno, for the many rigid body simulations in which CG fighter planes were required to break into between 300 and 1,000 pieces. To comply with environmental rules, VFX supervisor Eric Brevig wrote a new piece of software to create the huge amount of smoke plumes needed.

129 JACK THE GIANT SLAYER Digital Domain, Giant Studios, The Third Floor, MPC, Soho VFX, Rodeo FX (2013) Giants naturally take centre stage in Brian Singer’s retelling of the fairytale, which also meant creating new digital worlds. The use of virtualisation made the production unique – taking digital models and assets, a kinematic model for each actor was created to track and solve a virtual version of each actor’s skeleton in real time. This was then re-targeted to the digital giant mocap puppet. Animation and virtual camera moves were streamed to Motion Builder to visualise the giants in CG backgrounds, all in real time. 128 ENDER’S GAME

131 NIGHT AT THE MUSEUM SEUM 3

MPC, Cinesite, Zoic Studios, various (2014) The tension between the need to be authentic and accurate, but create a fantastic, dream-like feel, threw up a lot of interesting challenges for this threequel, says MPC’s VFX supervisor upervisor Seth Maury, whose team completed 250 shots for the movie. “There There were an endless number of characters to create, from elephants, to Balinese dancers, to painted stone hippos,” he says.

130 WATCHMEN

Digital gital Domain, The Embassy, various (2013) As the main vendor on the movie, Digital Domain undertook 700 of the 941 VFX shots, with six other studios helping with the remainder. DD’s key challenge was generating and rendering the flocks of fighter ships, that often numbered in the tens of millions. Only one model was needed for each style of ship – one alien fighter and one International Fleet drone. This meant the team could use instancing to minimise RAM usage while increasing both the number of ships simulated and rendered on screen at the same time. The instancing was particularly helpful for around ten wide shots, dubbed ‘superflocks’, numbering between 20 and 100 million ships.

126 MEN IN BLACK

125 SKYFALL

ILM (1997)

Cinesite (2012)

A range of weird and wonderful creatures feature in the 250-odd VFX shots for this extraterrestrial action blockbuster. Standout moments include the hilarious ‘worm guys’, the lip-synched pug dog, which features an entirely CG-ed lower jaw, and the scene in which Jeebs gets his head blown off, only for another to seamlessly grow back. ILM’s work in adding the entirely-CG alien Edgar to the final sequence really pushed the art of matchmoving, enabling directors to shoot any scene as if it were live and not due for a CG makeover.

Bond fights a Komodo dragon in Daniel Craig’s third outing as the suave super spy, and Cinesite brought the creature to life convincingly. The dragons’ basic shapes were initially modelled in Maya and then the finer elements – with all the details of their ribs, wrinkles and folds in the skin – were added using Mudbox. “We ended up modelling the shape based on the more rugged wild Komodo dragons online, and used the textures from our zoo shoot to get the resolution required,” explains Cinesite’s VFX supervisor Jon Neill.

127 WRATH OF THE TITANS NS

Sony Pictures Imageworks, MPC, CIS Hollywood, various (2009) Zack Snyder’s impressive take on the Alan Moore/Dave Gibbons comic book was rammed with CG, hence the need for several VFX vendors. MPC was responsible for the 3D environments and cityscapes, the Nite Owl’s aircraft and digital doubles, while SPI took on the major task of bringing Doctor Manhattan to the screen. As an indicator of this challenge, a quarter of the film’s $100 million budget went on visual effects, with $17 spent on Manhattan’s scenes alone.

MPC, Framestore, Method Studios (2012) This VFX-laden romp though Greek mythology required the combined efforts of three London effects houses to deliver its epic action and titanic monsters. Key among these are the 1,200-foot high lava-spewing volcano creature, Kronos. To create the moving textures of Kronos’ rock-and lavacovered body, they covered texture artist Stephen Molyneaux with clay from head to toe. When the clay dried, they asked him to mimic poses from post-viz. Photos of his cracked skin became the basis for a displacement map in ZBrush.

3D WORLD November 2015

54

3dworld.creativebloq.com


124 MAD MAX: FURY ROAD

Iloura, BlackGinger, Method Studios (2015) The makers of the 2015 Mad Max reboot didn’t want the action to be overly CG-dominated, explains Tom Wood, VFX supervisor at Australian FX house Iloura. With more than 300 stunts in the film, VFX was used exclusively to achieve the unachievable. Standout VFX elements include an epic sandstorm and citadel environments.

123 REAL STEEL

Digital Domain, Giant Studios (2011) For the robot boxing matches, a virtual production system was built by Digital Domain and Giant Studios, enabling director Shawn Levy to film the CG characters as if they were on set. Fight scenes were motion captured prior to principal photography, then the actors’ CG counterparts placed in virtual environments matching locations shot in Detroit. While hile shooting what appeared to be an empty boxing ring, Levy could see the CG robots and frame the action accordingly.

122 THE MUMMY

ILM, Cinesite, Pacific Title (1999) Stephen Sommers’ horror yarn is full of supernatural effects, from dust storms and swarms of locusts to the eponymous star himself. ILM created an underlying skeleton and muscle system in Maya, with dozens of animatable controls. For scenes where you see his bones and chest cavity, the creature was layered in texture maps with areas of transparency and chunks of displaced geometry. In later scenes, VFX supervisor John Berton added markers to Arnold Vosloo’s face, so he could painstakingly track in digital prosthetics, showing his jaw and teeth behind the shredded skin.

120 TERMINATOR SALVATION ON

ILM, Asylum, Rising Sun Pictures, various (2012) Among its 1,500 VFX shots, Terminator Salvation features an impressive 60ft, Harvester robot, which has one of the film’s most intricate rigs. ILM also integrated an energy-conserving shader set in RenderMan to achieve more accurate lighting and cope with the extreme contrasts of desert conditions. The ensuing segment with the truck, Moto-terminators and a giant Transporter is pretty impressive, too.

119 SPIDER-MAN

Sony ny Pictures Imageworks, Pixel Magic (2002) Sony’s digital Tobey Maguire helped launch one of VFX’s most enduring modern franchises. After bodyscanning Maguire, SPI artists sculpted individual muscles onto the CG model to match the actor’s performance. This physical vocabulary was updated throughout the movie as Spidey became more confident in his abilities.

117 SUCKER PUNCH

Animal Logic, MPC (2011) Zack Snyder’s alternate-reality movie features giant samurai, ferocious dragons and sci-fi vistas, and is loved and hated in equal measure – however the CG is well worth seeing. Amidst all the fantasy bombast it’s actually the low-key tricksy shots that catch the eye, including one sequence in which the camera flies through a keyhole and into the reflection in the eye of the female lead, Baby Doll.

BIGGEST ‘BEST VISUAL EFFECT’ OSCAR WINNERS

121 TRANSFORMERS: REVENGE OF THE FALLEN

ILM, Digital Domain, Asylum FX, (2009) While Transformers brought the Hasbro toys whirring to life, its sequel involves some 60 robots across seven states, three countries, and one alien planet. Add in another 40 CG vehicles, and the total count is more than 100 major 3D assets. “The locations, the effects, the style are all huge,” says VFX supervisor Scott Farrar. Rendering shots for Transformers took between 16 and 20 terabytes of disk space in the studio’s renderfarm; Revenge of the Fallen took 140 terabytes. “The amount of render time is colossal,” Farrar says. “The whole movie is that way.”

Death Becomes Her, ILM (1992)

118 10,000 BC

Double Negative, MPC (2008) If you can ignore the script, the acting, the historical inaccuracies and the bizarre pseudo sci-fi ending, 10,000 BC is a pretty cool film, with some excellent FX work from MPC and Double Negative. The sweeping vistas over the Giza pyramid site are largely models built at 1:24 scale by Joachim Grueninger, constructed near the actual film set in Namibia, but they’re enhanced with digital doubles, dust, props and so on. The best sequence, however, is the stampede, where a pack of mammoths are unleashed to wreak havoc among a building site with 50,000 digital slaves. Fully CG sets integrate seamlessly with live action and model shots and, all in all, it’s a suitably epic climax for a fantastically overblown movie.

3D WORLD November 2015

55

3dworld.creativebloq.com

ILM Digital Domain Weta Digital Rhythm & Hues Double Negative Framestore Pixomondo The Mill ESC Entertainment Dream Quest Images

15 7 5 3 2 2 1 1 1 1


FEATURE The 200 greatest VFX films

114 MARS ATTACKS! ILM, Warner Digital Studios (1996)

116 THE MAZE RUNNER

Method thod Studios, Gentle Giant Studios, Factory VFX, various (2014) Unsurprisingly, the biggest challenge in this adaptation of the James Dashner novel is the gigantic maze itself; Method Studios completed some 530 shots, providing the towering maze walls, complete with digital foliage, and marrying CG with physical sets. It also delivered the Grievers, the part-organic, part-mechanical creatures that patrol the maze. The Method team studied macro photos of bed bugs and fleas for inspiration, along with slugs and warty frogs for the soft areas of the creatures’ bodies. For the Grievers’ unique movement ants and even the ‘swarming’ affect of crabs crawling over each other became an influence.

115 ROBOCOP

Framestore, Cinesite, Method Studios, various (2014) Framestore’s key task was to embellish the practical suit from Legacy Effects and bring the iconic main character to life, explains CG supervisor Andyy Walker. Live-action photography featured Joel Kinnaman and stunt doubles in two variants of the physical suit, which needed various enhancements and a CG mechanical abdomen. “It became apparent that rather than track the whole body to replace the abdomen and limb linkages, it would be quicker to track just the head and spend our time creating a convincing CG body, even in close-up shots. It was a revelation that the he uncanny valley is a lot less deep when you only see a chin.”

Marking Tim Burton’s first major dalliance with CG, Mars Attacks! features over 300 shots from Industrial Light & Magic, then at the height of its dominance over the Hollywood effects industry. From 1950s trading cards to the big screen, the titular Martians represented a leap forward in CG character work. Putting even 15 on screen at once nearly brought ILM’s systems grinding to a halt.

112 DRAGONHEART

ILM, Tippett Studio, Illusion Arts (1996) ILM employed its bespoke Caricature lip-synching software to breathe life into Draco, the talking dragon. The big leap forward here was the range of Draco’s facial animation: the voice was provided by Sean Connery and over 200 reference photos were taken of him with different expressions. These were then used to help animate Draco, give him character and enhance his virtual ‘performance’. With lengthy screen times (up to 30 seconds in some scenes), and Draco being the main focal point in the shot, the CG had to stand up to audience scrutiny, which makes ILM’s work on the film all the more impressive. The facility duly received a Technical Academy Award for its efforts.

113 SPIDER-MAN 3 Sony Pictures Imageworks, BUF, Giant Killer Robots, various (2007) FX supervisor Scott Stokdyk led a team of 200 programmers at SPI to produce 900 visual effects shots for this film. Three years of research went into the origin of the Sandman. To visualise his varying states, they used a mixture of particle and fluid/gas simulations, plus SphereSim – a custom simulation engine that helped generate natural-looking he 2,700-frame, clumps of sand. The three-minute effects sequence (including one continuous shot that lasts over a minute-and-ahalf) is an astonishing piece of visual effects work.

3D WORLD November 2015

56

111 HARRY POTTER AND THE PRISONER OF AZKABAN

Double Negative, Framestore CFC, MPC, ILM, Cinesite (2004) All of the Potter films were VFX-heavy, but the third movie’s all-digital werewolf provided a new high point. MPC’s menacing mythical hound raised the bar for photorealistic creatures through its in-house muscle simulation software and proprietary tools like Delilah, its fur and hair plug-in for Maya.

3dworld.creativebloq.com


“I think that juxtaposition of something known placed into a landscape that’s totally unknown really makes it work. It immediately lets the audience know that this story is rooted in a place that was once our world.” Eric Barba, VFX supervisor on Oblivion, Digital Domain

102 TITANIC Digital Domain, ILM (1997)

110 JUPITER ASCENDING Double Negative, Framestore, Method Studios, various (2015) The ancient galactic empire at the heart of Jupiter Ascending looked nothing short of stunning. Method Studio’s Philippe Gaulier teamed up with art director Olivier Pron to create the early visuals, working in-house with Photoshop, Sketch, Maya and Vue 3D. They then brought in in-house VFX supervisors Stephane Naze and Simon Carr to create the concepts for the final shots themselves.

Digital Domain (1997) It was overlooked at the Oscars, but this vivid space romp was named one of the Visual Effects Society’s top 50 movies. Director Luc Besson recruited Digital Domain’s 55-man crew to handle some 200 shots, including sequences on 3D sets on board a model of a 2,000-foot interstellar ocean liner, hovering above an all-digital water planet.

107 THE PERFECT STORM

ILM (2000) George Clooney plays Frank ‘Billy’ Tyne, a fishing boat captain who ignores weather warnings, in a tale based on the true story of the Andrea Gail from 1991. The end sequence is a CG stonker, featuring a huge 100ft wave that finally capsizes the ship. In total, the film featured 90 completely CG shots, all of which include water elements. A further 220 shots required CG seas to be composited with live-action footage shot on a huge, moveable fishing boat set. A custom fluid dynamics system was developed to create a realistic ocean ins were written and more than 30 plug-ins for Maya to achieve the intricate effects. In addition to this, standalone applications for shaders and particle systems were also written in-house.

ILM, Animal Logic, Rising Sun Pictures, various (2013) VFX isn’t all spaceships, explosions and dinosaurs: it’s great for recreating period pieces, too. In The Great Gatsby the audience were treated to impossible sweeping camera moves over New York, seamless set extensions, time lapse sequences… in fact, it’s amazing how little of New York and its environs were actually real.

103 LABYRINTH Digital Productions, ILM (1986)

106 THE DAY AFTER TOMORROW

The owl in the opening sequence, included because the real owl flew away, is considered the first ever photorealistic CG animal in a film. Larry Yaeger and Bill Kroyer at Digital Productions animated and technically-directed the award-winning sequence, featuring the CG owl reflected in ripples of water.

ILM, Digital Domain, Zoic Studios, The Orphanage, various (2004) 50,000 scanned photos of a 13-block area of New York were used to create a 3D, photorealistic model of the Big Apple so it could be destroyed by a digital tsunami and then frozen. The film features the longest ever CG flyover shot for the opening ice-shelf scene. 3D WORLD November 2015

101 THE GREAT GATSBY

Mill Film (2000) Renowned as Britain’s first Oscar win for visual effects, thanks to the sterling work done by The Mill’s now-defunct film division on digital crowds in the Colosseum scenes. The CG stadium, created from blueprints and live-action reference images, was rendered in three stages for easier lighting. Around 50 real actors were filmed, then varied and multiplied for the final shot. The film is also famous for having a digital double of Oliver Reed who died before the production had wrapped.

ESC Entertainment, Sony Pictures Imageworks, BUF, various (2003)

108 THE FIFTH ELEMENT EMENT

104 GLADIATOR

105 OBLIVION

109 THE MATRIX RELOADED

Nothing in it had quite the impact of bullet time, but the Wachowski’s Matrix sequel marked the first outing for Universal Capture, a highdefinition image-based facial performance capture system. The richness of Keanu Reeves’ and Laurence Fishburne’s facial movements was simulated by producing 3D recordings of 00-micron their performances. 100-micron scans were used, with the detail extracted into bump and displacement maps.

Digital Domain, Pixomondo, (2013) With Iceland acting as a futuristic, war-scarred America, Digital Domain was tasked with adding the wrecked buildings and ships needed to sell the illusion. The drone attack inside the Scav base was also generated entirely digitally when the unfilmed scene was reintroduced at the last minute.

Crowd simulation comes of age in James Cameron’s weepy disaster flick. Digital Domain’s hundreds of CG actors helped propel this titanic movie to success. For the crowds of digital background characters, the DD team motion-captured a small number of actors going through a variety of interactive movements, forming a library of digital shots for use in any number and scale throughout the film. The movie also features fully CG-rendered ships and photoreal digital seascapes.

57

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

99 SIN CITY CafeFX, Hybride Technologies, The Orphanage (2005) Sin City’s digital backdrops propelled it to the Visual Effects Society’s top 50 films of all time. The fidelity with which the VFX artists recreated the style of Frank Miller’s graphic novel is commendable. Work involved use of black and white and adding false perspectives in post.

98 PAN’S LABYRINTH CafeFX (2006) A surprise worldwide hit, the virtual creatures of Pan’s Labyrinth charm and unnerve with equal frequency. Key effects include Pan himself, the messenger insect, and the nightmare-inducing Pale Man. Away from the creature work, subtle CG wounds enhanced key scenes. CafeFX used software, including XSI and Maya.

97 PIRATES OF THE CARIBBEAN: AT WORLD’S END ILM, CIS Hollywood, Digital Domain, various (2007) Even more effects-heavy than its precursors, the third Pirates movie boasted an impressive $300 million budget, making it the most expensive film ever made at the time of its release. Water simulation reached new heights in the maelstrom sequence. Using a tweaked fluid simulator based on Stanford University technology, ILM created 15 billion gallons with stunning clarity and controllability.

96 FORREST GUMP

100 KING KONG Weta Digital (2005)

ILM (1994)

We’re not sure why one, let alone three, T. rex would be interested in eating Naomi Watts: she’d hardly make a filling meal. Nevertheless, Kong has to stop his new size-8 friend from becoming dinosaur fodder in this thrilling, 10-minute-long, CG-heavy sequence. Weta Digital doubled its capacity in terms of render farm and disc space, and took on roughly 25 per cent more people to create King Kong. The team used a Maya, RenderMan and Shake pipeline, and created custom software for the ape’s fur. And since Naomi Watts gets thrown about, Weta also had to use a digital double for the actress in these scenes. Ultimately, this remake of the 1933 classic proves that good things also come in big packages.

3D WORLD November 2015

58

3dworld.creativebloq.com

The 3D elements that establish the eponymous hero’s place as an unwitting participant in the key events of the 20th century are discreet, but nevertheless they are there. From the napalm explosions to the frantic CG ping pong, ILM’s epic journey through the ages has an effects shot for every decade. This film is certainly a real masterclass in using CG to sell a story.


95 SNOW WHITE AND THE HUNTSMAN Pixomondo, Rhythm & Hues, Double Negative, various (2012) A key VFX element in Rupert Sander’s superlative Snow White reboot was the sinister ‘Man in the Mirror’ character, designed by The Mill and using techniques developed on Voyage of the Dawn Treader. The sequence was keyframed in Maya, with some subtle cloth simulation added, then rendered with mental ray.

94 FIGHT CLUB

Digital Domain, BUF, Pixel Liberation Front (1999) David Fincher’s film revolutionised the use of photogrammetry and 3D pre-vis in cinema. BUF’s impossible camera sequences created moving 3D animations based on 2D reference photos, with some requiring threeday shoots in order to collect enough material. One of the film’s earliest sequences is made up of over 100 individual still photographs composited together. It also boasts a full CG zoom from deep within Ed Norton’s brain, past firing synapses and out through a pore in his forehead, following a bead of sweat as it drips down and flows along the barrel of a Smith & Wesson 4506 in his mouth. Digital Domain created the shot at a cost of $800,000.

90 TRANSFORMERS ILM, various (2007) Michael Bay’s $150 million robotic battle royale featured some incredibly complex character rigs, courtesy of painstaking work by ILM. Autobot leader Optimus Prime alone has 10,108 separate parts. The animators used multisegmented parts for the lips and a turbine system for the eyes to simulate pupil dilation.

92 STAR TREK ILM, Digital Domain (2009)

89 APOLLO 13

Massive ships, exploding planets, wormholes through space… J. J. Abrams’ reboot may not have made any freakin’ sense, but the new Star Trek looked the part. Among ILM’s many challenges was the brilliant space jump on to the Romulan mining ship’s drill platform. It wasn’t even in the original script, yet represented a third of the shots that ILM had to accomplish.

Ron Howard’s tense space drama bagged an Oscar nomination for Digital Domain’s slick CG artistry and detailed large-scale modelling techniques. DD’s proprietary tracking technology helped create a series of stunning deep space scenes, plus a Saturn V launch sequence that even NASA astronauts mistook for original televised archive footage.

93 EX MACHINA

91 STAR TREK II: THE WRATH OF KAHN HN

Double uble Negative, Milk VFX, Utopia (2015) Alex Garland’s smart sci-fi movie brought Ava to the screen: an intelligent lligent android with a realistic appearance and translucent limbs that reveal the machine beneath the skin. Double Negative VFX supervisor Andrew Whitehurst was charged with overseeing her creation, ignoring ‘robots’ in the research and focusing on Formula 1 cars to design her muscles. Milk VFX worked with Dneg to create a CG brain that referenced both jellyfish and high technology, as well as POV shots. 3D WORLD November 2015

Digital Domain (1995)

88 TRON: LEGACY Digital Domain, various (2010)

ILM, Lucasfilm CG (1982) Creation of the Genesis planet was Hollywood’s first fully CG sequence and lasted sixtyseconds. It also features the first use of a fractal-generated landscape, the brainchild of ex-Boeing engineer Loren Carpenter, who discovered that fractals could be used to produce realistic-looking terrain.

59

3dworld.creativebloq.com

TRON’s return after three decades showed how much the industry had moved on. Digital Domain previsualised the film in stereo before it was shot under the guidance of previz and layout supervisor Scott Meadows. Roughly 85 per cent of the film is CG, with live-action photography complemented by fully digital environments, characters and vehicles.


FEATURE The 200 greatest VFX films

83 THE GOLDEN COMPASS Framestore, Rhythm & Hues, Digital Domain, Cinesite, various (2007) Described by VFX supervisor Michael Fink as the most complex film he’d worked on, The Golden Compass was one of the most ambitious VFX movies ever attempted. Ten visual effects houses and countless artists across the globe integrated live-action elements with CG backgrounds, talking bears, flying witches, daemons and magical dust.

87 SPIDER-MAN 2

85 THE INCREDIBLE HULK ULK

Sony Pictures Imageworks, Zoic Studios, Entity FX, Furious FX (2004) Spidey’s second outing featured better digital cloth and, of course, the brilliant Doctor Octopus. For the train fight sequence SPI tweaked its existing approach to cloth simulation with new Maya code, so the character’s complex colliding layers of clothing could be simulated together. The film would go on to win a second Oscar for John Dykstra.

82 NARNIA: VOYAGE E EADER OF THE DAWN TREADER

Rhythm & Hues, ILM, Hydraulx, Soho VFX, various (2008)

Framestore, MPC, Cinesite, The Senate, The Mill (2010)

Helping us to forget the mixed 2003 Ang Lee version, Rhythm & Hues gave the Hulk the CG he deserved, using its proprietary pipeline alongside modelling in Maya and effects work in Houdini. They also did some sterling work creating arch-villain The Abomination.

86 YOUNG SHERLOCK HOLMES MES ILM (1985) The first photorealistic CG character arrived in the form of a stained glass knight who grabs a whole ten seconds of screen time. When we say photorealistic, you have to remember he was supposed to look like a stained glass knight, not a real human being. This was one of the final jobs done by Lucasfilm’s Graphics Group. The technology and a team of 45 staff was spun off in 1986, in a deal co-funded by Steve Jobs for $5 million. The sale was prompted when George Lucas’s cash flow had stalled, following his 1983 divorce and the appalling box office flop, Howard The Duck. The group called itself Pixar and 20 years later was sold to Disney with an estimated value of $7.4 billion. Not one of George’s smartest business decisions…

Amidst all the incredible creature work, one shot still stands out for its subtle brilliance.. In the final scene the characters find Aslan on a beach;; in the background a CG wave stands frozen for minutes of screen time to be picked apart by viewers.. Framestore created the wave using Maya particles, its proprietary deep shadow Occvox (occlusion voxels) software, and ray tracing in RenderMan.

84 THE LAST AIRBENDER RBENDER ILM (2010) A mis-step in translation from TV show to live-action epic, the elemental simulations created by ILM are amazing nonetheless. The sequence in which Aang battles Prince Zuko used ILM’s airbending pipeline and shared a core simulation with the fire pipeline. Filmed using practical fire at Zuko’s feet, ILM blocked in the animation of the fire and air with simple objects to judge the timing. “Because the fire and air share the base fluid simulation we were able to really see how they integrate together and see all the nice flowlines as one starts to impact the other,” said VFX supervisor Pablo Helman.

81 THE MATRIX REVOLUTIONS ESC Entertainment, CIS Hollywood, Tippett Studios, BUF, various (2003) Notable for the ‘superpunch’ shot, the first close-up representation of facial deformation on a computer generated human character. The effect of having a full-frame photorealistic impact on a known character’s face was achieved by taking a 100-micron scan of a facial plaster cast of actor Hugo Weaving, which was converted into displacement and bump maps.

79 PADDINGTON

Double Negative, Framestore (2014) “In Paddington I think we found a sweet spot between stylisation and photorealism,” says Pablo Grillo, animation supervisor at Framestore. “That permitted the character to be extremely expressive through a simplicity of performance that honours the original material while being completely believable.”

78 THE LOST WORLD: JURASSIC PARK

ILM (1997) Jurassic Park’s sequel upped the ante in tracking and match-lighting, and almost matched its Oscar win. The extensive use of traditional filming movements in the plate photography, such as handheld shots and dolly moves, really showed how to integrate CG elements into dynamic camera work.

77 HEREAFTER

Scanline VFX (2004) Scanline’s reputation for water effects was firmly established by its Oscar nomination for the Tsunami scene in Clint Eastwood’s hit movie, on which it was sole vendor. To achieve the spectacular scenes, Scanline leveraged previous R&D efforts on its proprietary Flowline simulation software, which is built into 3ds Max and V-Ray for rendering.

80 STUART LITTLE Sony Pictures Imageworks (1999) This blend of CG animation and live action missed out on an Oscar for Best Visual Effects to The Matrix. There’s still plenty to admire this charming tale of a mouse raised by humans in New York. From large scale water simulations of Stuart’s boat race to a small tear flowing down his fur cheek, this movie pushed CG techniques. A favourite shot is of Stuart cornered inside a water-logged washing machine, details include his wet hand prints on the glass.

3D WORLD November 2015

60

3dworld.creativebloq.com


76 THE DARK KNIGHT RISES Double Negative, New Deal Studios (2012) Knowing Chris Nolan’s dislike for CG, any VFX in the third Batman movie had to be nigh-on photoreal. Double Negative rose to the occasion with some stunning work, including the key sequence in which Bane destroys Gotham Stadium. A fake field was built, enabling stunt players to fall through holes, then the entire field and crowd-filled stadium was replicated in CG.

75 FAST & FURIOUS 7

“What’s great is that there was real purpose to the effects work in the sequence. It’s such an iconic moment in the Snow White story… the iconic moment, really. It felt like a massive privilege to work on it.”

Weta Digital, Scanline VFX, MPC, various (2015) While only ten per cent of the shots are CG, those that are here impress, such as the moment Vin Diesel drives a Lykan Hypersport between Abu Dhabi’s Etihad towers. But it’s Weta’s work on re-developing Paul Walker as a digital double that leaves a poignant note – and points to a future where digital actors are a reality. An emotional entry.

Nicolas Hernandez, VFX supervisor, The Mill

71 HUGO

Pixomondo, Lola VFX, Uncharted Territory, various (2011) Martin Scorsese’s hyper-real vision of 1930’s Paris relied on the work of Pixomondo, which delivered over 800 shots. The opening sequence swoops down from on high, through the Montparnasse train station to a tight close-up on Hugo himself – 1,230 frames in (nearly) one shot. The film is full of similarly impossible camera moves and threaded with homages to movie effects techniques of the past.

74 INDEPENDENCE DAY A AY VisionArt, various (1996) Independence Day’s massed 3D ships and planes still hold up to scrutiny today. VisionArt and its proprietary particle system/ compositor, Sparky, were brought in for the dogfight scenes, but ended up taking on around 85 shots, including the destruction of the alien mothership.

73 IRON MAN ILM, The Embassy, The Orphanage, various (2008) The hero may be just a man in a suit… but ut what a suit. Iron Man’s cybernetic costumes actually started life as real props, but for the most part were replaced with digital versions. While ILM did the bulk of the heavy lifting, The Embassy was responsible for the digital version of the Mark I suit, and The Orphanage also contributed key shots.

72 HOLLOW MAN

Sony Pictures Imageworks, Tippet Studio (2000) The hokey plot follows Kevin Bacon’s character, Sebastian Caine, as he goes on a rampage when the effects of tampering with his transparency fail to wear off. However, the film boasts some 400 effects, ranging from simple tracking and bluescreen shots when Bacon’s latex mask is seen to be empty, through to sophisticated effects where the invisible man’s body is highlighted only by water or smoke. The standout VFX sequence, in which Bacon gradually becomes invisible, is a marvel too. Sony’s custom volume rendering system enabled the VFX crew to replicate an entire human body in detail, where all the veins and organs move and react properly to the movement of the character. Overall, Hollow Man isn’t Verhoeven’s best effort, but the effects are sumptuous. 3D WORLD November 2015

70 DISTRICT 9

The Embassy, Weta Digital, Image Engine, Zoic Studios (2009) Neil Blomkamp’s brilliantly gritty tale of inter-species conflict is notable for the quality of its CG aliens, which are beautifully tracked and composited into the documentary-style ‘shakycam’ footage. Image Engine was responsible for the refugee aliens, while The Embassy handled the climactic exoskeleton shoot-out, with Weta on shuttle take off and crash-landing duties. 61

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

66 THOR: THE DARK WORLD Double Negative, Method Studios, Luma Pictures, various (2013)

69 STARSHIP TROOPERS Sony Pictures Imageworks, ILM, Tippett Studio (1997)

Method unleashed a world of destruction and chaos for Thor: The Dark World, using rigid body dynamics, plus particle and gas simulation work. Meanwhile Luma delivered a key VFX scene in the explosive demise of a mighty stone giant, using Maya’s built-in particle and instancing tools, Bullet with the FractureFX plug-in, and FumeFX for dust and smoke.

The first movie to feature a large scale CG-enhanced military battle – futuristic soldiers and giant alien insects in this case. By now stop-motion animator Phil Tippet had embraced the art of CG creatures, bringing his knowledge of form and movement to the digital realm. To create diversity in the Whiskey Outpost swarm scenes, Tippett’s artists varied the bugs’ hue, colour and saturation parameters, and mixed between a library of different walk cycles.

65 THE HOBBIT: AN UNEXPECTED JOURNEY Weta Digital (2012) It was quite the journey to bring The Hobbit to the big screen. Weta created numerous creatures and environments and relied on cutting edge technologies that allowed actors to work together on sets even where the final characters would be different scales – all in stereo 3D and in high (48fps) frame rates.

68 INSURGENT Method Studios, Animal Logic, Milk VFX, The Third Floor (2014) This young adult sci-fi adventure features a stunning shattered mirror shot in which heroine Beatrice ‘Tris’ Prior crashes through a glass window and fights herself. More impressive shots cover massive crowds created by Milk in Golaem Crowd and a rich holographic room rendered by Animal Logic.

63 THE LONE RANGER

ILM, MPC (2013) The Lone Ranger was a huge flop, but features some top quality work by ILM and MPC. There are chase scenes galore, sweetened by completely flawless CG environments, and a thrilling train crash. For the classic ‘train plummeting off a destroyed bridge’ gag, ILM even went old school, employing miniature work for close-ups, with CG for wide shots of the bridge.

62 CLOUD ATLAS

Method Studios, Trixter Film, Rise FX, Scanline VFX, various (2012) With a budget of $100 million, the Wachowski’s retelling of the David Mitchell novel is one the most expensive independent movies of all time. The biggest tasks for lead vendor Method Studios, was the creation of the futuristic cityscape of Neo Seoul in 2144 – a mixture of 6K matte paintings, geometry rendered using V-Ray, with traffic and crowds handled by Massive and Houdini. It was all comped with 3D camera moves in Nuke.

67 ELYSIUM

Image Engine, The Embassy, Whiskytree, Method Studios, MPC (2013) For Neill Blomkamp’s socio-political sci-fi epic, he asked Image Engine to tap into its creature expertise and create a variety of gruesome droids. As with the ‘prawn’ aliens in his previous movie, District 9, these bots were created using a complicated hybrid capture/animation approach. Stunt performers wore marker-covered grey suits on set, with witness cameras in addition to the main one on hand to cover all the angles. In post, Image Engine artists performed a first pass by manually matching model animation to the original performances. These roughs would either be approved or notes would be given on how to finesse or deviate from the original performances. The aim with the droid designs was to give them functionality and plausibility. Car panelling and military hardware provided some reference, while the springs at the base of the droids’ backs directly mirrored those used on motorcycles.

3D WORLD November 2015

64 CINDERELLA MPC, Rodeo FX (2015) With a tale as well known as Cinderella, you have to create some pretty special visuals to get people’s attention. MPC’s twist on the famous pumpkin transformation saw a greenhouse exploding into particles of dust, which then collected together to forge the carriage, using their proprietary destruction tool, Kali.

62

3dworld.creativebloq.com

61 TOMORROWLAND

ILM, Rodeo FX (2015) Rodeo FX created 50 shots for Brad Bird’s visually stunning sci-fi adventure, including a key shot in which the heroes escape an exploding farm house in a bath tub. “We were given live plates showing the escape pod landing in the lake and bobbing up to the surface. We had to extend the camera in CG at the head of the shots to create an environment that didn’t exist. We started with a 2D concept, elevating the camera and adding a farmhouse and land around it,” reveals Rodeo FX’s Ara Khanikian.


60 CHAPPIE Image Engine, The Embassy, Ollin VFX (2015) Neill Blomkamp’s tale of the little man standing up to a repressive futuristic regime thematically fits right in with District 9 and Elysium. Armed with Weta Workshop’s concepts, Image Engine created 3D designs, ensuring that the rigging would result in a range of real-to-life motion. Then Weta started 3D printing around 20 lifesize Chappies in various stages of wear, receiving the robot’s final design from the legs up. This process allowed them to iron out any kinks as far as motion limitations based on the design, well before the practical versions were actually built. It also gave Neill a chance to see the characters in a 3D space and make some design changes. 59 JOHN CARTER OF MARS

Cinesite, Double Negative, MPC (2012) Andrew Stanton’s action movie had it tough at the box office but with over 1,850 VFX shots there’s much to admire. It took a year researching to bring Zodanga, the mile-long long crawling structure that mines the surface of Mars, to life. The city’s legs were procedurally animated, with secondary animation and dust added using Houdini. The population was then made in Massive.

58 THE CHRONICLES OF NARNIA: THE LION, THE WITCH AND THE WARDROBE

ILM, Sony Pictures Imageworks, Rhythm & Hues, Studio C, various (2005) Notable for its portrayal of half-human, half-animal creatures, and all-CG lion, R&H used the Massive software to create up to 20,000 creatures, taking six to eight weeks to shoot with each frame taking nearly ten hours to render. ILM utilised Maya for the character rigs and Hair & Fur to create horse tails and Minotaurs.

56 GODZILLA

“Because Godzilla is a character and not just an animal, we also had to introduce more humanised elements to his behaviour. We had to find the right balance to make sure Godzilla was able to emote without falling into the realm of fantasy”

Weta Digital, Double Negative, MPC, various (2014) For the 2014 return of Godzilla, wanted director Gareth Edwards “wanted to keep everything designed in a way that could have been shot for real, so the effects would never feel fake,” says Guillame Rocheron, VFX supervisor for MPC. “We had to make sure Godzilla was a believable creature.” Inn all, designing the monster took his artists seven months.

Guillaume Rocheron, VFX supervisor, MPC

52 EDGE OF TOMORROW

55 THE AMAZING SPIDER-MAN 2 Sony Pictures Imageworks, MPC, various (2014)

53 X-MEN FIRST RST CLASS Weta Digital, MPC, Cinesite, Rhythm & Hues, various (2001)

In the second movie of the rebooted franchise, Peter Parker takes on Electro, The Green Goblin and Rhino. SPI did some incredible work, building an entirely digital version of New York’s Time Square, and creating a totally realistic CG Spider-Man with flexing muscles and rippling cloth, which featured even in tight close-ups.

The fourth X-Men outing was a VFX tour-de-force, with over 1,000 effects shots split across multiple vendors. As well as a selection of mutant powers, we also got a full-blown nuclear impasse off the shores of Cuba, a boat that splinters in two parts and Magneto dragging Sebastian Shaw’s submarine out of the ocean – an awe-inspiring scene with some impressive fluid simulation work.

54 HELLBOY II: THE GOLDEN ARMY

57 TERMINATOR 3: RISE OF THE MACHINES

Double Negative, Cube Effects, Baseblack, various (2008)

ILM, various (2003) Unfairly maligned, Jonathan Mostow’s entry in the Terminator canon has a pretty solid story and a number of great set-pieces. ILM reprised the role it had on T2, creating digital doubles, more metallic Terminator transformations, and a standout chase sequence in which a 140-tonne crane is flipped end-over-end – all done digitally, of course.

51 MINORITY REPORT

ILM, PDI/Dreamworks (2002) Spielberg’s high-concept whodunnit features 481 effects shots, which range from vast cityscapes and futuristic transportation to creepy Spyder robots (courtesy of PDI/ Dreamworks) and a genre-defining virtual user interface. The VFX is pretty flawless throughout, and also ties in beautifully with the overall look of the production design.

The sequel to Hellboy is wonderfully rich in ideas, with gorgeous production design and some stunning set pieces. Double Negative delivered over 1,000 VFX shots, including a swarm of ravenous tooth fairies, a rock creature and the eponymous Golden Army. But the sequence with a rampaging forest god is pretty freakin’ awesome.

3D WORLD November 2015

Sony Pictures Imageworks, Framestore, MPC, Cinesite, various (2014) Starship Troopers meets Groundhog Day in this rip-roaring sci-fi blockbuster, fuelled by top-notch VFX. The alien Mimics required a bespoke Maya plugin to handle their multifaceted limbs, and the dynamic battle sequences are all the more impressive when you consider the entire background – beach, vehicles, soldiers and explosions – were all added in post.

63

3dworld.creativebloq.com


FEATURE 50 WARS: THE PHANTOM MENACE The 200STAR greatest VFX films ILM (1999)

In November 1996, visual effects supervisor John Knoll – later to receive an Oscar nomination for Star Wars: Episode I – walked into director George Lucas’s offices at his ranch, and came face to face with the storyboards for the film. “There were 3,600 of them, laid out on pieces of 4x8-foot foamcore, in 36 groups of ten columns,” he says. “Prior to Episode I, our biggest show was 400 or 500 shots, Episode 1 had 2,000.” Although many modern blockbusters have upwards of 1,500 shots, when ILM started on Episode I, no-one had attempted anything of that scale. The Phantom Menace was, at the time, the biggest visual effects project ever undertaken. 95 per cent of the frames have digital elements, and it features 66 different digital characters, including the first all-CG lead character (Jar-Jar Binks). The 320-shot Pod race, which takes place on the barren landscape of Tatooine, required that ILM spend a year in R&D, working on physics systems for the destruction of the Podracers, plus an Adaptive Terrain Generator, employing a level-of-detail system, just so that its computers could hold the mesh data in memory. Whether you like it or not, Episode I fundamentally changed the way movies were made – pretty much as Episode IV did back in 1977.

49 IRON MAN 2 ILM, Double Negative, various (2010) As you’d expect from a Marvel movie, there’s plenty of CG action on offer here, including a multi-vehicle race car pile-up in Monaco, and an impressive segment in which Iron Man and War Machine take on a group of Hammer Drones – touted as ILM’s toughest assignment on the movie.

48 GRAND HOTEL BUDAPEST Look Effects, various (2014) Wes Anderson’s stylish comedy romp called for a blend of matte paintings, model shots, CG set extension and greenscreen action sequences to bring its distinctive look to the big screen. There’s a surprising amount of CG work needed to make multi-layered composites look like old-school miniature effects work!

47 HARRY POTTER TTER AND THE DEATHLY Y HALLOWS: PART 2

46 OZ THE GREAT AND THE POWERFUL

Sony Pictures Imageworks, various (2013) Sam Raimi returned to his roots on Oz by ditching greenscreen in favour of physical sets – but that still left plenty for the VFX artists to do. Sony did over 1,100 of the 1,500 VFX shots, a favourite being China Girl – animation supervisor Troy Saliba recalls how he had to tell a surprised animator that the animation was too articulated, and that, contrary to normal practice, it should actually be made less expressive.

MPC, Double Negative, Framestore, Cinesite, various (2011) Using the combined talents of most of London’s effects houses, the final chapter of the Potter franchise was also one of the best visually. Digital doubles, CG creatures, a dragon made entirely of fire, a burning Quidditch stadium… and, of course, the final demise of Voldemort.

“One of the things I love about working with Sam Raimi is that he keeps you on your toes. [On Oz the Great and Powerful] he was constantly challenging us to incorporate everything he wanted, and to create a consistent tone” Scott Stokdyk, visual effects supervisor, EuropaCorp

3D WORLD November 2015

64

3dworld.creativebloq.com

45 THE HOBBIT: THE BATTLE OF THE FIVE ARMIES

Weta Digital (2014) For the culmination of J. R. R. Tolkien’s short story, Weta Digital had to employ all of its skills: fluid dynamics for Smaug’s firey breath; physical simulations for the destruction of Lake-town; new army management tools for the huge CG battles; plus the roll-out of Manuka, Weta’s own physically-based renderer.


38 TEENAGE MUTANT NINJA TURTLES

ILM, Image Engine, various (2014) While the ‘realistic’ design of the turtles left some bemused, Teenage Mutant Ninja Turtles packs a punch: not least the dramatic, 100 per cent CG mountain chase created by Image Engine in which the turtles slide, flip and kick their way down a mountain slope flipping trucks and fending off enemies, enhanced by a swirling 360-degree camera move.

44 NOAH ILM, Look Effects, various (2014) Darren Aronofsky’s Noah offered an inventive, VFX-driven update of one of the best known Biblical stories. ILM was the main vendor, working in Maya, SpeedTree and Katana to create the ark, animals and foliage, and deliver stunning sequences such as the birth of a forest and the animals boarding the iconic vessel.

37 JURASSIC WORLD

ILM, Image Engine, various (2015) Although the latest Jurassic Park movie features a new dinosaur, the Indominus rex, it’s the returning velociraptors that are the real stars. Image Engine was tasked with bringing them back to the screen – this time interacting with humans. The studio used mocap data from Industrial Light & Magic to get the initial staging and performance cues, refining the animation manually to give each creature an individual personality, before adding a final pass of detail to the face, muscles and eyes.

43 WILLOW ILM (1988) Fantasy favourite Willow includes a groundbreaking sequence when the title character turns a goat into an ostrich, the ostrich into a turtle, the turtle into a tiger, and then the tiger into the human sorceress Fin Raziel. ILM won a Sci-Tech Academy Award for Morf, the morphing software developed for the movie.

HOW MUCH CG DO EARLY FILMS CONTAIN? Futureworld (1976) 40 seconds Star Wars (1977) 28 seconds The Black Hole (1979) 90 seconds Looker (1981) 32 seconds Star Trek II (1982) 78 seconds TRON (1982) 15 minutes The Last Starfighter (1984) 27 minutes The Abyss (1989) 75 seconds (photoreal) The Lawnmower Man (1992) 9 minutes 45 seconds Jurassic Park (1993) 4 minutes

41 LORD OF THE RINGS: THE TWO TOWERS

Weta Digital, Sony Pictures Imageworks, various (2002) Gollum featured in the previous movie, but the world’s best-known virtual character truly turned heads here, becoming more physically realistic thanks to Bay Raitt’s facial animation system. New subsurface scattering shaders were also used for the skin.

40 2012

42 MALEFICENT MPC, Digital Domain, various (2014) ’s digital doubles Digital Domain’s work had taken great strides since The Curious Case of Benjamin Button and TRON:: Legacy, but Maleficent brought things to a whole new level. The three pixies were particularly challenging,, being seen both as normal-sized live actors, and as tiny CG characters, the latter versions shown in close-up delivering dialogue. Digital Domain’s in-house hair tool, Samson, was brought in to create the complex hair grooms, with the digital hair also having to interact with the characters’ complex costumes.

Digital Domain, Double Negative, Scanline VFX, Sony Pictures Imageworks, various (2009) For 2012, director Roland Emmerich turned to 18 separate effects facilities to deliver unforgettable scenes of a supervolcano erupting in Yellowstone, earthquakes annihilating LA and Las Vegas, and a wall of water engulfing the Himalayas:: a fiendishly complex set of physics simulations.

36 ANT-MAN ILM, Double Negative, various (2015) Whenever Ant-Man shrinks, the effects are always digital: the CG water in a CG bathtub for one iconic early scene was recreated from photography shot on ‘macro’ sets: special six-foot-tall replica sets designed to give greater realism to the work. “The water sim was massively detailed and pushed the limits of resolution we could simulate. It’s fully raytraced water refracting everything,” commented Double Negative VFX supervisor Alex Wuttke.

39 ALICE IN WONDERLAND ERLAND

35 THE ABYSS

Sony Pictures Imageworks, various (2010) Tim Burton’s retelling of the Lewis Carroll story is a tour-de-force of CG: almost every scene was shot against greenscreen, so backdrops, props and characters could be inserted digitally. Indeed, in some shots the only real thing on show are the actor’s faces; generated. everything else is computer-generated. 3D WORLD November 2015

ILM, various (1989) ILM’s aquatic ‘pseudopod’ took CG to a new level. To create the effect of the water-filled creature mirroring the actors, the team digitised their faces and morphed them in software originally developed for the film Willow.

65

3dworld.creativebloq.com

34 SUPERMAN RETURNS

Sony Pictures Imageworks, various (2006) Several studios helped Sony supply the visual effects to set the man of steel soaring. Sony Pictures Imageworks’ super-realistic digital double of actor Brandon Routh used Paul Debevec’s LightStage 2 system, capturing reflectance data of the actor to apply to a spandex-clad CG model.


33 RISE OF THE PLANET OF THE APES Weta Digital, various (2011)

FEATURE The 200 greatest VFX films

Without ever uttering a word, Caesar, the star of Rise of the Planet of the Apes, had to be believable, understandable and utterly sympathetic in a story essentially told from his perspective as he grows up, rebels, and ultimately leads his species in a revolution against mankind… And he needed to look good in fur while doing it. With visual effects supervisor Joe Letteri once again at the helm, around 730 Weta crew members ultimately contributed to the film’s multitude of creature and environment-based effects shots. The technique used to produce the fur was completely different to Weta’s previous ape movie. “With Kong, the approach was very much a procedural one, manipulating the fur with deformers,” says Code Department supervisor Alasdair Coull. “But the tools were, in a sense, quite limiting. With deformers, you’re always one step removed. Plus, there was a very big learning curve for the artists. With the new software, we’ve gone for something that’s way more intuitive.” Rather than generating the hairs at render time, the new system, dubbed Barbershop, enables the user to groom the fur in real time. Artists work with guide hairs, controlling the hair at a very reduced density, and able to direct a single strand if necessary. “You name it: hair length, curl, thickness at base, thickness at the tip, level of clumping… With Barbershop we can control it,” says fur lead Nick Gaul. “Previously if I hired people, they had to spend three months learning how to groom fur. Now if they can shape a curve, they can groom.”

31 THE LAST STARFIGHTER

Digital Productions (1984) Spaceships go 3D in Nick Castle’s classic 1980s sci-fi fantasy. The movie is widely held to be the first use of CG to represent real-world objects in a movie. Digital Productions used innovative software and the Cray Research X-MP supercomputer to create CG for 300 shots, occupying a then-incredible 27 minutes of screen time.

30 IRON MAN 3

Weta Digital, Digital Domain, Framestore, Scanline VFX, various (2013) One of the standout VFX sequences in the Marvel threequel occurs when an army of soldiers is given a biotech boost called Extremis, which generates some jaw-dropping under-the-flesh fireworks. Using reference material from Digital Domain, Framestore created the effect using its proprietary fluid solver fLush, rendering the shots using Solid Angle’s Arnold renderer.

32 WAR OF THE WORLDS

ILM, Stan Winston Studio (2005) Spielberg’s re-imagining of the HG Wells story is notable for the photoreal quality of the CG tripods and the grim efficiency of the human vapourisation effect. ILM used the Inferno compositing solution and Maya to create the tripods, but most challenging was creating the weapons the aliens use. The ‘heat rays’ were made in Inferno’s action module using a particle generator that the artists used to spray particles along a pre-determined path. With a ten-month production schedule and just 12 weeks in post, Spielberg utilised real-time pre-viz to ensure ILM had enough time to get the 400 shots completed. Despite being its tightest ever job, ILM delivered – but in doing so, started a trend in Hollywood known as the ‘War of the Worlds effect’ where VFX studios were pressured to deliver better visuals on smaller budgets and tighter schedules.

“I remember as a teenager being blown away by what I was seeing [in the original TRON]. You knew this was something you’d never seen before. But now that technology has been around for 28 years and is much more sophisticated”

29 EXODUS: GODS AND KINGS

Double Negative, MPC, Scanline VFX, various (2014) With showpiece scenes like the parting of the Red Sea, Ridley Scott’s biblical epic was always going to demand a lot of visual effects work. MPC supplied the shots in question, developing a 160-foot wave in conjunction with Scanline VFX and its Flowline fluid simulator. Both MPC and Double Negative contributed crowd scenes comprising tens of thousands of digital extras, with MPC working with UK mocap specialist Audiomotion to record motion data of horses and chariots, the latter requiring the facility to capture eight horses and multiple stunt performers simultaneously.

Eric Barba, VFX supervisor, TRON: Legacy, Digital Domain

3D WORLD November 2015

66

3dworld.creativebloq.com


28 THE HOBBIT: THE DESOLATION OF SMAUG Weta Digital (2013) Tortoise shells, snake skin and crocodiles were referenced in making Smaug, but it was Benedict Cumberbatch’s performance that counted. Weta used lipsync studies to animate the curl of the lips and snarl of the nose. To sell the scene, Weta created 200 billion pieces of treasure and 20 million coins.

“The best fantasy creature animation I’ve ever seen. Smaug is an awe-inspiring achievement in design and performance and his presence dominates every scene” Shelley Page, head of international outreach, DreamWorks Animation

27 LORD OF THE RINGS: RETURN OF THE KING

24 CAPTAIN AMERICA: A: THE WINTER SOLDIER

Weta Digital, various (2003) The conclusion to Peter Jackson’s epic trilogy epitomised many techniques that are now commonplace in largescale effects work, such as crowd simulation. Another highlight was the all-digital Gollum, animated from Andy Serkis’s reference performance: then a rare example of a virtual character both emotionally engaging and technically impressive.

22 PIRATES OF THE CARIBBEAN: DEAD MAN’S CHEST ILM, various (2006) One of the key VFX movies of the decade, thanks to its erformancecaptured scurvy sea dogs, the crew of the Flying Dutchman. Armed with motion data of the actors captured during principal photography using its pioneering Imocap system, ILM created 18 hero characters and 32 variations to crew the ship. This gang of half-human misfits – all of them entirely digital, with the exception of Stellan Skarsgård’s Bootstrap Bill – account for 600 of the film’s total of 1,000 animation and visual effects shots.

ILM, Scanline VFX, various (2014) There are plenty of great moments in the Captain America sequel to match the espionage action,, but the highlight has to be the huge showdown between three of S.H.I.E.L.D.’s awesome Helicarriers. Each ship features texture detail four times greater than the single ship in Avengers Assemble.

21 TERMINATOR GENISYS

26 TRANSFORMERS: DARK OF THE MOON

ILM, Digital Domain, various (2011) The third Transformers movie features more unbelievable VFX work from ILM,, notably Driller, a giant mechanical worm comprising over 70,000 parts. The scene where it smashes through a skyscraper and demolishes Detroit took over 120 hours per frame to render using ILM’s entire farm, proving if you want big bombastic VFX there’s no better place to go than ILM.

25 STAR TREK: INTO DARKNESS

ILM, Pixomondo, Atomic Fiction (2013) The second Star Trek reboot is, understandably, pretty VFX-heavy: Spock stranded in an active volcano,, the attack on Starfleet HQ in San Francisco, the chase scene on planet Kronos… but our favourite sequence is the fight on the garbage barge. The actors are (occasionally) real, but the barge and the sun-drenched SF cityscape are utterly convincing CG. ILM provided the digital effects, based on mocap by Giant Studios.

Double Negative, ILM, MPC, various (2015) It took MPC a year to recreate Arnold Schwarzenegger’s youthful likeness for the latest movie in the Terminator franchise. Three models were built, including a high-res mesh of around 1.3 million polygons, and the work was refined until the last days of the project.

23 PROMETHEUS MPC, Weta Digital, Fuel VFX, various (2012) The prequel to Alien has some stunning creature work. The snake-like alien that attacks the crew was created using a mixture of CG and practical effects: MPC created a CG ‘Hammerpede’ to mix with the practical shots and digitally removed wires from the physical model, puppeteered by director Ridley Scott himself.

3D WORLD November 2015

20 TRANSFORMERS: AGE OF EXTINCTION ILM, various (2014) In 2014 Extinction was the dataheaviest project ILM had ever handled – Optimus Prime alone is made of 6,732 individual pieces of geometry. From shape-shifting robots to ocean liners falling from the sky, to colossal alien spaceships hoovering up Hong Kong, it’s an astonishing feat.

67

3dworld.creativebloq.com

19 THE DARK KNIGHT

Double Negative, Framestore, BUF (2008) Of the The Dark Knight’s 700 VFX shots, most remained invisible, but the 120 that feature the disfigured face of Harvey ‘Two-Face’ Dent stole the show. Framestore supplied the digital makeup effects, modelling the gruesome 3D face in Maya and Mudbox, and tracking it to live footage of actor Aaron Eckhart.

18 MAN OF STEEL

Weta Digital, MPC, Double Negative, various (2013) Superman’s return saw Smallville and Metropolis torn apart in seriously epic destruction sequences. MPC used its proprietary Kali solver to simulate elements including concrete, dirt, glass, ice and brick – with additional tools for particles and volumetrics.


FEATURE The 200 greatest VFX films

17 STAR WARS: EPISODE III REVENGE OF THE SITH

ILM, Gentle Giant Studios (2005) The opening space battle of the third Star Wars prequel set the tone for a VFX blockbuster, while General Grievous was the icing on the cake. The killer cyborg was designed so that during a fight, a rigid body simulation could animate any cut-off parts. The model comprised 4,591 NURBS surfaces for a total of 435,186 CVs. While a flawed film, much of the VFX stands up and ushered in a new era of blockbuster CG.

14 LIFE OF PI Rhythm & Hues, MPC, various (2012)

16 GUARDIANS OF THE GALAXY

Framestore, MPC, various (2014) Marvel’s most out-there caper to date brought a number of visual effects triumphs, not least the fully-CG lead character Rocket Raccoon. Rather than use guide hairs, Framestore simulated the entire groom to achieve more naturalistic results for his fur, rendering in Arnold using a shader based on the disneyISHair model. Multiple colour maps and mix masks controlled the variations in colour along the length of each strand of hair.

Any lingering doubts that we’ve yet to tackle the animal analogue of the Uncanny Valley were buried at sea when millions of moviegoers came face to face with Richard Parker, the Bengal tiger in Life of Pi. After watching the pivotal performance from a stranded tiger, the public were left wondering how the filmmakers had coaxed a real animal into doing all those things, while the Indian film board refused to show the film until they were sent reference shots proving the emaciated animal was a digital creation. Rhythm & Hues had a year to create the film’s 158 CG tiger shots, with a team of 40 animators and 120 other artists working in the US and India. The studio used a pipeline built entirely around proprietary tools, with Voodoo used for Parker’s animation, Wren for rendering, and Icy for final compositing.

MAGI, Triple-I, various (1982) The film that brought CG to the attention of the masses, and made MAGI’s Light Cycles a cultural icon. TRON’s 15 minutes of CG remain a landmark, despite the crude technology of the time: even the largest machine used in the effects work possessed a meagre 2MB of memory.

Iñaki González, co-founder, Leonard Blum

13 AVENGERS ASSEMBLE

MPC, Digital Domain, Rising Sun Pictures, various (2014) With 1,311 VFX shots, the latest X-Men film required a dozen different vendors. The coolest sequence occurs during the rescue of Magneto from the Pentagon: time slows almost to a stop as we follow Quicksilver, disarming soldiers and redirecting bullets,, achieved using high-speed photography, wirework, and CG elements from Rising Sun Pictures.

ILM, Weta Digital, various (2012) Marvel’s original superhero team-up took the combined efforts of over ten visual effects houses and hundreds of artists to bring to the screen. It’s hard to single out an individual highlight. “There were tons of effects: night-time organic effects, daytime mechanical effects,” says Weta Digital visual effects supervisor Guy Williams. “It’s the kind of thing that effects guys like to work on – it was so over the top in a fun way.”

3D WORLD November 2015

68

The Mummy Returns The plastic-faced Scorpion King Indiana Jones and the Crystal Skull The fake tree-swinging CG monkeys in the fake CG jungle I Am Legend Those not terribly realistic vampire-zombie things Die Another Day Laughable James Bond iceberg tsunami surfing scene Escape From LA More surfing: equally improbable, equally dreadful Jumanji What’s up with those freakylooking mess-making monkeys? Star Wars Episode II There’s plenty of bad CG here, but we’ll pick that bit where Annikin rides a space cow. Ugh.

12 TRON

“Creating a completely believable CG human being and keeping the performance real was considered the ‘Holy Grail’ of VFX. On The Curious Case of Benjamin Button, Digital Domain developed new techniques for hair, skin, eyes, animation and compositing to bring us the best possible results”

15 X-MEN: DAYS OF FUTURE PAST

10 BIG BUDGET BLOCKBUSTERS WITH BAD CG

Lost in Space Bloody awful Blarp creature. Airforce One The ‘climactic’ crash of the Jumbo Jet into the ocean.

11 THE CURIOUS CASE OF BENJAMIN BUTTON Digital Domain, various (2008) When director David Fincher met Digital Domain VFX supervisor Eric Barba on an Adidas commercial, he had a thought: could CG be used to make the ‘unfilmable’ story of the reverseageing Benjamin Button work? The answer was ‘yes’; the film’s a milestone in photorealistic VFX.

3dworld.creativebloq.com

Lockout A hoverbike sequence that looks like pre-viz was used instead of the actual CG.


“The advances we made [on Dawn] are less about explosions and spectacle and more about what we’ve done to allow the characters to connect with the audience. What makes these films unique is how much acting our digital characters do and how well they integrate into the rest of the movie” Dan Lemmon, VFX supervisor, Weta Digital,

10 DAWN OF THE PLANET OF THE APES Weta Digital (2014) Building on the success of 2011’s Rise of the Planet of the Apes, this tale of humans and super-intelligent apes battling for survival in a post-apocalyptic near-future is a technological masterpiece. Weta Digital’s CG creatures are very much the stars of the show, and have minutes of screen time with no human interaction. It feels unfair to single out individual scenes, as the effects are incredible throughout. But it’s the intimate moments that really stand out. Those shots in which human actors react to computer-generated apes carry genuine emotional depth. For once, CG characters are not simply there to advance the action but to perform: to carry a narrative arc. The digital apes – 12 hero characters, plus around 20 ‘extras’ – were created using a mixture of hand animation and motion-capture footage of Andy Serkis and the other actors portraying the creatures. Much of the mocap was shot on location, which meant an overhaul of the technology. This kit needed to be hauled up mountains, rained on, and generally abused, as Weta Digital sought to capture the most authentic performances possible. When the gloves did come off and the apes fought, it created new problems – artistically, the actors had to discover how intelligent apes would fight (no biting!), while director Matt Reeves had to get a grip on new workflows, filming the actors in a virtual space and setting beats and camera angles afterward, using virtual cameras. The hard work resulted in three VES Awards, plus an Oscar nomination for Best Visual Effects. Maybe by the time of the third movie in the series, Hollywood will have accepted motion-captured performances enough to nominate Andy Serkis for Best Actor. 3D WORLD November 2015

Weta’s CG apes were created using a mixture of hand animation and on-set mocap footage of the live actors

69

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

“ILM is full of fan boys exactly like Guillermo. People can sense his love for the material, the fun of making monsters which is part of what we do here. He loves that and we love him” Hal Hickel, animation supervisor, ILM

9 PACIFIC RIM ILM, Rodeo FX, Hybride, Ghost VFX, Base FX, Mr. X Inc. (2013) Director Guillermo del Toro’s epic blockbuster tells the tale of badass robots saving humankind from monstrous sea creatures. For the work, del Toro assembled a ‘dream team’ of concept designers, including veteran sci-fi artist Wayne Barlowe, also commissioning maquettes of all the major Kaiju (the sea monsters) and Jaeger (the humanoid war machines) from practical effects firm Spectral Motion. Industrial Light & Magic led the CG work, along with supporting facilities including Ghost FX, Hybride, Rodeo FX and Base FX. “We put a lot of time into the [Jaegers] Gipsy Danger and Striker Eureka, because they have the most screen time and the most actions,” says ILM animation supervisor Hal Hickel. “We focused on how the shoulders and hips would work and how all the mechanisms would fit together. Along the way we’d adjust proportions and other things [based on] Guillermo’s input, and we’d then start to move them and see what they looked like in various poses. We focused on how all the mechanisms would fit together. “Once we’d built them in the computer, we were able to start animating them and figuring out things like whether arms needed to be longer or see if the legs looked a little stumpy on a given lens and from a certain angle. There were tons of decisions like this. In total, ILM spent months working on Gipsy Danger. “We put a lot of detail into the model, but carefully planned where it was needed and when,” says Hickel. “We’d look ahead at a sequence and would dress the amount of detail accordingly.” They spent their time wisely – the Pacific Rim creatures instantly joined the canon of beloved movie monsters that audiences love to hate and fear in equal measure, while the visual effects landed six VES Award nominations, one of them for Hickel himself. 3D WORLD November 2015

ILM handled principal VFX duties on Pacific Rim, basing its designs for the robots on Spectral Motion’s maquettes

70

3dworld.creativebloq.com


“There are bits [of Gravity] that people just assume have been filmed, for instance a midrange shot when [Sandra Bullock] is working on the Hubble. Lots of people have seen it and asked us what we did there – they had no idea that it’s basically all CG apart from her face” Tim Webber, VFX supervisor, Framestore

8 GRAVITY Framestore, various (2013) Some films are bogged down with effects; others are light fare. But Alfonso Cuarón’s Gravity, set in the weightlessness of space, is a rare union of ethereal CG work and a heavyweight emotional storyline. Anchored by Sandra Bullock’s central performance – or facial performance, at least: for much of the movie, her spacesuit-clad body is animated digitally – as stranded astronaut Dr. Ryan Stone, Framestore’s artists created an entire digital world. “Gravity is a hybrid: it’s partly live-action, but is similar to [an animated feature] in many respects,” says animation supervisor Max Solomon. “Very large parts of it are fully CG.” Those ‘parts’ include the space shuttles, the Hubble Telescope, the International Space Station, and the Earth itself. All are on screen for long periods of time: the opening tracking shot alone lasts 13 minutes. At the start of the pitch process, it wasn’t clear just how much of Gravity would have to be computer-generated. Although early tests were done using more traditional methods such as shooting actors on wire rigs and building physical sets, it soon became clear that the challenges of simulating weightlessness and characters spinning off into the darkness could not be overcome practically. As in so many previous cases, digital effects became the silent star of the film. Very few directors are good as Cuarón at making visual effects feel like an organic part of the action. “I think that is what is special about Gravity,” says Solomon. “As a space film, it doesn’t feel like a VFX film: it feels more like a documentary. That was always something Alfonso stipulated from the beginning, that we were a fly-on-the-wall camera crew following these astronauts through their ordeal.” And that’s what’s so seductive about Gravity: it makes space seem exciting and scary, but also tricks you into thinking that it’s real.

Over three years in the making, Gravity landed Framestore the Oscar, BAFTA and VES Awards for its VFX

3D WORLD November 2015

71

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

“Ultron is probably the most elaborate rig we have ever done. It was 10 times the complexity of [Transformers’] Optimus Prime” Ben Snow, visual effects supervisor, ILM

7 AVENGERS: AGE OF ULTRON ILM, Double Negative, Framestore, Method Studios, Animal Logic, various (2015) The sequel to 2012 hit Avengers Assemble, Age of Ultron was Marvel Studios’ biggest cinematic thrill ride to date. Joss Whedon returned to write and direct, along with visual effects supervisor Chris Townsend, who managed over 10 studios for the last Avengers film and approximately 20 studios this time around. Industrial Light & Magic handled 800 of the VFX shots, including the opening and closing battles, and the three main CG characters – Hulk, Iron Man in all his forms, and new anti-hero Ultron – dividing the work between teams in San Francisco, Singapore, Vancouver and London. As in the previous movie, Hulk is a highlight of the visual effects. With the digital character appearing in 50 per cent more shots than in the previous film, the studio took a new approach to the green giant’s muscles and flesh. Typically, ILM’s artists model the final form for a character’s body, then put muscles inside and skin simulations on top. For Ultron, character TDs Sean Comer and Abs Jahromi worked with a professor of medicine to devise a new multi-layer muscle system. “We built the rig, put the muscle system on top, and the skin on top of that,” explains animation director Marc Chu. “So when we animated Hulk, the rig would reverse engineer what the muscles were doing and realistically drive the flesh simulation. In the previous film, we had to do corrections to keep Hulk on model, but the new system [did that semi-automatically]. We could still correct anything that penetrated, and adjust the jiggle to taste, but it made us six times more efficient.” With the movie’s other key effects expanding on those of its predecessor in similar ways, the result is a spectacle of superheroic proportions, each scene delivering a range of new visual treats.

Age of Ultron was created by an Avengers-like team of VFX superheroes. Around 20 studios worked on the movie

3D WORLD November 2015

72

3dworld.creativebloq.com


“I remember the huge buzz about the movie before it came out. Everyone in the industry was raving about early tests John Gaeta had posted of his bullet time footage. When the first trailer dropped, the world went insane! It changed the way VFX was perceived” Adrian Wyer, Owner, Fluid Pictures

6 THE MATRIX Manex Visual Effects, Animal Logic, DFilm Services, various (1999) Released in spring 1999, The Matrix was the surprise science fiction hit that beat The Phantom Menace to the finish line in the race for the Academy Award for Visual Effects. It launched the career of first-time visual effects supervisor John Gaeta, then just 34, and goes down in history as the film that raised the bar for the choreography of fight sequences and reinvented cinematography. Its most iconic scene is a frozen moment that has become known as ‘bullet time’, in which Neo (Keanu Reeves) dodges bullets fired at him by an agent, while the camera circles around. The sequence still captivates today. Gaeta’s team trained a circular array of 122 still cameras on Keanu Reeves, then triggered them in sequence. Because cameras located on one side of Reeves were visible to those on the other, Gaeta needed a way of generating photorealistic sets so the camera could be removed from frame. Gaeta and Manex VFX supervisor Kim Libreri found the answer at Siggraph 1997 in ‘The Campanile Movie’, a short film by Paul Debevec, George Borshukov and Yizhou Yu. Photographs of buildings were reprojected onto their CG models using new, bestfit algorithms. The result was the birth of virtual cinematography. “All my friends who worked on the film were enablers who allowed us to take risks,” remembers Gaeta. “That was very important to me at that time, and gave me confidence in everything else I’ve done since. The Wachowskis give their designers quite a bit of creative freedom, and they really engaged, encouraged and inspired.” Reflecting on the film in 2007, Gaeta noted that there were many shots he’d still like to tweak. “There are a lot of shots that, in the light of what you can do today, are pretty crude. But they still represent ideas the filmmakers were trying to represent.” To judge from The Matrix’s placement in the poll, they’re ideas that endure to this day. 3D WORLD November 2015

One of the oldest movies in our list, The Matrix’s bullet time shots helped pioneer virtual cinematography

73

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

DIRECTORS WITH THE MOST ENTRIES IN THE TOP 200 Steven Spielberg Michael Bay The Wachowskis Christopher Nolan Peter Jackson Bryan Singer Guillermo del Toro James Cameron Alfonso Cuarón David Fincher Gore Verbinski Jon Favreau Neill Blomkamp Paul Verhoeven Ridley Scott Robert Zemeckis Roland Emmerich Sam Raimi Stephen Sommers Tim Burton Zack Snyder Barry Sonnenfeld George Lucas J. J. Abrams Joseph Kosinski Joss Whedon Rob Cohen Ron Howard Wolfgang Petersen

8 6 6 5 5 4 4 4 3 3 3 3 3 3 3 3 3 3 3 3 3 2 2 2 2 2 2 2 2

5 INTERSTELLAR Double Negative (2014) Collaborating once more with director Christopher Nolan, Interstellar challenged Double Negative to visualise the un-visualisable: realistic alien worlds, a mathematically accurate black hole, and the Tesseract, a four-dimensional space with time as a physical dimension. Theoretical physicist Kip Thorne provided the maths, the studio’s artists delivered the visuals, and Hollywood awarded them an Oscar. For many viewers, the most memorable shot from the movie is the depiction of the black hole Gargantua, for which Double Negative needed to show the realistic behaviour of the black hole and a wormhole, right down to the lighting – or lack thereof. For this, Double Negative was lucky to have, in Oliver James, a chief scientist with a first-class degree in physics from the University of Oxford. To process the equations Kip Thorne – who also acted as executive producer on the movie – had written to describe light paths around a black hole, James and his team wrote a new physically based renderer: DnGR (Double Negative General Relativity). It enabled artists to generate realistic images of the hole and its gravitational lens by setting three key parameters: rate of spin, mass and diameter. The work broke new ground, both artistically and scientifically: a paper on the research written by James, Thorne, CG supervisor Eugenie von Tunzelmann and VFX supervisor Paul Franklin was recently published in the American Journal of Physics – which promptly called for the movie to be shown in school science lessons to help explain general relativity. In an interview with BBC News, Christopher Nolan commented that reactions like these were the “ultimate goal” of the movie. “We hoped that by dramatising science and making it... entertaining for kids we might inspire some of the astronauts of tomorrow,” he said. 3D WORLD November 2015

Interstellar’s recreation of a worm hole and black hole led not only to an Oscar, but a paper in the American Journal of Physics

74

3dworld.creativebloq.com


4 INCEPTION Double Negative (2010) With Inception, director Christopher Nolan did the unthinkable: produced a blockbuster movie with an arthouse aesthetic. While the visuals are on a scale equal to his Batman movies, at its heart, the story is as tricksy as his celebrated low-budget debut Memento. Rather than rely on multiple vendors to furnish the 500-odd digital effects shots (accounting for some 25 per cent of the movie’s running time), Nolan was keen to let Double Negative handle all the CG work, says VFX supervisor Paul Franklin. “Usually you’ll have an independent VFX supervisor who divides the work across several studios, sometimes all over the world. Chris wanted to simplify the relationship. He described it more like a 1970s model, where the VFX department would operate within a film studio.” Whatever the reasoning, it worked. The team beautifully realised multiple-level dream worlds in which the laws of nature cease to apply. One scene in particular pushed audience’s perceptions of what VFX could achieve. When architect Ariadne starts to “mess with the physics of it all” within her own dreamscape, she casually folds up Paris in front of Leonardo DiCaprio’s unbelieving eyes. To achieve the intricate effect, the Double Negative team spent two weeks taking thousands of stills and working from millimetre-accurate LIDAR scans to replicate a photorealistic model of four Parisian apartment blocks, populating them with digital cars and people. The team had to devise a further series of cheats to fully achieve the shots needed, including hiding intersecting buildings behind other geometry and a set of careful camera moves. The work won Double Negative its first Academy Award for Visual Effects. Four years later, the studio would win again – this time for Interstellar, featured on the page opposite.

Double Negative’s VFX for Inception’s physics-defying dream worlds won the studio its first Oscar

3D WORLD November 2015

75

3dworld.creativebloq.com


As well as introducing the iconic T-1000, Terminator 2 marked ILM’s first use of a fully digital effects pipeline

FEATURE The 200 greatest VFX films

KEY MOMENTS IN ILM’S HISTORY 1975 ILM is formed with the express purpose of creating special effects for Star Wars 1981 Works on first nonLucasfilm project, Dragonslayer 1982 Lucasfilm’s computer graphics division creates the Genesis Effect for Star Trek II 1988 The team win a Sci-Tech Academy Award for the ‘Morf’ software developed for Willow. 1989 The Abyss’ CG watery pseudopod wows audiences the world over 1991 Terminator 2’s liquid metal T-1000 cements CG as a bona fide VFX tool 1993 Staff convince Spielberg to let them create dinosaurs in CG, bagging ILM its 11th Oscar 1995 Casper the friendly ghost becomes the world’s first CG lead character 1999 Star Wars: Episode I is the most VFX-heavy film then made, with around 1,950 shots 2006 ILM develops the iMocap system for Pirates of the Caribbean: Dead Man’s Chest and bags another Oscar 2007 With Transformers, ILM does for robots what it did for dinosaurs, 14 years earlier 2011 ILM creates its first fully animated feature film, Rango 2015 ILM celebrates 40 years of creating the impossible

3 TERMINATOR 2: JUDGMENT DAY ILM, Video Image, Pacific Data Images (1991) “I think Terminator 2 was more groundbreaking than Jurassic Park,” says Dennis Muren, visual effects supervisor on – and VFX Oscar winner for – the classic sci-fi movie. “We had to put a lot of things in place for Terminator 2: complex rendering, compositing, and so on. But no one saw it until Jurassic Park.” After finishing work on the The Abyss in 1989, Muren took a year off. During that time, he read a 1,200-page book on CG. “I couldn’t figure out how it worked, but I wasn’t afraid of it,” he says. “I could tell Jim [director James Cameron], ‘Yep, we can do this.’ ” The ‘this’ in question was the T-1000, Terminator 2’s iconic liquid metal cyborg. Compositing it with correct reflection maps to anchor it in the frame was difficult. “We had background plates and that environment needed to reflect in the character,” says Muren. “The distortions had to remain consistent without creating big tears in the maps.” Another challenge was animation: when the T-1000 adopted actor Robert Patrick’s form, animators moved it like a person. When it was liquid, animators moved the fluid. “We figured out as much as we could to keep the geometry from tearing,” says Muren. “But we had an early copy of Photoshop, so if something didn’t work [art director] Doug Chiang would go in and paint the frame.” T2’s biggest innovation, however, may have been that for the first time, the effects process was fully digital. “We had a film recorder that actually worked,” Muren says. “We had a scanner that we had worked on with Kodak. So with the scanner, film recorder and our copy of Photoshop, we completed the triangle. It was the first time we could do digital in, digital manipulation and digital out.” 3D WORLD November 2015

76

3dworld.creativebloq.com


Jurassic Park marked the first use of photorealistic organic creatures in a movie. They still look good today

“The original Jurassic Park is my favorite movie of all time. It holds up beautifully because of the limitations of the technology: you only see glimpses of the dinosaurs so they’re scary and mysterious instead of being constantly in your face”

2 JURASSIC PARK ILM, Tippett Studio (1993) Why is Jurassic Park remembered so fondly for its visual effects? No-one knows the answer to that question better than Dennis Muren at Industrial Light & Magic, who won one of his eight visual effects Oscars for the film. “It was the first time we had been able to put living, breathing synthetic animals in a live-action movie,” says Muren. “No-one had seen anything like it. The reality hadn’t been done before; the naturalism.” Muren credits dinosaur supervisor Phil Tippett and, of course, director Steven Spielberg for pushing the unsafe documentary film style. “We wanted the animals to create the feeling that we wouldn’t know what was going to happen next,” Muren says. Because creating CG animals was so new, Muren set up two systems: stop motion and CG. “The animators hadn’t worked on real animals,” he says. “No-one had.” Even though the CG animals soon proved themselves, Stan Winston’s puppets starred in closeups in most of the film. “When we started, I didn’t think we could do anything closer than a full-length dinosaur in CG. But we pushed closer and closer. Near the end of the film, in the rotunda sequence when the T. rex walks in and the raptor jumps on its back, I was confident enough to try close-ups.” ILM’s 56 CG shots and 6.5 minutes of screen time also included a digi-double for the lawyer (actor Martin Ferrero) as the T. rex snags him out of a bathroom, and a face replacement for a character who falls through the floorboards during a raptor attack. Muren recalls: “George [Lucas] came by occasionally, and one time said, ‘This looks pretty good.’ I said: ‘Yeah, I’m hoping we can do something like 2001, something brand new.’ He said: ‘You guys are doing it and you don’t know it.’ It wasn’t until it was over that I realised he was probably right.”

Grant Miller, creative director, Ingenuity Engine

3D WORLD November 2015

77

3dworld.creativebloq.com


FEATURE The 200 greatest VFX films

1 AVATAR Weta Digital, ILM, Framestore, Giant Studios, Hydraulx, BUF, various (2009) Back in 1996, James Cameron announced that he would be creating a science-fiction film called Avatar that would feature photorealistic computer-generated characters. Soon after, it had to be shelved as the technology of the time could not satisfy the creative desires of the director. But by 2009, things had caught up and Cameron, with help from a range of VFX studios, was about to make movie history. Avatar wasn’t just a film, but a whole new, fantastical CG world – and the level of detail was astonishing. “James Cameron and his team spent a lot of time designing the horticulture of the environment,” says Dan Lemmon, VFX supervisor at Weta Digital, which created over 1,800 effects shots for the movie. “There are very detailed and very exotic plants, many modelled by hand. Most of them were executed at fairly high detail. Larger trees had up to 1.2 million polygons.” As any given frame might have hundreds of thousands of plants to be rendered, efficiency was crucial. “We had to use proxy versions that would act as stand-ins in Maya and render procedurally in RenderMan,” Lemmon says. “We put together level-of-detail strategies so that we could have more detail up close, and less geometry and detail as the camera gets further away.” The rendering time was further increased by the fact that Avatar was released in stereoscopic 3D: then still a relatively untested medium. On working with live-action stereo footage, Lemmon notes, “Before, we just worried about mattes, green spills, that sort of thing. Those issues become more complicated once you get into stereo. For example, things will look slightly shinier in one eye than the other.” The mammoth effort that Weta put in to resolving these technical problems paid off: Avatar became the highest-grossing film of all time and changed the industry’s view of stereoscopic 3D overnight. 3D WORLD November 2015

The destruction of the Home Tree: a technically impressive effect, and a pivotal emotional moment in the movie

78

3dworld.creativebloq.com


TEN THINGS CG CAN NOW DO FLAWLESSLY

James Cameron planned a sci-fi movie with realistic digital characters in 1996. It took CG 20 years to catch up

1 WATER EFFECTS Realistic oceans, giant water splashes, tidal waves – all completely nailed. 2 DE-AGEING Benjamin Button did it first, but Michael Douglas looks amazing in Ant-Man. 3 CG CHARACTERS Gollum, Rocket and Groot, Caesar the ape…we’ve come a long way since Jar-Jar Binks. 4 EPIC VISTAS Look at the scenery in Oblivion, or the cityscapes in Total Recall and Cloud Atlas. 5 GIANT ROBOTS Can the visual effects in the Transformers movies get any more epic? 6 SMOKE AND FIRE Remember that bit where Godzilla appears and the smoke flows around him? Lovely. 7 ANYTHING IN SPACE Gravity and Interstellar set new benchmarks for extra-terrestrial action. 8 CROWD SCENES Whether a (literal) Massive attack, or World War Z’s army of zombies, crowds are a snap. 9 HOLOGRAPHIC USER INTERFACES Since Minority Report, everything has a glowing, see-through display. 10 PHOTOREAL HUMANS Okay, so we still have yet to cross the Uncanny Valley. But nine out of ten ain’t bad, right?

“We didn’t cut anything out [of Avatar] because we couldn’t do it – but one scene took us two years to figure out how to shoot” James Cameron, director

3D WORLD November 2015

79

3dworld.creativebloq.com



Will VR Change CG art? Explore the potential of virtual reality

3dworld.creativebloq.com October 2015 #199

mattE painting star wars Pro training to recreate an Episode VII scene

free!

5gb of resources

• Video walkthroughs • Textures and meshes • Models and setup files

the rise of

g modelliunm’s continuarine time m

How TV’s greatest VFX are created… • Game of Thrones • the flash • constantine • walking dead


FEATURE The rise of TV VFX

AUTHOR PROFILE Tom May Having worked in magazine journalism for 22 years, Tom is currently the content manager for 3D World and our website CreativeBloq. twitter.com/tom_may

3D WORLD October 2015

40

3dworld.creativebloq.com


VFX With movie talent increasingly flocking to TV land, Tom May examines what effect this seismic shift is having on the VFX industry

W

hen was the last time you went to the cinema? And when was the last time you stayed in and bingewatched a TV drama? If the latter was more recent, you’re not alone… More and more of us are shifting our downtime from the multiplex to the front room, thanks to a combination of larger, better quality TV sets, and the accessibility of content. Where the demand goes, the supply is following: top Hollywood actors, directors and producers are all flocking from the movies to television, and the financiers have followed in their wake, investing big money in blockbuster shows. Will Cohen of visual effects studio Milk, the main vendor on Doctor Who and Jonathan Strange & Mr Norrell, believes there’s a good reason for all this. “The movies find it very difficult in 2015 to tell sophisticated stories,” he argues. “They have to appeal globally in order to

3D WORLD October 2015

41

make their money back. Ergo, there is a dumbing down of the sophistication level, or the characterisation. “So I think what’s attracted actors, directors and producers to television is the ability to tell the kind of stories movie studios like United Artists were making in the late 1970s. If you want to make stuff like that now, the format is television.” Nigel Hunt of Glowfrog, a Londonbased studio creating high-end effects for clients including BBC, Channel 4 and HBO, agrees. “The driving factor is probably the attractiveness of TV for high end feature directors wanting a little more control and freedom,” he suggests. “The U.S. cable networks such as Showtime and HBO have traditionally been the powerhouses of high-end TV, attracting film talent. Now, with the storm surge of online networks, Netflix and Amazon have emerged as major content producers attracting even more film talents, and larger production budgets.”

3dworld.creativebloq.com


FEATURE The rise of TV VFX

DIGITAL DOUBLES Fully CG characters – like Grodd here – were once unheard of in mainstream TV shows. Now they’re cropping up everywhere…

THE FLASH

So how do 3D artists and visual effects studios get a slice of this action – and should they even want to? In this article, we assess how television is transforming the 3D industry before our very eyes…

Encore VFX’s Armen V. Kevorkian shares insights into bringing DC comic’s fastest man alive to the small screen

Rising expectations

The first, and perhaps biggest change to take place in recent years when it comes to visual effects lies in audience expectations. Essentially, in 2015, viewers want their TV shows to look as good as their movies. “As VFX in film advances, it stands to reason that it will follow in TV,” says Tanvir Hanif, visual effects supervisor and 3D animator at 3sixtymedia, a production company serving ITV, BBC and independent clients throughout the UK. “The audience is so much more savvy and critical now, so in TV we have to work harder to match their expectations. As shows like Game of Thrones, Vikings and Battlestar Galactica have met that challenge, viewers have come to expect the same on other shows, further increasing the pressure to raise standards.” There’s just one drawback: film-like effects need to be achieved without film-like budgets. Sam Nicholson, CEO of Stargate Studios – the VFX facility behind The Walking Dead – puts it bluntly: “The only difference between TV and film these days is time and money.” So more needs to be done by fewer people in less time. Improvements in technology help, but it still leaves a lot of pressure on artists. “Working in television is quite different from working on feature films,” continues Sam. “To survive in TV you must be very good but also very fast. Particularly in series television, you’re generally prepping two shows, shooting one and posting two.” The actual business of creating 3D effects doesn’t really change. As Sam puts it: “The

ew TV show’s exhibit spectacular VFX in quite the same way as WB’s The Flash, the ambitious superhero show based on DC’s classic comic hero. The fact he’s blazing a trail on the small screen first and not in cinemas isn’t an issue for Encore VFX supervisor Armen V. Kevorkian: “Superheroes are prevalent across entertainment mediums. Fans don’t care where the content is playing; they just want it to look good, and we want to make sure we’re doing their favourite characters justice.” Encore VFX was recently nominated for an Emmy Award for Outstanding Special Visual Effects, part of which is for creating and animating the giant talking gorilla, Grodd – a classic villain from the DC comic. A task not usually seen in TV shows with tight deadlines. “Each episode requires prep, shooting and post, so we’re often working on

F

VIDEO SCOPE The rise of affordable CG technology means the scale of TV VFX has grown dramatically

3D WORLD October 2015

42

3dworld.creativebloq.com

five or more episodes at any given time,” says Armen. “New villains are introduced weekly so there’s also a high volume of complex shots being done constantly. Grodd was a special one; he’s one of the most well-known nemeses in The Flash universe. Time is always a challenge in TV and, fortunately, we had a pretty substantial heads-up that Grodd would be making an appearance this season so we were able to prepare accordingly.” Encore VFX use 3ds Max for 3D work and Nuke for 2D work, as well as ZBrush. Scans for the digital doubles were done at Light Stage and Gentle Giant Studios. All were essential to create the sense of super speed seen in every episode. “Super speed is iconic to the character and focal to the show, so we spent a lot of time ahead of the pilot researching what would look best. The comic books provided a good base, and from there we played around with movements that were grounded in reality,” says Armen, “then we had to cheat it a little bit to make it look cool. Once we nailed the right balance, there was definitely a sweet ‘eureka moment’.” When conversation turns to the wider growth in VFX in TV, Armen reflects: “As both artists and an audience, we’ve set the bar high in what we’ve come to expect from TV VFX. The work being created for television (and beyond) is incredible and quality shows are held to new standards.” See Encore VFX’s scene breakdown FYI here: www.bit.ly/the-flash-vfx


CHANGING PLACES The man in the green leotard is none other than Jonathan Strange, played by Bertie Carvel

JONATHAN STRANGE & MR NORRELL Bringing statues to life in fantasy series Jonathan Strange & Mr Norrell wasn’t as easy as you’d think, reveals Will Cohen of Milk f you haven’t seen it yet, Jonathan Strange & Mr Norrell is a difficult show to explain in a single sentence. Part costume drama, part fantasy, part historical epic, it’s about a man during the time of the Napoleonic wars who sets out to bring back pagan magic to England. And when it came to the look and feel of the visual effects, director Toby Haynes was explicit about one thing – this series should not look like a Harry Potter movie. “It’s magic of the ancient earth,” explains Will Cohen of Milk, the lead VFX vendor on the show. “It’s not Expelliarmus and wand glows and stuff like that – it’s grounded in reality. In fact I remember reading some criticism of the opening episodes that there wasn’t much magic in it. But I think that was sort of missing the point, because when you do want to show some magic done, you want to make sure it really stands out.” There’s an example of just that in episode one: after a long episode centred around, it has to be said, a lot of people talking, there’s a thrilling finale in which the statues of York Minster come menacingly alive. It’s a big set piece that was to set the tone for the whole series, so it had to be done right. But it also posed a very specific challenge. “You start moving stone around and it just looks like it's made of rubber unless you’re very careful,” Will says. “Because as we all

I

know, stone statues don’t come alive. So if they’re moving around, they no longer look like they’re made of stone. “So we carried out a lot of R&D. And that research phase told us that in order to sell stone, we needed to see cracks and dusts. If they’re struggling to move, it’s making them crack. “After that, Toby and Milk’s VFX supervisor on the show, Jean-Claude Deguara, realised that we could use the cracks to enhance the emotion of what’s going on, or the performance of the stone statue. So the effects turned out to be very multilayered.” Milk went on to deliver 30 visual effect shots for the York Minster sequence, creating a number of animated statues including a line of seven stone kings, musicians and a Latin-speaking bishop (played by lead actor Bertie Carvel). “All in all, I think the team succeeded in making the effects seem part of the world of Jonathan Strange, which was one of the challenges set by the production team at the beginning,” says Will. “One of the things you need to get quality work is time. And everyone had the luxury of time on Jonathan Strange, which is why I think it came out so well. “We were allowed to stop and start, park things, think about things for a few weeks, have a chat over coffee, come back to them. And that was really helpful and beneficial to the end product.” To see more of Milk’s work, visit FYI its site at www.milk-vfx.com 3D WORLD October 2015

43

artists are the same. The software and the computers are the same.” The gap, then, is being met by innovative approaches to organisation, time management and productivity. “So now on our big shows like Heroes Reborn and The Ten Commandments, there are multiple simultaneous first units, multiple directors and split location,” explains Sam. “We have specifically engineered Stargate Studios to excel in this challenging production and post production environment

To survive in TV you must be very good but also very fast. You’re prepping two shows, shooting one and posting two Sam Nicholson, CEO and founder, Stargate Studios by networking and synchronising our ten international facilities. Today, amazing visual effects are possible in a fraction of the time and a fraction of the cost of previous years, which truly closes the gap.” It’s a similar story across the industry. “In television, studios have tighter schedules, due to the episodic nature, and budgets that don’t quite stretch to reiterating a shot for the 40th or 50th time,” explains ftrack’s Ben Minall. “These constraints lead to approaches that have to be ingenious, as the production doesn’t have the grunt power or luxury of time behind it.” Project management tool ftrack aims to help stressed-out VFX studios meet these required levels of ingenuity, he says. “ftrack does away with the need for huge Excel spreadsheets that need to constantly be shared and reams of emails cluttering up your inbox. It gives everyone on the team one centralised location 3dworld.creativebloq.com


FEATURE The rise of TV VFX

WORLDS APART The sheer scale of Marco Polo’s epic environments posed a huge technical challenge for Pixomondo

where they can see what needs to be done that day, how it needs to be done, and when it needs to be done by.” Other tools are available, such as Shotgun. But whatever productivity software studios use, it’s this kind of streamlined approach to production that’s crucial in the world of TV VFX, as tough schedules mean there’s scant room for mistakes or multiple iterations.

Variety and diversity

But if this all sounds like no fun at all, then here’s the good news: there are some definite upsides to working on TV shows too. First of all, there’s a much greater level of variety. For instance, working at Milk, says Will Cohen, “you’ll find yourself animating a snake in Hercules one minute, for a month, for a

Since budgets are restricted, the atmosphere becomes more inviting for collaboration and creative solutions Niklas Jacobson, VFX supervisor, ILP handful of shots. And then you’ll be on Doctor Who animating 20 shots in a handful of weeks. I think it’s fun for artists to mix it up and have that diversity. And to be able to move on to another project very quickly, or to spend time on really applying the final detail and polish.” Just as importantly, working on TV shows can also involve much higher levels of creative collaboration between directors and the artists themselves. “The ambitions on a TV show may be very high,” says Nicklas Jacobson of Swedish VFX house ILP, which has worked on hit shows Crossbones and Constantine. “But since budgets are still 3D WORLD October 2015

MARCO POLO

Bringing Marco Polo to the screen was an epic challenge, Pixomondo’s Christian Hermann explains ixomondo was founded in 2001, as a design studio, based in Pfungstadt, Germany. It broke into visual effects in 2005 and has since worked on dozens of big-budget movies and TV shows, including 2012, Wrath of the Titans, Oblivion and Game of Thrones. 3D World speaks to Christian Hermann…

P

Did you have to do any exhaustive historical research? In most cases when approaching environment asset builds, Pixomondo artists were following detailed historical reference provided by the production team. A lot of time was spent on perfecting accurate architectural detailing and textural authenticity for the buildings found in our city builds. They not only needed to be accurate and varied in order to show class stratification at the time, but also had to reflect the special character of the city they were a part of, so there was no confusion between views of Xiang-Yang and The City of Cambulac. Pixomondo gave special attention to the build and layout of the Cambulac wall because of its prominence in the shots. For example, we studied the way its stone blocks would taper in size from bottom to top and we researched existing ancient Chinese walls to get the sense of colour and makeup of the bricks. Layouts for both Cambulac (Beijing) and Xiang Yang cities were very specific 44

3dworld.creativebloq.com

since both cities have a long history and pictorial documentation throughout the ages. Great care was taken in conveying accuracy in layout of the Imperial city and its relation to the rest of the city. Through colour palette and building variations Pixomondo made sure each zone was distinct and identifiable. How were the very realistic CG snakes created for the show? We had to create two hero snakes for the first season. One was an Asian Pit Viper and the other one was a Chinese Cobra. We sculpted both snakes in ZBrush with incredible amounts of detail, due to the need for close-up shots with live action characters interacting with the snakes. Early in the project we gathered a lot of reference material of snakes from the Far East and studied them closely. They also had a cobra on set which we used for colour, texture and lighting reference. The modelling, rigging and animation was done in Maya and everything was rendered in V-Ray and comped in Nuke. What was your approach to the environments? Pixomondo’s first steps in environment creation, long before background plate delivery, was a detailed build of the assets for each city, ensuring that models and textures would hold up regardless of camera angles and moves. The layout of the background environment for Cambulac consisted of a


SNAKES ALIVE The CG snakes were sculpted in ZBrush in great detail, with modelling and rigging in Maya

360-degree matte painting as a base. The matte painting displayed the surrounding landscape and identifiable story-driven landmarks to allow the audience to orient themselves in space. This 2.5D setup was created with re-lighting options in mind so it could be adapted for different time-of-day scenarios. The Cambulac city layout involved both manual building placement plus Pixomondo scatter scripts that generated building zones based on street grids and density maps. Once in shot production, the lighting department used a combination of custom lighting rigs and on-set HDRI maps to create an exact lighting match to the background plates. The matte painting department then augmented the rendered passes, providing the next level of integration and detail to the environments, including textural variation, vegetation, clutter and, of course, the background elements depending on the shot location. Detailed matte paintings were divided into foreground, mid-ground and background elements. These were then projected onto low- or high-res geometry in Maya and rendered for compositing, or exported into Nuke for projections depending on the camera moves. The compositing department then finalised the integration between the plates and other elements, and breathed life into shots through the addition of moving elements, such as smoke.

It’s been reported that Marco Polo cost $9 million per episode – was there a bigger budget for this than normal at your end? A lot of money was spent on the practical sets. For Pixomondo the budget was comparable with other TV shows we’ve worked on. What was the biggest technical challenge you faced? Scale and population of the environments proved to be the biggest challenge on this show. As with any production of this scale, the logistics of handling hundreds of assets and terabytes of texture data required the development of custom Maya tools to provide artist-friendly layouts that would produce predictable and compositor-friendly render passes. Models and textures were optimised to reduce render times and special tools were utilised to populate the city layouts with vegetation and low-resolution objects to add visual clutter. We used a crowd simulation plugin for Maya called Miarmy (www.basefount.com) which was employed in the creation of CG extras and armies within the cities, and which enabled quick swapping of walk/ run cycles and accessories to breathe life into wide establishing shots with slow camera moves. Using Nuke’s 3D capabilities the matte painting department was able to set up a 360-degree landscape template, which enabled quick swaps of the sky-dome elements depending on the shot’s needs, as well as ensuring consistency in the layout of the background landscape features across sequences. For more on Pixomondo, visit its FYI site at www.pixomondo.com 3D WORLD October 2015

45

restricted and everyone is aware that it’s not a feature film budget, the atmosphere becomes more inviting for collaboration and open for creative solutions. That makes you feel more like a part of the production where you all work towards the same goals.” Milk’s Nicolas Hernandez, CG supervisor on the BBC historical fantasy series Jonathan Strange & Mr Norrell, has had similar experiences. “On Jonathan Strange, and on TV generally in fact, you have direct access to the director,” he says. “And usually you need to have a proper partnership with the production company to make it work, because of the challenge of time and money.” His boss, Will Cohen, concurs. “On Jonathan Strange we had great collaboration – a creative partnership with the producer, the director and two editors – and that makes it all very economic. The lines of communication are very small.” If working on TV productions is good for artists, it’s equally good for studios too. For some it’s a good way to fill their downtime between movie projects, while others have chosen to specialise in TV completely. All

PAST GLORIES Pixomondo conducted detailed historical research to ensure Marco Polo’s environments were accurate

3dworld.creativebloq.com


FEATURE The rise of TV VFX

HIT AND MYTH Although a fantasy, Games of Thrones’ look and feel is grounded in reality

those we spoke to are expecting the demand for TV VFX to grow in the future. But while it’s the huge explosions and epic set-pieces of shows that get the attention, that’s only the icing on the cake as far as VFX work goes. Most of the work available is in providing ‘invisible effects’ – things that the viewer would never guess were done digitally and not actually real. “Invisible effects are essential to our business model at Stargate,” says Sam Nicholson. “They’re generally overlooked by the viewing audience, but producers and

Our work is to support the narrative in an invisible way. If you haven’t spotted what we’re doing, we’ve done our job! Tanvir Hanif, VFX supervisor, 3sixtymedia directors absolutely realise the essential nature of these non-spectacular effects to their shows.” A good example of this is Stargate Studios’ work on Grey's Anatomy. “For the past nine years using our Virtual Backlot process, Grey's Anatomy has not had to travel to Seattle – the supposed location of the show,” Nicholson reveals. “In fact, they’ve been able to stay in Los Angeles and we’ve brought Seattle to Los Angeles for them. This greatly enhances the creative possibilities for the writers of the show while containing the cost for the producers.”

Can’t see the join?

It’s a similar scenario for Tanvir Hanif at 3sixtymedia, where invisible effects form a large chunk of day-to-day work. “Our work is often to support the narrative in a more invisible way,” he says. “If you haven't spotted 3D WORLD October 2015

GAME OF THRONES

Lead vendors discuss the challenges of bringing George RR Martin’s fantasy classic to life t’s one of the biggest shows in the world right now. So 3D World spoke to Joe Bauer and Steve Kullback, the VFX supervisor and producer of Game of Thrones, as well as vendors Pixomondo and Mackevision, to find out how the visual effects are put together. First they explain the set-up: “We have a two person in-house concept team,” begins VFX producer Steve Kullback. “We have five people who are working on pre-vis and they’re all in house with us, generating pre-vis all under Joe’s guidance.” When Joe and Steve are on set, they then employ a team of a visual effects key grip and data wranglers, who are responsible for making sure that all the on-set requirements are in place and that all the data is gathered. “Once we go into shot production, once the episodes are edited, Joe will sit down with our visual effects editor and work through temps of how the scenes should be. Then those temps, in consort with the pre-visualisations and concepts, will be turned over to our vendors and they will constitute the work at various stages with constant check-ins from us.” With no visual effects facilities in-house (only two compositors who do some light composites, green/blue screen composites and paint fixes), Joe and Steve have a significant team of vendors all over the world who they turn in shots to. They’re

I

46

3dworld.creativebloq.com

then in constant touch with them from their headquarters in LA when in post production and shot production. “We’re always on the look-out for talent and it tends to be an as-needed requirement,” says Steve. “We have a team of very hard working folks that we have a shorthand with, but that doesn’t mean we aren’t on the look-out. We introduce new folks as we have new and exotic needs.” One of the regular vendors on the show is Pixomondo, the international VFX company with a network of studios across Germany, China, Canada and the US. Its CMO Christian Hermann explains what’s unique about working on it. “The biggest challenge of Game of Thrones is that it’s a fantasy show without looking like fantasy,” he tells us. “Meaning everything is very much grounded in reality, from castles to landscapes and mystical cities. But especially when it comes to mythological creatures like giants and dragons, it becomes very challenging.” The dragons are an integral part of the show, and Pixomondo tries to push the limits with every new season along with its growth. “This season we had to deal with ten times more detail on the dragons as they’re much bigger now and filling the screen in tight close-ups,” Christian reveals. “For example our texture amount went up from 70 to about 700 image files per dragon.”


GROWTH SPURT The dragons have got bigger and more detailed with each new series

The shots with Dany (Daenerys Targaryen – Emilia Clarke) and her dragon were particularly tricky. “We had a lot of interaction between the two and the dragons appear as a full, second digital actor on screen who must retain the dragon’s animalistic qualities.” “There’s one shot with over 1,000 frames with the dragon full screen in bright daylight. This was very challenging for all of our departments, starting with animation, simulation, lighting and compositing. For direct interactions, Emilia Clarke had to act with a green dummy to give her an eyeline and be able to touch the digital creation.” Game of Thrones is as much about landscapes as creatures of course, and much of that lies in the hands of fellow German outfit Mackevision. VFX Supervisor Jörn Großhans fills us in… “The most elaborate sequence we’ve worked on would be in the second episode when Arya Stark arrives in Braavos and

finds her way to the House of Black and White,” Jörn tells us. “We’ve also worked on the Titan of Braavos shots – we created a similar establishing shot as in season 4, for which we gained widespread praise. We worked on all shots of the exterior of the House of Black and White where Arya has to wait for her entry and I think they worked beautifully.” For season 5 the studio has developed its pipeline with new tools so it can work quicker and hit tight deadlines. Max, Nuke and V-Ray are the 3D tools of choice, while Shotgun handles production data. As is so often in the current TV landscape, it’s a tale of working smarter to meet increasing challenges. “All the shots on the show have become bigger,” Jörn explains. “In earlier seasons we often worked with still cameras but now we have a lot more moving camera shots and the environments have become more complex too.” For more on Mackevision visit its FYI site at www.mackevision.com

what we’re doing then we've done our job!” Tanvir’s role is to go on set to advise and supervise on how to shoot a particular scene where an additional enhancement in post production might be needed. It could be to repair or paint out something that’s not needed, or a larger scale alteration to help realise a key story moment in a programme. “These bigger scenarios are often undertaken with a lot of pre-planning with resources put in place to help realise the effect, but sometimes you have very little to play with and have to create most of the final image yourself.” One of their biggest recent productions, for example, was ITV Studio’s Cilla, following singer Cilla Black‘s rise to fame in the 1960s. “One noteworthy sequence was the first scene of the first episode,” says Tanvir. “This introduces Cilla, who is queuing outside the Cavern Club with supposedly hundreds of people behind her. We used crowd replication effects to seamlessly give the impression of hundreds of extras rather than just the 70 we had on the day. We devised various in-camera effects that we would later use in

ANIMAL MAGIC The interactions between Daenerys and her dragon have been particularly tricky

3D WORLD October 2015

47

3dworld.creativebloq.com


FEATURE The rise of TV VFX

TAKEOVER BID This breathtaking pull-up shot reveals the scale of the zombie infestation

post production to create the final effect.” This emphasis on not making the CG elements obvious is even the case on a show like Game of Thrones, says Jörn Großhans, VFX supervisor at Mackevision. “Our main goal is always to create invisible effects. The imagination of the audience should be triggered, but everything should be reliable in the respective story world. Visual effects are great as long as they support the story.”

So where are we now?

For some industry veterans, working in television VFX feels like coming full circle. “Go back to 1999, 2000,” says Will Cohen. “I think there’s 40 animated shots in Jurassic Park, the first movie. 90 digital shots in Gladiator… and that was a big number. “Now movies are coming out with 1,500, 2,000, 3,000 shots. Maleficent had 3,000 shots; every frame is a digital effects shot. So these big VFX companies have evolved to have a big machine and a big hierarchy. Decisions are made, and they’re passed down through that hierarchy. They’ve created these large machines capable of the full pipeline solution. And that works for them… but TV is very, very

THE WALKING DEAD

More and more digital effects are being used on hit zombie shot The Walking Dead, reveals Sam Nicholson of Stargate Studios ased on the cult comic by Robert Kirkman, AMC’s show has smashed all cable TV ratings records, and a big part of the show’s popularity is its recreation of the zombie hordes – and their grisly demises – courtesy of Sam Nicholson’s Stargate Studios…

B

How much pressure is it working on a TV phenomenon like Walking Dead? The Walking Dead has a specific style, character, budget and workflow, which must be maintained year after year. This continuity is essential to keep the show a success. Some things stay the same, like shooting on 16mm film which is a real challenge for high-quality VFX work. We always push to bring new technologies to the table, especially in the area of seamless digital prosthetics, which must complement Greg Nicotero’s incredible prosthetics. We’re also using many more virtual zombies and digital set extensions than in the early years. What’s been the biggest challenge in creating VFX for Walking Dead? The greatest challenge has been to maintain the organic nature of the VFX so they never appear digital or artificial. They must always maintain an organic nature and be totally believable. ‘No digital fingerprints’ is our motto on Walking Dead, right down to the film grain and spattered black zombie blood.

LONG WALK AHEAD This iconic shot became synonymous with AMC show The Walking Dead

3D WORLD October 2015

48

3dworld.creativebloq.com

Which shot(s) stand out in your mind as the best example of your work? Over the years, the shots which still stick out in my mind are the ones we created on the pilot with Gale Anne Hurd and Frank Darabont. Specifically, ‘torso girl’ and the final pull up shot which reveals the amazing scope of the zombie infestation. Of course the iconic shot of Rick riding down the deserted highway which became the poster for Walking Dead is a classic. These shots were beautifully designed and executed which resulted in the best possible result – they became icons for the show. What kind of pipeline do you use? Much of our software is off-the-shelf but with lots of custom modifications, transcoding and automation. After Effects, Nuke, Maya and Premiere. We’ve developed a proprietary data management system we call our Virtual Operating System (VOS). VOS enables us to seamlessly distribute, process, render and deliver shots throughout our international network of VFX studios. What’s the best thing about working on The Walking Dead? The overall body of work is amazing. Maintaining the quality of the original, inspired production year after year is a real challenge, so we’re always looking for new ways to keep it fresh. For more on Stargate Studios visit FYI www.stargatestudios.net


SKY TV Even TV comedy now has moviestyle CG, like this space station scene for Last Man on Earth

LAST MAN ON EARTH Oliver Taylor explains how Ingenuity Studios recreated the ISS for a space comedy he huge advances in television VFX over the past few years aren’t just a matter for bigbudget dramas: even the makers of comedy shows are getting in on the act. The Last Man on Earth is an American post-apocalyptic comedy about a man who believes he is the only survivor of a deadly virus that has wiped out the world’s population. Ingenuity Studios is the main vendor on the show, and one of the most challenging sequences it was asked to create involved a reveal shot, which starts on the surface of the Earth and rises up all the way to the ISS’s altitude in space. As VFX producer Oliver Taylor explains: ”We knew right away we couldn’t get away with a matte painting of the Earth, that to some extent we had to do it for real. Because the camera moves through the clouds we needed them to be volumetric. We also needed them to cast correct shadows on the Earth, interrupt reflections on the surface of the Earth's terrain and water, etc. All of this necessitated a detailed build of the Earth, its textures and displacement, and proper interaction with the clouds.” An additional challenge was the removal of the zero-gravity rig from the actor on the ISS. “The two complicating factors were that it was shot on a moving Steadicam and the stunts team used a crane rig, which entered the shot from just behind camera. To remove the crane,

T

from a shot which drifted around, we had to build geometry for the interior of the ISS to match the set, get an accurate match-move for the camera, and reproject clean-plate textures back onto the geo. It’s a tricky process, that requires a lot of tweaking, but it allows production to be very flexible with how they shoot the scene. Ingenuity employs Nuke for compositing and Hiero for all its I/O. “This is a great integrated pipeline that makes things a little faster and easier,” says Oliver. “We also used Houdini to create the clouds around the Earth. The primary benefit for us was that we could place rough geo in the scene, so that the placement of the clouds made sense from an artistic viewpoint, and use that geo to generate VDB volumes of the clouds. It’s a process we nailed down working on commercials and have used it again and again doing cloud work.” Modo made its debut for modelling, lighting, shading, and rendering. “In addition to the render preview, a big benefit for us is that the render licenses are free, so it’s very easy to add Modo to our render farm. In addition Modo really does come with great support.” Finally, Mari was used for texturing the exterior of the ISS. “Mari was very useful in this situation because it saved us a lot of time in model-prep and skipping a lot of the grunt work in laying out UVs.” To see more of Ingenuity visit its FYI site at www.ingenuitystudios.com 3D WORLD October 2015

49

different. In fact, it’s a bit like it was ten to fifteen years ago when you were making 90 digital shots in Gladiator.” Having said that, even the lines between television VFX and film VFX are now blurring. Will gives an example: “During the making of the Battle of Waterloo shots for Jonathan Strange, we asked the producer permission to show Lionsgate, who were talking to us about

Now movies are coming out with 1,500, 2,000, 3,000 shots. Maleficent had 3,000; every frame is a digital effects shot Will Cohen, CEO & executive producer, Milk some crowd work in Insurgent. And Lionsgate didn’t realise it was TV not a film – they asked ‘What movie was that from?’. You won’t be surprised to hear they then hired us to do the work on Insurgent.” To the ordinary consumer, that fine line is even more invisible. “We talk about film, we talk about TV,” adds Will, “but for most people nowadays, it’s just visual media. Whether it’s

REAL DEAL Ingenuity Studios rejected a matte painted Earth in favour of the real thing

3dworld.creativebloq.com


FEATURE The rise of TV VFX

WET AND WILD Shoot-outs with cannons involved a lot of complex fluid simulation work

CROSSBONES

Swedish studio ILP explain how it created some astonishing naval battles for hit pirate adventure Crossbones mportant Looking Pirates (ILP) is a visual effects and digital animation studio located in central Stockholm. And with a name like that it’s not surprising that it got to work on a show like Crossbones, the hit US adventure series about the life of pirate Edward ‘Blackbeard’ Teach. “Our main area was creating digital ships and water,” explains CEO/VFX supervisor Niklas Jacobson. “There were two real ships on set during principal photography, but the show required a multitude of variations of ships as well as scenes of fleets of ships. There were also shots of a stormy ocean, burning ships and other complex scenes that required extensive visual effects in order to be realised.” One of the biggest challenges was to efficiently tackle the workflow of handling complex scenes containing high numbers of different ships, full of digital crews, cloth-simulated sails, banners, and water

I

on your mobile phone, in your living room or at the cinema, people don’t really differentiate over the quality. People don’t think: ‘I’ll forgive that, because it’s only a TV show from the BBC.’ It’s got to stand up to be successful. “So as the world gets quicker, faster, in terms of data, in terms of size, you’re working on the same camera, on the same resolution – we’re recording shots on Doctor Who in 6K sometimes – the techniques you learn on both TV and movies you try and apply where possible.” And here’s something else that’s changing: just as the last few years have seen TV become more like film, movies are now becoming more like television. “If you look at Avengers, it’s just big TV in content terms,” argues Will. “It’s basically a giant TV series; you just have to wait a year in between episodes. And for movies, just like TV, schedules are shortening. Since the last big financial crisis the challenge of the entire world – whether it’s visual effects or any business – has been to deliver more for less. That’s what everyone wants in every business in every industry, and the creative industries are no different.” Consequently, in the future even VFX studios specialising in TV will need to up

SHIP HAPPENS The show required the creation of multiple CG vessels

3D WORLD October 2015

50

3dworld.creativebloq.com

simulations. “The pipeline and workflow for keeping track of all these different assets – and how to quickly assemble shots and cost-effectively turn over lit and rendered versions – required some creative thinking, planning and pipeline work,” recalls Niklas. A tricky shot he’s particularly proud of is the scene where The Reaver (pirate ship) sails along side of The Petrel (British ship) and they have a shoot out with cannons. “We made a full CG shot with boats filled with a digital crew that needed to cut seamlessly between two live action shots. The challenge was capturing the look of the scene but also acting of the crew on board without pulling the audience out of the story. “This is not an ‘effects shot’ but rather a great example of virtual cinematography,” he stresses. “It could very well have been shot on set, but it was during editing that production discovered that a shot like this would really tie the sequence nicely together.” Other challenging shots were those with heavy simulation work like water, fire and smoke. “They are tricky in a more technical way, with complex simulation setups and plenty of elements to tie together nicely. Those shots are more resource-demanding in terms of computing power and time. They also add another layer of complexity in artistry to get it looking right.” To see more of ILP’s work visit its FYI site at ilpvfx.com


A BUG’S LIFE ILP needed to create swarms of bugs in a short period

CONSTANTINE

ILP explains how it created a swarm of bugs for hit horror show Constantine onstantine is a US television series based on the character John Constantine, star of the DC Comics’ horror series Hellblazer. As you’d expect, the television adaption of such a fantastical story brought a wide spread of challenges for VFX studio Important Looking Pirates. “Every episode brought something new and interesting to the table,”says CEO/VFX supervisor Niklas Jacobson. “Over the course of the ten episodes, we worked on magic, various creatures, digi-doubles, angel wings, effects work, simulation and destruction. Very few things carried over from one episode to another. Compared to other TV shows, Constantine had more similarities with how we are used to working on high-end commercials. “We had a great relationship with overall visual effects supervisor and co-producer Kevin Blank. Kevin was our point of contact and it was his job to realise the vision of the production and make the most of the post-production in order to tell the story. The briefing varied from specific to very open for creative interpretation. “Kevin has a fantastic ability of giving great back-story and purpose to a certain effect or scene from a story perspective. He often deliberately avoids talking specifics in order to leave room for us to explore creativity and to see what we come up with. He is very open to talk

C

specifics as well, but I loved the creative freedom he trusted us with. I see him as much as a director as a VFX supervisor. He made us feel like an important part of the production. “One particular challenge that comes to mind was for the episode A Feast of Fiends. Our mission was to create swarms of bugs for approximately 40 shots in a very short time period. We needed a workflow that was swift and flexible and allowed great creative control, yet did not require our FX/simulation artists to layout the shots. Hence we created a custom Maya particle rig that allowed our Lighting TDs to easily control shape, path, speed and noise of the particles. We would playblast little coloured spheres for Kevin, which he would give feedback on. Once the animation was blocked and approved we switched the particles to V-Ray proxies containing high-res animation cycles.” To see more of ILP’s work visit its FYI site at ilpvfx.com

INFEST WISELY ILP created a custom Maya particle rig to control shape and speed

3D WORLD October 2015

51

their game. Despite 15 years in the business, Sam Nicholson recognises that applies to him as much as anyone else. “To stay competitive in the ever-morecrowded field of global VFX, we’ve continually re-invented Stargate Studios, opened new markets and challenged our previous assumptions of what’s possible,” he stresses. “We bring the latest technologies, creative innovation, cost savings and real-world problem-solving to all our projects, large or small. Visual effects applied in this way should save a production money, not cost more.” Some people working in the industry won’t want to hear any of this, of course. “There’s still an enormous amount of snobbery from people who only work in film to the idea of TV,” says Milk’s Will Cohen. “I think it’s a generational thing. But look around you; look at shows like Game of Thrones, what they’ve done in terms of production values. It’s just about being economic with your storytelling. Realising how to make your budget and what’s being asked of you work is the future of our industry. And you shouldn’t resist it, you should be excited about it!” For more VFX case studies and interviews FYI visit www.creativebloq.com

3dworld.creativebloq.com



FREE! 10GB OF RESOURCES: VIDEOS, MODELS, BRUSHES, SUBSTANCES! WHICH GRAPHICS CARD IS FOR YOU? Don’t upgrade until you read our group test!

3dworld.creativebloq.com September 2015 #198

25

FRETEHI!S GET L MODE

CIN CINEMA 4D TECHNIQUES

Pro tips to become a better CG artist!

HOW TO CREATE AMAZING

ZBRUSH BRUSH ROBOTS BOTS Master new hard surface modelling skills!

WIN! KEYSHOT WORTH $995


DEVELOP Group test

GROUP TEST

What’s the best graphics card for 3D content creation? We pit the six most popular professional GPUs – three each from AMD and Nvidia – against each other

T

AUTHOR PROFILE James Morris James has been writing about technology for two decades, focusing on content creation hardware and software. He was the editor of PC Pro magazine for five years. webmediology.com

he professional graphics card market used to be awash with alternative brands. But for over a decade the choice has been exclusively between AMD and Nvidia. Even more recently, the latter has grabbed the lion’s share of the sales, shipping 79.4 per cent of units in Q1 2015 according to Jon Peddie Research. But AMD’s professional graphics are still very competitive, and it’s far from a foregone conclusion that a Quadro will always be the way to go. This issue, we pit three of the most popular Nvidia Quadros against their AMD FirePro alternatives, to find out which is the best, and which is the best value. Whilst they are ostensibly based on the same overall GPU

designs as consumer graphics cards, professional cards have a number of significant differences that make them worth the (considerable) extra money. One of the key features is optimisation for professional applications. The drivers will be coded for stable performance with the key software used in 3D content creation, rather than the fastest frame rates possible. The cards are also usually guaranteed for three years instead of two or one, and this can even be extended to five years in some cases. Another point worth noting is that whilst consumer-grade graphics cards come from a number of brands, Nvidia Quadros are all sold under the PNY brand in the UK, while AMD FirePros

are all Sapphire-branded. So there is only one card model per professional GPU to choose from. Although we have three products each from AMD and Nvidia on test, they don’t precisely parallel each other in price or ability, either. The Nvidia Quadro K5200 is around twice the price of the AMD FirePro W8100, while the Quadro K4200 sits somewhere between the W8100 and W7100 FirePros.

Pushing performance

However, both the AMD cards sport 8GB GDDR5 memory where the K4200 only offers 4GB. So where frame buffer is required, they have a distinct advantage. As our tests show, though, performance depends considerably on optimisation, and

When specifying a GPU for pro 3D usage, the choice between AMD and Nvidia’s products will partly depend on the software you usually run

EXPERT TIP

GPU rendering? The idea of harnessing the huge power of the GPU for final 3D rendering has been hovering in the wings for some years now. But it still has yet to hit the mainstream. All of this issue’s cards will enhance modelling alone unless you enlist a specialist renderer, so spending on a multi-core CPU is the best choice when it comes to output.

the best card for your workflow will be determined by whether the software you run works best with Nvidia or AMD drivers. All of this issue’s tests were performed on a RENDA PW-E7F workstation from Overclockers. This system is based around an Intel Core i7-5960X processor permanently clocked to 4.2GHz and backed by 32GB of 2,133MHz DDR4 SDRAM. Since 3D modelling benefits from clock speed rather than cores, this is an ideal test platform. To compare the cards, we ran industrystandard SPECviewperf 12.02 and the OpenGL test from Maxon Cinebench R15. Read our review of Overclockers FYI RENDA PW-E7F in issue 199.

3D WORLD September 2015

88

twitter.com/3DWorldMag


AMD FIREP FIREPRO FIRE PROO W5100

AMD FIREP FIREPRO FIRE PROO W7100

DEVELOPER/MANUFACTURER AMD WEBSITE www.amd.com COST £222.51 ex VAT / $340*

DEVELOPER/MANUFA DEVELOPER/MANUFACTURER NUF CTURER AMD NUFA WEBSITE www.amd.com COST £479.71 ex VAT / $733*

The AMD FirePro W5100 is the cheapest card on test this issue. But it’s still a capable piece of hardware. It sports 768 Stream Processors and a healthy 4GB of GDDR5 memory. So whilst this won’t be the fastest option, it should have no trouble running any mainstream application.

For around twice the price of the FirePro W5100, the W7100 is essentially twice the card. It sports double the GDDR5 memory and more than double the Stream Processors, with 8GB and 1,792 respectively. So it’s a potent piece of hardware for the money.

Usability and design

However, the memory uses a 128-bit interface, so the bandwidth available is just 96GB/sec. Another beneficiary of the healthy frame buffer complement will be the four DisplayPorts, each of which supports a 4K monitor when used with a DisplayPort 1.2a connection. This card doesn’t require any extra PSU power connection, either, as it draws less than 75W.

Performance

As the cheapest card on test, the FirePro W5100 is unsurprisingly the slowest. But it still posts some good results, taking this into consideration. The Cinebench R15 OpenGL result of 102.06 is way behind the rest on test here, but 34.13 in maya-04 in SPECviewpwerf 12.02 is only a little less than the more expensive Quadro K2200, whilst the showcase-01 result of 25.73 and snx-02 score of 45.73 are ahead.

MAIN FEATURES

Usability and design

4GB GDDR5 memory

The memory bandwidth is also almost twice that of the W5100, thanks to the 256-bit interface. So the GDDR5 runs at 160GB/sec. This card will be even more capable of utilising its four DisplayPort 1.2a connections with a quartet of 4K monitors for a massive 3D-accelerated video wall. However, the W7100 also draws around 150W, which is also twice that of the W5100, so a separate power connector is required.

128-bit memory interface Up to 96GB/s memory bandwidth

Performance

Direct graphics memory access support

The W7100 is a much more accomplished performer than the entry-level W5100, with 161.88 in the OpenGL portion of Maxon Cinebench R15, although this is behind the Quadro K4200. But it fares well against the latter in SPECviewperf 12.02, with comparable scores in maya-04 of 55.96, in snx-02 of 62.24, and creo-01 of 51.57. Its 24.77 in medical-01 and 44.73 in showcase-01 are ahead, showing the benefits of the large frame buffer. But the results in sw-03 of 86.48 and 56.6 in catia-04 are notably behind the Quadro.

Four standard DisplayPort outputs

Conclusion

Overall, the AMD FirePro W5100 acquits itself well for it £200/$340* pricetag. Considering that it is two thirds the cost of the Nvidia Quadro K2200, it’s definitely better value, and a good choice if you are on a very tight budget or need only light modelling abilities. The 4GB frame buffer means few applications will be impossible to use, although they might be a little slow. VERDICT

Conclusion

The W7100 provides a mixed bag of performance results compared to Nvidia’s Quadro K4200, beating the latter in some areas, equalling it in others, and falling behind elsewhere. However, at over £150 cheaper, AMD wins out on value. This is a capable all-round professional 3D accelerator for the money. (*Currency conversion)

3D WORLD September 2015

89

VERDICT

twitter.com/3DWorldMag

MAIN FEATURES

8GB GDDR5 Memory 256-bit memory interface Up to 160GB/s memory bandwidth Direct graphics memory access support Support for DisplayPort 1.2a and Adaptive-Sync


DEVELOP Group test

AMD FIREPROO W8100

NVIDIA QUADRO K2200

DEVELOPER/MANUFACTURER AMD WEBSITE www.amd.com COST £707.18 ex VAT / $1,080*

DEVELOPER/MANUFACTURER Nvidia WEBSITE www.nvidia.com COST £324.95 ex VAT / $497*

The W8100 is the second most powerful card in AMD’s FirePro range, yet it’s not even 50 per cent more costly than the W7100. It has the same 8GB GDDR5 frame buffer size, but a larger 2,560 quantity of Stream Processors, so should offer significantly greater performance.

Although the Quadro K2200 is our entry-level option from Nvidia in this test, it’s actually the only one based on the most recent Maxwell architecture, rather than Kepler. The main benefit of this is a faster 1GHz core clock than Nvidia’s other offerings. It also sports a capable 4GB of GDDR5 frame buffer, although otherwise this is a budget card.

Usability and design

Although the quantity of memory is the same, the W8100 uses an even wider 512-bit interface than the W7100. So the bandwidth available is doubled to 320GB/sec, the fastest of any card on test. As with the other FirePros in this test, there are four DisplayPort 1.2 connections, each capable of driving 4K monitors. However, this is the most power-hungry card in this group, requiring up to 220W and two power inputs.

Performance

Despite extra GPU grunt over the W7100, the W8100 still can’t beat the Quadro K4200 in Maxon Cinebench R15, resulting 183.21 in the OpenGL test. Its SPECviewperf 12.02 sw-03 result of 87.7 is also notably behind. However, in most other areas, the W8100 equals or beats the K4200, with some scores ahead of the K5200. The results of 3.7 in energy-01 and 55.83 in showcase-01 are the fastest on test, whilst 70.84 in maya-04 is a whisker away from the K5200, as are 27.28 in medical-01 and 79.12 in snx-02.

MAIN FEATURES

8 GB GDDR5 memory

Usability and design

512-bit memory interface

The K2200 sports 640 CUDA cores, less than half that of the K4200, and its 128-bit memory path means it offers an even lower 80GB/sec bandwidth than the cheaper AMD FirePro W5100, although the 68W power consumption is a little lower too. However, only three connections are available – two DisplayPorts and a single DVI – with only the DisplayPorts supporting 4K monitors.

320GB/s of memory bandwidth Direct graphics memory access support

Performance

The Maxon Cinebench R15 OpenGL result of 119.48 is notably ahead of AMD’s FirePro W5100, and the K2200’s score of 86.37 in SPECviewperf 12.02’s sw-03 viewset is on par with the more expensive FirePro W7100. However, elsewhere results are variable. The maya-04 score of 37.82 isn’t far ahead of the W5100, nor is 38.87 in creo-01, whilst 22.08 in showcase-01 and 33.41 in snx-02 are actually behind.

Four DisplayPort 1.2 outputs

Conclusion

Conclusion

At less than £100 more expensive than the Nvidia Quadro K4200, the AMD FirePro W8100 provides a lot of professional 3D acceleration for the money. If you’re running Maya, for example, this is a much more costeffective option than the Nvidia Quadro K5200.

Costing over 50 per cent more than the AMD FirePro W5100, the Nvidia Quadro K2200 doesn’t beat it in enough areas to justify the extra money. Whilst this is still good value for a 4GB professional 3D accelerator, the AMD FirePro W5100 offers only a little bit less performance for a lot less money.

VERDICT

VERDICT

3D WORLD September 2015

90

twitter.com/3DWorldMag

MAIN FEATURES

640 CUDA cores 4GB GDDR5 memory 128-bit memory interface 80GB/s of memory bandwidth DisplayPort 1.2 outputs


NVIDIA QUADRO K4200

NVIDIA QUADRO K5200

DEVELOPER/MANUFACTURER Nvidia WEBSITE www.nvidia.com COST £634.42 ex VAT / $969*

DEVELOPER/MANUFACTURER Nvidia WEBSITE www.nvidia.com COST £1,444.35 ex VAT / $2,207*

The Nvidia Quadro K4200 and its predecessors have been some of the most popular pro 3D accelerators in the workstations we have reviewed, and it’s not hard to see why. The K4200 is still in the affordable price bracket, yet it packs decidedly high-end performance.

The Nvidia Quadro K5200 is by far the most expensive card in this issue’s test, costing more than twice as much as AMD’s top-end offering the AMD FirePro W8100, although both companies also offer rangetopping models costing around the £3,000 mark.

Usability and design

The K4200 provides 1,344 CUDA cores. Although the frame buffer is the same 4GB of GDDR5 as the K2200, the bus is 256-bit so bandwidth is more than double at 172.8GB/sec. It still offers 4K-supporting DisplayPorts and a single DVI connection that won’t drive a resolution this high. Despite these specs, this isn’t a particularly power-hungry card; requiring just 105W.

Performance

With more than twice the CUDA cores, the K4200 is notably quicker than the K2200 across the board. The OpenGL result in Maxon Cinebench R15 of 200.85 is faster than anything AMD has to offer, as is the SPECviewperf 12.02 score of 103.63 in sw-03. The K4200 nudges ahead of the AMD cards in catia-04 with 67.69 and in creo-01 with 52.78. However, its maya-04 score of 56.87 is behind the FirePro W8100, as is 63.54 in snx-02, whilst 21.14 in medical-01 and 37.17 in showcase-01 are behind the FirePro W7100.

MAIN FEATURES

1,344 CUDA cores 4GB GDDR5 memory 256-bit memory interface 172.8GB/s of memory bandwidth

Usability and design

The K5200 is a significant improvement over the K4200, as you would expect for the price. It sports 2,304 CUDA cores and twice the frame buffer at 8GB of GDDR5. However, the bus is still 256-bit, so the bandwidth is only slightly improved to 192GB/sec – notably behind the much cheaper AMD FirePro W8100. But this is still only a 150W card, and there are two DVI connections alongside two DisplayPorts, although only the latter can drive up to 4K monitors.

Performance

DisplayPort 1.2 outputs

Thankfully, you do get what you pay for in performance with the K5200 – mostly. The Maxon Cinebench R15 OpenGL result of 217.66 is the fastest we have seen, as are the SPECviewperf 12.02 scores of 94.81 in catia-04, 71.69 in creo-01, 3.63 in energy-01, 72.61 in maya-04, 29.54 in medical-01, 82.05 in snx-02, and 128.45 in sw-03. But the W8100 still manages to run the K5200 closely in some of these, and beats its score of 48.9 in showcase-01.

Conclusion

Conclusion

The Nvidia Quadro K4200 doesn’t beat the AMD competition across the board, but it is consistently capable with enough different applications to make it the best high-end all rounder. If you’re running Maya or need an 8GB frame buffer, the FirePro W8100 would be a better choice. But otherwise, the K4200 provides the most consistent performance in its price range.

The Nvidia Quadro K5200 is the fastest professional graphics card you can buy unless you spend around £3,000. If you do want the best performance and are willing to pay for it, the K5200 certainly delivers. However, the AMD FirePro W8100 comes close in a number of areas, making it much better value overall.

VERDICT

VERDICT

3D WORLD September 2015

91

twitter.com/3DWorldMag

MAIN FEATURES

2,304 CUDA cores 8GB GDDR5 memory 256-bit memory interface 192GB/s of memory bandwidth DisplayPort 1.2 outputs



FREE! 12 HOURS OF VIDEO TUTORIALS! LEARN ZBRUSH, UNITY & MORE

25

UNREAL 4

TECHNIQUES Make this pro game environment FREE Videos • Models • Textures

8GB OF T PROJEC FILES

T S I T R A S E M GA 3dworld.creativebloq.com August 2015 #197

A E M O C E B E TO C I V D A O PR

ON I T A M I N A S E M A G R FrOeate flawless scrneoawtu!re C n cycle animatio

ER T C A R A H C DReEndTeAr rIeLaliIsNticlsG!cloth

E M I T L A E R Gally N I R E D N E ic RStart using Phyinsg!

r mode for you

ender Based R

ASSASSIN'S CREED Insider insights to video game world building

FREE! ICLONE LONE 5 Worth £69

MAKE THIS IMAGE

ZBRUSH MECHA Concept and model a warrior android for video games


Alex Dracott shares tips and tricks for lighting, texturing and rendering in Unreal Engine 4

S

ince public release early last year, Unreal 4 has set and raised the bar for third party engines. As of March it’s now free to use and there’s no better time to get creating on your own. Built from the ground up, Epic’s newest engine is capable of producing truly incredible visuals. Its deferred rendering, custom materials and advanced lighting techniques are perfect for pushing the engine – and your art – to the next level. I’ve worked professionally in Unreal 4 since its public release and have discovered some fantastic techniques for creating and presenting high-quality art in-engine. Here I share some of my personal tips and tricks I use on a day-to-day basis to help you light, texture, and render your own beautiful scenes within Unreal 4. In this issue’s online Vault you can find video tutorials as well as models and textures to get started in Unreal 4 today! For all the video and digital assets you need visit www.creativebloq.com/vault/3dw197

AUTHOR PROFILE Alex Dracott Alex is a lighting, effects and environment artist working in the gaming industry. He has been working in the field for the past four years. digitaldracott.com

3D WORLD August 2015

42

3dworld.creativebloq.com


1

IMPORTING TEXTURES INTO UNREAL 4

3

PHYSICALLY-BASED RENDERING

You can import textures via the Import button in the Content Browser. Unreal 4 supports a large variety of texture formats, from .tgas and .pngs to .psds and .jpgs. One important tip is to make sure normal maps are compressed as TC Normalmap to prevent visual errors in engine. Also be aware that if your texture dimensions do not follow the power of two, they won’t stream or have mipmaps.

2

SAVE MEMORY: CHANNEL-PACK TEXTURES

One of the fantastic things about Unreal is the large amount of control you get to have by creating your own materials. When you’re creating multiple black and white masks for textures like roughness or transmission, you are able to save memory by hiding each mask into an individual channel of a texture image and then accessing each channel of that texture separately in your material.

With the dawn of new rendering capabilities in new engines like Unreal 4, there has come the widely praised adaptation of physically-based rendering. This should definitely be worked with rather than against. Learning how to accurately represent the physical properties of materials with roughness and metalness masks can seem like a change from the way game engines worked last generation but can help keep materials consistent and believable across multiple lighting environments.

FOLLOW THE VIDEO www.creativebloq.com/vault/3dw197

3D WORLD August 2015

43

3dworld.creativebloq.com


UNREAL INSIGHT

Epic’s content examples Looking to learn more? Check out the great content examples from Epic in the Learn tab on the Unreal launcher. There are also some great free community examples on the marketplace itself.

4 4

TEXTURE REUSE

Another fantastic element of Unreal 4’s Material Editor is that it allows for very intelligent texture reuse. This can not only save you memory, but also time. Sometimes a red channel from a rock albedo texture can make a great overlay for a roughness texture. A tiling cloud render texture from Photoshop could be useful for adding variation to a brick, but also to blend in a detail texture for some concrete. The possibilities are endless.

5

DON’T MAKE UNNECESSARY TEXTURES

Occasionally certain textures are not needed and can be left out to save memory. For 100 per cent nonmetal materials like wood or dirt, a metalness texture can be substituted in the Material Editor for a simple float constant with a value of 0. The same idea can be applied for multiple versions of the same material. Three slightly different coloured bricks don’t all need different normal maps, but could share one.

6

GENERATING MATERIALS Large collections of materials can be made by instancing a smaller set of base materials

6 Vertex material blending

USING MESH VERTEX COLOUR FOR BLENDED MATERIALS

BUILDING A CORE MATERIAL SET

One way to save considerable time and work is by creating a basic set of materials that can be instanced out for different objects. When I start projects I create a base material for each type of object I’ll need. For example, if I was making a nature scene I would want base materials for terrain, props and foliage. There will always be outliers but it helps with the bulk of the process.

ONE ADD IN TEXTURE SAMPLES

TWO BLEND TEXTURE PAIRS TOGETHER

For this example we want to blend two different texture sets. Once I have created my blank material I import the texture sets as samples by dragging them from the Content Browser into my material. I also add in a Vertex Color node.

Use HeightLerp functions to blend the first texture set. HeightLerps blend two textures based on an input and Heightmap. Connect the Vertex Color red channel to the Transition Phase. Lerp Nodes can be substituted for extra sets by connecting the alpha of the HeightLerp to the Lerp’s alpha.

3D WORLD August 2015

44

3dworld.creativebloq.com


Online resources:

EMBRACING THE COMMUNITY

MATERIAL INSTANCES These can be used to create a wide variation all from one base material

The online Unreal 4 community is active and getting bigger and better by the day. Online sites like Epic’s Unreal forums and Polycount have incredibly large amounts of information and helpful members looking to share techniques. On top of the official documentation for UE4, large collections of tutorials can also be found on YouTube. Epic has also set up the incredibly useful UE4 AnswerHub site for those with specific technical problems to seek help from others as well as those looking to offer assistance. (https://answers.unrealengine.com)

7 7

ITERATING THROUGH MATERIAL INSTANCES

8

MATERIAL COMMENTS AND ORGANISATION

9

MATERIAL FUNCTIONS

A great feature of a Material Instance of a base material is its ability to parameterise values that can be changed in real time. You can use these changes to rapidly test out many different values without having to recompile a material. Whenever I have a complicated material I always have a test material instance on the side. I use the test material instance to lock in more realistic base values for the final material.

For very complex materials Unreal 4 brings some very welcome organisational tools to help. Selecting a group of nodes and pressing [C] puts those nodes into a comment, which can then be moved as a group and colour coded. The comments (and individual nodes) can have basic text explanations added to improve readability.

CREATING GREENERY The forest was made mostly from five base materials, instanced out for foliage and key environment elements

THREE HOOKING EVERYTHING UP

FOUR TESTING OUT THE MATERIAL

For each pair of textures (diffuse, normal map) connect one texture to input A of a Lerp node and the other to input B. Ensure you’re consistent, so input A for each Lerp is always the same texture set. The output from each Lerp can be connected to the appropriate input for the base material.

Click the Apply button and drag the material from the Content Browser to a tessellated mesh. Switch to the Paint mode in the editor by moving to the Paint Brush tab in the Mode window. After flooding the mesh black, paint red on the mesh and see the material change appropriately.

3D WORLD August 2015

45

Material functions can be thought of in the same way real functions work in code – repeatable operations that can be called multiple times to perform a specific set of instructions. They are made outside of a material in the Content Browser, but can then be called on their own to help simplify materials. They can contain their own set of inputs and can be a fantastic way to save time when repeatable operations need to be called.

3dworld.creativebloq.com


Natural lighting

SETTING UP BASIC OUTDOOR LIGHTING CONDITIONS

UNREAL INSIGHT

Saving camera positions Unreal 4’s viewport camera positions can be saved by pressing [Ctrl] and any of the number keys. You can then recall that position by pressing the number again.

ONE REFERENCE While it often seems like common sense, I always try to find references for my work. Lighting is no different. It could be times of day or just colours and clouds, but it always helps. I personally love the program PureRef for gathering and viewing large image collections.

TWO ADDING LIGHTS AND SKY For outdoor scenes I always start with a Directional light and a Sky light. I also drop the Sky light to a .5 intensity to start. Unreal 4 offers the option of starting scenes with the BP_Sky_Sphere blueprint which can be a great start/placeholder for a skysphere.

10 THREE ENVIRONMENT FOG The last step I take is usually adding fog, as almost all times of day have some kind present. I almost always start with the Exponential Height Fog actor and tweak it over time to match the time of day I am working on. If the scene includes long view distances I will include the Atmospheric Fog actor. This is where I usually add a postprocess volume as well.

FOLIAGE MATERIALS

Foliage can be one of the trickiest things to ensure looks correct in any game engine. As of UE4 version 4.7, a Foliage Shading Model exists to help make that task easier. It is highly recommended as it supports sub-surface transmission, which most leaves benefit from. I also recommend adding sky light to your scene to help balance out some of the darker areas of a foliage mesh that could be in shadow.

3D WORLD August 2015

46

3dworld.creativebloq.com


14 LET THERE BE LIGHT Lighting is absolutely key in setting mood and tone for your artwork and should never be undervalued

11

VERTEX COLOURS

Having access to vertex colours in materials is one of my favourite features in Unreal 4. They can be incredibly powerful when used creatively. From ambient occlusion to masking out wind and world offset for foliage, their versatility is incredible. They‘re particularly useful in blending tiling textures together. Vertex colours can be imported from outside 3D software or imported and painted in editor.

12

DETAIL DIFFUSE AND NORMAL OVERLAYS

13

TEXTURE BLENDING IN MATERIALS

14

KNOWING YOUR LIGHT TYPES

Because you can customise texture UV tiling rates you can increase the details of a material by blending in a secondary set of textures, usually diffuse or normal maps, then tiling them at a higher frequency on top of base textures. Diffuse detail can be applied with various techniques, such as the Overlay Blend Function, while detailed normal maps can be applied by adding the red and green channels to the base as normal.

Want to combine textures in the material editor but only familiar with Photoshop’s blend modes? Epic has it covered. Along with many useful material functions, they included the majority of blend modes that all Photoshop users are familiar with. From Overlay to Linear Dodge, they can be found in the Palette window inside the Material Editor. They can be particularly great for adding detail and variation to your materials.

DEVELOPING SKILLS This forest was one of Alex’s first big personal projects in Unreal 4 and really helped him push his skill set and learn the engine

3D WORLD August 2015

47

Unreal offers four different types of lights to use in the environment: Directional, Point, Spot, and Sky light. Directional lights are great for outdoor areas or any kind of extreme singular light source. Point lights are omnidirectional and Spot lights are similar but have their limits defined by a cone. Sky lights can be used to add ambient light to your environment by capturing distant parts of your map. They also support custom Cubemaps.

3dworld.creativebloq.com


15

FEATURE Become a Modo master

15

ADDING ENVIRONMENT FOG TO YOUR SCENE

While close up fog can always be created with particle effects, Unreal 4 offers two other ways to add fog to your scene. AtmosphericFog reacts to directional lighting angles and intensity to create fog based on actual scattering of light in the atmosphere. Exponential HeightFog gives a bit more colour control and allows you to add a simpler fog effect that becomes less dense in higher parts of the map and denser in lower parts.

16

CREATING CLEVER LIGHT SHAFTS

Light Shafts or ‘god rays’ can be a powerful visual tool and are created by particles in the air being lit by specific light sources. In Unreal 4 they can be created in a few ways. The most common way is by enabling them from the properties of a Directional light. They can also be made using geometry and clever materials. Epic’s blueprint example project contains a good example of how someone could go about doing this.

17

TAKING HIGH-RESOLUTION SCREENSHOTS

While custom resolution videos can be rendered out of Matinee, there is a quick and easy way to take high-resolution screenshots straight from the editor. By clicking the little downward arrow in the top left of your Viewport you can reveal a little drop down menu. At the bottom of that you can open up the High Resolution Screenshot window. From there highresolution shots can be captured and sent to your project/saved/Screenshots folder.

GET INSPIRED Alex’s process for making this environment can be found in the accompanying video, including all assets

18

19

16 18 CREATING GOD RAYS These can be cheesy if overdone so add subtly with an artistic touch

COLOUR CORRECTION AND LOOK-UP TABLES

Using post-process volumes, final render colours can be tweaked and adjusted based on artistic preference. While options exist for basic settings like contrast and colour tinting, custom colour correction can be done using colour lookup tables. These tables allow for complex colour transformation and can be made with a base file available on Epic’s Unreal 4 documentation site and Photoshop – or other image adjusting software suites.

3D WORLD August 2015

48

19

EDITING BLOOM AND LENS FLARES

Image bloom and lens flares post processing has become popular in games and 3D and can be enabled and customised in UE4 using post-process volumes. Bloom can be highly customised in almost every sense. Size, colour, intensity, and threshold can all be tweaked and even be used to mask in dirt textures to mimic dirty lenses. Similarly, imagebased lens flares can also be enabled and their shapes and intensity adjusted.

3dworld.creativebloq.com


22 LIGHT FUNCTION A tool used for flickering candlelight patterns to stained glass window colouring on the ground

22

LIGHT FUNCTIONS

23

SAVE YOURSELF TIME BY COPYING AND PASTING

24

VIEW MODES AND BUFFER VISUALISATIONS

25

PERFORMANCE TIPS

One fun feature when lighting in Unreal 4 is the support of light function materials. These materials act as masks for the light and can be used to make anything from custom colour variation in a light to cloud shadows on the ground. They are made by setting the Material Domain to Light Function in the Material Editor and can be used on spot, point, and directional lights.

One fantastic trick to know about Unreal 4 is that any object in a level can be copied and pasted directly into another level within the same project. It will appear with the same properties and in the same location. What’s even better is that anything copied from Unreal can be pasted into a text document. That text can then be copied and re-pasted into another Unreal 4 level.

20

20

21

CREATING DEPTH OF FIELD

Unreal 4 supports both gaussian and custom shaped bokeh depth of field. Both of these options exist within the settings of Post Process Volumes. It should also be noted that while bokeh sprite weighting exists to help with blurring thin objects in front of distant objects and vice versa, problems can sometimes still occur. Care should be taken when applying depth of field to foliage and other similar shots.

21

Knowing what makes up your image is an integral part of working in any 3D engine and working in a deferred renderer like UE4 allows for some useful view modes. Pressing [Alt] and [1-8] switches between various view modes like Unlit or lighting only but if you click on the View Mode button in the Viewport you can view individual buffers. This can be useful for seeing level wide ranges of material inputs, like roughness.

AUTO EXPOSURE AND EYE ADAPTATION

Auto exposure control is on by default and simulates eye adjustment to bright or dark areas. The effect is awesome but can create constantly changing visual variables that are hard to stay consistent within. Adjustments to the exposure range can be made in the settings of post-process volumes and can be disabled by setting the minimum brightness equal to the maximum. Exposure bias can be used to adjust base exposure settings.

3D WORLD August 2015

49

While Unreal is incredibly powerful, not every workstation is created equally. If you are running into performance problems in-engine, the first place to turn is the Engine Scalability Settings within the Setting button on the Editor Toolbar. Turning down some options like anti-aliasing can really speed things up when working. Another trick is to organise large groups of assets into folders in the world outliner. You can then toggle their visibility to help with performance.

3dworld.creativebloq.com



FREE! 3 HOURS OF VIDEO TUTORIALS + MODELS + SCENE FILES INSPIRING CG ARTISTS

FREE!

8GB OF RESOURCES

3dworld.creativebloq.com July 2015 #196

MAYA 2016 TRAINING Master the new tools to model amazing CG

L A I C E P S X F V E T A M I ULT S AVENGEURLTRON AGcluEsivOeF#1 TheoCn GMan!

OF S R A E Y 0 4 IhCind G A M M L IExclusive #2 Bthee Ranch!

RS A W R A T S Gdel N I L L E D O o M e #3 Mhip! Exclusiv rs ed sta a detail

nes at the sce

Ir Ex Hulk vs behind

INSIDE VUE 2015 REVIEWED • 3DS MAX SIMS • SUBSTANCE 5 TIPS


HITS The venerable studio moves into its 40th year with a slew of past honors, a slate of blockbuster films on the roster, and a new, global presence to celebrate. Barbara Robertson visits ILM‌

3D WORLD July 2015

42

3dworld.creativebloq.com


ILM has worked on some of the most iconic effects movies of the past 40 years, and it all started with Star Wars

ILM is responsible for several movies that changed the VFX industry, such as Pirates of the Caribbean: Dead Man’s Chest, Terminator 2 and Jurassic Park

T

his year, Industrial Light & Magic celebrates its 40th anniversary. The powerhouse visual effects studio began with a handful of people hired by George Lucas to create visual effects for the first Star Wars, which was released in 1977. Today, ILM is a thriving global studio with 1,200 people in San Francisco, Singapore, Vancouver and London. Many of those artists are creating the seventh Star Wars film, scheduled for release by Lucasfilm/ Disney in December. Directed by JJ Abrams and with VFX supervised by Roger Guyett, Star Wars: Episode VII: The Force Awakens is destined to score a box office bonanza. But the studio has much more to its credit than Star Wars. Films with effects created at ILM have won 16 Oscars, and received 33 Oscar nominations for best VFX. And, of the top 10 all-time worldwide box office hits, ILM has worked on half of them Avatar, Titanic, The Avengers, Iron Man 3, and Transformers: Dark of the Moon.

ILM wizardry

Equally important, the artistry in VFX for those films and many others can be traced back to inventions and innovations created by people at ILM during the past 40 years, and to the early application of CG techniques for production filmmaking. “I think there is really something valid about the influence we’ve had on films,” says senior visual effects supervisor Dennis Muren. “The techniques we’ve brought to filmmaking have opened up the medium to telling stories that couldn’t have been told before. I think you see it all the way through.”

16

THE ILM WAY

John Knoll, Scott Farrar and Dennis Muren discuss the forces at work at ILM

Academy Awards Films include, Star Wars, E.T., Indiana Jones and the Raiders of the Lost Ark, Who Framed Roger Rabbit, The Abyss, Terminator 2: Judgment Day, and Pirates of the Caribbean: Dead Man’s Chest

AUTHOR PROFILE Barbara Robertson Barbara is an awardwinning freelance writer based near San Francisco. www.bit.ly/196-barbara

PHOTOGRAPHER PROFILE Bill Zarchy Bill is a freelance director of photography, writer, and teacher based in San Francisco. www.billzarchy.com

3D WORLD July 2015

Staff at ILM are constantly improving their art and being more efficient “When I came to ILM, I knew there was something special,” says John Knoll, chief creative officer. “They were doing work way better than what I was seeing from any other place, so there had to be something special. I was determined to be a sponge, to find out what allowed them to do such great work. It became apparent to me that there was an ILM method even if not everyone was aware they were doing it. It’s transferred through osmosis through a million different discussions in meetings and dailies about not only what should be tweaked in a shot, but why. Dailies were like a masterclass in how to manipulate people’s perception of a shot. These were amazing brainstorming sessions.” “That’s how I learned,” says senior visual effects supervisor Scott Farrar. “Dailies are a theatre of education. We’re always re-examining what we do, thinking there’s got to be a better way. I’m doing it myself right now. ILM is like a think tank. A lot of work is done with a cup of coffee in hand, talking to someone in the hall who might have bumped into the same problem or thought about it.”

43

3dworld.creativebloq.com

“We don’t have, as far as I’m aware, any list of the ILM way of doing things,” says senior visual effects supervisor Dennis Muren. “Throughout, there’s always been an awareness that you’re not trapped in any one way of working. I think our approach has always been, the image is what counts. It’s what’s appropriate for a film. Maybe it’s that way for others. But, having been founded by a filmmaker, George Lucas, that’s always been our goal. How does the shot fit in the movie. The filmic things. No matte lines. Clear performances and storytelling. Lighting fits the characters into the shot. Looks like it was shot by the same camera crew that shot the actors. We make everything match seamlessly. The R&D and technology is still a huge part of our success. And we are coming up with smart ways to do more shots in less time. But without the artists making the decisions on how to use the technology, the work can look very mundane and very familiar. It’s always been about having both of them.” “Arts and Sciences,” Dennis says. “That’s why it’s called that.”


ILM People

FEATURE The history of IlLM

DENNIS MUREN: 40 YEARS AT ILM

Renowned for his work on the original Star Wars trilogy, Dennis Muren is a legend in the VFX industry

Senior visual effects supervisor Dennis joined ILM in 1976 and worked on Star Wars in 1977. As VFX supervisor for Star Wars: The Phantom Menace, the first film with more than 1,000 VFX shots, he propelled digital VFX into the realm of the super VFX blockbuster. Dennis now consults for ILM. “I’m working on upcoming technologies and the aesthetics we want to bring into new shows,” he says. “I kinda miss having my head the entire way into doing a show, but it’s better for me to pass on what I know and step back.”

SCOTT FARRAR: 34 YEARS AT ILM

Senior visual effects supervisor Scott joined ILM in 1981 as an effects cameraman and photographed the famous Genesis sequence in Star Trek: The Wrath of Khan, the first CG sequence presented in a motion picture. He was senior VFX supervisor and second unit director for Transformers: The Age of Extinction, his fourth Transformers film. Scott is currently working on making technical improvements to ILM’s tools. “I don’t think our tools light correctly based on real world lighting, and the tools are still stupifyingly difficult to work with,” he says. “My goal is to make it easy and quick for an artist to achieve the lack of perfection in real world motion picture lighting.”

JOHN KNOLL: 29 YEARS AT ILM

1970s

Chief creative officer In May 1978, while Dennis Muren was at ILM working on Close Encounters of the Third Kind, John Knoll, inspired by having seen Star Wars, picked up a phone book, found a listing for Industrial Light & Magic, and called the studio. “Star Wars was such an amazing experience. I thought maybe visual effects was a viable career. It was a life-changing experience. The next day my dad drove me to Van Nuys at 8:30 in the morning, and I spent the whole day at ILM. Lorne Peterson [model maker] toured me around. I went to dailies. I saw them building models. I saw them shooting on a motion control stage. I walked out of there and said, ‘This is what I’m going to do. I’m going to work at ILM some day.’” He reached that goal eight years later. John was employee 105.

The Beginning

W

hen Dennis Muren joined ILM in April 1976, he witnessed the studio’s first innovation in visual effects. “It was before any shooting had been done [on Star Wars],” he says. “They were still building the shop. I’d never seen anything like it. Everyone was so young – my age and younger – and everything was under one roof. The approach John [Dykstra] had taken with motion control was to break everything into separate elements, each shot separately programmed with motion control gear, so they all had their own flight path. Then we’d composite them together. It was so complicated that it was great to have everything in one place. It was very unusual for a Hollywood film.” The main challenge back then was ensuring everything was precise. “The film had to go through perfectly without any shaking and only a couple places could fix gear to be that precise,” Dennis says. “Models had to be made to precision. Explosions had to break apart realistically. It was a very linear process and if something didn’t work

we had to do that part again, often from scratch. We needed a lot of physical space and some equipment was literally dangerous.” In addition to bringing the VFX process all under one roof, Dennis exalts the in-house art department.

Design matters

“We’ve always had an art department,” he says. “Most studios would bring someone in, or expected the director or studios to provide shot design. To us, the art department seemed as necessary as a camera department. We have ideas a director or producer might not know are possible. So it gave us a way to get in early and that has been great. Now, not all shows that come in need [our shot designs], but every show still goes through our art department for the studio to consider our ideas before we start the show.” ILM displays that art – the models, the maquettes, the concept art created during the past 40 years – throughout the San Francisco studio, giving an art-gallery feeling. For instance, an Iron Man maquette greets you at the reception desk. 3D WORLD July 2015

44

JOHN GOODSON: 26 YEARS AT ILM

Model maker and supervisor John joined ILM’s model shop in 1988, and stayed there for 16 years, by 1991 he was working on models for Star Trek VI. In 2005, John began using CG to create concept art for Star Wars: Episode III. He left the model shop and became a digital artist. It’s rare for Goodson to step away from the computer these days, but occasionally he still gets his hands dirty. “Paul Huston and I needed to create aeroplanes for [the television show] ‘Agent Carter,’” he says. “So we bought aeroplane model kits. We built one with the landing gear down, put it on Masonite on a table, dusted the table with baking soda for snow and ground foam for vegetation, and shot it. We started laughing. We were actually shooting a model again. It was especially fun doing that with Paul who has been here the entire 40 years.”

GRADY COFER: 16 YEARS AT ILM

Visual effects supervisor “The one thing that brought me to ILM was art,” says Grady who’s first love was Star Wars. “My childhood was punctuated by striking visual moments in film. E.T.’s ship taking off. The Arc of the Covenant opening at the end of Raiders. The link was ILM. How could I not be compelled to work there ultimately? Artistic greatness has a gravitational pull.” Grady's first movie was Wild, Wild, West and recently worked on Noah. One of the strengths of ILM is “the creative brain trust,” says Grady. “I talk to Dennis [Muren] all the time. He looks at everything. When I get back from dailies, I’ll have a message from Dennis on my phone. He always comes up with really interesting ideas.”

3dworld.creativebloq.com


Eight months of work resulted in the Abyss’s 75-second pseudopod shot: the first digitally animated CG water effect to appear on the big screen

1980s Enter Computer Graphics “

33

Academy Award nominations

15

BAFTA Awards This includes ILM's first animated film, Rango

21

BAFTA nominations

2

Emmy Awards

1

Best Animated Feature Academy Award Won for Rango

T

he advent of CG as a serious production tool transformed the whole industry, and that happened largely at ILM,” says CCO John Knoll. “The big turning points were the Stained Glass Man [in Young Sherlock Holmes], but that was Lucasfilm’s computer division; The Abyss, the first film with ILM’s computer graphics; T2, which had a central character executed in CG; and Jurassic Park, which shocked the industry and marked the end of stop motion as a technique for doing creature work.” ILM hired Scott Farrar to shoot models at ILM for the 1982 film Star Trek II: The Wrath of Khan. Little did Scott know that he’d also shoot the first entirely CG sequence in a film.

The avant-garde

“The only way to photograph the CG effect was to set up a camera with an intervalometer to control the camera shutter,” says Scott, who is now a senior VFX supervisor. “We put the camera on a tripod in front of the monitor and shot all night.” To shoot the star field, effects supervisor Jim Veilleux and Scott photographed a camera flying through a 3D array of stars at Evans & Sutherland, again using a camera on a tripod in front of the monitor. “I had a crew of about 60,” Scott says. “On the last Transformers, I had between 300 and 400. The difference is staggering.” In 1985, the first photorealistic CG character, a knight made from stained glass, fell from a church window in The Young Sherlock Holmes and made film history. Dennis Muren was the VFX supervisor and received an Oscar nomination for that film. In 1988, Doug Smythe and ILM’s CG department worked with Tom Brigham to create the first morphing system for Willow, resulting in technical achievement award and another ILM VFX nomination. 3D WORLD July 2015

The first photorealistic CG character: the stained glass knight from Young Sherlock Holmes in 1985

ILM worked with Steven Spielberg on E.T. and continues to collaborate on many of the films he directs

The big turning points were The Abyss, T2 and Jurassic Park, which shocked the industry and marked the end of stop motion as a technique for doing creature work In 1989, artists first applied a system developed at ILM for wire removal and for removing dirt and damage artifacts from film to help create Back to the Future II. “Bit by bit we tried new things in movies,” Scott says. “That’s what ILM has always done – experimented. You can point to so many movies, so many little steps and advances that moved us forward.”

Deep dive

1989 also brought to the screen James Cameron’s The Abyss in which a transparent CG pseudopod made faces at Mary Elizabeth Mastrantonio. To help make that possible, John Knoll did what was probably the first set survey with CG in mind. “It was all so new,” reflects John. “I went along for the plate shoot, I figured that someone would need 45

3dworld.creativebloq.com

to measure where the camera was relative to landmarks on the set so we could match our CG cameras. I measured the set with a tape measure and drew where the camera was on Xeroxed floor plans. Then, as the pseudopod was reflective, I brought in a still camera and photographed full 360s around the set to stitch into reflection environments. We were making it up as we went along. Many of the procedures’ basics are still the same.” The Abyss was a turning point for Dennis Muren: “After The Abyss, I thought ‘I have to know this because I have to control it, I had a million things to learn and I took a year to study at my own pace. I wanted to get CG into production. Not just five shots.” Then came the 1991 Terminator 2: Judgment Day, and it all changed…


FEATURE The history of IlLM Xxxxxxxxxxx

The terrifying liquid metal T-1000 in Judgment Day used realistic human movements: a movie first

1990s CG Gets Serious

T

he T-1000 character from Terminator 2 that shape shifts from human form into liquid metal was the first CG character on film to have natural human motion, a widely acknowledged VFX breakthrough. But the film was also the first to use camera-projected plates for matte paintings. “We developed it for the shots when the T2 [the T-1000] pushes through the bars of the jail,” John says. “The idea is to take a live-action plate, project it onto CG geometry, and then when you warp the geometry, the image goes with it. We used it for matte paintings for the first time for Hook [in 1991]. We created Neverland as traditional paintings on glass, but then matched the camera and the basic shape of the island in CG with simple representative geometry, projected onto that geometry, and flew a camera toward it. The whole painting moved in perspective in a way we had never seen with 2D matte paintings. It was revolutionary. We used it for Mission

Perfect Storm got an Academy Award nomination for best VFX

Impossible [1996], and pretty soon it became an industry standard.” The 1993 film Jurassic Park followed shortly after T2, and signalled the real beginning of CG characters in films. “Those of us in the model shop were aware that there was only one miniature set used in Jurassic Park,” says John Goodson, digital artist and model maker. “The Ford Explorer pushes over the wall. Usually, we did a significant amount of stuff. We thought, ’Wow. Something is changing.’” With T2 and Jurassic Park, ILM had made CG an essential part of the films and showed the industry that CG was production ready. “We were no longer limited by materials, models, wires, gravity, physics,” Dennis says. “We were making shots we could imagine. It was incredibly liberating. No one knew… that is, until they saw T2 and Jurassic. We could have gone ten years longer without knowing. But that nut was cracked here because we had filmmakers and

ILM developed new systems to deal with the CG creatures in Jurassic Park

3D WORLD July 2015

46

scientists. Then the trick came in making it look real, not artificial.” The 1995 film Jumanji put CG hair and fur to the test, and resulted in an Academy technical achievement award. Casper, a second film that year, put a CG character into a starring role alongside actors.

CG water and fire

“Suddenly, between Jurassic Park and Casper we had to quadruple the size of the animation department,” Dennis says. “We couldn’t find another 50 people who knew CG. It was one of several huge spurts where we had to get people in and train

28

Scientific and Engineering and Technical Achievement Awards In 1981, the first award recognised the development of a motion picture figure mover for animation photography. It went to ILM’s Dennis Muren and Stuart Ziff

We were no longer limited by materials, models, wires, gravity, physics. We were making shots we could imagine. It was incredibly liberating them in esoteric areas and give them schooling in the tools we have and the quality we expect.” In the 1990s, artists at ILM won Oscars for Terminator 2, Jurassic Park, and Forrest Gump, and received nine Academy Award nominations. One of those nominations was for Backdraft, which marked the first use of the ILM Trilinear High Resolution CCD Digital Input Scanning system developed in partnership with Kodak. It scanned its last film in 1996: Mission Impossible. Backdraft was also the first film with ILM’s CG fire. CG water flowed into the production line for the 1998 film Deep Impact. “We learned how to make a wave with white caps that left little trails,” Scott says. “We improved on that with Perfect Storm [in 2000] and those two films moved CG water forward. Now, we have the relationship with Stanford researchers for Physbam to do physical representations of natural things – dust, fire, water. But water is still hard.” 3dworld.creativebloq.com

15

Films that have won a Best Visual Effects Oscar

338

People have been with ILM for over 10 years… and two have been at ILM for 40 years, Dennis Muren and Paul Huston


+ 2000s

ILM’s astounding innovations will continue in the new Star Wars movie

The Digital Age

A

ccording to Dennis, ”The industry exploded in 2000.” It started with Phantom Menace [1999]. With that and Lord of the Rings [created at Weta Digital], CG visual effects became a way to make movies. Greenscreen or bluescreen, directors could imagine anything and get what they wanted. Audiences had never seen stories like that outside stop motion or cartoons. So, we needed to rise up to that scale. We had 1,200 people working on those prequels. It was massive.”

Quality is key

With Episode I: The Phantom Menace, filmmaking became digital. “We’ve made great, big, important developments over the years, but there were also significant smaller innovations,” says John Knoll, who was a VFX supervisor on EPI along with Dennis, Scott Squires, Scott Farrar and Barry Armour. “For example, up until EPI, our matchmoving tools were largely manual for the most part. People were lining up things by eye, setting keyframes, and seeing where they

slipped. Tracking features and using the camera to back solve happened with EPI. We hardly do a shot now without that being the first step.” Today the biggest challenge is differentiation and budgets. “There’s so much on the screen now compared to 30 or 20 years ago, that a lot of the work looks the same,” Dennis says. “The biggest problem is how to make it look different. Make it fit within the film and not stand out. That’s the feeling I have. The work is overused. Studios think if you put in more effects, you’ll make more money and in some cases, they’re right. The problem is how to keep the quality of the imagery up, the price down and meet the deadlines. We have to compete with the financial realities of competing with overseas companies that pay less and get refunds and tax incentives. That’s almost unsolvable by staying in California. So, we have the four studios. We need that.” Having a global presence and a capacity in tax-incented countries has allowed the studio to stay competitive. So, too, have the efforts

At the forefront of the digital revolution, ILM continues to break new ground in VFX

of the R&D teams and artists who look for ways to make the process more efficient. “The world of rebates is really hard,” Scott says. “That’s one reason why I’m trying to refine the tools, make them easier, quicker and more efficient for artists to use." “We’re the last of the big American effects companies left,” John says. “That’s partly because we’ve gone international. And because we’ve always put a high premium on hiring the most talented people we can. Extremely experienced, talented people do things right the first time. That’s how we get price competitive.” We can’t wait to see the amazing film sequences the ILM artists will come up with over the next 40 years. Find out more information about FYI ILM's work at www.ilm.com

ILM continues to oversee VFX on cinema’s biggest new releases, including Disney's Tomorrowland and Marvel’s Ant-Man (right)

3D WORLD July 2015

47

3dworld.creativebloq.com

1200

Number of employees ILM San Francisco (500) ILM Vancouver (200) ILM Singapore (300) ILM London (200)

310

Total number of films, including those currently in production



FREE VIDEO TUTORIALS! GET STARTED IN UNREAL ENGINE 4 START YOUR CG CAREER BETTER PORTFOLIOS COLLEGE PROFILES EXPERT ADVICE

3dworld.creativebloq.com June 2015 #195

ZBRUSH MODELLING Add detail to your armoured characters with NoiseMaker

FREE BOOK!

ROBOT DESIGN

The secret behind Chappie's success

WORTH £14.99

228 pages of pro tutorials

MASTER THE ART OF

PHOTOREAL

HAIR & SKIN

Nick Gaul shares his complete process for ZBrush and V-Ray


FEATURE Chappie VFX

ag e m I o tudi eill s N X VF tor n c w e r o i h cree d s s r r ve e a nd ilve s oos , i s co e of M a s e d i s h n t p g i ed des n us ha p , to eale C d 2D ere the ’s 3D s, a e i N t a e p w in d el c re ht sto g en e t a t , w h i c h g e En g 3 D m o c a l y r u W , i d a e r s o ou B t r ac t r Im p pie e fir br o C h a C h a p p i o i n t f o p r o v e d l d t h e p to b e o p s i p in a n d t a r t i n g N e i l l a a to b u e d t h e m i t a t i o n ’s r p t s m e w i op En g e l c e o h l n s m h n l W t k a r to ot io d s. O es s m k a t of Weta Wd othe timteure b u il e n t t h e m i s p ro c f a r a s m e o l h s s T a B we fea ks ou s.” re abl o un

B

t s we o s i t r a y the t as true t t i l e d The fi ve was jus ance ie h m c r a o f o r t e inal p g i r o the

k in io n orn epts ar kamp’s ppie’s v e r s o u t a ny g c con ill Blom t 9, Cha tandin n o ir s e c of N t Distri tle man uristic t u t t i u b l h f e g e e i d ssiv of th . fits r tale a repre atically Elysium o o -t t m d o e n g a h up t s g ine me istric t 9 t Neill’ was i g e e En r a g D k h a t o h s ’ o it w Im in w ork flo ngine t th Weta isor, i w E D perv u s e The Image rmed w s own 3 e al s o e th V FX A d ll’ in efor uilt. “It s in a vey, a an hough. nd Nei ge Eng ging b r t l a l e b r e a sH a ll y Ima n, w ios W ent t a c te e rig ie – Chri esig ere ac tu he char anges,” stud ly dif fer Chapp oose – that th otion. d e m h l r M th sw e e t s ig n c tota epts fo robot nsuring l-to-life ed 3D s d o n r s io n to s e e t e c h b a s e t i c a l ve c h a n c e o m e d con n of th esigns, e of rea eta star n variou w hic s g i g o m d in r f prac Neill a d make , desi ed 3D d in a ran aland W appies nal e e t fi h fo rc s ig n amm g ave p a c e a n e s . crea d result New Ze esize C robot’s o li c e r e p ro g r g e d d e , u p s l n f i i n u e l i t d D g s n e 3 th c k nd 2 0 wo and o - le film y co a ni s nt ving n ba Ba r r e m e c h b e r a t e d s a m e t w e r s h o r t pm e n The ing arou r, recei up. o l e i i i h v a t a T is l the ’s earl prin s of we the legs sset de f the m mpared s p p pie e on e a s Cha lm, tak lomkam bot. stag n from rom an e, one o ppie co proce fi B p g g a v i h in “F pec ti the was in d Tem des t uild on C n pers rences s this b e’s asse built that Vaal a a n e e i a f w r f w g i t , n m d Te of sium ge E lysiu N to E alls Ima “On Ely sed of f POO r a Y c . n b R re Poon O s BAR superviso els. roid y eta,” h W or Barr of the d cal mod he’s sset t i a e r n w e is a r vis rsions s , wh ac t i Barry e Engine ts such a u pe s pr e c g . s e 3D v kshop’ e a k j r o m a r I D at np th or Zero ed o n aW work ium and rry-poo Wet Elys

ba

195-

it.ly/

.b www

3D WORLD June 2015

46

3dworld.creativebloq.com


NUMBER CRUNCH: PIPELINEFX'S QUBE! RENDER STATS HOW LONG? It took 95 million CPU minutes to make 1,000 VFX shots for Chappie. THAT MEANS‌ It's the equivalent of one computer working nonstop for 2,000+ years. MACHINE POWER The 500 machine render farm was used to create a mix of fight sequences, hard-surface character creation, live-action animation, comps and lighting tasks.

] [ Film title

CH A PPIE

RC ctures and M

] Sony Pi [ Distributor e Engine udios ] Imag st X [ Main V F 9,000,000 $4 ] t ge d n bu [ Productio ars ration ] 1.5 ye [ Project du

3D WORLD June 2015

47

3dworld.creativebloq.com


FEATURE Chappie VFX

sight ]

COCKidTgAetILa molotov T N A Y O B yd FL A M er actuall ite, but

[ V FX in

ign rform A stunt pe wn at him and it did tegration in ro t th h g il cockta of the ti ace ) (because hysical sp in the end Chappie and his p h. tc a m ith nce to ne e d e d w it as refere se u ly n o we could

CHRIS HARVEY Chris is a VFX supervisor at Image Engine. His credits include Battleship, R.I.P.D. and Fast & Furious 6. www.bit.ly/195-harvey

“One significant difference to working on the previous films was the addition of a ‘post-viz’ aspect,” says VFX supervisor Chris Harvey. Basically Neill did not want to screen the film with a shot that contained Chappie actor, Sharlto Copley. He wanted the audience, even in the earliest test screenings to see Chappie as Chappie.” What this meant was a lot of work in a ridiculously short deadline. Filming, with Sharlto acting in a grey suit, ended in January

FEELING THEE HEAT A AT igh frame rates and High close proximity made this fire simulation particularly challenging. HIGH IMPACT A custom ”magic” curl noise function was written to build upon the Houdini Pyro tools in order to achieve the results seen here. CENE EMOTIVE SCENE The FX simulations together with some skilled compositing gave us a spectacular climax to an emotional sequence.

By the time we were done Chappie had turned into by far the most complex digital asset myself or Image Engine has ever had to deal with Barry Poon, asset supervisor, Image Engine and Chris and co had to have Chappie in place for the first audience viewing in mid-March. Chris describes the work as “quick and dirty” animation with a “dumbed down” version of the final Chappie asset in all the shots. “While a tall order, it was extremely beneficial for the film on both sides,” he says. “For Neill and the editors it gave a much more representative look of what would be seen. For us, we got very early access to everything we would later be dealing with and allowed us, to very early

3D WORLD June 2015

Weta‘s 2D designs of Chappie were used as a starting point for Image Engine‘s 3D builds

48

on, anticipate issues we were going to have to deal with, so we were able to be far more proactive rather than the typical reactive nature you often have to deal with. It also gave us a literal boot camp that we ran every single animator through as they ramped onto the show.” That was just the start of it. Image Engine put in the hours to research and sort out the minutiae of Chappie’s mechanics and locomotion before sending Weta their final designs. “It is a great process to have this back and forth creative development with Neill so early in the project,” says Barry. “By the time we were done Chappie had turned into by far the most complex digital asset myself or Image Engine has ever had to deal with.” Image Engine worked primarily in Maya, Houdini, 3Delight, Nuke, ZBrush, Mari and Shotgun. The latter of which Chris says, “might seem strange but it’s heavily utilised and integrated in our artist workflow.” In addition to this, Image Engine used their staple of proprietary software, including Gaffer – their app for scene generation, shading, rendering and compositing. There was also a number of new tech that they created. “We added an advanced shader layering system that allows for realistic soft transitions between material properties,” says Chris, “and Photoshop-like adjustment layers allowing for globally affecting the various shading

3dworld.creativebloq.com


EVEN MORE RE RENDER STATS

RELIABLE RESULTS “Some of the simulations for our final sequence took machines with 192 GB of memory over 24 hours to render a frame,” says Gino del Rosario, Head of Technology at Image Engine.

components from underneath layers.” As Chappie’s face design consisted of eyebrow bars and two rabbit-like ears, the animators relied on Sharlto’s every movement to convey character. So why not use motion capture? “The short answer is after lots of testing it simply did not make sense,” explains Chris. “The thing about motion capture is it's great for multiple characters, and it’s great for highly articulated facial performance. However, while it might get you a first pass very quickly, there is always this process where you need to spend a decent amount of time cleaning up, manipulating and adjusting that data.” For the Image Engine folk, working mainly on one character, that time was better used matching Sharlto’s performance through good old keyframe animation. “The fidelity the artists were able to achieve was just as true to the original performance,” Chris continues. “When we weighed the costs, manpower, time and disruption for filming, using motion capture simply did not add up to a worthwhile use of our resources. It’s sort of funny because to be honest, it’s effectively the same as what really goes on with motion capture, we just cut out the automated data acquisition portion of the process.”

DAY RATE Qube! ran an average of 1,479 render jobs a day for Chappie. JOB TIME The whole render process took Qube! 498 days to complete.

[ V FX insight

Barry Poon, asset supervisor, Image Engine

Continuity is king

toggle on and off the various configuration files that would put this uber asset into the appropriate state for any given shot.” Though it sounds like a complex system, the result was a streamlined process. “Based on the script we knew Chappie would require several damage states and other wardrobe changes,” says Barry. “Once the edit was close to being locked we made a breakdown of every state Chappie would require. Then, every Chappie shot was tagged with the necessary state. We had some proprietary tools that helped ensure the correct state of Chappie was used in animation and lighting, but a lot of it came down to someone tracking each shot and

3D WORLD June 2015

L

By the final sequ en has accumulated ce of the movie, Chappie of 16 different va significant damage – a total of thought was riations of destruction. A lot pu be destroyed, w t into how each part would he film, and if that n it would happen in the da model or textur mage would be executed in e. Chappie’s armou In this image, various parts of using a combina r were ripped of f and mangled , tion of Maya an d ZBrush.

The amount of detail that was put into Chappie is amazing. It's something myself and the Image Engine team are very proud of In total there are eight main stages of look Chappie has to go through in the film, with another 12 sub-stages. Each of these required adjustments, tweaks and totally new textures, shaders and models. “Technically this meant we had roughly 30 different versions /assets of Chappie /the Scout. It would have been a nightmare to try and handle and track each change and then make sure every other version got the same update,” says Chris. “So instead we treated Chappie as an uber asset.” Every single texture, shader, model, and rig adjustment for every single version rode along with this single uber asset. “We then used shotgun and our internal database Jabuka to automatically flip the correct switches to

]

D A M A GE C O N T RO

49

make watching the edit s repeatedly to .” used was sure the correct state visuals, But judging the film purely on the tly sligh gs thin g doin say you would have to Engine e Imag the for off paid ly rent diffe a full team. “It was the first time I’ve seen we’ve ts asse hero the of any of t prin 3D ing a chance worked on,” says Barr y, “and gett all of the to see Chappie in person, to see model, fine detail that was put into the 3D team. The was incredibly rewarding for the Chappie amount of detail that was put into and the is amazing. It’s something myself d of.” Image Engine team are very prou To see the official trailer for Chappie FYI visit www.bit.ly/195-chappie

3dworld.creativebloq.com



SCI-FI SPECIAL MECHS • SCENES • FIGURES INSPIRING CG ARTISTS

3dworld.creativebloq.com May 2015 #194

FREE INSIDE! 6GB OF RESOURCES PLANET TEXTURES VIDEO TRAINING 3D PRINT MODEL

FREE!

ZBRU ZBRUSH GUIDE Pose and render a sci-fi character

MECH MODELLING Harness Cinema 4D and ZBrush for hard surfaces

MAKE YOUR OWN

STAR WARS VFX MOVIE How to become the next J.J. Abrams!


FEATURE ARTIST PORTFOLIO Pierre Meet the Drolet artist

ARTIST PORTFOLIO

PIERRE DROLET

Developing a virtual tour of his 3D models is the latest challenge this sci-fi veteran is setting himself 3D WORLD May 2015

50

3dworld.creativebloq.com


Pierre Drolet built the original Star Trek Enterprise, for the prequel series of the same name, based on Doug Drexler’s design

SC SPECI-FI I ISSU AL E

W ARTIST PROFILE Born in Canada, Pierre Drolet has worked as the lead modeller for Eden FX, Universal Studios and Pixomondo. He is now freelance. www.pierre-drolet-scifi-museum.com

VITAL STATISTICS

JOB TITLE Lead CGI modeller, graphic designer and author NOTABLE TTABLE CREDI CREDITS TS Da Vinci’s Demons, Mission: Impossible Ghost Protocol, Snow White and the Huntsman, Fringe, Hawaii Five-0, Terra Nova COUNTRY Canada (now based in USA)

hen it comes to showing off our 3D models, most of us are happy to post some 2D images on our website and have done with it. But not Pierre Drolet. Having spent 15 years designing CG spaceships for shows like Star Trek, Firefly and Battlestar Galactica, he wants people to see more of his work than that. “I want people to feel like they’re really there, that they’re walking around a spaceship,” Pierre explains. “So I’m using Unreal Engine 4 to rebuild the ships and retexture them to make it look almost real. And then I want to make it so that people can take a virtual tour, like they would in a real-life aircraft museum.” Perhaps unsurprisingly, it’s proving a tough task. “It almost made me cry, the amount of work” he sighs. “For one thing, when you build a spaceship for a movie, the inside and the outside are almost like two different things. Most of the time they don’t even match! But now I need to redesign them so people can accurately walk around the ship and then go into it. I don’t want people to feel like it’s a set, but that you’re really in the ship.”

I want people to take a virtual tour of my spaceships, like they would in a real-life aircraft museum

Many strings to his bow

One thing Pierre isn’t short of is spaceship models to develop, after a long career creating them for popular sci-fi shows. Having worked as a video game artist from 1995 onwards, he was hired as a CGI modeller at Foundation Imagine in 1999, and his first independent assignment was a probe called Friendship 1 for Star Trek: Voyager. Pierre recalls it being a real make-or-break moment for his career. “We received a design from the art department and I really didn’t like it,” he says. “So I asked my supervisor if I could come up with something different, and he said go for it. I was taking a gamble because if he’d hated it, that would have been the last time I’d work for Star Trek! But thankfully he loved it, and things snowballed from there.”

KLINGON BIRD OF PREY

Pierre ended up working on the remainder of Voyager, its follow-up Star Trek: Enterprise, and a number of the movies. Noted contributions included his CGI build of the Star Trek Enterprise, NX-01, based on designs by Doug Drexler (see right), and the CGI Romulan capital city he built for Star Trek Nemesis. He won a VES Award in 2003 for his Star Trek work and was nominated for Emmy Awards in 2003 and 2004. This was followed by a series of high-profile movies and TV shows, including Battlestar Galactica and 24. He also redesigned Serenity, the main ship from cult series Firefly. “It looked like a cross the way the original one was built,” he says. “I tried to extend the neck and bring the wing back, make it look more aggressive.” Pierre’s not just a spaceship nerd, though. “I can do a lot of things,” he explains. “I’ve done a lot of characters, a lot of monsters, a lot of set extensions. On Da Vinci’s Demons, for example, I did a lot of 16th century sets from Italy. But I do prefer to do mechanics.” And he’s taken full advantage of the opportunities to do so. “I’ve done around 150 spaceships for different shows,” he smiles. “I have a whole collection, as well as military vehicles, planes, boats… I have a whole virtual army for myself!” A stunning career, then – but CG isn’t the only string to Pierre’s bow: he’s just released his first sci-fi novel, The Gateway, on sale in print and for Kindle on Amazon. A time travelling thriller about a top-secret science project, it’s full of suspenseful twists and turns and is getting rave reviews. It seems like hanging out on some of the most popular sci-fi shows of modern times must pay off. Find out more about Pierre's work at FYI www.pierre-drolet-sci-fi-museum.com

USS JEFFERIES

3D WORLD May 2015

51

3dworld.creativebloq.com

ARTIST INSIGHT

Enterprising task Working on the final series of Star Trek, Pierre got to build the original version of the iconic starship in CG Over the summer of 2001 Pierre created what at the time was the most detailed computer generated model in TV history. The Enterprise NX-01 was produced for the fifth series of Star Trek, a prequel to the original 60s show. It was a dream job for a CG artist, and Pierre admits he was lucky to get it. “Rob Bonchune, who’d worked on the show since Deep Space Nine, wanted to build the NX-01. But as the visual effects supervisor for Enterprise, he reluctantly decided it was too big a job, so I got the privilege – one I was very grateful to take on!” Pierre took the designs of veteran visual effects artist Doug Drexler, and modelled the starship using LightWave 3D. Taking up 200 megabytes of RAM merely to load, the digital model consisted of 500,000 polygons with 156 separate images painted on it. Two versions were created: a HD version for the TV series and a low-res model for promotional purposes.

VRIL GENERATION FIGHTER



FREE! 17GB OF RESOURCES VIDEOS • MODELS • BOOK • BRUSHES INSPIRING CG ARTISTS

MATTE PAINTING FOR VIDEO GAMES ES Learn 3ds Max & World Machine

FREE MODEL! 3D print-ready

dinosaur

10 PHOTOREAL

3dworld.creativebloq.com March 2015 #192

MAKE THIS IMAGE

RENDER TIPS Master Maxwell Render

FREE BOOK!

Worth $45 Teach yourself Maya in 24 hours!

MODEL MAGICAL

ZBRUSH CREATURES

MAKING BIG HERO 6 Disney‘s brave new world

Master the complete character design process! 11 hour video • ZBrush model • Easy to follow steps


FEATURE Photoreal tips

principles of

PHOTOREALISTIC RENDERING C reating excellent photorealistic renders is both an artistic and a technical challenge. The key to becoming a great CG artist is developing both aspects of your work and here we will examine 10 principles of creating photorealistic images: five of them technical and five of them artistic. The key lesson in technical principles is that there are no large flat surfaces in nature. Every surface has variations in tone and colour. If there’s a key artistic principles message it’s that photorealistic rendering should be seeking to emulate the very best of photography, not the dull snaps of an amateur. At Cityscape, we love the technical side. We are always experimenting with new software, new scripts and new ways of doing things. But when interviewing new staff, we put more emphasis on the artistic. It’s easier to teach a working method or good-practice settings than it is to teach people visual judgement, or a real feeling for and love of images. We also encourage lighting artists to work with more than one render engine. As a studio, we use Modo, Maxwell Render, V-Ray and Corona Renderer. Each is wonderful in its own way, and each has its strengths and weaknesses – but each is only ever as good as the vision, judgement and ambition of the user. For all the assets you need go to www.creativebloq.com/vault/3dw192

NIKOS NIKOLOPOULOS Nikos is CGI director at leading architectural visualisation firm Cityscape. He previously worked at studios including Glowfrog, Crystal CG and Ixor. www.cityscape3d.com

Visualisation expert Nikos Nikolopoulos reveals key artistic and technical principles for recreating the real world with photorealistic accuracy

REFINE MATERIALS Textures and materials are crucial for creating photorealistic renders. All materials should have accurate reflective properties. The key is to understand how materials work in real life, then add extra depth to them. You can do this by using specular maps to tell your render engine which parts of your model should have sharper specular highlights (the smoother parts of the surface) and which should be more diffuse (the rougher parts). You can also use bump or normal maps to simulate grooves and imperfections in a surface. Keep materials and textures with the same properties instanced to save memory and keep your scene file clean.

3D WORLD March 2015

40

3dworld.creativebloq.com


SMART LIGHTING Use your lights to tell your story. Direct lights can help you to develop the mood/ atmosphere of your image, and in many cases reduce render times. In interiors you can add extra lights to your windows (light portals) to intensify the colours in your scene.

CHAMFER EDGES

There are almost no razor-sharp edges in nature. Even most man-made objects have a slight roundness where two opposing surfaces meet. Chamfering helps bring out detail and adds realism to your models by allowing edges to properly catch highlights from your lighting setup. The quality and roundness of edges depends on the material and how far away an object is from the camera. You can either chamfer edges in your 3D modelling software, or in your renderer.

TEST LIGHTS

Everyday testing is important. Start with your lights, then move on to materials and small details. It’s important to test all your lights one by one so you know how much influence they have in your scene, then focus on how they work in relation to one another by choosing the right intensity and colour for each one. Replacing the materials in your scene with a standard white clay material is an easy way to see how your lights and shadows are working. 3D WORLD March 2015

41

3dworld.creativebloq.com


TEST MATERIALS

Always work from reference images, and try to analyse the properties of a material before you start to recreate it digitally. Certain materials, particularly those with many layers, SSS, translucency or glossy reflections, are responsible for most of the rendering calculations, so understanding and optimising materials is very important to keep your render times short. Creating a standard virtual studio setup can be a good way to test materials before adding them to your scene. It’s also worth checking your materials based on your camera: objects in the distance will need less complex materials than those close to the camera require. Also, don’t be tempted to use pure white or fully saturated materials in your scene.

CREATIVE LIGHTING

DIVERSE INSPIRATION Creating images is not always the easiest way to make a living, although it can be a deeply rewarding one. Staying inspired has to be worked at. Think of inspiration as a jug. What goes into that jug is very personal. It might be Danish furniture design, circuit bending, modernist literature, fashion photography, or 1970s conceptual art. As an artist it’s vital that you keep filling this jug. Look outside your immediate CG rendering world for your sources of inspiration. You will only truly find your artistic voice by chasing down the things that really move you. When you come to create your own work, you simply need to pour out a little from the jug. Even on what may seem like a dull job, there’s always space for trying something new, pushing a boundary or finding beauty.

“You don’t take a photograph: you make it,” said the great American photographer Ansel Adams. The idea that photography is ‘truth’ is a powerful and wildly held misconception that you need to break free from. Take a look at the number of lights professional photographers use for an interior, car, or portrait shoot. Try to find images of the lighting setups for scenes from films that you admire. Seeing photography as something constructed is the first step to really understanding the creative potential of photorealistic rendering. Use light as a painter would. Put it where you want your viewer to look, and remove it where it’s not helping to clarify what you want to say. Experiment with the balance of your lights, test their temperature and hierarchy. Think about light, message and composition as one thing. Good CG artists avoid dull and dutiful realism, just as good photographers have always done. 3D WORLD March 2015

42

3dworld.creativebloq.com


CRITICAL OBSERVATION Look critically at photography. Like all visual art forms, it has its languages, subtleties and discourses. Watch films to learn about creative lighting. Look at the early pioneers of black-andwhite photography to learn about composition. Take your own photographs. The wider your awareness of what photography has achieved, the greater the potential there is for your own CG work. If you can’t imagine it then you can’t make it. Look at light. Hopefully, you have had the experience of being literally stopped in your tracks by some unexpected and uncommonly beautiful behaviour of light, only to realise the people you are with have not seen it. If not, you are not doing this right. Look harder. Look at your work. It’s amazing how many artists don’t check theirs carefully enough. You may have used the setting you always use for glass, but if it doesn’t look like glass it’s wrong. Visual judgement is as important as technical skill.

CLEAR DIRECTION

The temptation with any image is to just start. But be careful: this can lead to wasted hours and weak images. Before you start, you need to be clear of the direction of the image and this vision will guide every creative choice. Always start with a clear brief and you may have to push a client to articulate what it is they want to say. For personal work, clarify your thinking. Strong art says one or two things clearly; immature art tries to talk about everything. You should also gather visual reference. Mood boards are the first step in the transition from words to pictures. They help define a quality of light, a colour palette or mood. They can also help you explore the context and visual history of your ideas. You have the whole of the world’s history of design and imagemaking to draw upon. Inspiration can be found in unlikely places.

STRONG COMPOSITION How many photographs are taken every day in, say, London? Very many. How many are truly great? Very few. What elevates those few is clarity of message and refinement of composition. Composition is the hidden language: the patterns of shape, light and dark, line and colour. To learn about composition study the work of great artists critically. How have they used restless diagonals to evoke the breathless confusion of battle, tonal pattern to force your eye to a focal point, colours to tell you about the emotional state of the subject? Some find composition easy. Others have to really work at it. 3D WORLD March 2015

43

3dworld.creativebloq.com


TUTORIALS Character design

FOLLOW THE VIDEO www.creativebloq.com/vault/3dw192

ZBRUSH | KEYSHOT | PHOTOSHOP

DESIGN A STORY-DRIVEN FANTASY CHARACTER Discover how to suggest stories for your characters as Raul Tavares reveals how he created this weary traveller, from initial concepts to finished image

ARTIST PROFILE Raul Tavares Freelance concept artist Raul likes to think of himself as a problem solver who uses a variety of tools, both 2D and 3D. With a focus on character and creature art, he likes to communicate his ideas in a fast, engaging and appealing way. www.artstation.com/ artist/art

STRONG CHARACTER ER SHEET My notes for Gorgua include references to fictional characters (Mr. Bean, Kronk from The Emperor’s New Groove) and real creatures (turtles and insects)

AMPHIBIAN QUALITIES

Every character presents a completely new challenge. No matter how many years of experience one might have, designing and conceptualising something from nothing is a pretty hard task. But imaginary characters are no different to real people in many respects. They have a certain physique, personality and form. They inhabit a certain location – and, most importantly, they all have a story to tell. The workflow I will outline here is designed to establish those qualities quickly and efficiently. Beginning by gathering reference material and generating quick

LAZY/OUT OF SHAPE

INSECURE

1 STORYTELLING TOPICS COVERED Concepting Sculpting Texturing Rendering Compositing

I start the design process by writing down the character’s physical traits, personality and environment. Thinking of a certain actor or film character also tends to help solidify my thoughts and the character’s role in the story I’m trying to tell. Storytelling plays a huge role in my art, so I like to write down as much as I can to keep things on track moving forwards.

3D WORLD March 2015

thumbnail sketches, then moving on to ZBrush for sculpting and texture painting, KeyShot for rendering and Photoshop for compositing the final image, I will explore how to produce an effective 3D character. In this printed walkthrough, I will provide an overview of the process and explain my design decisions. If you need more technical information, the download accompanying the tutorial includes over 11 hours of video, which shows the entire process in more detail. Plus my model is included for reference. For all the assets you need go to creativebloq.com/vault/3dw192

©istockphoto.com

L

ike many of my personal projects, Gorgua: The Weary Traveller came about from my own life experience. I had just come back home from the Trojan Horse was a Unicorn festival with the feeling that I’m on a journey, like so many others out there. I wanted to communicate that emotion by creating a character that people who feel the same way would relate to. In this tutorial, I will explain how I created Gorgua, setting out my workflow for conceptualising ideas to a high level of detail relatively quickly: from rapid sketches to a final presentation maquette and finished art.

46

2 FINDING REFERENCES

Before I start, the first thing I do is gather photo reference and create a collage. This helps me ground my design in reality when I start sketching in search of the concept. I collect images that embody qualities I want my final design to have: either the physical traits of the character, or simply a mood, feeling or composition. I’m looking for anything that drives the imagination.

www.youtube.com/3dworld


MAKING PLANS A good character has good foundations that are based on his or her part in a wider story or world

3D WORLD March 2015

47

www.youtube.com/3dworld


TUTORIALS Character design

QUICK SKETCHES CHES Scribbly sketches are a good way to explore ideas quickly. Set down as many possibilities as you can, then pick the most promising ones

3 THUMBNAIL RESEARCH

4 DEVELOPING THE CONCEPT

5 REFINING THE CONCEPT

6 EXPLORING COLOUR

7 DESIGN SHEET

8 THE ZSPHERE BASE MESH

I have two monitors, so I like to have the references on the other monitor while I sketch. I either sketch on paper with a Copic marker and then scan the pages into Photoshop, or mimic the same process in Photoshop. Using a simple Round brush, I doodle around trying to draw through the forms to define volume and get a good sense of the character.

Next, I dive into the initial (and pretty awful!) lines I drew and start looking for the personality of the character, the details of his anatomy, and props that will help to communicate his story. This is where the concept finally starts to take shape. My drawings are still a bit loose, but I’m starting to tighten things up in order to get to know him better. Shape language, form and design consistency are what I look for here.

I don’t waste too much time at this stage, keeping myself loose. I don’t care about quality: only about letting the ideas flow as freely as I can. I look for lines of action, combining elements that will enhance my design and take it where I envisioned. The best advice I can give here is to let go. Forward momentum is important, and there’s plenty of time to refine the design later.

Colour plays a big part in determining the mood of an image. I want a whimsical feel, and to suggest a long life full of quests and adventures, so I aim for a warm colour palette with a vivid pastel tones. I use a simple cel-shading technique to colour in my tonal sketches, since this gives the images depth and appeal while keeping rendering times to a minimum. You can see the layers and blending modes I used in Photoshop in the full-sized image provided in the download.

EXPERT RT TIP

Controlling volume When working with ZSpheres, I use Classic Skinning mode, and work with the Move and ClayBuildup brushes to control the initial volume of the sculpt before jumping into DynaMesh.

Knowing how to present your work is part of being able to communicate your ideas properly. Whether you’re presenting a simple sketch or a more realistic photobashed image, showing how the design has been resolved will help the client visualise how you intend to develop the character in future. Putting everything on a single clean design sheet will also keep you on track once you start iterating in 3D.

3D WORLD March 2015

48

Next, I begin modelling the character in ZBrush. I like to start with a fairly low-resolution mesh before I move into DynaMesh for detailing. I use a combination of ZSpheres and Insert Multi Mesh (IMM) primitives to begin blocking out the model. Even at this stage, there is still room for exploration. Allow the design to evolve; don’t just blindly follow the concept.

www.youtube.com/3dworld


EXPERT TIP

Use layers for more control Sculpting on layers gives you more control over details. This way you can work boldly at first and then use the slider for that layer to tone down its effect. Most times, a subtle change can make a big difference.

9 BUILDING FORM

I like to build form as I move the basic shapes into place. I do so using simple brushes (Clay, ClayBuildup and Move), increasing the subdivision level as necessary. I use the Dam Standard brush (www.bit.ly/damstandard) to carve into the surface of the model and the Clay brush to build it outwards. A bit of surface detail helps the basic forms to ‘pop’ more, but don’t go overboard with it at this stage.

10 PLACING THE LANDMARKS

I get the bony landmarks of the character in place first, then gradually add muscle and fat. I create the clavicles, elbows, wrists, ankles and knees, then focus on the major muscle masses. Building form in layers, smoothing then recreating the landmarks, helps to add realism, so having sculpted the landmarks, I smooth them with the Smooth Valance brush, then carve the crevices back in with the Dam Standard brush.

BETTER UVS

USING UVMASTER IMPROVE YOUR UVS

ONE USE CONTROL PAINTING

11 BALANCING THE DESIGN

At this stage, I like to take some time away from the sculpt, then look at it in 2D. I draw lines over screengrabs of the model to highlight the relationship of each form to the others. I like to play a lot with contrast: curves against straights, visual interest against visual rest, big versus small. This process is subjective, so you need to trust your instincts to get the model to a point at which you feel it is balanced before you move on.

12 BUILDING PROPS

I build the props out of basic primitive shapes, using the insert brushes (InsertSphere, InsertCube, InsertCylinder) or appending the regular 3D meshes (Sphere3D, Cube3D, Cylinder3D); then using the Move brush. When working with DynaMesh, I normally don’t go over a Resolution of 64 to keep the poly count manageable. When everything is in place, I refine the props using the Clay, ClayBuildup and Dam Standard brushes as before.

UVMaster’s Control Painting mode lets you to tell ZBrush where and where not to put the seams. It isn’t exact, but it can produce really good results.

TWO ITERATE AS NECESSARY Even using Control Painting, the model sometimes doesn’t unwrap in the way you expect it to. If that’s the case, try going through a couple of iterations.

13 RETOPOLOGY OPOLOGY

ZRemesher can be a quick way of getting decent topology from a high-resolution sculpt. I duplicate the initial SubTool, then use the ZRemesher Guide brush to paint lines along which I want edge loops to appear. Having pressed the ZRemesher button to generate a simplified version of the mesh with cleaner topology, I project the detail from the original SubTool onto the new mesh. You can find a guide to the workflow at www.bit.ly/transferdetail.

14 QUICK UVS

I use UV Master to unwrap the model. You can find it in the Zplugin palette. Having selected Work On Clone, I divide the model into polygroups, separating the arms, legs, tail, body and head. I make sure the Symmetry and Polygroups buttons are enabled, then press Unwrap. I copy the UVs, return to the original model, select the equivalent SubTool, and paste the new UVs. You can read tips for getting more control over your UVs on the right of this page.

3D WORLD March 2015

49

www.youtube.com/3dworld

THREE ADJUST THE UVS UV Master always calculates the best use of UV space. The result isn’t always perfect, but you can always adjust the UVs later, and it speeds up work enormously.


TUTORIALS Character design

15 REFINING AND DETAILING

Now it’s time to go back to sculpting, finishing the secondary forms and adding the tertiary detail. Surface details like the skin wrinkles and the imperfections of the exoskeleton can be created with some of the Alphas that come with ZBrush, using the Standard brush. To break up the regularity of the pattern, I then go over with Dam Standard and Danny Williams’ Lucky brush (no longer available online, but provided in the files to download).

16 PAINTING BASE COLOURS

Gorgua’s colour scheme is based on a combination of a cricket and a ladybird. I wanted to give the character a whimsical feel, so I also picked up an image of an environment for inspiration. Using the SkinShader4 material as a base, I start by filling the model with the reference texture and painting in the detail with the Standard brush, with the stroke type set to Spray and Alpha 23. Blending the textures then just takes time and patience.

PROJECTING DETAIL How to get your detail back ZBrush’s ZRemesher toolset (see Step 13) will create clean, workable topology, but remeshing your model will cause you to lose surface detail and some of the volume. Reprojecting the original model onto the retopologised version will recreate the original detail and keep volume consistent. To do so, make sure only the new mesh and the original DynaMesh version of the model are visible and then press Project All in the SubTool palette. A good rule of thumb is to have Dist set to 0.1.

17 REFINING THE COLOURS

To ensure consistency, I build up colours in layers, working from reference images. To create a feeling of depth, I use Mask By Cavity from the Masking palette, then invert the mask to paint the crevices and make the details pop out. Nature is also about variation, so splashes of vivid bright colours (either complementary colours or contrasting ones) will help give the character a more realistic look.

18 POSING THE FIGURE

I pose the character using the Transpose Master plug-in. I activate the Grps and Layer options, then press the TPoseMesh button to combine all the model’s SubTools into one mesh. I then use the Transpose line tool and the MaskLasso option to move each part of the body into position. Pressing TPose>SubT transfers the new pose back to the original model. The Layer option selected earlier creates a new layer for the transferred pose. 3D WORLD March 2015

50

www.youtube.com/3dworld


19 RENDERING MAPS

FIX POSING ERRORS Save your poses often!

Since I’ll be creating a cavity map inside KeyShot, the only things I need to export from ZBrush are the model itself and the diffuse map (the surface colour). First, I set the size of the map I want to create (Tool>UV Map>4096). Then in the Texture Map sub-palette, I press New from Polypaint to generate the map. I repeat this process for all of the SubTools.

20 EXPORTING THE MODEL

Sometimes, when you transfer the new pose back onto the original model (see Step 18), you end up with an error. This often happens if you have multiple SubTools open, and ZBrush doesn’t know which one to transfer the pose to. Assuming you’ve been saving your work regularly, all you need to do to fix this is reopen your ZBrush file, select the correct SubTool, then load the pose file via the TPose>SubT button in the Transpose Master sub-palette in the Zplugin menu.

21 WHY USE KEYSHOT?

For the model itself, I clone all the SubTools (just in case), then use the Decimation Master plugin to reduce the poly count of each one to a level that KeyShot can handle: a decimation value of 5-10 per cent usually works. I make sure the Keep UVs option is selected. Next, I use the SubTool Master plugin to export each SubTool as an .obj file. You can see both processes in more detail in the annotated image in the download files accompanying this tutorial.

KeyShot produces amazing results with very little tweaking, meaning that it takes much less effort to generate the render passes you need for compositing than V-Ray or mental ray. You can use it in the same way as a photographer would use studio lights, helping you to present the model appealingly.

22 IMPORTING INTO KEYSHOT

23 RENDERING IN KEYSHOT

I start by importing every .obj file into KeyShot, naming each imported object appropriately to keep things organised. KeyShot recognises the BMP texture maps exported from ZBrush and loads them automatically. By double-clicking on the material in the viewport or Project library, I can edit its properties. I start by changing the material Type to Plastic and adjusting the specular settings: these vary from one type of material to another.

To light the scene, I use one of KeyShot’s preset Studio environments with two area lights opposite one another (Light Tent Black Open 3 2k: you can see the process in more detail in the annotated image in the files to download). From the Output section of the Render options, I set render Resolution to 3,840x2,160 and export as a .tif with Include alpha (transparency) checked to make the render easier to edit in Photoshop.

3D WORLD March 2015

51

www.youtube.com/3dworld


SKIN SSS

DEEP SKIN SSS

SPEC

BACK RIM LIGHT

RIGHT RIM LIGHT

LEFT RIM LIGHT

FLAT COLOUR

COLOUR/SHADED RIGHT LIGHT

COLOUR/SHADED LEFT LIGHT

FRONT SIMPLE LIGHT

FRONT COLOURED

DEPTH

MASK

CAVITY

24 RENDER PASSES

The image above shows the render passes I exported from KeyShot. By combinining these in Photoshop, I can achieve a realistic-looking result. Adjusting the layer settings in Photoshop lets me adjust the look of the image more easily than adjusting settings and re-rendering in KeyShot. Even on a relatively stylised character, this is the key to success, enabling you to enhance details and increase the impact of the finished image.

To generate the passes for yourself, you will need to drag and drop appropriate materials onto the mesh (a black material for the specular pass, a translucent material for the SSS, and so on) and adjust the lighting (you can find more information about this in the annotated images in the download). Through a bit of experimentation, you should be able to generate the passes shown above. The video in the download also shows the process in more detail.

25 BLENDING IN PHOTOSHOP

Once I’ve imported my render passes onto separate layers within Photoshop, I experiment with different blending modes to try out different looks for the image. The ones that I use the most are Multiply, Overlay, Soft Light and Screen. You can see the settings I used in the annotated image in the download. I also use Levels, Curves and Color Balance adjustments to fine-tune the colour and shadow balance. 3D WORLD March 2015

52

www.youtube.com/3dworld


26 FINAL COMPOSITION

Finally, I get to a point where I feel the look of the composited character matches the story I originally wrote about him. All that’s left is to add a background, to ground the character and give him a sense of scale, and apply a few final effects to enhance the image.

27 BACKGROUND IMAGE

I start off by adding a background that reflects the places through which the character would normally travel. So that it doesn’t have more visual interest than the character himself, I manipulate the background image using the Gaussian Blur filter and a combination of Levels, Curves and Color Balance adjustments, then add some shadows.

MULTIPLE FUNCTIONS As well as reflecting the character’s back story, the background image helps to give him a sense of scale, and to unify the composition

28 GLOW AND DOF

I generate lens glow and depth of field using the Gaussian Blur filter (you can see the process in the video) and apply an Unsharp Mask filter to bring back detail. I also paint in some of the shadows. To focus attention on the character’s face, I apply a vignette (from the Lens Correction filter settings). Finally, I add texture details to make the final result look more like a picture, and apply a warm Photo Filter adjustment layer to unify the colours in the image.

29 FINAL ILLUSTRATION

NEED MORE HELP? If you need help following this tutorial, find me on Facebook. Knowledge is meant to be shared, and I’m always glad to help fellow artists

I continue experimenting, changing colours, and duplicating the image and adding it back over itself, changing the blending mode and Opacity of the duplicate layer. I don’t like my illustrations or even 3D characters to look like just flat renders. Coming from a comic-book background, I feel the image becomes much more interesting if little photographic elements are added. The result is still clearly a 3D render, but at least it feels more interesting. 3D WORLD March 2015

53

www.youtube.com/3dworld


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.