19 minute read
ACQUISITION The Studio is the New Location
ACQUISITION ACQUISITION & LIGHTING & LIGHTING
Weta Digital, Streamliner, and Avalon Studios Launch LED-Stage Virtual Production Service
Advertisement
NEW ZEALAND’S WETA DIGITAL, Avalon Studios and live event company Streamliner Productions, have announced a new LED-stage virtual production service, based in Wellington. A modern take on using front or rear projections of landscapes or rolling streets, Weta Digital Executive Producer David Conley describes LED stages as “… the latest technique to take advantage of game engine technology to provide virtual production workflows that can greatly expand what is possible on set. Being able to shoot final VFX imagery at the same time as principle photography adds another level of creative control for producers and filmmakers.” According to Richard Lander, Studio Manager at Avalon, the service can be scaled to cater for smaller productions and TVCs right up to a 270-degree screen scenario of the type used by ILM and Disney on The Mandalorian. “Each partner’s a different part of the puzzle,” says Lander. “We’ve provided the studios. Our Studio 8 measures in height to its grid 8.4 metres, and it’s got 23 metres wide and 33 metres deep. And then our Studio 11, which is going to be used in the same way, has the 17 metres wide and it’s 30 metres deep and it has an extra height of 10 metres. “We’ve done experiments where you can potentially put screens on the grids because our grids are enabled to do a clip on type scenario where you can clip the screens to the roof of the grid, but we’ve also worked out that you don’t always need a screen on the grid or a screen necessarily on the floor, it’s more the walls, because obviously on the floor the Art Department is laying something together to give you that illusion of there’s a bit of sand, for example, that people have used regularly, or a bit of a prop in the foreground that then the background so the screens takeover and give you that 3D imagery out the back. And on the grid, you can do that.” The studios can also employ J-arm cranes with LED screens attached that can “float” or be positioned above the acting talent. “For example, if they’ve got a helmet on and you want that reflection to appear truly in the top of the helmet, or if you’ve someone with glasses and they’re looking up, the reflection from the screen is appearing in that,” says Lander. Provided by Streamliner, the facility’s large, configurable LED panels are able to display imagery beyond 8K as a way to augment practical sets or replace greenscreen shots. With screens for indoor and outdoor use, options include a variety of small pixel-pitch arrays (the smaller the pixel pitch, the higher the pixel density, and resolution). “We’ve got some which are 3.9mm pitch and some which are 2.5mm pitch,” says Richard Lander. “Those are the indoor ones and then the outdoor ones we’re carrying are 5.9mm and 6.9mm pitch. “We know in the next few months there’ll be one that will become more the standard, which is going to be around the realms of 1.9mm pitch. I think the industry standard will become 1.9mm near the end of the year, so that’s what we’ll gear up to as well.” Another piece of the puzzle provided by Streamliner are strong relationships with augmented reality technology provider Disguise, as well as Epic Games, creator of the Unreal Engine. In addition to delivering high quality VFX/ graphics to the LED screens, the systems can also be used to trigger studio lighting. “The lighting board is also being driven by Disguise and the Unreal Engine,” says Lander. “So, the lighting board is getting triggered by what the screens are doing to actually enable the lights to flash to a particular timer or to a certain colour temperature or whatever you wanted to use. “We’ve done tests with a car scenario where someone’s gone out and shot a plate scenario. A car was mounted with five cameras on it inside and on the roof, it recorded the environment where it drove for ten minutes down a road and then we come back and that imagery from all those five cameras is then played onto the five screens surrounding the car and then you’re getting the true reflections of that. And not only are they acting as the reflection, but the panels or the LED screens are so strong that they become the lighting in a sense as well. “The car was travelling through trees, so obviously you’ve got the light coming through the trees, which is exactly like a shaft of light, so you were getting an effect of that by the screens themselves but to augment that, the DOP put a light through one of the panels so he gave that extra flash to give the illusion to the person in the audience watching it, there was a shaft of light striking the car every three or four seconds. The beauty of that now, because of all of this technology, the guys are timing the screens, they’re using the screens, which is driven by the www.content-technology.com/acquisition
Disguise system, and the Unreal Engine, but they’re tying it to the lighting guys or the lighting board operators. “The beauty in using the LED screens of course, is you’re getting true reflections all in time in sync with the driver who is driving, and then the director of course is just able to back up the thing, repeats the link, as opposed to driving down the car down the road or closing off a road.” According to the Avalon Studio Manager, test scenarios have involved different models of camera to ascertain compatibility with the LED set-up. “That proved really successful,” says Richard Lander. “We’ve tested with RED cameras to know their capabilities and their lens variations on a screen. We’ve tested with ARRI cameras and Sony cameras, and Panavision cameras. “To be fair, some cameras performed better than others in this environment, which would be true anywhere. [For example] the ARRI camera and the Sony camera have a subtle shutter thing within them that enables you to clear up any of the what we would term, ‘a Moire effect’, when you’re in close to a screen.” In addition to enabling more control, and reducing production and costs, Richard Lander says LED stages suit the requirements of a COVID-safe workplace. “In a COVID sense,” he says, “the programmers, or the people prepping the production, can be upstairs using that space in a comfortable environment to do all their design elements, and then it’s controlled where we can patch it to the floor straight away and to the screen, so that means there’s actually less people on the studio floor, which is good for long form or TV projects because then you’re keeping that social distancing factor on the floor.” For more information, contact sales@wetafx.co.nz
AVE Powers ‘On-Air’ Studio with Panasonic
ALTHOUGH RESTRICTIONS ARE LIFTING in some states, the imperative to isolate through COVID-19 has thrown calendars into disarray as conferences, plays and festivals are forced to postpone, or abandon events altogether. In a bid to build industry resilience and help local business spark virtual creativity, Sydney Masonic Centre (SMC), Create Engage and Audio Visual Events (AVE) collaborated to deliver ON AIR, a state of the art semipermanent studio with audio visual and virtual event capabilities. Located in the heart of the Sydney CBD, uses three Panasonic AW-UE150 ProPTZ cameras, while the studio space provides users with access to leading technology to stream their message to their audiences during a time of travel restrictions and physical distancing. Until now, this kind of production studio has been limited in existence, with most organisations opting for a more traditional and familiar style of venue or event space for production. In the current operating climate, there is a clear appetite for these alternative spaces. Since March, the studio has been booked out by clients opting for half-day or full-day sessions. With the support of Panasonic solutions, the ON AIR studio has turned an unused conference room into a fully-booked livestream and recording studio. AVE made the decision to incorporate three Panasonic ProPTZ (AW-UE150) cameras, operated by one central AW-RP150 controller. The AVE operations team can quickly organise each camera angle during rehearsal, improving efficiency at the operations desk as the event requires just one operator for all three 4K cameras. General Manager at AVE, Paul Keating, emphasised that the construction and design strategy for the ON AIR studio was centred around reliability and flexibility due to the global pandemic. “The use of the Panasonic ProPTZ camera system within the studio has enabled us to deliver our clients live streams with very high production quality,” said Mr Keating. “We have stocked a range of Panasonic ProPTZ cameras and controllers for years, and the use of pre-sets within the AWRP150 enables perfect repeatability and control over each camera’s setting. Our technical operators enjoy working with them during shows and clients love the results, flexibility and cost savings these products deliver.” The technology utilised within the studio also enables operators to adhere to physical distancing measures, without compromising video and image quality. “The health and safety of our team and clients has remained of utmost importance during the COVID-19 pandemic. The use of three AW-UE150 cameras with one central controller allows us to reduce the number of people required in studio, and respect physical distancing needs. This also allowed us to carefully manage client budgets at a time when many organisations are looking for efficiencies across their operations,” said Mr Keating. Scott Cooper, Director of Sales at SMC Conference & Function Centre also said: “The refitting of this space at the Sydney Masonic Centre has allowed us to continue supporting local artists and business pursuing their creative passion and utilising streaming services. With uncertainty around ongoing restrictions of business travel, some organisations have also considered hosting national and international conferences with us too. It really is an incredibly unique and versatile space which will fulfil a range of needs in the months – and maybe years – to come.” With an impressive pipeline of upcoming events and conferences, the ON AIR studio has been receiving great reviews from its early clients, including Blackmores Group, Salesforce, Vinnies CEO Sleepout and Palliative Care Australia. Visit: www.onairstudio.com.au and https://business.panasonic.com.au
Canon Looks to Make Another Mark with EOS R5 and R6
IT WAS SOME 12 YEARS AGO in 2008 that Canon released its game changing 5D Mark II. Combining video and stills in the one device, it was the first full frame mirrorless camera on the market with full HD movie mode built into it. Now Canon is looking to repeat history with the release of the Canon EOS R5 and EOS R6, two new full frame mirrorless camera additions to Canon’s EOS R System built on the RF Mount. The pro-level EOS R5 delivers 45 megapixel stills at up to 20fps and is the first full frame mirrorless ever to record 8K RAW up to 29.97fps internally and 4K to 120p. The EOS R6, meanwhile, captures 20.1 megapixel stills up to 20fps, 4K video up to 60p and Full HD at up to 120p. According to Brendan Maher, Senior Product Manager, Consumer Imaging, Canon Australia, the EOS R6’s target market is “… a bit more focused on the stills photographer, definitely in that low light space, and event photographers, who, if they’re taking thousands of images every day, having less megapixels actually helps them because of their workflow, with smaller file sizes it makes it a little bit easier for them. “For the R5, really seeing it obviously with those impressive video specs is a bit more focused on the videographer, definitely a great camera for wedding photographers that might be picking it up as hybrid shooters, doing equal amounts video and stills. This is the camera for them because it excels in both areas. And in terms of travel or landscape shooters, 45 megapixels stills is fantastic for them because they can get a bit more flexibility to crop or to print bigger.” Both cameras feature in-body image stabilisation which enables eight shutter speed stops of image stabiliser. Using a “Co-ordinated Control Image Stabiliser”, which is the Optical IS system of RF lenses talking to the 5-axis in-camera stabilising unit, the system is designed to deliver a better result than what either could do on its own. DIGIC X processor technology at the core of EOS R5 and EOS R6 – the same technology in the EOS-1D X Mark III – supports next generation Dual Pixel CMOS AF II. Claimed to be the world’s fastest AF focusing speed among interchangeable lens digital mirrorless cameras, EOS R5 focuses in as little as 0.05 seconds and can focus in light levels as low as -6EV. EOS R6, meanwhile, is the first EOS camera to offer a minimum EV for AF of -6.5EV. The high precision AF is effective in even poorly lit or low contrast shooting conditions. An iTR AF X AF system has been programmed using algorithms and face/eye detection mode ensures subjects are kept sharp even when moving unpredictably with a shallow depth of field. Even if a person turns away for a moment, their head and body continue to be tracked. “We’ve got eye detect auto focus and all detect auto focus,” says Brendan Maher. “Both working in stills or movie modes, both work in servo as well, so it will track your subject around the frame as they move. We’ve got a huge amount of auto focus zones. We’ve also got the multicontroller joystick on the back of the camera for moving your AF point around the screen.” There is also an Animal Detect feature which will track dogs, cats and birds. With built-in Bluetooth and Wi-Fi, the EOS R5 (5GHz Wi-Fi) and EOS R6 (2.4GHz Wi-Fi) can be easily connected to a smartphone and networks allowing high-speed file sharing and FTP/FTPS transfer. This functionality also allows for the cameras to be remotely controlled using the Camera Connect and EOS Utility apps, tethered to a PC or Mac via Wi-Fi or high-speed USB 3.1 Gen 2. With a camera body weight of 730g, they can be carried by a mid to large-sized drone. The EOS R5 and EOS R6 also support automatic transfer of image files from the device to the image.canon cloud platform, or integration with Google Photos or Adobe Cloud workflows. While the Wi-Fi capability can’t be used for streaming, Canon USA is Beta-trialling a USB platform for camera connection to a PC where the Canon camera takes over from the PC’s internal webcam. “That will launch later this year,” says Maher. “So, that’s across a range of platforms you’ll be able to use, not just these cameras but pretty much any camera that we’ve launched in the last three or four years, will be able to run via USB and stream on a range of platforms. And it’ll be a free service when it does launch. It’s free to download now, the Beta version itself.”
For on-board storage, the EOS R6 uses two SD UHS II card slots, while the EOS R5 uses one SD UHS II slot and one CFexpress slot. “The reason behind that,” says Brendan Maher, “is if you want to shoot 8K or if you want to show 4K 120P, it’s a huge amount of information, a huge amount of data, and basically you need the fastest card on the market for that, so the CF express card is needed if you’re shooting 8K or 4K or 120P.” “8K is a pretty exciting spec. What we’ve seen over the last couple of years is display manufacturers, your TV manufacturers are out there pushing panels out into market which are 8K, but there’s not a whole heap of content out there that you can display on those screens, so the opportunity for Canon photographers and videographers now to pick up an R5 for what is a very reasonable sort of price point, compared to other 8K cameras out there on the market, and be first to market with really 8K content, shooting on our devices, is pretty exciting. 8K’s four times the resolution of 4K, so it’s not just about outputting 8K either at the maximum resolution, you’ve also got the flexibility there to crop into your frame by up to four times and still deliver a beautiful 4K video file.” Visit www.canon.com.au
IP Image Capture with Grass Valley’s LDX 100
GRASS VALLEY CONTINUES TO DRIVE the evolution of live production with the launch of the LDX 100 camera platform. This latest addition to the company’s camera portfolio is a high-speed, native UHD camera that takes a revolutionary approach to camera design. Built from the ground up with native IP connectivity, the camera delivers signals directly into the network, conveniently enabling access to them wherever they are needed — even, for example, as return feed monitoring from another camera position. The LDX 100 unlocks greater creativity and flexibility for content creators, freeing them from the physical constraints of traditional live production environments. The LDX 100 leverages the full power and agility of IP by delivering easy, rapid set-up and configuration and the ability to add features as needed. Its unique integrated design frees up multiple rack units per camera that were previously dedicated to camera base stations and deeply simplifies the logistics for mobile productions when having to pair the right number of cameras to CCUs at each event. The TELEMETRICS INTRODUCED its reFrame Automatic Shot Correction technology as a key feature of its RCCP series camera control panel in 2017, allowing operators to lock cameras onto the talent and automatically trim the shot without having to touch the controls. With its latest RCCP-2A Robotics and Camera Control Panel, Telemetrics has continued to improve its control systems with new artificial intelligence (AI) and facial recognition algorithms that allow the RCCP to track several people with multiple cameras simultaneously. “We’re layering object-tracking on top of our facial tracking, so reFrame becomes more robust,” said Michael Cuomo, Vice President of Telemetrics. “When a person turns around, we can still track them because the system intuitively knows that that face and body was there before. Other systems can’t do this.” New rules for how the system behaves have also been added, enabling the operator to determine when they want reFrame to track an object or person and when they don’t. The new improvements have also facilitated better pan/ tilt/zoom and focus capabilities. Telemetrics offers reFrame on its RCCP-2A control panel with either the optional Studio (STS) or Legislative (LGS) software packages. The technology ensures that on-screen talent remains in frame despite their moving about or slouching in their seat. With its AI and facial recognition algorithms the software can run a production in a fully automated mode without operator intervention. Also, the functionality of the control panel’s top row of eight TeleKeys is dynamic, adapting to the operator’s commands as they define predetermined actions to run camera is a self-contained IP device that connects directly to the network, at up to 100 Gbps, allowing simple connection and discovery of audio, video, and control as well as superior PTP timing using open SMPTE IP standards. Implementation of NMOS IS-04 and IS-05 protocols means the camera is instantly identifiable to a network control system such as Grass Valley’s GV Orbit. At the heart of the camera is Grass Valley’s all new Titan imager. Designed to answer the specific demands of live sports production, this 2/3-inch imager provides superior UHD resolution and HDR/WCG color reproduction at up to 3X high speed. The camera easily pairs with industry standard B4 lenses to provide the zoom range and depth of field needed in sports without the optical gymnastics of other competitive solutions. This new solution allows production teams to create rich breathtaking images in full raster native UHD without compromising on the storytelling that a production in fully automated mode - with the control panel automatically switching and adjusting cameras as required. Meanwhile, RCCP
STS Studio Software, predominantly used in TV production studios, includes a Studio View feature that enables users to work with a 3D display of their studio environment and quickly see the location of
Telemetrics robotics and set pieces within that space.
A 3D model of the user’s studio can be imported into the panel to be processed and integrated with the software, providing complete positional awareness of all studio robotics. The STS control panel also features a selectable
Robotics Status display to show all major robotics parameters—such as pan/tilt angles, elevation height, rotational status, lens information—for any selected camera, at a glance. Quick access to custom overlays, stored pre-sets and built-in intelligence for all robotic cameras within a system allows the user to easily select the best camera angles, and add graphics, with one button press. The system also includes the ability to internally store snapshot thumbnails for easy shot identification, as well as a live video preview output.
Telemetrics’ LGS Legislative Software bundle
viewers demand. Further matching the needs of live event production, LDX 100 has a new mechanical design that provides ergonomic support for operators lifting and carry equipment, at-aglance state indicators for operational status — and “Champagne shower proof” protection against moisture and dust. A flexible SW-Option - available on a temporary or permanent basis — allows customers to leverage extensive upgrade capabilities and only pay for features when they are required.
Telemetrics Adds AI and Facial Tracking to Automatic Shot Correction
Visit https://www.grassvalley.com/
is now used by dozens of governmental agencies and departments around the world to cover regular proceedings and keep the public informed. An operator with limited production skills can pick any subject within a large chamber they want to feature on camera and store their facial profile into the system. The operator then simply pushes an icon for that person on the control panel’s touchscreen interface and the system automatically moves the corresponding camera to that person, focuses and trims the shot, and inserts the required title graphic. The LGS software also includes the Robotics Status display. Visit www.telemetrics.com.