13 minute read
VIRTUAL PRODUCTION
MEDIA IN MOTION: UNDERSTANDING TRACKING SYSTEMS
In the fourth part of this series, Matthew Collu looks in-depth at two of the more prominently used types of motion capture in a virtual production environment
In the recent decades of digital storytelling, whether in gaming or filmmaking, the incredible technological feat of performance capture has become commonplace in our virtual ecosystems. The subtlety and nuance of a completely digital, fantastical avatar was either added meticulously by an artist or concocted as a retrofuturistic concept that belonged only in the far reaches of a fictional future. Nowadays, it’s not only a viable means of content creation but accessible too, with no better example than the new virtual production workflow.
Over the years, as this technology has developed to serve various purposes in various industries, different products and pipelines have hit the market. All have attractive benefits and intriguing use cases, but perhaps a quick definition of the practice would prove helpful.
Motion capture is a way of tracking and capturing the motion of an object – sometimes a prop or part, other times a living creature like you or me. This is done in a couple of ways. The most common with in-studio production is through infrared light and small markers that reflect to specialised cameras collecting the data. The second is a much more recent solution: a series of accelerometers are lined across the surface of an object, then translated back into rigid body data. This allows wireless motion capture, but can be far less detailed in its capture.
Motion capture is thus a workflow with a simple definition, but with numerous use cases, types and utilisations across the creative spectrum. Simply being able to capture the movement of something through a volume of space and to transfer that data in real time to a digital counterpart presents us with an entire universe of potential.
“While there are a number of existing options for camera
tracking, the one to be selected is one that can be deployed without limiting production,” says The Other End’s Head of Virtual Production, Kevin McGeagh. “Crews need to be comfortable performing their duties without interference while still ensuring a stable, accurate output.”
Without some kind of motion capture solution, there is no reliable way to track the camera's position correctly, which means no reprojection of its perspective on a LED wall. Most of the time, you won’t have to worry about not having any tracking. However, the real dilemma is one of choice. Of the many pipelines forged during the advent of this technology, a couple have proven to be the winners when it comes to reliable use during studio scenarios. So instead of explaining every motion capture solution, I’ll shift the focus to two of the more prominently used types in a virtual production environment: outside-in and inside-out.
Outside-in The most widely understood and used solution type, outside-in motion capture is the workflow we’ve all come to commonly associate with the topic. Motion capture cameras are scattered around a volume, all pointed inward to track whatever is marked and moving through a cloud of infrared light. Companies like OptiTrack and PhaseSpace are the largest proprietors of this kind of workflow, which is used well beyond virtual production – from gaming to visual effects, it is found in some of the largest motion capture studios worldwide. It is precisely what the name suggests: motion is captured from outside sources looking in on a volume.
However, this means you’re at the mercy of something every good director always wants – coverage. Looking inwards at a select group of markers, the solution can only track what it sees. If anything occludes or covers those markers, tracking consistency erodes quickly. You’ll bounce around like a ping-pong ball in a paint shaker if you're not careful.
Inside-out More recently developed, inside-out motion capture is the same solution found on new-generation VR headsets – rather than infrared cameras pointed in towards an allocated space with markers placed within it, cameras face outwards and track infrared light bouncing around the space. Points of contrast that make up a room or space can be tracked by how and where that light is bounced.
Another more advanced and consistent system that leverages this kind of solution is Mo-Sys’ StarTracker. This puts markers on the ceiling above the tracked object, capturing its position relative to where it maps those points out in real time. Both follow the same principle but are different in their capture of the space.
Despite seeming to be the opposite of outside-in, inside-out functions have the same issue: they need enough of something to look at. If there aren’t easily identifiable points outside, the system can’t determine its correct location. This is increasingly concerning when contemplating its use with a LED wall accompanied by a ceiling LED rig. I know the sky is usually where you find the stars, but in this situation, not so much. Again, though, this is an easy fix once you understand the inner workings of each kind of solution.
Much as virtual production is a product of astute reconfiguration and inventive, creative problemsolving, so too are the many motion capture solutions. This broad term covers a multitude of ways to attack a creative endeavour, and even more ways to bring something from synapses to screen. The real challenge is choosing the solution that works best for you.
Matthew Collu is Studio Coordinator at The Other End, Canada. He is a visual effects artist and cinematographer with experience working in virtual production pipelines and helping production teams leverage the power of Unreal Engine and real-time software applications.
NEW LAUNCHES AT IBC2022
Villrich Broadcast brings PTZ workflows into focus
Villrich Broadcast will demonstrate several solutions that can be combined with a PTZ workflow. One of the systems displayed will be from Canon and Seervision. This setup integrates a Canon CR-N500 PTZ with the Seervision Suite. The Suite integrates directly with a wide variety of pan/tilt heads and PTZ cameras, and its API allows users to keep using all their existing systems while continuing the workflow of their video production.
Seervision mainly focuses on automating camera movements or AI-assisted camera operation. The software developed for this camera automation is designed to make live video production effortless. Powered by computer vision and machine learning, the Seervision Suite enables existing robotic cameras to autonomously track the subject and keep it in optimal framing.
The Suite also helps integrate full subject tracking and framing. With Seervision's Adaptive Presets, the resulting shots are dynamic and fluid. By expanding the reach of autonomous cameras, the system helps production companies create more compelling content in a safer work environment with fewer people, at an affordable price.
Stand 12.A22 Kiloview will debut N50, the latest in its range of high-quality, low-latency NDI converters. A 12G-SDI bi-directional converter for both SDI to NDI and NDI to SDI, the N50 supports the direct transmission of USB to NDI signals by integrating both NDI and NDI|HX (including NDI|HX3) into one converter.
This all-encompassing solution is intended to streamline
New Kiloview products to boost IP transition
and simplify processes in any professional IP-based video transmission environment, including broadcast, news gathering, sports and live events, as well as facilitate the transition from SDI environments to IP-based ecosystems.
Also on show will be the CUBE X1, an embedded device focusing on NDI multiplexed distribution.
Stand 7.B60
Ateliere solutions help modernise OTT workflows and supercharge cloud content libraries
Ateliere Creative Technologies, a major in media supply chain solutions, is inviting attendees to experience Ateliere Connect and Discover.
The cloud-native, multitenant SaaS solution enables customers to manage media assets and the media supply chain from concept to consumer at faster speeds. New advancements in the Ateliere core technology tenets, Deep Analysis and FrameDNA, allow customers to automate deduplication of large content libraries and archives, reducing overall storage by up to 70% as well as reducing cloud costs and carbon footprint.
Ateliere solutions run on Amazon Web Services (AWS) and connect directly to cloud services. Leveraging powerful cloud parallel processing, customers can onboard massive content libraries in a matter of days as opposed to months, as well as scale to support significant bursts in workload at a moment’s notice.
Ateliere will demonstrate Ateliere Connect, which promises a better way to manage content libraries and connect a media supply chain, and Ateliere Discover, where content owners can power up a fully branded streaming service and orchestrate delivery across their own branded video app or major OTT marketplace in a short period of time.
Ateliere partner Synamedia, one of the world’s largest independent video software providers, plans to showcase the latest Ateliere solutions on stand 5.A69.
Stand 5.G12
Telstra offers superlative broadcast delivery with next-gen IDN
Telstra Broadcast Services (TBS) will launch the next generation of its cuttingedge IDN (internet delivery network) at IBC, to cater to the specific needs of international broadcasters for major event delivery.
It expands on the capabilities of this network, based on internet standards with a web-first approach to delivery for broadcasters, and features more automation for efficiency, greater network flexibility, higher bandwidth and lower latency.
Telstra’s IDN is a softwaredefined, cloud-based platform enabling the transport of high-quality video content and live broadcast streams to any registered endpoint across shared networks like the public internet. It features over 40 sites spread over EMEA, APAC and North America.
The IDN upgrade follows TBS joining the SRT Alliance, a collaborative community of industry leaders and developers striving to achieve lower-latency internet video transport, last December. This collaboration means TBS is able to more effectively provide broadcasters of any size with the most flexible, cost-effective and robust ways to manage their content.
The IDN rounds out TBS’ stable network connectivity solutions, giving customers the flexibility to choose the delivery method that works best for them – satellite, fibre or the internet.
Stand PD-17
More portability with the new Phabrix QxP
Phabrix will showcase the QxP, which inherits the flexible architecture and extensive workflow support of the QxL rasteriser. It incorporates a 3RU multi-touch 1920x1200 7” LCD screen and is equally at home on-set in SDR/HDR, grading and shading applications, as well as in QC, MCR, engineering and R&D environments.
The latest addition to the Qx series, the QxP’s extensive feature set is headlined by its capacity for 25GbE UHD IP workflows and its suitability for portable operation due to an optional V-mount battery plate. Incorporating touchscreen control and external HDMI for a rasteriser output with up to 16 instruments, the QxP has been designed to provide all of the capabilities required for flexibility and ease of configuration when working with both existing and emerging production infrastructures. It is thus suitable for workflows involving HD, UHD, SDR, HDR, SDI and IP, as well as conventional and remote productions.
The QxP comes with extensive IP support including SMPTE ST 2110 and 2022-6, along with compatibility with a wide range of 444 and 422, YUV and RGB video formats. For realtime IP production, it provides support for the generation and analysis of HD/3G/UHD/ EUHD 2110 payloads on generic SFP28/25GbE interfaces.
Stand 10.C01
GatesAir to launch FM analogue transmitter at IBC
GatesAir has introduced Flexiva GX, a new FM analogue transmitter line that provides radio customers with a power-to-size ratio for power levels up to 10kW. Using the latest LDMOS technology, GatesAir has packed power density into a compact 5RU chassis, providing broadcasters with 5kW and 10kW FM solutions that deliver an overall efficiency rating of up to 76%.
The engineering breakthroughs in power density, efficiency and footprint are made possible by the Flexiva GX’s design, enhanced by GatesAir’s third-generation PowerSmart high-efficiency transmitter architecture. While GatesAir continues to support HD radio and DRM+ digital radio within its Flexiva FM transmitter range, the Flexiva GX series is built for customers fully focused on the benefits of modern highefficiency solid-state technology.
Flexiva GX transmitters support N+1 configurations, enabling large national network operators to build flexible and consolidated transmission sites that meet the stringent uptime requirements.
Stand 8.C73
EditShare to reveal power of hybrid workflows
EditShare will showcase how its latest technologies boost quality and efficiency for production and post-production at IBC 2022. The latest suite of EditShare FLEX software solutions for cloud and hybrid workflows will have its first showing in Amsterdam.
FLEX reflects business trends, including the migration to a ‘work anywhere’ environment, with ready access to content wherever creative staff need to be. In adopting cloud storage and processing, it also meets the move toward an OpEx financial model, with cloud hosting and storage fees flexing to reflect the level of business.
EditShare’s EFS Multi-Site will also be on the booth for the first time. This allows users with multiple locations to leverage built-in file acceleration to synchronise project storage between EFS clusters in different facilities, ensuring ready access to content wherever they choose to work.
Stand 7.A35
5G transmission and decoding with Dejero at IBC
Dejero will introduce its new EnGo 3 and EnGo 3x 5G mobile transmitters with an integrated internet gateway at IBC 2022, along with its new WayPoint 3 receiver. The EnGo 3 and EnGo 3x combine 5G performance with video quality for broadcast and media production markets, with EnGo 3x supporting 4K UHD transmission and multi-camera production. WayPoint 3 reconstructs video transported over multiple IP connections from select Dejero transmitters, decodes HEVC or AVC, and outputs resolutions of up to 4K UHD.
Both transmitters are designed with brand-new RF and 4×4 MIMO antenna architecture to unlock the full potential of 5G connections and ensure optimal antenna isolation. Field crews can use them to transmit broadcast-quality video, quickly transfer large files and instantly set up high-bandwidth access points for multiple devices. In newsgathering and remote production scenarios, GateWay mode can be used to transfer files, connect to newsrooms and MAM systems, and publish content on social media, among other things.
Stand 2.B51
Digital Nirvana in a SaaS Trance 4.0
Digital Nirvana has announced an upgrade to its Trance selfservice SaaS application, to be released at IBC. Trance 4.0 works within a media company’s own workflow to automatically generate transcripts, closed captions/subtitles and translations for content localisation. Both enterprises and individual users will benefit from new features and capabilities, including elaborate account management, modular rights for users and an advanced pro-captioning window. It can be used as a combined or individual tool for transcription, captioning, text localisation or conformance of existing captions.
The improved transcription window can now be used as a stand-alone app instead of as part of a three-step transcription, captioning and subtitling workflow that only generates output after all steps are complete. In addition to media and entertainment content, users will be able to get interviews, meetings, podcasts and more quickly transcribed, reviewed and exported in different formats.
The pro-captioning window has undergone a major change to make it both user-friendly and efficient. Machine-generated timecodes will now be adjusted using spectrogram and manual inputs. The stand-alone procaptioning window will allow users to upload media, generate automatic captions and display in the pro-captioning widow using the desired presets defined by the user or the organisation.
An extension of the procaptioning window, text localisation will now be used as a Trance module.
Stand 7.A28