Content+Technology ASIA September-October 2020

Page 1

SEPTEMBER/OCTOBER 2020

PP: 255003/06831

MEDIA + PRODUCTION + MANAGEMENT + DELIVERY www.content-technology.com

MALAYSIAN FOOTBALL

Back on the Pitch with Ideal Live

Scan the QR or visit www.candt.asia

Print & Online Publication / Weekly Newsletter / Podcasts / Product Updates / Event Discounts

YOUR FREE APAC INDUSTRY MAGAZINE ISSN 1448-9554



SEPTEMBER/OCTOBER- 20

SUBSCRIBE FREE

Sportscasting ... page 18. REGULARS 02 EDITOR’S WELCOME 04 NEWS ASTRO Reports Improved Results; KBS

08

10

Upgrades TV Production at Seoul HQ with Lawo; ATEME Powers Converged Headend for India’s GTPL KCBPL; Globecast Distributes Euronews English HD in Asia, Oceania; Sony Brings New Vision to Taiwan’s CTS.

FEATURES 10 CONNECTECHASIA/BROADCASTASIA Preview. 14 ACQUISITION The Studio is the New Location – WETA, Avalon and Streamliner Launch Virtual Production Service; New Fujifilm Premista Lens; Sony Introduces ‘Cinema Line’; Canon Makes Another ‘Mark’ with EOS R5 and R6.

14

18 SPORTSCASTING Malaysian Football Back on

the Pitch with Ideal Live; Tuning Into the 5G Future for Broadcast; Dream Chip Super Slow Motion Camera; Tedial Boosts Automated Sports Production.

22 NEWS OPERATIONS AFP Captures Chengdu

Consulate Closure with AVIWEST; IBC Recognises News Organisations of the World; Ideal Systems Launches APAC Rental Service for Zoom Rooms; Sony Expands Remote, Virtual and Distributed Production Solutions.

26 MEDIA IN THE CLOUD OVSN – the Power

20 CONTENT+TECHNOLOGY ISSN 1448-9554 PP:255003/06831 Broadcastpapers Pty Ltd (ABN 34 095 653 277) PO Box 165 Surry Hills, NSW 2010, Australia www.broadcastpapers.com PUBLISHER: Phil Sandberg Tel +61 (0)2 4368 4569 Mob +61 (0)414 671 811 papers@broadcastpapers.com

of Remote Workflow; Cantemo Portal 4.3 Delivers More Control; Silver Trak Expands MediaTrak Workflow Solution; Arvato Relaunches Vidispine; NAGRA Watermarking Leverages AWS; and more …

ADVERTISING MANAGER: Adam Buick Mob +61 (0)413 007 144 adam@broadcastpapers.com PRODUCTION MANAGER: Lucy Salmon Mob +61(0)412 479 662 production@broadcastpapers.com DESIGN & LAYOUT Wide Open Media Mob +61 (0)419 225 348 www.wideopenmedia.com.au PRINTING: DAI Rubicon, Singapore

30 POST-PRODUCTION Tokyo’s NiTRo

Streamlines 8K Editing Workflows with AJA; Intel oneAPI Rendering Toolkit; OTOY NVIDIA A100 GPU Nodes on Google Cloud for RNDR; ETC Publishes Specs for Naming VFX Image Sequences.

34 AUDIO Stage Tec AURUS platinum is

Backbone of KBS Hall; Indonesia’s 86 INC Upgrades with Clear-Com; DirectOut PRODIGY Evolves to Next Level; Pro Tools “Configurator” from Qvest Media; Logitek Adds Dante.

38 RADIO Stagetec Asia Transforms Taiwan FM Radio; Spokenlayer Partners with SCA; Euro Deadline for Digital Radio in Cars; GatesAir Integrates AOIP Transport within Radio TX; Digigram’s IQOYA CONNECT.

42 CONTENT DELIVERY Telstra Extends Global

Media Network to China and India; Village Island Brings JPEG-XS to VICO Converters; China Mobile’s MIGU Picks VisualOn for Streaming; Embrionix and the Riedel Connection; BirdDog Flex NDI Processors; Blackmagic’s ATEM Mini Pro ISO.

COPYRIGHT NOTICE: All material in Broadcastpapers’ Content+Technology Magazine is protected under Australian Commonwealth Copyright Laws. No material may be reproduced in part or whole in any manner without prior consent of the Publisher and/or copyright holder. DISCLAIMER: Broadcastpapers Pty Ltd accepts no responsibility for omissions or mistakes made within, claims made or information provided by advertisers.


EDITOR’S WELCOME

Going the (Social) Distance By Phil Sandberg

ersburg

Of course, there will be another round of virtual trade shows amongst this traffic jam, and these will likely bump into each other again as event organisers wait, again, until the last moment to pull the pin on their physical events.

Of course, its “show community” doesn’t seem to have included the NAB’s transatlantic colleagues over at IBC HQ, whose Amsterdam-based show is usually held in September. It also creates logistical issues for the NAB’s own New York event. Then there are the other industry events, given the state of the COVID-19 pandemic globally, who will be looking, if they are not already, to push their schedules out from what was always an

02:00

01:00

00:00

24:00

23:00

22:00

Hastily cobbled together in the initial period of the pandemic, virtual trade show platforms are growing in sophistication - and usefulness. While NAB Express and the IBC Showcase were pretty basic – and came with a questionable ROI for exhibitors (especially for those who could reproduce the experience in-house), the ConnectAsia/BroadcastAsia platform, which will be online throughout the year, displays a userexperience more akin to the physical tradeshow experience. Same for the IP Oktoberfest from the Alliance for IP Media Solutions (AIMS) which has opted to leverage existing online videoconferencing technologies. These types of platforms do serve as a viable alternative to physical events and should remain as part of the mix in the longer term in a, hopefully, post-COVID world. I understand the optimism of the trade show organisers, but I don’t share it. The large, international shows are going to be diminished for

some time, especially without a widely available vaccine. More likely is a revival of smaller, local events in those territories where the virus is under some of control. Bubbles will “pop up” where neighbouring territories have the confidence in their health regimes – e.g., Singapore and Malaysia, Australia and New Zealand – and even between bubbles. For the foreseeable future, the global events won’t be returning to their glory days. But, hey, I’m happy to be proved wrong.

THE REAL ISSUE THAT EVERYONE IS MISSING Of course, in all the hand-wringing over tradeshow cancellations and rescheduling, one point has been largely overlooked – it is the financial health of the end users of technology we should be worried about. They are the people and companies who ultimately finance tradeshows, exhibitors and even this publication. We should be helping them to adapt to this new environment with solutions, advice and support (and that also means supporting, not undermining, existing customer relationships with distributors. Like the tradeshows, the future of customer relationship is local).

03:00

optimistic/naïve first half of 2021. The GSMA recently announced a rescheduling of Mobile World Conference series for 2021. You might recall the Barcelona MWC being the first large trade event this year to cancel as a result of COVID-19. MWC21 Shanghai (where the pandemic situation is better than the world average) will now take place from 23 – 25 February 2021, and MWC21 Barcelona will now take place between 28 June – 1 July 2021.

21:00

20:00

19:00

17:00

18:00

According to the NAB, the organisation has also extensively surveyed its “show community” to collect feedback on its digital initiatives, ascertain how the COVID-19 pandemic is impacting on exhibitor and delegate attendance plans, and canvas alternative dates for the event. 16:00

15:00

AS YOU MAY OR MAY not be aware (because it’s been a hell of a year), the U.S. National Association of Broadcasters recently announced that its 2021 NAB Show, previously scheduled for April 11–14 in Las Vegas, is now scheduled for October 9–13, 2021, making it yet another event on the industry calendar trying to stay ahead of the COVID-19 curve and hoping that large-scale tradeshows can make a return in the not too distant future. “We have witnessed growing concern and uncertainty over what the next six months will bring,” said NAB President and CEO, Gordon Smith, “enough that there appears to be a good deal of reluctance around participating in large events in the first half of next year. The pandemic remains a significant threat and the evidence suggests it will be well into next year before it could be under control in the U.S. We also have our own concerns around being able to deliver the type of event in April that will not only drive results, but one that can be produced safely for all involved and without significant limitations on the experience.”

In the end, however, the only thing that will get the industry back on track is if we all do our bit to eradicate this virus – wear a mask, practice socially distancing. The virus is real. Once that situation has stabilised, the economy will follow. Thanks for reading Phil Sandberg – Editor/Publisher papers@broadcastpapers.com +61(0)414 671 811

Moscow

16:00

17:00

18:00

19:00

20:00

21:00

22:00

23:00

2020 C+T DEADLINES

24:0

Beijing

ASIA EDITION

Seoul Tokyo

NOVEMBER-DECEMBER 2020 Editorial Submissions: Ad Bookings: Ad Artwork:

Shanghai

Hong Kong

AUSTRALIA/NEW ZEALAND EDITION

Jakarta

ANZ

00

00

00

00

00

00

00

0

00

EDITOR’S WELCOME 00

NOVEMBER-DECEMBER 2020 Editorial Submissions: Ad Bookings: Ad Artwork:

23-10-20 26-10-20 02-10-20

For more information www.content-technology.com +61-(0)414 671 811 Email: papers@broadcastpapers.com

Sydney

2

16-10-20 23-08-20 26-08-20

00

ASIA

‘ASIAPAC’

00

Bangkok

00

Mumbai

00

Tehran



NEWS + PEOPLE

Malaysia’s Astro Records Q-o-Q Improvement PAY OPERATOR Astro Malaysia Holdings Berhad has announced result highlights for the second quarter of the financial year ended 31 January 2021 with improvements on Q1FY21 despite the wider economic effects of the COVID-19 pandemic. The results included: • Revenue +4% q-o-q to RM1.09bn (approx. USD$264,402,717) • EBITDA +13% q-o-q to RM372mn • PATAMI +81% q-o-q to RM134mn; Normalised PATAMI +13% q-o-q to RM121mn • Higher dividend of 1.5 sen per share (Q1FY21: 1 sen per share) • Pay-TV ARPU moderated to RM98 primarily due to one-off Sports Pack rebate • Go Shop posted record quarterly revenue: +52% q-o-q to RM145mn Tun Zaki Azmi, Chairman of Astro, said: “With the gradual restarting of economic activities, we saw encouraging signs of recovery in the second quarter. Astro’s balance sheet remained strong as it continued to be cash generative, cost disciplined and proactive in its capital management. Despite this, we remain prudent in view of the uncertainties arising from the pandemic. Nevertheless, the Board has declared a higher dividend of 1.5 sen in Q2FY21 (Q1FY21:

1 sen per share).” Henry Tan, Group Chief Executive Officer of Astro said: “Playing our role as a responsible corporate citizen, we acted quickly amid the challenging operating environment to serve our customers and nation as well as to ensure the well-being of our employees. We delayed several initiatives including upselling, installations and production of live reality shows. Go Shop pivoted to offer fresh and frozen food and saw record performance. The Q2FY21 results included the full effect of the one-off Sports Pack rebate given to our loyal Sports customers. We believed this was the right thing to do – providing access to other content and compensating for the absence of live sports.” “Operations are now close to full capacity. Installations have resumed and we see the Astro Ultra Box continuing its growth trajectory, with over 100k boxes installed, up 60% over three months. Customers payment trend is also encouraging, and local productions, global live sports and on-ground events are resuming.” Home Cinema on Astro First, the company’s home cinema proposition saw great interest with a 40% y-o-y jump in Pay-Per-View buys in the first half of the year. The positive trend is expected to continue. However, with advertisers holding back spending and a pause

in the production of Astro’s live reality signature content during the MCO (Movement Control Order), Adex in Q2FY21 declined by 11% q-o-q to RM80mn. Radex (radio), TV Adex and Digital Adex share stood at 81%, 38% and 2% respectively. Adex started to improve from June with the resumption of signature shows, live productions and on-ground events. In terms of outlook going forward, Henry Tan says, “Astro remains cautious in the second half of FY21 due to prevailing uncertainties amid the pandemic, structural changes in the media industry and the ongoing acts of piracy. We are mindful of the potential impact on consumers’ disposable income and sentiments when the loan moratorium ends. We will continue to support our commercial customers, who are still affected by the ongoing RMCO (Recovery Movement Control Order).” Visit www.astro.com.my

Remote IP Intercom Solution with LQ & Agent-IC ®

®

NEWS + PEOPLE

A fast-to-deploy, cost-efffective solution that complements your core intercom with remote capabilities to meet new workflow needs.

4


IN BRIEF AUSTRALIA: MediaHub Australia is providing key playout services for the PacificAus TV initiative, launched to make available 1000 hours of Australian content per year for three years to Pacific broadcast partners in Papua New Guinea, Fiji, Vanuatu, Solomon Islands, Kiribati, Tuvalu and Nauru. Visit www.mediahubaustralia.com.au ----------------------------------------------------------------CHINA: Nokia recently completed its first 5G standalone (SA) call on a live network with Chinese state-owned telecommunications operator China Unicom. By bypassing non-standalone (NSA) 5G core deployments, Nokia was able to deliver 5G SA core for China Unicom and enable functions such as network slicing capabilities. Nokia deployed its Unified Data Management, Shared Data Layer (5G Unified Data Repository), and Cloud Mobile Gateway (from its Cloud Packet Core portfolio) for the 5G Session Management Function and User Plane Function. This was complemented by Data Refinery charging and NetAct network management, with all products deployed in the cloud using the Nokia CloudBand cloud infrastructure and management solution. Visit www.nokia.com ----------------------------------------------------------------HONG KONG: HK’s Office of the Communications Authority (OFCA) has launched the Subsidy Scheme for Encouraging Early Deployment of 5G. The Scheme is open until 30 November 2020, on a firstcome-first-served basis and aims to encourage deployment of 5G technology early to foster innovation and smart city applications. The Government will subsidise 50% of the cost directly relevant to the deployment of 5G technology in an approved project. Visit www.ofca.gov.hk ----------------------------------------------------------------INDIA: NAGRA has exceeded 1.5 billion multi-DRM licenses served by the cloud-based NAGRA Security Services Platform (NAGRA cloud. SSP) for the Airtel Xstream app. Airtel Xstream is the digital entertainment platform of Bharti Airtel, India’s largest integrated telecom company. Airtel (through its subsidiary Wynk) selected the NAGRA platform and its multi-DRM solution in September 2019 to secure video streaming content on the Airtel Xstream app. Visit https://dtv.nagra.com

MALAYSIA: Skyline Communications has announced that Malaysian satellite operator MEASAT Satellite Systems has recently deployed its DataMiner platform for end-to-end multi-vendor network management of its customised VSAT satellite solutions. MEASAT can now deliver high quality real-time data sharing to support its regional customers. Visit www.skyline.be ----------------------------------------------------------------NEW ZEALAND: Singtel-Optus has announced a contract with Airbus Defence and Space for a new, OneSat software-defined satellite, Optus 11, to be deployed for Australia and New Zealand in 2023 at the current Optus D1 orbital location of 160°East. The satellite will be fully configurable in space, meaning its location, coverage, bandwidth and capacity can be changed in orbit as customer demands evolve. Optus has also entered into a revised agreement with Sky New Zealand which will be the cornerstone customer leveraging the new satellite. Visit www.optus.com.au ----------------------------------------------------------------TAIWAN: WarnerMedia Entertainment Networks & Sales has secured another four new partners in Taiwan for the distribution of HBO GO. Its regional streaming service is now available to subscribers of Kbro, TWM Broadband, CABLE GIANT CATV and Pingnan CATV. This follows an initial launch in April with Taiwan Broadband Communications (TBC). Visit http://www.hbogoasia.com ----------------------------------------------------------------THAILAND: PlayBox Neo has announced completion of an HD parliamentary broadcast playout system for the National Assembly of Thailand in Bangkok. The installation includes ProductionAirBox Neo, TitleBox Neo and CaptureBox Neo. Partner in the project was Thai system integrator Broadcast Audio Service Company Limited and Bangkok-based distributor Mahajak Development Company Limited.


NEWS + PEOPLE

ATEME Powers Converged Headend for India’s GTPL KCBPL ATEME, THE PROVIDER of video delivery solutions for broadcast, cable TV, DTH, IPTV and OTT, has announced that GTPL KCBPL (GTPL KCBPL Digital Cable TV & Broadband), the Indian entertainment, education and interactive services company, has chosen to implement ATEME’s TITAN Live solution for its cable TV and OTT platform. A leading multi-system operator in West Bengal and Odisha, GTPL KCBPL has recently upgraded its headend platform to complement its existing cable service offering, which already includes distribution of the popular Channel One group. ATEME’s TITAN Live is being used to encode and transcode channel video head-ends to deliver a high-quality cable TV offering to GTPL KCBPL customers. The solution provides GTPL KCBPL with major benefits, such as:

KBS Upgrades TV Production at Seoul HQ with Lawo KOREAN BROADCAST SYSTEM (KBS), South Korea’s public broadcaster, has upgraded TV production facilities at its headquarters in Seoul with IP technology from Lawo as part of an extensive renovation project. The installation, which includes two new Lawo consoles – an mc²96 Grand Production Console and an mc²36 “all-in-one” console – is part of KBS’ move to a completely new IP-based production workflow.

NEWS + PEOPLE

The mc²96 and mc²36 mixers are native IP consoles, with RAVENNA / AES67 built in for industry-standard networking, flexibility, scalability and efficiency. The new setup also includes Lawo DALLIS stagebox interfaces, and a Nova routing system. System integration was provided by Lawo’s Korean partner, Dong Yang Digital.

6

performances and also serves as submixer and backup mixer for the mc²96. In addition to the new Lawo consoles, the facility’s renovation encompassed a full replacement of stage and audio infrastructure of the KBS TV concert hall. Even the main speaker system was replaced, with a Lawo Nova73 router providing mic/gain control for each console, a MADI connection for FOH and the monitor mix consoles, and comprehensive RAVENNA integration with full operational redundancy. Two modular Lawo DALLIS I/O units are configured as stageboxes, the first providing 112 mic/line inputs while the second DALLIS handles 44 AES3 digital inputs.

High video quality: TITAN Live maximizes workflow efficiencies and ensures exceptional video quality even at very low bitrates. Operational flexibility and scalability: TITAN Live’s full software approach simplifies deployments and operations such as profile reconfigurations, automatic switchovers, or fast updates. It easily integrates with any ecosystem and eases on-field operations. Reduced headend CAPEX: with software only and optimized storage resources TITAN Live is a pure software-based solution that can run on any COTS or virtualized server, with the same benefits and value proposition. Bijoy Agarwal, MD, GTPL KCBPL, commented: “After thorough evaluation, we have selected ATEME’s TITAN Live solution as it offers the best video quality with more efficient bandwidth saving than any other vendor. ATEME also provides us with unparalleled support, a flexible system architecture and operational simplicity.” Visit www.ateme.com

“We are positive that we have made the right choice in embracing IP as the basis for our technical upgrade,” says Mr. Choi, HyoSub, Head of KBS’ TS-15 audio production team, “The new IP infrastructure gives us full access to all audio resources on demand, without having to perform changes physically like we used to. It definitely makes our lives easier.”

The global health crisis prevented in-person commissioning of the new equipment, so installation, configuration and FAT (Factory Acceptance Testing) was conducted using the remote procedures recently developed by Lawo. According to Myoungho Seo, the remote configuration and testing went smoothly, with all phases of the project running according to schedule.

The 88-fader mc²96, Lawo’s flagship production console, is now installed as KBS’ main mixing console in the completely renovated production complex. Optimised for IP-based video production, it boasts more than 300 DSP channels plus SoundGrid integration, and is the first large-scale digital mixing console deployed to mix concerts at a KBS site. The music programs and concerts, as well as KBS TV drama, entertainment and film award ceremonies, are produced in the “TS-15” SRC with live audiences in an open-studio concept.

According to the technical requirements of the TS-15 audio production team, the system configuration was done by Lawo engineers. KBS’ audio production team reviewed the captured video clips created by Lawo’s team, and then oversaw the hardware status and configuration test online during a live, remote joint technical session. The remote tools Lawo offers, like mxGUI remote control, in combination with computers and webcams, proved the solution for the installation and configuration as well as training sessions during the course of the project.

The launch of the HD channel follows several years of Globecast delivering an SD feed across the regions. Globecast is acquiring the signal from Euronews’ HQ in Lyon, France, in IP over fibre. It is then fed into the company’s international Globecast BN fibre network to reach the Jordan Media Center in Amman. There it’s uplinked at the teleport to Asiasat 5. During the course of the contract, Globecast will also add Viaccess PC6.0 encryption to the signal, at a time of Euronews’ choosing, with cards also supplied.

Visit www.lawo.com

Visit www.globecast.com

Its companion, the 40-fader Lawo mc²36, takes over the tasks of mixing live studio

Globecast Distributes Euronews English HD in Asia, Oceania GLOBECAST, the global solutions provider for media, has announced that Euronews has selected the company to provide distribution services for the launch of its new HD version of Euronews English HD, covering Asia and Oceania.

Euronews is an international news corporation providing news with a European perspective to a worldwide audience since 1993, in 12 languages through linear channels as well as nonlinear, online, mobile and digital services in over 160 countries.


Leveraging Telstra’s total global network assets for fully managed media content transmission. Major events & tours

International Remote Production

Occasional Use

Next Generation Linear Distribution

IP Transit

+61 2 8258 7979 TelstraBroadcast@team.telstra.com

NEWS + PEOPLE

telstra.com/broadcastservices

7


NEWS + PEOPLE

AVIA Event Sees Optimism in Thailand THE ASIA VIDEO INDUSTRY Association (AVIA) recently held its first country focused seminar of the year with a deep dive into the media and video industry of Thailand and the pursuit of growth in the market across both traditional and online platforms. Thailand’s media industry is undergoing seismic transformation driven by technological and commercial change. With the first commercial 5G networks becoming available to consumers as early as next year, there is no doubt the market will see continued acceleration in the consumption of content across multiple devices and platforms.

Sony Brings New Vision to Taiwan’s Chinese Television System AS SALES OF 4K TVS continue in Taiwan, TV stations, especially terrestrial stations, have recognised the need to increase production of original 4K content more suited for local consumers. One such station is free-to-air Chinese Television System (CTS) which is part of the Taiwan Broadcasting System. “In line with the nation’s ‘Forward-looking Infrastructure Plan,’ CTS has identified 4K as a key component of our goal to bring highquality local programmes to our viewers,” said Chao-I Lee, Manager Engineering Department at CTS. Founded in 1971, CTS is renowned for its wide range of quality programmes including dramas, variety shows and sports. CTS is well remembered for its 350-episode Chinese drama classic Justice Bao which was launched in 1974. It was the longest running TV series produced entirely by CTS and aired in over 80 countries. The series still enjoys a cult following. Most recently, CTS was the exclusive TV station to broadcast the 2017 Summer Universiade, an international multi-sport event that was hosted by Taiwan. CTS assessed several 4K partners and after a lengthy evaluation, selected Sony. It was the first time that CTS has acquired Sony equipment for its studios. The Sony 4K equipment selected by CTS include the BRC-X1000 – Sony’s first 4K remote camera. CTS identified this compact and powerful robotic camera for its picture quality as well as its versatility to complement other broadcast cameras for capturing images in hard-to-reach positions. The remote camera can be easily deployed in TV newsrooms, remote studios and sports stadiums. It’s also particularly suited for efficient multi-camera set-ups under the control of a single operator. Another equipment that impressed Chao-I and his team was Sony’s PXW-Z450, the world’s first 4K HDR shoulder camcorder. “The BRC-X1000’s low-light sensitivity and silent pan-tilt zoom functions were stand-out features. Together with the PXW-Z450 shoulder camcorder, it will give the CTS crew more creative options,” Lee continued.

NEWS + PEOPLE

CTS acquired three Sony PXW-Z450 4K HDR shoulder camcorders.

8

To complement the two Sony cameras, CTS also invested in the BVM-HX310 TRIMASTER HX Professional Master Monitor for its studio and post-production suites. Selected for its excellent High Dynamic Range (HDR) capabilities and wide colour gamut, the BVM-HX310 also offers easy system integration with CTS’ current production workflows. Visit https://pro.sony

For Truevisions, Thailand’s leading cable and satellite tv operator, Ongard Prapakamol, Chief Media Officer of True Corporation commented, “This is a time where we need to reinvent our pay TV to see a new S-curve.” Prapakamol shared that their strategy is to have different platforms serving different consumer needs, offering multiple services as a “super-app”, across free-to-air, pay tv as well as streaming via their digital service, TrueID. This has enabled True Corporation to move from 99% of their revenue coming from Pay TV subscription when Truevisions first launched in Thailand, to only 60% now, with 30% from advertising across TV and digital, and the remaining 10% coming from a growing events arm which also includes licensing of True’s original content. This growth in multi-platform consumption has also led to a plethora of regional and international OTT services launching in the Thai market, with iQiyi, China’s largest streaming video service, launching earlier this year. Kelvin Yau, VP of International Business Department and GM Thailand, iQIYI International, discussed their strategy for growth in Thailand. “It’s AVOD plus SVOD plus more … for iQiyi, SVOD is more than just a streaming service,” commented Yau. iQiyi will continue to focus on content and the technology they have, “but there is still so much more we haven’t shown to … users about what the app can do.” With the growth of OTT comes greater diversification of ad spend in video, and greater importance of the AVOD model. However, not all video is created equally, and there needs to be a better understanding of why OTT warrants the premium that is charged, when media planners look for the best platform that advertisers should invest in. “OTT is the perfect love child between TV and programmatic,” added Nigel Kwan, VP of Marketing, APAC, SpotX. “You’ve got the impact and quality of the TV ad … [with] all the measurement capabilities and data capabilities of programmatic.” However, challenges still remain and there is much work to be done in the area of measurement. But Greg Armshaw, Head of Media, Brightcove, ended the day’s sessions on a positive note, “There’s definitely opportunities for growth … clearly we are not at the reach situation yet … but it is growing very quickly.” And to sum it up in the words of AVIA’s Chief Policy Officer, John Medeiros, who, in light of the COVID-19 pandemic, reversed the words of economist John Maynard Keynes, “In the long run there is a lot of life to live, but we must get there first.” Visit https://avia.org

HK FILMART Online Reports 7000 International Buyers ORGANISED BY the Hong Kong Trade Development Council (HKTDC), the Hong Kong International Film & TV Market (FILMART Online) concluded successfully on 29 August. The four-day virtual content marketplace, where some 2100 film and television productions were released and promoted, attracted nearly 7000 international buyers from 73 countries and regions. More than 2000 online business matching meetings were arranged, illustrating strong demand for entertainment content globally and confidence in the prospects for the industry. Twenty-two online conferences and fringe events took place as part of FILMART Online, drawing more than 35,000 views in total. A total of 22 fringe events took place as part of FILMART Online, drawing more than 35,000 views in total. Among these events, six online conferences addressed the latest industry developments in areas such as streaming platforms and technologies for developing interactive entertainment content. Various functions of the FILMART Online platform will remain available until 30 September. Until then, exhibitors can continue to promote their productions and connect with buyers through the multifunctional online platform, while buyers can search for projects and enjoy online screenings, as well as viewing footage from the online conferences and seminars held as part of the event. Visit http://www.hktdc.com/hkfilmart


9

NEWS + PEOPLE


29 September -1 October 2020

PREVIEW

connectechasia.com/broadcast-asia

ConnectechAsia/BroadcastAsia to Deliver ‘Best-In-Class’ Virtual Experience ConnecTechAsia2020, the first virtual Infocomm, Media and Technology event held in partnership with Singapore’s Infocomm Media Development Authority (IMDA), will burst online from 29 September to 1 October 2020.

COMPRISING LONG-TIME FAVOURITES addressing the latest industry trends and challenges will feature top industry names, BroadcastAsia and CommunicAsia, including: ConnecTechAsia2020 will feature new entrants SatelliteAsia and TechXLR8 Asia staging their • Anne Chow, CEO, AT&T Business respective exhibitions and conferences. Inaugural • Brenda Harvey, GM, IBM Asia Pacific Asian editions of accelerateHER Asia and Elevating • Bicky Bhangu, President, Southeast Asia, Pacific will also be key highlights of this year’s ASTASIAFounders, GOES VIRTUAL and South Korea, Rolls Royce event, bringing with them the big names from the T 2020 women-in-tech and start-up worlds. • Martin Huang, MD Southeast Asia, SenseTime The event will feature some 200 exhibitors, • Euan Smith, CEO of TV & Group COO, Astro several of which are taking part for the first all-new virtual event, The event’s four virtual exhibition halls will allow time showcasing the SCANlatest HERE 5G TO technologies and companies to set up customisable virtual booths, topics like Delivering REGISTER YOUR enterprise solutions. This group includes AT&T, with live presentations and real-time interaction INTEREST Media Revolution, and SPTel,29 Qualcomm, StarHub and SEP -orServiceNow, 1visitOCT 2020 with buyers. Furthermore, the exhibitor metrics ons with broadcasting Verizon. Other bellwethers will also enrich the bit.ly/CTsepreg dashboard will give them valuable real-time data s— poweredvirtual by AI .experience; they are Grass Valley, Huawei, and insight into their customers’ behaviour by KTSat, Roland, Sony and SingTel Satellite, to name capturing multiple touchpoints and providing a Experience the all-new virtual event, a few. multi-dimensional view of a visitors’ journey,

BROADCASTASIA GOES VIRTUAL

industry leaders— powered by AI .

Conference delegates will have the full benefits and experience of a live conference, available right at their fingertips using their device of choice. This includes live keynotes, Q&A sessions and breakout rooms. Content will also be available on-demand 24/7 for delegates based in different time zones.

SCAN HERE TO

After the live event, ConnecTechAsia2020 will thereafter turn into a 24/7, 365-day interactive marketplace, allowing exhibitors and buyers to continue to connect and communicate on the same platform.

bit.ly/CTsepreg

Visit https://www.connectechasia.com/home/ virtual-event-experience/

Six group bringing themlike companies getpavilions engaged in with topics Delivering allowing for more effective outreach and followREGISTER YOUR from the European Union, France, Great Britain up. INTEREST the Consumer Media Revolution, and and Northern Ireland, Israel, Korea and Ontario are Networking opportunities are an integral part or visit also taking part. start conversations with broadcasting of any event experience and have also been In addition, more than 220 conference sessions

matching platform. Attendees will receive personalised meeting suggestions based on their interests and profile. On a consolidated dashboard, visitors can view connections, find out who is interested in setting up a meeting with them and arrange for one-to-one video meets via virtual meeting rooms.

enhanced through a powerful AI business

BROADCAST ASIA

Interra Systems Boosts Video Quality

10

DURING THIS YEAR’S VIRTUAL BroadcastAsia 2020 show, Interra Systems will demonstrate its content quality control, monitoring, analysis, and classification solutions for delivering quality of experience (QoE) on every screen for cloud and onpremises deployments. For VOD and live-event streaming, Interra Systems will demo ORION-OTT, a leading-edge monitoring solution. The rich feature set offers support for the latest standards in closed captions, ad insertion monitoring, ABR manifest file validation, audio-video checks, real-time alerts, DRM integration, new format supports, enhanced reporting and much more.

This year, Interra Systems’ BATON suite has been expanded to further simplify and automate quality control workflows for broadcasters and service providers. The company will showcase BATON Captions, a new solution that brings simplicity and cost savings to the creation, management, and delivery of captions for traditional TV, DVDs, and video streaming via machine learning and automatic speech recognition technology.

For IP-based delivery infrastructures, Interra Systems offers ORION, a real-time content monitoring solution that provides video analysis of linear channels. The new features include support for monitoring of ST2110 SDIoIP streams; custom alerts for Slack, HTTP etc.; enhancements in SCTE35 monitoring and more. ORION perfectly complements the company’s OTT offering, looking at all aspects of video streams including closed captions, ad-insertion verification, and quality of service (QoS) and QoE.

BATON LipSync is a new automated tool for lip sync detection and verification. The solution leverages machine learning technology and deep neural networks to automatically detect audio and video sync errors.

For end-to-end monitoring of channels across the delivery infrastructure, the newly revamped ORION Central Manager (OCM) provides an aggregated view of linear and OTT services based on monitoring data collected by ORION and ORION-OTT with an enhanced comprehensive centralized probe management and reports for channel availability and quality.

Other major updates to BATON include enhancements in the Interoperable Master Format (IMF), 4K and HDR checks, support for the latest PSE guidelines, enterprise enhancements for cloud support and scaling, new additions to audio language detection, as well as support for the latest formats. Additionally, Interra Systems will demo a vast range of new audio-video checks, more powerful BATON Media Player and Content Corrector. Also being demonstrated will be Interra Cloud Services (ICS) – the company’s subscriptionbased service for automated quality control of file-based content that simplifies and expedites content verification for quality and compliance.

Interra Systems will also showcase a new updated version of the company’s WINNOW, the company’s award-winning solution for content classification and identification. Interra Systems’ Vega Media Analyzer now includes support for all popular video compression and container standards and ABR streams, and also provides command-line analysis of streams over cloud. Its frame-by-frame analysis of problematic or erroneous streams enables encoder developers and video service providers to identify and fix errors efficiently. Interra Systems’ Live Showcase will debut Sept. 22-Oct. 15. More info about webinar topics is available online. In addition, Interra Systems’ Business Development Manager Deepanshu Rustagi will discuss the latest ML/AI innovations for the media industry during the BroadcastAsia 2020 virtual conference on Sept. 30 from 13:3014:00 SGT. Visit https://www.interrasystems.com


PREVIEW BROADCASTASIA GOES VIRTUAL 29 SEP - 1 OCT 2020

BroadcastAsia Conference Schedule

------------------------------------------------------------10:30am - 10:45am: GV AMPP - Taking Live into the Cloud

Speakers include Dennis Breckenridge - CEO, Elevate Broadcast; Lorenzo Zanni - Head of Insight & Analysis, IABM; Soo Teck Goh - Sales Director, SEA, LiveU; Ashutosh Patel - Asst Vice President, Media Services, PCCW Global;

GV Media Universe and GV AMPPGV - AMPP is the core enabling technology of the GV Media Universe, a concept that encapsulates the Grass Valley vision for the software-and cloudbased future of media.

11:00am - 11:30am: VoCaption, Live Automated Subtitling - presented by BroadStream Solutions

10:45am - 11:00am: The Future of Live Streaming Over IP in APAC

12:15pm - 1:00pm: Networking Lunch Break

Zixi customers and partners will discuss the state of streaming in the APAC region, the latest challenges and opportunities facing media companies as the industry shifts towards virtualised IP-based distribution models for live video, and how they’ve relied on the Zixi SDVP to orchestrate, monitor and manage live streams over IP. 10:45am - 11:00am: Unveiling VSN New Product Launches See the new dashboards for daily operation in VSNExplorer Media Asset Management (MAM), the new module for non-linear scheduling in VSNCrea Broadcast Management System (BMS) and the remote editing workflows recently launched for VSN’s Media Stories solution. 11:00am - 12:15pm: Building the Future Media Facility Cloud, virtualisation, 12G and IP are revolutionising broadcast facilities, studios and OB trucks, powering more efficient, flexible operations and enabling upgrades to 4K and beyond. Explore the options for new infrastructure with case studies of the most exciting new facilities, and discover what projects are around the corner as CTOs reveal their investment roadmaps. 11:00 - 11:15am: The Health of the Media Technology Industry New IABM market intelligence on the state of the media technology business in Asia Pacific. 11:15 - 11:35am: Case Study by Rohde & Schwarz

11:30am - 12:00pm: Unify Storage - presented by Gb Labs

12:30pm - 1:30pm: Video Mixer Products Demo presented by Roland Corp 1:00pm - 2:15pm: Enabling AI Across the Content Chain This session reveals how realworld applications of AI and ML are working today. 13:00 – 13:25pm: [Keynote] The Future of AI in Broadcast One of Asia Pacific’s leading broadcast AI research scientists reveals what’s coming out of the labs, and how AI is changing (and recognising) the face of broadcast. 13:25 – 13:50pm: Streamlining Subtitles with Neural Machine Translation 13:50 – 14:15pm: [Keynote] Automating Broadcast Content Production with AI Innovation

15:05 – 15:25pm: Delivering Best-in-Class Sports OTT Experiences 15:25 – 15:45pm: Innovating Low-Cost Live Sports Production. 15:45 – 16:05pm: Achieving Ultra-Low Latency for Live Sports Distribution in the Cloud. 16:05 – 16:30pm: [Panel] Technology Roadmap – Plans and Projects for the Year Ahead Broadcast and digital media technology leaders reveal their plans and investment priorities. Speakers include: Tom Lithgow Product Manager, Bluefish444; Gautier Demond - Director Content Delivery Services (CDN) APAC, Centurylink; Regie Bautista - SVP for Corporate Strategic Planning and Business Development, GMA Network; Hendy Lim - Vice President of Content Business, Indonesia Entertainment Group (IEG); Jay Ganesan - SVP and Region Head, APAC, MediaKind; Ren Egawa - CEO, Rexcel Nippon; Robert Ambrose - Managing Consultant, High Green Media 3:30pm - 4:00pm: WINNOW, an Advanced Platform for Content Classification presented by Interra Systems 4:00pm - 4:30pm: Angenieux Optimo Prime Lense Product Launch - presented by Jebsen Industrial Technology Co Ltd.

Speakers include: Alphie Larrieu - SAVP Content and Localisation Engineering, Astro; Masakazu Iwaki - Head of Human Interface Research Division, Science and Technology Research Laboratories, NHK; Dion Wiggins - Chief Technology Officer & Co-Founder, Omniscien Technologies.

Day 2: Delivering the Consumer Media Revolution Wednesday, 30 September 2020

1:30pm - 2:00pm: SONY presentation

This session will discuss what other technologies enterprises should be focusing on – the developments of advancements in analytics through automation, distributed cloud system - bridging the gap between data storage and computation, or data-driven policing?

2:00pm - 2:30pm: Perfect Timing - presented by Hitomi Broadcast 2:15pm - 3:05pm: Afternoon Networking Break

11:35 - 11:55am: Why Build 4K Facilities – and Is IP or 12G SDI the Way to Go?

2:30pm - 3:00pm: StarTracker Studio: Simplifying Virtual Studio Production presented by Mo-Sys

11:55 - 12:15pm: Software-Defined Networks for Automated Video Distribution

3:05pm - 4:30pm: Transforming Live Sports Production

SCAN HERE TO REGISTER YOUR INTEREST

or visit 29 SEP - 1 OCT 2020

bit.ly/CTsepreg

Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders— powered by AI .

------------------------------------------------------------9:30am - 10:45am: Enterprise Outlook - Tech Reality Check

Speakers include: Euan Smith - CEO of TV & Group COO, Astro; Anne H. Chow - CEO, AT&T Business; Min Chen - VP & CTO, APAC, LexisNexis Legal & Professional; ST Liew - VP & President, Taiwan & South East Asia, Qualcomm; Robert Le Busque

SCAN HERE TO REGISTER YOUR INTEREST or visit bit.ly/CTsepreg

- Regional VP APAC, Verizon Business Group; Claude Achcar Managing Partner, Actel Consulting; Adam Etherington - Principal Analyst, Digital Enterprise Services, Omdia. 11:00am – 12:15pm: Shaping the Future of TV Broadcasting 11:00 – 11:15am: Taking Live into the Cloud Learn how to leverage elastic compute technologies to take your cloud-based production to the next level. 11:15 – 11:35am: Caching in a 5G Network and Its Implications for Video Streaming 11:35 – 11:55am: [Fireside Chat] Launching a Modern Satellite TV Platform Going inside a TV platform operation in the Philippines that enables content owners to launch new channels rapidly, reaching an audience of 4.5 million household while enhancing the bouquets of DTH operators. 11:55am – 12:15pm: All in With IP – Engineering a Broadcast Infrastructure Transformation First-hand experience of designing and planning every step of a project to transition to an all-IP infrastructure. Speakers include: Dennis Breckenridge - CEO, Elevate Broadcast; Stefan Mayr - AMPP Solutions Consultant, Grass Valley; Tushar Gohad - Principal Engineer | Storage | Networking | Performance, Intel Corporation; Ralph Siebenaler - Managing Director, Magistan Media; Devesh Gautam - Head of Section/ Director - Edge Platforms-MEC, Media Services and CDN, Rakuten; John Huddle - Director, Market Development, APAC, SES Video; Reza Mazhari - Senior Broadcast Architect, Tabcorp Holdings; Arianna Aondio - Head of Support Operation, Varnish Software. 11:00am - 11:30am: Stream Live Video Over IP with the Zixi Software-Defined Video Platform 12:15pm - 1:00pm: Networking Lunch Break 1:00pm - 2:25pm: Growing Direct-to-Consumer Services continues p12>>

BROADCAST ASIA

Day 1: Transforming Broadcast Technology Tuesday, 29 September 2020

BROADCASTASIA GOES VIRTUAL

Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders— powered by AI .

11


PREVIEW BROADCASTASIA GOES VIRTUAL 29 SEP - 1 OCT 2020

BroadcastAsia Conference Schedule

BROADCASTASIA GOES VIRTUAL

Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders— powered by AI .

SCAN HERE TO REGISTER YOUR INTEREST

or visit 29 SEP - 1 OCT 2020

bit.ly/CTsepreg

Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders— powered by AI .

SCAN HERE TO REGISTER YOUR INTEREST or visit bit.ly/CTsepreg

Day 2: Delivering the Consumer Media Revolution Wednesday, 30 September 2020 ------------------------------------------------------------13:00 – 13:20pm: [Keynote] Competing with the Global Giants – a Recipe for OTT Success in Asia 13:20 – 13:40pm: Building Scalability and Elasticity Into OTT Services 13:40 – 14:00pm: [Keynote] Engineering an OTT Platform at Scale Behind the scenes of one of the biggest new OTT platform launches. 14:00 – 14:25pm: [Fireside Chat] Competing in the OTT Business: Using Data to Get the Business Model Right Panel includes: Fabien Astruc - Sales Engineer, Anevia; Kristiono Setyadi - CTO, Vision+; Manish Verma Head of Technology - SonyLIV, Sony Pictures Networks India; Aki Tsuchiya - Founder & Managing Director, Streamhub; Hirosuke Usui - Senior Expert/Deputy President Media Strategy and Planning Office, Tokyo Broadcasting System Holdings, Inc.; Tarun Katial - CEO, ZEE5; Robert Ambrose - Managing Consultant, High Green Media. 1:30pm - 2:00pm: ML/AI Innovations for Media Industry - presented by Interra Systems 2:00pm - 2:30pm: Brainstorm Suite Demos presented by Brainstorm Brainstorm will showcase online demos and presentations of the products included in the Brainstorm Suite: InfinitySet, virtual set and AR solution, and Aston, graphics creation, CG and playout system. 2:25pm - 3:05pm: Afternoon Networking Break

2:30pm - 3:30pm: Bandwidth Limitations: Delivering Quality Video Speakers: Chris Fellows - Solutions Engineer, UK/EMEA, Zixi / The RIST Forum; David Griggs - Senior Product Manager, Media Services, AWS Elemental, MediaConnect. 3:05pm - 4:20pm: The Future of Entertainment Technology 15:05 – 15:30pm: [Fireside Chat] Building a ShortForm OTT Platform to Beat the World

BROADCAST ASIA

------------------------------------------------------------11:00am - 12:15pm: Colour and 8K and HDR! Oh, My! In this session you will learn about some of the science of enhanced imaging, practical applications, and where the implementation of these enhancement stands around the world. + Expanded Colour Volume | Why do we care and how do we visualise it?

Short-form content is making waves in the streaming market with the launch of Quibi on the global stage and Singapore-based Viddsee. Explore what’s behind their success.

+ Cognitive Colour Phenomena in HDR | A colourist’s perspective on cognitive limiting factors while dealing with HDR, and how to use them for creative purposes to make the best images.

15:30 – 15:50pm: Combining OTT and Free to Air Broadcast

+ Ambient Effects in HDR Imaging Explore what effect the viewing environment has on perception of images, how this effect is quantified and what can be done to compensate for it.

An amazing case study involving the Taiwan National Opera House and how broadcasters enabled viewers to select their preferred choice of camera angle and coverage. 15:50 – 16:20pm: [Panel] On the Screen and on the Field: What Broadcasters and Esports Producers Can Learn From Each Other Speakers: Andy Blondin - Senior Product Manager, Epic Games; Nick Vanzetti - SVP, Managing Director, ESL Asia Pacific Japan; Ren Egawa - CEO, Rexcel Nippon Co., Ltd.; Debbie Lee - CEO & Founder, TechStorm; Derek Tan - CCO & CoFounder, Viddsee; Robert Ambrose - Managing Consultant, High Green Media. 3:30pm - 4:00pm: GV Orbit - Dynamic System Orchestrator for SDI, Hybrid and IP Networks - presented by Grass Valley

Mo-Sys to Demonstrate Virtual Studio Technologies

12

Day 3: Thursday, 1 October 2020

MO-SYS, A PROVIDER OF precision camera tracking solutions for virtual studios and augmented reality, will be showcasing StarTracker Studio at BroadcastAsia 2020’s all-new virtual reality experience running from September 29 to October 1 2020. StarTracker uses dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with a high degree of accuracy.

Speakers: Aurora Gordon - Senior Colourist, Arsenal FX; Robert Wanat - Senior Colour Scientist, Dolby Laboratories; Anastasia Shepherd Freelance Colourist. 11:00am - 11:30am: The Affordable Compliance Monitoring and Logging Solution Voted #1 by Broadcasters presented by Vela 11:30am - 12:00pm: The Art and Science of Effectively Integrating Media Management & Scheduling - presented by VSN 12:15pm - 1:00pm: Networking Lunch Break 1:00pm - 2:25pm: SMPTE Standards Paving the Way to the Future - Attacking the Microservices Interop Challenge | The Open Services Alliance and SMPTE have been hard at work in recent months, jointly focused

StarTracker Studio combines StarTracker camera tracking with an all-in-one virtual production system capable of working with green screen studios or LED volumes. The system uses Mo-Sys VP Pro software to connect camera tracking data from up to 16 cameras to the Unreal Engine graphics. StarTracker Studio uses a smart system design to reduce the typical hardware required for multi-camera virtual production, and the whole system comes pre-configured in a flight-cased rack. Mo-Sys will also be detailing how its VP Pro and StarTracker technologies operate with LED volumes for virtual productions that want to use on-set finishing techniques.

on one of the key interoperability challenges the media industry faces today: making microservices from multiple vendors work together. Join us for a look into what’s been done to date, and what’s next in the game plan. + Machine Learning, Artificial Intelligence, and Interoperability + Cloudy with a Chance of Standards | How SMPTE is responding to the changing industry landscape to get the best from Open Source and Standards. Speakers: Yves Bergquist - Director of the AI & Neuroscience in Media Project, Entertainment Technology Center (ETC); Christopher Lennon - President & CEO, MediAnswers; Bruce Devlin - Standards Vice President, SMPTE; Thomas Bause Mason - Standards Director, SMPTE 2:00pm - 3:30pm: Broadcasting E-Sports: Challenges & Opportunities 2:25pm - 3:05pm: Afternoon Networking Break 3:05pm - 4:30pm: The Industry Transformation Through Software-Defined Workflows + The Evolution of Live Production | From remote production to cloud-based, software-defined workflows. + MovieLabs 2030 Vision | Explore the principles of MovieLabs’ 2030 vision, and delve into software defined workflows and ML’s work towards common ontologies and APIs. Speakers: Boromy Ung - VP of Product Marketing, Grass Valley; Jim Helman - CTO, MovieLabs; Paul Briscoe - Chief Architect, TAG Video Systems. 3:30pm - 4:00pm: LDX 100 - What the Industry Wants in Camera Technology presented by Grass Valley

“LED-wall technology now offers a viable alternative to the traditional green screen / post-production workflow for visual effects (VFX) shooting. Specifically, LED walls enable a composited shot to be captured on-set, rather than just previewed on-set, thereby removing the need for downstream post-production,” said Michael Geissler, CEO of Mo-Sys. “LED walls won’t replace greenscreen, both will co-exist going forwards as each is suited to a different type of VFX shot. The benefit of StarTracker Studio is that it handles both workflows”. Visit https://bit.ly/33P5K6B


PREVIEW BROADCASTASIA GOES VIRTUAL 29 SEP - 1 OCT 2020

BROADCASTASIA GOES VIRTUAL

Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders— powered by AI .

VSN Boosts Remote Productivity

The VSNCrea traffic and scheduling system presents its content scheduling module for nonlinear channels that allows for easy integration with any non-linear environment (VoD, OTT, Web TV, etc.) thanks to its open architecture based on webhooks. In essence, the new module allows to plan the publication and “unpublication” of the content catalogue for one or several non-linear media and to create block programming in a totally customised way for each channel. It also includes new integrations with video distribution and broadcast services, such as Vimeo or Youtube, which will continue to be expanded in the coming months. At a business level, VSNCrea

bit.ly/CTsepreg

SCAN HERE TO REGISTER YOUR INTEREST or visit bit.ly/CTsepreg

now also allows the system to be contracted under a SaaS service modality, installed in public, private or VSN-owned cloud services at the user’s choice. Among the main new features included in VSN’s Media Stories solution, it is worth mentioning its support for growing files for remote work environments. This development allows users to visualise the low resolution of a recording and start editing and working with it instantly, while it is still being ingested into the system. That is, they can simultaneously select video fragments of their interest, add them to the editing timeline, make quick video edits and consolidate the final piece in low resolution to send it to FTP or any destination for broadcast or publication. In addition, in order to further streamline the remote editing process, new controls have also been introduced in the VSNExplorer player to allow more precise and selective video navigation. Some of these new features include options such as zoom, VU meter and new keyboard shortcuts. VSN has spent many years focused on developing a full suite of solutions to cover the

Lip-Sync Comes Alive with Hitomi HITOMI BROADCAST, manufacturer of the MatchBox audio-video alignment toolbox, will demonstrate a broadcasting lineup using its MatchBox solutions with live feeds into the Hitomi virtual stand at BroadcastAsia 2020 which is running from 29th September to 1st October 2020. Hitomi will be showcasing the benefits of the newest member of the MatchBox family of solutions, Glass, for live remote production. This sophisticated iOS app is designed for fast and precise remote lip-sync measurement. Used on location at the point of content capture, Glass takes its measurement from in front of the camera and through the microphones in the same way as the live action. Back in the MCR or OB Truck, the signals generated by this app are quickly analysed by the MatchBox Analyser and any alignment issues can be rapidly corrected with great precision. Additional checks further down the chain can be run using test patterns generated by the MatchBox Generator. Not only can audio to video be aligned but also audio to audio and video to video. Where MatchBox measures zero difference, perfect timing has been achieved and the audience will hear what it sees.

entire media lifecycle into the market, always bearing in mind that those solutions should not only complement each other, but more importantly, it was also critical that – as remote working gained prominence –all of the elements could be accessed remotely, not just the ones the market was starting to expect. The entire suite of VSN solutions can be configured to access data that is stored on-site, in hybrid deployments or in the cloud, to ensure that ‘remote workflow’ did not equate to ‘cloudbased computing’. The company has worked hard to make sure that all of our systems can talk directly with the data stored on your physical premise or cloud-based data storage. VSN is fully agnostic to one or another and can even provide clients with a hybrid setup with cloud-based and physically-based data. Visit www.vsn.es

Satcoms Innovation Group Tackles LEO and 5G INDUSTRY FORUM, the Satcoms Innovation Group has announced a range of activity alongside the virtual ConnecTechAsia event, taking place from the 29th of September to 1st of October. SIG will host two panel sessions as part of the main conference. The first session will tackle technical questions regarding LEO and flat panel antennas, which takes place on Tuesday 29th of September, 14:30 – 16:30 SGT. Over-crowding of LEO and its impact on surrounding orbits will be assessed, before going on to question whether high-performance antennas will ever be small enough to be classed as flat panels. The second panel session will discuss 5G and spectrum on the 1st of October, 14:30 – 16:30 SGT. Understanding 5G’s potential impact is important in pre-empting future challenges, both within spectrum and within the wider broadcast ecosystem. SIG will also run technology tours, but in a virtual environment. The tours, which normally take visitors around various members’ booths, will highlight latest developments and innovation for satellite

Hitomi takes guesswork out of the equation, replacing subjective estimates with highly accurate, electronic test audio and video generators along with a state-of-the-art analyser. MatchBox enables the production team to measure when it’s wrong and know when it’s right.

communications.

Visit www.hitomi-broadcast.tv

Visit http://satig.space

SIG will be exhibiting as part of SatelliteAsia and showcasing its members on its booth throughout the event.

BROADCAST ASIA

ALONGSIDE ITS PARTICIPATION in BCA’s virtual edition, VSN will present new developments and functionalities for its media, traffic and scheduling management systems and new developments in remote production and editing in news production environments next. These developments are all focused on easing the daily operations of broadcast and media professionals and improving their productivity in remote work environments and content planning for nonlinear channels. The latest version of VSNExplorer MAM has a new functionality for users to create dashboards from which to operate all the tools and functionalities of the system in a centralised way. Each user can now create their own control and operation panels from the top menu of VSNExplorer and easily customise them by combining different widgets that allow to use and operate all the tools and functions of the MAM system from the panel itself and without having to open each one of them.

SCAN HERE TO REGISTER YOUR INTEREST

or visit 29 SEP - 1 OCT 2020

Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders— powered by AI .

13


ACQUISITION & LIGHTING www.content-technology.com/acquisition

Weta Digital, Streamliner, and Avalon Studios Launch LED-Stage Virtual Production Service NEW ZEALAND’S WETA DIGITAL, Avalon Studios and live event company Streamliner Productions, have announced a new LED-stage virtual production service, based in Wellington. A modern take on using front or rear projections of landscapes or rolling streets, Weta Digital Executive Producer David Conley describes LED stages as “… the latest technique to take advantage of game engine technology to provide virtual production workflows that can greatly expand what is possible on set. Being able to shoot final VFX imagery at the same time as principle photography adds another level of creative control for producers and filmmakers.” According to Richard Lander, Studio Manager at Avalon, the service can be scaled to cater for smaller productions and TVCs right up to a 270-degree screen scenario of the type used by ILM and Disney on The Mandalorian. “Each partner’s a different part of the puzzle,” says Lander. “We’ve provided the studios. Our Studio 8 measures in height to its grid 8.4 metres, and it’s got 23 metres wide and 33 metres deep. And then our Studio 11, which is going to be used in the same way, has the 17 metres wide and it’s 30 metres deep and it has an extra height of 10 metres. “We’ve done experiments where you can potentially put screens on the grids because our grids are enabled to do a clip on type scenario where you can clip the screens to the roof of the grid, but we’ve also worked out that you don’t always need a screen on the grid or a screen necessarily on the floor, it’s more the walls, because obviously on the floor the Art Department is laying something together to give you that illusion of there’s a bit of sand, for example, that people have used regularly, or a bit of a prop in the foreground that then the background so the screens takeover and give you that 3D imagery out the back. And on the grid, you can do that.” The studios can also employ J-arm cranes with LED screens attached that can “float” or be positioned above the acting talent. “For example, if they’ve got a helmet on and you want that reflection to appear truly in the top of the helmet, or if you’ve someone with glasses and they’re looking up, the reflection from the screen is appearing in that,” says Lander.

ACQUISITION

Provided by Streamliner, the facility’s large, configurable LED panels are able to display imagery beyond 8K as a way to augment practical sets or replace greenscreen shots. With screens for indoor and outdoor use, options include a variety of small pixel-pitch arrays (the smaller the pixel pitch, the higher the pixel density, and resolution).

14

“We’ve got some which are 3.9mm pitch and some which are 2.5mm pitch,” says Richard Lander. “Those are the indoor ones and then the outdoor ones we’re carrying are 5.9mm and 6.9mm pitch.

“We know in the next few months there’ll be one that will become more the standard, which is going to be around the realms of 1.9mm pitch. I think the industry standard will become 1.9mm near the end of the year, so that’s what we’ll gear up to as well.” Another piece of the puzzle provided by Streamliner are strong relationships with augmented reality technology provider Disguise, as well as Epic Games, creator of the Unreal Engine. In addition to delivering high quality VFX/ graphics to the LED screens, the systems can also be used to trigger studio lighting. “The lighting board is also being driven by Disguise and the Unreal Engine,” says Lander. “So, the lighting board is getting triggered by what the screens are doing to actually enable the lights to flash to a particular timer or to a certain colour temperature or whatever you wanted to use. “We’ve done tests with a car scenario where someone’s gone out and shot a plate scenario. A car was mounted with five cameras on it inside and on the roof, it recorded the environment where it drove for ten minutes down a road and then we come back and that imagery from all those five cameras is then played onto the five screens surrounding the car and then you’re getting the true reflections of that. And not only are they acting as the reflection, but the panels or the LED screens are so strong that they become the lighting in a sense as well. “The car was travelling through trees, so obviously you’ve got the light coming through the trees, which is exactly like a shaft of light, so you were getting an effect of that by the screens themselves but to augment that, the DOP put a light through one of the panels so he gave that extra flash to give the illusion to the person in the audience watching it, there was a shaft of light striking the car every three or four seconds. The beauty of that now, because of all of this technology, the guys are timing the screens, they’re using the screens, which is driven by the

Disguise system, and the Unreal Engine, but they’re tying it to the lighting guys or the lighting board operators. “The beauty in using the LED screens of course, is you’re getting true reflections all in time in sync with the driver who is driving, and then the director of course is just able to back up the thing, repeats the link, as opposed to driving down the car down the road or closing off a road.” According to the Avalon Studio Manager, test scenarios have involved different models of camera to ascertain compatibility with the LED set-up. “That proved really successful,” says Richard Lander. “We’ve tested with RED cameras to know their capabilities and their lens variations on a screen. We’ve tested with ARRI cameras and Sony cameras, and Panavision cameras. “To be fair, some cameras performed better than others in this environment, which would be true anywhere. [For example] the ARRI camera and the Sony camera have a subtle shutter thing within them that enables you to clear up any of the what we would term, ‘a Moire effect’, when you’re in close to a screen.” In addition to enabling more control, and reducing production and costs, Richard Lander says LED stages suit the requirements of a COVID-safe workplace. “In a COVID sense,” he says, “the programmers, or the people prepping the production, can be upstairs using that space in a comfortable environment to do all their design elements, and then it’s controlled where we can patch it to the floor straight away and to the screen, so that means there’s actually less people on the studio floor, which is good for long form or TV projects because then you’re keeping that social distancing factor on the floor.” For more information, contact sales@wetafx.co.nz


ACQUISITION & LIGHTING

SmallHD Unveils Reference-Grade 4K OLED

SmallHD says the OLED 22 display has virtually no image degradation at any viewing angle, providing a real-time image as vivid and pristine as a final output, whether on set or in post. Monitor features include: 21.6in/55cm screen size, >1,000,000:1 contrast ratio with an absolute black point, 10-bit color depth for nuanced colour fidelity, 3840×2160 resolution, 350nit brightness

level, and 100% P3/135% Rec 709 colour gamuts that generate true-to-life colour reproduction. Power options include two hotswappable power inputs: one 3-pin XLR and a slide-on Dual Battery Plate (Gold Mount & V-Mount options, sold separately), which can be attached via the built-in Smart Rail on the rear side of the monitor. Two 2-pin locking accessory outputs add power flexibility for additional devices. For untethered freedom, OLED 22 is Teradek Bolt 4K compatible: an ideal combination for cablefree, zero-delay video monitoring in 4K 10-bit HDR. Coinciding with the release of OLED 22, SmallHD is also introducing what it says is its most sophisticated internal toolset yet: PageOS 4. With this new software, users gain easier access to a diverse and customisable array of curated exposure tools and workflows for enhanced production functionality. Beyond a host of

Firmware Update for JVC’s CONNECTED CAM

Roland, Canon Unveil Facial Recognition Workflow

JVC PROFESSIONAL VIDEO has announced it is furthering its commitment to providing the video production industry with the highest quality streaming solutions in a complete, plug-and-play package with its latest CONNECTED CAM firmware update. Using the open source Secure Reliable Transport (SRT) technology, JVC has added Forward Error Correction (FEC) and Stream Identifier (ID) to its 500 Series and 900 Series CONNECTED CAM cameras.

ROLAND, IN CONJUNCTION WITH CANON, recently showcased the Ver.2.0 update for its

Adding FEC to JVC’s CONNECTED CAM allows for redundant data stream packet loss recovery, meaning that data loss can be corrected before the video must buffer, while Stream ID makes it possible for multiple cameras to stream directly to one device. The company has also incorporated VITC (vertical interval time code) and LTC (longitudinal time code) functions with this latest update, which are essential for live production and synchronized streaming of live events, such as concerts, sports, ceremonies and conferences. The JVC CONNECTED CAM system, which also includes its SRT-enabled BR-DE900 decoder, BR-EN900 encoder and the KM-IP6000 switcher, is an easy to use streaming solution that meets a wide range of budgets. The system’s cross-compatibility between brands also means that users can incorporate the latest streaming video capabilities with their existing streaming studio equipment. The new firmware will be available as a free download from the JVC Professional Video website from the end of September. Visit http://pro.jvc.com/prof/support/ productupdate.jsp

V-600UHD 4K Multi-format Video Switcher which now adds “Automatic ROI (Region Of Interest)”, a new technology-assisted camera workflow that blends robust facial recognition technology with Roland’s ROI tool. LAN-based PTZ camera control is also included as part of the Ver.2.0 update. “Automatic ROI” marks the beginning of a new technical collaboration between Roland and Canon, lending technologyassisted workflow enhancements to several of Canon’s leading digital cameras and camcorders including the Canon XF405/ XF400 Professional Camcorders. The V-600’s Region Of Interest windows automatically finds and follows two people within the scene aiding in single operator productions.Ver.2.0 update allows the V-600UHD’s Automatic ROI to work seamlessly with the camcorder face-tracking autofocus of these Canon camcorders when using specific firmware, settings, and HDMI and CAT LAN cable. The Roland V-600UHD adds value to many different production scenarios by letting users transition to 4K workflows as demand and budgets allow, one input at a time. Scaling is also provided on every input with Roland’s Ultra Scaler technology, making it possible to use Full HD and 4K sources simultaneously while outputting at multiple resolutions. The high pixel density of 4K camera sources can be leveraged in Full HD workflows as well, for problem-free, visually-impressive productions. Visit https://proav.roland.com/global

improvements for simplicity and speed, PageOS 4 supports a streamlined colour-calibration experience with Colour Pipe, an intuitive rendering tool that accurately converts log formats into SDR and HDR. PageOS 4 also includes significant upgrades in user page presets, 4K HDR (PQ) waveforms, improved false colour, and dual/quad viewing options – as well as retaining the same dependable features known to users of previous PageOS-equipped monitors. Visit www.smallhd.com/4K

Micro Pan/Tilt Mount Head for Mini Cameras MARSHALL ELECTRONICS has joined forces with high-end pan/tilt head designer BR Remote, to design and build a micro P/T head, the CV-PT-HEAD, for Marshall miniature cameras. The new P/T head is compatible with all Marshall 500 series cameras including the CV503, CV503-WP, CV506, CV506-H12, CV565 and CV566. The CV-PT-HEAD body is less than 3.5-inches tall and less than 2-inches wide creating one of the smallest P/T footprints available. The CV-PT-HEAD comes with an IP65 weatherproof rating, which means it is dust-tight and protected against water under pressure from any angle. When paired with Marshall’s weatherproof CV503-WP camera, the CV-PT-HEAD creates an excellent movable camera position that can be left out in wet and dirty environments. There is a built-in camera power supply delivering a constant 12V to the camera socket as well as camera control data. The input voltage is from 12V to 35V allowing operators to run up to 3000 feet (914.4 metres) with a common 4-pin XLR cable. Unique for a small head, the input socket for power and control data is on the fixed base. This allows for better freedom of movement, with the camera socket on the moving part with the camera. For seamless plug-and-play, operators can use Marshall’s CV-MICRO-JYSTK controller (sold separately) with thumb joystick control. Visit www.marshall-usa.com

ACQUISITION

SMALLHD HAS ANNOUNCED a new entry into the reference-grade monitor space: the OLED 22 4K Production Monitor. Setting what the company says is “a new standard for colour purists”, the OLED 22’s hardware design is built around the Small4K Video Processing Architecture, which provides a range of input/output options with eight 12G-SDI and two HDMI 2.0 ports, all of which enable 4K signal processing. Housed in a rugged, unibody, milled-aluminum chassis that includes 36 individual ¼-20” mounting points along the top and sides, the OLED 22 weighs-in at a total of 9.3-lbs/4.2kg. A removable handle and feet allow for convenient portability.

15


ACQUISITION & LIGHTING

Canon Looks to Make Another Mark with EOS R5 and R6 IT WAS SOME 12 YEARS AGO in 2008 that Canon released its game changing 5D Mark II. Combining video and stills in the one device, it was the first full frame mirrorless camera on the market with full HD movie mode built into it. Now Canon is looking to repeat history with the release of the Canon EOS R5 and EOS R6, two new full frame mirrorless camera additions to Canon’s EOS R System built on the RF Mount. The pro-level EOS R5 delivers 45 megapixel stills at up to 20fps and is the first full frame mirrorless ever to record 8K RAW up to 29.97fps internally and 4K to 120p. The EOS R6, meanwhile, captures 20.1 megapixel stills up to 20fps, 4K video up to 60p and Full HD at up to 120p. According to Brendan Maher, Senior Product Manager, Consumer Imaging, Canon Australia, the EOS R6’s target market is “… a bit more focused on the stills photographer, definitely in that low light space, and event photographers, who, if they’re taking thousands of images every day, having less megapixels actually helps them because of their workflow, with smaller file sizes it makes it a little bit easier for them. “For the R5, really seeing it obviously with those impressive video specs is a bit more focused on the videographer, definitely a great camera for wedding photographers that might be picking it up as hybrid shooters, doing equal amounts video and stills. This is the camera for them because it excels in both areas. And in terms of travel or landscape shooters, 45 megapixels stills is fantastic for them because they can get a bit more flexibility to crop or to print bigger.”

ACQUISITION

Both cameras feature in-body image stabilisation which enables eight shutter speed stops of image stabiliser. Using a “Co-ordinated Control Image Stabiliser”, which is the Optical IS system of RF lenses talking to the 5-axis in-camera stabilising unit, the system is designed to deliver a better result than what either could do on its own.

16

DIGIC X processor technology at the core of EOS R5 and EOS R6 – the same technology in the EOS-1D X Mark III – supports next generation Dual Pixel CMOS AF II. Claimed to be the world’s fastest AF focusing speed among interchangeable lens digital mirrorless cameras, EOS R5 focuses in as little as 0.05 seconds and can focus in light levels as low

as -6EV. EOS R6, meanwhile, is the first EOS camera to offer a minimum EV for AF of -6.5EV. The high precision AF is effective in even poorly lit or low contrast shooting conditions. An iTR AF X AF system has been programmed using algorithms and face/eye detection mode ensures subjects are kept sharp even when moving unpredictably with a shallow depth of field. Even if a person turns away for a moment, their head and body continue to be tracked. “We’ve got eye detect auto focus and all detect auto focus,” says Brendan Maher. “Both working in stills or movie modes, both work in servo as well, so it will track your subject around the frame as they move. We’ve got a huge amount of auto focus zones. We’ve also got the multicontroller joystick on the back of the camera for moving your AF point around the screen.” There is also an Animal Detect feature which will track dogs, cats and birds. With built-in Bluetooth and Wi-Fi, the EOS R5 (5GHz Wi-Fi) and EOS R6 (2.4GHz Wi-Fi) can be easily connected to a smartphone and networks allowing high-speed file sharing and FTP/FTPS transfer. This functionality also allows for the cameras to be remotely controlled using the Camera Connect and EOS Utility apps, tethered to a PC or Mac via Wi-Fi or high-speed USB 3.1 Gen 2. With a camera body weight of 730g, they can be carried by a mid to large-sized drone. The EOS R5 and EOS R6 also support automatic transfer of image files from the device to the image.canon cloud platform, or integration with Google Photos or Adobe Cloud workflows. While the Wi-Fi capability can’t be used for streaming, Canon USA is Beta-trialling a USB platform for camera connection to a PC where the Canon camera takes over from the PC’s internal webcam. “That will launch later this year,” says Maher. “So, that’s across a range of platforms you’ll be able to use, not just these cameras but pretty much any camera that we’ve launched in the last three or four years, will be able to run via USB and stream on a range of platforms. And it’ll be a free service when it does launch. It’s free to download now, the Beta version itself.”

For on-board storage, the EOS R6 uses two SD UHS II card slots, while the EOS R5 uses one SD UHS II slot and one CFexpress slot. “The reason behind that,” says Brendan Maher, “is if you want to shoot 8K or if you want to show 4K 120P, it’s a huge amount of information, a huge amount of data, and basically you need the fastest card on the market for that, so the CF express card is needed if you’re shooting 8K or 4K or 120P.” “8K is a pretty exciting spec. What we’ve seen over the last couple of years is display manufacturers, your TV manufacturers are out there pushing panels out into market which are 8K, but there’s not a whole heap of content out there that you can display on those screens, so the opportunity for Canon photographers and videographers now to pick up an R5 for what is a very reasonable sort of price point, compared to other 8K cameras out there on the market, and be first to market with really 8K content, shooting on our devices, is pretty exciting. 8K’s four times the resolution of 4K, so it’s not just about outputting 8K either at the maximum resolution, you’ve also got the flexibility there to crop into your frame by up to four times and still deliver a beautiful 4K video file.” Visit https://sg.canon/

“For the R5, really seeing it obviously with those impressive video specs is a bit more focused on the videographer, definitely a great camera for wedding photographers that might be picking it up as hybrid shooters, doing equal amounts video and stills. This is the camera for them because it excels in both areas.


ACQUISITION & LIGHTING

New Fujifilm Premista Lens, PL Mount Box Lens, and MK Lens Mount

The short, lightweight wide-angle Premista 19-45mm T2.9 expands the Premista family of zooms to three lenses. Joining the 28-100mm T2.9 and the 80-250mm T2.9-3.5, the Premista 19-45mm produces images with natural and beautiful bokeh, outstanding high resolution, accurate color rendition, and controllable flare with minimal ghosting for capturing high dynamic range. The lens shows very little distortion throughout the entire zoom range, lightening the burden of correcting footage after shooting, and allowing high quality cinematic images to be created more efficiently. “The response we’ve seen to the Premista lenses since their 2019 launch has been tremendous both in terms of excitement and usage across feature film and high-end TV productions,” said Thomas Fletcher, Director of Marketing, Optical

Devices Division, FUJIFILM North America Corporation. “Now, with stricter safety and efficiency needs on set, there is a growing demand for high quality zoom lenses that match the quality and ‘look’ of prime lenses, and efficiently capture images without the hassle of having to frequently change lenses. The Premista family checks off all the boxes, with no compromise.” The Premista 19-45mm is scheduled for release in early 2021.

SK35-700mm Telephoto PL Mount Box Lens Fujifilm has also developed the FUJINON SK35-700mm PL Mount Telephoto Box Lens (SK35-700) for 8K television applications, but the company will now be doing extensive market research, exploring the possibility of repurposing the lens in response to the emerging needs of the multi-camera cinema style production market. The lens features a 20x high magnification zoom, covering a focal range of 35mm-700mm at F2.8 (35-315mm) and F4.8 (at 700mm). The SK35-700 also features a 1.4x extender, which brings the range to 49mm-980mm on S35 cameras while also offering significant coverage on many large format cameras. It is 28” long and weighs 69 lbs. The SK35-700 is currently the only telephoto PL

Sony Introduces ‘Cinema Line’ SONY HAS ANNOUNCED the launch of Cinema Line, a series of new camera products for content creators which will bring together the company’s expertise in image quality, attention to detail, technology and passion in digital cinema. According to Sony, Cinema Line will deliver not only the coveted cinematographic look cultivated through extensive experience in digital cinema production, but also the enhanced operability and reliability that meet discerning creators’ various needs. The new series will extend beyond basic cinema camera and professional camcorder form factors. In 2000, Sony released the HDW-F900, the world’s first 24p digital cinema camera. VENICE and other products followed in response to countless dialogues with cinematographers and image creators. Existing products in the Sony range forming part of the Cinema Line include VENICE and FX9. VENICE has become a first choice for digital movie production, and the FX9 has an outstanding track record in documentary production. The next step is a new model that will appeal to a wide spectrum of visual creators. Sony will be releasing and shipping this next addition to the range, the FX6 camera by the end of 2020. Each of the Cinema Line cameras will evolve with user feedback: The FX9 Version 3 upgrade, available in 2021, will see the addition of the S700PTP allowing remote control of Sony’s camera, a Centre Scan mode for Super 16mm lens and B4 Lens support with its adaptor as well as other features. In parallel, as of November 2020, the VENICE camera will see a couple of additional features in its V6.0 version which will improve its operability in broadcast and live environments. Cinema Line website – https://www.sony.net/cinema-line Cinema Line video – https://pro.sony/rewrite/cinematic-look Visit https://pro.sony

The FUJINON Premista 19-45mm T2.9 lightweight wide cinema zoom lens.

mount box lens on the market. Its design provides for unparalleled cinematic imaging in various multi-cam productions.

Duclos Lenses MK-R Mount U.S. company Duclos Lenses has developed the MK-R Lens Mount, an RF Mount Conversion that makes the FUJINON MK 18-55mm and 50135mm zoom lenses compatible with a variety of RF mount camera bodies – most notably, the highly anticipated Super 35 format KOMODO 6K camera from RED. Paired together, the setup is extremely small, lightweight, and relatively affordable. For more information on the MK-R Mount, visit www.ducloslenses.com Visit https://bit.ly/35Y46Cv

Sony Remote Camera Range to Support Free-D Protocol AS PART OF ONGOING EFFORTS to help productions work in remote and safe ways, as well as create engaging broadcast experiences, Sony is upgrading its BRC camera range with a new set of features. Designed for remote production and efficient operations, the v2.1 firmware upgrade to BRC-X1000 and BRC-H800 will allow producers and operators to simplify their VR/AR production workflows. Through this update, the BRC cameras will output tracking data over IP, using the industry standard Free-D protocol. This enables the cameras to directly feed the pan, tilt, zoom, focus and iris, as well as the position of the BRC cameras in real time, making VR/AR production simple and cost effective without the addition of other tracking devices or systems. Further, this new feature will allow productions to easily incorporate VR/AR into their live content, such as expanded sets or scenery, live animations, e-sports and graphic overlays, thus enriching their production. Free-D protocol is an industry standard protocol, supported by major AR/ VR solutions providers. BRC-X1000 and BRC-H800 are currently under verification with The Future Group (Pixotope), Reckeen, Vizrt and Zero Density, and plan to support integration with other partners supporting Free-D data in the near future. In the context of social distancing and reduced operational crews allowed, the update will also improve the pan/tilt/zoom operations of the BRC-X1000 and BRC-H800. A reduced minimum speed allows the camera to track more accurately an object on the set and facilitate shot framing. The output will therefore be more realistic and smoother, even with non-professional operators. What’s more, the cameras will now start to focus as soon as the pre-set recall is done, at the same time as the PTZ function, so that the camera movements look more natural. Control when using a physical remote controller such as a pan-bar one will also be supported. The firmware V2.1 upgrade is available now via Sony’s product website. Visit https://pro.sony

ACQUISITION

FUJIFILM HAS ANNOUNCED the development of the FUJINON Premista 19-45mm T2.9 lightweight wide cinema zoom lens (“Premista 19-45mm”) for large format sensor cameras, as well as previewed the SK35-700mm telephoto PL mount box lens. In addition to these items, an MK lens mount developed by Duclos Lenses for the highly anticipated RED Komodo cinema camera has been introduced.

17


SPORTSCASTING Sport coverage worldwide

www.content-technology.com/sportscasting

Malaysian Football – Back on the Pitch with Ideal Live AS THE COUNTRY’S MOVEMENT CONTROL ORDERS and other restrictions begin to ease, the Malaysian Football League is back, in a truncated form, without crowds, but back all the same. Mainly broadcast via Telekom Malaysia with selected matches shown on other regional OTT and free-to-air operators, coverage of the MFL, since 2018, has been in the hands of Ideal Systems’ content and production division, Ideal Live. According to Updesh Singh, Director of Technology, South East Asia, with the Ideal Systems Group, a “normal”, non-pandemic year would have seen Ideal Live covering around 200300 matches across all the Malaysian football competitions. “We act as a technology partner to the Malaysian Football League, rather than a supplier,” he says. “The first thing is we do on-rotation production, which means we have camera crews, vision mixer interviewers, on field staff. The second thing we do is the contain distribution and processing. We run sort of a production control room for the Malaysian Football League. It’s a central hub where there are Malaysian Football League people who are able to instantly access the content, edit it and publish to social media. We also distribute the content to any affiliates. All the content processing and distribution, 100 per cent is done by us, but on field production we do a majority of it. Some of it they do outsource

technology, this technology that can make the content, the same content, more simply.” Could you talk me through the production process now, as opposed to pre-COVID? Sofiyant Neo: “We have three teams. Their details are given to MFL, the organiser of the League. Every stadium, before we go, we have to provide the list of the crew that is coming. And we start with the crew coming here to our office. We know the crews who’s coming. They have to come in. They have to do their temperature. They have to log in. They come into the building. After that, they pick up their gear. They go to the venue. Upon arriving at the venue, same process. You have to register yourself. And the official will basically check each name. If they see changes, they will verify with MFL, any basic changes. “We don’t use an OB truck, we use flyaway kits so we used to have slightly smaller room, but it’s become slightly bigger. Because, even sitting, we have to have a big gap. Like a metre at least apart. And same goes with the players. The players come from the bus. They come, basically, one by one. Scan. Register. Go inside. And even when they come up from the tunnels. Same thing. They’re not really close until they start playing. There is no handshake. There’s no national anthem. There’s no club anthem anymore. Because then you’d have to queue up and you can’t have a long queue now, and so on.

Sofiyant Neo, Director of Media and Creative Content with Ideal Systems, and Head of Ideal Live.

Updesh Singh, Director of Technology, South East Asia, with the Ideal Systems Group.

when we sit down and do our meetings, again we highlight that, okay, this is something that went wrong, this should not go wrong again. We need to rectify this. Or this we need to improve. It can be a production work. It can be SOP. It can be anything.”

The Ideal Live MFL master control room.

SPORTSCASTING

to a couple of third parties because of rights issues, but we cover the majority of the League.”

18

According to Sofiyant Neo, Director of Media and Creative Content with Ideal Systems, and head of Ideal Live, “Our setup is totally different than a traditional setup. Our setup is a very versatile. I’ve been a content guy for my entire life. And when I joined IDL, when I start learning and getting exposed to all this technology, I find myself in position that I have this crazy

“They used to go on air about 8.15pm. Now they’re going on air by 8.45. The pre-show is already filled in so 9 o’clock is the kick-off. Personally, I go to most of the stadiums to ensure that SOP (standard operating protocol) is being followed. And before the match starts, and after the match ends, we always have a briefing session. Again, we will highlight the key protocols, there’s a critical one which involves the crew getting involved with the players, or going to grounds. We just keep on reminding them, these are the protocols. After the show, again

Updesh Singh: “And some other operational issues. For example, some very interesting ones like in the past when they do interviews there would be mic sharing, so you move the mics. Now you are not allowed to share mics. Everyone gets their own mic. With talkback intercom, it’s difficult to talk when you have a mask on because the noise gets in and tries to cancel your voice out. We have many smart, innovative ways to work around some of those issues. And even our flyaway kit, we can build in 90 minutes. We can break in one hour or less. Every system or solution in the kit must be able to be built within 90 minutes. Anything that cannot fit the time will not be put into the system. 90-minute build, including cable, everything. 90 minutes build. One-hour break. The on-location infrastructure is built with that type of mindset”. At the venues are there certain no go zones for the crew? Sofiyant Neo: “It’s pretty standard what basically we do. When the players arrive. We still have the shots. We shoot them how they arrive. How they register themself. How they follow the SOP. And


SPORTSCASTING

Updesh Singh: “Other things that have come up because of the COVID situation is sometimes they need to do two matches at the same time. At any one time we’re doing four, six, multiple matches on match day, so there’ll be three matches on Friday. Three matches on Saturday. Three matches all over the country. That is a challenge in itself. “What we are doing is we are re-using equipment on location, and we are ‘popping up’ cloud infrastructure. We use a mix of technologies. Which means we can easily pop up a third match. A fourth match. A sixth match. Or seven matches in the cloud. We can have people working from on location or somewhere else all accessing the cloud infrastructure. We were thinking of going 100 per cent cloud in the 2021 season, but this has been the early trigger. MFL is only one of our clients, Malaysian Football League. We do many other live events. And now we have the ability to really quickly pop them up in the cloud. All we need are a bunch of cameras and I will say we can do about 80/90 per cent in the cloud.” Could you describe the gear you’re using from the cameras right through the chain? Updesh Singh: “In terms of cameras I would say we are not technology fixed. We are very agnostic. However, we use a lot of JVC cameras for the Malaysian football league, which no other broadcaster or provider in Malaysia has got. That gives us that slight edge in terms of the slightly different colour, and the sound and all. “Then we also use some other tools. We use a product called Streamstar. From early days, we decided we will not use traditional old school gear. We have turned into a very IT-based type of infrastructure which very soon we are virtualising into the cloud. We will be able to do slow motion, instant replay type of things in the cloud. Because the core of the system was software from day one, we also try to avoid using hardware. We trying to use more touch screen types of interfaces. There are some panels, but we are trying to not have as many as possible. Obviously, we also have an older lot of switchers. We use a lot of standard kit for contribution. “We use a lot of AWS, Amazon, Elemental, so we use a lot of AWS kit because we use a lot of the cloud. We have a department within Ideal called Ideal Cloud. What we do is we build integrations that are not readily available. For example, if there’s a third-party CG we want to use, which is cloud-based, we will build the hooks for it. What we have is a cloud platform that allows us to hook many third-party sources. For example, if I wanted to virtualise instant replays in the cloud the Ideal Cloud team will virtualise it, and offer me a user interface where I can do more than just that. “There’s a lot of Ideal work going into the cloud function and, obviously, we’re still working with a

lot of third parties because we don’t make instant replay systems, we don’t make CGs. We work with many of them to get it all hooked in. “For a lot of our contribution we used dual encoders. We have now started - rather than using the encoders, we started using the Dejero EnGo which basically offers a mobile network. Now we are more interested in networks rather than encoding. We do most of our encoding in the cloud. So, these are software defined network boxes that allow you to have multiple networks, bonding, and it gives you a really reliable link to the cloud, so we are moving into that direction. “In terms of multi-viewers, we used cloud-based infrastructure from Sienna as well. We can get a 20 x 20 multi-viewer in the cloud for anyone anywhere in the world to have a look at, for example. We are not stuck by buying five different multi-viewers. Shipping them. We don’t have the traditional problems with broadcasters’ fixed infrastructure. They are limited by what they own. By utilising cloud, and obviously the shift to having softwarebased technology, we are able to construct infrastructure on the run. We can build an entire chain in 15, 20 minutes. We are doing things in minutes, rather than this old traditional way. And I think that has sort of changed the dynamic. And Malaysian Football is our case study for that.” With the cloud technology you use, how much has been developed in-house? Do you leverage the technology that Ideal has developed for the DR platform, for example? Updesh Singh: We do most of our work with AWS, so we have an AWS Ideal platform. But that’s not the only cloud provider we work with. We also have relationships with Alibaba, Google, the others. We are mostly driven by customer demand usually. However, for in-house usage we usually fall back on Amazon web Services. Because of the technology stack that they offer, I wouldn’t say the others don’t have it, but more people have better knowledge of how to use it, and there are more trained people at it. In terms of Ideal versus third party, I would say we’re currently at about 50/50. And I think this is how it’s going to stay. Obviously, we are not just a reseller in that we bring in some graphic system and give it to a client. We try to improve upon that and, as I told you what we do, we are gap fillers. If you find a problem, or obviously if the system is not suitable or something, we will fill the gap. And that’s how we get the nice end polished product specific to a client. Because everyone is unique in their own way.”

What can you tell us that’s coming down the track with the MFL coverage? Updesh Singh: “We are now in negotiations with MFL and a few other sporting bodies. Currently, we do a lot of work in Malaysia/Singapore - and to a certain degree in Indonesia. We will probably start rolling it out across the Ideal countries, which is in I’ll say about 80 per cent of Asian countries where we have a presence. We will, obviously, start rolling it out as a sort of group-based roll out, rather than just focusing on one or two clients. That’s what’s going to happen.” Is there anything we’ve missed about the football coverage? Sofiyant Neo: “One is the simplified production with the technology. Simplified also means economical. In the traditional way where you have the OB truck with the 30 people in the production, and you have set-up like half a day or one day, for us it’s just 10 people and setting up in one hour and doing the same thing. “Two, I know the fact that I’m using technology and using much more simplified technology that does the same thing, I think for me, as a content guy, one key point that I would like to share to the content players, that you can use the technology to enhance your creativity. Because it can make it magic, it can make it so beautiful. You don’t have to be traditional. I just hired a guy who had been in film and TV for 17 years, and I told him this is what you need to prepare yourself. If you can’t prepare yourself, I don’t think so we’re going to see each other again. I make him understand. If you can accept the technology, you can use it, utilise it, you will see the magic and you can survive. You can do anything. It could change your life.” Visit https://www.idealsys.com/ideal-live

SPORTSCASTING

after that they go through the tunnel – we don’t go through the tunnel anymore. Besides that, it’s just standard production. You have your five-six cameras fixed. And you do your productions accordingly. There’s no fans in the stadium. It’s still empty stadiums. But we have background sound. We still put mics, and still have the stadium sound because the coaches, the players are screaming and shouting as well. And, of course, we mix with recorded crowd sounds so on TV it still sounds like it’s a stadium with crowds.”

19


SPORTSCASTING

Tuning Into the 5G Future for Broadcast By Karen Clark, Telstra Broadcast Services Head of Sales FOR MANY, THE FUTURE OF 5G is still the great unknown. But what is widely known is the fact that it will fundamentally change everything we know about technology and how we interact with it. Each mobile generation, from 1G to 5G, has brought with it a brighter future and no other industry appears more poised to reap the rewards of the latest digital transformation than the broadcast sector. 5G will make good on the promise of a truly connected entertainment experience, delivering speeds and network capacity we’d only previously dreamed of. Among the many 5G demonstrations at Telstra Vantage in 2019, I was particularly blown away by Australia’s first live 5G broadcast. In partnership with Igloovision, Summit Tech

applications and the promise of new tools for broadcasters and content creators to work with. It’s likely 5G’s first impact in the broadcast industry will be felt mostly in the live sport sector. We’ve already seen a number of highprofile international trials by our industry peers across the US, Europe and Asia and the results have been eye-popping. NBC Sports successfully trialled the use of 5G during a live broadcast for the National Football League (NFL) in December. Beyond the visual quality, the technology deployed by the NFL seeks to improve audience interactivity and opens the door to VR and AR home broadcasts in real time. The need for this type of content has only grown in the face of COVID-19. In 2020, almost all sports experiences are virtual and

when needed, so they aren’t permanently consuming bandwidth on the network. Broadcasters could soon be offered ‘slices-asa-service’, which can be booked on a per hour/per minute basis the same way fibre and satellite network resources currently are. 5G network slicing will make wireless contribution a viable option for live sport.

It’s Showtime Multimedia Communications, and Magna Systems & Engineering, we streamed 360° video, live over 5G for the very first time, showcasing Kurrawa beach, Queensland onto the Vantage show floor in a breathtaking 8k resolution. For many, it was an awesome demonstration of 5G’s future power, but for me it was a captivating glimpse into its potential to redefine the broadcast and entertainment industry.

Endless possibility To understand what broadcasters are expecting with 5G, we went straight to the source. We carried out internal research to understand what media professionals are anticipating and how they want emerging mobile technologies, such as 5G, to power their broadcasting workflows.

SPORTSCASTING

Having spoken with television and radio broadcasters, sports broadcasters, e-sports bodies, production teams and freelancers, the sentiment is both fascinating and unanimous. Our customers believe 5G will initially be highly complementary to existing technology, but its increased capacity offers the capability to deliver more immersive, higher resolution experiences.

20

5G is more than just making Ultra High Definition content the new standard in this country. It’s about empowering 360° filmmaking, virtual reality and augmented reality

5G technology is positioned to disrupt and democratise the way we consume sports and other broadcasts. 5G will have the capability to place anyone, from pretty much anywhere in the middle of the game, with the addition of rich immersive content and without the expensive first row tickets. When looking at a successful deployment like this, it’s easy to envisage a world where the NRL and AFL follow suit - offering users unprecedented ways to enjoy the game they like, being closer to the action than ever before.

A bigger bandwidth slice Cellular solutions for live broadcasting have been around for some time, but haven’t been embraced in live sport due to network congestion at stadiums, where tens of thousands of people all compete for the available bandwidth. The difference with 5G and its superior capacity is that we’ll be able to offer network slices. For those unfamiliar with the term, 5G network slicing allows a network provider, such as Telstra, to create specific broadcast-grade networks for customers with the guaranteed latency, bandwidth and quality required for live broadcast without competing against other network use. 5G network slices can be turned on and off

It’s exciting to think what this will mean not just for the modern fan experience and engagement but also for their own contribution – harnessing live user-generated content. With 5G equipped fans sitting in the stadium or at an off-site fan zone, our customers can incorporate their unique experiences into their production on any platform in near real time.

“It’s likely 5G’s first impact in the broadcast industry will be felt mostly in the live sport sector. We’ve already seen a number of high-profile international trials by our industry peers across the US, Europe and Asia and the results have been eyepopping.

Telstra has the largest, fastest, smartest and most reliable next generation network, and is continuing to roll out 5G coverage to more parts of Australia The potential for 5G to expand into the broadcasting space is exponential. If we can inspire the next generation of content creators, who knows what the future of entertainment can be – a new world where a VR headset can transport you to the Masterchef kitchen? Or maybe the AFL Grand Final broadcast in 360 from the centre circle? Visit http://www.telstra.com/broadcastservices


SPORTSCASTING

Digital Nirvana and Adobe Automate Closed-Captioning for US Tennis

DIGITAL NIRVANA, a provider of leading-edge media-monitoring and metadata generation services, has announced that its Trance automated postproduction captioning solution with advanced AI is powering the closed captioning workflow for the United States Tennis Association (USTA), the US governing body for tennis. Integrated with Adobe Premiere Pro, Trance improves the speed and efficiency of the USTA’s caption generation process while freeing technical personnel to focus on the creative aspects of their jobs.

Dream Chip Super Slow Motion Camera

BUILDING UPON its existing range of mini cameras, Dream Chip has now added the ATOM one SSM500 to its range. A Super Slow Motion camera capable of recording 500fps in full HD resolution (1920×1080), the ATOM one SSM500 is claimed to be the smallest high-speed camera on the market, measuring only 190mm x 60mm x 60mm. This combination of small size and crisp, detailed slow-motion output makes the ATOM one SSM500 perfect for applications in a range of contexts – and particularly that of sports broadcast. Compatible with both B4-Mount and C-Mount lenses, the ATOM one SSM500 is remarkable for the level of flexibility it provides in both outputs and control. With soon-to-be implemented IP control along with standard RS 485 connection, the camera also uses open protocol so that it remains compatible with a range of remote-control options from vendors such as CyanView, Skaarhoj and Antelope. The intuitive GUI shipped with the ATOM one SSM500 also enables the connection of up to 99 other Dream Chip cameras, allowing for comprehensive but clear control of all the expected image output options; from elements such as framerate and resolution, to memory and recording buffer status, to white balance and exposure. Indeed, the Atom one SSM500 has multi-matrix colour correction to ensure perfect match with other cameras being used for the broadcast – a feature that is unique in cameras of this size and FPS capability. The workflow associated with the camera is also simple, whilst still providing user flexibility. There are two different operational workflows: the first allows for Super Slow Motion capture through 2, 3 or 4 phase connection to any server, facilitating capture at up to 240fps. Alternatively, a ‘trigger’ mode can be used, giving the ability to capture up to 60 seconds of 500fps footage, which is stored in the internal memory. Switching between modes is easy, and the camera can be setup to start in the preferred workflow setting. This flexibility in application allows for broadcast professionals to make use of the camera in a way that best suits the needs of their existing practices and workflows.

“The USTA aims to give tennis enthusiasts access to its content with minimal delay, especially when it pertains to live events or other timely content something it couldn’t do with its old captioning process. Now with Trance seamlessly integrated into its Adobe postproduction workflow, the league gets a rapid turnaround time that is virtually unheard of in the captioning industry,” said Russell Wise, senior vice president of sales and marketing for Digital Nirvana. “Not only that, but the USTA has the industry’s highest captionquality standards, with captions that are virtually 100% accurate and display with minimal delay, as the words are spoken.” The USTA’s previous captioning process was time-consuming and inefficient, relying on manual tasks and transcripts generated by a third-party speechto-text service. Previously, video personnel were required to copy and paste text from the transcripts, line by line, onto the video timeline and manually enter timecodes. Depending on the duration of the video program, captioning was taking several hours – an unsatisfactory turnaround. Now with Digital Nirvana’s new AI-powered workflow Trance combined with the power of Adobe Premiere Pro, most of the manual work has been eliminated and captioning is provided in a fraction of the time. Since moving to Digital Nirvana, the USTA production team has been able to cut the turnaround time for captioning an average video title from hours to only 30 minutes, with the captioning task completely offloaded from the technical team. In a typical captioning workflow, a video technician simply drags the clip to a “hot folder” in the Digital Nirvana media service portal. The clip is then automatically uploaded to the processing centre, where the Digital Nirvana team uses AI-based speech-to-text algorithms to generate a transcript, validate its accuracy, and then burn captions into the video. Once complete, the captioned video is automatically uploaded to the Premiere Pro timeline.

At the size which Dream Chip has achieved with the ATOM one SSM500, it can capture crystal clear slow-motion sports footage from unusual and innovative angles; embedded within the pitch, on goalposts or anywhere close to the action.

Digital Nirvana’s Trance application unites cutting-edge STT technology and other AI-driven processes with cloud-based architecture to drive efficient broadcast and media workflows. Implementing cloud-based metadata generation and closed captioning as part of their existing operations, media companies can radically reduce the time and cost of delivering accurate, compliant content for publishing worldwide. They also can enrich and classify content, enabling more effective repurposing of media libraries and facilitating more intelligent targeting of advertising spots. The latest version, Trance 3.0, has a text translation engine that simplifies and speeds captioning in additional languages and automated caption conformance to accelerate delivery of content to new platforms and geographic regions.

Visit https://www.atom-one.de

Visit www.digital-nirvana.com

TEDIAL HAS ANNOUNCED a number of new features for SMARTLIVE, its automated live sports production solution. Consistently updated, SMARTLIVE has several features and functionalities incorporated for increased fan engagement. Broadcasters can now deliver improved and more vivid graphics thanks to a new integration between SMARTLIVE and Singular.live’s cloud-based technology for live graphic overlays. Graphics can be inserted into the SMARTLIVE auto-generated EDL, reviewed in the player and then burned in when the EDL is flattened. This gives broadcasters the capability to enhance storytelling by layering

graphics on top of highlights. In addition, SMARTLIVE can now create transitions between EDL segments, such as fade in/fade out, adding a dynamic element to highlights. Providing an additional advantage for remote production Tedial’s player allows users to review content in different low-res quality with an automatic adaptive bit rate depending on the quality of the connection. SMARTLIVE is also now connected to a video AI engine to detect replays into the video live stream or file to provide even more advanced storytelling, making highlights more dynamic.

metadata engine to automate clip creation to distribution – including tight integration with social media. It also automates the event metadata ingest process; automatically provides a production environment for the production team; provides multiple, instantaneous live searches to bring historical media and informational archives to live events; and automatically generates highlights, delivering them to digital platforms and social networks as well as traditional broadcast distribution channels as it’s directly connected to the production environment.

Utilising AI tools, SMARTLIVE leverages its

Visit https://www.tedial.com

SPORTSCASTING

Tedial Boosts Automated Sports Production

21


NEWS OPERTAIONS OPERATIONS www.content-technology.com/newsoperations

AFP Captures Chengdu U.S. Consulate Closure with AVIWEST AVIWEST, A GLOBAL PROVIDER of video contribution systems, has announced that leading global news agency Agence FrancePresse (AFP) provided live and exclusive video coverage of the recent closure of the U.S. Consulate in Chengdu, China, thanks to AVIWEST’s PRO3 Series bonded cellular system and smart-designed backpack. China ordered the consulate to be shut down on July 24 in retaliation for the U.S. closing China’s consulate in Houston, amid a general deterioration in U.S.China relations. “Providing live images from anywhere in the world with an easy-to-deploy solution has been absolutely key to our video and photo operations,” said Thomas Pam, IT manager for Asia-Pacific at AFP. “In China, the AVIWEST PRO3 mobile field unit provided a fast and reliable means of sending live footage from the front line

as the event unfolded.” AVIWEST’s system features a unique, compact design that fits into a backpack. The PRO3 Series leverages up to eight 3G/4G-LTE modems simultaneously to enable automatic transmission of real-time images and streaming of broadcastgrade videos. On top of eight cellular modems, the transmitter features a high-efficiency custom antenna array and a built-in Wi-Fi modem that ensure quick network acquisition and improved resilience in the field. Through the PRO380’s user-friendly, intuitive interface, AFP can instantly transmit live videos and photos without any action required by remote journalists. Leveraging world-class, award-winning AVIWEST Safe Stream Transport (SST) technology, the PRO3 field unit applies special stream-grooming techniques to improve link quality and reliability. “The AVIWEST PRO3 Series offers press agencies

IBC Recognises News Organisations of the World ON A LIVESTREAM BROADCAST, IBC recently presented its highest award to the news organisations of the world. This is the first time in its history that it has recognised an entity rather than a single individual or organisation. The IBC2020 International Honour for Excellence acknowledged the critical role news broadcasters have played in continuing to inform and educate audiences during the COVID-19 pandemic. Every year, IBC, one of the world’s most influential media, entertainment and technology shows, recognises game-changing technical and creative innovation for social change and the exchange of knowledge. “The need for investigative journalism to separate fact from rumour has never been greater and the news organisations of the world have been unfailing in their drive to ask the difficult questions,” Michael Crimp, Chief Executive of IBC, commented. “As the global forum for the broadcast and electronic media industry, it is important that we recognise the excellence that the world’s news organisations have consistently delivered during these challenging times and thank them for their dedication and service.” The Award was presented during a dedicated livestream hosted by BBC World News’ Kasia Madera. A specially produced video featured contributions from leading news organisations including the BBC; CNN; ITV News on behalf of ITN; NBC News International; TV Globo; and Zee Media.

NEWS OPERATIONS

During the half-hour video, the news organisations shared insight into how they are operating in the time of a pandemic, shedding light on their challenges and how they have overcome them through new workflows and remote production solutions.

22

“If there is a positive side to this pandemic, it is realising how collaboration between journalists and engineers has allowed us to adopt innovative production processes that are here to stay after the pandemic,” said Raymundo Barros, CTO, TV Globo. “We migrated traditional operations in our studios to journalists’ homes in New York and London and cities like Rio de Janeiro and São Paulo. We implemented remote post-production for hundreds of professionals through cloud technology [and] some shows were produced live entirely in the public cloud.” The presentation is available on-demand via the IBC Showcase at https:// www.ibc.org/ibcshowcase/programmes/ibc-international-honour-forexcellence

and broadcasters greater mobility with its compact design,” said Tommy Qiu, mainland China sales manager at AVIWEST. “It helps AFP to expand their live news coverage capabilities, share crucial real-time information, and to remain a major news source in the industry.” Visit www.aviwest.com and www.afp.com

LiveU Tops the Polls for Singapore Election Coverage

LIVEU TECHNOLOGY PROVIDED key live streaming support during Singapore’s June 2020 General Election that enabled Mediacorp, Singapore’s largest content creator and national media network, to provide comprehensive live coverage from 31 constituencies across the country. Mediacorp used multiple field units to stream high-quality live video from Nomination Day events and the polling stations, coverage of candidates at their constituencies and results on Polling Day itself. Leveraging LiveU cloud technology, Mediacorp also used LiveU’s platform for multi-destination distribution and disaster recovery. On the ground, management and support for the complex project was provided by LiveU’s partner in Singapore, Elevate Broadcast. Elevate’s role was crucial, both in the run-up preparations and on Polling Day itself. James Hollis, Lead, Production Services, News and Current Affairs Mediacorp, said, “This election was like no other, with social distancing paramount and additional measures to take into account when planning and providing coverage. LiveU’s high availability solutions guaranteed flawless transmission around the clock from multiple locations, enabling us to cover the election extensively, while keeping to the strict safety measures. Using LiveU’s cloud technology, we could also multistream the live broadcast footages to our YouTube and Digital platforms, bringing the live feeds to wider audiences, without needing large crews onsite.” LiveU is now working with Mediacorp to rev up its operations for its coverage of Singapore’s National Day on August 9 in which LiveU’s technology will be used to share live feeds from across the island allowing Singapore citizens to enjoy the celebrations while maintaining social distancing and safety guidelines. Yaal Eshel, General Manager – LiveU Asia, said, “Mediacorp provided exemplary election coverage around the country under challenging conditions. It was great to see our technology in action, delivering reliable live feeds and IP distribution as part of a seamless content workflow.” Visit http://elevatebroadcast.com and www.liveu.tv


NEWS OPERATIONS

Ideal Systems Launches APAC Rental Service for Zoom Rooms ZOOM PRO-AV INTEGRATION partner Ideal Systems has announced the launch of its new Zoom Room rental service using the DTEN all-inone video conference system.

Cavan Ho, Regional Manager at DTEN for South East Asia, said “We are delighted to partner with Ideal Systems in the region, their ability to provide the DTEN as a rental service backed up by their installation, integration and support capabilities it makes them an ideal partner for us.” “Covid-19 has changed the global business landscape and how business is done by catalysing the adoption of video conferencing in particular with Zoom. DTEN and Zoom have made it easy for companies to deploy Zoom Rooms, and now our new rental model makes Zoom Rooms easy to afford, manage and support,” Fintan Mc Kiernan, CEO of Ideal Systems for South East Asia.

Cavan Ho, Regional Manager at DTEN for South East Asia [left] with Fintan Mc Kiernan, CEO of Ideal Systems for South East Asia.

The DTEN has a 4K camera that can track presenters and a 16-element microphone array that can focus on the person speaking anywhere in the room with a powerful on-board computer and a DSP developed with Zoom to ensure clear audio and minimize distracting background noise. The operating system has been locked down to ensure security and ease of use with updates being managed via the Zoom Portal.

the remote conference process, from booking meetings to instantly sharing content with participants, allowing businesses to maximize efficiency in communication. The RED DOT award winning device also comes with touch screen enabling it as a digital whiteboard with functionality for annotating documents and graphics within Zoom calls. Copies of the whiteboard can be emailed directly from the screen. The DTEN also supports wireless screen sharing for users who want to present content direct from their laptop, so there is no more fiddling with VGA or HDMI cables, just a seamless easy presentation.

Zoom and DTEN have streamlined and simplified

Visit www.idealsys.com/dten-free-trial

The DTEN device comes preinstalled with Zoom and is ready to go as soon as you connect to the internet. With no separate speakers, microphones or cameras to worry about, the DTEN is quick and easy to wall or trolley mount.

NEWS OPERATIONS

Ideal Systems is now offering DTEN’s new all-in-one Zoom Room video conferencing device as a rental service in Singapore, Malaysia and Indonesia, with more countries coming online soon. The new service is targeted at organisations who want to be able to roll out and easily manage Zoom Rooms in the current times of travel restrictions and social distancing. The new service offering from Ideal Systems offers free delivery of the DTEN device, installation and set-up of the Zoom Room including integration to the users Outlook or Google Calendars for zoom call scheduling and room booking.

23


NEWS OPERATIONS

Sony Expands Remote, Virtual and Distributed Production Solutions SONY CONTINUES TO BRING to market, new offerings for every type of broadcast production. These allow producers and media companies to work effectively and safely in tight spaces or work remotely, leveraging the best of a team’s expertise and creativity. Sony’s ELC is designed specifically for streamlined news production, and with, amongst others, four major US broadcast groups representing over 60 ELC installations have been operating during the crisis using ELC remotely. It means the production control operations can be fully distributed or in a now typical scenario, for a single home-based operator to direct the production remotely. In parallel, commentators, anchors and floor directors can be working from different locations using ELC, which means any news can be produced by a team of experts located anywhere in the world. The content resulting from this optimised production workflows benefits from the freedom of location. As a part of Sony’s efforts to support the media industry during the current health crisis, existing customers could remotely operate their XVS Switchers, leveraging the Virtual toolsets available such as Virtual Menu, Virtual Panel & Virtual Shotbox. During the pandemic, Sony offered these capabilities for free, on a trial basis. By using this remote operations set-up, operators and technical staff were able to work from home or a remote studio. This software has been used in major broadcasters in Europe and North America, as well as eSports program production in South Korea. The recently announced BRC cameras update will allow output tracking data over IP, using the industry standard Free-D protocol. These cameras, with their enhanced PTZ capabilities, require a minimal number of operators, who can work on several cameras remotely at the same time. IP Live solutions from Sony have already been adopted by more than 100 customers worldwide. Launched in July, the HDCE-TX30 solution enables the HDC-3500 and previous generations

of HDC cameras to become IP native, again opening up new possibilities for remote production. It allows for greater workflow stability and resource sharing, which is key for any customer investing in remote & fully distributed production. New licenses for HDC-3500 and HDCETX30 enable IP HFR and IP 4K remote solution from March 2021. After remote production comes virtualisation as the next natural step. As part of a strategic partnership, Nevion and Sony have been contributing to the 5G Virtuosa project since July. This project, in part funded by the European Union, aims to harness the rapidity of 5G connectivity and Scalable Software Defined Network architectures for cooperative live media production, by exploiting virtualised production resources. The first part of the project has been successfully delivered even during the lockdown period, thanks to the IP connectivity such system has natively. For broadcasters already looking at the next step for their setup, a move to a fully distributed and cloud native infrastructure will ensure yet more safety and savings on resources. Between its Virtual Production Service Hive, Ci Media Cloud Services, NavigatorX and a set of media workflow micro services that help tailor modern workflow architecture to specific needs, Sony has a full portfolio of solutions and services for future proofing production infrastructures. News companies have been using Sony’s Cloud based Hive solution for the last few years: With 2,500 journalists in 200 locations spanning across the world, Reuters has partnered with Hive, Sony’s centralised production platform that allows worldwide bureaus to collaborate on stories across different time zones.

Geographically scattered teams can access and share content via the cloud, reducing production bottlenecks and driving down operational costs. The latest addition to Sony’s cloud-based offerings, the Media Analytics Portal, will be available in October 2020. As programme makers rethink their production workflow, they need tools to monitor content flows, provide rich metadata and analyse outputs. Media Analytics Portal will bring these capabilities to their fingertips and will automate processes, freeing up time and creating new efficiencies. Media Analytics Portal complements the existing portfolio of cloud-based solutions from Sony, which empowers broadcasters and content creators to move from a non-linear to fully distributed and cloud-based solutions. For streaming concerts, festivals and corporate events live, Virtual Production, Sony’s cloud native live production switcher, enables multicamera production without the need for any dedicated infrastructure. It can be used as the user requires, leveraging a pay-as-you-go model. The next release coming this October 2020 will add new capabilities, such as Screen Sharing input, Webcam input, Clip Playlist and many more. Visit https://pro.sony/rewrite

Live Streaming with Video-Call Management MULTICAM SYSTEMS has unveiled AIRBRIDGE+, a hardware-based video controller, call-in manager, CG and streaming engine — all in one package.

participants; adjust audio delay for each guest’s connection; and send the video program to guests.

With AIRBRIDGE+, four guests can be live simultaneously and up to 12 guests can be on the waiting list. Operators simply send a link to their guests. And with that link, they can join the show from anywhere, using their smartphone, tablet or computer — no app needed!

AIRBRIDGE+ also provides access to full production and live streaming capabilities. Whether you want to record the live stream and edit the recording later, or take snippets for use on social media. The system’s innovative PTZ camera integration allows you to create compelling video footage that entices viewers, while increasing productivity.

NEWS OPERATIONS

Suitable for radio, podcast, small Web TV or event productions, AIRBRIDGE+ includes video call queue management and is compatible with A/V standards (AoIP in/out, NDI/SDI outputs).

24

It’s one of the few professional video calling solution that lets users control the video settings (scale, position, picture) of remote guests; automate mix-minus audio of all

With AIRBRIDGE+ you can efficiently create interactive and engaging content and even

repurpose it later to drive people to your website, increasing traffic and revenue. Visit www.multicam-systems.com


NEWS OPERATIONS

Linear Keyer for Both IP And SDI

The M-KEY offers the user a choice of keying modes – with the key generated using either

The M-KEY’s support for multiple signal formats eases any SDI to IP upgrade, while also making it perfect for mixed SDI and IP installations as well as fully IP or fully SDI environments. The M-KEY’s gateway functionality can be used to integrate SDI into an IP environment or IP into an SDI environment. Its IP to IP translation functionality can be used for network address translation, protocol conversion between any of the input and output formats, unicast to multicast address conversion and the creation of media firewalls. The IP flows can be separated and protected across

up to four bi-directional 10GbE SFP+ network interfaces. It includes output traffic shaping and is tolerant of any input packet distribution. The M-KEY includes a number of timing features to help with system integration. These include a framestore synchroniser timed to an external Black and Burst or tri-level syncs analogue reference or PTP, with user configurable options for timing source priority and redundancy. Additional video delay – adjustable in one frame steps – is also available both on the input and output. The ten frames of flow input video delay allow delay compensation between the background source and two foreground sources – useful as the fill and key coming from a graphics machine may arrive several frames later than the background programme video. The ten frames of flow output video delay allow compensation for any big system delays. Other features include a quad split which allows the Output video, Output key, Keyed foreground and Keyed background to be viewed simultaneously, with zoom available for fine-detail checking during setup. Comprehensive SDI, IP and PTP monitoring information is available and can be used to generate SNMP traps, while 13 built-in test patterns can help with fault finding. The flexible choice of control options include the VisionPanel touchscreen control panel, VisionWeb Control web browser software and complimentary SNMP – while 16 time-saving presets can be assigned and recalled. Visit http://www.crystalvision.tv

NEWS OPERATIONS

CRYSTAL VISION HAS RELEASED a linear keyer that works with IP (SMPTE ST 2022 and ST 2110 video), with SDI or with both IP and SDI at the same time. The M-KEY can key one externallygenerated graphic over a video stream or a matte and is a traditional linear keyer designed for those who use a character generator or graphics PC to provide their key and fill sources. The M-KEY is ideal for a wide range of channel branding and graphic overlay applications, while the mixed IP/ SDI I/O means that it can provide a cost-effective solution for those using SDI caption generators to key into an IP feed, with the SDI to IP conversion included in the cost of the keyer. The M-KEY is a software app that runs on the MARBLE-V1 media processor hardware, a card housed in the Vision frame which features a powerful CPU/GPU processor and both SDI and 10GbE IP network interface connections.

External Key mode or Self Key mode and with both additive and multiplicative keying available to suit different types of graphics. The keyed graphic can be faded in and out, either manually or as a timed transition, while crops can be used to prevent keying in particular areas of the picture by forcing areas of background or foreground. The Min Clip and Max Clip controls can be used to change the key gain, with adjustable thresholds to force or remove areas of key – ideal for compensating for a key signal which does not have enough amplitude to force full keying or for creating a semi-transparent effect for the graphic.

25


MEDIA IN THE CLOUD, STORAGE & MAM VSN and the MAMifications of Workflow THERE SHOULD BE SOME CAUSES for celebrating in 2020 and VSN is celebrating 30 years of being in business. Launched as an automation company, it has developed a range of systems and solutions for various markets with a core competency around media asset management and automation.

With points of presence throughout the region, Nick Morgan, VSN’s Asia-Pacific Manager, says the company is well positioned to solve some of 2020’s industry challenges. “Obviously, the big trend at the moment is around remote workflow,” he says. “This has been going on for quite some time. In fact, VSN has been focused on the remote part of the workflow for actually such a long time. And we developed our current media asset management platform around I think 2011. We’ve been building that ever since. But, the primary drivers of that was access anywhere, content anywhere. That was one of the key philosophies in the design of the system. We built it all on HTML 5. It’s all completely web-based. There are no applications to install anywhere.” And VSNExplorer is the MAM platform? “Absolutely. And we have several components or modules to VSNExplorer. We have the core MAM itself. We also have a very powerful PAM module integrated within the same system, using the same database. We can actually track contents that are based on an NLE sequence. We have that tight integration with NLE systems. We know and are aware of every single clip and every frame boundary within a sequence. It allows us to save and store those sequences as you would in other typical PAM systems. And also allows us to do things like storage management based on what’s on the sequence, or what you want to protect in somebody else’s NLE editing sequence.

“There’s no real limitation on how you architect a workflow in our BPM solution. And we give this great toolset which allows users to actually create and design their own workflows.

MEDIA IN THE CLOUD, STORAGE & MAM

“Recently, we introduced a production planning and stage tracking module. Now you can pre-determine who gets involved from a functional user perspective - the various competencies you have like NLE editing, video editing, audio editing, graphics generation, or any other production type - and you can establish who does what and when. We actually track the production through the various stages. When people have actually finished their particular job they can say, well that’s done, automatically flag the next person in the production chain to pick up that project, and then do their particular part on it; or they can run simultaneously. But, we set these up as automated workflows with checkpoints that flag various users, and even manages where a production is in a particular stage.

26

“The third module we have is our BPM. Most people would know that as a workflow orchestration system. We have a very powerful visual workflow orchestration system that allows users to basically connect the dots, like boxes if you’ve ever built a Visio diagram or something like that, each box contains an event or a piece of script that determines a particular action or process that happens either within the MAM itself, or through a third-party system. When you can actually drag and drop and connect all of these, and you can have parallel processes running simultaneously and all merging to a final end point. “There’s no real limitation on how you architect a workflow in our BPM solution. And we give this great toolset which allows users to actually create and design their own workflows. This system not only works with our own MAM, or even a manual task, so it can break out to allow a review and

approval stage, or some human intervention task, but we also allow it to connect with third party systems like transcoding farms or what have you. That can actually perform a task that the BPM has set outside of the MAM and maybe bring it back. “The last module of the MAM is our Business Intelligence Module, which is designed ultimately to gather up data of activity and operations that happen within the MAM. An example would be number of users that have picked up a particular asset and created a production from it. The number of times a particular program might have been transcoded into different formats. The number of users that have actually been involved in creating a production. The time it took them to create that production. Whatever the metric happens to be we can gather up all of those metrics and we can then generate logical reports in a visual fashion. Your typical bar charts, pie charts, this sort of thing, so you can actually see how things break down. We can break the data down and present it to you in a visual way. “The other side of that is we go another step further there, we actually can integrate with third party systems as well and gather up their information if they allow us an API to gather up their data. And the common trend we’re seeing around BI is customers asking us, can you get OTT data from various VOD platforms, or OTT platforms, or even social platforms. The various kind of user metrics that we’re seeing in the digital world are now coming back into a MAM environment. The MAM not only becomes a central production hub for the production of content, it’s now being used to collate data from those different end publishing points back into that world to allow management teams, decision makers to be able to review that data and make logical business decisions based on the data they’re seeing.” So, what are your observations on the evolution of remote workflows during this pandemic? “I guess the thing is that – and I don’t think it’s just VSN - but I think quite a few vendors have had the capability of doing or offering remote workflows as part of a core functionality of their solution, however the pandemic has triggered a very serious look at that. The media industry is starting to recognise that remote workflows are possible. They don’t impact productivity. They allow people potentially to be a lot more creative in their own environment. They’re not restricted by an environmental space. And they’re seeing some pretty impressive results. “From a technology standpoint that will only increase. I mean if you’re not fully into remote workflows now you certainly are rushed into facilitate that. Because I think it’s definitely going to be a huge part of our future.” Visit https://www.vsn-tv.com/en/


MEDIA IN THE CLOUD, STORAGE & MAM

AFT Unveils AJA Pak Media Reader

ATECH FLASH TECHNOLOGY (AFT), a manufacturer of industrial and professional media card readers and storage solutions, has announced the Blackjet VX-1P AJA Pak Media Reader, a new media reader compatible with AJA’s durable, high capacity Pak Media. Encased in a protective metal housing, AJA Pak Media Solid State Drives (SSDs) offer video professionals flexible storage options with capacities ranging from 256GB to 2TB and are built to withstand repeated use in the field. Blackjet VX-1P utilizes the bandwidth of USB 3.1 Gen 2 (10Gbps), allowing users to ingest, edit and archive creative content from AJA Pak Media at maximum speeds. The new VX-1P utilizes the USB-C interface, making it compatible with existing Mac and Windows USB 3.1 and Thunderbolt 3 computers. Key VX-1P features include: • AJA Video Systems certification • USB 3.1 Gen 2 for transfer speeds up to 10Gbps • Reversible USB Type-C connector for quick and easy connection • Ingest speeds of up to 525 MB/s • Rugged metal enclosure for durability and longevity • Vent holes to increase heat dissipation • Thunderbolt 3 compatibility Visit www.blackjetusa.com/product-vx1p and www.aja.com/products/pak

Cantemo Portal 4.3 Delivers More Control

CANTEMO HAS ANNOUNCED an update to its media asset management system, Cantemo Portal. Portal 4.3 launches new features aimed at giving users more control over their media, while saving time and resources. The audio player within Portal has had a redesign to deliver both more detail and add functionality including better playback control. Using JKL keys, users can choose various playback speeds. The audio signature is now included as part of the display, making it easy to see what is happening with the audio at a glance. Portal 4.3 also delivers an update to the Portal Final Cut Pro Workflow Extension. Users are already able to continue working in Final Cut Pro, while Portal keeps the content organised in the background. With this update, editors have more ways to find the assets they need with new Portal collections and saved searches within the workflow extension. Cantemo has introduced updates to its collections, making it much simpler and quicker to create and search by collections. Users can create a collection with metadata, with every asset within that collection inheriting that metadata. Parham Azimi, CEO, Cantemo, commented: “We are continually updating Cantemo Portal to ensure that our users can be as efficient as possible. These updates will help them further reduce manual effort and save valuable time.” The latest version of Portal also features a new tasks page to give an easy overview of open tasks, as well as a Database Monitor Page. Admins can monitor and purge databases as well as see database changes over time. Admins can also time and configure purge intervals and purge directories. Third-party integrations can add more efficiency to the Portal workflow. The Rest API reference page now offers more ways to set up creative integrations. This collection of APIs makes it easy to find what you need when preparing your own solutions. Visit https://www.cantemo.com

Hitachi Vantara Reaches New Speeds for Unstructured Data HITACHI VANTARA, the digital infrastructure and solutions subsidiary of Hitachi, Ltd. (TSE: 6501), has announced a new distributed file system and management solution that will help customers gain faster access to and insights from unstructured data such as emails, documents, health records, audio, video and images. The new solution will be delivered through a partnership with WekaIO (Weka), the developer of scalable file storage for data-intensive applications.

enhance Hitachi Vantara’s portfolio with a highperformance, NVMe-native, parallel file system that the company will deliver tightly coupled to an HCP datastore. This performant networkattached storage (NAS) solution will be well suited for use with artificial intelligence, machine learning and analytics applications across a broad array of industries.

Hitachi Vantara has also announced an expansion of the Hitachi Content Platform (HCP), its cloud object storage software solution for connecting data producers, users, applications and devices.

The expansion of HCP also better supports next-generation unstructured workloads with performance-optimised all-flash HCP nodes. These new capabilities will deliver almost 3.4 times more throughput over Amazon’s Simple Storage Service (S3) protocol, resulting in lowered costs of up to 34%.

The new OEM relationship with Weka will

Updated storage nodes also deliver an

improvement of three times the read and write performance, while simultaneously enabling three times more capacity in the same rack space as the previous generation. This is especially significant as traditional NAS, primary workloads and cloud-native workloads are transitioning to object storage to meet high-performance requirements. The new HCP expansion helps customers translate data into business insights faster, increase revenue from data generated by unstructured data, and improve application performance to drive a better digital experience for end users. Visit www.hitachivantara.com

IN ADDITION TO ACQUIRING new businesses in Australia and the Asia Pacific region and launching new offices in Malaysia and New Zealand, Silver Trak Digital has also been busy expanding its MediaTrak workflow management solution. MediaTrak is a cloud-based web interface that updates in real-time. As a media technician or supplier updates a status against a service, the client can see its progress. It collects a range of information for Silver Trak and turns it into a day to day manager for staff and equipment resources, client portal for complete visibility as well a database of IP for the services that Silver Trak has to offer. Users can now place an order

for any number of services, track the order’s progress and watch it successfully be delivered to a variety of broadcast and distribution outputs all via this effective and efficient tool. MediaTrak works in parallel to Silver Trak’s Media Room, an advanced asset management solution available as a SaaS platform, to ensure every piece of content including film, TV and music masters, trailers, graphics, photos, metadata, captioning, subtitles and audio streams are digitally archived in one place keeping tracking and access simple and secure. According to Silver Trak Digital Cinema Manager Kylie Longworth, “MediaTrak gives users immediate access to a full-service, production

facility. From complex UHD, Blu-ray and DVD design and authoring, QC to transfers, conversions, edits and broadcast delivery all under one roof. With all of our platforms we ensure our clients get complete transparency, what they want when they want it and great service from an expert team. MediaTrak has an easy-to-use web-interface that makes ordering and delivering content a breeze, saving time and eliminating margin for error. Then, by storing content in Media Room, users have completely futureproofed their content solution for their library.” Visit www.silvertrak.com.au

MEDIA IN THE CLOUD, STORAGE & MAM

Silver Trak Expands MediaTrak Workflow Solution

27 27


MEDIA IN THE CLOUD, STORAGE & MAM

Arvato Systems Re-launches Vidispine

Samsung 8TB Consumer SSD

ARVATO SYSTEMS HAS ANNOUNCED that it will combine the synergistic product portfolios of its media & entertainment related Arvato Systems and Vidispine businesses, presenting them under a relaunched Vidispine brand. The move will bring the strengths of the two brands together in a more cohesive way, and give customers a clearer understanding of the product offerings.

SAMSUNG ELECTRONICS CO., LTD. has introduced its second-generation quad-level cell (QLC) flash drive, the 870 QVO SATA SSD, that is setting a new standard for high-capacity consumer storage. Featuring up to eight terabytes (TB), the new SSD delivers an uncompromising mix of speed, storage capacity and reliability for mainstream and professional PC users.

Arvato Systems, the IT specialist within the Bertelsmann Group, acquired media supply chain experts Vidispine in 2017 in order to strengthen the existing link between their complementary solutions and expand into new markets with a combined offering, working together to address the explosion in video content and the growth of cloud-based solutions for media. Behind the newly combined Vidispine brand is a complete ‘content ecosystem’ of the company’s own product portfolio and professional services together with a strong community of partner vendors, consultants and service providers and developers creating solutions for organizations.

In the past, consumers have had to choose between SSDs – which provide superior performance – and HDDs, which traditionally offer greater capacity. Samsung’s 870 QVO SSD, however, is able to reliably offer the best of both worlds, making it an optimal choice for mainstream PC users who prioritize performance and value, as well as for professional users who require high levels of capacity. The 870 QVO offers sequential read and write speeds of up to 560 MB/s and 530 MB/s respectively, with the drive’s Intelligent TurboWrite technology allowing it to maintain peak performance levels using a large variable SLC buffer. The 870 QVO also delivers a 13% improvement in random read speed compared to the 860 QVO, making it ideally suited for everyday computing needs such as multitasking, gaming and web browsing. The renewed Data Migration and Magician 6 softwares provide a host of improved and added features, enabling users to upgrade, manage and optimise their SSDs with greater ease.

The restructured portfolio includes solutions for the broadcast and media & entertainment industry, including enterprise Media Asset Management, Content Planning & Rights Management and Ad Tech, as well as the cross-industry content platform, VidiNet.

In addition to the industry-leading capacity and performance, the 870 QVO provides an exceptional endurance rating of up to 2,880 terabytes written (TBW), or a three-year limited warranty.

Visit www.vidispine.com/vidispine-arvato-systems-unite-strengths

Visit www.samsung.com/ssd

The 870 QVO comes in 1TB, 2TB, 4TB and 8TB models.

NAGRA NexGuard Forensic Watermarking Leverages AWS to Secure Content NAGRA, THE PROVIDER OF content protection and multiscreen television solutions, has announced API integration of NexGuard forensic watermarking technologies into AWS Elemental MediaConvert, a file-based video transcoding service with broadcast-grade features from Amazon Web Services (AWS). The API integration provides content owners, and any third-party working on premium content, with an added layer of security and traceability for their valuable pre-release and early release content workflows, and simplifies the watermarking process for file transcoding and OTT content preparation. NexGuard forensic watermarking for pre-

release, including the recently announced NexGuard ClipMark for short form content, as well as NexGuard Streaming for on-demand OTT content, is available in AWS Elemental MediaConvert, enabling full automation for the watermarking process from the cloud. No extra steps are required to apply forensic watermarking in the transcoding of pre-release content, such as full features, TV series or short clips, or in the preparation of content in the cloud, such as escreeners or direct-to-consumer OTT streaming. The API integration of NexGuard Streaming forensic watermarking with AWS Elemental MediaConvert enables OTT, direct-to-consumer, and video-on-demand platforms to apply a unique and imperceptible watermark per

streaming session, leveraging caching in the content delivery network (CDN) and efficient stream control at the CDN edge. The use cases include tackling theft of premium content such as early-release of movies in some territories, original video-on-demand content, and business-to-business online screening. AWS Elemental MediaConvert is a file-based video transcoding service with broadcastgrade features. It allows content owners, post houses and video platforms to easily create video-on-demand (VOD) content for broadcast and multiscreen delivery at scale. The service combines advanced video and audio capabilities with a simple web services interface. Visit https://dtv.nagra.com

iconik Adds Transcription and Annotation

MEDIA IN THE CLOUD, STORAGE & MAM

ICONIK HAS ADDED TRANSCRIPTION and draw-on annotation features to the latest version of its cloud-based media management hub. iconik allows organisations to securely manage, edit and share their media remotely, using a single interface for both local and cloudbased assets.

28 28

The transcription feature uses AI to convert audio dialogue to text, and attributes it to a speaker. The transcriptions are displayed as closed captions in the iconik player but are also visible in formats which separate the text by speaker or display individual lines of text against a time code. With the transcription function, users can search through all the spoken dialogue stored in the asset’s metadata. This

enables them to easily search for the best clips of an athlete, actor or location, just from audio mentions. The iconik review and approve function, which enables teams to feedback on projects remotely, has been enhanced. The draw-on annotation feature includes free-line and shape-based drawing tools, a colour chooser and a select and move tool. This enables reviewers to visually manipulate and draw directly onto single frames of video and images. These annotations help the reviewer to quickly clarify any amendments they are requesting. The annotations can be turned on and off when the user previews the content. This helps teams to communicate changes visually, in addition to

time-based comments. The iconik team has also implemented some additional enhancements. Folder uploads allow for many files to be uploaded to iconik at once. Azure Blob Storage support has been added for uploads, downloads and the iconik Storage Gateway (ISG). The cloud scanner in iconik can now support partial scans offering more control over the management of cloud resources. Collections can be transferred while maintaining their directory structure, ensuring consistent asset organisation. Finally, the iconik links have been shortened to streamline the sharing process. Visit https://iconik.io


MEDIA IN THE CLOUD, STORAGE & MAM

EditShare Straps on MoovIT’s Helmut to Supercharge Adobe Workflows

EDITSHARE AND MOOVIT GMBH have formed a global partnership to transform Adobe enterprise workflows. MoovIT’s Helmut solutions are used by leading broadcasters, newsrooms, sports leagues, corporations and new media organisations for their robust project management and administration, enhancing creative workflows for large distributed Adobe workgroups. The two companies have worked to integrate EditShare’s EFS shared storage solution and FLOW media management with the Helmut suite of products to form a comprehensive project management and remote editing workflow solution for enterprise Adobe workgroups. Scalable to the enterprise level, the EFS, FLOW and Helmut integrated workflow offers a media technology foundation that combines storage, media management, workflow automation, and workgroup management for a proven all-in-one solution. “To succeed in the evolving world of video production, our customers are looking to EditShare to provide open solutions that allow them to rewire their workflows and adapt,” states Tom Rosenstein, Vice President of Business Development, EditShare. “EditShare’s expanded partnership with MoovIT allows us to bring to our Adobe customers a powerful, adaptable set of tools to address those challenges head-on. The Helmut platform coupled with EFS and FLOW, embodies the workflow structure users require. Powerful, yet simple, collaborative Adobe Premiere Pro project management alongside adaptive media production management. It’s the complete package professional editors have been waiting for.” Visit https://editshare.com and https://www.moovit.de

IMT Streamlines Cloud Data Transfers

INTEGRATED MEDIA TECHNOLOGIES, INC. (IMT) has announced the general availability of SoDA, an enterprise software application that streamlines the process of intelligent data transfers to and from the public cloud. Designed for media and entertainment workflows and other unstructured data environments, SoDA provides predictive, actionable cost and data transfer metrics for optimising on-premise and cloud storage. “SoDA predicts the cost and speed of data movement between on-prem and cloud solutions, giving customers the tools they need to control their spend and understand the time it takes for data movement,” said Greg Holick, vice president of product management for software at IMT. “SoDA data management software gives users unprecedented insight into their total cloud storage spend.” Technologies partners contributing to the platform include Dalet, reachENGINE, IPV and CatDV. SoDA delivers unprecedented simplicity, insight and control, enabling users to manage the operating costs of their storage and cloud services more efficiently. Key SoDA benefits include: • Simplicity. Install and configure in minutes • Unlimited Data Movement. On-prem, hybrid, or cloud movement • Fixed Monthly Rate. SaaS model The launch of SoDA comes as IMT also announces the expansion of its advanced software development division to bring to market innovative software tools and applications that automate and simplify business operations, workflows and hybrid cloud data management. Visit https://cloudsoda.io

GB Labs Strengthens Security for CORE.4 and CORE.4 Lite THE DEVELOPER OF INTELLIGENT storage solutions, GB Labs, has announced that its proprietary CORE.4 and CORE.4 Lite operating systems have been upgraded with additional security benefits designed to augment existing performance, analytic, and real-time data monitoring and display features.

variety of remote locations – including home – and between various platforms, we knew it was important to revisit our operating systems to exploit or improve every aspect of their protective environment.”

“Some of what we have done will mean little to the casual observer, but to everyday users, we have again practically reinvented what is possible to do with a well-designed storage system.”

The CORE.4 UI has been improved with a range of auditing features, both historical and real-time. Data columns can be sorted to indicate how users have interacted with files as well as the type of protocol; login confirmation; shutdown and restart; file transfer status and location details; in addition to report creation and delivery tools. The real-time data is displayed in a timeline along with a pie chart that includes protocol data, all of which can be viewed for detailed analysis of precisely what’s taking place with the data at any time.

Twine said, “Security has always been crucial for protecting valuable assets, and with increasing numbers of people moving files around from a

“It is, of course, common to monitor a system and resolve security problems after the fact, using accumulated data, but what isn’t common

GB Labs’ Chief Product Officer, Howard Twine said, “We have been anything but idle in recent months and have used our time to further improve our already market-leading storage system OS with a range of security enhancements.

is our ability to automatically monitor a system’s data – live – and flag up anything extraordinary. This is highly valuable in showing how people interact with a system and in identifying where any bottlenecks may be, unnecessary downtime is being wasted, or further efficiencies may be found. All of these tools are useful in helping an organisation achieve its aspirations.” Twine added, “The in-built System Security Analyser in CORE.4 and CORE.4 Lite is a tool that actively looks for potential security issues and, if detected, makes appropriate recommendations. The upgraded CORE.4 OS with advanced security is currently being rolled out to existing GB Labs storage system users and will be available in all new systems from August 2020. Visit www.gblabs.com

EcoDigital Launches ECOPro Managed Services for Advanced Media Support

“Knowing how to build workflows efficiently, properly maintain infrastructure, and refine digital asset management strategy can be a challenge,” said Geoff Tognetti, Chief Technology

Officer at EcoDigital. “ECOPro Managed Services are designed to protects users’ digital assets, mitigate risks, reduce operational costs, and increase efficiencies, ultimately simplifying and streamlining digital content management so that users can spend more time on the activities that build their business.” Allowing users to strengthen their technology strategy, increase their operational efficiency, and extend their team’s capabilities, ECOPro Managed Services enable users to get the most out of their DIVA Software Suite. The service also provides cost-effective access to engineers and other subject matter experts who not only assist in optimising storage and workflow, but also help the business grow and evolve its DIVA system. The combination of 24/7 monitoring with proactive

ticket generation and patch installation minimises risk of operational downtime, giving users confidence that the right people always have access to the right assets at the right time. “As they expand their operations, most businesses can’t afford to get bogged down by the details of adding, managing, and training internal IT infrastructure,” added David Gonce, Chief Revenue Officer at EcoDigital. “ECOPro Managed Services allow these businesses to shift from a reactive model to a proactive IT approach that better supports efficient content creation workflows and frees up resources to grow and take advantage of new business opportunities.” Visit www.goecodigital.com

MEDIA IN THE CLOUD, STORAGE & MAM

ECODIGITAL HAS ANNOUNCED the release of ECOPro Managed Services, a new offering that delivers digital ecosystem management, storage infrastructure optimisation, and workflow improvement for users of the company’s DIVA Software Suite for digital asset management. Through ECOPro Managed Services, a team of highly trained, qualified, and experienced EcoDigital engineers conducts proactive monitoring, reporting, automation, and technology maintenance so that users of DIVA (originally developed by Front Porch Digital) can focus more resources on their core business.

29 29


POST PRODUCTION www.content-technology.com/postproduction

Tokyo’s NiTRo Streamlines 8K Editing Workflows with AJA TOKYO-BASED EXPERT in motion picture and multimedia broadcasting production, NTV Technical Resources Inc (NiTRo), is a major technical provider to the Japanese broadcasting industry with a range of media services, including production engineering, post-production and broadcast operations and management. Following the launch of 8K broadcast channels in Japan in December 2018, NiTRo broke ground on a new state-of-the-art facility for 8K/UltraHD2 postproduction in response to a surge in client requests for editing and delivery of 8K content. During development of the studio, NiTRo examined robust and flexible solutions that could manage both 8K/UltraHD2 and 4K/UltraHD postproduction, without requiring separate tools for each workflow. In pursuit of the most streamlined workflow possible, NiTRo selected AJA KUMO 323212G for the device’s versatile signal control capabilities to route 8K or 4K SDI signals throughout the new video editing facility, with the added benefit of 12G-SDI connectivity for simplified cabling and transport of high-bandwidth content. In the new NiTRo SHIBUYA post-production facility, the high density KUMO 3232-12G device is capable of routing and distributing 12G-SDI inputs to multiple destinations in the studio, with 32x 12G-SDI inputs and 32x 12G-SDI outputs. Simple configuration for ganged dual- or quad-link grouping enables users to group multiple inputs and outputs to route 8K or 4K signals as needed. KUMO 3232-12G is housed in the studio’s machine room, and three companion KUMO CP control panels connected to the network and located in editing suites allow colourists and editors to easily control the input and output signals remotely. Additionally, KUMO 3232-12G salvo support further

simplifies workflows by enabling post-production staff to configure and save up to eight salvos in the device to automate routing of sources to specific destinations. During post-production, all 8K sources are fed into KUMO 3232-12G, including 12G-SDI quad-link outputs from NLE and mastering applications and 12G-SDI quad-link I/O from an 8K Panasonic AJ-ZS0580 recorder and player. All colour grading and editing are viewed on HDR monitors, including Sony 4K and Sharp 8K displays. Once grading is complete, content is transcoded and exported to P2 cards with Colorfront Transkoder and played back on the AJ-ZS0580 for final review, prior to delivery to the client. In addition to KUMO, NiTRo uses a range of additional AJA solutions throughout production facilities, including the KONA 5 PCIe card for 8K I/O, and 4K2HD and 3G-AM Mini-Converters for reliable AES embedding/ disembedding of digital audio signals. Visit http://www.nitro.co.jp and www.aja.com

SIGGRAPH Asia 2020 Goes Virtual WITH ONGOING WORLDWIDE travel restrictions due to the COVID-19 pandemic making it impossible to host an in-person SIGGRAPH Asia 2020 in Daegu, South Korea, in November, organisers of the event have announced the move to a virtual conference to be held online from December 4-13, 2020.

Mediacorp, Infinite Studios Launch 3D Animation that’s a ‘Lil Wild’

POST PRODUCTION

SINGAPORE’S MEDIACORP has launched Lil Wild, an original 3D animation kids series featuring an unlikely crew of “tweenage” animals that can be found in the Singapore Zoo! Targeting those viewers ages four and up, this comedy-adventure series is Singapore’s first locally-produced animalbased animation, and looks to offer lively edutainment for young audiences through the exciting stories of the animal characters and the showcase of interesting wildlife fun facts peppered throughout the show.

30

Conceptualised and created by Mediacorp in collaboration with integrated media entertainment and creative services company Infinite Studios, the main

protagonists of Lil Wild are five animal characters with distinct personalities who are best friends living together in a zoo. In this 16-parter series, they will go on big adventures and in the process, lean to overcome various challenges life throws their way. Sapna Angural, Head, English Audience, Mediacorp said: “As Singapore’s largest content creator, we are very excited to present our latest edutainment animation series Lil Wild as part of Mediacorp’s range of quality, engaging content for our audiences. Inspired by animals found in our Singapore Zoo and specially developed for our younger fans, the show will showcase our lovable animal protagonists going on wild adventures while dealing with real-life issues faced by tweenagers in Singapore today.” Visit www.mediacorp.sg and www.infinitestudios.com.sg

“This has not been an easy decision, but we came to a common consensus that it was necessary in this current climate. The safety and well-being of all our participants remains our top priority. We appreciate your understanding and patience as we adjust our plans and refocus our efforts to put together the very first SIGGRAPH Asia virtual event for the community,” shared Jinny HyeJin Choo, SIGGRAPH Asia 2020 Conference Chair. The theme for this year’s SIGGRAPH Asia ‘Driving Diversity’ will take on a new meaning as we give our diverse group of worldwide technical and artistic contributors the opportunity to connect with and inspire new communities. The conference chair, Jinny Choo, and her team of program chairs are committed to delivering a strong SIGGRAPH Asia 2020 that celebrates this year’s innovation, advances and achievements in computer graphics, interactive techniques and beyond. We are also optimistic that the virtual format will allow our global community to come together and participate in new and innovative ways and drive forward the forefront of our field. Visit https://sa2020.siggraph.org/en/


POST PRODUCTION

THE RECENT VIRTUAL SIGGRAPH 2020 saw Intel announce the latest additions to the Intel oneAPI Rendering Toolkit. Part of Intel’s oneAPI family of products, the toolkit brings premier high-performance, high-fidelity capabilities to the graphics and rendering industry. The toolkit is designed to accelerate workloads with large data sets and high complexity that require built-in artificial intelligence (AI) through a set of open-source rendering and ray-tracing libraries to create high-performance, high-fidelity visual experiences. The new additions are Intel OSPRay Studio and OSPRay for Hydra available later in 2020, and visualisation capabilities for the oneAPI Intel DevCloud with sign-up available now on the Intel Developer Zone website.

capabilities to visualize multiple formats of 3D models and time series. It is used for robust scientific visualization and photoreal rendering and consists of Intel OSPRay in conjunction with other Intel rendering libraries (Intel Embree, Intel Open Image Denoise, etc.). Intel OSPRay for Hydra is a Universal Scene Description (USD) Hydra API-compliant renderer that provides high-fidelity, scalable ray tracing performance and real-time rendering with a viewport-focused interface for film animation and 3D CAD/CAM modeling.

By taking advantage of Intel’s XPU hardware, Intel Optane persistent memory, networking solutions and oneAPI software solutions, content creators and developers can bring their ideas to photorealistic reality with performance, efficiency and flexibility across today’s and future generations of systems and accelerators.

New Intel DevCloud for oneAPI capabilities enable the ability to visualise and iterate rendering and create applications with realtime interactivity via remote desktop. Users can use the Intel oneAPI Rendering Toolkit to optimize visualization performance and evaluate workloads across a variety of the latest Intel hardware (CPU, GPU, FPGA). There is free access with no installation, setup or configuration required.

Intel OSPRay Studio is a scene graph application that demonstrates high-fidelity, ray-traced, interactive, real-time rendering, and provides

Advantages of using Intel’s platform with an open-development environment for ray tracing and rendering for developers include:

OTOY NVIDIA A100 GPU Nodes on Google Cloud for RNDR

OTOY HAS LAUNCHED the RNDR Enterprise Tier featuring next generation NVIDIA A100 Tensor Core GPUs on Google Cloud with performance it says surpasses 8000 OctaneBench. This milestone was reached on a Google Cloud Accelerator-Optimized VM (A2) instance with 16 NVIDIA A100 GPUs. Each A100 GPU offers up to 20x the compute performance compared to the previous generation processor. With Google Cloud’s NVIDIA A100 instances on OTOY’s RNDR Enterprise Tier, artists can leverage OctaneRender’s industry-leading, unbiased, spectrally correct, GPU-accelerated rendering for advanced visual effects, ultra-high resolution rendering, and immersive locationbased entertainment formats. Benchmarked using OctaneRender 2020.1.4, Google Cloud instances bring thousands of secure, production-ready NVIDIA high-performance GPU clusters to the OctaneRender ecosystem – featuring both NVIDIA V100 Tensor Core GPUs (with up to 3000 OctaneBench) and NVIDIA A100 Tensor Core GPUs (with up to 8000 OctaneBench). In 3Q 2020, OctaneRender users will get early access to Google Cloud’s new NVIDIA A100 GPU instances with 40GB VRAM, 8-way NVIDIA NVLink support and 1.6 TB/s memory bandwidth, delivering remarkable memory capacity for blazing-fast GPU render times for the most demanding memory-intensive scenes – which was previously only available with slower out-of-core or CPU rendering. According to OTOY Founder and CEO Jules Urbach, “For nearly a decade we have been pushing the boundary of GPU rendering and

cloud computing to get to the point where there are no longer constraints on artistic creativity. With Google Cloud’s NVIDIA A100 instances featuring massive VRAM and the highest OctaneBench ever recorded, we have reached a first for GPU rendering – where artists no longer have to worry about scene complexity when realizing their creative visions.” Urbach added, “OctaneRender GPU-accelerated rendering democratised visual effects enabling anyone with an NVIDIA GPU to create high-end visual effects on par with a Hollywood studio. Google Cloud’s NVIDIA A100 instances are a major step in further democratizing advanced visual effects, giving any OctaneRender users on-demand access to state of the art NVIDIA GPUs previously only available in the biggest Hollywood studios.” “Built on the NVIDIA Ampere architecture, the NVIDIA A100 Tensor Core GPU has come to the cloud faster than any NVIDIA GPU in history,” says Bob Pette, vice president of professional visualization at NVIDIA. “Its record-breaking performance will open creative possibilities for artists and designers who use OctaneRender to produce their leading-edge work.” “Google Cloud Accelerator-Optimized VM (A2) instances are one of the only places with 16 GPUs connected via NVLink topology, enabling artists to push creative boundaries without sacrificing speed or performance. Bringing A2 instances to RNDR is a game changer for ultra-high resolution production, machine learning augmented rendering workflows, and next generation immersive media formats,” said Manish Sainani, Director, Product Management,

• Open-platform approach addresses singlevendor, lock-in customer concerns with crossarchitecture support for a variety of platform choices in performance and costs. • Rendering toolkit open-source libraries drive innovation through powerful ray-tracing and rendering features that extend beyond the capabilities of GPUs, such as model complexity beyond triangles, path tracing, combined volume and geometry rendering, and addressing the data explosion in today’s workloads. • Simplified AI integration is included via Intel Open Image Denoise, Intel® Distribution of OpenVINO toolkit and acceleration via 3rd Gen Intel Xeon Scalable processors with Intel Deep Learning Boost and bfloat16. • Intel tools provide readiness for nextgeneration hardware innovations to ensure visualisation applications automatically scale to support future Intel CPUs, GPUs and other accelerators. Visit https://devmesh.intel.com/groups/2969

ML Infrastructure, Google Cloud. Introduced earlier this year along with OctaneRender 2020, the RNDR Network allows artists to choose between rendering jobs on secure Enterprise Tier GPUs like Google Cloud’s NVIDIA instances or leverage the massive processing power available on a network of decentralized GPUs. Artists processing renders on the RNDR Enterprise tier can also use decentralized GPUs for overflow capacity, providing the flexibility to scale renders across thousands of peer-to-peer nodes when on a deadline or for ultra-high resolution formats. The RNDR Enterprise tier on Google Cloud is available for users of all OctaneRender integrated plugins across 20 of the industry’s leading content creation tools – including Maxon Cinema 4D, SideFX Houdini and Autodesk Maya. All current OctaneRender 2020 Studio and Enterprise subscribers as well as for OctaneRender Box License holders with an Enterprise maintenance plan are able to access the RNDR Enterprise Tier. Current OctaneRender users can try RNDR today at https://rndr.otoy.com or subscribe, buy, or upgrade to OctaneRender 2020 to access industry-leading 2-3x NVIDIA Optix7 RTX GPU hardware acceleration, Fast-Spectral Random Walk SSS, C4D native GPU noises, OSL Volume Shaders, and many more new features. OctaneRender 2020 also includes access to EmberGenFX – one of the industry’s first real time GPU simulation toolsets for ultra-realistic fire, volumetrics, smoke and particles and OTOY Sculptron, a GPU-based Mesh sculpting toolset. Visit www.otoy.com

POST PRODUCTION

Intel oneAPI Rendering Toolkit

31


POST PRODUCTION

ETC Publishes Specs for Naming VFX Image Sequences

THE ENTERTAINMENT TECHNOLOGY CENTER (ETC@USC) VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organising the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organisation had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication. ETC’s specification, which aims to standardise the process for media production, is available online for anyone to use. The new specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realised that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the Working Group included over two dozen participants representing studios, VFX houses, tool creators, creatives and others. The ETC@ USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision for the future of media creation technology. A key design criteria for this specification is compatibility with existing practices. Chair of the VFX Working Group, Horst Sarubin of Universal Pictures, said, “Our studio is committed to being at the forefront of designing best industry practices to modernise and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.” This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to

RE:Vision Effects Unveils DEFlicker v2 RE:VISION EFFECTS, INC., the effects plug-in developer, has introduced DEFlicker v2, a major upgrade to its solution to problematic high frame rate and timelapse footage. DEFlicker takes control of all things that flicker whether you shoot with very high-speed shutter or at very low video frame rates (timelapses). Version 2 provides additional and enhanced tools for correcting highspeed shutter artifacts, timelapse flicker and now comes with 4 plug-ins: HighSpeed: High-speed shutter and high frame rate elements tend to exhibit strobing caused by lighting systems. New in DEFlicker v2 are tools for managing the Alpha Channel for green screen VFX shots. Also added is improved noise reduction and a simplified method for removing the flicker quickly. Now users can enter the frames-per-second of the shot, and the system frequency for their region (50 or 60 Hz) and DEFlicker takes care of the rest. Timelapse: Your solution to handle flicker of image sequences with a lot of motion discontinuities. New features include improvement of render speed for sequences over 4K and better handling of large color and lighting shifts. It now also properly

deals with over-range values. Auto-Levels: Extended timelapse scenes often suffer from fluctuation in colour and levels when shot using automatic exposure. DEFlicker already allows you to display graphically the variation over time and replace missing frames in the sequence. In v2, new methods of fixing or replacing bad or damaged frames are implemented. Rolling Bands: New in v2, the Rolling Bands plug-in can be used in conjunction with the High-Speed plug-in or by itself. It allows you to model and attenuate those annoying dark bands primarily caused by lighting and rolling shutter speed time synchronization issues. Rolling Bands provide interactive ways to model the rolling bands height, the distance between bands, and the speed of roll. It also provides fine band feathering options to match the source video transition from lighter to darker. Temporal processing takes into account the rolling bands speed. All plug-ins will see better handling of linear versus gamma encoded sources (now automatic in AE, matches project settings) and v2 should load v1 based projects without issues. Visit https://revisionfx.com/

POST PRODUCTION

Join over 9,000 industry professionals who receive our weekly e-newsletter.

32

make it as seamless as possible for everyone to adopt this specification,” said ETC@USC’s Erik Weaver, co-chair of the VFX Working Group. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.” “Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.” Download the white paper, VFX Image Sequence Naming, via https://drive.google.com/file/d/173 MM9VRVXKZ64dEPixTajnTv04f2o5SN/view Visit https://www.etcentric.org

Remote, Cloud-based QC

HAVING SEEN A GREAT INCREASE in the need for those in the Media processing industry (post houses, editors, colourists, and alike) to have the capability to QC their content remotely, Venera Technologies has taken idea of usage-based pricing introduced with its Pulsar Pay-Per-Use (PPU) on-premise QC software, and applied it to Quasar, the company’s native-cloud QC service. According to Fereidoon Khosravi, SVP Business Development with Venera, “Up until recently, Quasar had been used by those Media companies and organisations that had already moved their content workflow to the cloud, and typically used one of Venera’s tiered subscription plans. However, given the current circumstances, there are a large number of Media Professionals, looking for a remote way to process (including QC) their content, but have been hesitant because they are unfamiliar with the Cloud and what can be done in the Cloud, and equally important, are not interested in committing to a monthly subscription plan due to the uneven and uncertain volume of work. “Given our experience of working closely with a wide range of Media professionals, we have come up with what we think is a winning practical solution for Remote QC – the cloud-based Quasar Ad-hoc plan.” Features include: • File storage; • An easy-to-use tool to upload your content to the cloud. • Instruction on how to use Quasar and its pre-made QC templates (you can modify them as you wish); • Browser-based interface; and • Users only pay for what they QC, simply load credits into your Quasar account using a credit card, and your account is debited as you QC your content. Visit www.veneratech.com


POST PRODUCTION

Avid Reimagines Workflows with Media Composer 2020

AVID HAS ANNOUNCED a major new release of its flagship Avid Media Composer 2020 video editing software. Designed to give storytellers at all levels the most powerful solution for more creative freedom and workflow flexibility, Media Composer 2020 includes a redesigned customizable user interface, a new Universal Media Engine, finishing and delivery tools, and support for Apple ProRes for Windows and Catalina, among many other enhancements.

More customizable user experience – With Media Composer 2020 users can tailor their workspace to exactly how they want to work. Improvements to the panelled UI dramatically increase ease of use and faster editing and mastering. A new Timeline Sequence Map increases efficiency by letting creators navigate their entire sequence without taking up the whole screen, while the Blank Panel unclutters the UI and stops panels from resizing. Finish and deliver with precision – Expanding on the editing and finishing capabilities introduced a year ago, Media Composer 2020 enables users to fine-tune colour with greater precision and make more granular gain value adjustments when working in ACES (Academy Colour Encoding System) spaces. Users can finish high-resolution and HDR projects with total colour precision and interoperability, ensuring pristine picture quality throughout their workflow. Next Generation Avid Media Engine – Media Composer’s powerful Universal Media Engine enables users to accelerate their workflows by reducing the

reliance on QuickTime to deliver better media importing, playback, editing and export performance. The media engine increases processing speed of hi-res HDR media and provides native support for a wider range of formats, including direct media access and Open EXR for over-the-top services such as Netflix. Media Composer 2020 enhances a user’s ability to easily create content for mobile video platforms and social media by providing 9×16 and 1:1 Mask Margins and FrameFlex framing pre-sets. Apple ProRes for Windows and Catalina Support – Like Mac users, Windows users can now create, edit, collaborate, and export ProRes media natively with encoding supported on Windows machines for media creation and exporting to .MOV export, MXF OP1a, and MXF OP-Atom workflows. Media creators also can use Media Composer on Apple’s latest macOS Catalina, a 64-bit OS that provides superior performance while leveraging the power of the new Mac Pro. Media Composer | Enterprise – Additionally, Media Composer | Enterprise expands its role-based customization capabilities to enable users to deploy or update site settings across an organization and deploy user settings independently to individuals or groups quickly without impacting any existing site settings. With more studios managing remote teams, Media Composer | Enterprise gives users more control over their productions. Visit www.avid.com/media-composer

Automated HDR to SDR Conversion

BLUEFISH444, the manufacturer of uncompressed 4K SDI, ASI, Video Over IP & HDMI I/O cards and mini converters, has announced support for Foundry Nuke and Nuke Studio 12 in its latest 2020.14 Windows Installer for KRONOS K8 and Epoch hardware. Bluefish’s K8 video I/O card and the entire Epoch range now support 2K/HD/SD-SDI playback within Nuke and Nuke Studio 12, giving post-production 2D/3D compositing and visual effects professionals access to the high quality associated with Bluefish video cards. With the K8 and Epoch video cards, Nuke and Nuke Studio 12 now have access to the highest quality SDI playback with proprietary 12-bit processing, supporting both RGB and YUV colour spaces. Currently supporting 2K/ HD/SD video modes. Bluefish will continue to work with Foundry to update support for the Nuke and Nuke Studio products and will be integrating 4K/UHD support in a forthcoming installer update. Visit https://www.bluefish444.com

LYNX Technik’s HDR Suite of Processing Solutions (HDR Evie+, HDR Evie, HDR Static) for the greenMachine platform addresses the challenge that broadcasters and content creators are facing when there is a need to broadcast or archive content in both SDR and HDR. The simple solution is to have two independent production routes for SDR and HDR, but this is costly and falls short of delivering full HDR content to the customer and ultimately the subscriber / viewer. The greenMachine suite of HDR processing tools addresses the issue of the simultaneous workflows by combining optimized HDR with up-converted SDR sources into a single HDR production process, thus eliminating the expensive and time-consuming dual SDR & HDR production. HDR STATIC, HDR Evie (Enhanced Video Image Engine) and HDR Evie+ processing applications run on the now familiar and award-winning LYNX Technik greenMachine hardware platform. These industry leading format conversion solutions ensure facilities can, for example, use a single greenMachine titan hardware module to up-convert four independent 3G SDR sources (e.g. SDR-only cameras, graphics, replays, external feeds, archives, etc.) to HDR in a variety of formats, and feed directly into the HDR

production workflow. This conversion ensures that HDR content is delivered direct from the optimized HDR cameras without compromise. Once the content is ready for delivery / broadcast / streaming to clients and subscribers/viewers, greenMachine can down-convert one of the HDR program output feeds to SDR, ensuring media facilities can deliver content to both HDR-capable screens and to viewers who are still watching content in SDR. HDR Evie+, the most recent addition to LYNX Technik’s HDR line-up for its award-winning greenMachine platform takes things to a new level. It makes use of patented, industry leading dynamic segmented frame-by-frame algorithms that use sectional dynamic tone mapping that allows adjustment of each segment (144 segments/frame) of the 3G or 4K HDR content all in real-time. The segmented dynamic conversion by greenMachine HDR Evie+ is especially suited to demanding and unpredictable content, with fast moving subjects and high contrast conditions typically found in live sports and news broadcasts. The entire range of LYNX Technik’s greenMachine HDR < > SDR processing solutions support a range of open standards for conversion, tone mapping, and color gamut, including HLG, PQ, SDR, and SLog3. Rec709, Rec 2020, and camera standards by Panasonic, Sony, Arri, ACES, DCI-P3, RED and BMD. Visit www.green-machine.com

POST PRODUCTION

Bluefish444 Supports Foundry Nuke & Nuke Studio 12

LYNX TECHNIK, the provider of modular signal processing interfaces, has announced that HDR Evie+ is now available for its greenMachine platform.

33


AUDIO Digital soundwaves

www.content-technology.com/audio

Stage Tec AURUS platinum is Backbone of KBS Hall in South Korea KBS, THE KOREAN BROADCAST SYSTEM, has updated the mixdown room at its well-known event location KBS Hall. Stage Tec’s AURUS platinum was chosen as a main component of this transformation. KBS Hall is located in Yeouido, Seoul in South Korea and is best known for hosting a variety of music and concerts performances by both Korean musicians and international guest. The concerts and shows are broadcast live and also recorded for later broadcast. The audio is managed from a control room at the rear of the hall. In order to stay technically up-to-date and to integrate Merging Pyramix multitrack recorder, a digital upgrade of the mixdown room was recently carried out. The control room was equipped with a new platinum Stage Tec AURUS mixing console with 48 faders – perfect for working with the existing Ravenna/AES67 audio network technology and the analog effects units in the room. AES67 allows individual items to be connected to a network that primarily uses one protocol. AURUS large mixing consoles are Stage Tec’s flagship products. In addition to the outstanding audio quality, this mixing console is characterized by the generously equipped user interface. All important functions are available with only one single touch. AURUS platinum delivers highest audio quality, most extensive editing functions, fastest workflows and perfect integration with the NEXUS audio network. All common audio

processing functions can be accessed directly from each channel strip. The clarity is extremely high, the distances are very short and the operating speed is unrivalled. Especially important for the broadcast sector is the KBS Mixdown Maintenance Engineer Kim Jae Min with newly-installed unrestrictedly StageTec AURUS platinum. flexible bus whole system is supported with a minimum of structure of AURUS with intelligent on-air/offwiring and simple clock sync method (PTPV2). air solutions. In addition, compliance with the For mixing processes impressive automation existing AOIP was a particularly requirement for buttons are integrated in the interface. Further to this project and the Ravenna/AES67 protocol the many helpful, console-internal functions, the must match the digital audio workstation Merging interaction with the NEXUS system and all other Pyramix, AURUS was the ideal choice. connected devices is one of the conveniences The Stage Tec console was supplied by Syncfish, that make working with an AURUS console so the manufacturer’s distributor for South Korea. comfortable, especially in large, complex and timeSince 2002 Syncfish has provided many digital critical productions. 40 user keys serve to meet all audio solutions for KBS. requirements, from the simplest routing changes Music events at KBS Hall often involve up to 100 in the audio network to PTZ control of connected people on the stage, and for large classical events cameras. All operations can also be stored in there may be as many as 200 artists. A special scenes and run automatically. challenge for system and mix engineers. The new AURUS console help ease these challenges. The

Visit http://www.stagetec.com

Indonesia’s 86 INC Upgrades with Clear-Com HelixNet and FreeSpeak II CLEAR-COM PARTNER 86 INC, an Indonesiabased rental company specialising in communications equipment for broadcast, event productions and live shows, has decided to expand its inventory in response to growing demand. Company owner Edwino “Dino” Gonggalang has been using Clear-Com’s solutions since 2010 and recently invested in the addition of HelixNet digital network partyline and FreeSpeak II digital wireless intercom systems to meet the growing complexities of communications networks in the company’s customer base.

AUDIO

Dino established 86 INC in 2006 while pursuing his law degree at Atma Jaya Catholic University Jakarta, where he began his production work supporting various campus activities from seminars to sporting events and live music. From there Dino established his business 86 INC, and just a year into its tenure, the company was landing major projects such as the Indonesia National Christmas Celebration, Dreamfields Festival, the IMF-WB Annual Meeting, the ASEAN GAMES and artists like Metallica and BTS.

34

With increasingly sophisticated productions demanding more from intercom systems, 86

infrastructural flexibility. “We decided to expand to HelixNet as the product offers great flexibility without being difficult to use or set up,” said Dino. “One of our favorite features is that HelixNet completely eliminates ground loop hum, which has been a longtime request of our customers.”

86 INC owner Edwino “Dino” Gonggalang, “Due the current global situation, we haven’t been able to deploy our latest equipment, but we are sure they will be in high demand when live events are able to return.”

INC made the decision to expand their analog systems with the latest digital intercom solutions from Clear-Com. “Clear-Com offers a variety of flexible solutions for the communications needs of our clients that are only becoming more complex,” explains Dino. “The addition of HelixNet and FreeSpeak II empowers us to address any application with complete confidence in the result.” HelixNet enhances familiar analog partyline system concepts with additional operational and

FreeSpeak II is a five-channel, full-duplex wireless intercom solution that is the obvious choice for large-scale, complex designs or specialized applications. 86 INC provides support for some of the most technologically demanding live events in Indonesia, thus requiring a robust intercom solution with the advanced capabilities of FreeSpeak II. Both HelixNet and FreeSpeak II can also be easily integrated with existing products and are backward compatible with Clear-Com legacy products, which is a cost-effective advantage for customers. “Due the current global situation, we haven’t been able to deploy our latest equipment, but we are sure they will be in high demand when live events are able to return,” concludes Dino. Visit https://clearcom.com and https://www.86inc.biz


AUDIO

Sth Korea’s Seongnam Arts Centre Completes Upgrade with Clear-Com SEONGNAM ARTS CENTRE is a cultural hub in South Korea that prides itself on being a stateof-the-art, multi-purpose performance facility. The Center is one of the first venues in Korea to upgrade to a fully digital intercom system. Having recently installed a new audio system, the 1808seat capacity focused on an intercom systems upgrade comprised of Clear-Com’s HelixNet Digital Network Partyline and FreeSpeak II Digital Wireless Intercom. The new system would be implemented across three performance halls: the Concert Hall, the Ensemble Theatre, and the largest of the three, the Opera House. Seongnam Arts Center partnered with One-Up Solutions for the integration of the intercom upgrade, but there was some trepidation over transitioning from the existing analogue system infrastructure, to a fully digital intercom system. One-Up referred them to Clear-Com’s work with the 2018 PyeongChang Winter Olympics, which had designated Clear-Com equipment as a standard by the International Olympic Committee (IOC), helping to put their minds at ease. The venue had been using an analogue intercom system that was already operating at max capacity, and demand for a bigger system, with more channels and more flexibility, was only continuing to grow. Andy Jae Hyung Ryu, Chief Technical Director of

One-Up Solutions explained, “Based on the requirements of international and big performance teams we can predict that performance venues in Korea will need to provide larger systems which will mean that digital intercom systems will have higher demand … so they decided to upgrade.” In the Opera House, the digital intercom system includes a HelixNet system and a FreeSpeak II system. The design for the new digital IP system took into consideration the existing cable infrastructure, and they were able to upgrade their analogue system to a digital system with 12 channels (64 endpoints) while still using the traditional three-pin XLR cable infrastructure that was already in place. A HelixNet HMS-4X main station is installed on the Stage Manager desk, backstage, and the desk station unit and belt pack are configured in various channels like sound,

lighting, video desk, and broadcasting. A FreeSpeak II FSII-Base-II wireless base station, linked with HelixNet, is also installed on the Stage Manager’s desk backstage. The FreeSpeak II wireless intercom system connects 25 wireless belt packs via several 2.4GHz antennas distributed throughout the venue for seamless roaming. Visit https://www.clearcom.com/

New System T Broadcast Platform from SSL

This latest release provides new functionality across the whole System T console range including the S500, S500m, S300 and TCR, and builds on previous enhancements including NGA and immersive audio, and DAW and dynamic automation aimed at entertainment

programming and events. Tom Knowles, SSL Broadcast product manager, comments “System T provides the most advanced AoIP integration of any audio console, supporting Dante, AES67 and ST 2110 on the same hardware interfaces, with full discovery and dynamic routing of any Dante enabled devices directly from the console GUI. Our latest V3.0 release further enhances true IP integration with visibility and routing of ST 2110-30 or AES67 RTP (Real-time Transport Protocol) streams from the console GUI.” SSL has updated the embedded operating system on System T, future proofing ongoing investment and keeping production facilities

Neumann Ships V 402 Preamplifier GERMAN STUDIO SPECIALIST Neumann.Berlin is shipping the V 402, its new microphone preamplifier with integrated headphone monitoring. The V 402 is Neumann’s first-ever stand-alone microphone preamp – although the company has created several generations of top-quality preamp modules for its mixing consoles, such as the now legendary V 476 B of the 1980s. Like its predecessors, the V 402 preamplifier is conceived as the perfect complement to all Neumann and other highclass studio microphones.

unique transformerless circuitry amplifies the microphone signal without unwanted coloration or sonic artifacts, such as noise and distortion. While this is also often claimed for simple preamps such as those in audio interfaces, the V 402 is built to much higher standards.

The V 402 is a dual channel microphone preamplifier carefully designed to maintain the sonic integrity of the original signal. Its

The V 402 is characterised by exacting attention to every detail. For example, the DI input is also designed for sonic purity. Its novel high-

The V 402 is equipped with a studio grade headphone amplifier ensuring uncompromised monitoring quality at the recording stage. Independent volume controls for each channel allow to dial in a latency-free monitoring mix without affecting the recorded signal.

secure. With the migration to Windows 10 Embedded, broadcasters get significant advantages of reduced development cycles and advanced feature deployment. With additional support for TeamViewer, consoles can also now be remotely accessed, enabling remote support, configuration, and ‘at home’ control with a standard internet connection. Further client requested feature updates include enhancements to System T’s onboard Effects Rack with a new reverb with degenerative noise controls, a 10-band parametric EQ with FFT overlay, plus a complete overhaul of channel and bus path presets. Visit www.solidstatelogic.com

impedance input stage captures the sound of electric guitars and basses as well as other instruments with no loss of detail and free from audible noise. Its switchable high pass removes rumble and pops very effectively without signal degradation. A 20-dB pad allows the V 402 to be used with high-level sources up to 28 dBu without distortion. The V 402 comes in a 2U 19” rack enclosure measuring 89 mm x 483 mm x 242 mm. Visit https://en-de.neumann.com/

AUDIO

THE LATEST SYSTEM T broadcast platform release from Solid State Logic addresses many of the ongoing IP and security challenges within modern audio production. With expanded AoIP integration providing direct console routing control for ST2110-30 and AES67 streams, and a major embedded operating system upgrade, broadcasters can embrace remote production and increased IP expansion and connectivity with confidence.

35


AUDIO

DirectOut PRODIGY Evolves to Next Level DIRECTOUT HAS EXPANDED the capabilities of its PRODIGY Series with the release of new firmware for both PRODIGY.MC (Modular Audio Converter) and PRODIGY.MP (Multifunction Audio Processor). Originally planned solely for PRODIGY.MP, the update expands the potential application scenarios of PRODIGY.MC by adding FastSRC and EARS (Enhanced Automatic Redundancy Switching). FastSRC is a bidirectional, low-latency asynchronous sample-rate converter for MADI and Network I/Os that allows two digital interfaces of a device to work in different clock domains. It combines good sound quality with very low latency of less than 0.15 milliseconds, and is invaluable in live sound applications and a “life-saver” in critical situations.

Shure SLX-D Digital Wireless System

THE SLX-D DIGITAL Wireless System is the newest addition to Shure’s digital wireless portfolio. Replacing Shure’s popular SLX system, SLX-D comes complete with new mechanical designs, exceptional audio quality, more reliable RF performance and simplified setup. The multi-faceted SLX-D Digital Wireless System provides end users with greater channel count than SLX, smart rechargeable options, and single and dual channel options. Transmitters run on standard AA batteries or an optional lithium-ion rechargeable battery solution with a dual-docking charging station. SLX-D is a state-of-the-art system with several notable features and user-friendly benefits, including: • Reliable RF - The system enables operation of up to 32 channels per frequency band without worrying about dropouts or signal fades. • Excellent Audio Quality – the SLX-D can handle a variety of inputs while preventing distortion – ultimately enabling clean, natural instrument and vocal sound.

AUDIO

• Ease of Use - the SLX-D is equipped with Guided Frequency Setup and a Group Scan feature that lets users set up multiple channels more efficiently by assigning frequencies to all receivers automatically via ethernet connections. Even for a 30+ channel system, the entire Group Scan can be completed within a few seconds.

36

Visit https://www.shure.com/en-ASIA

Other new functions include the Levelmeter speed configuration and an LTC-to-MTC converter. Updated Control Software A new version of the globcon control software is also available for free download, and is required in order to manage all of the new functions brought to the PRODIGY Series by the major firmware updates. globcon is a global control software platform for the management of professional entertainment production equipment which allows standardised control of multiple types of equipment from different manufacturers. DirectOut GmbH, was the first adopter of globcon and the complete DirectOut product portfolio is fully supported by globcon. In 2020, an increasing number of differing devices

Firmware Update for Pliant Crewcom

PLIANT TECHNOLOGIES HAS released version 1.8 firmware update for all CrewCom wireless intercom systems and devices. This new V1.8 firmware offers several improvements, including refined communications for network devices, and provides an overall enhanced user experience for Pliant’s CrewCom customers. The 1.8 firmware update includes a new group management feature in CrewWare and a new two-level user rights feature. With a roaming bias option for CrewCom Radio Packs as well as the ability to connect/ disconnect configured CrewCom devices without power-cycling the system, the 1.8 firmware update offers improved battery monitoring for the system’s devices. Additionally, the 1.8 firmware does not require users to power cycle remote devices when power cycling the system’s Control Unit, and there is full compatibility with the new Pliant 6+6 Drop-In Charger. The firmware update also adds mixed band options on 900MHz transceivers and enables use of GPO relays. The V1.8 firmware and software update, as well as installation instructions and release notes, are available to all CrewCom users free of charge and can be downloaded via https:// plianttechnologies.com/downloads

from multiple manufacturers is being added progressively. A public version of the software is available for download at no cost, providing integrated control and management of any number of DirectOut devices via an Ethernet network, USB and MIDI ports. According to Luca Giaroli, Product Manager at DirectOut and at globcon, “Thanks to the great co-operation between DirectOut and the globcon team, we have been able to increase the usability and versatility of the DSP functions inside PRODIGY.MP still further by adding parallel compression and stereo link to the dynamics, and introducing a new set of IIR EQs including All-Pass filters. The processor combined with its control software is absolutely ready to face any audio challenge.” Visit www.globcon.pro and www.directout.eu/en/support/updates

Avid Pro Tools S4 and S6 Configurator

USING A DAW CONTROLLER (Digital Audio Workstation), musicians and sound engineers get the most out of professional audio editing software. Qvest Media has developed an online “Configurator” for both of the modular Pro Tools | S4 and Pro Tools | S6 systems from Avid. With the help of haptic and visual controls such as knobs, joysticks, faders and displays, audio professionals can simplify and accelerate their workflows and finish audio projects faster and more comfortably. Thanks to its ergonomics and flexible application and design options, the S6 has become a leading controller and is used in numerous recording studios worldwide. In 2019, Avid added the compact version S4 to the portfolio. With Qvest Media’s Avid S4 and S6 configurator, different layouts for both DAW controllers can be conveniently configured in an online application. Upon completing the configuration, the created design can be shared with one of Qvest Media’s Avid audio experts for a profound product and system consultation. Both DAW controllers support the open EUCON protocol which allows DAWs from other manufacturers to be operated with an S4 or S6. Furthermore, it is possible to control multiple DAWs simultaneously with a single controller. Visit www.qvestmedia.com/avid-configurator


AUDIO

SLR Camera/Mobile Device Microphone SENNHEISER HAS ADDED the new MKE 200 to its portfolio of audio-forvideo microphones. The mini-microphone is designed for easy on-camera use with DSLRs and mirrorless cameras as well as mobile devices, where it ensures clean and crisp audio and gives that professional touch to video clips. “With the MKE 200, we are offering creators the first step to upgrading their sound,” said Tobias von Allwörden, Head of Portfolio Management – Audio for Video at Sennheiser. “Improved audio significantly increases the overall quality of your content. The MKE 200 makes this possible with its unique design which minimizes handling and wind noise. Simply attach it to the shoe mount, select the appropriate cable for your device and you’re good to go!”

The MKE 200 features a compact, sleek design with a stylish finish thanks to a fully integrated shock-mount and built-in windscreen. Battery-free operation and a lightweight design allow for optimal gimbal performance.

CEDAR for Pyramix 64 v8

CEDAR AUDIO HAS ANNOUNCED the availability of CEDAR for Pyramix 64 v8 which, as well as offering improvements in other modules, brings all of the benefits of the latest machine learning and AI capabilities of Retouch 8 to the Pyramix platform. CEDAR claims it invented spectral editing but, not content to leave things as they were, it has continued to research ways to improve it, making it faster and easier to use, able to cure a more extensive range of problems, as well as making it even more productive in a wider range of settings. In the case of similar instances of unwanted sounds in a track – elements such as hi-hat spill, over-excited sibilants and plosives, or even the repetitive noises caused by machinery, one would traditionally remove these by identifying each event individually and then defining it manually, but now users can mark one of the offending sounds and ask the machine learning algorithm in Retouch 8 to find all of the other instances within the recording. They can then be eliminated individually or as a group using the appropriate Retouch tool.

The MKE 200 comes complete with a furry windshield, two locking connection cables for DSLRs or mirrorless cameras (3.5 mm TRS cable) and mobile devices (3.5 mm TRRS cable) plus a draw-string pouch for storage. Visit https://en-de.sennheiser.com

Logitek Adds Dante Capability to Helix TV Audio Console

LOGITEK’S HELIX TV console package, a touchscreen audio console available in physical and “virtual” formats, now provides 64 x 64 channels of Dante audio along with AES67 networking. Powered by the JetStream Plus audio router, Helix TV integrates easily with popular automation/workflow systems such as Ross OverDrive, Sony ELC and Vizrt’s Viz Opus/ Viz Mosart as well as various networking platforms such as Ravenna and Livewire. 24 mix-minus buses are available, as are frame delay, EQ, and dynamics at every fader. Faders on the physical surface are motorised for remote operation. “The JetStream Plus router at the heart of our Helix TV system makes it easier for television stations to manage audio workflow, even from remote locations,” said Tag Borland, Logitek president. “TV stations are continuing to look at ways to more efficiently integrate their studio equipment, and many of our customers have requested Dante capability along with automation system integration. We’re pleased to announce that it’s now available as a standard feature in the Helix TV.”

Etere Loudness Control System

AUTOMATION, PLAYOUT and MAM specialist Etere has introduced an advanced loudness control system, ETX-L Loudness Control, that enables greater flexibility in adjusting the threshold for loudness normalization parameters according to requirements. Etere has released an advanced loudness normalization feature with EBU R128 and ITU BS.1770 algorithms signals to measure true-peak audio levels and adjust to the dynamic loudness range required by the station. Typically, loudness measurements uniformly integrate all programme material. However, there may be inconsistent loudness levels within different audio equipment. Additionally, broadcast audio can possess a greater dynamic range due to the use of digital media. As a result, the measurements may not be an accurate representation of the true measured value, thus losing consistency of audio levels between programmes and audio delivery sources.

• Dethump

All new JetStream Plus/Helix TV systems will include AES67 and Dante; current users can contact Logitek to purchase an upgrade.

Etere ETX-L Loudness Control unveils new capabilities to overcome this challenge by enabling loudness levels and dynamic range to be optimised according to the requirements of the station. The algorithm of Etere Loudness Control pre-reads audio data to determine the audio normalisation levels. It also allows loudness normalisation thresholds to be adjusted and modified as needed. When reading audio data, everything below the threshold will not be automated to avoid incorrect transitions. This enables loudness levels and dynamic range to be optimised according to the requirements of the station. The audio data is saved and used to regulate audio loudness range dynamically in live sources if audio data cannot be pre-read.

Visit www.cedaraudio.com

Visit https://www.helixconsole.com

Visit https://www.etere.com

In addition, the AI in the new Repair tool can identify, suppress, or reveal sounds while leaving the background in a marked region untouched. Unlike other spectral editing tools, only the significant signal within the region is processed; all low-level signals as well as the ambience are left unaffected. The latest additions include: • Retouch 8 • Auto declick • Auto decrackle • Auto dehiss • Manual declick

Up to 240 channels of local I/O may be directly connected to a JetStream Plus; total networking capability including Dante and AES67 is 96 channels in / 64 channels out.

AUDIO

The MKE 200 features a directional design which captures the sound of your subject while rejecting unwanted background noise. To minimise any handling noise, the microphone is fitted with a clever internal shockmount which acoustically decouples the capsule from the housing. To protect from wind noise, Sennheiser engineers designed the MKE 200 with an integrated layer of protective mesh inside the housing. This protection is further enhanced by using the included furry windshield when filming outdoors.

37


RADIO The original broadcast media

www.content-technology.com/radio

Stagetec Asia Transforms Taiwan FM Radio

STAGETEC ASIA WAS RECENTLY entrusted to commission an international radio project in North Asia Market, further strengthening the brand name in the South East Asia regions since its establishment in 2004. Professional Sales Support & Technical Experts are Stagetec Asia’s commitment and deliverables in business. 2020 marked its 16th Anniversary with a significant milestone i.e. Stagetec Asia’s first entry and pioneer project in North Asia regions awarded by Transformation Radio Station in Taiwan. Transformation Radio recently invested in DHD Audio system for its new radio studios located in Taipei. The decision was made in favour of DHD Audio’s brand because of its fore front in audio mixer technology and its products reliability which can meet Transformation Radio’s long-term investment goals and expectations. Besides DHD Audio System, Transformation Radio also considered ‘Yellowtec Mounting System’ as modernize radio studio ensemble due to the product quality and ‘Intuitive solutions, award winning and sleek product designs’ concept. Hence, Transformation Radio studios were equipped with state-of-the-art broadcast radio equipment that supports latest AoIP formats, which interoperable seamlessly between one another. The on-site testing and commissioning were scheduled for January 2020 but slightly delayed amid COVID-19 travel ban and lockdown. The system installation has been well coordinated and supervised by Mr. Yap Wei Keong, Stagetec Asia’s Project Manager together with his team in proficient pretesting, configuration, commissioning and training for the radio station. Kudos to the local partner “VTek Engineering Ltd” as the main local system integrator in Taiwan to work virtually and remotely with Stagetec Asia team for a smooth commissioning. This earmarked Stagetec Asia’s capability and business continuity despite the unexpected and unprecedented global pandemic. The project consisted of 6 units of DHD New 52 SX2, a complete modular and modern broadcast audio mixer that was installed at a total of six OnAir Studios. The 52/SX2 broadcast mixer is a user friendly and flawless system that allow networking with Series 52 products including DSP Control Software, View Apps and Production Remote-Control Software. Meanwhile, the MCR Room was equipped with Main & Backup DHD XD2, a high capacity radio router core which accept numerous connectivity options for AES67, DANTE, MADI, Gigabit Audio, AES, Analogue audio feeds and stay in control

via Ember+, NMOS, DHD-ECP, Ross-Protocol, SNMP, Pro-Bel, GPIOs or serial connections. Mr. Advon Tan, Managing Director at Stagetec Asia commented, “For a complete newly built radio studio Transformation Radio, Stagetec Asia has provided the latest Radio Broadcast Solution for the unique and modernised broadcast station. Modernised in term of latest technology to cater for the new broadcast formats. Uniqueness in term of workflow customisation and tailored design which allow DJ’s creativity in their radio show to impress the audience with a powerful optimised performance. The user-friendly system is also designed for cost effectiveness i.e. easy maintenance and low power consumption. We believe that both Stagetec Asia and DHD are not only worldwide recognised brands but able to understand and meet the expectations of Transformation Radio, therefore we made a strong footing in the new market. We anticipate more demand for radio studio in the new markets and we are ready to serve and provide a unique solution to tailormake each radio studio’s budget and requirement.” Mr. Junan, Head of Engineering at VTek added, “We are impressed to engage with Stagetec Asia team who understand fully our needs and concerns. Our clients have specifically requested customised radio workflow and tailoredmade studio design to operate in modernised theme and the end results delivered are impeccable! Stagetec Asia managed to deliver the project successfully and professionally in spite of some hiccups amid the COVID-19 outbreak. Congratulations to Transformation Radio as all radio studios have begun airing session recently and are fully operating now.” Visit www.stagetecasia.com

Spokenlayer Partners with SCA to Launch Short Form Audio Network US COMPANY SPOKENLAYER has announced its expansion Down Under with the launch of SpokenLayer Australia in partnership with TV and radio broadcaster SCA. SpokenLayer is the US’s leading provider of short form audio content for voice assistants, smart speakers and podcast platforms. The company specialises in the creation, distribution and monetisation of short form audio content in partnership with publishers and brands. The end-to-end solution turns text into human voiced audio which is then distributed to Google Home, Amazon Alexa, Apple Podcasts, Spotify and a number of other platforms. The partnership with SCA will enable SpokenLayer to access SCA’s vast network of professional voices and enables advertisers to access SpokenLayer’s growing network.

RADIO

SpokenLayer CEO, Andy Lipset, said: “We are thrilled to be partnering with SCA as we bring SpokenLayer to Australia and also New Zealand. With digital audio consumption showing massive listenership and smart speaker penetration increasing every quarter, Australasia is a natural place to extend SpokenLayer’s presence to. With SCA’s infrastructure, relationships, and native understanding of audio, coupled with SpokenLayer’s technology and ownership of short form, we know advertisers and publishers are going to find incredible success working with our team there!”

38

SCA CEO, Grant Blackley, said: “We’re excited to partner with SpokenLayer as

it enters the Australian market and SCA has a depth of professional voices across Australia. Audio is an incredibly innovative medium, and with smart speaker ownership growing and Australians spending more time at home, we’re seeing demand for text to voice grow dramatically. We see these text to voice briefings as complementary to our existing assets in podcasting and radio streaming giving advertisers access to a highly engaged and growing digital audience.” There are more than 5.7 million smart speakers in Australia, which has the greatest concentration per capita globally. Smart speaker consumption has grown by 50% month on month since March this year with 3.2 million Australians having at least one smart speaker in their home, according to SCA. SpokenLayer recently launched daily news briefings for five newspapers as part of a partnership with Australian Community Media (ACM). Led by The Canberra Times, the ACM partnership will also see SpokenLayer producing and distributing daily briefings for The Ballart Courier, The Launceston Examiner, The Border Mail and The Illawarra Mercury. SpokenLayer already partners with leading US publishers such as Hearst, Time and Conde Naste, producing content for more than 200 titles including Techcrunch, Time’s The Brief, The Economist and The Los Angeles Times. The ACM titles will add to its growing Australian roster which includes a daily gaming news update from leading gaming publisher Press Start and the Coronavirus Australia Daily Update. Visit www.sca.com.au and https://spokenlayer.com


RADIO

Euro Deadline for Digital Radio in Cars DIGITAL RADIO ROLLOUT in automobiles will receive further impetus throughout Europe with an impending adoption deadline for the European Electronic Communications Code (EECC), which must be transposed into national legislation by EU Member States by 21 December 2020. The regulation, which applies to all EU member states – regardless of the status of DAB in each country – also gives countries the opportunity to introduce new legislation regarding consumer receivers. Several EU countries, including Germany, the UK and Italy have already introduced regulations to implement the EECC directive into national legislation. In Germany, all radio receivers in new cars will be required to include digital radio capabilities from 21 December 2020. In the UK, all radios fitted in new passenger cars will come with digital radio as standard from 2021 following new regulations passed through Parliament. In Italy, all new (consumer and automotive) radio receivers sold from January 2020 onwards are required to include DAB+. In France, a proposal requiring all new car radios to include digital radio capabilities – in line with the EECC deadline – is being reviewed by parliament. Spain recently published a draft of its Telecoms Regulation, which also complies with the EECC. Other countries including The Netherlands, Belgium, Denmark, Sweden, Austria, Greece, Czech Republic, Poland and Malta have all initiated procedures to implement the EECC into national legislation. Patrick Hannon, President of WorldDAB, said: “The EU decision to mandate that all new car radios should be able to receive digital terrestrial broadcasts has transformed the prospects for DAB+ radio in Europe. A growing number of countries are transposing the directive into national law. We urge countries that have yet to implement the EECC to act imminently and help ensure that motorists in all EU Member States benefit from the advantages of digital radio – greater choice, clearer audio and enhanced data services.” Visit https://www.worlddab.org

Logitek’s JET67 AoIP Engine Now with Dante Option

DESIGNED FOR RADIO APPLICATIONS, Logitek’s JET67 provides an inclusive option for broadcasters who need access to various forms of audio networking. Its on-board AES67, Ravenna, Livewire and Logitek JetNet networking enable easy interconnectivity with most other equipment in the studio; stations that also operate Dante-enabled equipment can now access this capability via an extra-cost module. The power behind Logitek’s new mixIT touchscreen consoles, the 1 RU JET67 provides multiple analog and digital inputs and outputs, mic inputs with phantom power, multiple mix-minus buses, and EQ/Dynamics control along with routing functions. Logitek’s advanced touchscreen console, Helix, may also be operated with JET67 although some features such profanity delays are not available. “We designed JET67 to take care of everything a small to medium market Radio broadcaster will need, with no hidden surprises in the equipment costs,” said Tag Borland, Logitek President. “Other budget-minded mixing and routing engines require the purchase of external microphone processors, dynamics processing or even networking options. JET67 has all of these and when paired with the mixIT surface, brings a highly versatile touchscreen console to even the smallest operation.” Visit https://logitekaudio.com

AHYBRI DCONFERENCE T heAs i aVi deoSummi ti st het hea nnua l ma r queeev entoft heAs i a nv i deo i ndus t r y . Conv er s a t i onswi l l c ent r ea r oundt heSt at eofFr eeToAi randPay Tel evi s i on, adeepdi v ei nt ot heSt r eami ngI ndus t r yandPr emi um Adver t i s i ng, t heTec hnol ogyt ha tunder pi nst heent i r ei ndus t r y , a ndwha tl i es a hea df orSpor t s . I na ddi t i on, t her ewi l l a l s obeapa r t i c ul a rf oc usonGender Equal i t yi nt heMedi aI ndus t r y.

Hi ghl i ght s f or2020

•RememberTel evi s i on? •St r eami ngAhead •Pr emi um Adver t i s i ng •Spor t i ngTi mes •Tal ki ngTec h

Ear l �Con�r medSpeaker s

LeeChoongKay VP,Chi efofSpor t s As t r o

Angel i nePoh Chi efCor por at e Dev el opmentOf f i c er Medi ac or p

I vyWong CEO VSMedi a

Cl émentSchwebi g MD War ner Medi a Ent er t ai nment Net wor k s ,Sout heas t As i a,Pac i f i candChi na

RADIO

Spons or s

Lakshant iFer nand Par t ner CMSSi ngapor e

as i avi deos ummi t . c om

39


RADIO

WorldDAB Urges Priority on the Visual for Car Radio WORLDDAB HAS LAUNCHED a campaign with the support of its Automotive User Experience (UX) Group members to encourage broadcasters to use their visual assets to keep digital radio prominent in car dashboards.

radio user experience in their cars,” said Laurence Harrison, Chairman of the WorldDAB Automotive Working Group. “As car dashboard screens get even bigger, radio station metadata will be even more important to power a rich user experience.”

The campaign underlines the important role visual information now plays in providing a positive digital radio experience for drivers - and it offers guidance to broadcasters on how to use information they already have in the form of metadata to provide a richer experience for the driver.

To support the campaign, WorldDAB has produced a number of resources to help broadcasters, including:

Metadata enables the visual information, text and graphics (e.g. station name and logo, presenter, song title and album artwork) to be displayed on the dashboard while a specific station is playing, as well as the development of a good hybrid radio user experience. “Car manufacturers need the confidence that broadcasters are going to provide metadata, and that in turn will ensure that they prioritise the

• Animated video explaining why it’s important for broadcasters’ stations to have a visual presence in the car, and how this can significantly impact the digital radio experience for drivers; • Information Sheet aimed at senior radio managers and those working at a more technical level giving information on: The exact type of metadata the car industry requires; Why metadata is important for broadcasters and drivers today; How broadcasters can effectively provide metadata; and Metadata requirements and other technical information; and

• Video presentation to promote the use of metadata more widely within the broadcasting industry and raise awareness on the importance of metadata. The campaign is part of WorldDAB’s ongoing work to improve the user experience of digital radio in the car. To find out more, contact metadata@worlddab.org

GatesAir Integrates AOIP Transport within Radio TX GATESAIR HAS ANNOUNCED two new Intraplex IP Link Audio over IP innovations. The IP Link 100e is the Intraplex family’s first modular plug-in card built for integration within radio transmitters, while the IP Link 100c is a new compact hardware codec built for remote contribution and standard STL IP connections. The IP Link 100e is purpose-built to receive FM and digital radio content directly within GatesAir Flexiva transmitters. The module is added to Flexiva FAX exciters to reliably receive and feed AES67 and other Audio over IP formats direct to the exciter. The smaller, integrated form factor reduces the cost of using Intraplex Audio over IP transport at the transmitter site since no separate hardware codec is required, and frees a 1RU equipment rack slot for auxiliary equipment. The module retains all functionality associated with IP Link codecs. In addition to GatesAir’s traditional robust connectivity, the IP Link 100e builds GatesAir’s industry-first Dynamic Stream Splicing (DSS) software into the module.

DSS software sends multiple identical streams over the same network or two separate paths, and each stream borrows data from companion streams to avoid service interruptions from packet loss. The IP Link 100e also supports the SRT (secure reliable transport) protocol and provides failover service to Icecast or locally stored audio for optimal reliability. The module also provides storage for program content and full duplex capability, allowing engineers to monitor signals off-air. The IP Link 100c also adopts a cost-reducing strategy but within a standard hardware codec. GatesAir has halved the form factor to better serve portable applications, providing a halfrack-unit footprint ideal for remote broadcast and studio-to-studio applications. The codec design includes a DC power supply, allowing broadcasters to quickly plug in and stream

program audio for sports contribution, live remotes and news coverage, for example. The IP Link 100c is also suitable for STL service, particularly as an affordable backup for primary STL connections, or for delivery to Icecast streaming servers. As with the IP Link 100e, this codec integrates standard Intraplex features such as Dynamic Stream Splicing software, SRT protocol support, and three separate network ports. Visit www.gatesair.com

News Production and Playout from CGI

RADIO

CGI’S NEWSBOARD enables journalists and editorial teams to organise story-centric production processes directly from anywhere they are working in the world. It uses a customisable and open widget architecture that allows newsrooms to stay flexible to workflow adaptations and allows journalists and editorial teams to organize the story creation process from research, to cross-media planning and distribution of news. All departments gain complete visibility into topics, while information from integrated widgets, such as Google Maps, social media, web CMS and more helps ensure journalistic teams are on top of the story.

40

Meanwhile, with StudioDirector 2.0, MOS integration is extended out from the NRCS to

studio automation software, enabling a whole range of time-saving tasks to be completed directly from the running order. Each studio layout for a specific show is defined within separate models that contain all mandatory graphics elements, video clips, camera, live, microphone and other directives, and the essential MOS commands for running studio automation. Journalists and directors simply select the desired studio layout for their stories in a show from a predefined list of models. Repetitive tasks, such as insertions and coding the story with the proper automation commands (graphics, camera directives etc.) are all managed by StudioDirector.

studios in existence than ever before as the media landscape shifts, the dira OnAir Player is effectively an all-in-one studio command centre. Its modular interface and features have been optimized to give solo radio DJs unmatched control over the content and presentation of their broadcasts, even in extremely demanding environments such as OB. Tasks such as adding an audio or live feed to a schedule, creating a mix between upcoming songs, or editing a text have all been designed to take as little time as possible, with most everyday jobs contained within the single platform.

Recognising there are more self-operated

Visit http://www.cgi.com/mediasolutions


RADIO

Making the Remote Broadcasting Connection with Digigram THE LATEST ADDITION to Digigram’s IQOYA range of IP-based solutions, IQOYA Connect is a cloud-based service designed for broadcasters to manage and monitor their fleet of codec and comms, whether they are located in the studio or a remote location. The service works with Digigram’s IQOYA Talk portable IP codec, IQOYA Mobile+Q-Mic smart phone codec app, IQOYA X/Link remote studio codec, IQOYA Guest browser-based remote interview solutions, as well as a number of other Digigram and third-party solutions. What distinguishes the platform is that it enables the codec equipment to be accessed via two different user interfaces – one for journalists/production staff and the other for engineering/technical personnel, according to user account permissions. According to Christian Bartl, Presales Engineer at Digigram in Singapore, “It’s a SIP infrastructure with session portal controller and measurements in the background, all available as a Cloud service, so running on different locations around the world, so we are hosting these services and we are basically, a huge SIP infrastructure that we use to connect all our hardware codecs or software codecs.”

IQOYA Connect’s technical or fleet management view.

C+T: Where are the points of presence? “At the moment, it’s in France and it’s in Australia.” What does the workflow involve? “Basically, it’s a Cloud service, so all your devices you can just log in to the devices with an account and the devices automatically connect to the Cloud services and once the devices are connected there, then you can control and manage all the devices from this Cloud service.

Is latency an issue? “Of course, there is latency because we’re talking about Internet connections, so it always depends also on the Internet connection on the service provider, how you are connected, are you connected, are you connected directly with a direct Internet connection? There are latencies that we cannot influence but it’s not comparable with video codec where we talk about three seconds to ten seconds latency, here it’s more about under a second. It can be optimised to under a quarter of a second or something like this, so really you have the chance to optimise the connection or the parameters. “If we talk about the IQOYA Talk, basically you have two codecs there. One is dedicated for the program and one for the talkback, so the program, the intention is to have the most reliable connection. If we say, ‘Yeah, okay, we have a bigger buffer so we can compensate for any networks congestion’, then we intentionally increase the latency. But if we talk about the talkback channel, then we say, ‘Okay, we don’t need this reliability, we don’t care about if there’s maybe one milliseconds of the audio is not transmitted’, then we can optimise the connection more for latency.” If you have multiple contributors or you’re in a multiple interview situation, how does it work in that sort of case? “Well basically, you are in your studio and let’s say, in your studio you have eight or multiple studio codecs there, or with the SERV/LINK, which is our multichannel codec, there you can connect up to 32 contributors. It means

The view for production and on-air staff within IQOYA Connect.

if you have one of the huge SERV/LINK devices all the power can connect up to 16 talks with two channels of program and talkback channels, and then you use Audio-Over-IP to connect to your audio mixer, then your audio mixer can be the central point to connect all the sources and to coordinate this, to connect it also to a talkback system that can take over the coordination for all the talkback channels. This is basically how you handle then all the sources, so your mixer is still your central operation tool in the studio, and you have all the sources available there in the same way as you might have your microphones there if you do it locally.” And you can vary the bit rate to compensate for poor connections? “Exactly. At the moment what we have is you can really adjust according to the quality and what is the next step, what we bring now, is really to have a kind of intelligence behind this to adjust the connection automatically - to be able to always have the optimum quality but still reliable connection. This is the next step. And what about security? “When we talk about how we did remote broadcasting in the past, is really to make direct connections to open firewalls, to set port forwardings, and all these kind of things that compromise security. This we really tackled with IQOYA Connect with this [session portal control infrastructure really to optimise the security for our clients so that we can really provide the most secure connection without compromising their security. Especially in broadcast, it is very tricky and this kind of onion principle in the networks, so we really accommodated for this to provide the most reliable security that we can provide with Cloud services.” So, more secure than a Zoom meeting? “I would definitely expect so.”

RADIO

“Basically, once you are logged into the devices, then you manage everything from the web interface for IQOYA Connect, you can manage the device settings directly. Let’s say, there are two different philosophies. One is the studio codecs, where you can prepare the setup and the audio configuration within the Cloud service and then push these settings to the stationary codecs. That is one philosophy, and then the other is for the remote codecs, that is more user-centric then technical-centric, so it means you have a kind of user setup and every user can prepare a kind of pre-set for themself and once the user take, let’s say the IQOYA Talk, and log in with their own user account to this device, then it automatically pulls the auto configuration that is made within the Cloud services, pulls to the device, and then you can use your individualised setup, so then all the settings for your microphones, all your audio monitoring are all tied to your user account and it just will pull to the device then directly, once you log in.”

41


CONTENT DELIVERY Terrestrial, Mobile, Broadband

www.content-technology.com/transmission

Telstra Extends Global Media Network to China and India, Partners with Zixi TELSTRA, HAS ANNOUNCED the expansion of its Telstra Global Media Network (GMN) into China and India, adding to the company’s capabilities in two of the world’s largest media markets. “The points of presence in India and China have been strategically selected to provide connectivity to media broadcasters in the region,” comments Andreas Eriksson, Head of Telstra Broadcast Services. “These PoPs have been designed to provide exactly the degree of high availability and low latency required to meet professional media standards for live events, sports, esports, gaming and more. “The partnership with Zixi will also assists us to deliver high quality content to internet connect media buyers using Telstra’s high-capacity internet peering arrangements and cloud infrastructure.” Telstra has been operating and delivering telecommunication services in India since 1995. The Telstra GMN develops on the company’s existing connectivity in the region and is connected to new points of presence in Mumbai and Chennai. These are positioned to transport sports, esports, gaming and entertainment content both in and out of what is the world’s second-largest telecommunications market in

The PoPs in each city — Beijing, Shanghai, Mumbai and Chennai — are sited in two separate data centres to provide both A and B paths for full redundancy. In addition, Telstra has also announced a strategic partnership with Zixi to provide stable, secure and cost-effective distribution of content over IP to Telstra’s customers and partners worldwide as part of the Telstra GMN. The Telstra Global Media Network distributes high-quality, high value, live and linear video on a consumption-based business model and is an expansion of Telstra’s international fibre, satellite and partner networks by providing a gateway, including data between the remote locations and company headquarters to improve

operations and workflows. The addition of the Software Defined Video Platform, with the Zixi Protocol and 16 other supported protocols and ZEN Master Control Plane, to the GMN connects on-network media rights holders to off-network media buyers by transporting highquality linear video using cloud infrastructure in a highly secure manner via content networks. This new integrated service expands the GMN to broadcasters in locations where fibre or satellite is not available for news and entertainment broadcasters, sports leagues and esports organisations. “These are hugely exciting expansions for us globally and a significant boost to our goal of connecting Europe to Asia and Asia to Europe,” says Mr Eriksson. “India and China are both vast, increasingly sophisticated markets with a growing desire for international content, and by strengthening our presence in each of them we enable media businesses to dramatically expand their reach into both countries. “We are looking to expand into more regions with a variety of partner alliances and pending points of presence, to ensure our customers thrive on connected networks.” Visit www.telstra.com

China Mobile’s MIGU Picks VisualOn for Video Streaming Solutions

INTOPIX AND VILLAGE ISLAND have announced an agreement to integrate JPEG-XS technology into Village Island’s VICO products.

VISUALON, A STREAMING solutions provider and pioneer of VR technology, has announced a partnership with China Mobile Communications Corporation’s subsidiary MIGU Video Co., Ltd. to transform the mobile video experience using technologies including the VisualOn MultiStream Sync feature, multi-angle view, augmented and virtual reality (VR), and low latency live streaming.

The upgrade will allow more throughput on current SDI-based installations and provide an avenue to switch to full IP transport using the ST2110 standard in combination with the JPEGXS compression.

CONTENT DELIVERY

Telstra has operated in China since 1989. Through its joint venture Telstra PBS, it was the first foreign company licensed to provide connectivity and network services on the mainland. It already operates data networks in 38 key cities and now has five data centres in China, the expansion adding to its Hong Kong data centres, Stanley teleport and master control room. The Telstra GMN is connected to points of presence in Beijing and Shanghai.

Village Island Brings JPEGXS to its VICO Converters The VICO converters deployed in Japan and world-wide for UHD visually lossless compression over SDI, fibre and IP were already powered by intoPIX’s TICO RDD35 technology and will now include the TICO-XS technology option. The upgrade will enable a higher compression ratio and better video quality while keeping zero latency.

42

terms of subscribers.

The new VICO options will power higher compression and ultra-low delay with JPEG-XS while preserving the visually loss-less quality, enabling even multiple UHD 50/60p video streams to be transported over single 3G-SDI, SFP+, ST-2110 and ST-2022-6 networks. Based on the fully standardised High Profile of JPEG-XS, it will further decrease latency down to 20 video lines, and further increase the compression ratio compared to the TICO-RDD35 version. Visit https://www.intopix.com and www.village-island.com

China Mobile is the world’s largest telecom operator, with more than 940 million customers, including over 55 million 5G customers. MIGU has the rights to the largest digital content library in China and has partnerships with many leading global content providers and sports leagues, such as the National Basketball Association. Enhancing the user experience through innovative playback features will help China Mobile continue to grow its 5G subscriber base. VisualOn is providing custom development services as an extension of China Mobile’s R&D team to bring specialised playback expertise to the project. Together, they will be deploying MultiStream Sync on the MIGU video platform, which allows users to view multiple camera angles on the same screen and seamlessly switch between the streams to choose the

optimal, customized viewing experience. Additionally, they are pioneering the real-world applications of VR through five key areas: live broadcasting, education, gaming, sports, and movies. VisualOn and VLA VR will enable China Mobile to stream high-quality, low latency, ultra-reliable VR content for MIGU users on VR devices, mobile phones, tablets, and Android TV. The significant improvements in latency and bandwidth offered by 5G make these types of advanced features a reality and provide a clear value proposition for subscribers to upgrade. “China Mobile has very ambitious plans for MIGU as a way to revolutionise the consumption of streaming content,” said Michael Jones, Head of Product and Business Development of VisualOn. “Deploying MultiStream Sync, as well as augmented and virtual reality support offers something that’s really ahead of the curve in the industry. Furthermore, we are in a unique position to help China Mobile develop the next generation of multi-angle and multi-view technology that goes beyond anything seen so far. We are thrilled to work with a leader in the wireless space to build and deploy these solutions.” Visit, www.visualon.com


CONTENT DELIVERY

the Free Library of Technology & Business Videos

SO MANY WEBINARS, SO LITTLE TIME?

REGISTRATION IS FREE AT >

CONTENT DELIVERY

CATCH UP ON-DEMAND, LEARN ON YOUR TERMS.

43


CONTENT DELIVERY

Etere Provides RTMP Input and Integrated Time Delay

Monitor and Diagnose ST 2110 IP Broadcast Production Video Networks

THE 4K-READY Channel-in-a-Box solution Etere ETX is designed to

THE NEW INSPECT 2110 Probe from Telestream monitors TS 2110

deliver audio and video content via the Internet, giving users greater

video streams across a production or contribution network to make sure

freedom when it comes to their live streaming setup with the option to

they are present and working as expected. A notification will be triggered

select an RTMP stream as an input source.

if the format, video, audio or data has errors or differs from the SDP file.

An integrated time delay feature allows the flexibility to configure a delay between preview and real-time. The time delay technology of

Thumbnails provide visual indication of program status, and penalty box provides immediate visibility of error conditions for diagnostics.

CensorMX is now integrated with both Etere ETX and Etere CensorMX.

Inspect 2110 Features:

Time delay can be used in live broadcast to prevent unacceptable

• Up to 100Gbps of monitoring capacity for up to 100 ST2110 essence

content such as profanity and bloopers from making it to air. Not only that, it also enables a multi-language live commentary. Etere allows users to customise the details of their time delay including:

streams • Confirm video streams are present and correct including format, video, audio, and data

• Set the number of frames to delay the stream for frame accuracy.

• Ensure redundant video streams are the same and healthy

• Adjust compression quality.

• Check that PTP synchronisation is correct and operational

• Set a time interval for the synchronisation of delayed video. Etere ETX provides synergy between playout, ingest, automation,

• View network traffic and program flow through the network • Link to PRISM to enable detailed analysis

master control and graphics environments. It supports SDI, IP and NDI

• IP Video Monitoring

transmission. Etere ETX integrates all the features to bring a channel

PTP timing and synchronisation is critical for IP video networking, and

on-air, including closed caption and live subtitling insertions, graphics

Inspect 2110 verifies that synchronisation is operating and correctly

insertion on up to eight layers, integration with the Etere Master Control

sync’d across video, audio and data streams.

touch screen panel for the simultaneous management of multiple

Inspect 2110 provides PTP status monitoring and PTP metrics reporting

channels, virtual machines capabilities and cloud support. It can also

and simplifies automated detection of PTP issues and diagnostics,

be integrated with Etere’s ETX-M Multiviewer which lets you monitor

saving time and reducing errors.

unlimited video sources on a single display.

Users can easily view program details including format, video, and audio

Real-Time Messaging Protocol (RTMP) is an Adobe Proprietary

as well as A/B redundant stream comparison and health checks. Inspect

protocol for the high-speed streaming of audio, video and data over

2110 provides support for IP Standards including ST2110-10, ST2110-

the internet. Based on Transmission Control Protocol (TCP), RTMP

20, ST2110-21, ST2110-30, ST2110-40 and ST2022-7.

provides optimisation of video and audio data transmission. In RTMP

A direct link to the PRISM Media Analyzer provides deep video waveform

streaming, the server divides the information into segments to enable

analysis, with multiple PRISM units able to be linked to any Inspect 2110

a smooth display with as much data delivery as possible. While Etere is

Probe.

not a streaming server, users can send RTMP streaming from Etere to a

Inspect 2110 software uses a high-performance container-based

RTMP streaming server. Additionally, it is also possible to receive RTMP

architecture supporting up to dual 100G Ethernet in a server

streaming through a remote URL and playout in Etere.

configuration now, plus virtual and cloud configurations near term.

Visit https://www.etere.com

Visit http://www.telestream.net

CONTENT DELIVERY

Magewell Adds NDI|HX Support

44

MAGEWELL HAS ANNOUNCED new updates for the company’s Pro Convert for NDI to HDMI and Pro Convert for NDI to HDMI 4K standalone IP decoders that further expand the devices’ support for Newtek’s evolving NDI media-over-IP technology. The free upgrades add compatibility with the high-efficiency, lower-bitrate NDI|HX mode in NDI 4, complementing the decoders’ existing support for full-bandwidth NDI streams.

rate video delivery over wireless and limited-bandwidth networks. In

Magewell’s Pro Convert NDI encoders and decoders let users reliably bring traditional video signals into and out of IP-based production and distribution workflows, enabling existing sources and displays to work seamlessly in next-generation media infrastructures. The Pro Convert for NDI to HDMI and Pro Convert for NDI to HDMI 4K decode NDI input streams for output to HDMI monitors, projectors, production or distribution equipment. In addition to NDI technology, the decoders also support SRT, RTSP, RTMP, UDP, RTP and HTTP (HLS) streams with H.264 or H.265 compression.

incorporates other new NDI 4 features including Discovery Server support,

While full-bandwidth NDI offers the highest quality and lowest latency, the bitrate-efficient NDI|HX mode supports full-resolution, full frame-

addition to supporting the NDI|HX technology in version 4 of NewTek’s NDI Embedded SDK, the Pro Convert decoders can be configured for compatibility with earlier NDI|HX variants implemented in some of the first NDI-compatible PTZ cameras and accessories. The update also enabling the use of NDI devices across multiple network segments. The Pro Convert for NDI to HDMI 4K supports NDI inputs up to 4096 × 2160 at full 60 frames per second, while the Pro Convert for NDI to HDMI decodes 1080p60 and 2K sources. Both models automatically optimise output parameters to match the capabilities of the connected HDMI display, using FPGA-based video processing to perform high-quality up/ down-conversion between HD and 4K. Visit http://www.magewell.com


CONTENT DELIVERY

Signiant Issued Patent for Transport Architecture, Recruits Aspera CTO

V-Nova and NETINT Enhance Transcoding V-NOVA AND NETINT Technologies have announced a collaborative roadmap that accelerates V-Nova’s MPEG-5 LCEVC (Low Complexity Enhancement Video Coding) market deployment while further improving the performance of NETINT’s encoding technology.

.

Signiant’s intelligent transport chooses the optimal number of parallel transport streams and selects either standard TCP or Signiant’s proprietary UDP-based acceleration protocol based on a variety of inputs including current and historical network conditions, available compute, storage type and the characteristics of the data set. The new architecture is capable of multiple Gbps transfer speeds and is already deployed in Signiant’s SDCX (Software-Defined Content Exchange) SaaS platform for transfers between public and/or private cloud services and/or storage. The platform is capable of working with any size file, data sets with an unlimited number of files as well as growing files and streams. “When early transfer tests showed speeds greater than 10 Gbps using settings derived through machine learning that weren’t intuitive or obvious, we knew we were onto something disruptive,” said Signiant CTO, Ian Hamilton. “As networks evolve and increase in bandwidth, Signiant’s transport becomes even more valuable and is now capable of speeds of 10s of Gbps. Standard transfer tools simply aren’t designed to take advantage of available bandwidth and so with today’s fatter pipes, this new architecture becomes even more critical as datasets get bigger and workflows are increasingly globally distributed.”

New Signiant Chief Solutions Officer Mike Flathers

In order to take full advantage of available bandwidth, Signiant’s machine learning algorithm examines past history and optimally configures application-level and transport-level parameters for both file and live media transfers. Not only does this ensure an optimum result without expensive and error prone manual tuning and tooling, but results improve over time as the system learns. In addition to Signiant’s patent-pending intelligent transport, Signiant’s core UDP-acceleration protocol offers distinct advantages. Signiant isolates different sources of congestion by looking at latency and packet loss, and also by constantly examining the rate of change in these observations. As such it can differentiate between edge and core network congestion and react accordingly. Signiant Inc. has also announced that Mike Flathers is joining the company in the newly created role of Chief Solutions Officer. He will be a member of Signiant’s senior leadership team and report directly to CEO Margaret Craig. Flathers joins Signiant from IBM Aspera, where he was most recently Aspera CTO and an IBM Distinguished Engineer. An expert in IP networking technology, he spent 8 years at Aspera and remained with the company after its acquisition by IBM. Earlier in his career he held positions at Novell and Sorenson Media. Visit https://www.signiant.com

Cinegy Multiviewer 15.2 CINEGY GMBH HAS RELEASED the latest version of Cinegy Multiviewer 15.2 designed for on-premise and remote monitoring applications. Streams from satellites, camera feeds, playout devices and other local or remote sources all need to be monitored. Cinegy Multiviewer displays and analyses these signals, raising alerts for any detected signal problems. Running as a service operating on commodity IT equipment, video streams can be received over IP, eg UDP/RTP, NDI, SRT via Ethernet or using standard SDI cards. Features introduced include latest version NDI support, using Cinegy Encode as an SDI output option, SRT encapsulated IP input and output stream inputs, and support for encrypted SRT streaming on input. Multiviewer Version 15.2 also offers further enhancements of its SRT support with additional SRT output encryption support along with DNS resolving for SRT URL addresses. Cinegy Multiviewer 15.2 also fully supports 8K workflows via IP or SDI. Cinegy’s 8K capabilities were first deployed in 2015 upon the release of its Daniel2 codec which at the time was capable of decoding 16K video at 280 fps with an Nvidia Quadro M6000. Cinegy has further optimised and integrated Daniel2, making it ubiquitous through the software product range. Cinegy Multiviewer 15 joined Cinegy Capture PRO and Cinegy Air PRO in supporting 8K, via IP (SRT/RTP/UDP), SDI, NDI (e.g. BMD DeckLink 8K Pro) confidence software broadcast solutions when the major update v15 was released earlier this year. Visit https://www.cinegy.com

NETINT Technologies launched the Codensity T408 video transcoder platform featuring realtime, high-quality, scalable H.264/H.265 ASIC based transcoding for live video streaming at up to 8K. Packaged in a 2.5-inch U.2 module formfactor for easy field upgrades, the T408 features sub-frame encoding latency that ensures an immersive experience for interactive video services and ultra-low-power and high density encoding for Edge based deployments. MPEG-5 LCEVC (Low Complexity Enhancement Video Coding), the latest MPEG standard, is a codec enhancement technology designed to enhance the video streaming performance of existing and future codecs. V-Nova LCEVC is a multi-platform software library for MPEG-5 LCEVC that improves the compression quality of any base video codec by up to 50 percent whilst simultaneously accelerating encoding by up to 4x. Any playback device supporting the underlying codec can be quickly upgraded in software to gain the benefits. Leveraging V-Nova’s LCEVC, the T408 transcoder module’s performance will be enhanced with increased encoding quality and throughput. These improvements will provide access to V-Nova’s LCEVC technology to a wider market, increasing the viability of new emerging services including cloud mobile gaming, social mobile video, and live streaming of UHDTV content. Visit https://www.v-nova.com and http://www.netint.ca

TAG Video Systems Supports CMAF Streaming TAG VIDEO SYSTEMS, the developer of monitoring and multiviewing solutions has announced support for the Common Media Application Format (CMAF). The move will allow OTT customers to take advantage of more diverse platforms, simplified workflows, lower latency and reduced cost, while adding flexibility and versatility to TAG’s software platform. The Common Media Application Format (CMAF), the result of a collaboration between Microsoft and Apple, is an open, extensible standard that enables efficient streaming over the HLS and MPEG-DASH protocols. CMAF cuts costs, minimises workflow complexity associated with delivering video online, and reduces latency – all crucial issues that content owners or broadcasters face while streaming live or on-demand content. TAG also added support for Microsoft Smooth Streaming, a legacy hybrid media delivery method that acts like streaming but is based on HTTP progressive download. Visit https://www.tagvs.com and www.magnasys.tv

CONTENT DELIVERY

SIGNIANT HAS ANNOUNCED it has been issued patent number 10,735,516 by the U.S. Patent and Trademark Office for its invention titled “Cloud-Based Authority To Enhance Point-ToPoint Data Transfer With Machine Learning.” The patent describes methods and systems that incorporate machine learning to evaluate anonymised historical transfer information collected by its SaaS products. This history is used as training data to build a model that predicts the fastest possible way to move or access media.

45


CONTENT DELIVERY

Embrionix and the Riedel Connection C+T Talks to I/O pioneer Renaud Lavoie, founder of Embrionix, now part of Riedel. LAUNCHED JUST OVER TEN YEARS AGO, Canadian company Embrionix was formed with the goal of making the lives of manufacturers and users easier by allowing them to use different types of I/O - be it co-ax, fibre or HDMI - with the same product using the concept of SFP (small form-factor pluggable) devices from the world of datacoms. Over a ten-year period, Embrionix shipped around 700,000 SFPs in the SDI world alone, to a mix of OEM and end-user users. As the broadcast world started to embrace IP standards, the company found its use of the SFP format perfect for the world of gateways and switches. According to Embrionix Founder Renaud Lavoie, “We were really ahead of the curve for that and we wanted to really differentiate ourselves by doing a miniature, if you want, gateway. You can compare us to, instead of having a big card and frame, we created a gateway that was the size of your thumb and this thumb can fit directly in a switch so you avoid all power supplies and failure block. We were really successful.” The company was so successful, in fact, that in November 2019, it was acquired by comms and networking innovator Riedel, a longtime Embrionix OEM customer, to form a North American development hub for the company.

What was the compelling reason for the acquisition? What made sense? Renaud Lavoie: “We wanted to bring more value to, let’s say, a ‘big brother’, and the big brother of the best fit, and I’m still convinced the best fit was with Riedel. They had this MediorNet, super reliable transport, and on the other end, we had this IP advance and innovation and they really enjoyed this innovation as well. It was really a perfect fit combining the two forces.”

What’s the plan going forward with the integration? “Other than bringing to the Riedel identity, the SFP are now called MuoN and FusioN, they go in line with the next generation of MediorNet. We are now called the MediorNet division. We combined the strength of Montreal and Vienna and now we have a bigger R&D force to deliver, again, top of the line products and also really full features. In fact, three new products we will introduce in August will be J2K, JPEG-XS and both will do SDI to JPEG-XS or J2K or 2110. You can see that it’s an IP to IP product. “One cool thing we also did is fit a 16-image multiviewer in an SFP, so that’s 1 centimetre by 3 centimetres, and now you have a 2110 to 2110 UHD output multiviewer. You can take 16 images in a new output UHD in IP. So, you can basically create mosaic with really, really small footprint and a small power.”

CONTENT DELIVERY

Given they are relatively new, what are some of the challenges in designing products to match the SMPTE IP standards, and to talk to other manufacturers’ systems?

46

“Embrionix was the fifth member in the AMWA (Advanced Media Workflow Association), so for discovery and registration we were one of the first to jump in. Then we joined AIMS (Alliance for

IP Media Solutions) and also 2110 and Interop. Embrionix participated until five years to all the Interops, we did not miss any one of them, and the last one was at Riedel, which was really interesting as well. “I think to stay ahead of the curve, first of all, all product is software defined, so we just need a software update and, boom, we follow the new standard, but also it was a commitment from us. “We wanted to ease the life of users and honestly, IP, even today, it’s not an easy task. It’s not a walk in the park, like taking an SDI feed or SDI signal, you take, let’s say, test equipment and you plug that in and you’re good to go. “IP, where is my signal? I cannot remove the fibre connection or active cable and just plug that in and test equipment. This is why we participated in all those Interops and today, with all the small, let’s say “roadblocks” we had to overcome, we have more diagnostics inside our parts that help us in those interrupts, help us in the lab, but also help users.”

What’s your sense of 12 GB SDI versus IP and customer adoption? “It’s really everything to scale. Depending on the scale you want to achieve, of course the budget is an important point, but let’s say the scale you want to achieve, I take CBC [Canadian Broadcasting Corporation] as an example. They have thousands of signals, you don’t see huge routers of 12Gbit. They said, ‘One day we want to be UHD. We don’t want to always change the gear, reducing the cable reach and they have more and more UHD signal. So, for large installations, I will say it makes sense to go to the IP. “I will say for a small island, I can see 12Gbit, but the corona crisis for us really showed that the IP, you can work from home. I think the corona crisis also made us rethink how we want to do our job, and is it good to have remote people? I think so.”

You mentioned the AIMS group, which has started to focus on the AV sector. Where do you see your products fitting into that? “At the beginning of Embrionix, when we started, we did these SFPs and some were HDMI, and we did a lot of patents around that, but we can see ourselves being an HDMI to an IP port. The complexity we had when we started that was the HDCP (High-bandwidth Digital Content Protection) and I’m really happy to see that the AIMS group are asking or are looking with DCP group [Digital Content Protection LLC] to see a way to transport the protected content and IP, because this will be key for all this AV over IP and there are some standards, they have a way to encrypt the data to make sure no one can listen to it, but if we figure that out, and HDCP was always a little bit of a pain, but if we figure that out, it’s really interesting, and will fit in both ends of the spectrum.”

What else is on the radar? “We want to look at different types of compression. One aspect is to go more cloud

oriented, so H.264, H.265, maybe NDI, so that’s one aspect we are looking at. The other aspect is we did an up-down cross but now we want to see can we scale the updown cross. I can see one SFP supporting, let’s say, two UHD, which will be the best bandwidth optimisation where everything we’re doing is 25G, so best bandwidth optimisation.

“So far, since the acquisition, it’s seven months now, it’s going really, really well. The developments are now more and more with Vienna and Montreal and in

“I can see other functions such as SDR/HDR. Our boxes that you put behind monitors for 2110 are HDR ready but processing, more processing. I have to say the more processing, not like huge processing like bulk processing, this is probably cuts-oriented but on a more granularity we’re doing, I think doing processing in the venues, it still makes sense doing it with the SFP.

Wuppertal with the

HQ, so we’re seeing the integration taking place.

“The processing, it’s certainly one aspect we’re looking and a recent part we released were without any I/Os, so it’s kind of an SFP you stick in a switch and there’s no I/O whatsoever, so it’s an IP-to-IP pure processing, and there’s no gateway in front of an SDI system. Embrionix started, really, from scratch, so there’s no legacy inside our parts, so it’s really an IP-to-IP. If we say to a customer we go IP-to-IP in Riedel, it’s a real IP-to-IP, it’s not a bridge, there’s no bridge in there. “So far, since the acquisition, it’s seven months now, it’s going really, really well. The developments are now more and more with Vienna and Montreal and in Wuppertal with the HQ, so we’re seeing the integration taking place. There’s a good motivation for us, it’s super interesting, we have more people in the field to sell the technology, to sell our products, MediorNet IP, and I used to say the coronavirus just put some oil on the fire, just accelerated this IP transition. Honestly, it’s the way I see it.” Visit www.riedel.net


CONTENT DELIVERY

BirdDog Introduces Flex Family of NDI Processors BIRDDOG HAS ANNOUNCED the launch of BirdDog Flex, which the company says are the smallest NDI encoders and decoders on the planet. This new addition to the BirdDog range has a footprint only slightly larger than a credit card, creating the world’s smallest 4K NDI Encoders and Decoders. With three products available Flex is suitable for most workflow needs, and includes Tally, Audio, Video, PTZ control, Audio Intercom and Power along a single Ethernet cable. The Flex range consists of the 4K NDI Encoder, 4K NDI Decoder and 4K Backpack, with key features including:

• Adaptive bit rate • Full PTZ camera control via optional control cable for nonnative NDI cameras • Dante input and output supported • RESTful API for third party control and automation • Camera Mounts are available as an optional extra • Central Lite NDI routing software

• Cool touch thermals for operation in the hottest environments

• Comms Lite audio intercom software

• Halo Tally is built into the Flex family, with zero configuration required with any NDI enabled software-based production system

The Flex 4K BACKPACK is designed as an upgrade for a camera top monitor recorder. Featuring an NP style battery connection, and 15w power

output, it allows the user to encode NDI, power a monitor, and record all at the same time. Visit www.ambertech.com.au

Blackmagic Announces ATEM Mini Pro ISO BLACKMAGIC DESIGN has announced the ATEM Mini Pro ISO, a new low-cost live production switcher with a 5-stream recording engine that allows all video inputs to be recorded allowing a live production to be edited after the event. This allows users to get a clean feed of all inputs and use edit software multi-cam features for later editing. ATEM Mini Pro ISO also records all audio files, media pool graphics and a DaVinci Resolve project file, so a live production can be opened and edited with a single click! Features include four standards-converted HDMI inputs, USB webcam out, audio mixer with EQ and dynamics, 2D DVE, transitions, green screen chroma key, and 20 stills for titles.

Connections include four HDMI inputs, one HDMI output, two microphone inputs (with power available), Ethernet port, and USB-C output. HD Video Input Standards include: 720p50, 720p59.94, 720p60, 1080p23.98, 1080p24, 1080p25, 1080p29.97, 1080p30, 1080p50, 1080p59.94, 1080p60, 1080i50, 1080i59.94, and 1080i60. HD Video Output Standards include: 1080p23.98, 1080p24, 1080p25, 1080p29.97, 1080p30, 1080p50, 1080p59.94, and 1080p60.

EQ; as well as Master gain control. The ATEM Mini Pro ISO supports direct live streaming over ethernet using Real Time Messaging Protocol (RTMP). Video recorded via the 4 x HDMI ISO inputs is recorded as H.264 .mp4 files at up to 70Mb/s quality at the ATEM video standard with AAC audio. Programme output is H.264 .mp4 file at the Streaming quality setting and at the ATEM video standard with AAC audio. On the Audio Recording side, the 6 x 2-channel audio inputs are recorded as separate 24‑bit 48KHz .wav files. These inputs include the 2 x analogue stereo audio inputs and 4 x HDMI 2-channel embedded audio inputs. HD Multi-View Monitoring consists of 1 x 10 Views including left right configurable Program/Preview, 4 HDMI inputs, Media Player, Streaming Status, Recording Status and Audio Meters.

Video Streaming Standards include: 1080p23.98, 1080p24, 1080p25, 1080p29.97, 1080p30, 1080p50, 1080p59.94, 1080p60

ATEM Software Control Panel is included free for Mac 10.14 Mojave, Mac 10.15 Catalina or later and Windows 10 64 bit only.

Video Sampling is 4:2:2 YUV. Colour precision is 10-bit with Rec 709 colour space. Colourspace Conversion is hardware-based and real time.

Blackmagic Design has also announced the ATEM Streaming Bridge, a new converter that decodes the live stream from any ATEM Mini Pro model switcher and converts it back to SDI and HDMI video. The advantage of ATEM Streaming Bridge is broadcasters can use it to connect high quality video links direct from any ATEM Mini Pro studio. ATEM Mini Pro’s video stream is much higher quality than that of conferencing software, so customers get broadcast quality, clean of any streaming software vendor logo burn-in.

HDMI Input Resolutions from Computers handled by the ATEM Mini Pro ISO include: 1280 x 720p 50Hz, 59.94Hz and 60Hz; 1920 x 1080p 23.98, 24, 25, 29.97,30, 50, 59.94; and 60Hz 1920 x1080i 50 and 59.94Hz 60Hz. The ATEM Mini Pro ISO’s 6-input x 2-channel mixing capabilities include selectable On/Off/Audio-Follow-Video per channel plus separate gain control per channel; Level and Peak metering; and Fairlight audio enhancements such as Compressor, Gate, Limiter, 6-bands of parametric

Visit www.blackmagicdesign.com

CONTENT DELIVERY

ATEM Mini Pro also includes recording to USB disks in H.264 format, a built-in hardware streaming engine for YouTube Live, Facebook, and Twitch, plus multi-view to see all cameras on a single monitor. The ATEM Mini Pro ISO’s 5-stream recording includes all inputs as clean feeds for editing, plus a DaVinci project file for fast edit turnaround and Blackmagic RAW file relinking for finishing in Ultra HD.

47


CLASSIFIED + EVENTS C+T AUSTRALIA/NEW ZEALAND FEATURES + DEADLINES 2020 ISSUE

EDITORIAL + AD DEADLINES

PUBLICATION DATE

PRODUCT ROUND-UP

SHOW COVERAGE

Sept-Oct

Editorial Submissions: Ad Bookings: Ad Artwork:

21-08-20 24-08-20 27-08-20

2nd Week September

+ Live Streaming + ENG/Newsroom Systems + Focus on Startups + OTT/IPTV/VOD Solutions

+P review: BroadcastAsia (Sept 29-Oct 1, Online) + Remote Production Tools

Nov-Dec

Editorial Submissions: Ad Bookings: Ad Artwork:

23-10-20 26-10-20 02-11-20

2nd Week November

+ Content Delivery + DAB+ Digital Radio + Channel-in-a-Box Solutionsy + Podcasting + HDR in Post & Broadcast

+P review: Siggraph Asia, (Online) + Remote Production Tools

C+T ASIA FEATURES + DEADLINES 2020 ISSUE Nov-Dec

EDITORIAL + AD DEADLINES Editorial Submissions: Ad Bookings: Ad Artwork:

01-10-20 08-10-20 15-10-20

PUBLICATION DATE

3rd Week October

PRODUCT ROUND-UP + Cable & Satellite Delivery + Focus on Startups + Digital Asset Management Solutions + OTT/IPTV/HbbTV Solutions + DAB+ Digital Radio

SHOW COVERAGE +P review: AVIA (CASBAA) Asia Video Summit +P review: Siggraph Asia, (Online) + Remote Production Tools

PHIL SANDBERG

ADAM BUICK

LUCY SALMON

PUBLISHER/EDITOR

ADVERTISING SALES MANAGER

CO-FOUNDER

He has also produced the ABE2016, Sportscasting (2014-2016), and 3D-Day (2011) industry conferences.

Adam’s previous role as Manager of the SMPTE Australia Exhibition, along with strong family ties in Australia and SE Asia, has given him a unique perspective on the broadcast and content creation sector throughout the region.

Co-founder and Production co-ordinator Lucy Salmon has worked with Broadcastpapers.com since its inception. She also holds a PhD in the field of sports medicine.

Contact Phil via +61(0)414 671 811 or papers@broadcastpapers.com

You can contact Adam on +61(0)413 007 144 or adam@broadcastpapers.com

You can reach her via +61(0)412 479 662 or production@broadcastpapers.com

Phil Sandberg has spent 28 years reporting on technology issues across the Asia-Pacific. His credits include launching Content+Technology, Broadcastpapers.com, the SMPTE Australia Exhibition, Directory, and Cinema Technology Asia-Pacific.

CLASSIFIED + EVENTS

Online Whitepaper Library

48

CONTENT + TECHNOLOGY MAGAZINE

ADVERTISER INDEX AVIA----------------------------------------------------------------------39

Dejero-----------------------------------------------------------------23

Riedel--------------------------------------------------------------OBC

VSN-----------------------------------------------------------------------25

Clear-Com------------------------------------------------------------4

MediaKind------------------------------------------------------------5

Ross Video---------------------------------------------------------IFC

WEBINARCHANNEL.TV------------------------------43

ConnectechAsia/BroadcastAsia-------------------IBC

Mediaproxy---------------------------------------------------------3

Telstra---------------------------------------------------------------17

Zixi------------------------------------------------------------------------19


BROADCASTASIA GOES VIRTUAL 29 SEP - 1 OCT 2020 Experience the all-new virtual event, get engaged in topics like Delivering the Consumer Media Revolution, and start conversations with broadcasting industry leaders—powered by AI.

SCAN HERE TO REGISTER YOUR INTEREST or visit bit.ly/CTsepreg


SMALL

FORM FACTOR

BIG

IMPACT

Software-deямБned platform with up to 3 app spaces per SFP IP

SDI

IP Gateway

MULTIVIEWER 4x1 / 9x1 / 16x1

J2K/JXS

UHD HD

4x1 / 9x1 / 16x1 Multiviewer JPEG-2000 / JPEG-XS En- / Decoder Up / Down / Cross Converter

Audio Router

SOFTWARE-DEFINED

DISTRIBUTED

THE NEW

MEDIORNET MUON FLEXIBLE SCALABLE

MEDIORNET VIRTU 32

www.riedel.net

/VIDEO


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.