Content+Technology ANZ February 2024

Page 1

PP: 255003/06831

MEDIA + PRODUCTION + MANAGEMENT + DELIVERY www.content-technology.com

OK

COMPUTER ...

MEDIA

& AI

ISSN 1448-9554

VOLUME 21/ISSUE 1

FEBRUARY 2024

CELEBRATING 21 YEARS OF PUBLICATION



VOLUME 21/ISSUE 1

FEBRUARY 2024

Cover Story... page 3. REGULARS 02 EDITOR’S WELCOME 03 COVER STORY Jonathan Watson explores

the integration of GenAI into the broadcast industry and what organisations need to know to start a Generative AI project.

31 INDUSTRY FOCUS

07

08

FEATURES 07 CAMERAS & LIGHTING

Sony Enters the Arena with Russell Crowe’s Indoor Garden Party. Plus ARRI’s 360 EVO Stabilised Remote Head; the SmallHD Ultra 7 Ultra-Bright Monitor; Marshall’s VS-PTC-300 PTZ Camera IP/NDI Controller.

10 SPORTSCASTING

10

NRL Sets Sights on Vegas; LiveU Delivers Pacific Games Action; Chyron Streamlines Data Management; Low Latency Video for Sportsbooks; Control System for MEDIAEDGE QDCAM.

13 SHOW PREVIEW: SMPTE METEXPO2024 19 NEWS OPERATIONS

Sky News’ “The Jury” Finds in Favour of Gravity Media; AI Turns news.com.au Articles into Podcast Ads; Etere MAM Connects with Ross Newsroom Inception; Quicklink Brings Skype and Teams into SMPTE ST 2110.

24 AUDIO, RADIO, PODCASTING

Leading Broadcaster Taps Ferncast for InHouse DAB+; AES Welcomes Leslie GastonBird as President; Dante Pro S1 System-ona-Chip; Tieline Dante Card for Gateway and Gateway 4 codecs; PRODIGY.MX Multiformat Audio Matrix; Shure Expands SLX-D Digital Wireless Family; Neumann’s MT 48 Now an Immersive Audio Interface; Shure ADX3 PlugOn Transmitter; Dual TX for RØDE Wireless ME Microphone; Voice Enhancement for Podcasters.

28 CONTENT DELIVERY

Ross Video to Adopt ‘NDI Advanced’; Cinegypowered Appliance for Capture or Playout; DASH-IF Conformance Tool Update; Norigin Media Launches CTV White-label FAST Apps; Venera CapMate Caption/Subtitle QC Now on AWS; Avid | Stream IO Production Ingest & Playout Solution Updates.

22 POST PRODUCTION

19 CONTENT+TECHNOLOGY ISSN 1448-9554 PP:255003/06831 Broadcastpapers Pty Ltd (ABN 34 095 653 277) PO Box 165 Surry Hills, NSW 2010, Australia www.broadcastpapers.com PUBLISHER: Phil Sandberg Tel +61 (0)2 4368 4569 Mob +61 (0)414 671 811 papers@broadcastpapers.com

Next Goal Wins’ Makes the Grade with DaVinci Resolve Studio; RSP’S Launches REVIZE Machine Learning for VFX; Baselight Harnesses Machine Language; Animal Logic Opens Up with OMX and ALab.

ADVERTISING MANAGER: Adam Buick Mob +61 (0)413 007 144 adam@broadcastpapers.com PRODUCTION MANAGER: Lucy Salmon Mob +61(0)412 479 662 production@broadcastpapers.com DESIGN & LAYOUT Wide Open Media Mob +61 (0)419 225 348 https://wideopenmedia.com.au PRINTING: SOS Printing, Sydney

COPYRIGHT NOTICE: All material in Broadcastpapers’ Content+Technology Magazine is protected under Australian Commonwealth Copyright Laws. No material may be reproduced in part or whole in any manner without prior consent of the Publisher and/or copyright holder. DISCLAIMER: Broadcastpapers Pty Ltd accepts no responsibility for omissions or mistakes made within, claims made or information provided by advertisers.


EDITOR’S WELCOME

ersburg Moscow

16:00

17:00

18:00

19:00

20:00

21:00

22:00

be able to access each of those services on the device by selecting a single icon or visual representation which must be visible on the primary user interface of the device. Prominence requirements would also apply to electronic program guides (on devices where they are used). In response to the draft legislation, Free TV has expressed its qualified approval, especially the prohibition of charging FTA Broadcasters by device manufacturers for compliance with the minimum prominence requirements, and the insertion of advertising not authorised by FTA Broadcasters. It has also welcomed an extension of the Anti‐Siphoning scheme to prevent, not only subscription broadcast services, but also subscription streaming services, from acquiring rights ahead of FTA Broadcasters, as well as an expansion of the Anti‐Siphoning List to include women’s and diverse events.

• Requirement for content provided through FTA Services, including through Free BVOD Services and delivered online by FTA Broadcasters, to be included in the content search function on Regulated TV Devices. • Requirement for any electronic program guide (EPG) provided within a Regulated TV Device to present FTA Services, including all the primary channels and multi‐channels of each FTA Broadcaster, including the versions of those channels streamed live over the internet. Additionally, for each EPG to place these FTA Services in the EPG prominently and ahead of other channels.

• Reduction of the timeframe by which compliance with the requirements commences, from 18 months connected with date of manufacture and supply, to no longer than 6 months from Royal Assent, with shorter periods to be specified in the regulations for a number of requirements.

In terms of Anti‐siphoning, Free TV has asked that the list be extended so that both the free over-the-air broadcast rights and free digital streaming rights both be acquired by the relevant broadcaster before the event can be shown by a pay TV or subscription streaming provider. Certainly, global streamers will see the legislation as protectionism while compliance always equals cost for device manufacturers, even if it is just a software update. Global streamers will also wave the flag for their locally produced productions and, while these are an essential development for their presence in this market, they will not devote the resources needed for local sport and news coverage, and that is why this legislation is vital.

• Extension of the requirements to cover, not only new Regulated TV Devices, but also existing Regulated TV Devices, where those devices continue to receive software updates.

All the best for 2024. Phil Sandberg – Editor/Publisher papers@broadcastpapers.com +61(0)414671811

03:00

02:00

01:00

However, the Industry group is seeking amendments to the legislation, including: 00:00

24:00

21:00

20:00

19:00

17:00

18:00

FOR A NUMBER of years now, Australian industry group, Free TV, has asserted that “Global deals between TV manufacturers and international streaming giants have given global subscription services the most prominent positions on home screens and remote controls, while local TV services are difficult or sometimes impossible to find.” Following through on an election commitment to local broadcasters, the Albanese Government has released exposure draft regulations for a “prominence” framework aimed at supporting access to local and free TV services in the streaming era. The draft legislation covers both streamed linear and on-demand services provided by Australia’s freeto-air broadcasters. These include the ABC, the SBS, the commercial free-to-air broadcasters and overthe-air community TV broadcasters. In short, if it’s broadcast free overthe-air, it’s eligible for prominence. If it’s a subscription service, it is not. Therefore, Foxtel, Kayo, as well as Netflix, Prime, Paramount+, Disney+, etc., are ineligible. Also not eligible are any free, ad-supported versions 16:00

15:00

By Phil Sandberg

23:00

Prominence or Providence?

22:00

of those subscription services. Applied to “all regulated television devices”, the minimum prominence requirements would include: (a) Applications (apps) for Australian free-to-air linear and on-demand services are either installed on the device before the device is supplied or installed when the device connects to the internet for the first time after the device is supplied; (b) the application must be able to be updated when an update is made available by or on behalf of the provider of the regulated television service; (c) the application must be visible on the primary user interface of the device; (d) the application must be of a similar size and shape to other applications that: (i) are displayed on the primary user interface of the device; and (ii) are designed for the purposes of providing access to a service (other than a regulated television service) that makes audiovisual content available using a listed carriage service; (e) the application must be located in the same area of the primary user interface as those other applications. There are also additional minimum prominence requirements for certain devices. If the device is capable of receiving a television broadcasting service that uses the broadcasting services bands, a user of the device must be able to access the ABC, the SBS, and broadcasting services transmitted in the licence area in which the device is located. Each of those services must be identified and accessible on the device using the service’s logical channel number. The user must

2024 C+T DEADLINES

23:00

Beijing Seoul

AUSTRALIA/NEW ZEALAND EDITION MARCH/APRIL 2024

Tokyo

Tehran Shanghai

Hong Kong Mumbai

ASIA

PREVIEW: NABSHOW, LAS VEGAS

‘ASIAPAC’

Bangkok

Editorial Submissions: Ad Bookings: Ad Artwork:

Jakarta

For more information www.content-technology.com +61-(0)414 671 811 Email: papers@broadcastpapers.com

00

00

0

00

00

00

00

00

00

0

00

The C+T (Time) Zone.

0

00

2

Sydney

00

EDITOR’S WELCOME

ANZ

21-02-24 21 - 0 1 - 2 4 27-02-24


COVER STORY Generative AI’s impact on the Broadcast industry

By Jonathan Watson* GENERATIVE AI will revolutionise multiple industries worldwide, contributing to a USD$7 trillion increase in global GDP over the next decade. In this three-part series, we will explore integrations of GenAI into the broadcast industry and what organisations need to know to start a Generative AI project. Adopting Generative Artificial Intelligence (Gen AI) will revolutionise how content is created, produced, and distributed. I will give my impressions and examples of how Gen AI will impact operations and investment in technology. This first instalment aims to introduce you to Gen AI and provide relevant context for broadcasters, exploring its potential benefits and outcomes for the industry. The second article will explore Gen AI services available for broadcasters and what your business needs to do to prepare. In the third article, we’ll present examples of Broadcasters who have deployed Gen AI projects and the outcomes, look into the future, and imagine some potential outcomes. Decision-makers must plan strategically to embrace Generative AI and harness its opportunities, such as crucial decisions about data infrastructure, foundation model ownership, workforce structure, and AI governance. What Generative AI is and how it works Generative AI captured the public’s imagination with Midjourney and Chat GPT. Its roots lie in machine learning, language models, deep learning, and neural net training. The leap from Machine Learning to Generative AI was an invention in how massive amounts of unstructured data could be analysed using what’s called ‘Transformers.’ An example of unstructured data in broadcast would be a video content repository with no metadata, so the Transformer must extract context via image recognition and training. Another example would be to collect all news scripts from the archives and train a model to write a script. Generative AI foundation models take inputs such as text, image, audio, video, and code and generate new content in any format. The key to developing content is called ‘prompt engineering.’ Prompt Engineering is a new role that uses various conversational prompts to extract valuable outputs from the models. Large Language Models (a foundation Model)

are a considerable amount of textual data that understands the context of sentences and documents. Chat GPT is Generative AI, and the Large Language Model is called ChatGPT. Stable Diffusion is the FM behind Midjourney, making those saturated out-of-this-world images. Stable Diffusion is an Open Source model, and Open AI owns ChatGPT. The Rabbit R1 is an AI-powered gadget that can navigate your apps. It uses a Large Action Model, which is a model that can interface and click through web interfaces via voice prompts. Choosing between closed-source and opensource models depends on the specific requirements of a project, the level of customization needed, and the question of governance. Closed-source models may offer ease of use through APIs, while opensource models provide greater transparency and adaptability but require more technical expertise. Foundation Models are published daily; there are thousands to choose from. When you speak with a vendor or project manager about your Gen AI project, it’s about the suitable foundation models and data transparency. If AI integration is on the roadmap, I will start with the data scientists in your org on what data is collected. Hybrid models are when a smaller, proprietary data set is connected to a Foundation Model that is general purpose. And then ‘tuned’ to a specific task. Broadcasters would take the hybrid approach because the industry has very particular workflows and technology and thus must build their proprietary models. An example of an applied data set is PDF documentation. An FM model is connected to a repository of PDFs that the AI would augment into the queries and analysis; this is called a RAG (Retrieval Augmented Generation). Broadcasters have teams collecting data to create proprietary models to augment the larger FMs. However, the data collected is not in the operational and system data. Broadcasters must extract data from vendors’ systems to build operational user models. Automation systems will have excellent data for an AI engineer to make live production foundation models. Start identifying and collecting operational data from systems and workflows. Broadcast executives planning for AI in the workflow will also use event-driven AI when data streams into the models, and the AI reacts in real time. It is beneficial because Live production captures dynamic events, like a live cross. AI Governance is an important consideration to be part of the decision-making process. Managing biases or hacker-inserted malicious results that tilt the model must be a factor in the AI-powered newsroom. ESG is all over Generative AI because the energy consumption and carbon footprint are immense. Part of governance will be security. Chat GPT and Mid Journey are directly connected to the internet. Broadcasters have strict firewalls to prevent outside attacks; it raises

Image: DALL-E

questions about how a Gen AI project will be secure. Cloud vendors are the possible answer. Broadcasters may seek a vendor with foundation models specifically tuned for broadcast and media. FMs that assist with writing scripts, software code, marketing, and audience analysis are common. Perhaps the future broadcaster is just one foundation model covering all broadcasters’ needs. Could the next-generation NRCS system be a Foundation Model? Generative AI will impact these areas of Broadcast: content creation, workflow and production, commercial strategy, and operations. Content Creation and Personalisation: Generative AI’s noticeable impact on the broadcast industry is the evolution of how content gets created. AI systems can analyse vast amounts of data, understand viewer preferences, and generate personalised content tailored to individuals or groups. The FM will tailor news, sports, and opinions to the viewer’s profile. This personalization does raise ethical governance questions of singular echo chambers. Script writing is the easiest and first impact a newsroom will have with Gen AI. Journalists will have personalised Gen AI editors that do factchecking, copy editing, librarian, and source confirmation. AI will build video packages; the prompt engineering would be the news story or script in >> continues p4

COVER STORY

Media executives must find innovative efficiencies with technology, operations, and content. By integrating AI, media businesses can work better and faster. Content will be created and published with fewer operational assets.

3


COVER STORY >> continued from p3

the NRCS. As the journalist writes the script, the video package gets assembled in real-time, with video clips pulled from sources such as a MAM. MAMs must have the data hooks that enable a newsroom FM model to connect via API, extract the clips, and drop them into a timeline. The MAMS of the future are foundation video models. Where footage is unavailable, let’s say an interview for the story. Then, the newsroom AI assistant will coordinate the resources and production assets for the interview. The journalist does not need to initiate anything; this happens as ‘event-driven AI.’ I would even go so far as to suggest that if it’s an online interview, the Gen AI would book the guest, write and ask the questions, and record the interview. Ingesting into the MAM and inserting into the package or rundown.

Will personal AI assistants make for an ethical difficulty in what constitutes a source? Was that Quote from the actual person or their Virtual AI Assistant? It goes back to governance as something to consider. News, opinion, and sports-specific foundation models will increase the variances of the automated content. Broadcasters will have families of foundation models, one for each Sport. Today, with AWS Gen AI services, Fox Sports in California is auto-generating sports highlights using the massive datasets Fox Sports has already been collecting. Following Sports highlight clips, news packages will be automated. GenAI will place the correct CG and branding, adjust colour correction and transitions, drop in the VO, create multiple versions, and publish.

COVER STORY + PEOPLE

An FM that captures the technical skills of an experienced live sports or news director will be an asset. An NHL producer will be different from an NFL producer. I predict the potential for the most skilled and expert live production producers and TDs to licence their ‘abilities’ into a foundation model. These would be available to buy for integration. Will personal AI assistants make for an ethical difficulty in what constitutes a source? Was that Quote from the actual person or their Virtual AI Assistant? It goes back to governance as something to consider.

4

News and Sports are derived from live events; I don’t see any requirement to create a video package from the pixel level, but it could be possible. As a live production business, I would not get hung up on the fancy AI tools that create content from nothing, as broadcast content must remain authentic. Virtual Sets and other 3D models will eventually be constructed with Generative AI, with some fine-tuning by a human technical artist. Virtual Sets will be designed, assembled, and deployed in near real-time. OpenAI has unveiled Point-E,

Image: DALL-E

a new text-to-3D model that can generate 3D point clouds from natural language inputs or Prompt Engineering. These Gen AI deployed models will be virtual studios or AR studio graphics rendered on VR/AR devices in your living or office space. Virtual Sets will rebuild and assemble automatically live on air, depending on the script and prompts from the producers, scripts, or talent. Imagine watching a live news program, and the 3D elements in the virtual set are dynamic, reacting to the piece’s content. Sports Broadcasters can leverage generative AI to enhance live sports production by automatically producing AR virtual overlays of player stats and CG without an operator. CG could be an example of event-driven AI. The type of CG and the information populating it are triggered by what happens on the Preview. Gen AI would build a broadcast map from the GIS data from a live feed. A trimmed-down Sports OB van, as many tasks an operator may do, can be AI. A sports producer will create highlights not by spinning an edit wheel to find the right clip but by entering textual prompts. News graphics are going to become spectacular! GenAI would create entire infographic storylines to explain complex pieces. News Infographics of this complexity are built by the script and prompts from the creative team. The future of News stories will become very infographic-heavy. Existing CG rendering platforms that are HTML-based have an advantage. Digital Rights Management will be AI. Smart contracts on the blockchain will be the mechanism to trigger event-driven AI when it comes to monetization and content rights. I will write another series on how Blockchain tech can benefit broadcasters. Enhanced Live Production Workflows Generative AI streamlines workflows across all global industries and will do the same for Broadcast. Accounting, marketing, scheduling,

research, and other traditional administrative tasks are more efficient and cost-effective—a reduced headcount or increased output, most likely a mix of both. Real-time feedback into the production pipeline will speed up the news cycle. The circular news story, Go-live, clip, publish, feedback, repeat, etc.…is not new, but an organisation can manage many more stories in circulation with Gen AI. Gen AI will have a more significant impact at the regional and hyper-local end of Broadcast networks, lowering and automating the cost of production for stories relevant to that local smaller market. Content will be data-driven, meaning when content is published, consumer metrics will flow back in real-time, feeding the foundation model to impact content creators’ decisions. The AI will incorporate this data, automatically providing insights or adjusting the stories. Broadcasters have virtualized control rooms since COVID-19, and platforms from switching, audio, and CG bundled into a single virtualized solution as an on-demand resource. The next evolution is an AI control room, which could become a foundation model. Or at least the TD and producers will become AI agents. But for this to happen, broadcasters need workflow data from live production. Log and user data must be recorded from the vendor’s platforms that comprise a control room. The broadcast journalist could become a content strategist. Instead of writing a couple of stories, the journo builds a show by the prompts and queries that go into the NewsRoom AI. What comes out the other end is a series of scripts, video packages, infographics, social media, and playlists that get handed off to AI publishing. Because the cost and volume of live production will drop, Gen AI Broadcast systems may start at the hyper-local regional markets. What would a national broadcaster look like if it had thousands of neighbourhood-level productions, all virtualized and automated? >> continues p6


Route any combination of SD, HD and Ultra HD at the same time! Blackmagic Videohub 12G-SDI video routers let you connect all your equipment without creating a complex cable mess! Blackmagic Videohub routers support 12G-SDI so they allow you to connect and route any combination of SD, HD and Ultra HD on the same router at the same time. Plus they have zero latency so are perfect for live production and broadcast!

Advanced 12G-SDI for SD, HD and Ultra HD Routing

Elegantly Designed Front Panel

Features Full SDI Re-Clocking

The built in front control panel lets you route video, so you get the perfect solution

Videohub includes built in SDI re-clocking on every 12G-SDI input. SDI re-clocking

for live production racks that don’t have the space for extra hardware panels. The

regenerates the video signal for maximum video quality. This is extremely important

front panel features a spin knob for browsing, direct entry buttons for speed and an

because longer video cables degrade the signal. With SDI re-clocking all of the

LCD for displaying labels. As the panel is built into the router, you can also see live

SDI devices in your studio receive a regenerated signal with improved jitter

video on the LCD! The LCD also has menus so you can change router settings.

performance. That means longer cable lengths, and no drop outs in your video.

Change Router Crosspoints Visually! With a built in LCD, Blackmagic Videohub 12G can show you live video of all your router inputs! This means you can see all router inputs as live video before you change a route. There are also labels displayed below the live video to make it easy to identify the input you are viewing. What this means is you can scroll up and down the router sources in alphabetical order and see each input as live video.

www.blackmagicdesign.com/au

Blackmagic Videohub 12G features advanced 12G-SDI connections which are multi rate so they support any SD, HD and Ultra HD video format up to 2160p60. Plus Blackmagic Videohub 12G supports routing any video standard on the same router at the same time. 12G-SDI gives you high frame rate Ultra HD via a single BNC connection that also plugs into all of your regular HD equipment!

Blackmagic Videohub 12G From $2,285 Learn More!


COVER STORY >> continued from p5

Operations: I believe that Gen AI will have the most significant impact on the operations of a broadcaster. How a vendor is selected, the way engineers manage technology stacks, and the management of systems used to put a production together. Operations will include HR and talent acquisition and training. McKinsey Global Institutes projects that between 2030 and 2050, Gen AI will automate half of all knowledge tasks. What Microsoft is about to do to your Office 365 will blow you away. Gen AI is not replacing roles but simply making existing roles more time-effective and removing tasks within the roles. Did I mention a 4-day work week? Writing code will be faster and more accurate, as LLMs can now offer real-time feedback and assistance to programmers. That will impact service and in-house teams’ levels of productivity. Vendors that write code for your business will also have efficiencies, shrinking project deadlines and delivery costs. The daily operations of logistics to locations, booking interviews, and getting the right people in place. Scheduling OB vans and technical production crew will be automated. It may already be if your business is using a 3rd party vendor. Cloud-based project management tools already use Gen AI to help project managers. Control Rooms are increasingly just software instances spun up on the cloud. These virtualized Operational nodes will be part of the same processes that assist the journalist in creating the content. The future technical director will be a Gen AI, ensuring the right graphic is on air and the right source is used with the correct audio levels. Once the show is complete, the control room spins down until the next Live show. But think about that: what it means is that broadcasters will produce 1 or 1000 live shows simultaneously with different branded graphics packages and target audiences. Broadcast scheduling will be a monumental multi-dimensional task that no human can keep on top of. But this scheduling is across thousands of targeted personality profiles and pop-up channels. The entire playout infrastructure is best handed over to a playout foundation model.

COVER STORY + PEOPLE

Vendors supplying the technology that underpins a complex large broadcast organisation will open up data and metering capabilities. A Broadcast Engineer FM will manage all operations, including systems, SLA, licences, and commercials.

6

The above evolutions will create new data sets so that GEN AI can output a very accurate ROI on operations and output. Having end-to-end data, From edit tool to media throughput to live production, to deployment and feedback and click rate, will be the ultimate broadcast media foundation model; this will give the C-suite a macro ROI view of the entire business, be it a global brand or a regional broadcaster. Conclusion: While the focus of content creation may be the flashy output, Foundation Models will

significantly impact operations in terms of savings, efficiencies, and automation. Vendors and suppliers to Broadcasters should be ready to expose system data as an additional product or feature. Service providers specialising in integrating Generative AI into their operations and cloud service providers will be knocking on the CTO and Engineering and Operations office doors if not already. The interface with this level of AI is conversational, but it must also be thoughtful. Your existing workforce will need to learn ‘prompt engineering.’ Prompt engineering is the process of structuring text that can be interpreted and understood by a Generative AI model. Imagine that AI is a genie in a bottle. How you phrase your wish will be very important. The sooner you can get these training sessions on the go, the better. Gen AI can leverage existing MAM systems to source and build a video package from the NRCS story script. Generating a video from nothing is not a requirement because of the live video nature of broadcasting. However, developing compelling infographics will be a relevant new feature in the news. In 2024/2025, HTML5 CG vendors will offer this feature of automated content creation with Data Storytelling and AI-generated infographics. The design team in your org will need to understand that it will not be about the tools, like After Effects or CG systems, but how news stories will be infused with infographics created from prompts. There will not be one Foundation Model that does it all but a platform of different Foundation Models for parts of the broadcast production. This article mentioned ChatGPT and Stable Diffusion, but over 300,000 opensource Foundation Models are already available on GitHub, with more added daily. The following article in this series will look at the existing services and vendors offering solutions to broadcasters. And how a broadcaster should prepare for a project, defining the outcomes desired and which parts

Images: DALL-E

of the business to start connecting to a Gen AI Foundation Model. *Jonathan Watson has spent over 20 years as a vendor to broadcasters globally working across production, graphics, and the commercial side of the business. With a focus on how Blockchain and Generative AI will impact broadcast and digital media, he brings an understanding of what’s on the horizon to how the broadcast industry can adapt and work more efficiently in the future. You can find him on LinkedIn at https://www.linkedin.com/in/watsonjon


CAMERAS & LIGHTING Russell Crowe’s ‘Indoor Garden Party’ Shot with Sony Cameras A LONG-TIME MUSICIAN as well as an actor, Russell Crowe’s Indoor Garden Party tour featured 17 shows around the east coast of Australia, as well as multiple shows in Malta, Italy, and the Czech Republic. The venues ranged from small pubs and clubs to large theatres and festivals. These included iconic venues The Sydney Opera House and Australia Zoo and events such as The Karlovy Vary International Film Festival. The entire tour was captured exclusively by and shot with Sony cameras.

Before the tour began, Crowe tasked DoP Jules O’Loughlin ACS ASC and Machart with organising the filming of two earlier concerts he was planning at The Hoey Moey in Coffs Harbour. Machart added, “This venue provided a fantastic experience for live acts and Russell wanted to capture that in the footage. On our recce we discovered that our wide and mid shots at front of house would be obstructed by a low ceiling. The FR7 immediately came to mind as an elegant solution as there was a lighting bar over the mosh pit which we could undersling this camera on. This gave us a clear angle of the stage and worked flawlessly.” As Machart mentioned, there were specific creative requirements that had to be met by the Sony cameras on the tour. For example, all cameras needed to have a similar look so that colour correcting would be easy as content had to be edited quickly on the road. To help him on the tour Machart worked with two old friends and “amazing cinematographers”, Nick Hodgskin and Scott Wood.

Machart added, “The Sony Cinema Line was the only choice that met all of our requirements. I chose to utilise an FX6, four FX3s, three FR7s and an A7RV. They all have S-Gamut3.Cine/S-Log3 profiles and can be effortlessly matched in the grade. The versatility of the whole Sony Cinema Line meant that we could use each camera’s strengths to our advantage. Russell saw the potential of the FR7s from the early gigs as a way that one person could operate three cameras simultaneously. The FX3s are extremely compact, allowing for comfortable and prolonged shooting. They also have IBIS for smooth movements. The FX6 has ergonomic controls as well as built in variable ND. The speed and durability of Sony’s CF Express Type A cards were essential. We had quick turnaround times in between shows and these cards enabled us to data wrangle footage onto mirrored SSDs in minimal time.” On the tour, the FR7 operator desk typically had three iPads to monitor and control settings. Machart described the set-up as “very intuitive” with the crew able to use the touch screen to focus, move the camera, and change settings. He continued, “The Sony RM-IP500 allowed for finer tuned controls with the joystick and dials. The ability to recall position presets and determine the speed of the movements created dynamic,

replayable shots at the touch of a button. One person could control one camera with the RM-IP500, whilst also controlling a second or third camera simultaneously with presets or the iPad interface. It was a treat to use autofocus on a slow speed setting to rack focus between subjects on stage with just a touch on the iPad.” According to Machart, another great feature was the live monitoring page which showed a full screen live feed, without the ability to control the camera. He explained, “This feature became invaluable for us as we could use our phones to wirelessly connect to the FR7 router, allowing us to remotely view a shot whilst setting up. We placed an iPad in Russell’s green room so that he could direct our shots, as well as view what was happening on stage.” The tour always had a mega deck riser at the back of the room with an FX6 and an FX3 on tripods. The FX3 was a locked off wide with a 24-70mm SEL2470GM2 and the FX6 was a mid shot with a 100-400mm SEL100400GM (sometimes with a doubler SEL20TC) and operated by one of the crew. The FX6 also took in left and right audio channels of the line mix from the sound desk. Machart said, “It was very helpful that the FX6 could also record scratch audio tracks on channels 3 and 4 just in case anything unexpected happened with the

“The Sony Cinema Line was the only choice that met all of our requirements. I chose to utilise an FX6, four FX3s, three FR7s and an A7RV. They all have S-Gamut3.Cine/S-Log3 profiles and can be effortlessly matched in the grade. The versatility of the whole Sony Cinema Line meant that we could use each camera’s strengths to our advantage” sound desk feed. We used an FX3 on an Easyrig or DJI RS3 Pro gimbal to get dynamic shots from within the crowd and around the stage. Russell likes movement in his shots and a feeling of being part of the audience, which these setups achieved. We rigged other FX3s in interesting positions using magic arms, usually on stage somewhere or at high vantage points. We could confidently put these cameras in places which aren’t possible to access during a show, thanks to the remote control function of the Sony Creators’ App. >> continues p9

CAMERAS & LIGHTING

Director of photography on the tour, Joe Machart, explained, “Russell wanted six or more cameras on every show, as well as the ability for us to record rehearsals, interviews and behind the scenes footage. Over the years my go-to camera kit has gone from the Sony HDR-FX1 to the NEX-FS100 to the PXW-FS5 to the ILME-FX6V. When Sony announced the ILME-FR7, I was very excited to see what possibilities it would unlock for the tour. Sony and their partner distributor, Lemac, kindly organised a demo of one of the first units in Australia.”

7


CAMERAS & LIGHTING

ARRI 360 EVO Stabilised Remote Head ARRI HAS ANNOUNCED a new stabilised remote head, the 360 EVO. Building on the SRH-360, it features 360-degree rotation on the roll axis as well as the pan axis, a more robust design, and remote system integration. A new GUI and multiple accessories are shared with the TRINITY 2, creating a uniquely integrated lineup of hard and soft-mounted stabiliser options. Since the launch of TRINITY 2, operators have been exploring the creative opportunities provided by its 360-degree rotation on the roll axis, revolving around the optical centre of the lens. Now, the same technology has been scaled up for the ARRI 360 EVO three-axis stabilised remote head, opening up new shot possibilities for dramas, commercials, and music videos. Even more dynamic roll-axis shots can be achieved by mounting the head to a crane, for example, or a cable cam above a live broadcast event. The 360 EVO runs on the same software platform as TRINITY 2, with the same intuitive GUI displayed on its touchscreen remote control panel. The two products also share cables, brackets, and SAM plates for mounting different cameras, and can be controlled by the same tools, such as ARRI’s Digital Remote Wheels DRW1 and new Digital Encoder Head DEH-2. TRINITY 2 customers can therefore invest in the 360 EVO without having to double up on accessories or learn new workflows. By offering both systems professionals can extend their services to a production, increase their workdays, and reduce

setup times, which is also of great benefit to the production itself. LBUS connectivity enables efficient digital and metadata workflows, while the new software and GUI are focused on long-term Unreal Engine integration to facilitate virtual production. Plug-and-play control of the 360 EVO over the internet will be possible in future when using an authorised repeater cloud service, providing exceptional flexibility of control options. With its remote system integration, 360-degree

roll-axis rotation, payloads up to 30 kg, compact size, versatile connectivity, high-capacity 12/24 V camera power supply, and rock-steady stabilisation even at long focal lengths, the 360 EVO offers an unparalleled price-performance ratio that makes it a budget-friendly product for the cine and broadcast markets. First customer shipments of the 360 EVO, as well as upgrades for SRH-360 owners, will begin in Q1 2024. Visit https://arri.com/360-evo

SmallHD Ultra 7 Ultra-Bright Monitor SMALLHD HAS ANNOUNCED Ultra 7, a new ultra-durable and ultra-bright flagship monitor in its Smart 7 Series. Ultra 7 features a nextgeneration platform powered by the same technology that drives SmallHD’s elite 4K production monitors, and integrates with Teradek’s new Bolt 6 wireless platform. Ultra 7’s colour-accurate touch screen can display up to 2300 nits full-screen luminance, enabling visibility in any environment, and SmallHD has provided the powerful I/O required by the dynamic needs of cinematic productions. 6G-SDI inputs allow ingest and passthrough of up to 4Kp30 video signals, enabling criticallyaccurate focus and detail when zoomed in pixel-to-pixel. Dual 2-pin power connectors enable users to output power for accessories,

SmallHD’s PageOS software can be controlled via touchscreen or joystick, and large tactile buttons offer customisation and functionality. This is SmallHD’s first IP-Certified monitor at IP53, which is defined by protection against ingress of liquids and fine particles, and a sealed heat management system allows the Ultra 7 to operate in variable temperatures ranging from 0ºC to 40ºC. The machined chassis is strengthened by a new design that includes raised edges protecting the front-glass of the display and shock-absorbing silicone bumpers. Also available is a quick-release sunhood (sold separately). This new design clips securely into the front of the monitor without tools or screws, using hidden magnets for quick conversion to a 3- or 4-sided hood.

Ultra 7 can be purchased with a fully-integrated Teradek Bolt 6 transmitter or receiver inside the same-size chassis as the standard model. These integrated wireless monitors will feature a new rugged antenna cap to reduce antenna damage and will be available in Bolt 6 750 and 1500 range models. Ultra 7 RX kits will ship with handles, a padded strap, and a Wooden Camera Micro Battery Plate (GM, VM, or B-mount). An integrated Ethernet and 5-Pin USB port enable flexible and intuitive PageOS-integrated camera control options–anchored to every page–for ARRI, RED, and Sony VENICE cameras, while TX and RX models will support wireless camera control over the clear airwaves of the 6GHz spectrum. Visit https://smallhd.com/pages/ultra-7

Marshall VS-PTC-300 PTZ Camera IP/NDI Controller

CAMERAS & LIGHTING

Marshall Electronics has released the VSPTC-300 PTZ camera controller, providing the ability to manage multiple protocols, including NDI, RS232/RS422/RS485 and IP, VISCA over IP and ONVIF.

8

The VS-PTC-300 IP POE PTZ controller allows users to control multiple cameras using the same or different protocols, which is beneficial while migrating to a larger set-up. It can control up to seven RS232 cameras, fourteen RS422 cameras and up to 255 NDI/IP cameras. It also allows easy NDI/IP set-up utilising the NDI/IP discovery set-up. “The VS-PTC-300’s layout provides 10 ‘instant access’ camera control buttons, which reduces

fumbling when quick action is required,” says Greg Boren, Product Marketing Engineer at Marshall Electronics. “In addition, the information display screen doubles as a video monitor for NDI and RTSP streams. Also, there is no need to type in the camera number before operating the camera; just select the camera button and you’re in control.” The VS-PTC-300 is designed to operate seamlessly across multiple camera protocols on a single network. Used in live event production and content creation, the VS-PTC-300 offers detailed operation through a high-quality PTZ joystick, professional zoom rocker and individual fine-tune adjustment knobs. Settings include iris, white balance, exposure, red/blue, shutter

speed, focus, pan/tilt speed and zoom speed. The VS-PTC-300 can store up to six selectable ASSIGN function keys and save up to 256 camera presets with memory of image setting parameters. Dedicated knobs and control buttons simplify direct access to frequently used functions without needing to use on-camera menus. The VS-PTC-300’s joystick allows “one hand” PTZ/Focus or use the convenient rocker for Zoom. Its solid construction ensures that the controller will not slide while operating the joystick. The built-in display also allows the crew to easily check framing/focus, the team can store presets and also practice camera moves. Visit https://www.marshall-usa.com/


CAMERAS & LIGHTING >> continued from p7

This app enabled us to remotely access and view an FX3.” Sometimes, Crowe would have interviews to do for live broadcast, radio or Zoom where the crew would then use their FX6 with an iPad teleprompter rigged to it. The iPad was a mirrored display for Machart’s MacBook Pro via Sidecar so that Crowe could see the live studio feed on the teleprompter while he maintained his eyeline down the lens of the camera. Machart added, “We either used a LiveU, cellular hotspot or Starlink dish to send and receive data with low latency, even in remote areas. The S-Cinetone picture profile was perfect for these situations as the colours and skin tones look accurate and cinematic.” Crowe also wanted to film rehearsals and the recording of new songs in his music studio. After musicians and instruments, there wasn’t much room left for camera operators. Again, Machart saw the FR7s as the solution. He explained, “We placed three FR7s around the studio and ran Ethernet cables to an operator’s desk in an adjacent room. Not only did this reduce our footprint in the room, but it also allowed the musicians to not feel pressured by the presence of cameras. The FR7’s full frame sensor allowed us to achieve cinematic, shallow depth of field shots with amazing low light performance and 4K resolution.” In terms of workflow on the tour, Machart and the crew started by using the FX6 to jam timecode into all cameras. They then chose positions for the three FR7 cameras. With the Progl+Gerlach PGX FR7 mounting plates, they had a variety of mounting points and an easy 4mm safety chain solution. They also travelled with a selection

of grip gear which gave them flexibility to mount the FR7s in different ways. He continued, “We decided that the most versatile way to mount the FR7s at venues would be with half clamps because they gave us the option to mount the camera upright or underslung. In this way, we could rig FR7s onto existing lighting bars in venues, or onto our own scaff posts and steel stands with T speed rails. We plugged our FR7s, iPads and RM-IP500 into a network switch and router. All cameras then appeared on the network and could be controlled without much setup. Over the duration of every four-hour show we did not have a single issue with an FR7 not responding. The dual slots on the FR7 allowed for ample storage space. After the show concluded a roaming FX3 followed the band back to the green room to capture behind the scenes. “Sony is a market leader in its colour science, autofocus and low light performance. It was invaluable to have a family of cameras, each with different strengths to meet every situation. When we filmed at large venues we sometimes attracted a small crowd at the back of the room trying to get a glimpse of our FR7 iPad screens to watch the performance. The FR7s truly gave unique and impressive angles of the stage. The low light performance, dual base ISOs and the ECS shutter mode of the FX6 and FR7 were crucial with the dynamic and challenging stage

lighting.

sensor cinema camera.”

“The high dynamic range of these cameras in CineEI mode also gave us the confidence of knowing that the shadows and highlights were protected. Typically, there were areas on stage with bright follow spots and other areas of deep shadows. Preserving all these details was essential in producing the best image possible. The autofocus on these cameras is truly remarkable. The facial recognition continued to lock focus to a subject’s eye in the constantly changing lighting and movement of the musicians. The autofocus allowed us to concentrate on controlling the PTZ functions and composing the shots. Likewise, the facial recognition autofocus became an essential part of a one-person gimbal setup for these concerts. Without worrying about focus, the operator was free to compose the shot. The FR7 was also small enough that it could offer angles and perspectives which would previously have been impossible to get with a large

The high frame rate options of the Sony cameras additionally enabled Machart and the crew to capture emotion and energetic movements in engaging slow motion. The ability to take all camera angles into the edit and have minimal colour correcting before the grade was also a great time saver for the tour. The tour finished with a performance at the Karlovy Vary International Film Festival in the Czech Republic. With the help of local production companies and crew the final show was filmed with 19 cameras in front of 15,000 people. To see a full behind the scenes of the Sony cameras in use on Russell Crowe’s Indoor Garden Party tour go to: https://www.youtube.com/ watch?v=PsHA6I5J9dU Visit https://pro.sony/en_AU/

V-RAPTOR LARGE FORMAT

THE FRAME RATES, LIGHT SENSITIVITY, AND RESOLUTION OF THE V-RAPTOR LINE COMBINED WITH THE GLOBAL SHUTTER OF KOMODO.

Proudly distributed in Australia by Blonde Robot Pty Ltd

CAMERAS & LIGHTING

8K 120FPS

9


SPORTSCASTING NRL Poised to Take on US Audiences FOR THE FIRST time ever, the NRL Telstra Premiership rugby league season will kick off on American soil (Saturday 2 March 2024 local time) at Allegiant Stadium, Las Vegas. Round 1 matches in Las Vegas will see two matches take place - the Sea Eagles vs the Rabbitohs and the Roosters vs the Broncos. The first match will be televised in Australia via subscription services Kayo and Foxtel, while the second tussle will be available via the Nine Network, Nine Now, Kayo and Foxtel. As the Las Vegas venue would suggest, however, the target audience for this year’s opening NRL is the American sports fan. In terms of stadium audience, the NRL is aiming for some 60,000 attendees. US television viewers, meanwhile, will be able to catch the NRL doubleheader live on the Fox Sports’ FS1 channel and the FOX Sports app. The matches are also available to other foreign territories via www.watchnrl.com Speaking at a recent Business Sydney function, chairman of the Australian Rugby League Commission, Peter V’landys, outlined the NRL’s strategy to overcome the limitations of scale with current audiences. “I went overseas to try to sell the NRL broadcast,” said V’landys.

“We wanted to get competitive attention, naturally, to maximise the return on the broadcast. And we went to Amazon, we went to Facebook, we went to Twitter, we went to all of them. And they all said, ‘what’s the population of Australia?’ And, I’d say 26 million. They said, ‘see you later’. I said, what do you mean? They said, ‘You’re so smallscale, there’s 40 million in California. Why would we [L-R] NRL players Campbell Graham, Billy Walters, Spencer Leniu and Aaron Woods in Las Vegas to promote be bothered with the historic 2024 Telstra Premiership doubleheader at Allegiant Stadium. Photo: NRL Photos. somebody with 26 million that’s $650 million in additional broadcast try other big populations like just not in our market? We’ll do revenue. Even if we got half, that’s India or China or those places a revenue share, you pay for 300 million. Now, in order to do where there is scale. Australia production. that, you need a good partner, is a very small population, so if “But, they made me think they’re which we do with Fox. And if we you want a big market, you need right. We’ve got no scale in can get Fox to show us on Fox One to look elsewhere, and that’s Australia. And then I looked at the in America every week after we’ve why we’ve looked at the US. And population in America, it’s 340 promoted it at Vegas, I think it’s million people. And we have an app I’m very confident that if we do got an unlimited potential, not just in America that we sell, which is an it right, and it all comes back to for rugby league but for all sports NRL app that you can watch every implementation, if we do it right in in Australia, to sell to a scale that’s game. It’s a $160 subscription. And America, we can be talking billions much bigger than what Australia I worked out that if we could get of dollars in 10 or 20 years.” will ever be. 0.1% every year of the population in America, we would generate “And, if it works, then you can Visit https://www.nrl.com/

Sportscasting Giants to Launch Joint Streaming Service

SPORTSCASTING

ESPN, A SUBSIDIARY of The Walt Disney Company, Fox and Warner Bros. Discovery have reached an understanding on principal terms to form a new joint venture (JV) to build an innovative new platform to house a compelling streaming sports service. The platform brings together the companies’ portfolios of sports networks, certain direct-to-consumer (DTC) sports services – including content from all the major professional sports leagues and college sports. The formation of the pay service is subject to the negotiation of definitive agreements amongst the parties. The offering, scheduled to launch in 3Q of 2024, would be made available directly to consumers via a new app. Subscribers would also have the ability to bundle the product, including with Disney+, Hulu and/or Max.

10

Each entity would own one-third of the JV, have equal board representation and license their sports content to the joint venture on a non-

exclusive basis. The service would have a new brand with an independent management team. The platform would aggregate content to offer fans an extensive, dynamic lineup of sports content, aiming to provide a new and differentiated experience to serve sports fans, particularly those outside of the traditional pay TV bundle. By subscribing to this focused, all-in-one premier sports service, fans would have access to the linear sports networks including ESPN, ESPN2, ESPNU, SECN, ACCN, ESPNEWS, ABC, Fox, FS1, FS2, BTN, TNT, TBS, truTV, as well as ESPN+. Bob Iger, Chief Executive Officer of The Walt Disney Company said, “The launch of this new streaming sports service is a significant moment for Disney and ESPN, a major win for sports fans, and an important step forward for the media business. This means the full suite of

ESPN channels will be available to consumers alongside the sports programming of other industry leaders as part of a differentiated sports-centric service.” Lachlan Murdoch, Executive Chair and Chief Executive Officer of Fox said, “We believe the service will provide passionate fans outside of the traditional bundle an array of amazing sports content all in one place.” David Zaslav, Chief Executive Officer of Warner Bros. Discovery, said “At WBD, our ambition is always to connect our leading content and brands with as many viewers as possible, and this exciting joint venture and the unparalleled combination of marquee sports rights and access to the greatest sporting events in the world allows us to do just that.” More details, including pricing, will be announced at a later date. Visit https://thewaltdisneycompany.com/


SPORTSCASTING

LiveU Delivers Pacific Games Action THE 2023 PACIFIC GAMES in the Solomon Islands were successfully broadcast live to viewers in numerous participating countries and beyond using LiveU’s IP-video EcoSystem. LiveU’s technology was selected for its costeffective and flexible technology, underpinned by LRT (LiveU Reliable Transport), enabling live broadcasts to be distributed with the highest resiliency to a disparate array of broadcasters and other takers across the Pacific Islands as well as Australia and New Zealand.The LiveU team was present on-site throughout the games, providing engineering and operational support. Historically, the world feed for the Games was transmitted via satellite and produced centrally with selective content. The organisers turned

to LiveU for cost-effective and reliable IP distribution over the open internet using LiveU Matrix’s fully managed service with multiple feeds, encompassing all the content, produced at the different venues. In addition, LiveU’s compact 5G LU300S encoders were deployed for point-to-point transmission, LiveU Ingest for cloud recording on-site and the rackmount LiveU Transceivers for uplinks and point-to-point transmission. Paul Vunituraga, Broadcast Manager, Pacific Games 2023, said, “We were looking for a cost-effective and efficient alternative to satellite and were excited by the opportunities enabled using LiveU’s IP technology. The results speak for themselves! Despite the challenging

environments on the island, the quality and scope of the live production remained at the highest level throughout the games. We could focus on covering all the different sports with engaging live content, while expanding our reach to new takers with full peace of mind.” Chris Dredge, Country and Sales Manager – LiveU Pacific, said, “It was an honour for us to be involved in these historic Pacific Games. This major production demonstrates how LiveU can replace traditional broadcast methods for major sports events across the whole spectrum, from live contribution on the ground and cloud ingest to a variety of distribution methods.” Visit https://www.liveu.tv

Control System for MEDIAEDGE QDCAM VIDEOSYS BROADCAST has launched a new control system for MEDIAEDGE’s QDCAM highspeed box camera that allows broadcasters to incorporate the high-quality camera into their production workflows. The fibre-based plug and play system, which fits seamlessly into a standard equipment rack, uses intuitive software that automatically configures all the camera and CCU features for the user’s chosen format. By adopting this simple to use solution, broadcasters can now deploy the QDCAM box camera for 4K productions or high speed (x4) SloMo shots. Colin Tomlin, Managing Director of Videosys

Broadcast, says the company’s new Remote Control Panel (RCP) for MEDIAEDGE’s QDCAM will increase the camera’s appeal to broadcast users, particularly those producing sports programming where there is a need for highresolution 4K shooting and high-speed shooting for slow-motion playback. Features offered by QDCAM include excellent image quality; remote camera/lens control using micro 4/3 lenses allowing access to the vast range of off-the-shelf lens options. Also, a Global Shutter CMOS Image Sensor allows for precise imaging without the distortion associated with rolling shutter devices. The QDCAM has

an Optical Transmission system that utilises standard SMPTE fibre optic camera cables installed in many stadiums and other facilities frequently used for sports broadcasting. According to Colin Tomlin, “Its one drawback was its complicated set-up system and awkwardlyshaped boxes that were not weatherproof and didn’t fit standard equipment racks. By working closely with MEDIAEDGE, we have succeeded in creating a new RCP system and fibre implementation that resolves all these issues and brings the camera firmly into the broadcast domain.” Visit https://www.videosys.tv/

SPORTSCASTING

2024

11


SPORTSCASTING

Chyron Streamlines Data Management for Sports Users CHYRON HAS INTRODUCED PRIME 4.9, the newest release of the company’s live production engine. Designed to drive the world’s most demanding datadriven sports and news production workflows, Chyron PRIME 4.9 boasts an array of updates, both large and small, that significantly enhance the operator experience — and improve operator efficiency — in designing and playing out complex graphics that pull information from real-time data sources. Along with control panel and focus improvements, the platform offers even more extensive data integration and management capabilities, faster and more convenient workflows for creating and modifying large tables, and a series of features that combine to streamline and accelerate workflows built on SMT real-time sports data. “Version 4.9 is a great representation of the two-pronged approach the PRIME development team is following to deliver new innovations and address feedback from close conversations with customers,” said Nikole McStanley, Product Portfolio Director of PRIME at Chyron. “Our new features — such as the Duplicate Effect and expansion of the Table Resource to integrate with the full range of industry-preferred data sources — provide uniquely efficient tools for tackling timeconsuming, complex graphics. Meanwhile, the fine-tune adjustments we’ve made to the design and playout interfaces yield major time-savings and quality-of-life improvements for day-to-day users.”

process of arduous cell-by-cell data-binding when building table graphics with multiple rows and columns linked to data. This new feature recognizes links between a graphic element and an entry in a data set, allowing designers to duplicate that graphic element multiple times with links to the subsequent data values in your connected source. The Duplicate Effect is a huge time-saver for designers who need to create leaderboards and other data-driven list graphics.

With PRIME 4.9, the table resource can integrate with the full range of industry-standard data sources — such as text files, standard Excel files, web-based JSON data, online databases, and much more — enabling designers to leverage and customise their preferred data all within the PRIME application.

For designers, new adaptability features enable zoom-in on keyframes, easily organised keyframe control actions, and rapid renaming of objects throughout the scene tree — making it far easier to manage scenes with multiple, precisely timed animations. Additionally, new frame quality settings for PRIME’s preview channels enable designers working with mediaheavy graphics or demanding 4K monitor wall environments to prioritise system performance on the primary program output.

Another significant feature addition is PRIME 4.9’s new Duplicate Effect, which eliminates the

Meanwhile, PRIME playout operators will find that new control panel keyboard improvements

enable efficient control of multiple graphic channels without having to use a mouse. To further accelerate operations during fast-paced productions, PRIME can import/export play out configuration settings to quickly shift to a new show. Beyond the PRIME user experience, Version 4.9 also brings new opportunities to streamline production workflows. A new GPI IN functionality enables third-party control of PRIME playout from third-party devices — such as a video switcher — via a general-purpose interface. Meanwhile, hardware bypass workflows no longer require external hardware, with support for direct bypass through supported Matrox I/O cards with PRIME. Finally, a major improvement for top-tier broadcasters leveraging real-time SMT data. PRIME 4.9 delivers new features to provide reliable, direct connections to SMT data applications, ensure accurate collection, and filter out unwanted values from integration with graphics. Visit https://chyron.com

DRM-Protected, Ultra-Low Latency Video for Sportsbooks and iGaming VIDEO PLAYBACK and real-time streaming technology provider, THEO Technologies, and EZDRM, the provider of Digital Rights Management as a Service (DRMaaS), have announced a strategic partnership which aims to provide sportsbooks and iGaming providers with secure ultra-low latency video streaming.

SPORTSCASTING

Both iGaming providers and sportsbooks benefit from ultra-low latency video streaming. For the iGaming sector, real-time streaming is pivotal in elevating player-dealer engagement, maximising the number of hands that can be played. Sportsbooks leverage ultra-low latency video streaming to maximise the betting window for micro-betting experiences, ultimately boosting revenues.

12

Robust DRM is critical to enforce the streaming business model, and proves additionally valuable for legal and compliance teams in live dealer casinos, streamlining the onboarding process

for new countries. In the case of sportsbooks, comprehensive DRM ensures adherence to contractual obligations with rights holders, particularly pertinent for large-screen streaming in venues like betting shops. THEO’s real-time video API at scale, THEOlive, already delivers advanced security features such as geoblocking and token-based security. Now, with the seamless integration of the EZDRM multi-DRM solution into THEOlive, iGaming providers and sportsbooks can effortlessly initiate DRM-protected ultra-low latency streaming cross-platform. This integration ensures that they do not need to compromise on latency, video quality, and scalability while ensuring top-notch security measures for their content. “Through THEOlive, we simplify real-time streaming for iGaming providers and sportsbooks, ensuring a high viewer quality

of experience for audiences of any size — be it ten, hundreds, thousands, or even millions of viewers,” said Steven Telemans, CEO of THEO Technologies. “Our collaboration with EZDRM not only ensures content protection across platforms but also fortifies our commitment to delivering exceptional ultra-low latency video experiences worldwide.” “The growing commercial applications for low-latency streaming services are an exciting backdrop to our technical collaboration with THEO,” stated Olga Kornienko, COO and CoFounder of EZDRM. “Service providers need a robust, seamless approach to secure delivery to realise their commercial goals. THEOlive is a significant advance in providing a complete solution to meet the new demands of this market.” Visit www.theoplayer.com and https://hs.ezdrm.com


PREVIEW metexpo.com.au

ATEME https://www.ateme.com/ At SMPTE METexpo 2024, Ateme is set to unveil a comprehensive array of solutions aimed at enhancing audience engagement, maximizing content monetization, and revolutionizing operational efficiency. Audience Engagement: Sports events are the litmus test of the ability of a content or service provider to engage its audience. When millions of viewers watch the same event simultaneously, Ateme’s cutting-edge solutions deliver the best Quality of Experience, efficiently. At METexpo, content and service providers can discover Ateme’s TITAN range of encoders and transcoders. Ateme’s pure software solutions for video distribution enable the future with huge scalability, so you can integrate new codecs and optimize your bandwidth usage. They provide a unified, futureproof, innovative, and scalable solution to manage secure transmission across videodistribution platforms. Content Monetization: In the fiercely competitive media landscape, Ateme offers solutions that empower content and service providers to boost monetization. This is achieved by leveraging targeted offerings, including FAST channels and personal channels to monetize existing content libraries while engaging audiences and creating new revenue streams; and personalized or regional advertising through Server-Side Ad Insertion. Operational Efficiency: Ateme will highlight its software-based, cloud-native solutions that enable content and service providers to transform their video-delivery operations, bringing a new level of flexibility and efficiency with full-IP operations. These efficiencies in both quality and density of processing reduce power

consumption and carbon footprint, enabling sustainable operations. Ateme’s solutions collectively promise engaged audiences, increased revenues, and streamlined operations, ultimately translating into higher profits for content and service providers ––––––––––––––––––––––––––– D2N - TECHNOLOGY SOLUTIONS https://www.d2n.com.au/ D2N - Technology Solutions will be exhibiting at the SMPTE METExpo show in March 2024 at Randwick Racecourse. We will be exhibiting kit from the following partner’s, Calrec, Matrox, AIDA, RF Optic, and Raytalk. D2N specializes in solutions across the Audio, Video, and Communications platforms, for several Industries including Broadcast and Event Production, Sporting Federation & Motorsports, Construction, Government & Education, and House of Worship. Our team will include our Managing Director Jason Owen, BDM Phil Goulden and Marketing Assistant Mackenzie Owen. We will also be joined by two representatives from Calrec to support the Argo S console we will have on display. If you would like a personal demonstration of any of the kit, we will have on the stand please contact our office to secure a time. ––––––––––––––––––––––––––– EDITSHARE EditShare, in collaboration with local reseller Digistor, introduces groundbreaking solutions at METExpo to

optimize collaborative media workflows. The focus is on enhancing production and post-production efficiency, particularly for remote work and seamless cloud integration. The challenge addressed is the need for storage solutions that connect multiple facilities and integrate with the cloud, catering to users working remotely or from home.

Swift Link significantly improves remote client speeds over VPNs or high-latency connections, boosting throughput up to 10 times. This technology allows users to preview and edit proxies and high-res media, optimising connections for remote work without altering existing equipment or workflows. EditShare One, a new user experience, streamlines creative processes, with initial applications like Producer View facilitating task assignments and feedback delivery across dispersed teams. Producers can mark key points in transcriptions for editors through the FLOW panel, enhancing collaboration. AI-driven services, such as speech-to-text transcriptions, integrate seamlessly into FLOW, expediting work and enabling automatic creation of rough cuts or integration of selected clips into Adobe Premiere sequences. This can be extended even further with the new integration between FLOW and MediaSilo, allowing users to share links, create presentations and collaborate throughout the production team. EditShare’s booth will also host Adobe, showcasing the seamless workflow integration between EditShare’s media management and Adobe Premiere Pro. ––––––––––––––––––––––––––– EVS https://evs.com/ Celebrating 30 years of innovation at METexpo

For its 30th anniversary EVS will be showcasing at METexpo in Sydney its cutting-edge solutions, including LiveCeption for live production replays and highlights, MediaCeption for content management, MediaInfra for broadcast control and processing, and PowerVision for video assistance enabling swift decision-making.

At the show visitors will be able to experience live demonstrations including: • LSM-VIA, for advanced live replays and highlights and controlling both XT-VIA and XTGO live production servers • Neuron for all audio and video processing needs in IP infrastructures • Cerebrum, EVS’ broadcast control and monitoring system • Xeebra, our FIFA-certified multi-camera review system and AI-based Offside Technology for video officiating in a wide range of sports

Nestor Amaya, SVP Solutions Architecture at EVS, will take the stage for two insightful presentations on the transformative impact of AI in live sports production and the critical importance of a unified control and monitoring system in both IP and hybrid facilities. ––––––––––––––––––––––––––– LIVEU https://www.liveu.tv 2024 is set to be a major year

for the broadcast industry with the Summer of Sports and over 70 elections worldwide. At SMPTE METexpo, LiveU will be presenting its complete IP-Video EcoSystem for agile live remote and on-site production across news, sports, and other events. The LiveU EcoSystem – spanning contribution, production and distribution – enables broadcasters, sports and other organizations to add efficiency and shorten their workflows, while keeping their costs and carbon footprint to a minimum. By removing complexity with easy-to-use, high-quality solutions, producers can fully focus on creating dynamic, engaging

content.

LiveU will share tangible ways that customers can meet the increased demand for >> continues P14

MET EXPO

–––––––––––––––––––––––––––

13


SMPTE MET EXPO >> continued from P13

quality content and optimise monetisation opportunities – leveraging its 5G bonding solutions and cloud-based workflows. LiveU will showcase its latest IP-video solutions, including its LU4000 ST 2110 receiver, supporting the transition towards end-to-end IP workflows. All LiveU solutions are built on its field-proven LRT (LiveU Reliable Transport) protocol for rock-solid resiliency. Other highlights include: 1. Dynamic content contribution solutions with LiveU’s reliable and flexible field units, Including the 5G multi-cam LU800, and the powerful-yetcompact HD/4K LU300S with 5G capabilities. 2. Our award-winning cloud solutions, including: LiveU Ingest: ingest cloud solution for automatic recording and story metadata tagging of live video.; LiveU Studio: first cloud IP live video production service to natively support LRT; and LiveU Matrix: IP cloud video management and distribution service. Our experienced team will be ready to meet you, so please schedule a meeting or guided EcoSystem tour here: https:// www.liveu.tv/company/contact-us ––––––––––––––––––––––––––– MAGNA SYSTEMS & ENGINEERING

https://magnasys.tv/ sales.au@magnasys.tv Get ready for NAB 2024 by visiting the Magna stand at METexpo 2024!

MET EXPO

Whether you are going to NAB or viewing it from afar, you can get a heads-up on the latest trends, products and what to expect at the Las Vegas show in April by visiting the Magna stand in Sydney at METexpo 2024 in March.

14

At METexpo 2024 Magna will be demonstrating Actus Digital’s Intelligent Monitoring Platform, manifold CLOUD and BLADE//runner live broadcast production software and COTS FPGA hardware from Arkona Technologies, the Chyron LIVE end-to-end, cloudnative live video production

platform, Enensys digital media distribution chain solutions, the unique IPconneX contribution, distribution and delivery solution, Panasonic 4K and IP PTZ cameras and switchers, Pebble playout automation, RTS intercoms, Synamedia video experience solutions and TAG IP monitoring. The Magna Systems team along with product specialists from Actus Digital, Arkona Technologies, Chyron, Enensys, IPconneX, Panasonic Connect, Pebble, Synamedia and TAG will be on METexpo stands 46 - 49 in the Kensington Room at Royal Randwick Racecourse from 5 - 7 March 2024 ––––––––––––––––––––––––––– QVEST

A key highlight for attendees is the opportunity to personally meet the Ross team and participate in discussions regarding their end-to-end solutions. Attendees can engage in conversations covering Hyperconverged solutions, production switchers, and graphics technologies, gaining valuable insights and in-depth knowledge directly from the Ross experts. Discover how these ground-breaking solutions have the potential to transform broadcast production workflows. Be sure to visit the Ross Video booth to connect, learn, and stay updated on the latest trends in the industry! ––––––––––––––––––––––––––– SYNCHRONISED TECHNOLOGY

https://www.qvest.com/au-en/ More choice, changing needs, faster decisions: The enormously accelerated processes due to digitalisation require a strategy that transforms people and technology in harmony. We help you harness the enormous potential of media change, take your organisation to the next level, and give you the competitive edge. Visit our stand and find out more about our industryleading practices, services, and products, including: Broadcast Transformation ; Cloud ; Digital Product Development ; IP & Rights Management ; OTT ; Systems Integration ; makalu cloud playout automation; and Remote Editing. Product partners: Stage Tec, Direct Out, Merging, Glensound, Voice Interaction, DHD Audio, Bolin, DataMiner, and many more! ––––––––––––––––––––––––––– ROSS VIDEO

https://www.rossvideo.com/ Ross Video is poised to leave a lasting impression at the SMPTE MET Expo 2024 by offering attendees a unique opportunity to interact with their expert team and explore the latest advancements in broadcast production technology.

sales@syntec.com.au www.syntec.com.au This year Synchronised Technology (Syntec) will showcase a wide range of products across their portfolio of represented brands at SMPTE. Each of the showcased brands have a range of products that offer unique solutions to a wide range of applications; including, but not limited to Headphones, Headsets and Microphones for both broadcast; pre and postproduction applications. Several of the headphones on display offer a multitude of plug-n-play connectivity and flexibility, via either a traditional stereo jack connection, or USB-C connection capability. A range of monitor speaker solutions for broadcast and studio applications will also be on display, offering Analog, Digital and Dante capability. The monitor options range from traditional freestanding speakers to small compact rackmounted monitors. Rounding off the products on display will also be a wide range of industry-leading innovations, for microphone shock mounting and wind protection solutions for both live location recording, and studio applications. –––––––––––––––––––––––––––

WASABI TECHNOLOGIES

https://wasabi.com/ Wasabi Technologies will be exhibiting at the SMPTE METexpo 2024 where media and entertainment organisations can learn about Wasabi’s recent acquisition of Curio AI, an AIbased intelligent data platform. With Curio AI, Wasabi will soon introduce the industry’s first AIpowered intelligent cloud storage designed specifically for the M&E industry. With hundreds of exabytes of archives existing on aging tape systems, and many more exabytes generated every year from film, television, sporting events, news, and advertising, including countless hours of valuable raw footage and outtakes, the growing demand for content means M&E organisations are one of the largest generators of data. However, a video archive without detailed metadata is like a library without a card catalog. Finding exact segments requires tedious and time-consuming manual effort. Searching and tagging video content is now easier with Curio AI from Wasabi, using metadata to manage troves of digital assets with lightning-fast speed. With Curio AI, Wasabi users will be able to create a second-by-second index of video stored in Wasabi’s affordable, high-performance cloud storage using rich, searchable metadata for media libraries. Curio AI enables editors and producers to instantly search and retrieve specific media segments based on people, places, events, emotions, logos, landmarks, background audio and more. Curio AI can also detect and transcribe speech in over 50 spoken languages. Visit Wasabi at METexpo to learn more about how Wasabi is revolutionising media storage by enabling the M&E industry to quickly deliver relevant content to market and unleash the value of their archives.


Welcome to the SMPTE Co11fere11ce 2024 Paul Whybrow, Chair SMPTE, Australia, New Zealand, and South Pacific I am so excited and proud to invite you to be part of the amazing conference we have planned forTuesday March the 5th till Thursday March 7th. As our industry keeps evolving the need to stay in touch with tl1e continuing changes in technology and business is becoming more rapid and difficult to do. SMPTE is a 100+ year old organisation tl1at was born out of the needs of standards of the movie business and has evolved to become the global standard maker across media, broadcast, production and digital and so evolving technology is at the heart of what we do. OurSMPTE conference is a three-day opportunity for you to listen to the best in our industry, gather insights that will make a practical difference to your professional role and mix with your peers and engage in meaningful conversations. All our speakers and panellists are giving tl1eir time to this event as tl1ey are passionate to sl1are tl1eir knowledge and support our industry community.

We aim to bring alive such themes as diverse as cyber security, media sustainability, automation and Al, broadcast tech, 5G, unleasl1ing tl1e power of cloud and the remote revolution. The Third Floor are Live from Los Angeles with a sneak peek at the design magic that makes them tl1e world's leading visualisation studio. We will link live with the HollywoodSMPTE section, IBC and IABM will give theirtakes on the state of the industry, and you can ask anything you like to David Grindle ourSMPTE Global Executive Director. The ever-popular EmpowerHER breakfast is back along with our 'RisingStars' sessions aimed at those about to enter or early into tl1e industry. I am thrilled that we can offerthis and at a fantastic value price. SMPTE members can come to all three days for just l> 15 o. For 11011-members (or as I prefer to say future members!) it is great value at \>850 and includes the first year ofSMPTE me111bersl1ip. On a personal note, a huge thanks to the speakers, volunteers, theSMPTE programming team and organisations supporting the METexpo experience. Looking forward to seeing you there! Paul Whybrow Chair SMPTE Australia, New Zealand and South Pacific

ConfereI1ce sI1apshot Tuesday -Day One

Thursday - Day Three

Themes: Keynotes, Security, Sustainability, automation, and Al.

Themes: EmpowerHER breakfast, Rising stars new and emerging professionals, Hollywood visualisation and remote revolution.

• Opening and keynote from Barbara Lange, CEO Kib □ 121 • Keynote -Marc Zorn, Head of Security at Marvel • Cyber shields for media and entertainment • Sustainability in media - when will it arrive? • Keynote -Ian Taylor, Founder ARL -The Americas Cup and beyond. • Industry Insights, Michael Crimp, CEO IBC Exhibition - Amsterdam Technology Papers include Media Assurance APl's, media supply chains, sports streaming, and accessibility.

• EmpowerHer - Engaging stories and advice from inspiring women in media technology • The Third Floor - CEO and Founder Chris Edwards sharing the secrets of awe-inspiring and entertaining visualisation. • Virtual branding and the role of graphics in production • Remote Revolution - insights into how Remote is rocking the sports world. • Rising Stars - sessions with prospective and emerging professionals in mind and opportunities to meet young professionals and ask how did they get their job? Technology Papers include video over IP and VQA for ABR streams.


9.30-10.30

1330 - 1400 Papers 4 Ankit Gautam (Presentation) TBS Media Assurance A Pis 1130 - 1200 Papers 2 Bill Admans (Presentation) The Agility of Cloud-native Media Supply Chains

11.00 -12.30

Broadcast and Media Technology Part 2

Sharing the latest l<nowledge and trends

Cyber Shields

"Ensuring Security in Media Technology"

1100 - 1130 Papers 1 Scott Favelle (Keynote) Scaling a streaming platform to breaking levels - Lessons from FIFA WC & Tokyo Olympics

Media & sustainability 13.30-14.30

- Panel discussing Media sustainability in our region AWS

1400 - 1430 Papers s (30mins) Fiona Habben Al Media

Afternoon Tea Break

1500 - 1600 Keynote - Ian Taylor - Animation Research Limited

15.00 -16.30

Rise of The Machines

"Automated and Al in Media Production"

our Industry

Today The IBC perspective Michael Crimp - CEO IBC Evening events

Social Drinks 011 show floor Gala Dinner

Beyond our shores A discussion on how to go beyond the Australian Marketplace and the challenges Evolving with SMPTE Fireside chat Local Chair Paul Whybrew and Global Execcutive Director of SMPTE - David Grindle


9.30-10.30

1330 - 1400 Papers 4 Ankit Gautam (Presentation) TBS Media Assurance A Pis 1130 - 1200 Papers 2 Bill Admans (Presentation) The Agility of Cloud-native Media Supply Chains

11.00 -12.30

Broadcast and Media Technology Part 2

Sharing the latest l<nowledge and trends

Cyber Shields

"Ensuring Security in Media Technology"

1100 - 1130 Papers 1 Scott Favelle (Keynote) Scaling a streaming platform to breaking levels - Lessons from FIFA WC & Tokyo Olympics

Media & sustainability 13.30-14.30

- Panel discussing Media sustainability in our region AWS

1400 - 1430 Papers s (30mins) Fiona Habben Al Media

Afternoon Tea Break

1500 - 1600 Keynote - Ian Taylor - Animation Research Limited

15.00 -16.30

Rise of The Machines

"Automated and Al in Media Production"

our Industry

Today The IBC perspective Michael Crimp - CEO IBC Evening events

Social Drinks 011 show floor Gala Dinner

Beyond our shores A discussion on how to go beyond the Australian Marketplace and the challenges Evolving with SMPTE Fireside chat Local Chair Paul Whybrew and Global Execcutive Director of SMPTE - David Grindle



NEWS OPERTAIONS OPERATIONS Sky News’ “The Jury” Finds in Favour of Gravity Media AUSTRALIAN NEWS CHANNEL (operator of Sky News Australia) and Gravity Media Australia have confirmed Gravity Media’s Production Centre in Sydney as the hub for the complete television production for Sky News Australia’s new weekly programme: The Jury. The Jury, hosted by Sky News Australia Presenter Danica De Giorgio, is filmed with a studio audience. The ten-week series, which premiered on Sunday 4 February, sees De Giorgio moderate a discussion between guest debaters on the biggest hot topics of the week. After each debater pleads their case, a 12-person jury joins in on the action and votes for who they believe has won the argument. Mark Calvert, Australian News Channel, Head of Programs, said: “We couldn’t be more excited to work with the professionals at Gravity Media. Their creativity and commitment have made a real difference to this new format. We can’t wait to share the program with our viewers.” Mike Purcell, Head of Production, Gravity Media Australia, said: “We are delighted to be working with Mark Calvert and the team from Australian News Channel on the production of this new program. We appreciate their decision to select Gravity Media Australia to work with them

in delivering the complete studio television production of The Jury.” According to Gravity Media, Australian News Channel’s decision to create and deliver the production of The Jury at Gravity Media Australia confirms the company’s increasing presence in broadcast technology and facilities provision for major television production companies and complements Gravity Media’s complete “turn-key” production for major projects across broadcast and subscription television and streaming services. Gravity Media’s Production Centre in Sydney is an integrated full-service broadcast and television and film production centre comprising two large studios and complete in-house production services and facilities. To oversee every shoot, there are two control rooms, taking care of audio mixing, graphics, EVS playback, postproduction, encoding, and distribution. And with Telstra fibre connectivity in and out of the building, content can be sent to or received from anywhere in the world at a moment’s notice. The Production Centre Sydney also has 11 editing suites, and an architecturally designed and acoustically engineered audio post studio. Visit www.gravitymedia.com/au

Host of “The Jury”, Sky News Australia Presenter Danica De Giorgio.

AI Turns news.com.au Articles into Podcast Ads IN AN AUSTRALIAN FIRST, Artificial Intelligence (Ai) will transform content from news website, news.com.au, into content marketing placements for inclusion in Acast’s content catalogue. Acast, one of the world’s leading independent podcast companies, hosts and distributes some of Australia’s most popular podcasts across categories including Sport, Business, Lifestyle. Listeners to the Acast network will receive real-time news headlines advertising news.com.au. Aaron Matthews, Head of Creative at Creative Fix, said the technology

will not only deliver news content to listeners in real time, but that news categories can be paired with relevant podcast genres. “This is the next step towards a truly integrated and tailored content experience where listeners receive relevant news updates, in real time, about topics of most interest to them,” said Matthews.

for news.com.au, said the relationship allows news.com. au readers to be on top of the latest news, wherever they are. “This activity embodies our brand promise to ‘be on it’, delivering the news that matters to our audience in the way they want to consume it.”

“It also overcomes the traditional barriers faced by audio producers, with most audio ad production being time-consuming, making this ideal for productions with tight timelines.”

The initiative is a collaboration between Acast, news.com.au, audio agency Creative Fix, and tech company Audiostack, and will use Audiostack’s scalable AI audio production to convert news articles into audio while also augmenting the voiceover with music and sound design.

Gina McGrath, Head of Marketing

Dr Timo P Kunz, Co-founder and

CEO of AudioStack.ai said: “We’re really excited about the massive innovation Creative Fix has brought to the audio advertising sector. Using news headlines is exactly the type of use case we built AudioStack for: creating audio fast and at scale.” The news audio ads will be delivered as 30-second news bulletins across Acast’s full network which started on 13 November, 2023. Visit https://www.acast.com and https://audiostack.ai/

AUSTRALIA-BASED personalised news service LeadStory is collaborating with Sweden’s Volvo Car Group on bringing videobased news to the screens of the company’s connected cars.

Price, Co-Founder and CEO with LeadStory. “That’s why we’re excited to be working with Volvo Cars to create an industry-defining product for the automotive screen in 2024.

“The future of content consumption in the car isn’t just radio. It’s video,” said Cam

Price recently visited Volvo’s Göteborg Headquarters to discuss the delivery of personalised video

news feeds to Volvo drivers, in any language -,meeting all road safety guidelines, of course. “Myself, (Co-Founder and CTO) Cheyne Wallace and our team are excited about what comes next,” said Price.

algorithm controlled by the viewer, content featured on LeadStory is done so with permission from the Publisher, or under licence from Content Aggregators such as Reuters (Thomson Reuters (Australia) Pty Ltd) and Video Elephant Limited.

Employing a personalisation

Visit http://www.leadstory.com/

NEWS OPERATIONS

LeadStory Teams with Volvo on In-Car Video News.

19


NEWS OPERATIONS

Etere MAM Connects with Ross Newsroom Inception ETERE HAS ROLLED OUT a new integration that connects its Etere MAM with Ross Newsroom Inception, resulting in an allencompassing solution that the company says revolutionises news production. Etere Media Asset Management (MAM) empowers users to send assets directly from the Etere database to the Ross Newsroom as MOS objects. This streamlined process eliminates the need for constantly switching between screens, allowing users to preview files and search segments and virtual assets seamlessly. This collaborative synergy ensures the smooth data flow between both systems and significantly reduces content delivery delays, all made possible through Etere’s partnership with Ross Video. Etere Media Asset Management is a media content management software designed to streamline content handling: ingest, index, storage, and retrieval of digital assets. Etere MAM simplifies the

management of digital media files, production workflows, and distribution channels. The heart of this system lies in its central database, which accelerates content delivery, streamlines content administration, and ensures fast, frame-accurate content distribution. Key features of Etere Media Asset Management include: • Modular Solutions: Etere manages the entire media lifecycle, from ingest and transcoding to content retrieval, metadata association, and asset distribution. • Automated Content Processing: Etere MAM can detect new content assets automatically. • Efficient Ingest with Integrated QC Workflows: Etere offers automated ingest processes coupled with effective quality control workflows and MD5 checksums to deliver the highest content quality.

• Multiformat, Multipurpose File Generation: Create versatile files compatible with multiple platforms, enhancing content accessibility. • High Availability and Fault Tolerance: Etere MAM is built for reliability and continuity, minimising downtime. • Multi-Language Content Management: Easily manage content in various languages,

expanding your reach. • NLE Editing Integration: Seamlessly integrate with NLE editing software such as Adobe Premiere and Blackmagic DaVinci. • User Rights Integration: Etere MAM offers seamless integration with Etere User Rights, enhancing security and access control. Visit https://www.etere.com

Quicklink Brings Skype and Teams into SMPTE ST 2110 Workflows QUICKLINK, the provider of remote production solutions, has announced the launch of the Quicklink TX (Skype TX) 2110, bringing the integration of Skype and Microsoft Teams callers into ST 2110 media workflows. This new support enables the Quicklink TX solution to integrate seamlessly with uncompressed IP video workflows conforming to the SMPTE ST 2110 specification. The SMPTE ST 2110 suite defines standards for transporting video, audio, and metadata essences as separate streams over professional IP networks. Quicklink says its support for ST 2110 empowers broadcasters to unlock new

efficiencies in live production. “We have seen a rapid growth of demand for IP video and adoption of the ST 2110 standard,” said Richard Rees, CEO of Quicklink. “By introducing ST 2110 support to the Quicklink TX, our products now align perfectly with customers transitioning their facilities to IP.” The Quicklink TX is offered in a number of channel configurations for complete workflow flexibility and is relied upon by leading broadcasters worldwide to integrate remote Skype and Microsoft Teams guests into live productions with exceptional full HD quality. Since its launch in 2015, the Skype TX platform

has been adopted by the world’s largest brands, including Aljazeera, BBC, ITV, NBC Universal, CNN, COMCAST, CBS, and many more. The announcement of SMPTE ST 2110 support bolsters Quicklink’s reputation for supporting the latest IP standards including NDI, SRT and Dante. The Quicklink TX IP-3 and Quicklink TX IP-6 are offered in addition to the established HD-SDI options. The interoperability with ST 2110 infrastructure simplifies workflows and expands possibilities for remote contribution over modern IP networks. Visit www.quicklink.tv

INTRODUCING

A N O VA P R O 3

NEWS OPERATIONS

T H E A L L - W E AT H E R P O W E R H O U S E

20

CLASS LEADING POWER, > 22,000 LUX AT 1M

IP65 WATER-RESISTANT, PERFECT FOR OUTDOOR USE

“MAGIC EYE” OPTICAL LIGHT SENSOR

FULL COLOUR TOUCHSCREEN DISPLAY

LUMENRADIO CRMX & IOS / ANDROID APP


NEWS OPERATIONS

ABE2024 Register now https://abeshow.com

October 22-24, Royal Randwick Racecourse, Sydney.

2024 ABESHOW Exhibitors include:

The 2024 ABE Exhibition, Conference & Workshops are being held at Sydney’s raciest venue, the Royal Randwick Racecourse from October 22-24.

4Amber Technology 4Rohde & Schwarz 4Arri 4Ross Video 4Blonde Robot 4Silvertrak 4Dalet

4Sony Australia

Hear from industry experts on hot topics such as IP production and delivery, 4K, HDR, HEVC, OTT, VR, cloud broadcasting, AI in Media, and more.

4FujiFilm

4Studiotech

4Magna Systems & Engineering

4Techtel

4Onair Solutions 4Panasonic

See the latest solutions to empower production and delivery of your content. Register now for updates & notifications on exhibitors & speakers.

4Pacific Live Media

4TVU Networks 4Videocraft 4VizRT

MEDIA PARTNER

PMS 356

PMS Cool Gray 8

NEWS OPERATIONS

4Professional Audio & Television (PAT)

21


POST PRODUCTION ‘Next Goal Wins’ Makes the Grade with DaVinci Resolve Studio BLACKMAGIC DESIGN has announced that the feature film “Next Goal Wins” was graded by FotoKem’s David Cole using DaVinci Resolve Studio editing, grading, visual effects (VFX) and audio post-production software. Directed by Taika Waititi (“Jojo Rabbit,” “Thor: Ragnarok”), “Next Goal Wins” follows the infamously terrible American Samoa soccer team, known for a brutal 2001 FIFA match they lost 31-0. With the 2014 FIFA World Cup qualifiers approaching, the team hires down on his luck, maverick coach Thomas Rongen (Michael Fassbender) hoping he will turn the world’s worst soccer team around in this humorous and heartfelt underdog story. Australian Cinematographer Lachlan Milne, who worked with Waititi on “Hunt for the Wilderpeople,” had also worked with Cole on two back-to-back films, “Love and Monsters” and “Minari.” Working with Milne and Cole, Waititi focused on creating a unique feel for the film’s two distinct sequences. “One of Taika’s main directives was that American Samoa needed to lean into the world of ‘hyper-real’ or ‘postcard idealised,’” said Cole. “In comparison, at soccer HQ, we were attempting to have a feeling of the more ‘normal/formal’ world of organised sports, and in particular, to enhance the ‘interrogation’ of Fassbender’s character Rongen.” With extensive work on LUT development for production, Cole found the greatest challenge was

managing the diverse lighting after the shoot. “Wrestling with the constantly changing weather conditions, which happens when you film open exteriors in Hawaii with seemingly four seasons in four minutes, was very challenging,” he continued. Cole leaned into DaVinci Resolve Studio’s toolset throughout the project. “We used the full arsenal of Resolve tools, including keys, animated Power Windows and curves to have a consistent feel over the course of the movie,” he added. “Custom DCTLs were used to handle gamut mapping of police car lights so that

they didn’t tear and look too digital. A custom film emulation was also created within Resolve and applied to the entire movie. The film’s LUTs were also developed within the grading package.” With a wide variety of beautiful imagery, Cole particularly enjoyed grading the team mountain run. “I really loved the scene and following montage of the team running up a mountain where a delirious Rongen delivers a speech. Lush vegetation, beautiful skies, low sun and enticing beaches all added up to a beautiful couple of scenes,” he noted. While challenging schedule wise,

Cole enjoyed working with such a visionary filmmaker as Waititi. “Taika is a filmmaker who is always busy and has many plates spinning at once,” said Cole. “We began grading the film, then stopped for a year or more while he went off and shot ‘Thor: Love and Thunder.’ We then came back to an editorially refined version of the film and completed the movie. Judicious use of Taika’s time during grading sessions and meetings allowed us to get inside his head and realise his vision on the screen.” “Next Goal Wins” is now in theatres. Visit https://www.blackmagicdesign.com

POST PRODUCTION

RSP’S Launches REVIZE Machine Learning for VFX

22

OFF THE BACK OF a plethora of successful projects, Rising Sun Pictures (RSP ) has launched REVIZE, its toolkit for Machine Learning.

and other likeness adaptations. The REVIZE toolkit now includes REVIZE Face and REVIZE Body, with future developments already in development.

Leveraging cutting edge Machine Learning (ML) technologies and advanced VFX workflows, REVIZE produces the highest quality VFX output for feature film, series, and entertainment. REVIZE has been used for a variety of digital ML augmentation, most notably face replacement, facial performance modification, de-aging, body replacements,

“We recognised the potential of ML early on and invested in talent and resources to develop a premium toolset. Our studio clients report that, with respect to the rest of the industry, we are miles ahead of the curve, the best in the world,” said Tony Clark, RSP Co-Founder & Managing Director. Clark is no newcomer to innovation. A 30-year veteran,

Tony has helped put the Australian VFX industry on the map and has inspired a generation of creative artists and technologists. An Emmy award winning cinematographer, Tony co-founded RSP in 1995 and has innovated throughout his career including the creation of cuttingedge motion control camera system, the Australian broadband network Cinenet and the colour management tool cineSpace. His most standout creation is the world-renowned cineSync which he won an Academy SciTech

Award in 2010. “RSP’s differentiation is evident with our experience in problem solving for the variety of creative and technical challenges that are faced in VFX production. It is our mission to be the Trusted Creative Partner of studios and filmmakers worldwide. REVIZE was created to do just this, to solve problems and puts the tools into the hands of our artists,” said RSP President Jennie Zeiher. Visit https://revize.ai/


POST PRODUCTION

Baselight Harnesses Machine Language FILMLIGHT’S LATEST version of its Baselight grading software, Baselight 6.0 benefits from several years of machine language (ML) development and new features designed to bring gains in productivity and creativity to today’s increasingly demanding and competitive postproduction environment. Many of its beta testers are already relying on the new tools, which include an improved and modernised timeline, a new primary grading tool, X Grade, a unique new look development tool, Chromogen, plus a new ML-based tool, Face Track. Using an underlying ML model, Face Track finds and tracks faces in a scene, adjusting smoothly as each face moves and turns. It attaches a polygon mesh to each face, which allows perspective-aware tools such as Paint and Shapes to distort with the mesh. The colourist’s productivity is further boosted by the ability to copy corrections and enhancements made in Face Track to the entire timeline ­– with a simple

copy and paste, these previously repetitive corrections can be applied to the same face across an entire sequence or episode. FilmLight’s Face Track technology has been developed from scratch internally, but the company has also ensured Baselight 6.0 is fully MLready. By developing a framework, which it calls Flexi, FilmLight is now in a position to integrate future MLbased tools into Baselight quickly and easily – allowing FilmLight and its customers to access the best grading tools and stay on top of cutting-edge technologies now and in the future. In Baselight 6.0, colourists will also find the RIFE ML-based retimer, a reworked Curve Grade, integrated alpha for easy compositing, a new sophisticated Gallery for improved searching and sorting, plus a wealth of new and

enhanced colour tools such as Sharpen Luma, a built-in Lens Flare tool, Bokeh for out-of-focus camera effects, Loupe magnifying for fast and accurate adjustments, an upgraded Hue Angle, and much more. Visit https://www.filmlight.ltd.uk/baselight6

Animal Logic Opens Up with OMX and ALab AS PART OF ITS ONGOING commitment to the open-source community, Animal Logic has announced the open-source release of OMX, a user-friendly and performant extension to Autodesk Maya Python API. OMX is a thin Python wrapper around the Maya Python API 2.0, which aims to make the power and performance of the API more accessible to Artists, Technical Directors, and Developers. OMX makes it easy to work with Maya nodes and attributes, giving direct access to API objects for more advanced use cases while making writing undoable code a breeze. “OMX has been a fantastic library to work with and I know it will be useful to the rest of our industry,” said Chiara Licandro, Rigging Technical Lead, Animal Logic. “OMX provides a more user-friendly introduction to OpenMaya, especially for anyone who hasn’t had much

experience with the API.” “We wrote OMX to encourage our rigging TDs to use the Maya API in their Python scripts and avoid the heavy performance cost incurred by existing alternatives,” explains Valerie Bernard, Animation and Rigging R&D Engineering Manager, Animal Logic. “It quickly became popular among all our Maya developers, including API experts from R&D and Production Technology. We’re very excited to be able to share it with the whole Maya community.”

quality assets, two animated characters, and baked procedural fur and fabric. Since its release in 2021, ALab has provided the wider community with the tools to foster further exploration and adoption of the open-source USD ecosystem, with over 2600 downloads in 2023 by vendors and studios.

Animal Logic has also announced the release of an update to its open-source project, ALab, with fixes and enhancements that will improve the wider community’s overall experience when using the dataset.

The idea behind the ongoing open-source project, says Animal Logic, was to grow and develop the dataset with regular updates to the suite of assets. This release includes improved, interactive viewing components, fixes for several bugs reported by the growing ALab community, and the addition of the shot cameras used to produce the previously opensourced ALab trailer.

ALab, the first open-sourced complete USD production scene, has over 300 production-

Visit https://github.com/AnimalLogic/AL_omx and https://animallogic.com/alab

Next Generation Video I/O with Streaming DMA

Cutting Edge Connectivity Designed from the ground up to support next-generation AI and AR workflows, KONA X is a 4-lane, PCIe 3.0 video I/O card with two full-size, bidirectional 12G-SDI BNC connections and two full-size HDMI 2.0 connections, one for input and one for output.

*Application dependent

KONA XM represents a new generation of I/O technology from AJA that allows medical OEM developers to ensure ultra-low latency for the most mission-critical video applications, including surgical and medical video hardware devices. Questions? Shoot us an email.

KONA XM is designed and certified to be the heart of the next generation medical video technology solutions. Available in Active (with fan) and Passive (fanless).

POST PRODUCTION

Video I/O for AI/AR Medical Devices

KONA X lets users achieve as low as sub-frame latencies with Streaming DMA (Direct Memory Access), which enables powerful real time workflows with no perceptible delay on I/O and processing crucial for implementations such as live sports and virtual production*. KONA Xpand, an optional PCIe card that expands the connectivity on KONA X and connects to KONA X over internal ribbon cable.

23


AUDIO, RADIO RADIO, PODCAST AUDIO Digital soundwaves

Leading Broadcaster Taps Ferncast for In-House DAB+ A LEADING AUSTRALIAN BROADCASTER has been collaborating with AVW Group to incorporate Ferncast’s aixtream - a software solution tailored for diverse live audio applications - to elevate in-house DAB+ radio service monitoring and distribution capabilities. aixtream enables broadcasters to design a customised main control hub. Its Smart Control Applets allow for the creation of trigger-based scripts, automating various system behaviours. With advanced user role management, the software ensures a streamlined and efficient workflow. Prioritising security, the software boasts features such as VPN for secure audio streaming and file encryption for safeguarding sensitive audio recordings. In pursuit of distributing DAB+ radio services

within its in-house IP-based monitoring network, the Australian broadcaster sought expertise from AVW Group Australia. The recommendation to integrate Ferncast’s aixtream is part of a broader in-house IPTV network upgrade, catering to hundreds of radio and television services. Ferncast’s aixtream solution is designed for scalability, functionality, and redundancy. This integration not only addresses the broadcaster’s immediate needs but also ensures adaptability to the ever-evolving broadcasting landscape – as the broadcaster’s needs change it’s just a matter of dragging and dropping some icons. Stefan Hunt, AVW Group’s Australian Business Development Manager, remarked: “Ferncast’s ability to provide a bespoke software solution,

working closely with AVW, within a constrained delivery timeframe, was commendable. Navigating the intricacies of DAB+ signal delivery, especially when juxtaposed with traditional FM/AM broadcast services, was challenging. However, the collaboration’s outcome was a testament to the expertise of all parties involved.” Detlef Wiese, Ferncast CEO, commented: “Based on our long-term relationship with AVW, we are glad to now service the Australian broadcast market with our aixtream solution and we are honoured to have received such positive feedback from our valued partners in broadcasting.” Visit https://www.ferncast.com/ and https://www.avw.com.au/

AES Welcomes Leslie Gaston-Bird as President THE AUDIO ENGINEERING SOCIETY (AES) has welcomed Leslie Gaston-Bird CAS, MPSE, as president, effective January 1, 2024. GastonBird will serve as the first African-American president in the organisation’s 76-year history. An AES Fellow, Gaston-Bird currently serves on the AES Board of Directors and has served in the past on the Board of Governors: first as Vice President Western Region & Canada and then as a Governor-at-large. She co-founded the Diversity & Inclusion Committee alongside fellow AES member Piper Payne, and provided valuable expertise as part of the AES Technical Council’s Technical Committee for Broadcast

and Online Delivery’s “Guidelines for Over-TheTop Television and Video Streaming.” Trained in classical piano, Gaston-Bird is founder and director of Immersive and Inclusive Audio, CIC, which provides training in Avid Pro Tools and Dolby Atmos certification for underrepresented groups, and she performs freelance work as a sound editor and re-recording mixer through her production company, Mix Messiah Productions. She is the author of Women in Audio (Routledge) and Math Fundamentals for Audio (A-R Editions). She is a Pro Tools | Dolby Professional Avid Certified Instructor and Dante Level 3 Certified audio engineer. She is also a member of the

Cinema Audio Society (CAS), Motion Picture Sound Editors (MPSE), and a voting member of the Recording Academy (the GRAMMYs). Gaston-Bird became a tenured professor at the University of Colorado, Denver in 2012 where she also served as Chair of the Music and Entertainment Industry Studies Department. Early in 2024, she will be awarded with a Doctorate from the University of Surrey’s Institute of Sound Recording (IoSR). She is currently a senior lecturer at City, University of London. Visit https://www.aes.org/

Dante Pro S1 System-on-a-Chip AUDINATE GROUP LIMITED, developer of the AV-industry Dante AV-overIP solution, has announced the Dante Pro S1, which the company says is a compact and cost-effective system-on-a-chip (SoC) for professional audio and AV equipment manufacturers to integrate industry-leading Dante network functionality in low channel count devices. The chip solution will be a focus of Audinate’s new innovations and provides a platform for new features such as media encryption. Dante Pro S1 offers an alternative to the Ultimo chip for OEMs building low channel count devices. New designs are encouraged to use Dante Pro S1 as it will provide future features and capabilities that may not be supported by Ultimo-based products.

AUDIO, RADIO, PODCAST

Dante Pro S1 is designed for compact and power-efficient audio equipment solutions requiring Dante network interoperability. The chip is a versatile TFBGA100 package (8mm x 8mm) and requires minimal external components to complete system integration. The underlying microcontroller is more power efficient than previous Dante solutions, resulting in low heat dissipation for space-constrained equipment designs.

24

Dante Pro S1 is offered in multiple package configurations to meet the exacting requirements of OEMs. The configurations announced support up to 2×2 channels at 96kHz or up to 4×4 channels at 48kHz audio encoding. An evaluation kit is available for equipment manufacturers to prototype system designs using Dante Pro S1. The kit is based on the Dante Pro S1 reference design and includes XLR connectors for analogue I/O and

swappable network interface modules for a range of PHY and switch ICs. A full suite of technical documentation is also available. Visit https://www.audinate.com/pros1


proudly distributed by

jands.com.au

AUDIO, RADIO, PODCAST

AUDIO, RADIO, PODCAST

25


AUDIO, RADIO, PODCAST

Tieline Dante Card for Gateway and Gateway 4 codecs CODEC MANUFACTURER Tieline has announced that customers can now order its Gateway or Gateway 4 codec with optional Dante card fitted, delivering compatibility with Dante devices. Dante Controller software facilitates simple stream management, as well as discovery of devices and streams. Gateway and Gateway 4 codecs include native support for AES67, ST 2110-30, ST2022-7, AMWA NMOS IS-04 and IS-05, Ember+, RAVENNA and Livewire+. An optional WheatNet-IP card, or the new Dante card, can also be installed. Tieline codecs specialise in streaming low latency, high quality audio over the internet using a range of wired and wireless IP transports. They are suitable for STLs, network audio distribution, and multiple remote broadcasts. Interoperability between multiple AoIP protocols delivers greater flexibility when integrating IP audio streams into the broadcast plant from a range of sources. Tieline MPX II Codecs Shipping Tieline has also announced that its new MPX II codec will ship from January 2024. Tieline

MPX codecs deliver composite FM multiplex (MPX) codec solutions for real-time network distribution of FM-MPX or compressed MicroMPX (µMPX) signals to transmitter sites. The MPX II can effortlessly transport two discrete composite FM-MPX signals from the studio to transmitters with return monitoring in real-time. The codec supports analog MPX on BNC, MPX over AES192, and multipoint signal distribution, to deliver a wide range of flexible composite encoder and decoder solutions for unique applications. “The MPX II streamlines FM radio broadcasting by allowing broadcasters to maintain baseband audio processing and RDS data insertion at the studio,” said Charlie Gawley, VP Sales APAC & EMEA. “This slashes capital and operational costs by eliminating equipment from transmitter sites. And support for both

analogue and digital composite MPX signals allows broadcasters to transition from analog to digital exciters over time and maximise the value of their MPX investment.” The MPX II supports sending the full uncompressed FM signal, or high quality compressed µMPX at significantly lower bit-rates. An optional satellite tuner card with MPEG-TS and MPE support can receive DVB-S or DVB-S2 signals. Composite MPX over IP signals can be easily replicated and distributed using multicast and multiunicast technologies and take advantage of rocksolid redundancy features like redundant streaming, RIST, FEC, and automated SD card file failover. A shipping date for the MPX I codec will be announced in the near future. Visit https://www.tieline.com/

PRODIGY.MX Multiformat Audio Matrix DIRECTOUT, the provider of high-quality audio networking, converting, and processing solutions, is shipping PRODIGY.MX Multiformat Audio Matrix.

powerful infrastructure device that enables any signal flow while acting as an “audio firewall” between independent audio networks, if needed.

PRODIGY.MX offers up to 1664 inputs and 1668 outputs granularly handled by a single-channel routing matrix. This compact 2RU device converts, connects, or even isolates various digital audio formats, including MADI, Dante, RAVENNA/AES67, SoundGrid, and AVB/MILAN. Optional HD SRC technology bridges different clock domains seamlessly. With this versatile internal structure, PRODIGY.MX excels as a

The introduction of PRODIGY.MX has also brought a complementary globcon update. Enhancements include new Factory Templates designed for PRODIGY.MP, streamlining and easing deployment in Festival and Touring applications. These templates, when loaded via globcon, automate the setup of internal DSP routing, Layouts, and Snapshots configurations on the PRODIGY.MP device.

This time-saving feature allows users to focus solely on physical input and output patching and PA tuning. In addition, the entire PRODIGY series has received a maintenance update. The new system build includes some customer requests such as zeroConf (link local address support) for the Management interface, AES output dark to disable the AES output signal for downstream failover detection and a MIDI tunnel for SG.IO, and more. Visit https://www.directout.eu/ and https://qvest.com/au-en/

Shure Expands SLX-D Digital Wireless Family SHURE, IN PARTNERSHIP with Jands, the exclusive distributor of Shure products in Australia, has announced the expansion of the SLX-D Digital Wireless family with the introduction of SLX-D Portable Systems, including the new SLXD5 Portable Digital Wireless Receiver and SLXD3 Plug-On Digital Wireless Transmitter.

AUDIO, RADIO, PODCAST

SLX-D Portables are designed to deliver the scalability, high-performance wireless, superior digital audio, reliable RF performance, and convenient power management of SLX-D in new, durable form factors designed specifically for film, electronic newsgathering (ENG), broadcast, and video industries.

26

The SLXD5 Portable Digital Wireless Receiver is an SLX-D wireless receiver in a flexible, miniaturised form factor and can be installed on-camera through the provided cold shoe mount as well as in an audio bag, enabling full-featured use in any location. With IR Sync, users can easily pair receivers to transmitters for instant single-channel configuration. SLXD5 offers Multi-Mic Mode, which facilitates

management and monitoring of multiple sound sources from a single receiver. The SLXD3 Plug-On Digital Wireless Transmitter transforms any XLR source into a wireless one—including dynamic and condenser microphones. The SLXD3 provides phantom power and is ideal for wireless transmission from boom-mounted shotgun mics. Paired with the SM63L or SM58 (available separately), SLXD3 is suitable as an interview microphone. With SLX-D Portables, users will not be required to change the gain setting on their transmitter, saving time and simplifying the path to crystal-clear audio. The portable systems achieve this with 24-bit digital audio and >118db dynamic range, providing the award-winning sound experience of the SLX-D Wireless platform to mobile productions. SLX-D Portables provide assurance with high spectral efficiency and UHF-band RF performance - 32 channels per 44MHz band and 330-feet (100-metre) transmission range. With quick, out-of-the-box installation and

intuitive setup features, SLX-D Portables enable users to start recording immediately. The SLXD5 Portable Digital Receiver automatically scans for the best frequency and pairs it to the transmitter with IR Sync in seconds. Both SLXD3 and SLXD5 feature a highluminosity OLED screen where users can monitor battery life, meter audio and RF quality, as well as frequency tuning. The SLX-D Setup Guide offers easy access to additional education and instructional content from any mobile device. SLX-D Portables are also compatible with the full ecosystem of rechargeable Shure accessories, including the SB903 Lithium-Ion (Li-ion) battery and accompanying charging bays. Both SLXD5 and SLXD3 can be powered and charged by USB-C. Available separately, the SBC-DC-903 battery eliminator integrates with SLXD5 for use with mobile power distribution setups. Visit https://www.jands.com.au/


AUDIO, RADIO, PODCAST

Neumann’s MT 48 Now an Immersive Audio Interface NEUMANN.BERLIN has released a feature upgrade for the MT 48 audio interface. The Monitor Mission transforms the MT 48 into a freely configurable monitoring controller. In addition to mono and stereo, the Monitor Mission can also handle surround formats such as 5.1 as well as highly sought-after immersive audio formats such as Dolby Atmos 7.1.4. The Monitor Mission includes flexible bass management and complex alignment functions to adjust the frequency and time domain characteristics of loudspeakers to the

listening position. The downmix feature allows multichannel audio to be monitored in either mono or stereo.

professional multi-channel converters such as the Merging Hapi MKII and/or Neumann studio monitors in the AES67 version.

The range of connectivity options makes it possible to connect studio monitors in a variety of ways, whether analogue or digital via S/PDIF or ADAT. It is even possible to combine several connection types for one speaker group. Using the four analogue monitor outs and an external ADAT converter, systems of up to 7.1.4 can be realised. A built-in AES67 interface also allows the connection of

The Monitor Mission does not require a download but is part of the MT 48 firmware from version 1.6.x, which will be released in February. The Monitor Mission is activated via an individual software key. Users who register their MT 48 with the manufacturer before July 1, 2024, will receive their activation key for the Monitor Mission free of charge. Visit https://bit.ly/3Uij7rN

Shure ADX3 Plug-On Transmitter with ShowLink Technology SHURE, IN PARTNERSHIP with Jands, has introduced the Axient Digital ADX3 plug-on transmitter, the latest addition to the Shure Axient Digital Wireless System. ADX3 is suitable for audio professionals in broadcast television and location sound who are seeking an industry-standard transmitter that enables real-time remote control of key parameters.

diversity wireless connection - directly from the convenience of the receiver.

ADX3 transforms any XLR-terminated microphone into an advanced, portable Axient Digital ADX Series wireless microphone, delivering the same transparent audio quality, RF performance and wide tuning of the AD3, with the addition of Shure’s proprietary ShowLink technology.

The ADX3’s compatibility with Wireless Workbench supports efficient control and configuration, optimal spectrum management, and frequency coordination. With realtime monitoring via the AD600 Spectrum Manager, sound engineers can neutralise RF interference by manually switching the signal to a clear backup frequency, or by programming the system to do it automatically. Additionally, the ADX3 seamlessly integrates with Axient Digital AD4D and AD4Q rack receivers as well as the Cinema Audio Society’s ADX5D Dual-Channel Portable Receiver.

ShowLink allows comprehensive, real-time control of all transmitter parameters, including interference avoidance, over a robust 2.4GHz

The ADX3’s selectable modulation modes optimise performance for spectral efficiency. Users can select High Density mode to

dramatically increase their maximum system channel count or run ADX3 in standard mode for optimal, low latency coverage. Moreover, the ADX3 features automatic input staging and equips users with AES 256-bit encryption for secure transmission over a line-of-sight operating range of 300 feet (100 metres). The ADX3 ships with two SB900 lithium-ion rechargeable batteries, with each battery providing up to six and a half hours of continuous use, precision metering, and zero memory effect. The SB900 battery can be charged over a USB-C port. Alternatively, the transmitter can be powered externally via USB-C or by two AA batteries. Visit https://www.jands.com.au/

Dual TX for RØDE Wireless ME Microphone RØDE HAS ANNOUNCED a dual transmitter version of the Wireless ME “grab-and-go” wearable wireless mic for content creators. The Wireless ME will be released in a limitededition dual set, featuring two transmitters and one receiver, available in either black or white. The Wireless ME combines the same sound quality and features that RØDE wireless microphones are known for with ease of use. In addition to its Series IV 2.4GHz digital transmission and universal compatibility

with cameras, smartphones and computers, the Wireless ME features RØDE’s GainAssist technology, which uses intelligent algorithms to automatically control audio levels onthe-fly, ensuring pristine audio in almost any recording application. This delivers professional sound quality with no setup or tweaking required. It is also equipped with a mic built into the receiver for recording audio from in front of and behind the camera simultaneously, making it suitable for capturing interviews or voice overs for video content.

The dual Wireless ME allows seamless recording of two subjects simultaneously, or even three when the in-built receiver mic is activated. The option to get the Wireless ME in white instead of the original black also makes it suitable for creators who are looking for a discreet solution when working with talent who typically wear lighter coloured clothing. Visit https://rode.com/en

NATIVE INSTRUMENTS has announced iZotope VEA, a new AI-powered Voice Enhancement Assistant for podcasters and content creators. VEA features AI technology that listens first, then enhances audio so creators can feel more confident with their voice and deliver better sounding content. VEA increases clarity, sets more consistent levels, and reduces background noise on any voice recording. VEA features three simple controls that are intelligently set by iZotope’s AI technology. Those that are more familiar with editing

vocal recordings will find a new way to finish productions quickly by consolidating their effects chains and saving on CPU.

settings. Boost delivers a smooth and even sound to speech for a more engaging listening experience.

A Shape control ensures audio sounds professional and audience-ready, without having to worry about an EQ. Shape is tailored to each voice, and matches the sound of top creators or podcasts with the free iZotope Audiolens tool.

A Clean control takes background noise out of the spotlight so every voice can shine. VEA learns the noise in the room automatically and preserves speech for light, transparent noise reduction.

A Boost control adds loudness and compression as it’s turned up. Easily boost the presence and power of voice recordings without spending time struggling with

VEA works as a plugin within major digital audio workstations (DAWs) and non-linear editors (NLEs). Visit https://www.izotope.com/vea/ and https://www.native-instruments.com/en/

AUDIO, RADIO, PODCAST

Voice Enhancement for Podcasters

27


CONTENT DELIVERY Terrestrial, Mobile, Broadband

Ross Video to Adopt ‘NDI Advanced’ ROSS VIDEO AND NDI, the connectivity tech company, have announced a strategic partnership for Ross to broadly licence and adopt the most advanced NDI technology across its portfolio, extending and enhancing existing support for the NDI standard. This collaboration will allow Ross to incorporate the most up-to-date AV networking features of NDI across its portfolio through NDI Advanced, ensuring enhanced video connectivity, flexibility, and workflow efficiency across the ecosystem. “Ross Video has supported the NDI standard for many years and is excited to be taking this next step in supporting enhanced versions of NDI, NDI Advanced, and the evolution of NDI connectivity,” commented Jeff Moore, Executive Vice President, and Chief Marketing Officer at Ross Video. “We’ve found NDI incredibly useful and look forward to being on the forefront as things advance.” “NDI is excited to have Ross Video on board as a flagship adopter of NDI Advanced,” said Nick Mariette, Director of Product Management at

NDI. “A cornerstone company in the broadcasting industry adopting our most advanced technology and formats is the best demonstration of how NDI has grown to become a pervasive industry standard, capable of performing to the benchmark set by leaders in the segment.” NDI, a proprietary connectivity standard, supports a range of video codecs and enables seamless interoperability for devices and software across standard IP network infrastructures. NDI Advanced offers product developers all the benefits of NDI technology, like seamless device discoverability, bi-directional remote control, and embedded metadata streaming, but also unlocks an additional set of features for enhanced video connectivity, including: • Sending and receiving all NDI formats, from NDI High Bandwidth toNDI HX3, the most advanced format yet, which allows for visually lossless, low-latency video with minimal bitrates.

• Access to the NDI Certified program, enabling products to be certified by the NDI team for guaranteed interoperability, performance, and reliability • Custom-packaged SDKs for Hardware or Software, including reference designs for major FPGA models from Intel or AMD. • Extended personalization of connection settings for each sender, finder, and receiver. • AV Sync, Genlock APIs, and KVM Support. Ross Video will incorporate NDI Advanced features broadly across its portfolio, including the following product lines: XPression graphics solution; Ultrix hyperconverged platform; PTZ broadcast video cameras; Media I/O capture and playout solution; softGear Streaming Gateway; Vision[Ai]ry Ft facial tracking and recognition solution; PIERO sports graphics analysis system; and the DashBoard production and facility control solution. Visit https://www.rossvideo.com/

Cinegy-powered Appliance for Capture or Playout CINEGY GMBH, the provider of software technology for digital video processing, asset management, video compression, automation, and playout, has worked together with its French channel partner Broadcast Architech to create XGESTORE, a configurable, all-in-one, ready-toroll hub for broadcast and streaming operations. Depending upon configuration, XGESTORE can provide a powerful capture server or a broadcast playout device. The device combines powerful and compact hardware, including a customisable local SSD RAID in a 3U rack, with industry-leading software including Cinegy Capture for ingest and

management, Cinegy Air for automated playout control, and Cinegy Multiviewer for monitoring. It supports resolutions up to 8k Ultra HD, including HDR; all the popular canonical and delivery codecs; and I/O as SDI, SMPTE ST2110, NDI and SRT. Broadcast Architech designed the system to be flexible, so it can be configured for multiple applications. As an ingest server, for example, it can capture an unlimited number of channels simultaneously, generating high-quality proxies on ingest for immediate distribution. The on-board SSD storage provides high-capacity storage and XGESTORE can be connected

directly to additional NAS storage such as XSTREAMSTORE, over ethernet to external capacity, or even to USB disks and sticks. The OBCast version of XGESTORE plays during ingest for live and delayed broadcasting and incorporates multiple dynamic graphics layers for applications like sport as well as branding. The Cinegy Air software includes complete playlist management including remote scheduling and operation, and the system is designed for nonstop operation where required. Visit https://broadcast-architech.com/xgestore and https://www.cinegy.com/

DASH-IF Conformance Tool Update AN UPDATE to the DASH-IF Conformance Tool is now available. The tool validates the conformance of DASH content to relevant media specifications. It is available as either an online service or open-source software.

CONTENT DELIVERY

DASH is a streaming technology that allows receivers to automatically choose the best quality video stream based on the device’s capabilities and the speed of the internet connection, allowing viewers to enjoy the best possible video quality without buffering or

28

interruptions. While the MPEG DASH (Dynamic Adaptive Streaming over HTTP) specifications describe client behaviour, conformance of DASH content is important for interoperability.

(including DVB-DASH) content conformance, the new tool has options to check for CMAF and WAVE requirements, and for HLS (HTTP Live Streaming) requirements as well.

The Conformance Tool was launched by the DASH Industry Forum over a decade ago and has been continuously updated and extended to test against relevant specifications from other bodies. A joint project undertaken over the past two years has improved the tool, making it more reliable and accessible. Along with DASH

The updates to the tool were funded by the JCCP (Joint Content Conformance Partners), a group of five organisations including the DVB Project, DASH-IF, the Consumer Technology Association WAVE (Web Application Video Ecosystem) Project, the HbbTV Association, and ATSC. Visit https://conformance.dashif.org/

OWN THIS SPACE!

FOR OUR ADVERTISING RATES AND OPTIONS Contact Adam Buick adam@broadcastpapers.com or +61 (0) 413 007 144

Get your products on the desks of key industry decision makers in print and online.


CONTENT DELIVERY

Met Expo 2024

for the first time in Australia discover KAIROS, the next generation IP/IT video processing platform

book your free demo at MetExpo

learn more about KAIROS

CONTENT DELIVERY

The KAIROS live production platform is a revolution in how live video content is created. Achieving flexibility in all levels of production unlike any existing hardware-based systems, the innovative architecture incorporates proprietary, ground-breaking software to maximise CPU and GPU capacities for video processing.

29


CONTENT DELIVERY

Venera CapMate Caption/Subtitle QC Now on AWS VENERA TECHNOLOGIES, a provider of audio, video, and caption QC solutions for the media and entertainment industry, has announced the availability of CapMate, its advanced caption and subtitle QC and correction cloud-native solution on the AWS Marketplace. CapMate, with its machine-learning engine and advanced verification algorithms, contains a long list of caption QC capabilities. Some of

these capabilities are the detection of audio/ caption sync issues, caption overlap on burnt-in text, missing captions, caption overlap, spelling, and profanity. These capabilities are provided for a large number of frequently used languages such as English, German, French, Spanish, Italian, Portuguese, and many more. CapMate provides the user with the ability to not only detect a variety of caption/subtitle

issues but also to provide an intuitive interface to review the result as well as automatically or manually correct the issues and generate a new caption file. Offering CapMate on the AWS Marketplace provides other benefits to the users, including simplified procurement, unified billing, and builtin security. Visit https://www.veneratech.com/

Avid | Stream IO Production Ingest & Playout Solution Updates AVID HAS ANNOUNCED enhancements to its Avid | Stream IO software subscription platform for broadcast production ingest and playout, including additional channel count, playlist building, closed captioning and new codec support. To improve performance, Avid | Stream IO’s channel count has doubled to now support up to eight channels in a single system, with the flexibility to configure inputs and outputs freely. This is in addition to the existing four channel configuration available since launch. Enabling eight channels in a single solution delivers greater value to customers by reducing the overall cost per channel. Alongside the increased channel count, content teams are now able to build media playlists ready for playback during production directly from the Avid | Stream IO web remote console.

Ideal for manual operation, this also offers a solution for business continuity by providing a back-up should studio playback automation be unavailable for any reason, meaning shows can stay on air in the event of such a failure. With this release, Avid | Stream IO now supports closed captioning on playout, giving teams the option to preserve and include closed captions on both ingest and playout – an essential content accessibility requirement. Increased codec support is also now available, with AVCLongG12 and 25 formats added. Avid | Stream IO runs on COTS hardware and is available through subscription. Together with the Avid FastServe family of video server solutions, Avid | Stream IO provides a range of options for media content producers seeking to upgrade from their existing Avid AirSpeed systems.

With a flexible architecture that can be configured to ingest or play out SDI streams today and in the future various IP-based formats, Avid | Stream IO allows production teams to migrate from legacy workflows and on-premises deployment to cloud and IP workflows at their own pace. It also allows them to increase efficiency by combining different ingest sources in a single configuration. Ideal for live entertainment and multi-camera productions, Avid | Stream IO supports all common production formats, including SDI. Compressed IP streams SRT / RTMP are coming soon, while NDI and then SMPTE 2110 will follow later next year. Visit https://www.avid.com/products/avidstream-io

Amagi and Nevion Simplify Pop-Up Channel Creation AMAGI, THE PROVIDER of cloud-based SaaS technology for broadcast and connected TV (CTV), has announced a partnership with Nevion, a Sony Group company and provider of virtualised media production solutions, to enable broadcasters to be more flexible, swift, and cost-effective in creating and delivering live content to their viewers, especially premium, high-quality, and time-limited events like sports.

latency issues typically associated with such hybrid workflows.

The combined solution allows production teams, wherever they are located, to spin up and tear-down pop-up channels in the cloud and use them seamlessly in combination with their ground production — overcoming the

Amagi CLOUDPORT’s newest support for VSF TR-07 (JPEG XS over TS) enables high-quality, ultra-low latency video transportation between the ground and the cloud as if they were in the same truck or facility — a key requirement

This Amagi-Nevion partnership brings together the cloud-based master control room (MCR) functionality of Amagi CLOUDPORT and the orchestration capabilities of Nevion VideoIPath. It makes it easy for production teams to integrate cloud-based MCR capabilities into ground-based production workflows without relying on technical specialists.

for premium live production and a superior end-viewer experience. The integration with VideoIPath, a key component of Sony’s Networked Live offering, further simplifies the workflow (including connectivity) for production and master control operators, who can now seamlessly synchronise on-screen graphics, digital video effects, and commercial breaks with minimal retraining effort. By putting the production team in control of hybrid workflows, the solution optimises operational efficiency and significantly reduces costs. Visit http://www.nevion.com/ and http://www.amagi.com/

CONTENT DELIVERY

Norigin Media Launches CTV White-label FAST Apps

30

STREAMING TV UI/UX specialist company Norigin Media has announced a brandable Connected TV product for Linear or FAST TV channels. The White-label solution is offered to content owners who want to effortlessly launch direct-to-consumer (D2C) applications across platforms including Samsung Tizen, LG webOS, and other Android TV OS based Smart TVs.

within a single brandable and turnkey application across a range of Smart TVs. The solution’s ready-made plug-ins also support integrations towards leading ad-servers to maximise ad-revenue. Traditional Linear TV providers will also benefit from using this product from Norigin Media to launch services specifically targeting growing CTV audiences.

metadata-rich channel zapping feature, and the

The newly designed White-label CTV Apps will enable FAST or Linear TV channel providers to launch their D2C apps directly on the App Stores of leading device manufacturers. Consumers will be able to access content

The React.js-based (a modern web technology) TV app will feature a new and privately researched UI/UX that is intended to enhance consumer viewing experiences. The Live TV Player UI will showcase an innovative

video playback formats, as well as a multitude

Electronic Program Guide (EPG) will facilitate smooth navigation for consumers looking to discover content across channels. The entire application is brandable with the ability to change colours, logos, and icons, while offering custom integrations for authentication and of ad-server integrations including Google Ad Manager, Freewheel, Invidi and others. Visit https://noriginmedia.com/


002 >> CONTENT LIVE 2023 >>

003 >> CONTENT LIVE 2023 >>

004 >> CONTENT LIVE 2023 >>

005 >> CONTENT LIVE 2023 >>

006 >> CONTENT LIVE 2023 >>

007 >> CONTENT LIVE 2023 >>

008 >> CONTENT LIVE 2023 >>

009 >> CONTENT LIVE 2023 >>

010 >> CONTENT LIVE 2023 >>

011 >> CONTENT LIVE 2023 >>

012 >> CONTENT LIVE 2023 >>

013 >> CONTENT LIVE 2023 >>

014 >> CONTENT LIVE 2023 >>

INDUSTRY FOCUS

001>> CONTENT LIVE 2023 >>

31


INDUSTRY FOCUS

015 >> DIGISTOR XMAS >>

017 >> DIGISTOR XMAS >>

018 >> DIGISTOR XMAS >>

021 >> DIGISTOR XMAS >>

022 >> DIGISTOR XMAS >>

024 >> DIGISTOR XMAS >>

025 >> DIGISTOR XMAS >>

016 >> SMPTE XMAS >>

019 >> SMPTE XMAS >>

020 >> SMPTE XMAS >>

023 >> DIGISTOR XMAS >>

026 >> SMPTE XMAS >>

INDUSTRY FOCUS

See more photos at www.facebook.com/phil.sandberg

32

027 >> SMPTE XMAS >>



Get ready for NAB 2024 by visiting the Magna stand at METexpo24

Whether you are going to NAB or viewing it from afar, you can get a heads-up on the latest trends, products and what to expect at the Las Vegas show in April by visiting the Magna stand in Sydney at METexpo 2024 in March. At METexpo 2024 Magna will be demonstrating Actus Digital’s Intelligent Monitoring Platform, manifold CLOUD and BLADE// runner live broadcast production software and COTS FPGA hardware from Arkona Technologies, the Chyron LIVE® endto-end, cloud-native live video production platform, Enensys digital media distribution chain solutions, the unique IPconneX contribution, distribution and delivery solution, Panasonic 4K

IP core infrastructure solutions Open standards support API first approach

Cutting edge 4K and IP PTZ cameras KAIROS live video production platform One of the world’s most trusted brands

Media Intelligence Platform Advanced broadcast compliance and monitoring Trusted by hundreds of customers

World class automation Playout and management solutions Suits entire broadcast community

and IP PTZ cameras and switchers, Pebble playout automation, RTS intercoms, Synamedia video experience solutions and TAG IP monitoring. The Magna Systems team along with product specialists from Actus Digital, Arkona Technologies, Chyron, Enensys, IPconneX, Panasonic Connect, Pebble, Synamedia and TAG will be on METexpo stands 46– 49 in the Kensington Room at Royal Randwick Racecourse from 5–7 March 2024. To book a demo for any of the above email sales.au@magnasys.tv or call +61 2 9417 1111 today.

Cloud native live production Broadcast-grade graphics Cut and mix live video

Industry leader for over 40 years SMPTE ST-2110 capability Quality, reliability and scalability

Digital media distribution chain solutions Entire media delivery chain Suits all types of broadcast networks

Content distribution and delivery Video processing and advanced advertising Broadband offerings and video security

100% software-based integrated IP multiviewing Probing and monitoring solutions Live production, playout control, distribution and OTT

empowering innovation Australia | Hong Kong | Indonesia | New Zealand | Singapore

COTS FPGA Power Efficient Density and Resliency

Contribution and distribution Low latency, remote production Secure and reliable


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.