SMPTE Committee Tackles the Circle of Confusion; Vizrt PTZ3 PLUS Cameras for BroadcastAV; Blackmagic Camera for Android App; Canon’sEOS R1 and EOS R5 Mark II Mirrorless Cameras.
Paris 2024: The Rise of Remote Production in Games Broadcasts; EMG/Gravity Media Deliver Advanced Workflows for EURO 2024.
BBC and TVU Redefine Mobile Coverage for UK Election; Reuters Selects Sony Cameras for Worldwide Video Group; Quicklink StudioPro 4K Production Platform; Vizrt Beefs Up Newsroom Security.
20
Colour Grading Mad Max:Furiosa.
Run Rabbit, Run - ex-FM Morning Host Goes on the Road with Podcast Van; Pathfinder from Telos; ALIVE RADIO Goes Outdoors with AEQ. 28
Nine Network Deploys Black Box’s Emerald IP KVM; World-First Demo of STL Using IP/PTP Over Microwave; Viz Connect Tetra Native NDI Bridge Multichannel Video Conversion Tool; Bridge Technologies Enhances Probe MicroBursting Analysis; Mediaproxy LogServer Integrates AIMedia’s LEXI; Veset AdWise Launched in AWS Marketplace.
30 POSITIONS VACANT
Things are Speeding Up for FAST
By Phil Sandberg
THE CONNECTED TV has expanded content delivery options for traditional and new media players alike. Whether this takes the form of extending an established brand, or introducing specialised content to a wider audience, the connected TV is the conduit that opens up audiences beyond those possible with free-to-air radio frequency transmission.
Enter FAST (Free Ad-Supported Streaming Television). Unlike its IP-delivered sibling, VOD or catch up TV, FAST can enable a linear viewing experience AND targeted ad delivery.
It is no surprise then that traditional broadcasters, those with the most experience in delivering linear programming, have embraced FAST to broaden their reach.
In its recently released 12th Global FAST Report, Amagi, the provider of cloud-based SaaS technology for broadcast and connected TV (CTV), found that FAST delivers a significant impact to the business of TV Networks, and D2C streamers. The report found that TV networks account for 30% of the top 100 FAST channels, driving a 40% share in total Hours of Viewing (HOV) across these channels. Additionally, among Amagidelivered FAST channels, Global HOV (31%) and ad impressions (26%) continued to show double-digit growth during the comparison period (Q2 2024 versus Q2 2023). The company says these findings demonstrate the significant role that FAST channels play in the rapidly evolving streaming landscape, demonstrating FAST’s ability to adapt and thrive in the face of new technologies and viewing habits.
“The latest edition of our report highlights the substantial role FAST plays in the streaming ecosystem,” says Srinivasan KA, Co-founder and Chief Revenue Officer of Amagi. “With TV Networks accounting for a substantial portion of the top channels and viewing hours, it’s clear that FAST provides a great avenue for traditional media companies to draw additional revenues in the digital age. This report also underscores the ongoing double-digit growth in global HOV and ad impressions, demonstrating the robust health of the FAST market.”
Key takeaways from the 12th Amagi Global FAST Report (Data based on Amagi ANALYTICS and FAST channels that run on Amagi’s platform) include:
Consumers Increasingly Comfortable With Exploring FAST Offerings: 75% of Amagi’s Consumer Survey respondents indicated they would create a free profile on a streaming service to sample FAST channels, and more than half would enter their credit card information.
FAST Sees Continued Growth Among Key Metrics Worldwide: Global HOV (31%) and ad impressions (26%) continue to show double-digit growth. Growth of Broadcaster-Owned Channels: The total number of broadcaster-owned channels
within FAST increased by approximately 2.5 times. Increase of FAST Channels Within O&O
Apps: The total number of FAST channels within O&O apps increased by almost 50%.
Significance of Single IP Channels in FAST: More than 25% of entertainment channels are Single IP channels, driving more than 33% of HOV within the genre.
Visit https://www.amagi.com/resources/fastreport
Meanwhile in Australia, a December 2023 study undertaken by the Australian Media and Communications Authority study, “How we watch and listen to content”, found that among Free-to-Air Catch-Up TV and Streaming Services, ABC iview dominated the free-to-air catch-up TV market with 58% viewership. 7plus and SBS On Demand were the second most popular services at 41% each.
It also found that Smart/Connected TVs have continued to grow in popularity in Australian households, with 78% of adults owning a smart TV (up from 73% in 2022) and that as smart TV adoption has increased, standard TV ownership has fallen to 30% (from 35% in 2022).
A Matter of Survival - the Philippines’ ABS-CBN While delivery to connected TVs is mostly a business decision, there are some media companies for whom it has been a lifeline. Following months of licence renewal hold-ups, legislative roadblocks and an alleged run-in with then President Rodrigo Duterte, May 4, 2020, saw Philippines’ media company ABS-CBN hit with an immediate shutdown order by the country’s National Telecommunication Commission (NTC).
With 42 television stations, 10 digital broadcast channels, 18 FM stations, 5 AM stations, and 11,000 jobs at stake, ABS-CBN turned to Internet delivery, launching “Kapamilya Online Live” service via the ABSCBN Entertainment YouTube channel. The service offered livestreaming and on-demand viewing of the latest episodes of various ABS-CBN shows for viewers inside and outside of the Philippines.
Now, in 2024, the ABSCBN Entertainment YouTube channel boasts 48 million subscribers and is the number one YouTube channel in Southeast
Asia in the media and entertainment category.
Interestingly, the company says, “More and more Filipinos are glued to their TV screens to watch the YouTube livestream of ABS-CBN’s shows where watch time hours have increased by an average of 85% yearly since 2020.”
Viewing time of Filipinos consuming ABS-CBN shows on ABS-CBN Entertainment’s YouTube channel using their Smart TVs recently hit 554.9 million hours, versus 549.6 million watch time hours for mobile phones, the dominating device platform for video delivery over recent years..
ABS-CBN says Kapamilya Online Live, which livestreams ABS-CBN’s programs 24/7 in Asia, Australia, Europe, and New Zealand, accounts for 61% (or 539.8 million live viewers) of those watching ABS-CBN content on YouTube using Smart TVs.
“While regional differences remain in the types of content found on FASTs, the concept of free linear channels as a popular delivery mechanism is a universal phenomenon,” said Alan Wolk, CoFounder/Lead Analyst at TVREV, who authored the introductory note of the Amagi 12th Global FAST Report.
And while mobile devices dominate snackable, short-form delivery, the Connected TV, for now at least, is a fertile platform for those whose expertise is steeped in delivering curated, linear, long-form content.
Thanks for reading,
Phil Sandberg – Editor/Publisher papers@broadcastpapers.com
+61(0)414671811
SEPTEMBER/OCTOBER 2024
Editorial Submissions: 26-08-24
Ad Bookings: 28-08-24
Ad Artwork: 03-09-24
For more information www.content-technology.com +61-(0)414 671 811
Email: papers@broadcastpapers.com
The 2024 ABE Exhibition, Conference & Workshops are being held at Sydney’s raciest venue, the Royal Randwick Racecourse from October 22-24. Hear from industry experts on hot topics such as
NEWS + PEOPLE
NEP Boosts Remote Production with Sony
NEP GROUP has acquired multiple Sony NXL-ME80 media edge processors, the first product to include Sony’s new HEVC Ultra Low Latency encoding technology. The technology combines a high compression ratio, low latency and the maintenance of high picture quality for remote and distributed live productions. The processors were recently deployed on remote production projects across New Zealand.
As NEP New Zealand Technology Manager Sam Scally explains, “The NXL-ME80 media edge processors are high-quality, multi-channel encoders utilising Sony’s ultra-low latency compression which makes them ideal for live production applications. They work with both ST-2110 and SDI interfaces and we are successfully using them on a current remote hub project.”
For the project, NEP is transferring 16 streams from their main live cameras using two NXL-ME80 units on diverse 5Gb links utilising the -7 functionality at 45Mb per stream. The NXL-ME80 units are managed from the NEP hub at both ends.
Scally added, “The NXL-ME80s form part of a remote production system where we have 13 cricket venues around the country connected back to our Auckland CBD hub facility. We operate on links as low as 1Gb/s.”
The Sony NXL-ME80 media edge processor is part of the Sony Networked Live ecosystem which optimises live production by flexibly utilising on-premise, cloud, hardware or software resources to meet specific requirements. Acting as a gateway between LAN and WAN, the NXL-ME80’s advanced technology achieves ultra-low latency, high picture quality and low bit rate, resulting in efficient network bandwidth usage for remote production and video contribution, reducing network costs for transmission across multiple locations.
The NXL-ME80 enables NEP to reduce network costs for remote production and transmission across multiple locations while enhancing its live production capabilities.
Scally added, “The NXL-ME80 also supports multiple streams to cater to individual live production
requirements efficiently. With the capability to convert and compress multiple video channels, content transmission with high picture quality is achievable, even with limited network bandwidth.”
In 4K mode, the NXL-ME80 can handle two channels and up to eight channels in HD mode. The selectable interface of SDI or ST2110 on the LAN side also helps easy installation.
The NXL-ME80 is ideal for remote production, enabling the connection of a camera to a venue’s control room. Multiple cameras situated at a venue can also transmit high-quality, ultra-low-latency and low bit rate video to a remote station, thus making remote production more accessible.
When looking for the NXL-ME80, Scally explained that NEP’s requirements were essentially for a high-quality encoder and decoder that could send signals across a small network pipe while working with a traditional broadcast environment using both SDI and 2110.
He said, “We have been very impressed with the quality of the NXL-ME80, as they enable us to transmit high quality images from remote locations with relatively low bandwidth.”
NEP is using its new NXL-ME80 units in a number of different environments, one of which is covering international and national cricket matches remotely.
Sam Scally concluded, “We are sending signals from cameras and third-party providers capturing cricket around the country with SDI out of our existing trucks. They are sent to an NXL-ME80 in our Auckland hub where they are decoded to SDI, then encapsulated to 2110. We are using the ME80s in conjunction with a Sony MLS-X1 switcher. By using this workflow, the NXL-ME80 gives us excellent picture quality and they have been very reliable.”
NEP Group Expands Sony MLS-X1 Live Switcher Inventory
Meanwhile, NEP Group has significantly expanded its inventory of Sony MLS-X1 Live Switchers. The expansion comes in the form of an additional 10 MLS-X1 units bringing NEP ANZ’s total number of MLS-X1 Live
Switchers to 31.
NEP’s SVP Technology, Marc Segar, commented, “NEP initially purchased 21 MLS-X1 units to upgrade and expand our current Sydney facility and deploy them in time for last year’s large international sporting competition held across Australia and New Zealand.”
The MLS-X1 is a key element of Sony Networked Live ecosystem which helps productions take full advantage of resources in high-quality mission critical live situations through connected hybrid onpremises and cloud capabilities.
Segar added, “The new MLS-X1 Live Switchers are currently being used in Sydney, Melbourne, and Auckland on a variety of projects, productions and events. This really highlights the versatility and efficiency of the MLS-X1 Live Switchers within NEP’s production ecosystem. By being able to allocate switcher resources between virtual configurations and seamlessly integrate additional hardware as needed, the MLS-X1 enhances NEP’s ability to handle a diverse range of projects and events across multiple locations. Moreover, the native ST-2110 support ensures easy integration into NEP’s existing facilities, further streamlining our production workflows.
“The new MLS-X1 Live Switchers are currently being used in Sydney, Melbourne, and Auckland on a variety of projects, productions and events. This really highlights the versatility and efficiency of the MLS-X1 Live Switchers within NEP’s production ecosystem”
As NEP continues to lead in producing events that demand more resources due to the rising prevalence of 4K content and larger switcher configurations, the MLS-X1 proves to be a valuable asset in meeting these evolving demands effectively.”
NEP have been one of the leaders in producing events that increasingly require additional resources as the use of 4K increases and larger switcher configurations are needed.
Segar continued, “The MLS-X1 is easily expanded and allows for efficient use of resources. It can be configured for large global 4K productions or divided into smaller HD productions as required. NEP typically uses an MLS-X1 or multiple MLS-X1s in our Andrews production hubs to manage all the major sports coverage that we produce on a weekly basis.
“We love the way we can share resources efficiently across various projects, whether it’s a large-scale international sporting event or a smaller, localised production. The scalability and flexibility of the MLS-X1 Live Switcher align perfectly with NEP’s needs, allowing us to adapt to the demands of different projects seamlessly. NEP’s investment in the Sony MLS-X1 Live Switchers underscores our commitment to staying at the forefront of technology in live event production. With the increasing demand for high-quality, missioncritical live productions, having reliable and versatile equipment like the MLS-X1 is essential for delivering top-notch results consistently.”
Visit https://pro.sony/en_AU/
NEP SVP Technology, Marc Segar.
Blackmagic
Now you can build affordable live production and broadcast systems with SMPTE-2110 video! Blackmagic Design has a wide range of 2110 IP products, including converters, video monitors, audio monitors and even cameras! You get the perfect solution for intergating SDI and IP based systems. Plus all models conform to the SMPTE ST-2110 standard, including PTP clocks and even NMOS support for routing.
Build Professional SMPTE-2110 Broadcast Systems
The Blackmagic 2110 IP Converters have been designed to integrate SDI equipment into 2110 IP broadcast systems. The rack mount models can be installed in equipment racks right next to the equipment you’re converting. Simply add a Blackmagic 2110 IP Converter to live production switchers, disk recorders, streaming processors, cameras, TVs and more.
Conforms to the SMPTE-2110 IP Video Standard
Blackmagic 2110 IP products conform to the SMPTE ST-2110 standard for IP video, which specifies the transport, synchronization and description of 10 bit video, audio and ancillary data over managed IP networks for broadcast. Blackmagic 2110 IP products support SMPTE-2110-20 video, SMPTE-2110-21 traffic shaping/ timing, SMPTE-2110-30 audio and SMPTE-2110-40 for ancillary data.
Uses Simple 10G Ethernet for Low Cost
Blackmagic 2110 IP Converters are available in models with RJ-45 connectors for simple Cat6 copper cables or SFP sockets for optical fiber modules and cables. Using simple Cat6 copper cables means you can build SMPTE-2110 systems at a dramatically lower cost. Plus copper cables can remote power devices such as converters and cameras. There are also models for optical fiber Ethernet.
Incredibly Easy to Install
One of the biggest problems with SMPTE-2110 is needing an IT tech on standby to keep video systems running. Blackmagic 2110 IP converters solve this problem because they can connect point to point, so you don’t need to use a complex Ethernet switch if you don’t want to. That means you get the advantage of SMPTE2110 IP video with simple Ethernet cables, remote power and bidirectional video. Blackmagic 2110 IP
COVER STORY COVER STORY
Gold Standard for Nine’s Olympic Coverage
The Paris 2024 Olympic Games have been historic for Australia, not only in terms of medal performance, but in the scope of coverage undertaken by rights holder Nine Entertainment.
WITH OUTLETS RANGING from free broadcast and streaming channels to specialist subscription video channels, radio, audio, print and online, Nine’s cross-platform approach has set the bar for Games coverage and set the company up for the 2024 Paralympics, and Summer and Winter Games leading up to Brisbane in 2032.
And just as Nine itself has evolved its approach to media, so, too, has the International Olympic Committee and its provider Olympic Broadcasting Services (OBS).
Launched just prior to the Games by Alibaba Cloud and Olympic Broadcasting Services (OBS), OBS Cloud 3.0 is designed to deliver cloud-based broadcasting in realtime via the Live Cloud service, while the Content+ platform provides a wide range of content produced by OBS, including live sessions, athlete interviews, features, behind-the-scenes footage, and social media content.
For the first time in Olympic history, OBS Live Cloud became the main method of remote distribution to broadcasters, transmitting a total of 379 video and 100 audio feeds.
OBS Live Cloud service, introduced at the Olympic Games Tokyo 2020, had 22 subscribers during the Olympic Winter Games two years ago. At Paris 2024, it catered to two-thirds of booked remote services across 54 broadcasters (up 279 per cent).
This has lead to a significantly reduced broadcast footprint:
The space in the International Broadcasting Centre (IBC) has been reduced 13 per cent from Tokyo 2020 and 23 per cent compared to Rio 2016.
The power provided by the Organising Committee for the IBC has been reduced by 44 per cent compared to Tokyo 2020 and 72 per cent compared to Rio 2016. The venue compound space has been reduced by 11 per cent from Tokyo 2020 and 20 per cent from Rio 2016.
The venue broadcast power has been reduced by 29 per cent from Tokyo 2020 and 46 per cent from Rio 2016.
In line with these trends, Nine’s footprint in Paris has been smaller than Olympics of the past with many production functions being carried out remotely from its headquarters in North Sydney.
Geoff Sparke, Nine Network’s Director of Broadcast Operations, explains that in the past, covering the Olympics meant essentially setting up a full television station at the International Broadcast Centre (IBC). This was necessary to bring in all the feeds and curate a channel or program for the duration of the event.
“Life has moved on in such a way that the expectation now includes a wide variety of streaming channels. While we still focus heavily on our premium curated channel, we’ve expanded our offerings to include more channels, including GEM, which will run 24 hours a day. Additionally, we have numerous streaming channels available on 9Now, and our Stan SVOD platform is also very active, with significant work being done there as well.”
With its pioneering SMPTE 2110
infrastructure and connectivity, 1 Denison Street acts as a nervous system for remote broadcast operations taking in feeds for creation of linear programming for broadcast channels Nine and GEM, as well as 9Now linear streaming. It also provides a conduit to Telstra Broadcast Services, allowing both 9Now and Stan to receive other overseas produced content, and provides secure feeds to Nine radio stations, as well as ABC Radio for broadcast to remote areas of Australia.
“We decided to go with remote production for several reasons,” says Geoff Sparke. “The main reason is that it’s the way the industry operates today, and we have the experience to do it. We already handle remote production for our news at NBN and various sports events too.”
For the Paris Olympics, the International Broadcast Center has been located at the Exhibition Center in Le Bourget, nearby to the Media Village, some 10km north of the capital. While past Olympic
IBCs would have included the technical facilities of global rights holders, nowadays functions are condensed to racks in a rack room. This is where feeds, including Nine’s, come before international distribution.
For Nine, the main action occurs in Paris’ Trocadero district, situated across the Seine River from the Eiffel Tower. Not unlike the Skytower at the Sydney 2000 Olympics, a tower has been constructed there to provide studio space for rights holders.
Nine’s studio space features a glass backdrop and is enhanced with Unreal Engine-powered extended and augmented reality technology provided by Alston Elliot (AE).
A Mark Roberts Motion Control 9-axis robotic arm handles camera operation, triggered remotely with preset commands from Sydney.
Feeds from the Trocadero studio come back to Nine in North Sydney via Telstra Broadcast Services’ facilities in Pitt Street in the Sydney CBD and Oxford Falls in the city’s northern suburbs. Feeds for
Stan and 9Now are also delivered viaTBS. A further 86 feeds are also delivered to Nine.
Outside of the Trocadero studio space, Nine has booked unilateral camera positions for events such as swimming and athletics to focus on Australian competitors. These feeds are sent separately to 1 Denison Street for integration into the broadcast.
Meanwhile, teams of Nine reporters deployed across France to file stories on events such as football (soccer) - as well as reporting from the streets of Paris - use a mix of 1Gbit internet and 5G bonded cellular connections via platforms from LiveU and Dejero.
“Whereby you would traditionally have a video circuit coming from the mixed zone into your IBC, we’ll have them coming direct to Australia via public internet,” explains Geoff Sparke, “but also all those live devices have SIMs in them, so we can use 5G as well if there’s issues with the internet, and
that’s how we do our news crosses as well. So we’ve ensured that we’ve got both Dejero and LiveU receivers in each of our five city locations for doing the live news crosses, so those news crews can independently hit those particular cities without having to port through the IBC or through here at our Master Control. Our Master Control has been embellished with another Master Control upstream, that’s just looking after the Olympics. So, we’ve got our normal Master Control, and then we’ve got another one that’s mainly looking at all of those multiview feeds coming in from Paris, just to ensure that there’s someone who can focus purely on those particular signals.”
Provision for redundancy and Disaster Recovery has been made through GTV Melbourne linking back to Sydney-based National Playout Centre (NPC Media), giving Nine the ability to keep broadcasting Olympic coverage and inserting commercials.
“The big kicker for us,” says Geoff Sparke, “was the utilisation of 1 Denison Street. We don’t have to go off-grid or off-station and build a facility. We’re using our 2110 backbone to actually do all of this.”
PLANNING
For Nine, Paris 2024 had been some 18 months in the planning. According to Geoff Sparke, one of the biggest considerations was games audio.
“One of the major challenges in this area is managing the commentary,” says Sparke. “When you purchase feeds, they often come with English-speaking commentary. However, when we send our commentators over, they aren’t always commentating, which creates a mix of commentary and silence. We have to carefully manage this by ensuring that any feed being used live, recorded, or edited has the appropriate commentary.”
“Not all of our commentary comes from Paris; some of it is
done locally,” Sparke explains. “When our commentators are speaking, we need to replace the English commentary that comes with the feed. However, if our commentators aren’t speaking and we want to record the original feed, we preserve the original commentary, which we refer to as CX. To manage this, we have a couple of [Calrec] audio mixers with audio specialists handling it constantly.”
Again, due to the flexibility of the 1 Denison Street network, Nine has been able to “spin up” these audio facilities within repurposed space in the building, as well as five temporary “commentary booths” with commentary information systems (CIS). In addition, Nine’s existing comms network has been extended from Sydney to Paris with additional on-site Riedel technology.
EVS STORAGE
With the sheer volume of footage coming out of Paris, Nine has
augmented its storage and media management capabilities.
“We’re an EVS house,” says Geoff Sparke. “We use EVS for all our recording and playout needs, including news, post-production, sports, the Today Show, and A Current Affair. To manage all of that, we have two separate AVID PAM systems—one dedicated to post-production and the other to news. Overseeing both is a MAM system that handles management and archiving.”
“We can use our existing system, but it’s not typically designed to handle something as large as the Olympics, with so many live feeds and moving parts. So, we’ve made subtle expansions to each component. We didn’t need to reinvent the wheel—the workflows remain the same. The only change is that the tech stack has been scaled up, allowing us to manage everything smoothly.”
“Additional EVS units and rented storage have been brought in for the Olympics. This setup was chosen to avoid overloading the MAM or PAM with Olympic footage once the event concludes. Since much of that material may not be necessary afterward, the footage is stored separately. During the Olympics, the content will be condensed into a more manageable size, which can then be offloaded, allowing the rented storage to be released.
We’re definitely evolving. Just look at these Olympics—what we’re doing now wouldn’t have been
possible a few Olympics ago, not even in London. This shows how much technology has advanced in our favour. Since then, we’ve adopted new standards like SMPTE 2110, embraced COTS devices, and seen virtualization take off. We’ve transitioned from outdated systems to something much more adaptable and flexible, making the most of these advancements.
Twelve years ago, we wouldn’t have even considered it, but now, remote production wasn’t even a question—it was simply the way to go.”
STREAMING ACTION
Also a no-brainer for Nine was to extend coverage of the Games beyond that of terrestrial broadcast to wider linear and on-demand viewing via its 9Now streaming service.
According to Lewis Evans, Product Director for Streaming for Nine, this also meant extensive pre-planning and testing.
“It’s been almost a year of product delivery ahead of what is essentially a new streaming experience for us, particularly on 9Now, but also across our various radio products as well,” says Evans. “We drew a line in the sand pretty early just to make sure that we could complete all of our delivery and get all of our major app releases out the door, whilst leaving some time for the teams to kind of reset, refocus, and prep operationally. We also need to make sure that we’re operating that event well, because for 9Now, we’re not just taking carriage of the live
streams, but we’re bringing over 40 additional streams directly from Paris. We’re producing exclusive 9Now video on demand content. This is a whole kind of operational workforce that also has had to spin up and kind of get set up and ready that we’ll be running 24-7 through the 16 days of the Olympics and a little bit either side, but also then for the Paralympics straight after. “For us, we’ve focused really on one thing which is providing a really great core live streaming offering. So that’s come in a couple of key parts. Obviously, that’s ensuring that we can deliver Channel 9 and 9GEM which will be covering the Olympics 24-7 on 9Now in the best quality possible. So we’re bringing that out in full HD with 50 frames a second and 5.1 surround sound. But we’ve had that in some devices for a little while. The big lift was actually bringing that to the entire 9Now platform base. So no matter what device you’re choosing to use, whether that’s a fetch TV or a Samsung TV, they all have that best-in-class streaming experience.”
As part of its extended coverage, 9Now has been utilising around 50 additional sports feeds from Olympic Broadcasting Services. According to Evans, “It’s almost 50 individual live sports feeds from every arena across Paris. So, we’ll be bringing them in and making sure that you can watch every minute live if you want to move away from the curated broadcast feel and just watch the handball, for example.
“We
support what we refer to internally as 26 separate platforms, which is essentially building and releasing through app stores 26 different versions of 9Now. And our commitment was that when you tune in for Paris this year, no matter what piece of hardware you chose to buy, you’re getting the one experience from us.
“The additional channels that we take from the IOC directly, they will have English commentary where it’s available. It’s not available in all sports. And we’ll be taking their international talent commentary for that.”
The result is 9Now’s streaming viewers have access to both the Wide World of Sports’ curated Australia-focused coverage on Nine and GEM, as well as specialised OBS-produced content on the other channels.
“The only caveat to that is in video on demand,” says Lewis Evans. “In our video on demand space, we will be creating bespoke recaps for each day, as well as top 10 moment highlight packages. Those will be topped-and-tailed with Wide World of Sports talent. So they’ll lead into a summary of the events that have taken place overnight and package that up in a program, which is kind of supported with our talent rather than just being like an automated highlights reel of that IOC content.”
Given the tight turnaround, packages will be put together by a combination of human teams and automation.
“We have teams working all night on shifts to produce that content for people when they wake up in the morning,” says Evans. “We will be leveraging automation for some of the smaller sports where maybe it didn’t happen to be broadcast on Nine, but it was a massive moment that just kind of came out of nowhere. That’s been clipped up and automated and available in our systems to bring into our highlights if we need to.
“But, obviously, we want those packages to include as much of Nine’s broadcasted content with the commentary. We don’t want to grab a swimming event that’s from a separate feed and mix that in. We want to use the feed that has Ian Thorpe commentating, so there will have to be a little bit of us bringing human editorial to that. So, we’re kind of taking a bit of both. And we know that it needs that Wide World of Sports editorial feel. So, it will always be checked off and finished by an editor in our team.”
In terms of user interface, Evans says the 9Now team has spent a great deal of time developing its “channel switching” experience
to allow for smoother transition between different content options.
”So, when you’re watching a live stream, it’s being able to browse through what else is on other channels and actually flick between those,” says the Nine Product Director for Streaming. “We’ve invested a lot in that. And we’ve also rolled out our new homepage design, which we did towards the back end of last year across all platforms now as well.
“And the thing that’s really powerful about that is we can really mix the way in which we promote or recommend content to users, whether that’s video on demand content, the live broadcasts, individual events. And so we’re hoping that the way we’ve approached curation, channel switching, branding, metadata, that it’s going to be easy to find something to watch because it’s a lot of content.
“On a connected TV device, basically you can press any button as you’re watching the live stream and all of the little channel cards appear at the bottom and you can cycle through everything that’s on 9Now.
“On web and mobile devices, we have a few extra pieces of functionality where we have a bit more interactivity and space to play with. On the web, we also show events that are upcoming and we can group those. We also support what we call channel groups. So, as you’re browsing through the website or on the mobile app, you’ll actually see live channels grouped by sport. So, you’ll see archery and then under, you can hit that and it’s almost like a collapsed folder and it’ll expand and you can see all of the streams. So, sometimes for an individual sport, there could be up
to five or six streams happening concurrently.”
Another challenge for the 9Now team has been addressing that variety of platforms, and the variety of devices used by viewers to stream video.
“A lot of people don’t necessarily appreciate that to deliver an application like 9Now across every manufacturer is an incredibly tough job,” says Evans. “We support what we refer to internally as 26 separate platforms, which is essentially building and releasing through app stores 26 different versions of 9Now. And our commitment was that when you tune in for Paris this year, no matter what piece of hardware you chose to buy, you’re getting the one experience from us.
“I think the key thing for me is that single platform story, that’s really the thing that I’m most excited by and has been the really I think the most impressive technology job. We’ve done close to 70 app releases, major app releases in the last two months to be ready for the Olympic Games. It’s an incredible volume of activity that’s gone through so I am super proud of the team and the uplift that’s gone out.
“We look at testing from so many different places. We’re testing an individual feature on one website and then that gets tested by that team and deployed into production where we run PVT or postverification testing that happens. We then completed deep, endto-end testing which is validating performance all the way from an upstream video stream through to the end platform or end user.
“We then do significant load testing, penetration testing for security purposes which tests the whole system and the resiliency
of the team. We’ve done almost 10 simulated ‘game-day’ experiences that rehearse particular nights of the Olympics and what that will feel, including dry runs of a particular outage on a particular service, and how would we respond to that. So, everything from that micro feature being tested all the way through to the whole team at scale tested as if it was the Olympics.
“Resilience is the word of the day when it comes to building streaming services and that’s been our core focus. We’ve put a ton of things in place. We’ve been focused number one on managing peak traffic through the period, fortunately we get three really big peak moments in the run-up, which is three NRL State of Origins, which have broken our streaming record with each game, so we’ve got a really nice testbed for practising, whether that’s practising our operational processes for escalation, incident management, working with key vendors to get solutions in place quickly, but as much as preparation as you can do, things do go wrong and so now we’ve got really good rehearsals and practices for that, but equally we’ve been able to work with our key vendors to make sure we’re all working in the office at the same time, we’re all scaling up our infrastructure together, we’re working as one team through those events and through that Olympics period.
“So, I think the big story for us is we were given 12 months to get 9Now ready. I think we’ve hit that challenge. I feel we feel really confident going in as confident as you can be with live sport and I just hope everyone really enjoys the experience.”
Visit https://www.9now.com.au/
CAMERAS & LIGHTING CAMERAS & LIGHTING
SMPTE Committee Tackles the Circle of Confusion
INDUSTRY BODY SMPTE and members of its Rapid Industry Solutions, On-Set Virtual Production (RIS-OSVP) initiative recently orchestrated a camera and lens measurement study to test and validate the optical model for accurately determining the circle of confusion and the near and far focus planes.
“After months of planning, the Camera/Lens Metadata committee under RIS-OSVP conducted a Circle of Confusion test with the assistance of the Academy of Motion Arts and Sciences,” said Snehal Patel, Director/Producer and committee leader for the Camera and Lens Metadata committee. “Our belief is that modern digital cinema cameras are not accurately representing the usable depth of field achieved with various lenses. The calculations for depth-of-field have traditionally assumed a certain value for Circle of Confusion, which was based on experience with film technology. A more modern approach has to be achieved.”
Spearheaded by Patel and Jim Helman, Co-Leads of the Camera and Lens Metadata committee — one of three Interoperability Groups within RIS-OSVP — this study was done with both monochrome and colour cameras with the goal of finding the correct levels of focus or blur for computer-generated images. The committee will use the results to propose improved ways of encoding and using this information for virtual production and visual effects.
“Our camera tests examined multiple camera
types from two different manufacturers,” said Patel. “We also examined multiple versions of a camera with and without demosaicing and with and without glass filtration in front of the sensor. The cameras were mated with a set of prime lenses and measured in the centre of the frame using a calibrated depthof-field test chart. The results of the tests are currently being evaluated and the committee will publish them when the analysis is complete.”
The committee is also looking at how this test will integrate into virtual twins of cinema lenses and cameras in computer-generated and ICVFXspecific workflows.
The circle of confusion is the blur circle formed when a point source of light (like a star or a distant point) is out of focus. Instead of a sharp point, it appears as a small circle on
Vizrt PTZ3 Plus Cameras for BroadcastAV
VIZRT HAS THE BroadcastAV market in focus with the launch of the PTZ3 PLUS and PTZ3 UHD PLUS cameras.
Bridging the gap between ProAV and Broadcast, the PTZ3 PLUS range delivers high-quality video for any live production or presentation.
With intelligent AI-driven talent tracking, greater microphone choice with phantom power, and effortless integration for augmented reality (AR) – all in a discrete design blending into any space. With the additional advancements in the PTZ3 PLUS line, it’s never been easier to create broadcast-quality content.
A single ethernet cable also offers an efficient and straightforward workflow providing video, audio, power, control, tally, and FreeD camera tracking data over NDI|HX.
Keeping talent in shot automatically based on face and posture detection with AI presenter tracking, tracking continues even when talent turns away from the shot, with an auto 0.5-10s timeout/reset if the presenter leaves the presentation area, and the camera never tracks off set or breaks a green screen effect thanks to edge limitation. Blackboard detection locks the camera in place when it identifies a presentation area for a more enjoyable viewer experience.
To simplify setup and use of camera tracking data for Augmented Reality (AR) or Extended Reality (XR) system setups, the cameras embed the open FreeD tracking data protocol in the NDI|HX stream itself, adding camera position, rotation, and lens data metadata into every frame. This world-first development removes the need for additional hardware and further reduces production complexity, making it easier than ever for all content creators to create broadcast-quality content with high-end graphics systems like Viz Virtual Studio Go.
Owners of the original NewTek PTZ3 and PTZ3 UHD models can enjoy FreeD-overNDI|HX with a free firmware upgrade available on https://www.vizrt.com
the image plane. Filmmakers can manipulate this phenomenon to create stunning visuals. Understanding and accurately modelling the circle of confusion can help those in postproduction to create visual effects that match the visual quality and tone of live-action footage.
Visit https://bit.ly/3zRq8b3
needs and offers easier integration with various workflows and less equipment needed.
With software-controlled phantom power to the mini XLR port, any microphone or audio source can be connected to ingest audio in the same NDI|HX stream, powering +48v microphones with no amplifier needed. Two-step confirmation prevents accidentally powering non-powered microphones. This reduces setup complexity for AV deployments and remote production
The PTZ3 PLUS (1080p60 with 20x optical zoom) and PTZ3 UHD PLUS (4Kp60 with 30x optical zoom) are available to order now in black and white variants directly from Vizrt or through a range of certified partners.
Visit https://bit.ly/3YdaZed
Left to Right: Valentin P. Alt, AC/DIT; Danna Kinsky, Camera Assistant; Snehal Patel, Director/Producer at FearlessProductions.tv and Committee Leader for the Camera and Lens Metadata Committee; Dave Stump, ASC BVK; Rob Hummel, Photographer; Phil Holland, Cinematographer/Image Scientist; and Joe di Gennaro, Stage.
Blackmagic Camera for Android App
JAPANESE ROCK GROUP Hitsujibungaku’s recent music video, “Addiction”, was shot entirely with the Blackmagic Camera app on iPhones.
Following on from its iPhone version, Blackmagic Design has announced the Blackmagic Camera for Android app, which adds digital film features and controls to Samsung Galaxy and Google Pixel phones. Based on the same operating system as Blackmagic Design’s digital film cameras, these professional features give Android content creators the same tools used in feature films, television and documentaries. Support for Blackmagic Cloud allows creators to collaborate and share media with multiple editors and colourists around the world instantly.
Blackmagic Camera unlocks the power of the phone by adding digital film camera controls and operating systems. Users get the same intuitive and user-friendly interface as Blackmagic Design’s dedicated cameras. Users can adjust settings such as frame rate, shutter angle, white balance and ISO all with a single tap. Or record directly to Blackmagic Cloud (with Blackmagic ID) in industry standard files up to 8K. Recording to Blackmagic Cloud Storage lets users collaborate on DaVinci Resolve projects with editors anywhere in the world, all at the same time.
Blackmagic Camera has all the controls users need to quickly set up and start shooting. Everything is interactive, so users can tap any item and instantly change settings without searching through confusing menus. The heads-up display shows status and record parameters, histogram, focus peaking, levels, frame guides and more. Show or hide the HUD by swiping up or down. Auto focus by tapping the screen in the area users want to focus. Users can shoot in 16:9 or vertical aspect ratios, plus users can shoot 16:9 while holding the phone vertically if they want to shoot unobtrusively. There are also tabs for media management including uploading to Blackmagic Cloud, chat and access to advanced menus.
A settings tab unlocks the full power of the
phone’s camera, with quick access to advanced settings such as monitoring, audio, camera setup, recording and more. The record tab allows control over video resolution and recording format including space efficient H.264 and H.265. Professional audio options include VU or PPM audio meters. Blackmagic Camera also includes professional monitoring tools such as zebra settings for checking exposure, focus assist, frame guides and more. Blackmagic Camera features a built-in chat workspace so Blackmagic Cloud project members can talk about shots and quickly share creative ideas, all without leaving the app.
The Blackmagic Camera media tab has all the controls needed to browse or scrub clips for quick review, search and sort and view the upload status of their media. Users can also link to their DCIM folder and select clips to upload to the Blackmagic Cloud.
There are three options for media management with Blackmagic Camera, depending on location and mobile data coverage. Users can record to their phone and transfer clips to a
computer, log into Blackmagic Cloud and select a DaVinci Resolve project before recording, or record footage to their phone and then select which clips they want to upload via Blackmagic Cloud when they have a network connection. When shooting with Blackmagic Camera, the video users capture can be instantly uploaded as a proxy file, followed by the camera originals, and saved to Blackmagic Cloud Storage. These will then automatically sync to all members of the project anywhere in the world. This means users can start editing quickly using their proxies, speeding up their workflow. Media can be constantly added to the project by multiple cameras in different locations which is automatically synced to other project members via Blackmagic Cloud. Everyone can use the proxy media and the colourist or finisher can download the original high res camera originals and render. It’s a fast, seamless and automatic way to collaborate.
Blackmagic Camera for Android is available now as a free download from Google Play.
Visit https://www.blackmagicdesign.com
Canon Launches EOS R1 and EOS R5 Mark II Mirrorless Cameras
CANON AUSTRALIA has announced the launch of new flag bearers for its EOS R System – the EOS R1 and EOS R5 Mark II, with streamlined workflow processes and intuitive user experience, and assistive technologies powered by a new Accelerated Capture imaging platform and Deep Learning.
The EOS R1 boasts advanced features that enable news and sports professionals to capture the headline shot effortlessly whilst the EOS R5 Mark II is built for a wide range of hybrid professionals to tackle any creative project.
The EOS R1 is the first camera in the EOS R range while the EOS R5 Mark II succeeds the EOS R5 which combined high resolution, speed and 8K movie performance. For the first time, both cameras are introduced with the new ‘Accelerated Capture’ imaging platform comprised of a new processor – the DIGIC Accelerator – along with the well-known high-performance DIGIC X image processor and newly developed high-speed image sensors featured on both new camera models.
The DIGIC Accelerator is added to support the processing of large volumes of data, alongside Deep Learning technologies. This combination unlocks higher performance and new features in several areas including autofocus, continuous shooting and image quality.
The EOS R1 and EOS R5 Mark II both feature the latest version of Dual Pixel CMOS – Dual Pixel Intelligent AF. This offers multiple enhancements including the ability to more accurately track subjects, by identifying the face and upper bodies of players and avoiding obstacles or other players. This is further enhanced by the ability for users to register specific faces and track and prioritise them consistently over other players.
A newly added ‘Action Priority’ mode automatically identifies common action poses in basketball, soccer and volleyball, identifying and tracking the main subject in fast and dynamic situations and capturing the headline-grabbing moment of action.
Eye-control AF is offered in both models with improvements over the system found in EOS R3 that include a higher pixel count sensor, enhanced LEDs, a larger eye detection area, and an
updated detection algorithm allowing a unique and instinctive way to select a subject to track in a complex scene.
The EOS R1 and EOS R5 Mark II offer new high-speed backilluminated stacked image sensors, resulting in faster shooting speeds and faster sensor readout with a 40% reduction of rolling shutter in the EOS R1 compared to the EOS R3 – putting it on the same level as the mechanical shutter in the EOS-1D X Mark III. With a similar 60% reduction in the EOS R5 Mark II compared to EOS R5, both new models are highly capable in capturing action with high image quality and wide dynamic range. The cameras feature a precontinuous shooting function offering up to 20 frames (for EOS R1) and 15 frames (for EOS R5 Mark II) to be captured in HEIF/JPEG or RAW format at any frame rate before the shutter is pressed, allowing the key moment to be captured even if it was missed. Both feature large, high brightness and resolution blackout-free viewfinders with the EOS R1 featuring the highest resolution at 9.44M dots and the EOS R5 Mark II offering twice the brightness of the EOS R5. Thanks to the new imaging platform, the EOS R1 and EOS R5 Mark II benefit from enhanced image quality with Deep Learning, in-camera image upscaling and noise reduction, providing an additional four times resolution or reduced noise incamera when using JPEG or HEIF formats. Users can also crop and upscale in-camera making it that much easier to send out photos without editing via separate applications.
Both cameras feature up to 8.5 stops of image stabilisation performance in the centre and 7.5 stops at peripheral with effective shooting capabilities particularly in low light or in other difficult conditions.
For videographers, the EOS R1 and EOS R5 Mark II offer video in 12 bit RAW recording internally to the memory card as well as using Cinema EOS Movie Recording formats alongside Canon Log 2 and 3 with proxy video recording that is now fully supported between two cards, and four channel audio. The cameras are also able to record high resolution stills and Full HD video
simultaneously, with the option to record externally via the HDMI type A ports in both cameras. To support the workflow of professionals, several features ensure fast and stable connectivity and failsafe options, with multiple routes to image/ video destinations for filing news with support for the C2PA content authenticity format or reviewing the first rushes of a film. Industry standard file naming, separate photo/video folders, and advanced tagging in News ML-G2 standard are also supported. Both cameras support Wi-Fi6E/11ax 6GHz inbody, making them the first EOS series cameras to offer new levels of transfer speed, with the EOS R1 additionally supporting 2.5Gbps Ethernet connectivity and dualthreaded FTP in the body and the EOS R5 Mark II providing 2.5Gbps
Ethernet via an optional grip. The EOS R1 and EOS R5 Mark II join a roster of recently announced EOS R System cameras and RF lenses.
The EOS R1 is scheduled to be available in Australia from November 2024 and the EOS R5 Mark II from August 2024. An EOS R5 Mark II Mirrorless camera kit with an RF 24-105mm F/4L IS USM lens will also be available from August.
Visit https://www.canon.com.au/
Canon EOS R1 with RF 24-70mm F2.8 L IS USM lens.
Canon’s EOS R5 Mark II.
SPORTSCASTING SPORTSCASTING
Paris 2024: The Rise of Remote Production in Games Broadcasts
By John Mailhot, Senior Vice President, Product Management at Imagine Communications
AS THE PARIS 2024 Olympics continue I can’t help looking back on the 2010 Winter Games in Vancouver, where I had the privilege of working with one of Imagine Communications’ broadcast customers. In all the excitement, one thing that struck me the most was the sheer size and complexity of the International Broadcast Centre (IBC). Broadcasters from all around the world had packed the space with a massive amount of equipment — including multiple control rooms and dozens of full editing suites — and thousands of people were working tirelessly to produce their programs. It was a spectacle on par with the games themselves.
As in all televised Games before it, show production in Vancouver was performed on-site, with the finished product being sent back to each broadcaster’s home country. The reason for this was simple: Telecom (or even Satellite) links at the time were more expensive than sending personnel and equipment to the host city. Over the last 14 years, however, there’s been a gradual shift towards a mix of local and remote production, with camera operators and commentators working on-site to cover the games, and some editing, graphics, and other finishing touches being applied in the studio at home.
This trend has become more pronounced with each biannual event and was accelerated dramatically by the pandemic. During the 2021 Summer Games in Tokyo, there was a significant push to increase remote production, with far more staff and equipment than ever before remaining at home. The following year, the Winter Games in Beijing provided a bellwether of how much production could be performed remotely and which aspects must be conducted on-site.
Today, as the world watches the 2024 Summer Games, most broadcasters have applied these remote production technologies to all manner of sport content and figured out how to achieve the right immediacy and event feel by balancing local and remote production. So, while the IBC in Paris will no doubt remain an impressive, bustling hub of activity, it will be more about handing off signals from the pool to each broadcaster’s on-site team, and less of a site production compound than in years past.
Higher Bandwidth at a Lower Cost
From a technological standpoint, there are three key factors that have brought the broadcast industry to this point where remote production is suitable even for such high-profile events. The first is the availability of bandwidth for transmitting signals. Compared to years ago, the amount of carrier bandwidth accessible to broadcasters has increased significantly, while the associated costs have become far more reasonable. Both of these factors are vital, because the bandwidth demands for remote/ split production are significantly higher than simply forwarding home a finished show.
With on-site production, broadcasters really only need one link back to their home country for the finished channel — at most, two or three paths for redundancy. With a remote or split production process, however, they might be sending back 20 or 30 signals from various cameras, at production quality levels, which requires dozens of links between the sites. The increase in availability and the reduction in the price of bandwidth are key enablers of remote/ split production.
Well-Documented HDR Workflows
The 2024 Paris Summer Games mark a significant milestone in live HDR broadcasting, with several world broadcasters planning to incorporate HDR productions to varying extents. This brings the second factor — welldocumented HDR workflows — into play. Until recently, broadcasters were limited to locally producing live events in HDR, as workflows required careful visual coordination between camera shaders and producers looking at the same monitor. That has changed over the last few years as broadcasters have developed and documented their HDR workflows across major events, including standardized LUTs for conversion and shader checking the Standard Dynamic Range (SDR). Today, these standardized workflows are capable of supporting local and mixed/remote production, including creating SDR work products of very high quality — a requirement for the all-important legacy distributions.
Reducing Latency
Finally, the relatively new JPEG XS codec tackles the issue of latency, which has traditionally
been a stumbling block in remote production, especially when it comes to communication between on-site camera operators and technical directors in the studio. With traditional codecs, it may take the director a few seconds or longer to see the result after they’ve asked the camera operator to adjust something, such as zooming in or panning left. This can lead to a frustrating and disjointed process that hinders cohesive team interaction.
By reducing the latency of the signals being transmitted between the sites, the entire team feels like they are working more naturally together. JPEG XS dramatically reduces latency to the bare minimum while maintaining production picture quality.
At Imagine Communications, many of our customers have found that the JPEG XS codec offers the ideal combination of high picture quality, ultra-low latency, with an 8:1 bandwidth savings over uncompressed — allowing them to achieve the look they want while enjoying the benefits of the remote/split production. So, with its support for JPEG XS alongside its complement of UHD and HDR conversion capabilities, our Selenio Network Processor (SNP) has become an integral part of their remote production workflows.
There are more than 3500 SNP units actively deployed around the globe — for more than 100,000 video channels worth of video processing — and many of them are on the ground in Paris. It’s a watershed event for remote production, and we 9are thrilled to be a part of it.
Visit https://imaginecommunications.com/
The Olympic Rings are added to Paris’ Eiffel Tower. Crédit Instagram : @odieuxboby
EMG/Gravity Media Deliver Advanced Workflows for EURO 2024
WITH THE UEFA EURO 2024 football championship completed, EMG/Gravity Media, working on behalf of UEFA, is highlighting the advanced broadcast infrastructure the combined group put together on a global level, both for UEFA and for a number of international broadcasters including UK commercial broadcaster ITV, Dutch public broadcaster NOS and Australia’s Optus Sport.
For UEFA, EMG/Gravity Media leveraged its extensive experience gained from years of working at the EUROs to deliver the Technical Operations Centres (TOCs) and FANTV solution at each of the ten venues across Germany. In collaboration with the team in Italy, whilst the Group’s French crew members on-site delivered the big screens action for the fixtures held in both Munich and Stuttgart on behalf of UEFA. Besides delivering the technical assets for the giant screen experience in the stadiums before, during and after the matches, EMG/Gravity Media also provided four OB trucks and RF equipment (host Steadicam and handheld RF camera coverage) for the ten matches held at the Cologne Stadium and Düsseldorf Arena.
The Group also provided live helicopter 1080p HDR aerial coverage for all matches, from the opening to the Final in Berlin through the Group’s specialist RF and camera divisions, EMG Connectivity and ACS. The six helicopters, all fitted with the latest UHD HDR stabilised camera gimbals, were regionally sourced keeping flying time to a minimum and followed EMG/Gravity Media’s carbon offsetting initiative for helicopter filming on the flying hours required to deliver the tournament’s coverage plan.
The approach to sustainability was also evident in its extensive work with the UK’s ITV, which has helped the broadcaster to drop headcount for covering a major tournament from 300-400 down to just 90.
The workflow for the 1080p50 production was designed around a ‘reverse remote’ production model, with a remote gallery and mini MCR set up in Berlin along with a studio and commentary positions. The two presentation areas were fitted out with nine cameras
between them, while on-site is a vision mixer, director, PA, producer, autocue op, and a small tech and lighting team. Crucially, most of the kit underpinning this set up, including EVS replay, sound, and graphics, was all based back in London, with the locations linked together by low latency 10Gb diverse paths.
This allowed ITV to maintain a 30/70 split in personnel between the two locations, Berlin, and the UK, saving costs and the environmental impact of moving crew and kit across borders.
Continuing the sustainability theme, EMG/ Gravity Media also provided new trucks to ITV for venue injects. These feature solar power battery backup, allowing them to sustain connectivity overnight while minimising the amount of power used.
Meanwhile, the EMG/Gravity Media team from the Netherlands has been successfully following the Dutch football team for the public broadcaster Nederlandse Omroep Stichting (NOS) coverage, from the start of the tournament to the semi-finals. This crew of fifteen members are covering the team from stadium to stadium with an OB truck carrying six cameras, three for the studio and three for
the analysis shots.
Finally, the team based in Australia provided technical support and operation for the coverage of the UEFA Euros at Optus Sport. The group had a full crew of production, cameras, technical and engineering support to integrate world feeds with live crosses to Germany and local communities. All of this while providing pre-match hosting and a cross into the CONMEBOL 2024 COPA America tournament also for the Optus Sport channels.
Eamonn Curtin, Interim Chief Commercial Officer at EMG/Gravity Media says: “Following a successful EUROs, we are proud to announce that this event ranks among the largest live broadcasts we have managed since the merger of our two companies. The collaboration of our diverse employees from various backgrounds and countries epitomises the purpose of our union; to leverage our collective expertise and experiences. Together, we have created innovative and sustainable workflows to support UEFA and partners like ITV and Optus, ensuring the best possible Euros experience for their audiences.”
Visit https://www.gravitymedia.com/au/
Ki Pro GO2 is a portable multi-channel H.265 (HEVC)* and H.264 (AVC) recorder o ering up to 4-channels of simultaneous HD or SD recording to qualified USB drives and/or network storage with redundant recording capabilities.
4x 3G-SDI and 4x HDMI digital video inputs with flexible channel assignments provide connections to a wide range of video sources, including any 3G-SDI signal from a router for example, professional and prosumer cameras and DSLRs.
Designed to be either portable or rackmountable with half rack wide, 2RU high dimensions, Ki Pro GO2 is well suited for use in any production environment be it a global concert tour or your local house of worship.*(High E ciency Video Coding)
Next Generation Digital Recorder featuring HEVC
NEWS OPERATIONS NEWS OPERTAIONS
BBC and TVU Redefine Mobile Coverage for UK Election
A POSTAL TUBE arrives in the mail. Inside is a custom tripod and smartphone kit. Throw in TVU’s Anywhere app and you become part of a network of hundreds of correspondents, broadcasting on the ground updates. No OB vans. No satellites.
This was the BBC approach to the recent UK General Election and it allowed the UK national broadcaster to manage 369 live feeds from vote counting locations across the UK, all captured using the TVU Anywhere mobile app. The feeds were streamed on BBC iPlayer, the BBC News website, and transmitted to a giant virtual mosaic screen at the BBC’s London studio headquarters London Broadcasting House, providing audiences across the UK and around the world with an unprecedented, immersive experience. The broadcast reached a peak audience of 4.6 million viewers in the UK.
A BBC custom created tripod and smartphone kit running the TVU Anywhere app for cellular bonded live transmissions was sent to every count, allowing BBC News to capture real-time video feeds regardless of local pressures on cellular networks. The user-friendly remote production setup was easily configurable, even by staff without a technical background. The compact design of the kit assembled by BBC representatives already attending the count combined with the lack of reliance on broadcast vehicles allowed the corporation to generate more live content in an environmentally friendly way.
“This approach not only upheld the BBC’s highest quality standards
but also pioneered a new method of election broadcasting that would have been near impossible using traditional methods” said BBC News’ Geraint Thomas, who led the project. “The TVU cloud platform allowed us to scale up a vast number of feeds and handle peak traffic seamlessly without investing in additional hardware. This was both an editorial and technological innovation that transformed the viewing experience, bringing the vote counting process closer to our audience, setting a new benchmark in election coverage.”
Hundreds of live feeds, which would typically require extensive server setups, were ingested and recorded in real-time on the TVU Search cloud platform. These feeds were displayed in a large mosaic view using TVU Partyline, allowing viewers to see all voting locations across the UK simultaneously and access custom mosaics for their local regions.
The BBC managed 21 regional workflows through TVU Partyline, providing tailored coverage for different areas of England, Scotland, Wales, and Northern Ireland. “We delivered the most personalised election coverage ever in live video. Across the UK, wherever you looked you could see democracy in action, live” said Jonny McGuigan, BBC News’ Streaming Editor. “The ability to choose from any one of 369 counts on the night meant we could always
be where the story was. When augmented with our traditional broadcast live SNG and bonded connectivity, we could guarantee we’d be where audiences needed us, on digital platforms, TV and radio all night long”.
Viewers could follow what was happening in their own electoral areas by tuning into live feeds via BBC News’ website constituency pages, creating a personal connection to the electoral process. The BBC’s London studio featured a massive virtual LED wall connected to all feeds, while the national programmes in Scotland, Wales and Northern Ireland also all had access to the live content. The live streams were also used as political commentators provided real-time updates and expert analysis in a split screen, making the coverage as interactive and visually engaging for the audience.
The TVU platform was able to output IP for OTT, as well as SDI for the TV and Radio programmes seamlessly integrating into the
Reuters Selects Sony Cameras for Worldwide Video Group
SONY HAS ANNOUNCED that Reuters, one of the largest news agencies in the world, has selected Alpha and XDCAM cameras, G Master lenses and audio equipment to equip its video journalists across the world. The fleet will be using Alpha 7S III, and PXW-Z280 as their main shooting kit, paired with UWP-D wireless audio.
The PXW-Z280 has become an industry standard, thanks to its ease of use, electronic variable ND filter and Sony’s Instant HDR workflow, eliminating the need
for colour grading. Its advanced network features and multiple format support means it can easily, securely and wirelessly fit into a cloud-based news gathering workflow.
Since its launch, the Alpha 7S III has established itself as a go to camera for on location content creators needing a compact form factor and exceptional picture quality.
In this instance, the advanced autofocus performance for hybrid use, together with its outstanding low light performance were key
features in Reuters’ decision making. Sony’s E-Mount strategy will also provide journalists with the flexibility to capture images in their truest form, when perfectly paired with the vast options of Sony lenses of their choosing, including the SEL2470GM2.
Both cameras benefit for the connectivity features needed to easily connect to the cloud for seamless and reliable content transfer, from the capture location back to the remote broadcast unit, particularly critical for news
existing broadcast infrastructure.
“When we started planning this project, many believed the election would be held later in the year,” added Geraint. “In a matter of weeks, we managed to achieve what we thought would take months to develop, and integrated the TVU cloud platform with our on-premise workflows in a costeffective manner.”
“It’s about innovating together and redefining the boundaries of broadcast technology,” says Paul Shen, CEO of TVU Networks, praising the BBC for leading the way in blending linear and digital broadcasting across their media supply chain to offer a wider variety of content. “This collaboration is a testament to what’s possible when we combine our strengths. The future of broadcast is now, offering an accessible, sustainable model that opens up endless possibilities for storytelling to audiences everywhere.”
Visit https://www.tvunetworks.com
gathering, with its emphasis on speed.
The cameras and associated equipment have been shipped to 23 Reuters locations across the world. The investment from Reuters is a testament to the long-standing relationship between Reuters and Sony. Sony’s Media Backbone Hive is already in use at Reuters as its main multi-platform news production system.
Visit https://pro.sony/en_AU/
QuickLink StudioPro 4K Video Production Platform
QUICKLINK,the provider of multi-camera video productions and remote contribution solutions, will present QuickLink StudioPro at IBC 2024 (Stand 7.A55). According to the company, QuickLink StudioPro is an easy-to-use 4K video production platform which delivers a true-to-life ultra-low latency production experience without the complexity of legacy systems.
With a one-frame delay, QuickLink StudioPro ensures flawless synchronisation between stage speakers and large projected screens or video walls for live presentations. StudioPro functions similarly to a video-based Microsoft PowerPoint. Each scene is like a slide, allowing users to add multiple layers, effortlessly switching between scenes to create amazing productions. Without
StudioPro, creators face the inconvenience of extensive training on outdated systems, the financial burden of outsourcing to external companies, and lip synchronisation issues.
Utilising built-in, industry-best QuickLink StudioCall technology, remote guests are seamlessly integrated with Teams, Zoom, and Skype calls, other camera feeds, audio sources, images, videos, and graphics. Designed for both large-scale and intimate productions, StudioPro ensures that content remains on-brand and unforgettable to audiences, broadcasting to multiple physical, digital, and social destinations, supporting multi-aspect ratio output simultaneously.
QuickLink StudioPro also supports ASIO audio
Vizrt Beefs Up Newsroom Security
VIZRT HAS ANNOUNCED the launch of Pilot SignOn for its Viz Pilot Edge HTML-based newsroom graphics platform.
According to the company, security authentication is critical to the modern newsrooms’ operational integrity. With a new cyber-attack happening every 39 seconds, the threat of rogue agents targeting media organisations is very real, making it essential to implement advanced authentication mechanisms.
Designed specifically for newsroom environments, Pilot Sign-On offers robust security measures to safeguard sensitive journalist workflows. Ensuring only authorised personnel can access sensitive information, and
crucial editorial tools, or investigate reports, Pilot Sign-On protects against unauthorised access and reduces the risk of a data breach.
“Journalists routinely handle highly confidential information and need reliable and secure systems. Pilot Sign-On addresses these concerns by providing a simple, yet secure authentication process that enhances the integrity of newsroom activities. This system not only protects sensitive information but also streamlines access for journalists and designers, ensuring they continue to work in a safe and secure manner,” explains Ionut ‘Johnny’ Pogacean, Senior Product Manager, Vizrt.
devices over IP and offers advanced audio mixing capabilities, allowing for precise tuning of audio to perfection. The platform’s AI-driven advanced noise reduction, echo cancellation, and Studio Audio features ensure crystal-clear audio quality.
Visit https://www.quicklink.tv
By integrating Pilot Sign-On, newsrooms can add another layer of security to confidently protect their digital assets and maintain operational efficiency. Additionally, it simplifies any audit process for internal IT teams.
Additionally, Template Builder, which is used to build the Viz Pilot Edge templates, has been completely overhauled and rewritten in modern web technology for not just added security, but also speed. This action offers improved startup time, makes the UI faster and will ensure a similar look and behaviour once the template is loaded in Pilot Edge, as well as many other quality-oflife improvements that will make newsroom workflows more efficient and futureproof.
Visit: https://bit.ly/4c8oejO
Colour Grading George Miller’s Furiosa: A Mad Max Saga
post-apocalyptic
RESPONSIBLE FOR THE COLOUR on Furiosa:
A Mad Max Saga, Eric Whipp of alter ego employed FilmLight’s Baselight 6.0 across three locations: alter ego in LA and Toronto as well as Spectrum Films in Sydney, who handled the workflow supervision, conform, home entertainment mastering, and delivery in Baselight.
Whipp, Miller and the Spectrum team worked together previously on Three Thousand Years of Longing (2022) and brought a similar workflow to Furiosa.
The work on the movie was spread across three different facilities and cities – starting with Spectrum Films in Sydney, Australia, who handled the conforming in Baselight in the same space as the editorial department.
“We started on the workflow for Furiosa while we were completing Three Thousand Years of Longing,” comments Michael Messih, technical operations manager at Spectrum. “The same on-set VFX supervisor worked on both films, so we were able to bring a lot of what we had developed in Baselight directly over to the Furiosa pipeline.”
“The Rebel Fleet managed the dailies and all the VFX pulls, and because there were so few shots which didn’t require visual effects we decided to do everything via EXR sequences,” explains Catherine Armstrong, GM and head of post-production at Spectrum. “At no point did we actually see any of the original camera raw files.”
Andrew Cucé, colourist and finishing editor at Spectrum, commented: “The best feature we had was Baselight’s shot versioning tool. We would typically get up to 100 shots a day, so
having the version refresh feature made things so much quicker. Also being able to switch back and compare versions during VFX reviews was amazing.”
Spectrum helped develop a workflow for colour-critical 4K graded streams between LA and Sydney, and maintained a calibration check procedure – including colour comparisons and render tests – so that everyone was seeing the same images.
“George likes to refer to the term ‘rolling DI’, implying a tight integration between departments which means that the grading process happens alongside all the VFX work,” explains Messih. “Having the remote working ability between Spectrum and alter ego meant that Eric didn’t have to spend weeks in Sydney – and George could review everything from the 4K DI theatre at Spectrum.”
The Spectrum team worked entirely in ACES and made use of Baselight’s ability to switch between colour spaces to compare colour in different viewing conditions. It also supported quick switching of formats, which they were doing all the time to compare 4K, HD and UHD against various references.
Supported by a team of finishing editors (Affrica Handley and James Cowie) and production experts (Basia A’hern and Sofia Costa), the Spectrum team used the Baselight reporting tool, which allowed them to provide daily data reports and enable the VFX supervisor to cross-check versions.
“This was essential in Furiosa, as we were doing most of the VFX changes in Baselight rather than waiting for an updated EDL,” comments Armstrong. “DI was effectively
leading the edit in the end, so this reporting tool was critical.”
“Due to the nature of Eric’s stacks, I ended up doing quite a lot of the editorial changes manually in the timeline,” adds Cucé. “The new Baselight 6.0 timeline editing tools will make this much easier – the addition of tracks and the ability to compress large stacks means managing a complex timeline will be a lot simpler for future projects like this.”
Colour: LA and Canada
Alter ego opened its LA office in September 2023 and knew one of the first projects would be Furiosa.
“We ensured we had everything we needed to grade the film while building out the new office,” says Whipp. “A big part of this was to ensure that our six Baselight suites in Toronto could easily sync with the two new Baselight suites in LA.
“We also needed to find a way to sync up the Baselight suites at Spectrum, so we set up a project server here at alter ego in LA, which everyone was connected to. This way, once a reel was conformed in Sydney, it was as easy as opening it in LA and everything was linked up.”
There was also a dedicated team of colour support at alter ego in Toronto and LA to assist with the large amount of roto requirements on the movie. This meant Whipp could spend more time on the look and some of the other complicated issues that came with this film.
“We had experts such as Andrew Ross, Lily Henry and James Graham working furiously on roto and shapes, with Jonah Venneri, Daniel
Furiosa: A Mad Max Saga, directed by George Miller and shot by cinematographer Simon Duggan, is the fifth instalment in the
Mad Max series and a prequel to Mad Max: Fury Road (2015).
Saavedra, Ben Otten and Corey Martinez all handling the terabytes of data moving across the three facilities on a daily basis,” says Whipp.
“Colourist Andrew Ross was amazing to work with, helping me prepare scenes and some of the balancing work,” adds Whipp. “Some of the looks required layers of shapes and keyframing just to get started, so it really was a team effort.”
Colour producer Genna McAuliffe managed and tracked all the shots and scenes and kept track of the Baselight work across the three facilities. In LA, lead colour assistant Corey Martinez helped manage the needs for each session and also supported with roto. Ben Otten was also on hand to help make AI mattes, etc., as they came up.
“Towards the last couple of months on the film, it really did become a bit of a rush,” recalls Whipp. “I was grading until 3am (LA time – to sync with the time zone in Sydney), and back at it at 8am. There was just so much work to be done in Baselight, that having the three facilities on three different time zones actually worked out amazingly for us.”
Whipp graded the film almost entirely remotely.
“George was in Sydney, and I was in the alter ego LA office,” says Whipp. “We streamed the 4K image from LA to Spectrum Films, and the image looked flawless.”
Graphic Novel Look
As the film is a prequel, Whipp began by rewatching Fury Road to remind himself where they ended up.
“We knew it had to be similar, but not exactly the same,” says Whipp. “In Furiosa, we visit locations that are only talked about in Fury Road (like Gastown and the Bullet Farm). So the door was wide open for us to explore.”
For Gastown, Whipp developed two looks. The first, a typical industrial site, and the second, which comes years later after Dementus has run it into the ground, a much darker and “dirtier” look.
“We also graded some shots to resemble a Renaissance painting, and overall the film is leaning toward a graphic novel look,” adds Whipp. “We didn’t use any specific references, but it was definitely inspired by paintings and graphic novels.”
Whipp used the Chromogen tool in Baselight 6.0 to support the development of the look for Furiosa.
“I jumped into Baselight 6.0 specifically to work with Chromogen,” explains Whipp. “I knew I needed rich-coloured sand and deep-blue skies in a lot of cases, so I made look strips in Chromogen that pushed the sand colour richer and deepened blue skies which twisted toward a nice cyan-blue colour. In some scenes we needed the greenery to pop, so I created look strips to open up the greens.”
In all, Whipp ended up making approximately 20 look strips that he used for different scenes.
“All of these helped with the overall look while not being too destructive to the overall image,” says Whipp. “I converted them all to look strips so I could slide the amount of ‘effect’ shot by shot. We graded this film in ACES, so there was no need for a show LUT; working with look
strips essentially does the same thing anyway, but with the advantage that everything is colour space aware, which allowed for easy transitions to HDR and rec.709.
“I’d take a scene, play with it in [Baselight’s] Chromogen and find a look that I felt was helping. But a day or two later, I’d find myself jumping back into Chromogen to refine it. Then another scene would come along with a slightly different look, and I’d find that the Chromogen look I had just created two days ago was not going to work, so I’d jump back in and adjust it specifically for the new scene.
“Traditionally, I have never had that much control over a LUT. I’d have to grade around it. For example, I used this one LUT for years that I loved, but green colours were not very strong and in fact got twisted to yellow. So I’d often have to key green colours and bring them back over the LUT.
“With Chromogen, I can literally adjust something like that and save off a LUT that won’t require all that keying shot by shot.”
Whipp used a variety of Baselight tools on the grade – from shapes, keys, working with embedded mattes from EXR sequences, optical flow retimes, compositing new skies and adding new elements.
“There was a shot or two in the film where one of the actor’s eyeline was slightly off,” explains Whipp. “It ended up making them look a touch cross-eyed, so we used the gridwarp tool to shift the eyeball position over and correct the issue.
“On other shots we added a lot of additional dust, smoke and fire in Baselight. We used flicker, and heat haze effects, lens flares, etc. I even photographed a series of lens dirt elements that we could then use to composite over certain shots to add some grunge and realism.”
Heavy VFX
“As a lot of the backgrounds in this film are full CG, we would start with a basic lighting pass and comp of a shot from VFX, jump into the Baselight suite and play with the look,” explains Whipp. “Often we would try different skies and looks and sculpt the lighting to get it right for the scene.”
When Whipp had a look that George liked, he would send that reference to the VFX teams to match the lighting and look. Then the shots would come back with the amended sky and lighting.
“We had a great collaboration with VFX on this film,” adds Whipp. “It’s imperative that we work together closely and support each other on huge projects like this.”
Andrew Jackson was the VFX supervisor, who Whipp had worked with on Fury Road.
“This is why colour was brought in so early,” explains Whipp. “Between colour and VFX, we could work out where the best place was to achieve a look. In many cases, we opted to do the effect or look in Baselight so that it could be adjusted to George’s preference, rather than bake it into the VFX comp.”
Whipp also found this was a great place to explore looks as they could take a shot and very quickly composite a new sky or add layers of dust or light rays to see what that did for the image.
“We did this for every scene to establish what
the feel would be,” explains Whipp. “It makes it easier for VFX to move forward, as there’s a reference of where we need to end up. This is ultimately the way George loves to work. Traditionally, VFX would try out lighting looks and send them through for him to look at. He would give notes, wait a couple of days, and repeat the process until the VFX is sitting in a good place. But if we get these early VFX passes into the Baselight suite, he can adjust and control the look in real-time and find the desired look much faster.”
Unlike Fury Road, which was shot on location in Namibia, Africa, Furiosa was shot in Australia. As the film needed to look like the desert from Fury Road, a lot of the background environments in Furiosa are CG.
“The weather wasn’t playing nicely once the shoot began, so there was a huge mix of sunny and cloudy days,” recalls Whipp. “At some points the wind was so strong on set, they had to move into a studio to shoot outdoor action scenes. Once the edit was done, we ended up with a mix of sunny shots, overcast cloudy shots and studio-lit shots. Obviously, they don’t all match seamlessly, so there was a lot of work in the Baselight suite to even these out and relight as best we could.”
The CG backgrounds required specific work, too, to prevent them from looking too perfect.
“We did a lot of depth hazing in the grade, to reduce some of the contrast and help draw your eye to the main action taking place, and lots of other tricks in the grade to take away some of the CG edge,” explains Whipp.
One of the most challenging sequences for Whipp was a section they called the “war montage”.
“It involved a series of shots that dissolved and overlaid on top of each other,” explains Whipp. “That alone would be quite a task, to bring through certain parts of the image via a dissolve and get the timing perfect. But we took it up a notch and also composited new backgrounds into each shot and added layers and layers of fire, smoke, embers and ash flying across the frame. A huge amount of compositing work and roto work was required in Baselight to get that little sequence working.”
When asked about his highlight on this project, Whipp struggled to single out a scene, sequence or look.
“There’s certainly scenes that I prefer the look of over others,” he said, “But George once told me that people often ask him what his favourite shot is. He said it’s like asking a composer to tell you their favourite note. It’s the combination of all of the notes that makes the music. I feel very similar about colour.”
Visit https://www.filmlight.ltd.uk
Rabbiting On - from FM Mornings to Podcasting’s Vanguard
It was the old “change in direction” at Nova Entertainment’s Star 104.5 FM on the NSW Central Coast that led morning host Dave “Rabbit” Rabbetts down a literal new road with his career and his audience.
A RADIO VETERAN of some 30 years, Rabbit’s career began at the age of 15 with work experience in his native New Zealand. After graduating to onair hosting, a move to Australia saw him behind the mic in the NSW town of Griffith, Cairns in Queensland, and in Sydney and Adelaide, before eight years of mornings on the NSW Central Coast with co-host Julie Goodwin of MasterChef fame.
When a late-2023 reshuffle at Star FM saw the breakfast team’s services no longer required, rather than sign off, Rabbit took to the road with two new vehicles for his talents - podcasting and a caravan.
“I’ve renovated caravans before, and I love doing it,” says Rabbit, “and I don’t know when the idea happened, at what point, where I went, ‘I could build a podcast studio inside of a caravan, and record in here’, but that’s what happened.”
The caravan in question is a 1967 “Baravan”, manufactured by the Barrett family of Western Australia, and now.rechristened the ‘PodVan’.
“It’s about 10 or 11 feet [long], so it’s very small,” says Rabbit, “and I wanted something of this size so that I could put it into a car space at the beach.”
After gutting the Baravan, Rabbit set about equipping its interior with host and guest seating along the side walls with adjacent headphone outputs, LED lighting and boom arms housing Audio Technica microphones. A bench with sink and cupboard at the front of the PodVan’s interior provides space for a RØDECaster Pro II, and an M1-powered MacBook Pro which records audio from the RØDECaster and video from four Logitech BRIO 4K webcams using OBS Studio, the free, open-source, crossplatform screencasting and
streaming app.
Giving the PodVan an on-location feel is the rear of the van which has been modified with a hinged, horizontal window opening upwards, and a horizontal panel which folds down to convert into a mini-stage with legs, allowing passing members of the public to watch as podcast episodes are being recorded. Powering the PodVan are two solarpowered 150 amp batteries, and a 2500 watt inverter providing 240 volts to the studio equipment. When in transit, an Andersonstyle connector (designed for high current 12V circuits) will charge the batteries from a car
as it tows the caravan. As a final redundancy, the PodVan features an external power point, allowing it to be connected to domestic or campground outlet.
The exterior of the PodVan also functions as a mobile billboard with website and social media addresses, QR codes and branding for local businesses who pay monthly sponsorships.
These include hardware outlet Mitre 10 and Power Creative, a local production company which is providing free video branding services. Also on board is central coast graphic design firm Jetstream Graphics.
Mitre 10 has also provided retail space at its store for PodVan merchandise, including caps, hoodies, T-shirts and coffee mugs (also available via the PodVan website).
CONTENT SIDE
When it came to content for the PodVan Podcast, Rabbit’s initial plan was to place signage inscribed with a topic of the day outside the van to draw in potential guests, but this was superseded almost immediately as curious passers-by continually stuck their heads in and consented to being interviewed. According to Rabbit, “I find that I’ll pull up somewhere and I’ll just get chatting to someone, and it could be that they’ll ask about the podcast and I’ll say, ‘I used to be in radio, and I’m doing this’, and they’ll say, ‘Oh, that’s like me, I used to be a High Court judge. Now, I’ve set up this little cafe’, and so on. It’s just become the
Making truly mobile media, PodVan Podcast founder Dave “Rabbit” Rabbetts. with podcasting.
The PodVan on location.
whole thing of ‘everyone’s got a story’, and there doesn’t need to be a theme to the episode. You never know what you’re going to get from one episode to the next, and people are liking that.”
DISTRIBUTION
Once recorded, a PodVan Podcast episode is edited using Adobe Premiere Pro. Video shorts are produced for Rabbit’s social media pages and a full length MP3 audio file is produced for upload to the Acast.com podcast publishing platform. Acast’s technology inserts advertising into the programme and pushes it out to a variety of public-facing platforms, including Spotify, Apple Podcasts and iHeartRadio. Acast also provides a streaming player which is embedded on Rabbit’s podvan.com.au website. Episode lengths vary depending on the interview subject.
“I think 22 minutes is the shortest one that I’ve done,” says Rabbit, “and then there was one that was over an hour and a half. It’s just whatever I get for that particular episode. Sometimes I’ll lump them together, so I might get
someone jump in for 15 minutes, and then someone else tells a story for about 10 minutes, and then another one for another 10 minutes, I’ll put those three into one episode, so it’s not just like a 20 minute episode. It can get up to 40 minutes. That seems to be about a sweet spot, about 40 to 60 minutes.”
ANALYTICS
With a publishing schedule currently comprising three episodes per week, Rabbit reserves Mondays for analytics. While it is still early days for the podcast, and listening patterns are still emerging, episodes such as an interview with local influencer Sarah Kearns have resulted in audience crossover and new subscribers.
“She’s a ‘coastie’ who has hundreds of thousands of followers,” says Rabbit. “She jumped in the van and shared some posts about it on her pages. And that episode has gone huge. “Two things on that. One, fans of hers that know she’s on a podcast, so they want to listen to it. And then the other one is so
many people that used to listen on the radio have rediscovered me. They’re looking at her Instagram and she’s like, ‘here’s Rabbit, and he’s doing a podcast’ and they’re saying ‘Rabbit? He’s still alive. Oh, my God’. So I’ve got new listeners out of that.
“And on a Monday, I also go back through and look at Facebook, Instagram, YouTube, TikTok, etc. Each one of them works differently as far as how important is the caption or the hook. On some, it matters. On some, it doesn’t matter in the slightest. It’s more about the first three seconds on different platforms. And the whole reason of putting all those videos and the shorts and the reels and things up there is to let people know about the podcast.
“I had a notification that I’d hit a million impressions, so a million sets of eyeballs had seen my stuff on social media since starting the podcast, and it was only March 1st. So, it was three months and a million people, and that’s across Facebook, Instagram, YouTube and TikTok. That number’s ridiculous.”
Visit https://podvan.com.au
Australia’s ALIVE RADIO Relies on AEQ for Outdoor Events
RADIO ALIVE IS A community radio station based in Baulkham Hills, from where it serves the Hills District and parts of Greater Western Sydney. Founded in 1992, it has been broadcasting since its inception on 90.5 MHz FM and currently also on its website. The station’s programming features a predominantly contemporary music and community information format, including public assistance sections.
Radio Alive 90.5FM is a long-time customer of AEQ technology. Its main broadcast studio is equipped with an AEQ CAPITOL IP digital console. As a further endorsement of the brand’s technology, the station has deployed an outdoor work kit manufactured by AEQ’s local partner in the country, Broadcast Components, which has the ALIO audio codec as the centrepiece of the system.
Radio Alive is a radio station that aims to be very close to the events of their community, and this new outdoor kit allows them to bring the realisation of their programmes physically to the audience, as it offers the possibility to perform many of the contents from anywhere in the city: parks, hotels or restaurants, with a professional audio quality and a negligible investment in communication lines.
AEQ’s ALIO is a portable IP audio codec with dual stereo channels. It has been designed specifically for sports broadcasts, but has been optimised to make it easy to use in the most varied scenarios, including music events. Its design, compact, and resistant to pressure, shock, and even liquids, is optimised for outdoor use where the treatment cannot always be sufficiently careful.
AEQ ALIO can connect to most manufacturers’
Shure Lavalier Condenser Mics
SHURE HAS ANNOUNCED its updated, newly designed low-profile WL18Xm professional lavalier condenser microphones, including the cardioid WL185m, supercardioid WL184m, and omnidirectional WL183m, for use with wireless bodypack transmitters.
The refreshed portfolio builds on the legacy and performance of existing Shure WL lavalier microphone technology, offering full and accurate sound quality with an improved presence response. The WL18Xm line delivers enhanced specifications for optimal compatibility with modern wireless systems
and is specifically designed for speech and presentation applications in corporate, education, and worship environments.
“Our WL series has been the go-to lavalier solution for decades, known for reliability, durability, and full, accurate sound. As wireless technology continues to improve, we saw an opportunity to innovate and evolve the industrystandard WL line into a product that meets the requirements of professionals using today’s modern, digital systems,” said Nick Wood, Senior Director, Professional Audio Products, at Shure. “We’re introducing a product that users
base equipment via SIP communication protocol, according to EBU standard N ACIP Tech 3326. But if you connect with an AEQ codec you can use a unique set of tools to help you communicate and control the unit. In addition, with the AEQ ControlPhoenix Audiocodec Management Software you have full control of the unit and all its functions.
Chantelle Pritchard, National Operations and Broadcast Sales Manager with Broadcast Components, undertook commissioning the system as well as training the staff at Radio Alive. The equipment has already been used on numerous occasions such as Run for the Hill, Mateship Fair or Hills Winter Sleep Out, among others.
Visit: https://www.alive905.com.au/, https://broadcastcomponents.com.au/, and https://www.aeq.eu/
already know and love, now with lower self-noise, improved RF immunity, higher dynamic range, and improved max SPL.”
The WL18Xm line is 8 mm shorter than its predecessor, making it lighter, lower-profile, and easier to use. For flexible placement on speakers, presenters, and performers, each lavalier is available in either black or white with LEMO or TA4F connector options. The microphone’s innovative tie clip can be rotated in 90° intervals for accurate positioning.
Visit https://www.jands.com.au
Shotgun Microphones
Axient® Digital ADX5D Dual-Channel Wireless Receiver
Axient® Digital Wireless Systems
Twinplex Lavalier Mics
Broadcast Microphones and Plug-on Transmitters
Q5X Wireless Transmitters
Radio on the Right Track with Telos Alliance’s Pathfinder
THE AXIA PATHFINDER Core
PRO Broadcast Controller from Telos Alliance is designed to simplify routing in radio facilities. Available as a 1RU appliance or as a virtual machine, it allows users to customise and command their Axia network by streamlining functionality via a web interface and Logic Flows - a flow-chartstyle events system that makes events easier to create, adjust, and monitor in real-time.
Represented in Australia and New Zealand by AVC Group, Telos Axia technology has been rolled out in radio networks across the region. These include ARN, SCA, SBS, ABC, Nine, NZME, and Radio New Zealand. To the surprise of Telos, these broadcasters, with development help from AVC, have been pushing the limits of its technology, and Pathfinder in particular, in ways not seen in its other markets across the world.
According to Cam Eicher, Senior Vice President of Worldwide Sales with Telos, “Australia is probably one of the most technically advanced markets in the world in terms of how they use their equipment to try to get the most out of it that they can. They see possibilities in a product, and their goal is to be able to utilise everything a product can offer, whereas many commercial broadcasters in the U.S. have developed really simple workflows. They don’t have the size and scope of what we’re trying to do here in Australia. So, we’ve been really impressed about what they’ve been able to do with these products.”
“Pathfinder is a control application,” says Dan Bays, Pathfinder inventor and product manager. “So, what it does is it reaches out to each device - you’ve got these sources and these destinations - and pulls that all into a central router so that when you make a route change, it may be reaching into this device to tell that one to pick up a new stream or that device to pick up that new stream. So, it is constantly reaching into all of this and then tries to bring that together to look like a contiguous system.
“What the customer wants to see
is just a user panel where they hit a button and 37 things happen underneath and away it goes. It’s very graphical. So, the idea is instead of writing a number of scripts, all of the logic is represented in a graphical flowchart style that will actually show you the state of the logic as it’s passing through the system.”
But, the system also does more than just direct the routing of audio streams.
Being IP-based, it can control a variety of studio elements, from lighting to branding to PTZ cameras.
“You can think of it like an omniscient system monitoring and system control tool, so it has visibility on your entire ecosystem as much as you’d like to have it visible to it,” says Dan Bays. “So you could even have your lighting controls, you could have your air conditioning controls, your security, opening a door control, all these different logic devices can be attached to Pathfinder as well as the audio sources, the console sources, whatever you’re trying to do from an audio standpoint.
“And this allows you to tailor the visual aspect and the use case aspect to your talent or to your engineering staff. So, you can make complex actions really simple. You can boil it down to a button for the announcer which will change a log, it’ll change a studio delegation, it’ll change a routing system because you’re going from the breakfast show to the news show and being able to automate functionality as well as providing monitoring and logging.”
This control can even extend to remote workflows.
“When we think about control software,” says Bays, “anything we can see, and by see we mean access it on the network, we can control. So it can be nationwide, it can be within the walls of a studio,
you get to restrict the visibility of what the system can see and how you want to control it.”
User interfaces for the system can also vary, depending on the end user and the action being executed. FOr example, screenbased “panels” can be configured as producer stations, giving producers the ability to quickly control speakers and monitors, guest microphones or change what’s being fed back to hosts and guests.
According to AVC Director of Strategic Accounts, Ian Cambell, “You can design any number of panels and, because we’re moving into a very visual radio phase where there’s cameras in the studio and lots of branding, well, the marketing department might get involved in the look of this if it’s going to appear in a camera shot, so you want the right colour scheme, the right branding, and so allowing a user to have some predefined templates that this is Station 1’s branding and then be able to more quickly design a panel that matches the corporate branding. And then if you move over to Station 2, everything changes colours.
“And there’s dozens of them. They can all look very, very different. Different routing controls, different screens outside the studio, clocks and delegation systems, so whatever you can visualise when
you talk about custom masks, what Pathfinder really allows is the engineering department of a radio station to be able to deliver whatever custom needs that the users and talent and production people ask for.
“Dan said to me, ‘I’ve got the hammers and nails and I’ve provided a bit of timber. You guys have been building the house.’ I think that was actually a really good way to describe it and we can build whatever we want. Dan will attest that the AVC developers pushed the product to its limit. We just want to keep pushing the limits. The Australian industry has got a good reputation for pushing beyond what a product is originally built for.”
“I see it as a creative tool, not a cost-saving tool, “ says Dan Bays. “This is a way of improving how your station works, a creative tool to help an engineer get involved. We provide very, very cool gear, and it works out of the box in a certain fashion. But if there’s something that you need to adjust because it doesn’t quite work exactly the way your studio requires it, this Pathfinder tool allows you to make those adjustments. It’s almost the tailoring of a suit where you can buy it and it looks reasonably good, and this will let you do the fine-tuning and the customisation to get you exactly what the users want.”
Visit https://www.telosalliance.com/ and https://www.avc-group.net/
[L-R] Cam Eicher, Senior Vice President of Worldwide Sales with Telos and Dan Bays, Pathfinder inventor and product manager.
Pro Tools ARA Support for Leading Plugins
AVID’S RECENT update for its Pro Tools audio production software introduces ARA 2 support for solutions from iZotope, Synchro Arts and Sound Radix. The update also adds two new included MIDI effect plugins from Mixed In Key, session export for Media Composer, Dolby Atmos enhancements, improved searching and importing of session data, memory locations enhancements, MIDI pitch labels, and more.
Pro Tools 2024.6 adds ARA 2 support for six industry-leading plugins, including iZotope RX Spectral Editor, Synchro Arts RePitch, VocAlign, and Revoice Pro, as well as Sound Radix AutoAlign 2 and Auto-Align Post 2. Together with existing Celemony Melodyne integration, ARA 2 support enables users to leverage these tools significantly faster and easier in Pro Tools without having to roundtrip audio. Additionally, customers with active Pro Tools subscriptions and perpetual upgrade plans will automatically receive the ARA 2 plugins for iZotope RX Spectral Editor and Synchro Arts RePitch Elements as part of this release.
MIDI Effect Plugins
The 2024.6 update gives Pro Tools customers two new MIDI plugins by Mixed In Key: Captain Chords Lite and Human Lite. Captain Chords Lite is a perfect songwriting
companion that allows users to pick any key and instantly create chords to try out, with complete control over inversions, substitutions, and passing tones. Captain Chords Lite also includes a library of chord progressions and rhythms to experiment and create something amazing. The Human Lite plugin adds the organic feel of live musicians to quantized MIDI performances, loops, and even entire tracks. The Captain Chords Lite and Human Lite plugins are exclusively available to all Pro Tools customers with active subscriptions and perpetual upgrade plans.
includes:
Session Export for Avid Media Composer
Building on recent interoperability improvements, Pro Tools Ultimate users can now save a copy of a session as a Media Composer compatible file, greatly simplifying sound-for-picture workflows. This new file format includes sample accurate volume and pan automation and markers that can be imported into Media Composer to streamline workflows between teams.
• Enhancements to the internal Pro Tools Dolby Atmos renderer, including 9.1.4 and 7.1.2 monitoring and metering directly in the renderer window, Stereo Direct output format support, and dedicated controls for soloing Atmos groups and setting binaural and group settings in the I/O Setup.
• New comprehensive filtering options, more visual information, and the ability to import entire folders with member tracks speeding up the importing of tracks from different sessions.
• New Memory Location improvements with separate modes for managing/
Avid Upgrades NEXIS for Demanding Audio Workflows
AVID HAS ANNOUNCED enhancements to its industryleading software-defined storage solution, Avid NEXIS, to meet the performance needs of audio and video professionals – from small teams to the largest media enterprises.
Optimised for real-time media production, Avid NEXIS is the industry standard collaborative shared storage solution, designed to provide the performance, scalability, and availability that powers professional audio and video content creation workflows. Ideal for post-production facilities, dubbing studios, and multi-room studios using multiple Avid Pro Tool® workstations, NEXIS offers a range of storage ‘engines’, both on prem and in the cloud.
For the most demanding audio and video workflows, Avid recommends customers deploy the Avid NEXIS F2 SSD engine. With Avid NEXIS | VFS software
version 2023.12 or later, SSDbased NEXIS systems are now performance verified to support up to 8,000 tracks per NEXIS F2 SSD across hundreds of connected Pro Tools workstations accessing audio clips, session files, video sequences and other media simultaneously, while enabling 6x faster performance for audio punch-in.
Seamlessly integrated within Avid Media Composer® and Avid Pro Tools, the Avid NEXIS F2 SSD offers ultimate performance, scalability, and real-time media delivery to track and mix soundfor-picture and other high track count, immersive audio projects.
For smaller-scale projects, production teams can deploy the Avid NEXIS F-series and Avid NEXIS | PRO+ engines. With Avid NEXIS | VFS software 2023.12 or later, they also offer verified multiuser performance for Pro Tools audio workflows, now enabling
multiselecting memory locations and interacting/recalling them, plus new key commands that enable users to more easily navigate between timeline marker locations.
• MIDI pitch labels on the piano roll and MIDI notes, making it easier for creators not familiar with music theory to recognize which notes they are playing, recording, or need to modify in chords.
• Pro Tools 2024.6 is now available to all Pro Tools customers on an active subscription or perpetual licence with a current Upgrade Plan, as well as all Pro Tools Intro users.
Visit https://www.avid.com/
2x faster performance for audio punch-in, and 20% faster save times for Pro Tools sessions. The NEXIS F-series supports up to 175 million files and 500 voices per media pack, while the NEXIS | PRO+ supports up to 5 million files and 500 voices per media pack.
In any NEXIS configuration,
production teams no longer need to spend time duplicating files or moving them around. Their media can be securely, centrally stored, providing fast simultaneous access to all authorised team members.
Visit https://www.avid.com/
Pro Tools 2024.6 also
Nine Network Deploys Black Box’s Emerald IP KVM
BLACK BOX, a leading IT solutions and consulting services provider to businesses worldwide, has announced that Australian media company Nine Network is using a Black Box Emerald IP KVM solution to ensure reliable, flexible signal extension across offices and studios at its 1 Denison Street location in Sydney. Black Box Emerald 2K and 4K transmitters and receivers, along with the Boxilla KVM Manager, are deployed in a redundant KVM architecture over an existing copper-based IP network to guarantee users continuous remote access to centrally located systems supporting critical tasks such as graphics creation and video editing.
“Reliable connectivity is a must within a media operation, especially one such as Nine Network, where teams collaborate to create a high volume of live content, day in and day out,” said Tom Fitzgerald, Black Box KVM product manager. “With management centralised and simplified by our Boxilla system, Black Box Emerald KVM devices integrate readily onto existing IP networks and operate seamlessly with key broadcast systems to support vital operations.”
The Black Box Emerald IP KVM system forms an integral part of the Nine Network broadcast production and studio environment used for creating 12 to 14 hours of live television each day, as well as promotional content. Capable of transporting data over standard IP networks, with redundant connections using physically
different paths/ networks and equipped with redundant power, the Emerald KVM solution addresses Nine Network’s top requirement: reliability. With this multi-level redundancy, the KVM system can tolerate various types of connectivity failure between transmitters and receivers, in turn preventing interruption of operators’ work.
An API integration of Boxilla with Nine Network’s LAWO VSM IP broadcast control and workflow solution allows the company to control the active connections for its new Emerald KVM receivers. Operators can switch between multiple systems for day-to-day operations, or quickly swap to backup systems in the event of a failure. Support for custom USB HID/control peripherals enables studio operators to switch between different host devices and operate them using the same custom USB devices they use to deliver live news bulletins.
“The willingness and ability of the Black Box development teams to work with us throughout
World-First Demo of STL Using IP/PTP Over Microwave
MEDIA LINKS, a manufacturer of Media over IP transport technology, has succeeded in a world’s first demonstration of PTP and STL using IP over microwave dedicated lines. Microwave dedicated lines are widely used for STLs at many broadcasting stations both in Japan and overseas. Based on the success of this demonstration, Media Links will now offer STL transmission using the IP/PTP transmission method to broadcast stations both in Japan and across the globe.
Broadcasting systems are rapidly changing from networks that utilise traditional coaxial cables to systems based on new IP-based technologies. For STLs however, which are the backbone of many broadcast networks, the locations of transmitting stations (mountaintops, etc.) are often unsuitable for networks that rely on wired lines, and where microwave transmission remains essential. It has therefore become important to develop IP transmission technology using PTP over microwave.
Media Links was the first in the world to develop and deploy STL using PTP over a wired IP network (STL over IP/PTP). Building on this innovative work, we’ve now developed a new technology solution (STL over IP/PTP Microwave) that again realises STL over IP/PTP but now over dedicated lines using microwaves.
Together with our partner Oscilloquartz, Media Links conducted a demonstration of the technology on an existing microwave install. The demonstration was carried out at a broadcasting station in the Philippines from May 13 to 17, 2024. It used Media Links’ STL over IP/PTP device, the MDP3020 SFN, to transmit both broadcast content and PTP timing, and leveraged the existing equipment installed by the broadcast station for normal microwave transmit and receive operations. As a result, the team confirmed the normal operation of the transmission of signals and the accurate transmission of timing by PTP. It therefore succeeded in realising the world’s first STL using IP and PTP using microwave lines.
deployment to resolve issues and include required features in the lifecycle was very reassuring,” said Matt Benson, group enterprise architect, Information Technology at Nine Network. “This collaboration helped to ensure that the Emerald KVM system could meet all of our needs. As we expand our Sydney site and redevelop other news production facilities around Australia to replace ageing systems, the Emerald IP KVM system is a proven solution that we can deploy consistently to ensure essential performance and reliability.”
Visit https://www.blackbox.com
migration of STL transmission, thereby delivering a complete IP migration of broadcast systems.
Tsukasa Sugawara, CEO of Media Links, commented: “The success of this demonstration has made it possible to realise STL transmission by IP/PTP over microwave lines, where previously this was a difficult step in the IP migration of broadcaster systems. Going forward, we will not only propose IP migration within broadcast stations, but also the IP
Anil K Reddy, Associate Vice President of Oscilloquartz, Adtran Networks, commented: “It is exciting to see the success of the trial of PTP over IP Microwave networks with OSA Synchronisation technology. Adtran’s Oscilloquartz innovative and versatile solutions bring the highest timing accuracy required in packet broadcast networks both on wired and wireless networks for a better quality of experience.”
Visit https://www.medialinks.com/
Native NDI Bridge Multi-channel Video Conversion Tool
VIZRT, THE PROVIDER OF real-time graphics and live production technology for content creators, has announced Viz Connect Tetra, an ultra-compact live production workstation that enables multi-channel 4K video and audio connectivity with a simple internet or network connection, anywhere in the world.
“Live production has historically required multiple video converters and software platforms for differing workflow needs. Remote production requirements typically add additional complexity and conversion, especially with cloud integration. Viz Connect Tetra solves this by rolling all you need for remote live contribution into one device, revolutionising workflow, scalability, and stability for live productions,”
Liam Hayter, Product Manager, Vizrt.
Enabling Easier Video and Audio Conversion, Anywhere
Viz Connect Tetra helps meet the ever-changing demands of multi-camera, on-location, live productions with 4 flexible channels supporting up to 4K with no compromise. Tetra’s I/O channels can be used as up to 12G NDI to SDI, 12G SDI to NDI or even NDI to NDI – at home in a server room rack as it is on a desk, on location, wherever connection is needed.
Tetra’s in-built white balance and colour correction tools mean these all-important quality adjustments can be made directly at the point of conversion – saving time.
Audio has never been more flexible than with Viz Connect Tetra – whether it’s NDI, SDI, ASIO and WDM virtual SoundCards. For the very first time in a Vizrt product, each of the four IO channels support 16×16 audio routing for patching on the go – making it easy to respond to any live production requirement.
Access to the Cloud Made Simpler with NDI 6
Featuring access to NDI Bridge join-mode in the device’s user interface, this compact converter
can send and receive NDI for remote production anywhere in the world. Using NDI Bridge on Viz Connect Tetra, users can seamlessly enable production capabilities with cloud and remote productions without additional hardware. Effortlessly connecting to TriCaster, Vizrt graphics solutions, and other NDI and SDI live production systems, Viz Connect Tetra can be used either on premises or in the cloud, from anywhere in the world. NDI Bridge’s built-in encoding capabilities empower users to choose from NDI High Bandwidth or NDI HX with bandwidth control – suiting almost any connectivity constraints.
to support Viz Flowics and HTML 5 graphics independently of TriCaster, converting HTML5 outputs into not only NDI, but also up to two key and fill pairs for baseband SDI workflows.
Remote Post-Production Pipelines
With NDI Bridge Join mode, Viz Connect Tetra systems can be securely linked to NDI Bridge Hosts anywhere in the world. Enabling bidirectional sharing of content and contribution between any location with internet or WAN access.
“Video productions rely on collaboration. Viz Connect Tetra helps remote content creators to securely connect, expand their sources, and create a private peer-to-peer content network. It integrates video and audio feeds from different locations into a single NDI or SDI live production environment via an NDI Bridge Host in the data centre,” adds Hayter.
HTML 5 Graphics Support
Viz Connect Tetra bridges the gap between HTML 5 Graphics tools, Traditional SDI infrastructure and the world of NDI IP technology. Furthermore, Viz Connect Tetra is the first multi-channel conversion solution
With seamless connectivity to a multitude of post-production tools, Viz Connect Tetra enables access to flexible, cloud-based, and remote production environments. This capability extends to postproduction, NDI enabled Post Production Systems, editors, even Producers and Directors no longer need to be in the same room.
Viz Connect Tetra’s compact footprint with 12G 4K UHD output, and the inherent secure encryption workflow provided by NDI Bridge, with the data centre host in control of connections, makes it the perfect desktop companion for post-production professionals to land their programme output to broadcast grade displays, wherever they are, even if operating via remote desktop systems many miles away.
BRIDGE TECHNOLOGIES has announced to have further improved its Microbitrate analytics, integrated into the VB330, VB220 and VB120 and NOMAD monitoring solutions.
According to the company, this proprietary feature is unique amongst IP-based broadcast monitoring probes and provides unparalleled insight into network performance for broadcasters wanting superior Quality of Service (QoS) and Quality of Experience (QoE).
Traditional network traffic measurement instruments often rely on average metrics over a second to assess network performance. In reality though, a network is subject to ‘microbursts’, a phenomenon in which data packets are transmitted in rapid, oversized bursts. Although these constitute only momentary fluctuations, they have the potential to overflow the buffers of the network stack, undermining broadcast quality. Thus, a per-second readout of network performance may indicate that – on average –everything is operating within parameters, when in reality the broadcast is subject to jitter and dropped packets. Indeed, a network showing an average rate of four gigabits per second may experience microsecond-level peaks reaching as
high as 25 gigabits.
By addressing this issue at the microsecond level, Bridge Technologies says its Microbitrate feature helps engineers to pinpoint the exact moments and causes of microbursts, facilitating immediate and effective troubleshooting. While the core functionality of Bridge Technologies’ Microbitrate analytics has been integrated into the IP probes for over six years, it undergoes continuous evolution with each new version release, providing increasingly intuitive and in-depth visualisations of network traffic peaks and troughs. With the ability to customise time in increments at which measurement is undertaken, no other broadcast probe on the market offers this level of detail and userfriendly presentation.
“The ability to monitor and analyse microbursts is indispensable for providers dealing with IP transmissions,” said Simen Frostad, Chairman
of Bridge Technologies. “With our version 6.3 release, we have further improved our MicroBursting analytics, empowering users to maintain the highest standards of distribution quality, and ensuring seamless delivery and optimal viewer experiences.”
Visit https://www.bridgetech.tv
Mediaproxy LogServer Integrates AI-Media’s LEXI
MEDIAPROXY, the provider of software-based IP compliance solutions, has further expanded the capabilities and scope of its LogServer compliance monitoring system through integration with AI-Media Technologies’ recently launched LEXI Recorded self-service captioning and transcription platform.
This is the latest development in the ongoing partnership between Mediaproxy and fellowAustralian broadcast technology developer AI-Media. The closed caption specialist was originally a Mediaproxy customer, but the two companies have since been collaborating more closely, beginning by integrating LogServer with AI-Media’s iCap cloud network.
Founded in Sydney in 2003 with the aim of making television programmes more accessible to Deaf and Hard of Hearing viewers, AI-Media has developed a full range of captioning technologies to generate, distribute and display subtitles for a variety of broadcasting and streaming platforms. These include viewing systems and encoders, with its core offering being the LEXI
AI-powered toolkit. Among its features are automatic captioning, translation and disaster recovery capabilities.
LEXI Recorded is the latest addition to this suite of tools and was launched in February this year. It is a cloud service designed to deliver fast turn-around captions for Video On Demand along with transcriptions and translations of broadcast output. Like LogServer, LEXI Recorded is file-based, which has allowed for easy integration of the two systems.
Mediaproxy announced at the 2024 NAB Show that LogServer is now able to work with LEXI Recorded, providing outputs from its logger/ recorder that are taken into the AI-Media system for captioning, translation and transcription purposes. “Our technology is also fully integrated into LogServer, enabling broadcasters to create a transcription on-demand of video content by simply selecting an in point and an out point on the clip they want captioned and sending it to LEXI Recorded from within the LogServer software,” explains Declan Gallagher, general manager of sales for the Asia Pacific region
at AI-Media. “The use case is primarily for compliance reasons. If a broadcaster was asked what went to air and didn’t currently caption that channel, they would be able to quickly turn around a transcript of the content in question. Integrating further with LogServer has given our new service – and our users – a high degree of functionality and flexibility for an increasingly important aspect of broadcasting.”
Mediaproxy’s chief executive, Erik Otto, comments: “The integration of LEXI Recorded is an important
Veset AdWise Launched in AWS Marketplace
VESET HAS ANNOUNCED that its Veset AdWise is now available in AWS Marketplace.
Veset AdWise is the company’s cloud-based solution for real-time, in-video advertising, allowing for elevated viewer experience and ad effectiveness in live streams for broadcasters and content providers. AdWise utilises advanced picture-in-picture technology, seamlessly integrating contextual advertising into live feeds.
With AdWise, dynamic advertising is taken up a notch with a smooth and engaging user experience that is attainable without disrupting the live feed. AdWise allows broadcasters, service and content providers to implement seamlessly integrated advertisements, instantly update advertising content in response to dynamic live events and gain the flexibility to choose a wide variety of advertising formats.
Veset AdWise supports the rise of dynamic, contextual-based advertising, allowing media providers to use a tool to create advanced, user-friendly in-video adverts without disrupting the live feed. Broadcasters and media providers can further enhance their
broadcasting workflows by prioritising user experience and monetisation in a non-intrusive and beneficial way for both consumer and provider.
Bringing Veset AdWise to AWS Marketplace allows for industry-wide access to in-video, contextual-based advertising where content providers and broadcasters can continue to create, manage and leverage content with advertising as much as possible while also taking into consideration the consumer experience.
Gatis Gailis, CEO, Veset, commented: “The launch of AdWise in AWS Marketplace is a step forward in accessible, non-intrusive advertising solutions for live content delivery. By introducing in-video, contextual advertising to broadcasters and content providers in the industry, user experience can be prioritised, and a new era of ad-tech is being realised.”
Visit https://www.veset.tv
addition to the features of our LogServer product and gives a great deal of flexibility in the crucial area of captioning and transcription. The partnership with AI-Media is a long-standing one and is only getting stronger. We are pleased to be working with such dedicated experts in this specialist field and helping to provide better accessibility for a wider TV audience.”
Visit https://www.ai-media.tv and https://www.mediaproxy.com
We are looking for a Technical Sales Executive to join our team and drive revenue growth in the ANZ region. You will focus on selling Broadcast and Professional video and audio solutions, including SMPTE 2110, NDI, IP, and baseband audio and video.
You will work with brands such as Calrec, Shure, Sennheiser, Sound Devices, AIDA, Kiloview, Science Image, Sienna NDI, TVU Networks, and Matrox. $120k plus Uncapped commission
To apply, email sales.aus@d2n.com.au
GatesAir Addresses Industry’s RF Engineering Gaps
GATESAIR, a Thomson Broadcast subsidiary dedicated to wireless content delivery, has announced the launch of GatesAir Care, an enhanced managed services program that addresses the urgent need for skilled RF resources and services in the global broadcast industry.
Available immediately, the GatesAir Care program offers defined service level agreements across three-tiered support plans (Elite, Signature, Standard) that collectively aim to fill widening gaps in engineering skillsets as longtime RF engineers approach retirement. The number of services included escalates with each tier, beginning with simple extended warranties, discounted spare parts, and preventative maintenance visits to keep transmitters and associated
RF systems in excellent operating condition. Additional services are available with each tier, which can include installation and commissioning, 24/7 remote maintenance and monitoring, and live, onsite technical support for major broadcast events.
“Most young engineers coming into the business today possess a strong IT skillset but have limited to no RF expertise,” said Raymond Miklius, Vice President of Technology, GatesAir. “We developed GatesAir Care to alleviate concerns about finding skilled engineers to take care of RF systems and plants as veteran TV and radio engineers reduce hours and ultimately retire.”
All three program tiers can also scale to the size of the broadcaster, including dedicated services in smaller markets that are especially
light on engineering resources.
GatesAir Care will also offer options for crisis team deployment and disaster recovery services in the event of severe weather or other unanticipated events that require immediate attention.
Mark Goins, Vice President, Global Sales for GatesAir, adds that the
Ki Pro GO2 Multi-channel HEVC/AVC Recorder
AJA VIDEO SYSTEMS has announced Ki Pro GO2, the next generation of its Ki Pro GO multichannel HD/SD recorder. Sporting an upgraded feature set, the portable, 2RU device facilitates up to four channels of simultaneous H.265 (HEVC) or H.264 (AVC) recording to cost-efficient USB 3.0 drives or network storage with redundant recording and singlechannel playback. Ki Pro GO2 helps production professionals produce higher quality images at lower bit rates, allowing for longer recording times. It also provides flexible connectivity, including four 3G-SDI and four HDMI digital video inputs, so they can connect to a wide range of video sources.
Ki Pro GO2 boasts an extensive feature set for reliable, highperformance recording
and playback in a range of environments, from concert tours and sporting events to houses of worship, universities, and beyond. Setup and operation are simple with an HD display for precise monitoring; video monitoring and menu/status overlays with an on-screen keyboard; dedicated record, play, stop, rewind, and fastforward buttons; and a standard Ethernet LAN connection for remote configuration and control of the device over a network from any web browser using Ki Pro GO2’s intuitive web user interface (UI).
Integrated input frame synchronisers ensure users don’t have to genlock incoming video signals, while high-quality de-interlacers on each input allow progressive recordings to be made from interlaced inputs.
Multi-channel matrix monitoring makes it convenient to monitor multiple camera feeds on a single HDMI or SDI display. Flexible audio functionality supports the capture of up to two channels of embedded audio from 3G-SDI or HDMI inputs, or two channels of Analogue Audio from XLR ports, each offering a selection of Line, Mic, or +48V Phantom Power for level control. One of the device’s analogue audio inputs can also act as a source for Longitudinal Time Code (LTC) insert.
Recording is convenient and cost-effective with Ki Pro GO2, as the device supports a host of off-the-shelf USB media. Ki Pro GO2 also records in real-time to network storage via its Ethernet connection and FAT or exFAT file systems are supported for cross-
Language Detection for QA of Captions & Subtitles
TAG VIDEO SYSTEMS has developed a new Language Detection feature which it says will transform how operators ensure quality and compliance across large scale operations with multiple closed captions and language subtitles.
The core of the technology lies in its ability to automatically pinpoint the language of subtitles within video content. Powered by advanced algorithms, TAG performs a meticulous quality analysis informed by languagespecific dictionaries, which in turn provides the data within the multiviewer output in the form
of two parameters: identified language, and quality percentage against the identified language’s dictionary. The data can also be aggregated and visualised with data visualisation tools to identify trends or centralise large scale operations. This significantly streamlines the caption monitoring process, offering these key benefits:
• Frees Up Operator Time: By eliminating manual monitoring of closed captions and subtitles, operators gain valuable time to focus on more strategic and nuanced tasks.
• Proactive Problem Solving:
Real-time monitoring paired with helpful, informative alerts enables operators to quickly address any potential caption quality problems.
• Pinpoint Accuracy: Reliable language identification and quality measurement against a dictionary helps avoid human error and assure accuracy and compliance.
• Quality That Meets the Mark: Language-specific analysis ensures captions uphold quality standards and regulations, ultimately enhancing the viewing experience.
GatesAir Care program ultimately aligns with the broadcast model’s increasing emphasis on operational efficiency. “Changes in the global RF engineering fleet are unfolding at a very swift pace over the past 12-18 months,” said Goins. “The GatesAir Care program will help broadcasters maintain their focus on operational efficiency with the peace of mind that our experienced, dedicated support staff will provide consistent, proactive, and responsive service tailored to each customer’s needs.
GatesAir will also offer beginning and advanced RF training courses in alignment with the GatesAir Care program to help younger broadcast engineers strengthen their RF skillsets.
platform use when transferring files to workstations, servers, and more. Multi-channel recording functionality ensures individual recordings for each input, so users can drag and drop files directly from a USB drive into their preferred creative software, and “group recording” is also supported. Clip names are laid out logically, and channel and backup are automatically appended for simple data management. Enhanced Super Out provides a timecode overlay, per channel audio VU meters, and media remaining percentages over SDI and HDMI monitor outputs. The device also facilitates playback of HEVC/AVC files created in supported thirdparty creative apps.
Visit: https://www.aja.com/kipro-go2
• Informed Decision-Making: Data visualisation tools provide insightful overviews of caption/ subtitles quality trends, empowering media operations to make strategic improvements.
Out provides a timecode overlay, per channel audio VU meters, and media remaining percentages over SDI and HDMI monitor outputs. The device also facilitates playback of HEVC/AVC files created in supported third-party creative apps.
Visit https://tagvs.com/
HIGH-END QUALITY FOR EXCEPTIONAL VALUE
To
A flexible studio camera for today’s live productions, the HXC-FZ90 combines high image quality with seamless integration and smoother live workflows, making it ideal for producing coverage at sporting events, house of worship services and OTT content like Esports. Safeguard for future productions with a simple upgrade to 4K with a software option.
4K and HDR production with wide colour BT.2020 support
Compatibility with Sony standard viewfinders and accessories
Wide range of operational functions inherited from the industry leading HDC series
Network trunk expands configuration options for prompters or PTZ camera control