TM Broadcast International #122, October 2023

Page 1

EDITORIAL

Well, once again, here we are, beginning Autumn season with an industry buzz… Can our readers hear it? It’s the sound of the industry getting ready to kick out a new round of exciting times for broadcast sector.

As the world of sports broadcasting continually evolves, we find ourselves standing at the cusp of a new era in live sports production. The post-IBC 2023 landscape unveils exciting innovations, and our readers will be able to read the conclusions after the main European meeting for industry players within this TMBroadcast International issue.

At the start of this magazine, TM Broadcast brings its readers a conversation with Òscar Lago (LALIGA), on how it feels to be shaping the future of football coverage. The sports broadcast director from Grup Mediapro discuss, in these pages, what will be the main changes in coverage during the upcoming football season as well as their vision for redefining football coverage. LaLiga has been at the forefront of leveraging technology to engage fans globally, and Lago provides insights into how they plan to elevate the viewing experience for football enthusiasts.

The importance of lighting in broadcast and all kinds of productions cannot be overstated. Lighting sets the mood, highlights the key moments, and ensures that every aspect of the game is crystal clear; with this in mind, this issue will be a light in the dark for broadcasters and creators interested in what are the main state-of-art developments in consoles.

Besides, TM Broadcast talks with the OCA (Open Control Architecture) Alliance, which has been at the forefront of driving interoperability in the broadcast industry. The alliance’s influence is set to expand further, and, with an eye on simplifying complex production workflows, the OCA Alliance continues to play a pivotal role in unifying this world.

For closing, we keep a close watch on Teradek for the next big thing in the industry with a Lab focusing in Teradek, an important operator in this technologic ocean for broadcasting.

How does it sound? Stay tuned to TM Broadcast International as we continue to bring you the latest developments shaping the future of broadcasting and filming productions.

3
Editor in chief Javier de Martín editor@tmbroadcast.com Key account manager Patricia Pérez ppt@tmbroadcast.com Editorial staff press@tmbroadcast.com Creative Direction Mercedes González mercedes.gonzalez@tmbroadcast.com Administration Laura de Diego administration@tmbroadcast.com Published in Spain ISSN: 2659-5966 TM Broadcast International #122 October 2023 TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43

New developments in the production of LaLiga

For this edition, LaLiga is not only presenting a new graphic identity but, as readers will have already had the opportunity to see, Grup Mediapro has integrated new technologies, cameras and points of view to turn the viewer’s experience into a much more immersive one, with greater access to complementary information, and aesthetics increasingly closer to the look of video games, specifically, of EA Sports FC.

IBC 2023: Back to “real”

When all sectors, not just broadcast, are being immersed in virtual reality, generative AI, and other technologies that intermingle the real and the virtual , IBC 2023 has been the first “real” IBC after the pandemic.

OCA Alliance on how power is nothing without control

The OCA Alliance is a consortium of industryleading companies dedicated to advancing open standards in the field of professional audiovisual (AV) and media networking. Established in 2011, the alliance seeks to foster interoperability and innovation within the AV industry by promoting the adoption of the AES70 standard.

SUMMARY 4
6
News
22
32
40

DMX512 lighting consoles

TM Broadcast talks with A.J. Wedding, Orbital Studio’s CEO, about how to achieve success in so little time as well as the advantages of virtual production, video industry and what are the main challenges in audiovisual production nowadays.

5
46 The 4K UHD/HDR in Live Sports Streaming, by MediaKind 54 Test Zone: Teradek ART 58

Viz Pilot Edge 3 offers

features

Vizrt has announced the release of Viz Pilot Edge 3.0 featuring a completely new user interface and, as the company points, the core of the redesign is improved navigation for users and enhanced functionality written using modern web technology (Vue, TypeScript), making Viz Pilot Edge more user-friendly, more responsive, and with a start-up time that is now up to 3x faster*.

*Internal test based on a database from a large network. Opening a template was routinely 3x faster than with the previous version of Viz Pilot Edge

A brand-new streamlined interface

The layout has been streamlined and is optimized for different screen sizes so users can better focus on creating graphics. Navigation is also smoother and the search function now contains multiple tag filtering to further narrow down search results. Additionally, the preview window can be undocked.

The interface update also includes the ability to drag and drop multiple elements into a story slug, radio buttons in templates, improved crop

functionality and faster editing of tables. Live data input gets a boost with the addition of feed browser RSS feed support.

Mayam Order Management

The new Order Management integration aims to requesting and managing images easily. Journalists can prep their templates with the relevant text and/or data and send an order directly to the Creative Department via Viz Pilot Edge.

User-friendly augmented reality and virtual reality workflows

Incorporating AR/VR into an automated rundown has been simplified with the ability to host the Viz Arc MOS plugin, the company points. Users can control AR/VR graphics in one unified interface and playout with a studio automation application such as Viz Mosart. Furthermore, company says, there’s no disruption to workflows. Despite the extensive interface change, any template created in previous versions will continue to work seamlessly. 

NEWS | PRODUCTS 6
faster and more responsive interface with new control
VIZ PE 3 INTERFACE
Viz Pilot Edge 3 introduces centralized order management, offering improved useability, intelligent media search and Viz Arc control to speed up graphics creation and playout

Grass Valley launches Kayenne production switcher control surfaces

Now, Kayenne modular surfaces can be combined with K-Frame XP video processing engines thanks to new bundles

Grass Valley®, manufacturer of solutions for live production, has recently announced new production switcher bundles for sports, entertainment, news and any other premium events, combining redesigned Kayenne® modular Video Production Center surfaces with the K- Frame™ XP video processing engines.

While the Kayenne K-Frame XP bundles are specifically designed for larger productions, Grass Valley explains that, according with its “Any Surface, Any Engine” philosophy, it affords panels such as

Korona™ and Karerra® to be paired with any K-Frame including the CXP, SXP, low-cost V-series, or softwarebased switcher offerings powered by AMPP.

Additionally, K-Frame engines can now be fitted with a JPEG XS board, for advanced IP inputs and outputs, based on the user’s requirements complementing the native JPEG-XS support found in Grass Valley LDX 100 series cameras and the latest C135 compact camera. 

NEWS | PRODUCTS 8
KAYENNE AT THE SUPER BOWL

QTV allies with Ross Video and ES Broadcast to improve sports broadcasting in Scotland

The provider of production services and outside broadcast facilities QTV partners with Ross Video and ES Broadcast, has boosted its sports broadcasting capabilities to improve the service offered to its clients, among them the Scottish Professional Football League (SPFL), BBC, EBU, World Archery, and The New York Times.

QTV produced over 500 events and outside broadcasts across all sectors in 2022, and it’s the production partner of the SPFL, delivering coverage of 180 Cinch Premiership matches a season and managing the league’s post-production, world feeds and streaming distribution. In October 2022, QTV facilitated the introduction of, and now hosts,

the Video Assistant Referee (VAR) operations centre for the Scottish FA.

QTV adopted technology from Ross Video, facilitated by ES Broadcast, its principal technology partner. Key milestones in this partnership include:

- In 2017, QTV initiated services for the SPFL with new broadcasting equipment.

- Adoption of a remote production model by 2020, securing a multi-year contract with the SPFL.

- Launch of custom remote engineering vehicles (ENG) for efficient broadcasting.

- In 2022, QTV relocated to Clydesdale House, a facility purpose-built for remote production.

- By 2023, QTV further expanded its facilities to meet growing demands.

As the company explains, QTV’s collaboration with Ross Video and ES Broadcast has led to a range of improvements in its business metrics and operations. The company experienced a significant increase in its turnover, rising from £1.7 million in 2018 to £5.3 million in 2023. Alongside this financial growth, QTV has also expanded its workforce, growing from a team of nine to forty-five employees.

10 NEWS | STORIES OF SUCCESS

RTL+ upgrades software for streaming platform with Qvest

The German TV broadcaster RTL runs largest streaming service in Germany, RTL+, with more than 3.4 million users and, facing transition to a modernizing software they chose a part of the Qvest Group, Qvest Digital AG, for overhauling of the existing software architecture.

As part of the modernization, the existing software architecture

was converted from a backendfor-frontend (BFF) pattern to GraphQL with Federation. In the process, numerous separate APIs, each of which was connected to a different frontend of the streaming service (web app, mobile app, TV app, etc.), were replaced by a harmonized API. The newly created, central interface for all software

teams enables easier further development of the platform as well as time savings of up to 30 percent in the workflow.

Key factors in the success of the GraphQL Federation implementation included troubleshooting and community building, in addition to tooling and schema management. 

NEWS | STORIES OF SUCCESS 12
QV-RTL+ SOFTWARE ARCHITECTURE

HbbTV association opens entries for HbbTV 2023 Awards

The global initiative dedicated to providing open standards for the delivery of interactive TV services through broadcast and broadband networks, HbbTV Association, has just opened entries for this year’s HbbTV Awards. As part of the 11th HbbTV Symposium and Awards, the contest will take place on November 28-29, 2023 in Naples, jointly hosted by the HbbTV Association and Comunicare Digitale, an Italian association for the development of digital television. It will be the sixth time the HbbTV Awards are held, showcasing best practices in the HbbTV community.

The providers of HbbTV applications and services are welcome to submit their entries through an online form placed at its website, where companies will find also details the terms

and conditions of the free competition.

The categories for the HbbTV Awards 2023 are:

Best use of HbbTV for advertising-based solutions

Best tool or product for HbbTV service development or delivery

Best technology innovation in an HbbTV product or service

Best marketing or promotion of an HbbTV-based service

Judges’ Award: “HbbTV newcomer of the year”

A company may enter into as many of the categories as they like. Each submission will require a separate form to be completed. The awards are free to enter. Entries will be judged on their execution, impact and innovation.

The closing date for submissions is September 30, 2023. A shortlist of finalists will be put forward to a panel of industry experts who will select the winners. The finalists will be announced by October 31, 2023 and prizes will be awarded at a ceremony on November 28 as part of the HbbTV Symposium and Awards 2023. All finalists are invited to attend the awards celebration.

The winners of the fifth HbbTV Awards, held in 2022 in Prague, Czech Republic, include TV Nova, Fincons Group, ZDF, Kineton, Cellnex/LOVEStv and Klaus Merkel (rbb/ARD).

The annual key summit of the connected TV industry will take place at the Naples Maritime Station – Angevin Pier, a congress and exhibition centre located in the port area of Naples at the Mediterranean Sea. 

14 NEWS | STORIES OF SUCCESS
THE ANNUAL KEY SUMMIT OF THE CONNECTED TV INDUSTRY WILL TAKE PLACE AT THE NAPLES MARITIME STATION – ANGEVIN PIER

Gravity Media Australia partners with Supercars Media to produce

Repco Bathurst 1000

According to both companies, the production to come will be the biggest ever for this Supercars’ competition

Supercars Media and Gravity Media recently confirmed details for the all-including all-screens coverage of 2023 Repco Bathurst 1000. Both companies have worked together to capture every moment on Fox Sports and the Seven Network from October the 5th to 8th, and the deployment consists in more

than 175 television cameras ontrack, in-car, in the pits, around the track, embedded in concrete kerbs and walls, in the air and across sections of the track on a wire, and accessing four state of the art high definition outside broadcast trucks, speciality in-car camera technology designed and developed by Gravity

Media Australia, 52 kilometres of broadcast cable and a total production crew of more than 250.

In addition, coverage of this year’s Repco Bathurst 1000 will be broadcast around the world via Supercars’ international broadcast partners. This is the

16 NEWS | STORIES OF SUCCESS

eleventh year for Gravity Media Australia’s broadcast facilities and technology collaboration with Supercars Media which produces the global and Australian television coverage of The Great Race, which reaches its 60th anniversary this year.

Coverage of the Repco Bathurst 1000 has expanded dramatically over the years. The first race saw three television cameras cover the event in 1963. The first “live” incar camera coverage was delivered in 1979.

Key production details for television coverage of the Repco Bathurst 1000:

Ten production trucks, including four high definition outside broadcast trucks driving the overall television production

More than 175 cameras across the track, in the pits, in and across cars, on driver helmets, mounted in race walls and track kerbs, including portable and speciality extreme super slo-mo cameras, live helicopter coverage and a “DACTYL CAM” camera on a wire covering 700 metres across Conrod Straight and The Chase

52+ kilometres of television production cable and fibre, enough to lap the circuit more than eight times

Broadcast and production team circa 250+ across Gravity Media and Supercars Media

In addition to this partnership with Supercars Media, Gravity Media also provides the technology and systems for Supercars’ team radios, along with isolated camera coverage for the Gravity Media Australia developed and bespoke Gravity Review System, used by the motorsports judiciary and race control in the management of every on-track moment in the Repco Bathurst 1000 and across every race in the Repco Supercars Championship.

Broadcast Solutions set SNG fleet up for rt1.tv

Camera connectivity uses the Ereca Cam Racer system and this way, allows any professional camera to be docked and connected over standard SMPTE fibre to a 1U base station in the truck, and controlled over a web interface. Each vehicle carries Litepanel LED lighting and a number of Sennheiser microphones. For major events, the company explains that two vehicles can be simply linked together, using NDI and Dante.

German production company rt1.tv. has received two of four new reporter vans as part of an expansion of its fleet and delivered by Broadcast Solutions, European media systems integration group.

Two have been delivered already, to replace life-expired vehicles in the fleet. Based on VW Transporter vans, they are designed to be self-sufficient and as environmentally friendly as possible, manufacturer claims. While the van includes a generator, the primary source of power is a lithium ion battery pack, capable of supporting all the technical equipment for as much as four hours. As Broadcast

Solutions points, even with the large battery and the technical equipment, the gross vehicle weight remains below 3.5 tonnes, so it can be driven without the need for a truck licence.

According to Broadcast Solutions, the production systems are designed to be as simple as possible, and are generally controlled by a single operator, although the reverse bench seat provides for a second operator, and there is an additional pullout laptop workspace in the front passenger area. The large monitor provides access to the NewTek Tricaster for routing and production control. When in edit mode, the screen provides

As the manufacturer points, the intention is to use IP connectivity from the vehicle back to the broadcaster’s base, using either LTE or broadband via the LiveU transmission system. This also carries the Laon intercom for talkback between the camera position, the van and the central studio. Another feature the company stressed is that in case there is no strong LTE signal, there is a roof-mounted 1.2m dish, transferred from the vehicles being replaced, providing Ku-band links.

The first two vehicles are already in routine use. An extended delivery time from vehicle manufacturer VW means that the other two new trucks will not be on air until 2024.

18 NEWS | STORIES OF SUCCESS
access to the HP workstation.

Ateme powers the Viacom18 UHD experience for Indian Premier League 2023

viewing experience for the first time in India.

Ateme, company that specializes in video compression, delivery, and streaming solutions, has made public that their TITAN encodersand decoders were behind the high quality video and

immersive audio experiences delivered by Viacom18 Media Pvt. Ltd. during the Indian Premier League (IPL) 2023. All through the tournament, fans were able to enjoy an immersive UHD TV

The audio-visual display of one of the world’s most-watched and highest-value sports events was enabled by Ateme’s TITAN encoders and decoders. Supporting both linear and OTT services, the encoders and decoders empowered Viacom18 to reach audiences on any screen, company claims. In addition, the encoders also lay the foundation for Dynamic Ad Insertion (DAI), equipping Viacom18 for targeted advertising, says Ateme. 

NEWS | STORIES OF SUCCESS

Inverto discloses the acquisition of Quadrille Ingénierie SAS

Inverto, supplier of broadcast reception equipment and video streaming solutions, has acquire 100% of the shares of Quadrille, which is a French independent software and service provider and are about to engage in the merging of their activities.

Commenting on the acquisition, Christophe Perini, CEO at Inverto, said: “the Quadrille technology complements and advances our solutions portfolio in the content delivery segment. Our shared vision is to be our customers’ preferred connectivity partner through cutting-edge

technological leadership. We will combine our complementary solutions, strong R&D foundation and above all customer-focused values to connect our clients to

the future”.

Quadrille will continue operating from its Paris-based headquarters. 

Lionsgate updates its media supply chain with Ateliere

Technical Operations & Delivery at Lionsgate, the positive impact that migrating to a digital media supply chain means:

To automate and streamline the supply chain to maximize ROI on existing titles.

By centralizing content and then migrating it to the cloud, Lionsgate partners are now able to access “clean” titles.

Ateliere Creative Technologies just shared key insights on the main particularities of legendary studio Lionsgate and its media supply chain transformation; the

goal was to help manage and monetize its iconic 17,000 title library.

Operational Efficiency: The entire system was up and running in a matter of weeks. Before, this required a proactive and manual process that resulted in duplications and lost time. 

20 NEWS | BUSINESS & PEOPLE

Riedel to partner with Studio Automated for developing AI video production solution

Riedel Communications and Studio Automated recently announced that they have established a partnership and kicked it off with a letter of intent (LOI) where both companies signed declarations on the principles and rules of the negotiation process. The ultimate goal for this collaboration is de development of an artificial intelligence (AI)-assisted video production solution.

As both companies claim, by joining forces they aim to create an optimized video production solution that will allow sports productions and leagues to remotely produce their live broadcasts with minimal personnel and operating costs.

NEWS | BUSINESS & PEOPLE
FROM LEFT TO RIGHT, PAUL VAN DEN HAAK, FOUNDER, STUDIO AUTOMATED; THOMAS RIEDEL, GROUP CEO, RIEDEL COMMUNICATIONS; PAUL VALK, FOUNDER AND DIRECTOR, STUDIO AUTOMATED; LUC DONEUX, DIRECTOR, LIVE PRODUCTION, RIEDEL COMMUNICATIONS.

New developments in the production of

LaLiga

For this edition, LaLiga is not only presenting a new graphic identity but, as readers will have already had the opportunity to see, Grup Mediapro has integrated new technologies, cameras and points of view to turn the viewer’s experience into a much more immersive one, with greater access to complementary information, and aesthetics increasingly closer to the look of video games, specifically, of EA SPORTS FC.

Innovation is the drive that runs through the veins of Grup Mediapro’s technical team for LaLiga, so the novelties are neither just limited to the adaptation of aesthetics, nor to the use of cinematographic cameras - a technique introduced by the Catalan media group and which, so far, no other broadcaster has been able to beat. To tell us all the novelties that LaLiga fans will have access to this season, the difficulties of their implementation and the new technological developments that we can expect in the near future, we spoke with Óscar Lago, one of the most reputable Spanish filmmakers and the first one to take full responsibility for the broadcasts of UEFA matches or the World Cup matches, after UEFA and FIFA decided to select their own team of professionals to produce and deliver the signal of their major competitions.

With more than a thousand matches behind him, Óscar Lago explains what the starting point has been for changing the way fans approach the football experience. From micros at hydration breaks, through the face to face of coaches before the start of the match to integration of game statistics or a greater attention being paid to the benches of both teams, LaLiga has come up with a bombshell for spectators for this season. Let’s find out what it is.

SPORTS PRODUCTION| LALIGA 22
SPORTS PRODUCTION| LALIGA 23

This new edition of LaLiga will provide better coverage by using more cameras, new graphics, new points of view and unprecedented customization. What kind of cameras? What kind of new technical equipment or what differences will we be able to see in the production of this new edition of LaLiga?

We have made a firm commitment to increasing the use of cinematographic cameras in the broadcasting of football matches. This is one of the most important distinct traits that Grup Mediapro and LaLiga have had in these last two and a half years, during which we have been totally innovative and pioneers. And we’re really pleased with the outcomes of using film cameras in a different way than everyone else.

Different from the rest?

Yes, because we use the cinematographic cameras in a dual way and they are cameras that at one point in the match have the same texture, the same depth of field as the other cameras in the broadcast. They are cameras comparable to the

rest, they are mounted on a steady cam. But when we spot an epic moment, a moment where the player has to be on the spotlight because he has scored a goal or because

there has been any situation that is highly relevant, the image is fluidly converted into a cinematographic image, through a filter that we apply; thus, it gives that feeling of a hero standing apart from the background, with the background out of focus, and so on... In this, we are pioneers.

We are really pleased, because we are the only major league in the world that uses cameras in this way. We know this, among other things, because of an international award we were given -such as Sportel

24
SPORTS PRODUCTION| LALIGA
We have made a firm commitment to increasing the use of cinematographic cameras in the broadcasting of football matches.

two years ago- and because of the comments from the broadcast community, which is something that is highly valued and difficult to achieve. And this is demonstrated by the fact that other competitions have tried but have not yet hit the right button.

On the other hand, although during these years we have worked with different companies and camera types, we are now very satisfied to work with California company Red, with which we maintain a very close relationship. They have provided us with

cameras that not only have this cinematographic effect, but also allow us to get super slow-motion replays of these cinematographic images, something that until now hadn’t been possible.

Grup Mediapro and Red work together to develop this camera, taking into account our needs. Because originally, these cameras were designed for filming movies or series, not for TV live broadcasting. But the Red has really made an effort for us, and we are very satisfied with this relationship and its fruits, such as this improvement that we

have included this year, and we hope this will continue in the coming years.

That in terms of the cameras. What about new points of view? Because after the use of drones or cameras on the pitch, one wonders what POVs remain to be covered or used in the production of football.

Well, we are working on bringing in a type of camera that we used for the first time in the FC Barcelona v. Real Madrid Classic of the second half of last season; they are Point of View cameras inside the bench areas of both teams. These are small, robotic cameras featuring zoom and panoramic viewing. They are managed remotely and enable seeing very closely how all the players react, from both teams. We are working to be able to have it more frequently in important matches during the season.

What is the unprecedented customization announced by LaLiga about?

It refers to all these new images that we have on insights, as agreed between LaLiga agreed and the clubs which have meant a true

25
SPORTS PRODUCTION| LALIGA

revolution. Yes, it was done in other sports that are not as mainstream as football, but in major competitions football having access to certain places such as the inside of the changing rooms before the matches or the technical talks, when there is a hydration break... had always been a bit of a taboo. As we have seen these first days of the season, this is a bit of new additional content that generates a lot of impact, not only during broadcasts but also later on social media. We are in the same context, but we have access to places and sounds that until now we were not allowed to access.

As for the graphics, what’s new in your approach this year?

At Grup Mediapro we have developed the new LaLiga brand. As you know, LaLiga has made a complete rebranding, not a simple change: it started from scratch. The colors were changed, the logo was changed, the typography was changed. And in Grup Mediapro we have transferred this new graphic universe to TV graphics; thus, we have two kinds of more dynamic and lively graphics: 2D graphics, the usual ones, but also the

online marker, which will offer a broader and more detailed vision; or lower first, in which we have incorporated new elements such as the images of the players made through stop-motion -photographs chained together that give a sense of movement with the players doing different poses, different gestures.

This is featured not only when the line-ups are announced, but these graphics will be interspersed during the game, for instance, when a goal is

scored, when a substitution takes place... in key situations we can see these stopmotion images of the players, a tribute to EA SPORTS FC. The new optics, added to the graphics, aim to generate a change in broadcasting that will provided a renewed, more fresh and much more dynamic image.

In addition, we will continue to integrate the use of live augmented reality graphics into the implementation, with live tracking data and

26
SPORTS PRODUCTION| LALIGA

performance metrics powered by Artificial Intelligence and Microsoft Azure Machine Learning. That is, we will include live information, provided with augmented reality.

Since last year, LaLiga has been editing or packaging, as it were, the productions that are then delivered to the station, but how is the production process? Do you always work with the same human team?

We will continue to integrate the use of live augmented reality graphics into the implementation, with live tracking data and performance metrics powered by Artificial Intelligence and Microsoft Azure Machine Learning. That is, we will include live information, provided with augmented reality.

Yes, in Grup Mediapro we form a compact on-site team: the technical team, the production team and the operation team. We have been working as a team for many years and, from our point of view, this is one of the keys to success. We do not work with occasional collaborators, as the team is made up of professionals who work every week on this specific activity, they produce a large number of matches. We are all on the same page,

in all departments; from pre-production, preparing the programs, broadcasts, to the technical part that does an incredible production and operation job.

Doing 1, 2, 3... or up to 4 games every week gives a lot of working experience, provides a smooth workflow, and keeps confidence very high to improve every week.

What technology has become a must in a sports production?

27
SPORTS PRODUCTION| LALIGA

At Grup Mediapro we have been keen in promoting the use of HDR (High Dynamic Range) since 2017. Although at first there were doubts about whether it would stick around, now it is clear that it is the standard of choice, since it is the one used by all major competitions: the World Cup, the Euro, any major competition today is produced with HDR.

Unlike the move from HD to 4K, which was also an important step -but one more complicated to notice by viewers since it refers to the number of pixels- the change from SDR to HDR is seen all over the world because it is not limited to increasing the number of pixels: it improves pixel quality, color quality, lighting contrast, high-beams, blacks.

And, therefore, we are committed to increasing the number of matches produced in HDR.

Regarding viewing modes, inclusion of broadcasts in portrait format is striking, such as the broadcast that Grup Mediapro made of the Belgian Super League, which has recently been produced to be broadcast directly by Tik Tok.

What seems obvious is the fact that portrait formats are very important nowadays, especially for young people. They are very much used to seeing everything in a portrait format and this is already the third production we have made in this format, for Tik Tok -a couple of years ago we produced a LaLiga match and now we have made two productions in Belgium- and it has had very good acceptance because mobiles it is the device of choice for young people. Young people clearly have their phone in their hand all day and consume a lot of video content in a vertical format; it is another way of watching football.

It is one more option that does not exclude the rest. But I’m sure it’s still going to be around.

Although it involves a lot of change.

Graphics change, as they have to be special, camera framing changes, shot size... This type of production greatly favors short shots of the players, but it does not facilitate following the game properly so much because football is more horizontal than vertical. But, but hey, in the end it’s all about adaptation: adapt or die.

At Grup Mediapro we have been keen in promoting the use of HDR (High Dynamic Range) since 2017. Although at first there were doubts about whether it would stick around, now it is clear that it is the standard of choice, since it is the one used by all major competitions

Speaking of changes in consumers, as compared to linear broadcasting with commentators, the younger audience opts more for the Twitch model in which a streamer talks about what is happening on the screen with a more direct language. Is Grup Mediapro working on any way to merge both worlds or attract new targets?

We are working on various innovation projects; we have been producing another type of multi-cam LaLiga match broadcast for more than two years, which is a linear channel broadcast by Movistar and DAZN, through which the

28
SPORTS PRODUCTION| LALIGA

same match that you can see in standard broadcast can be seen from different angles, in linear broadcast. A lot of information is included in graphics, and we’ll be adding Twitter and Twitch comments soon.

We are developing projects aimed at opening new windows in the way of consuming football for new target audiences, where viewers will be much more the owners of broadcasts and will be able to choose in a greater extent what they want to see,

and even produce different types of signals for different types of audiences. Although at the moment they are only projects, very ambitious projects, true, and still in the embryonic phase, at Grup Mediapro we are convinced that it is the way forward to attract new segments of the public.

What do you think of the integration of AI in the broadcast world?

I think artificial intelligence in the world of sports broadcasts

will definitely help. I don’t think it will replace anyone, but I do think it will open up new possibilities. In fact, at Grup Mediapro we are already working with artificial intelligence tools, as in some graphs or as in the expected goals (xG), where artificial intelligence and machine learning tell us, when a goal is scored, what were the changes (as a percentage) that whoever shot would score. This indicator is already working and has not taken anyone’s job, all it has done

29
SPORTS PRODUCTION| LALIGA

is enhance the experience for viewers. We also produce the ambient audio with the help of artificial intelligence, knowing where the ball is and improving the mix of the atmosphere on the field. And more things to come.

I think we are still at an early stage, but I have high expectations regarding AI. It will help, it will turn complex tasks into simple ones; above all, it will simplify the processes by taking care of the most bureaucratic or repetitive tasks, and it will open up other worlds that we can fathom right now.

If you could ask for a solution to a technical problem that until now the industry has not been able to solve, what would it be?

Well, all filmmakers and those of us who work in the mobile unit environment would agree on this: matches with a mixed sun and shade situation. With HDR the cameras have improved a lot, but we still have difficulties adjusting the camera’s diaphragm in situations of very intense sun, for example in matches at 15:00 in the afternoon.

Sometimes we received

criticism for having match scheduled in the Spanish LaLiga at noon, but it is impossible to make a monthlong tournament with all matches being played in the evening; the best competitions in the worldthe Club World Cup, the Euro Cup, the European Team World Cup...— have matches in daylight hours because it is impossible to make a monthlong tournament and all matches be played at nighttime.

So if we could find the key to adjust the diaphragm so that in the matches sun-and-shade plays the two scenes can be viewed at the same time with the same quality, we would all sign up for it. Yes, there has been a lot of improvement with HDR, but there is still room for improvement.

As for the training of new professionals, is technology ahead of the professional development of technicians?

30
SPORTS PRODUCTION| LALIGA

It’s complex; in this profession it is the day-to-day what makes you better. You have to come to work with some prior knowledge, but where you really learn is when working, and if we look at the history of most producers, filmmakers, or technicians, they all started from lower jobs. And I think it will continue like this.

Technology is not always ahead of people. For example, I didn’t know anything about HDR five years ago and when they explained it to me the

first day my head was going to burst. In the broadcast world you have to stay abreast, read a lot, ask people who know more than you and never stop learning. And this is what all of us in this world do.

Other times this is not the case; and this is where I feel most comfortable and most satisfied, when you take the initiative and search and stay ahead of technology. For example, in the experience of film cameras. These cameras were not made for this, no one had thought that these cameras would be good for

broadcasting. But one day something lights up in your head and another day you see something on YouTube, and you say ‘wow, we could try this’. So then it’s the other way around. It is people who are ahead of technology, and it is technology that has to adapt to what we want to achieve.

So, I think that sometimes technology is ahead and you have to adapt and other times it’s us who propose things and in the end that’s when you feel most satisfied. Many times you are wrong, but you must always look beyond. When, for example, the pandemic happened and, in April 2020, we decided that LaLiga was going to resume without an audience at stadiums, Grup Mediapro proposed creating a virtual audience so that the experience would be as close to normal as possible.

Then we started looking, and there was nothing in the world to do what we wanted to do. We started talking to brands and adapted products that existed for something else to this end. These times are really when you feel most satisfied; sometimes, yes, technology is ahead and you have to adapt. And some other times the technology has to adapt to what you request. 

31
SPORTS PRODUCTION| LALIGA
I think we are still at an early stage, but I have high expectations regarding AI. It will help, it will turn complex tasks into simple ones; above all, it will simplify the processes by taking care of the most bureaucratic or repetitive tasks, and it will open up other worlds that we can fathom right now.

When all sectors, not just broadcast, are being immersed in virtual reality, generative AI, and other technologies that intermingle the real and the virtual, IBC 2023 has been the first “real” IBC after the pandemic. Not that 2022 didn’t exist, of course, but this edition has been where the business, the launches and the atmosphere at the fair have been more real than ever.

TRADE SHOWS| IBC 2023 32

Last year 2022 marked the return of IBC after the dark ages of 2020 and 2021 and it is true that the fair returned just the way it used to be. Perhaps a little less space taken up, a couple of small halls remained unopened, but the number of visitors and exhibitors that attended reminded us of a record year 2019. And, as comparisons are odious, it was not until the current year 2023 when we realized that, now it is for real: IBC is back.

If you have to, then go

All of us who belong to this industry -perhaps in others it is a similar issue- and, above all, if it is out of vocation, we cannot fail to appreciate

the fairs, however exhausting, no matter if we may not like them, and no matter if they amount to a weekend of hard work just to be back on a Tuesday to our relentless work routine. It is a meeting point that entails hugging old colleagues in the sector, seeing firsthand novelties, gadgets and recalling anecdotes and stories that would otherwise fall into oblivion. And that was IBC 2022: the reunion.

And it is not a trifle, after two years during which we had only been able to see each other unless on the other side of the glass for obvious reasons. It was a joy, full of facemasks, to be able to touch each other and return to reality. But that was all at IBC 2022, let me explain: Fairs

TRADE SHOWS| IBC 2023 33

are attended to do business, to learn and to meet people again, but by this order of precedence. The first two things were mostly missing last year.

Of course there was some business done, it would be foolish to ignore it; and certainly interesting training sessions, panels of experts and conferences were held, awards were given and there was recognition to many professionals, but not as much as on other occasions. The atmosphere was festive, fun and relaxed, but more than one of us was surprised to return home and see that the business being generated was not such. That’s behind us now.

Back to the real world

And IBC 2023 has truly been a return to reality. And not only because once again the RAI has been full, with all its halls open -such labyrinthic premises, let’s not forgetor because some of those who missed it last year have returned to the floor, which is always good, but also due to the fact that the vast majority of us have gone there to learn, to meet again but, above all, to do business.

A good part of last year’s meetings were greetings, coffees and beers from five in the afternoon onwards, but this year there was talk about projects, solutions, purchase orders and even some juicy contract was announced throughout the fair, something anecdotal but a hint about the importance of this edition. And it is vital that this happens because the need for face-to-face fairs has long been called into question, since some claim why wait to present products and services at these events if nowadays we interact more and more in virtual environments, through software and cloud, than in

the real one, namely hardware and equipment. Well, because business is done by people, not by things or services, by us people.

Okay, but what has been discussed at IBC?

Although necessary from time to time, let’s leave philosophy and rhetorics aside and focus on what has been discussed on the floor. Months ago, as we were able to feel in NAB before the summer, generative AI -and this last bit is the important one- is being implemented and spread over

34
TRADE SHOWS| IBC 2023

wherever it is tried. In our business, from generation of metadata in order to complete our catalog, our EPG or our OTT, through virtual stages and sets, automatic subtitling, and not just voice recognition, or augmented reality to the always controversial replacement of real images or automatic dubbing, these are concepts that are becoming more and more real every day. The good thing is that they are maturing.

And there has not been any great novelty or “wow effect” -as some like to call it- at this IBC and that is good, let me explain again. When

something is as disruptive as this type of technology -which many are forecasting will have a greater impact than the Internet in the long term- it needs to mature. Just as the cod pil-pil recipe is not just about cod and oil, as it needs to be cooked over low heat and it is important that all ingredients are properly coming together little by little in order to achieve such a mellow sauce that is so tasty and we all love. Because it’s not cod with oil, it’s cod pil-pil.

Well, it is pretty much the same with AI: it is not just computer-generated images, it is not just scripts written by machines in the same way that it is not just artificial graphics or special effects. These are tools, many new tools that we must learn, explore and find a use for, tools that need to mature and be linked together so that everything results in a new way of creating and enjoying what we do: entertain.

They have come to stay

And beyond AI there are concepts that I think are important to highlight and are familiar to us. In addition to the blending -which took place

not so long ao and no one remembers now- of Broadcast and IT with IP technology that is still spreading, let’s keep our feet on the ground, there are tectonic movements that must be perceived. Hardware exists because it’s

been called

since some claim why wait to present products and services at these events if nowadays we interact more and more in virtual environments, through software and cloud, than in the real one, namely hardware and equipment. Well, because business is done by people, not by things or services, by us people.

35
TRADE SHOWS| IBC 2023
The need for face-to-face fairs has long
into question,

necessary, not because it’s convenient. This leads us to an environment where more and more things are being done through software and less by hardware, and that’s a good thing. I am not advocating the death of hardware or saying that software is the new God in the industry, hackneyed phrases that are very popular on TV and radio, but I speak of the evidence: for multiple reasons, software can do and does certain functions better than hardware.

Software is more flexible, we can adapt it to our needs better than something designed and manufactured only for one function; it is cheaper, here we could argue about a short and long-term dilemma, licensing models and so on but, in general and for our business model it is cheaper, and it is less risky as an investment. The mistake of buying software that does not suit us is a minor one by comparison.

Another concept that I think has come to stay over tiem is “shared”. From shared investments, between suppliers and customers, to production environments in which the same resources are used for more than one task, this is the main notion from

which cloud services emerged. And this “shared” concept is the logic that leads many investments to be made today in a different way than a while ago and for suppliers to offer their products in a different way.

Why have it if I can use it?

By combining the concepts of software and shared for a long time, and it has been a natural thing in the IT world for decades, I would dare to say, the concept of service comes naturally. More and more

companies are increasingly labelling themselves as service companies rather than product companies and this is an important paradigm shift.

“Everything as a service” is the mantra. Let’s not get so radical, so let’s take this mantra as an origin rather than as an end. And, if we do not buy bikes, scooters and soon even cars because we do not use them even 10% of the time, the same can happen -and it is happeningwith tools that are necessary during our productions but which, in essence, remain 90% of the time unused.

SaaS; Software as a Service, that’s the actual translation of the concept and it makes a lot of sense. Deep down, what is the purpose? Produce

36
TRADE SHOWS| IBC 2023
Another concept that I think has come to stay over tiem is “shared”. From shared investments, between suppliers and customers, to production environments in which the same resources are used for more than one task, this is the main notion from which cloud services emerged.

more with less and better. That is why avoiding risks and investments in systems and equipment that are not used

intensively and constantly makes a lot of sense in the business. Hence the mantra has a real meaning.

Outside of this are the systems for permanent use or equipment that we need exclusively due to the nature of our business, and we learned this with the explosion of the cloud. At that point we realized that moving everything to the cloud didn’t make much sense, especially for permanent systems. If you are going to use it all the time, make it your own, so let’s not use as a service something that we are going to need all the time. Let’s not make the same mistake.

Ecosystems and extended services

Many niche suppliers who were dedicated not long ago to making Intercom systems, cameras, repeater systems, graphics or teleprompters now do everything, and that is good for everyone. Competition is good, in a proper measure, and exemplifies the technological democratization of tools, which means, globally, that we can do more with less.

The spreading of the notion of everyone doing everything leads to the creation of ecosystems that help to have available tools and flexibility that until now had been so

37
TRADE SHOWS| IBC 2023

hard to achieve since it forced to bring together systems that were not made to be united. And this is based on technological concepts such as APIs, but the important thing is that we can find general solutions for almost everything we want to do in many suppliers, with a wide array of options and costs, which is always important.

These integrated production ecosystems in which we have available multiple tools provide us with increased flexibility and innovation capacity that we must be able to manage. As I said before, everything must mature and by going all-in to a cloud ecosystem with our systems, our problems will not be solved and we will not

be aligning our costs with our productions or business immediately, but they are a good tool.

So, what then?

Well, first of all, AI is gaining strength and maturing. And we understand AI as generative AI at all times. Algorithms that can “create”

38
TRADE SHOWS| IBC 2023

Conclusions

Generative AI, software, ecosystems and services are the four concepts that are not new at all but are becoming solidly established and are giving us a lot in many aspects.

Sometimes it is good that no novelties come up as long as what exists continues to evolve and advance as it is doing on so many fronts and in such a solid way.

Let’s hope this very real IBC stays for a long time and that all the news, the “wow effect” and the tools that arrive are solid, become established, and let us people do what we do best: create and entertain our audiences.

information or content from previous references and with a certain complexity.

On the other hand, the software is going to spread outside its obvious applications in a natural and positive way in order to offer us flexibility and take on lesser risks. It is no longer a risk to “virtualize” systems, in fact the concept even sounds outdated, so let’s embrace it and think about how to make the most of this tool.

Another important concept is ecosystems and what they

imply. The fact that more and more suppliers are becoming transversal means that there are very well-integrated ecosystems and that they provide us with systems offring advanced capabilities and much more flexibility.

And all this with no need to buy anything, just use it. Because the concept of SaaS, shared services and systems makes a lot of sense in our business where we need a lot of capacity at specific times and not so much at others.

39
TRADE SHOWS| IBC 2023

OCA Alliance on how they found the balance between standardization and flexibility for AES70

The OCA Alliance, short for the Open Control Architecture Alliance, is a consortium of industry-leading companies dedicated to advancing open standards in the field of professional audiovisual (AV) and media networking. Established in 2011, the alliance seeks to foster interoperability and innovation within the AV industry by promoting the adoption of the AES70 standard.

This standard is a clear way to establish seamless communication between various AV devices, providing a common language for AV professionals and simplifying system integration and management. Over the years, the standard AES70 has steadily gained traction as the industry’s go-to standard. Its robustness and versatility have enabled seamless integration and management of diverse AV devices. With this interview, we wanted to know what were the main obstacles the Alliance met during its journey and what to expect for the coming years.

Join us as we discuss how OCA Alliance and the standard AES70 have navigated through technological shifts, embraced innovation, and fostered collaboration, with Plattform Strategist Ethan Wetzell. Together with him, we will understand how OCA Alliance and this open-source standard, AES70, have paved the way for a more interconnected, flexible, and user-friendly AV landscape as well as which are the challenges for them to face nowadays.

TECHNOLOGY| AES70 40

When does OCA Alliance started? What was the initial proposition and what did you want to achieve with building up this non-profit corporation?

The OCA Alliance was formed in 2012 by a collection of 12 pro audio manufacturers. This was around the time that a lot of the IP revolution in media transport was happening in the industry, and there was a group of us who were involved with these activities but saw that the topic of control was not being addressed directly. Our question at the time- something that has become a bit of a quick pitch for the technology now- was “you can send your content over the network, but how are you going to do anything with it when it gets there?”

And so we gathered this group of like-minded manufacturers for different areas of the industry around a table at a trade show and decided to join together to create a group that would develop a technology and open public standard that would answer that question.

How AES70 works and which are its uses and benefits for the industry?

AES70

TECHNOLOGY| AES70 41
is an open public standard that provides a complete professional solution

to device configuration and control. It is based on three fundamental elements: AES701, the framework, AES70-2, the class structure, and AES703, the binary protocol. New elements of the standard, including a JSON format and media networking connection management adaptations are coming, as is the AES70-2023 revision that is soon to be made available.

These elements cover the fundamental areas of system control, such as describing and enumerating device parameters, establishing

connections to them for sending and receiving control and update messages, how to interface with devices and controllers, and numerous other functions. Moreover, it does this in a manner that meets the needs of professional systems and is capable of scaling upwards to very large networks of thousands of devices as well as scaling down to very small and compact solutions and systems.

Equally important though is what AES70 does not do. It does not dictate how a device

parameter is implementedfor example, it does not care about a specific algorithm implementation, it only describes a way to interface with it. Likewise, it does not dictate what a user interface looks like, it simply provides the tools to create one efficiently.

When we talk about benefits for the industry, we like to focus on the value of open public standards and what this means for developers and users. Developers gain the benefit of quicker time to market and access to a fully

42
TECHNOLOGY| AES70

engineered control solution that is freely available yet maintained by professional standards bodies and engineers. For users, this means more reliability, interoperability opportunities, and more flexible integration options.

What were the key challenges and industry needs that prompted the development of AES70 and the Open Control Architecture (OCA)? How has AES70 addressed these challenges over the years since its inception?

Control is an extremely complex topic, particularly when considered in the context of a device’s design. While an audio networking solution is something that can be attached to the output of a devices and leave the inner workings and design largely untouched, control is different. It gets to the heart of the design and everything about the device’s design, use, and operational philosophy have an impact on the technology used to control it.

To that end, the design of AES70 has had to balance rigidity and standardization vs. flexibility and freedom. One example of this is how to describe a parameterAES70 provides a number of strongly-typed classes (OCA.Gain, for example) but also offers the option for a developer to use a weaklytyped class or even develop their own subclasses that can be discovered and controlled by external controllers.

Another challenge was scalability. The needs of a massive, multi-site system made up of thousands of devices are radically different than that of a simple embedded wall controller.

However, we needed to find a way to make the technology useful for both- and we have achieved and demonstrated this. Can you provide examples of real-world applications and use cases where AES70 has made a significant impact in streamlining control and configuration of professional AV equipment? What benefits have users experienced in these scenarios?

In the field, we have heard from some AES70 adopters that they are able to build complex control systems that they simply would have been able to build with other technologies. We also have examples of how some adopters have been able to implement AES70 in their devices alongside their legacy control protocols to maintain compatibility while migrating to a newer control technology. There are also places where AES70 can complement other technologies, such as providing connection management for AES67. This allows the creation of more complete media networking solutions for vendors and better functioning and more easily integrated products for customers.

43
TECHNOLOGY| AES70
AES70 is an open public standard that provides a complete professional solution to device configuration and control. It is based on three fundamental elements: AES70-1, the framework, AES70-2, the class structure, and AES70-3, the binary protocol.

Interoperability and compatibility are crucial in the professional AV industry. How does AES70 ensure interoperability among devices from different manufacturers, and what efforts has the OCA Alliance undertaken to promote AES70 adoption within the industry?

There are three important way this has been addressed. First, AES70 leverages other open public standards, such as mDNS. Second, AES70 is agnostic as to what other media networking technologies it can be used with. Dante, MILAN, AES-67, or no media network at all, AES70 will happily operate alongside any of those. Third, the OCA Alliance makes available several documents and tools to ensure proper implementation of the standard. An example of this is the AES70 Compliancy Test Tool- available for free at ocaalliance.github. io that implementers can use to validate that their implementation is compliant with the standard.

AES70’s flexibility in supporting various network protocols is an essential feature. Could you elaborate on how this

flexibility allows designers and implementers to tailor their product solutions to specific requirements, and how it contributes to the success of AES70 in different AV environments?

There are strengths to all the available media networking technologies that are available today and our view is that AES70 exists to make devices and systems easier to use, not to place restrictions or barriers on how systems are designed and deployed. This is particularly critical in situations where different technologies may come together in one place. As an example, you could have something like a stadium where audio is moved over MILAN for the main PA, Dante for the systems in the concourses and suites, and ST

2110 for broadcast facilities. AES70 can happily exist with all of these and cross the system boundaries as needed to support the users in what they need to get done. It’s this type of flexibility and scalability that is aimed at allowing the users to spend less time worried about the technology and more time focusing on the production.

As AES70 continues to gain traction in the market, what are the OCA Alliance’s future plans and objectives for further developing the standard and expanding the ecosystem of AES70compliant products and solutions?

For the standard itself, the goal is to continue to enhance the core standard and extend the functionality AES70 brings to the table. In the upcoming AES70-2023 revision, we have made numerous enhancements in the core standards that make it easier to use and develop with and made additions to provide new or improved functionality. At the same time, we are introducing several new standards that we call adaptations that will bring new- but optional- functionality, such as connection management

44
TECHNOLOGY| AES70
AES70 is agnostic as to what other media networking technologies it can be used with. Dante, MILAN, AES-67, or no media network at all, AES70 will happily operate alongside any of those.

solutions, to the standard.

The OCA Alliance will continue to support this technical work through promotion, education, demonstration, and outreach to the industry to continue to encourage adoption and grow the OCA ecosystem.

According to your point of view and aside standards as AES70, which would be the developments AV industry requires at this very moment?

From my personal perspective, I think the most important thing is not so much more technology, but rather a mind shift in how the technology is developed. We have reached a high level of maturity in the technologies that came about with the IP revolution over the last decades. Our

next step should be to work to enhance the usability and user experience of working with these technologies. To get there, we need to focus on the users, workflows, and making these technologies as frictionless as possible for people to adopt and work with. Mature technology is critical, but as this comes, the focus needs to shift to making it ever more user friendly and accommodating to not only what the task at hand is, but how the user wants to operate and carry out what they are trying to do. Technology for technology’s sake isn’t going to make anybody’s life easier- we need to focus on next-level problem solving.

AES70 is a freely-available open standard, so which do you consider is the ROI for OCA Alliance?

Our one and only goal is adoption, full stop. Our return on investment is when this technology makes it into products, and it begins to make the lives of developers, designers, and users easier. The individuals who make up the working groups in both the AES and OCA Alliance are, in many cases, competitors in our day jobs. However, we all believe very strongly in the common good that this standard brings to the industry and our customers. Our goal is- and always has been- to try to solve problems for people through open public standards that are freely available to all. After all, if a designer doesn’t need to worry about a control solution anymore, they can spend more time developing killer features into their products.

45
TECHNOLOGY| AES70
TECHNOLOGY| DMX512 46

From the invention of ire to the latest advances in LASER and LED technologies, through the manufacturing of the incandescent lamp by Thoman Alva Edison in 1879 -capable of remaining on for 48 uninterrupted hours- or the irst ilm shot entirely with tungsten light entitled Broadway (1927 – Universal) or the inauguration of TVE’s -the Spanish public broadcaster’s- Prado de Rey studios (1964 - Madrid), there are many technological contributions that have added to everything related to lighting in the audiovisual industry, leisure and entertainment.

Lighting is vital when it comes to constructing narratives, designing events and making shows. But, from the point of view of equipment and technologies, not all areas in the audiovisual industry and LIVE approach lighting in the same way when dealing with operation and control.

In this article we will take a closer look to one of the most outstanding pieces of equipment when it comes to lighting in a remote and controlled way: lighting consoles, also known as lighting controllers and/or tables.

First of all, to better understand the reason behind the existence of this item, we have to indicate that its origin comes from the very need as required by different professional fields such as live television, concerts, performing arts, compact events (such as DJ sessions, party rooms, congresses, prize-giving ceremonies, etc.) and spectacular events (such as macrofestivals, inauguration events, world sports ceremonies, etc.).

In each of these areas it is necessary that the person responsible for the lighting of the show can adjust, move, modify, add color, put effects, apply memories and work with all the possibilities offered by a lighting console for the proper running of the event and the creation of light as an artistic element.

A lighting console is a piece of control and monitoring equipment mainly for the lighting appliances/projectors in a remote way, regardless of the number of devices to be used; the type of light projector -fixed, mobile, scanner, wash, spot, etc-; or the manufacturer of the light point. Therefore, it allows us to space out each equipment item to be used for design and in the lighting plant. In addition, we can handle other devices such as smoke and/or fog machines.

But how do the different light devices/ equipment/appliances communicate with each other and with the lighting controller? The answer is very simple. Throgh DMX512, a communications protocol for lighting design and operations. It was released in 1986 by USITT (United States Institute for Theatre

TECHNOLOGY| DMX512 47

Technology) as a solution enabling synchronizing and programming of lighting effects offered different lamp brands and manufacturers from a lighting console.

DMX stands for Digital Multiplex and 512 because this protocol contains 512 channels. Each channel works at 8 bits (DMX values) between 0 and 255 (or in percentages, from 0% to 100%). The 512 channels form what is called a DMX universe.

That is, each of the 512 channels will entail a command/instruction sent from the control table to one or more light projectors (provided they are the same equipment model and are grouped). All of them work within that universe. It must be emphasized that the DMX512 protocol has a limit of 512 channels per universe; but several universes can be managed as long as the table/controller/lighting console allows so (Universe 1, Universe 2, Universe 3, also called DMX OUT A, DMX OUT B, DMX OUT C…).

For instance, from channel 1 of the console we can send a command governing the light intensity of a light projector, which can be off (value 0 or 0%), to full on -”FULL”- (value 255 or 100%) covering all the different light levels or

percentages (10%, 25%, 86%...) only in universe 1 (DMX OUT A).

But what must be very clear is that the command, what the light fixture is capable of doing, is predetermined by the manufacturer or brand of the light projector itself. This is what is known as modes or number of channels/ commands that the light devices has. Accordingly, it may be the case that a same projector may be configured

to obey different modes, for example, 4CH, 9CH, 16CH or 32CH. Thi is what is known among lighting technicians as appliance libraries, which are necessary for a correct configuration between the lighting table and the large number of different manufacturers and models.

Another very necessary aspect is to configure the DMX address -or Start Addressbetween the controller and the light fixture (DMX path).

48
TECHNOLOGY| DMX512

That is, for the command to get there correctly we have to set a destination direction, which goes from 1 to 512 always in the light projector. In this sense, two or more light fixtures (of the same model and brand) will be jointly controlled if they are assigned the same DMX address (and therefore grouped), thus obeying the same command ging out from the controller, for example, that they all change at once to red color.

DMX512 is based, like many others, on the specifications of an RS-485 serial bus. It is a data signal that transmits asynchronously through a pair of cables at voltage levels ranging from 0 to 5 and a cable for grounding, running at a speed of 250 Kbit/s.

Some very useful information within the DMX512 protocol is that it supports 1,200 meters of wiring length, with a maximum of 32 devices in a single DMX universe and,

if there are more than 32 devices it can be extended through DMX splitters (DMX Opto-split or DMX repeater).

A splitter is used to perform signal bypasses in parallel. Its main use is when the installation to be carried out involves a large number of light projectors connected to a controller. With the use of a DMX signal distributor, each DMX OUT line is considered a new line in every way.

Regarding the installation and the communication cable, it is necessary to use shielded twisted pair cable, with an impedance of 120 ohms, which can have XLR 3 and XLR 5 connectors. The minimum thickness of the conductors depends on the distance: 24 AWG up to 300 meters, 22 AWG up to 500 m. It is called DMX cable and can never be confused or mixed with audio cabling.

The DMX512 installation assumes that commands exit from the lighting control table and, through a Daisy Chain connection scheme, we make a succession of links in sequence or in a ring, so that a lighting device is first connected to a second, this to a third and so on up to the last one, a chain connection between all the light devices (IN DMX / OUT DMX).

49
TECHNOLOGY| DMX512

In order to have stability and integrity in the entire intercommunication and wiring process in the installation it is convenient to close the circuit with a DMX terminal or buffer at the last light fixture or connected device.

Currently, wireless communication (Wireless DMX 2.4 GHz) is possible, thus reaching a range of about 700m, more than enough for many events.

But what is increasingly popular in the professional field is the transmission of DMX data via IP (Internet Protocol /Cat5 cable), known as network infrastructure (Art-Net), thus becoming a standard. This is due to three reasons: the stability in the length of the wiring; being able to work with light fixtures that have more and more channel modes, (some can obey more than 60 channels in a single projector); and the increasing presence of very ambitious shows and with a large multitude of light fixtures. Art-Net is a new form of communication featuring 100Mb/sec; it is bidirectional and allows working with more than one DMX universe (expandable to 64 DMX universes).

As we have already noted when introducing the the

DMX512 protocol, we see that it is essential to distinguish the DMX universe, the DMX channel and its values, the DMX address, the DMX modes (inside the lighting fixture) and DMX cabling.

Before finalizing the communication between devices in the lighting environment we find the sACN, Streaming Advanced Control Network, (2013, ESTA -Entertainment Service & Technology Association-). This is rather a set of applications formed by protocols: DMP (Device Management Protocol) reports the status between devices without yjr limitations such as those that DMX has; DDL (Device Description Language), idelivers the information about the parameters and components of the devices; SDT (Session Data Transport), defines how the communication will be carried out, which can be “reliable” or “unreliable”; ACN discovery, generates the search and communication processes; and ACN

Packet Format: defines the communicationd frame, twoway at all times.

Every lighting technician must be very clear about the operation of the protocols described above, but also handle a very specific terminology and know the entire set of possibilities offered by manufacturers in the design and functionality of the devices/light projectors.

Regarding the terminology in the use of a lighting controller, we refer to a common theoretical basis, regardless of the professional field involved:

• FIXTURE: It is the name used for any type of light projector/device, scanner and/or lamp that is controlled by a console/ controller/lighting table.

• FADER: It is a slider (sliding potentiometer) that lighting consoles (and other types of devices such as audio tables, MIDI controllers…) are equipped with that allows adjustment and control for

50
TECHNOLOGY| DMX512

command values (from 0% to 100%). There are channel faders, submaster faders, master faders…

• CHANNEL: It is the most basic unit of a controller. We can use it to make changes in the percentage of a value in a command that will be sent to fixture through a channel fader.

• GROUP: It is the grouped storage of several channels that may have different command parameters and belong to different light fixtures.

• FIELD or PRESET: It is an assignment that we can make in a lighting controller so that individual channels can work under differentiating work zones. Thus, a single channel can be operated separately if the A field or the B field of the controller is activated.

Also, the PRESET designates a way to record a set of values according to some brands of lighting consoles.

• STEP: It is a specific action/ command in the state of the lighting fixtures, which will be part of a whole set according to a more complex narrative underlying the show or event. It is the minimum unit of light design that can make up a CUE, SCENE or MEMORY.

• CUES, SCENES, MEMORIES: All three terms refer to the possibility of storing an action or different actions (light designs) previously programmed with the lighting console and then showing them live.

• CHASES, SEQUENCE: Both terms mean that we have the possibility to have an output of the stored designs/light states (CUE, SCENE or MEMORY) according to the content of the event or list of scenes regardless of the order in which they have been saved/ programmed. Therefore, it allows us to set IN, OUT times or live durations, make transitions and even jumps between states.

• SUBMASTER: A sliding potentiometer where we output (0% to 100%) multichannel clusters to send a command/instruction. Field, effects and/or channel group submasters are normally found.

• MASTER – GRAND MASTER: A sliding potentiometer that allows us to have control at the final output (OUT) over all values/commands on a lighting control table.

• BANK, PAGES: An internal space in the lighting console that allows us greater storage capacity for

programmed commands and states of for scene output in order achieve a more orderly event’s live.

• BLACK OUT: A switch or button that entails a total and absolute absence of command output from the lighting controller. Fade to black is normally another widely used expression.

Finally, the manufacturers of light projectors, especially of the mobile/robotic light type, offer us a wide variety of possibilities when it comes to configuring, handling, operating, and creating with the light beam. It is convenient to understand that every fixture can have channels for commands different sorts but in most cases they are parameters that are common to all, such as:

• DIMMER: Fixture light source intensity regulator.

• PAN: Horizontal movement of fixtures, such as moving heads.

• TILT: Vertical movement of fixtures, such as moving heads.

• SPEED: A value that we can modify to accelerate or decelerate pulses of continuous or strobe light.

• COLOR EFFECTS: Changes in tone, saturation and/or

51
TECHNOLOGY| DMX512

luminosity parameters of the primary colors (RGB) or possible combinations with the secondary colors.

• BEAM EFFECTS: Variation of parameters corresponding to beam type, focus and light coverage.

• SHAPE EFFECTS or GOBO: Variation of the parameters that allow generating shapes and/or patterns with light.

As there are multiple types of TV programs and a wide variety of types for LIVE events, we can find different solutions when choosing a lighting table. Therefore, we can choose between:

• Double scene controllers, very common in theaters and/or for simple TV programs. They feature a small number of DMX channel faders (most commonly 24 or 48). They are designed with Field A and Field B operations in a

single DMX universe.

• DMX192 controllers, are par excellence the ones that are most used for compact events, DJ sessions, parties, awards ceremonies… Manufacturers that offer good equipment are: EUROLITE, SHOWTEC, BEAMZ, STAIRVILLE, ADJ, AUDIBAX, JBSYSTEMS, among others. Their main feature is that they are compact under a DMX universe, therefore, they apply the use of field A and field B in order to be able to use 16 channels for each

52
TECHNOLOGY| DMX512

of the 12 fixtures that can be designated separately. A total of 192, hence its name.

• DMX interface, through an external device we can intercommunicate a computer equipped with lighting software with the different projectors/ light fixtures, for example: ENTTEC, EUROLITE, SHOWTEC, AVOLITES, ADJ MyDMX, CAMEO, SUNLITE… Most of them allow working with 4 DMX universes.

• Branded surfaces or consoles, typical of music

concerts, large events and spectacular lighting setups. They are known by the brand name of the console manufacturers. Some of the best known: MA, CHAMSYS AND AVOLITES, MIDAS, INFINITY, LT LIGHT. Most allow you to work with multiple DMX universes and under IP or sACN.

The advent of LED lighting technology has led to the rise of a large number of manufacturers: ADJ, AFX LIGHT, AMERICAN DJ, AUDIBAX, BEAMZ, E-PRO, CAMEO, EUROLITE, IBIZA LIGHT, ACME, ALTMAN,

APOLLO, LASERWORLD, JB SYSTEMS, JB-LIGHTING, VISION led, MARK, SHOWTEC, WORK PRO, STAIRVILLE, VARYTEC, DOUGHTY, MARTIN, ETC. TRITON, BRITEQ, ELUMEN8, PRO LIGHT, ARRI, HQPOWER, STRAND, DESISTI, HIGH END SYSTEMS, DTS, SKYTEC, E-LITE, INFINITY, CHAUVET PRO, VARI-LITE, ROBE, MOVITEC, COEMAR, CLAY PAKY, AYRTON, for example.

The use of controllers/ consoles/light tables is inevitable when we want to perform a programmed and completely remote work that will enhance the show and take control of the chaos in the lighting environment. 

53
TECHNOLOGY| DMX512
TECHNOLOGY| 4K UHD/HDR IN SPORTS 54

The 4K UHD/HDR in Live Sports Streaming

The evolution of media content is undergoing a rapid transformation, as higher resolution formats such as 4K and UHD are quickly becoming the new industry standard. This follows a steep rise in consumer video consumption behaviour due to the proliferation of viewing options and unprecedented flexibility to consume any content, in any place, on any screen, replacing the ubiquitous stationary TV. However, the modern viewing audience demands nothing less than pristine quality when it comes to live streaming, regardless of the medium they choose to do so. In recent years, major sporting events such as the FIFA World Cup, Formula 1 races, and Wimbledon have graced our screens in glorious native 4K UHD/HDR. Not too long ago, the Beijing 2022 Winter Olympics was the first Winter Games broadcast in UHD, which followed the successful 4K HDR deployment of the previous Tokyo Summer Olympics, as well as regular broadcasts during the

Turning live UHD production and delivery into a reality

2021/22 Premier League season - all broadcast in higher resolution formats.

It’s evident that 4K UHD/ HDR is swiftly becoming the new industry standard for resolution and mostly because consumers continue to show a willingness to pay for flexible and unrestricted access to video content. Content providers, such as traditional video distributors or emerging on-demand video content providers such as Netflix and Prime, are aggressively monetising this consumer trend. The only question that arises amidst this growing trend is whether they can keep up the pace with this insatiable appetite for highquality video content on a massive scale.

According to research, the share of consumers claiming that 4K/UHD content is an important feature in their video services generally rose faster than any other factor across the market over the last two years. Productions and contribution links in 4K/ UHD are quickly becoming the standard practice for broadcasters and studios. But, to ensure viewers can savour the full splendour of 4K UHD/ HDR, a range of solutions can be deployed ahead. Broadcasters have the ability to create content in both native 2160p and 1080p50/60 HDR, with 4K displays adept at upconverting the latter images to 2160p, albeit with some imperfections. However, the allure of UHD extends beyond mere resolution and dynamic range. UHD requires progressive scans at 50Hz or higher, a feature ideally suited to the fast-paced nature of sports, greatly enhancing

TECHNOLOGY| 4K UHD/HDR IN SPORTS 55

the viewer experience. Moreover, UHD offers superior conversion to other frame rates, with progressive 2160p50 content transitioning to 2160p59.94 with fewer artifacts. This versatility caters to international feeds that necessitate different frame rates. Single-production UHD also proves beneficial, enabling better conventional HD outputs, including the conversion of 2160p50 to 1080i29.97. Delivering these top-resolution quality experiences comes down to a handful of core technology innovations that are reshaping the playing field.

Taking streaming to the next level with 5G

The paramount challenge for broadcasters is to efficiently deliver UHD/HDR content across various platforms and devices. With its capacity to support data-intensive resolutions like 4K UHD, 5G addresses the unique demands of immersive technologies such as 360-degree live video and virtual/augmented reality experiences. 5G guarantees ample bandwidth, high data speeds, low latency, and delivery reliability, resulting

56
TECHNOLOGY| 4K UHD/HDR IN SPORTS

in superior user experiences, even in remote locations.

Beyond enhancing the viewer’s experience, 5G has ushered in a transformative era for remote productions. By reducing the need for onsite personnel and equipment, it contributes to sustainability efforts by curbing carbon emissions and minimizing energy consumption, which inevitably creates sustainability advantages. Fewer people travelling will cut carbon emissions and it also takes away the reliance on on-premise hardware, lessening energy and power consumption. Additionally, 5G-enabled mobile devices typically support HEVC (HighEfficiency Video Coding), offering superior efficiency and quality compared to H.264. HEVC, known for its visually loss-less compression, achieves bandwidth savings of 120:1 to 150:1 or more, albeit at a slight increase in latency.

Looking towards the next streaming frontier

While strides have been made, broadcasters still face the challenge of selecting formats

that optimise efficiency and cost-effectiveness for producing and delivering UHD/HDR content. Enter VVC (Versatile Video Coding), the current standard-bearer in video coding efficiency. VVC holds the promise of substantially more efficient encoding, an essential consideration for higher resolutions and frame rates, particularly in the 4K and above range, where HEVC may not yet be fully established.

VVC paves the way for enhanced quality and efficiency in the future. As immersive 4K HDR services continue to proliferate, VVC positions itself as a catalyst for widespread adoption of 4K content. This promises viewers a deeper, richer viewing experience and even opens the door to potential 8K services down the line. While the question of 8K adoption among consumers remains open, these technological advancements herald an exhilarating era for the media industry, empowering broadcasters, service providers, and content owners to deliver unparalleled media experiences to audiences across the globe. 

57
TECHNOLOGY| 4K UHD/HDR IN SPORTS

Teradek ART Redefining

video streaming by means of ultra-low latency

TEST ZONE| TERADEK ART 58

In the fast-paced world of real-time video streaming, the quest for ultra-low latency has brought the industry to a technical crossroads. The ability to stream content with minimal latency is a critical factor in creating immersive user experiences and in the viability of applications, ranging from live production to robotic teleoperation. While technological progress has enabled higher video quality, latency remains a persistent obstacle.

In this context, Teradek ART emerges as a revolutionary solution, promising to completely redefine the video streaming experience. Its ability to achieve ultra-low latency while

optimizing video quality and network features is a game-changing technological leap. This technical analysis will delve into the intricate details of this equipment’s protocol, exploring how it addresses latency challenges and how it compares to existing technologies.

Traditionally, streaming solutions have had to deal with a dilemma: sacrifice video quality to reduce latency or tolerate higher latency to maintain quality. Teradek ART challenges this dichotomy and presents an elegant solution: providing ultra-low latency without compromising video quality.

TEST ZONE| TERADEK ART 59

The video streaming revolution

At the heart of Teradek ART is its dual focus on optimizing video content and network features. This ability to “think” about both content and the network in real time is what defines the essence of this revolutionary technology. The process begins with constant evaluation of video quality, allowing Teradek ART to adapt compression to optimize visual quality without compromising speed. But this is where Teradek ART takes a unique approach.

The joint coding of source and channel is the beating heart of the Teradek ART protocol. This approach combines real-time

error correction with video compression, thus ensuring a reliable, smooth delivery of content even under harsh network conditions. By using Forward Error Correction (FEC) techniques, this device is able to detect and correct errors before they impact transmission quality. In contrast to other protocols that rely on retransmissions to correct errors, this equipment operates with exceptional efficiency by ensuring the integrity of content without any interruptions.

The real magic of Teradek ART lies in its ability to avoid retransmission delays. This feature is especially significant in real-time applications where even a small fraction of a second can make a difference.

Robotic teleoperation and remote driving are clear examples of areas where ultra-low latency is essential for safety and performance. By eliminating the need for retransmissions, Teradek ART becomes a crucial ally for these critical applications.

The result of this combination of technology is a video stream that exceeds conventional expectations. Ultra-low latency and exceptional quality are intertwined for an unprecedented user experience. Images are displayed with crystal clear clarity, movements are smooth, and details are sharp, all while maintaining a latency that meets the most challenging demands.

60
TEST ZONE| TERADEK ART

Comparison of Teradek ART with SRT: Revolution in context

One of the most revealing comparisons is presented when facing this device with the SRT (Secure Reliable Transport) protocol. Both protocols address latency and transmission quality but differ fundamentally in their approaches. SRT uses retransmissions to deal with packet loss and improve transmission quality, while Teradek ART uses forward error correction (FEC) techniques to maintain content integrity without the need for retransmissions.

This approach is a turning point in the quest for ultralow latency. By eliminating

retransmission delays, the equipment ensures that content is delivered with minimal latency, even under challenging network conditions. This is particularly important in applications such as robotic teleoperation, where latency can have a direct impact on security and performance.

Another benefit it offers over other technologies is its ability to adapt to changing network conditions without sacrificing video quality. While some solutions may experience quality degradation on poor networks, Teradek ART maintains a focus on delivering high-quality content. The combination of joint

61
TEST ZONE| TERADEK ART
TRANSPORT LATENCY COMPARISON

source and channel coding with FEC techniques allows this equipment to overcome the challenges of transmission in adverse conditions without compromising the user experience.

In addition, it sets a new standard in terms of latency. With the ability to deliver less than 100ms latency across WAN networks, it exceeds expectations for applications that require instant response. This advantage translates into greater immersion in

real-time experiences and greater effectiveness in critical applications where synchronization is crucial.

Applications and benefits

The applications of this device are vast and diverse, ranging from live productions to robotic teleoperation and realtime collaboration. For remote live productions, where instant interaction is essential, it enables creation of immersive experiences without

noticeable delays. Hosts can interact with their audience naturally and seamlessly, without the latency barrier.

Robotic teleoperation, an ever-growing area, benefits greatly from Teradek ART’s ultra-low latency. In medical, safety and applications beyond them, instant video streaming enables operators to effectively control robots remotely, ensuring realtime response and accurate decision-making.

In environments where

62
TEST ZONE| TERADEK ART

precise synchronization is vital, such as collaborative video editing and creative production, the device allows teams to work together regardless of geographic distance. Eliminating perceptible latency creates a smoother workflow and increases creative efficiency.

The advantages of the equipment go beyond its applications. Teradek’s Core cloud platform complements its technical capabilities by providing an intuitive interface for configuration and monitoring.

One of the biggest advantages of Teradek ART is its adaptability to changing network conditions. Forward error correction (FEC) and the ability to adjust to packet loss ensure a seamless user experience, even on poor networks. Resilience in challenging environments offers a significant advantage as compared to other streaming solutions.

The evolution of video streaming

Continuous growth of realtime applications points to a future where instantaneity and immersion are the norm. As demands for more immersive user experiences continue to increase, ultra-low latency will become a must in a variety of industries.

The convergence of realtime video streaming and virtual and augmented reality technologies is another area that is set to revolutionize the way we interact with content. Minimal latency is essential to create immersive and seamless experiences in these technologies.

In addition, the rise of the Internet of Things (IoT) and machine-to-machine (M2M) applications presents additional opportunities for this equipment. As more devices and systems connect

in real time, streaming data and video with low latency becomes a critical component.

Conclusions

Teradek ART’s dual focus on simultaneous optimization of video content and network features has proven to be an innovative breakthrough. Joint source and channel coding with forward error correction (FEC) techniques eliminates the need for retransmissions, thus enabling ultra-low latency with no noticeable interruptions.

Its ability to avoid retransmission delays in combination with its adaptability to changing network conditions offers significant advantages over solutions that resort to retransmissions to correct errors. Ultra-low latency and high quality are attributes that align with the everincreasing demands of real-time applications, from live productions to robotic teleoperation and creative collaboration.

Finally, its integration with the Core cloud platform provides an additional layer of control and customization to user experience. 

63
TEST ZONE| TERADEK ART

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.