Photo Credit: © 2021 Olympic Broadcasting Services
EDITORIAL This edition of the Olympic Games has undoubtedly had many peculiarities. Not all of them relating to sports. The fact that the Games were held without an audience makes them the first to be designed almost exclusively for broadcasting. Such a singularity seemed to us worth highlighting. That is why in this issue you can find extensive information about this from two different perspectives. On the one hand, an exclusive interview with Sotiris Salamouris, CTO, Olympics Broadcasting Services. On the other hand, we wanted to get closer to one of the largest European televisions, specifically Spain's RTVE, so that their expert staff could give us detailed information regarding the technical
Editor in chief Javier de Martín editor@tmbroadcast.com
operations involved in the Tokyo Olympic Games. The attractive information contained in this issue is completed by a special PTZ report which explores the development of this type of camera and the boom that, in recent years, they are having in the broadcast sector. Also an interesting article on the transformation to HD, an unavoidable topic that pervades nowadays the offices of hundreds of TVs from all over the world. Finally, closing this issue is the first installment (there will be several given its breadth) of "The Colorist" a detailed analysis of what this trade is about.
Creative Direction Mercedes González mercedes.gonzalez@tmbroadcast.com
Key account manager
Administration Laura de Diego
Susana Sampedro
administration@tmbroadcast.com
ssa@tmbroadcast.com
TM Broadcast International #96 August 2021
TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43
Editorial staff press@tmbroadcast.com
Published in Spain ISSN: 2659-5966
SUMMARY
6
News
20 The First Olympics Only for Broadcast
Interview with Sotiris Salamouris, Olympics Broadcasting Services CTO Sotiris Salamouris has offered us a compilation of the technological improvements they have implemented in these Tokyo 2020 Olympic Games.But he also shared with this magazine his concerns about the future of broadcast technology and the challenges that have arisen in the OBS race to reach the goal of innovation. And, of course, the problems and solutions they have created from scratch to circumvent the problems arising from the COVID-19 pandemic.
38 RTVE's technical operation for TOKYO 2020
The technical team of RTVE, one of the largest broadcasters in Europe, tells us how they have prepared for these Olympic Games. 4
58
The boom in the broadcast and live
environments:
PTZ CAMERAS
68 Transformation to HD 74 The colorist 62 5G: Nothing like what you've ever experienced before 5
NEWS - PRODUCTS
Clear-Com launches Arcadia Central Station: digital, analog and AoIP combined
Clear-Com has recently launched Arcadia Central Station, a scalable IP platform which integrates all wired and wireless partyline systems, including the FreeSpeak family of digital wireless solutions and HelixNet Digital Partyline. Arcadia combines digital, analog and AoIP intercom technologies into a single integrated system. The system can license up to 96 IP ports in a single RU device. Arcadia provides connectivity to a wide range of Clear-Com endpoints through a mixture of 2-wire and 46
wire audio ports, together with third-party Dante and AES67 AoIP devices. Additionally, it supports the full range of FreeSpeak digital wireless products, encompassing the 1.9 GHz, 2.4 GHz and 5 GHz bands, and the upcoming integration with HelixNet. Arcadia also has two large touch screens on the front panel to solve the need for adjustments easily and quickly. The system is configured and monitored through a new, re-imagined version of Clear-Com’s browserbased CCM software
featuring an interface that guides users through all steps of the process. “Arcadia Central Station is essentially the universal translator, integrating all our partyline systems together in a single, compact 1RU package. With the flexibility and power of a licensingbased model, functionality will increase over time and eventually this central station will become the ultimate group communications hub”, says Bob Boster, president of Clear-Com.
NEWS - SUCCESS STORIES
Quicklink Studio allows oXyFire Media to conduct a remote interview through a virtual studio Quicklink Studio has assisted a remote interview between a football who was placed in London manager and a journalist based in Dublin. For the production, Quicklink Studio (ST55) was used to provide a conversation between London, Dublin and the Sheffield based Master Control Room (MCR). Using Scarlett audio interfaces into laptops based in London and Dublin, Quicklink Studio (ST55) facilitated a bidirectional Conference call between both parties as well as the MCR. To achive a realistic virtual studio production both were carefully positioned in a green screened environment to allow for a virtual backdrop to be applied. 8
To allow for natural conversation, monitors were strategically placed at the other’s eye-line, to make it appear as though they were making eye contact throughout the interview. “Quicklink was a vital part of the virtual production workflow. I needed a system I had faith in and could trust to deliver. I’ve used Quicklink Studio many times over the pandemic and it’s proved rock solid. I wouldn’t want to use anything else; especially on such high-profile projects”, said Jay Rozanski, Creative Director at oXyFire Media. Quicklink Studio (ST55) enables the gathering of audio-visual contributions from a laptop or mobile device with no apps or
software installations required. Contributors can be invited by an operator to contribute via SMS, WhatsApp, email or by generating a shareable URL link. From the Quicklink Manager, the operator is then able to remotely control and configure the contributor’s device camera, microphone, speaker settings, resolution, data bandwidth and latency if required.
CP Communications provides solutions for ESPN, FOX and MLB Network during All-Star Game ESPN, FOX and MLB Network have selected RF coordination and an extensive complement of equipment from CP Communications. Those solutions took part during Major League Baseball’s All-Star Game and related events earlier this month at Coors Field in Denver. The RF Coordination strategy for the All-Star Game included pre-planning and acquisition of 1108 frequencies, covering RF video systems, audio, return video, paint control, intercoms and two-way radios. CP’s support included 16 RF handheld cameras, including two FlyCam systems, two “Megalodon” Sony cameras (with shallow depth of field), a Sony HDC-P50 compact POV on a MōVI camera rig, a SteadiCam, and batting cage POVs. CP added 60 wireless microphones that were used throughout the events by reporters and other on-air talent, plus umpires, coaches, and players. Some mics were even planted in the ground to capture game audio and effects.
NEWS - SUCCESS STORIES
Riedel has supplied its solutions to Plazamedia’s UEFA Euro2020 project and its broadcast
Riedel's MediorNet, Artist and Bolero provided decentralized communications and signal routing infrastructure for UEFA EURO 2020™.
Plazamedia GmbH has staged all UEFA Euro2020 matches for live UHD broadcast on Telekom’s MagentaTV using solutions from Riedel. The German company has used Riedel’s MediorNet real-time media network as a decentralized production video routing infrastructure to support 10
management and processing of signals across their production facility. Also, they have deployed Riedel’s Artist intercom system with 2300 Series SmartPanels and an integrated Bolero wireless intercom.
production environment across 2,600 square meters with three studios, four playout centers, and a content center. The main studio featured a 32:9 LED wall as well as augmented reality and virtual graphics designed.
For UEFA Euro2020, Plazamedia reated a
The Riedel MediorNet included four MetroN
routers, 13 MicroNs, and almost 20 MicroN UHD frames. The MicroN UHDs formed a redundant ring each with 200G bandwidth and distributed 33 UHD signals, four of which originated from the IBC in Amsterdam. Furthermore, the system routed almost 70 3G video signals and the audio layer handled 10 fully saturated MADI connections in addition to all the embedded tracks. . For this Plazamedia project, the 13 MicroN modules were dedicated to run Riedel’s MultiViewer App. With access to all distributed MediorNet signals, the MultiViewer App creates up to four monitoring heads that can be routed to any given output. Riedel also supplied an Artist intercom system with around three dozen RSP-2318 SmartPanel interfaces and 16 Bolero wireless intercom beltpacks in order to facilitating crew communications. “PLAZAMEDIA delivered a remarkable production for Telekom and MagentaTV viewers, and we’re proud to have supported this successful high-profile event with our MediorNet product family”, said Marco Kraft, Head of Sales Germany at Riedel Communications. “The UEFA Euro2020 broadcasts were a win for everyone involved, and we look forward to partnering again with PLAZAMEDIA on future production projects”.
NEWS - SUCCESS STORIES
Laguna Octal solution chosen to receive the signal from the Olympic Games Sapec's versatile LAGUNA solution has once more been selected for a project of great relevance within Sports Broadcast. This time is the OCTAL model, which has met the requirements established by the Spanish public broadcaster of receiving the signals from the Japanese capital and must be simultaneously encoded and distributed. For simultaneous processing of these 36 signals, SAPEC has provided 9 pcs of LAGUNA OCTAL (1RU each). These 36 signals are divided into 18 signals encoded in HEVC/H.264 4:2:2 that arrived in Madrid from the International Broadcast Center established in Tokyo, and 18 signals that were also distributed RTVE. This implies that, in these 9 rack units, LAGUNA OCTAL offers 18 encoders and 18 HEVC/H.264 decoders that work simultaneously. For the management of the inputs and outputs of the 36 signals, LAGUNA OCTAL provides per unit: 8x
12
HD-BNC interfaces (configurable as ASI or SDI) and 4x 1Gb Ethernet interfaces allowing to establish redundancy and concurrency of paths both in ASI and IP. In addition, LAGUNA counts with hot-swappable redundant power supplies, an important advantage in ensuring easy maintenance and without interrupting
the operation of the equipment to change a power supply. The solution provided supports 4x audio PID services per video signal (4x36), with the possibility of upgrading to 8x PID of audio per video signal (8x36).These units are also prepared to support different functionalities in case the Spanish
NEWS - SUCCESS STORIES
Broadcaster considers them necessary in the future such as: • SRT Protocol: To guarantee a secure transmissions over public IP networks. • CONVERSION SCTE 104 <> SCTE35: For local ad/content insertion management. • UP/DOWN Conversion: To convert the received signal from HD->SD or SD->HD.
Finally, it is necessary to highlight the flexibility of LAGUNA in terms of adapting the Layout (internal architecture of the unit). This means that these units, that today have a Layout of 2 HEVC encoders and 2 HEVC decoders each (1RU), can adapt to future needs. This means that any of these units can be reused after this event for other types of applications. What today are 2 encoders and 2 HEVC/H.264 decoders, it can easily be upgraded via
software to for example: • (Up to) 8x H.264 Encoders • (Up to) 4x HEVC Encoders • Any of the above with internal multiplexer (MPTS output) • 8x Decoders HEVC/H.264/MPEG-2 In addition, the solution is based on Pay as you Grow software model, where the client will not have to worry about investing on anything else but what they need and when they need it.
NEWS - SUCCESS STORIES
Lawo has covered the sound needs of the “Concert de Paris” for France Day 2021 To celebrate the Day of France, the past 14th of July the Champs du Mars by the foot of the Eiffel Tower host the traditional Concert de Paris. Lawo mixing consoles were responsible for the live and broadcast sound requirements, as in the last seven years. For the orchestra mix they used a 48-fader mc²56 production console (FOH) while another 48 fader mc²56 handled the monitoring tasks in a
console’s own stage boxes
highlights the excellence of
and DALLIS microphone
our equipment with
preamps.
uncompromising audio
distancing between the two
“It is a great honor for Lawo
processing and state-of-the-
operators. The whole
to be trusted, year after
art reliability”.
system operated over the
year, by Radio France for
This year’s Concert de Paris
Optocore fiber-optic
such a prestigious event as
was followed on-site by an
network deployed by Radio
the provider of the audio
audience of 15.000 people,
France.
mixing consoles for both
and was relayed to a record
The broadcast mix was
the public address and
3.5 million TV viewers in
managed in a Radio France
broadcast signals”, says
France and an estimated 10
OB vehicle. It allowed the
Joffrey Heyraud, Lawo Sales
million audience in Europe.
use of the broadcast
Director France. “It
configuration in which a 16 fader mc²² extender was used to allow social
14
NEWS - SUCCESS STORIES
Czech Rupublic’s O2 acquires a OB vehicle designed by Broadcast Solutions The mobile company O2 covers the major soccer league from Czech Republic for a couple of years now. The company has recently added a Broadcast Solutions Streamline OB to its fleet. The vehicle is a S12T and gives support to 12 HD cameras.
Jan Chodora, CEO at IVC s.r.o. comments: “we have been in close contact with Broadcast Solutions for many years. Our extraordinary mutual rapport is based not only upon their engineering skills and experiences but especially on friendly
atmosphere and mutual trust. Based on that we successfully delivered several OB vans already and therefore when new projects arise our decision to co-operate on that with Broadcast Solutions is natural”.
NEWS - SUCCESS STORIES
Ateme transforms Telecom Armenia’s operations with complete OTT Video-Delivery Solution
ATEME has announced that Telecom Armenia, an IPTV/OTT operator running under the Beeline brand, has selected its integrated OTT videodelivery solution to enable a re-launch of its TV offering to the Armenian market. With a legacy system in place, Telecom Armenia identified a need for a scalable and effective video-delivery solution as part of a planned service 16
refresh that included a new front and back office. The operator selected ATEME as a single source supplier, enabling the organization to benefit from a range of OTT technologies, from reception to CDN. ATEME´s end-to-end offering gives service providers a single platform to leverage transport streams over both IPTV and OTT networks, leading to
optimized processes and better performance. Because of its shared workflow approach, it also enables organizations to centralize all media processing workflows and ensure high-quality delivery of any content to any screen. The chosen system also includes TITAN for compression, channel origination and reception, and NEA for packaging,
NEWS - SUCCESS STORIES
VOD, catchup and CDN delivery. Gevorg Gevorgyan, CTO, Telecom Armenia, said: “We look forward to powering our OTT platform with a solution that brings better operational efficiency, flexibility of converged IPTV/OTT and the ability to support new formats and functionalities in the future. Our partnership
with ATEME also enables optimum video quality, agility and bandwidth savings while ensuring low latency, giving us the opportunity to deliver high quality HD and UHD (4K) TV content to the end customer with minimal delays.” Boris Yurin, Sales Director, CIS, ATEME, said: “As the TV industry evolves and the line
between paid-for TV and OTT becomes increasingly blurred, creating a high-quality user experience is a key driver behind business decisions. Telecom Armenia is developing a winning strategy to attract and retain the most valuable viewers, and we are delighted to accompany them on this journey.”
NEWS - BUSINESS & PEOPLE
Tedial taps two of its own for leadership roles TTedial, independent MAM technology solutions specialist, has tapped two of its own to assume larger roles in the Company’s hierarchy as it continues to develop its next generation Media Integration Platform. Emilio L. Zapata, Tedial founder, has announced that Manuel Martínez has been promoted to Tedial’s Business Development Manager from his position as Regional Sales Director, Spain; and José Luis Montero has increased his sales management responsibilities from Regional Sales Director of LATAM / Middle East to include Spain and Portugal as well. Both appointments are effective immediately. “We are fortunate to have such talented and experienced professionals on our team as the industry transitions to a 18
new era driven by Cloud, AI and SaaS, and the company moves toward a next generation Media Integration Platform,” says Zapata. “Manuel’s comprehensive insight of the challenges the M&E sector faces in this time of unprecedented change, along with his deep knowledge of customers’ expectations will be invaluable when we roll out our groundbreaking platform. And José Luis’s vast proficiency and deep understanding of the industry mixed with his proven track record will be key assets for Tedial customers in Spain and Portugal as they have been for those in LATAM and the Middle East.” Martinez, who is based at company headquarters in Malaga, has assumed the responsibility of developing business and marketing strategies for Tedial’s next generation
of services. He will work with Tedial’s senior management, sales and marketing to assess market reach, evaluate opportunities and put best practices in place to build new business channels. “New technologies and operational needs arising from the COVID-19 pandemic have caused a disruption in the M&E market. Tedial is at the forefront of this technology transition and is well prepared to offer new services that meet the M&E industry’s evolving needs, enabling next generation user experiences. I am proud that Tedial trusts me to lead this new challenge, aligning the different areas of the company involved in the business development of these next generation services, including sales, marketing and partnerships." says Martínez.
NEWS - BUSINESS & PEOPLE
Montero joined Tedial in 2017 as Regional Sales Director, LATAM. His proven track record in strategic planning and customer development led to the swift expansion of his regional responsibilities to include the Middle East. Montero will now add Spain and Portugal to his current role as Manuel Martínez
moves into the position of Business Development Manager.
forward to attending to
“I assume the challenge of driving the sales in Spain and Portugal with the same enthusiasm that I have for the Middle East and LATAM,” says Montero. “The Spanish and Portuguese markets are extremely important for Tedial and I look
regions as well. I am
the needs of our customers in those proud to work with the high caliber of professionals at Tedial who consistently support customers around the world with the maximum level of innovation and quality.”
TOKYO 2020
The Tokyo 2020 Olympic Games have been the most complicated in history. An event as important as the Olympics, and not only at a sporting level, but also in terms of innovation in the creation and efficiency of technical and human infrastructures, proved to be a real headache for the organizers. Of course, in the global framework in which they have been developed, the pandemic has been very present. And this major handicap has made this latest edition of the Olympic Games the first in history to be solely and exclusively told and shown to the world through technology. This is both a problem and an opportunity to give technology for the broadcasting of content the important role it deserves. Sotiris Salamouris is the Chief Technology Officer of Olympics Broadcasting Services (OBS). His role is to oversee all technical operations of the Olympic broadcasters. Among his responsibilities is to lead the
teams responsible for the design and control of the technical infrastructure and operations of OBS. He must also set the long-term technology roadmap for the Olympics. For the last five editions of the Olympic and Paralympic Games, Sotiris has been responsible for setting the technical and human resources guidelines for the broadcasting of the sports content generated by these major events. The CTO has offered us a compilation of the technological improvements they have implemented in these Tokyo 2020 Olympic Games.But he also shared with this magazine his concerns about the future of broadcast technology and the challenges that have arisen in the OBS race to reach the goal of innovation. And, of course, the problems and solutions they have created from scratch to circumvent the problems arising from the COVID19 pandemic. Photos Credit: © 2021 Olympic Broadcasting Services
20
OBS
21
TOKYO 2020
Regarding the Tokyo 2020 edition of the Olympic Games, what are the most important technological innovations? We live in a technological age and, of course, each edition of the Olympic Games is accompanied by many technological innovations. But I have to say that several of these changes have been exacerbated by the pandemic. However, as far as the Tokyo Olympics and specifically our operations as a sports broadcaster are concerned, the biggest technological improvement we have made has been the introduction of Ultra High Definition. The last time there was such a technological change was in 2008 with the Beijing Olympics, where all coverage went from Standard Definition to High Definition. This innovation also includes implementations of High Dynamic Range and Wide Color Gamut. That's something very 22
Sotiris Salamouris is the Chief Technology Officer of Olympics Broadcasting Services (OBS). His role is to oversee all technical operations of the Olympic broadcasters.
OBS
important because we are talking about all live content being produced that way. For the Tokyo Games, all our coverage is simultaneous in UHD and also in HD. We have more than 80 multilateral feeds coming from all the venues. Each sport is covered by one or more production lines, and that's the technological breakthrough: all the transmissions are going to be based on UHD standards along with HDR and WCG. This update also includes an important audio innovation. We are also
THE BIGGEST TECHNOLOGICAL IMPROVEMENT WE HAVE MADE HAS BEEN THE INTRODUCTION OF ULTRA HIGH DEFINITION
complementing this content with 5.1.4 immersive sound. We deliver 10 discrete channels in a 5.1.4 configuration. Subsequently, broadcasters can compress or repack into one of the immersion standards they want to follow. The second innovation, which is also linked to this resolution upgrade, is the conversion of a major part of our broadcast infrastructure to IP. All our UHD contribution and distribution is taking place over a live IP infrastructure. The latter is, of course, the adoption of a predominantly public cloud to support a large part of our production workflows. The difference that makes this interesting, rather than using the public cloud only for our postproduction workflows, is that our cloud solutions also involve part of our live distribution workflow. Every UHD live stream is distributed to multiple
broadcasters around the world through a public cloud. This is a very important innovation, as a few years ago dealing with live HD signals within the cloud was considered a challenge.
How has this UHD / HDR / WCG upgrade affected production teams? Does everyone have to learn new skills? The UHD element, in the sense of its advanced resolution, is not necessarily changing too much of how sports are covered. This is taken into account when thinking about how the story is going to be told. Of course, having more pixels and better picture quality helps them be a little more creative in how they cover the picture. However, there are some elements of the camera that are becoming a bit more sensitive. HDR and WCG can capture a much richer and broader set of colors and light levels and, cameramen should be aware of this. 23
TOKYO 2020
Our coverage is unified. This means we use the same cameras that ultimately deliver UHD and HD content. The vast majority of the cameras are native UHD and most of them record in native HDR, HLG, which is our HDR format. We have more than 1,300 cameras from different brands.
24
What are your signals flows like? Most of the cameras we use are UHD ready, but not all. Some of the cameras may not be HDR ready or may not be UHD ready. In these cases, we do the conversion. However, the content must be delivered in HD
or UHD resolution. We have to be very careful in the conversion of this material. To prove this, we have done a lot of internal development. We have explored how the output is visually controlled. We have introduced the new layers of visual engineering and
OBS
control, plus layers of quality checking and adjustment to be sure that we are going to have excellent output in both UHD/HDR and, very importantly, HD/HDR. And this is significant because 90% of broadcasters are still distributing globally in this standard. However, we are very confident that
they are performing excellently. That has been a challenge.
What will be the future of OBS, how are you going to push the boundaries of these standards you mentioned earlier? OBS is always focused on improving and
upgrading the infrastructure, how is it going to become more important? We are a sports production company with a mission to provide the best possible coverage for all the different sports that are part of the Olympic ecosystem. But
25
TOKYO 2020
we are also the only host for many broadcasters who don't have the muscle to spend time and money planning for the upcoming games. That means they rely on us to provide a lot of technical services. In terms of innovation, we strive to innovate, both as a production company and as storytellers for the games. There is innovation that is purely technical, such as special cameras. There are all kinds of cameras developed for specific sports. These cameras are continually being developed because the way they are built ensures that they will capture that particular moment that is important for the sport they are going to cover. This part of the innovation will continue in the future. Improved technical services for broadcasters, such as UHD, and methods of sharing content, such as the cloud, will also continue. However, we have to be reasonably conservative. We are going to use this technology because we 26
want to be innovative, but on the other hand, we have a big responsibility as we do this to provide those very resilient and very solid services, technically speaking. We are always balancing these two areas because we can't afford to compromise really
important content to the global audience. Due to the pandemic situation, in just a few months we will have the Beijing Winter Games. In this next edition, we will not make a major change from what we have done in Tokyo. We will adjust our technology and learn
OBS
from this experience to adapt it so that it will be even safer when used in Beijing. Paris is three years later. We have a lot of ideas about Paris, but we are only beginning to consider what we should do next.
Is the IP infrastructure upgrade related to 5G coverage?
THE USE OF 5G FOR BROADCAST DEVELOPERS IS ALSO RELATED AND MADE POSSIBLE BY THE ADOPTION OF IP IN OUR INFRASTRUCTURE. THE FIRST THING WE DID IS THAT WE REPLACED OUR POST PRODUCTION WORKFLOWS GRADUALLY. WE HAVE ALREADY MADE THE TRANSITION.
The use of 5G for broadcast developers is also related and made possible by the adoption of IP in our infrastructure. The first thing we did is that we replaced our postproduction workflows gradually. We have already made the transition. What remains to be done is to migrate all the broadcast technology used in live coverage to IP. 5G now comes very well because it is a much more powerful mobile technology than the traditional one we have been using in previous generations. It also comes with two additional requirements that are very good for broadcast: more 27
TOKYO 2020
bandwidth and very low latency. This allows us to use 5G in additional workflows, especially now that we are already moving to IP, live workflows that were not possible before with 4G. At this stage, we have conducted a proof of concept ourselves with our partner Intel. For the Tokyo games, we have developed 5G coverage that has been used in the ceremonies predominantly and also in some other sports. We have used some specialized 5G wireless cameras and expect to use more of them in the future.
Will we see only wireless production equipment in the future? I'm not sure it's going to happen. Wireless is a good solution because it gives freedom. But I don't think the day will come when we will stop using wires, because wires also have their own important advantages. Whatever you do, with 5G or anything like it, you're always going 28
OBS
29
TOKYO 2020
to have less bandwidth available than if you use wired technology.
Why did you decide to make the OBS cloud public? Why not private? The main advantage of using this infrastructure is that it exists everywhere. Almost everyone can access it and it has a lot of capacity. We work with Alibaba and they also offer a wide variety of services. Alibaba has helped us a lot because it allows us to have direct access to their infrastructure and professional resources and services. On the other hand, if I do something in a private cloud, this is a solution just for me. That will just be restricted infrastructure, either within the transmission center here in Tokyo or in another center somewhere in the world.
to do as a host broadcaster is to create content and distribute it globally. And I can't replicate Alibaba's infrastructure, which provides cloud services to thirty regions around the world.
In this particular case, we wanted to take advantage of the ubiquity of a public cloud, because one of the things we need
How has Artificial Intelligence been indispensable in these Olympic Games, and how will it be
30
indispensable in the future? AI is not new, but it has become quite a bit more interesting in recent years because it can greatly help broadcasting in various workflows. We have created over 9,000 hours of Olympic content, which is a combination of our live content along with additional postproduced content. The
OBS
biggest problem we have, once it has been produced, is knowing what has been produced. If this information is not available, nothing can be done with all this content. Because of this need, we ourselves have traditionally carried out an important operation of recording these contents during their creation. This operation required, and still requires, a lot of human effort. The technology comes with the hope that some of that will now be done by artificial intelligence. It can recognize what's going on in the video or audio and point and say, okay, that was a foul, for instance. On the one hand, that will free up some of our
need to use people to do it, but, even more importantly, it can improve our labeling. We can't label everything, because we all have limited attention spans. However, if machines do it, the task is completed much faster and with much higher quality than if we had done it ourselves. There is another type of AI application that is starting to become a reality. And we may come to not require the typical set-up with a director and a video operator to switch cameras in a production control. Of course, we are far from doing it with the quality with which humans do it. But it could be done automatically thanks to this technology.
WE HAVE CREATED OVER 9,000 HOURS OF OLYMPIC CONTENT, WHICH IS A COMBINATION OF OUR LIVE CONTENT ALONG WITH ADDITIONAL POST-PRODUCED CONTENT.
For example, in firstround tennis matches, this AI-based self-realization technology could be used. This is because we have been covering these matches on a delayed basis. We don't cover them live because we would need an additional amount of resources and that could be a problem. Potentially, we will be able to do that someday with AI-based automatic live switching. We have already tested it for the Youth Olympic Games.
Has COVID-19 changed your workflows? In relation to the pandemic, we have had to be more efficient in bringing people in. There are more than 8,000 people working for the Olympics and the vast majority of them have been in Tokyo. Normally, we would need even more. We used technology to further improve the way we do our workflow. In this way, we have used even fewer staff than we would need to do it locally. And this talent that has emerged is because of COVID-19. 31
TOKYO 2020
WE HAVE MADE ADAPTATIONS IN OUR AUDIO PLANS KNOWING THAT WE ARE NOT GOING TO HAVE A CROWDED ENVIRONMENT. IN ADDITION, WE HAVE HAD TO PROVIDE TECHNICAL SOLUTIONS TO SOMEHOW BRING THE AUDIENCE INSIDE THE VENUE. We also had to redesign several of our facilities. We had to change the layout of some of our studios, including the layout of some of our control areas to eliminate some of our equipment that was inside OB vans because we didn't want to have so many people inside those vehicles. We had to make a number of changes, partly operational and partly technical, to meet the constraints. In addition, of course, this edition of the Olympic Games has had a major challenge: we have not received any public. This has meant that we have had to do several things to cover this major handicap. 32
We have made adaptations in our audio plans knowing that we are not going to have a crowded environment. In addition, we have had to provide technical solutions to somehow bring the audience inside the venue. For example, we have helped the other broadcasters so that they can conduct remote interviews with the relevant athletes without sharing the same space. We have also launched a project called the "Fan Engagement Project," which allows audiences around the world to cheer remotely at matches, and for these cheers to appear at major events. Or even facilitate the interaction of athletes with their families
who have not been allowed to come to Japan. It is a pity but, precisely because of this situation, it has fallen on our shoulders to bring this content to all the local and international audience. Everyone has seen the games through television or the other digital media that have distributed this edition of the Olympic Games. There has been no other way to enjoy them.
Conclusion Against all odds, an edition of the Olympic Games over which there were doubts about its success even a few hours before its official opening, has become a complete triumph. And also, and very important, it has meant an injection of confidence and capacity to overcome the challenges that the pandemic has posed to all of us. And, as we have been able to see, the technology for the broadcast has far exceeded expectations by bringing the best human qualities of elite sport to every home.
OBS
LITTLE BIG OLYMPIC GAMES Doing more with less
the paradigm has shown how important it is to adapt to remote production and the technological capacity we have to do so. The pandemic has caused destruction, true. But it has also got us to think outside the box and given us an opportunity to rethink and reshape our future. And, in this path, OBS is a pioneer.
The goal of the whole team concerned with the broadcast of the Olympic Games is to offer more content, in a better way, with more quality and concerning many more participants. And, on top of this, do it with the least amount of resources. The technology involved in the production and broadcast
of these contents has contributed enormously to reducing the impact on the environment that an event like this usually causes. An incentive has been the pandemic that was caused by Covid-19, which we all had to go through. Times are changing and
OBS has long been trying to create the least impact for the planet and endeavored to use the least amount of resources possible in each host city in which the Olympics have been organized. The plans to create the infrastructure and the means used for broadcasting are increasingly becoming less expensive in terms of the environmental burden that they entail and provide better broadcast quality. 33
TOKYO 2020
The path that this organization has always followed requires optimization of the space and the energy used -for which the large International has been restructured Broadcasting Center (IBC) in Tokyo- and, not least important, to implement the benefits of the digital transformation, among which we have, for example, the implementation of a new resolution standard: UHD/HDR. Therefore, we have prepared a comprehensive guide to all the innovations -both technological and structural- implemented by OBS in this edition of the Olympic Games, taking into account their improvements for the preservation of the planet.
UHD / HDR / 5.1.4 The innovations in the technological field that OBS has offered in this edition of Tokyo 2020 are many and very interesting. It is remarkable the fact that we are about to experience another 34
transition like the one we went through a while ago with the change in resolution: from SD to HD. OBS has proposed one more development, the next natural step for broadcasting: from HD to UHD. And of course it has not been necessary to dedicate an unprecedented infrastructure on this process, but rather the opposite. Under the guise of offering more for fewer resources, OBS has become an ally with the technologies involved in communication, delivery, and storage, such as 5G networks or cloud systems. Each UHD signal generates eight times more information than an HD signal. But, as if this amount of data were not enough, OBS has broadcast the first Olympic Games in high dynamic range (HDR) and immersive audio 5.1.4. OBS has broadcast UHD signals combined with HDR and WCG (Wide Color Gamut) from 42 sports competitions, only
excluding the eight outdoor tennis courts, which have remained broadcast through conventional signal in HD. To meet these needs, OBS has used a total of 31 mobile units (OB vans) and 22 fly-away systems. Regarding capture of immersive sound, OBS has succeeded in getting viewers from home to be able to perceive the atmosphere of each sports event in a threedimensional environment. To achieve this, OBS has expanded 5.1 sound by adding a third dimension through the creation of a technical ceiling featuring four hanging microphones that could be modified in height and direction for each occasion.
HD vs. UHD Nearly all content that OBS has generated has been produced natively in UHD/HDR. But, what happens with that signal if in ultra definition it cannot find its own infrastructure for broadcasting? As we have mentioned, we are facing a paradigm shift,
OBS
but we are still in an embryonic phase. And the infrastructure required for everyone willing to broadcast a signal like this to succeed has not yet been created. Likewise, OBS is also in the process of adapting and part of the cameras that it has used to produce the Games are only capable of obtaining a signal in HD. For this reason, OBS has had to adapt to all standards. All signals have gone through a process of rescaling and duplication for broadcasting. The solution found by the Olympic Broadcasting Services has been to create an adapted model so that each Mobile Unit (OB van) is capable of generating an HD 1080i SDR signal from the source UHD/HDR signals and from the other HD signals. In this way, the entire demand from the other broadcasters that have been involved in the process has been met. This model is based on the creation of an entire Internet Protocol infrastructure.
Cloud solutions
5G Transmission
The pandemic and new remote production processes are also changing the creation of content and its subsequent dissemination. Compared to Rio 2016, the connectivity of Tokyo 2020 has been ten times higher. This data provides the certainty that broadcasters are becoming more and more digital. OBS has not been left behind and, in order to facilitate remote operations for broadcasters who broadcast Olympic content, it has created -in cooperation with Alibabaa cloud specially designed for the heavy files generated by the production of the Games.
5G will offer wireless contribution with enough bandwidth as to carry the signal in UHD. The data handled in this edition of the Olympic Games has been around 100 Mb per second, with download speed peaks of up to 20 Gb per second and latencies not exceeding one millisecond. These capabilities have allowed the signal to be transmitted with quality and efficiency thanks to such a low latency.
For associate broadcasters this is a turning point. Thanks to the implementation of cloud services, they have managed to reduce travel costs, costs associated with the creation of an operations center in Tokyo and, even more importantly, the carbon burden associated with former production ways.
The initial 5G tests carried out by OBS were performed in Pyongyang 2018 and the plans of the Broadcast Service providers in the Olympic Games foresee the adoption of this technology again during the 2022 Winter Olympic Games to be held Beijing, where it is estimated that there will be an operational 5G network already in place.
Artificial Intelligence solutions At some point in the 35
TOKYO 2020
future, the effects of production on climate change will be reduced to practically zero through processes automated by means of Artificial Intelligence. For now, OBS has implemented in this edition of the Olympics solutions based on Artificial Intelligence for tracking and tagging tasks. These processes, especially tagging, have always required OBS to devote a huge human team to ensure that the images they produced on each edition of the Olympics were correctly cataloged. This software solution has been named AMD (Automatic Media Description) and its mission is to automatically catalog the images by recognizing the elements that comprise them. In addition, OBS will introduce other systems such as BDF (Broadcast Data Feed) that will recognize athletes contained in the aforementioned images through an extensive library where their 36
physical and personal features are stored. Another piece of software that will also take part in these automation systems is the simultaneous transcription that will be carried out once the signal is produced. But OBS, is not content living it at that. In Beijing 2022, the broadcasting services will expand these processes to as many sports as possible. In this way, they will implement their solutions based on Artificial Intelligence to international workflows and, on the other hand, they will make this technology available to any broadcaster interested in acquiring broadcasting permits for the Olympic Games. Likewise, in its future plans, OBS will try to implement Artificial Intelligence in user customization processes by generating appropriate content for each viewer, creating automatic summaries through the detection of patterns and generating forecasts for the athletes' performances
through historical data, personal conditions or, simply, situational factors.
International Broadcast Center OBS's technical facilities are located in the IBC, which, on this occasion, has been deployed on already existing fair grounds built in Tokyo Bay. Its actual name is Tokyo Big Sight, being one of the largest convention centers in the entire country. All the signals captured by the OBS cameras have been relayed to this central hub and its workers have been operating 24/7 to make these signals go through the aforementioned processes and end up in the hands of at least 80 broadcasters that have broadcast the Olympic Games. Time is of the essence, since all these processes must be executed in real time. For all these reasons, the Tokyo IBC has been, for 17 days, the largest content broadcast center in the world.
OBS
However, this large infrastructure deployment has not significantly affected much the climate load associated with these means of production. A new model is sought for each edition of the Olympic Games. Each site is unique and has different advantages and drawbacks. On this occasion, the Olympic Committee had a huge advantage: the city of Tokyo was able to provide the infrastructure. Existing facilities have been fitted to house two large entities: OBS and OCOG, whose initials correspond to the Organizing Committee. In this way, the environmental impact associated to construction of new facilities or to housing these institutions in separate places is decreased.
Much more for much less The Tokyo 2020 Olympic Games have brought the world more than 10,000 hours of content produced and distributed by OBS. This represents 30% more
than the content that was produced in the last edition of the 2016 Olympic Games in Rio de Janeiro. More than 1,800 cameras and 3,600 microphones have generated around 290 graphic signals in HD and UHD resolutions and around 46 sound signals in stereo. 9,000 people have been part of all the processes in which OBS has been involved. The ongoing digital transformation of broadcast media has prompted OBS to multiply the number of terabytes by 7, for a total of 2.7 Tb, corresponding to the international bandwidth it has used to send its signals through IP. In Tokyo 2020, OBS has used around 40,000 square meters of infrastructure at the IBC. However, the carbon footprint generated by these Olympic Games has been 21% lower than in the previous edition, when it comes to the IBC facilities. Regarding the footprint on the
environment created by the combination of production teams in each of the places where sports competitions have been held, it has been 24% lower than the environmental burden generated in Rio de Janeiro 2016. The digital transformation that OBS has carried out for this edition of the Games has been crucial in reducing the greenhouse gases generated by all the technical and human infrastructure necessary to produce a sports event of such magnitude. Process virtualization through remote access, improved connectivity, cloud solutions and updated technical equipment have brought the Olympic Broadcast Services closer to the goal of reducing the environmental footprint. A goal that will soon be achievable in the Olympic future and one that will mean offering much more for much less. 37
TOKYO 2020
38
RTVE
By the RTVE technical team, in July 2021
New Olympic and Paralympic games are about to begin: Tokyo 2020. And RTVE has now the technical infrastructure ready to cover this world sporting event. With a programming that will include 400 hours of first broadcast through La 1, Teledeporte, RNE and their new digital platform, RTVE Play.
RTVE's technical deployment will be carried out thanks to the work of a small group of engineers and technicians who travelled to Tokyo and are already carrying out the tests on the equipment installed there, as well as to the work of many other technical professionals who from the Barcelona (San Cugat) and Madrid (Torrespaña)
headquarters will configure, operate and provide 24/7 support to the communications systems and all the broadcast and information technology equipment involved in the audiovisual production chain designed specifically for the Games. Part of the electronic equipment to be used was pre-installed and configured in Madrid 39
TOKYO 2020
during the month of May. And in mid-June the equipment was sent to Japan by air. The remote set that has been set up in Tokyo, located opposite to the Olympic Stadium, will be the main hub from which an important portion of La 1's programming will be presented. Remote production will be carried out from the A4 studio in Torrespaña with the cameras from the Tokyo set. Other Olympic programs broadcast in Teledeporte will be produced from the E3 studio of the Sant Cugat Production Center. Radio Nacional de España from its studios at the Casa de la Radio in Prado del Rey will focus on the Games with in order to narrate live all the medals won by the Spanish athletes. And as usual, the rtve.es website will feature a multi-device offer for the Olympic Games. On this occasion, the games will coincide with the launch of the new RTVE Play platform, an evolution building on RTVE’s Alacarta platform. The transport of all the 40
signals and the multiple technical services between Tokyo and Madrid or Barcelona will be carried out through Nimbra nodes. These are linked by OBS on two separate 1G Ethernet (820 Mbps payload) bidirectional circuits up to their London and Frankfurt POPs. From those POPs to the RTVE center in Madrid, the 1G trunks are served by Eurovision Services (EBU) through their FINE network. They are not redundant circuits since the services they carry are distributed. In the event of failure of one of the circuits, a redistribution of priority services is programmed by the circuit that remains operational.
all the signals necessary for production will be extended to Barcelona.
The 15 unilateral MDS (Multi-channel Distribution Service) channels that OBS distributes encrypted as allocated by the Eutelsat 7 and Eutelsat 10 satellites will also be downlinked through the Torrespaña parabolic antennas. Internally, once demodulated, by RTVE's own contribution network,
RTVE's MCR has been installed in a dedicated space at the IBC (International Broadcasting Center). From there, all the signals and technical exchange services involved in the operation of the Olympic Games are managed. The MCR will receive the OBS IP VandA 1.PK14 service with live signals from the
As described below, RTVE's public service undertakes a complex technical operation through which the images, sounds and values of the Olympic spirit will be conveyed to the spectators. On top of this technical deployment, hopefully the presenters and editors of the programs will narrate many successes of our Olympians and Paralympians and the exciting award ceremonies of their medals.
Infrastructure installed at the Tokyo IBC
RTVE
ceremonies and competitions for all Olympic disciplines. Two 10 Gbps streams -main and redundant- will be received.
OBS through an SDN controller and the stream is then pushed to the Appear X platform installed by RTVE in its own space at the IBC.
IP VandA is a streaming of up to 48 multicast signals that meets the SMPTE 2022-2 and SMPTE 2022-7 video distribution standards. The HD 1080i/59.94 video signals are encoded in H.264 High@Level 4.1a 22 Mbps format with a GOP size of 32 frames. And international sound L/R and AAC LC (5.1) audios are encoded in MPEG1 Layer 2 2/0 at 192K and 512K bit rates, respectively. All signals are multiplexed and encapsulated over RTP (Real Time Protocol). The IPVandA distribution switches are managed by
Appear's X platform handles the IP convergence of Digital TV. In a modular chassis with built-in high-availability features, the platform has all the necessary interfaces: IP, video in classic baseband or over IP, modulators, demodulators and all processing capabilities such as encoding, decoding, encapsulation and decapsulation SRT, ST2110, ST2022 in a fully integrated way and with a high processing capacity. In this case, Appear's X20 allows RTVE, on the one hand, to receive the 48 DX from the IPVandA service
on both the main and reserve 10Gbps fiber and reconstruct the transport frame in a totally seamless way under the SMPTE 2022/7 standard and on the other, it allows the decoding of the 4 services corresponding to the competitions of interest at any given time in an equally redundant manner and totally dynamically- chosen by the operator, thanks to the intuitive graphical user interface offered by the X platform. The decoded signals are delivered to the MAM card in the Nimbra node that encodes them under JK2 and delivers them to the OBS circuits towards Spain. Although the Appear X platform has excellent features in terms of redundancy, the top 41
TOKYO 2020
RTVE's MCR at the Tokyo IBC
priority is to ensure maximum availability of the images of the competitions. For this purpose, two Sapec Laguna octal processors have been connected in 10G, which in case of failure can take over the decoding of the IP VandA signals. The signals decoded by both Appear and Laguna 42
are delivered in HD-SDI baseband to a Utah Scientific HD-SDI matrix. This longstanding, proven matrix, which has already served in previous editions of the Games in Rio and London, routes all the bidirectional exchange signals between the MCR and Madrid and between the MCR and the remote set located opposite the
stadium to the Nimbra network. The matrix also interconnects a multiscreen for local monitoring of the signals and a measurement TOM for quality control. A 1G Ethernet link through which OBS connects the MCR with the so-called Announce Platform position contracted by RTVE inside
RTVE
the Olympic stadium. Through this trunk, an ENG signal from the stadium will be received via IP and the program return PGM signal will be sent to it with the N-1 audio + commands. For this, both the Announce Platform and the MCR have two Sapec Laguna processors that encode and decode the HD-SDI signals in IP bidirectionally with the audio embedded between both points. The MCR's Laguna is connected to the matrix for the routing of signals with Spain. An RTVE Nimbra node provides all the necessary communication services between Torrespaña, Sant Cugat and the MCR and between the latter and the remote set opposite the stadium. The node features MAM cards for the Tx/Rx exchange of video signals encoded under JK2 at 150 Mbps, cards dedicated to AESEBU audio Tx/Rx and cards for internet services: linking of the Artist Riedel intercom nodes, IP telephony, corporate
services, digital newsroom, CIS (Commentator Information System), ODF (Olympic Data Feed) Management, etc. This node, through redundant 1G links, is connected to the remote set's Nimbra node and to the OBS Nimbra node for connections with the London and Frankfurt POPs.
Infrastructure installed at the remote set For the presentation of broadcasts from Tokyo, RTVE has set up a set there that has a large window with a direct view of the Olympic stadium and LED interior lighting. To control outdoor light exposure and deliver the best image quality, Rosco View polarizing filter systems have been fitted to the window and set cameras. By rotating the filter in the camera, the degree of cross polarization between the two is changed and the backlight is balanced with the lighting of the set. The use of these filters only
entails a reduction of one diaphragm stop. The programs in Tokyo will be carried out as remote production from studio A4 in Torrespaña. Four cameras have been arranged at the Tokyo set (one of them as a back up), Grass Valley LDX82 Premiere model with their XCU Worldcam base stations that will be permanently installed in RTVE's Castilla - La Mancha Territory center after the Games. These cameras can work in HD 1080i/50 format since the organizers of the Games provide the 230V/50Hz power supply for the set. A Calrec Brio audio mixer and an IP system for the Autoscript text presentation (teleprompter) have also been fitted. These resources belong to the Prado del Rey studios. In the set, the CUE is connected via Ethernet to a switch that in turn is connected to Nimbra with a bandwidth of 20 Mbps. The presenters' tablets are also connected to Nimbra, since, although they run 43
TOKYO 2020
via WiFi, the access point is directly connected to the Nimbra. In this way the need for a VPN is avoided. WiFi internet connectivity would be a backup, like the 4G router.
Tally cameras and cable or Phonak WisyCom in-ear devices for presenters. On the set, large-format LED monitors will show image content sent from Torrespaña.
The installation is completed by a Riedel Artist intercom node for N1 audio communications, commands and GPI for
Through the Calrec console, a sound operator manages the signals received from the Sennheiser 6000 digital
wireless microphone, the N-1s and orders that arrive from Spain through Artist for the listening of presenters, the set's PA system, the sending of news and sports audio to the embedders who insert them into the cameras' HD-SDI signal and the back up sendings to Spain from the set microphones
RTVE's remote set in Tokyo located opposite the Olympic stadium and electronic equipment installed on the set
44
RTVE
through the AES-EBU cards in the local Nimbra node. The Nimbra node in the set has MAM, AES-EBU and Ethernet cards; it supplies all the necessary services for the production and the network trunk of the Artist intercom node. Through
this Nimbra, the set's screen signals, the CIS, are received, a video signal is exchanged with the IBC, the signals from the cameras are sent directly to Madrid as well as the return of the PGM program from the Torrespaña studio and the broadcast returns of La 1
and Teledeporte. Camera signals are encoded under JK2 at 150 Mbps and sent to the Madrid studio with a very low latency. The program return and the videos for the set's screens reach the set also encoded in JK2 at 150 Mbps. Broadcast and Announce Platform returns not requiring as much quality arrive encoded at 100 Mbps. Additionally, in order to guarantee the continuity of the program from Tokyo in the face of any issues in the OBS or EBU circuits, a Net Insight encoder model Nimbra VA 225 has been installed as back-up to send the 3 camera signals with the audios from presenters embedded, encoded in H.264 at 8Mbps and channeled to Spain through a dedicated 20Mb capacity FTTH circuit. This encoder is integrated into the Nimbra Vision system deployed on RTVE's contribution network, through which the device and the services assigned to it can be managed and remotely monitored. 45
TOKYO 2020
For the presenters at the set, a small 3-computer newsroom has been set up and connected to the iNews systems in Madrid and Barcelona. A network printer is also available. And the teleprompter is connected to iNews so as to be able to write in the step outlines as if they were in the TVE's newsroom. They will even be able to insert labels in the news of the step outlines. The set contains 46
3 Surface GO2 tablets connected in the same way as the computers so that the presenters can follow the list of the program or the news program. Set connectivity is carried out via the internet and access to AVID's production network via VPN tunnels. There is also a 4G router as a backup connection system in case of internet service failure.
Infrastructure dedicated to the Olympic Games in Torrespaña and Sant Cugat As indicated in the introduction to this article, Torrespaña is responsible for downloading the OBS MDS transmissions from the satellites. These are demodulated by NS2000 Novelsat equipment featuring temporary DRM licenses for signal
RTVE
decryption. Each demodulator delivers an MPEG transport stream with three services in ASI format. By means of 5 demodulators, as many ASI signals are obtained and shared internally by Nimbra with Sant Cugat. Both Madrid and Barcelona have H.264 MPEG-4 4.2.2 decoders to obtain the Olympic signals in baseband video.
Alchemist Grass Valley motion compensation algorithm and are then sent to a matrix for distribution to the various internal destinations. The signals from the cameras at the set are also received, which when working natively in 1080i/50 do not require any conversion and are sent directly to the mixer at the A4 studio.
On the other hand, the circuits of the FINE network reach the EBU's Nimbra node in Torrespaña. The two 1G circuits of this node are connected to the Nimbra node of RTVE in International Control. From there, all the services that are transported are managed by RTVE's technical staff in charge. The IP VandA signals selected at the Tokyo IBC through the Appear platform arrive in their original HD 1080i/59.94 format. To convert the frame rate to the European 1080i/50 format, the signals have to go through high-quality converters with the
Signals coming from Tokyo with international sound (either through the MDS or through Nimbra) are forced to pass through various call centers where the various specialists will comment on them, thus
generating a video signal with comments in Spanish. Projections are for three call centers in Torrespaña and another four in Sant Cugat. The respective central controls in Madrid and Barcelona will be in charge of composing the signals with international sound with the locutions, generating signals with embedded audio ready to be delivered to the studios or for feed, depending on broadcasting needs. At the Torrespaña and Sant Cugat controls, signals will also be received from up to 6 Ikusnet IP transmitters from Prodys taken to
47
TOKYO 2020
Japan for news coverage at the various Olympic venues. Each ENG transmitter has 6 3G/4G modems. The signal from an ENG camera is encoded in H.264 and sent live via IP or stored as a file and then sent to Torrespaña via FTP.
3 signals from the Digital
All the aforementioned signals will be available both for feed and for use at the radio studios in Prado del Rey, for production of rtve.es, for work at the digital newsrooms and for production of the relevant programs at the respective studios: A4 in Torrespaña and E3 in Sant Cugat. Specifically, in the A4 studio that will carry out the remote implementation, the provision of audio deembedders has been expanded. so that the set of signals that it must handle is distributed as follows:
Studio (Daily News) in Torrespaña, for sending** to the monitors at the Tokyo set
3 cameras from the Tokyo Set, two of them with embedded audio
4 signals for booth feed in Torrespaña, to broadcast replays and recordings 48
Newsroom, for broadcasting news edited by sports journalists
6 signs of commented sports
2 signals from the E3 Studio in Barcelona, to be sent* to the monitors at the Tokyo set
2 signals from the A2
2 signals for 4G backpacks taken to Tokyo * when the E3 studio in Sant Cugat carries out most of the broadcast ** during the broadcast of the news
Torrespaña's A4 studio has temporarily expanded its technical equipment to facilitate these tasks. Ancillary to the aforementioned deembedders there is an extension from 12 to 16 lines from the central control matrix, 8 additional inputs in the video mixer and multiscreen, a new character generator with features that will be later described, and the generation of a series of
different PGM output signals from the studio, depending on destination: Return PGMs (different for Tokyo, for Studio A2 and for Studio E3) that are added to the usual PGM signal, plus the usual spare PGM circuits. Additionally, the studio, since production will be carried out remotely, also has computer platforms for intervention in programs featuring remote guests through the Skype and VMix applications that had already been widely used during the toughest stages of the pandemic. Due to the time difference with Japan, the bulk of the sporting activity will take place during the Spanish early morning and midday. Teledeporte and La1 will allocate the events between them in order to be able to offer the most number of live broadcasts. Teledeporte will have a 24-hour broadcast with Olympic content, whether live or recorded. And La1 will simultaneously broadcast live competitions from 0 am to 3 pm, the time at which
RTVE
Daily News 1 will begin, which will also include a significant amount of Olympic and Paralympic information. To manage the broadcast on both channels, cooperation among the A2 and A4 studios in Torrespaña and the E3 studio in Sant Cugat will be required, as well as the from the Tokyo set. In them, all the
production of the Olympics will be made in 5.1 audio and will be broadcast in this format on La 1 HD and the TDP channel. Torrespaña's A4 studio will centralize most of these operations, switching between different work modes according to the time slot:
From 0 am to 9 am it will run
the
Teledeporte
channel with live broadcasts.
From 9:00 a.m. to 3:00 p.m. it will run the broadcast of La1, in which it will include the Tokyo set, with all its resources and live presenters.
From 3pm to 4pm the
broadcast of La1 goes to studio A2 in Torrespaña, but studio A4 remains operational to carry out the production of the
49
TOKYO 2020
Figure 1.
50
RTVE
51
TOKYO 2020
Tokyo set during the broadcast of the daily news. At this time, the Tokyo set screens will receive signal from studio A2, routed from studio A4. From 4pm to 5pm, a summary of the day will be broadcast on La1 from the E3 studio in Sant Cugat. For this broadcast, the A4 studio will continue to operate the remote production of the Tokyo set as during the news broadcast, in this case routing the signals from the E3 set's screens to the Tokyo set. The E3 studio in Sant Cugat, recently renovated and equipped with modern production technology over IP based on the ST2110 and ST 2022-7 standards, will feed the continuities of La 1 and TDP in different time slots. For the occasion, a new set is inaugurated, which includes largeformat LED videowall screens that will display what is happening. It will have a supply of 5 cameras with tripod, 1 crane, 2 mini cameras and 1 robotic camera. The 52
following signals will reach this studio:
7 signs of commented sports
1 sign from Torrespaña's A4 studio, with the signal produced from the Tokyo set
6 signals for 4G backpacks deployed in Tokyo
An EVS IPDirector system for broadcasting replays and recordings
8 signals from the Digital Newsroom, for broadcasting news edited by sports journalists To label the programs, new templates have been designed specifically for the Olympic Games. In the case of the broadcasts made from the A4 studio in Torrespaña, these templates are integrated with the iNews System and the Chyron CAMIO server. In the production stage, labels are inserted in the iNews news by using the LUCI plugin and the CAMIO system (integrated with iNews). The newsroom only has to select the Olympics context in CAMIO through the LUCI plugin and drag
the label to the relevant piece of news. The entire system is conveniently configured so that the loading of labels is carried out on a Lyric X device. In broadcast, an ISQ label automation system has been deployed that will receive the list of labels from iNews and will control the Lyric X in order to air them. In the case of Barcelona, the labeling solution is a similar one, but is based on the Avid Graphics (Orad) graphics system. That is, the integration is with Maestro/Command. The labels are entered through iNews but using the Orad plugin. At broadcast stage, they are launched from the Maestro client. Intercommunication of the entire operation is fast since all RTVE facilities have the Riedel Artist intercom matrix system. The intercom installation is based on the trunking system, which enables communication between different venues through the nodes of the Nimbra contribution network. The link among venues is
RTVE
made by several IP audios. In this way, the three venues -Sant Cugat, Torrespaña and the Tokyo Set- will be united. This makes it possible to properly manage coordination communications, orders and audio returns for all the intercom panels, cameras, in-ear devices installed in the multiple workstations involved in the operation of the Games: studios, central controls, MCR and Tokyo set, presenter booths, etc. All nodes are interconnected VoIP through RTVE's Nimbra contribution network. In addition, as already indicated, this facilitates Tally and GPI/O services. Through Connect IP devices it allows VoIP transport from the commentator positions at the different Olympic venues. In the case of the IBC, as it only needs two intercom panels for coordination, the connection is made through a Riedel Connect Duo device that allows IP transport of a
communications position. Figure 1 summarizes the entire network set up to service the operation that RTVE's technical areas have designed for the Tokyo Olympic Games.
RTVE technical operation on the Radio Due to the restrictions put in place for these Games, the number of people who can travel to Japan is very limited, so the better part of the programs will be carried out from Casa de la Radio. To achieve this, editors will have the live video signals of the various sporting events in their booths. These signals reach them internally from Torrespaña through the Nimbra contribution network. Through it, 8 HDSDI video channels have been set up with the contents concerning competitions of interest managed by Torrespaña. The signals will be distributed by means of a KROMA video matrix and by Ethernet cabling (in HDBaseT), to the main
studios and recording booths of the Casa de la Radio. During the broadcast of Olympic Games special programs, commentators will have, on the television sets in the studios, the video signals of all the events that are being held at that time, so both broadcast and comments will be will always be performed in real time. 6 editors and 2 sound technicians are in Japan for radio services. Technicians will be mainly in charge of the production of the programs, audio editing and technical support to the editors. However, due to the special circumstances surrounding these Games, it has been considered essential to give the editors greater freedom of action. For this reason, backpacks have been prepared so that they can easily transport the technical means necessary for conducting interviews and live interventions from the different venues. Each of them will have a laptop, 53
TOKYO 2020
prepared for audio editing and recording with the Dalet production system, a Quantum Lite audio encoder, for live interventions via 4G or WIFI, as well as a mobile phone, microphones and a pole for interviews. The aim is to ensure that, with a minimum number of people allocated to the venues and taking into account mobility limitations that they are going to face, information coverage as close as possible, to the broadcasts of previous Olympic Games can be achieved.
Technical operation on the web rtve.es
Access page for the recently launched RTVE Play VOD platform
The production of up to 400 hours of live broadcasts for these Games makes it possible to form a package that includes the broadcasts of La1, Teledeporte and up to four additional live channels, which will enable to offer up to six simultaneous live shows of sports events through rtve.es. Broadcast of thematic channels during the Games is also being
considered. The RTVE Play video-on-demand platform has just released its trial version and will improve through user experience before its official launch in autumn. Digital coverage of the Olympic Games will reach almost all digital video consumer platforms, with special emphasis on
54
YouTube, where summaries will be offered with the highlights of each day's competition. Technical operation of the website can be split into four different processes:
Contribution From the Torrespaña Control, signals are distributed in HD-SDI
RTVE
quality. Processes are used for broadcast from these live recordings in real time, as well as for subclipping (creating cuts directly from the live show) for the newsroom. The packaging, which is mainly done in HLS format contains four different quality standards: 1080p, 720p, 576p and 360p, all of them in perfectly aligned MP4 format. A DASH format is also generated, mainly to meet the needs of connected TVs and HbbTV.
Multi-quality transcoding and storage in segments from 5s to 10MB/s in Full-HD quality
Packaging and broadcast of the live signal through CDNs (Content Delivery Networks).
Cutting and publishing the VOD recordings.
from
the
For live production and for the catch up, the HDSDI signals distributed from the Torrespaña Control are used, which are transcoded in Transport Stream by the HLS protocol and 5second segments that reduce latency and allow the information to be stored with maximum
Two pairs of stereo audio are used to manage the audio signals. Pair 1 with ambient sound of the event + commentator. And Pair 2 only with ambient sound. As distributed HDSDI signals carry two pairs of audios, the first one with the mixing of ambient sound plus the commentators' speech, and the second only with ambient sound, this allows a single recording with both types of sounds, so that at a later stage an editor -through his own 55
TOKYO 2020
editing tools- can generate pieces with the required audio. For online production, the complete ones (complete tests), publication with the comments by the announcers is needed, but for short videos only the ambient sound is used in all instances, since otherwise the end user could become confused. For longer summaries, specific comment may be added. Finally, these pieces are uploaded to the rtve.es CMS to be made available to users. From a more technical perspective, production of live show has the following work scheme:
The HD-SDI signals distributed from the Torrespaña Control are injected into two systems, a Makitos (Haivision) encoder farm and the Golumito platform from the Golumi company.
The Golumito system generates the 5-second segments with the 56
required quality levels. It also generates the HLS and DASH playlists in four qualities, and publishes all the information to the CDN Publishing Netstorage (Akamai).
Live images are generated every 10 seconds for distribution of live thumbnails.
The Golumito system generates three versions of encapsulation in real time: - Five-segment encapsulation for low capacity devices. - DVR encapsulation (including segments from the last three hours) - StartOver encapsulation. For this, each live show taking place on each signal is uniquely identified, and its specific playlists are then generated. Regarding the VOD (catch up), the process is performed as follows:
The editor connects to Golumi's Golumito sys-
tem, selects the portion of the recording to use, and uploads it to a video environment. The pieces intended for export, as well as the audio band (voice-over or ambient) are chosen and finally exported to the CMS.
The Golumito system composes from the selected segments a package with MPEG4 media, the metadata and the audios. As Golumito actively listens to events coming from TVE's continuities (Harris), it includes the metadata that it sends plus any additional data chosen by the editors, as well as other stored audios such as audio description, original version, etc.
The packaging is fed into the CMS through RTVE's Ingester system.
For the distribution in CDNs, an on-the-fly segmenter and the Unified Streaming mp4split, are used and the segments are persisted in the dis-
RTVE
tribution CDN (Fastly or Level3). In the specific case of HbbTV, for compatibility with TV receivers prior to the 1.5 standard, the Golumito system generates for each direct a streaming in RTMP format that is fed into an external provider (Cinfo) that generates and distributes a Transport Stream from the source signal to end users. In addition to the abovedescribed main management system, RTVE has a second backup ingest subsystem, the Cires21 LMS. Using the same HD-SDI signals through the Makito encoder farm where IP signals in multicast format are captured by the LMS to broadcast to the backup endpoints. For this Olympic event, RNE are going to make a special effort to report from smart speakers, especially on Alexa. Two years ago, RNE launched the RNE skill in Alexa, in
which you can consume the content that RNE produces, this being the product from which you can follow the Games. A workflow has been created to guarantee the immediate publication of the contents so produced, integrating the radio’s Dalet production system with the CMS of rtve.es. In Dalet an export procedure has been created by which the content selected by the editor passes to an Interactives FTP system. This integrates an active listening system that, when faced with new content, launches a technical ingest and standardization process. The rtve.es ingest system carries out an audio standardization by adjusting the maximum and minimum thresholds (difference between audio maximum and minimum levels measured in decibels), to later encode the standardized audio in mp3 at 128Kbps bit rate. Then the media and its metadata, the title of the
audio and the category of the program, are uploaded to the interactive CMS. Finally, from the interactive CMS, the audio is published through a REST API and the mp3 is published through a CDN (Fastly). For its part, RNE's Alexa skill organizes the contents into three sections: agenda, Spain's summary and summary of the day. Technically, the skill has been implemented by using Amazon lambdas that acquire the metadata using the REST API of rtve.es, and if the user so requires, content is played by accessing the media published on the CDN.
57
TECHNOLOGY
The boom in the broadcast and live environments
PTZ CAMERAS 58
PTZ CAMERAS
By CARLOS MEDINA Audiovisual Technology Expert and Advisor
TV programs such as Big Brother, Survivors or Love Island, theaters of all kinds, religious sites, conference rooms and numerous events/concerts with multi-camera recordings have something in common: the use of PTZ cameras. This type of camera has achieved the recognition of audiovisual professionals in the broadcast environment and in multi-camera coverage of live events thanks to the technology they feature and to the enormous benefits they offer nowadays.
59
TECHNOLOGY
But it hasn’t always been this way. The origin of this type of camera comes from the field of security and video surveillance: the use of cameras for closed circuit TV (CCTV); a camera that only offered the ability to capture from a single point of view, determined by the physical location of the device with respect to the space to be displayed. The result of these initial cameras were frames on a fixed plane, without camera movements, lacking in image quality and with little visual angle. And yet they were key for video surveillance work due to their multiple and versatile placement in view of their compact size and their very simple operation and configuration. Not to forget the ability to have real-time images of what is happening and the option of having recording material 24/7 at a low cost, replacing the expense of security personnel. 60
A curious fact: the first use of a closed-camera TV circuit was documented to have been carried out by the German army thanks to Siemens in 1942. They were very basic black and white systems and were used for missile test observation in preparation for long-distance military strikes. At present, after the different historical episodes that international terrorism has left us and the continuous dissemination of news on robberies and thefts, security cameras and video surveillance are all
around us in our daily work: from those used in traffic, on the streets and squares, even in banks, stores and shopping centers (retail sector) up to the most private use in houses and homes. At present time, the technology that has been developed is very sophisticated in order to obtain excellent results in video surveillance matters: night vision, sound activation, thermal imaging cameras with automatic tracking and face detection, among other features.
PTZ CAMERAS
The use of PTZ cameras in the broadcast TV environment and live events has been the result of two circumstances. In the first place, the inception of television content based on reality television where the landmark audiovisual
reference is Big Brother, born in September 1997 as an original idea of the John de Mol Produkties (Endemol) production company. This type of program saw the need to “watch” the contestants day and night through cameras that would go
unnoticed in the contestants' home. Secondly, an audiovisual production that is being increasingly enriched with points of view from numerous cameras with the aim of generating showiness at live events.
Canon CR-X500
61
TECHNOLOGY
Audiovisual content that can be enjoyed on various large-format displays by the attendees to the event. Live events have to make the most of the spaces within the stages and manage tight budgets, as well as hire the right human team to operate the different cameras. In this historical, social and economic context, PTZ (an acronym for pantilt-zoom) cameras became a major asset in television programming, entertainment and live shows. A PTZ is a remotely-controlled video camera that is compact in size (+/- width: 158.4 x height: 177.5 x depth: 200.2 mm), lightweight (1.5Kg to 5 Kg) and having a wide range of possibilities for planning due to its smooth and silent PAN (P), TILT (T) and zoom (Z) movements. PAN is a horizontal movement of the camera body (from left to right or vice versa) on its own central axis and without physical movement of the camera. TILT shares the same characteristics but it 62
is a movement of the camera body in a vertical direction (from top to bottom or vice versa). And ZOOM is the internal movement of the lenses within the camera optics that allows us to have different focal lengths without changing the lens, thus being able to go from a wide angle to a normal focal length or to a telephoto lens (various combinations that depend on the type and design made by the manufacturer of the broadcast optics). Therefore, PTZ cameras neither are nor are called IP cameras, POV cameras, Bullet cameras, Robotic cameras, PoE (Power over Ethernet) cameras, Built-in Webcams, Standalone Webcams, Pencil cameras, or action cameras; although they all share quite a few technical specifications and protocols in common. It now is time to learn a little more about PTZ cameras. Let's see their specifications in aspects such as the cameras themselves and lenses, video format, system requirements, interface,
add-ons and accessories required. High-end PTZ cameras in the audiovisual industry have the same features or operations as any other type of camera used in this sector (an ENG Electronic News Gathering- camera, for instance), as follows: front and rear tally, iris control, focus control (including face detection and tracking), ND filter, color bars, scene files, optical image stabilizer (OIS), electronic image stabilization system (EIS), changes in shutter speed, gain control, white balance, gamma, knee and detail settings, synchro scan and variable frame rate (VFR), among others. We can even find camera models that support HDR (Canon CR-X500) and both BT.709 and BT2020 color spaces. PTZs are a great integrated solution without complications when it comes to cabling, without added technical elements (such as deploying robotics for camera movements) with
PTZ CAMERAS
Panasonic AW-UE100
excellent size and weight. They can be operated with ceiling/wall mount, tabletop or tripod, with an 'image rotation' function that automatically ensures
the correct output orientation in any installation environment. Depending on the construction materials and the protection index of the
base and camera body housings, we can find PTZ cameras for indoors and outdoors that are able to withstand different weather conditions (Canon 63
TECHNOLOGY
cameras in UHD and 4K on the market. As we have already mentioned, every PTZ camera features a fixed lens with variable focal length (zoom) having its own optical movements and offering different focal lengths (some models providing x24 Sony BRC-H800 and even x30 - Minrray UV950AS) and some with the possibility of digital zoom. AVer PTZ330 30X
CR-X500 with an IP55 index, to work from -15°C to +40°C and up to 90% humidity). Control and camera setup operations are carried out by an operator remotely by means of an IR remote control and, above all, with the help of a joystick controller and/or specific software (IP Pipe from LiveU or PC PTZ Control Panasonic Center for instance) with which you can operate one or more PTZ cameras. The Sony BRC-H900 PTZ allows operating large64
scale systems of up to 112 cameras controlled over standard IP networks by adding the BRBK-IP10 IP remote control card and the RM-IP10 remote control. Regarding technical parameters of PTZs, worth mentioning is the fact that at present those with a single camera sensor, usually CMOS (type 1.0, type 1/2.5 or type 1/2.3) are the most common; They allow support and sharing of FHD resolution video signals, although there are already PTZ
When choosing a PTZ, it is important to know its response at a rotation angle/tilt and its speed of movement, as well as under the settings that the PAN involves. These values are measured in degrees and speed in seconds: For example a Panning Range ± 175° or tilt dynamic speed between 5° and 300° per second. In this sense, most manufacturers of this type of camera offer what is known as the number of preset positions allowing us to mark and remember the PAN, TILT and ZOOM positioning of the camera with regards to the event or content to be captured.
PTZ CAMERAS
For example, the Aver CAM PTZ330 model gives us the possibility of 255 different locations. One of the reasons that has driven the rise and success of PTZs in audiovisual and live is their connectivity. Both in transmission of video and audio signals, even of several signals simultaneously (12G-SDI, 3G SDI, HDMI, USB 3.0, IP Streaming and CVBS) and with regards to control and communication protocols. At present, there is no single PTZ camera not being suitable for inclusion in complex configurations under VISCA functionality through RS232 and RS422 ports and over IP (RJ-45) or supporting NDI| HX, RTP/RTSP/RTMP/ SRTP, TCP/IP, UDP/IP, HTTP, HTTPS, FTP, DHCPv6, DNS, NTP, ICMPv6 (MLD), RTSPoverTCP, RTSPoverHTTP, SSL (TLS), or MultiCast/UniCast standards like the Panasonic AW-UE100 model.
PTZ camera control protocol. It was designed by Sony to be used in several of its block and surveillance OEM (Original Equipment Manufacturer) cameras. It is based on RS232 serial communications at 9600 bit/s, 8N1, without flow control normally through a DB-9 connector, but can also be on 8-pin DIN, RJ45 and RJ11 connectors used in daisy-chain configurations.
NDI®, named after Network Device Interface, is a network device interface technology that involves the exchange of video information through IP over Ethernet networks. This protocol was first implemented by NewTek Company in 2015 at the International Broadcast Conference (IBC) in Amsterdam, combining SDI compatibility with IP flexibility. NDI supports integration with ASPEN, SMPTE 2022, and other emerging standards.
Marshall Electronics CV630-NDI UHD 4K30 NDI
VISCA is a professional 65
TECHNOLOGY
It is a high-quality, lowlatency IP video transmission standard that is popular for video production. It is an easy way to connect live video sources between computers that was initially adopted by the live video production industry for use in software including Wirecast, vMix, LiveStream Studio, OBS (via plugin), xSplit and NewTek TriCaster. Nowadays, NDI® is used in a broad range of video applications, including broadcasting, distance learning, and video communications. Over the years, NDI® has launched new connectivity options like NDI | HX®, which stands for ‘High Efficiency’ (2017) and provides additional flexibility for bandwidth control when sending video over a LAN. NDI | HX® has also made it possible to use NDI® video over WiFi and other limited bandwidth networks. 2020 saw the announcement of NDI | HX®2 with a series of new 66
optimization improvements such as NDI® HB, the ‘High Bandwidth’ option. RTP (Real-time Transport Protocol) is the real-time transport protocol for the transmission of information in real time, such as audio and video in a video conference. It was developed by the IETF Audio and Video Transport Working Group, first published as a standard in 1996 as RFC 1889, and later updated in 2003 in RFC 3550, which is the Internet standard STD 64. RTCP stands for Real Time Transport Protocol, and it is defined in RFC 3550. RTCP works hand in hand with RTP. RTP does the sending of the data, where RTCP is used to send the control packets to the participants. The primary function is to provide feedback on the quality of service offered by RTP. RTSP (Real Time Streaming Protocol) is a real-time streaming protocol for controlling the transmission of audio/video between two
endpoints and facilitating transport of low latency streaming content over the Internet. First developed by Netscape Communications, Progressive Networks (now RealNetworks), and the Columbia University, the RTSP specification was published by the Internet Engineering Task Force in 1998. Version 2.0, released in 2016, modified the initial version in an effort to shorten round-trip communication with the media server. SRTP, also known as Secure Real-Time Transport Protocol, is an RTP extension profile that adds security features, such as message authentication, confidentiality, and response protection, mostly intended for VoIP communications. SRTP uses authentication and encryption in order to minimize the risks of attacks such as service denials. It was published in 2004 by the IETF (Internet Engineering Task Force) as RFC 3711. The COVID 19 crisis has
PTZ CAMERAS
only accelerated the use of PTZ cameras. As they are controlled without the need for a camera operator, PTZs have made it possible to work under health safety regulations according to the social distancing measures imposed and the reduction of human resources in TV programs and live events. PTZs have become a true tool for the salvation of the show and the live multi-camera option. A single operator can control multiple PTZ cameras remotely from a desktop, making it possible to record an event or broadcast live from different points of view without needing to be physically there.
CR-N ranges), JVC (KY-PZ series) and even Minrray, Marshal, Aver CAM, Logitech or Digitex, are trying to gain a foothold in the face of the high demand seen for these types of cameras. Each of the aspects that we have discussed are on their own sufficient reason behind the rise of PTZ cameras in the broadcast
audiovisual environment and multi-camera content applied live in the immediate present, and for the future we are being announced as featuring virtual studios, virtual reality (VR) and augmented reality (AR), as well as new protocols such as FREE-D, a standard that transmits camera tracking data.
Sony BRC-X1000
The manufacturers of PTZ cameras that dominate this market are Sony (with their BRC or SRG range, with successful models such as BRC-X1000 or BRC-H800) and Panasonic (their AW-UE and AW-HE series, the AWUE100 or AW-HE130 as camera reference). Although other brands such as Canon (CR-X and 67
TECHNOLOGY
Transformation to HD By Asier Anitua, Business Development Manager EMEA & LATAM at Telefónica Servicios Audiovisuales
With the advent of DTT, it was clear that SD was the standard for "tube resolution" (4: 3). Very soon afterwards, the HD and 16: 9 broadcast tests began and many stations began to deliver their TV offerings in HD, at least in the broadcast part, not so much in the production part. Regarding 4K broadcast, we should look towards the future with hope since it is a format that has come to stay, but if we look at the past it will give us the keys as to when we will be able to enjoy general 4K stations natively. The key to accelerating the implementation of 4K technology is given by the large platforms and the use of the internet as a broadcast medium, with Smart TV sets and apps providing viewers
68
HD
with ultra-high definition TV, usually on demand. There are already several stations that have made experimental broadcasts in 8K resolution. This resolution is seen as the ceiling of what the human eye can perceive, as more quality in resolution would he hardly noticeable, but subjective quality is not only about resolution. Thinking of HD -High Definition- means above all thinking of 16: 9. Therefore, if the leap from SD to a higher definition should be gradual, the first thing we must do is convert all the equipment into 16: 9, starting with cameras and not leaving out any element that requires a change from 4: 3 to 16: 9 signal processing. Broadcasting in HD is not the same as having a full native HD stream, from capture to broadcast. In the UK there are still many TV stations showing the HD logo in their broadcast that actually produce all content in SD and then scale it up in the last stage. For transforming a TV channel, short, medium and long-term plans are required.
69
TECHNOLOGY
On many occasions the day-to-day absorbs everything and immediate need takes precedence over the ideal, workarounds are put in place just to get going and a contingencies force to leave aside any proper actions that would be contemplated in a well thought out-out master plan. One of the first actions being sought in this transformation is to convert everything to digital, that is, streamline all workflows and if there is any linear component, be it capture, recording or editing equipment, transform them to nonlinear. We seek efficiency through a tapeless solution. This naturally includes old video magnetic tape files and their conversion to digital, so that they can be used optimally and efficiently. Sometimes we find streams of linear editing in the press, mixed with the latest in non-linear technology in programs along with disconnected areas of interactive make
70
it through the best they can. ENG cameras of multiple formats and brands for the same television program, some SD, others HD, some P2, others XDCAM and some Prores. Faced with this, the first thing is to sort out, unify and simplify, avoid mixing formats and strive for standardization. This search for simplicity makes production volumes increase, making processes more simple. If we think of a TV station as a factory of audiovisual content, we must apply the same parameters that we would apply to any other factory, but considering also the very important creative part of talents, artists, operators and filmmakers. Does it make sense today to move from SD to HD or should we already consider 4K or at least HDR? The answer, as it usually is: It depends, not all televisions are the same, nor do OTT platforms have the same requirements. Therefore, the values to be analyzed
are, for example, what budget we have, what use the channel has, where it is mainly broadcast, how it is broadcast, how often a technological renovation is carried out, etc. In some countries there is a tendency to amortize equipment in five years, which later become seven and end up as ten or even a longer lifespan; this is
HD
potential competitor, which detracts from our company. Manufacturers have seen this dilemma clearly and offer hybrid solutions and equipment capable of working in HD and, optionally, in 4K. The business model of upgrading under a license and even temporarily, is booming. Likewise, we see that more and more technology renewal specification sheets as a service are becoming increasingly popular, as this allows changing systems and improving quality at the end of the contract.
logically a factor to take into account. If our experience is that we usually change systems every many years, we must think of a new system that can withstand the test of time well and leave the door open to migration to new formats, as this would otherwise condemn us to work in lower qualities than those used by
On a small channel, a local television that broadcasts on DTT for instance, although most viewers use an OTT app or even 4K YouTube. Here, Black Magic equipment in 4K is a viable alternative, given that the budget is low, the highest possible quality is required in resolution and the workflows and content are small and simple. Although this quality does not reach the TV
broadcast grade, nor for certain platforms, it is one more option currently available on the market. A medium-sized channel, with a higher budget and more viewers, requires more professional and broadcast equipment that ensures better quality, support and adequate maintenance, so the budget rises and 4K can be more inaccessible, even more so if the broadcast is in DTT and the content does not require quality higher than HD. In this case, we would stick with HD for the next few years. At this point, the recommendation would be, -as much as possibleto acquire the most critical equipment in 4K compatibility. But on a television channel or platform producing very highquality content, first-rate television series, sports that are watched all over the world, high definition and even HDR can fall short, and moving to 4K definition is a must. Let's say that producing in 4K, nowadays, can be
71
TECHNOLOGY
something acceptable when referring to cameras and editing systems, but things get more complicated when it comes to production systems for news programs, file systems, shared storage, etc. The big leap in quality from HD to 4K or even 8K in production requires transforming all traditionally HDSDI signals into IP, under the current SMPTE 2110 protocol. Today's digital transformation is no longer just going to HD in coaxial, it means making
72
the switch to IP technology under standards that unify and simplify the interoperability of the different brands and manufacturers. Just as the beginning of the MXF was somewhat complicated and painstaking, we are on that same path with the SMPTE standards, with the 2110 being the clear winner nowadays. Regarding the technical area, how should a digital transformation to HD or higher-resolution formats be approached?
Critical points in a broadcast infrastructure In a TV environment we have to check that all the components are compatible in format, form and codecs in order to have efficient workflows in place. Among the basic and critical components that must integrate an HD workflow, we find: Capture systems. Transmission systems, backpacks, 4G, 5G.
HD
File ingestion systems and recording of internet sources. Monitoring and analysis of signals, both the use of multiscreens on led monitors, as well as frame and signal analyzers. Video and audio mixers. Video matrix, both 100% HDSDI matrices, and hybrid matrices In the case of a leap to IP television, with SMPTE 2110, traditional matrices are replaced by routers. Glue, Converters, distributors, upconverters, downconverters, etc. These elements disappear if we migrate to an IP television, leaving just a few converters for equipment that is not compatible with the standard. Digital production system, PAM with Shared Storage. Mobile production units, where the growing use of remote production means that these elements are
increasingly reduced, thus allowing to minimize operating costs with remote work. MAM, centralized filing systems. For TV stations, the recommendation I can make about formats nowadays would undoubtedly be to try to work in a single format 'from home', for example in HD with 1080i and go for a standardized TV production format such as XDCAMHD 422 at 50 Mbps. With some exceptions, having a single working format at file level greatly speeds up and improves the workflows of the different areas of a TV station that require the sharing of material and where immediacy is something to take into account. In the future, in the medium and long term, XAVC is positioning itself as the substitute for XDCAM422 @ 50 for 4K formats, but this is something that right now looks a bit distant since it has implications beyond
the capture of the format by the cameras. If we are going for fiction production, or content production companies, we would be considering heavier formats, RAW for 4K capture and subsequent compression for proxy work (DNX35 or similar type). In short, if the television environment is facing a technological renovation to migrate from SD to HD or 4K, the recommendation is to leap to HD, since 4K is still very expensive in order to be able to do so in all stages. If we are dealing with a small-size or a medium-size TV station, we could consider a leap to 4K based on IP technology such as NDI. In the event of a renovation that seeks to update HD equipment, I would recommend looking for hybrid equipment and solutions that allow living with 4K or at least having the option of IP in the future. 73
POST PRODUCTION
74
THE COLORIST
Much more than colors By CARLOS MEDINa, Audiovisual Technology Expert and Advisor
75
POST PRODUCTION
Film and television technicians have been modifying their professional profile and job skills depending on the historical moment in which they have lived. This statement is evident from the very inception of the seventh art, throughout the expansion of the mass media until today, with a more customized and interactive communication in a digital environment. Who can imagine what George Méliès (December 8, 1861 - January 21, 1938), French filmmaker, pioneer of cinema and of the stop trick techniques had to learn: multiple exposures, the fast camera, the image solutions and/or color film. The same can be said of effort made by the team of technicians that worked alongside Canadian director James Cameron (August 16, 1954) for the filming of Avatar (2009). They used new motion capture techniques and the film was marketed for conventional viewing, for 3D projections (using RealD 3D, Dolby 3D, XpanD 3D, and IMAX 3D 76
formats) and even 4D projections in some South Korean movie theaters. These two examples are a token of what happened, but also a result. A dynamics that has been driven, on the one hand, by the curiosity of both authors and technicians involved; and on the other, through the changes brought about by the inclusion of work tools and new technologies applied to image and sound. A new camera, more advanced software, a different response level from a microphone, a change in processes and protocols or even the introduction of innovations from other sectors, such as, for instance, 5G technology. In the present case, the profile of the colorist is very evident: A new professional with very specific roles and responsibilities. Its origin stems from color grading. On the brink of the fourth industrial revolution, we are seeing now a number of terms from a different time such
as laboratory, film and final copy. So, what is a colorist then? What professional profile is currently required for the job? After getting acquainted first-hand with the process, consulting current documentation and having enquired the opinion of many professionals who call themselves colorists, we can now attempt to define this very fashionable and almost essential profession in the audiovisual sector today. A colorist is an art technician who is involved
THE COLORIST
environments in this sector, we have to delve in what aspects have to be taken into account, training and/or expertise achieved and the social skills of a colorist; all this with a view to the present time and a permanence going into the future.
from the outset in the creation of an audiovisual work concerning image and lighting parameters in order to achieve an atmosphere, aesthetics and visual sensation that accompanies the cinematographic/televisio n narrative. Works fully synchronized with the decisions and criteria of the film director or by the head of audiovisual work, director of photography (DoP) and the camera team as well as in keeping with the guidelines of the art department.
Therefore, it is convenient to update the performance of roles throughout the visual creation process in audiovisual work. And not only regarding processes comprising the postproduction and/or finishing stages, as it has been until now. In other words, we have to update the classic, traditional vision of the colorist, who is much more than a grader. In order to be ready to respond to a myriad of situations, audiovisual stories and productive
First, some basic multidisciplinary knowledge. We are making reference to training background, that is, understanding about the nature of light, the properties of color, assessing the meaning of colors, aspects relating to visual composition, lighting techniques, measurement and photographic exposure and the importance of color perception and psychology of color. And also, whatever having to do with the vision and perception of human sight and the formation of the image through optics. A colorist should have certain creative inclinations, aesthetic taste and a mind full of artistic influences from many fields such as 77
POST PRODUCTION
dominance. The visual outcome of a clean, colorful image or, conversely, with more noise (grain), in search of a specific “look”.
photography, painting, graphic design or art, among others. Secondly, the audiovisual production environment. In this sense, we include the knowledge of the different stages involved in the process of making a feature film, a TV series or a video clip, for example. This environment implies knowing production times and deadlines, being aware of aesthetic trends (directors, publicists, DoPs, movie titles...). Processes such as telecine, scanners, intermediate digital (ID), raw material, processed 78
material, conforming, video or DCP (digital cinema package) mastering, just to name a few of them. In this aspect, it is convenient to know some history about each creation field: cinema, television, advertising, documentaries...; what aesthetics, narratives and artistic movements have developed in the past, what authors have left an “aesthetic” footprint on the audiovisual area. The intentional use of black and white, tacking, sepia, color gamut and/or color
Third, video technology and technique. Every colorist is potentially a technician. A huge amount of concepts, terms and parameters that range from backups, to deliveries, through workflow depending on the work to be done. Values that affect both analog and digital video signals, such as amplitude and its measurement in IRE; the types of video signals (RGB signal, color component signal or composite signal); creation of audiovisual content on HDR (high dynamic range), S3D or VR technologies; the huge number of formats, codecs and recording media; log and linear curves or the various international standards on color space, among others. Fourth, every colorist has to acquire social skills and capabilities in order to work within a team, know
THE COLORIST
how to communicate feelings and points of view, listen and read between the lines what the director of audiovisual work wants to achieve and, of course, walk in the cinematographer's shoes when it comes to decisionmaking related to image, framing and lighting. In short, leave the color room, leave the equipment, work almost in darkness, and leave the software to coordinate the entire image process and visual finishing. This entails speaking with technicians of a very different types and backgrounds (camera DIT, art directors, costume or production managers, among others). And in fifth place and last, the work tools. Undoubtedly, it is essential to be up to date with the equipment, software and facilities that make up the ‘color room’: Viewing monitors or projection systems with their technical features and configurations, the software (one or several apps) with which to make
changes to the client's audiovisual material, external playback and recording equipment and computer equipment according to the level of the professional environment where work is to be carried out. Also, decide whether an external console (basic or advanced panel) is required or not. Currently, we have to be aware of the possibilities offered for collaborative work where interaction is possible with any part of the world, without leaving the comfort of your own color room. A colorist knows that he works along with other professionals who have assumed color correction in their day-to-day work, such as image retouchers, visual effects technicians, and, above all, video editing operators (cinematographic montage). Each one of them has been involved in assuming this process of creation and visual decisionmaking on the resulting image/video due to a
democratization of specific software (some free for amateur or professional users such as DaVinci Resolve from BlackMagic Design) and/or control and color correction tools that come with retouching, editing, composition and visual effects software. There is even an application for performing simple color corrections for iPhone or iPad called ExpressColor, developed by color management specialists Gamma and Density. This new tool features traditional color wheels, with settings for low, medium and high lighting (lift/gamma/gain), plus hundreds of presets so that beginners and professionals alike can quickly create their looks. Therefore, as a way of clarification, we can say that we would have, on the one hand; technicians who assume color correction, others who are prepared for color grading, and lastly; the figure of the colorist as the professional who brings together solutions, 79
POST PRODUCTION
responses and proposals on color and the visual aspect from the origin of the audiovisual work up to the final material (called master broadcast or exhibition) that reaches users/consumers. Thus, color correction focuses on exposure, contrast, color, and dominants settings within a video clip. It is a basic modification in which a technical criterion prevails, which is also called the primary adjustments. They are usually essential and are always applied when done with editing/montage between the different takes/clips in order to achieve the racord (visual continuity) necessary for the audiovisual to work between the proposed narrative and the viewer. Color correction are universal, generic and global corrections, because they are applied to the entire image. Tools such as the contour line or the color wheels can be used, for instance. The goal is to obtain a technically correct 80
editing/montage, ready to become the broadcast/exhibition master, free from visual racord errors and looking as natural as possible. It is clearly identified with a quality control (QC) in aspects as specific as white balance and color temperature, meeting luminance and chrominance standards and contrast balance. And we can define color grading more broadly as a longer creative process, a combination of technical and aesthetic adjustments. It allows a more meticulous work shot by shot. Feedback, decisions and criteria by director of photography are essential. The changes are the result of said primary adjustments, in addition to those that we call secondary adjustments. Color grading strives for a specific look designed for the story to be shown and in keeping to the narrative set by the audiovisual director. They are customized, original and unparalleled processes. And they allow adjustments that modify
parts of the image within the same frame, the grouping of shots or sequences and the total/final completion of the editing/montage. Thus, in a more complex process where shaping, composition are addressed, together with deliveries by using masks, paths, keyframes, filters, LUTs, static and moving effects, scaling, reframing, powergrades, stills, color matches,… among others. A new audiovisual industry is beginning to demand that the image as a whole takes full precedence and, therefore, it is essential to take care of the visual finish from the beginning, until it reaches the audience regardless of production field: cinema, advertising, corporate video, independent filming, television and/or the internet. The hiring of an audiovisual work by producers determines when, how and who will be aware of these processes involving the visual aspect and the presence of a colorist.
THE COLORIST
intermediate processes such as shaping, editing and composition-, to mastering (video format and/or DCP (digital cinema package) output. - Coordinate criteria and protocols (EDL, AFF or XML) with the editor and with all post-production departments that generate visual content such as VFX.
At first, when the possibility of scanning the photogram (photosensitive film material) became real, the creation of ID (digital intermediate) and, nowadays, with processes perfectly implemented in the digital electronic imaging environment (FHD, UHD, 4K up to 8K) for film, video and television. From the choice of cameras and their configuration, to displays for viewing (monitors, TV sets and video projectors), leaving behind an era featuring low definition, low color response, low sensitivity and high noise levels.
- Analyze the literary script and interpret it from the color viewing and grammar standpoints.
- Propose a color range, texture and visual aesthetics.
- Apply presets and audiovisual content standardization settings for cinema/TV.
So much preparation, knowledge and experience required can be summed up in a list of main roles of a colorist:
- Activate the best workflow, from technical decisions for recording (camera settings) through all the
- Make changes and modifications in aspects relating to exposure, contrast, color properties, dominants or
- Work as a team with the director of the audiovisual work, the DoP, the camera team and art departments such as scenery, costumes and makeup/hairdressing/ch aracterization in order to decide the color chart or palette that will be chosen for the resulting image.
- Know all calibration and color management processes. - Operate specific color correction and color grading software. - Carry out quality control work with the necessary adjustments; being in many occasions a job that entails fixing, repairing or correcting the material received. - Promote visual racord.
81
POST PRODUCTION
color temperature to favor specific aesthetics or achieve a visual atmosphere. - Achieve look and a LUT that are different from the rest of the audiovisual content consumed by the viewer. - Generate a master of broadcast, exhibition, 82
viewing or distribution according to the environment to which it is directed: cinema, television, direct sales or the internet. - Meet budgets and deadlines set by the audiovisual work's production department. It is worth noting that
the role of the colorist has been undertaken by professionals who come from a different previous professional background: photographers, cinematographers, video technicians, photochemicalcinematographiclaboratory technicians, audiovisual restoration
THE COLORIST
technicians and even film/video editors. Currently, there are already training plans and internships in place companies associated solely to the profile of the colorist. Both technology and equipment and software manufacturers have already enabled the
changes to have this done from the beginning -i.e. from the shooting itself, until the delivery of the master. They have facilitated the processes that shape the visual finish of an audiovisual work in a completely digital environment. The audiovisual industry has to face the fourth
industrial revolution of this 21st century. Changes that are coming, or that are already taking place in our society, that are not at all unrelated to those faced by a colorist, in line with what the audiovisual sector demand from this kind of professionals, which is much more than just colors. 83
OPINION
5G: Nothing like what you've ever experienced before By Ateme When we talk about innovation, we can think of it in two ways. One is technological breakthroughs: cramming ever more pixels into ever smaller bandwidths, for example. The other is experiential breakthroughs: offering something that is nothing like what consumers experienced before. Over the past decade, evolutions in our industry have focused on experience: what we, as viewers, can enjoy. This is how streaming platforms created a breakthrough in the quite mature TV landscape. And a new technology is coming that promises to take that experience to another level yet: 5G. Beyond merely letting us watch video, 5G enables content and service providers to engage more and better with their audiences, while also increasing content monetization. 84
How does it do that? Through low latency and high bandwidth. These enable content providers to engage directly with viewers; they also enable viewers to engage with the content. With 5G, the viewing experience takes on another dimension. It is no longer just about watching content: it is about making video part of a wider experience, linking the content you watch with social media, retail, and your community – and providing the interactivity and responsiveness that viewers are used to in these other platforms. This way, viewers can feel onboarded with a story that is bigger than the content itself. One big driver of this overdrive direct-toconsumer engagement is the ability to personalize the experience. Currently, the offer is fragmented: there is too much content
dispersed across too many services. One major frustration viewers face (and also one of the top four reasons for churn) is not finding what they want – or spending too much time looking for it. So imagine being able to reach viewers with the content they want, based on where they are (at
5G
home or commuting), the time of day (morning or lunch break), their preferences (sports or news), and the type of entertainment they ask for (augmented reality or just a show). With 5G, content suppliers can adjust their offering, the content itself, and the surrounding experience – leading to a win-win situation with viewers who are fully engaged with the content, and suppliers who see increased loyalty, reduced churn, and increased revenue.
The picture sounds good, doesn’t it? Let’s now peek into what is happening behind the scenes to enable it. The main idea is Multi-access Edge Computing (aka MEC) based on two technology pillars of viewer engagement: efficient caching and just-in-time processing. Why do we need MEC? Because on the one hand, we have demand for better and deeper engagement with viewers, who look for a personalized experience – a unicast, as a streaming media nerd like me would say. And on the other hand, we have a network which, even with all the means and investments we could ever imagine, is unlikely to scale to manage 8 billion different unicast (nerd!) consumptions. The way to reconcile the infinite demand with a capped bandwidth is to shift the computing closer to the user or device being used. Instead of processing everything from a central location (the multicast approach), we move the logic, intelligence, and processing closer to the
end user. That way, the popular content in the right format at the right resolution can be smartly cached – or produced ‘’ondemand’’ at the edge, according to what that specific viewer is expected to watch. 5G is by default a meshed network with towers located all over the country or continent. We can make each of them an edge place for hosting smart caching and just-intime-processing. 5G is clearly a paradigm shift for content experience. But its promises can become reality only if we change how we look at the workflow, processing and streaming delivery side of it. Through its elastic CDN technology as well as its renowned processing and TV personalization capabilities, ATEME offers a complete solution to enable content providers and service providers to embrace 5G network capabilities – and offer their viewers an experience that is nothing like what they ever experienced before. 85