![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/39623f4ae9f017add93bc3f2f7271bbb.jpeg?width=720&quality=85%2C50)
38 minute read
in
London has new General Manager: Beth Lowney
Gravity Media has recently announced that Beth Lowney is the new General Manager of its brand-new Production Centre in London (White City), opened just nine months ago on September 2022
Beth Lowney joined Gravity Media in 2022, after working at the International Tennis Federation. She arrived with proven success in event servicing, broadcast production, sales and distribution and the ability to deliver excellent client service. This way, with over eight years of experience working across major global sporting events, Lowney joined Gravity Media as Business Development Manager – Outside Broadcasts & Projects.
As Broadcast and Media Manager at the ITF, she managed the team to deliver broadcast production at all major events including the Davis Cup & Billie Jean King Cup Finals, alongside handling worldwide distribution. As Event Manager, she worked closely with National Federations and Sponsors to achieve a professional cohesive package for all stakeholders, also servicing Davis Cup and Fed Cup ties at an operational level to ensure compliance with ITF standards and official regulations.
In her new role as General Manager, Beth will oversee operations at the next-generation Production Centre, act as the main point of contact for partners including ATP Media and Formula E and set the tone for the client experience at
WestWorks. Beth will continue to be a key member of Gravity Media’s Business Development team and focus on delivering a strong pipeline of new opportunities at the facility.
Gravity Media detailed its latest expansion at IBC 2022, where the 50,000-square-foot facility features best-in-class technology to support both on-premise and distributed remote production workflows. The state-of-the-art facility is based around a fullyfledged 2110 IP media fabric with dedicated master control rooms, six dedicated production control rooms with dedicated audio control rooms, seven flexi control rooms, multiple off-tube commentary booths, two studios, lighting and vision control facilities, fast turnaround and craft edit, flexi desk production spaces, media management and client desking.
Working closely with the Business Development, Engineering, Production and Front of House teams, Beth will help drive Gravity Media’s strategy for growth across the EMEA region, ensuring the facility is intrinsically linked to end-to-end projects and remote production offerings.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/157de7d6a812751ba6928f9686a59397.jpeg?width=720&quality=85%2C50)
Ed Tischler, Gravity Media’s Managing Director says: “We are delighted to announce Beth as the General Manager of our second Production Centre in London. Her extensive experience in the industry and valuable knowledge of Remote Production make her an invaluable asset to our business. We believe that her unique perspectives will further strengthen our ability to deliver exceptional results and maintain our position as a leader in the industry.”
Beth Lowney commented on her new role: “I am thrilled to continue my career at Gravity Media, as the launch of Gravity Media’s Production Centre in White City has marked a pivotal moment. I look forward to bringing my expertise and contribute to working alongside the talented team to drive continued success and growth of the company.”
How was Singular.live born?
Singular was built from a vision to create a new graphics platform that could revolutionise live graphic overlays. Traditional systems are very powerful but their underlying technology pre-dates the internet and smartphones. Content creators need different tools that take advantage of digital technologies to help them offer more flexible, scalable and cost-effective solutions for enhancing their content and viewer engagement.
What is Singular.live’s approach to remote production and how does it contribute to streamlining the broadcasting workflow?
Singular is a browser-based platform, so there are no dedicated graphics hardware or downloads required. Since everything in Singular can be done via a web browser including the creation, preparation and operation of graphics, the platform can be used by anyone from anywhere with a basic computer or tablet and an internet connection. As a result we are able to support remote production natively. We also have fully documented APIs and SDKs so that clients can easily integrate Singular into their own control interfaces or systems giving maximum flexibility. Additionally, we are already integrated with the leading production platforms giving clients the widest choice of workflows without compromise.
How does Singular. live’s platform facilitate real-time graphics and overlays in remote production scenarios?
Singular’s web-based nature allows full operation from anywhere in the world, in real time. Not only does this enable graphic designers, operators and producers to work remotely from anywhere in the world, it also offers exceptional redundancy. If operators face issues either with their connectivity or computer, anyone else on the team can login and pick up operations from wherever they may be. It also makes it simple to share work and collaborate in the building phase and preparation of graphics. For example, a show producer can easily prepare their graphics from their home, hotel or even while travelling to the venue making it an incredibly flexible solution.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/b1c045cb5c2cfb819c4d460bcdf16445.jpeg?width=720&quality=85%2C50)
Can you share some examples of successful remote production projects where Singular.live’s technology played a key role?
One of the most ambitious remote production projects that we have helped deliver was with our partners Reality Check Solutions for their client Red Bull Media. This was for a Red Bull surfing event that was happening off the extremely remote southern coast of Tazmania. In order to minimise the environmental impact of their production, Red Bull Media wanted to send only a skeleton crew. As a result, the video signals were sent from Tazmania back to their production hub in Santa
Singular is a browser-based platform, so there are no dedicated graphics hardware or downloads required.
Monica where the Singular graphics were added before distribution.
More recently, Singular has been used in remote productions for the SPFL with partners QTV who do the matchday productions including Singular graphics from their remote hub in Glasgow.
We have also just launched a new project with partners Photron in Japan that delivers live Singular graphics, with data integration, for a major domestic Sport league from a central production hub in Tokyo.
What are the key features of Singular.live’s platform that make it a preferred choice for remote production teams?
One of the main features is accessibility; the fact that Singular is all browser-based makes it quick and simple to access all features of the platform. Secondly, the SaaS and cloud-native nature means that clients can scale up and down as they require ensuring they are not left with redundant hardware after they have had to upscale for a specific event. For example, football leagues will often have a week or two (typically at the end of the season) when all matches are concurrent. With Singular, you can simply scale up for those match days and then down again for the regular schedule.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/c7c1a8c8a94d8590ee5408a826535389.jpeg?width=720&quality=85%2C50)
Finally, Singular is the only live graphics platform that has Albert certification for sustainability. In very simple terms, using Singular can help any live production reduce its environmental impact through reducing both the required hardware and the shipping and transportation for the project. Unlike virtualised cloud graphic solutions, Singular is cloud native meaning we use elastic compute resources rather than dedicated hardware, which is a far more environmentally friendly approach.
How does Singular. live ensure a seamless integration with existing production setups and equipment?
Our partner network has over 50 technology partners and continues to grow as we integrate with more partners. In addition, since the output from Singular is a URL, any production equipment that can take a browser as a source is automatically compatible with us. We also provide solutions for Singular to fit seamlessly into SDI and NDI workflows. With our decades of experience working in live production and specifically graphics we know the importance of frictionless compatibility which is why we continue to focus on our growing partner network. We have also released a new updated version of our API with full documentation and a dedicated developer portal to help anyone looking to integrate with Singular.
What are the benefits of using cloud-based solutions for remote production, and how does Singular.live leverage this technology?
Cloud-based solutions enable remote production that in turn helps reduce costs and the environmental impact of live productions. Cloud-native solutions go a significant step further by removing the need (and sometimes logistical challenge) of having to provision dedicated hardware.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/f13b9ddfc1d7c5d8bf4bfbb09409923b.jpeg?width=720&quality=85%2C50)
As a cloud native platform, Singular offers the best option for remote production whilst also delivering the best solution for scalability, accessibility and sustainability. It does this without any compromise and while offering next-gen features like localisation, personalisation and enhanced engagement through interactivity.
How does Singular.live address the challenges of latency and bandwidth limitations in remote production environments?
Singular graphics can be time stamped to ensure that they trigger at the desired time. This is essential in live production and sport in particular since nobody wants to see a score graphic update while their video is buffering and they are yet to see the goal. Embedding a timestamp for the graphics into the video makes sure that, if a viewer’s video is delayed or buffering, the graphic will not display until the correct point in the video.
In addition, our output is in HTML which has a very small footprint and so requires a stable but very small operational bandwidth. This helps ensure that we do not add any additional delays into the workflow. We have also recently successfully completed a new integration with our partners Videon on their cloud encode stack, which means clients can now add Singular graphics at the point of encode, further
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/cbf290947b0877d8da3bca42596d710b.jpeg?width=720&quality=85%2C50)
Our output is in HTML which has a very small footprint and so requires a stable but very small operational bandwidth. This helps ensure that we do not add any additional delays into the workflow.
Producing a Million Dollar Event: The 64 Matches of The Soccer Tournament (TST)
Tupelo Honey, a full-service production company with over 25 years of experience in sports, music and entertainment, recently faced an exciting challenge when tasked with creating broadcast graphics for The Soccer Tournament—a completely new ground-breaking 64-match event over five days, spread out across five fields.
With a tight six-week turnaround, Tupelo Honey sought to incorporate live game graphics, upcoming match details, live scores from multiple fields, and previous game results.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/051b44ca5c9d9081356abe59a54bb0f9.jpeg?width=720&quality=85%2C50)
The Soccer Tournament presented Tupelo Honey with a unique opportunity to create broadcast graphics for this first of its kind event: the objective was to seamlessly integrate information from multiple fields, allowing operators to display up-to-the-minute scores from different locations.
reducing any potential delays in their production workflow.
What role does automation play in remote production, and how does Singular. live’s platform support automated workflows?
Many remote productions harness some form of automation either through data or pre-prepared playlists. Singular has robust and varied data integration solutions, and our fully documented APIs can also integrate with automation solutions. We have several technology partners who provide automation services that are integrated with Singular making it easy to automate workflows.
Tupelo Honey sought a solution that could condense a vast amount of information into a visually captivating and easily digestible format for their audience. Singular’s flexibility allowed them to leverage an existing package and tailor it to their specific requirements within the tight turnaround time. With numerous stakeholders involved, including Tupelo Honey, The Soccer Tournament, and NBC/Peacock, Singular delivered a final product that exceeded expectations and satisfied all parties involved.
TV2 in Norway has a great hybrid example where they created a bespoke workflow for their Ice Hockey production. The Singular graphics are all integrated with data and the operator can choose which graphics they want to automatically play out on air (goals and penalty graphics for example) and then which graphics they want their system to prompt them with so that the operator can then decide if they wish to take the graphic on air. It makes for a really efficient workflow for graphics operation on a very fast moving sport with a lot of possible graphics.
Can you discuss the scalability and flexibility of Singular.live’s platform when it comes to managing multiple remote productions simultaneously?
In 2022, we had 10 million hours of output with Singular graphics. We have clients and partners producing literally thousands of hours of live productions and outputs every month. This is growing as we work with more FAST channel providers. Scalability is one of the key benefits of Singular being a cloudnative platform rather than hardware-based or virtualised. We use elastic compute resources so there is never an issue finding resources, unlike systems that need to provision dedicated hardware in the cloud.
As a SaaS platform, clients can increase their outputs on demand and drop down when not needed. With Singular, when a production is finished, the operator simply shuts their browser and walks away. There’s no de-rigging or closing down server instances.
How does Singular.live ensure data security and protect against potential cyber threats in remote production workflows?
In addition to offering industry best practice solutions like SSO, we also conduct regular external cyber threat reviews. The most recent of these found no susceptibilities. However our team continually works to manage and improve our security. We fully understand the importance of security for our clients, many of whom have made significant investments for the rights to the content that they are producing.
In terms of user experience, what kind of training and support does Singular.live provide to production teams adopting their platform for remote production?
For anyone who signs up to Singular, we have a comprehensive set of video tutorials and previous webinars all available on our YouTube channel. These cover basic guides on how to get started all the way through to more technical and expert videos. We also offer monthly live webinars on specific topics that are available to anyone to join, with each session including a Q&A section. We also have a dedicated support portal that is monitored 24/7 and accessible either from within Singular or directly from the support website. This provides a highly effective way for people to get answers to any questions they may have or help with any challenges they encounter.
For our Enterprise customers, we also offer dedicated support that includes free training workshops for both designers and developers and a dedicated Slack channel for any specific questions. We also have a customer Slack channel where people post questions and our community and our support team answer. Our certified partners can get direct access to our development team to help them with their integrations. This has proven very helpful especially with sharing some of our knowledge gained through our own experiences of integration video players and updating CEF versions for example.
What do you see as the future trends and advancements in remote production, and how is Singular.live prepared to adapt and innovate in this evolving landscape?
Singular was purpose-built for remote production by virtue of not requiring any dedicated hardware. As a platform we are built using standard web protocols which helps both us, as web technologies continue to evolve, but also helps our customers. Finding developers and staff with experience in web technologies is much easier than trying to find and recruit specialist, experienced graphic systems developers and designers.
As technologies like 5G and NDI continue to mature and evolve, and more broadcast technology moves to cloud native solutions, remote production will become even easier and more prevalent. This will also allow more customers to take advantage of some of the more advanced features of Singular such as adaptive overlays and interactivity through the use of our Intelligent Overlays. Production technology has evolved very quickly and a lot of what we can do fairly easily now would not have been possible even 5 years ago. As demand increases for personalisation and wider use of tools like AI and automation, so will the opportunity for more remote production will grow. The technology to deliver robust, scalable and professional live productions remotely already exists. There are already tier 1 productions being produced remotely so there is no technological obstacle to doing it. The delay in wider roll out is down to operational decision making and people’s readiness to adopt it rather than any issue with the technology itself.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/113e658e1b4b490dc37bae153426cadc.jpeg?width=720&quality=85%2C50)
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/826201e551d6b7f9ff38d8c2119f350f.jpeg?width=720&quality=85%2C50)
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/705d2ad1182c046acc67275004cc1927.jpeg?width=720&quality=85%2C50)
The Wimbledon Championship is one of the largest annual sports Outside Broadcasts undertaken in the UK and a complicate project for all the companies involved. Last June we learnt that EMG Group companies (ACS/EMG-C) had signed a four-year extension with Wimbledon Broadcast Services (WBS) to be the specialist cameras and RF equipment supplier to The Championships, Wimbledon, staged by the All England Club’s annual Championships.
Over several years, EMG Connectivity have supplied various broadcasters’ RF services but it was when WBS took over the host broadcasting of The Championships from the BBC in 2018 that the group became the exclusive supplier of RF cameras; on the other hand, further cementing their expertise in the area, Aerial Camera Systems (ACS), also part of the EMG Group, has been supplying specialist cameras for The Championship for over 20 years.
As the major tennis event takes place, TM Broadcast wanted to know more about the deployment and coverage by EMG Connectivity and ACS, both from EMG Group.
With:
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/79ca9fae2ba35c09c33083af90285254.jpeg?width=720&quality=85%2C50)
As one of the leading providers of sports production services, how does EMG approach the television production of Wimbledon?
Broadcasting an event of the calibre of Wimbledon requires a great deal of preparation and innovation each year. When we first started working with the BBC at Wimbledon we only supplied three or four camera systems. That provision has steadily grown and when WBS took over as host broadcaster they placed additional emphasis on creating unique shots of The Championships by placing more remotes around the Grounds to capture, for example, the crowd atmosphere and the players’ practice areas.
As a result of these years of experience, thanks to our historic relationship with the event, EMG Connectivity are uniquely suited to meet these challenges; for example, the All England Club’s historic venue in SW19 has a high standard of requirements to be fulfilled, and it often prefers broadcast standard compact robotic cameras as their unobtrusive nature minimises space requirements and line of sight issues. ACS’ specialist cameras, including ACS SMARThead™ systems, are strategically positioned so they achieve its goal whilst being imperceptible.
Can you describe the scale and complexity of the production setup for broadcasting Wimbledon matches?
Due to the scale and calibre of the event, the advance planning for The Championships was a vast undertaking involving a complex setup.
As in example, as well as covering several angles of on-court play, numerous beauty cameras provided contextual coverage of the event and included remote crowd cams, coverage of the player arrivals area, Aorangi practice courts, and Media Theatre, plus topographic venue shots from a hoist-mounted GSS stabilised camera gimbal sitting high above the venue for the iconic overhead shots of the local area and London skyline.
This year was our largest Championships delivery to date and we strengthened all equipment to fulfil all requirements; this way, the vast majority of camera systems were either UHD or 1080p HDR for the first time as WBS expands the high-quality format support from Centre Court out to all cameras at The Championships.
What are the key technological innovations and solutions that EMG employs to deliver highquality coverage of Wimbledon?
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/e3afd0584ceff8b6129d0ed0f8d88f02.jpeg?width=720&quality=85%2C50)
ACS and EMG Connectivity employed a range of solutions throughout the Wimbledon site to ensure that the best footage can be obtained in a range of situations. Throughout the event, broadcast standard compact robotic cameras were favoured as their unobtrusive nature minimises space requirements and line of sight issues. As we mentioned before, this year we had to use camera systems that were either UHD or 1080p HDR for the first time as WBS amplified the requirements.
This year was our largest Championships delivery to date and we strengthened all equipment to fulfil all requirements; this way, the vast majority of camera systems were either UHD or 1080p HDR for the first time as WBS expands the highquality format support from Centre Court out to all cameras at The Championships.
EMG Connectivity even designed a special deployment plan for Wimbledon: Centre Court itself features some notable specialist units including an
ACS SMARThead™ mounted on a 10m railcam positioned along the baseline and housed within a purpose-built hide, whilst another four units are mounted on bespoke camera brackets designed specifically for Wimbledon, including two on the umpire’s chair dedicated to player coverage.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/092daa6f48b0e140189499d626c87f7d.jpeg?width=720&quality=85%2C50)
Other than ACS’ specialist cameras, including ACS SMARThead™ systems, were strategically positioned to capture the action, which includes baseline angles, player coverage from umpires’ chairs, remote crowd cams, and topographic venue shots from hoist-mounted cameras.
Other than ACS’ specialist cameras, including ACS SMARThead™ systems, were strategically positioned to capture the action, which includes baseline angles, player coverage from umpires’ chairs, remote crowd cams, and topographic venue shots from hoist-mounted cameras.
How did EMG ensure seamless integration between different production elements, such as cameras, graphics, and commentary, during the live broadcasts?
ACS and EMG Connectivity worked to host broadcaster, Wimbledon Broadcast Services who oversaw all production elements.
But I can say that one of the challenges for the EMG Connectivity team was to ensure smooth steadicam RF coverage of the Walk of Champions from the Dressing Rooms to the entrance of Centre Court pre-match.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/1c644ba27b789b4e93e69d8dfc2e9d07.jpeg?width=720&quality=85%2C50)
They also covered the champion from Centre Court up the stairs of the Clubhouse, through the corridor to the Dressing Rooms and onto the Members’ balcony to be greeted by the cheering crowd. This required its own dedicated receive installation with antennas secreted within the walkways and corridors, providing seamless coverage within the building and in addition to the 40 antennas EMG Connectivity have around the Grounds.
EMG Connectivity meanwhile was providing a wide range of kit and expertise to both the host and multiple unilateral broadcasters, supported by a crew of 6 for fortnight and more for the rig and de-rig of The Championships.
What challenges did you face in terms of signal transmission and distribution during the Wimbledon production, and how did you overcome them?
EMG Connectivity provided a wide range of kit and expertise to both the host and multiple unilateral broadcasters, supported by a crew of 6 for fortnight and more for the rig and de-rig of The Championships. There were 40 antennas site wide which feedback to a central RF cabin in the Broadcast Compound, switched and fed into their appropriate receiver units. This is a cost effective way of multiple area coverage for events such as Wimbledon and golf events such as the Open and the Ryder Cup. A wide area return video system for roving RF monitors was also provided to a number of broadcasters, allowing them to analyse footage from anywhere within the Grounds.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/1a34b7bf7a3d113cba6a2b062cafe4e5.jpeg?width=720&quality=85%2C50)
Looking ahead, what do you foresee as the future trends and advancements in television production for major sporting events like Wimbledon, and how is EMG prepared to embrace those changes?
One of the major trends that is growing year on year is the recognition and practice to ensure broadcasting events such as The Championships adopt sustainable methods. Our clients are requesting that we adopt sustainable practices. As part of the EMG Group, ACS and EMG Connectivity continue our efforts to make live broadcasting and remote production sustainable with specific innovations including the Group’s new remote production vehicles.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/3b4922cd2abcc915b0a30678d28e5f4d.jpeg?width=720&quality=85%2C50)
By Carlos Medina Audiovisual Technology Expert and Advisor
The history and events surrounding video cameras used in the professional audiovisual sector has always had one constant: innovation. A long path full of discoveries and technologies as important as the appearance of the first electronic cameras, the fitting of capture sensors, the capture and recording of video on the same equipment, the possibility of working with interchangeable optics, coaxial connectivity, optical figure and/ or IP, the development of the time code, digital cameras, new recording media and 4K&8K quality, among others.
A wide array of technologies and advances that have been fitting in almost perfectly over time under a single denomination: Broadcast equipment, that is, everything that boasts a professional quality meeting specific standards/ norms for and by the audiovisual sector. In this sense, we can highlight as key players two international institutions: the SMPTE and the ITU.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/48de318021f0964155d4e9d8703d9d5c.jpeg?width=720&quality=85%2C50)
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/9ab410c7936da5fd6815fc6ff1a43e31.jpeg?width=720&quality=85%2C50)
SMPTE (Society of Motion Picture and Television Engineers; https://www. smpte.org/) was founded in 1916 in Washington as SMPE. It currently has around 8,000 members worldwide and has published more than 800 standards and protocols, playing a decisive role in the development and dissemination of television and telecommunications in the United States and also for the rest of the world.
The SMPTE ST 20481 standard defines the main characteristics of an image with a 4K resolution (4096×2160 pixels). And the SMPTE ST2110, SMPTE ST.2082-12 and SMPTE ST 2036-1 standards for 8K UHD (7680x4320 pixels).
ITU, founded in Paris in 1865, is the original acronym for International Telegraphic Union. In 1932 it adopted its current name, and in 1947 it became a specialized agency of the United Nations. Its first subject of expertise was the telegraph, but nowadays ITU covers the entire ICT sector, from digital broadcasting to the Internet, and from mobile technologies to 3D TV. ITU currently comprises 193 member countries and some 700 private sector entities.
Headquartered in Geneva, Switzerland, it has 12 regional and area offices worldwide.
More than 5,000 specialists from telecommunications and ICT organizations and agencies from around the world participate in the Radiocommunication Study Groups to prepare the technical basis for Radiocommunication Conferences, develop ITU-R (Radiocommunication Standards) Recommendations and Reports and compile radiocommunication manuals.
Recommendation ITU-R BT.2100 proposes three levels of detail or resolution: high-definition television (1,920×1,080), 4K UHDTV (3,840×2,160) and 8K (7,680×4,320) all using the progressive image system with wide color gamut and the frame-rate range included in the BT.2020 ITU-UHDTV recommendation.
In this article, we are going to delve in and update the situation around professional 4K & 8K Broadcast video cameras, which are used both in the field of television and in the production of audiovisual content.
A first nuance we have to understand is the difference between camscopes and camcorders. The first name is given when the video camera only does the capture and processing, resulting in a video/audio signal output. Conversely, when the video camera in addition to capturing and processing the video/audio signal, has the possibility of recording/ storing that signal, it is called camcorder, i.e. included in the camera body there is a recorder, jack and/or slot for recording and playing back the video/audio signal.
A multi-camera production/operation is defined as one that offers audiovisual content, uses a wide variety of sources for video inputs, including more than one video camera. Therefore, the 4K&8K camera model has to be ready to integrate perfectly into the workflow behind a multi-camera, and the most suitable ones for this are the camera chains/systems.
The next aspect to consider when choosing a 4K&8K video camera is the mounting of the optics with respect to the camera body. We can find solutions with fixed optics, that is, the camera manufacturer offers us a model where the body and optics cannot be separated. And, on the other hand, camera models with interchangeable optics, which allows you to remove and place optical lenses of different focal lengths -always compatible with the mount on the camera body- to adapt to the shot we want to get.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/cad13928cbcd55d4f8e7a88b52bca238.jpeg?width=720&quality=85%2C50)
Thirdly, professional video cameras are conceived and designed according to the production environment for 4K&8K content where they are intended for use: singlecamera or multi-camera. This issue is very important because choosing one model or another will facilitate or complicate the work to be done. A single-camera production is one that uses just one single camera, so this model has to have the necessary features for capture, shooting, operation and recording of both image and sound to do perform the task autonomously and independently.
Instead, a multi-camera production/operation is defined as one that offers audiovisual content, uses a wide variety of sources for video inputs, including more than one video camera. Therefore, the 4K&8K camera model has to be ready to integrate perfectly into the workflow behind a multicamera, and the most suitable ones for this are the camera chains/systems.
But why camera chains/ systems? The answer lies in the very way of producing, monitoring and offering visual content featuring the highest technical and artistic quality possible. A camera chain/system comprises the following items: camera body with the most suitable optical system based on placement of the camera in regard to what must be captured and the kind of take to offer, a camera cable through which the various signals and communication orders come and go –such as tally and intercom- (featuring optic fiber or triax cable nowadays) and a base station for controlling the camera from which the outgoing video signal that we will use as input source will exit. These elements must be used based on the number of cameras independently deployed.
In addition, normally each camera chain will have an OCP (camera control panel) to adjust the camera’s technical settings.
At this point, taking into account the first three issues mentioned above, we can establish a typology of 4K&8K broadcast cameras within the audiovisual sector:
Studio camera chains/ systems of studio (Set
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/d8302fad4022126fca95f382bb022f1b.jpeg?width=720&quality=85%2C50)
TV). This is a type of camera that stands out for the possibility of using interchangeable optics of a long focal length, that is, large telephoto lenses. Their main feature is that they have no section for making the recording within the camera body. In addition, they usually have high levels of quality in the video/audio signal they offer. They are used in multi-camera environments.
EFP (Electronic Field Production) or PCS (Portable Camera System) camera chains/systems, these being more compact and portable pieces of equipment. In addition, these are equipment items having a recording section on the camera body and interchangeable optics.
ENG (Electronic News Gathering) shoulder cameras. They are equipment items whose main feature a great ergonomic design to fit comfortably on the shoulder of the camera operator and allow great ease in the handling of the buttons, the optics and the settings. They are lightweight and have a recording section in the camera body. They typically have interchangeable optics with a high focal length zoom.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/06c894c611f57cfab1d60a41b83008a5.jpeg?width=720&quality=85%2C50)
Modular cameras. This type of camera is characterized by the fact that the camera body is robust, simple and contains the minimum elements for operation. Therefore, for better handling and operation, accessories, complements, optics and other resources must be fitted into the camera body.
Compact/ultra-compact cameras (All-purpose). The really good thing about this type of camera is that in a small size we have all the global and operational features to give a great service for capture, recording and connectivity. They usually have fixed optics with zoom.
Handheld cameras. As the name suggests, size is what matters, taking the minimum space a camera in the hand can; they are also known as Handycams.
PTZ remote cameras. Currently, and after the COVID19 healthcare crisis, this type of camera has really swarmed TV studios and is used in numerous events. A PTZ (Pan-TiltZoom) camera is a remotely controlled video camera, compact in size, light in weight and offering great possibilities for planning in a fluid and silent way. It normally comes with fixed optics but there are already models with interchangeable optics, as the SONY FR7, for instance.
Multipurpose remote cameras. Those that have a small camera body together with interchangeable optics featuring great performance and high connectivity for the audiovisual sector.
What all 4K&8K Broadcast cameras are telling us is that they feature camera sensors that are capable of generating images with a resolution 4096 x 2160 pixels (4K) and/or 7680 x 4320 pixels (8K) and therefore, higher than the FHD 8 (1920 x 1080 pixels) and the UHD 8 (3,840 x 2160 pixels).
The landscape of 4K&8K cameras is expanding in each of the above camera types and this is only expected to gather strength as 8K televisions are already being marketed since 2018, when Samsung launched the first QLED 8K model featuring 8K resolution in quantum dot technology.
Large manufacturers of Broadcast cameras already have models that can be purchased or rented to work in 4K&8K production environments:
What all 4K&8K Broadcast cameras are telling us is that they feature camera sensors that are capable of generating images with a resolution 4096 x 2160 pixels (4K) and/ or 7680 x 4320 pixels (8K) and therefore, higher than the FHD 8 (1920 x 1080 pixels) and the UHD 8 (3,840 x 2160 pixels).
SONY introduced the 8K EFP camera chain, which consists of a combination of the UHC-8300 camera head and the UHCU8300 camera control unit, the new HDC FHD/ UHD/4K series EFP camera systems or the versatile HXC series, which are ideal for applications such as live streaming and event production.
PANASONIC continues to develop its Broadcast & Pro-AV range cameras such as the compact AG-UX180; the shoulder AJ-CX4000GJ camera; the EFP camera chain AK-UC4000 or AKPLV100GSJ; the PTZ camera range AW; or the advanced multi-purpose camera system 8K ROI.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/ed8403aea9a3e39d1c7dd92862ed179c.jpeg?width=720&quality=85%2C50)
CANON with its LEGRIA HF G70 model, the Canon XF605 and its PTZ CR-N500.
JVC continues to raise the stakes on its light ENG models GY-HC500E 4K, its ultra-compact cameras GY-HM170E GY-HM180E GY-HM250E GY-HM250ESB or the PTZ KY-PZ510NW/
NB and PTZ KY-PZ510W/B models.
BLACKMAGIC with its URSA Broadcast G2 model.
GRASSVALLEY with its LDX 150 model.
Finally, we cannot possibly close this article by making a brief reflection with other types of cameras such as digital cinematography or DSLR/ EVIL for RRSS, which even though feature different production modes and standards as compared to Broadcast cameras, come all of them very close in regard to 4K&8K: more resolution, more content.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/c1bee8e90109103da22027bb5cc0f1f2.jpeg?width=720&quality=85%2C50)
How was Windmill Lane born?
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/9dc89a3dc2f9e3ca18d75a0c54e86821.jpeg?width=720&quality=85%2C50)
In 1978 James Morris, Russ Russell, Brian Masterson and Miert Avis established Windmill Lane. At the time it was Ireland’s first world class recording studio and TV commercial post production facility. Back then Ireland was a country where people with ambition left (the late 1970’s saw Ireland fall into sharp economic decline) and everyone thought they were crazy but from a long established culture of storytelling, creativity & determination and a little bit of luck came opportunity. The leading opportunity, that formed the basis of what we are today, was TV Commercials. 2023 marks 45 years since we opened our doors as a music recording studio in the Dublin Docklands – famous for producing U2’s first five albums. Now based on Herbert Street, Dublin 2, we provide post production services for film, tv and commercials covering audio, colour grading, animation, VFX and editing.
Jason Gaffney - Marketing Manager for Windmill Lane
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/13e7d656bd0d8de6f50d6964745617df.jpeg?width=720&quality=85%2C50)
As a leading postproduction and visual effects company, can you tell us about Windmill Lane’s approach to delivering high-quality audiovisual content?
We believe that delivering high-quality audiovisual content is contingent on investing internally with regard to talent and technology. For example, one of our fastest growing areas is VFX for global streamers (some of our current clients include; Paramount, Netflix & Disney+) and therefore we need to ensure that our clients can rest assured their work is being processed to the highest spec and by people with experience. In February 2023 we appointed Stephen Pepper as VFX Supervisor – a multi Emmy nominated professional who has worked on District 9 & Ironman. Around the same time we announced major upgrades to our world class multi-room audio facility, including the installation of Dolby Atmos – a revolutionary spatial audio technology providing the most immersive sound experience. By May 2023 we had installed a new FLUX Store 360 and upgraded our existing pair of Baselight TWO systems, allowing us to keep up to date with new technologies and workflows while improving the colour team’s connectivity, speed and performance. These are just some of the ways we stay on the edge of an ever evolving global media landscape and continue to produce the highest quality content.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/e35ab31233cb82685e099d63f583dc58.jpeg?width=720&quality=85%2C50)
Dave Quinn - CEO of Windmill Lane
Can you share some notable projects where Windmill Lane’s expertise in post-production and visual effects made a significant impact?
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/0baedf3c88b24ad68b1a724dabf68262.jpeg?width=720&quality=85%2C50)
In 2022 we partnered with Oscar nominated Henry Selick (The Nightmare Before Christmas, Coraline) and Oscar winning Jordan Peele (Nope, Get Out) on Netflix’s Wendell
& Wild. This particular project was a mix of stop-motion, CG and VFX which required a particular set of skills. This responsibility threw up some technical challenges that pushed Fred Burdy – Head of CG at Windmill Lane – and his team;
“Working on stop-motion shots revealed new challenges that were great fun to tackle. We discovered that the amount of plates to work with was huge, because even simple shots had multiple exposure with different lighting setups – and more complex ones had multiple passes, multiple exposures, and quarter-scale passes for the background environments. All that was shot on motion control which helped the consistency. The production also provided us with textured 3D scans of the sets we needed to put together in our set extension, mostly done through a 2.5D matte painting approach.”
This level of expertise was required more so than ever as Wendell & Wild was shot during Covid. Fred reflects:
“The team were amazing, despite the COVID implications that made us use an hybrid approach: some artists were on site, but most worked from home, and a good few were abroad. We were careful to have frequent catch ups and made sure that people were as involved as possible even if they were not in the office – and to be fair it worked beautifully! It was a great experience overall to work with Mark Fattibene and Heather Abels, the super talented matte painting/ digital/VFX supervisor we had collaborated with before. We’re very proud of having worked on this film and that we could make such an impact on the final piece”.
What are the key considerations when it comes to collaborating with clients and understanding their creative vision during the post-production process?
Understanding that today’s content demand is aggressive and fast paced – our internal strategy has allowed us to meet external expectations.
Deborah Doherty - Head of Production says “As a business, and for us as a client facing partnership, we have become more solution driven. Our domestic sector is hugely important to us but doesn’t negate our international focus and the two exist together perfectly well. Across the leadership team, and the company as a whole, we now work together in a more strategic ‘business oversight capacity’. This has not been easy – it requires new thinking. This is the first time we are bringing in projects together. Recent examples include HBO, Netflix, Paramount, AMC and SKY along with strengthening relationships in the growing local film, tv and advertising sectors”.
With the effort on growing international business
Deborah and John Kennedy
- Head of VFX - knew that this would also require an evolution to Windmill Lane’s service offering. John explains “We realised we would have to apply a more holistic approach to how we engage with current and prospective clients. We are now more involved with clients from start to finish. In addition to discussing creative, production and technical aspects of the project we aim to add value by being more consultative. This can cover everything from our creative offering to helping clients avail of tax incentives”.
How does Windmill Lane approach the integration of visual effects seamlessly into live-action footage, ensuring a cohesive and realistic end result?
“We try to be involved in the creative process as early as possible, work with the director and the DOP on set to help them shoot in the best way possible to help us integrate our VFX down the line. We also make sure we capture all the needed information on set, such as cameras and lighting references, panoramas of the set, 3D scans of the set and sometimes the actors if needed. This is really important if CG is involved as we need to match the real set in CG. With all that, we have all we need to properly start our work on a shot. Then we usually do rough passes of the VFX to show the intent and make sure everybody in on board. After that, the creative process is a lot of work on integration, refining the work as we go to get is really seamless. And eventually we have a finished shot!” Fred Burdy - VFX Supervisor with Windmill Lane.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/db9852792e8d41f1b2d329aa50b0572e.jpeg?width=720&quality=85%2C50)
In terms of data security and protection, what measures does Windmill Lane have in place to safeguard clients’ confidential and sensitive materials?
We take the security of any content that crosses our threshold extremely seriously. Over the last year at Windmill Lane we have completely overhauled both our physical and digital security infrastructure, culminating in our first TPN audit in January. Our TPN journey is ongoing and we’re preparing for our TPN Gold Shield audit later this year. Everybody at Windmill is keenly aware of how important security is across the entire media
& entertainment space and we’re committed to adhering to best practice and giving peace of mind to all of our clients. Ed Smith - Head of Operations at Windmill Lane.
How does Windmill Lane leverage technology and innovation to enhance the post-production process and create stunning work?
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/44fe400cb811980868cfbe3bb4d2132d.jpeg?width=720&quality=85%2C50)
To support this focus the company has recently unveiled major upgrades to its world class multi-room audio facility, including the installation of Dolby Atmos, a revolutionary spatial audio technology providing the most immersive sound experience. The new studios, featuring
Avid and Genelec equipment, ensure Windmill Lane will continue to operate at the highest standard of postproduction.
On the colour front Windmill Lane has installed a new FLUX Store 360 and upgraded its existing pair of Baselight TWO systems, allowing it to keep up to date with new technologies and workflows while improving the colour team’s connectivity, speed and performance. The Baselight TWO upgrade and new FLUX Store have provided immediate workflow benefits as well as the option to further expand in the future. Both investments serve to exemplify the importance of domestic projects to ensure
Irish content is produced to the highest standard. Having recently provided full post production on RTÉ’s massively successful crime drama KIN, Windmill Lane prides itself on making great stories about Irish life and showcasing them to a global audience.
On 5 July 2023 Disney+ will release Kizazi Moto: Generation Fire. This animated anthology, by Irish based animation studio Triggerfish, brings together a new wave of animation stars to take you on a wildly entertaining ride into Africa’s future and is a perfect example of how Windmill Lane is leveraging technology and innovation to evolve the company’s output.
Inspired by the continent’s diverse histories and cultures, these action-packed sci-fi and fantasy stories present bold visions of advanced technology, aliens, spirits and monsters imagined from uniquely African perspectives. Windmill Lane provided colour, sound editing & mix services on this ambitious project.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/512832bb860a6374b104f8d1384335da.jpeg?width=720&quality=85%2C50)
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/a8b70d7cbca4a05ee1dfa08e9d2e14cc.jpeg?width=720&quality=85%2C50)
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/9a0efce5ddb938e1d3f8851f21fb1451.jpeg?width=720&quality=85%2C50)
Jeff Rosica is the Chief Executive Officer (CEO) & President, and a member of the board of directors, of Avid Technology. Prior to being appointed Avid’s CEO & President in February 2018, he had served in a number of senior executive roles with the company.
Mr. Rosica is a 35 year industry veteran in the broadcast and media technology segment, with extensive experience spanning production, post-production and distribution technology solutions.
Prior to joining Avid in 2013, Mr. Rosica served in various senior leadership capacities with leading industry brands including Grass Valley, Thomson/Technicolor, and Philips Electronics.
Last June, 15th, TM Broadcast could assist to last stop in Madrid of The Future of Post, Avid’s european tour in which companies, distributors and suppliers learned from some of the senior leaders in the Avid team on how the future of post is shaping up.
TM Broadcast team met Jeff Rosica, Avid Technology’s CEO and President, and could exchange some impressions about the future of the industry and what are the cardinal points we need to be aware of, in order to navigate these exciting times.
First question, mandatory, is about how do you foresee the future of the industry; you have a privileged point of view about new developments at the content creation landscape… but we arrived here and first thing we found was a totem than says “The Future of Post” and I’d like to change the subject: What is the future of the post?
It’s a lot. I think we are in an interesting time at the industry, I’ve been in the industry for 35 years and I’ve seen a lot of change (the introduction of digitalized archives… a great amount of change), but I’ve never seen anything like this. There is more change going on in our industry, just because our world is changing, and I think we are living a time of rapid change: viewer’s habits are changing, technology shifts are happening five times as fast as they were doing it five years ago; the pandemic accelerated the whole idea of people working in a more distributed way, and even though our industry was already doing that, because it’s always been a part of our industry working with remote locations or freelance workers but it depended on traveling a lot. The pandemic allowed us to try and do new things: there were no choice, you really had to do it, because everybody was at home.
I think the future of the post will be different that we first thought but still will be quite good. The good thing is that consumption of content is rising because there’s a lot of new devices that allow people to watch content anywhere any time. There’s a strong appetite for good quality content on a global basis. In the old days, content production used to be filmed at London or at studios in Hollywood or Los Angeles: all production was local… Today… well, we just need to look around us, here in Madrid the shots are getting big hits globally… and not only Madrid, all the Spanish territory.
I think the world is changing, appetites are changing; people want to see content that’s more global; usually you want to see more storytelling related to your culture and in your language, but now there’s a strong appetite for more global content: storytelling from around the world. And this is changing our industry. It’s an exciting time and future appears bright, but different: people are going to work very distributed, the content will be in HD, AI is almost, I believe, an industrial revolution. To summarize: it’s going to be a significant change.
Now, you almost step on my next question, AI; what role is going to have AI in content production and development?
I know there’s a lot of debate about this. I believe AI will be deployed level wide in all the industry; even generative AI will be deployed; I don’t think is going to replace creativity.
AI is nothing more than a very complex computing model that is copying human invention; it’s just repeating human works and serving that, so humans have to create. On the other hand, I believe we have to proceed carefully with AI, socially, legally, culturally…
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/ad259362822b2bb9b09277a1b2a2f77e.jpeg?width=720&quality=85%2C50)
But it can help?
It can help creativity and it can help efficiency; it’s really a help to create a lot of more efficiency workflows, and the industry needs it, as we are impelled to create more and more content and we need to find a more efficient way of creating content. Besides, AI is going to take routine tasks away, which is good for creativity.
“I think the world is changing, appetites are changing; people want to see content that’s more global; usually you want to see more storytelling related to your culture and in your language, but now there’s a strong appetite for more global content: storytelling from around the world”.
I think that AI will help creativity too, making 3D animation and visual effects more accessible, since both have come down the costs. AI is going to revolutionize industry, and it can make available for any creator cheaper costs.
If you are creating a program and you decide “oh, I think I should insert an opening scene in a beach with golfing” but you’ve never shot that, you can go and shoot the scene and spend a lot of money… or you can just tell the AI to create the synchrony you’re going to use… just like a Sybilla.
![](https://assets.isu.pub/document-structure/230710132823-b6899686bc58fe68f07d0b30a8086c8f/v1/543ddb4e1f8d10fed32c06658f5ba476.jpeg?width=720&quality=85%2C50)
Though, I think there are elements to the thing that are important, as we’re going to have to protect copyright, privacy, people’s IP address … And we have to do it properly. But I really do believe that AI is going to be similar to an industrial revolution.
Talking about efficiency and reducing costs, recently we could read about Televisa - Univision and Avid developing productions workflows on Google Cloud… What can you tell us about this new alliance and how is it going?
It’s just started. Televisa and Univision, because it’s a combination of Univision and Televisa blended together, they have some really strong visions about the future and about how they want to not just how they want to integrate the companies but how they want to work in the future. They had a vision about how to virtualize all environments and how to create workflows to distribute people and work, so people can share content and share ideas. And they saw the cloud and smelled an opportunity there; they also see the cloud as a good way of generate efficiencies in their organization…
Avid is a major supplier of technology for them and they wanted us to look at how they could standardize on a way to operate in the cloud. So that work has started, and Google is involved, and is going to be operating in a google tab. Google is helping with some of the research and design work that we have to do together. So, right now we are in the development stage because there is actually research on the go that have to be done to get to the real Televisa and Univision’s vision. I cannot tell you the exact schedule but we extracted too many peels and seeds pretty soon so we’re doing start transitions next year.
It’s exciting because Televisa and Univision are huge producers of content and obviously the main broadcasters in Mexico and USA, they are leaders in Sports, in News… to have somebody that large and sophisticated, one who really has seen how to brain all that operations and all that content flow is pretty exciting and I think is going to help lead the industry in that way.
Collaboration with Amazon and AWS
And another interesting one is the work that we are doing with Amazon; Amazon Studios, which is Prime Video, is developing a studio on the cloud initial which is building a cloud deployed studio; and we are building it right now with AWS and with Amazon. That is going to be a very interesting initiative; if a production, once is started up, needs to move they can literally deploy the technology wherever they are and open direct tab connections. That’s coming and it will change a lot.
Talking about solutions for production workflows… How virtual production can help?
Virtual production mainly reduces costs. The fact that you don’t have to take the crew –twenty people, 100 people— to a location has a significant costs-saving but we have to understand also the plausibility of virtual production: to be able to try new and brave things and to be able to actually see the outcome Live or near Live… This is an important impact of virtual production in the industry. I think is changing the workflows in a really good way.
During these times in audiovisual industry, and specifically in virtual production, there are claims about the lack of talent to recruit… Is this a temporary issue?
I think this is a real issue. I spoke at the Hollywood press association meeting, last February, and I gave a speech and talked about seven principles that are going to shake our industry. One of them is ‘We Are Running Out Of People’, because we are literally running out of people. If you look at the growth of content creation is happening across the world, and you look the amount of people that are in the industry today, the amount of who are coming to the industry and the amount of who are retiring out the industry, you realize that we are quickly going to run out of people to keep up with the demand of content creation.
And I’m talking about creative roles, technical roles… almost every role that exist around broadcast and media production. We are coming dangerously close to a point where we are not going to have people for these roles. And you can already see it, in certain areas around the market: It’s happening.
We’re going to have to think about this and how are we going to get more people into the industry faster. We want the young talent to want to be in our industry, not to want to be in gaming or other different industries…
We will need to look for ways to be more efficient. I think AI is going to be helpful just in the right time, cause the demand of content is growing much faster than the growth of talent that we have in the industry available to produce it.
Otherwise, I think that people will work more distributed, that will help: it could be that somebody here in Madrid doesn’t find a project to work on and they could be hired in Germany or Los Angeles and work remotely. AI will help to boost our efficiency and this will allow content creators to focus in creativity.
So short answer is yes, there’s going to be a fast lack of talent and a lack of skills to produce and to do what we need to do in broadcast and media content industry.
And finally, last but not least, what can we expect to discover from AVID at next IBC 2023? Are Avid planning on presenting new tools or technical developments?
We have disconnected trade shows from product launches. We don’t do that anymore. The only time we do it, it’s coincidental: if in the same month that we are going to launch a product that happens to be a trade show, we can adjust times for the launch to be one week before, for example. But we don’t tie ourselves to trade or fair shows’ calendar. Whenever is available, then it’s going to be launched. COVID help to do this, because people got used to learning about new things through virtual content, watching content on screens and displays, without having to go somewhere.