4 minute read

GUEST COLUMN

Next Article
WHAT LIES AHEAD?

WHAT LIES AHEAD?

If we take for granted the rapid moves towards 4K, HDR ultra HD and IP infrastructures, then the major challenge facing sports production today is the need to create more content with less time to get it in front of an audience.

Viewers are constantly looking for more: more angles, more replays, more graphics, more insight. Conversely, production companies are looking to automate and reduce the environmental impact through remote production and hybrid environments. Even if primary switching continues to be in a truck at the venue, highlights packages, expert reports, and opinions and features are being cut remotely.

Those remote edits may be back at the producer or broadcaster headquarters, or – thanks to the cloud – they may be at a completely different location. With proxy-based workflows and cloud rendering, the editor could be working from home or another remote location. This is a very practical proposition as time pressures are eased: perhaps not for the half-time highlights reel, but certainly for a late-night round-up.

While of course we concentrate on bandwidth to move content and ensure timely delivery, what makes all these workflows possible and efficient is metadata and automation. Metadata is always critical in sport; it helps everyone identify the key players and the critical plays. What we need is to make sure that the metadata is available to all with minimal manual intervention, performing much of the heavy lifting as we seek to create ever more engaging productions.

The generation of metadata is already well established, using automated and human logging. The focus now needs to be on how we can use that metadata for maximum efficiency. That calls for a single brain, but with distributed, continual, real-time synchronisation between that metadata and all the other devices that need to use it.

Clearly, for maximum speed you want to give each editor just the content needed for the package they are cutting. In an ideal world, that material would arrive at the edit workstation already organised in the format required by that software and that operator, with the content organised and the timelines aligned. The cleanest way to organise this is to have metadata, content flow and workflow management software working in conjunction with the storage platforms. In turn, the storage platforms need to be capable of supporting multiple sources (the venue and the production base) and the cloud. Projects are then defined in this central workflow space and assigned to individual editors as required.

This abstracts the workflow management from the editing. It follows then that the storage network and asset management system must be completely agnostic to editing software (and other post tools), but capable of delivering material in precisely the right format for each. It may even be that a single event requires the use of more than one editing package: a specialist fast turnaround editor for halftime highlights, the house standard software back at base, and maybe a third application for the editor working from home.

That is not a trivial requirement. Some software vendors are open and actively encourage plug-ins and integration; others have much more closed systems, thanks to their legacy design. But all must be accommodated if the concept is to work. This calls for imagination on the part of the workflow designer, to find ways to achieve that integration while keeping within the look and feel of the third-party edit software. These challenges must be addressed to provide a universal approach without performance limitations.

The goal is to seamlessly move complex projects between different locations and editing environments, regardless of which client software is chosen. That means full synchronisation including bins and project locking: the content and the structure must be delivered complete, without time lost in dragging and dropping. Synchronisation should occur serverside, with no required extra steps by the user. That synchronisation must be bi-directional, so decisions made by one editor are not only captured but available to other editors if required.

The result will be much greater productivity and efficiency. That translates into a less stressful experience for the post-production team and of course more engaging, entertaining and informative content for the viewer.

“Viewers are constantly looking for more: more angles, more replays, more graphics, more insight”

Creating more engagement in sport

Sunil Mudholkar is Vice President of Product Management at EditShare.

MKE 400 MKE 400 Upgrade your Upgrade your sound. sound.

Sometimes, good just isn’t good enough. Enter the MKE 400, Sometimes, good just isn’t good enough. Enter the MKE 400, an on-camera shotgun microphone that combines portability an on-camera shotgun microphone that combines portability and performance and sheds a whole new light on the way we and performance and sheds a whole new light on the way we capture audio for our video. Features that were traditionally capture audio for our video. Features that were traditionally accessories—like a windscreen and shock-mount—are now accessories—like a windscreen and shock-mount—are now fully integrated and perform better than ever, freeing up space fully integrated and perform better than ever, freeing up space in your camera bag and on top of your camera itself. in your camera bag and on top of your camera itself.

This article is from: