8 minute read
Darryl Jefferson and Jim Miles on NBC Olympics' File-Based Workflows, Storytelling
TOKYO OLYMPIC GAMES: NBC
Every NBC Olympic effort sees massive changes and advances with respect to file-based workflows, editing, and more. Toss in UHD, HDR, and immersive audio, and those advances are even more challenging and, ultimately, impressive. Darryl Jefferson, NBC Olympics, VP, broadcast operations and technology, and Jim Miles, NBC Olympics, director, digital workflow systems, discussed the multiple-continent, -time zone, and -facility effort with SVG.
Can you describe the ecosystem here?
Miles: It’s Avid Media Composer for craft editing, and Avid Interplay MAM is the record apparatus for highlight-shot selection as well as our archive. The entire Olympic archive is Interplay and Media Central, and our playback turnaround is EVS. We do a lot with Telestream for transcode, flipping, and orchestration, and we use Signiant as our file mover and for transfers from the venues and to Stamford.
Jefferson: We also have a new ingest device from Telestream called Live Capture, which can capture to 1080p HDR content as well as the older flavors of content. And our big monster storage is Dell/EMC Isilon.
Miles: The interesting story on the editors is that we are still using hard workstations for the primary craft edits but all our auxiliary edits
Darryl Jefferson (left) and Jim Miles and the team worked hard to allow creatives to focus on storytelling during the Olympic Games.
are virtual machines. We used to have to bring 30 Avids to the IBC, but, this year, we had to bring only a dozen, and we have the VMs for the producers and those lighter-weight tasks. That has been huge for us in terms of the complexity of what we must build.
You have teams around the world diving into your file-based workflows. Are the workflows the same everywhere?
Miles: More or less. We try to put the high-resolution recording where it’s needed. If somebody is doing a turnaround at a venue, we’ll put it right there next to them in their local storage. If something is needed for primetime, we can move the content here to the IBC.
But our main record apparatus has moved back to Stamford, and the hundred feeds that are coming in from different venues, from the host, and from our own all go back to Stamford and are recorded there, where the bulk of our ancillary users are. Then there are other business units in Miami, in the news organization at 30 Rock that are pulling from that recording wall in Stamford.
32 SPORTSTECHJOURNAL / FALL 2021
Capture every play, angle, and emotion.
Immersive and beautifully surreal shallow-depth-of-field video has taken the world of live sports by storm. Our VENICE and FX6 full frame cameras as well as our legendary HDC Series system cameras delivered this compelling new look at last year’s biggest football game.
Sony’s complete line of shallow depth cinematic and live production cameras features resolution from HD to 8K¹ and up to 16x slow-motion for norm-shattering live action video.
Our new Super 35mm 4K CMOS system camera offers the cinematic bokeh of Super 35 and features a global shutter imager. You’ll get perfect color matching with our HDC camera family, plus cinematic looks. And for the total solution, you’ll get the full power of Sony’s multi-camera workflow, including IP transmission and “ISO” recording in the camera control unit.
To learn more about Sony’s HDC Series system cameras visit pro.sony/hdcseries
1. 8K: 7,680 x 4,320 pixels
©2021 Sony Electronics Inc. All rights reserved. Reproduction in whole or in part without written permission is prohibited. Features and specifications are subject to change without notice. Sony and the Sony logo are trademarks of Sony Corporation.
TOKYO OLYMPIC GAMES: NBC
Jefferson: The other thing is the added wrinkle of HDR for the primetime show and for our venues. That adds a layer to know which version of a recording you have and if we’re doing parallel recordings or is it an SDR sport contributing into an HDR primetime show or vice versa. We try to normalize the content for the end user so that complexity is obscured and people have to think about it less.
Miles: We have well over a 100 different paths, many of which have file conversion or interlace-to-progressive or progressive-to-interlace conversion in the middle. It has been an interesting challenge to build it all ahead of time and then get it running in a matter of days.
Jefferson: It has been an education for our legion of freelance editors and freelance operators in general. They need to wrap their heads around where is the recording that is closest to the output of my show. Or, if someone’s delivering into a show that’s different from the format they’re normally cutting in, it takes a little while to get up to speed.
But we do have islands of 1080p HDR, like at the venues, and, once they’re at the venue, they have to worry much less about their environment. They only need to worry about the outliers, like an ENG camera, coming in or an interview or reaction shot from another broadcaster that won’t be in 1080p HDR.
Miles: Content+ is a great example where we have that fire hose of content coming in from OBS and, depending on who’s pulling it, some of them will just take it natively and some are taking it with a LUT [look-up table], or some are taking it with a transcode depending on where it’s going.
Over the past year, there has been a lot of talk in the industry about how editors, for example, don’t need to be onsite. What do you see as the benefit of having editors at the A venues — swimming, gymnastics, athletics — and at the primetime set?
Jefferson: Having folks in the city and venue where the event is happening lets them feel the pulse of the Games, and they can interact with the players or the coaches. In some cases now, we bring those athletes and families together with our technology, and that’s a bonus.
But, for every editor that’s here, we have probably five or six in Stamford. Our onsite complement is down substantially from Games past, but you do need a handful of editors that can do the fast turnarounds, keep up with the Games, and cut up stories and inbound elements that are happening now.
Miles: To achieve the level of efficiency for those tight turnarounds and when they are in higher resolution or more complex things, you need people onsite. But the more casual editing, like if you’re just clipping a shot for a highlight, that has been [done at] home for a decade.
I wanted to ask about OBS Content+, which is where OBS makes everything available: the live events, highlights, features, B roll, and more. How does your team use that, and what does it mean for your content-creation efforts?
Jefferson: It does give us raw material for interviews, profiles, their aerial shots, course, animations, all those types of things. But, at the end of the day, the deeper dive on U.S. athletes still ultimately comes from us, following athletes all the way back to their hometowns and that type of thing. We have a zeroed-in approach on the U.S. team.
Beach volleyball continued to be popular with viewers, especially here in the U.S.
Content+ helps when we do deeper dives on non-American athletes. Normally, we send crews all over the world, but this has been an odd year for travel and for capturing people in their homes or training facilities. Leaning on OBS for that type of thing comes at a really good time.
Miles: And it gives us content that we didn’t plan for. We plan for the U.S. athletes, but, if there’s the breakout star from Argentina or somewhere else, we can find the deeper dive there.
I am assuming it is also great because it means you can’t miss anything.
Jefferson: Sometimes, you have mechanical failure on something, and you realize that is the only place something is rolling, but now there are many other opportunities.
Miles: It’s a delicate balance, too. For example, we have splits at the venues and tons of our own cameras at each venue, and it’s not really feasible for us to save every frame of video from every camera for the duration. So we do go through our own process of melting things down to the best shots. Sometimes, you do miss something, but, generally, we do an incredible job of capturing not only what OBS is doing but our cameras as well.
Jefferson: Now it’s less about missing coverage of an event but more likely an iso shot that zeros in on a footfall or a nudge on track. Our team is looking for a specific angle, and maybe, in that moment, we have 50 other angles but not the one they’re looking for. There is a bit more nuance now as to what people are looking for.
We discussed HDR, but you are also working in 5.1.4 and Dolby Atmos. How are you handling that for editors?
Jefferson: If you’re doing a tight turnaround, you want to carry those height mikes as an additional element on the timeline, and that has been an interesting process. What does the submix look like before it’s delivered for final mixing and Atmos and coding on the backend? That has been an additional kind of learning experience for all of us, figuring out how to mix in the environment we have. Having a mixed audio environment is odd, but it has been interesting teaching folks about it. – Ken Kerschbaumer
This interview has been condensed and edited. To read the full interview, visit the SVG SportsTechLive Blog.
34 SPORTSTECHJOURNAL / FALL 2021
Whether it’s a slate of games for a major sports league, a regional athletic meet, an international esports tournament, a concert festival, or a live or virtual enterprise event, The Switch has production and delivery services to suit every need.
Our Cloud Video Services platform, MIMiC, is an end-to-end offering that includes remote IP video Live Production, Clipping & Editing + Publishing and Transmission.
MIMiC – The Complete Package
theswitch tv
theswitchtv the-switch theswitchtv theswitchtv