14 minute read
Disney in The Woodlands
Where does this project come from and what was it about?
This goes back to the 21st Century Fox acquisition by Disney. As part of that effort, we have been on standby looking for an opportunity for the sports and entertainment networks to consolidate. Prior to this we had a lot of disparate sites. If you go back three to five years, we have had very sitespecific broadcast facilities. I am standing right now in the Burbank center in California, which is one of them. This was where Disney channels were originating, since I have been here for 20 years now. We started in the building back in 1997.
This effort overall was to consolidate a lot of our entertainment and sports networks into a new facility. Part of the 21st Century Fox acquisition included a broadcast facility in The Woodlands, Texas. What is great about this one is that it was built to withstand a lot of the weather events that happen there and it is a very world-class first-rate broadcast facility in general.
When the leadership team got to see the site, they said, "This is perfect". When the opportunity and the timing presented, we put up a project that predated the acquisition itself and we were activated to consolidate all of these broadcast networks into a single site in Texas.
How many phases did it take to complete it?
We had very aggressive timelines, including what
Media Operations QC Room Pre-Roll-Out. Credit Jon Edwards.
is called a transfer of service agreement with Fox. We had to vacate the Pico Lot facility out on the Westside of Los Angeles within two years. The countdown pretty much started with that effort.
Luckily, with the Disney cable side, we have had a concurrent project here, in which we put all of our assets in Disney data center. We combined the FX and National Geographic networks that were at the Pico Lot with the Burbank Disney cables. We had about 12 to 14 months to do it, OUR FIRST EFFORT WAS TO MOVE THE FX AND NATIONAL GEOGRAPHIC NETWORKS FROM PICO IN AN ACCELERATED TIMELINE. INSTEAD OF 24 MONTHS, WE ACTUALLY DID IT WITHIN ABOUT NINE.
which it was not a very long time. As we made this migration, we had to make sure that we were sensitive to the timing and handling of broadcast networks. A lot of the work was done in the background but then brought forward as time permitted.
Our first effort was to move the FX and National Geographic networks from Pico in an accelerated timeline. Instead of 24 months, we actually did it
within about nine. It was definitely a team effort across organizations to do this and we were able to get them live in September of last year.
What are the advantages, and also the risk of this integration?
The company already had in its plans to do a consolidation effort of its entertainment linear broadcast networks. There were several different variations of it prior to the acquisition but it has always been on the back burner to do that. The reason being is that it is always advantageous to have all of your networks in a single location. That way, they can share infrastructure, connectivity, networking, and the technology that builds underneath it all in one location.
The things that make it a primary place to be are that you got all your media assets now running into a single location. That's very advantageous to share content among different initiatives. For example, The Simpsons and Family Guy were very popular shows. Not only did we get that in-house, now we could share that same Family Guy episode, not only on the FX networks, where they were before, but now Freeform, our other network on the Disney cable side, can now run the same content.
I will give you another technological advantage. Because you have all of your feeds on a single site, you can now share them from that one site, as opposed to have to point eight different areas of service that pay for a link from one location to the next. Now they are all in a single location, so I could share all my feeds from one location out to vendor partners, and that is really advantageous.
Another point is that we can now share our linear streams with our internal partners, like Hulu, that
are getting the benefit of having these feeds from a single location. They can call a single number; have a single support structure, a single engineering and transmission team, etc.
The other part of your question resonates with me because there is actually an industry trend in doing this. Our competitors and partners in the field that have been doing the same thing of looking at sites where they can aggregate a lot of their linear broadcast networks into a single location.
What were the main technological challenges of this project?
That is a hard question to answer, because I lived it along with the teams. We were built on a lot of legacy infrastructure that was nearing to the end of its life. When you were
ESPN Disaster Recovery Unified Space. Credit Jon Edwards.
ABC Room - Pre-Build - March 10 2021. Credit Sam Exley.
Space Build. Credit Jon Edwards.
asking about how quickly we did this, there was also a time clock with several systems. One of the biggest challenges was trying to beat that clock in order to come off of those platforms very quickly. I am still surprised today that we did it. I speak a lot to the play-out system side for linear broadcast but there was also a very massive media network that needed to be also pulled in at site as well. You got a 21st Century Fox infrastructure that was aging for them with thousands of assets that needed to be moved not only to us but, of course, to Disney Plus and other apps like Hulu, and they also had to come to the linear side as well.
We had a very massive effort that took months, weekends, and holidays of the team's time to do it because there was a lot of work to be done to make sure that those assets were removed from the Pico Lot into the new systems. Meanwhile, at the same time, that clock was on the wall; "Hey these systems are probably going to fall over, and if they do they might not come back up”. It was a point of no return. We had to do it. Everybody chipped in and made sure that it happened.
What infrastructure are you using right now on your center?
On our side of the fence, we have a vendor deal with Imagine Communications that runs
our linear broadcast platform. For example, in your case, if you are watching Freeform or Disney cable at home, FX, National Geographic or ABC, it’s running through that software. It takes all of the assets or live events that are happening and puts them together.
Control Room Phase I Wide - September 2020. Credit Sam Exley.
Cables Control Room - Phase II - March 10 2021. Credit Jon Edwards. WE HAD A VERY MASSIVE EFFORT THAT TOOK MONTHS, WEEKENDS, AND HOLIDAYS OF THE TEAM'S TIME TO DO IT BECAUSE THERE WAS A LOT OF WORK TO BE DONE TO MAKE SURE THAT THOSE ASSETS WERE REMOVED FROM THE PICO LOT INTO THE NEW SYSTEMS.
It's very much like a playlist that you would create on your own, but that playlist needs to be of technological enterprisegrade for it to run. We have a lot of backup systems to it as well so that is one of our primary vendors for that technology.
ABC Primary Control Room - Pre-Production - October 2020. Credit Jon Edwards.
On the media asset side, we use Evertz Mediator-X or Mediator 10, and that makes sure that we get all of those assets. All of the commercials that you see on-air, all of the promos, all of the programs like Family Guy, The Simpsons or Disney Channel shows, all those shows need to run through the system before it makes it to air. They are basically the central point of a lot of our media workflows for linear broadcast.
How many signals and content pass through this facility right now?
That's a good one. On the FX networks and National Geographic we have six East Coast and two West Coast networks. On the ABC side, we have four time zone-related feeds with East, West, Mountain, and Pacific. For Disney Channels and Freeform, we have four East and four West networks.
After that there are a lot of derivative feeds that come out of those. ABC is not just ABC. ABC feeds 240 affiliate stations across the US. Our feed is the primary for stations all across the United States and so we have to make sure that whatever we feed goes out to them. We are basically 240 networks if you really calculate it. Those are the primary feeds so your question actually breaks out even further.
We actually have partner deals with folks like Hulu Live TV, YouTube TV, Sling
ABC Primary Control Room - In Production Milestone - March 10 2021. Credit Jon Edwards.
and Dish Networks as well, for their apps. The feeds actually start to disperse out even further. It is a hard question to answer as there are actually thousands of endpoints at the end of the day.
What are the technological challenges of distribute all the signals and all this content through a vast country like the United States?
A lot of it depends on the satellite systems currently, in order to make sure that everybody gets the feed at the same time. For ABC in particular, all of our affiliated stations around the US want the feed at the same time so they want it with minimal latency.
We are always looking at the technology and the distribution to make sure that everybody is getting the same feed. From day to day, our challenges are making sure that every performance-wise is good; that all the media assets come in at the right way and are playing in the right format. There are also audio challenges; there is what we call ancillary data in the feeds. All of that needs to run 24/7.
We were working on those to prevent nothing impacted on air. On the engineering and operations side we are in a constant state of making sure things run correctly, so that way the viewer doesn't see those problems. So there is the
day-to-day challenge to your question.
The strategic challenges are just getting those feeds to everybody without problems because it is a vast network and we feed networking systems as well to what we call terrestrial, which are point-to-point or internet connectivity. We don't control a lot of that. We are leveraging the same internet you are using right now to do the Zoom call.
For example, we keep varying line of sight with Hulu as an internal partner, but they are constantly calling out, "Hey, can you check this?" Because there are hundreds of feeds, literally
hundreds of feeds. Almost every day there are at least a handful issues that we have to follow up on and go back and find out. The good news is they are not typically Disney failures, but we make sure that everything is working correctly.
This whole project was a really big migration. Is this process 100% completed? Are you planning to move everything to the cloud? Is it profitable for Disney?
Yes, this process is a 100% complete and yes, it is profitable. We are leveraging a hybrid model of both onsite resources as well as public cloud. For
Media Bullpen - Pre-Roll-Out - March 10 2021. Credit Jon Edwards.
Building Facade - Pre-Branding - October 2020. Credit Jon Edwards.
example, some of our assets come through Amazon Web Services. We leveraged some of their infrastructure to help support us for media, mostly media file movement.
To your question, one of the areas we are still exploring is doing everything that I just described in the cloud. There are some technological things that we have to overcome. We continue to aim to improve issues like lag and latency. We can't get the same speed through public cloud quite yet, but it is gaining momentum. There is some newer technology that seems to be really stronger on the horizon for us. We are still looking at that and we keep that door open.
Could you explain a bit more about this hybrid model?
I will keep it to the media side. For example, we have to get some content from a vendor. Let’s say it is a Freeform show in production with a completed file. They will leverage public cloud, with Amazon, to send the file up for some processing. Once it is in the system, we have an oversight on it and we can step it through different processes.
We will send that file back down from Amazon to us locally; so that way we can play the file out for linear broadcast. For example, tonight on any of our networks, a lot of those files that you are seeing are leveraging that model of pulling from a public cloud source. We call them S3 Amazon buckets.
There is also transcoding. Amazon has put a lot of media into their service with us. For our consumers, we transcode
files with high-resolution that you can't play on your phone because it consumes too much bandwidth. For that, we do a process in the cloud to transform it into a format that serves us better on the broadcast side.
Another big piece are the things we could do remotely. Imagine going back to the beginning of the conversation. Throughout the pandemic my team and the vendors could not get to site. Everything we did was over the cloud. We leveraged our networks, certain tools and applications to get into the site and do everything remotely without ever being on site.
We've had vendors building massive systems in The Woodlands in Texas without ever even stepping a foot in the facility. We were leveraging all this on cloud, the infrastructure and public internet from our Disney Global networks, our Disney Media and Entertainment Distribution production network. All of those networks combined allowed us to move forward.
As an example, this weekend we have a person in Idaho that I was working with on Saturday and Sunday because of an issue. He was able to gather logs and video clips remotely so we can work with Evertz, who is a key partner remotely as well.
What's next? What's coming into the future?
We have eight owned ABC stations that are all already starting to migrate over into The Woodlands. By the end of this fiscal year, we will be done with that project. After that, we are looking at internal reviews to add more to this facility. The regional sports networks, which now are owned by Sinclair, are still in The Woodlands right now. Once they take them out of our facility, the floodgates are open. We can start moving other processes within the company in there and we have a lot we can do in the facility. We are looking to add more and more over time to make this a center of excellence for the company. That's our next step.