Channeling Success
FOLLOW US www.tvtech.com twitter.com/tvtechnology
CONTENT
Content Director
Tom Butts, tom.butts@futurenet.com
Content Manager Michael Demenchuk
michael.demenchuk@futurenet.com
Senior Content Producer
George Winslow, george.winslow@futurenet.com
Contributors Gary Arlen, James Careless, Fred Dawson, Kevin Hilton, James O'Neal and Mark R. Smith
Production Manager Heather Tatrow Art Editor Olivia Thomson
ADVERTISING SALES
Managing VP of Sales, B2B Tech
Adam Goldstein, adam.goldstein@futurenet.com
Publisher, TV Tech/TVBEurope Joe Palombo, joseph.palombo@futurenet.com
MANAGEMENT
SVP, MD, B2B Amanda Darman-Allen
VP, Global Head of Content, B2B Carmel King MD, Content, Broadcast Tech Paul McLane
VP, Head of US Sales, B2B Tom Sikes
VP, Global Head of Strategy & Ops, B2B Allison Markert
VP, Product & Marketing, B2B Andrew Buchholz
Head of Production US & UK Mark Constance
Head of Design, B2B Nicole Cobban
FUTURE US, INC.
130 West 42nd Street, 7th Floor, New York, NY 10036
All contents © 2024 Future US, Inc. or published under licence. All rights reserved. No part of this magazine may be used, stored, transmitted or reproduced in any way without the prior written permission of the publisher. Future Publishing Limited (company number 2008885) is registered in England and Wales. Registered office: Quay House, The Ambury, Bath BA1 1UA. All information contained in this publication is for information only and is, as far as we are aware, correct at the time of going to press. Future cannot accept any responsibility for errors or inaccuracies in such information. You are advised to contact manufacturers and retailers directly with regard to the price of products/services referred to in this publication. Apps and websites mentioned in this publication are not under our control. We are not responsible for their contents or any other changes or updates to them. This magazine is fully independent and not affiliated in any way with the companies mentioned herein.
If you submit material to us, you warrant that you own the material and/or have the necessary rights/permissions to supply the material and you automatically grant Future and its licensees a licence to publish your submission in whole or in part in any/all issues and/or editions of publications, in any format published worldwide and on associated websites, social media channels and associated products. Any material you submit is sent at your own risk and, although every care is taken, neither Future nor its employees, agents,subcontractors or licensees shall be liable for loss or damage. We assume all unsolicited material is for publication unless otherwise stated, and reserve the right to edit, amend, adapt all submissions. November 2024
It all started with a simple yet compelling concept: take all the software running on the variety of black boxes needed to manage and distribute a broadcast channel and put it into a software suite that handled the processing and ran on commercial off-the-shelf hardware (aka “COTS”).
Voila: the invention of the “channel in a box” (CiAB) concept that was conceived more than 20 years ago. Little did we know at the time how much of an impact that would have on our industry. Back then, most broadcasters were focused on optimizing their main channel and cable networks while also slowly rolling out new over-the-air channels via ATSC 1.0—the so-called diginets, which currently number around 50.
Since then, with the rise of OTT—and in particualr, FAST—the media landscape in which broadcasters now compete has made way for a range of new competitors that have provided more opportunities than challenges. CiAB gave way to virtualized playout, which has become more commonplace in our industry. With more available streaming services, faster processing and bandwidth, and lower latency, broadcasters have far more options to spin channels up or down at will, focusing on particular themes, events or niches.
Two decades ago, the cloud and IP were still fairly nascent technologies in the broadcast space. Since then, their adoption has provided far more flexibility, scalability and efficiency in the creation, management and distribution of content. In our latest Guide to Virtualized Playout, we take a look at the evolution of the technology, where it’s going, what broadcasters are looking for and how these two advances have revolutionized the way TV is created and distributed.
Tom Butts Content Director tom.butts@futurenet.com
industry trends
Broadcasters Seek Reliability, Lower Costs as More Channels Go Virtual
Improved connectivity and less-pricey storage spurs new methodologies, workflows
By Tom Butts
Channel playout has been an evolving technology for decades, but as broadcasters have moved from analog to digital, the advantages of digital have perhaps been best demonstrated by the flexibility, scalability and cost savings that come with deploying channels virtually—whether on premises, in the cloud or in a hybrid set-up.
Since the concept of the “channel in a box” was launched more than 20 years ago, virtualized playout (or what some prefer to refer to as "cloud playout") has gone from an option to a must-have for broadcasters needing to compete in today’s multichannel environment.
Steve Hassan, senior director of playout for Grass Valley, has been involved in playout since 1991, starting with the BBC. “I originally started in the BBC control rooms before any automation systems, and spent probably just under 20 years at the BBC and in the playout environment and so I’ve seen a lot of things in my time,” he said. “In the ’90s, a
BBC One control room had about seven people to run one channel—the director, vision mixer, engineer, two people in VTR running the programs in, as well as a continuity announcer and everything else.
“But now, some of our customers have single operators looking after 30, 40 if not more channels at any one time,” Hassan added.
“So the paradigm has completely shifted for customers to be able to 'do more with less,' as it were.”
NON-MONOLITHIC
Flexibility has been among the most important benefits of virtualized playout, according to Hassan, who pointed to Grass Valley’s Playout X as more adaptable than traditional virtualized playout systems.
“A true virtualized system is just taking what you’ve got in a monolithic architecture and running it in virtual compute, but actually there’s no real benefits there for our customers today because it’s running the same footprint, it’s still monolithic,” Hassan said. “Playout X, which is part of our AMPP platform ecosystem, is completely cloud-na-
tive, so it’s based on a series of microservices, rather than being a monolithic application. It’s based around an edge-compute architecture, so that allows for hybrid deployments both in terms of geography as well as platforms.”
Playout virtualization has also evolved in tandem with the “software as a service” concept, which allows broadcasters to expand services and save money by tapping into massive data banks.
“Business models can change quickly, so broadcasters need flexibility and that can manifest in a bunch of different ways, including things like more flexible ways to pay,” said Andy Warman, chief technology officer, video at Imagine Communications. “People don’t just want to make Capex purchases anymore. They want things like term licensing and other options, but they don’t have to take a hit up front on what they’re buying. But it also covers things like redistribution of channels, ability to be more eco-friendly and the ability to sort of bounce their business, across on prem and cloud.”
For broadcasters who want to move their playout operations to the cloud, there are a number of cost, reliability and security considerations, not to mention whether or not the cloud is public or private.
industry trends
James
“We’re seeing some customers deploy virtualized playout via private cloud networks, whereas others are preferring public cloud solutions,” James Gilbert, senior vice president, sales and marketing for Pixel Power, said. “Pixel Power has an agnostic position on this—our role is to support our customers regardless of the operational model they prefer.”
VENDOR-CLIENT RELATIONSHIP
Gilbert pointed out another consequence of moving towards a more automated playout scenario that has impacted the customer-vendor relationship.
“Where once a broadcaster maintained a large in-house engineering team with very specialized skills, they now need much broader experience including IT maintenance and network design,” he said. “The solution is a much closer bond between vendor and user, with new contractual agreements focused on service, delivery and performance levels rather than specific hardware.”
The learning curve for new playout customers can also be a hurdle, said Van Duke, director of U.S. operations at PlayBox Neo, a provider of playout systems to mid-Tier-1 enterprises.
“We just did a big move with a client of ours that has 35 channels on servers in a data center, and we had to install those 35 channels on AWS,” Duke said. “And that all went well, but it’s not easy. It has to be set up and the client has to understand how to operate it, and it’s a little tedious. It’s not cheap either.”
Marco Branzanti, chief technology officer for AxelTech, a provider of automation tech for radio and TV, advised broadcasters to
“We’re seeing some customers deploy virtualized playout via private cloud networks, whereas others are preferring public cloud solutions.”
JAMES GILBERT, PIXEL POWER
shop around when considering a cloud playout provider.
“In the very beginning, AWS was actually very cheap, but then they kind of specialized and introduced a lot more types of services, more types of architectures and differentiated their price list a lot more and the prices went up a little bit,” Branzanti said. “But normally, other providers that are not so big are usually your web servers where you buy a virtual machine someplace, or you need a physical web service. Those have been going down all the time, and that’s what I suggest.”
AxelTech’s XTV Suite for TV broadcast automation manages video playout, capture, trimming, scheduling and CG. Granting automatic and unattended efficiency around the clock, the XTV channel in a box guarantees linear scalability of TV playout facilities, supports all major broadcast production standards and, with the latest release, is ready for the cloud.
THE IMPORTANCE OF ORCHESTRATION
The workflow—which has allowed the industry to take a more holistic approach to channel launches—has also greatly evolved, according to Neil Maycock, chief commercial officer for Pebble.
“Playout is no longer a standalone process at the end of the chain, but completely integrated into every aspect of the broadcaster’s
business,” he said. “To serve multiple versions and outputs, and allow for pop-ups and additional services, users expect the playout platform to be providing intelligent orchestration across the whole media estate.”
Many broadcasters concerned over rising costs, security and flexibility in their playout operations adopt a hybrid approach, using public or private cloud with on-prem storage. This is especially true given today’s more complex channel environment, Maycock said.
“For 24/7 playout, which is a nonstop, mission-critical process that has high utilization of infrastructure, broadcasters are still tending towards keeping the primary technology on-site, using virtualized systems like Pebble Integrated Channel,” he said. “But to address the increasing trend of channels with variable use patterns, the economics of cloud are very attractive and this is leading to many broadcasters adopting true hybrid playout environments.”
AxelTech’s Branzanti noted that playout must work well whether storage is on-prem, in the cloud or hybrid. He described some of the challenges involved in on-prem, for example.
“In video, we really need a lot of GPU to do H.264 or H.265 encoding, and usually when you rent cloud servers the GPU is one of the expensive options, which really can go for hundreds and hundreds of dollars per month,” he said. “So we try to keep everything in CPU, rent more CPU and use the CPU to do those kinds of processes which are usually on-premise [and] would be GPU. We also have dedicated resources for development and optimization of the solutions which can be run in virtualized environments, in the cloud or on rented environments.”
Despite the processing costs, storage prices continue to decline, offering a more compelling case for some on-prem setups for broadcasters just starting out, Branzanti adds.
“I usually propose a simple physical machine which is in the cloud and doesn’t cost so much, and those costs are going down,” he said. “That is the trend I see.”
Imagine’s Warman offered his assessment of the costs: “I would have thought they’d have declined more, but, you know, we actually map this stuff out periodically and it’s really not declining much at all these days," Warman said. "I’m not going to call it ‘plateaued,’ exactly, but it has stabilized significantly. I would have thought it would have tumbled significantly over the last decade, but really not that much change. I think a lot of that is driven by particularly integrated channel-type solutions that are able to pack more functionality into them.”
DISASTER RECOVERY
For those broadcasters who want to emphasize security and reliability, disaster recovery is top of mind.
“That’s one of the major pros of the cloud,” Branzanti said. “Your TV station can go down at any moment and you need backup technology. We talk about the 3-to-1 method, which means you need three backups with two different technologies. One has to be delocalized in another place, or else
you’re going to lose your data.”
Grass Valley’s Hassan outlines the variety of security implementations within its Playout X platform in AMPP.
“Our orchestration or control platform is completely redundant in that we have
multiple platforms around the globe and each platform is deployed across three physical AZs [availability zones] with multiple Kubernetes clusters, so the platform itself is resilient,” he said. “The customer gets to choose where they deploy what we call the ‘data plane’—this is where the video processing is actually carried out—and everything else. And they can choose to deploy that in multiple regions or multiple ways.
“We also have the ability to deploy an
“For the next 15 to 20 years, you’re going to see more and more cloud-enabled production and definitely cloud-enabled playout."
COSTA NIKOLS, TELOS ALLIANCE
industry trends
on-premise orchestration, which replicates our cloud platform to provide an almost standalone system for regions of the world where maybe you can’t rely on the internet,” Hassan added. “So in places like India where they often suffer internet outages, they can utilize what we call ‘AMPP Local’ so you deploy the orchestration layer on-premise.”
ELEMENTS IN THE PLAYOUT CHAIN
As channel playout has become more virtualized over the past several decades, the move to all-software, cloud or onprem environments has been optimized by lower storage and processing costs, faster connectivity, as well as the ability to manipulate, insert and integrate multiple
elements of the video, whether it’s inserting ads, generating lower thirds or processing audio.
This is where companies like Telos Alliance come in. While not traditionally a channel playout provider, Costa Nikols, the company’s strategy adviser for M&E, has started to think about Telos’ role in the technology’s future.
The trend toward remote production was accelerated by COVID-19, Nikols noted, and the demand for improved connectivity during the pandemic has impacted the various elements of the playout chain.
“When COVID hit, things got really interesting, because the need for remote production really kicked off and what really was the big enabler of all of this was that the internet got fast enough and reliable enough, and with the least amount of latency,” Nikols said. “And then where it’s becoming really interesting is that we’re enabling the audio to be manipulated and corrected within the playout chain. But now, as we move forward, we’re actually enabling the user, the viewer, to select what parts of the audio spectrum they actually are interested in, such as next-generation audio, which offers dialogue intelligibility [and] im-
mersive audio as well as personalization.”
Enterprises entering and expanding their playout operations should ask themselves some crucial questions, Nikols suggested.
“For the next 15 to 20 years, you’re going to see more and more cloud-enabled production and definitely cloud-enabled playout,” he said. “And of course, with that, you still have certain needs: Where’s this content going? What platforms is it designed to be consumed on? Is it going to be live-streamed or is it going to be file-based VOD? And what do you need to do in order to create the best user experience for those platforms you’re delivering?”
In the end, the requirements needed to deploy a virtualized playout setup haven’t changed since the early days of the technology, but as complexities grow, those demands become ever more important.
“The primary requirement has always been seamless and accurate delivery of programs, commercials, promos and graphics,” Pixel Power’s Gilbert said. “In today’s environment, this also means serving multiple platforms with complex quality and regulatory requirements. Customers want the security of knowing quality control standards will be met and, given that today’s broadcast delivery is a part of the wider connected world, reassurance that the playout platform has been hardened against all forms of cyber-threats.” ●
“Playout is no longer a standalone process at the end of the chain, but completely integrated into every aspect of the broadcaster’s business.”
NEIL MAYCOCK, PEBBLE
Powering Digital Linear Channel Creation in the FAST Era
Today’s audiences are looking for fresh, live and relevant content experiences on new platforms and devices
By Rick Young
A major challenge for media companies is discovering ways to efficiently and affordably deliver new channels or versions of existing channels without significantly increasing head count or investing in new technology.
Major broadcasters, sports networks and streaming platforms understand the need to scale their live channel offerings to engage new audiences amid intense competition and growing consumer choice. The difficulty is trying to rightsize the cost of creating a new digital linear channel against the expected revenue returns from highly segmented audiences.
Simply experimenting in the digital arena doesn’t cut it anymore. Ambitious players need to see genuine revenue generation across all traditional linear and digital products to meet profitability targets and achieve sustainable business growth.
Media companies know that live is the most compelling and valuable flavor of content. Live content draws viewers onto new platforms—recent Kantar research showed that soccer drove 51% of subscription video-on-demand (SVOD) sign-ups in Europe in third-quarter 2023. Traditional media organizations and digital-native players seek ways to make digital and FAST platforms the home for their live events and linear channel experiences.
One reality remains clear: Media organizations need their new digital-first channels to engage the widest audience possible with fresh, live content. Robust IP-based technology, global reach and the power of automation enable them to do that cost-efficiently and at scale.
STAYING AHEAD TO ENSURE SUCCESS ON FAST PLATFORMS
Viewer expectations are constantly shifting. Today’s audiences, particularly
digital-native viewers, are looking for fresh, live and relevant content experiences on new platforms and devices, including free, ad-supported streaming television (FAST) platforms.
We need to move past the long-held perception of FAST as a secondary viewing option best suited to archived content and niche programming—times have changed. Media companies need to create more compelling content propositions to attract the scale of audiences required on FAST platforms to increase revenue.
Beyond integrating more live and high-value programming to increase viewer engagement, content owners also need more cost-effective ways of iterating, experimenting and doubling down on winning formulas. That could mean launching a new digital linear channel tomorrow and replacing it with several different iterations in a matter of weeks.
Meanwhile, keeping up with a complex and rapidly expanding digital distribution landscape takes a lot of work. While facilitating seamless distribution to all of the most well-established FAST platforms, such as Samsung TV+, Pluto TV and Tubi, is vital, content providers also need the toolkit to deliver their digital linear channels to various downstream traditional platforms, including multichannel video programming distributors (MVPDs), vMVPDs and broadcast stations, where significant viewership remains.
Media businesses and platform operators are looking for connective tissue to help bring channels originally destined for FAST to broader horizons and more traditional end points. We’re also seeing channels initially designed for more traditional platforms evolving their business model toward a targeted FAST approach. Using automated playout and IP-based distribution technologies simplifies these processes and makes them more cost-efficient, helping media companies manage ecosystem complexity and scale without high costs.
TAKING YOUR LIVE LINEAR CHANNEL TO NEW LEVELS
Broadcasters can champion intelligent, reliable, digital-first playout systems with built-in automation to seamlessly create new
digital linear channels, increasing speed to market, integrating high-value live sports and news programming and supporting custom ad triggering for multiplatform monetization.
We all know that live sports, news, and events draw in viewers at scale and attract advertisers. Localized content experiences are also fundamental ingredients for building a compelling FAST offering. Regionally tailored news coverage, local sports, weather and entertainment programming deepen audience engagement and reduce platform churn.
The good news is that cost efficiency in bringing high-value live and local content to FAST platforms isn’t as complicated as you may think. Powered by advances in automation and intelligent IP distribution, broadcasters can take a primary linear channel and seamlessly spin up secondary and tertiary tracks for FAST distribution, engaging audiences with live and local news, sports or other high-value live content while sidestepping the typical costs and headcount increases associated with scaling your channel offering.
AUTOMATION DRIVES
DIGITAL REVENUE GROWTH
Automation is essential to driving operational efficiencies and achieving cost optimization for media businesses. Reducing the time and human effort spent on non-value-added tasks opens up resources to invest in creativity and audience experiences. Incorporating an integrated IP ecosystem approach that harnesses automation to fuel powerful channel creation, customization and monetization capabilities gives broadcasters a robust foundation to deliver unique and targeted content that delivers for consumers and ad buyers alike.
Media companies looking for more intelligent, effective ways of creating and managing a new digital linear channel are on to a winning strategy. The next step is to find the right technology partner with IP-native foundations and proven playout expertise. ●
Rick Young is senior vice president, head of global products for LTN.
“Cost efficiency in bringing high-value live and local content to FAST platforms isn’t as complicated as you may think.”
Study: Broadcasters Continue to Expand Presence on FAST
A recent report from Amagi indicates that the FAST channel business continues to grow and play an increasingly important role in the streaming sector, particularly among broadcasters who now account for 30% of the top 100 channels.
The “12th Amagi Global FAST Report” found that TV networks now account for 30% of the top 100 FAST channels, driving a 40% share in total hours of viewing (HOV) across these channels. The research also indicated that among Amagi-delivered FAST channels, global HOV was up 31% and ad impressions jumped by 26% during second-quarter 2024 compared to Q2 2023.
These findings underscore the significant role that FAST channels play in the rapidly evolving streaming landscape, demonstrating FAST's ability to adapt and thrive in the face of new technologies and viewing habits, Amagi said. Key takeaways from the report include:
• Consumers Increasingly Comfortable With Exploring FAST Offerings: 75% of Amagi's Consumer Survey respondents indicated they would create a free profile on a streaming service to sample FAST channels, and more than half would enter their credit card information.
• Growth of Broadcaster-Owned Channels: The total number of broadcaster-owned channels within FAST increased by about 2.5x.
• Increase of FAST Channels Within O&O Apps: The total number of FAST channels within O&O apps increased by almost 50%.
• Significance of Single IP Channels in FAST: More than 25% of entertainment channels are single IP channels, driving more than 33% of HOV within the genre.
Data in the report was based on Amagi Analytics and FAST channels that run on Amagi's platform.
❚ George Winslow
Mastering the Live Sports Video-Delivery Game
There is no one-size-fits-all approach
By Jonathan Smith
Live sports presents huge business opportunities for linear TV, as demonstrated by the numbers. The English Premier League is the most-watched league in the world and NFL games accounted for 96% of the mostwatched TV broadcasts in the U.S. in 2023. What’s more, this past February’s Super Bowl alone brought in a record 123.7 million average viewers. This has also bled into streaming, with the NFL’s first streamingexclusive playoff reaching a record-breaking 23 million viewers.
While the media landscape continues to develop quickly, live sports keep linear viewership rates high. Sports fans worldwide consume live Tier-1 sports events alongside an increasing volume of lower-tier niche sports, bringing media companies valuable, loyal viewership and monetization opportunities.
However, there is no one-size-fits-all approach to live-sports video delivery. With hardware, cloud-based and hybrid distribution models available, media companies must make sure the strategy they opt for aligns with the value of their content and the expected ROI for each destination. The model needs to fit the content and overall company business needs to fully take advantage of lucrative live-sports opportunities.
ENSURING HIGHLY RELIABLE DELIVERY
The dramatic rise in global demand for live sports consumption is significantly driving up the value of sports rights. The influx of tech and streaming players into the market has turned the sports-rights arena into a competitive battlefield, continually pushing the value of these rights higher. According to Rethink Research, the global media rights for the top 16 sports leagues worldwide are projected to reach $68.8 billion by 2028 and an astounding $90.6 billion by 2033.
As media companies vie for a share of the fast-growing live sports market, the stakes are getting higher with the content becoming increasingly premium. However, the challenge extends beyond acquiring these rights. Media
companies must also ensure reliable and high-quality delivery of live feeds to provide the best viewing experiences and maximize monetization. Premium content demands robust delivery; fans will not tolerate missing crucial moments in sports history due to video-quality issues, delays or buffering.
This is where hardware-based video contribution and primary distribution workflows excel. Innovative hardware solutions offer ultra-reliable, broadcast-grade long-distance media networking across any infrastructure. They support high-density video compression formats, from lossless JPEG XS to H.264, enabling media companies to manage various events and monetize them using a single reconfigurable platform.
The key to meeting the business needs of media companies lies in flexibility and scalability. Modern hardware solutions are designed to adapt to multiple scenarios, accommodating the fast-paced changes and variable returns characteristic of the media landscape. These solutions allow companies to set up new services, adjust bandwidth on demand and expand their network with new locations and services as needed. Investing in a versatile hardware-distribution model ensures media companies can reliably deliver high-value
content, maintaining viewer satisfaction and capitalizing on monetization opportunities.
Overall we can see how considering business needs, such as, in this case, reliable streaming of high-value live sports content, shows one distribution model to be more advantageous than others.
CLOUD-BASED MODELS: UNBEATABLE SCALABILITY
Cloud and IP workflows have transformed consumer video delivery, allowing media organizations to reach vast audiences through existing connectivity. By leveraging the cloud, media companies can scale up or down per live event swiftly and seamlessly, consuming necessary services through a flexible commercial model. This approach eliminates the need for upfront tech infrastructure investment, thereby reducing risk and enabling experimentation with new markets, content, and destinations at lower initial costs.
While cloud and internet primary delivery models are gaining traction, they are primarily used as backup links or for second-screen content in Tier-1 live sports events.
However, for lower-tier events, these workflows are becoming the preferred choice. A cloud-based distribution model aligns
perfectly with the business needs of media companies distributing lower-tier live sports video content, providing the flexibility, scalability, and cost-efficiency required by those delivery circumstances.
HYBRID: ‘BEST OF BOTH WORLDS’?
Despite the immense popularity of premium live sports, not all global markets guarantee revenue growth, making new ventures risky. Entering unfamiliar markets requires a cautious approach regarding the investment needed to distribute video content to new audiences.
Media companies looking to explore new territories while maintaining cost-effective video distribution benefit from a hybrid approach. This method combines existing infrastructure investments with the flexibility, scalability, and cost-efficiency of internet primary distribution, enabling them to expand their live content and maximize their investment value.
For media companies acquiring rights to Tier 2 and 3 sports, a hybrid model can enhance the primary feed with additional pregame and postgame content and highlights for social media and other platforms. This
strategy allows them to engage with dedicated fan bases, drive monetization, and boost cross-platform engagement without significant infrastructure investments.
The dynamic nature of live sports demands that media organizations act swiftly and intelligently. Securing valuable sports rights is only the beginning. The real challenge lies in maximizing ROI while minimizing risk. Media companies must have a strategic live sports
Connecting
Buyers and Sellers Our Services
the power of
“As media companies vie for a share of the fast-growing live sports market, the stakes are getting higher with the content becoming increasingly premium.”
JONATHAN SMITH, NET INSIGHT
distribution plan tailored to the unique requirements and potential risks of each market. By choosing the most appropriate approach—whether hardware, cloud or hybrid— on a case-by-case basis, media businesses can protect their content investments and future-proof their operations. In an everevolving media landscape, adaptability and strategic planning are essential for staying ahead of the competition. ●
Jonathan Smith is solution area expert at Net Insight.
delivers an unparalleled client and audience experience across newsletters, advertising, lead generation, content creation, webinars and live events.
Our established brands, like SmartBrief, ActualTech, and ITPro deliver expert-led niche newsletters, cutting-edge advertising solutions, pipeline-enhancing lead generation, and unforgettable live and virtual events.
Our turnkey services are crafted to expand your market reach, supercharge your lead nurturing efforts, and captivate your clients. Future B2B’s hyper-focused brands such as Mix, Twice, Radio World and others offer uniquely authoritative advertising opportunities to engage niche audiences with specialized content.
Is a Hybrid Cloud a Better Choice?
Many organizations adopt a hybrid cloud platform to reduce costs, minimize risk and extend existing capabilities
Cloud and IT go hand in hand, so as IT leaders, you likely need a comprehensive, clear insight into the technologies that feed both the enterprise and the cloud. Of paramount importance to IT leaders and cloud architects is keeping your technology assets secure, well-governed and cost-effective. This is a far cry from where IT was a decade or more ago, whereby IT was pictured more as support for the back office and keeping “the network” functional as well as supporting the users and workplace.
Karl Paulsen
IT leaders need comprehensive, clear insights into their technology to fuel these evasive and evolving data-driven decisionmaking processes, which lead to the best of results. The cloud, while convenient and less involved (compared to an enterprisesize data center) is not without its concerns. This is not to be “negative” about the cloud. Quite the contrary, knowing pitfalls of the cloud can only make your implementation(s)
Definitions
better and less risky.
By looking at known issues, we hope to broaden perspective and set the tone for understanding how, why or why not “in the cloud.”
LIFECYCLE
Most products sold to users have some concept of how long it will last. This “lifecyle” is no different for automobiles, appliances and certainly electronics. Like hardware, software also has its own version of how long it will last—sometimes based on the hardware it lives on and sometimes just on how long the original equipment manufacturer (OEM) wishes to support it or feels it is worthy of maintaining due to technology changes (usually advancements) or their cost of keeping it alive.
In IT (and cloud), there is also a software lifecycle. In this case, the software asset lifecycle is about having your IT resources accounted for, cost-effective and properly employed. Vulnerabilities of the products lead
End of life (EOL) indicates the last date that “full support” for a product is provided by a supplier, vendor or manufacturer.
End of support (EOS) is the last date on which any support is provided by a supplier, vendor or manufacturer. From this point on, the user is at the mercy of “self-help support.”
Vulnerability sometimes is referred to as an error in software, which can be exploited with a security impact and gain. All systems have some level of vulnerability. Thus, it becomes imperative that the administrator of the system is up to date, is thorough in their practices of support and maintenance and stays current in technology as it applies to their systems and operations.
Exploitation is most evident when malicious code leaks into a system and takes advantage of any vulnerabilities that might infect a computer or perform other harmful actions.
“In considering cloud, many look at using a combination of ground-based (on-prem) services and cloud services.”
the list of concerns about IT assets, according to Flexera’s 2021 “State of IT Visibility Report.” This issue is a chief concern of enterprise information management and staff. The same issues can be and are extended into cloud practices and, like the “ground-based” enterprise, must be carefully understood, watched and protected. The sidebar provides simple definitions as they apply to software systems and hardware components.
In considering cloud, many look at using a combination of ground-based (on-prem) services and cloud services. This is generally
“A hybrid cloud approach is one of the most common infrastructure architectures of modern computing applications for IT, media, health care and the list goes on.”
referred to as a “hybrid” cloud service and some feel this is not only important, but it may also be essential to its operation. In this model there is a dual role—that is, here the enterprise must manage its own services (on-prem) and, in turn, manage its cloud services as well.
IS A HYBRID CLOUD THE RIGHT ANSWER?
According to Morpheus’ “Gartner Market Guide for Cloud Management Tooling,” “The requirement to support hybrid and/or multicloud deployments is stressing current enterprise operational processes and tooling
that had been designed for their on-premises environment. The main use cases continue to be around cloud governance and resource management as enterprises try to avoid overspending or falling prey to security breaches.”
By definition, a hybrid cloud is a mixed computing environment where applications are run using a combination of computing, storage and services in different environments— public clouds and private clouds, including on-premises data centers or “edge” locations.
On a broader perspective, hybrid cloud architectures are widespread primarily because almost no one today relies entirely on a single public cloud. Figs. 1 and 2 show the values and benefits of employing a hybrid cloud to your solutions.
The value is that in hybrid cloud solutions you need to migrate and manage workloads between various cloud (and ground) environments. In turn, this allows users to create more versatile setups based on specific business needs. Many organizations choose to adopt hybrid cloud platforms to reduce costs, minimize risk and extend their existing capabilities to support digital transformation efforts.
A hybrid cloud approach is one of the most common infrastructure architectures of modern computing applications for IT, media,
health care and the list goes on. Today, most cloud migrations often lead to hybrid cloud implementations as organizations often have to transition applications and data slowly and systematically. Hybrid cloud environments allow you to continue using on-premises services while taking advantage of the flexible options for storing and accessing data and applications offered by public cloud providers, such as Google Cloud.
THE GOOD AND THE BAD
On the other side of the coin, there are many reasons why there are plenty of workloads that will never go to public cloud—some of which include: regulatory response, life/ safety reasons, subscription vs. permanent cost models, less control over your data security and, of course, you must have good internet.
The “if it’s not broke don’t fix it” and the “one throat to choke (your own),” along with higher long-term costs for cloud computing, all stack up against the “allin-the-cloud” model. Users also say some applications actually run better on a local server along with not knowing “where” your data really is (risk of regulations that could impact data retention or recovery, and how much does it take of your time plus the inability to control reliability.
Another not-so-pleasant concern is that every action leaves a trail (i.e., where’s your privacy?). Many of us grew up thinking that almost everything online was anonymous. Wrong. Connecting to the web generates an IP address. Every website we visit can see that IP address, and others can “see” that information, everything from which operating system we use to the size of our screen resolution.
Nothing you do is private any longer, especially when your interaction requires or expects one to “create an account.” The main point in creating an account is to retain certain data in order to display it again later. (Privacy is lost, even if the website “says” differently. If it weren’t important, why does that site need it?)
EVERYONE HAS A DATA PROFILE
Every detail about us can, and often is, regularly bought and sold; cookies track us routinely. Even if you don’t accept the cookies, there are means to track and trace you. Artificial intelligence now “fills in the gaps” using other resources, such as Facebook, one of the largest repositories of personal information on the planet.
This issue isn’t limited to services with public data (as in Facebook and Twitter/X).
cloudspotter’s journal
Amazon, Google and Dropbox each store different but very intimate details about each of us. Personally Identifiable Information (PII) is valuable to you, to businesses and to hackers. Personal data generates a “profile” that now positions you into “classes” or groupings, which now link to other connections and end up being “mined” by organizations that profit from knowing what they know and, at times, exploiting that information for less-than-personal reasons.
And it’s not just cybercriminals who want your data; countless services and governmental agencies want and collect your personal details, too.
CONCERNS BY HR
So, think further about your own organization or enterprise data, much of which may indeed include “your” personal data. For good reasons, HR departments take worthy concerns over their employees’ personal information and that you, the employee, trust that data to your employer.
The same might go for corporate records, contracts and such. Hence the interest and concern over security whereby they may employ technologies like blockchain to protect transactions, contracts and other confidential information.
TYPES OF PPI
Direct identifiers, also called “sensitive PII,” are data sets that can be used to pinpoint you and only you. Quasi-identifiers, also called “nonsensitive,” are those details that can be combined with other quasi-identifiers to “label” you—classify or group you for geographic, domestic or other reasons. Quasi-identifier designations are often used for statistical analysis placing you into group(s) that describe which “you belong to.” Direct identifiers, as mentioned, are about you and only you. Nothing you share with another person is a direct identifier—that data such as your full name, medical history, credit card details, personal identifiers such as Social Security or insurance numbers or your passport number. Europe took a hard stand less than a decade ago with its General Data Protection Regulation (GDRP), a European Union law enacted in about mid-2018. The GDPR concept aimed to protect you (the user) and provided a legal means to have to prove that any collector of said data had actually scrubbed and/or expunged your data from their files, should you request such action. Initially, the 1995 EU Data Protection Directive set goals and requirements, which EU member states were free to interpret its
“Artificial intelligence now ‘fills in the gaps’ using other resources, such as Facebook, one of the largest repositories of personal information on the planet.”
general goals and requirements as they see fit when complying with their national (EU) laws. Essentially, this says that organizations must have a lawful reason for collecting personal data; the amount of personal data collected must be limited to the minimum necessary to complete the lawful purpose; and that data must be deleted once the lawful purpose has been completed.
The U.S. (country-wide) has been struggling to develop a similar protection, but has essentially gotten no-where; yet California
has implemented similar policies.
WHERE IS MY DATA, REALLY?
The depth of these kinds of actions apply not only to local servers and services, but extend into the cloud, which leads to an interesting caveat. What happens when the data actually resides in a cloud that is in a non-EU country?
Hence, you can see the concerns for using “the cloud” when you don’t know where the data is or under which jurisdiction it might be controlled at any given moment. Furthermore, these concerns become more complicated as the cloud providers expand and the resiliency models (data duplication and distribution) grow for faster, more improved services.
So, know your solution set thoroughly before jumping on the “all in the cloud” bandwagon. Keep informed or use a knowledgeable entity to help support and maintain your investments. ●
Karl Paulsen is a frequent TV Tech contributor who has been writing about storage and media solution technologies for the past three decades. He can be reached at karl@ivideoserver.tv