Intelligence for the media & entertainment industry
MAY 2018
MAY 2018
WATCHING THE SECOND SCREEN 01 Cover_V1.indd 1
30/04/2018 16:14
TVBEurope_Print_Template_210x265mm.indd 1
19/04/2018 15:29:31
www.tvbeurope.com
FOLLOW US
Twitter.com/TVBEUROPE Facebook/TVBEUROPE
TWICE AS NICE
CONTENT Editor: Jenny Priestley jpriestley@nbmedia.com Senior Staff Writer: Colby Ramsey cramsey@nbmedia.com Designer: Sam Richwood srichwood@nbmedia.com Content Director: James McKeown jmckeown@nbmedia.com Contributor: George Jarrett
A
couple of weeks ago, Facebook’s head of content strategy and planning Matthew Henick said content creators need to stop viewing “smartphones as little televisions.” He said we are entering a new era of audience-responsive “social entertainment” content, which will take traditional storytelling and gel it with engagement, conceived to be viewed mainly on smart devices. Henick suggested content creators could begin using new production processes such as two scripts, one would be for shooting a traditional TV-version of the content and the other would include how the audience could engage with the content. I think his point about social entertainment is really interesting. How many of us watch TV with one eye on our social media accounts? I find watching an episode of The Great British Bake-Off or even a football match is enhanced greatly by being able to comment on Twitter about what I’m watching (mostly this season it’s about what a great player Mohammed Salah is – but that’s beside the point). Why don’t content creators, and companies that deliver that content to viewers, think about how the audience is engaging with their product? It’s alright promoting a Twitter hashtag but again isn’t that taking the viewers’ eyes away from your product? And this is even more pertinent when viewers are watching content via a smartphone. Wouldn’t the clever idea be to create an app where viewers can watch content but also comment about it within the same app?
Digital Director: Diane Oliver doliver@nbmedia.com
Facebook’s obviously looking at that idea with the launch of Facebook Watch, but that service is still not available to most of the global audience. One company that is embracing this idea is Second Screen. It has created an app that streams bite-size stories for the mobile culture. But it has also cleverly incorporated social media within its app, allowing users to follow friends and comment on what they are watching. It even allows users to follow “influencers” such as the actors or directors of the content they’re watching. We speak to the company’s founder Estella Gabriel this month to find out more about the concept behind Second Screen. Also this month, we hear from Finnish company Camment who are working with Spanish broadcaster RTVE to enhance coverage of the Eurovision Song Contest. Camment has developed in-screen technology for broadcasters that they can incorporate in their own platforms. Essentially, bringing second screen technology to the TV screen. Another growing area for content on the mobile screen is vertical video and so we’ve spoken to both the BBC, who use vertical video in their News app, and Grabyo, who create and share vertical video to multiple social and digital platforms. Finally, it’s just a matter of weeks until MediaTech 360 returns for 2018. This year’s chair Christy King gives us an idea of what attendees can expect at next month’s event. n
ADVERTISING SALES Sales Manager: Peter McCarthy pmccarthy@nbmedia.com +44 207 354 6025 Japan and Korea Sales: Sho Harihara sho@yukarimedia.com +81 6 4790 2222 Production Executive: Warren Kelly wkelly@nbmedia.com
SUBSCRIBER CUSTOMER SERVICE
To subscribe, change your address, or check on your current account status, go to www.tvbeurope.com/page/faqs or email subs@tvbeurope.com
ARCHIVES
Digital editions of the magazine are available to view on ISSUU.com Recent back issues of the printed edition may be available please contact lwilkie@nbmedia.com for more information.
REPRINTS / PERMISSIONS
All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means electronic or mechanical, including photocopying, recording or any information storage or retrieval system without the express prior written consent of the publisher.
Managing Director Mark Burton Financial Controller Ranjit Dhadwal Events and Marketing Director Caroline Hicks Head of Operations Stuart Moody HR Director Lianne Davey Audience Development Lucy Wilkie Printed by Pensord Press Ltd, NP12 2YA © COPYRIGHT 2018, Newbay Media LLC. All Rights Reserved NewBay is a member of the Periodical Puslishers Association NewBay Media, The Emerson Building, 4th Floor, 4-8 Emerson Street, London SE1 9DU
JENNY PRIESTLEY, EDITOR
TVBE MAY 2018 | 3
03 Welcome_V2 CR.indd 1
19/04/2018 09:48
IN THIS ISSUE
MAY 2018
08 The hybrid transition to IP production
NewTek’s Will Waters explains why it’s such an exciting time for broadcasters
10 A post-Brexit vision
Inderk Saar, Estonia’s minister for culture, on why the country is a viable alternative to London for broadcasters
13 MediaTech 360 Christy King, chair of this year’s event, tells us what to expect from the second annual conference
18 A song for Europe
20
CEO and founder of Finnish AI-powered engagement platform Camment, Tomi Kaukinen, explains the technology set to be deployed during this month’s Eurovision Song Contest
20 OTT in the Formula 1 driving seat
Jenny Priestley speaks to Tata Communication’s Mehul Kapadia about the new live Grand Prix OTT service
23 Blackbird at NAB
George Jarrett meets the company’s CEO Ian McDonough
32 ‘Netflixing’ movies for mobile
Jenny Priestley talks to Second Screen founder Estella Gabriel to find out more about the mobile and social platform that features episodic content
40 Living in a vertical world
44 Sounds of the future
40
44 04 Contents_V3 JP.indd 1
The BBC’s Chris Lunn explains the broadcaster’s use of vertical video in its News app
Raja Sehgal, director of sound engineering and Dolby Atmos specialist at London’s Grand Central Recording Studios, talks to Colby Ramsey about their new immersive mixing capabilities.
19/04/2018 11:54
RIEDEL
CROSSWORDS
As simple as that!
Audio router
Frame store synchronizer
7
MediorNet MicroN
4K
3
Think BIG? Use MEDIORNET as a decentralized Router.
Decentralized router
Video router Audio embedder de-embedder
Ethernet router
9 MultiViewer
4
Stagebox MediorNet @ AMPVISUALTV
1
Scalable media network
On-screen display
5 8
2
Delay measurement
MediorNet
IP
Audio gain control
Quadsplit
compatibility
6
Cross conversion
REALTIME NETWORKS FOR VIDEO, AUDIO, DATA AND COMMUNICATION
Testor pattern Whether simple point-to-point connections large backbone infrastructures, generator MediorNet provides unlimited flexibility in routing, processing, and distributing just about any broadcast signal – uncompressed and in real time.
Enabling futureproof network solutions, the MediorNet platform paves your migration path to IP-based production environments. With modular hardware and feature-driven apps, MediorNet is customizable to your specific application– from multiviewers to large-scale systems including de-centralized video routing. NEW MediorNet MultiViewer
TVBEurope_Print_Template_210x265mm.indd 1
www.riedel.net 17/04/2018 15:34:00
OPINION AND ANALYSIS
Rewarding security best practice By Mark Harrison, managing director, Digital Production Partnership
S
ince October 2017 more than 24 companies have been awarded the Committed To Security mark – the initiative from the Digital Production Partnership (DPP) designed to help suppliers demonstrate their commitment to security best practice. The rapid uptake of companies applying for the Committed To Security mark confirms the need for a security accreditation that works across the whole supply chain. Back in 2015, we began looking at the fundamental question of how to build trust in connected services: with varying standards of security and no way of knowing who is doing what, the good security practices of one company can be undone by the poor security practices of others. In early 2017 we assembled a group of security experts from the broadcast and supplier community with the explicit purpose of finding a way to build security – and, in the process, build trust. We found that the industry was in need of a common, shared basic language to ensure that everyone was addressing the same fundamental requirements. If everyone could see that everyone else was doing the basics well, it would be so much easier to then build greater, and wider, security sophistication. This was the starting point for the DPP to create standardised best practice for industry security, and a means of assessment that would be low cost and simple for the companies involved. So, working with a range of security specialists, we devised the Committed To Security Programme, comprising carefully devised self-assessment checklists for broadcast and production environments that would lead companies through a common set of basic security controls. The checklists were developed with an awareness of existing security standards. There is plenty of overlap which is a good thing - but there were also some gaps, particularly for the production checklist. For example, existing standards don’t consider long-term storage
concerns or the risk of losing data through format obsolescence, data corruption, human error or business failure of a supplier. In mid-October 2017 we announced the programme and invited all suppliers in the media sector to apply for either or both of the marks. The Committed To Security mark is awarded free of charge to qualifying DPP Members, while non-members can also apply at a license fee of £1,500 for the award of the security mark for each of the checklists, payable annually on renewal. The development of the governance processes was supported by DPP Member company, Eurofins Digital Testing, which now provides audit and administrative support for the programme. Telestream was one of the first companies to achieve the Committed to Security mark. Dominic Jackson, product manager, Enterprise Products at Telestream, says, “There is no doubt that security is one of the most critically important considerations for any organisation handling digital media. When Telestream customers see a DPP Committed To Security designation, they know they are adding technology to their infrastructure which will maintain security protocols as integrated with other program-compliant vendors.” Jaspal Jandu, head of cyber security at ITV, adds, “For most major broadcasters there is a heavy reliance on a handful of key suppliers around the delivery and transmission of our content. If these key suppliers are compromised in any way, it’s our brand and viewers that are impacted. Having a DPP mark for our suppliers allows us to gain a level of assurance around the resilience of our supply chain so that in turn our viewers can continue to enjoy our content uninterrupted.” For our part, the DPP is delighted at the speed and enthusiasm with which the programme is being taken up, and it has certainly fulfilled our ambition of creating a security programme that works across the whole supply chain. n
6 | TVBE MAY 2018
06 Opinion_DPP_V2 JPCR.indd 1
19/04/2018 09:48
The new ¸VENICE – Your centre of excellence for live, studio and channel playout applications
Versatility Reliability Scalability Future Proof
The new ¸VENICE platform to handle your media workflows pushes limits even further through virtualisation to provide a future proof solution for whatever comes your way today and tomorrow. www.rohde-schwarz.com/ad/venice IS-04 / 05
COTS hardware
Virtual Storage Access
Channel Playout
SMPTE 2110
UHD
MOS
Live Production
Studio Production
PTP
FIMS
AES 67
TVBE Full Page Template_210x265mm_NEW.indt 1 20996.009_VENICE-TVBEurope-Mai18_210x265_e.indd 1
SDI / IP
HDR
VDCP
11/04/2018 14:04:49 11.04.18 8:09 Uhr
OPINION AND ANALYSIS
The Hybrid Transition to IP Production By Will Waters, Senior Director IP Workflow Strategy at NewTek
I
t is an exciting time for broadcasters. Around the globe, broadcasters and video production professionals are navigating the transition to IP, perhaps the biggest change the video production industry has seen in the last half-century. Where once a tiny number of producers generated video, now almost anyone can make a show and broadly distribute it across IP networks and the internet. As a consequence, consumers of video now expect more content, in more places, and they want it on their own schedules. To be successful, broadcasters today must be able to quickly and easily produce content that is curated for many smaller groups of viewers and do so while reducing costs. Afterall, the true end goal of the transition to IP in broadcast is to gain the flexibility, efficiency, and cost savings that the larger IT industry can provide. Recently, the ratification of major standards such as SMPTE 2110 and the continued mass adoption of proprietary protocols like NewTek’s NDI begin to paint a picture of how IP solutions will emerge. What this means for broadcasters is that video production over IP is no longer a future milestone, but a reality today. Other areas of the modern production facility have long been using networked IP infrastructure in their workflow. Software-driven applications within the live production space like graphics generators and video playout servers are already connected to the rest of the facility via the network for interface and control, leaving only the raw audio and video streams in an SDI format. The ratification and adoption of IP standards for moving high-quality, low-latency video suitable for live switching now brings the rest of the studio into the same fabric. However, the need to support existing SDI-based gear is clearly apparent for successful transition to IP
for several reasons. Large investments in SDI core routers and many edge devices serve a legitimate function and there is little reason to get rid of them quickly. The HD signal types commonly used are HD-SDI (1.5Gb/s) and 3G-SDI (2.97Gb/s in progressive format) and these uncompressed formats can be moved into 10Gb/s network routing easily. Higher resolution formats such as UHD-1 (12 Gb/s) and UHD-2 (24Gb/s) require a different approach and are the topic of a different debate. Further complicating matters are various uncompressed IP formats introduced prior to the ratification of SMPTE 2110 and the introduction of 12G-SDI. As a result, some manufacturers are resisting putting IP capable I/O on their equipment. This dictates that a proper transition to IP will require hybrid systems taking advantage of the best of breed devices whether they are SDI or IP based. Conversion from SDI to and from IP is an important design consideration for engineers as they plan out the IP infrastructure. While it is a practical reality that conversion will be required in a hybrid facility, the amount of conversion should be weighed against the cost of full infrastructure switchover. Depending on the type and capabilities of the conversion products, there are hidden costs when dealing with multiple resolutions and formats. Additionally, lots of conversion usually comes with increased power consumption, adding to cost. With IP, all traffic in a facility can be on a single common infrastructure. However, transitioning to IP for the sake of using IP alone gains little benefit in both workflow and investment. If an IP implementation did no more than provide a simple, unidirectional connection between devices, perhaps reducing expense, it may not justify the effort. Once studios are connected on the IP network, however, physical ‘normal’
8 | TVBE MAY 2018
08 09 Opinion_NewTek V2 JPCR.indd 1
19/04/2018 09:49
OPINION AND ANALYSIS restrictions that traditionally impacted storytelling and content production go away. The studio itself is no longer limited to the distance between the video mixer, graphics and media playback systems, recording devices, camera controllers, audio desks, and other necessary gear. Likewise, video routing for monitoring outside of the studio is much easier. If someone on one side of the building desires to view a source on the network, they can easily do so without touching a single cable. When each output required discrete physical hardware, this was an unavoidable requirement of system design and manufacture. For IP-connected devices, however, the number of channels a system handles are freed from such physical constraints. If there is no need to view a source, it does not have to transmit across the network. Consider a typical graphics system with 100 pages of prepared graphics ready to display. All 100 graphics pages in parallel over IP, allowing anyone who wishes to freely display any one or more of them at any time. Critics may point out that this example ignores a key fact, that even the most powerful current graphics systems are unlikely to be able to render 100 simultaneous channels of output at one time. However, this is where design considerations in the IP world come into play. A very important aspect of the IP world is the bi-directional connection between devices. Going back to our graphics system example, the source can be alerted, effectively instantly, that a request is made for connection with one of the 100 channels. Let’s consider the benefits of SMPTE 2110 and NDI protocols for a moment. SMPTE 2110 incorporates the concept of “essences” where elementary streams for synchronisation, audio, video, metadata, and control are all available to connected devices and can work with only the necessary part of the stream required. Moving uncompressed video, audio, and data across the facility for specific needs is possible. Similarly, NDI has the ability for receiving devices to make independent calls for audio, video, or metadata in a compressed form, taking advantage of Gigabit Ethernet environments. Not only is this interesting to engineers from a cost savings standpoint, compressed formats are required for fully virtualised software-driven production systems. Once signals are in the proper format, other design considerations exist. What kind and type of network should engineers invest in? This is yet another area of debate. If uncompressed video is required, then 10/40GbE or even 25/50/100GbE connectivity is neces-
sary. Fully redundant systems double the infrastructure. The data bandwidth and networking protocols required for uncompressed transfer also usually involve separate logical networks for the elementary streams and timing protocol. These are not detrimental requirements; however, the heavy upfront investment arguably defies the market pressure of creating more content and delivering to more places with no more cost. Fortunately, the NDI option allows for a path that is digestible by many broadcasters concerned with costs or those needing to expand operations in an incremental fashion. Using networks already in place, or reasonably available to purchase, live production workflows based on IP can be realised to train staff and refine business models. Investing today does not limit the decisions in the future. The advantage of the IP studio is that the infrastructure is agnostic to protocols that travel across it. From the network perspective, it becomes a matter of adding bandwidth and processing to reach the needs of the protocols on top of it. This is fundamentally different for the broadcast industry as previous advances in resolution and formats required wholesale change of infrastructure. Since the reality of the transition to IP is hybrid for the foreseeable future, this means that there is a real economic reason to consider inexpensive and widely available network infrastructure. Operators may consider using gear that can operate on IP networks for new or smaller live production environments that can then be attached to the traditional facility infrastructure with conversion. This allows operators and engineers to experiment with IP as the new signal types and for producers to create new and exciting shows with innovative workflows that were previously unavailable. Ultimately, while broadcasters will feel pressure to invest exclusively in either NDI or SMPTE 2110 as the IP workflow solution to achieve their production objectives, it is simply not required. With NDI serving as both a standalone IP standard and a complementary technology to SMPTE 2110, clients at every level of production can outfit their environment to preference, transition at their own pace, and implement new business models with complete confidence. Live production is a real-time art form, spontaneous by its very nature. The more availability to sources and content, the greater the creative options. The transition to IP is no longer a concept that is based in the future, but rather one that can be built on today, which is why it is an exciting time for broadcasters. n
‘The true end goal of the transition to IP in broadcast is to gain the flexibility, efficiency, and cost savings that the larger IT industry can provide.’
TVBE MAY 2018 | 9
08 09 Opinion_NewTek V2 JPCR.indd 2
19/04/2018 09:49
A POST-BREXIT
VISION
Estonia’s minister for culture Indrek Saar talks to TVBEurope about why the country is a viable alternative to London for broadcasters post-Brexit
Why should broadcasters move to Estonia? Estonia’s advantage, compared to other countries, is primarily our advanced digital business environment, including the possibility to quickly establish companies and the minimal bureaucracy involved. There is a transparent and simple taxation environment in Estonia, and business can be quickly and transparently conducted with the national tax board as well as with other state institutions and authorities. Plus, Estonia has a high level of English language proficiency among the population. All relevant legal acts are available in English and retrievable online too. Within the media services context, there are inexpensive licensing fees (from €510 to €960 per license for a period of maximum five or ten years) and fast processing of licenses (one month for conditional access license). There are also relatively few additional requirements applied to media services providers
operating in Estonia (e.g. in the context of quotas, the protection of minors etc.). We can offer educated, multilingual and skilled labour at a more reasonable cost than in many other countries. What can you offer that London won’t be able to post-Brexit? Estonia’s proposal enables media companies currently under the UK jurisdiction to continue operating in the EU’s digital single market following Brexit. In the case that there will be no special agreement for broadcasting services between the UK and EU after Brexit, broadcasting companies operating with an Ofcom license will lose the right to provide crossborder television broadcasting in the EU. At the same time, UK-based broadcasting companies will have the possibility to direct their services to the EU market via some other EU country.
10 | TVBE MAY 2018
10 11 Feature Estonia_V4 JPCR.indd 1
19/04/2018 13:58
FEATURE Broadcasters, who are currently using Ofcom broadcasting licenses for broadcasting into the EU, will find flexible license terms as well as facilities providing playout, asset management and distribution services to secure COO (Country-of-Origin) principles in Estonia. In addition, even though Estonia has not so far been a traditional European broadcasting hub, we try to offer a possibility for broadcasters to table their concerns and to be heard. Broadcasters are moving closer to an all-IP workflow. How ready is Estonia to meet that requirement? And the same for AI, Blockchain and 5G? It is certainly true that broadcasters are growing warmer to IP-based solutions and workflows. For example, Estonian digital services provider Levira also provides both IP and SDI-based infrastructure and workflows. Levira is already successfully providing IP-based distribution and playout services to their current customers. With regards to AI, the hype is also present in Estonia. There are companies developing intelligent automated solutions but also those developing blockchain cyber security technologies with over 150 cryptographers, developers and security architects. Guardtime has been doing it for decades. In addition, a local capital-based bank, LHV, launched blockchain into their offering for a fast and safe money transfer solution over their own wallet application. With regards to new technologies, 5G is seen as something to liberate data users even more. Currently the Estonian population is totally covered with some form of data connection and also 99 per cent of the state is covered in 4G. Data consumption, as in most countries, is on the rise due to streaming and smartphones. This, of course, is not news, but rather a reason why all the telecom companies have been testing new 5G technologies in order to provide for the increased demand. The standards for 5G were just recently accepted, so now it is probably only a matter of time until we see the first 5G data bundles appear. What guarantees can you offer content producers regarding online security? Estonia is the first in Europe for cybersecurity (Global Cybersecurity Index, ITU, 2017), HQ for NATO’s Cooperative Cyber Defence Centre of Excellence and home to the EU’s IT Agency managing and developing the mission critical infrastructure. Online security is considered as a very important topic in Estonia, we are a country which was the first to suffer from a
true cyber-attack back in April 2007 and we managed to overcome that with very minor consequences. Since then, online security has been among the most important topics for us. Estonia is the first country to use blockchain on a national level. KSI (Keyless Signature Infrastructure) is a blockchain technology designed in Estonia and used globally to make sure networks, systems and data are free of compromise, all while retaining 100 per cent data privacy. With KSI Blockchain deployed in Estonian government networks, history cannot be rewritten by anybody and the authenticity of the electronic data can be mathematically proven. It means that no-one – not hackers, system administrators, and even government itself – can manipulate the data and get away with that. How can broadcasters and content providers take advantage of your e-residency scheme? Within the e-residency, an EU-based company can be registered entirely online. An e-resident can digitally sign contracts and other documents, as well as choose from a variety of trusted service providers that offer turn-key solutions for business administration. E-residents can access Estonian government digital services like online tax authority to supply the needed reports, converse online with the regulatory office, apply for licenses and take advantage of a fully online banking solution. E-residency makes it possible to use digital services in Estonia, but it does not replace the requirement that a company must be located in Estonia in the context of determining the jurisdiction that applies to the media services provider. Based on the EU Audiovisual Media Services Directive, and in order to comply with the requirements of the Estonian Media Services Act based thereon, it is necessary for the editorial decisions to be made in Estonia and these, in turn, are closely related to the criteria for editorial liability. This means that the media service provider executes control over the choice, content and structure of the programmes, and the placement of these programmes or the programme catalogue. It also assumes that personnel are located in Estonia, and this cannot be replaced by e-residency. How close the connections are between editorial liability and the everyday work of the channel depends on the nature of the service provided by the channel. So, a channel focused on news must make editorial decisions that entail editorial liability every day, because different news is broadcast every day. Other channels, such as those that only broadcast films or music videos, can many their editorial decisions much less frequently (e.g. once a month). n
TVBE MAY 2018 | 11
10 11 Feature Estonia_V4 JPCR.indd 2
19/04/2018 13:58
Get trained on IP Live Production Now To ease the transition to IP Live Production, Sony is collaborating with Juniper Networks’s experts to deliver training programs that help Broadcasters and Live production companies to get up-to-speed with IP fundamentals quickly. By making sure the right theoretical and practical skills are in place, it’s now possible to experience the ground-breaking benefits of IP for live production. Find out more on www.pro.sony.eu/ip
Ad Template.indd 1
09/01/2018 14:34
FEATURE
HEAD IN THE CLOUD Christy King, industry expert – media technologies, change management, and chair of the upcoming MediaTech 360 Summit, tells us what to expect from the second annual conference
I
am honoured to chair TVBEurope’s successful return of MediaTech 360 in London. In helping to shape content for this event, I am drawing on my 30-year career in the media business juxtaposed against the knowledge people need today to successfully take advantage of new revenue opportunities during an era of mass disruption. This second annual conference will feature experts who will share valuable information on media distribution solutions, new advertising technologies and concepts, the emergence of esports and Generation Z, emerging cloud solutions, and innovative ideas for data, machine learning, and AI. The TVBEurope team have a great talent for finding hidden avenues to thought leadership that help us gain a better understanding of our industry’s potential. The future vision of the media industry expands upon our networked society, where globally connected, digital, always-on consumers are more sophisticated and demanding in responding to media consumption trends. This phenomenon has been a huge challenge for every mature industry to address, but I believe the media industry, in particular, has turned the corner from thinking “this is hard” to understanding and embracing the opportunities for new businesses and revenue streams. All providers of media content, subscription services and connectivity, have to rapidly accelerate their digital propositions that leverage data insights, machine learning and analytics, and combine those technologies with cloud-based approaches in order to effectively scale into tomorrow. The media industry is somewhere in the middle of this massive content transformation, with much we can learn from other kinds of data-reliant industries. While media has the luxury of the knowledge that in an all-digital ecosystem the most compelling programming can reach global audiences, we have
PICTURED ABOVE: Christy King
TVBE MAY 2018 | 13
13 14 Feature MediaTech360_V2 JPCR.indd 1
19/04/2018 09:50
PICTURED ABOVE: A snapshot of MediaTech 360 2017
also learned that the mass distribution, one-to-many approach, is slowly coming to an end. Advertising and content is expected to be relevant for each viewers’ tastes and wants, which requires a plethora of technology to achieve smart new sales and distribution approaches. There are so many new technologies that create different ways of working and thinking for content creators, distributors, and advertisers, that we must take opportunities like MediaTech 360 to think through how these new concepts could open more opportunity for our businesses. With a view to addressing many of these ideas, the first day of MediaTech 360 will feature a keynote that will take a good look at the holistic, human aspect of technology’s influence. Following this will be an industry panel of respected analysts jumping right in to discuss the next iteration of cloud technologies and issues of media distribution. These experts will address the balance of ‘hype vs reality’ of scale vs customisation: What is possible today and what can we do to prepare to take advantage of new business opportunities coming down the road? We will expand our cloud discussions by taking a look at how content distributors can balance availability against costs associated with cloud infrastructures. And finally, we will round out the morning with a practical look at today’s best practices in cloud-based tools and services for a variety of media applications. In the afternoon we will hear from experts discussing innovations in dynamic advertising, how augmented and virtual reality ideas can change the nature of advertising, and then dive in to sports, esports, and the great expansion of tech infrastructure that sports tends to push first. Finally, we will cap day
one with a fantastic discussion on OTT distribution considerations such as network infrastructures, rights management innovation, and interesting new business models. As an author and leader of change management in media businesses, I help companies get out of their assumption ruts and look afresh at workflow and revenue opportunities. On the theme of letting go assumptions and investigating possibilities openly, we expand into day two with a deep dive into data, connected devices, “data casting,” and examine the many pitfalls of treating data lightly. We will continue our discussion of the power and promise of data in media in the afternoon of day two with a fascinating panel on transformation through artificial intelligence concepts. Next, we will turn our attention to experts sharing their successes in harnessing AI in streaming workflows and operations. Finally, our day will round out with a keynote presentation on a look into the future. Our industry expert will showcase the impact of technology on human resources, improvements in strategy with big data, and a review of which emerging technologies should be attracting the most attention from investors. I am excited to hear what all of these esteemed panelists and presenters have to share with us about the future of media. The content of MediaTech 360 is a natural extension of the curiosity we all share in our industry as we drive forward toward positive change. I look forward to seeing you there -- and please do not hesitate to contact me in advance with the questions and curiosities you would like to hear addressed at MediaTech 360. n
14 | TVBE MAY 2018
13 14 Feature MediaTech360_V2 JPCR.indd 2
19/04/2018 09:50
Endorsed by
9 - 10 May 2018 Olympia London
SERVE YOUR VIEWERS EXPAND YOUR BUSINESS SIMPLE RIGHT?
Book your FREE visitor pass today
Discover the knowledge and tools you need to beat your competition and take every opportunity to grow. Connected Entertainment Hubs Tech Tours Industry Roundtables Live Press Interviews TV Connect Awards
Confirmed Partners for 2018 include
tmt.knect365.com/tv-connect TVCONNECTSERIES #TVCONNECT TVCONNECTSERIES
Save 20% off the cost of an all-access delegate pass with VIP code ‘TAM2394NEW’
TVBE Full Page Template_210x265mm_NEW.indt 1 TV Connect Full Page Advert Resize - 210x265.indd 1
20/03/2018 08:05:26 19/03/2018 14:56
FEATURE
IT’S ALL ABOUT AGILITY TVBEurope talks to Philippe Bernard, CEO, Globecast
PICTURED ABOVE: Philippe Bernard
What’s your professional background? As a young engineer working in digital TV R&D, I was one of the initial five key engineers that helped create digital TV in France. Six years later I was heading up the French digital TV project. Then I moved across to working on B2B data networks and that was my route into the B2B world. I then joined Equant, which went on to become Orange Business Services and I became the CEO of the French affiliate. After that, I held a variety of roles with Orange, working across the mobile sector, culminating in taking charge of customer service for the entire Orange Group. In January 2016, I joined Globecast as CEO, in some ways coming full circle, but equipped with vital knowledge of both the B2C and B2B communications sectors. I found the opportunity to work across video and data – the whole content and media management proposition – with Globecast was very exciting. What is the role of the CEO of a modern technology services provider? What do you do? My first key challenge was to improve Globecast’s profitability in order to fuel sustainable growth. This is now the case, while we’ve also put considerable fresh investment into the US and we’re expanding and reshaping our operations in Asia, with a new MD – Jimmy Kim – recently appointed. As CEO, an essential part of my role is to also anticipate genuine breakthroughs in the market, not only in terms of technology, but in terms of market forces too and the arrival of new players will change the dynamic. This is achieved in multiple ways: talking with customers; listening to staff; and networking across the industry. We all have to ensure that Globecast is agile enough to very quickly identify and adapt to market trends and test out new ideas. Our customers don’t always know what the effect of new entrants will be and at
the moment that’s the FAANGs: Facebook, Amazon, Apple, Netflix and Google. Our move to cloud playout is a prime example of pushing the boundaries to create greater agility. So my role is a mix of finance, market understanding and engineering. How much is that challenge enlarged by the fact that you’re a global player? This is undoubtedly a challenge. Being global is central to what we do and is vital in today’s market. But the challenge for me has been – and still is to a certain extent – that Globecast has not been built from a single company. Rather, there have been quite a lot of acquisitions along the way in all our geographies. This means that we still have a little way to go in terms of completely unifying our processes. But the gradual move to IP infrastructure, virtualisation and the cloud is helping us to complete that process. There’s a lot of market fragmentation at the moment – both in terms of how customers access content on devices and how that content is delivered. How do you service that? Globecast’s history is in channel distribution and occasional use services via satellite. But we’re very different now. At the core of what we do is media management and that powers a range of services: for linear TV; VoD; OTT; and now short-form VoD content creation with Globecast liveSpotter. That last point might sound minor, but it isn’t. Let’s take the sports market as a prime example. Live linear TV is still very powerful and will remain so for a long time to come. Indeed, live is where linear TV still scores very highly. But, as an example, around a linear broadcast sports channel, there’s now the increasing requirement for second screen content as well as across social media. Fans want highlights
16 | TVBE MAY 2018
16 17 Feature Globecast_V2 JPCR.indd 1
19/04/2018 13:58
of a match when they can’t see the whole game. Motor racing fans want additional data – lap times, gaps between competitors and so on. They want interviews, behind-the-scenes footage – a 360-degree experience. And content providers have to keep fans engaged between events too and promote those upcoming items. We are satisfying that with liveSpotter, which can create short-form VoD content very, very quickly – almost real-time – automatically or manually. It’s the internet where things come together. We’ve seen a huge rise in providing streaming services, a prime example is across niche sports but also for second screen activity. Once we have the feed, it’s easy to stream out to Facebook Live, YouTube Live or embedded web players. For our customers it’s one of the key benefits of a single-provider approach. There is no longer a one-size-fits all way of working to anything that we do. Again, agility is the key word. We do strongly believe two things: that linear TV will maintain its strength in delivering live content; and satellite is alive and growing. We’ve heard much talk recently about the slow demise of satellite and while there are threats, its reach is clearly unrivalled, both for DTH and B2B services. What about regional variations? Yes, that’s an important question. What works in one region won’t necessarily in another. There’s huge geographic variation. OTT is a prime example with some markets requiring very high-end, complex solutions; others, very simple, quite basic services. OTT is clearly a necessity and we work with customers in all regions to identify the most suitable route to market and monetisation and therefore what technology and services should be used. If we look
at the ongoing developments in the relationship between video content access and telcos, specifically the use of smartphones to access content, again there are significant regional variations. It’s common to see across emerging markets a significantly higher percentage of people using smartphones to access video on a daily basis than in markets with a more evolved overall digital ecology. But what remains absolutely consistent across every region is the need for a premium customer experience. That might sound obvious, but too often it’s neglected by companies. Both our MyGlobecast interactive customer portal and our appointment of Jacques Rivals as chief quality and change officer are testimony to the work we are doing in that regard.
PICTURED ABOVE: Globecast’s cloud playout in action
How important for Globecast is the move to cloud-based playout? It’s very important. Using the cloud and virtualised technologies changes the dynamic. I’m going to use the same word again: agility. We can now launch channels quickly and we can increase capacity or decrease it without financial penalty. We – and the customer - pay for what they use. Combined with the fact that we access a lot of the content we playout now from the cloud, the whole workflow makes sense. And it means that customers can trial new services – or existing services in new markets – without financial penalty. As we move forward, doubtless other advantages will emerge. Moving to cloud playout – though of course this is far from meaning that we throw out our existing physical playout solutions – is also important in providing greater global unity across our systems. Overall this will further increase the flexibility of the services that we provide. n
TVBE MAY 2018 | 17
16 17 Feature Globecast_V2 JPCR.indd 2
19/04/2018 13:58
FEATURE
A SONG FOR EUROPE CEO and founder of Finnish AI-powered engagement platform Camment, Tomi Kaukinen, talks to Colby Ramsey about the interactive technology set to be deployed during this year’s Eurovision Song Contest
C PICTURED ABOVE: Tomi Kaukinen
amment is an implementable plug-in for digital streaming platforms that lets viewers have private video chat groups on top of TV shows, ask statistics of players from Camment’s AI bot when watching sports, or otherwise engage with the show they are watching by utilising artificial intelligence powered ‘bots’. A recent deal with RTVE will see the state-owned Spanish broadcaster employ a karaoke singing bot to allow viewers to sing and compete against one another during the live broadcast of this month’s Eurovision Song Contest. The technology, which is implemented into mobile, connected TV applications and web streaming platforms, uses a facial, emotion and speech recognition system to provide broadcasters with valuable viewing data. “We have an SDK; it’s a plugin that the broadcaster can install very simply and uniquely into their OTT platforms,” says Camment CEO and founder Tomi Kaukinen. “The integration with the product depends on how good the developers product is, it can take anywhere between two hours and a day. So it’s a very lean project in that sense.” When did Kaukinen recognise that there was a demand for the technology? He explains that while his previous company worked in mobile apps with a lot of the Spanish media at the time, it was clear there was a bit of a reluctance to promote third party apps. “They prefer asking people to download their own apps from their own ecosystem,” Kaukinen explains. “So we were basically trying to sell them solutions that were app-based, but we were told by all the media that we spoke to that they don’t like this app promotion stuff. Then we kind of redid our business model and scrapped our application, and started selling technology that enabled them to promote their own products. The pull after that started coming from the media themselves. “We met RTVE around six months ago and just
wanted to see if, now we’ve changed the business model, we could work with them on something, which we started doing immediately.” Kaukinen points out second screen apps usually draw attention away from the screen, but with Camment in-screen technology, “the broadcasters can take ownership of their products and the data it collects.” “We believe and we hope that the current traditional TV channels in media are being disrupted a fair bit by the likes of Netflix etc., so they need something to keep and retain their audience,” says Kaukinen. “We believe that we provide a solution that helps them stay relevant. We can see that the European and Asian markets are going to be especially good for us, and the timing is great at the moment.” Kaukinen’s goal is quite simple; he wants to see Camment become the industry standard of engagement on VoD platforms. So why Eurovision? “Now we have the technology to work with any music show, and the good thing with the broadcast business, is that usually all the channels in the world have some kind of music show,” Kaukinen explains. “Our technology is so easy to scale and distribute, so we strongly believe that, especially for music shows that are watched by millennials on tablets and phones, it’s amazing. “From a personal perspective, if I watch something on the small screen, I engage in chat groups with my friends, so I have to either use two devices or switch between two screens,” he continues. “We give users the opportunity to do it on the second screen, so for us, it’s about providing simplicity.” And this is the main factor that Kaukinen is determined to achieve, simplicity for the consumer. “For the broadcaster, it’s more about keeping the audience watching their ads, keeping them in-screen, extending the viewing time,” he concludes. “Data shows that people watch shows for a longer time when they communicate with friends simultaneously, which makes it more interactive, more fun, and increases the value of the whole product for the media.” n
18 | TVBE MAY 2018
18 19 Feature Eurovision_V4 JPCR.indd 1
19/04/2018 09:51
PICTURED: Camment’s technology in action
TVBE MAY 2018 | 19
18 19 Feature Eurovision_V4 JPCR.indd 2
19/04/2018 09:51
FEATURE
PUTTING FORMULA 1 IN THE OTT DRIVING SEAT In March, Formula 1 announced plans to launch a new live Grand Prix OTT service. The sport has partnered with Tata Communications to provide the technology. Jenny Priestley speaks to Tata’s Mehul Kapadia to find out more
I
n late 2015, Tata Communications completed a successful proof of concept with Formula 1 to deliver a live trackside feed from the Singapore track to Formula One Management’s technical HQ in Biggin Hill, UK – with no time lag between the live broadcast and the footage viewed on an app. Racing ahead to two years later, Formula 1 is launching the app to the consumer this month. The new Grand Prix OTT service marks the sport’s biggest investment in its digital transformation to date. Access will initially be available through desktop and web, with mobile and TV apps being phased in on Amazon, Apple and Android at a later date. CDN and connectivity services to distribute the F1 TV content globally are being provided by Tata Communications, which has a long history of working with the sport. “This is the seventh year of our association with Formula One,” explains Mehul Kapadia, managing director of Tata Communications’ F1 Business. “When we began, we were delivering services to them and building on innovation, and we are now engaged with all parts of the Formula 1 ecosystem.” “We work across the whole spectrum and it’s really
been a fantastic journey because we have been able to create a very bespoke sports platform,” continues Kapadia. “We are able to help all of the players within the broadcast side of F1 to leverage that platform, so for example, it allows broadcasters to unlock value within their own organisations as well as providing the stability and reliability they need on services.” Tata is well aware of the changes in the way many sports fans now consume content. A Formula 1 fan is just as likely to watch a race on their mobile whilst on a train as they are at home watching on a 40” screen. “On the video side, it’s all about ensuring the best video experience for fans whether it’s on mobile, television screen or whichever screen the viewer is using,” explains Kapadia. “Our focus has been on the fans and fan engagement and how technology can be driven there. We’ve tried to work hand-in-hand with the Formula One Group to make that come alive.” Since they began working with Formula 1, Tata has been focused on driving change in the way the sport is broadcast. “When we first began working with Formula 1 we tested remote production, so from Biggin Hill we were able to edit their radio feeds,” explains Kapadia. “Then in 2014, we tested 4K as a technology and in
20 | TVBE MAY 2018
20 21 22 Feature Formula 1 V2 JPCR.indd 1
19/04/2018 11:20
FEATURE
2015 we moved on to live OTT video delivery, which can be delivered synced up with broadcast. Most recently we’ve been testing 360-degree video.” Asked who instigates the testing, Formula 1 or Tata, Kapadia says, “We always want to be ready as new requirements come up or new engagement models need to be built.” “When we were testing 4K for example, devices that could broadcast 4K weren’t really available, and broadcasters weren’t really doing it yet,” he continues. “So what tends to happen is we work in partnership with Formula 1’s technical teams and their CTO on technology that in the future can benefit them. There is no pressure on them that they must immediately deploy the technology. Most of what we do is led by
what is going to be relevant in the future and what we can test out now.” It was that forward thinking, and the testing of a live OTT video feed in 2015, that has led to the launch of F1 TV. “One of the fundamental challenges in a sport like Formula 1 is that in two seconds so much can happen: a crash can happen, or an overtaking manoeuvre, it all happens so quickly,” says Kapadia. “If you look at a typical mobile experience, and look at the second screen versus the television screen, one of them is coming through the internet, and one is probably coming through satellite, especially if you’re watching Sky. They can often be off sync. If you are a fan who is trying to look at a second screen experience or you’re looking at a data feed
TVBE MAY 2018 | 21
20 21 22 Feature Formula 1 V2 JPCR.indd 2
19/04/2018 11:20
FEATURE
PICTURED ABOVE: Tata Communications’ F1 Tech Pod
somewhere, you will be off sync. So we took up the challenge of how do we make it as close to real-time as possible? The laws of physics mean we will always have some latency!” he laughs. “Basically, what we were able to do was synchronise what is happening between the two screens. That then became a component which gave us more potential if ever an app was created. Liberty Media came in and they have had fantastic vision in terms of wanting to give fans even more access to the sport. Frank Arthofer, who is leading their entire initiative around digital, initiated the process of building the app, he put the whole thing together and we’ve been working very closely to launch this with them.” Tata is deploying a global content delivery network for the app – and even a special version for Formula 1, a video delivery network. “Delivering content is different when you’re trying to do a live feed which becomes
so much more challenging,” explains Kapadia. “So we deployed a CDN and VDN which ensures that as the viewer is connecting into the app, the content being delivered is seamless and reliable. “ Obviously, one of the biggest challenges of delivering live content on an app compared to traditional broadcast television is latency. Kapadia says that Tata’s ultra-light VDN minimises the latency for the OTT content as it aims to sync it to the broadcast: “Of course, we can’t control the end speed on the user’s device, so between 3G and WiFi you might still see some differences. But if the viewer is on a reasonably good speed, then the differences between the two would be something that they would barely notice. “There are of course some other factors,” he says. “But whatever we can do to make it sync up, is part of what we are offering to Formula 1. Fundamentally that is what we have attempted to fix.” n
“We took up the challenge of how do we make it as close to real-time as possible?” MEHUL KAPADIA
22 | TVBE MAY 2018
20 21 22 Feature Formula 1 V2 JPCR.indd 3
19/04/2018 11:20
Intelligence for the media & entertainment industry
1_Cover.indd 1
MAY 2018
09/04/2018 14:46
Together we can do anything Today’s media landscape offers limitless opportunities. And now there’s a collaborative production platform to match your own ambitions. Ci from Sony puts the power of the cloud at your fingertips. Store, edit and share content from anywhere in the world: it’s fast, secure and effortlessly scalable. So you can spend less time on boring stuff like moving files around, and more time being creative. Made by media professionals, for media professionals. With Ci the sky’s your only limit. Find out more on www.sonymcs.com
SUPPLEMENT
WHY MEDIA FIRMS SHOULD EMBRACE THE CLOUD If media companies want to keep up with audience expectations, they should consider supercharging their content creation and distribution processes by adding the benefits of the cloud, writes Stuart Almond
T
he internet has fundamentally changed our lives – from the way we communicate to the way we pay our credit cards. Importantly for the broadcast industry, as broadband speeds have increased, it has changed media consumption too. Viewers can now stream and download televisual content online to a myriad of devices rather than simply watch it ‘live’ via terrestrial, cable or satellite on their TV. And that content isn’t just the programmes produced for, or by, traditional broadcasters. The internet has democratised media distribution. Video streaming and OTT services, from the likes of Apple, Amazon and Netflix are increasingly popular. Netflix, for example, currently has over 109 million subscribers worldwide. Multiple screen usage is also on the rise as more than 87 per cent of viewers are now using a second screen device when watching televised content. A 2017 Nielsen Audience Report shows that, of those viewers aged 28-36, one in four watches less television than the same age group did just five years ago. For those aged 18-27 this figure increases to 40 per cent. While many broadcasters have embraced this change, some still face tough challenges. Creating engaging content quickly and distributing it effectively across a multitude of platforms requires greater operational efficiency - especially with the ongoing need to adhere to the traditions of ‘broadcast quality’. A fresh approach is therefore required.
And it would seem obvious that the cloud would be a key enabler for this. Yet, media companies have been very slow to adopt the cloud. Broadcasters and media organisations typically operate in a siloed and fragmented fashion. Workflow is distributed amongst various departments, with little visibility across the board. Equally, content distribution processes often vary considerably, making a wide scale roll-out challenging. For media companies to embrace the cloud often means retraining people and completely redefining workflows. In addition, there is an often inherent and institutional reluctance to move to the cloud. While all these concerns are valid, there is a way around it. The key to moving to the cloud is taking a phased approach, accompanied by the appropriate training and
staff development initiatives. Working this way, the benefits are clear. Most legacy equipment can be migrated to the cloud, and with the right expertise can be managed relatively seamlessly. From small independent media organisations to huge broadcasters, the cloud can scale up and down to meet the needs of any business. Content can be uploaded in real-time with visibility to all those who need it. This means that the various editing processes and production of the finished content is much faster and more collaborative. Timely access to content also helps with audience engagement. The more content becomes readily available on as many platforms as possible, the better the overall user experience. Everyone expects a frictionless experience and the cloud is the best technology to deliver this content quickly and efficiently to consumers. Additional revenue can also be found through transferring historic content to the cloud, and monetising it. This could be incredibly lucrative for media and broadcasters, working with a partner such as Netflix or Amazon Prime that have huge user bases. Media companies are at a crossroads where they must now adapt to a much more competitive landscape, driven by technological advancements, which have led to huge changes in consumer consumption habits. Supercharging content creation and distribution with the cloud has become essential if media companies want to keep up with audience expectations. n
TVBE MAY 2018 | III
03 Introduction V2 JPCR.indd 1
23/04/2018 14:44
SUPPLEMENT
SONY TO LAUNCH VIRTUAL PRODUCTION: SHARE EVENTS. REACH FURTHER. BROADCAST LIVE. With the introduction of a pay-as-you-go cloud-based production service that negates the need for expensive on-site infrastructure, live production will never be the same again. By Barbara Rosseel, marketing and communications manager at Sony Professional Europe.
H
igh quality and reliable live production has always required serious infrastructure, major upfront investment and an army of talented creatives and technicians. Whether it’s a sports tournament, a music festival or a fashion show, you need kit and lots of it. And that kit needs to be in or near the venue, location or stadium ready for the production team to work their magic. The requirement for a convoy of expensive OB trucks, mobile galleries and power generators to be on site, usually puts live production out of the reach of almost anyone who isn’t a major broadcaster. Yet, conversely, we now live in a world where our geographical location is less important during the working day. Just think of all those virtual tools we use that allow us to have face-to-face meetings without leaving our desks or work together on the same files even though we are continents apart. This ‘remote’ working saves us time and money. The good news is that some of these principles can now be applied to live production too, which will be
a boon for content creators that might not have the budget for a full outside broadcast set-up. And there are strong arguments for it. There will always be a need for talented people but with increasing bandwidth, web-based tools and subscription services, there is a suggestion that the traditional way of doing live production is a bit outmoded and too expensive for current media needs. This is especially true when there is an urgent need to reach more audiences, on more platforms, quicker than ever - whether that means Facebook and YouTube or a football club’s media channel. For these outlets, live production requires a more cost-effective and less kit-hungry approach. This is why the solutions experts at Sony have developed a new on-demand cloud-based service that achieves just that. Called Virtual Production, it provides content creators with a complete production toolset in the cloud that can be used for multi-platform streaming of live events. It requires no infrastructure, little investment and has limitless potential.
IV | TVBE MAY 2018
04 05 VP Launch_CR.indd 1
09/04/2018 14:48
Virtual Production fuses multi-camera wireless acquisition and sophisticated QoS technology (for fewer dropouts over cellular LTE networks) with a range of graphics and production tools. This combination makes it possible to produce and deliver streamed HD content quickly and professionally. It is frame accurate, works with any internet protocol standard and adapts to different bit rates and frame rates to ensure consistent pictures and sound. How does it work in practice? Simple. The camera crew on location uses wireless transmitters to feed a virtual production switcher that is hosted in the cloud. At the same time, a vision mixer, based anywhere in the world and using an ordinary web browser, logs into the Virtual Production service. From there, he or she can switch the camera feeds, add graphics, logos and captions, audio and stream the output to YouTube, Facebook Live and more. It is even possible to cut pre-recorded packages into the stream. Captions can be integrated with social media feeds too, pulling comments from the web into the stream and the output can be recorded to allow for post-event highlights editing.
There is no need for production infrastructure on site, just compatible cameras, microphones and transmitters (and any on-screen talent of course). There is no need to invest upfront either. Once you have the acquisition kit, everything else is pay-as-yougo. The switcher is entirely cloud-based. There is no need to install or run any software. If it sounds good to be true, it isn’t. Sony has already completed a Virtual Production proof-of-concept with Swisscom for coverage of the Locarno Film Festival. A hugely successful production, it is said to have provided a 30 per cent cost saving when compared to traditional production workflows. A change is coming. With Virtual Production, it is time to say goodbye to on-site infrastructure and upfront investment. Thanks to the cloud and some Sony ingenuity, content creators can embark confidently upon high quality, reliable and cost-effective live production projects that help them to reach more audiences in new and exciting ways. To find out more about Virtual Production visit www.pro.sony.eu/virtualproduction.
TVBE MAY 2018 | V
04 05 VP Launch_CR.indd 2
09/04/2018 14:48
Broadcast as-a-service Share your content the way people expect it. Live. Anywhere. Any time. Any channel. Always in high quality. And only pay for what you broadcast, when you broadcast it. No new infrastructure. No set up costs. No subscription.
Share events. Reach further. Broadcast Live.
05 06 Production.indd 1
Virtual Production by Sony Virtual Production is an on-demand cloud production service that provides a complete production toolset for multi-platform professional streaming, only when you need it. There no infrastructure requirements, no capex investment, not even the need to download a software… Your audience is now! So tell your story now as it’s happening with Virtual Production the cloud-based production service for content creators needing to meet the needs of today’s Internet-first audiences. Sony’s Virtual Production Service allows content creators instant access to a range of studio tools in order to produce, edit and deliver content professionally. Virtual Production is an on-demand cloud production service that combines multi-camera wireless acquisition, sophisticated QoS (Quality of Service) technology for better-looking pictures and fewer dropouts over cellular LTE networks, and a range of graphics and production tools to deliver streamed content fast and professionally. It is frame accurate, works with any internet protocol standard and adapts to different bit rates and frame rates to ensure consistent pictures and sound. And as a service, you only pay for what you need, when you need it. Allowing you to focus on your story, whilst saving precious time and cutting costs.
09/04/2018 14:48
What if you could work anywhere? Zurich
Tokyo
Zurich
o
u
London
Work from anywhere Live Streaming (720P25)
Munich Videoclips
Record PGM and all camera streams
Logos / Pictures
What it does •
Parallel streaming to three different destinations
Add social media tweets to PGM stream
•
Don’t bother about complex setup
•
Mix live-stream with preproduced videoclips
•
Just concentrate on the actual production
•
Record all input- and PGM stream
•
Multiple live stream inputs (local/remote)
•
Add logos/pictures to PGM stream
•
find out more on www.pro.sony.eu/virtualproduction 05 06 Production.indd 2
09/04/2018 14:48
Recapture lost audiences Break the news first across all platforms with Media Backbone Hive. It’s time to reconnect with your lost audience in a more cost effective way. As viewers look to the internet, mobile and social media for their news fix, broadcasters can now reclaim this ‘Lost Audience’ with Media Backbone Hive, a Unified Content Platform linking everyone within the organisation to enable unified news production. One production system to deliver stories faster across multiple platforms online, TV and radio. No more silos. No more duplication of effort. And lots more flexibility.
Media Backbone Hive
Find out more pro.sony.eu/lost_audience
BLACKBIRD SINGS THROUGHOUT NAB George Jarrett speaks to Blackbird CEO Ian McDonough
V
ery much a MIP man in his ‘content sales’ years with BBC Worldwide, Turner, and Viacom, Ian McDonough enjoyed a sensational NAB debut thanks to the popularity of Blackbird Ascent and Forte, and some clever edge technology involving Mac computers. As CEO of Blackbird (formerly Forbidden Technologies), McDonough wanted to run ahead of consumer content viewing trends, and before he joined Forbidden to quickly re-launch it, he talked a lot about software that had been developed over 18 years. “I found it absolutely astounding, mind-blowing, that what Stephen Streater had designed in the year
2000 is still today the only cloud-native codec for video editing,” he says. Pointing out that codecs derived from H.264 and H.265 are principally playback tools, and not suitable for manipulation, he adds: “They are not designed for fast turnaround content – editing very quickly, and then publishing very fast. My guiding light was that the whole industry will be much more telling with the content that is fast turnaround.” This involves anyone, and not just video cutting professionals. And consequently Forbidden needed a new ID. “I thought this was something extremely special
TVBE MAY 2018 | 23
23 24 25 27 Feature George_V2 JPCR.indd 1
23/04/2018 15:10
FEATURE
PICTURED ABOVE: Ian McDonough
because I had not heard of such amazing technology before. I wanted to be the person that took it out to be commercially successful,” says McDonough. “The codec is what differentiates us, and I felt we had not elevated that point enough. I really like the name Blackbird. It is cool, and I wanted to get behind a point of differentiation and to take it to another area.” Reappraisal of the company was important, because when technology companies have been around for a while the industry tends to think things have moved on. “We are very much of the time, in fact ahead of the curve,” says McDonough. He moved onto identifying the critical challenges the business formerly known as broadcasting is facing. He says: “The industry is morphing very quickly. There are a number of critical issues and the first is the massive fragmentation. The audience for the amount of video available is much larger and mass volume, but the audience for each piece of video is that much smaller because the audience has been fragmented, and cost becomes such a key area. “The world has moved very much away from having a fixed cost, particular bespoke workstation
model with software that is on premise, to a rental model, where people want to hire software from the cloud that can be played on any device or workstation. And the editing is done by professional people that don’t naturally have to be highly-skilled craft editors,” he adds. “Every second counts now in terms of getting content out to the consumer, and the other cost is rights.” We are living in a content gold rush, and Blackbird Media Proxy is quite a tool. “It can be created extremely quickly from content, only five seconds behind live. It brings it down by about 90 per cent when compared with other codecs, so we are very much part of the story: content can be edited anywhere in the world right behind live, and published at that rate as well. The costs are down and Blackbird Proxy is making turn round incredibly fast,” says McDonough. AN AI INTERFACE FOR THE EDITING SOLUTION At NAB visitors could assess Blackbird Ascent and Blackbird Forte. “The first by definition is the entry level product,
24 | TVBE MAY 2018
23 24 25 27 Feature George_V2 JPCR.indd 2
23/04/2018 15:10
FEATURE designed for non-editing professionals working in the media industry that come from other departments. Forte is a full editor, with 18 video tracks and 36 audio tracks. There are 18 different camera positions that you can edit from that one suite,” says McDonough. Explaining the ‘five seconds behind live’ values, he adds: “If you want the video to be one minute, it could be edited one minute and five seconds behind live. If you wanted it to be 10 minutes of video, it is that plus five seconds behind live.” Interoperability is a vital byword around the business. “We can operate with virtually any system out there, and we very much encourage any organisation that wants to move down the line of cloud editing to seriously look at taking on a package like Blackbird within their system,” says McDonough. “For emerging publishers working direct to social media we could be an end-to-end solution, and for those more traditional players with legacy systems going back generations, we can be very much the cloud enablers. They can bring Blackbird Proxy and our other tools into their stack and have all the benefits of the cloud.” HDR and AI are two hot technology areas that play well with Blackbird. “We can edit and conform against 4K and HDR and we can proxy from that with no problems,” says McDonough. “AI is something we have looked at in a lot of detail, in terms of how we create an AI interface for the editing solution. We have talked to one group in particular about how we integrate that, but it was not there for NAB. It is an absolute priority for the roadmap this year. “One of the key things for AI as a use case for us is for looking at archive – building AI in modules,” he continues. “In a sports archive for instance you could go and look at every Diego Maradona goal, and then use the Blackbird software to proxy to these huge volumes, and scan through very quickly.” CREATED USING PSYCHOLOGY Making archives searchable is a key element of the Blackbird roadmap. And for when AI is used to bring forward content, the Blackbird software allows the user to see where the cuts and clips reside within a large amount of content. “This is used a lot with reality programmes in post production, so it means there is a human involved even if the AI software lets an erroneous piece of content get through to the final edit,” explains McDonough. “Our Proxy has been honed and crafted rather than just programmed. It was created using psychology – using how people imagine they see
colour. So it is a perception model, and it is also used as elements of AI. “We have created a very high quality low bandwidth codec through understanding how the human eye and brain work together. The codec informs what the next colour along should be, based upon learning. There are elements of AI in there as well,” he adds. NAB was fertile ground for talking about partnerships with other vendors. “Yes,” agrees McDonough. “We are looking at how Blackbird could fit into an OEM model, where we would be the original engineering model. So we would be right into the systems of other, larger companies that are rolling out across hundreds of news and sports networks. “It is an area where we have had some fruitful conversations, and this is an opportunity that gets us quite excited,” he continues. Dr Stephen Streater, Blackbird R&D director and part owner of the company, started the codec development after he sold Eidos in 2000. Streater completed a PhD at Cambridge on the use of AI video in self-guided missiles, and then decided to create the world’s best video compression codec. “Over 18 years he has honed it (many versions) to what we have today. I take the tool sets and our software for demo purposes in the US and the clients have seen nothing like it still,” says McDonough. “Our customer base is far and wide, but one of the key things I have done since joining the business is to really push it towards the US market. That market is just bigger, and also significantly more complex: there are hundreds of local and regional news and sports
PICTURED ABOVE: Blackbird Acscent
TVBE MAY 2018 | 25
23 24 25 27 Feature George_V2 JPCR.indd 3
23/04/2018 15:10
Untitled-1 1
08/03/2018 08:53:23
FEATURE
networks and then the national networks. “That means our software solution has a much larger addressable market, and those companies are much more willing to do something new with a new technology.” MORE THAN EIGHT MILLION HOURS A customer like Madison Square Garden Networks is proof of the pudding. “They have used our software for two years, and have been nominated for an Emmy for the workflow that uses our software. It is a workflow that edits the New York Nicks and New York Rangers (basketball and Ice hockey) feeds, and they can add closed-captions to their digital output with incredible speed and accuracy,” explains McDonough. “More than eight million hours of content has now been processed through our software.” This impressive volume is helped by credits from UEFA and the European Golf Tour. He adds: “It is used on the most valuable sports content in the world, and the reason the software is so incredibly fluid and frictionless is because we use edge technology. You have a computer that sits very close to the cloud, very close to the high bit rate: it is able to process the high
bit rate and upload it to the cloud very quickly.” Edge technology is being talked about as a major innovation and caused a great deal of chat at NAB. “We have been using it for several years and it is very much part of our key infrastructure. It is why our Proxy can be made so quickly,” says McDonough. “In addition to Blackbird Ascent – our entry level easy clipper – and the full tool set in Blackbird Forte, we have the innovation of Mac OS ingest. This is a very new thing with a Mac computer acting as our edge server. You can ingest that high bit rate content and crate the Blackbird Proxy from a Mac as opposed to using a Linux box.” This is a big deal because users do not need to buy specific and unique hardware, and it again opens up the addressable market for Blackbird, which has recently finished moving its model from Java to JavaScript. “The difference is night and day in terms of who can use it. There is no download required for JavaScript and you can effectively edit from any web browser, which again increases the addressable audience and adds to the number of people attracted by our software,” McDonough concludes. n
PICTURED ABOVE: Blackbird Forte
TVBE MAY 2018 | 27
23 24 25 27 Feature George_V2 JPCR.indd 4
23/04/2018 15:10
FEATURE
PREPARING FOR GDPR Jenny Priestley talks to Channel 4’s Sarah Rose about how the broadcaster is preparing for the new regulations
O
n May 25th, the EU’s General Data Protection Regulation (GDPR) comes into effect. GDPR seeks to give people more control over how organisations use their data, and introduce hefty penalties for organisations that fail to comply with the rules.It also ensures data protection law is almost identical across the EU. What it means for broadcasters is the same as it means for every business that deals with customer data. Those found to be breaching the regulations will be hit with significant fines. For broadcasters operating VoD services, GDPR is particularly significant, as Sarah Rose, chief consumer and strategy officer at Channel 4 explains: “Increasingly, broadcasters collate first party data from our customers because we have VoD services on which people have to register to watch content. Channel 4 has had that for five or six years now, ITV has had it for three or four years and even the BBC has it now. So that is one instance in which GDPR becomes pertinent for us because we’re handling our viewers’ data on a daily basis. “Also, just like any other business, it affects all data that we have in the building, not just third party data from a VoD service, it’s data that our suppliers use on our behalf. It’s also data we hold about our staff so it has HR implications. A lot of attention goes on how we handle the data with our external customers if you like, but actually GDPR itself covers every single instance of data for whatever source.” For Channel 4, Rose says GDPR will
particularly affect how they deal with data around their VoD service, All4. “We serve targeted advertising on All4,” she explains. “Targeted advertising is done on the basis of understanding who the viewer is when they’re consuming that particular advert because we know at the point of registration their name, their gender, their postcode, their email and their age (although it is optional for them to give us that). So, based on that information we can then target adverts that we deem to be more relevant for their demographic. “We’re very clear about that, we are already abiding by GDPR regulations anyway as a natural course of practice. But under the new regulations we have to be super clear about that,” says Rose. “If the viewers don’t want us to serve them
targeted advertising, it will be very evident on our website where they can opt out. They won’t not get advertising any more, but they won’t get targeted advertising on the strength of the data we have about them.” Channel 4 began preparing for the arrival of GDPR “about a year ago” says Rose. A lot of their planning involved understanding exactly what it meant for the company as well as implementing training for staff. Rose says even though Brexit looks like it will happen next year, Channel 4 is still expecting to abide by the regulation as the UK is expected to adopt it irrespective of whether Brexit happens or not. She believes GDPR is very good for consumers and viewers: “While it’s a pain to implement, it’s a very good practice and it will see a lot of rogue operators no longer able to operate in our marketplace which has to be good in terms of the license and business online long term. “One of the issues is that the Information Commissioner’s Office still hasn’t formally published their interpretation of some of the rules,” explains Rose. “So for example, do we need to re-seek consent from all of our existing registered database? Or do we just tell them that we’re doing it and give them the option to opt out, as we already have this in place. “We have taken a lot of external advice and had informal conversations with the ICO. But we believe that under what they call the legitimate business interest clause, we don’t need to re-seek consent, meaning we don’t need to get our multiple millions-strong database to re-register with
28 | TVBE MAY 2018
28 29 Feature GDPR_V3 CR.indd 1
19/04/2018 09:51
us. But we don’t formally know that yet, although we are pretty much confident we won’t need to do that. Part of our planning process has been to cater for all eventualities in the interpretation of the regulation.” Channel 4 hasn’t been working independently as it’s prepared for GDPR. In fact, it’s been working alongside all of the other UK free-to-air broadcasters: “We haven’t compared deep notes, because obviously your practice is private to each of us,” Rose smiles. “But we’re not being competitive in this field, and actually I’d go a step further than that and say if we as broadcasters adopt very similar approaches, particularly in the communications that we give to our users, that will help everybody understand what this is about. So yes we go to regular steering groups run by the BBC but represented with the majority of the UK broadcasters around the table. The lawyers have a separate forum, in fact we instigated a legal forum for other media lawyers to check and compare notes. “What’s going to happen is that all of us have registered databases and a lot of us have the same consumers registering with us,” Rose continues. “We need not further muddy the waters by sending conflicting
messages about what this is. Bearing in mind this is a nationwide, cross industry initiative, I think the broadcast space is in quite good shape because the British broadcasting culture - so BBC, ITV, Channel 4 and Channel 5 - have a long history of respecting their consumers and having high trust and loyalty to their brand. We are certainly not at all complacent about it, we have always trodden extremely carefully with anything to do with viewer data. “I’m not saying it’s easy for us to implement GDPR because we do have to do things differently, we have to update our privacy policy etc, but we have been on a path to this for quite a long time,” concludes Rose. “When we started data collection we put something out called the viewer promise, it’s five or six years old now, but it was a promise we made to the viewer that at all times it would be evident what we were doing with their data and if they didn’t like it they could seek to withdraw it. “Therein lies the fundamental principles of GDPR as everyone now has to adopt them. We feel like it’s a continuum and not a u-turn on how we’ve handled data before.” n
TVBE MAY 2018 | 29
28 29 Feature GDPR_V3 CR.indd 2
19/04/2018 09:51
FEATURE
CONTENT LOCALISATION ADDS VALUE TO VIDEO CONTENT By Marleen Lasche
E
very day content creators, production companies, broadcasters and telcos rely on NEP’s glass to glass managed services. NEP provides the expertise, the people and the next generation broadcast IT facilities and cloud-based infrastructure to help its clients to develop and deliver the world’s biggest and best live broadcast events. By working together, especially with the client, unique workflows for the creation, management and distribution of content are developed. NEP recently introduced an entirely new Remote Commentary solution for localising content. Remote Commentary is a proprietary solution from NEP Media Solutions and is specifically developed to enable broadcasters to increase the value of commentary content, reduce event costs, and deliver the highest audio standards. You may be asking how can you localise your content in order to best reach your international audience? Discovery had this ambition for its recordbreaking coverage of the Olympic Winter Games PyeongChang 2018 and teamed up with technical partner NEP. OLYMPIC GAMES COMMITMENT Discovery committed to deliver the Olympic Winter Games to more people, on more screens, than ever before. This included the first fully digital Games across Europe, offering every minute live on its digital services. Discovery wanted to deliver the best
‘NEP was working on 13 simultaneous event feeds with nine different languages per feed, adding up to 117 languages — of which 50 were basically unique.’
possible experience, including providing unique commentaries for every event, which often took place simultaneously, in the more than 20 languages required for its coverage in Europe. The scale and complexity of this Discovery and Eurosport Olympic Winter Games project, in combination with the volume of content, was a great technical and operational challenge. The number of countries, regions, cultures and languages involved made the Olympics project unique. Therefore, Discovery was looking for an efficient, cost effective and flexible remote commentary solution with a flexible, innovative partner. NEP has been Discovery’s trusted technological partner for many years now, discussing and cocreating new solutions for creating, managing and distributing content. Because of this close collaboration and intensive relationship, the parties were able to make one of the most complex and team intensive production workflows simple and scalable. A NEW REMOTE COMMENTARY SOLUTION For all the feeds sent from Korea to Europe, NEP functioned both in Oslo and the Netherlands as a hub on the Discovery built WAN in Europe where all the locations were connected. A total of 26 locations around Europe were connected to it, a network set-up for both contribution and distribution of live feeds as well as file transfers. Of the up to 50 fibre feeds coming from PyeongChang and 14 backup feeds on satellite, NEP was sourcing 13 simultaneous event feeds for language-altering. These included commentary audio as a mono audio feed produced at the venue location booths with different languages. In addition these contributed feeds went to all the Eurosport markets, so if commentary was not produced in a language at the venue Eurosport was capable of adding
30 | TVBE MAY 2018
30 31 Feature NEP_V2 JPCR.indd 1
19/04/2018 14:03
commentary from their European markets. All those different feeds were then sourced into NEP where it did the automated syncing of all audio feeds with the video as well as an automated audio mix. That meant NEP was working on 13 simultaneous event feeds with nine different languages per feed, adding up to 117 languages — of which 50 were basically unique. NEP built a system in which 50 languages could be mixed: this complete system was controlled by only two operators in the dedicated Eurosport Olympics Commentary control room. Before this solution Discovery would have needed to have an audio engineer per commentary feed, but by working in this automated way NEP was able to reduce 50 people to two who were mixing all the audio feeds. STREAMING HOURS OF COVERAGE AND EVENTS Once this was completely mixed and synchronised, NEP encoded the video and audio feeds in order to
distribute onto the Eurosport Player platform. The languages involved were Swedish, Norwegian, Finnish, Danish, Dutch, German, English, Polish and Italian. Although the Eurosport Player was not delivered by NEP, the company was the source of the live event feeds. The Player offers the functionality to select the language in which the viewers wishes to listen, so Discovery was able to localise its content. Eurosport’s digital services were the only place fans could watch every minute, every athlete and every sport – online, on mobile, tablet devices or connected TVs in their own language. More than 4,000 hours of coverage and 100 events were available, including 900 hours of live action, more than ever before across Europe. With NEP’s Remote Commentary solution sports fans across the continent were able to watch their favourite sport in their own language. Discovery changed the viewing experience of sports fans, delivering the first fully digital live and on-demand Games across all screens in 48 European markets. n
PICTURED ABOVE: NEP’s Remote Commentary
TVBE MAY 2018 | 31
30 31 Feature NEP_V2 JPCR.indd 2
19/04/2018 14:03
‘NETFLIXING’ MOVIES
FOR MOBILE Second Screen is a mobile app aiming to be the Netflix of short-form content. The mobile and social platform features edgy, episodic content curated by millennials for millennials. Jenny Priestley talks to founder Estella Gabriel to find out more
32 | TVBE MAY 2018
32 33 34 35 Feature Second Screen_V3 CR.indd 1
20/04/2018 09:29
FEATURE
I
n 2018, some 1.87 billion people, almost a quarter of the world’s population, will watch digitally streamed content on their mobile phone. That’s almost double 2014’s figure, and it’s expected to rise to 2.33 billion by 2021. That’s an awful lot of people consuming different kinds of content, whether it’s a short video on social media, a sport livestream, or catching up with a film or TV series. A good proportion of those viewers are probably watching while on a train, tube or maybe a bus journey. And as great as watching a film during your daily commute is, surely there aren’t that many commuters who spend 90-120 minutes on every journey? What if it was possible to watch bite size ‘episodes’ of a film on your phone? And while you’re watching, you could be interacting with fellow viewers, commenting on what you’re watching or checking out the cast’s social media pages. That is the premise behind Second Screen, an app that rebrands and repackages feature films into bitesize episodes, essentially ‘Netflixing’ a film: turning a film into a binge-watch series. Second Screen is the brainchild of Estella Gabriel, an award-winning screenwriter who also has over 15 years of experience working in technology with the likes of Samsung and PayPal. “I’ve always had sort of an entrepreneurial side to me,” explains Gabriel. “I started thinking about independent films that I love, and how there’s so many out there that don’t ever get the light of day because of the way viewing behaviour has changed. A lot of people now are watching on their phones and they’re not going to sit through a two hour movie on their device. “I thought about one of my favourite indie films, Miss Bala, and how it’s an incredible film yet no-one I knew had ever heard of it. That just blew my mind,”
continues Gabriel. “So, I decided to watch Miss Bala at home to see if I could divide it up into episodes, to see what it would look like and how easy it would be to do. As I understand story and story structure, it wasn’t hard to do at all. There are obvious natural story breaks you can find around every 10 minutes. That’s sort of where I started and then I built a little curation team of film lovers and story-trained millennials that could break up films and we titled each episode like a series and now we’ve gone through over 500 films.” Second Screen’s team of content editors currently use Final Cut Pro and work with filmmakers to choose where the cuts to their films begin and end. They also work together to create a title for each episode within the film. Most of the episodes are approximately 12-13 minutes, depending on where the natural story breaks are. Gabriel says the team have had some great feedback from filmmakers so far: “We have shown it to filmmakers and there were several who were part of our beta testing, most of them understand that they have to evolve or their content will die.” One of Second Screen’s key tenets is to work with independent talent and they’re currently seeking 50 Founding Filmmakers to share their work with the platform in exchange for equity within the company. “That’s how we plan to get the library going,” explains Gabriel. “Then, over time, we would like to bring on brands so that filmmakers can distribute their films in a way that’s consumable to this mobile generation and make money because they’re not currently making money on their films.” The first of these Founding Filmmakers is Fernando Lebrija, who has brought his film Amar a Morir to Second Screen. “Fernando understands it’s a great new way for a filmmaker to have a second chance with their
“We would like to bring on brands so that filmmakers can distribute their films in a way that’s consumable to this mobile generation.” PICTURED LEFT: Second Screen app as seen during beta testing
TVBE MAY 2018 | 33
32 33 34 35 Feature Second Screen_V3 CR.indd 2
20/04/2018 09:29
FEATURE films that did not necessarily do well when they were originally released,” says Gabriel. “We want great films, and there are a lot of them out there, so most of the filmmakers are really excited about it. The majority understand we’re not taking away the integrity of the film, just dividing it up, the same as commercial breaks, but we’re being strategic about it.” The app itself currently uses JW Player as its video platform, and Gabriel says they will look to move to a different network as the company grows. Once the user downloads Second Screen onto their phone and sets up their account, they enter the app via the Discover page. At the top of this page is where Second Screen showcases the feature films available to watch. Below that sits the Continue Watching section, where users can find the episodes for a film and pick up where they left off last time they used the app. There’s also a trending section and Gabriel says the app will have more sections as the library grows. Once the user clicks on a film, they’re taken to the episodes page. This includes a 30 second clip of an interesting or compelling part of the film, Gabriel says they don’t call them trailers as the content is delivered as a series. Users can share that preview within their network – Second Screen has a large social aspect included within the app, it will even allow a user to view an actor’s social media profile, be it on Instagram, Facebook or Twitter. Each of the episodes can be played on the phone’s full screen, but again there is an area where users can interact and comment. “You can click on a commenter’s name and you can see their profile and what they’re watching, what’s on their Watchlist and we’re also going to be adding a section for favourites, and you can also see all the people they’re following,” says Gabriel. “That’s what the millennial generation does. So,
“We’re not taking away the integrity of the film, just dividing it up, the same as commercial breaks, but we’re being strategic about it.”
34 | TVBE MAY 2018
32 33 34 35 Feature Second Screen_V3 CR.indd 3
20/04/2018 09:29
FEATURE instead of having an algorithm like Netflix might have, we use a referral system, you want to see what your friends are watching. We believe it has the viral aspect that keeps you in the app and it’s got the social angle that’s sort of addictive. We noticed throughout our beta testing that whenever a user got a new follower, they would come right back into the app, even if they hadn’t been on the app that day, and then they might go watch an episode or two.” As mentioned above, a lot of the Second Screen content is curated by millennials for millenials. Gabriel says she has first hand experience of how that generation isn’t aware of how much amazing content is available to them: “I have two young adult daughters and I took the youngest to watch Moonlight a few years ago and she was blown away, she had no idea that these types of movies were out there,” she says. “It’s not marketed to them but it’s so compelling to them. So the millennial aspect is important because not only do we think this is a great app for everybody who might want to watch a quick 10 minute clip of something really interesting and
TVBE Half Page Horizontal Template_183x115mm.indt 1
32 33 34 35 Feature Second Screen_V3 CR.indd 4
compelling that isn’t YouTube and that has actual depth and story; but we also think that this offers millennials and even Gen Z-ers the opportunity to get their ‘veggies in their smoothie’ [ie, to get the key highlights of the film] without having to watch a whole two hour film on their phone, they can come back to it.” Having been through beta testing, and validated the customer side of the platform, Second Screen is preparing for its official launch. “Now we’re bringing on the content and we’re looking for partnerships with distribution companies that are forward thinking, that have really great content but that aren’t mainstream,” says Gabriel. “We don’t want huge films, you can go to the movies and watch those. This is for the really great indie films that haven’t had their time in the sun.” “We’re probably going to do a soft launch with the filmmakers and their networks to start,” concludes Gabriel. “That will probably be in the next three or four months. From there we’ll open it up and market it to the masses. We definitely want it to be international eventually, but it will probably start in the US.” n
20/04/2018 08:12:45
TVBE MAY 2018 | 35
20/04/2018 09:29
A GREATER DYNAMIC RANGE Daniele Siragusano, image engineer at FilmLight, analyses the benefits of shooting and posting HDR for movies
H
DR has rapidly moved to the forefront in discussions on how to shoot and post premium content. Films are increasingly choosing HDR as a second order deliverable and are searching for a suitable archive strategy to prepare their content for later exploitation. Prestige television projects like the BBC’s recent Blue Planet II are also using HDR acquisition and producing HDR deliverables. So what does HDR bring to the table? We tend to talk about HDR, high dynamic range, but in fact this brings several shifts in capabilities and creative aspects. First and most obvious, it increases the contrast ratio, meaning that you can have both bright highlights and deep shadows simultaneously or between scenes over time. Another benefit is that it extends the palette of bright, saturated colours available. With HDR displays you can express bright, vivid and fluorescent colours that are simply not achievable with traditional standard dynamic range displays. This enlarges the colour palette the colourist has at hand. But the standard dynamic HD colour volume
(Rec.709) already covers most of the reflected colours in a scene. Therefore, having a greater dynamic range and colour gamut also brings some challenges for the colourist, as pictures can start to look quite unnatural if the colours are pushed too much towards the edge of the colour volume. New colour grading tools are needed to give the colourist the right controls. Additionally, the image viewing pipeline needs to account for the implications of natural colours. HDR not only influences the colour aspects of an image, it also introduces various other subjective effects. Because of the greater contrast and reduced highlight compression, images are perceived as sharper. Edges that might appear relatively soft in SDR appear harder. This is seen as a major benefit in television. Genres like sport, where pictures can never be too clear or too sharp, regard HDR as a significant step forward, even on today’s HD displays, without the need to go to 4K. For films or premium drama though, directors and cinematographers are used to the language of film. For
36 | TVBE MAY 2018
36 37 38 Feature Filmlight_V2 JPCR.indd 1
19/04/2018 09:53
PRODUCTION AND POST many films that means a degree of softness, allowing light to bloom rather than to cut like a knife. The question is: can we enjoy the benefits of detailed highlights and shadows, and rich, revealing colour palettes, without losing the ability to create cinematic moods and atmospheres? The enhanced dynamic range also affects the perception of motion. Motion artefacts, like motion judder, are contrast sensitive. Images with higher contrast in higher frequencies will emphasise these artefacts. So it is possible that a slow panning shot that looks fine in SDR can start to judder when the image is translated to HDR. But the most important issues relate to image composition and lighting. What looks attractive when shooting for SDR can look like a distraction in HDR. Similarly, a scene that looks interesting in HDR might look unexciting in SDR. PRACTICAL EXPLORATION FilmLight recently took part in a one-day masterclass, organised by ARRI, in which the cinematographer Karl Walter Lindenlaub ASC lit and shot specific scenes. FilmLight was on hand, first to provide immediate viewing and correction through FilmLight’s Prelight on-set software, and then to provide a final grade of the images using a full Baselight workstation. Andy Minuth, FilmLight’s in-house colourist and colour workflow expert, performed the grade. Lindenlaub lit the same set in a number of different ways including morning, afternoon and evening external light. The ARRI ALEXA SXT camera sent its scene-referred output to the Prelight workstation. There the signal was managed and instantly prepared for various viewing conditions. Multiple high-quality HDR monitors allowed everyone attending the event to get a close look at what was happening. One advantage of Prelight is that it allows you to connect multiple display devices with different characteristics and simultaneously produce the correct signal for all of them no matter if it is SDR or HDR. Additionally, Prelight can visualise an SDR image within an HDR container to be sent to an HDR display. This allows switching between SDR and HDR at the click of a button, so the DoP, Gaffer and the workshop participants could instantly see the differences between SDR and HDR outputs on the very same HDR monitor. This, in itself, is an important reason for having a tool like Prelight on set; having the full postproduction pre-visualised live gives the creative team the ability to make meaningful decisions without any guess work.
The exercise clearly showed that the extra dynamic range creates a perceived rise in the sharpness of the image – exaggerating the effect of lighting. A good example was shown as Lindenlaub lit for an actor in the foreground of the scene. He set his lights using SDR on the monitor, then switched to HDR to see the difference. One obvious observation was the effect of the backlight or edge light, placed to lift the actor from the background. Setting up the lights while viewing the result in SDR achieved the required separation. But when switching to HDR it was visible as an effect; the edge was way too harsh and not really “cinematic”. Conversely, lighting the same scene to produce a pleasing result on the HDR monitor (by diffusing the light source) achieved a very satisfactory look for this viewing condition. But when switched to SDR, the attendees felt that the actor did not pop out of the background in the way they would expect.
PICTURED ABOVE: Daniele Siragusano
FIX IT IN POST? To visualise these differences on a display in a realistic manner, a whole range of dependencies need to be taken into account, like the scene brightness and the viewing conditions. These dependencies can be described in a colour appearance model (CAM) and applied on the fly as a colour transform on the image. Working for multiple deliverables in SDR and HDR also highlights the need for specific grading tools operating in a scene-referred context, making physically realistic adjustments that are independent of the camera or monitor. FilmLight has developed these grading tools, and integrated the complex colour science that allows the transition between SDR and HDR, but as was clearly demonstrated in the Lindenlaub masterclass, the differences between SDR and HDR are not limited to colour. The higher frequencies available make HDR very sharp, so the visible differences are spatial and textural. For instance, edges that appear too sharp in HDR can be visually diffused with a variety of newly developed tools for texture modification. At present this is a creative decision by the colourist, looking at the image with the director and cinematographer. But research is being carried out to include the spatial and temporal aspects in the colour management framework. STRIKING A BALANCE We are in a state of transition. Today maybe 95 per cent of audiences will view content in SDR, even if you shoot, finish and archive in a larger colour space
TVBE MAY 2018 | 37
36 37 38 Feature Filmlight_V2 JPCR.indd 2
19/04/2018 09:53
PICTURED ABOVE: BaseLight
for future revenue streams. Budgets and deadlines, though, mean that working across the various formats cannot be allowed to increase either shooting time or post time. The immediate solution to the question may be to devote time broadly in balance with the revenue stream. Today that might mean grading in SDR then spending a short time checking and refining the HDR deliverable. If we think back a few years to the last disruptive change in the industry, the move from film to digital cinema, this same principle applied. At first, in the finishing suite you were only concerned about the release print, allocating a short time at the end to create the new-fangled DCP. Today, the focus is entirely on the digital deliverables: many directors, cinematographers and colourists never even view a film print. The chances are that the production team will agree that lighting, make-up and set design will be done for SDR, with a quick check in HDR to ensure that nothing too obvious pops out of the picture. As HDR becomes more important in audience (and revenue) terms, more effort on set as well as post will be devoted to it. HDR monitors on set will become the norm. For post production it is quite important to set up an image processing pipeline to account for the multitude of deliverables. It is very common for a blockbuster
grade to spend the majority of the time grading for the standard cinema release, which is very close to the SDR TV trim. That means the grade is already made and approved for SDR, and later the HDR trim is produced. For this to work successfully, the colour grading needs to be independent of the display it’s graded on. The decisions made by the colourist need to translate in a meaningful way to all other delivery formats. To account for these situations, FilmLight has developed a robust and truly scene-referred grading workflow called Truelight CAM, which allows the user to start with a grade for any deliverable and later produce the other deliverables on demand without the need to heavily regrade the images, and with the best image quality possible. This is much more flexible than alternative workflows which dictate the order of the grades and trim-passes. HDR technology is evolving and these considerations provide a snapshot of where we are today. HDR is immensely exciting, but it can change the imagery in a fundamental way, and affects spatial, textural and temporal aspects as well as pure colour and contrast. Directors, cinematographers, on-set DITs and colourists need to work together to create a body of experience, generating a new language to use HDR in the most effective, creative and immersive way – in order to serve the story. n
38 | TVBE MAY 2018
36 37 38 Feature Filmlight_V2 JPCR.indd 3
19/04/2018 09:53
DIGITAL 6000
No intermodulation. More channels. More power for your business.
Others dodge problems. We prefer to solve them. Of course, you can work your way around intermodulation and do some software magic — but that is no real solution in the already congested and limited frequency spectrum. By design, Digital 6000 has no intermodulation artifacts. Our superior RF technology results in more channels and more flexibility for any production and any stage — with no trade-off in transmission power or quality. Smarter, leaner, more efficient — this is the built-in principle from user interface up to spectrum efficiency. Redundant Dante™ sockets and the command function are just two components of the recent update. More about the next step towards the future of audio: www.sennheiser.com/digital-6000
TVBE Full Page Template_210x265mm_NEW.indt 1 SE_D6000_210x265_TVBE_UK.indd 1
02/03/2018 09:31:38 02.03.18 09:01
PRODUCTION AND POST
LIVING IN A VERTICAL WORLD Jenny Priestley speaks to the BBC’s Chris Lunn about the broadcaster’s use of vertical video in its News app
I
n November 2016 the BBC announced the launch of its new daily vertical video news product. In the news app, the ‘videos of the day’ section hosts seven to ten vertical video stories chosen by BBC editors, updated throughout the day depending on the news cycle. Videos run for between 60 and 90 seconds with subtitles, so they can be watched without sound. The decision to launch a vertical video offering was taken after the BBC began to see a shift in their audience towards mobile. According to Chris Lunn, senior product manager for news, sport and syndication at BBC Global News Ltd, the Corporation was looking at its video offering across the board and realised they weren’t offering “as good a video experience on smartphones as we could.” “So we took it upon ourselves to go back to the drawing board and ask what was the experience like for a smartphone audience,” he explains. “One of the things that came out of that was around vertical video. So, we took that forward and looked at how our storytelling formats translate to the mobile screen.” Lunn says the BBC didn’t make a conscious effort to aim their vertical videos at a particular demographic. “Obviously smartphone adoption leans towards younger audiences but we were really targeting the smartphone user. For us, it was more about people consuming news on a smartphone and how did that experience feel and what did it look like?” Reaction to the vertical format has been extremely positive. In terms of interviews, Lunn says users have made comparisons to FaceTime saying content felt more personal in a vertical format which he says the team has found “really interesting.” “Since we’ve gone to market, we’ve seen some really strong numbers, it’s been really encouraging,” Lunn continues. “It has certainly met some of our aspirations but there is more that we want to do, as you’d expect from any kind of news publisher. As it stands, we’re really happy with how it’s been received and the general usage of the product itself.” Asked if there’s a particular genre that users have gravitated towards, Lunn says views tend to be driven by the news agenda. “Obviously in the case of a breaking news story then we’ll see those videos do very well. At the end of the day, we are trying to produce high quality video for a particular story and we rely on that to drive the numbers.” The app includes a curated playlist, compiled by the BBC’s editorial team. “It leans towards a more heavier, more current news story towards the top of the playlist with more ‘featurey’ content towards the end,” explains Lunn. “We do see a natural drop-off
40 | TVBE MAY 2018
40 41 PP BBC_V4 JPCR.indd 1
19/04/2018 11:15
PRODUCTION AND POST as audiences go through a playlist. People tend to not watch everything every single time. Nonetheless, if the story is strong enough to get audiences watching it then they typically do really well.” Typically, videos aren’t shot in vertical. The digital video team reformat the video to work across different platforms. “They could start in landscape and translate down to vertical. That’s not always the case, some things are produced in vertical specifically, but that’s quite rare,” explains Lunn. “The editors use tools such as Final Cut with some extensions to try and make the work load easier when they’re creating all these different versions that are required these days. That’s been one of the interesting things for us, trying to produce video that works across different platforms.” “If we want to provide an optimal experience for smartphones then we see vertical video as being one of the potential ways to do it,” continues Lunn. “Particularly working on the international side of the business, we are a commercial entity as opposed to being license fee funded, so from that point of view it extends our offering to offer vertical advertising as well, which is a different market for us. We’re trying to
TVBE Half Page Horizontal Template_183x115mm.indt 1
40 41 PP BBC_V4 JPCR.indd 2
get an engaged audience on the smartphone and we see vertical as a way of doing that.” Finally, does Lunn see a future where vertical video takes over horizontal or landscape in popularity with viewers? “We’ve talked throughout this conversation that the BBC is very much targeting the smartphone with vertical video, but will that translate to desktop? Does it translate to tablets? Those questions are really interesting for us.” “I think it’s hard to say what format is going to be the predominant one in the future. Square exists, vertical exists, you’ve got landscape, which is kind of our bread and butter. “The direction of travel for us is how can we have one video that works across everything irrespective of format?” says Lunn. “From the point when we shoot the thing, how do we then go through the production process for the content to work well across all these different platforms without adding tonnes of editorial overhead? I think that’s the kind of real key issue for us. I don’t really know what the future is going to be, but at the end of the day we want to provide the best possible experience for your device.” n
PICTURED ABOVE: Chris Lunn
06/04/2018 11:26:58
TVBE MAY 2018 | 41
19/04/2018 11:15
LET’S GET VERTICAL Vertical video is gaining in popularity with both consumers and content creators. Jenny Priestley talks to social video production company Grabyo about the move away from horizontal
W
hen you take a picture or video on your phone, how do you hold it? Do you shoot the image vertically, or turn the phone 90 degrees? Research has shown smartphone users hold their device vertically 94 per cent of the time. So if consumers are choosing vertical video to create their own content, should broadcasters and content creators continue to stick to the horizontal format? In recent years, social networks such as Instagram, Facebook and Snapchat have certainly been at the forefront of vertical video. And consumers appear to be embracing the format, with Facebook reporting
vertical videos are now generating increased watch times and more engagement than the traditional 16:9 format. In April, Netflix launched a vertical preview feature for its mobile app. Social video production company Grabyo launched their first vertical video product for content creators around 18 months ago after watching the trend gain popularity. As Grabyo’s CEO Gareth Capon explains, watching a video on a phone horizontally is something of a jarring user experience: “You only have to look at photos people take of their families or videos that they shoot, they will generally shoot them vertically, they
42 | TVBE MAY 2018
42 43 PP Grabyo_V2 CR.indd 1
19/04/2018 09:55
PRODUCTION AND POST do that naturally because that’s how they hold their phone,” he says. “Snapchat obviously was the catalyst for it. They created a camera centric, vertically focused image and then video platform, and it was showing people that you can create these really great experiences on the phone, you can use all the real estate, it’s quite an intimate experience and it fits well with the way people hold their phones.” Grabyo provides tools for premium content publishers, rights holders, broadcasters, sports teams, music companies etc. to create different types of video whether that’s live streaming to social platforms, creating clips or creating real time highlights. The company saw a rise in demand from customers who wanted to create their videos in different formats. “If you’re working on a Champions League game for BT Sport and they want to publish that to Facebook and Twitter, then both vertical and square formats work really well in that environment,” explains Capon. “We built the opportunity for content owners to very easily take their 16:9 and create real-time edits both in vertical and square formats. Being able to rapidly convert that into square video or vertical video is very useful and ultimately it drives engagement.” “In 2018 it’s all about meaningful human interaction, how do you create content that people really care about, that they engage with, and using vertical video is a mechanism to do that because so much of that video on Twitter, Facebook, Instagram is viewed on a phone,” adds Capon. “If you create video which is better optimised for mobile then it potentially should have a higher chance of success and a higher chance of engagement. It doesn’t work for every format and it isn’t perfect for every type of video, particularly premium 16:9 video.” “Mobile video is still the fastest growing part of the video market globally. There are still more smartphones coming into existence than any other consumer device on the planet right now and because of that we know that there is going to be this increased focus on video formats which work on that platform,” continues Capon. “Both those trends are important and they drive some of the product strategy at Grabyo. That’s combined with feedback we get from our customers, in fact very shortly we’ll be launching some updates specifically focused around branding and sponsorship that are explicitly focused on these new video formats.” Grabyo is a cloud-based platform which allows users to ingest a live feed or upload any piece of ondemand content to create video. There are two ways
to create vertical video on the platform: One is to edit a live stream or piece of online content in a browser with Grabyo’s tools where the user follows the action and creates a vertical video that can be published to any number of third party digital platforms and social platforms. Users can also create vertical video within Grabyo’s app, either shooting vertical video within the mobile app or creating multiple vertical videos that can be stitched and published back to the core platform. “One of the updates we’re planning in Q2 is to have support for vertical live streaming as well,” says Capon. “It may be natively shot in the vertical or the user could repurpose a 16:9 stream for a vertical output. There are some formats where it might not work as well, for example with sport; if you want to be able to switch cameras, if it’s not set up for live it’s very difficult. But if you’re talking about a live studio show or even news that vertical output can work really well. “What we’re saying is that if you’re doing a production set-up you may set it up specifically for vertical and we can just help with the distribution of that live stream. But also if somebody doesn’t have the right equipment to set up their stream for vertical, we can help them frame that and repurpose that to still get vertical or maybe even square video output. Square is a bit easier because you get more real estate than you do on a vertical.” So as consumers spend more time viewing video on their phones and tablets, and as vertical video continues to grow in popularity, does Capon envisage a time when viewers will only watch vertical video? “I’m not a feast or famine person,” he laughs. “I think all these formats will exist and coexist and will continue to coexist. “Because so much of the TV production workflow is in 16:9, our televisions are still 16:9. A lot of screens are still horizontal screens and our eyes are well suited for viewing in that way. So I don’t think that’s necessarily going to change. But when you add in VR and AR on top of that, and you’re talking about 360-degree screens and screens that have no physical edge, I just think all of these things are important. “I think it’s the rise of mobile and smartphones in particular and that combined with great social networks, that’s the real catalyst that is driving interest in vertical video,” concludes Capon. “But I don’t believe 16:9 is going to disappear. People need to be thoughtful about who their audience is, where they’re watching, where they’re watching it, what type of experience they’re looking for and then optimise that content accordingly.” n
TVBE MAY 2018 | 43
42 43 PP Grabyo_V2 CR.indd 2
19/04/2018 09:55
SOUNDS OF THE
FUTURE Dolby Atmos is leading a revolution in terms of the way sound is received and the way it enhances the visual. Raja Sehgal, director of sound engineering and Dolby Atmos specialist at London’s Grand Central Recording Studios, talks to Colby Ramsey about their new immersive mixing capabilities.
44 | TVBE MAY 2018
44 45 46 PP GCRS_V2 JPCR.indd 1
19/04/2018 12:35
PRODUCTION AND POST
U
pon entering Audio Lab 2 at Grand Central Recording Studios, it is not difficult to notice the room’s slightly different feel in terms of conventional studio design. Two layers of absorption panels adorn the walls to accommodate the gargantuan haul of Exigy custom monitors, 32 channels of which are used for Dolby Atmos projects, and 54 for the third-order Ambisonics (TOA) system. The amount of channels of course depends on the space. Dolby look at the size of a room or cinema and have a predetermined algorithm that calculates how many channels the room should have, with the spatial parameters of the Atmos mix moving correctly in accordance with how many speakers the system uses. The strategic placement of the speakers themselves is calculated by Dolby, using a set of angles that are required to match up with the mix position, as Dolby Atmos specialist Raja Sehgal explains: “They came in and took all the measurements, did lots of studio drawings and kept re-working it, working in tandem with our studio acousticians White Mark to get everything correct in accordance with the Grand Central ‘look’.” Everything that creates noise – all the computers and back end hardware – are located in a machine room to make the studio as quiet as possible. “It’s a feature-licensed studio so it also has to meet the decibel levels specified by Dolby,” says Sehgal.
“Predominantly we’ve got the two studios in the lofts, which do stereo TV work and commercials, an ADR room for dramas and features, and three more rooms that are stereo, 5.1 and 7.1, but this room is where we do a lot of trailer mixing.” With regards to TV commercials, which is essentially what the majority of GCRS’ business is, a big development recently has seen DCM – who distribute commercials in cinema – starting to promote Dolby Atmos more. “Cinemas in the UK have been slower than those in other parts of the world to start implementing the technology, but it’s taking off a lot better,” observes Sehgal. “Cinemas are being converted and new multiplexes are getting the upgrades. There’s enough Dolby Atmos screens just in central London now, so that’s made the option for commercials to be made in Dolby Atmos more viable, and the more screens that play it, the better. “With regards to our TV work, doing the sound design in Dolby Atmos obviously gives you a lot more flexibility. There’s a lot more you can do with it, a lot more channels, a lot more places you can put things creatively,” Sehgal continues. “We can be a lot more adventurous and we’ve had some fun with it. 5.1 does its job great but this just takes it to another level, with the Exigy speakers allowing for ultra smooth panning. Atmos just feels more immersive and louder because you’re in the middle of a much bigger sound field.”
“Around a year and half ago it would’ve taken me twice as long to create the Atmos mix, now it only takes slightly longer due to improved technologies and workflow” RAJA SEHGAL
TVBE MAY 2018 | 45
44 45 46 PP GCRS_V2 JPCR.indd 2
19/04/2018 12:35
PICTURED ABOVE: Raja Sehgal
So what kinds of projects has Sehgal been tasked with working on recently? His first two Dolby Atmos trailers were for Aardman Studios, the first of which for the recently released Early Man. “None of the big studios like Universal or Fox have really been doing any Dolby Atmos for film trailers so it was quite a big thing that Aardman wanted to go ahead with it,” says Sehgal. “I’ve been speaking to the big studios one by one and pushing the format quite a bit, knowing that this studio was going to come online soon. Certain studios are starting to think that if a trailer warrants it, they should do it, especially because a lot of the trailers I mix are international; China and some others in the Far East are already ahead with Dolby Atmos, and even in Europe it is growing fast.” GCRS has recently produced two commercials – one for Audi and another for Land Rover – which have been received well in cinemas. Sehgal explains that one good thing about the format is that when it goes through the Dolby decoder, it automatically switches the signal to Atmos. “So if you do an Atmos mix and it goes to an Atmos cinema, it should play in Atmos, which has always been difficult to control with any other form of cinema,” he explains. “Inevitably it takes longer to mix because you’ve got a lot more to play with. However, around a year and half ago it would’ve taken me twice as long to create the Atmos mix, whereas now it only takes slightly longer due to improved technologies and workflow.” Sehgal says that if a creative director comes in with a commercial and lots of musical stems, he feels he owes it to the client to experiment and put the same
amount of effort into the mix to make the product as best as possible. He also believes that now the trailer companies are doing the odd trailer each, a lot more of them will follow suit. “They’re putting it into the Odeon in Leicester Square and that’s going to be a big one,” remarks Sehgal. “Without the biggest prime site in the UK being Atmos, it’s very difficult to justify it.” “There’s been a couple of projects where car manufacturers have come in to discuss the format with us; they’re all really excited about it because it really lends itself to these kind of commercials,” says Sehgal. “I can see the majority of content going Atmos in the not-so-distant future. A lot of new TV’s have got Dolby Vision, so I expect now that Dolby Atmos will be incorporated, in alignment with the fact that not everyone’s going to rig their house up with speakers. “There’s Blu-Ray and some streaming services like Netflix supporting Dolby Atmos now, and Sky Sports have also done it with football in 4K, but terrestrial TV channels have inherently still got a way to go.” Mixing Atmos is essentially done in the same as any other recording session, except here the software lets users place the individual elements in specific positions in a room, “which is great because it means you can adapt pre-existing stuff very easily into Atmos and there’s no special technique required to do it; it’s all done in the post production process,” says Sehgal. “We’ve been experimenting with the format to show clients what we can do so that they get ideas, and write things with Dolby Atmos in mind because it makes things a bit more special. I think I’ve done 10 to 12 Atmos mixes in the six weeks the studio has been up and running, which is way more than we thought.” n
46 | TVBE MAY 2018
44 45 46 PP GCRS_V2 JPCR.indd 3
19/04/2018 12:35
TVBE Full Page Template_210x265mm_NEW.indt 1
11/04/2018 15:31:02
TECHNOLOGY
THE BENEFITS AND CHALLENGES OF DEPLOYING AN OTT WORKFLOW IN THE CLOUD By Anupama Anantharaman, VP, Product Marketing and Business Development, Interra Systems
T
he global video streaming market is expected to grow from $30.29 billion in 2016 to $70.05 billion by 2021. Millennials are watching content on their mobile devices and computers more and more, with 86 per cent of smartphone users reportedly viewing video on their phones. Given the changes in viewing styles and growing popularity of over-the-top (OTT) content, offering an OTT service is no longer a choice but a must for broadcasters. Many broadcasters have already started to invest in setting up OTT workflows alongside traditional linear delivery. Yet, OTT technologies are fairly new and still maturing, which can make the transition a bit complex. Deploying cloud-based OTT workflows, service providers can introduce streaming services more quickly and cost-effectively. BENEFITS OF OTT CLOUD WORKFLOWS The number of users accessing OTT services can be very dynamic, depending on the amount of programmes available, time of day, and whether a popular sports event is being streamed. With so many different variables involved, broadcasters need a scalable workflow. That’s one of the key benefits of the cloud. It enables users to scale up and down storage, origin servers, CDN capacity, and other systems in the workflow, based on demand, without impacting performance. Having this level of scalability, flexibility, and elasticity allows broadcasters to launch new services very rapidly from anywhere in the world. Moreover, with a cloud workflow deployment,
broadcasters can provide their customers with global access. OTT viewers are not restricted to a particular geographical area, whereas with a linear TV workflow, viewers are restricted to watching the service in their home country. Adopting a cloud-based OTT workflow also drives down costs. Cloud-based solutions are based on an opex model that allows broadcasters to pay per use. With the industry still figuring out monetisation for OTT services, the opex model is less risky for broadcasters that want to launch an OTT service, as it doesn’t require a high capex investment. With a cloud-based OTT workflow, broadcasters are free to run their business, leaving IT issues in the hands of experts (i.e., SaaS providers). OTT is very IT-centric, and not all broadcasters possess the skills to manage IT infrastructure. Putting the OTT workflow on the cloud, broadcasters don’t have to spend money on infrastructure resources and managing IT policies. Rather, they can focus on more important tasks such as acquiring premium programming, which boosts viewer satisfaction and drives up revenue. OTT CLOUD WORKFLOW CHALLENGES OTT is much more complex than linear TV workflows. Unlike traditional delivery where standardised formats are in place, the OTT world is fragmented. To cater to a wide gamut of devices with multiple screen sizes, OTT content needs to be transcoded to multiple incompatible delivery formats and encrypted with
‘Selecting a QC and monitoring solution that can seamlessly communicate with transcoders, DRMs, CMSs, CDNs, NMS, and other integral workflow components is critical’ 48 | TVBE MAY 2018
48 49 Technology Interra_V2 JPCR.indd 1
19/04/2018 14:55
TECHNOLOGY multiple DRM protection schemes. With so many variations to maintain, the content volume that broadcasters are dealing with becomes massive. When it comes to a cloud-based OTT workflow, in particular, there are additional challenges. Broadcasters need a large amount of bandwidth in order to support a cloud infrastructure. It’s impossible to host the entire workflow on the cloud. The incoming transport streams for linear content are all based on-premises, and there will be frequent content exchanges between the cloud and on-premises equipment. In certain areas of the world, where high-speed internet isn’t available, this can be a major problem. Moreover, with a cloud-based OTT workflow, broadcasters need a strategy for downtime. Occasionally, content providers will need to perform maintenance. While this doesn’t happen very often, it’s important to plan downtime during a period of the day when there aren’t a large number of viewers. Content protection becomes even more important with OTT offerings as all data flows on a public network. However, with a cloud workflow, content providers lose some amount of control over content security. They’re dependent on the cloud vendor (i.e., Amazon Web Services) to provide security. Being aware of these challenges and being well-equipped with the right set of tools to manage them, broadcasters and other content providers are in a better position to deliver a high-quality OTT service. GUARANTEEING HIGH QOE WITH A CLOUD OTT WORKFLOW QoE is one of the most important criteria for successfully launching an OTT service and maintaining profitability. With the right set of QC and monitoring tools, broadcasters can stay competitive and meet the demand for high-quality video content on every screen. There are several key requirements that broadcasters should look for in a QC and monitoring solution if they are utilising cloud architecture for OTT distribution. Choosing a solution that is software-based is a top priority, as this will ensure flexibility of pricing, deployment, and configuration.
With an all-software-based solution, broadcasters can use QC and monitoring on-premises, in the cloud, or have a hybrid mixed deployment. Many broadcasters today need to deploy QC and monitoring probes both on-premises and in the cloud. This enables them to detect QoE issues for live and VoD content at each step of the workflow, reducing network jitter that can occur during encoding on-premises and assuring that video quality stays consistent throughout packaging and distribution via the cloud. With a software-based solution, everything can be parameterised and configured according to the content provider’s workflow and requirements. Moreover, it’s easy to integrate software with third-party systems in the ecosystem. Selecting a QC and monitoring solution that can seamlessly communicate with transcoders, DRMs, CMSs, CDNs, NMS, and other integral workflow components is critical. Look for a solution that offers an API. This will dramatically simplify integration between the various components in a cloud OTT environment. Comprehensive QC checks are also important. Broadcasters will want to check content at multiple points during the preparation phase, as well as during content delivery using active and passive monitoring techniques in a complementary manner. There are many different problems they’ll want to check, including contentrelated, encryption, server-related, and delivery issues, which can impact QoE. Having a consolidated, single screen view of linear as well as OTT workflows is also helpful in terms of enabling quick identification and isolation of fault points in the content chain. As OTT video consumption continues to grow, broadcasters and other content providers must embrace the change. Deploying cloud-based OTT workflows, they can introduce streaming services more quickly and cost-effectively. However, it’s important that broadcasters not lose sight of the importance of delivering a high QoE on every screen. Advanced QC and monitoring tools are needed in the OTT cloud environment in order to detect issues throughout the entire workflow and ensure flawless QoS and QoE to viewers around the world. n
TVBE MAY 2018 | 49
48 49 Technology Interra_V2 JPCR.indd 2
19/04/2018 14:55
DATA CENTRE
An insight into streaming Deloitte’s Digital Media Trends survey provides insight into how five generations of US consumers are interacting with media, products and services, mobile technologies, and the internet. The 12th edition of the survey reveals that the growth of video streaming is causing consumers to reassess the value of their pay-TV subscriptions. The report pays particular attention to the streaming habits of Generation Z, millennial and Generation X households which it has collectively dubbed ‘milleXZials’. Below are some of the survey’s key findings…
Figure 4.
50 | TVBE MAY 2018
50 Data Centre_V3 JPCR.indd 1
19/04/2018 09:56
CREATION. CONSUMPTION. DELIVERY
20TH & 21ST JUNE 2018 GRAND CONNAUGHT ROOMS, CENTRAL LONDON
EARLY BIRD DISCOUNT AVAILABLE HURRY, OFFER ENDS FRIDAY 4TH MAY, DON’T MISS YOUR CHANCE TO SAVE: WWW.MEDIA-TECH360.COM
www.media-tech360.com FOR SPEAKING OPPORTUNITIES: HANNAH TOVEY +44(0)203 889 4920 HTOVEY@NBMEDIA.COM
FOR SPONSORSHIP OPPORTUNITIES: PETE MCCARTHY +44 (0) 20 7354 6025 PMCCARTHY@NBMEDIA.COM
Sponsored by
#mediatech360 TVBEurope_Print_Template_210x265mm.indd 1 Single.indd 1
@mediatech360 19/04/2018 16:06:36 19/04/2018 15:08
TVBE Full Page Template_210x265mm_NEW.indt 1
03/04/2018 11:18:56