ISSUE 15 \ DECEMBER 2019
RAISINGTHE
BAR
MIDIS GROUP’S HUSNI HAMMOUD ON DELIVERING INNOVATION AT THE LOCAL LEVEL BY HARNESSING THE POWER OF GLOBAL TECH POWERHOUSES
CONTENTS
12 HUSNI HAMMOUD, MANAGING DIRECTOR FOR MIDDLE EAST, CEE AND TURKEY AT MIDIS GROUP, TALKS ABOUT HOW HIS COMPANY IS TAPPING THE POWER OF GLOBAL POWERHOUSES TO ACCELERATE DIGITAL TRANSFORMATION EFFORTS OF ITS CLIENTS IN THE REGION WITH INNOVATIVE SECURITY SOLUTIONS.
45 PRODUCTS
RAISING THE
BAR 6 NEWS
10 10 16 20 22 24 26
A FINE BALANCE THE RISE AND RISE OF NFV THE NEXT BIG THING SIX TRENDS TO SHAPE THE NEW YEAR MAKING A MARK PLUGGING THE SECURITY GAPS
PUBLISHED BY INSIGHT MEDIA & PUBLISHING LLC
16 28 30 32 33 36 38
WHEN EXPERIENCE MATTERS ORCHESTRATING INNOVATION HIDDEN TUNNELS UNLOCKING VALUE HOW TO COMBAT TECH SPRAWL GETTING SMART ABOUT AI
MICROSOFT ANNOUNCES AI CENTRE OF EXCELLENCE FOR ENERGY SECTOR 5G SUBSCRIPTIONS TO TOP 2.6 BILLION BY END OF 2025: REPORT AVAYA UNVEILS FLEXIBLE UC AND CC CONSUMPTION MODEL
DECEMBER 2019
CXO INSIGHT ME
3
EDITORIAL
TIME TO REFLECT AND RECHARGE
A
s 2019 draws to a close, it is time to ponder the year that was. Without a doubt, 2019 will go down in the annals of history as a time that tested the mettle of many IT leaders. The prevailing economic conditions in the region put the onus on CIOs to understand and adopt drivers of innovation to sustain business growth. With shrinking IT budgets, they had to find that delicate balance between spending on innovation and keeping the lights on. It is estimated that only 20 percent of IT budgets are spent on innovation, while the rest goes to meet core business needs. In the coming years, I would expect CIOs to turn to AI and automation to get a handle on the maintenance part and free up resources. Probably, 2019 is the year the whole region has awoken to the potential of AI, and the hype has reached a crescendo now. Contrary to the global trend, in the Middle East, the private sector is trailing the public sector when it comes to AI adoption. We
Published by
Publication licensed by Sharjah Media City @Copyright 2019 Insight Media and Publishing
Managing Editor Jeevan Thankappan jeevant@insightmediame.com +97156 - 4156425
see many exciting use cases emerge, and this could potentially change the way governments operate and solve many challenges they face, especially around citizen-facing roles. This year will also be remembered as the tipping point for the public cloud in this part of the world, with biggies such as AWS and Microsoft opening their cloud data centres in the region. This instills confidence in many users, especially in the regulated sectors such as government, healthcare, and banking, to take advantage of the scale and elasticity the public cloud environments offer. Globally, only around 20 percent of workloads have moved to the public cloud, and this adoption statistics would be on the lower side in our part of the world. However, this might change next year as many organisations continue to tighten their budgets and get a handle on IT operations. Whatever happens, I think 2020 is going to be an interesting year for us in the tech industry.
Sales Director Merle Carrasco merlec@insightmediame.com +97155 - 1181730
Production Head James Tharian jamest@insightmediame.com +97156 - 4945966
Operations Director Rajeesh Nair rajeeshm@insightmediame.com +97155 - 9383094
Administration Manager Fahida Afaf Bangod fahidaa@insightmediame.com +97156 - 5741456
While the publisher has made all efforts to ensure the accuracy of information in this magazine, they will not be held responsible for any errors
DECEMBER 2019
CXO INSIGHT ME
5
NEWS
MICROSOFT ANNOUNCES AI CENTRE OF EXCELLENCE FOR ENERGY SECTOR
M
icrosoft has announced that it will open an AI Centre of Excellence for Energy in the United Arab Emirates, a global first for the company, to empower organisations in the industry in accelerating digital transformation, equipping the workforce with AI skills, as well as collaborating on coalitions to address sustainability and safety challenges. The company revealed its plans at the Abu Dhabi International Petroleum Exhibition and Conference (ADIPEC) 2019, held under the patronage of His Highness Sheikh Khalifa bin Zayed Al Nahyan, President of the United Arab Emirates and Ruler of Abu Dhabi. Supported by partners that include ABB, Accenture, AVEVA, Baker Hughes, C3.ai, Emerson, Honeywell, Maana, Rockwell Automation, Schlumberger, and Sensia, the Microsoft AI Centre of Excellence is expected to open in early 2020. The centre will support organisations to accelerate their digital journeys and drive innovation through active engagements with leading technologies and industry partners, as well as equipping the workforce with necessary
AI readiness towards closing the skills gaps and enhancing employability. “Microsoft’s mission is to empower every individual and organisation on the planet to achieve more,” said Omar Saleh, Microsoft’s Head of Energy & Manufacturing for the Middle East & Africa region. “Our aim with the AI Centre of Excellence is to foster innovation, develop effective collaboration, and champion AI skills development for the energy industry. We believe in the power of AI to drive business transformation, and Microsoft has been a leader in building best-inclass platforms to deliver that.” The Microsoft AI Centre of
Excellence for Energy also brings together coalitions to drive effective collaboration towards tackling the industry top of mind challenges and aspirations, with a primary mandate to create societal impact, as it focuses on environmental sustainability, worker safety and energy efficiency needs. “At Microsoft we are working on building lasting, meaningful alliances with energy industry players, technology partners and academic institutions. Together, we will infuse the energy sector with the power of the intelligent cloud, enabling innovation to flourish as never before,” said Darryl Willis, Microsoft’s Global VP of Energy.
AWS GROUND STATION NOW AVAILABLE IN THE MIDDLE EAST Amazon Web Services (AWS) has announced the expansion of AWS Ground Station to the AWS Middle East (Bahrain) region. AWS Ground Station is a fully managed service that lets you control satellite communications, process satellite data, and scale your satellite operations. AWS Ground Station is expanding to multiple geographic regions to ensure customers are served across the world. With the AWS Middle East (Bahrain) Ground Station deployment, budding startup communities and enterprises in the region can communicate with their satellites and run workloads in the
6
CXO INSIGHT ME
DECEMBER 2019
Middle East to serve end-users across the region. This will help businesses and organisations throughout the Middle East speed up their digital
transformation and innovate even more rapidly. The new AWS Middle East (Bahrain) Ground Station will bring opportunities for entrepreneurs leading start-ups to experiment, iterate, and fail fast as they develop new products and services. Customers can easily integrate satellite data with other AWS services, either in the same region or in another AWS Region using Amazon’s international, high-capacity backbone network. AWS Ground Station is available today in US-West-2 (Oregon), US-East-2 (Ohio), and ME-south-1 (Bahrain), with more regions coming soon.
5G SUBSCRIPTIONS TO TOP 2.6 BILLION BY END OF 2025: REPORT Ericsson expects the global number of 5G subscriptions to top 2.6 billion within the next six years, driven by sustained momentum and a rapidly developing 5G ecosystem. The forecast is included in the November 2019 edition of the Ericson Mobility Report, alongside a range of other forecasts with an endof-2025 timeline and communications service provider insights. Average monthly data-traffic-persmartphone is forecast to increase from the current figure of 7.2 GB to 24 GB by the end of 2025, in part driven by new consumer behaviour, such as Virtual Reality (VR) streaming. With 7.2 GB per month, one can stream 21 minutes of HD video (1280 x 720) daily, while 24 GB would allow streaming 30 minutes of HD video with an additional six minutes of VR each day.
The report also projects that 5G will cover up to 65 percent of the global population by the end of 2025 and handle 45 percent of global mobile data traffic. Fredrik Jejdling, Executive Vice President and Head of Networks, Ericsson, said, “It is encouraging to see that 5G now has broad support from almost all device makers. In 2020, 5G-compatible devices will enter the
volume market, which will scale up 5G adoption. The question is no longer if, but how quickly we can convert use cases into relevant applications for consumers and enterprises. With 4G remaining a strong connectivity enabler in many parts of the world, modernizing networks is also key to this technological change we’re going through.” Given its current momentum, 5G subscription uptake is expected to be significantly faster than that of LTE. The most rapid uptake is expected in North America with 74 percent of mobile subscriptions in the region forecast to be 5G by the end of 2025. North-East Asia is expected to follow at 56 percent, with Europe at 55 percent. Other forecasts include: total number of cellular IoT connections now seen at five billion by the end of 2025 from 1.3 billion by end 2019 – a compound annual growth rate of 25 percent. NB-IoT and Cat-M technologies are estimated to account for 52 percent of these cellular IoT connections in 2025.
IBM STUDY SPOTLIGHTS WHY DATA IS KEY TO COMPETITIVE ADVANTAGE IBM’s 20th edition of its bi-annual C-Suite Study, “Build Your Trust Advantage,” polled nearly 900 C-level executives across the Middle East and Africa (MEA) to examine how companies in the region are achieving market leadership by emphasising trust in their use and sharing of data. The study, conducted by IBM Institute for Business Value (IBV) in cooperation with Oxford Economics, found that market leadership is most frequently attained when an organisation establishes a high level of trust in the data from its customers, its own business processes, and across its partner ecosystem. Through the quantitative and qualitative surveys issued, it became clear there was a set of leaders– dubbed “Torchbearers” – that stood out as understanding that transparency, reciprocity, and accountability are critical ingredients for earning trust among key stakeholders. These leaders have a deep understanding that building trust
in customer relationships is a strategic imperative and work hard to earn and maintain it. In fact, 91 percent of leaders in MEA, compared to 82 percent of their peers globally, strongly believe data helps create a strategic advantage in strengthening their level of customer trust as well as their bottom lines. Torchbearers in MEA also outpace their peers in the region by 48 percent in their
capacity to respect customers’ data privacy as a core competitive advantage. Globally, Torchbearers outpace their peers by 22 percent. The study also highlights the importance of trusting data that’s within an organisation. Leaders in MEA were found to take great pains to ensure that the data within its own walls is accurate and clean so they can leverage it to make the best-informed decisions on important business ventures, such as developing new business models and entering new or emerging markets. The study also revealed an emphasis on the importance of creating trustworthy ecosystems. Data that simply stay within the organisation is more likely to drift out of date than to grow in value. Leading organisations are liberating their data while simultaneously de-risking data exchanges in a shared ecosystem – allowing it to circulate widely, without sacrificing their responsibility to secure permissions and safeguard it, said the report.
DECEMBER 2019
CXO INSIGHT ME
7
NEWS
AVAYA UNVEILS FLEXIBLE UC AND CC CONSUMPTION MODEL
A
vaya has launched its new Avaya IX Subscription programme in the international market, making it easier than ever for EMEA and APAC customers to purchase and consume Avaya’s communications and collaboration solutions to drive their business growth. Showcased at Avaya ENGAGE Dubai, the Avaya subscription programme provides customers with a flexible new consumption-based alternative to traditional perpetual pricing models and can also facilitate their transition to cloud. Avaya IX Subscription gives customers the flexibility to scale consumption of the company’s contact centre and UC solutions based on their unique needs. This comprehensive new programme offering monthly or annual subscription payments enables customers to avoid the complexity and cost of software licensing and contract renewals and instead focus on growing their businesses. Additional benefits to customers of the programme include
lowered business risk, increased operational agility, streamlined budgeting and purchasing processes, and maximum flexibility when adding new services and users. Avaya IX Subscription includes access to the latest software releases, the freedom to flex up to 20 percent over the number of subscribed users at no additional charge and support from the company’s services organisation. Yaser Al Zubaidi, Senior Director, Engagement Solutions, Avaya International, said, “We’re now able to extend a subscription model for on-premise communications infrastructures. We expect our customers to shift from an on-premise
Yaser Alzubaidi, Avaya
deployment paradigm to a private or public cloud architecture over the next two to three years, and Avaya IX Subscription provides a convenient stepping stone on that journey towards the cloud.” According to Al Zubaidi, Avaya IX Subscription also highlights the fliexibility inherent in the firm’s solutions, which can be consumed across a range of deployment models, and are fully capable of fitting into any given organisation’s hybrid cloud strategy. As part of the Avaya IX Subscription program, the company is providing trade-in and upgrade offers for existing customers to protect and extend the current investments in their Avaya communications infrastructure. Customers can trade-in their existing perpetual licenses for credits to be applied towards their subscription payments. For customers not running on the latest Avaya software releases, the firm is also offering a “Experience Avaya” programme to upgrade to Avaya OneCloud or Avaya IX on-premise software. Additionally, Avaya IX Spaces, the company’s powerful new cloud-based platform for team collaboration and meetings, is included as part of all Avaya IX Subscriptions.
HPE LAUNCHES KUBERNETES-BASED CONTAINER PLATFORM Hewlett Packard Enterprise (HPE) has announced the HPE Container Platform, an enterprise-grade Kubernetes-based container platform designed for both cloud-native applications and monolithic applications with persistent storage. With the HPE Container Platform, enterprise customers can accelerate application development for new and existing apps – running on bare-metal or virtualised infrastructure, on any public cloud, and at the edge. The HPE Container Platform is built on proven innovations from HPE’s acquisitions of BlueData and MapR, together with 100 percent open-source Kubernetes. This next-generation solution dramatically reduces cost and complexity by running containers 8
CXO INSIGHT ME
DECEMBER 2019
on bare-metal – while providing the flexibility to deploy on virtual machines and cloud instances. Customers benefit from greater efficiency, higher utilisation, and improved performance by “collapsing the stack” and eliminating the need for virtualisation. The new platform addresses the requirements for large-scale enterprise Kubernetes deployments across a wide range of use cases, from machine learning and edge analytics to CI/CD pipelines and application modernisation. IT teams can manage multiple Kubernetes clusters with multi-tenant container isolation and pre-integrated persistent storage. Developers have secure on-demand access to their environments so they can develop
apps and release code faster, with the portability of containers to build once and deploy anywhere. “Application development is migrating to containers, and Kubernetes is the de facto standard for container orchestration,” said Kumar Sreekanti, SVP and Chief Technology Officer of Hybrid IT at HPE. “We’re combining our expertise and intellectual property from recent acquisitions together with open-source Kubernetes to deliver an unmatched enterprise-class container platform. Our container-first approach will provide enterprises with a faster and lower-cost path to application modernisation, optimised for bare-metal and extensible to any infrastructure from edge to cloud.”
LUXURY DUBAI HOTEL UPGRADES ACCESS CONTROL SYSTEMS WITH TRAKA Global specialist in intelligent asset management solutions Traka has announced the successful integration of its advanced key and asset management software at Atlantis, The Palm, allowing the Dubai hotel to offer its guests the highest level of hospitality and security. An intuitive, browser-based asset control software, Traka Web and Traka Touch technology provides all the tools necessary to centrally manage all Traka key cabinets and lockers. With the ability to regionalise administrative control, different areas of an organisation can customise its security needs to make keys and assets more efficient and effective for the business. Additionally, it is designed for easy and simple integration into existing third-party systems, for a seamless deployment process while
greatly reducing administration overhead. Traka, a subsidiary of access control systems giant ASSA ABLOY, has been the key management solution of choice for Atlantis, The Palm since 2009. Currently, access control systems are not given enough importance as surveillance cameras or fire alarms in an organisation’s security strategy, according to Roshin Roy, Regional Business Development Manager, Middle East, Traka ASSA ABLOY. “Key and asset management are as crucial as CCTVs and fire alarm systems when it comes to best practices in security,” he explained. “With the integration of Traka Web and Traka Touch technology to their system, Atlantis’ security team is assured that their master and other critical keys
are accounted for and that operating processes are working efficiently while guest security is never compromised. The resort’s attention to key control through the use of our systems is a reflection of their strong commitment to continuing evolution for operational efficiency in order to deliver a new level of guest experience,” Roy added. Traka Web and Traka Touch technology provides a distinct set of features keeping user in control, all the while optimising business processes at a lesser cost. A three-tier architecture, data, application and presentation layers work together to enable Traka Web and Traka Touch to be efficient and highly scalable. It also brings users and items side by side, making it easy to grant access to multiple items quickly and efficiently. Furthermore, Traka Web and Traka Touch technology provides an autosync system for all changes, while a suite of reports is also provided to help better utilise the assets that need to be controlled.
DECEMBER 2019
CXO INSIGHT ME
9
VIEWPOINT
A FINE BALANCE ZAHID SYED, INFORMATION SECURITY OFFICER OF ABU DHABI PORTS, ON HOW TO INTEGRATE INFORMATION SECURITY WITH THE PROJECT MANAGEMENT LIFECYCLE.
T
raditionally, information security, and project management have been in a mutually exclusive relationship. Most commonly, projects have tight deadlines and somehow it is widely believed that information security slows down project delivery. So why not we just ignore the information security requirement and focus on meeting the deadline? Let me run you through a few scenarios to understand the importance of information security especially in an IT project. You (the project manager) are informed that one of your resources shared the source code of a critical
10
CXO INSIGHT ME
DECEMBER 2019
application with a competitor of your client. Or just after the go-live, the webserver was brought down by the adversaries who executed a successful SQL injection attack. Or a day before going live, your project folders on a shared location are inaccessible due to ransomware, which was sent to the project coordinator in her email in the late afternoon. Each scenario demonstrates lack of information security control in each of the information security triad. To avoid or minimise the impact of such a catastrophe, information security should be taken into consideration from the very beginning of the project
management lifecycle. Considerable attention should be paid to incorporate information security best practices into each phase of the project management lifecycle. Project managers are not expected to be security experts, but having information security in each phase of a project will enable the project team to deliver a more secure system, process, and infrastructure.
Information Security should be “baked in” to any project, not “bolted on” Prior to initiating any project, a business case is developed including high-level project objectives and requirements
along with a cost-benefit analysis and the project feasibility. This phase is usually called the pre-project phase but it is not the focus of our discussion here.
Initiation The initiation phase is the most preliminary phase in any project management lifecycle usually begins with a project charter defining project scope, project timelines, identifying stakeholders, etc. Once the project gets a “green light” and project initiation document or a project charter is approved, the project managers should engage the security team to ensure the following: • Information security objectives are incorporated into the project objectives, for example, the delivery of a secured system could be listed as one of the objectives. • A high-level information security requirements are defined. • Risk assessment is performed and all project risks including the information security risks and their implications to identify necessary controls.
Planning The planning phase is the most important phase and the key to a successful project where the project plan is developed. In this phase, the information security team in coordination with the project team should: • List the authorised hardware, software, and systems to be used during the project. • Ensure that the segregation of duties is clearly defined to prevent conflict of interest and to detect control failure. • Identify potential threats and assess security risks such as a malicious attack (internal or external) that may occur during the project or a key resource that may not be available during the project. • Allocate a budget to reduce the negative impact of such uncertain events. • Agree on the change management
practice to avoid unexpected data loss or system failure due to unrecorded changes. • Run backup and restore jobs regularly during the project to guard against data loss due to unexpected crashes or errors. • Develop the communication plan specifying the method, frequency, audience, guidelines and technical standards for communication such as: A document must be encrypted before sharing it over the internet; meeting minutes shall be limited to only authorised personnel via secure method SFTP, encrypted emails or password-protected files; and list the authorised instant messaging tools, which should only be allowed to use during the project.
Execution and control This phase is often considered as the meat of the project. Usually in IT projects, this is the development, integration and implementation phase of the system, software or infrastructure. The project team with the support of the security team should ensure that during the execution and control phase: • The physical security of the systems and development environment is controlled and monitored. • All IT Assets used in developing the system are protected using various
endpoint protection tools such as anti-malware, host-based intrusion prevention systems, sandboxing and host-based firewalls to protect the systems from internal and external threats • Access control is monitored and auditing is enabled all the time for all authentication and authorization requests. • The secure configuration is taken into consideration during hardware and software deployment. • Traffic flow is controlled and secured between resources using various methods and technologies such as encryption, microsegmentation, firewalls, etc. • Application software security passes all the relevant security checks before it goes into production. • An adequate backup strategy is followed to backup critical resources even during the implementation. • Systems are tested against their expected functionality, performance, availability, backup/ recovery, maintenance, and security • Segregation of duties is practiced to avoid conflict of interest and control failures
Closeout Closeout is the last phase in the PMLC. In the IT Project Management, it is the phase where the post-implementation review (PIR) exercise is conducted after the go-live to evaluate whether the objectives of the project were met and If security was included in the initiation phase with agreed information security deliverables, project handover should be organised around verifying those deliverables. In addition, any security experiences and lessons learned should be documented and residual risks should be noted in the risk register for further improvements in controls. Lesson learned can also be one of the most effective ways to establish a security culture in your organisation over the long-term.
DECEMBER 2019
CXO INSIGHT ME
11
COVER STORY
RAISING THE
BAR HUSNI HAMMOUD, MANAGING DIRECTOR FOR MIDDLE EAST, CEE AND TURKEY AT MIDIS GROUP, TALKS ABOUT HOW HIS COMPANY IS TAPPING THE POWER OF GLOBAL POWERHOUSES TO ACCELERATE DIGITAL TRANSFORMATION EFFORTS OF ITS CLIENTS IN THE REGION WITH INNOVATIVE SECURITY SOLUTIONS.
12
CXO INSIGHT ME
DECEMBER 2019
What excites you most about your current role? s one of the Managing Directors of Midis Group’s local office operations, I am responsible for handling the business of three major brands – Ivanti, Barracuda Networks, and ESET. We manage sales operations, support channel partners for each vendor to ensure that proper solutions are provided and deployed for end-users across the Middle East and Eastern Europe. As an extended local arm of these vendors, we have teams available on the ground in most of the markets we cover, such as Poland, UAE, Saudi Arabia, Qatar, and many other countries. We are always looking for the right solutions in line with the latest trends around the world. However, when it comes to choosing a vendor, a key criterion is to see if their solutions address the unique challenges of our region, including security and regulatory requirements, and support the digital transformation initiatives of local companies. And this is reflected in our portfolio. For example, Ivanti provides AI and ML-driven solutions around asset management, patch management, and identity management, available on both on-prem and the cloud. This meets one of the most essential requirements in our region. Likewise, Barracuda provides e-mail and network security and costeffective cloud storage solutions. It is a well-established brand, which has been within the MEA market for more than eight years. ESET is an Internet security company that provides a comprehensive endpoint security product. With these solutions, we can tackle the unified IT requirements of our clients and help improve their security postures. As a group, our real strength is that we position these brands and create the market for them. This is why new vendors always approach us to support or to establish a market
A
presence for them in the Middle East and the Emerging EMEA region, and why we are growing our business almost double to triple digits every year. What are the key challenges that your customers are facing, and how are you addressing them? Every end-user has a different challenge. Some are more concerned about security, some are looking for more efficiency and agility while others are grappling with challenges around mobility and endpoints. Each of our vendors brings at least one piece of the problem puzzle to the table. Moreover, we know the market, culture, and requirements. We support users with the right solutions that can be deployed with minimal efforts and at very competitive pricing, which addresses the budgeting constraints most of them face today. Is the cloud an important play for you? We have different delivery mechanisms depending on the vendor and solution offered. For example, Ivanti offers a cloud-based platform that unifies IT operations while Barracuda provides cloud-managed security. Most of the vendors are now moving to a cloudmodel, and it will become the norm. With public cloud services gaining traction, do you see demand picking up for cloud security solutions? Public cloud providers such as Microsoft Azure and AWS are now present in the region, and Barracuda offers solutions born in the cloud. We have a special relationship with Microsoft as one of their preferred solution partners. We have announced many joint initiatives to integrate our solutions and protect end users on Azure cloud infrastructure. Also, we have cloud generation firewalls, and Guardian offering to automate security policy compliance in the cloud, which I feel, will become an important solution for highly regulated industries.
Within your portfolio, which solutions do you see growing at a faster pace over the next 12-18 months? Security is a number one priority,and has been growing at a faster clip in the past few years and will continue to do so in the coming years. Cybersecurity has many different layers, and there is also a human dimension – people are the weakest link. This is why you need to focus on the security awareness of your employees and make them the front-line of defense against everincreasing cyberattacks. Our Barracuda PhishLine provides top-class security awareness training services. When it comes to IT efficiency and management, automation is becoming an important element. Ivanti tackles this by automating infrastructure, cloud, and workspace processes, which helps minimise human intervention and reduces dependency on previous resources. At the same time, with AI and analytics also becoming critical in today’s digital age, we have embedded AI into our solutions. Do you think AI will be the future of cybersecurity? Cybersecurity is not just one thing – it is a combination of many IPs, solutions, and processes. AI and ML are helping us to make accurate decisions faster by collecting information from vast data sets and automating security operations. In the future, I think AI will play a significant role in protecting our networks and systems. Users are investing in all kinds of security tools and solutions, yet getting breached. What is your advice to security practitioners? You do require all these tools and solutions. At the same time, you need to invest in proper management and analytical tools to protect your IT environments, which is where tools like AI and deep learning will become necessary to identify and mitigate threats in no time. What is the point in buying a Rolls Royce if you don’t have the means to drive it?
DECEMBER 2019
CXO INSIGHT ME
13
VIEWPOINT
GREAT EXPECTATIONS CHADWICK KINLAY, DIRECTOR, MARKETING & COMMUNICATIONS AT EPSILON, ON OPTIMISING USER EXPERIENCE WITH A MULTI-IX STRATEGY
T
he Middle East is one of the fastest moving markets in terms of digital transformation. Local service providers are beginning to deploy 5G technology, while many businesses and communities are already benefitting from smart city technologies and the internet of things (IoT). Enterprises in the region have an opportunity to ride this wave of innovation if they can optimise the way they connect and deliver exceptional end user experiences. Building cloud-centric services is not the only priority. Being able to deliver them consistently to users across an increasingly diverse and global footprint is imperative. To support new applications and services locally and around the world, Middle Eastern enterprises need a robust, flexible and scalable way to ensure these services provide a consistent and high quality of experience (QoE) that meets end user expectations. Growth in global cloud, content and digital communications is driving the need for efficient exchange of traffic at the world’s internet hubs. According to Grand View Research, the Middle East and Africa cloud infrastructure services market will be worth $18.07 billion by 2025. These numbers point to growing reliance on a hybrid cloud model across the region and transformation happening across enterprises. In the long term, how can organisations keep up with immediate demand, and at the same time, deliver a high-quality service?
Delivering a multi-IX strategy One of the answers is internet peering. Peering is the arrangement of traffic exchange between networks. A large internet service provider (ISP) could allow traffic from other ISPs in exchange for 14
CXO INSIGHT ME
DECEMBER 2019
traffic on their backbone network, and also exchange traffic with smaller ISPs to reach regional end points. An internet exchange (IX) provides a neutral peering point between service providers and networks, facilitating the exchange of internet traffic within an enabled physical location known as an internet exchange point (IXP). To peer at an IX, members will need to have a point of presence (PoP) at the IXP. Enterprises that choose remote peering can seamlessly connect to the IXP without being physically present at the exchange point. By choosing to peer via a network service provider, the organisation can harness existing connections to the peering platform without having to establish their own PoPs. Remote peering is paving the way for more organisations to peer at IXs globally. The benefit of not having to buy a physical port or deploy equipment at every exchange point dramatically reduces overall network costs. Enterprises will see greater cost efficiency when compared to buying IP transit and arranging peering agreements with local ISPs. This reduction in supplier and logistic costs means enterprises now have a more cost-efficient method for accessing multiple IXPs.
Another benefit of remote peering is the improved network resiliency. Enterprises that connect to a single ISP have a single point of failure. By leveraging multiple peering arrangements, they can gain access to hundreds of potential partners and can exchange internet traffic on a settlement-free basis. The wide range of peering partners available means enterprises have multiple redundant routes to support increasing traffic growth. Peering remotely at IXPs also helps to reduce the number of hops between service providers, which in turn helps to lower network latency. This significantly improves end users’ online experiences when using bandwidth-sensitive applications, such as video on-demand or a real-time collaboration platform. Many enterprises recognise the power of peering to bring their digital services closer to end users. However, it gets complex when they want to serve customer demand in many regions globally and peer across continents. Remote peering has emerged as a costefficient way for enterprises to peer at IXs anywhere around the world. A Middle Eastern content company with a growing audience in the US and UK can deploy a multi-IX strategy across these markets that enables them to deliver high-quality services to their end users. With a new, flexible and scalable networking model, they can rapidly turn up remote peering to meet new traffic demand and seamlessly connect to customers in new regions.
Streamlining Connectivity The best approach is to work with network service providers who have existing relationships with the IXs. This streamlines the process of connecting to the IXP and reduces the number of vendors and suppliers. With a software-defined networking (SDN) platform, remote peering is made even simpler with on-demand connectivity to global IXs. Middle Eastern enterprises can access the IXs they need today, with the ability to connect to a whole range of IXs tomorrow.
FEATURE
THE RISE AND RISE OF NFV NETWORK FUNCTIONS VIRTUALISATION (NFV) TECHNOLOGY GIVES TELECOM SERVICE PROVIDERS FREEDOM FROM PROPRIETARY HARDWARE TO MIGRATE TO OFF-THE-SHELF SERVERS TO RUN THEIR KEY SERVICES. INDUSTRY EXPERTS SHINE A LIGHT ON THE EXPECTED BENEFITS.
H
ailed as the next stage in the evolution of telecom networks, NFV allows service providers to virtualize key network applications and devices with software running on commodity hardware. This promises to signification reduce the cost of building networks, operations, and also makes provisioning or upgrading of systems easy. According to market research firm Ovum, the market for NFV is expected to reach $48 billion by 2023, and the technology is expected to become more critical as operators start to build 5G networks and move to cloudnative applications.
16
CXO INSIGHT ME
DECEMBER 2019
Azz-Eddine Mansouri
“NFV allows service providers and operators to lower their infrastructure costs while speeding up the configuration and deployment of new network services, such as virtual instances of routers, firewalls, and encryptors. This highly flexible, software-based network service environment allows for the faster and easier deployment of new services that shortens the deployment process from weeks and months to days or even minutes. This agility creates significant competitive advantage, as it allows service providers to pursue new business opportunities that are not economically viable using a traditional appliance-
centric approach. Additionally, NFV helps to eliminate, proprietary hardware appliances from network infrastructure,” says Azz-Eddine Mansouri, General Manager of Sales at Ciena Middle East. Lucky La Riccia, Head of Digital Services at Ericsson Middle East & Africa, says one of the top use cases on NFV is automation of operations. This improves reliability and keeps total cost of ownership low. Within automated NFV operations, the focus is on auto healing and auto scaling and the benefits they provide. “Auto healing can be used for recovering an application, for example when there is a hardware fault. Autoscaling is a process of defining thresholds in the system, so the assigned capacity for an application can instantly be increased automatically when the threshold has been reached, typically at busy hour or when there is an unexpected increase in traffic. Another benefit with automated NFV operations is that the time to resolution is much shorter compared to manual mode. Further, NFV enables operators to be agile, executing required customizations on demand, and dramatically reducing time to market. Finally, virtualisation will eventually enable network slicing as a fundamental block for new use cases,” he adds.
Can NFV be integrated with existing telecom network architectures? Mansouri from Ciena says virtual network services are highly complementary to traditional hardware-based networks and must be to achieve commercial deployment success. “This is already taking place in many parts of the network, such as the ongoing virtualisation of the 4G Evolved Packet Network (EPC), where the mobile functions that were traditionally implemented in hardwarebased appliances are virtualised and integrated into existing networks. The 5G Core network will be heavily virtualised from the start, to realize the promised benefits associated with software-based networks,” he says.
SDN vs. NFV
Lucky La Riccia
Riccia from Ericsson adds it make sense to combine virtualized and legacy nodes, at least with initial deployments. In practice, this is how migration is typically done when operators start with virtualization of a few nodes and progress until reaching targeted virtualize network. “NFV is important for building future networks because current networks must cope with widely varying demands and a business landscape that will be significantly different from today and hence networks need to be programmable and highly automated to be able to respond quickly to various demands, provide network efficiency, QoE and shorter time to market for innovative services (new revenue streams),” he adds.
NFV IS CRUCIAL TO THIS, WHICH MEANS THAT 5G NETWORKS WILL HAVE TO BE ADAPTABLE, DYNAMIC, AND PROGRAMMABLE FROM ENDTO-END USING VIRTUALIZED CONSTRUCTS.
For users, it is very important to understand the difference between these two important networking trends. While the central tenet of SDN is network automation, NFV is all about de-coupling network functions from hardware, and industry experts both technologies are complementary rather than competitive. “Both NFV and SDN are seamless ways of configuring, deploying, virtualizing, and maintaining networks that provides a compelling business case for service providers to fundamentally change the way they implement customerfacing network services, functions, and capabilities. While SDN decouples the data and control planes, NFV virtualizes network services and decouples software from closed, proprietary hardware. This allows operators to quickly configure and deploy new network services using a policy-based management paradigm running on readily available commodity hardware. Both technologies support highly agile networking, elastic provisioning, reduced time-to-revenue for new services, reliable, and frictionless customer turn-ups,” says Mansouri. In agreement, Riccia says NFV is complementary with SDN, with core similarity that both use network virtualization to enable network design and infrastructure to be abstracted in software and then implemented by underlying software across hardware platforms and devices. The adoption of NFV in the Middle East has been gradual and many expect the technology to gain steam on the heels of 5G roll out in the region. “In the Middle East, as countries such as the UAE continue to invest in Smart Cities and ensure they become smarter and more connected, network operators need to make sure their networks are flexible and capable of slicing to meet the growing and increasingly varied needs of the smart city and its population. NFV is crucial to this, which means that 5G networks will have to be adaptable, dynamic, and programmable from endto-end using virtualized constructs,” says Mansouri.
DECEMBER 2019
CXO INSIGHT ME
17
INTERVIEW
CHARTING A COURSE FOR SUCCESS ANAS E. JWAIED, MANAGING DIRECTOR, MICRO FOCUS, MIDDLE EAST AND AFRICA REGION, REFLECTS ON HIS FIRM’S BULLISH GROWTH STRATEGY IN THE REGION.
As a relatively new brand in the region, how well have you done in these last two years? hat we have achieved over the last two years has far exceeded our expectations. As a new brand in the region, we had to set up our legal entity, establish local offices, and all the logistical side of the house was done smoothly. In terms of executing our go-to-market strategy and engaging with partners, the biggest challenge was to educate the market about the brand itself. As you know, Micro Focus has been around more than 40 years, but in the Middle East and Africa, not many people knew about us. We have been successful in creating awareness and positioning our brand by holding many customer-focused events and explaining the rationale behind the merger between Micro Focus and HPE’s software business. We have communicated what it means for the installed base and at the same time, giving confidence to new customers because brand name plays a big role in this part of the world. We are really pleased with our performance so far, and though the market may not be in the best shape these days, we have grown in double digits in all the countries in the region we operate in, and our outlook for 2020 is very positive.
W
Which are the fastest-growing markets in the region for Micro Focus? When you look at the Middle East, there are some standout countries such as the UAE, Saudi Arabia, Oman, and Qatar. When it comes to Africa, we are experiencing high growth in countries such as Egypt, 18
CXO INSIGHT ME
DECEMBER 2019
Nigeria, Kenya, Morocco, Tunisia, to name a few. We are also engaged with some key accounts in Jordan, Lebanon, and Iraq. We aren’t very different from other vendors when it comes to growth markets within the region. What are you doing around partner enablement? Our route to market is indirect, and though our salesforce on the ground engages directly with customers, they work through partners. So, the plan is to continue this model, and this is where value-added resellers and distributors come into play. Having said that, there is always room for improvement, and we continue to strive to better enable our partners by focusing more and more on certifications. We are revisiting the certification process to see how our partners can add more value to our customers. You have four key pillars – DevOps, hybrid IT, security, risk & governance, and predictive analytics. Which one is growing rapidly? We are growing across the whole portfolio, but if I have to pick one, it would be hybrid IT. This is where we are bringing innovations by cloudifying and containerising products. We are also seeing a big demand within our installed base for upgrades and new functionalities, and some of the new products we are bringing to the market such as RPA.
MICRO FOCUS HAS BEEN AROUND MORE THAN 40 YEARS, BUT IN THE MIDDLE EAST AND AFRICA, NOT MANY PEOPLE KNEW ABOUT US. WE HAVE BEEN SUCCESSFUL IN CREATING AWARENESS AND POSITIONING OUR BRAND BY HOLDING MANY CUSTOMERFOCUSED EVENTS AND EXPLAINING THE RATIONALE BEHIND THE MERGER BETWEEN MICRO FOCUS AND HPE’S SOFTWARE BUSINESS.
Why RPA? Isn’t that already a crowded market? I think Robotics Process Automation is one of the hot topics among our customers today. It might be a market that is getting commoditized with so many key niche players, but we are not new to this game. We have been doing RPA in bits and pieces in different products, and now we are bringing it all together. The idea is not to compete with pure-play RPA vendors but to offer it as something complementary to our portfolio. If any of our customers want to use a different RPA solution with the rest of our portfolio, they will be able to do that. We are focusing our RPA efforts within the IT service management area, and automation is not new to us.
FEATURE
THE NEXT BIG THING WHAT IS DIFFERENT ABOUT NEWLY RATIFIED WI-FI 6 STANDARD AND WHY IT MATTERS TO BUSINESSES.
W
i-Fi has become an essential part of network access in almost all organisations today, and the next Wi-Fi standard 802.11 ax, better known as Wi-Fi6, was ratified in September this year. This next-gen Wi-Fi is 40 percent faster than the previous generation and comes loaded with features that will enable many new use cases for enterprises. “The four times improvement in performance, along with the ability to run parallel streams of data, is going to allow us to do so much more at the edge from an IoT perspective. This is a significant Wi-Fi game-changer in terms of the amount of data we can push over 20
CXO INSIGHT ME
DECEMBER 2019
Jacob Chacko
Prem Rodrigues
Wi-Fi 6. The new technology will drive the adoption of breakthrough new IoT video analytics applications with facial recognition and intelligent video. Right now, we have the compute and storage at the edge, but we lacked the bandwidth required for these applications. This is a complete game-changer for videobased internet of things applications for airports, large public venues, schools, universities, and hotels,” says Jacob Chacko, regional business head – Middle East, Saudi & South Africa (MESA) at HPE Aruba. Prem Rodrigues, director of sales and marketing for the Middle East, India and SAARC at Siemon, says Wi-Fi 6 will also support a far greater number of mobile devices in dense deployment environments, such as large public spaces like arenas and airports. This is because Wi-Fi 6 devices benefit from operation in both the 2.4 GHz and 5 GHz bands - unlike Wi-Fi 5 devices that transmit exclusively in the 5 GHz spectrum. The benefits also extend to Wi-Fi 6 wireless access points (WAPs). WiFi 6 WAPs connect to the Ethernet network with two ports (rather than the traditional one port) via high bandwidth copper cabling, which delivers higher levels of remote power to these WAPs. To fully benefit from the advancements of new Wi-Fi 6 technology, designers,
consultants, and installers must note that Wi-Fi 6 requires the support of solid wired cabling foundations, he adds. Will Wi-Fi 6 face competition from 5G, which is on the horizon? Amanulla Khan, Managing Director, Linksys Middle East, Turkey, and Africa, says both are complementary technologies that support faster and broader connectivity, but with use cases in homes, enterprises, and Smart Cities. “Wi-Fi 6 is ideallypositioned to provide short-range internet connectivity indoors and is easy and cost-effective to maintain and scale-up. Increasingly, governments are deploying Wi-Fi across entire cities. Connecting to Wi-Fi takes only minimal battery power, allowing it to connect with wearables and sensors. “Meanwhile, 5G is a cellular service that is ideal for connecting industries and devices in the outdoors, from oil and gas rigs to the next wave of connected and autonomous vehicles. 5G rollout also depends on the deployment strategies by the various service providers, along with the ongoing rollout of 5G-enabled mobile devices,” he says. Beyond simple communications, Wi-Fi is expected to fuel some new cases. “We see dramatic growth in Wi-Fi, a lot of meaningful IoT use cases and pent up demand for greater bandwidth, greater security, and greater energy conservation. Those are all things our
Amanulla Khan
Lucas Jiang
customers are looking for so we have a warm audience for these products. It’s perfect timing for what is a superset of products,” says Chacko from Aruba. Rodrigues shares a similar opinion: “Wi-Fi can become the most popular choice for wireless IoT connectivity because organizations can leverage the infrastructure already in place to provision such applications as environmental control, energy management, and physical security, including automated video surveillance. High density public environments such as airports, trains, auditoriums, shopping centres, stadiums will particularly benefit from Wi-Fi 6.” Though Wi-Fi 6 promises to revolutionise connectivity and bring in its wake new applications, it has its share of drawbacks as well. “The main disadvantage of Wi-Fi 6 is that the OFDM subcarrier spacing is narrower at 78.125 kHz, making the use of phase noise oscillators and highly linear RF front ends necessary to support smooth workflow. Moreover, since 1024-QAM is being used to boost speeds, and achieve higher data rates, EVM specification is going to be tight, requiring tight frequency synchronization and clock offset correction to improve and sustain better performance. Also, Wi-Fi 6 stations maintain their frame timing based on their clocks, when their transmissions should be accurate as per their trigger frames and scheduling,” says Lucas Jiang, GM at TP-Link MEA. Rodrigues from Siemon adds: “Some might assume that the potential risks associated with higher levels of remote power are a drawback of Wi-Fi 6. When Wi-Fi 6 devices receive higher levels of remote power (e.g., IEEE 802.3 Type 3 (60W), Type 4 (90W) and POH (100W)), and these devices are connected/disconnected under load, electrical arcing might occur that can inhibit performance and cause power and efficiency losses. It is, therefore, recommended to install connecting hardware that is compliant with IEC 60512-99-00.”
DECEMBER 2019
CXO INSIGHT ME
21
EVENT
SIX TRENDS TO SHAPE THE
NEW YEAR
DAVE RUSSELL, VICE PRESIDENT OF ENTERPRISE STRATEGY, VEEAM, ELABORATES ON KEY TECHNOLOGY TRENDS THAT BUSINESSES CAN PREPARE FOR IN THE COMING YEAR.
T
hroughout 2019, technology has continued to have a transformative impact on businesses and communities. From the first deployments of 5G to businesses getting to grips with how they use artificial intelligence (AI), it’s been another year of rapid progress. From an IT perspective, we have seen two major trends that will continue in
2020. The first is that on-premises and public cloud will increasingly become equal citizens. Cloud is becoming the new normal model of deployment, with 85% of businesses self-identifying as being predominantly hybrid-cloud or multi-cloud today. Related to this are the issues of cybersecurity and data privacy, which remain the top cloud concerns of IT decision makers. In 2020, cyber threats will increase rather than
diminish, so businesses must ensure that 100% of their business-critical data can be recovered. Here are some of the key technology trends that businesses will look to take advantage of and prepare for in the year ahead.
1
Container adoption will become more mainstream.
In 2020, container adoption will lead to faster software production through more robust DevOps capabilities and Kubernetes will consolidate its status as the de facto container orchestration platform. The popularity of container adoption or ‘containerisation’ is driven by two things: speed and ease. Containers are abstract data types that isolate an application from an operating system. With containers, microservices are packaged with their dependencies and configurations. This makes it faster and easier to develop, ship and deploy services. The trend towards multi cloud means businesses need data to be portable across various clouds — especially the major providers — AWS, Microsoft Azure and Google Cloud. 451 Research projects the market size of application container technologies to reach $4.3 billion by 2022 and in 2020 more businesses will view containers as a fundamental part of their IT strategy.
2
Cloud Data Management will increase data mobility and portability.
Businesses will look to Cloud Data Management to guarantee the 22
CXO INSIGHT ME
DECEMBER 2019
availability of data across all storage environments in 2020. Data needs to be fluid in the hybrid and multi cloud landscape, and Cloud Data Management’s capacity to increase data mobility and portability is the reason it has become an industry in and of itself. The 2019 Veeam Cloud Data Management report revealed that organisations pledged to spend an average of $41 million on deploying Cloud Data Management technologies this year. To meet changing customer expectations, businesses are constantly looking for new methods of making data more portable within their organisation. The vision of ‘your data, when you need it, where you need it’ can only be achieved through a robust CDM strategy, so its importance will only grow over the course of next year.
3
Backup success and speed gives way to restore success and speed.
Data availability Service Level Agreements (SLAs) and expectations will rise in the next 12 months. Whereas the threshold for downtime, or any discontinuity of service, will continue to decrease. Consequently, the emphasis of the backup and recovery process has shifted towards the recovery stage. Backup used to be challenging, labor and cost-intensive. Faster networks, backup target devices, as well as improved data capture and automation capabilities have accelerated backup. According to our 2019 Cloud Data Management report, almost one-third (29%) of businesses now continuously back up and replicate high-priority applications. The main concern for businesses now is that 100% of their data is recoverable and that a full recovery is possible within minutes. As well as providing peace of mind when it comes to maintaining data availability, a full complement of backed up data can be used for research, development and testing purposes. This leveraged data helps the business make the most informed decisions on digital transformation and business acceleration strategies.
4
Everything is becoming software-defined.
Businesses will continue to pick and choose the storage technologies and hardware that work best for their organisation, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast-becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites. Software-defined approaches such as IaC and CloudNative — a strategy which natively utilises services and infrastructure from cloud computing providers — are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability — enabling organisations to deploy applications with speed and ease. With over three-quarters (77%) of organisations using software-as-a-service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.
5
Organisations will replace, not refresh, when it comes to backup solutions.
In 2020, the trend towards replacement of backup technologies over augmentation will gather pace. Businesses will prioritise simplicity, flexibility and reliability of their business continuity solutions as the need to accelerate technology deployments becomes even more critical. In 2019, organisations said they had experienced an average of five unplanned outages in the last 12 months. Concerns over the ability of legacy vendors to guarantee data Availability are driving businesses towards total replacement of backup and recovery solutions rather than augmentation of additional backup solutions that will be used in conjunction with the legacy tool(s). The drivers away from patching and updating solutions to replacing them completely include
CLOUD IS BECOMING THE NEW NORMAL MODEL OF DEPLOYMENT, WITH 85% OF BUSINESSES SELFIDENTIFYING AS BEING PREDOMINANTLY HYBRIDCLOUD OR MULTI-CLOUD TODAY. RELATED TO THIS ARE THE ISSUES OF CYBERSECURITY AND DATA PRIVACY, WHICH REMAIN THE TOP CLOUD CONCERNS OF IT DECISION MAKERS. maintenance costs, lack of virtualisation and cloud capabilities, and shortcomings related to speed of data access and ease of management. Starting afresh gives businesses peace of mind that they have the right solution to meet user demands at all times.
6
All applications will become mission-critical.
The number of applications that businesses classify as mission-critical will rise during 2020 — paving the way to a landscape in which every app is considered a high-priority. Previously, organisations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult. On average, the 2019 Veeam Cloud Data Management report revealed that IT decision makers say their business can tolerate a maximum of two hours’ downtime of mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organisations a total of $20.1 million globally in lost revenue and productivity each year, with lost data from missioncritical apps costing an average of $102,450 per hour. The truth is that every app is critical.
DECEMBER 2019
CXO INSIGHT ME
23
EVENT
MAKING A MARK AT VMWORLD 2019 EUROPE HELD IN BARCELONA LAST MONTH, THE VIRTUALIZATION MAJOR ANNOUNCED ITS PLANS TO DOUBLE DOWN ON KUBERNETES TO MANAGE THE MULTI-CLOUD WORLD WITH INTRINSIC SECURITY.
H
eld under the theme of ‘tech in the age of any,’ the event was kicked off by its CEO Pat Gelsinger, who said the tremendous variety and choice we have in technology today had created challenges around complexity and management at the same time. “It’s been 40 years since I started my career and it is easy to forget not long ago, this was a small community of geeks. Our digital life was separate from our daily life. But in the last decade, these have become one, and digital innovation has permeated every aspect
24
CXO INSIGHT ME
DECEMBER 2019
of our daily lives. We are just beginning to understand the implications of that.” Referring to the explosion of the developer community, Gelsinger said there had been a 3x increase in the number of app developers over the last decade to reach 13.5 million and a 6x increase in the number of apps, which is expected to more than double in the next five years. “We are seeing this redefinition of what is possible. If you think about cloud, we have this ability to assemble the largest number of computing resources that have ever been brought together. We are breaking through
digital and physical boundaries with edge and IoT, and bringing intelligence to everything with AI. We are now able to connect every human on this planet with an immersive and personalized experience of 5G. These tech superpowers are accelerating the pace of innovation and technology,” he said. He went on to add that technology has a greater impact than ever before and shared some examples of work his company is doing to bring inclusive growth. For instance, VMware is collaborating with the Egyptian Ministry of Communications
and IT to bridge the skills gap and develop young talent to meet its Vision 2030 of sustainable development strategy. The CEO said VMware each year is refining and improving on its vision of any cloud, any app available on any device. “Cloud was supposed to bring simplicity, but in many cases, it has brought diversity. Are we creating chaos or opportunities? Our objective is to create technologists to master this multi-cloud world. This will become the essential skillset in the next decade.” Referring to its partnerships with AWS, Microsoft, IBM, Google, and, more recently, Oracle, Gelsinger said VMware is helping its customers to build, run, manage and protect any cloud environment. “Operators are looking for consistent operations environments, and developers want access to modern services and apps. Kubernetes is emerging as the magic layer to bring these two worlds together. Since Java and VMs, we haven’t seen technology as critical as Kubernetes, which is much more than just container orchestration,” he added. In line with its vision to democratize Kubernetes in the industry, at the show, VMware laid out its strategy and announced Tanzu portfolio of products and services to enable and transform the way enterprises build modern apps. “With all these new technologies we are bringing together, we now have five million developers across Bitnami, Pivotal, and Spring. Bitnami brings a catalogue of open source components, Pivotal offers best-in-class cloudnative apps and Spring is part of the framework to accelerate developer productivity,” said Joe Beda, principal engineer at VMware. In this context, VMware announced Project Galleon, which builds on the Bitnami catalogue of packaging and delivery technologies to enable enterprises to build apps faster and reliably. Gelsinger also shared more details on Project Pacific, which was announced earlier this year at the VMworld event held in the US. ‘We are
fusing our proven vSphere platform and Kubernetes together, and establish it as a platform for modern apps and deliver Kubernetes at scale. This establishes a bridge between developers and IT.” Beda added that the goal of Project Pacific is to establish vSphere as an API-driven platform to manage both VMs and Kubernetes with one single toolchain. “This deep integration is not just at the control layer; we are also taking it to the virtualisaiton layer by enhancing ESXi to run Kubernetes workloads natively. Kubernetes pods can run 30 percent faster than traditional Linux VMs,” he said. On the heels of its $2.1 billion acquisition of Carbon Black, which is expected to close by the end of this year, VMware has also announced new security solutions that advance the company’s vision of intrinsic security. “We are going to layer Carbon Black into all of our major platforms, including vSphere and NSX, providing
for the first time an agent-less solution for workload security,” said Sanjay Poonen, COO of VMware. Other major announcements included Project Maestro, a telco cloud orchestrator built to help communications service providers (CSPs) accelerate the time-to-market of modern network functions and services across clouds, from the core to edge, and from private to public clouds. With this cloud-first orchestration solution, CSPs will be able to build and automate network services that span across a wider variety of network function formats, enabling interoperability and optimising their operations across every layer of the NFV and Telco Cloud architecture. VMware claims the telco cloud orchestrator will deliver operational efficiency at scale to ultimately help CSPs accelerate the time-to-market of new services, mitigate the cost of managing ever more complex networks, and improve customer experiences.
DECEMBER 2019
CXO INSIGHT ME
25
VIEWPOINT
PLUGGING THE
SECURITY GAPS
IF YOU’RE ONLY FOCUSED ON PATCHING, YOU’RE NOT DOING VULNERABILITY MANAGEMENT, SAYS ANTHONY PERRIDGE, VP OF INTERNATIONAL AT THREATQUOTIENT.
W
hen I speak to security professionals about vulnerability management, I find that there is still a lot of confusion in the market. Most people immediately think I’m referring to getting rid of the vulnerabilities in the hardware and software within their network, but vulnerability management encompasses a much broader scope. 26
CXO INSIGHT ME
DECEMBER 2019
Vulnerability management is not just vulnerability scanning, the technical task of scanning the network to get a full inventory of all software and hardware and precise versions and current vulnerabilities associated with each. Nor is it vulnerability assessment, a project with a defined start and end that includes vulnerability scanning and a report on vulnerabilities identified and recommendations for remediation.
Vulnerability management is a holistic approach to vulnerabilities – an ongoing process to better manage your organization’s vulnerabilities for the long run. This practice includes vulnerability assessment which, by definition, includes vulnerability scanning, but also other steps as described in the SANS white paper, Implementing a Vulnerability Management Process.
Just as the process of vulnerability management is broader than you might think, the definition of a vulnerability is as well. A vulnerability is the state of being exposed to the possibility of an attack. The technical vulnerabilities in your network are one component, but there is another important aspect that is often overlooked – the vulnerabilities specific to your company, industry and geography. You can’t only look internally at the state of your assets. You must also look externally at threat actors and the campaigns they are currently launching to get a more complete picture of your vulnerabilities and strengthen your security posture more effectively. In The Art of War, Sun Tzu captured the value of this strategy well when he stated, “If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer defeat. If you know neither the enemy nor yourself, you will succumb in every battle.”
Prioritize patching based on the threat As stated above, with respect to vulnerability management, most security organizations tend to focus on patching but because they don’t have the resources to patch everything quickly, they need to figure out what to patch first. To do this security teams typically take a thumbnail approach – they start with critical assets, the servers where their crown jewels are located, and work down to less critical assets. While a good starting point, their prioritization decisions are based only on internal information. As Sun Tzu points out, knowing yourself but not the enemy will yield some victories but also defeats. Having a platform that serves as a central repository allows you to aggregate internal threat and event data with external threat feeds and normalize that data so that it is in a usable format. By augmenting and enriching information from inside
your environment with external threat intelligence about indicators, adversaries and their methods, you can map current attacks targeting your company, industry and geography to vulnerabilities in your assets. Intelligence about a campaign that presents an immediate and actual threat to your organization leads to a more accurate assessment of priorities and may cause you to change your current patch plan to prioritize those systems that could be attacked at that moment. The result is intelligencedriven patch management that hardens your processes to thwart the attack.
A HOLISTIC APPROACH TO VULNERABILITY MANAGEMENT, THAT INCLUDES KNOWING YOURSELF AND YOUR ENEMY, ALLOWS YOU TO GO BEYOND PATCHING. IT PROVIDES AWARENESS AND INTELLIGENCE TO EFFECTIVELY AND EFFICIENTLY MITIGATE YOUR ORGANIZATION’S RISK AND POSITION YOUR TEAM TO ADDRESS OTHER HIGH-VALUE ACTIVITIES.
Bridge the visibility gap Unfortunately, the reality is that not every company has 100% visibility into their assets and vulnerabilities, so mapping external threat data to internal indicators to hone a patch plan sometimes has limited value. However, there is still tremendous value in gathering information from global threat feeds and other external intelligence sources to determine if your business is under a specific attack. The MITRE ATT&CK framework is one such source. It dives deep into adversaries and their methodologies so security analysts can use that information to their advantage. Bringing MITRE ATT&CK data into your repository allows you to start from a higher vantage point with information on adversaries and associated tactics, techniques and procedures. You can take a proactive approach, beginning with your organization’s risk profile, mapping those risks to specific adversaries and their tactics, drilling down to techniques those adversaries are using and then investigating if these techniques could be successful or if related data have been identified in the environment. For example, you may be concerned with APT28 and can quickly answer questions including: What techniques do they apply? Have I seen potential indicators of compromise or possible related system events in my organization? Are my endpoint technologies detecting those techniques? With answers to questions like these you can discover real threats, determine specific actions to harden your network and processes, and mitigate risk to your business. A holistic approach to vulnerability management, that includes knowing yourself and your enemy, allows you to go beyond patching. It provides awareness and intelligence to effectively and efficiently mitigate your organization’s risk and position your team to address other high-value activities – like detecting, containing and remediating actual attacks, and even anticipating potential threats.
DECEMBER 2019
CXO INSIGHT ME
27
EVENT
WHEN EXPERIENCE
MATTERS
AT ITS ANNUAL CUSTOMER AND PARTNER EVENT, AVAYA ANNOUNCED PLANS TO GO ALL IN ON CLOUD AND AI TO TRANSFORM THE DIGITAL CUSTOMER EXPERIENCE.
T
he Avaya Engage event hosted more than 1000 customers, partners, and alliance members from 35 countries, where the communications and collaboration major presented many use cases in the form of customer success stories. “Our customers are facing the challenge of identifying technology trends that will meet their business
needs. For instance, there is a lot of hype around AI, and they want to find out which particular solution under that umbrella will benefit them the most. Our strategy for 2020 is to accelerate investments in AI and cloud and focus on the banking sector as a focused vertical where AI is getting a lot of attention to improve customer experience,” said Fadi Hani, VP-MEA and Turkey, Avaya International.
Fadi Hani
28
CXO INSIGHT ME
DECEMBER 2019
Laying out its product roadmap for the year ahead, Savio Tovar Dias, senior director of sales engineering at Avaya, said: We have one of the largest installed customer bases in the world, and we are collaborating with our customers and partners to define the next wave of innovation in real-time communications. As Avaya, we are innovating on our existing core platform to launch new services and channels. We are making the integration between collaboration and contact centres seamless.” He said Avaya has been using AI for many years. “We understand customer service very well, and we have been embedding AI into all elements of customer service strategy. Where it is behavioural routing or chatbots, we are driving automation across all digital channels to help our customers. On the back end, we are focusing on asynchronous chat, and investing in conversational AI platforms and partnering with companies such as Google to drive our chatbot strategy.” Dias said Avaya is taking a different approach when it comes to the cloud. “We are giving our customers choices. We have many large customers who want to move rapidly and transform, and we are giving them a choice of either going into the private cloud or augment services through APIs and digital channels from the cloud such as WhatsApp as a call centre. We are going to invest in creating a new cloud architecture from the ground up to deliver a full suite of contact centre capabilities.”
Savio Tovar Dias
At the event, Avaya has launched its new Avaya IX Subscription programme in the international market, making it easier than ever for EMEA and APAC customers to purchase and consume Avaya’s communications and collaboration solutions to drive their business growth. This new programme offering monthly or annual subscription payments enables customers to avoid the complexity and cost of software licensing and contract renewals and instead focus on growing their businesses. Additional benefits to customers of the programme include lowered business risk, increased operational agility, streamlined budgeting and purchasing processes, and maximum flexibility when adding new services and users. Avaya IX Subscription includes access to the latest software releases, the freedom to flex up to 20 percent over the number of subscribed users at no additional charge and support from the company’s services organisation. Yaser Al Zubaidi, Senior Director, Engagement Solutions, Avaya International, said that the company would continue to offer its customers the broadest possible choice for deployment options.
WE UNDERSTAND CUSTOMER SERVICE VERY WELL, AND WE HAVE BEEN EMBEDDING AI INTO ALL ELEMENTS OF CUSTOMER SERVICE STRATEGY. WHERE IT IS BEHAVIOURAL ROUTING OR CHATBOTS, WE ARE DRIVING AUTOMATION ACROSS ALL DIGITAL CHANNELS TO HELP OUR CUSTOMERS. ON THE BACK END, WE ARE FOCUSING ON ASYNCHRONOUS CHAT, AND INVESTING IN CONVERSATIONAL AI PLATFORMS AND PARTNERING WITH COMPANIES SUCH AS GOOGLE TO DRIVE OUR CHATBOT STRATEGY.
He said, “As part of that, we’re now able to extend a subscription model for on-premise communications infrastructures. We expect our customers to shift from an on-premise deployment paradigm to a private or public cloud architecture over the next two to three years, and Avaya IX Subscription provides a convenient stepping stone on that journey towards the cloud.” According to Al Zubaidi, Avaya IX Subscription also highlights the flexibility inherent in the firm’s solutions, which can be consumed across a range of deployment models, and are fully capable of fitting into any given organisation’s hybrid cloud strategy. As part of the Avaya IX Subscription program, the company is providing trade-in and upgrade offers for existing customers to protect and extend the current investments in their Avaya communications infrastructure. Customers can trade-in their existing perpetual licenses for credits to be applied towards their subscription payments. For customers not running on the latest Avaya software releases, Avaya is also offering an “Experience Avaya” programme to upgrade to Avaya OneCloud or Avaya IX on-premise software. Additionally, Avaya IX Spaces, the company’s new cloud-based platform for team collaboration and meetings, is included as part of all Avaya IX Subscriptions. Launched across the Middle East, Avaya IX Spaces seamlessly integrates voice, video, tasks, sharing and more into one solution that can be easily accessed anywhere, on any device–mobile devices, desktops, telephones and room systems. Avaya IX Spaces is designed for teams that need a simple and effective way to communicate and manage tasks to create a more efficient environment for businesses and organisations. As an extension, the event also hosted Dubai Police-supported international call centre awards for public and private sectors, where the winners included Abu Dhabi Digital Authority and Etisalat.
DECEMBER 2019
CXO INSIGHT ME
29
INTERVIEW
ORCHESTRATING INNOVATION WE SPOKE TO GREG LAVENDER, THE NEWLY APPOINTED CTO OF VMWARE, ABOUT THE VIRTUALISATION PLAYER’S STRATEGY AROUND KUBERNETES AND MULTI-CLOUDS.
Is VMware now looking beyond VMs to transition to a future dominated by Kubernetes and containers? e have been talking about it for three years, but we had to wait for Kubernetes and other functions like Knative services to mature to create all that into our products, as we have announced with Project Pacific and Project Tanzu. This is a multiyear investment we are making to bring it back as enterprise software to our 500000 enterprise customers worldwide, who obviously trust us to deliver reliable, secure and high quality
W 30
CXO INSIGHT ME
DECEMBER 2019
software. And as you know, with open source, you get what you get, and you have to do all the quality work yourself to deliver it as enterprise software. So even if half of our global customer base upgrades to vSphere 7 when we release it next year, they will have Kubernetes, and we will be the biggest Kubernetes platform in the world. Do you think containers and VMs will co-exist? Even if our customers are using OpenShift, they are running it on VMware because we do so many clever things on the hardware to get maximum utilization. I believe hypervisor is the best foundation
for running Kubernetes and containers. Still, as we announced in Project Pacific, we are building in Kubernetes into the hypervisor layer, which means you can run containers natively without having to use hypervisors. Does this mean if you know vSphere, you don’t need specialized training to use Kubernetes? Basically, Kubernetes becomes a feature of vSphere, and the way you manage vSphere environments is through vCenter, and 500000 users know how to use it. There are all kinds of online and free training for that. Kubernetes ends up as another control panel inside of that
by Palo Alto Networks, and there is also another company called Aqua, which does something similar for customers who want additional security within their containers. For us, by embedding Kubernetes into vSphere, we are bringing the security of vSphere to the platform and all other things we do around NSX and AppDefense. So all the things we already provide in terms of security around hypervisors, you get it free as part of the package when you buy vSphere with embedded Kubernetes.
control plane, and then you can provision Kubernetes clusters just like your provision VMs into the vSphere environment. But, more importantly, we also give DevOps teams what we call guest clusters so they can have a native Kubernetes experience around pods and namespaces, and set high availability and resiliency patterns that applications would run in. So, this means they don’t separate IT stacks for cloud-native apps and virtualized environments. If you build standard containers with Docker images, in principle, you can deploy them in VMware Cloud on AWS, Azure and Google Cloud and even run them natively if Kubernetes environments in these cloud platforms don’t become proprietary. As you as long as you stay consistent with standards, open-source APIs, you can run your containers anywhere, and all you have to worry about where is your data. Is that where Tanzu Mission Control kicks in? Yes, Mission Control then becomes a control plane that allows you to manage your workloads no matter where you run them – edge, on-prem, hybrid, or public cloud – with a single pane of glass. Is it tough to deploy on-premise Kubernetes clusters? You have to really understand how Kubernetes works. When I was in Citi Group, we deployed OpenShift on VMware, but it took us several months to integrate it, and build security features in. Our job is to provide all that integrated as part of run time, so there is a huge savings in Opex for our customers, who don’t have to look for specialists in Kubernetes. We stay close to upstream Kubernetes, and you have to decide whether you want to upgrade slowly or quickly. You are hedging your bets on Kubernetes. Why not other alternatives such as Swarm and Mesos? I think Kubernetes is essential is what the industry wants, and the momentum is there from IBM RedHat, Google, and
all the cloud providers. That is not to say Docker Swarm or Mesos is not good technology, and I know customers are using these. But if you look at the opensource community, it is where all the efforts are going, and a whole ecosystem is developing around Kubernetes. Then again, Kubernetes is not our destination; it is an enabler – the goal is to have to distributed, reliable, and horizontally scaling applications built on Kubernetes foundation. It already has. I have talked to hundreds of our customers in the last year, and only a couple of them use vSphere for special cases like grid computing with Swarm. For anything else, with the skill sets available around Kubernetes, you will be hiring less and less Swarm people. How are you addressing the security concerns around Kubernetes? It is not as much about Kubernetes as it is about containers, which is where security vulnerabilities tend to live. When I deployed OpenShift in my previous job, I licensed a technology called Twistlock, which is behavioural monitoring of what is running insider containers and making sure it is not doing something nefarious. They were recently acquired
Customers have to re-platform their workloads to move the cloud. How are you helping your customers move to the cloud without modifying their apps? I think this is where customers have a kind conundrum – it is expensive to remodify and retest apps. So what I see happening are two things. One, you have to build cloud-native apps right from the start as Docker images and API-designed apps and deploy them into containers running across Kubernetes clusters, and ultimately a service mesh to manage the inter-communication of APIs. If they build those as cloud-native apps, you can deploy them anywhere. If they consume native AWS or Azure services, it gets sticky, and they may not get the benefit of multi-cloud capabilities. But, if you build it in a way where you move it between clouds like many customers are doing now, you can benefit from cloud mobility, and then the only thing that holds them back would be where is your data. I think workloads are going to move to where data is processed. Number two, some customers are moving their web front-end and mobile apps to the public cloud even if backend remains on traditional environments. They want to have horizontal scaling, and they want to take advantage of scale and the elastic nature of the public cloud. So they just can move the web tier, which is mostly stateless with a distributed cache, to the public cloud, and the systems of record remain on-prem, and then you can have some low-latency network connectivity between these.
DECEMBER 2019
CXO INSIGHT ME
31
VIEWPOINT
HIDDEN TUNNELS MATT WALMSLEY, EMEA DIRECTOR AT VECTRA, DIGS INTO HOW CYBER-ATTACKERS ARE OPERATING CONCEALED INSIDE FINANCIAL SERVICES ORGANISATIONS
F
indings from the Vectra 2018 Security Spotlight Report on the financial service sector identified vulnerabilities posed to financial services organisations by attackers using hidden tunnels to surreptitiously access and steal data. These attacker techniques played a significant role in the widely reported breach at Equifax.
public scrutiny. All defences are imperfect so despite monumental efforts to fortify security infrastructure, cyberattacks and breaches still occur. If we think back to Equifax, it had the budget, manpower and a sophisticated security operations centre. Nonetheless, 145.5 million Social Security numbers, around 17.6 million driver’s licence numbers, 20.3 million phone numbers, and 1.8 million email addresses were stolen. Hidden tunnels were a key tactic employed in the Equifax attack. Ironically this is because most financial services organisations have robust cybersecurity defences in place which forces attackers to utilise legitimate services and communications protocols in an attempt to hide in plain sight within hidden tunnels.
What are hidden tunnels
An AI light at the end of the tunnel
Firstly, let’s define what is actually meant by hidden tunnels. Tunnels are simply communications that share data within networks or between applications by using pre-existing protocols or services – for example the HTTP or HTTPS protocols that websites use. They often serve as an easy mode of communication that bypasses security controls for greater efficiency. For instance, it’s common that web access (HTTP, HTTPS) is available but other services may be blocked, so many applications “tunnel” their communications through these web protocols to ensure they can communicate with the outside world. In the financial world, examples of legitimate use of this tunnelling technique could be for stock ticker feeds, internal financial management services, third-party financial analytics tools and other cloud-based financial applications. But while they do have their advantages, hidden tunnels are also used by cyberattackers as means to hide within these legitimate communications protocols and services. They use these tunnels as an access point to a network from where they are able to exert control and steal critical data and personal information. As they are technically inside legitimate protocols, these tunnels allow attackers to undetectably orchestrate attacks with
As sophisticated cyber-attackers automate and increase the efficiencies of their own technology, there is an urgent need to automate information security detection and response tools to stop threats faster. Because hidden tunnels carry traffic from legitimate applications, simple anomaly detection systems struggle to discern normal traffic from attacker communications that are concealed among them. At the same time, there remains a global shortage of highly-skilled cybersecurity professionals to handle detection and response to cyberattacks at a reasonable speed. This is where the application of AI can have a marked impact by augmenting existing technical and human capabilities. To find these advanced hidden threats, sophisticated machine learning algorithms have now been developed to identify hidden tunnels within legitimate communications. Although the traffic is normal, there are subtle abnormalities, such as slight delays or unusual patterns in requests and responses that indicate the presence of covert communications. As a result, the use of AI is becoming essential to empowering existing cybersecurity teams, so they can detect and respond to threats faster and stay well ahead of attackers, especially when attackers use hidden tunnels.
32
CXO INSIGHT ME
DECEMBER 2019
“Command and Control” (C2) signals, but they also allow them to sneakily “Exfiltrate” data out too. These C2 and Exfiltration behaviours are part of a wider set of steps—including Reconnaissance, and Lateral Movemen—that advanced targeted attacks invariably exhibit. These steps combine to form links in the “Kill Chain” of an attack’s lifecycle.
Why financial services are more susceptible to hidden tunnel attacks Given that financial services organisations have the largest non-government cybersecurity budgets in the world, if money alone could buy security, these would be the safest places in the world. This points to one painful truth—the largest enterprise organisations in the world remain lucrative targets for sophisticated cyber-attackers, and with resources, skill and persistence, the attackers can still win. While financial services firms do not experience the same volume of breaches as other industries, the ones that do happen have caused exponential damage along with far-reaching consequences and
UNLOCKING VALUE VIKRAM BHAT, CHIEF PRODUCT OFFICER, CAPILLARY TECHNOLOGIES, ON DRIVING THE OMNICHANNEL STRATEGY WITH ARTIFICIAL INTELLIGENCE
T
he Ecommerce momentum is becoming unstoppable as brands are cashing in on a number of factors such as spike in mobile applications, a new generation of highspending shoppers, and the availability of faster internet speeds to enable them to offer their customers a shopping experience whenever and wherever they choose. In addition, an omnichannel approach has clearly shaped the retail industry in 2019, a sector largely driving ecommerce sales in the region. Going omnichannel is tempting for many retailers who have not yet embarked on their digital transformation journey. However, implementing an omnichannel strategy isn’t only about being present on all channels and platforms available. It is about providing a seamless and unified brand experience to customers across channels to enable them to connect with a brand and simplify their shopping experience. A Google report further proves this is the right approach after the study found that 85 percent of shoppers start their shopping journey on one device, like a laptop for example, and end it on another, say a smartphone, or even a physical store. While technology is the key enabler for brands wanting to enhance their omnichannel strategy, Artificial Intelligence is another crucial component that is driving its success. But AI is only a tool and not a standalone solution, so organisations need to understand that while it can be immensely beneficial in providing customer insights, it cannot compensate for a modest or nonexistent omnichannel strategy.
-
-
- In short, AI needs to be a supporting element of a wider omnichannel strategy and not being implemented for the sake of being a hot technology. When organisations take this approach, the power of AI can truly be unleashed to boost sales and customer engagement. Let’s take a look at how AI can be applied online as well in brickand-mortar stores. - - Unlocking data potential: Imagine the amount of data brands have access to via multiple platforms. AI can help brands to process this data to identify consumer spending
IT ALLOWS BRANDS TO COMMUNICATE WITH THEIR TARGET AUDIENCE AT THE RIGHT TIME, WITH THE RIGHT PRODUCT, THE RIGHT OFFER AND MESSAGE, THROUGH THE RIGHT CHANNEL.
patterns, buying preferences, customer demographics, personal preferences, and so on. Personalization: The best way AI can help brands is with the power of personalization. It allows brands to communicate with their target audience at the right time, with the right product, the right offer and message, through the right channel. Brands are able to achieve higher response rates, increased customer loyalty, and lower marketing costs Image Search: AI allows consumers to search for products based on images they’ve come across. Shoppers simply take a picture and get matched to similar items on ecommerce websites. A good example is Pinterest who are leveraging this technology by allowing its users to select any item from any photograph online and then throws up similar items through an image recognition software. Enhancing customer service: Chatbots are a popular and invaluable way for brands to offer 24/7 customer service support on their ecommerce websites. They simulate human-like conversations with customers and can execute tasks, automate order processing, and can also provide accurate answers to customers about product details, quantities and shipping terms. Generating customer insights Instore: AI deployed in physical store helps capture, and correlate in store customer behaviour data and shopping preferences with digital channels like social, email, and mobile app. These insights can be passed onto the sales associates for crossselling, up-selling and strengthening the customer engagement directly on the sales floor.
The use of AI becomes even more powerful when combined across all channels. Organizations that realize its potential will not only drive sales and improve efficiency across platforms, but will also build a strong and loyal clientele in the long-run.
DECEMBER 2019
CXO INSIGHT ME
33
VIEWPOINT
AI TO STREAMLINE CYBERSECURITY PROCESSES IN 2020 GREG DAY, VP & CSO, EMEA, PALO ALTO NETWORKS, DETAILS CYBERSECURITY PREDICTIONS FOR THE EMEA REGION DURING THE COURSE OF THE NEW YEAR.
I
t’s time to step back, reflect on the year so far, and make New Year’s resolutions. We’ve taken a moment to consider the upcoming challenges and opportunities for 2020.
AI streamlines cybersecurity processes There has been much talk of how AI offers new methods of threat detection and how the adversary is looking to subvert AI capabilities. While all the excitement no doubt continues, AI will make a real impact in 2020 in a different way: streamlining cybersecurity processes. SOAR (security orchestration, automation, and response) is just one example – using AI to gather the human knowledge held by cybersecurity staff through NLP and allowing it to be reusable across the rest of the team. This approach provides the building blocks for automating what are typically high-volume, simple, repetitive tasks that no security expert likes doing. It will also help ensure the right people with the right knowledge are engaged on any given project, to best navigate cybersecurity’s latest complex challenges.
How deep does faking need to get? The idea of a trusted digital contact or source is hitting an all-time low as faking continues to grow. For the last few years, we have seen an increase in business email compromise (BEC), using stolen trusted credentials to gain access to systems. As the concept of 34
CXO INSIGHT ME
DECEMBER 2019
faking continues to broaden into video, audio, and other digital formats, we are seeing faking move from simple spoofing into a complex web of lies that spans multiple platforms. We can only expect to see more complex, deeper fakes being created to trick and dupe users into doing things the adversary wants.
Cloud becomes specialist What was ‘cloud first’ became ‘cloud appropriate’, ‘hybrid cloud’ and ‘single cloud’, which is now ‘multi-cloud’. What comes next in the cloud journey? The likely answer seems to be more specialist clouds. Why? Particularly
across EMEA, it seems virtual boundaries for data are growing; many policy stakeholders encourage ‘cloud first’ more and more, but adding the caveat that the data must stay in the country or region. This is driven by the ever-increasing focus on privacy. At the same time, IoT and other big data generators are driving the need for more effective edge computing. Both requirements mean taking data and completing some form of processing – be that to hash out some personally identifiable information, convert it to metadata, or reduce the high volume of data into analytical summaries, which can then be processed at the next level. All of this means that while the cloud is connected, it will become more specialised and fragmented to cope with these requirements. Security experts have been getting used to shared responsibility models; they are quickly having to figure out how they normalise views across multi-clouds, and the approaching multiple specialist clouds.
CSOs go back to school to learn the DevOps way Many CSOs grew up with scripts and GUI interfaces to drive cybersecurity. However, DevOps moves everything to code, breaking it down into the smallest reusable chunks that then require multiple levels of orchestration to function in just container and serverless environments. Some CSOs are busy trying to understand how to make security function as code and how it fits
in this new digital world. Others will, in 2020 and beyond, have the challenge thrust upon them. The shift begins. Quite simply, old methods and tools don’t fit this space; CSOs are returning to education to learn the new languages, processes, and capabilities required to become part of the ecosystem.
Current 5G slowdown to lead to even bigger IoT wave 5G has already been rolled out in a few pilot cities across Europe. Yet at the same time, political news seems to be putting the brakes on the deployment of 5G, leading to potentially 12- to 24-month delays. However, this isn’t impacting the mass of IoT devices being developed to take advantage of existing 4G and the upcoming benefits of 5G. In reality, all it means is that whenever 5G is in full swing, there will be more internet-enabled 5G things ready to go. For security leaders, what may have been a small wave will likely now be much bigger. Health devices, connected homes, autonomous vehicles, and financial trading are just a few examples of industries preparing to take advantage. When 5G does go live, the delays in rollout simply mean the CSO and security team will have more things to grapple with as the additional time means more solutions are ready for market, and many will be desperate to gain quick returns to get their own profitability plans back on track. Businesses should not put off 5G/IoT planning but instead use the additional time now to better define how they will identify the things in the wave when it happens, and what will be the right security process and capabilities to include. If we think having a shared model between cloud and business is complex in 2019, we must realise 5G/IoT has the ability to create far more complex technology chains and associated responsibility models.
More boards asking different and smarter questions of their security teams Historically, most businesses want to understand the cyber risk and what
impact that would have on them. The savvier organisations are, the more likely they are to have a discussion on what the right level of cybersecurity investment is to balance against this. Typically, the CSO wants the platinum solution as they want to reduce as much risk as possible; yet often business leaders may settle for less, where they see the likelihood of business impact being low and a bronze or silver solution being good enough at a much lower cost. None of this is going away, although savvy business leaders are increasingly asking the ‘what if’ question. If things do happen, what is the response strategy, how long will it take to get the business back to normal, what is the backup strategy, and are processes in place to keep the business moving? As more processes are being digitised, they are accepting that for an infinite number of reasons, things will happen; and the definition of a good security practice, and CSO leading it, is not just their ability to identify and manage the risks. Increasingly, it is about their resilience strategy developed in conjunction with the business to ensure minimising the commercial impact of the ‘when it happens’, especially in the 24/7 yearround, cloud-empowered world.
Edge computing gathers pace An emerging honeypot for cybercriminals? Not so long ago, we noticed an attack that hit a payment service provider, which, in my mind, is the sweet spot. There are millions of PoS devices, so for a criminal, you have to be in lots of places at once. Attack the bank, however, and you’re typically targeting the most secure spot, which means big risks and a lot of attention. As such, a bit like the three bears’ porridge, the criminal looks for the perfect balance, which is the aggregator in the middle – in that instance, the payment service provider. Today, we are seeing the growth of edge computing, the ability to do that first-level data processing and aggregation before sending to the cloud – the logic being to reduce the latency, lag, and costs of data processing.
Edge computing is still relatively in its infancy; the most common examples we all probably use are digital personal assistants, like Alexa or Cortana. We have already seen examples of how these processes can be compromised in a number of ways; new capabilities generate new opportunities for compromise, and where the opportunity is worth it, criminals will focus. Edge computing is an aggregation point, which, like the porridge, is just the right temperature for the adversary. As such, expect to see examples of it being tested by the adversary and security strategies mature quickly around this space.
Incident response capabilities evolve, as more fail due to legacy SOC capabilities In just about every business, the scope of digital processes has at least doubled or tripled. Cloud is a part of daily life, yet for many, the incident response process is still as it was three, four, or more years ago. You may think GDPR focused change in this space, but typically, it tested existing capabilities. With the volume of security events continuing to escalate, most simply don’t have the staff or skills to keep pace. Many have already outsourced at least the firstlevel triage for a number of years. Most are realising that the processes for IR don’t work effectively during a cloud incident, which can be more complex, often requiring input from the cloud service and the organisation. These are some of the factors driving security leaders to reassess what the SOC of the future looks like and how we scale to match the relentless growth of alerts. Today, at one end of the spectrum, we’ve seen cloud providers claiming 100% automation; and at the other end; security leaders claim nothing is actioned without human validation first. With such a broad spectrum of capabilities, and ever-increasing demands, we can only expect to see more failures that, in turn, necessitate rethinking how a SOC functions as well as where the skills and resources should be to enable it.
DECEMBER 2019
CXO INSIGHT ME
35
VIEWPOINT
HOW TO COMBAT TECH SPRAWL STEPHANE POMMEREAU, BUSINESS DEVELOPMENT DIRECTOR FOR MIDDLE EAST, AFRICA & INDIRECT, ENTERPRISE SERVICES, ORANGE BUSINESS SERVICES, ON HOW CIOS CAN REGAIN THE CONDUCTOR ROLE OF IT ORCHESTRATION WITH MULTISOURCING SERVICE INTEGRATION.
I
t’s a complex world and getting more complicated all the time. You have to sympathise with CIOs. There’s no doubt that Digital Transformation can be disruptive for IT departments, as technologies and vendors multiply and complexity increases, whatever the situation – from smart city developers to banks undergoing merger or acquisition. And it’s only going to move faster and become even more complex – it’s the ‘tech sprawl’ syndrome. The Harvey Nash/KPMG CIO Survey 2019 UAE findings suggest that the key business issues that the management board is looking for IT to 36
CXO INSIGHT ME
DECEMBER 2019
address include improving efficiencies and business processes, delivering consistent and stable IT performance, whilst saving costs and enhancing customer experience. Companies want more data and more technology; more technology means more services, more vendors, more contracts, more budget - but what about service delivery for customers and support for business objectives? 15 (or more) service providers is the norm for enterprises and will increase as cloud and digital services grow. 50 percent of IT employee time is wasted on root cause analysis. The average hourly cost of an infrastructure failure
is $100,000. Firefighting - rather than service alignment and technology integration - may become the order of the day (every day) for the CIO. But there’s hope and MSI – Multisourcing Service Integration (MSI) – can put the CIO back in control as the conductor of the IT orchestra. For more than a decade, the enterprise IT environment has been growing more complex with the rise of new hybrid technologies and services, provided by a growing number of suppliers. The challenge for CIOs is to get their multiple suppliers to work together effectively and overcome the complexity of governing them uniformly. Plus, IT and networks are changing rapidly. Hybrid networks – including software defined networks (SDN) and SD-WAN – are increasingly being adopted to manage access to cloud services and deal with growing data traffic volumes. In most cases, hybrid infrastructures rely on using many local internet service providers. For example, according to Cisco, SDWAN adoption is growing exponentially and from its recent research (500 security practitioners in US and
Europe, larger organisations), found that 76% of organisations are either starting to use SD-WAN or have fully on-boarded SD-WAN, driven by security concerns in an increasingly Cloud-based IT environment. Meanwhile, 90 percent of enterprises will adopt public cloud in 2019. 45 percent of CIOs state that complexities involved in managing the WAN are their biggest operational concern. The multiplication of connected devices, multiple suppliers, different geographic locations and user mobility adds to the growing volume of data being transferred and processed in the network. Trends like the IOT, AI, automation and orchestration, machine learning, along with the development of emerging markets, all contribute to the growing need for optimised and flexible IT governance. This tech sprawl comes at a big cost and the risk to the IT department of the loss of control over performance, vendors – SLAs and costs - not to mention the governance and security risks that go with this. The challenges and risks can multiply even further when business units bypass the IT department and do their own thing – creating silos across the
THIS TECH SPRAWL COMES AT A BIG COST AND THE RISK TO THE IT DEPARTMENT OF THE LOSS OF CONTROL OVER PERFORMANCE, VENDORS – SLAS AND COSTS - NOT TO MENTION THE GOVERNANCE AND SECURITY RISKS THAT GO WITH THIS. THE CHALLENGES AND RISKS CAN MULTIPLY EVEN FURTHER WHEN BUSINESS UNITS BYPASS THE IT DEPARTMENT AND DO THEIR OWN THING – CREATING SILOS ACROSS THE ORGANISATION.
organisation. This may allow them to meet immediate business challenges but it also reduces overall enterprise governance and control, and creates more potential security issues. Change is needed if enterprises are to reap the benefits of digital transformation. So, where do you look for a solution in a multi-vendor environment that gives you control, flexibility and saves you money – while still delivering your Digital Transformation? Multisourcing Service Integration (MSI) addresses the key challenges facing the CIO - visibility at the network and IT level; the large number of suppliers making it more difficult to see components and interactions; troubleshooting and problem solving across different services; application performance control as more components are used to deliver response time from different locations; managing end-user requests for information and support; security and control across multiple and changing sources in infrastructure and applications; contract and SLA management; introduction of new technologies, such as software defined networks. MSI offers CIOs a solution that allows you to achieve seamless governance, unification, standardisation and end to end management of your services. It also provides a single point of ownership, 24/7 proactive monitoring, technical service desk, Level 2/3, 3rd party coordination, on-site and consulting, automation of incident reporting, in-depth performance & capacity trending reports, and a service catalogue to ease end user efforts. It comprises consulting services, governance services, infrastructure management, vendor management (contract, relationship, performance) and service brokerage, and can deliver average savings of 13 percent on service provider costs. MSI may not be a ‘magic wand’ but it is the conductor’s baton to put CIOs back in control of IT orchestration.
DECEMBER 2019
CXO INSIGHT ME
37
INTERVIEW
GETTING SMART ABOUT AI SINUHE ARROYO, FOUNDER AND CEO OF TAIGER, TALKS ABOUT HOW AI CAN DELIVER REAL BUSINESS RESULTS.
Isn’t AI still immature in terms of functions and capabilities? rtificial intelligence (AI) technology has made great strides in recent years and is being used in many business processes today. In fact, TAIGER has been using AI and cognitive technologies to exploit the meaning of structured and unstructured
A 38
CXO INSIGHT ME
DECEMBER 2019
information with accuracy to streamline key business processes for many of our customers. This has helped our customers achieve increased efficiency and an overall reduction of costs and human risks. As an example, TAIGER worked with Santander Bank in Spain to transform the way they onboard their SME clients. The 100% digital onboarding process
begins with an AI and OCR application preparing the powers of attorney and identifying the faculties of each SME director in real time. Next, a video call using biometric recognition is used for remote identification of the SME director. Finally, a single digital signature is used for all documents and queries and is stored securely in the cloud.
By automating and digitalizing the main steps of onboarding, Santander successfully reduced the time taken for new customers to open an account from an average of seven days to just 15 minutes compared to the traditional in-branch process. Through this entire project’s implementation, SMEs that have never banked with Santander can now open an account 100% digitally, through web or mobile app, without the need to go to a branch, thereby totally eliminating the need for cumbersome paperwork and laborious human-based processing. Will AI eventually develop into general purpose technology? We see huge potential for AI to be applied across a wide range of business and everyday use cases. In the near future, we can certainly see it being developed into a general-purpose technology that small business or even households may harness and apply very easily to day-to-day mundane tasks. The entire AI industry needs to develop enough use cases, and organizations need to be aware of the benefits of AI technology for it to gain mass adoption and be infused in everyday technologies and processes. We believe we will eventually achieve that as we continue to drive awareness of the technology through real-time adoption and research to improve AI technology. What are some of the common barriers to AI implementation in enterprises today? Do you have any tips for CIOs to overcome these challenges? While AI has become more tangible, measurable, and repeatable across industries, its value remains elusive to a wide range of business leaders. For example, 80% of businesses in the accounting industry are still holding back on AI adoption as they are still uncertain about the return on investment due to a lack of concrete business use cases. The value of AI is clear, but to encourage widespread adoption, we
need to demystify AI and provide a clear roadmap for organizations as they look to start implementing AI to their business processes. We are already seeing organizations establish AI Centers of Excellence and other sandboxes to pilot AI capabilities. This is reminiscent of cloud adoption when it first started – slow initially, followed by quick and widespread adoption once businesses started to see the clear benefits of cloud computing. Business leaders need to understand that those who can identify applications
AI ACHIEVES GREATER TRACTION ACROSS DIFFERENT INDUSTRIES, WE BELIEVE THE FUTURE FOR AI AS A TECHNOLOGY IS ONE THAT IS FULL OF PROMISE. AT TAIGER, WE WILL CONTINUE TO FOCUS OUR RESEARCH AND DEVELOPMENT EFFORTS IN TECHNOLOGIES THAT RELATE TO DEEP LEARNING AND SELF-LEARNING.
for AI and implement it in a highly scalable way can get the most value from their structured data, unstructured data, and their employees. How can we tackle some of the ethical issues related to AI? In certain use cases, there are outstanding ethical issues when it comes to AI – be it racial biases amplified by facial recognition technology, when autonomous vehicles get into traffic accidents, or the problem of AI taking away human jobs. That being said, AI as a technology is still very much driven by human developers and scientists, and these ethical issues will have to be resolved and tackled at the human level before implementation. Already, many countries and jurisdictions have set up AI centers of excellence to determine ethical rules and regulations for AI implementation. In Singapore, the government has released a framework for the ethical and responsible use of AI earlier this year, in addition to setting up the Advisory Council on the Ethical Use of AI and Data. Similarly, the UAE government has released an Ethical AI Toolkit to guide developers on these issues. We believe these issues will be resolved as we continue to evolve and finetune the ways in which AI is used and implemented across different industries and sectors. What does the future hold for AI? As AI achieves greater traction across different industries, we believe the future for AI as a technology is one that is full of promise. At TAIGER, we will continue to focus our research and development efforts in technologies that relate to deep learning and self-learning. We hope to integrate these cutting-edge technologies into our portfolio of solutions to help enterprises further streamline key business processes, thereby achieving a smarter, leaner, more agile organization to win in the digital economy of tomorrow.
DECEMBER 2019
CXO INSIGHT ME
39
VIEWPOINT
KEEPING UP WITH DIGITAL TRANSFORMATION MARCO ROTTIGNI, CHIEF TECHNICAL SECURITY OFFICER EMEA, QUALYS, ON THE IMPACT OF DIGITAL TRANSFORMATION ON THE PRACTICALITIES OF IT.
S
ometimes it is difficult to see everything that affects us. Getting philosophical for a moment, you cannot see what you do not know about, like colour blindness affecting the ability for people to see red and green. For those of us in IT security, this inability to see everything can lead to unnecessary risks and challenges. In other words, you cannot defend what you can’t see. Visibility across IT is a challenge today. New digital transformation
40
CXO INSIGHT ME
DECEMBER 2019
initiatives have delivered vital competitive advantages for the companies involved, but these new projects have made it difficult to track what is taking place across IT. Rather than being able to maintain accurate lists of assets over time, IT teams today can find it difficult to keep up with all the changing parts that make up applications. Digital transformation involves developing completely new business models based on technology and leads to a huge amount of change in how IT
teams work to support the scale, speed and ephemeral nature of underlying IT, particularly when cloud applications or third-party services are involved. Rather than being centralised and easier to manage, the range of IT assets to track has gone up considerably, and the number of different infrastructure locations or platforms used has risen too. This has a big impact on security, which relies on visibility of assets to manage and reduce risk. IT teams now need to have a constant stream
of updates around all the changes and fluctuations taking place, and consolidate that information in one central location. The resulting singlepane visibility provides a foundation for other processes that can harmonise IT, Security and Compliance teams across the organisation. As digital transformation efforts take place, IT has to keep up with the basics as well.
Taking a practical approach to keeping up with change To keep up with the digital transformation, you have to maintain constant insight into what is changing across IT. This insight has to be accurate, up to date and provide useful information on risk. Without this data, you will forever be in catch-up mode, making it extremely difficult to impossible to manage security over time. This is particularly hard around ephemeral applications, for example, like those built on microservices or in containers, where demand levels lead to increased numbers of machines being deployed and then removed when no longer in use. To get this insight, you need a continuous stream of data so you can track what is taking place across these ephemeral assets in the moment and over time. In order to get that data, you must have sensors within each infrastructure component on every platform that the IT team uses – from endpoints and devices, through to internal applications deployed in data centres and through to new applications based on the cloud. The ability to collect this data allows security teams to understand it in context —normalising and simplifying it so that it delivers the right level of visibility.
Planning ahead on data Now you have this data, what can you use it for? It can power more proactive planning around security issues as they develop. This helps you deliver new processes and ways of ensuring security that can keep pace with digital service delivery.
For example, software vulnerabilities are discovered all the time. These can exist across IT, from endpoint devices with operating systems through to the new cloud and software platforms used to deliver digital transformation. Finding these vulnerabilities can be challenging without an up to date IT asset list and data coming in from each asset. Similarly, the sheer volume of vulnerabilities can make it difficult to manage. In this case, you have to weigh the potential impact of any new vulnerabilities up across different devices and device types so that you can prioritise those that represent the biggest risks. This approach of prioritisation, asset building in a centralised place and connecting assets to vulnerability can also easily help you spot other security issues such as applications that have reached their End of Life and won’t receive new security patches and potentially unwanted applications (PUA). To build on this, you can also use this data to manage relationships with
stakeholders across the business, from other IT teams and senior business leaders. The role of IT as the facilitator has become more important as digital transformation work has grown. Firstly, the level of investment in digital has made these projects more valuable and more visible to the business; secondly, the amount of interest around security issues is higher than it has ever been, due to the number of data breaches and increased compliance legislation that has been brought in. By getting data on issues early and communicating on potential risks – or by flagging where issues in the news don’t have an impact – you can help management teams understand what is going on and how risks are handled. The important thing is to make this visibility consumable and actionable, starting from a high-level dashboard and drilling down in a couple of clicks to the specific information needed to support actions within specific teams.
Digital transformation requires security transformation The investment in digital transformation projects is not slowing. If anything, traditional companies are spending more to get up to speed alongside new market entrants. This has led to new applications being developed and cloudbased infrastructure expanding rapidly. The move to digital requires a new approach to security that can keep up with these developments. It demands more visibility, greater automation, and more understanding. As digital transformation makes businesses more responsive to customer demands, so security has to follow this same approach, responding faster to changes and ensuring that the right steps are taken to fix issues. This involves more collaboration across teams, across processes and should be based on common data to allow for more objective decisions. Digital transformation involves meeting needs faster and the continuous security and data-driven approach to IT security which also embraces automation will help IT support this goal.
DECEMBER 2019
CXO INSIGHT ME
41
REPORT
WHAT DOES 2020 HOLD IN STORE FOR CYBERSECURITY FORTINET PREDICTS ADVANCED AI AND COUNTER THREAT INTELLIGENCE WILL EVOLVE SHIFTING THE TRADITIONAL ADVANTAGE OF THE CYBERCRIMINAL
to detect patterns to significantly enhance things like access control by distributing learning nodes across an environment. The third generation of AI is where rather than relying on a central, monolithic processing center, AI will interconnect its regional learner nodes so that locally collected data can be shared, correlated, and analysed in a more distributed manner. This will be a very important development as organisations look to secure their expanding edge environments.
Federated Machine Learning
T
hese predictions reveal methods that Fortinet anticipates cybercriminals will employ in the near future, along with important strategies that will help organisations protect against these oncoming attacks. “Much of the success of cyber adversaries has been due to the ability to take advantage of the expanding attack surface and the resulting security gaps due to digital transformation” said Derek Manky, Chief, Security Insights and Global Threat Alliances, Fortinet. “Most recently, their attack methodologies have become more sophisticated by integrating the precursors of AI and swarm technology. Luckily, this trajectory is about to shift, if more organizations use the same sorts of strategies to defend their networks that criminals are using to 42
CXO INSIGHT ME
DECEMBER 2019
target them. This requires a unified approach that is broad, integrated, and automated to enable protection and visibility across network segments as well as various edges, from IoT to dynamic-clouds.”
Highlights of the predictions: The Evolution of AI as a System One of the objectives of developing security-focused artificial intelligence (AI) over time has been to create an adaptive immune system for the network similar to the one in the human body. The first generation of AI was designed to use machine learning models to learn, correlate and then determine a specific course of action. The second generation of AI leverages its increasingly sophisticated ability
In addition to leveraging traditional forms of threat intelligence pulled from feeds or derived from internal traffic and data analysis, machine learning will eventually rely on a flood of relevant information coming from new edge devices to local learning nodes. By tracking and correlating this real-time information, an AI system will not only be able to generate a more complete view of the threat landscape, but also refine how local systems can respond to local events. AI systems will be able to see, correlate, track, and prepare for threats by sharing information across the network. Eventually, a federated learning system will allow data sets to be interconnected so that learning models can adapt to changing environments and event trends and so that an event at one point improves the intelligence of the entire system.
Combining AI and Playbooks to Predict Attacks Investing in AI not only allows organizations to automate tasks, but it can also enable an automated system that can look for and discover attacks, after the fact, and before they occur. Combining machine learning with statistical analysis will allow organisations to develop customised action planning tied to AI to enhance threat detection and response. These threat playbooks could uncover underlying patterns that enable the AI system to predict an attacker’s next move, forecast where the next attack is likely to occur, and even determine which threat actors are the most likely culprits. If this information is added into an AI learning system, remote learning nodes will be able to provide advanced and proactive protection, where they not only detect a threat, but also forecast movements, proactively intervene, and coordinate with other nodes to simultaneously shut down all avenues of attack.
The Opportunity in Counterintelligence and Deception One of the most critical resources in the world of espionage is counterintelligence, and the same is true when attacking or defending an environment where moves are being carefully monitored. Defenders have a distinct advantage with access to the sorts of threat intelligence that cybercriminals generally do not, which can be augmented with machine learning and AI. The use of increased deception technologies could spark a counterintelligence retaliation by cyber adversaries. In this case, attackers will need to learn to differentiate between legitimate and deceptive traffic without getting caught simply for spying on traffic patterns. Organisations will be able to effectively counter this strategy by adding playbooks and more pervasive AI to their deception strategies. This strategy will not only detect criminals looking to identify legitimate traffic, but also improve the deceptive traffic so it becomes
impossible to differentiate from legitimate transactions. Eventually, organizations could respond to any counterintelligence efforts before they happen, enabling them to maintain a position of superior control.
Tighter Integration with Law Enforcement Cybersecurity has unique requirements related to things like privacy and access, while cybercrime has no borders. As a result, law enforcement organizations are not only establishing global command centers but have also begun connecting them to the private sector, so they are one step closer to seeing and responding to cybercriminals in real-time. A fabric of law enforcement as well as public and private sector relationships can help in terms of identifying and responding to cybercriminals. Initiatives that foster a more unified approach to bridge the gaps between different international and local law enforcement agencies, governments, businesses, and security experts will help expedite the timely and secure exchange of information to protect critical infrastructure and against cybercrime.
Advanced Evasion Techniques A recent Fortinet Threat Landscape report demonstrates a rise in the use of advanced evasion techniques designed to prevent detection, disable security functions and devices, and operate under the radar using living off the land (LOTL) strategies by exploiting existing installed software and disguising malicious traffic as legitimate. Many modern malware tools already incorporate features for evading antivirus or other threat detection measures, but cyber adversaries are becoming more sophisticated in their obfuscation and anti-analysis practices to avoid detection. Such strategies maximise weaknesses in security resources and staffing.
Swarm Technology Over the past few years, the rise of swarm technology, which can leverage things like machine learning and AI to attack networks and devices has shown new
potential. Advances in swarm technology, have powerful implications in the fields of medicine, transportation, engineering, and automated problem solving. However, if used maliciously, it may also be a game changer for adversaries if organisations do not update their security strategies. When used by cybercriminals, bot swarms could be used to infiltrate a network, overwhelm internal defenses, and efficiently find and extract data. Eventually, specialised bots, armed with specific functions, will be able to share and correlate intelligence gathered in realtime to accelerate a swarm’s ability to select and modify attacks to compromise a target, or even multiple targets simultaneously.
Weaponizing 5G and Edge Computing The advent of 5G may end up being the initial catalyst for the development of functional swarm-based attacks. This could be enabled by the ability to create local, ad hoc networks that can quickly share and process information and applications. By weaponising 5G and edge computing, individually exploited devices could become a conduit for malicious code, and groups of compromised devices could work in concert to target victims at 5G speeds. Given the speed, intelligence, and localized nature of such an attack, legacy security technologies could be challenged to effectively fight off such a persistent strategy.
A Change in How Cybercriminals Use Zero-day Attacks Traditionally, finding and developing an exploit for a zero-day vulnerability was expensive, so criminals typically hoard them until their existing portfolio of attacks is neutralised. With the expanding attack surface, an increase in the ease of discovery, and as a result, in the volume of potentially exploitable zero-day vulnerabilities is on the horizon. Artificial Intelligence fuzzing and zero-day mining have the ability to exponentially increase the volume of zero-day attacks as well. Security measures will need to be in place to counter this trend.
DECEMBER 2019
CXO INSIGHT ME
43
PRODUCTS
Tecno Mobile’s CAMON 12 TECNO Mobile has launched its new smartphone CAMON 12. Designed specifically for selfie and photography enthusiasts, the launch event of the new device also saw the company highlight its regional market strategy, and expansion plans as it seeks to grow its footprint across the Middle East. Key features of CAMON 12 include a 6.52” Crystal Dot Notch Screen along with an outstanding 90% edge-to-edge ratio – users are ensured a new and holistic cinematic viewing experience with a wider horizon. Users can now take more natural and exquisite photos with the 16MP main lens focusing on AI scene detection and AI HDR. This also helps
in covering various common shooting scenarios and offering corresponding AI optimization. The remarkable bokeh
effect highlights the main portrait and blurs background sundry, making the portrait the best visual sense. The secondary lens brings 120° super-wideangle shots and 2cm extreme macro photography experience. Equipped with the latest version of the Android Pie operating system and user interface, its makers say CAMON 12 brings users a brand new user experience. The newly upgraded system focuses on playing a role of intelligent partner as it not only offers efficient working conditions, but also varied entertaining functions. Intelligent voice broadcast, Smart panel, AI read mode and other refreshing functions are inbuilt in the phone.
Lenovo ThinkBook laptops two Dolby Audio™ speakers, users have full visibility and clarity when working. Dual-array, Skype certified microphones make conference calling easier than ever before and a USB 3.1 (Gen2, Type-C) port ensures large data transfers and heavy media files are managed with speed. Designed for ultimate portability, Lenovo has also introduced the ThinkBook S series, including the elegant 13.3-inch ThinkBook 13s. The sleek and light device is constructed of a metallic finish on an all-aluminum chassis, alongside a narrow bezel display.
In the UAE, ThinkBook laptops are available in 13, 14 and 15-inch variants. The flagship ThinkBook 14 and ThinkBook 15 devices are powered by Windows 10 Pro and up to 10th Gen Intel Core processing, combining high performance with intuitive, time-saving features. Options include Intel Optane memory, WiFi 6, and discrete graphics. The ThinkBook 15 comes at just 18.9mm thin, while the ThinkBook 14 is a mere 17.9mm – each easy to slip into a backpack, briefcase or under one’s arm. With FHD displays and
Aruba CX switching series The Aruba CX Switching Portfolio now includes the Aruba CX 6300 Series fixed configuration and CX 6400 Series modular access, aggregation and core switches, while delivering the latest advancements in the AOS-CX operating system. This gives network operators one simple, end-to-end switching platform to dramatically improve business outcomes today and into the future. The Aruba CX 6300 Series is a family of stackable switches that offers flexible growth via a 10-member virtual switching
framework (VSF) and provides built-in 10/25/50 gigabit uplinks to meet the bandwidth needs of today and the future. The Aruba CX 6400 Series modular switches offer both a 5-slot chassis and a 10-slot chassis with a non-blocking fabric that scales from Gigabit POE access to 100G core, allowing customers to standardise on one platform across the
enterprise, including hybrid use cases. The CX Switching Portfolio, including the Aruba CX 6300 and CX 6400 Series switches, the new version of AOS-CX and Aruba NetEdit 2.0 will begin shipping in November 2019. List pricing for the Aruba CX 6300 and CX 6400 Series starts at $5,899 and $13,499 respectively.
DECEMBER 2019
CXO INSIGHT ME
45
BLOG
WHY COMPANIES CAN NO LONGER IGNORE ZERO TRUST WEAK PASSWORDS ARE A MAJOR SOURCE OF BREACHES AND BY ADOPTING A ZERO TRUST APPROACH WITH LEAST PRIVILEGED ACCESS, ORGANIZATIONS CAN INCREASE THEIR COMPLIANCE LEVELS, WRITES KAMEL HEUS, REGIONAL DIRECTOR, NORTHERN, SOUTHERN EUROPE, MIDDLE EAST AND AFRICA AT CENTRIFY.
T
he recent Gartner Security and Risk Management Summit held in Dubai, UAE revealed that the Middle East and North Africa region has the highest number of reported breaches in the world. In 2018, more than 36,000 incidents were reported from this region, the highest in the world. Along with this statistic, Gartner presentations revealed that the region also has the highest mean time to identify the breach. At 260 days, it is the highest in the world. What are the weaknesses in organizations that allow such a high number of incidents? Post incident analysis usually reveals that prevalence and usage of weak passwords amongst end users, and especially privileged end users like administrators, is the root cause for such breaches. Most incidents that happen are not necessarily of an advanced nature, and mostly stem when threat actors or hackers are able to crack weak passwords, and gain entry into an organisation’s network using compromised credentials of end users and administrators. Gaining entry into an organization’s network through the credentials of an actual end user or privileged end user like an administrator, remains the easiest entry strategy for threat actors. Forrester Research points out that 80% of security breaches result from privileged access abuse. In the past, it used to be assumed that access granted through a login
46
CXO INSIGHT ME
DECEMBER 2019
including a user name and password was sufficient to guarantee the authenticity of the user. With the increasing sophistication of threat actors to brute force passwords to gain access, especially weak and repeated passwords, this assumption is no longer valid, and has spawned the creation of the Zero Trust model. The Zero Trust model, first suggested by Forrester Research and National Institute of Standards and Technology in 2010, reinforces the modern belief that login identities can no longer be trusted, inside or outside the organization, especially with the expanding threat surface. The Zero Trust model today covers the following elements with the objective of not implicitly trusting any access for any user without verification. • Networks: Verify access to segment, isolate, and control the network. • Data: Control access to secure and manage data, develop classification schemes, encrypt data at rest and in transit. • Workloads: Verify and control access to the application stack. • Devices: Verify and control access of every device on the network. • Identities: Limit the access of users and secure users. By limiting and securing privileged access to the above, the organization is moving away from a perimeter-based approach to a Zero Trust approach. The Zero Trust approach boosts prevention, detection, response, and compliance
towards standards such as HIPAA, FISMA, PCI, and others. Moreover, it can be extended to the cloud, mobility, Big Data lakes, DevOps, containers, microservices, and others. Organisations begin their Zero Trust journey with the following initiatives: #1 Vault all privileged credentials Access to the credentials of privileged users and privileged resources need to be secured and controlled, raising the level of security management control. Rigorous multi-factor authentication also needs to be enabled and added around privileged users and privileged resources. #2 Consolidate identities and introduce least privilege All identities need to be consolidated to eliminate redundant ones at the same time limiting privileges to the minimum required to get the work done. Along with limiting privileges, workflows need to be limited in the similar manner to restrict lateral user movements. #3 Hardening the environment Once the above two initiatives have been implemented, the organization can move to the next level of compliance. This can include introduction of air gapping around hardware and resources, usage of host-based intrusion detection systems, and development of advanced behavioral analytics. By going through these steps, organizations can ensure they are no longer vulnerable in the area of security breaches and password theft.