FORECASTS PREDICTIONS
Top executives share their industry and technology forecasts for 2022 and ahead.
Top executives share their industry and technology forecasts for 2022 and ahead.
The last two years were the warm up years for digital transformation. 2022 looks all set to be the kick off year and enterprises need to get this right, without the overarching headache of pandemic disruptions. However, like every year, wherever there is a silver lining for the IT industry there are also challenges.
The continuous attack on IT infrastructure and the failure of enterprise software to protect its integrity has finally made a dent on enterprise sentiment. Alongside climate change, the 2021 Edelman Trust Barometer shows trust in technology sector is declining.
Through a supply chain exploit, rather than attacking 100 or 1,000 separate organisations, they can successfully exploit a complete ecosystem through one company alone.
The reputation and integrity of enterprise software vendors has dropped to such new lows, that Lior Div, CEO and Cofounder of Cybereason categorically stated that organisations depending on Microsoft for security will make headlines in 2022.
According to Div, Microsoft will continue to be the primary focus for cyberattacks in 2022 and defenders need to understand the risk of relying on Microsoft to protect them. Microsoft has a dominant role across operating systems, cloud platforms, applications that makes it fairly ubiquitous.
Div continues, what we need to be aware of as we go into 2022 is the increasing cooperation and collaboration between threat actors.
According to Ettiene van der Watt at Axis Communications, companies must pay closer attention to their processes from end to end. Authenticity is becoming the next big hurdle in the age of data manipulation.
With some businesses getting over 200,000 threats per year it is impossible for humans alone to manage, points out Gregg Petersen at Cohesity. Ransomware continues to be a powerful and potentially devastating type of cyberattack. Ransomware as a Service, RaaS has seen continued evolution during 2021.
A distributed workforce means protecting a corporate network similar to a walled garden is no longer appropriate. The way forward will be provided by automation and artificial intelligence, and CISOs need to be aware of the potential of AI in their armoury, stresses Petersen.
The arrival of low-code, data analytics, real time insights will make 2022 an important year for data sciences. No-code and low-code will simplify and democratise AI. Alan Jacobson and David Sweenor from Alteryx point out that 2022 will be the year of the Chief Transformation Officer.
There is an existing skills gap between data scientists as practitioners and those as teachers. Organisations will shift their mindsets to sharing rather than data hording. With the continued democratisation of analytics, data scientists need to evolve from problem solvers to teachers.
Turn these pages to learn more about forecasts and predictions for 2022 and ahead. Also, read about the launch and build-up of Meta’s Research SuperCluster.
Wishing you a super busy, on-going events quarter, and positive returns in business.
MANAGING DIRECTOR
TUSHAR SAHOO
TUSHAR@GECMEDIAGROUP.COM
EDITOR
ARUN SHANKAR
ARUN@GECMEDIAGROUP.COM
CEO
RONAK SAMANTARAY
RONAK@GECMEDIAGROUP.COM
GLOBAL HEAD, CONTENT AND STRATEGIC ALLIANCES
ANUSHREE DIXIT
ANUSHREE@GECMEDIAGROUP.COM
GROUP SALES HEAD
RICHA S RICHA@GECMEDIAGROUP.COM
EVENTS EXECUTIVE GURLEEN ROOPRAI GURLEEN@GECMEDIAGROUP.COM
JENNEFER LORRAINE MENDOZA JENNEFER@GECMDIAGROUP.COM
SALES AND ADVERTISING
RONAK SAMANTARAY
RONAK@GECMEDIAGROUP.COM
PH: + 971 555 120 490
DIGITAL TEAM
IT MANAGER
VIJAY BAKSHI
DIGITAL CONTENT LEAD DEEPIKA CHAUHAN
SEO & DIGITAL MARKETING ANALYST HEMANT BISHT
PRODUCTION, CIRCULATION, SUBSCRIPTIONS INFO@GECMEDIAGROUP.COM
CREATIVE LEAD AJAY ARYA
GRAPHIC DESIGNER RAHUL ARYA
DESIGNED BY
SUBSCRIPTIONS
INFO@GECMEDIAGROUP.COM
PRINTED BY
Al Ghurair Printing & Publishing LLC.
Masafi Compound, Satwa, P.O.Box: 5613, Dubai, UAE # 203 , 2nd Floor
G2 Circular Building , Dubai Production City (IMPZ)
Phone : +971 4 564 8684
31 FOXTAIL LAN, MONMOUTH JUNCTION, NJ - 08852 UNITED STATES OF AMERICA PHONE NO: + 1 732 794 5918
A PUBLICATION LICENSED BY International Media Production Zone, Dubai, UAE @copyright 2013 Accent Infomedia. All rights reserved. while the publishers have made every effort to ensure the accuracyof all information in this magazine, they will not be held responsible for any errors therein.
Authenticity is becoming next big hurdle
COVER STORY / 24-46 2022 and ahead: Forecasts and predictions
l ALAN JACOBSON, DAVID SWEENOR: 2022 will be the year of the Chief Transformation Officer
l AMEL GARDNER: Information and insights are required just in time
l ASHRAF YEHIA: Challenge for datacentres is not efficiency but sustainability
l DAVID BROWN: Attack surface management an important area in 2022
l DAVID HUGHES: Consumers are placing value on experiences over things
l DAVID NOËL: Appetite for digital services unlikely to reverse itself
l ETTIENE VAN DER WATT: Trust in the technology sector is declining globally
l FADI KANAFANI: In 2022 AI starts to permeate all industries
l FIRAS JADALLA: Spatial and video analytics to progress in 2022
l GREGG PETERSEN: CISOs need to be aware of potential of AI in their armoury
l ISSAM LACHGAR: Digital technologies can help meet 2030 and 2050 targets
l LIOR DIV: Organisations depending on Microsoft for security will make headlines
l MOUSSALAM DALATI: Retailers will have to create experiential retail
l OSAMA AL-ZOUBI: Data has ownership, sovereignty, privacy, compliance challenges
l RAJESH GANESAN: Hybrid work making network-based security obsolete
l SHERIFA HADY: Opportunities for channel in reducing network complexity
l SID BHATIA: Enterprise AI will be supplanted by everyday AI
l WOJTEK ŻYCZYŃSKI: Channel marketplaces driving away product catalogues
l And others
Here are some forward looking forecasts about the role played by 5G, costs of device connectivity, low earth orbit satellites, and factory area networks.
There is so much happening in the Internet of Things world. Here, we explain why demand for device connectivity is about to take off with these predictions.
You will see a steady increase in IoT connected device projects that require more than one method of connectivity, but costs are high, and standards are lacking. One way to accelerate this would be if LPWAN providers adhered to some standardisation, making it easier for IoT platform providers to support multiple vendors. However, this is not an attractive choice for the LPWAN providers for the obvious reason that cheaper competitors could replace them.
What will make it happen is simply price drops. This will make it possible to cheaply add multiple connectivity standards to a device. IoT platforms will be expected to abstract and normalise them, acting as the do-it-all Swiss knife.
There will be huge hype around low earth orbit satellites for connectivity, but that may be short-lived. A low earth orbit satellite is an object that orbits the earth at lower altitudes than geosynchronous satellites - normally between 1,000 km and 160 km above the earth. They are commonly used for communications, military reconnaissance, spying, and other imaging applications.
In 2022 we will see many more launches, where companies are aiming for complete worldwide connectivity at fast speeds and with low latency.
Although there will be a huge hype around it in 2022, we predict that it might not stick. Why?
The use cases are based on unlocking access to remote locations. And 95% of them are in agriculture. While it goes without saying that the satellite connectivity technology for devices is relatively cheap, the lack of variety in use cases makes you wonder on how to achieve full scale adoption in the long run.
Once the hype settles will find a niche in other areas where cellular coverage is difficult, example Australia or Brazil. As such, it will be at the most a complementary technology.
In 2022, private 5G for IIoT uptake will be slower than the market expected. One of the reasons is that due to the slow adoption of other technologies like the new Wi-Fi 6 standard, that will offer at least four times faster speed than Wi-Fi 5, are catching up.
Therefore, the question is whether 5G’s most-hyped use case, factory area networks FAN, is really going to take off. Sure, very large areas like airports and harbours will benefit from 5G, but factories might be tempted to cover with Wi-Fi. Most assets might already have Wi-Fi connectivity and not 5G connectivity, so the business case will likely not be good enough to upgrade your brownfield. ë
IoT platforms will be expected to abstract and normalise them, acting as the do-it-all Swiss knife
BART SCHOUW, Chief Evangelist in CTO Office, Software AG.
Drawing a line to represent your organisation’s digital perimeter is problematic since such perimeters are not expanding outward like a balloon filling up with air.
JAAFARAWI, Managing Director, Middle East, Qualys.Arecent VMware report showed 80% of security professionals had experienced increases in attack levels in their organisation because of remote work. The pandemic did not cause a spike in cyber-incidents. It caused a spike in digital transformation, which expanded the opportunities for attackers to attack.
As we start to consider our new normal as just normal, there are a few challenges we still must overcome.
Drawing a line — even in an abstract sense — to represent your organisation’s digital perimeter is deeply problematic. Such perimeters are not expanding outward like a balloon filling up with air. Entirely new balloons, such as third-party networks and the private homes of employees, are joining the environment, as well as new factory — and field-based devices that make up the rapidly expanding Internet of Things.
Remote work, for example, is a necessary component of today’s world. Hybrid environments will remain, so CISOs must form a plan of action for managing them that retains the flexibility they have added while diluting the risk they pose.
And finally, security leaders must justify budget spends. They must target areas of improvement, balancing cost with value added. They must weigh issues such as talent shortages with the pressing concerns of discovering, auditing, and securing new digital assets, from field and factory machinery and traditional endpoints to cloud environments and containerised apps.
Automation is the standout quick win for today’s embattled CISO. Assuming regional technology stakeholders have been able to assemble a security team of any size, that team is likely to be overworked in the post-Cloud Rush era. Overwhelmed by false positives and preoccupied with firefighting, these professionals, recruited for their ability to add value, are instead succumbing to alert fatigue.
Automation can be applied to several areas that are traditionally labor-intensive. It can sift through telemetry in a fraction of the time it would take a human agent to do so. And it can be put to work in asset discovery, compiling a rich and accurate inventory that gives security teams a baseline from which to understand their new environment. Next, automation can get to work on auditing discovered assets and triaging them for action, whether that is further investigation by a human resource or immediate patching of a known vulnerability.
Visibility alone, gained by automated asset discovery, is priceless. Remote devices, cloud workloads, the activity within containers — all this and more should be transparent to CISOs and their teams. Zero Trust network architectures are also becoming popular. Other challenges, such as how to match the speed and ferocity of the attack landscape, can be met through advanced AI. Technologies like machine learning have proved themselves capable of drastically shrinking response times. They comb through lakes of data and flag threats in real time, reducing the number of false positives and further making the case for automation.
Tools are improving. Security vendors are starkly aware of the growing need for forward-looking digital strategies among their customers and have responded by raising their game once again to outwit bad actors. Cloud-oriented, container-sensitive security platforms are now capable of advanced prevention, detection, and response, including automatic asset discovery and inventory management, machine-controlled patching, and more streamlined compliance management. ë
Formjacking is appealing to cybercriminals because it is a way to extract data and stands to get a boost from hacking-as-aservice and affiliate models.
Much of the discussion about the threat landscape in 2021 has been about either ransomware or high-consequence vulnerabilities in enterprise software. Amidst all this excitement, it has been easy to overlook the risk to e-commerce from formjacking attacks such as Magecart.
In brief, formjacking is an attack type where threat actors inject a malicious script into an e-commerce site, which then captures and extracts credit card information during the payment process.
While attackers have been using this technique against a progressively broader profile of targets than strictly e-commerce sites, including public utilities and professional organisations, e-commerce is still the prime target because that is where the most easy money is. In 2020, more than half of retail data breaches in the US were attributable to formjacking attacks.
One significant factor that makes them so popular is the potential to inject a script once against an outsourced payment platform provider, and have it served to all of the target’s customers—harvesting their credentials. This extracts a huge amount of value for little work, a variation on the supply chain attack approach that has caused so many headaches in the last few years.
Formjacking is appealing to cybercriminals because it is a way to quickly extract valuable and easily saleable data. Now, however, it stands to get an additional boost from the growing trend of hacking-as-a-service and affiliate models. Most of the attention on this model is currently focused on Ransomware-as-a-service. This approach lets ransomware developers focus on development and outsource every other part of the attack to affiliates, who often rent the malware. And this trend is not limited to ransomware.
A growing number of intelligence sources indicate that specialisation and division of labor in the attacker community is becoming more common and more intensive, meaning that there are now separate experts focusing on specific areas such as gaining an initial foothold, establishing persistence, evading detection, and so on, whose services are all for sale.
This is particularly important for formjacking because the greatest degree of variation between Magecart variants lies in the methods of initial entry and detection evasion. As long as each formjacking threat actor
needs to figure out every aspect of their own vector.
This means that attack chains that are powerful at one stage might be hamstrung by limitations at another stage. For example, clever injection techniques might be undone by the noisy extraction of information or an unsuccessful attempt to masquerade as a Google Analytics or Recaptcha script.
As specialisation and division of labor accelerate in the attacker community, there is potential for each attack to feature the best practices in each phase of the attack chain.
Formjacking is not the single most prevalent attack technique around, nor is it the most devastating. However, in terms of a coherent pattern between attacker, technique, and target, it is one of the most clear and focused. Attackers prefer payment cards to other kinds of stolen data because they are easily monetised.
We probably cannot collectively reduce the risk of formjacking to zero, but with some situational awareness and a few extra controls, you can at least determine what happens on your own e-commerce site—which is, of course, a fine place to start. ë
Intelligence sources indicate specialisation and division of labor in the attacker community is becoming common
OPSWAT, ondeso and EMT in association with the Global CIO Forum organised a virtual summit on the Protect Your Critical Infrastructure webinar on 26th January 2022. The conference focused on critical infrastructure that are vital to a company or country and their incapacity would have a disastrous impact.
In the IT World, critical infrastructure is always referred to as operational technology and is the practice of using hardware and software to control industrial equipment.
For the past couple of years, threat actors have targeted organisations in energy, oil and gas and utility sectors. Cyberattacks on critical infrastructure have become increasingly complex and disruptive, causing systems to shut down.
ICS environments can also serve as a gateway into enterprise and government which frequently maintain sensitive data, as well as classified security information. Simply put, it is because of such high stakes that critical infrastructure organizations needs an abundance of qualified, highly skilled cybersecurity professionals to help identify, mitigate, and remediate threats of all types.
Critical infrastructure security is highly important in protecting systems and services that are essential to society and the economy.
Nandini Sapru, Vice President of Sales, EMT welcomed all the speakers and attendees. “It is what makes a company’s data or network so critical, that in case of a calamity a lot of people and a lot of data and a lot of
situations would be affected.”
Fawad Laiq, Senior Technical Manager, emt Distribution briefed the challenges faced in OT Networks and said, “Many of the OT Networks have a lot of legacy systems. Their updates, patching and management is not that easy because of the restrictions including physical and even the logical restriction which we have in a place.”
Oren Dvoskin, Vice-President of OT and Industry Marketing, OPSWAT talked about 2020-2021 OT ICS Security trends and OPSWAT OT, Industrial Cybersecurity, OPSWAT Netwall, and Unidirectional file transfers.
Vincent Turmel, Senior Director of OT Product Sales Engineering, OPSWAT highlighted few essential points including Unidirectional file transfers, OT ICS data replication, Data Centre Security, case study on electricity distribution and OPC UA Replication.
Peter Lukesch, COO, ondeso also highlighted the hidden potential of strategic IT Management in industrial environments, avoiding vulnerabilities, improving reliability and quality, major errors of operational technology and information technology solutions.
Arun Shankar, Editor, GEC Media Group conducted a question-andanswer session with Oren Dvoskin, Vincent Turmel, Nandini Sapru, and Peter Lukesch.
The event was concluded with an amazing quiz with the attendees and Megha Arora as the winner won an iphone 13.
Proven Robotics, announced the launch of its first robotics and technical service centre in Riyadh, Saudi Arabia. The new service centre will help customers to enhance strategic sales and achieve their technical goals, while benefiting from end-to-end local support and expertise in robotics and advanced technologies.
The new facility builds on Proven Robotics’ reputation of delivering efficient and innovative robotics solutions. It will offer a wide range of services including providing customers with original spare parts, onsite troubleshooting, inhouse maintenance from qualified and technically certified teams, as well as the installation and configuration of robots.
Newly launched facility will enable shorter response times for customers and empower customer to meet digital transformation goals, while deepening advanced technology expertise within the Kingdom
As a first-of-its-kind service centre for Proven Robotics, the new facility strengthens the company’s operations within the region at a time when demand for advanced technologies is on the rise worldwide.
Khazna Datacentres, the UAE’s largest data centre provider achieved the Uptime Institute’s Tier III certification of Constructed Facility for its Apollo 3 and Apollo 4 Datacentres.
In meeting today’s data centre demands, innovative requirements come in the form of speed, higher density, modularity, energy efficiency, sustainability, and scalability; whilst remaining secure and highly available. The wholesale model adopted by Khazna Datacentres, addresses enterprise data centre requirements by presenting highly secure, ultra-modern wholesale datacentres that are fully equipped with the latest technologies. These may be customised and scaled as customers grow allowing for faster time to
market.
With the increased digitalisation of processes, along with a growing demand for operational readiness, agility and availability of IT systems and infrastructure, datacentres are becoming more critical to accommodate for these developments. The datacentres which form an integral part of the 108MW data centre, focuses on improving efficiencies in the design and elimination of anomalies that ensures we are adhering to the best practises in data centre-built environments and implementing effective controls to protect critical infrastructure from environmental hazards and human errors.
Khazna Datacentres is setting up efficient design and delivery, at a quicker pace, halving the time it takes to get the data centre operational compared to traditional construction methods. Khazna Datacentres continue to empower customers and partners to accelerate their digital transformation journeys, re-imagine new ways of working, and optimise operations.
Uptime Institute’s Tier III accreditation was achieved after a diligent assessment and evaluation by expert teams from Uptime Institute. This certification assesses data centre reliability, availability, maintainability, and overall performance needed to provide continuous and efficient operations.
StarLink, announced a distribution partnership with Anomali, the leader in intelligencedriven extended detection and response cybersecurity solutions. By adding Anomali to its portfolio, joint customers’ security teams will gain in-depth visibility over all threats, automate threat blocking, and enable faster response times.
This new partnership will provide organisations across MEA with access to the awardwinning Anomali portfolio, an innovative suite of products that leverages global intelligence to empower security teams with the precision attack detection and optimised response needed to stop immediate and future breaches and attackers.
Included in the offering are Anomali ThreatStream, a leading threat management platform, Lens, a Natural Language Processing NLP extension that identifies all threats in any web content to operationalise it across security infrastructures, and Match, an advanced XDR attack detection and response solution that quickly identifies and responds to threats in real-time by automatically correlating all
Enova, announced its expansion into Turkey with the signing of its latest contract with Munzur Su. The expansion will see Enova securing new contracts and opening a new office as a part of its growing regional operations.
Enova’s expansion follows a successful year, with new EPC, Solar, and FM contracts including a long-term commitment with Dubai Metro aligning with Road & Transport Authority RTA and public health regulations. Enova has also signed with TECOM Group – a member of Dubai Holding, and Tarshid in Saudi Arabia along with existing contract renewals. Enova’s regional growth has been accelerated by the increased adoption of the energy services company model in the GCC and beyond, with companies financing their energy management strategy through the savings that they achieve.
Enova specialises in providing its clients with comprehensive and performance-based energy and facilities management solutions, paving the way to achieve financial, operational, and environmental targets. The company functions as an O&M provider, acting as a key enabler of energy efficiency in the industrial sector.
security telemetry data against active threat intelligence.
StarLink, with its unique GTM strategy, regional on-ground and technical expertise as well as its extensive partner network will help build a robust market expansion plan for Anomali in the region to meet the growing customers’ cybersecurity requirements as well as enhance their market presence.
Quantum is partnering with value-added distributor Mobius to deliver security and video surveillance infrastructure portfolio, across the Middle East and Africa regions.
The partnership allows Mobius, a distributor within the video surveillance market in MEA, to now offer customers Quantum’s entire security infrastructure portfolio, from high-performance NVRs, to hyperconverged infrastructure, to the largest shared storage and archive solutions, and analytics processing. Customers looking to build or upgrade their CCTV and physical security systems across the region can now access Quantum’s flagship video surveillance offerings, including:
l VS-NVR Series Network Video Recording Servers
l VS-HCI Series Surveillance Recording Appliances
l VS1110-A Application Server
l VS2108-A Analytics Server
Equinix announced its expansion into Africa through its intended acquisition of MainOne, a leading West African data centre and connectivity solutions provider, with presence in Nigeria, Ghana and Côte d’Ivoire. The acquisition is expected to close Q1 of 2022, subject to the satisfaction of customary closing conditions including the requisite regulatory approvals.
The transaction has an enterprise value of $320M and is expected to be AFFO accretive upon close, excluding integration costs, marking the first step in Equinix’s long-term strategy to become a leading African carrier neutral digital infrastructure company. With more than 200 million people, Nigeria is Africa’s largest economy and, along with Ghana, has become an established data centre hub. This makes the acquisition a pivotal entry point for Equinix into the continent.
Equinix believes MainOne to be one of the most exciting technology businesses to emerge from Africa. Founded by Funke Opeke in 2010, the company has enabled connectivity for the business community of Nigeria and now has digital infrastructure assets including three operational datacentres, with an additional facility under construction expected to open in Q1 2022. Upon closing, these facilities
will add more than 64,000 gross square feet of space to Platform Equinix, with 570,000 square feet of land for future expansions.
MainOne owns and operates a subsea network from Nigeria to Portugal, as well as 1,200 kilometers of reliable terrestrial fiber network across southern Nigeria. These are all improving connectivity to and from Europe, West African countries and the major business communities in Nigeria. When completed, this acquisition will extend Platform Equinix into West Africa, giving organisations based inside and outside of Africa access to one of the world’s fastest growing markets.
Tata Consultancy Services launched a pioneering All-Women Innovation Lab in Riyadh, to collaborate with startups and universities to provide a platform for students to explore and innovate with new technologies. The new innovation lab promotes digital skill development and supports Kingdom of Saudi Arabia’s Vision 2030.
The inaugural ceremony was chaired by Dr Ahmed Althnayyan, Deputy Minister for Future Jobs and Digital Entrepreneurship, Ministry of Communication and Information Technology, Saudi Arabia, and attended by dignitaries from Saudi ministries, semigovernment agencies, as well as CXOs and directors from top Saudi banking and telecom companies.
The TCS All-Women Centre is a unique business model developed to serve customer needs and help local communities.
It harnesses the best of TCS’ expertise to train and develop professional capabilities in Saudi women to support them in pursuing fulfilling careers and realising their potential in the Kingdom.
The centre provides long term career opportunities in the fields of finance and accounting, human resource operations, supply chain management, IT and digital related services and helps customers across the globe. Recently, it won the Bronze Trophy at the King Khalid Sustainability Awards 2021 for ensuring global sustainability standards and practices.
Siemon, hosted an African channel and service provider partner event at the stunning Southern Palm Beach Resort in Mombasa, Kenya. The three-day event took place in early December and provided an opportunity for Siemon and its longstanding distribution partner, Mart Networks Group, to reconnect with all of their partner companies and thank them for their continual hard work during these challenging past two years.
Representatives from partners across the continent followed the invitation to beautiful Diani Beach at the edge of the Indian Ocean to meet in a relaxed environment, strengthen business relations and have an opportunity to learn more about Siemon’s innovative IT infrastructure portfolio.
The meeting agenda included a conference, an excursion to Kisite Marine Park and Wasini Island where there were plenty of opportunities to watch dolphins and snorkel in the pristine waters, followed by a Kenyan barbeque dinner.
At the evening conference Siemon’s global intelligent building solutions specialist Bob Allan shared Siemon’s knowledge on the latest technology trends impacting intelligent building infrastructure and the company’s solutions portfolio for smart buildings. He was followed by Gary Bernstein, Siemon’s global data centre solutions specialist who discussed key infrastructure considerations including singlemode and multimode fibre cabling options as enterprise data centres look to adopt next-generation speeds.
The event also celebrated the successes that Siemon and its long-standing channel partners are enjoying in Africa. As the continent continues to experience broad investment in the data centre sector, Siemon has firmly established itself as a leading provider of quality data centre design services and cutting-edge IT infrastructure solutions within the finance, education and colocation data centre markets. The company is set to grow its footprint in the region in 2022 as the growth in internet penetration and a rising demand for cloudbased services will result in further data centre development.
CyberKnight, announced its financial results for FY2021. In its second year of operation, CyberKnight processed more than $29 Million in orders realising a revenue increase of more than 200% over the previous year.
This notable achievement and the rapid growth are the result of substantial investment into high-caliber business development and technical local teams across the Middle East, a keen focus on expanding the breadth of the channel partner ecosystem, successfully delivering extensive high-ROI marketing activities, maintaining strategic customer relationships, and effectively building out a market-leading Zero Trust Security focused portfolio offering.
A10 Networks announced significant ongoing success achieved by its channel programme in 2021, with 23 new strategic partners signed up in the last year and plans to further develop channel initiatives in 2022.
At the start of 2021, A10 Networks refocused its five key strategic channel pillars encompassing building ecosystems, channel enablement, lead generation activities, deal registration and working with distribution. At the end of 2021, A10 Networks signed up 23 new partners as a result of this focus on its channel, which now comprises over 80 partners and 30 distributors.
Furthermore, A10 Networks continues to work with strategic alliance partners, Dell and Ericsson. These alliances will be a key focus in 2022 driving combined technology solutions that deliver better business outcomes for customers. New business development initiatives are underway within the distribution community with joint-funded resources assigned in territories such as the Middle East, UK&I, Benelux, DACH, Africa and Scandinavia working with distributors such as Exertis, Ingram, Netex, V-Valley, 2SB, MUK and others.
A10 Networks also launched its new Affinity Technical Ambassador programme in EMEA in 2021 which is gaining great traction with strong technical collaborations within key partners underway to harness and enrich the knowledge level of partners. The company has also focused on taking partners on a progressive journey and its Elevate to Elite initiative has been successfully enabling partners to make the transition to Elite partner status.
CHRIS MARTIN, Channel Sales Leader for EMEA and APAC at A10 Networks.A10
new
Liferay announced that Link Development, a global provider of technology solutions and Liferay EMEA partner for close to two years now, has achieved multi-country Platinum Partner status, Liferay’s highest partnership level.
The distinction highlights Link Development’s know-how in terms of consulting, partnering and delivering digital experiences for enterprises, thereby achieving success and winning the Liferay Platinum Partner Award for their outstanding performance. Joining the exclusive list brings further proof of its expertise in the digital experience market. The strategic alliance has exceeded the objectives of both the organisations across Egypt, United Arab Emirates, Saudi Arabia, Bahrain, Kuwait, and Oman, as well as Canada and the United States.
Liferay’s DXP enables a unified and optimised experience across several platforms and scales up on the cloud for increased agility
and flexibility to drive digital transformation across verticals that include government, telecommunications, insurance, banking, retail amongst others. It also empowers entities with seamless, creative customer experiences across various touchpoints and plays a crucial role in preventing data breaches and securing customers’ data.
On this occasion, Ahmed Saad, Alliance Manager at Liferay Middle East, expressed his delight in elevating the cooperation between Liferay and Link Development to a new level, highlighting mutual trust and paving the way for future collaborative accomplishments.
Finesse, entered into an agreement with Barracuda, to provide its customers with easyto-deploy cloud security solutions. Through this agreement, Finesse’s expertise in digital transformation will be complemented by the power and simplicity of Barracuda’s cybersecurity solutions portfolio, enabling Finesse customers to reduce cybersecurity risk on their digital journeys.
Finesse delivers digital transformation solutions to over 350+ customers in industries like BFSI, healthcare, energy, education, and government. Barracuda offers enterprise-grade security solutions that protect employees, customers, their data and applications on the cloud, from a wide range of threats through their easy, comprehensive, and affordable solutions for email protection, application and
cloud security, network security, and data protection. Through Barracuda’s portfolio of products, Finesse will offer regional customers the ability to visualise, segment, and protect their cloud security structure, as well as respond seamlessly to zero-day threats by integrating Barracuda solutions into their existing security ecosystem.
Dubai Internet City announced a deal with Khazna Datacentres, one of the largest wholesale data centre providers in the Middle East and North Africa, to establish two state-of-the-art facilities to further support enterprises of all sizes by enabling the integration of technology across all business functions.
The government’s ambition of transforming the UAE into a smart country necessitates the deployment of secure cloud infrastructure and data storage across all industries, from government and residential services to healthcare and manufacturing. Combined with many businesses adopting hybrid work models on the heels of the global pandemic, the volume of digital data has increased significantly in recent years.
National strategies that encourage the development and deployment of IoT, AI and cloud computing are also increasing the demand for reliable storage. The growing reliance on the internet has elevated data security to a global priority, especially in relation to sensitive data generated by the healthcare and financial sectors. Wholesale storage providers such as Khazna will play a critical
role in catering to the recording, movement and security of increasingly large amounts of information. As digital adoption sweeps through the greater MENA region, the opening of data privacy and storage centres can enhance business activity.
The two new data centres will be strategically located in Dubai and will feature the unique design of Khazna’s data centre pods allow for the rapid scaling of operations when required, which offers partners long-term growth benefits, while businesses involved in 5G, smart city projects and cloud computing can also leverage Khazna’s technology.
Khazna Datacentres will operate a total of fourteen datacentres combined, creating the UAE’s largest data centre provider.
Tata Communications and Zain KSA announced they have entered a strategic engagement to fuel digital transformation
journeys of enterprises and government organisations in the Saudi Arabia. With this collaboration the combined ecosystems will
deliver solutions and platforms to remodel cities with smart street lighting, smart waste management, connected workplace, healthcare and connected cars.
The flagship project where Tata Communications and Zain KSA are working together to bring smart street lighting solution for one of the key cities in KSA.
Tata Communications IoT ecosystem will serve as one-stop-shop to provide the hardware, platform, application and insights while Zain KSA will expand the footprint with its business-to-business B2B offerings through joint projects related to software-defined wide area network SD-WAN and global contact centres, as well as the application of smart transport and Internet of Things IoT solutions enabling smart waste handling, smart metering and other smart city use cases, to name a few.
The Tata Communications and Zain KSA strategic engagement will serve Saudi’s enterprises and government institutions with advanced technologies such as IoT, 5G, Low Range Wide Area Network, Managed Security Services, SDWAN and many others. It will also support environmental sustainability measures and digital transformation of the region.
BROUGHT TO YOU BY
REGISTER NOW
Top executives indicate switching to a burner may not be sufficient, and it is equally important to avoid accessing personal and organisational accounts.
The FBI has notified Olympic athletes to leave their personal cell phones at home and carry a burner phone to the Beijing Winter Olympics. It has cited the potential for malicious cyber activities. The FBI has advised athletes to use a temporary phone while at the games. According to the FBI there is no country that presents a broader threat to US ideas, innovation, and economic security than China. US intelligence officials have warned that officials and members of business and academia, who travel to China can face possible risk of their personal devices getting hacked. While US athletes are allowed to compete, the Biden administration is not sending government officials to the games.
The National Olympic Committees in some Western countries are also advising their athletes to leave personal devices at home or use temporary phones due to cybersecurity concerns at the Games.
Top executives of the industry share their opinions on this advisory also indicating that just switching to a burner may not be sufficient. It is equally important to avoid accessing personal and organisation credentials and accounts, even while using the burner phone. Transfer of data from the burner phone back to a primary device is best done through an intermediate account and device as well.
Read on for a deep dive on this subject.
Using a burner phone limits the exposure of the device to the trip itself. Assuming the burner is activated for the trip and discarded or even destroyed either before leaving China or on arrival in the USA, the opportunity for significant compromise is severely reduced.
There may be opportunity to monitor calls, texts and internet activity while the phone is in use within China, but this activity can be limited through good education on the risks for those travelling. Using a long-term device may result in the compromise remaining in place when the person returns to the USA, a far more serious concern and almost certainly the basis for the FBI statement.
The real objective of any compromise of a device would likely be to establish a method of access that remains available once the device leaves the country. Likely malware-spyware would be an inobtrusive monitoring and or remote access control tool that could allow attackers access to the device wherever it travels. The biggest benefit for the malicious actors in this scenario is full control over the infrastructure from where the attack originates.
There is no need to compromise telcos, masts or Wi-Fi, as these are all within their sphere of control. This should make eavesdropping, while in country, relatively easy but also provide far more opportunity to probe for vulnerabilities without triggering infrastructure monitoring that exists outside of the country borders. That alone could provide opportunity to discover vulnerabilities that could be exploited long after the person leaves the country, avoiding checks on devices when the individuals arrive back in the USA.
With successful athletes potentially being invited into secure spaces, for example, The White House, those devices might offer a beachhead, in terms of a continued attack. Not only that, being able to track athlete’s movements, conversations, messages, and potentially even compromise their encrypted social media feeds, long-term, potentially offers opportunities to coerce individuals. High-profile individuals will always represent a richer target for attackers.
Our smartphones and tablets are full of sensitive personal data that we would not want anyone to have their eyes on or potentially steal. In the case of Olympic athletes, they might have photos of sensitive health documents or passports saved as a backup on their mobile devices.
That being said, regardless of whether athletes and press are using burner phones or not, they should be incredibly wary of any individual, app, or message that encourages them to share login credentials because the risk of being phished on mobile exists regardless of the type of device or operating system.
Furthermore, apps could easily be running malware in the background, especially if they are not being downloaded from a trusted source like the App Store or Play Store.
Also, there are concerns about the official Olympics application, so Lookout researchers took a look at the app, and found that it requires the user to enter some PII such as demographic information, passport information, travel and medical history. There also appears to be a list of forbidden words for censorship purposes.
The app also has a chat feature as well as file transfer capabilities between users. Considering the likelihood that the Chinese government could be monitoring all of this data, users should not use the app for anything more than the bare minimum. By the same token, they should enter as little information as they’re required to.
There is an identity angle here for us as well. It is all well and good using burner phones but if these are used to sign into accounts then there is an opportunity for device compromise to lead to identity compromise. A device these days can be temporary and easily replaced however a user’s digital identity is far more permanent.
As such it is as important, if not more, to protect the users’ identity and access as it is not the mobile device that magically grants access to data but the identities and the access these allow.
High-profile individuals will always represent a richer target for attackers
As with many high-profile international events, the Olympic Games generate a spike in economic activity and press coverage in the host nation, which we have seen attract the attention of cyber threat actors in the past.
With the Winter Olympics just around the corner, Mandiant has historically seen the Games attract the attention of cyber threat actors, but with them taking place in China this year, there are a few additional things to consider – whether you are attending, or part of an organisation with ties to the event.
Based on our understanding of threat activity surrounding previous Olympics, this activity could be in the form of nation-state actors and information operations campaigns using the media attention to embarrass rivals through hack-and-leak campaigns, website defacements or other disinformation.
We have also seen financial criminal actors capitalise on events like this to exploit increased tourism and local spending or use Olympic-themed subject lures in their malware campaigns targeting the public.
“With the event in China this year, known to be one of the big four nations when it comes to cyber activity, we could also see reconnaissance activities on devices brought into the country by visitors. It is important to be aware that cyber activity could target athletes, officials and visitors, but also the different businesses that support the Olympics too, whether that is in industries like hospitality, telecoms or providing a sponsorship deal.
Leave your personal devices at home and take burner devices if you need to – ones that you will only use while visiting and replace afterwards. Secure these devices and accounts you will access with strong passwords.
Use a VPN at all times and enable multi-factor authentication wherever possible. Avoid accessing social media and banking if at all possible – pick up the phone and make a call for anything that requires credentials
Remember your connections. It is not just about protecting yourself, but also organisations you’re linked with.
DAVID BROWN,In the past, where I come from, there was always an urging demand for holding secondary devices when traveling to certain counties that do not provide the same level of personal device ownership or have institutionalised censorship. However, holding a second device also means that it must come with a secondary account.
At the end of the travel, the device and accounts used are scrubbed. Even then, at no point should these devices or accounts be used to access any sensitive data or systems such as banking or primary account servers like email or storage.
To retrieve data like photos once home, it is recommended to use the second account to log into its web storage and download it to an intermediated system for scanning before uploading it to the primary account.
It is not about malware or spyware, where institutionalised censorship is the law, they have lawful interception. They see, record, and alter inline ether data you send or receive. There is no need to push malware when they have the right to access anything on the device and everything in transmission. This interception extends to legitimate apps.
Furthermore, since they are running gateway proxies, both transparent and the Chinese firewall, they also control the end-to-end encryption of all transactions, so again, there is no need to push anything. When you leave and return home, they could have account access to your services and access as needed. At that point, they could push backdoors and infostealers to ensure access is maintained.
There is no reason why China would overly target average US sports citizens over any other global sports citizens; this is fearmongering. The truth is that targeting will be widespread for average citizens, most likely via Natural language processing keywords, standard practices already in use. There will then be a list of high-priority targets of people of interest that will span all global sports citizens in which real-time or near-time targeting is most likely with NLP.
Everyone should control their speech; you do not have freedom there. Mind what you say, always assume someone can hear everything. On top of that, we all know that there are hot button topics in China, so it is best to avoid discussing them. As the old saying goes, if you have nothing nice to say, say nothing at all.
It is not just about protecting yourself, but also organisations you’re linked withSecurity Operations Director, Axon Technologies CRISTIANA KITTNER, Principal Analyst, Mandiant Threat Intelligence
The truth is that targeting will be widespread for average citizens, most likely via natural language processing
l With successful athletes potentially being invited into secure spaces, for example, The White House, those devices might offer a beachhead, in terms of a continued attack.
l High-profile individuals will always represent a richer target for attackers.
l Considering the likelihood that the Chinese government could be monitoring this data, users should not use the app for anything more than the bare minimum.
l Regardless of whether athletes are using burner phones or not, they should be incredibly wary of any message that encourages them to share login credentials.
l It is not just about protecting yourself, but also organisations you’re linked with.
l Avoid accessing social media and banking if at all possible.
l Pick up the phone and make a call for anything that requires credentials. Remember your connections.
l There is no reason why China would overly target average US sports citizens over any other global sports citizens; this is fearmongering.
l The truth is that targeting will be widespread for average citizens, most likely via natural language processing.
l There will then be a list of high-priority targets of people of interest that will span all global sports citizens.
l Consider if you will, how many people will be in a local proximity at different times of the games.
l With the current global tensions occurring around the world, you have to consider what is the data on each device worth.
l You need to also consider what is that individual athlete or delegation representative worth.
l A device these days can be temporary and easily replaced however a user’s digital identity is far more permanent.
l While travelling to many countries not just China, you should consider the same practice.
l When going through any border crossing most have the rights to check your electronic devices and possibly clone them.
l While China is top of mind you should always consider what other countries laws could result in.
l A very sophisticated culture of surveillance and censorship exists in China.
l Within China, authorities can request access to and access any data being transmitted.
l The official Olympic games application has been shown to have significant vulnerabilities.
Any event, such as the Olympics, draws high volumes of people which inherently means more opportunity for cyber adversaries. At the most basic level, using a cheap burner phone means that if the phone is lost or stolen, the impact to the owner is reduced.
Taking it a step further, one aspect, that we have sadly seen grow in recent years, is ransomware being used by bad actors to analyse the data on a device and use it either for profit or to blackmail the victim. It is unlikely that athletes would have state secrets on their phones that would be of value, but it is likely they may have personal information that they would be embarrassed for others to know, which as highprofile athletes could make them susceptible to coercion.
If a nation state is serious about compromising devices, it is likely they would be using zero-day attacks, threats that are not detected by common security tools. Today, most people do not see their mobile phone or tablets as a risk and so many have very weak security; easy to guess passwords, no anti-threat controls and are likely to click on anything that pops up.
If hackers attempt to compromise the mobile phones or tablets used by athletes or the traveling delegation from any nation in Beijing, there is a high likelihood they will be trying to install spyware and remote access trojans; software that allows the device to be controlled by third parties.
Consider if you will, how many people will be in a local proximity at different times of the games, country team meetings, key ceremonies. If hackers can compromise one device using the communications on that device, such as Bluetooth or other local peer-to-peer capabilities, they could analyse and compromise many other devices at the same time.
I do not anticipate US athletes being the only nation targeted. In fact, all nations are at risk. However, with the current global tensions occurring around the world, you have to consider what is the data on each device worth, which is loosely linked to the net-worth of the individual. But more importantly, you need to also consider what is that individual athlete or delegation representative worth and that is where the nationality of the device owner is key.
promised what would the impact be to you personally and your company.
It is also important that people understand that a burner phone does not mean a second phone and it means that you can easily wipe it clean, cannot be personally traced to you and you do not access any sensitive data from that phone so you would simply use an encrypted messaging service that is temporary for that period of time.
Honestly while travelling to many countries not just China, you should consider the same practice as when going through any Border crossing most have the rights to check your electronic devices and possibly clone them. So yes, while China is top of mind right now you should always consider what other countries laws could result in the same compromise and risks.
When inside a controlled network, anything you do online is already filtered and might not be the official website you think it is. So, it is very easy to insert malware that could steal credentials, passwords, exfiltrate sensitive data, steal identities, embed backdoor agents and much more.
It is not only a risk for US sports citizens but all citizens from around the world who should be vigorous and cautious when bringing devices that contain sensitive data or could be used later by attackers to gain persistent access long after the Olympics is over. The Olympics is the perfect venue to be able to infect as many people as possible.
It really depends on what risks your personal cell phone could expose. So before deciding to use a burner phone you should really understand what risks you are trying to reduce. If your cell phone is an extension of everything you do aka your complete digital life such as health data, personal information, financial information, election voting to business data then you should really consider if your phone gets fully com-
A very sophisticated culture of surveillance and censorship exists in China, and it IS worth noting that Chinese laws differ markedly from Western ones. Within China, authorities can request access to and access any data being transmitted within its geographical territory.
The official Olympic games application, which all athletes and officials are required to use, has been shown to have significant vulnerabilities that if exploited could lead to data on the handset being compromised.
Chinese authorities are deeply concerned about protecting China’s image both at home and overseas which has led them to become the world’s leading advanced digital authoritarian state. The Olympic Games application has been shown to contain censorship capabilities which are designed to safeguard the official State narrative and ensure China is perceived in a positive light.
Upon their arrival, athletes should expect to have their phones voice and data intercepted. There are a number of national security laws which are enforced that compel all communications companies to provide the information to the state’s intelligence and security services upon request.
A very sophisticated culture of surveillance and censorship exists in China
Top executives share their industry and technology forecasts for 2022 and ahead.
Executives from Alteryx discuss trends across democratisation of data, skills and the great resignation, artificial intelligence and machine learning.
Digital Transformation 2.0 will usher in a culture of analytics across business units as larger enterprises provide the self-service technologies and training to ensure the average knowledge worker is set up for success and able to directly perform analytics.
2022 will be the year of the Chief Transformation Officer. We will see a title and focus shift from Chief Data Officer to Chief Analytic Officer to Chief Transformation Officer, as the role of those leading the digital transformation journey focuses more on the results than the data or the analytic methods used.
As the people move from company to company, we will see beloved technologies travel with them and become an established part of their new stack.
With the continued democratisation of analytics, data scientists need to evolve from problem solvers to teachers. Organisations are now looking to fill these roles with someone who can articulate and explain – not just code to encourage people to be creative and think critically. However, there is an existing skills gap between data scientists as practitioners and those as teacher.
We will see the rise of data trusts and frameworks evolve and organisations will shift their mindsets to sharing rather than data hording. We will see increasing use of synthetic data, differential privacy, other techniques to ensure security, privacy and legit use of data.
The digital world needs to ditch paper. Many organisations are still working from printed documents leaving pertinent data on the table that needs to be extracted. Getting the data out of paper has been difficult to date, but with computer vision and text analytics capabilities, organisations can extract insights from shipping invoices, paper records, receipts, etc.
The role of the citizen data scientist will evolve. Organisations will focus more on the relationship between people and AI, leading to increased spend on upskilling people as data literacy evolves into AI literacy.
We will move away from the term citizen data scientist and towards AI or analytics literate. Businesses will become more dependent on collective intelligence, the idea that better business decisions can be made by machines and humans working together.
More responsible AI will bridge the gap from design to innovation. While companies are starting to think about and discuss AI ethics, their actions are nascent, but within the next year we will see an event that will force companies to be more serious about AI ethics. An increasing number of companies will get more serious about AI ethics with transparent explain ability, governance and trustworthiness at the centre.
AI becomes demystified and more approachable for the everyday business user. No-code and low-code will simplify and democratise AI – although data scientists will continue to focus on high value problems, the number of people who are able to participate in advanced analytics utilising automation, computer vision, natural language processing, and machine learning will increase. More companies will invest in AI-driven automated insights to complement their existing dashboards. ë
No-code and low-code will simplify and democratise
Historical data and representations are not enough for successful decision making and predictive intelligence needs to be blended into the process.
Even with heads of state coming to an agreement on sustainability requirements, it will largely fall on individual companies to enforce these standards upon themselves. While many organisations have already expressed, and even implemented, plans to reduce or eliminate carbon emissions, many have yet to adopt any strategy to make both immediate and long-term impacts.
Without a unified, standardised pact that holds both countries and companies accountable, minimal change will be made. Until such a standardisation exists, consumers and investors are the ones most likely to force companies to make the necessary shifts, as the younger and more environmentally conscious generations continue to grow into the largest global consumer base.
Delivering information just in time, instead of in traditional dashboard forms, which look in the rear-view mirror, will be critical in 2022. Historical data and representations are not enough for successful decision making. Predictive intelligence needs to be blended into the process.
Ultimately, these insights are needed at the point of decision and action, instead of in a separate operational location. Data fabric, business intelligence, AI, machine learning, and user experience all must come together in a single solution to be meaningful.
2021 has been a crazy year for supply chain professionals. A world of people who had no idea what a supply chain was at the beginning of the year now have a much better understanding of how the goods they purchase get from one place to another. The reality remains that a lot of the transport issues manufacturers are facing in 2021 are not going away. Be ready for plenty of supply chain news in 2022 including these important trends.
As long as several global economies continue to thrive, the demand for goods and services will continue to hold transportation rates, particularly ocean, at record levels. However, with inflation rearing its head in various parts of the globe, higher prices on products may lead to a consumer slowdown, allowing manufacturers and their suppliers some breathing room to restock their supplies.
That said, the backlog of existing demand will keep ocean transport rates higher until bottlenecks drain out. As we see supply and demand begin to balance out in the second half of 2022, transportation rates should creep lower heading into 2023.
With supply shortages stretching from groceries to semiconductors, many organisations have been forced to examine ways to bring crucial components closer to final production process to ensure history doesn’t repeat itself. With many global organisations looking to localise larger portions of their supplier base, supply chains will find themselves better equipped to handle large demand spikes as they occur.
With the global vaccination effort proceeding at a slower than anticipated pace, new variants of the COVID-19 virus will continue to drive caution and hesitation regarding traveling and fully reopening businesses. As a result, organisations will move away from just-in-time JIT inventory strategies and will bulk up inventory levels, so they can avoid production disruptions.
This also allows organisations to use supply chain finance tools to extend payment terms to suppliers using innovative finance options with lenders. Organisations can build inventory strategies that are less susceptible to disruption while allowing their supply partners to maintain healthy capital levels.
As the business world continues to transition to remote-work environments, the definition of user experience continues to change. While voice access, capabilities have been heavily
Data fabric, BI, AI, ML, user experience must all come together in a single solution to be meaningful
hyped for some time in the enterprise arena, security controls will continue to tighten, and employees will need new ways of executing work away from traditional web screens.
In 2022, we expect that users will demand nearly full operational functionality through voice-enabled devices – with digital assistants that augment and automate tasks.
As ERP systems evolve to modern Enterprise Application Platforms EAPs, look for expanded platform definitions to provide not only for composability in cloud environments, but also across hybrid cloud, on-premises environments. Composability will be broken down further to the business process level, and not just at the application level.
This means that enterprises will need a standard operating model and platform for consistent integration, workflow, data analysis, and extensibility. Users will want to build their own processes and experiences to match their exact needs, not simply take what’s out of the box.
No two businesses are the same. Users will demand easy and simple ways to define their business interactions in a flexible system. Therefore, expect the microservices discussion to accelerate, as companies strive to build and assemble their software systems, as if designing a floor plan for a new home.
Businesses will start deploying EAPs, through which business processes not only will be assembled to match needs, but also will
be self-sustaining and corrective, based on AI and intelligence that is baked into the framework.
The actual convergence of analytics, intelligence, and user experience will enable successful, real-time decision making.
Core and edge solutions already are connected, for the most part, and edge solutions do not refer to devices anymore. This view acknowledges that some business operations still want to maintain local control on premise. Being able to navigate a true hybrid cloud, on-premises business, while not impacting productivity, will be key.
Customers will need cloud innovations in the form of machine learning, for example. At the same time, they need the ability to apply such technologies to their on-premises systems – not just to stereotypical edge devices.
Users will demand easy and simple ways to define their interactions in a flexible systemAMEL GARDNER, Vice President and General Manager, Middle East and Africa, Infor.
Energy network operators will be stretched as they are asked to perform the magic trick of increasing supply while decommissioning fossil fuel plants.
For data and the datacentre industry, the pandemic disruption was also a major catalyst for accelerated digitalisation. Thankfully, most of the technology needed during the crisis was already in existence, supported by datacentre and telecoms infrastructure.
According to a PWC survey, 67% of Middle East consumers think they have become more digital in comparison to the global 51%, with the highest percentage being in Egypt at 72%. This can be related to the fact that governments are moving towards smart cities.
The before-mentioned crisis drove the rapid adoption of these technologies and sped developments which were already underway. But what is most significant is that this change is likely to be irreversible. When you remove a catalyst, the reactions it caused do not reverse themselves. The increased reliance on datacentres and by extension the telecoms infrastructure which connects us to them is here to stay.
However, there are serious associated issues with this. A decades-long efficiency drive, which held datacentres to steady demand levels while processing much more, has run out of headroom.
Our economy and society have gone full throttle on data, exactly at the time when we need to put the brakes on energy consumption if we’re to combat climate change. There are no megabits without megawatts, and as we demand and produce more and more data, energy consumption levels will rise.
As the demand for electrical energy is set to soar, datacentre operators will face tough challenges in accessing scarce, new energy production.
The solution is to ramp up renewable energy production, not only to meet new demand, but to also displace current fossil-based production. So, it is not just the datacentre industry facing challenges. Energy network operators themselves will be stretched as they are asked to perform the magic trick of increasing supply while simultaneously decommissioning fossil fuel plants.
As such the challenge for datacentres will no longer be one of efficiency, but one of sustainability. New metrics, new approaches to datacentre design and operations will fall under greater scrutiny, as will the energy consumed by the overall
telecom infrastructure which has an energy requirement many times that of the datacentre industry.
We rely on data, data relies on power, and a significant gap between our wants and needs will soon emerge. On one side this appears as a crisis. However, on the other side, this will be the kind of gap that will attract serious investment and innovation. For the grid, this gap will enable new and existing private ventures to build out the renewable power we desperately need.
A seller’s market for power supply opens the door to new approaches and new models. For datacentres, it will solidify the economic case for a new relationship with power, not just as consumers but as sites which support the grid with energy services, storage and even power generation.
Data and power will realign and soon in some cases that alignment will become a physical proximity, too. With economics and policy beginning to align in this manner, there is a case for datacentres to offer not just frequency response, but also move into direct flexible supply to the grid. Sector coupling, then, could become one of 2022’s major headlines for the datacentre sector. ë
According to a PWC survey,
67% of Middle East consumers think they have become more digital
Once you understand your attack surface, you can create a threat landscape and threat profiles linked to cyber threat intelligence services.
The cybersecurity industry saw some key trends emerge from defenders and attackers in 2021. The defence trends were, in almost all cases, a direct result of the threat trend, these defence trends were reactive, and for many, it was too late.
Attack surface management is an important area. We are predicting growth in this area, which supports the concepts of predictive defence. Once you understand your attack surface, you can create a threat landscape and threat profiles linked to cyber threat intelligence services with Priority Intelligence Requirements and Organisation Specific Intelligence Requirements.
These allow an organisation to shift from a reaction-based defence right of boom to a proactive-based defence left of boom. The growth of proactive-based defence is an area where we push into 2022 and hope others will too.
One of the growing threat trends we have seen over the last year is targeting Managed Services Providers, MSP and Cloud Services Providers, CSP. This targeting allows an attacker to have a significant impact per attack as it can span numerous victims.
MSP and CSP have value but also risk. Running on someone’s infrastructure means you lost control of how and if that infrastructure is protected.
In response to this trend, defence trending is growing in attack surface awareness, commonly referred to as digital foot printing. We see a slow yet growing understanding of this need. As users of MSP and CSP now have a greater need to understand their entire attack surface, not just what is left in-house.
It’s no surprise that ransomware is still the leading threat trend. As the value of crypto rises, the greater the incentive for cybercriminals. Every time a victim pays, it guarantees further attacks against others and, in many cases, repeated attacks upon themselves.
In almost all cases of ransomware that we have investigated, unpatched remotely managed or cloud-hosted systems were the initial point of access. These systems loop back to the defence trend of attack surface awareness.
The world’s most fantastic AI threat prevention solution cannot save you if you leave the front door wide open with a welcome mat out and no one to check the IDs of the people walk-
DAVID BROWN, Security Operations Director, Axon Technologiesing or out of that door. The same is true for MSPs — they need to take the security of their infrastructure as a critical service, offering complete vulnerability management and real-time monitoring and response within their managed infrastructure.
The evident concern is corporate assets operating outside of the controlled environment, this needs to be handled in a draconic manner. The best way to manage these devices is with combinations of application and access controls. It is deploying connection-aware host-based firewalls, remote gateway proxies, and MFA VPN solutions.
On top of this level of access control, other requirements are software inventory management, agent-based policy auditing, vulnerability management, and fully managed anti-malware, host intrusion detection prevention system, all with reporting to real-time monitoring and response. In a more straightforward statement, the more visibility, the greater the ability to protect, detect and respond. ë
In almost all cases of ransomware investigated, unpatched remotely managed or cloud-hosted systems were initial points of accessAxon Technologies
There is a culture shift happening, organisations are less focused on devices and capex and more focused on business outcomes of technology investments.
Digital transformation is driving a proliferation of IoT devices and Machine to Machine communications are growing rapidly. Connected devices outnumber people 5:1. Over the next three years, there will be 10x more connected devices as compared to people, making automated secure connectivity of IoT of paramount importance.
Without an automated way to onboard, provision, and secure these devices, organisations will be left vulnerable to security breaches, which are continually growing in sophistication.
As SASE deployments enter the early majority stage of the adoption lifecycle, the market will see a clear split in approaches. Small and medium size enterprises are likely to be attracted to the all-in-one SASE offerings, where simplicity and one throat to choke take priority over advanced capabilities. On the other hand, large enterprises will remain unwilling to compromise on security, reliability, or the quality of user experience.
They will look to a dual-vendor approach, pairing a best-of-breed SD-WAN partner for on-prem security and WAN facing capabilities, with a fully-fledged cloud-delivered security partner delivering secure web gateway SWG, cloud access security broker CASB, and zero trust network access ZTNA services.
While much of attention has been given to 5G cellular, on the campus and inside the enterprise, we are on the cusp of a fast transition to Wi-Fi 6E. Wi-Fi 6E delivers high capacity with an additional 1200 MHz of new spectrum, while retaining backwards compatibility.
Leading market intelligence firm 650 Group expects over 200% unit growth of Wi-Fi 6E enterprise APs in 2022, indicating that enterprise organisations recognise the potential of 6E, especially with the continued reliance on activities such as videoconferencing, telemedicine, and distance learning.
Even as the pandemic recedes, work-from-home is here to stay. This new normal will drive the emergence of the microbranch or branch of one. In the early days of the pandemic, organisations scrambled to expand VPNs and deploy remote access points RAPs to connect their locked-down workforce and implement pop-up testing kiosks.
In 2022, we will see enormous growth for purpose-built microbranch offerings that combine enterprise-class Wi-Fi access with sophisticated multi-path WAN connectivity and advanced AIOps for reliability and consistent user experience. These microbranch offerings
DAVID HUGHES, Chief Product and Technology Officer, Aruba HPE.will securely extend the enterprise to the branch of one.
There is a culture shift happening right before our very eyes – the increased value consumers are placing on experiences over things and the decline in needing to own something has already touched our everyday lives. This same shift will begin to play out in the enterprise as well in the coming year, with organisations being less focused on devices and capex and more focused on the business outcomes of their technology investments.
Organisations want greater financial flexibility and cost predictability, while being able to increase IT efficiency and keep pace with innovation. A flexible infrastructure consumption model allows for all of this, and for those organisations that are not fully ready to take the plunge, flexible consumption models provide the option to try before buying, so that enterprises can adopt the new model – or not – at their own pace. This will drive a big increase in demand for consumption-based services like NaaS in 2022. ë
Brands will continue to be judged on their digital experiences and only the best tools and insights to meet customer expectations will prevail.
The habits of consumers in the UAE mirror the world around them. The surging appetite for digital services, while initially a knock-on effect of a global pandemic, is unlikely to reverse itself when the crisis is over. Brands will continue to be judged on their digital experiences. Only the ones with the best tools and insights to meet heightened customer expectations will prevail.
A recent AppDynamics report App Attention Index 2021: Who takes the rap for the app? shows that 98% of UAE consumers, 14% higher than the global average, believe digital services have had a positive impact on their lives during the pandemic.
Consumers say they want the Total Application Experience, that means applications which are responsive, easy to use, secure, always on, and constantly improving to align with even more demanding user expectations. In the coming year, all brands will face the pressure to meet these demands for flawless customer experience.
The App Attention Index reveals a strong sense of attachment, from customers, towards the brands that supported their needs during the pandemic through digital services. A majority said that applications gave them a sense control and a feeling of empowerment, acting as a lifeline to normality. And consumers are not shy to acknowledge and express gratitude for the efforts behind those digital experiences.
85% of UAE respondents said they were grateful to brands that invested in digital to ensure access to the services they most relied on, and 82%, 15% higher than the global average, indicated that they now feel more loyal to those brands that went the extra mile to deliver high quality applications.
Given the fact that consumers are now perfectly aware of how a state-of-the-art application should look and work, it is no surprise that our report found they are not willing to put up with flawed applications anymore. They rarely ask questions about the source of an outage, a slowdown, or any other deteriora-
tion in a digital service. They immediately resort to finger-pointing. In the UAE, 69% of consumers believe the responsibility for flawless, uninterrupted experiences lies solely with the brand.
73% of UAE consumers, 16% higher than the global average, say they are not prepared to give second chances to brands that disappoint them on the first try. This is, again, hardly a surprising finding, given the volume of choice within reach for consumers. Tolerance for poor digital experience has all but disappeared. And if an alternative brand manages to impress them the first time, then digital-savvy consumers will have no reason to return to the platform that irked them.
To respond to the demand for the total application experience, technologists need unified, realtime visibility into IT performance across their entire IT estate, from the applications themselves to the core infrastructure — full-stack observability. Without this level of comprehensive observability, they do not stand a chance of being able to rapidly identify and fix performance issues before they impact the end user. ë
of UAE consumers believe digital services have had positive impact on their lives during pandemic
The 2021 Edelman Trust Barometer shows among respondents in 28 countries, trust in technology sector is declining, along with concerns of climate change.
The 2021 Edelman Trust Barometer shows that among online survey respondents in 28 countries, trust in the technology sector is declining globally, along with concerns of climate change, job losses, and cyberattacks. Worries that are all valid to the global security and surveillance sector. These are the technologies and insights that will continue to transform security in 2022 and beyond.
A global shortage of semiconductors has also seen companies explore in-house manufacturing and the potential of system on a chip SoC for relevant sectors. While this may be a very specific trend, combined with the substantial shifts caused by the pandemic, more businesses will consider SoCs for their security solutions going forward.
Sustainability is no longer just a trend, nor should it be deemed as such. With a global focus and push towards environmentally friendly principles and practices, exemplified by initiatives such as the UN Sustainable Development Goals towards industry, human settlements, and consumption and production, a business must exhibit sustainability in its offerings and examine new possibilities through a sustainable lens.
Companies must pay closer attention to their processes from end to end. They need to scrutinise their products and services in terms of sustainability factors, such as power efficiency, building materials, and ethical deployments.
A trend that’s emerged from taking a sceptical eye towards technology is zero trust networks. Built on the fundamental assumption that no device or entity connected to a network can be trusted, the deployment of these architectural setups is likely to accelerate and become the default approach.
In turn, this will dramatically impact video surveillance in the form of encryption, identification, and hardware and software maintenance. COVID-19 has also played a role in forming this approach, as remote working solutions call for more connected devices in a wider context.
A specific 5G-related trend that is likely to grow in leaps and bounds is the deployment of private 5G networks – wireless networks that use 5G-enabled technology and dedicated bandwidth to serve as a closed solution for a company. They are faster than public networks, more reliable, and offer an ideal situation for specific industries. These networks also present security benefits
which, when applied to the sector, could potentially streamline and improve solutions of varying size. This specific manifestation of technology is one to watch out for.
With the question of trust and increased scrutiny in cybersecurity, authenticity is becoming the next big hurdle in the age of data manipulation. This is valid for both hardware networks and video surveillance itself. How can you trust surveillance when you assign no value to its authenticity?
Deepfake technology is a growing threat. With improved methods of manipulating and altering images and videos, the authenticity of captured real-world events and people is compromised. This is not a problem exclusive to the security sector, but it is one that requires comprehensive solutions to overcome, such as applying digital signatures and verifying the source of data to specific hardware.
The application of AI also shows promise in being able to detect when manipulation has occurred. Regardless, this is a challenge that multiple sectors have to contend with and work harder to combat.
All these trends factor into the need for businesses and other entities to rethink their security solutions for 2022 and beyond. With a focused and driven approach and by embracing the technology of the future, today’s challenges can be met head-on. ë
A specific 5G-related trend that is likely to grow in leaps and bounds is the deployment of private
5G networks
We will see it in agriculture, food production, fastfood, entertainment, hospitality, while others gain from automation and simplification of processes.
While IT teams and IT leaders are historically called on to drive digitisation and increase value, the roles will be reversed in the postpandemic world. Strategic decision making starts with digital experience and digital transformation since they are now deeply connected to the successful operation of any company.
We see this for example in business analytics, where the analysis of user experience journeys or customer experience journeys become a crucial information source for strategic decisions.
Another example is increasing convergence between the online and offline world, which results in digital twin concepts being adopted beyond production, and any process being tested virtually before being considered for rollout.
The pandemic months have triggered a rapid increase in ransomware attacks as more and more people worked remotely. Coincidentally, this opened up a multitude of new infection vectors.
Enterprises had to come to terms with the fact that many IT security processes and protocols are not well suited to the fight against ransomware, because it is virtually impossible to cut off all these infection routes, especially when criminals use social engineering.
Enterprises will rely on AI-based prevention across their whole domain and stringent zero trust policies. Rather than preventing IT attacks from happening, this approach minimises their impact. Once an infection happens, it is discovered almost instantaneously: infected areas are cordoned off and infected files replaced in almost real-time.
In 2022, artificial intelligence AI starts to permeate all industries. We will see it used in agriculture, food production, fast-food chains and the entertainment and hospitality sector. Agriculture and the food industry, for example, will use it for packing and processing, while other sectors gain most from general automation and the simplification of their processes.
As more industries use AI to stay competitive and innovate, there needs to be a technology foundation that can scale accordingly, and AI users need to move their AI projects
FADI KANAFANI, Managing Director Middle East, NetApp.from standalone siloed infrastructure onto shared, virtualised, production environments.
Another driver is tiny machine learning. Experts are forecasting a massive increase in AI at the edge, down to very low cost, extremely resource constrained edge devices. Think sensors rather than compute devices. This is another generation of devices that feed the ever-growing edge-core-cloud data pipeline, which industries need to access and leverage to differentiate themselves.
Manufacturers in different branches of IT will be more vocal about their quantum computing strategy in 2022 – for example security providers, hyper scalers, storage companies, and GSIs, global advisors.
These manufacturers will also theorise how they can deliver quantum computing innovation as a service for their customers and overcome branchspecific limitations, for example building a data pipeline into the quantum computing cloud.
Green topics are on the rise, as demonstrated by the 2021 Climate Change Conference, the US infrastructure deal, or the traffic light coalition coming to power in Germany. We predict that businesses will head in the same direction.
This is partly due to regulatory pressure, for example to lower carbon dioxide emissions. But enterprises will also become intrinsically motivated to deliver green innovation.
Net Zero targets will become a priority for businesses in 2022, and they are impacting corporate decision-making already now. This will result in companies examining not just their own actions but their supply chain, digital and non-digital, as they strive to deliver net zero carbon emissions as quickly as possible. ë
Evolution of analytics will be possible by vendors who focus on hardware efficiency and offer granular controls for running analytics at intervals.
Within months of the pandemic, businesses were deploying different solutions to track occupancy in their buildings and control social distancing. Almost two years later, this trend is still growing because they are seeing value from the data collected.
Beyond safety objectives, organisations will embrace the use of spatial analytics data to reduce wait times, optimise staff scheduling, and enhance business operations. As businesses give employees the flexibility to split up their work time between the office and home, organisations will be looking for ways to best optimise their workplaces.
By using space utilisation intelligence, they will be able to analyse employee office attendance, monitor meeting room demands, and make informed floorplan changes such as adding more desk-sharing options.
In recent years, demand for video analytics solutions has been strong. However, because complex video analytics still require very powerful servers for adequate data processing, deploying analytics at an enterprise level isn’t always practical.
As we move into 2022, we believe video analytics applications will mature in ways that make them easier and more economical to deploy at scale. This evolution will be made possible by vendors who focus on hardware resource efficiency and offer more granular controls for running analytics at certain intervals or schedules, instead of continuously.
A report by Cybersecurity Ventures predicts that global crime costs will reach $10.5 trillion annually by 2025. With a growth rate of 15% per year, this cost is said to represent the greatest transfer of economic wealth in history.
All of this will usher in an entirely new model for cybersecurity that relies on continuous verification rather than just hardening networks and systems. Decision makers will need to implement more offensive cybersecurity strategies and choose partners who offer higher levels of automation to stay on top of potential threats.
As more businesses take a step towards trialling cloud applications, they will quickly understand the benefits of hybrid cloud, which will produce even greater forward momentum in the adoption of cloud technologies in the new year.
This could include implementing a digital evidence management system to speed up video and data sharing between different departments during investigations, deploying a cloud video management system to secure a high-risk remote location, or installing a physical identity access manage-
ment PIAM solution to better manage access rights for all employees. Forward thinking security leaders and their organisations will think less about how a product capability is delivered, and more about how and where they will employ this technology to improve and strengthen their security and data insights in 2022 and beyond. ë
Global crime costs will reach $10.5T annually by 2025 and represents the greatest transfer of economic wealth in history
While not providing any sort of blocking facility, AI can be used to identify suspicious activity, trigger automatic blocking, and alert security personnel.
Those intent on infiltrating computer systems to access, delete, exfiltrate or immediately extort sensitive data constantly evolve their approach to counteract measures being taken to block them. Inevitably this means those tasked with protecting computer systems and data also have to evolve their strategies.
While we see security breaches appear in the news with alarming frequency, what hits the headlines is just a small proportion of the true picture. Large, household name organisations are understandably reluctant to admit to gaps in their security setup, while smaller and medium sized organisations, even if they do go public, are less likely to make headline news.
With successful attack numbers likely to be much greater than we know, what have we learned about security in 2021, what should organisations be looking out for, and what should Chief Information Officers CIOs and Chief Information Security Officers CISOs consider as they think about bolstering cybersecurity defences in 2022?
Ransomware continues to be a powerful and potentially devastating type of cyberattack. In particular, Ransomware as a Service, RaaS has seen continued evolution during 2021. This phenomenon, whereby bad actors develop software and make it available to non-technical cybercriminals, has opened up more opportunity for targeting smaller and medium sized organisations.
The logic is clear. A bespoke attack on a large organisation can yield multimillion dollar payouts but needs technically astute execution. A generalised attack on smaller organisations via RaaS may have smaller individual yield, but a greater overall yield.
However, in its Sophos 2022 Threat Report the cybersecurity firms says that the release of some materials relating to RaaS has helped them to identify tactics, techniques and procedures that might indicate an attack in progress, helping them to thwart attacks.
RaaS will continue to be a significant threat in 2022. For CIOs and CISOs the challenge is not just ensuring their defences are strong and able to cope with evolving ransomware strategies, but they have a suitable set of recovery plans in place to deal with
issues when they arise, which they inevitably will.
The last two years have seen many organisations learn that they can work well with a distributed workforce, and this has become the norm for a significant number. A distributed workforce means that protecting a corporate network as a walled garden is no longer appropriate.
Today endpoint security is vital. That means not just securing a device, whether that’s a tablet, smartphone or laptop, but also being aware of how people are using these devices. Devices bring new threats into corporate networks and put your corporate data at risk.
With some businesses getting over 200k cyberattack threats per year it is impossible for humans alone to manage this. While not providing any sort of blocking facility for incursions, AI can be used to identify potentially suspicious activity, can trigger automatic blocking, and can alert the IT and security personnel that judge whether the activity is accidental, malicious or allowable mitigating the threat risk.
CIOs and CISOs have an increasing need to be aware of the potential of AI in their cybersecurity armoury. 2022 could be a challenging year for CIOs and CISOs. The strategies they put in place now will stand them in good stead as the year progresses. ë
Ransomware as a Service, RaaS has seen continued evolution during 2021
These sustainability targets will require large-scale infrastructure commitments, while moving away from siloed products to platforms that deliver real-time data.
With most of the international community pledging to achieve net zero carbon emissions by 2050, the pressure is on; though at our current pace, the world is falling behind. Fortunately, we still have time to turn things around.
For energy and utilities organisations, this will require a significant commitment. Encouragingly, many are stepping up. The US is targeting a carbon pollution-free power sector by 2035. In 2030, the goal is to reduce emissions by 50-52%, which would achieve net zero emissions by 2050. Some optimists predict the industry could hit net zero by 2035.
In Europe, 22 of the 25 largest power and gas utilities have set net-zero emissions targets. Some coal-powered plants in Europe and other regions have shut down operations to become clean energy leaders within renewables.
But the road ahead will not be easy. In the US, the transition from fossil fuels to electricity generated by renewables will require the electric-power sector to simultaneously decarbonise, while supporting an increase of about 40% in electric load by as early as 2035.
These ambitious targets will require large-scale infrastructure commitments, not only to achieve the end goal but to manage the transition. Organisations must move away from siloed products to platforms or a system of systems strategy that delivers real-time data for real-time results.
This digitisation of the operation will allow energy and utilities organisations to accurately answer two important questions: How can power be distributed in the most efficient and socio-economic way possible How much power is required?
In tandem, organisations must become good stewards of their own operations, controlling and reducing carbon emissions generated throughout their supply chain and in the field. Again, technology will play a critical role. For example, predictive maintenance and scheduling will drive operational efficiencies to help organisations achieve these goals. The world must work together to hit our target by 2050.
As we saw in the first prediction, infrastructure must be modernised and strengthened to accommodate the increased electric loads for the sustainable energy goals we have set. This will require a complete overhaul of existing transmission facilities.
In the US, the construction of new transmission facilities will cost an estimated $314 to $504 billion. This doesn’t include the $1.8 to $2.1 trillion in new generation costs needed to hit the 2035 goal of a carbon pollution-free power sector.
To offset this, the Infrastructure Investment and Jobs Act was signed into law in the US in November 2021 and includes $1.2 trillion in spending, with an additional $550 billion subsequently added.
Europe adopted an EU-wide planning approach to transmission infrastructure, unlike the US which is managed by multiple entities. In October 2020, EU Member
States agreed on a proposal to invest €998 million in critical European energy infrastructure projects.
Yet even with all of the new and increased spending, the cost versus investment will fall short. This puts more pressure on energy and utility organisations—and the people they serve—to make up the shortfall.
Additional costs for transmission infrastructure includes the undergrounding of vulnerable lines and equipment. In the US, the need to go underground takes on greater urgency due to the catastrophic wildfires caused by extreme weather events and the sparking of flames by fallen transmission lines.
Once again, there are differences between Europe and the United States. Although Europe has a much higher population, they are more densely packed so that distribution occurs over a smaller area for greater cost and resource efficiency.
These whole-scale changes will transform energy and utilities infrastructure from below the ground up. ë
In 2022, investment in transmission infrastructure will increase from billions to trillions
Microsoft will continue to be the primary focus for cyberattacks in 2022 and defenders need to understand the risk of relying on Microsoft to protect them.
When it comes to ransomware, what we see today is not that simple. We now have ransomware cartels, like REvil, Conti, DarkSide, and others, and ransomware is not a piece of malware, but rather comprehensive ransomware operations, or RansomOps, where the execution of the ransomware itself is just the final piece of a much longer attack chain.
There is too much focus on the ransomware executable, or how to recover once an organisation’s servers and data are already encrypted. That’s like fighting terrorism by focusing only on the explosive device or waiting to hear the boom to know where to focus resources.
RansomOps take a low and slow approach, infiltrating the network and spending time moving laterally and conducting reconnaissance to identify and exfiltrate valuable data. Threat actors might be in the network for days, or even weeks.
It’s important to understand how RansomOps work and be able to recognise Indicators of Behaviour that enable you to detect and stop the threat actor before the point of detonation when the data is actually encrypted, and a ransom demand is made. There is a growing trend of threat actors realising the value of targeting a supplier or provider up the chain in order to compromise exponentially more targets downstream. Rather than attacking 100 or 1,000 separate organisations, they can successfully exploit one company that unlocks the door to all the rest. It is the path of least resistance.
The attacks we have seen have been part of cyber espionage campaigns from nation-state adversaries. Those attacks will likely continue, and we will see a rise in cybercriminals adopting the strategy as well. Companies that act as suppliers or providers need to be more vigilant, and all organisations need to be aware of the potential risk posed from the companies they trust.
The simple truth is that one way or another, Microsoft products are directly involved in the vast majority of cyber-attacks. Threat actors invest their time and effort identifying vulnerabilities and developing exploits for the platforms and applications their potential victims are using. Microsoft has a dominant role across operating systems, cloud platforms, and applications that make it fairly ubiquitous.
As such, Microsoft will continue to be the primary focus for cyber-attacks in 2022. That is not really a revelation. Defenders need to understand the risk of relying on Microsoft to protect them when they cannot even protect themselves. Organisations that depend on Microsoft for security will find themselves making headlines for the wrong reasons.
I am not suggesting that organisations not use Microsoft products or services, but it is important to understand the risks and have a layered approach to defending those products and services against attacks.
The line no longer exists between national security and cybersecurity. Sometimes a nation-state adversary attacks a private company as part of a broader campaign and sometimes cybercriminals launch attacks with national security implications.
What we need to be aware of as we go into 2022 is the increasing cooperation and collaboration between these threat actors. Nation-state adversaries are not directly controlling many of these operations, but a combination of state-sanctioned, state-condoned, and state-ignored attacks create an environment where failure to act is equivalent to tacit approval and indicates that even if they are not actively working together, their objectives are often aligned. ë
What we need to be aware of as we go into 2022 is the increasing cooperation and collaboration between threat actors
Customers will continue to expect the same experience online and retailers will have to create seamless experience between online and offline worlds.
Over the last two years, the retail landscape has changed immeasurably. Consumer behaviour and technologies that seemed like they were a decade away from coming to fruition are now normal. According to a Gartner 2021 CIO survey in fact, nearly twothirds of retail CIOs said that their relationships with their CEOs strengthened in 2020 as CIOs lead the business through significant disruption.
Personalisation is a mainstay in retail innovation for most of the decade. We are aware by now that consumers demand not just personalisation but hyper-personalised experiences. In fact, 84% are willing to pay more for hyper-personalised services, experiences powered by data, analytics and AI and provide retailers the insights and capabilities to adapt to customer’s changing realities in real-time.
In 2022, this trend is expected to continue as retailers take a more holistic approach to customer data. Instead of using customer data to send targeted promotions, retailers will use personalised insights to create simple, streamlined shopping processes.
The demand for seamless shopping experience in all retail channels becomes more and more apparent. Just five years ago, the GCC e-commerce sector was a small channel with a big future ahead of it. According to Statista, e-commerce in the GCC was expected to grow from $24 billion in 2020 to $50 billion by 2025 after the adjustments of the effect of the pandemic on e-commerce.
The market however grew an additional 6% due to the incremental increase in e-commerce adoption with the pandemic. It will continue to grow. While growth may not immediately accelerate as rapidly as it did pre-pandemic, most big retailers are now taking eCommerce far more seriously, especially when it comes to on-demand delivery. In 2022, expect e-commerce to take an even bigger slice of the retail pie.
But the growth in e-commerce is not restricted to business to consumer B2C sales. Manufacturers can take advantage of the e-commerce boom to become retailers themselves with a new D2C channel. That means cutting out third-party e-commerce sites that often take a large slice out of their revenues
As e-commerce continues to grow, people’s expectations of physical retail stores will change. They may, for instance, be places where consumers go to experience the goods you are selling before buying online. To come back to their former glory, brick-and-mortar stores will have to adjust to the demands of a modern, digitally native customer.
Customers who are accustomed to a certain level of CX, will continue to expect the same experience online. Retailers will have to play into this by creating seamless experiences between the online and offline worlds, by creating experiential retail.
This is possible through a digital experience platform as it allows seamless integration of data from across the organisation for the best possible experience, no matter where the customer might be.
If the events of the past two years have taught us anything, it is that change can come quickly, and retailers need to be able to adapt to that change. ë
Retailers will use personalised insights to create simple, streamlined shopping processes
Extreme quantity of data being generated has already exceeded human scales but still needs to be processed intelligently with AI being used to produce insights.
Cisco has revealed the technology trends that are expected to make a significant impact in 2022 and beyond. Technology is always evolving and moving in exciting new directions.
Ethical, responsible, and explainable AI - from design and development to deployment - will become a top priority for organisations and governments worldwide if we are genuinely interested in an inclusive future for all
The extreme quantity of data being generated has already exceeded human scale but still needs to be processed intelligently and, in some cases, in near real-time. This scenario is where machine learning ML and artificial intelligence AI will come into their own.
The challenge is that data has ownership, sovereignty, privacy, and compliance issues associated with it. And if the AI being used to produce instant insights has inherent biases built-in, then these insights are inherently flawed.
This leads to the need for ethical, responsible, and explainable AI. AI needs to be transparent, so everyone using the system understands how the insights have been produced. Transparency must be present in all aspects of the AI lifecycle – its design, development, and deployment.
Data deluge, data gravity, and the need for predictive insights will propel the Edge towards whole new application development and experience paradigms
Modern enterprises are defined by the business applications they create, connect to and use. In effect, applications, whether they are servicing end-users or are business-to-business focused or even machine-to-machine connections, will become the boundary of the enterprise.
The business interactions that happen across different types of applications will create an ever-expanding deluge of data. Every aspect of every interaction will generate additional data to provide predictive insights. With predictive insights, the data will likely gravitate to a central data store for some use cases. However, other use cases will require pre-processing of some data at the Edge, including machine learning and other capabilities.
The future of innovation and business is tied to unlocking the power of data in an application-driven world
Beyond enabling contextual business insights to be generated
from the data, teams will be able to better automate many complex actions, ultimately getting to automated self-healing.
To achieve this future state, applications must be created with an automated, observable, and API Application Programming Interface-first mindset with seamless security embedded from development to run-time. Organisations will have the ability to identify, inspect, and manage APIs regardless of provider or source.
Only through the delivery of predictive and seamless Internet access will the metaverse be realised, and access to technology and innovation become ubiquitous
There is no doubt that the trend for untethered connectivity and communications will continue. The sheer convenience of using devices wirelessly is obvious to everyone, whether nomadic or mobile.
This always-on internet connectivity will further help alleviate social and economic disparity through more equitable access to the modern economy, especially in non-metropolitan areas, helping create jobs for everyone. But this also means that if wireless connectivity is lost or interrupted, activities must not come to a grinding halt.
Quantum computing and security will interconnect very differently than classical communications networks, which stream bits and bytes to provide voice and data information. Quantum technology is fundamentally based on an unexplained phenomenon in quantum physics - the entanglement between particles that enables them to share states. ë
Transparency must be present in all aspects of the AI lifecycle
The security landscape has evolved due to usage of cloud and personal devices, resulting in organisations becoming susceptible to cyberthreats, insider attacks.
Following the pandemic, hybrid work will be an expectation if not the norm at most organisations across the world. That means cybersecurity, AI, automation, and analytics will play increasingly significant roles in organisational efforts to support this way of working.
AIOps-driven monitoring will play a significant role in forecasting, capacity planning, combating alert fatigue, and maintaining the security posture of an organisation.
As employees continue to access organisational resources from different locations, traditional network-based security is becoming obsolete. The security landscape has evolved in part due to the accelerated shift to the cloud and usage of unvetted personal devices, resulting in many organisations becoming highly susceptible to cyberthreats and insider attacks.
In this scenario, a cybersecurity mesh model, with its central principle of Zero Trust, will gain more traction. The cybersecurity mesh model is a distributed approach in which smaller, individual security perimeters are built around people or objects acting as access points, thereby offering IT teams better security control.
When insights are presented directly within a business application, the chances of an organisation acting upon them are much higher than when those same insights are presented in standalone business intelligence software. For example, when insights on project efficiency are available within project management software, it’s easier for project managers to relate the findings to their daily work and implement measures to fix inefficiencies.
The way we train and deploy AI models is expected to change significantly in the coming year. With more sustainable techniques like meta learning, transfer learning, and causal AI expected to complement deep learning, AI and ML will eventually become full-fledged, embedded elements of contextual
analytics workflows.
Organisations had to stumble their way through implementing their business continuity plans in response to the first lockdown. But with employees preferring hybrid work long term, further changes will have to be made in operating models to ensure hybrid work is streamlined and sustainable.
Despite self-service portals, the productivity of remote employees is still disrupted when an incident occurs. In the era of hybrid work, aspects like zerotouch service management that can handle machinesolvable incidents, digital experience monitoring to ensure high availability and constant improvements to end users, and increased adoption of desktop as a service and VDIs will be more important than ever.
There is likely to be an imbalance in the supply and demand of skilled employees in the cybersecurity space. To address their evolving needs, organisations will increasingly use the services of MSSPs and managed detection and response providers.
For instance, the increase in remote employees, cloud adoption, and the need to meet compliance regulations make IAM a tedious process for most organisations. Since many organisations lack the necessary skills and resources to implement an IAM solution, more organisations will turn to Identity as a Service providers to fill this role. ë
Despite self-service portals, productivity of remote employees is still disrupted when an incident occurs
A unified infrastructure, centrally managed via a single point of control, will help channel teams to effectively manage distributed network environments.
Businesses have been made fully aware of how crucial digital transformation is to their future success and are looking to invest in IT services, support and solutions that match their new needs - distributed workforces, agile IT and all things cloud.
Channel partners who can tap into these trends and offer guidance, services, and solutions, will be the ones to succeed within this rapidly changing IT landscape.
Here’s our pick of the top three trends driving market demand in 2022:
In 2022, IT teams will need to focus on network simplification, and channel partners can get a step ahead by offering a unified network operating system. A unified infrastructure, centrally managed via a single point of control, will help IT teams to effectively manage distributed network environments, while also delivering a high-quality user experience within a single architecture.
Network simplification is even more crucial since the pandemic has shrunk many business budgets making it crucial for IT teams to have capacity to manage distributed environments without damaging wider operations. By offering a unified infrastructure, channel partners can provide customers with new levels of operational simplicity so that IT teams can redirect precious resources to more business-critical areas.
There is no doubt that the outlook for 2022 is for plenty of cloud. Recent Aruba research found that 83% of IT decision makers were looking to increase their investments in cloud-based networking over the next 12 months and data from Canalys reveals that the channel helped fuel a 33% increase in cloud spend in 2020.
Upgrading to the cloud is a complex journey and there is opportunity for channel partners to step in with new product offerings - and it’s important to understand the deep need that customers have for trusted advisors to help them with this tran-
sition. The process can be overwhelming - particularly if customers don’t have the skills to understand the benefits of the cloud or whether it’s more costeffective for them to deploy a hybrid cloud model.
The rapid uptake of cloud technologies has prompted a widespread re-think of IT consumption models and we have now moved from hardware-driven revenue to a software-first market - essentially a SaaS-based economy. In 2022, we can expect to see heightened demand for flexible subscription models.
Recent research from Aruba found that customers are becoming more open to exploring flexible models of consumption. Only 8% of IT decisionmakers said they would continue with solely Capex investments in light of the pandemic, compared to 55% who said they would look at SaaS models.
With many key services such as deployment and decommissioning included in subscription offerings, these models give IT staff more time to carry out the more complex, value-add business tasks. It is vital that channel partners broaden their own offerings to reflect this new demand. ë
In we can expect to see heightened demand for flexible subscription models
Before 2020, every business wanted AI. In 2021, every business discovered the need for AI. In 2022, every business will have AI. This is a culture change at industry scale.
Even before the pandemic, impartial observers such as third-party analyst firms were predicting surges in AI adoption. PwC, for example, predicted that AI would have a $320 billion impact on combined Middle East GDP by 2030. Growth like this was never likely to be confined to large enterprise activity.
Instead, smaller businesses moving to the cloud would dip their toes in the AI fountain. Once they found that business intelligence, machine learning, IoT, big data, and other AI-fed technologies were viable, useful, and affordable, they would consume more.
Before 2020, every business wanted AI. In 2021, every business discovered the need for AI. In 2022, every business will have AI. Enterprise AI will hence be supplanted by Everyday AI. This is a culture change at industry scale.
One-off projects and use-case specifications will be replaced with rapid rollouts. Specialist AI sub-departments will be replaced with companywide adherence to AI principles geared towards business longevity.
The empowerment of business specialists with out-of-the-box solutions-building capabilities is a trend that is gathering steam, and quantifiably so. One estimate predicts CAGR of 27.9% between now and 2026 for the worldwide low-code development platform market. Offerings are showing significant evolution to cater to a range of workflows and AI is becoming more and more common among low-code orchestrations.
Meanwhile, data scientists and more technical coders can apply their talents to more challenging problems, which means that we will likely see roles within organisations expand and change to accommodate low code adoption.
Following on from the predicted low-code empowerment that will stretch the talents of both technology and domain specialists, we will begin to see data science unleashed as never before. Data science has suffered the same prediction-disappointment cycle as AI, but there are now strong indicators that experimentation is yielding results, and results are building confidence.
As organisations build their AI cultures and become more efficient in their governance, more data science projects are being launched. As this happens, we can expect governance to
play an important role in risk-based assessment of new use cases, especially if they fall outside the comfort areas of existing projects.
If a stakeholder were advocating for a project where the required data was missing or incomplete, this could indicate a high probability of failure and would therefore need to be tightly managed. Understanding one’s own processes and modelling them accurately will be key to the success of these next-step ventures. But the potential value of successful outcomes will be too great to ignore, so expect to see a lot more experimentation in 2022.
Organisations that use AI may have new confidence in its efficacy, but regional governments anxious to build sustainable futures can be expected to call for innovation that benefits a wide range of people. Where governments opt not to step in directly, there is still the self-regulatory side of AI to consider. ESG emerged so strongly recently because the bottom line is no longer the only concern in the boardroom. ë
We can expect governance to play an important role in risk-based assessment of new use cases
Partner marketplaces revolve around expertise and niche of the channel partner and how well they meet customer demand with their selection of solutions.
The changes in IT landscape dominated the focus of many companies and institutions. To be able to cope with all the developments, organisations must become increasingly dynamic. These changes include the growing acceptance of hybrid and multi-cloud environments, address the evolving cyber threat landscape and the ever-growing importance of data. These developments will also affect the channel landscape in 2022 and beyond.
An inevitable effect of the shift to the cloud is that channel partners must implement different business models. Of course, it remains important to spot and close opportunities. But the shift to the cloud means a shift to subscription models. By constantly monitoring the use of the implemented systems, you as a partner are able to identify where new tools can offer added value, or where existing tools can be used more effectively.
The shift to the cloud is not only changing the way partners address the market. The buying behaviour of users is also changing. When they start thinking in the language of multicloud, it is more and more about ecosystems: who does what in my environments? Which possible blocks do I build my systems with?
Marketplaces will become more important and will finally drive away the product catalogue. Not only vendor marketplaces, but also partner marketplaces. These marketplaces revolve around the expertise and niche of the channel partner and how well they can meet customer demand with their selection of solutions.
In-depth knowledge of customers’ business operations is becoming essential because the future of channel sales is a constant interplay of monitoring and advising. This means channel partners must immerse themselves much more in their customers.
To remain future-proof, partners and vendors must work closely together to support customers in their challenges. Long-term cooperation is becoming the norm, in which the added value of solutions must be continuously proven, and active use is becoming the measure of success in channel sales. ë
An inevitable effect of the shift to the cloud is that channel partners must implement different business models
Future of channel sales is a constant interplay of monitoring and advising
Top executives from Infoblox look at networking and security trends coming up in 2022.
The move back to the office in 2022 will be a slow one, even as offices continue to re-open in 2022, with many employees preferring to work from home. So, making your security location independent is key. If you have not done this already, do it now.
Enabling flexible workspaces and smaller offices rather than large campuses may become the trend. More focus on endpoint security controls.
Mobile and personal devices continue to be hot area of debate. How secure is your employee’s home network?
Ransomware is not going away, though the government will increasingly be getting involved. Prepare for it.
There shall be increased use of DNS-over-HTTPS DoH by malware because DoH provides an encrypted channel to the DoH server. DoH adoption is increasing, and malware developers are more aware of it as a means to bypass security controls.
ISPs and enterprises will deploy DoH defensively on their own DNS infrastructure to prevent fallback to third-party DoH servers. Why now DoH is a newer technology, relatively speaking. Being able to run your own DNS servers that support DoH is even newer.
More iOS app developers will be running DoH servers to capture DNS telemetry from clients. ë
Flexible workspaces rather than large campuses may become the trend
Meta has designed and built the AI Research SuperCluster, which will be the fastest AI supercomputer in the world when it is fully built out in mid-2022.
Developing the next generation of advanced AI will require powerful new computers capable of quintillions of operations per second. Meta has designed and built the AI Research SuperCluster, which is among the fastest running and will be the fastest AI supercomputer in the world when it is fully built out in mid-2022.
Meta researchers have already started using Research SuperCluster to train large models in natural language processing NLP and computer vision for research, with the aim of one day training models with trillions of parameters.
Research SuperCluster will help Meta’s AI researchers build new and better AI models that can learn from trillions of examples; work across hundreds of different languages; seamlessly analyse text, images, and video together; develop new augmented reality tools; and much more. Meta researchers will be able to train the largest models needed to develop advanced AI for computer vision, NLP, speech recognition, and more.
Research SuperCluster will help build new AI systems that can, for example, power real-time voice translations to large groups of people, each speaking a different language, so they can seamlessly collaborate on a research project or play an augmented game together.
Ultimately, the work done with Research SuperCluster will pave the way toward building technologies for the next major computing platform — the metaverse, where AI-driven applications and products will play an important role.
Meta has been committed to long-term investment in AI since 2013, when Meta created the Facebook AI Research lab. In recent years, Meta has made significant strides in AI in a number of areas, including self-
supervised learning, where algorithms can learn from vast numbers of unlabelled examples, and transformers, which allow AI models to reason more effectively by focusing on certain areas of their input.
To fully realise the benefits of self-supervised learning and transformerbased models, various domains, whether vision, speech, language, or for critical use cases like identifying harmful content, will require training increasingly large, complex, and adaptable models.
Computer vision, for example, needs to process larger, longer videos with higher data sampling rates. Speech recognition needs to work well even in challenging scenarios with lots of background noise, such as parties or concerts. NLP needs to understand more languages, dialects, and accents. And advances in other areas, including robotics, embodied AI, and multimodal AI will help people accomplish useful tasks in the real world.
High-performance computing infrastructure is a critical component in training such large models, and Meta’s AI research team has been building these high-powered systems for many years. The first generation of this infrastructure, designed in 2017, has 22,000 NVIDIA V100 Tensor Core GPUs in a single cluster that performs 35,000 training jobs a day. Up until now, this infrastructure has set the bar for Meta’s researchers in terms of its performance, reliability, and productivity.
In early 2020, Meta decided the best way to accelerate progress was to design a new computing infrastructure from a clean slate to take advantage of new GPU and network fabric technology. Meta wanted this infrastructure to be able to train models with more than a trillion parameters on data sets as large as an Exabyte — which, to provide a sense of scale, is the equivalent of 36,000 years of high-quality video.
(Left to right) Kevin Lee, RSC’s Technical Programme Manager, AI Research Infra Team at Meta; and Shubho Sengupta, Software Engineer, AI Research Infra Team at Meta.
• To fully realise benefits of self-supervised learning and transformer-based models will require training increasingly large, complex, and adaptable models.
• Computer vision needs to process larger, longer videos with higher data sampling rates.
• Speech recognition needs to work well even with lots of background noise.
• NLP needs to understand more languages, dialects, and accents.
• Research SuperCluster will help Meta’s AI researchers build new and better AI models that can learn from trillions of examples.
• Research SuperCluster will build real-time voice translations to large groups of people, each speaking a different language.
• Work done will pave the way toward building technologies for the next major computing platform –metaverse.
• Meta’s previous AI research infrastructure leveraged only open source and other available data sets.
• Research SuperCluster translates into practice real-world examples from Meta’s production systems into model training.
• Meta can help advance research to perform downstream tasks such as identifying harmful content on platforms.
• Meta believe this is the first-time performance, reliability, security, privacy have been tackled at such a scale.
• Meta estimates some experiments could run for weeks and require thousands of GPUs.
• Research SuperCluster has to be researcher-friendly so Meta teams can explore a wide range of AI models.
• Meta researchers have already started using Research SuperCluster to train large models in natural language processing and computer vision.
• Meta expects such a step function change in compute capability to enable completely new user experiences, especially in the metaverse.
today comprises a total of 760 NVIDIA DGX A100 systems as its compute nodes, for a total of 6,080 GPUs.
SuperCluster’s storage tier has 175 Petabytes of Pure Storage
FlashArray, 46 Petabytes of cache storage in Penguin Computing Altus systems, 10 Petabytes of Pure Storage
FlashBlade.
SuperCluster is complete, the InfiniBand network fabric will connect 16,000 GPUs as endpoints, making it one of the largest such networks.
Once Meta completes phase two it will be the fastest AI supercomputer performing at nearly 5 Exaflops of mixed precision compute.
In 2022, Meta will increase the number of GPUs from 6,080 to 16,000, which will increase AI training performance by more than 2.5x.
Computer vision needs to process larger, longer videos with higher data sampling rates.
recognition needs to work well even with lots of background noise.
While the high-performance computing community has been tackling scale for decades, Meta also had to make sure it has all the needed security and privacy controls in place to protect any training data Meta use.
Unlike with Meta’s previous AI research infrastructure, which leveraged only open source and other publicly available data sets, Research SuperCluster also helps ensure that Meta research translates effectively into practice by including real-world examples from Meta’s production systems in model training.
By doing this, Meta can help advance research to perform downstream tasks such as identifying harmful content on Meta platforms as well as research into embodied AI and multimodal AI to help improve user experiences on Meta’s family of apps. Meta believe this is the firsttime performance, reliability, security, and privacy have been tackled at such a scale.
AI supercomputers are built by combining multiple GPUs into compute nodes, which are then connected by a high-performance network fabric to allow fast communication between those GPUs. Research SuperCluster today comprises a total of 760 NVIDIA DGX A100 systems as its compute nodes, for a total of 6,080 GPUs — with each A100 GPU being more powerful than the V100 used in Meta previous system.
Each DGX communicates via an NVIDIA Quantum 1600 Gbps InfiniBand two-level Clos fabric that has no oversubscription. Research SuperCluster’s storage tier has 175 Petabytes of Pure Storage FlashArray, 46 Petabytes of cache storage in Penguin Computing Altus systems, and 10 Petabytes of Pure Storage FlashBlade.
Early benchmarks on Research SuperCluster, compared with Meta’s legacy production and research infrastructure, have shown that it runs computer vision workflows up to 20 times faster, runs the NVIDIA Collective Communication Library more than nine times faster, and trains large-scale NLP models three times faster. That means a model with tens of billions of parameters can finish training in three weeks, compared
with nine weeks before.
Designing and building something like Research SuperCluster is not a matter of performance alone but performance at the largest scale possible, with the most advanced technology available today.
When Research SuperCluster is complete, the InfiniBand network fabric will connect 16,000 GPUs as endpoints, making it one of the largest such networks deployed to date. Additionally, Meta designed a caching and storage system that can serve 16 TBps of training data, and Meta plans to scale it up to 1 Exabyte.
All this infrastructure must be extremely reliable, as Meta estimates some experiments could run for weeks and require thousands of GPUs. Lastly, the entire experience of using Research SuperCluster has to be researcher-friendly so Meta teams can easily explore a wide range of AI models.
• The first generation of Research SuperCluster was designed in 2017.
• The first generation consisted of 22,000 NVIDIA V100 Tensor Core GPUs in a single cluster, with 35,000 training jobs a day.
• AI supercomputers are built by combining multiple GPUs into compute nodes.
• Compute nodes are connected by a high-performance network fabric to allow fast communication.
• Research SuperCluster today comprises a total of 760 NVIDIA DGX A100 systems as its compute nodes, for a total of 6,080 GPUs.
• Research SuperCluster’s storage tier has 175 Petabytes of Pure Storage FlashArray, 46 Petabytes of cache storage in Penguin Computing Altus systems, 10 Petabytes of Pure Storage FlashBlade.
• When Research SuperCluster is complete, the InfiniBand network fabric will connect 16,000 GPUs as endpoints, making it one of the largest such networks.
• There was a need for a powerful storage solution that can serve Terabytes of bandwidth from an Exabyte-scale storage system.
• To serve the training’s needs, Meta developed a storage service, AI Research Store AIRStore, from the ground up.
• AIRStore utilises a new data preparation phase that pre-processes the data set to be used for training.
• AIRStore optimises data transfers so that traffic on Meta’s inter-datacentre backbone is minimised.
• In 2022, Meta will work to increase the number of GPUs from 6,080 to 16,000, which will increase AI training performance by more than 2.5x.
• The InfiniBand fabric will expand to support 16,000 ports in a two-layer topology with no oversubscription.
• Research SuperCluster is isolated from the larger Internet, with no direct inbound or outbound connections.
• Inside Research SuperCluster, traffic can flow only from Meta’s production datacentres.
• The entire data path from Meta storage systems to the GPUs is end-to-end encrypted.
• Before data is imported to Research SuperCluster, it must go through a privacy review process.
• The data is encrypted before it can be used to train AI models and data and decryption keys are deleted regularly.
A big part of achieving this was in working with a number of long-time partners, all of whom also helped design the first generation of Meta AI infrastructure in 2017. Penguin Computing, an SGH company, Meta’ architecture and managed services partner, worked with Meta operations team on hardware integration to deploy the cluster and helped set up major parts of the control plane.
Pure Storage provided robust and scalable storage solution. And NVIDIA provided its AI computing technologies featuring cutting-edge systems, GPUs, and InfiniBand fabric, and software stack components like NCCL for the cluster.
Research SuperCluster began as a completely remote project that the team took from a simple shared document to a functioning cluster in about a year and a half. The pandemic and industry-wide wafer supply constraints also brought supply chain issues that made it difficult to get everything from chips to components like optics and GPUs, and even construction materials — all of which had to be transported in accordance with new safety protocols.
To build this cluster efficiently, Meta had to design it from scratch, creating many entirely new Meta-specific conventions and rethinking previous ones along the way. Meta had to write new rules around Meta datacentre designs — including their cooling, power, rack layout, cabling, and networking including a completely new control plane, among other important considerations.
Meta had to ensure that all the teams, from construction to hardware to software and AI, were working in lockstep and in coordination with Meta partners.
Beyond the core system itself, there was also a need for a powerful storage solution, one that can serve terabytes of bandwidth from an Exabyte-scale storage system. To serve AI training’s growing bandwidth and capacity needs, Meta developed a storage service, AI Research Store AIRStore, from the ground up.
To optimise for AI models, AIRStore utilises a new data preparation phase that pre-processes the data set to be used for training. Once the preparation is performed one time, the prepared data set can be used for
multiple training runs until it expires. AIRStore also optimises data transfers so that cross-region traffic on Meta’s inter-datacentre backbone is minimised.
To build new AI models that benefit the people using Meta services — whether that’s detecting harmful content or creating new AR experiences — Meta need to teach models using real-world data from Meta production systems.
Research SuperCluster has been designed from the ground up with privacy and security in mind, so that Meta’s researchers can safely train models using encrypted user-generated data that is not decrypted until right before training.
For example, Research SuperCluster is isolated from the larger Internet, with no direct inbound or outbound connections, and traffic can flow only from Meta’s production datacentres.
To meet Meta privacy and security requirements, the entire data path from Meta storage systems to the GPUs is end-to-end encrypted and has the necessary tools and processes to verify that these requirements are met at all times. Before data is imported to Research SuperCluster, it must go through a privacy review process to confirm it has been correctly anonymised or alternative privacy safeguards have been put in place to protect the data.
The data is then encrypted before it can be used to train AI models and both the data, and the decryption keys are deleted regularly to ensure older data is not still accessible. And since the data is only decrypted at one endpoint, in memory, it is safeguarded even in the unlikely event of a physical breach of the facility.
Research SuperCluster is up and running today, but its development is ongoing. Once Meta completes phase two of building out Research SuperCluster, Meta believes it will be the fastest AI supercomputer in the world, performing at nearly 5 exaflops of mixed precision compute. Through 2022, Meta will work to increase the number of GPUs from 6,080 to 16,000, which will increase AI training performance by more than 2.5x.
The InfiniBand fabric will expand to support 16,000 ports in a twolayer topology with no oversubscription. The storage system will have a target delivery bandwidth of 16 TBps and Exabyte-scale capacity to meet increased demand.
Meta expects such a step function change in compute capability to enable more accurate AI models for Meta existing services, and to enable completely new user experiences, especially in the metaverse.
Meta’s long-term investments in self-supervised learning and in building next-generation AI infrastructure with Research SuperCluster are helping us create the foundational technologies that will power the metaverse and advance the broader AI community as well. ë
• Early benchmarks have shown it runs computer vision workflows up to 20 times faster.
• Research SuperCluster runs NVIDIA Collective Communication Library more than nine times faster, trains large-scale NLP models three times faster.
• A model with tens of billions of parameters can finish training in three weeks compared with nine weeks before.
• Research SuperCluster is not a matter of performance alone but performance at the largest scale possible.
• Meta designed a caching and storage system that can serve 16 TBps of training data and plans to scale it up to 1 Exabyte.
• Meta estimates experiments could run for weeks and require thousands of GPUs.
• Once Meta completes phase two it will be the fastest AI supercomputer performing at nearly 5 Exaflops of mixed precision compute.
• In 2022, Meta will increase the number of GPUs from 6,080 to 16,000, which will increase AI training performance by more than 2.5x.
• The InfiniBand fabric will expand to support 16,000 ports in a two-layer topology with no oversubscription.
• The storage system will have a delivery bandwidth of 16 TBps and Exabyte-scale capacity to meet increased demand.
l Dynabook has continued Toshiba’s tradition of firsts.
l Dynabook is a dynamic, entrepreneurial new business, built on the foundations of Toshiba heritage.
l Dynabook has brought entry-level products to the market, and this has brought a lot of partners back to the vendor.
l Dynabook now provides non-laptop solutions to meet the requirements of different areas of the marketplace.
l Dynabook has launched a business in Turkey, investing in Middle East and Africa, as well as strengthening its team across Eastern Europe.
l Dynabook has procurement strength of Foxconn, combined with Sharp components and Toshiba’s engineering facilities.
l Dynabook offers a range of accessories to support the lower margin laptop business.
l DynaEdge is a prime example of the vendor’s ability to innovate.
l Dynabook’s research has found 63% of organisations plan to deploy smart glasses within the next three years.
l Dynabook has continued Toshiba’s tradition of firsts by developing the first monocular AR solution.
l Dynabook developed the world’s thinnest and lightest hybrid laptop, in 2020.
Dynabook has broken away from Toshiba heritage by moving into the B2B market space with non-laptop solutions like accessories, AR solutions, edge devices.
BY ARUN SHANKARIn 1985, Toshiba developed the world’s first commercial laptop computer and the world’s first notebook computer. These two computing innovations allowed people to safely and securely, work anywhere they wanted and defined today’s mobile computing market.
Dynabook has continued Toshiba’s tradition of firsts by developing, among others, the first monocular AR solution, and the world’s thinnest and lightest hybrid laptop, in 2020. Under Toshiba the company primarily focused on the consumer business whereas Dynabook’s strategy is 100% B2B.
Dynabook has progressed in establishing a secure footing following its separation from Toshiba Corporation, amidst a dynamic and shifting environment. Dynabook has won business in the past years, with support and investment from Sharp fuelling its expansion.
Dynabook is a dynamic, entrepreneurial new business, built on the
foundations of Toshiba heritage, Foxconn buying and manufacturing power, Sharp technologies in the field of displays, sensors, digital office solutions, and financial stability that comes with the acquisition by Sharp Corporation in October 2018.
Amongst the first products that incorporate technologies developed by TCS and Sharp were Dynabook’s G-series laptops that combine a sub-kilogram weight with a 19-hour battery life.
Dynabook has expanded its portfolio extensively. The Toshiba PC business was predominantly focused on Ultrabooks, and mid-range and high-end propositions. Dynabook has brought more entry-level products to the market. This has brought a lot of partners back to the vendor, particularly those that are transacting with schools and SMBs.
And while laptops remain an integral part of its heritage and future
strategy, Dynabook now provides non-laptop solutions to meet the requirements of different areas of the marketplace.
Dynabook has launched a business in Turkey, investing in Middle East, and Africa, as well as strengthening its team across Eastern Europe.
Dynabook has the procurement strength of Foxconn which, combined with Sharp components and Toshiba’s engineering facilities, makes for an efficient business model.
With its 100% focus on the B2B sector, its solutions offering does not carry tight margins. Dynabook offers a range of accessories to support the lower margin laptop business.
DynaEdge is a prime example of a vendor’s ability to innovate. The edge computing market has accelerated substantially over the past year driven by factors such as the pandemic and 5G. Dynabook’s research has found 63% of organisations plan to deploy smart glasses within the next three years. In this case, Dynabook identified the demand and developed the product.
It is hard to accurately predict the future in this
industry as it is so volatile. Take the pandemic as an example of new demands accelerating the develop ment of certain technologies and deprioritising others. The future is to keep innovating and responding, and to avoid businesses becoming stagnant. The future of product development will be all about responding to demand.As a purely B2B focused brand, Dynabook is positioned quite differently from other brands in this space.
Dynabook’s heritage within the education space dates 30+ years. Microsoft selected the Satellite Pro E10-S to be one of the first devices globally to run the education-focused Windows 11 SE. Dynabook under stands every school, college and university is unique. That’s why Dynabook has developed a range of laptops and tablets.
Within the commercial space, Dynabook has expanded to offer its most comprehensive range of business devices, from ultra-mobile two-in-ones to budget-conscious all-rounders.
Dynabook remains a 100% channel-focused business. ë
100%
As President, Damian Jaume explains his motivation and value system as he drives the strategy, business and innovation of Dynabook Europe.
I recently celebrated my 20th anniversary at Dynabook and have been fortunate enough in that time to work in almost every corner of the business. I started in a sales role in 2001, and since then have held various roles across the company which have enabled me to gain a broad view of all the moving parts of the business, and how they work together to deliver value to the organisation.
After working in sales, I then moved into more strategic roles. Firstly, planning for retail and commercial business lines, before shifting to operational financial planning and commercial management. These roles allowed me to grasp a true understanding of Dynabook’s business operations from the bottom up, and the experience gained is hugely influential in how I currently lead the business every day in my role as President of Dynabook Europe.
I took that position in 2014 and am proud to have recently become the longest serving President in the company’s history. All business leaders want to grow the company they lead, and I am no different – it is essential to succeed in the position.
Overcoming the challenge of growing Dynabook from a small part of a large corporation to a major player in its own right has shaped the last few years of my tenure. Yet, I want to have a significant cultural impact as well, to foster employees that are proud to work for Dynabook and are passionate about what they do.
“ALL BUSINESS LEADERS WANT TO GROW THE COMPANY THEY LEAD, AND I AM NO DIFFERENT”
In the cybersecurity arena, there are essentially two entry points – the endpoint and the servers and they are fundamentally different from each other. An application hosted on a server can be accessed both by good guys and bad guys, points out Satya Gupta, Cofounder and Chief Technology Officer, Virsec.
Moreover, end points are protected by solutions called Endpoint Detection and Response, EDR that have been built on two fundamental assumptions that no longer hold true.
EDR solutions have been built on the premise that all the signatures of attacks can be catalogued and that threat attackers will at some point of time or the other, use one of those signatures for their attacks. Another assumption built into EDR solutions is that the average time span of any threat attack is an extended duration, during which time the EDR solution can detect, respond and alert the end user into some action if necessary.
“These EDR technologies are not really suited to protecting servers anymore,” says Gupta. “The world is very different today. Attackers are coming in unannounced, and within seconds, they will have completely wiped your machine. They are not shy about employing the best skills, because people pay $15 million ransom these days, right?” he adds.
Expectations that threat actors are obliged to use predefined
“INSTEAD OF FOCUSING ON THE ATTACKER, WE FOCUS ON THE APPLICATION,” SAYS SATYA GUPTA, CTO VIRSEC
Expectations that threat actors are obliged to use predefined techniques that can be added to a library for EDR solutions can now be labelled as old school.
techniques and signatures that can be added to a library and catalogued by EDR solutions for the benefits of customers, can now be labelled as old school stuff, feels Gupta. “It is like trying to count every drop of water in the ocean, that many techniques exist out there,” he adds.
Today’s reality check is that attackers can take advantage of vulnerabilities in code, and within seconds can infiltrate a network, launch a ransomware and convert servers and industrial machines into bricks.
The Virsec approach
In order to meet the new realities of today and tackle modern day threat attacks, Virsec is adopting a new and completely different approach towards offering cyber security solutions. “What we do is instead of focusing on the attacker, we focus on the application. All other security controls focus on the attacker, we focus on the developer,” says Gupta.
Attacker techniques by default and for logical reasons have to keep changing, to avoid detection and to remain commercially successful. However, developers write code and use code to build applications. The code inside applications produces the same results irrespective of time, place or date. “In other words, code behaves remarkably the same way, and it does the same thing over and over again,” points out Gupta. This allows application code when executed to be used as a running timeline of application integrity.
When threat actors override application code by running their own code at the same time, this is detected against the baseline of application code.
“When the code starts doing things that are not part of the infrastructure,
not part of the design, it is easy to spot that. That is the fundamental difference between our technology and everybody else’s technology. We are tracking code, they are tracking data,” says Gupta.
Inside the CPU
If an application always runs the developer code, it will not serve the purpose of the attacker, since they will not get access and control, into the organisation’s infrastructure. For that to happen, and for the attacker to inflict some damage, the attacker must be able to run their own code while the application code is being executed.
The approach adopted by Virsec is to focus on that point in time when attackers are able to gain the privilege of executing their code along with the application workload.
“Our technology is focused on detecting that moment of truth, when the attacker is gaining privileges of executing code on your system. Because after that happens, anything that they do is completely out of your control,” explains Gupta. “Whether they do ransomware, whether they do exfiltration, or both, and something else is an after effect of that step.”
Virsec tools read the executable files and the libraries of code and build a database of these files. This database is now carrying all the information about executable files that will be present in-memory when the enterprise’s application workloads are running.
Checklist
When the applications are running, Virsec’s kernel module checks
l EDR solutions have been built on the premise that all signatures of attacks can be catalogued.
l Another assumption built into EDR solutions is the average time span of any threat attack is an extended duration.
l Code behaves remarkably the same way, and it does the same thing over and over again.
l This allows application code when executed to be used as a running timeline of application integrity.
l When threat actors override application code by running their own code at the same time, this is detected.
l The approach is to focus on that point in time when attackers are able to gain the privilege of executing their code.
l People are beginning to realise, this notion of detecting and responding is not, what you would like to do.
l Virsec works with languages and other standard frameworks since people use them all the time.
l Another market trend has been early adoption by selective market segments like government, defence, national and critical infrastructure.
l Early adopters and early innovators are those enterprises who cannot afford the risk of any type of attacks.
l Those market segments that cannot afford to perpetuate outdated detect and respond approach are typically the most critical market segments of any nation.
the integrity of the executable files being loaded into memory with those saved in its database. A logical question to ask at this stage would be, can malwares from threat actors also change the executable file database maintained by Virsec?
“We check integrity in the kernel module. That is essentially out of bounds of malware,” explains Gupta.
Another logical question to ask, would be whether the database look up and compare process by the Virsec tools actually hampers the performance of the application workload?
“We have very thin instrumentation and see very little performance impact,” elaborates Gupta.
Another way to view the decision-making process of the Virsec instrumentation, is assessing whether the transition from function one to function two of the application codes was legal, based on the disassembly that was created by the Virsec instrumentation in the beginning of the run-time.
“What matters is where did the instruction that is executing in the CPU come from. That is the domain in which we play,” continues Gupta.
Inside the application space
According to Gupta, there are essentially two kinds of applications. An application that has been written in a compiled language, or those that have been written for web in an interpreted language. While the attack mechanisms are different in the two types of applications, the foundational principle that the attacker wants to be able to run code on a machine still remains the same.
Web languages include Java, .NET, .NET Core, PHP, Ruby and Node. js, while developers also use web frameworks like Tomcat, WebLogic, WebSphere, Ruby on Rails.
Virsec works with those languages and other standard frameworks since people use them all the time, again and again. “Our technology grows by the number of frameworks and the number of languages that we protect. We protect 90% of the web languages today, but we are always adding some more exotic ones that come up,” explains Gupta.
Another market trend around the Virsec solution portfolio, has been early adoption by selective market segments like government, defence, national and critical infrastructure, amongst other. And there is a reason for that.
“More and more people are beginning to realise, this notion of detecting and responding is not, what you would like to do. If you can have proactive protection, it is a whole lot better than detecting and responding,” points out Gupta. And the customers that are the earliest to realise this are the ones typically in the most critical market segments.
For Virsec, the early adopters and the early innovators, are those enterprises who cannot afford the risk of any type of attacks. And are definitely those market segments that cannot afford to perpetuate the outdated detect and respond approach. This is typically the frontline and the most critical market segments of any nation.
“So, we are very ubiquitous in the sense that we can protect any workload. In fact, the mission of our company, our longer-term mission is to make cybersecurity irrelevant. It is a very profound statement,” sums up Gupta. ë
Commvault has been selected by United Arab Bank to implement Commvault Complete Data Protection, with built-in ransomware protection for all their workloads, across the whole organisation.
United Arab Bank decided to partner with Commvault to install Complete Data Protection as a single, powerful backup software solution for data protection. UAB, a leading financial solutions provider in the UAE, will benefit from comprehensive on-premises, public cloud, and hybrid multi-cloud workload coverage from a single platform and user interface.
All backup, archive, replication, disaster recovery, across all the bank’s environments will be covered including seamlessly integrating with IBM AS400 and AIX systems, Commvault will offer UAB
support for legacy and modern workloads within a single solution.
Commvault Complete Data Protection helps ensure data availability and business continuity across on-premises and cloud environments. The solution is easily managed through a web-based user interface with role-based access control, enabling fully permissioned self-service.
Gulf Bridge International, a global cloud, connectivity and content enabler, is deploying Ciena’s GeoMesh Extreme, powered by WaveLogic 5 Extreme, to increase GBI’s Smart Network capacity and performance. This upgrade will help GBI meet the demanding internet traffic requirements between the Gulf Cooperation Council GCC countries, Europe and India. This will also help prepare the region for major upcoming sporting events – including the world’s largest football competition – in Qatar and the wider region.
Surges in video streaming, cloud computing and 5G have driven bandwidth demand and the need for submarine and terrestrial network upgrades globally. GBI’s upgrade with Ciena uses the latest technologies to increase design capacity by 10 Tb/s, enhance its capabilities, and provide increased
flexibility in delivery times.
GBI will leverage Ciena’s WaveLogic 5 Extreme WL5e to optimise its design capacity and help significantly reduce the cost per bit. Additionally, the embedded software intelligence in WL5e gives GBI the ability to dynamically fine tune capacity to meet changing network demands. Ciena’s Manage, Control and Plan MCP domain controller will provide software control and automation to accelerate network operations. With this network upgrade, GBI intends to provide the best possible digital experiences to its partners, customers and end-users alike.
As part of GBI’s infrastructure evolution, Ciena has increased GBI’s Smart Network lit capacity by 10 Tbs.
Riverbed | Aternity announced that Shelf Drilling, an offshore drilling company providing shallow water services to the oil and gas industry, has successfully implemented Riverbed SaaS Accelerator and Riverbed SteelHead to overcome the inherent bandwidth limitations of the VSAT links used to connect its off-shore rigs to mission-critical on-premises and cloud-based applications. By delivering a 95% reduction to the company’s Intranet traffic while optimising core applications by 80%, the solution has helped Shelf Drilling streamline critical rig operations including order processing, staff training,
and operations management.
Headquartered in Dubai, Shelf Drilling operates 30 rigs across eight countries and has 12 onshore locations, with approximately 3,000 employees and contractors. While technology plays a fundamental role in operations, the company’s offshore rigs present unique IT challenges.
To address these challenges, Ian Clydesdale, IT Director at Shelf Drilling and his IT team implemented Riverbed SaaS Accelerator to accelerate businesscritical SaaS applications and Riverbed SteelHead for WAN Optimisation to maximise network and application performance. These solutions integrate with key technologies such as Shelf Drilling’s VMware virtualisation software, JD Edwards, and Microsoft 365 to improve performance.
Riverbed | Aternity has enabled the company to achieve a compression rate of over 50% on 1.2 terabytes of email data. Performance monitoring traffic is also accelerated at an average rate of 67% per month, and for their ERP suite, the company experiences 81% optimisation along with a 95% reduction in database replication traffic. Additionally, Riverbed | Aternity streamlines replication of data from offshore instances to the virtual cloud environment.
As satellite links are the primary means of connectivity, these rigs typically have low bandwidth of just 1-2 Mbps, and high latency of up to 750 milliseconds. As a result, some key applications require optimisation to manage data replication effectively, including the JD Edwards ERP platform, responsible for generating purchase orders in the procurement process, as well as our operational management applications.
Additionally, all the usual administrative functions need to share and access large files especially during key rig operations. Also, employee satisfaction is another aspect that’s heavily dependent on connectivity.
Since Riverbed | Aternity began optimising Shelf Drilling’s Internet traffic, outages are now less likely to affect data transfer. This is key to the company’s cloud-first strategy which includes the migration to Microsoft Azure in the UAE.
ServiceNow, announced that Amadeus, has selected the Now Platform to accelerate and secure the migration of its operations to the public cloud. As part of a major transformational effort for the company, Amadeus will leverage the Now Platform to enable its development teams to rapidly innovate and deliver a new way of working for the ever-changing travel market.
For over 30 years, Amadeus has relied on its own in-house tools and datacentre to build and deliver the critical solutions that help airlines and airports, hotels and railways, search engines, travel agencies, tour operators and other travel players to run their operations and improve the travel experience, billions of times a year, all over the world.
YANNIS DAUBIN, Area Vice President & General Manager of ServiceNow, France.Together with ServiceNow, armed with the benefits of the cloud, Amadeus will scale and innovate across its business that helps connect 1.6 bn+ people a year to local travel providers across 190+ countries better anticipating trends and customer behaviour.
To further improve efficiency and excellence in service delivery, Amadeus has selected ServiceNow’s IT Service Management Pro ITSM, IT Operations Management Enterprise ITOM and Customer Service Management Pro CSM solutions.
Oracle announced that Qatar Airways has implemented Oracle Fusion Cloud Enterprise Performance Management as the multiple award-winning airline transitions through the pandemic and embarks on a major global expansion. With Oracle Cloud EPM, Qatar Airways financial teams gain the transparency and flexibility needed for more accurate planning, budgeting, and forecasting. The improved processes are helping the airline increase agility, improve insights and enhance business decision-making across the organisation.
The airline needed to automate and streamline its financial and planning processes to control and manage resources and investments, improve reporting capabilities, and align the organisation behind its ambitious expansion plans more effectively. As the commercial aviation sector recovers
from the pandemic in the second half of 2021, Qatar Airways is quickly returning to its pre-pandemic network of 180 destinations and resuming its expansion program.
Oracle Cloud EPM enables Qatar Airways to better connect operational and financial data across HR, finance, supply chain and sales to improve management insights, accelerate decisionmaking, and enhance the company’s business modelling and planning. Moving business processes to Oracle Cloud EPM has also enabled the company to eliminate manual processes in financial reporting to improve the speed, accuracy and insights of reports.
Oracle Consulting is the implementation partner for this initiative.
Tech Mahindra announced it has been working with Telefónica Germany to digitally transform its microwave network with open software defined networking. Telefónica is working to standardise its management interface by collaborating with multiple vendors and partners. Tech Mahindra is helping Telefónica achieve this by bringing its domain expertise and continuous integration capabilities to the partnership. Tech Mahindra has been working with the telco to implement new standards for its DevOps model including enhancing its processes and requirements.
This includes the continuous management support of service operations for the organisation’s overall SDN Architectures. The project involves the deployment of microwave transmission automation aimed at elevating service delivery and network operations. All microwave devices in Telefónica Germany’s mobile
backhaul network are accessible via a single harmonised Network API which supports an open microservice framework.
Telefónica has been able to integrate 30,000 microwave links from multiple vendors into the OpenDaylight Controller using ONF TR-532 interface standard. This also reflects Tech Mahindra’s commitment to invest in open-source technologies to accelerate scaling out networks. The move to SDN will enable Telefónica to adopt automation functionality in the future more easily.
stc signed partnership agreements with a number of specialised companies in support of establishing modern, new generation of cloudbased datacentres. As part of the third phase of the datacentres project, stc recently launched a new datacentre in Jeddah. These steps aim to expand the stc’s capabilities and capacities and accelerate the implementation of the Kingdom’s digital transformation goals through a flexible and global-level data distribution process.
These agreements are an extension of the third phase of the stc’s newly launched datacentres project, the largest in the region. This phase would provide seamless data distribution on a
global level in compliance with the objectives of the Saudi Green Initiative for Environmental Sustainability.
The most prominent of these agreements are the ones concluded with SBM, with the aim to increasing the efficiency of datacentres using flexible and advanced technologies; Huawei, with the aim to supporting datacentres by ensuring a flexible flow of data and digital information traffic between different technical facilities to maintain business continuity; and MMR, which the aim to enhancing the infrastructure of modern datacentres.
In addition, stc inaugurated the new data-
centre in Jeddah under the third phase of the datacentres project. It is the first neutral datacentre in the region and its capacity amounts to 1.2 megawatts and 150 server racks. It combines many digital services that ensure fast and secure access to the high capacity local and international network that connects various IGW and MPLS networks.
The centre comes as a part of stc’s strategic plans for datacentres to enable the company to be a gateway to the digital infrastructure of the Middle East and to achieve the digital transformation of the Kingdom by providing important digital availability areas that secure an integrated set of secure services and service management at the global level, in addition to improving its digital technologies and communication services across the major cities in the Kingdom, including Jeddah.
ThycoticCentrify, announced new and expanded capabilities for its award-winning PAM solution, Secret Server. With the addition of new security controls, automation, and design updates, Secret Server builds on its industry-leading secrets management capabilities and ease-of-use to offer greater protection and higher productivity.
The latest Secret Server release allows organisations to rotate Secret Server’s master encryption key on demand. Rotating individual secrets housed within the digital vault provides an additional layer of protection to block external actors from gaining access to it.
Secret Server also streamlines the connection process for organisations that use jump boxes to protect access to critical resources. Rather than taking time to inject unique credentials at every connection point, users can now use a single key to navigate an entire route from launch, to jump box, to destination within a single session. Users can launch the end-to-end route via Secret Server or the interface of the Connection Manager session management tool.
To enhance auditing and compliance, Secret Server ensures that only one privileged user at a time can use a secret. When secrets aren’t checked back in to Secret Server after use, critical maintenance operations can’t be performed, and productivity slows. The latest release automatically checks in secrets for API connections after expiration.
Additionally, users now have more visibility into remaining time on a secret checkout and can extend the checkout if required.
The latest release also includes enhancements to the Secret Server interface, logging, and reporting to increase usability and accessibility through improved keyboard navigation and screen reader hints.
NetApp ONTAP has been recognised by the US National Security Agency for data-centric security capabilities that make it easier for organisations to protect their data
NetApp announced that NetApp ONTAP, the world’s leading storage operating system, is the first enterprise storage and data management platform to achieve Commercial Solutions for Classified CSfC validation for a data-at-rest DAR capability package. With this, organisations across the globe can benefit from NetApp ONTAP’s robust security capabilities to protect customers’ information on-premises and in remote locations from foreign actors, ransomware attacks or other data loss threats they may face.
A cybersecurity program led by the U.S. National Security Agency NSA, CSfC is a key component of the organisation’s commercial cybersecurity strategy. CSfC validates commercial IT products that have met the highest level of strict encryption standards and rigorous security requirements for both hardware and software solutions. Recently, the NSA has recommended that federal agencies hosting secret or top-secret data utilise storage solutions that have been CSfC validated.
Linksys, a global leader in home and business Wi-Fi solutions, announced the launch of its Wi-Fi 6 Cloud Managed Access Point.
As retail stores, medical offices, small businesses, and commercial spaces begin to operate at pre- pandemic levels and foot traffic continues to increase, the Wi-Fi 6 Cloud Managed Access Point’s 4x4 internal antennas deliver safer, more secure and faster Wi-Fi for areas needing to service a high density of concurrent client connections. With the control plane in the cloud and a cloud-native operating system running on the units, zero-touch provisioning, configuration, management, and monitoring are extremely efficient and simple. With access to Linksys Cloud, users can also scale with no limit to the number of networks and devices managed. Outfitted with a smaller and sleeker design, the new access point also comes with Linksys Cloud and free cloud management.
Managing massive datasets is excessively complex and costly. Seagate announced the new Exos Application Platform with a new controller featuring 2nd Gen AMD EPYC processors. The efficient, scalable, affordable end-to-end compute and storage platform delivers integrated compute and storage in a single enclosure optimising rack space utilisation, power efficiency, heat dissipation, and storage density.
The need for advanced storage solutions continues to rise to unprecedented levels with the ever-increasing growth in data generation. According to a report commissioned by Seagate and conducted by the research firm IDC, Rethink Data, enterprise data is expected to grow at an average rate of 42.2% over the next two years. The survey, which was conducted by IDC, also found that only 32% of data available to enterprises is put to work. The remaining 68% is unleveraged.
Exos AP options with the all-new AP-BV-1 controller, offer exceptional compute and storage performance in a single chassis. With dual AMD EPYC processor-based controllers, the system delivers high availability or controller partitioning, with the flex-
Data drives today’s most innovative technology and business breakthroughs. Maximising the value of an organisation’s data is dependent on the ability to store, access, and activate as much data as possible. Seagate launched the new Exos X20 20TB and IronWolf Pro 20TB conventional magnetic recording CMR-based hard disk drives HDDs, increasing mass-capacity data storage capabilities.
Seagate’s Exos X20 enterprise HDD is designed for maximum storage capacity and the highest rack-space efficiency. Built with cloud storage in mind, the 20TB Exos X20 delivers performance for hyperscale data centres and massive scale-out applications. With low latency of 4.16ms and repeatable response times, Exos X20 provides enhanced caching that performs up to three times better than solutions that only utilise read or write caching. Exos X20 also delivers an increased sustained data rate SDR of up to 285 MBs.
With available Seagate Secure technology and a 2.5M-hr MTBF rating, enterprises count on Exos X20 to realise their greatest data and operational efficiencies, and highest storage densities in the datasphere.
ibility of a common controller slot allowing connection to additional EXOS E SAS expansion units in matched chassis. The architecture is perfectly balanced for current and future CPUs and drive capacities.
AMD EPYC processors offer a combination of features to help support the need for advanced storage solutions. In the Exos AP Enterprise Data Storage System Controller, AMD EPYC processors offer core counts of 8, 12, or 16 for varying levels of performance. The processors also provide the solution with dedicated PCIe® 4 lanes that deliver 200GbE network connectivity, and high bandwidth to SAS controller for faster HDD and SSD response. Finally, the Exos AP system supports 25GbE on the motherboard providing base IO which is often an added cost on other competitor platforms.
The Exos X20 HDD can be paired with Seagate’s recently announced Exos CORVAULT intelligent storage system to deliver maximum data density in a small footprint. Built on Seagate’s 4U chassis accommodating 106 Exos enterprise drives in only seven inches 18 cm of rack space, CORVAULT offers over 2.12PB of SAN-level performance built on Seagate’s breakthrough storage architecture.
Seagate’s new IronWolf Pro 20TB HDD offers network attached storage NAS-optimised AgileArray technology to provide exceptional RAID reliability and compatibility during the heaviest NAS workloads that SMBs might require. Designed with built-in rotational vibration RV sensors, IronWolf Pro 20TB offers RV mitigation to provide reliable performance for NAS systems with little lag or downtime.
Exos X20 for cloud storage, hyperscale datacentres, massive scale-out apps
Lenovo announced the expansion of the Lenovo ThinkEdge portfolio with the introduction of the new ThinkEdge SE450 server, delivering an artificial intelligence AI platform directly at the edge to accelerate business insights. The ThinkEdge SE450 advances intelligent edge capabilities with best-in-class, AI-ready technology that provides faster insights and leading computing performance to more environments, accelerating real-time decision making at the edge and unleashing full business potential.
Designed to stretch the limitations of server locations, Lenovo’s ThinkEdge SE450 delivers realtime insights with enhanced compute power and flexible deployment capabilities that can support multiple AI workloads while allowing customers to scale. It meets the demands of a wide variety of critical workloads with a unique, quieter go-anywhere form factor, featuring a shorter depth that allows it to be easily installed in space constrained locations.
The GPU-rich server is purpose-built to meet the requirements of vertically specific edge environments, with a ruggedised design that withstands a wider operating temperature, as well as high dust, shock and vibration for harsh settings. As one of the first NVIDIA-Certified Edge systems, Lenovo’s ThinkEdge SE450 leverages NVIDIA GPUs for enterprise and industrial AI at the edge applications, providing maximum accelerated performance.
Security at the edge is crucial and Lenovo enables businesses to navigate the edge-to-cloud frontier confidently, using resilient, better secured infrastructure solutions that are designed to mitigate security risks and data threats. The ThinkEdge portfolio provides a variety of connectivity
and security options that are easily deployed and more securely managed in today’s remote environments, including a new locking bezel to help prevent unauthorised access and robust security features to better protect data.
The ThinkEdge SE450 is built on the latest 3rd Gen Intel Xeon Scalable processor with Intel Deep Learning Boost technologies, featuring all-flash storage for running AI and analytics at the edge and optimised for delivering intelligence. It has been verified by Intel as an Intel Select Solution for vRAN. This pre-validated solution takes the guesswork out of the evaluation and procurement process by meeting strictly defined hardware and software configuration requirements and rigorous system-wide performance benchmarks to speed deployment and lower risk for communications service providers.
Lenovo expands its ThinkEdge portfolio with GPU-rich edge server designed to accelerate business critical insights
New compact, ruggedised and quiet Lenovo ThinkEdge SE450 server delivers performance and scalability at the edge
Through an agile hardware development approach with partners and customers, the Lenovo ThinkEdge SE450 is the culmination of multiple prototypes, with live trials running real workloads in telecommunication, retail and smart city settings. The ThinkEdge SE450 AI-ready server is designed specifically for enabling a vast ecosystem of partners to make it easier for customers to deploy these edge solutions. As enterprises build out their hybrid infrastructures from the cloud to the edge, it is the perfect extension for the on-premises cloud currently supporting Microsoft, NVIDIA, Red Hat and VMware technologies.
Pure Storage announced FlashArray//XL, the newest member of the FlashArray family, designed for mission-critical, platinum tier enterprise applications — from massive databases to containerised and cloud-native apps. FlashArray//XL delivers unmatched performance and scale with a nearly 80% improvement in IOPS.
With the pace of business only increasing, application demand can shoot up in an instant and the need for new apps to be deployed requires IT to work in hours, not months. Previously, the
largest enterprises had to scale their top tier applications with legacy technology that required high management complexity, high power and energy costs, and disruptive forklift upgrades.
Now, with the new FlashArray//XL and Pure Fusion — Pure’s self-service, autonomous Storage-as-Code platform — Pure continues to add enterprise-grade scale to its promise of subscription storage that easily evolves to keep IT infrastructure agile and up to date.
FlashArray//XL and Pure Fusion were built to meet demand from new customers who wanted to standardise their storage environments on the simplicity of the Pure subscription model but needed new levels of scale. This demand is already seeing early validation with pre-release orders to standardise top enterprise workloads on FlashArray// XL arrays, as customers eliminate the last remnants of their legacy Dell EMC footprint.
Available now, FlashArray//XL delivers up to 5.78 PB effective capacity, as low as 150µs latency, and up to 36GB/s throughput, with industry-leading 5:1 data reduction average, 10:1 total efficiency, and proven 99.9999% availability in a 5U platform.
Linksys announces availability of its latest and more affordable Wi-Fi 6 product, the Hydra Pro 6. The Linksys Hydra Pro 6 delivers the ultimate Wi-Fi 6 experience to 30+ devices per node across 2700 sq. ft. of coverage and wireless speeds up to 5.4 Gbps. Powered by the Qualcomm
Immersive Home 216 Platform plus access to 160 MHz channel, the Hydra Pro 6 unleashes the true power of Wi-Fi 6 with reliable, incredibly fast connectivity and improved network efficiency for seamless video streaming, faster downloading and more. Intelligent Mesh technology offers whole home mesh Wi-Fi coverage that’s easily expandable by adding nodes.
The dual-band mesh system delivers a fast, reliable and secure connection at a more affordable price
At a time when device-heavy homes are more dependent on Wi-Fi than ever before, the powerful yet easy to use Hydra Pro 6 is a seamless addition to households and allows users to view and prioritise devices through the free Linksys app. Additional features include WPA3/WPA2-Personal encryption and SPI firewall, automatic security updates, parental controls, and a separate guest network.
The Linksys Hydra Pro 6 is now available in the UAE for AED 607.57 and will be available at most retailers in the country.
Drawing a line to represent your organisation’s digital perimeter is problematic since such perimeters not expanding outward like a balloon filling up with air.
The region’s cybersecurity leaders may be asking themselves variants of how secure my environment is. Many of them have been swept along in a recent wave of unavoidable change. Migration to the cloud undoubtedly left regional CISOs wondering about a lot of things as they returned to the office.
Given the complexity of pandemic-era IT stacks, chief among these musings would surely be how many people have access to my network? Who are they? Where are they? How can I find out? and, of course: What can I do about it?
Vendors and other trusted partners connect to your environment. IT leaders are aware of this, of course. But where they can say with confidence when employees are connecting and what they are doing when they connect, can they report with similar certainty on the activity of third parties? And it is not just about people. Third parties have their own endpoints and applications that may or may not adhere to your best practice standards.
We live in a region where regulatory burdens are becoming heavier. We must be able to report on what vendors are doing with our data and to what extent their technology infrastructure is aligned with our legal obligations.
BeyondTrust research from 2019 shows that, on average, more than 180 different vendors access an enterprise’s network every week and 58% of companies believe that access directly led to a breach. Consultants, service providers, contractors, and many others routinely access systems that lie a few lateral hops away from sensitive areas.
Inadequate auditing of third-party accounts can lead to a range of problems. Perhaps they introduce malware directly. Or an orphaned account is used weeks or months later by a threat actor. It is all too easy, especially when an account lies unused, for a malicious party to gain access to vendor-controlled systems, and exploit their vulnerabilities, moving from node to node, resource to resource, gaining control over assets and establishing themselves as an authorized insider.
VPNs are commonly used as a safe method of third-party access, but the privileges they grant takes vital granular controls away from the security function. Malware can still piggyback on vendor sessions, all because the third-party account has unnecessary privileged access. The least-privilege principle applies to outside agencies as much as it does to employees. Best practices such as password management and session auditing are critical.
As challenging as it may sound, the goal here is to enforce policy across two or more companies. Requiring it and enforcing it are, of course, two different things, but you need to find a way to make sure that vendor passwords are regularly rotated, and that they have not been compromised. A privileged password management system is one option. Multifactor authentication, while not fool proof, is best practice and might prevent incidents where a credentials theft has occurred recently.
Controlled and highly visible inbound network access governed by
least-privilege access models should be implemented. Vendors commonly do not need admin access to complete their tasks. It also helps to use a just-in-time model, granting access privileges as they are required and ensuring they expire as soon as they are no longer needed.
All activity should be fully monitored and auditable to enable strong forensic capabilities if a breach should occur. And IT staff should have enough control over the environment to be able to pull the plug on any suspicious process.
Secure remote access solutions that meet local, regional, and international standards will be critical components of any security model in 2022.
IT teams must be able to control, manage, and audit the access and activity of any network account, whether owned by an employee or an outside agent. Solutions must, naturally, integrate into workflows and business operations, and therefore must not hinder the performance of people or systems. But they must, at the very least, allow security teams to apply the best practices of least privileges, just-in-time, and zero trust.
Solutions should make it easy to automate secure authentication and password management through built-in MFA. Password policies should be open to control by the IT team and should be automatically applied thereafter, and the system should also automatically change user passwords and SSH keys regularly to prevent or mitigate attacks.
Today’s marketplaces are filled with competitors. And competition is good for everyone. To fall out of the market because of a stale value proposition is one thing. But to go out of business because of the crippling cost of a data breach caused by a business partner that was given unnecessarily broad access to your network? That seems like a painful tragedy we would all like to avoid. ë
VPNs are used as a safe method of third-party access, but privileges they grant take granular controls away from the security function
Sitecore, the global leader in digital experience management software, announced the appointment of Jose Valles as President of the Europe, Middle East, and Africa region. Valles joins Sitecore in a year where the company announced a $1.2 Billion Investment plan to accelerate its growth, which has generated four acquisitions including Four51, Boxever, and Moosend, and Reflektion and sparked record growth for the company.
Valles will be responsible for driving the overall sales strategy and growth for the EMEA region for Sitecore. Earlier this year the region touted an expanded presence in key markets including France, Italy, Spain and Greece, all of which will fall under the remit of Valles.
Valles joins Sitecore from SAP, where he served multiple roles in the EMEA South region, leading both the CX and the Cloud ERP businesses. He also previously held executive positions in the telco industry in companies such as du, STC, and Telefonica, starting and scaling new businesses across the globe.
Cohesity, announced that Kirk A Law has joined the company as senior vice president of research and development. In this role, Law has global responsibility for engineering, product management, and Cohesity’s ecosystem business.
The tech veteran brings more than 30 years of relevant experience to Cohesity with a passion for developing breakthrough products, scaling high-performing teams, enriching company cultures, and delighting customers and partners.
Law’s technology expertise spans multiple sectors including data management, governance, and security, with an extensive history in systems engineering, intelligent automation software, and enterprise product development for on-premises and cloud deployments.
Before joining Cohesity, Law served as a senior vice president of development at Tableau Software, a Salesforce company. Prior to that, Law was the senior vice president of engineering at Primary Data. He also held leadership roles at SanDisk, NetApp, Cacheflow Systems Inc., and SGI formerly Silicon Graphics Computer Systems. He began his career at the David Sarnoff Research Centre, which at the time was an R&D centre for the RCA Corp., where Law received several patents in digital TV processing technologies.
Palo Alto Networks announced appointment of Helmut Reisinger to the position of CEO for Europe, the Middle East and Africa and Latin America LATAM.
An internationally recognised business leader, as CEO of Orange Business Services, Reisinger led a global organisation of 28,000 employees supporting the digital transformation of enterprise customers around the world. Before joining Orange Business Services in 2007, Reisinger held leadership positions across Europe at Avaya Inc, NextiraOne Germany and Alcatel Austria.
Reisinger will work closely with Palo Alto Networks’ President BJ Jenkins to drive the acceleration of the company’s global growth strategy and will join CEO and Chairman Nikesh Arora’s management team.
Reisinger holds a PhD from Vienna University for Economics and Business WU, and speaks English, French, German and Spanish.