NITECH: Nato Innovation and Technology – Issue 12

Page 1


Partner of: Scan the QR-code for more information! +++ October 2024: SDoT Labelling Service achieved NATO SECRET approval! +++

Specializing in NATO compliant Cybersecurity Solutions, infodas supports the Alliance and its members in achieving Data Centric Security and executing Multi Domain Operations.

Cybersecurity & IT Consulting

Cloud & AI Consulting

SDoT Cross Domain Solutions

OPERATE AS ONE WITH A MULTI-DOMAIN COMBAT CLOUD

Multi-domain operations require rapid decisional superiority across all domains. The Airbus Multi-Domain Combat Cloud empowers operational system-of-systems by integrating sensors and effectors, coupled with command and control to ensure digitised, AI-enhanced and cyber-resilient missions. This allows forces to act as one and help to keep the world a safer place.

NATO EDGE 2024: A PIVOTAL MOMENT

Ludwig Decamps, General Manager, NCIA

As 2024 draws to a close, it is with great anticipation that we look forward to NATO Edge 24 in Tampa, Florida. This event represents a pivotal moment for NATO and our industry partners as we navigate the complexities of modern defence in an era characterized by rapid technological change and evolving global security challenges. As NATO Assistant Secretary General for Innovation, Hybrid and Cyber, Jean-Charles Ellermann-Kingombe aptly alludes to in his foreword, in today’s landscape, there is no security without technology.

This issue of NITECH is equally valuable for NATO Edge attendees and the wider NITECH community. Designed to be the perfect accompaniment to the NATO Edge agenda, it provides further detail on the key themes of the conference:

PARTNERSHIPS

We shine a spotlight on the vital partnerships that underpin our collective defence efforts. The importance of collaboration between NATO, industry and academia cannot be overstated. It is through these relationships that we can leverage cutting-edge innovations and strategic sourcing to enhance our operational capabilities. The lead article by Jennifer Upton, our Chief of Acquisition, provides an insightful overview of NATO Edge 24 and emphasizes the mutual benefits that will emerge from our collective participation.

SMART SOURCING

This edition also delves into the nuances of our Smart Sourcing Strategy, as outlined by Carol Macha, NCIA’s Chief Information Officer. As we adapt to the evolving landscape of defence procurement, our focus on efficiency and accessibility is critical. NCIA’s Ijeoma Ike-Meertens further elaborates on how our acquisition policies are being refined to facilitate smoother engagements with industry, ensuring that we remain responsive to both current and emerging threats.

CYBERSECURITY

Cybersecurity remains at the forefront of our agenda. Major General Dominique Luzeaux highlights NATO’s commitment to maintaining a unified approach as we pursue our digital transformation. As cyber threats grow in sophistication, so too must our defences. Frederic Jordan introduces the newly established Corporate Portfolio Management Office, designed to streamline our initiatives and enhance our cybersecurity posture. The discussions surrounding zero trust architecture and post-quantum security are particularly relevant as we navigate these complex terrains.

ARTIFICIAL INTELLIGENCE

Artificial intelligence (AI) continues to reshape our operational landscape. In this issue, we explore how NATO is embracing AI technologies responsibly and strategically. Our contributors provide invaluable insights into the implications of AI for defence operations. In particular, the interview with NCIA’s Ivana Ilic Mestric on the NATO AI Masterclass exemplifies our commitment to equipping NATO personnel with the knowledge necessary to harness the full potential of AI.

CLOUD ADOPTION

Cloud adoption is another cornerstone of our technological evolution. NCIA’s Principal Cloud Architect, Stefaan Vermassen, outlines how the recent CloudEx training exercise has helped the Alliance accelerate its cloud adoption journey and align it with NATO’s

Enterprise Cloud Operating Model. The synergy between cloud technologies and 5G further underscores the transformative potential of these advancements for our operational agility and responsiveness.

As we gather in Tampa, we are reminded of the enduring bonds that unite NATO Allies. This year’s NATO Edge provides an unparalleled opportunity for dialogue and collaboration across the Atlantic, fostering a shared understanding of our defence challenges and priorities. It is a testament to our collective resolve to strengthen NATO’s capabilities in the face of uncertainty.

For those who are able to attend, I encourage you to immerse yourselves in the thoughtfully curated sessions and networking opportunities that await at NATO Edge. This conference is not merely an event; it is a convergence of ideas, expertise and shared vision for the future of defence and security. It is also the catalyst for establishing a dynamic community of tech and defence experts extending well beyond the NATO Edge walls. Together, we will explore the pathways that lead us toward a more secure and resilient Alliance.

Thank you for being part of this vital endeavour, and I look forward to engaging with each of you in the exciting discussions to come.

The 2024 NATO Edge conference is taking place at the Tampa Convention Center in Tampa, Florida, 3–5 December, 2024 (PHOTO: GETTY IMAGES)

Power your advantage

Critical communications solutions that elevate mission intelligence and forge forces’ advantage.

Harness the exponential potential of networks. n

THE PRESSURE WHEN IS ON

Military open-source intelligence you can trust Find out more

NATO EDGE: HELPING TO BRIDGE TECHNOLOGICAL VALLEYS OF DEATH

Since time immemorial, security and technology have crucially gone hand-in-hand. Without access to the leading technology, it is impossible to determine our own security. For the past 75 years, the Alliance’s technological edge has hence been the backbone of its deterrence and defence posture. Today, the Alliance prides itself on a robust innovation ecosystem with leading research and development organizations, defence primes, tech companies, innovation hubs and venture-backed unicorns.

In recent years, the pace of technological innovation and the strategic competition surrounding the development of key technologies has increased exponentially. New technologies, such as artificial intelligence and autonomous systems, are acquiring greater strategic importance and already play a decisive role on the battlefield – while others such as quantum and biotechnologies have the potential to alter the character of conflict. We can observe this every day on the battlefield in Ukraine. Recognizing this, Allies agreed on an Innovation Cooperation Roadmap with Ukraine at the Washington Summit this year to support Ukraine’s urgent requirements and help build synergies between the Ukrainian and Allied innovation ecosystems.

Innovation today is largely driven by start-ups as well as small and medium-sized enterprises in civilian ecosystems. However, many of these applications have significant dual-use potential, as they can be repurposed for defence and security. Dual-use

technological innovations can both strengthen the Alliance’s deterrence and defence capabilities and create new market opportunities for non-traditional companies venturing into the defence and security environment.

But, as they scale, this new breed of companies, unlike traditional defence primes, have to bridge several ‘valleys of death’ – crucial phases characterizing start-ups where prospective revenues are lacking and most start-ups risk failing. Two are particularly prominent: the ‘development valley’ and the ‘adoption valley’.

DIANA AND THE NIF

Over the past few years, NATO and its Allies have invested substantially in establishing engagement mechanisms with innovation ecosystems across the Alliance to help Allied innovators bridge the development valley of death. Through initiatives, such as NATO’s Defence Innovation Accelerator for the North Atlantic (DIANA) and the NATO Innovation Fund (NIF), the Alliance is fostering and developing defence innovation ecosystems by supporting the most promising early-stage innovators with defence expertise, exposure and access to capital, thus helping them bridge the first valley of death. This year, DIANA launched its second set of Challenge Calls attracting over 2,600 applications, nearly twice as many as in 2023. After becoming operational in summer 2023, the NIF has announced its first set of direct investments into promising start-ups and mission-aligned funds.

However, the availability of cutting-edge technologies alone does not guarantee that Allied militaries are able to effectively equip their operators. The military sector is lagging behind the civilian market in the adoption of new technologies. This leads to the adoption valley of

death, as it is difficult for non-traditional companies, especially start-up and venture-backed unicorns, to sustain operations due to the uncertainty around prospective revenues from the defence and security sector.

Clear demand signals and communication of military needs, fostering a culture of innovation in the armed forces, more agile procurement and acquisition processes and accessible technology testing and validation pathways can, among other things, enhance the cooperation between non-traditional companies and the defence sector to provide them with scaling opportunities.

Recognizing the strategic imperative of expediting technology adoption, Allies agreed at the Washington Summit to develop NATO’s Rapid Adoption Action Plan, in time for The Hague Summit in June 2024. Its objective is to enable Allies and NATO to acquire and use new technologies delivering high-impact and operational solutions at the speed of relevance.

It is against this backdrop that events like NATO Edge play a significant role in bringing NATO and its Allies closer to transatlantic innovation and technology ecosystems. This can help foster partnerships and synergies which are crucial for the success of the Alliance’s efforts to maintain its technological edge. It is only together, at this critical time for our security and for international peace and stability, that we can safeguard our shared democratic values and protect the Alliance’s one billion people.

Most innovation is being driven by start-ups and SME tech companies

A commercial alliance delivering decision-ready information for space domain awareness, maritime domain awareness, geospatial intelligence and beyond.

AWS for Defence

We bring together the most advanced, secure cloud infrastructure, broadest service portfolio, and extensive space industry expertise so you can simplify IT management and focus on your missions.

AWS provides a global infrastructure and secure, scalable, and mission-focused solutions that helps defence organisations globally outpace our adversaries, strengthen their mission advantage and rapidly adapt to shifting priorities.

Powered by AWS and the ARGUS Partner Alliance, ARGUS fuses best-in-class data with industry-leading infrastructure, analytics, visualization, security and compliance modules to deliver targeted insights for critical mission use cases.

Project Kuiper

ARGUS is from the ground up designed as an open architecture to:

The AWS Cloud powers national security and defence organisations through:

• Accelerating access to cutting-edge technologies

Whether on land, in air, or at sea, having connectivity and access to information in the field is critical for defence organisations. Satellite communications systems provide critical capabilities – from delivering satellite imagery to enabling communications, that support military users operating around the globe.

• Increasing the speed, impact and scalability of innovation

• Deliver market leading tools and capabilities

• Access the right data to achieve critical insights

• Enabling rapid information analysis and collaboration

• Securing their most mission critical workloads.

• Achieve instant interoperability and effortless integration.

Connect to our dedicated NATO Team ›

Learn more ›

Find out how AWS and Amazon’s Project Kuiper will help shape the future of global, ubiquitous, and secure internet connectivity throughout the space domain for the United States and its allies.

Learn more ›

NCIA’S FLAGSHIP CONFERENCE NATO EDGE

Simon: The first NATO Edge in Mons, Belgium, two years ago set the bar extremely high. It was a fantastic event that consolidated and enhanced the former NIAS and NITEC conferences that NCIA used to run each year before COVID-19. Combining the cybersecurity aspects of NIAS with the wider technology scope of NITEC means that NATO Edge covers almost the entire range of digital innovation and technology that the Alliance is championing.

Lara: The war in Ukraine had only just begun yet its shadow cast over the entire event. It was, without doubt, a seminal moment for NATO. Now that Russia’s full-scale war on Ukraine is almost three years old, the world has witnessed how technological innovation is playing such a vital role. Ukraine’s ability to face down the overwhelming forces of the Russian aggressor is due, in a large part, to the rapid development, introduction and modification of emerging and disruptive dual-use technologies, particularly in fields such as next-generation communications and drones.

Simon: That is why the second NATO Edge, hosted at the Tampa Convention Center in Florida, has focused not just on the development of those sorts of emerging technologies but also the ecosystems that are required to nurture that development.

Lara: It is why this issue of NITECH delves into the inner workings of partnerships and procurement. With the private/commercial sector now developing emerging technologies at a faster pace and at greater scale than the defence sector, it is vital that NATO can engage seamlessly with smaller companies — often start-ups that have little or no experience dealing with the defence sector and the complexities of the NATO procurement process. These smaller companies and academic institutions are key elements of the evolving supply chain driving innovation.

Simon: Indeed, the NATO Edge opening video highlights this by showing three characters working together — an NCI Academy professor, a SATCOM engineer and a data scientist.

Lara: Yes! I helped work on that video and the concept highlights the essential collaboration between NATO, Nations, industry and academia necessary to keeping us at the forefront of technology.

Simon: That’s great, and we’ll see real-life examples of similar collaboration throughout the conference. And let’s not forget, NCIA’s Chief of Acquisition, Jennifer Upton, was instrumental in establishing the NATO Edge agenda specifically so that these smaller companies could be drawn into the ecosystem and begin to engage with NATO and NCIA in a much more agile fashion.

Lara: NITECH is playing its part in supporting NATO Edge by mirroring its major themes and incorporating key snippets that the conference agenda was established to highlight. There is a real synergy between NITECH and NATO Edge, including simple yet helpful things like links to the conference website to help our readers who are unable to attend the conference to get a feel for what is being covered in Tampa this December.

Simon: That is also why we have included articles from NATO Edge speakers.

Lara: Correct, we have a set of articles from individuals who are presenting or participating in panels at NATO Edge. We have chosen speakers from the big hitters like Amazon Web Services to less well-known companies like Logpoint and Safran.AI, as well as Floridian academia.

Simon: That offers a flavour of the type of ecosystem that NATO is enthusiastically trying to develop.

Lara: As you would expect, a huge element of the conference agenda and the supporting exhibits are devoted to some of the most vital emerging digital technologies that will have a game-changing impact on defence: cybersecurity, artificial intelligence (AI) and the cloud.

Simon: It has been great to highlight NCIA’s part in developing the AI Masterclass alongside the commercial sector. Although everyone knows the term AI, few really understand the massive impact it is going to have. Decision-makers, a few of whom were not born into the digital era, really need a bit of help with this.

Lara: AI represents a fundamental landscape change and everyone needs to develop at least a basic understanding of what it is all about.

Simon: The cloud is another area that not everyone is confident about. So, it is great that we have been able to include NCIA’s Principal Cloud Architect, Stefaan Vermassen, to highlight the CloudEx event, where participants looked into cloud operating models and how they may impact NATO’s digital transformation.

Lara: All in all, this is another issue full of valuable insight about what’s going on in the realms of digital transformation, cybersecurity and emerging technologies.

Forewords

05 NATO Edge 2024: a pivotal moment

Ludwig Decamps, General Manager, NCIA

11 NATO Edge: Helping to bridge technological valleys of death

Jean-Charles Ellermann-Kingombe, Assistant Secretary General for Innovation, Hybrid and Cyber, NATO

14 NCIA’s flagship conference: NATO Edge

Lara Vincent-Young and Simon Michell, co-Editors, NITECH

Partnerships and strategic sourcing

20 Welcome to NATO Edge

Jennifer Upton, Chief of Acquisition, NCIA

26 NCIA’s Smart Sourcing Strategy

Interview with Carol Macha, Chief Information Officer, NCIA

32 NCIA acquisition policy

Ijeoma Ike-Meertens, Head of Acquisition Policy, NCIA

38 APSS partnerships

Interview with Laryssa Patten, Head of the Space Technology Adoption and Resilience Branch, NCIA, and Matt Roper, Chief of JISR Centre, NCIA

44 View from the Nations: USA delivering technology to Special Operations Forces

Lisa Sanders, Director of Science and Technology for Special Operations Forces, SOCOM Acquisition, Technology and Logistics

47 Accelerating the pace of capability delivery

Frederic Jordan, Head of Cyber Security Programme Delivery, NCIA

Cybersecurity

52 Securing NATO’s digital transformation

Major General Dominique Luzeaux (Dr Hab), Digital Transformation Champion and Special Advisor to the Supreme Allied Commander Transformation, NATO

58 Zero trust architecture

Christian Have, Chief Technology Officer, Logpoint

64 Towards post-quantum security

Professor Kwang-Cheng Chen, Department of Electrical Engineering, University of South Florida

Artificial intelligence

70 Sharing AI data

François Bourrier-Soifer, Deputy Chief Executive Officer, Safran.AI

73 Autonomy and the drones

Interview with Dr Claudio Palestini, NATO HQ, and Rene Thaens, Head of the Electronic Warfare and Surveillance Branch, NCIA

78 AI Masterclass

Interview with Ivana Ilic Mestric, Principal Data Scientist and Head of Data Science and AI Engineering Profession, NCIA

81 The future of AI in defence

Anna Metsäranta, Head of Sustainable AI, Solita

84 Impact of emerging and disruptive technologies on multi-domain operations

Alper Köprülü, Project Manager, Chief Technology Office, NCIA

87 Fighting AI deep fakes

Janet Coats, Managing Director, Consortium on Trust in Media and Technology, University of Florida College of Journalism and Communications

Cloud adoption

92 Cloud adoption journey

Chris Bailey, General Manager, Global National Security and Defence, Amazon Web Services

95 CloudEx aligning with NECOM

Stefaan Vermassen, Principal Cloud Architect within the Cloud Centre of Excellence, NCIA

98 Combining the power of cloud and 5G for defence applications

Philippe Agard, Vice President of Market Creation, Nokia Defense International

A flexible and secure path through the clouds

Artificial intelligence (AI), Software-as-a-Service (SaaS) and the cloud can accelerate productivity and innovation, but the full scale of their impact on an organization’s network is still unknown

Since the launch of ChatGPT, the AI hype cycle has switched to overdrive. According to some, it will either save the world or end it.

In reality, AI has been evolving for decades. The difference now is that a perfect storm of application development, hardware and the cloud will see its use massively proliferate. A new wave of digital transformation is imminent. The question is: how will an organization’s network cope?

The gathering clouds

AI might be grabbing the headlines, but it’s not the only digital application organizations are looking to the cloud for.

Successive waves of digital transformation mean multinational organizations now typically use 50 Software-as-a-Service (SAAS) solutions. These are applications hosted in the public cloud or a SaaS provider’s own data centre, then delivered to users via an organization’s network.

Furthermore, 50% of organizations expect to add more SaaS within a year. With SaaS on the rise, it is important to remember that the productivity improvements it generates will be dependent on predictable, secure and resilient connectivity.

Like AI, SaaS demands real-time data processing and contextualization to drive quicker decision-making, which can accelerate productivity and efficiency. It requires authenticated and trusted connections from the organization to wherever the application is hosted. As the number of SaaS solutions an organization uses grows, the complexity of managing all these connections becomes a significant challenge.

Data diversity, quality and security

Returning to the subject of AI, hyperscalers today offer cloud infrastructure that can scale with Generative AI (GenAI) applications and deal with unpredictability and pace of change. But the same flexibility and scale are also required from the network. This can be broken down into three areas that organizations should consider when developing their AI strategies:

• Data diversity — AI relies on diverse data inputs of various data types, sources and sizes, which are driven by an organization’s ‘AI data governance framework’. However, that diversity creates unpredictable, often ‘bursty’ network traffic. Cloud computing offers flexibility and pay-as-you-use models, yet most organizations’

networks have not kept pace. This can hinder GenAI adoption and transformation efforts.

• Data quality — GenAI strategies involve balancing quality and quantity. Large datasets are used initially to train GenAI models quickly, but to achieve higher quality outputs, GenAI models require smaller datasets with greater fidelity for quicker, more accurate insights. These datasets are in many different locations across many organizations and are better referred to as ‘wide data’. In an AI-enabled world, wide data needs to interoperate seamlessly.

• Data security — LLMs (Large Language Models) sit at the core of GenAI applications. They process sensitive and confidential data, making data security crucial. The need to prevent unauthorized access and data breaches is essential. Security measures such as encryption and access controls are vital for maintaining data integrity and reliability.

In the digital world, where sharing data between organizations is the new norm, traditional networks lack flexibility and scalability and struggle to overcome the challenges of complex configuration, interoperability, latency and performance. Moreover, future networks must offer the same pay-as-you-use commercial flexibility as the hyperscalers.

To access diverse datasets at scale, dense and secure global interconnectivity is essential. The telco edge is the ‘Goldilocks location’.

Digital Infrastructure Director, Business, BT
Matt Swinden
“ Deploying cloud services that host an AI or SaaS at the edge of a network means interconnectivity is much easier with partners and there is good access to third-party data sources, making it a perfect location to unlock digital potential”

Goldilocks locations are ‘just right’ for harnessing AI’s power. In the context of networking, this means locations that allow the best balance between factors such as:

• Latency and bandwidth — ensuring data travels quickly;

• Security and accessibility — providing robust security while maintaining ease of access; and

• Cost and performance — achieving high performance without excessive costs.

Deploying cloud services that host AI or SaaS at the edge of a network means interconnectivity is much easier with partners and there is good access to third-party data sources, making it a perfect location to unlock digital potential.

At BT, our transformative network-asa-service (NaaS) platform, Global Fabric, will make it easier and quicker for organizations to securely connect their people, partners and devices to apps and digital services — including AI and SaaS — hosted across multiple

clouds. Organizations benefit from scalable, secure, high-capacity and resilient connectivity pre-integrated into the world’s leading cloud locations and SaaS providers, ready to meet the growing and complex demands of breakthrough AI technologies. Global Fabric’s flexibility will be unprecedented. With legacy networks, setting up connectivity or making changes can take weeks. With Global Fabric, it happens in an instant, helping manage unpredictable, AI-driven spikes in traffic.

Global Fabric’s design is based on the idea of ‘interconnectivity’ — connecting to third-party clouds, solutions and partners. This contrasts with the internally focused thinking of the past, where networks primarily ‘intra-connected’ an organization’s users and systems. In the digital world, the ‘intra’ model is no longer fit for purpose.

To thrive in a digital world, multinational organizations like

NATO need a seamless, secure and interconnected ecosystem. The more interconnections, the more it drives the organization forward. For organizations, flexible, robust and secure interconnectivity will be vital to onboarding AI and other cloudhosted digital solutions such as SaaS. This, in turn, will help the empowered leaders to make faster, betterinformed decisions.

WELCOME TO NATO EDGE 24

Jennifer Upton has served as the Chief of Acquisition for the NCIA since 2020. As part of this critical role, she is the senior responsible owner and key organizer of NATO Edge. She is one of the best people to explain the mission and vision of NATO Edge 24 and highlight the benefits that industry, academia and the wider NATO community will gain from attending

I have spent my entire career supporting acquisition for the U.S. Department of Defense, mostly in support of the U.S. Special Operations Command, which is our partner for this year’s NATO Edge conference. I am currently responsible for maintaining the overall integrity of NCIA’s processes in all areas relating to acquisition, including procurement policy and procedures, logistics, facilities engineering, cost estimating and analysis, and industry relations. NCIA’s procurement portfolio, amounting to 600 million EUR in 2023, is growing dramatically with the increase in common funding from the Nations. My mission is to enable NATO’s transformation into a fully digital enterprise through the cost-effective, innovative and timely acquisition of tools and

solutions that connect the Alliance and allow our allied leaders and service members to keep us safe.

As the head of NCIA’s industry relations, I serve as the Senior Responsible Owner for NATO Edge, reporting directly to the General Manager, Ludwig Decamps. My role is to ensure NATO Edge delivers upon NCIA’s strategic goals and NATO’s objectives. This flagship business and technology event aims to expand our dialogue and partnership with industry and not-for-profits, increase understanding between the various defence sectors, and enable pathways for tackling some of the Alliance’s most complex challenges together.

"NATO Edge enables you to gain insights into upcoming business opportunities, and this year we are excited to unveil our new technology roadmaps"

NATO EDGE 24 – NCIA’S LARGEST EVENT

Whether you attend or exhibit at NATO Edge, you can expect more connections and opportunities than at any previous NCIA event, as it is our largest conference to date. NATO Edge enables you to gain insights into upcoming business opportunities, and this year we are excited to unveil our new technology roadmaps. You can discuss and collaborate on planned projects, and network with NATO and government officials, along with industry and not-for-profit peers and partners. Finally, you will get an opportunity to present your products and solutions to NATO government and military experts and decision-makers, all gathered in one place.

We are sending a strong demand signal to industry and not-for-profit partners and changing the way we do business to accommodate the pledge for accelerating multinational procurement. Companies can expect a consistent and unprecedented demand for commercial, dual-use and military solutions coming their way, now and in the future. NATO is a reliable and attractive business partner with a mission that concerns us all: keeping our one billion citizens safe in the face of today’s dynamic global challenges. Practically, now is the time for companies to build capacities and engage with NATO, whether you are new to our acquisition process or an established partner. NATO Edge provides the opportunities to build the necessary foundations to secure critical supply chains and ensure resilience in the defence capabilities upon which we all depend.

REINFORCING NATO’S TRANSATLANTIC COOPERATION

As a first for NATO Edge and following on the heels of the historic NATO Summit held in Washington DC this summer, we are holding this year’s conference in the United States to reinforce our transatlantic defence cooperation at the operational level. This NATO Edge

represents a great opportunity to enhance collaboration and understanding between NATO and companies based both in North America and Europe, especially for small- and medium-sized businesses that could not attend or exhibit at our previous events across the ocean.

We seek to learn from market experts about all Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) solutions spanning our set of portfolios, including Air and Missile Defence Command and Control, Joint Intelligence Surveillance and Reconnaissance, Cyber Security and NATO’s Consultation and Command Networks, just to name a few. Our top priorities for NATO Edge are reflected in the five themes for this year: Strategic Sourcing, Cloud Adoption, Cybersecurity, Artificial Intelligence and Partnerships. First, outsourcing to industry and not-for-profits is a key tool to keep up with NATO’s increasing requirements; a focus on strategic sourcing is pivotal to ensure readiness. Second, as NATO’s journey to the cloud has only just begun, we have the opportunity to learn from the best and most experienced regarding cloud transformation. Third, cyberspace is contested and competed at all times, putting NATO and industry in one boat when it comes to hardening our cyber defences. As for artificial intelligence (AI), commercial companies are pushing the technological frontier at breathtaking speed, but we also need to make sure we embrace the potential of AI safely and responsibly. Finally, partnerships with both seasoned and new players – including start-ups, small businesses, academia and think tanks – are crucial for all the areas I just mentioned and many more.

DIVERSIFYING NATO’S SUPPLY CHAIN

It does not matter if you represent an established prime contractor or a young start-up, a specialized boutique innovator or a generalist supplier, applied research or consultancy, defence or not. NATO Edge represents a

unique and comprehensive opportunity to showcase the entire portfolio of NCIA requirements, as well as provide the fertile landscape for strengthening and diversifying NATO’s supply chain with all types of suppliers spanning a broad range of expertise.

Information is key for navigating NATO’s acquisition organization and procedures, and this year’s NATO Edge also serves as a source for practical how-to information. Presentations on upcoming business opportunities offer early information to help businesses plan their strategies and formulate potential teaming arrangements. My team will offer workshops on how to do business with us and provide opportunities to register in our e-procurement tool, NEO, giving all interested entities, regardless of size, the tools and knowledge necessary to participate in our business.

Attending NATO Edge is a fantastic opportunity to learn about our projects and procurement procedures, and how we are adapting to become a better business partner. I am delighted to encourage companies and

individuals to join us in Tampa, and to engage on every level offered through this venue. Whether it is in conversations with Alliance representatives about different technologies, participating in the breakout sessions focused on specific problem sets, creating links throughout the attending supplier ecosystem, or joining the panel sessions to hear from NATO leaders, industry and not-for-profit experts and national authorities about our shared global challenges, NATO Edge is the place that brings this all together, strengthening our collective defence in support of NATO’s dynamic mission.

NATO Edge is an interactive event comprised of distinguished keynotes and panels, breakout sessions and networking opportunities (PHOTO: Tampa Convention Center (TCC))

How national security and defence missions protect data with Trusted Secure Enclaves on AWS

Chris Bailey is the General Manager of the Global National Security and Defence team for Worldwide Public Sector at AWS. He is an expert in delivering national security and defence cloud adoption programmes, including 30-plus years in the Defense Industrial Base (DIB).

Chris Bailey

General Manager, Global National Security and Defence team for Worldwide Public Sector at AWS

By 2030, NATO’s digital transformation will facilitate multi-domain operations (MDO) with interoperability, heightened situational awareness and data-driven decision-making. Allies need to be able to share data and communicate in near-real time to counteract evolving threats to the 1 billion citizens that NATO protects.

In support of this, NATO initiatives lead the effort to deliver a secure, scalable cloud environment to provide a digital backbone and enable MDO. Acknowledging the risk to NATO from legacy communication and information systems is critical to NATO maintaining and strengthening its competitive edge. Collaboration among NATO allies is paramount, so digital systems and all the standards and policies around them must be interoperable and secure at all times, in all environments, at all classifications.

From training to supporting the front line, AWS can provide solutions to help solve the challenges that formations, units and allies face. More than just providing computer and storage capability in the cloud, AWS can help intelligence, planning and operations teams leverage newer, cost-effective artificial intelligence (AI) and machine learning (ML), analytics, simulations and other technologies.

AWS provides a global infrastructure and secure, scalable and missionfocused solutions. This enables NATO

and national defence organizations to address global threats and deliver military mission capabilities wherever and whenever they are required. With AWS Cloud, you can build once and deploy anywhere in the world in near-real time to support mission. You gain speed, impact and scalability of innovation to enable rapid information analysis and collaboration. You secure the most mission-critical workloads and can rapidly move essential applications away from physical IT infrastructure in any impacted location.

AWS created Trusted Secure Enclaves (TSE) for national security and defence organizations, so they can rapidly establish a comprehensive cloud architecture for sensitive workloads. Organizations can use TSE Sensitive Edition (TSE-SE) to meet and accelerate cloud accreditation processes against national and Alliance security and compliance requirements. It’s a reference architecture that provides a secure, compliant, isolated cloud configuration that supports the customers’ mission needs because it enables access to all the benefits the cloud brings, such as speed, scalability and security. With TSE-SE architectures, organizations can build, assure and operate secure environments in the cloud more rapidly, reducing the time it takes to establish robust, compliant and scalable operational cloud environments from months down to a few hours.

TSE-SE provides a standardized, repeatable and automated secure foundation from which to operate. It means organizations can establish their own operational security posture in the cloud. And TSE-SE lays the foundation to help organizations meet some of the most stringent security standards in the world, such as NIST 800-53 Medium, Canada’s CCCS-Medium, Australia’s IRAP, the U.S. Department Of Defense Impact Level 4 (DOD IL4) and FedRAMP Moderate. NATO Member Countries can trust each other’s compliance with NATO standards and so connect and interoperate faster with a common TSE-SE architecture. TSE-SE’s security controls have been mapped against NATO’s D32 security directive.

The TSE-SE environment includes the automatic deployment of services that can surface any compliance drift or security threats that may arise, including:

Amazon Security Hub — A cloud security posture management (CSPM) service that performs security best practice checks, aggregates alerts and enables automated remediation from network interfaces.

Amazon GuardDuty — A managed threat-detection service that continuously monitors for malicious activity and unauthorized behavior to protect AWS accounts and data stored in Amazon Simple Storage Service (Amazon S3).

AWS Key Management Service (AWS KMS) — A service that lets customers create, manage and control cryptographic keys across applications and AWS services.

Security-sensitive customers around the world, from global, heavily regulated financial institutions to national healthcare providers, choose AWS. AWS complies with more than 143 security certifications and attestations, laws and regulations, privacy standards and alignments to industry frameworks. TSE-SE on AWS is an additional option designed specifically to support national security and defence organizations to accelerate their cloud adoption. This aligns with NATO’s objective to deliver a secure, scalable cloud environment, providing a digital backbone and enabling MDO.

When customers use TSE-SE, they can meet their sensitive and protectedlevel data security requirements and

obligations under the AWS shared responsibility model. This shared responsibility model means customers retain control and have the flexibility they need to deploy the services they select.

Learn more about Trusted Secure Enclaves on AWS
Connect with a vetted AWS Partner for specialist support

NCIA’S SMART SOURCING STRATEGY

NCIA’s Chief Information Officer, Carol Macha, tells Maria Brandão Almeida, NCIA Conference and Events Coordinator, about the vision behind the Smart Sourcing Strategy and what NATO can expect from the first wave of its implementation

NCIA has begun an ambitious journey to more significantly outsource available work to industry. Underpinning this is NCIA’s Smart Sourcing Strategy, which focuses on analyzing sourcing options to find the best delivery methods for each project or service being analyzed. This new strategy considers both the differentiated services, which are essential to NATO’s mission and less likely to be outsourced, and the commoditized services, which are standard IT practices and more likely to be outsourced. It is through smart sourcing that NCIA manages to deliver effective solutions while balancing in-house and outsourced service providers. Why the shift in delivery approach?

As NATO looks at an increasingly complex geopolitical landscape, there is a growing demand for delivery of services and capabilities. As NATO’s pre-eminent IT service provider, NCIA must do all it can to use its resources effectively and efficiently. And this “clearly includes outsourcing more of our work to industry,” says Carol Macha, NCIA’s Chief Information Officer.

SCORECARD AND DECISION CRITERIA

In addition to the sourcing strategy, Carol introduced a sourcing scorecard with 11 questions to help initially decide whether to outsource a service or capability. Depending on the project or service being evaluated,

" The goal is to refocus, reskill and reallocate highly skilled and invaluable staff members towards more highimpact tasks that allow NCIA to prioritize its mission-critical activities"

the scorecard includes four grades per criterion, indicating the likelihood of success if the service is outsourced or not.

In general, complex, differentiated services with more moving parts that are fluid and dynamic are less suitable for outsourcing, while regular, repeatable commoditized tasks are more suitable.

“Whenever we get work coming into NCIA, we want to take every opportunity to make the most appropriate decisions for it. This is why it’s called ‘smart sourcing’ and not ‘outsourcing’. We may not outsource everything, but we have to, at least, analyze the opportunities, weigh up the options and use industry more extensively than we have in the past,” says Carol.

MISSION-FOCUSED

STRATEGY

Upon taking up the role as Head of the Alliance, the new Secretary General, Mark Rutte, outlined NATO’s top priorities: supporting Ukraine, increasing partnerships with other nations, increasing output and improving efficiencies in NATO’s investments. This certainly resonates with NCIA’s approach: “The Smart Sourcing Strategy aligns with these goals, particularly the latter. We’ll be transforming how we work by making more thoughtful decisions about the resources that we have — because they are limited. We have to use them as efficiently as we can,” Carol explains.

Therefore, should NCIA’s staff members find themselves affected by their tasks/responsibilities being outsourced, plans are being created to ensure those staff members have options, which could include staying assigned to the service but in an oversight-type vendor management role, or being reassigned to other similar-type work and transition to these other responsibilities. “The goal is to refocus, reskill and reallocate our highly skilled and invaluable staff members towards more high-impact tasks that allow NCIA to prioritize its mission-critical activities,” Carol explains. “This is a very mission-focused decision that we’re making. The increase in demand for NCIA services cannot be met healthily and sustainably without

the help of outsourcing. While we outsource to grow, our main priority is and remains our people. Outsourcing provides the opportunity for staff to upskill and pivot their expertise and skillsets towards emerging priorities.”

A STRONGER RELATIONSHIP WITH INDUSTRY PARTNERS

Smart sourcing is also an opportunity for NCIA to strengthen its engagement with industry and open the doors to new partner relationships.

As Carol explains, NCIA is now “empowering” its industry partners, who’ll have a more active and collaborative role in the deliveries with which they are involved. By outsourcing applicable work, NCIA is laying the groundwork for stronger partnerships with industry, leveraging their strengths in delivering services and capabilities through collaborative efforts. “In essence, we want to let industry use their entire portfolio of skills to better help us in the execution of our work,” says Carol. All in all, industry partners benefit in two ways: they receive more work, and, they can operate at peak level. Both of these by-products further strengthen our partnerships with industry.

CURRENT AND GROWING EFFORTS

NCIA is accelerating its outsourcing efforts and currently has a robust set of nine to ten services in development for outsourcing, including a high visibility internal service. NCIA launched the re-competition of REACH (Remote Access Connection Host) mobile laptops using a fully outsourced strategy. 2025 will also see the development of an outsourcing roadmap — with proactive identification of future services and capabilities to be outsourced. “Our goal is to make sure that using an outsourcing model becomes standard practice within NCIA, and that the Smart Sourcing Strategy is thoroughly integrated throughout our delivery model as soon as possible,” Carol continues.

CHALLENGES AND IMPLEMENTATION

There are, of course, challenges to be expected with the implementation of the Smart Sourcing Strategy, with the user experience being one of NCIA’s top priorities.

According to Carol, NCIA aims to make strategic choices in how it delivers services, while still ensuring a seamless experience for its customers and Alliance partners. It is, after all, essential that throughout every aspect of the strategy’s execution, NCIA and its partners are interconnected and operate as one. “We also need to enhance our collaboration with our vendors through regular performance analysis by cultivating strong relationships and overseeing all interactions more cooperatively,” she adds.

Throughout this journey, NCIA is taking care to ensure its staff are accommodated as much as possible without negative consequences, that industry is used more

effectively, and services and capabilities are successfully delivered to the Alliance. No matter how the role of industry is increased with NCIA deliveries, NCIA will always retain the overall responsibility for successful delivery. “We may encounter resistance on this outsourcing journey, as often happens during transitional journeys,” but Carol emphasizes that this outsourcing change journey is the best way forward. “We have a responsibility to use our resources not just effectively, but also with intention. By thinking differently, we open ourselves to innovative solutions and meaningful change. Change journeys are never fast, but they’re always worth it.”

NCIA’s Smart Sourcing Strategy aligns with the new Secretary General’s priorities (PHOTO: NATO)

NSC the National Secure Cloud

Business Development, infodas

How does the National Secure Cloud (NSC) contribute to increasing digital sovereignty?

The NSC, a joint project by leading german cybersecurity experts infodas, IABG, Kernkonzept and Utimaco, strengthens European digital sovereignty by providing a cloud infrastructure that is developed and operated by German companies. It enables the processing of classified information and meets the highest security standards. The use of carefully selected open-source technologies reduces dependence on international providers and thus strengthens control over national data.

What security mechanisms are implemented in the NSC to ensure the processing of data up to the SECRET classification level?

The NSC meets the strict requirements of the Classified Information Directive and uses technologies approved by the German Federal Office for Information Security (BSI). It offers security domains, secure domain transitions, hardware security modules for cryptography and strict access controls. These mechanisms ensure that data is protected up to the SECRET security classification level.

Why is the NSC a cost-effective solution for government and military organizations?

The NSC is modular and scalable, meaning that it can be expanded as required. It uses carefully selected open-source software, which minimizes licensing costs. In addition, the so-called Virtual Security Functions, like the virtualization of the infodas Secure Domain Transition (SDoT) Labelling Service and the SDoT Security Gateway, enable flexible adaptation without the need for expensive hardware changes. This makes the NSC a cost-effective solution that still meets the highest security requirements.

What role does opensource software play in the development of the NSC, and what advantages does it offer compared to proprietary solutions?

Open-source software plays a pivotal role in the NSC. It enables transparency, traceability and the ability to close security gaps quickly. Another advantage is that users are not dependent on a specific provider so there is no vendor lock-in. The open concept allows different providers to integrate their solutions into the NSC system, increasing security and flexibility.

How is the interoperability of the NSC ensured in international military operations such as within the NATO framework?

The NSC has been specifically developed for use in military and interoperability contexts. It supports the requirements of NATO and other international standards. Technologies, such as the SDoT Gateway and SDoT Labelling Service, which are compliant with NATO STANAG 4774/8 and

approved up to German GEHEIM (SECRET), EU SECRET and NATO SECRET enable secure data exchange between different security domains, including international cooperation. This ensures a high level of interoperability in the military environment.

What challenges do you foresee with the integration of new technologies such as artificial intelligence (AI) into the NSC?

When integrating technologies such as AI, they can place very high demands on data processing and security. However, the NSC has a modular structure, which makes it easy to integrate new technologies. The use of AI could help to increase efficiency in safety-critical areas, but equally, the security of the processed data must be guaranteed at all times.

How does the NSC ensure security when transferring data between different security domains?

NSC uses SDoT security products that enable the secure exchange of data between different security domains. These products verify structured and unstructured data against defined rules before they cross domain boundaries. In addition, data encryption ensures that only authorized recipients are granted access. This ensures maximum security when exchanging data.

Security is the core principle of the NSC. Being able to handle and exchange data between different security domains is one of the main requirements that public agencies are facing. Without such capacities, using the cloud will not be possible for such stakeholders. Therefore, the consortium that developed the NSC — which infodas is a member of — has

Benedikt Meng

based its work on proposing that the cloud functions as a Security Domain as a Service. Several technologies supplied by consortium partners ensure secure data exchange between security domains. One of the first steps was enabled thanks to a special operating system called L4Re, developed by Kernkonzept. With this special operating system, several security domains can now run in parallel while being fully isolated from one another.

When it comes to transferring data between security domains that have different classification levels, infodas SDoT Gateway is required as infodas has gained decades’ worth of experience in this area. The solution enables secure data exchange based on whitelisting and zero-trust principles. It offers users full control and security over data flow and focuses on data loss prevention.

Now, infodas is working on the virtualization of the SDoT Gateway to fully integrate it into the cloud system. The function will remain the same: secure data exchange in real time. The Gateway will still protect the so-called ‘HIGH’ side (the most classified domain) from other security domains, but in a virtual way that addresses the relevant cloud scenarios.

It is not only the SDoT Gateway that is currently being virtualized, the infodas Labelling Service is too. The Labelling Service is a STANAG 4774/9-compliant solution, usually combined with the SDoT Gateway to secure unstructured data flow. It creates an XML label, cryptographically bound to highly classified data. Based on this, the SDoT Gateway can also secure unstructured data exchange. The virtualization of the Labelling Service is thus an important step for a fully flexible solution for all types of data exchange. It will also be a key component of the NSC for data management. The Labelling Service will ensure that users have a clear overview of the classified data on the cloud and enable them to manage its access

between the different stakeholders, according to their security level.

What measures are being taken to protect the NSC against future threats from quantum computers?

The NSC is designed to be crypto-agile, which means that its cryptographic modules can be upgraded to postquantum cryptography as soon as this is approved by the BSI. This ensures that the NSC is also prepared for future threats from quantum computers and meets the highest security standards in the long term.

What advantages does SmartNIC technology offer for the security and performance of the NSC?

SmartNIC technology enables the transmission of data from different security domains over a single cable without physically separating the entire network infrastructure. This increases the scalability and flexibility of the NSC and enables new security domains to be set up quickly and securely without the need for additional hardware.

How is the scalability and flexibility of the NSC ensured for large organizations with thousands of users?

NSC uses Kubernetes-based techniques to enable horizontal scaling without manual intervention. This allows additional computing resources to be made available if required. The cloud platform is modular and flexible, making it suitable for small and large organizations. So, by integrating new technologies, such as SmartNICs, the NSC can meet the requirements of large organizations.

What are potential military use cases for the NSC?

Cloud computing plays a critical role in multi-domain operations by providing the necessary infrastructure, datasharing capabilities and flexibility to integrate and coordinate across various domains of warfare. NSC technology will

facilitate unified data and information sharing by fusing data across all domains with real-time access. It will also help with cross-domain coordination and the delivery of a common operating picture, which will increase interoperability between forces. It can additionally support command and control (C2) through distributed C2 networks in the spirit of a Combined Joint All-Domain Command and Control (CJADC2) concept, while enforcing resiliency and redundancy from core-to-edge continuously.

NSC will also enable AI and data analytics for decision support and help to enable training and simulation for multi-domain operations integration in high-security environments.

Scan me to learn more

NCIA ACQUISITION POLICY

NCIA Head

of Acquisition

Policy, Ijeoma Ike-Meertens, explains how she is adapting the way the Agency procures equipment and services to make doing business with it easier and more efficient

With the pace of innovation accelerating at an everincreasing speed, organizations must be able to acquire technology as quickly and easily as possible to stay at the cutting edge. This is easier said than done, particularly for multinational organizations like NATO. The Alliance not only has to ensure its procurements are cost-efficient, but that the product or service being acquired is fit for purpose across the 32-nation Alliance. Balancing these two requirements can lead to a complex labyrinth of red tape that must be navigated by all prospective suppliers.

However, an overly burdensome procurement process can often dissuade small- and medium-sized companies from looking for business opportunities with large public institutions like NATO. Simon Michell asks the Head of NCIA’s Acquisition Policy to find out what steps are being taken to encourage smaller, more agile companies to do business with NATO.

What are the key challenges facing tech companies when it comes to doing business with NCIA?

I believe the most prominent challenges tech companies face when engaging with NCIA are navigating complex procurement regulations, understanding the often rigid procurement processes and dealing with the length of

time it can take to secure contracts. Many tech companies, especially those smaller in size, are used to fast-paced, agile environments. The bureaucracy inherent in NATO procurement can feel like a major barrier, particularly for businesses accustomed to quickly pivoting and iterating based on customer feedback. In addition, emerging technology solutions can struggle to fit neatly into pre-existing categories, making it difficult for NCIA to acquire cutting-edge innovations swiftly. Another issue is the lack of clarity on how to access opportunities or the expectations around compliance with NATO standards, which can be daunting to those who have not worked in this space before.

How have you been able to alleviate some of the hurdles facing small- to medium-sized tech companies?

One of our primary initiatives has been to simplify and demystify the procurement process. We are working to streamline the acquisition lifecycle, especially for smallto medium-sized tech companies, by providing clearer performance-based requirements (i.e. focusing more on the outcome rather than dictating how to get the work done) and reducing unnecessary red tape in the time from when the solicitation closes until a contract is

awarded. We are also increasing outreach efforts through industry days in various NATO nations and creating direct lines of communication with our contracting officers, through which businesses can ask questions and receive guidance in real time during the solicitation period.

Another critical step has been expanding the use of innovative acquisition methods such as Rapid Procurement Vehicles (RPVs), particularly the use of Indefinite Delivery Indefinite Quantity (IDIQ) contracts for Commercial-Off-The-Shelf (COTS) supplies and services and Enterprise Agreements (EA) for software. These flexible contracting mechanisms allow faster delivery of goods and services, as well as greater flexibility when it comes to incorporating the latest technological solutions.

Why is it so important that the acquisition policy is nimbler and easier to navigate?

I think that in today’s rapidly evolving technological landscape, agility is critical to both NCIA and the tech industry. When acquisition policy is more flexible and easier to navigate, it benefits both parties. NCIA can

more quickly adopt innovative solutions, keeping pace with technological advancements that help us meet mission-critical needs. On the flip side, tech companies, particularly smaller ones, may be more inclined to bid on contracts and bring their innovative ideas to the table if the barriers to entry are lower.

A more flexible acquisition policy also fosters greater competition, which leads to better pricing, more innovative solutions and, ultimately, more value for the taxpayer. We can’t afford to be bogged down by outdated processes in a world where technology is rapidly evolving. Our policies must be as forward-thinking and adaptable as the solutions we’re looking to acquire.

What are you most proud of since you have become Head of Acquisition Policy at NCIA?

I’m most proud of the establishment of a Joint Centre of Expertise (JCoE) with the NCIA Acquisition Office and the Chief Operating Office, aimed at accelerating the source selection phase of the acquisition process. The JCoE brings together seasoned acquisition professionals from

NCIA Industry Days help to demystify the procurement process and give smaller companies the confidence to do business with NATO (PHOTO: NCIA)

across various offices in NCIA to streamline decisionmaking and ensure that the best solutions are selected more efficiently. By centralizing expertise, we are cutting down on the time it takes to evaluate bids and award contracts, while still maintaining the rigour and thoroughness necessary for high-stakes procurements.

In addition to speeding up the selection process, we are adopting a performance- and risk-based approach to acquisition, which shifts the focus from rigid compliance with processes to a more dynamic evaluation of outcomes and potential risks. This approach allows us to be more flexible and innovative, ensuring that we can adopt cutting-edge solutions while mitigating risks early in the process. It’s been a significant shift in how we assess vendor capabilities and align them with NCIA’s mission-critical needs.

Finally, I’m proud of our efforts to simplify and standardize acquisition processes across the board. We are working hard to reduce complexity and eliminate redundant procedures, making it easier for vendors – especially small- and medium-sized tech companies – to do business with NCIA. By standardizing key aspects of the acquisition process, we are making it more predictable and transparent, which benefits both NCIA and our industry partners.

What are your next priorities in the continued evolution of acquisition policy?

Our top priority is to continue simplifying and reforming the acquisition process to make NCIA a more attractive and accessible partner for businesses of all sizes.

We are actively reviewing our current policies and procedures to identify areas where we can reduce unnecessary complexity and streamline the steps required to compete for NCIA contracts. This means cutting down on administrative burdens, clarifying requirements and standardizing processes to create a smoother experience for companies looking to engage with us.

We’re also focusing on enhancing transparency and communication throughout the acquisition lifecycle. By improving how we communicate with vendors, particularly during the pre-solicitation and evaluation phases, we can ensure that companies better understand our needs and expectations from the outset, which will lead to more tailored and innovative proposals. This clarity will help us attract a broader pool of potential partners, including those who may have previously been deterred by the intricacies of NATO procurement.

In addition, we’re working on fostering closer collaboration between NCIA and the private sector. This includes building out initiatives that encourage more frequent engagement, such as industry days, direct consultations and feedback sessions with vendors. By doing so, we aim to position NCIA as not just a customer, but a partner that companies want to work with for long-term success.

Ultimately, our goal is to make NCIA’s acquisition process more agile, competitive and appealing, ensuring that we continue to attract the best and most innovative solutions to meet our evolving needs.

The complexity of NATO’s procurement processes is being evolved to encourage small- and medium-sized companies to compete for business (PHOTO: NCIA)

Banshee — bringing military 5G to the battlefield

Nathan Stenson

Vice President of Sales and Partners at Nokia Defense International

Dave Petersen

Vice President of Tactical Wireless and Business Development at Nokia Federal Solutions

What role will 5G play on the battlefield of the future and in wider military activities?

Nathan: At Nokia, we believe 5G already has a critical role to play on the battlefield by complementing existing military radio systems and networks, rather than replacing them. This role extends across multiple domains: rear echelon, forward operations and tactical battlefield.

The commercial world understands that there is an undeniable need to move processing closer to workloads. So, too, in the defence sector, which has an increasing multiplicity of sensors and systems in and around the battlefield — from biovital sensors on personnel to a host of platform-based sensors that are all working continuously to try and achieve information superiority. By locating that processing function as close to the troops on the ground as possible, it is possible to shorten the length of time required for decisionmaking not just on the battlefield, but in all supporting dimensions.

However, when complementing existing military radio communications systems, there are some key requirements. For example, you will need a platform that can deliver the 5G bubble and integrate it with existing mobile ad hoc network (MANET) radio technology. In fact, you are going to need to be able to host a range of applications as well. And, of course, any 5G system placed on the battlefield will have to be able to cope with jamming activity. It would also be helpful to have a range of backhaul options too. Essentially, what I have just described is exactly what we have with the Banshee platform,

which has been enabled through our acquisition of the Fenix Group.

Can you elaborate on the unique features that the Banshee 5G military network brings to the battlefield?

Dave: We developed a 4G product called Banshee based on Nokia technology, which we are now extending to 5G thanks to our joint design engineering project with Nokia. The design parameters of Banshee, in its earlier 4G state, originate, to a large extent, from the experience of some of the Fenix Group’s members as Special Forces veterans. As you can imagine, security is therefore of the utmost importance. Next is bandwidth. By that, I mean, if you have a lot of data available, you need to get that to the right people at the right time as quickly as possible. This is a key feature of Joint All Domain Command and Control and it is why Banshee’s multiband capability is a critical feature as it allows operating on more than a single 5G band.

The second thing Banshee offers is the ability to operate a mobile dynamic environment. For example, if a military operator has a 5G node on the battlefield and needs to extend it to another to give access to additional users or transport additional volumes of data, Nokia enables that through meshing. That means taking an existing network architecture the military already has and wrapping a private cellular network around it. That can be of immense use on the battlefield. Nokia has patented an edge-compute functionality combined with MANET radio technology to achieve this. We are now developing Banshee by combining those

technologies to create a hybrid platform for MANET and 5G.

One of the other features that is critically important for battlefield use is making it simple. If you can’t teach a soldier how to use something like Banshee in a couple of days at the most, it is unlikely to get deployed in a real-world scenario. So, making it simple and easy to use and understand is an incredibly important feature for the battlefield production network that we have created.

The battlefield is a highly contested environment; how can Nokia ensure information security with 5G on operations?

Dave: Recent real-world events have shown that it is possible to infiltrate a Telco carrier network. Across the world, some police forces do it, as do some governments. That is why Nokia provides a package of extremely compact 4G and 5G systems for the battlefield that enable the military to take ownership of their infrastructure instead of leasing it — a private 5G network. This enables the military to transport their classified data across networks and communicate with a much more limited threat of interception.

We know that military planners want 5G because it provides bandwidth scalability to their existing networks. That is why we are incorporating multi-band functionality to allow them to continue to operate in a contested environment where jamming may be present. It is undisputed that 5G works extremely well back at the home base because there is not a lot of threat from

electronic warfare and jamming there. But, in a frontline scenario, electronic warfare and jamming are very real threats. You need to have some resiliency and redundancy built into the network.

How is Nokia collaborating with NATO to advance 5G activities across the Alliance?

Nathan: Nokia has a long-standing relationship with NATO and other defence industry forums, with which we have been collaborating for many years.

In terms of 5G specifically, Nokia contributes to multiple NATO 5G-related initiatives. Nokia also attends NATO 5G field trials as we have an enormous amount to offer in terms of our wider 5G ecosystem and carrier-grade capabilities. As you would expect, we are also supportive of NCIA’s 5G Multinational Group.

Nokia is also a member of DIANA (Defence Innovation Accelerator for the North Atlantic). Last March, we announced that Nokia Bell Labs in Budapest, Hungary, was selected to be part of the NATO innovation ecosystem as both an accelerator and test site. Nokia Bell Labs provides access to cutting-edge testing facilities to DIANA’s current cohort of companies, as well as those DIANA will support in the future, giving them expert advice and access to test and trial their technologies in specialized environments. In short, wherever there is military 5G activity being developed, Nokia is on hand to contribute its experience, expertise and infrastructure. nokia.ly/defense

APSS PARTNERSHIPS

NCIA’s Rich Laing talks to the Head of the Space Technology Adoption and Resilience Branch, Laryssa Patten, and the Chief of NCIA’s JISR Centre, Matt Roper, about the increasing role commercial space partners will play in delivering decision-making data to the Alliance

The NATO Alliance currently faces a multitude of challenges from areas such as terrorism, pandemics, climate change, migratory flows and the re-emergence of geopolitical competition as highlighted within the NATO 2030 vision. As such, the complexity of today’s strategic environment exposes a pressing need to enable swift decision-making and coherent policy implementation.

At the Washington Summit in July 2024, 17 Nations committed to supporting the Alliance Persistent Surveillance from Space (APSS) programme. The APSS initiative aims to enhance and accelerate situational awareness and decision-making through a dedicated programme for access to shared data from space and the integration of this data into NATO systems.

Through pioneering the ability of the Alliance to harness space ISR capabilities and deliver operational effects from the Space Domain, the APSS initiative will deliver a transformational change in NATO’s ability to increase the level of situational awareness through access to data from space. In the early phases, APSS programme management is within NCIA’s Chief Technology Office, specifically the STAR (Space Technology Adoption and Resilience) team, to ensure that technology activities are aligned. Beyond that, the supporting Nations will be collectively and significantly increasing the amount of data and analytic support that Allied Command Operations (ACO) is able to leverage.

AQUILA – VIRTUAL CONSTELLATION

In addition to the objective of integrating space data into NATO systems, the APSS programme will establish an Alliance virtual space-based Intelligence, Surveillance and Reconnaissance (ISR) constellation to improve the reliability, availability, efficiency, flexibility and resiliency of access to data and services from space in support of NATO’s political and military activities and interests. This virtual constellation, named Aquila, will be constructed from the satellites committed to providing data in support of the Alliance including commercial satellite data.

Commercial Earth Observation capabilities are becoming increasingly capable and prevalent and will constitute a key element of Aquila to provide ACO with resilience and persistence when and where required.

Commercial Earth Observation capabilities can support the Alliance with data collected from a wide variety of sensors, and insights derived from said data that can accelerate decision-making.

The capabilities of commercial data and services from space continue to develop at pace — a development that has been increasingly visible to the public through commercial space companies delivering critical capabilities to Ukraine. The ready availability of such space-derived data supports national security through the timely provision of information and insights and provides images to support news stories across multiple channels, assisting in combating misinformation.

The commercial satellite industry continues to increase the number of satellites launched into orbit annually. In 2023, 2,781 new commercial satellites were launched — an increase of 20% from the previous year. By the end of 2023, 9,691 active satellites circled the Earth, translating to an increase of 361% over the past five years. In addition, the types and quality of sensors in space offer new and increased intelligence.

Laryssa Patten, Programme Manager for APSS, highlights that it is this growth of capacity and capability that provides the Alliance with the opportunity to augment current ISR capabilities and rapidly deliver strategically important capabilities against the needs of ACO. “The APSS Programme was established to address ACO’s growing need for space data and ensures that NATO maintains its technological edge. The unique vantage point of space allows for wide-area surveillance with persistence and access to denied areas. The commercial space sector’s capabilities play a crucial role in NATO’s APSS programme. In the past two years, the sector’s progress in both launch volume and product quality has been unprecedented, creating vast new opportunities for the Alliance that are seized via APSS.”

APSS BASIC ORDERING AGREEMENT (BOA)

Within APSS, NCIA has established a mechanism for the Alliance to rapidly access a wide array of companies within the commercial space arena. The APSS Basic Ordering Agreement (BOA) has more than 75 companies registered,

each capable of delivering either Earth Observation from high-end sensors, data insights or analytics. The BOA allows ACO’s requirements to be advertised to registered companies, inviting a rapid response regarding their ability to answer operational needs.

In August 2024, working closely with the ACO operational requirements, the first commercial contract (allowing direct tasking by SHAPE) was awarded to Planet Labs PBC for electro-optical imagery. On 20 August 2024, SHAPE directly tasked a commercial satellite for the first time; moving into a new era that entails a dynamic and active relationship with commercial space data providers.

This contract has allowed SHAPE to adapt existing processes to incorporate commercial tasking and has been supported by the development of an architecture for the integration of commercial data onto NATO systems. This is in line with the strategic aim of the APSS initiative, which is to unify, consolidate and integrate the Alliance’s space-based data services (commercial and national) to serve the intelligence and information requirements of NATO.

The Alliance’s need to harness the capabilities offered by the commercial satellite industry is not constrained to electro-optical imagery. Looking to the future, access to sensors such as synthetic aperture radar, infra-red and hyperspectral will augment the data collected organically by the Alliance. Maintaining

Planet Labs was awarded the first commercial contract under the APSS BOA. This Planet Labs image shows Ukraine’s Snake Island in the Black Sea (PHOTO: Planet)

awareness of the threats NATO faces also requires the ability to understand emerging incidents and potential threats. As such, it is increasingly important that there is resilience in those capabilities that can be tasked ‘24/7’ and in all weather conditions.

An escalating amount of data provided to the Alliance can also be supported through the provision of advanced analytics and related services from commercial providers. Informing decision-making through data insights is an area that is being explored within the APSS programme. This includes efforts to integrate analytical solutions for the challenge of ever-increasing volumes of data being delivered to the Alliance Joint Intelligence, Surveillance and Reconnaissance (JISR) community. This work will leverage the commercial development of advanced analytics, integrating solutions into the NATO JISR processes and architecture.

Commercial space-data integration presents challenges to Alliance policies, processes and procedures — all of which are being addressed by the APSS programme. The technical integration of commercial space data and services into existing and future JISR architectures (alongside Alliance and nationally provided data), is a primary focus area for the JISR Centre. This includes the development of processes to ensure access to a broad and deep pool of data with the transfer of only essential data to Alliance systems, thereby reducing strain on storage and bandwidth.

INTELLIGENCE AND ISR FUNCTIONAL SERVICES

Matt Roper, Chief of the JISR Centre within NCIA, highlights the importance of ensuring the APSS technical work (which is moving forward at pace)

is informing the wider JISR Centre’s portfolio and helping future-proof the JISR enterprise. “Access to commercial data within the JISR community remains a critical element of our business. APSS has been able to address issues of data transfer, ownership, licensing, storage and access. By undertaking this work within the JISR Centre we are ensuring that the direction of travel is consistent with other Alliance programmes, which are also addressing data access, exploitation and dissemination requirements. One such project, the Intelligence and ISR Functional Services, also recognizes the need to access and integrate commercial data, and the work we’re doing on APSS is helping advance our understanding of that project’s needs.”

Importantly, the APSS programme engagement with commercial space industry partners that sit within the BOA has allowed for rapid advancement in our understanding regarding the challenges of data integration and access.”

As the commercial space sector continues to develop, with technical innovation increasing productivity and availability, the Alliance will continue to seek areas for further integration to augment organic and nationally provided capabilities. The Earth Observation portion of the overall global space economy is currently undergoing significant growth. Commercial technologies such as Search and Rescue (SAR), Hyperspectral, Edge Computing in Space and Thermal Infrared remain in a relatively early phase of development but are advancing at pace. The APSS BOA will allow for these technologies, and more, to develop in line with Alliance requirements.

NATO signed a contract with Planet for the supply of Earth Observation imagery (PHOTO: Planet)

INDUSTRY PERSPECTIVE

Fortion® Massive Intelligence

Frédéric Julhes

Head of Defence Digital France Programmes, Airbus

Could you please explain what is behind Fortion® Massive Intelligence technology and why it was developed?

Our world is constantly evolving and so are our intelligence systems. Due to the recent changes to the strategic environment, increased threat complexity, data explosion, increasing employment of Western forces on hybrid and asymmetric operations, and the inadequacy of the Cold War-derived intelligence architecture, it was key for us in Airbus Defence and Space to review the traditional approach to intelligence systems.

This is why, a few years ago, we invested in a new approach and architecture for processing, exploitation and dissemination (PED) intelligence systems. The aim was to enable intelligence analysts to focus their time on 80% added-value analysis tasks instead of spending so long searching for the right data. This is how Fortion Massive Intelligence was born!

Fortion Massive Intelligence helps defence and security customers be more efficient in their daily work,

automatically processing huge amounts of data from all sources. Not only can it aggregate information from imagery, open sources, signals, cyber and human intelligence, but it can also integrate one’s own data and analytics.

The modular and flexible approach of the architecture enables adjustment of the components and services that can be offered to various intelligence customers, according to their existing software applications and IT infrastructure. It extracts the different features and capabilities of existing products and transforms them into microservices in the cloud, which can be public, private or hybrid. Thus, customers can select or deselect services depending on their needs and existing solutions.

Today, Fortion Massive Intelligence solves many deficiencies observed within the intelligence community, including the difficulty in searching, finding and accessing adequate data in a set of different databases with different formats and access rules; the complexity of integrating existing customers’ proprietary applications and databases; and the lack of flexible ‘need to know’ management processes, rules and capabilities.

Is Fortion Massive Intelligence technology already deployed? If so, what feedback do you have at this stage?

Yes, it has already been deployed within a French intelligence agency. Furthermore, Massive Vision, the dual-use version of Fortion Massive Intelligence, serves as the core for the development of a proof of concept (PoC) to support the Political-Military Assisted Decision Making (PM-ADM) capability development programme within NATO ACT, for a use case

completely different from that of an intelligence system.

Even though I cannot disclose for which purpose the French customer is using our system, I can share some feedback we have received. Users appreciate the ability to access and select varied and heterogeneous data from multiple sources through a single search mechanism, called the ‘Federated Search’ function.

The artificial intelligence (AI) and analytics provided in most operations functions are of great help, as well as the reception of alerts based on the specific criteria they have defined. They also praise the easy integration of their databases and optimized data visualisation, which provides a better understanding of the data and facilitates decision-making. In addition, they report experiencing a drastic reduction in the time required to perform their analysis tasks thanks to automation and collaboration offered by Fortion Massive Intelligence and Massive Vision.

Can you tell us more about the NATO PM-ADM project and describe how Supreme Allied Commander Transformation (SACT) plans to use it?

You might have heard of the PM-ADM launched by HQ SACT in 2023 and wonder what is behind it. This capability development programme aims to support NATO’s political and military leaders from the North Atlantic Council (NAC) and Military Council (MC) in identifying potential threats to NATO’s interests at an early stage, considering all instruments of power (IoPs). Airbus was awarded a contract by HQ SACT to develop the proof of concept (PoC) demonstrator in the summer of 2023.

The PM-ADM programme is being developed incrementally to keep the stakeholders at the heart of the process and mitigate risks throughout the capability development process. Earlier this year, Airbus teams presented the initial versions of their PoC to SACT, his staff and the National Liaison Representatives in HQ SACT Norfolk in Virginia. The aim of this demonstration was to obtain direction and guidance from HQ SACT senior leadership for the subsequent stages of PoC development.

NATO ACT plans to demonstrate the PoC at the North Atlantic Council Military Committee away day in early 2025. They also plan to test it during the Coalition Warrior Interoperability 2025 Exercise (CWIX25), during which they will compare how decisions are taken in a crisis scenario without our tool and how they would have been taken with our support and artificial intelligence. This demonstrator, developed with the PM-ADM programme team at HQ SACT, will therefore be highly visible to NATO top decision-makers.

To what extent is it a key solution for NATO?

The PM-ADM demonstrator is shaping the future capability of NATO to make strategic decisions while being supported by artificial intelligence. With this tool, NATO will benefit from assisted decision-making in multidomain operations, acknowledging the impact of complex relationships between the political, military, economic, social, information, intelligence, law enforcement and environmental instruments of power. It will enhance NATO’s ability to conduct strategic foresight, decide faster than its adversaries and be stronger together.

What about the next steps? How do you envisage the future of Massive Intelligence? Have you already identified new use cases?

Airbus will keep progressing the development of the PM-ADM proof of concept in close collaboration with the PM-ADM programme team at HQ SACT. In the frame of PM-ADM,

NATO recently asked us to develop a new use case based on a specific operational scenario that fits the current geopolitical situation.

Additional requests from our current customers should also result in including more capabilities in the tool, such as predictive analysis.

Once the request for proposal for the implementation of PM-ADM capability is published, Airbus will, of course, propose Massive Vision. Additionally, several other customers within NATO are interested in our demonstrator, which opens the door to new opportunities.

The market for this AI solution appears very promising.

https://intelligence.airbus.com/ industries/defence/joint-isr/fortionmassive-intelligence/

VIEW FROM THE NATIONS

USA

DELIVERING TECHNOLOGY TO SPECIAL OPERATIONS FORCES

Lisa Sanders, Director of Science and Technology for Special Operations Forces, SOCOM Acquisition, Technology and Logistics, outlines the technologies she seeks and the ecosystem she relies on for their development and delivery

Today’s battlefield is complex, rapidly changing and technologically driven. Delivery speed of capabilities to Special Operations Forces (SOF) is a core function of the Acquisition, Technology and Logistics (AT&L) team at United States Special Operations Command (USSOCOM).

USSOCOM AT&L is responsible for leading the pathfinding,

development, production and fielding of the cutting-edge capabilities required by SOF operators around the world.

USSOCOM AT&L keeps warfighters involved throughout the process of procuring new capabilities, beginning with identifying the most relevant technology for their unique battlespace. Representatives from each of our service components and theatre Special Operations Commands identify gaps and provide operational perspectives on proposed ongoing projects intended to mitigate those gaps.

USSOCOM AT&L has the responsibility of pathfinding and driving SOF acquisition using a diverse industry ecosystem.

Our work with small businesses continues to grow, driving innovation within the SOF enterprise through a variety of means.

USSOCOM AT&L’s Science & Technology (S&T) directorate and SOFWERX (our public-private partnership) uniquely leverage our Small Business Innovation Research programme, which is one entry point for small business engagement.

Other entry points include Vulcan and Engage SOF (eSOF), our electronic submission pathways to submit business proposals for assessment and receive feedback directly from the relevant SOF customer. We are also accelerating and increasing collaboration with the private capital investment ecosystem to maximize the power of the private capital market to

“Our partnership within the Department of Defense and interagency is key to SOCOM AT&L success”

address SOF problems and accelerate capability. This diversity of pathfinding and procurement capability enables SOF to bring in the most relevant, technologically advanced ideas to defend, and if necessary, fight and win in contested environments.

In addition to our partnership with industry, our partnership within the Department of Defense (DOD) and interagency is key to SOCOM AT&L success. We depend on the capabilities of each service within the DOD to ensure we can connect to the greater joint force in the

fight. The partnerships within greater DOD allow us to leverage and partner with others to quickly experiment, prototype and push capabilities to the field.

Technology is at the forefront of SOCOM AT&L’s mission and there are five key categories of AT&L modernization efforts:

• Warfighter performance: We invest in human-centric technologies and capabilities best-suited for the environments SOF operators work in.

• Emplacement and access: We provide capabilities that

allow SOF teams to get to target and help avoid detection by adversary systems.

• Battlespace awareness: We field technologies that fuse data streams to support forces in all domains.

• Multi-domain command and control: We develop technology that processes information quickly and disseminates the most relevant information for the mission.

• Precision and scalable effects: We invest in the rapid integration of open architecture to integrate new technologies.

SOCOM AT&L work with warfighters to solve their unique challenges (PHOTO: U.S. DOD)

AT&L’s S&T directorate invests in discovering, developing and advancing creative technologies that address SOF capability gaps. In the fiscal year 2024 (FY24), we increased transition pathways and developed new partnerships with government, academic, commercial and international partners, increasing the speed of technology maturation. Two of S&T’s capability areas –Battlespace Awareness (BA) and Multi-Domain Communications and Computing (MDCC) are particularly relevant to NITECH.

BA has ongoing investments in autonomous sensor delivery and placement for persistent situational awareness reporting and Artificial Intelligence and Machine Learning (AI/ML)-enhanced sensor fusion, enabling tactical and theatre data processing and analysis to support

mission planning, execution and damage assessment. In FY24, we began projects to develop collaborative autonomy systems to increase and enhance SOF’s ability to anticipate and act on threats in contested environments to autonomously deliver and place sensors providing enhanced evaluation, analysis, prediction mission planning and battle damage assessment.

MDCC is primarily focused on contested communications, defensive cyber operations and alternative navigation technologies in support of command’s objectives. In FY24, S&T initiated nine new projects focused on automated systems of systems at the point of need, with technology development consisting of a variety of artificial intelligence and edge-computing applications.

USSOCOM is focused on serving as an early adopter of technology. Our unique position as a globally employed joint force deeply integrated with our international partners, combined with direct acquisition authorities, provides a perfect platform to work with the global technology ecosystem, delivering early user feedback into the technology development cycle.

Lisa Sanders speaks at SOCOM Innovation Foundry 15 in London, 17 April 2024, which brings together military, industry, academia, labs and futurists. (PHOTO: Courtesy photo)

ACCELERATING THE PACE OF CAPABILITY DELIVERY

THE C-PMO AS A CATALYST

FOR CHANGE

Frederic Jordan reveals the benefits of the newly established Corporate Portfolio Management Office (C-PMO), which he leads, and explains how it will transform NCIA’s delivery of NATO common-funded projects

What is the Corporate Portfolio Management Office (C-PMO)?

Established in June 2024 within NCIA’s Chief Operating Office (COO), the C-PMO serves as a central office to enhance the business value of common-funded projects. It provides a structured framework and the necessary tools for coordinating, prioritizing and overseeing the execution of those projects.

The C-PMO also drives changes in the areas of project management and delivery. It actively fosters innovation through the introduction of simpler, more streamlined processes that support capability delivery and continuously integrates best practices from the broader defence and technology sectors.

How does the C-PMO support NCIA’s new operating model?

NCIA is facing an unprecedented demand for its services and support, necessitating a transformative shift in its operating model. This new model emphasizes enhanced customer insights and more effective solutions delivered at the speed of operational relevance.

Within this framework, the C-PMO plays a pivotal role. First, it serves as a bridge between internal project teams, external partners and stakeholders, facilitating stronger collaboration and alignment on capability requirements. The C-PMO also fosters a growth mindset within project teams, emphasizing flexibility, adaptability and continuous learning.

Simultaneously, it promotes a more agile project management culture, collaborating with all business and functional areas to streamline and simplify the processes enabling project management and delivery.

What changes will the C-PMO introduce?

The C-PMO is committed to introducing and sustaining the radical changes that will enable the delivery of capabilities in increments lasting 18 months maximum.

At the core of these changes, and in close collaboration with the Joint Centre of Expertise (co-led by COO and the Acquisition Office), is the introduction of performance-based acquisition. This approach focuses on outcomes (the ‘what’) rather than prescriptive requirements on the ‘how’, encouraging contractors and partners to innovate and find the most efficient ways to deliver capabilities or services. By setting clear performance goals and metrics, performance-based acquisition allows suppliers the freedom to determine how best to meet those objectives, fostering creativity

and accountability across the project or service-delivery lifecycle. The emphasis is on delivering results that meet operational needs, rather than strictly adhering to rigid requirements on how to deliver the work.

Performance-based acquisition also aligns well with the C-PMO’s broader goals of enhancing project efficiency and effectiveness. By focusing on measurable performance outcomes, NCIA can track progress more accurately and ensure that resources are allocated in ways that maximize value. This shift also creates incentives for vendors to continuously improve their processes and solutions, as their success is tied directly to their ability to meet or exceed performance metrics.

To support this new approach, the C-PMO is working with NCIA’s business and functional areas to improve internal resource management through the ring-fencing of resources allocated to projects during time-boxed activities. This approach ensures that teams remain fully focused on a single project at a time, minimizing distractions and competing priorities. This approach allows for deeper engagement with the project’s objectives, reducing inefficiencies that arise from task-switching and enhancing the overall quality of work. Moreover, it accelerates decision-making processes, as resources are fully dedicated to resolving issues in real time, leading to faster project completion and a higher likelihood of delivering on time and within budget.

In parallel to this, the C-PMO is working closely with the NCI Academy to support the development of a highly professionalized project management workforce. Through continuous and adaptive learning programmes, project managers will be equipped with the latest knowledge and techniques, ensuring they can respond to evolving challenges with agility and insight. This focus on workforce development is critical to ensuring that lessons learned from past projects are integrated into future initiatives, maintaining the momentum of capability delivery improvements.

What challenges does the C-PMO face?

Driving large-scale change in a complex organization is always a challenging endeavour that requires a strategic approach balancing clear vision-setting with adaptive, stakeholder-centred communication. The goal is to ensure that transformation initiatives are well communicated and integrated thoughtfully across all levels and aligned with organizational goals.

In the current environment, where there is an exceptionally high demand for NCIA resources, these

challenges are magnified. Transformation initiatives must compete with other urgent projects for access to necessary resources, making it crucial to exercise agility and resource optimization. This context necessitates careful management of expectations from both internal and external stakeholders, ensuring that realistic targets are set to measure the benefits delivered by the C-PMO.

These benefits will be assessed through the expedited delivery of projects, improved use of resources and increased stakeholder satisfaction. These measures will not only serve as benchmarks for the C-PMO’s success but also provide a transparent mechanism for tracking progress.

Despite these challenges, or perhaps because of them, the C-PMO is a vibrant and dynamic team, energized by the opportunity to drive change. It is fully committed to advancing NATO’s mission with determination and purpose, continuously striving to meet the evolving needs of the organization.

How will the C-PMO evolve in the future?

As artificial intelligence (AI) technology advances, it will play a crucial role in transforming how the C-PMO manages data and decision-making processes. AI will enhance our ability to analyse vast amounts of data in real time, improving decision accuracy at every stage of a capability delivery lifecycle.

For instance, AI-driven insights can significantly optimize scenario planning and resource allocation. By predicting outcomes and potential risks, AI can help NCIA prioritize activities with the greatest potential impact while mitigating possible delays or resource constraints. In addition, AI will enable more dynamic risk management by continuously monitoring projects and detecting issues before they escalate. The C-PMO is working closely with NCIA’s AI experts to ensure AI is applied effectively and responsibly.

Looking forward, the C-PMO’s embracing of AI and data-driven approaches will fundamentally reshape how project data is exploited, paving the way for faster and more efficient capability delivery.

The C-PMO is working with the NCI Academy to support the development of a highly professionalized project management cadre (PHOTO: NCIA)

INDUSTRY PERSPECTIVE

Software security – who’s responsible?

Imagine going to purchase a new car and being asked if you want to add the options of safety glass, seatbelts, airbags or crumple zones?

Such basic car safety features are now taken for granted, and indeed any manufacturer not including them as standard would soon go out of business. Yet, when the first automobiles rolled off the production line over a century ago, their designs showed scant regard to the safety of drivers, passengers and the unfortunate pedestrians they occasionally hit. Instead, the design of those first cars prioritized functional performance, comfort and aesthetics. Safety was at best an afterthought.

The evolution of car safety in the intervening century can be viewed in two ways: yes, technological innovation has played a major role, but equally important was the gradual shift in the responsibility for safety from end-user to product manufacturer.

Now consider the current situation with regard to mission-critical software. How often do such products come with an accompanying ‘hardening guide’ — a document detailing how to minimize the risk from cyber threats? Some of

these steps — such as applying critical patches or changing the default admin password — although straightforward, are so critical that they present huge potential vulnerabilities if overlooked or delayed.

Clearly, until now, responsibility for the cybersecurity of software products has rested largely on the shoulders of the user.

The need for change

As our lives become ever more dependent on digital technology, the potential profits from cybercrime have skyrocketed, driving unprecedented rates of attack innovation and deployment. As well as posing an escalating risk to the business and reputations of IT-dependent organizations, cybercrime has started to disrupt the critical services and infrastructure upon which our society and lives depend — as witnessed in the state-sponsored cyberattacks that exacerbated the conflict in Ukraine.

As a consequence, minimum levels of cybersecurity are increasingly mandated by law through new regulations, such as the second European directive on Network Information Security (NIS2) coming into effect this October. With narrower reporting windows and heftier financial penalties (or even jail time) for non-compliance, NIS2 has raised the priority of cybersecurity across all sectors.

But just as cyber threats are increasing in volume and sophistication, our rapid adoption of new digital technologies has increased the complexity and attack surface of the infrastructure being targeted. As a result, chief security officers are facing unprecedented challenges, and

with a global shortage of skilled cybersecurity staff, they need help.

Secure by Design and Secure by Default

For most of its history, IT software design (as with cars) mostly prioritized function over security, which has led to widespread vulnerabilities. To minimize the risk of these being exploited in cyberattacks, we the users have accepted the responsibility of applying regular patches to software we buy.

Recognizing that cybersecurity had become an uneven playing field with certain manufacturers and customer organizations shouldering more responsibility than others, national cyber strategy bodies in key countries began to address these shortcomings, highlighting the importance of building cybersecurity capabilities across society — and of using market forces and the tools of government to do so.

The result was ‘Secure by Design’, outlined in this international whitepaper led by the U.S. Cybersecurity & Infrastructure Security Agency (CISA).

Rather than impose strict new rules that may initially be unachievable for all but the largest technology manufacturers, Secure by Design provides voluntary guidance “intended to progress an international conversation about key priorities, investments and decisions necessary to achieve a future where technology is safe, secure and resilient by design and default.”

At its heart, Secure by Design focuses on three key principles: taking ownership of customer security outcomes, embracing radical transparency and accountability, and building an organizational structure and leadership to achieve these goals.

Head of Cyber Policy and Global Field CISO, Fortinet
Jim Richberg

The objective is to deliver IT products and services in a configuration that provides strong security “out of the box”, rather than expecting users to become product experts before being able to use securely.

The Secure by Design Pledge

Ultimately, to develop products that are inherently secure by design, cybersecurity considerations must inform every step of the product development lifecycle. For many manufacturers, this means considerable change to their processes and organizational structure, as the current (often separate) functions of development, operations and security become integrated under a common DevSecOps mission.

While the full transition represents a significant, multi-year investment for many organizations, its benefits can be seen long before the journey is complete. To that end, in 2024, Fortinet and other vendors, together with CISA, collaborated to make Secure by Design actionable by crafting a set of near-term measures that IT producers could undertake to begin the shift of responsibility for user cybersecurity back to the manufacturers.

The Secure by Design Pledge lets vendors of any size demonstrate their commitment towards this goal so that potential end users can readily distinguish them from vendors that haven’t made the commitment.

In signing this voluntary pledge, vendors commit to making progress each year on each of the following seven steps:

1. Increase the use of multi-factor authentication (MFA) in their products.

2. Phase out the use of default passwords.

3. Start eliminating common classes of vulnerability such as SQL injection or cross-site scripting.

4. Simplify or automate the process for applying security patches.

5. Create and publish a vulnerability disclosure policy — one that welcomes well-intentioned external vulnerability research.

6. Demonstrate increased transparency through faster, more complete vulnerability disclosure.

7. Include logging capabilities that can provide evidence of intrusions — facilitating faster detection and response across the industry.

Some of these steps, such as eliminating a whole class of vulnerability like SQL injection, may require migrating to a more secure programming language or framework that may not bring immediately visible improvements from a customer perspective. So, unless customers start demanding that their suppliers sign the pledge, or at least ask why they haven’t, then some vendors may remain reluctant to make the necessary investment.

Similarly, increasing transparency through faster and more complete disclosure of vulnerabilities will almost certainly result in more reported vulnerabilities and patches at first. As such, potential buyers need to understand that such elevated metrics may be a sign of positive progress towards Secure by Design rather than evidence of less secure products.

The reality is that all computer code contains errors and it is preferable that vulnerabilities are discovered by the manufacturer rather than by those who will exploit them.

The way forward

Total security may never be achievable. Modern software applications often comprise millions of lines of code and can encounter a near infinity of use cases against which full testing is impossible. As such, Secure by Design will never be a perfect solution. But as customers recognize that some vendors are prepared to go the extra mile in shouldering some of their cybersecurity burden, the industry as a whole will gradually start to shift in the right direction.

Just as car drivers still need to drive safely and follow the highway code, software users need to continue adhering to cybersecurity best practices such as defence in depth. But, as progress is made, we will all be safer as a result.

www.fortinet.com/blog/industrytrends/fortinet-progress-onits-secure-by-design-pledgecommitments

www.fortinet.com/blog/cisocollective/what-cisos-need-to-knowabout-secure-by-design

www.fortinet.com/blog/industrytrends/How-proactive-responsibletransparency-benefits-customers

www.fortinet.com/blog/industrytrends/rsa-conference-2024embracing-responsible-radicaltransparency

SECURING NATO’S DIGITAL TRANSFORMATION

Major General Dominique Luzeaux (Dr Hab) NATO’s Digital Transformation Champion and Special Advisor to the Supreme Allied Commander Transformation insists that the risks of failing to digitally transform the NATO Alliance could be fatal. He explains how the Alliance is pursuing its digital transformation vision

DIGITAL TRANSFORMATION: AN ESSENTIAL ASSET FOR NATO

Not only are digital technologies ubiquitous, they have also reshaped society from top to bottom, abolishing frontiers and taboos on one side, while creating digital divides on the other. Like so many other areas of human activity, warfare has also been deeply impacted. Digital technologies are pushing back the boundaries that even the most imaginative proponents of a revolution in military affairs had not foreseen. The Russia-Ukraine conflict bears witness to this phenomenon: cloud migration securing government data under threat, hackers using commercial Realtek software-defined radios (RTL-SDR) to jam adversary signals, drones dropping molten thermite on troops… This is only the tip of the iceberg.

All of us are currently learning from the war in Ukraine. But this conflict represents only one kind of military technological revolution, where digitally enhanced weapons systems are being primarily used at the tactical level. Imagine the outcome of facing an adversary with access to world-class hyperscale capacities and abilities.

Paragraph 62 of the Vilnius Summit Communiqué of 11 July 2023 highlights digital transformation (DT) as a major challenge for NATO as it seeks to keep an edge over its adversaries. I quote, “Recognizing the urgency of a digitally transformed Alliance, we have endorsed a Digital Transformation Implementation Strategy to underpin our ability to conduct multi-domain operations, drive interoperability across all domains, enhance situational awareness, political consultation and employ

data-driven decision-making.” Fulfilling this critically ambitious objective to prevail in a contested environment begins with an awareness of the brittleness of the digital infrastructure NATO has inherited from the Cold War alongside a multiplicity of diverse systems, processes and capabilities within the Alliance. Such impediments beg the question: are we really ready to win?

NATO MUST ACCELERATE ITS DIGITAL TRANSFORMATION TO ENABLE A MULTI-DOMAIN OPERATIONS ALLIANCE BY 2030

Multi-domain operations (MDO) are defined as “the orchestration of military activities, across all operational domains and environments, synchronized with non-military activities, to enable the Alliance to create converging effects at the speed of relevance”. But, beyond this official definition, the first key tenet of MDO is the extension of any action from land, air and sea to space and cyberspace. Furthermore, as ‘hybrid’ (Russia), ‘unrestricted’ (China) and ‘fifth- and sixth-generation’ (United States) warfare concepts strongly suggest, the information space and non-kinetic effects might become commonplace features of future conflicts. Thus, MDO should be understood as not only getting inside the enemy’s OODA (observe, orient, decide and act) loop as described by Colonel John Boyd’s strategy, but getting inside multiple OODA loops simultaneously so as to overwhelm the adversary’s decision-making processes.

Furthermore, as illustrated by Colonel John Warden’s Five-Ring model, this involves effects not only on fielded military formations, but also on the general population, infrastructure, organic key production assets and leadership. Even if some actions are not compliant with our ethos and rules of engagement, we have to be prepared to counter them as our potential adversaries might behave differently to us.

From a digital perspective, MDO rely fundamentally on the interconnection of forces, platforms and systems, and on secured, trusted and resilient digital systems and data. Just as solid foundations are a fundamental prerequisite for constructing a house, before building walls, working on plumbing and choosing the colour of the curtains, modernizing the digital backbone is a fundamental first step to an effective digital transformation. This should be done as quickly as possible by adopting proven commercial civilian solutions and not redeveloping ad hoc customizations. This backbone will provide the hardware and software necessary for the transmission and transport layers. It will also offer critical services for operational use, such as wireless or satellite-based communications. It is the cornerstone for basic interoperability between communication and information systems.

Adopting commercial contractor-operated cloud technologies must be a parallel endeavour.

Levels of confidentiality are not an impediment and should not be an excuse for inaction as existing cipher algorithms and hardware encryption devices provide technological solutions. The main challenge is overcoming directives based on outdated, risk-averse approaches. We must adapt in order to cope with evolving leading-edge technology and geopolitical situations.

With these foundational elements, sharing and exploiting huge amounts of data becomes possible, subject to their preliminary curation and labelling. This will not only identify, gather and cleanse the data, but it will also add the necessary context to each data type so it can be readily integrated into automated search and/or machine learning algorithms. The goal is to know what data is available, where it is, and to ensure its quality for further use. Sharing data requires defining an appropriate architectural framework or, in other words, a data fabric that federates various existing data platforms, technologies and services such as those provided by Nations or by civilian agencies that may be involved during a multi-domain operation. This is essential for delivering the right data to the right customer at the right time.

As soon as these technological building blocks are ready, enhanced AI-assisted decision-making at the pace of relevance becomes achievable. Hence, there is a real sense of urgency to be ready by 2030.

THE ALLIANCE MUST ACQUIRE DIGITAL PRODUCTS DIFFERENTLY

Digital lifecycles are measured in years (two to three) at the software application level, three to five at the information technology hardware level, and six to eight at the global system level. However, current procurement, development and implementation processes, as defined in the Common Funded Capability Delivery governance model (CFCDGM), are clearly not adapted to such a pace.

There is no doubt that procurement time, as well as programme decision time, must be drastically reduced. In addition, our existing traditional defence industry must evolve and open itself more to be able to collaborate with small- and medium-sized enterprises to a greater extent.

Accelerating procurements and giving them agility will make it possible to adapt to short technology cycles and take advantage of potential disruptions and/or new use cases. This agility must be reflected in the purchasing method itself, which necessitates a culture change within contracting authorities and purchasing agents, as well as overall governance. Priority must be placed on schedule- and user-focused outcomes, rather than on technical and performance scopes.

Engineering contracts, which offer high flexibility in placing orders, can be executed through framework

At the Vilnius Summit in 2023, the former Secretary General, Jens Stoltenberg, said digital transformation was a major challenge (PHOTO: NATO)

agreements. The rationale behind framework agreements is to achieve savings in the cost of procurement as well as in the time and resources spent in the procurement process. Once a framework contract is in place, the completion of call-off orders is a quick and simple process.

Contract engineering must be outcome-driven. That is, the supplier must be committed, even if the purchase is adaptive, to the expected result. Periods and milestones must be determined, and the contracting authority should consider committing the contractor to an operations and application maintenance package. Provisions for penalties as well as incentives can be used to control the battle rhythm of development and delivery. Furthermore, cost drift must be controlled through rigorous monitoring and accurate order forecasts.

Consideration should be given to how to achieve the objectives. A prerequisite, therefore, is to clearly define the acceptance criteria for the functionalities delivered. Hence the need, right from the start, for integrated teams to work collaboratively in short cycles, bringing together operational requirements specialists, functional/technical experts and agents responsible for monitoring development and acceptance testing.

Without doubt, there are many ways to make purchasing more agile. However, it requires a culture change for purchasing agents, wider risk acceptance and increased skills in terms of defining the requirements and monitoring the execution.

WE ALL MUST GO ABOVE AND BEYOND IN ORDER TO OUTPACE OUR ADVERSARIES

The proliferation of intelligent sensors and actuators with the exponential growth of the Internet of Things (IoT), coupled with mobile intelligent 5G/6G systems, calls for a global balance of storage and computing resources. This balance must extend from the Edge (sensor/actuator level) to the Core (main, centralized, high-performance resources) with intermediate levels, to guarantee robustness and resilience in case of intentional or accidental loss of service.

Resilience can be achieved by switching to Software Defined Networks (SDN) integrating Secure Access Service Edge (SASE), but as already emphasized, this requires implementing the foundational bricks first.

The good news is that the technology is already available, as is the commercial (civilian) industrial base. It is mainly a question of reaffirming our commitment to do it. Although not negligible, the investment is comparable to the acquisition of large combat platforms, and is definitely lower than the cost of defeat.

Effective multi-domain operations will require a rapid and robust digital transformation (PHOTO: NATO)

INDUSTRY PERSPECTIVE

At the forefront of cross-domain solutions

Our solutions ensure secure data transfers between networks with different security levels, allowing organizations to operate efficiently without compromising security. Arbit’s products are engineered to perform optimally in challenging environments, whether in military operations or critical infrastructure sectors, ensuring that data integrity and confidentiality are always maintained.

How has Arbit been collaborating with NATO and NCIA?

When and why did you establish Arbit Cyber Defence Systems?

I founded Arbit almost 20 years ago. Working as a contractor for the Danish defence, I learnt the importance of being able to communicate between networks of different classifications. This inspired me to develop the first version of the Arbit Data Diode in my basement. Since then, Arbit has endeavoured to be at the forefront of cross-domain solutions (CDS). We create advanced security systems tailored for organizations with stringent security and national confidentiality requirements, such as intelligence agencies and the defence sector.

Today, in contrast to the recent international acquisitions and mergers you see among CDS suppliers, Arbit remains an independent Danish company still majority-owned by myself and two other co-owners who have been working towards Arbit’s success for over 10 years. This stability in the ownership is a central part of Arbit’s unique value proposition: trust created through transparency and stability.

Our international presence has now grown considerably and our systems are used by organizations in Europe, the Middle East and Asia. In addition, we actively participate in major international defence initiatives, including NATO Edge 2024, NATO CWIX 2024 and the European Defence Fund (SESIOP).

During CWIX 2024, Arbit successfully provided a fully operational and easily deployable solution capable of releasing data from PINK (high) to GREEN (low), according to STANAG 4774 and 4778. We utilized the modular design of the Arbit Trust Gateway to create the demo configuration in cooperation with NCIA. Next year, Arbit intends to participate in CWIX 2025 with even more capabilities.

Which trends do you see in military cyber and security?

Military cybersecurity has several trends. Most are generated from the demand for multi-domain and multinational operations, for example, NATO forces in the Baltics. With a higher degree of interconnected systems on the battlefield, the threat from cyber-attack is also increasing. These trends are:

Introduction of cloud-based systems: The benefit of cloud-based systems is the high level of interoperability of a cloud solution. However, this is also a threat as a compromised cloud could affect the entire operation. Without proper segmentation, the IT environment is more vulnerable. Even without directly compromising the cloud, the interconnection must be prepared for jamming and other disruptions. The systems also need to work as independent islands, which calls for a more loosely interconnected web of systems that support graceful degradation.

Hardware — the missing trend in cybersecurity: Most discussions concerning military cybersecurity are about software: applications, operating systems and hypervisors. However, no software can compensate for compromised hardware. We believe that a stronger focus on the hardware level is required. No matter the certifications and approvals a piece of software has, it will never be more secure than the hardware platform on which it is executed. This also includes FieldProgrammable Gate Array (FPGA) programming, which is just as dependent on a trustworthy FPGA platform as software is on a trustworthy processor.

Importing civilian technology into dual-use scenarios: In many areas, such as drones for example, civilian technology is being developed faster and is often at the forefront of technological evolution. However, to create reliable solutions for military use, the technology needs to be matured. This process involves testing and certification, which will

CEO Arbit Cyber Defence Systems
Rasmus Borch

provide the quality that is expected of military technology.

How does this relate to NATO’s focus on datacentric security (DCS) in multi-domain operations?

DCS is a trend in cybersecurity and combining classification level and data encryption is a good principle. The classification level makes it easier for Information Exchange Gateways (like Arbit Trust Gateway) to determine if data is allowed to pass through. Encryption protects the data wherever it is stored, ensuring that only the people with the right credentials can access it.

One common misunderstanding concerning DCS is that it removes the need for segmented networks. It is true that data at rest is encrypted, and if that encryption is trusted then the data could be stored or transported through a lower classified domain. However, if the data is accessed — and decrypted — then the environment of the data must have the right classification, such as Class II for SECRET information and so on.

You mention certifications, but what is the value of certifications?

We trust our clients — and clients should be able to trust their vendor. International certifications assure

clients that our solutions have undergone rigorous evaluation by independent authorities, offering a higher level of trust and transparency beyond internal assurances. Critical unbiased validation of products is essential.

Certifications such as Common Criteria EAL7+ from BSI Germany and NATO COSMIC TOP SECRET accreditation by the Danish Centre for Cyber Security (CFCS) ensure compliance with the highest security standards, offering clients assurance based on independent assessments rather than vendor claims.

Our Arbit Data Diode 10GbE, certified at Common Criteria EAL7+, guarantees 100% secure, one-way communication between networks, effectively protecting critical infrastructure from external threats and data breaches.

By integrating certified software like SUSE Linux, we provide comprehensive security across the entire network. This independent certification is essential for clients seeking robust, proven solutions to protect their critical assets and ensure that our technology meets the strictest security standards.

ARBIT’S ‘BUILDING BLOCKS’ FOR AIR-GAPPED SYSTEMS

Arbit provides two basic building blocks: the Arbit Data Diode, which solves the task of importing data unidirectionally into a classified network, and the Arbit Trust Gateway, which enables unidirectional release of data from a classified network, creating a secure platform that can support all required security policies.

In addition, Arbit provides advanced CDS that combine the two products. A good example is the Arbit Web Gateway. This enables an application residing on a higher classified network to send an HTTPS request locally on the higher classified network, which in turn is released to the lower classified network, if it conforms to all the security requirements of the Arbit Trust Gateway. The HTTPS reply from the lower classified network will then be sent back to the higher classified network using the Arbit Data Diode. This enables applications to operate even if they were not designed for air-gapped use.

Arbit recently signed a 20-year agreement with the Danish Ministry of Defence Acquisition and Logistics Organization (DALO), focusing on CDS for the entire Danish defence community. This is to increase network security across various military units, from HQ server installations to frontline units.

ZERO TRUST ARCHITECTURE NEVER TRUST, ALWAYS VERIFY

Christian Have , NATO Edge Speaker and Chief Technology Officer at Logpoint, highlights the growing threat from the advanced persistent threat (APT) community and ransomware gangs, and explains why zero trust architectures are key to safeguarding an organization’s network and data

What, in your opinion, are the key evolving cyber and data assurance threats that organizations should be preparing for?

Everyone has heard of ransomware attacks. They are not new but we have recently witnessed the emergence of a thriving black market where criminal gangs specializing in ransomware act as initial access brokers, i.e. ransomware as a service (RaaS). They compromise networks and then offer that access on the criminal market together with an assessment of the target’s potential willingness to pay, the required budget for the

ransomware operation and the return on investment.

That is not new but one of the most alarming things we are now seeing is the increasing collaboration between these criminal entities and state-backed advanced persistent threat (APT) cells. This is particularly true in near-peer adversary countries, such as Iran and North Korea, as well as in peer adversaries like China and Russia. The authorities in the host nations of these criminals are increasingly putting pressure on them and only allowing them to continue to operate so long as they adhere to guidelines and direction from the state.

While the APT landscape has remained fairly static over the past few years, albeit with evolving tactics and techniques, there has been a significant rise in the sophistication of criminal gangs who have plenty of money to spend thanks to their illegal activities. Another worrying aspect is the fact that, as the West has been able to come together to shut down or incarcerate the leadership of many of the criminal gangs in their jurisdiction, the threat has actually increased. This is because the people below that most senior level have struck out on their own, resulting in a massive fragmentation of the ransomware community. These disenfranchised cybercriminals operate more autonomously in a far more decentralized fashion, functioning almost like a gig economy. They share payment and ransomware infrastructure as well as tooling and expertise. Most worrying is how this expertise now includes artificial intelligence (AI), which has the potential to be an absolute game-changer.

What is zero trust architecture and how can it enhance an organization’s data security? Essentially, zero trust is a mindset. Zero trust architectures are networks protected by a rigorous set of access rules that permit only expected traffic to enter the network. Everything else is denied. To set one up, you compartmentalize your network by creating micro-segmentations. In this way, if your service does get breached, it is only that specific service that is

affected. The intruder cannot move out of that microsegment to another part of the network.

Moreover, the zero trust architecture acts like a detection sandbox where you have complete monitoring throughout. If absolutely everything is logged and registered and you capture all of the security decisions that control access you can display every unwanted encroachment on your network. The moment someone pings the network or tries to probe it, your display will light up like a Christmas tree. Then you can start to analyse what is happening and what the implications are for your network and services. A more traditional set-up would mean that you would have to model all of the potential future adversarial behaviours to try to work out what to do next. With zero trust architecture you don’t have to.

Can you describe a successful implementation of zero trust architecture and explain the benefits?

To begin a successful zero trust journey, customers need to start with the network. First, they should conduct a risk assessment of the applications and services hosted on their network so that they can get a true understanding of how vulnerable they are and how they expect them to function. It is important not to disrupt the daily operations of the business when implementing a zero trust architecture, so the ‘deny everything’ rule is activated only once everything has

" Zero trust architecture acts like a detection sandbox, where you have complete monitoring throughout"

been monitored and the customer begins to realize what should have been blocked from the network. Put more succinctly, you begin by employing an ‘allow everything’ rule then gradually move towards denying unauthorized traffic across each segment and then each microsegment of the network. That is what typically works best.

A successful zero trust architecture implementation should also allow the customer to map the capabilities they are gaining as security is tightened. For example, incident response time will have been drastically reduced and the time taken to investigate an attack is also significantly shortened. In addition, the customer can now start to undertake simulated attacks and see how the network responds. They can see how the process of micro-segmentation helps to reduce an attack’s blast radius. The customer can start to measure the resilience of the network rather than merely the number of attacks blocked by their firewall.

Why was it important for you to speak at the NATO Edge conference and what is your key message about zero trust architecture? We work with a lot of defence, intelligence and critical infrastructure organizations but not NATO, so that is a big part of why we wanted to participate in NATO Edge. That said, our work with the Danish Ministry of Defence has brought us into contact with NATO and its Cooperative Cyber Defence Centre of Excellence (CCDCOE). Over time, we have come to realise that as a community we need to broaden the extent of our collaboration, not just within the defence sector but also as a bridge to and from it. There are a lot of organizations that can benefit from our expertise and experience and we are keen to share it. We would also like to participate in the CCDCOE’s Locked Shields exercise. We have put a team together to make that a possibility.

Logpoint specializes in cybersecurity and advanced threat detection, investigation and response (PHOTO: Logpoint)

Beware of the smartphone!

In today’s interconnected world, the smartphone has become an indispensable tool for personal and professional tasks. Smartphones allow us to stay connected, work remotely and manage our lives with just a few touches on a screen. While smartphones rely on standard encryption implemented by apps, they are not designed to protect classified or highly sensitive information, especially in environments vulnerable to adversarial state actors or highly sophisticated attackers. The risks aren’t always visible but remain constant; eavesdropping, tracking and data interception can occur even when the device isn’t actively in use.

This is where Sectra’s Tiger/S comes in, offering a secure communication solution specifically developed to protect against these threats. Unlike smartphones, which are built for general use, the Tiger/S is specifically developed for those who cannot afford any compromise in information security, such as government officials and NATO personnel. Sectra’s Dennis Buchinhoren delves into why smartphones fall short in security and explains how the Tiger/S offers the robust protection needed for high-stakes communication.

The hidden threats of smartphone use

While smartphones are powerful tools, they also have some serious vulnerabilities. From the moment you turn one on, it becomes a potential target for various attacks — and you don’t even need to be actively using it for risks to emerge. Let’s explore some of the hidden dangers:

Data vulnerabilities: Smartphones are constantly connected devices, whether through Wi-Fi, Bluetooth or cellular networks. This makes them vulnerable to hacking, malware, phishing attacks and unauthorized data access. Whether through malicious apps or unsecured connections, attackers can easily get unauthorized access to personal data, location information and communication logs. For NATO end-users, this could mean that crucial data could be intercepted or exploited by adversaries.

Lack of control: One of the biggest risks with smartphones is the difficulty in controlling data flow. With hundreds of apps requesting access to microphones, cameras and location services, it becomes almost impossible to monitor what’s being shared and with whom. Additionally, third-party services or even device manufacturers often store data in ways users are unaware of, creating further risks for unauthorized access.

Eavesdropping and tracking:

Even when a smartphone isn’t in use, it can still act as a listening device. Attackers can remotely activate the microphone or camera, turning your smartphone into a surveillance tool. This is particularly concerning for anyone handling classified information, as conversations or

activities in supposedly secure environments can be monitored without your knowledge. Additionally, malware can silently extract vast amounts of information stored on the device, all without the user ever knowing.

With these threats in mind, it’s clear that smartphones present too many risks for users involved in sensitive operations and are not secure enough for environments requiring the utmost discretion and protection.

Tiger/S vs. smartphones: the key differences

To meet the demands of secure communication in a world full of digital threats, Sectra developed the Tiger/S: a purpose-built device with a unique, secure-by-design architecture, developed from the ground up to meet the stringent requirements of those who cannot afford to take risks. Unlike a smartphone, which is designed for multiple functions, the Tiger/S is narrowly focused on one thing: securing classified communication through speech, messaging and data transfer.

By focusing only on these functions, the Tiger/S eliminates many potential attack vectors that exist in multifunctional smartphones, such as apps, internet browsing or social media. The absence of these kinds of features reduces exposure to malware, phishing attacks and unauthorized data access. For example, Bluetooth is a convenient feature in modern smartphones, but it also presents a serious security risk. This is why we have avoided features such as Bluetooth in the Tiger/S, eliminating one of the most common entry points for hackers.

Sectra Account Director for NATO
Dennis Buchinhoren

Visit us at NATO Edge — 3 - 5 December Stand number S13

This December at NATO Edge in Tampa, Florida, we are excited to showcase how the Sectra Tiger/S addresses the unique security needs of NATO personnel. From enabling closed user-group calls to facilitating the exchange and centralized storage of classified information up to the classification level NATO SECRET, our solutions are designed to protect what matters most.

We’re also thrilled to showcase one of our latest innovations: the Sectra Secure Mission Pack, designed specifically for mission-based personnel and battlegroups. This solution ensures seamless, secure communication between field units, headquarters and operational teams, even in the most challenging environments.

Stop by for a hands-on demonstration and connect with Dennis and the team, who will be available throughout the event to discuss how our solutions can support your security needs. Want to guarantee dedicated time with our experts? Simply scan the QR code to schedule a one-on-one meeting. We look forward to seeing you at NATO Edge.

One absolutely key difference with the Tiger/S is the emphasis on a controlled supply chain. Sectra ensures that every component in this highly secure communication system is sourced from trusted vendors, following a rigorous evaluation process, both internally and through each Nation’s security authorities. This minimizes the risks of supply-chain attacks, which have become increasingly sophisticated over the past few years. The Tiger/S also has tamper-proof technology, meaning that any attempt to alter or compromise the device is immediately detectable, adding another layer of protection against sophisticated state actors.

Sectra’s Tiger/S is equipped with advanced end-to-end encryption that protects every communication.

Whether you’re making a voice call, sending a message or transferring data, all content is encrypted from start to finish. The solution is quantumresilient, meaning that it is designed to resist even the most advanced attacks in the future. When using the Tiger/S, you can also be confident that the user on the other end is who they claim to be. Strong multi-factor user authentication ensures that only authorized individuals can communicate with one another. This type of certainty is critical in NATO operations, where failing to ensure the validity of the caller’s identity can jeopardize the mission.

In conclusion

Smartphones are incredibly useful but when it comes to handling classified information, they fall short. The everyday use of smartphones

brings vulnerabilities that make them unsuitable for secure communication.

Sectra’s Tiger/S is specifically designed to counter these growing threats. With its advanced encryption, tamper protection and fully controlled supply chain, the Tiger/S creates a secure environment that goes far beyond what commercial smartphones can offer. For NATO personnel and anyone handling classified information, the Tiger/S isn’t just an alternative — it’s an essential tool for ensuring that communication remains secure.

communications.sectra.com

POSTQUANTUM SECURITY

TOWARDS QUANTUM-CLASSICAL

SECURE INFORMATION SYSTEMS

Professor Kwang-Cheng Chen from the University of South Florida’s Department of Electrical Engineering offers a compelling insight into how quantum computing will have a massive impact on the ancient art of codes and code breaking

Advances in quantum computing are changing the frontiers of cybersecurity in two major ways. First, by the threat they pose to classical cryptography, and second, as a consequence of the opportunities they will offer to innovate new cryptographic systems. Although it will take some time for quantum information sciences to be commercially mature, it is worth exploring these two scenarios further from a precise quantum-information engineering point of view, as well as suggesting possible holistic facilitation of future secure information systems, alongside the status of international standards required for interoperability.

POST-QUANTUM CRYPTOGRAPHY

The seminal milestone of Shor’s algorithm for finding the prime factors of an integer (whole number), which has been brilliantly enhanced by a recent manuscript by Oded Regev, suggests the possibility of breaking prime factorization and the subsequent threats to public key cryptography which that process will introduce. To respond to such a threat, the United States’ National Institute of Standards and Technology (NIST) has established a mechanism to facilitate the development of so-called post-quantum cryptography (PQC) with the initial selection of four PQC algorithms in 2023, and the final selection expected in late 2024.

Multiple PQC algorithms, based on different mathematical fundamentals, support the capability of dealing with attacks from future quantum algorithms and quantum computers. Current PQC family algorithms are constructed on two classes of technologies: the lattice-based Cryptographic Suite for Algebraic Lattices (CRYSTALS) series, which implements ML-KEM (Kyber) and ML-DSA (Dilithium), two lattice-based PQC standards recommended by NIST. This class provides a complete PQC key exchange and identity authentication IP. Another, the Pyramid class of hash-based signatures, implements LMS, XMSS, SPHINCS+, and hash functions, focusing on postquantum extremely secure digital signatures and authentication requirements.

Due to the complexity of PQC algorithms, traditional software implementation on a general-purpose processor might face more challenges in computational efficiency due to much larger keys and much more complicated algorithms, which suggests the need for dedicated PQC-integrated circuits (IC) or

PQC silicon-IP integrated into a host IC. To illustrate such a hardware/IC approach to implement PQC, there have been several realizations developed or in progress, verified by Field-Programmable Gate Array (FPGA) or sample ICs, which typically include Keygen/ Encap/Decap for Kyber, and Keygen/Sign/Verify for Dilithium. These examples are generally required to pass the Known Answer Tests (KATs). Such sample ICs do not require advanced semiconductor fabrication and Taiwan Semiconductor Manufacturing Company (TSMC) 40nm technology would be enough with decent power consumption and a satisfactory clock rate. Effective architecture and resilience to sidechannel attacks will likely be considered in relation to the PQC hardware solution.

QUANTUM KEY DISTRIBUTION

Quantum key distribution (QKD) is the first mechanism to demonstrate quantum computing, quantum communications and quantum cryptography. So, after extensive theoretical and experimental study, primarily by physicists and mathematicians, QKD has been

Figure 1: OSI for classical networking and thus communication is shown in the blue box, while PQC modules can be installed either in the physical layer or applications according to the application scenarios. QKD could be an add-on module for key exchange or key distribution to achieve security in classical networking and communication. QKD, though leveraging properties of quantum mechanics, is to facilitate classical secure communications and networking systems.

"In the noisy intermediate-scale quantum era, the interplay between quantum circuits and classical logic will be key to secure information processing"

undergoing a process of standardization by the International Electrotechnical Commission/ International Organization for Standardization (IEC/ ISO), Joint Technical Committee-1 (JTC-1), the European Telecommunication Standard Institute (ETSI), and the International Telecommunication Union (ITU).

Essentially, QKD leverages properties of quantum mechanics to exchange the key for a one-time pad, which is considered to be the most secure classical communication method. However, this technique requires further consideration, especially from the engineering systems perspective. In particular, there is a need to assess its robustness and the engineering implementation to examine the operation of QKD and its variants. If one considers using QKD to exchange the security keys for classical communication, and if the common assumption of authentic classical channels is made in QKD, then attacks on classical communication systems and channels should be taken into account.

Furthermore, imperfect quantum channels must also be taken into account in relation to QKD operation. Hence, performance degradation is due to be analyzed with an engineering robust QKD proposal. As shown in Figure 1, the appropriate architecture, based on ISO’s Open Systems Interconnection (OSI) to integrate QKD and classical cryptography, including PQC, is under consideration for system architecture and subsequent interoperability in ongoing international standards.

QUANTUM FOURIER TRANSFORM

The engineering implementation of quantum computing will serve as another issue in future quantum-classical security mechanisms. The central functionality of Shor’s algorithm to attack digital

signatures is the Quantum Fourier Transform (QFT). Due to the nature of probabilistic measurement outcomes for any quantum system, realistic QFT should be considered to gauge the real cost and performance of quantum computing.

I would suggest a quantum accelerator for QFT, known as the QFT accelerator, with regards to potential quantum-classical hybrid computing. A quantum electronic design automation (Q-EDA) tool has been developed by taking imperfect quantum gate implementation into account. With this, it is possible to demonstrate that an appropriate design of a QFT accelerator by using, for example, quantum photonics, can be quite different from what is the traditionally accepted implementation according to theory. Furthermore, my students and I have noted that properly blending quantum circuits with classical logic control circuits might indicate the effective realization in noisy intermediate-scale quantum (NISQ) computing devices that are expected to be useful in the next two decades or longer, which might utilize a quantum accelerator in quantum-classical computing.

Quantum-secure information systems will not only consider PQC and QKD; many other innovative mechanisms provide potential breakthroughs beyond the scope of classical computing and communications, such as quantum secret sharing and so on. There is no doubt then that, in the NISQ era, striking the correct balance between quantum computing circuits and classical processing circuits is critical for achieving secure and interoperable quantum-classical information systems to compute, process and transmit classical information.

INDUSTRY PERSPECTIVE

Fusing geospatial intelligence to create actionable intelligence

Chief Executive Officer, EMDYN

While much discussion exists about geospatial intelligence, what exactly is it?

Geospatial intelligence (GEOINT) is the art of locating and making sense of features on the earth’s surface and associated activity using information from maps and data gathered by satellites, aircraft and unmanned aerial vehicles (UAVs), as well as ground-based sensors. It is mostly undertaken by analysts who often operate in siloed cells without ready access to other types of intelligence that would help to add valuable context and validation to the data they are examining.

There are other silos of information being gathered, stored and analysed by different intelligence personnel: signals intelligence (SIGINT), opensource intelligence (OSINT), human intelligence (HUMINT) as well as measurement and signals intelligence (MASINT). While all these separate silos may individually deliver invaluable information, much of the added value and the contextual relevance is never realised because of blind spots between them.

How does the ‘Fusion’ part of your geospatial intelligence offering work?

EMDYN’s philosophy lies in the knowledge that there are more efficient ways of extracting real value from these intelligence feeds. This is where ‘Fusion’ comes in. Fusion brings in these other intelligence silos on to a map, enabling the data to tell a much richer story. In other words, EMDYN Platform pulls together on a single pane of glass multiple location data sources and other complementary data feeds that may not have a location aspect to them but are still highly relevant to the subject at hand. By bringing these together into one central application, we enable analysts to process and crossvalidate vast swathes of relevant data extremely quickly without having to work across multiple software applications or request help from other IT specialists.

As EMDYN Platform is a government solution, it is often installed locally on site. Although technically we could run in the cloud, we don’t have a cloud offering. Instead, we allow people to fuse the data that we can commercially provision alongside their own internal data, without them having to transmit it to us.

The beauty of the system is its simplicity. It is a point-and-click solution designed so that analysts can deliver actionable intelligence in a flash. In short, we have purposefully developed EMDYN Platform to complement an organization’s digital transformation strategy. It is a disruptive capability that transforms intelligence-led workflows.

Why is geospatial intelligence important for national and regional security?

Decision-makers and intelligence analysts are looking for very specific answers to core questions, such as who, what, where, when, why and how. Although each of these questions is critical, it is the ‘where and when’ that are absolutely essential, as without knowing the location of an activity or incident and when it happened, you cannot truly make intelligence actionable. That is why GEOINT is one of the most valuable fields of intelligence there is.

Can you describe EMDYN Platform and offer a typical use case?

EMDYN Platform is not designed for a typical use case as such. Conversely, it is a flexible solution that leverages multiple intelligence sources that can be overlayed on to a range of workflow options related to applications such as military intelligence, maritime domain awareness, counter-terrorism, disaster relief, border security and so forth. For example, a workflow could be based on a specific geographical location where it is known that something ‘interesting’ is taking place or has just happened, such as the sudden absence or appearance of a ship at a specific location. Or, it could be a more general workflow where there is a certain level of anxiety about the activities occurring across a much larger area: the Red Sea, perhaps, or the English Channel.

As such, EMDYN Platform is a powerful yet intuitive intelligence tool suited to determining not just what may be happening at a precise

location but why. This is valuable not only for determining how to respond to that situation as it plays out but also to enable an organization to make advance preparations for a likely situation that is yet to occur.

Who is the end user and who is the beneficiary of this technology?

EMDYN Platform was built specifically for analysts. This is a very important distinction. The system was not developed as a tool for software developers or database engineers. In developing our Platform, we were always adamant that an end user should not need a technical IT background to operate the application. It is a simple point-and-click operation developed to enable analysts to make

insightful, verified reports that combine a multitude of data sources to create actionable intelligence.

These reports are destined for decision-makers higher up the command chain, perhaps even at the very top of the chain, be they political, military, commercial or operating in the public services. The end user is the analyst but the key beneficiary is the senior decision-maker.

How

far down the command chain can the information or insights be accessed via EMDYN Platform?

EMDYN Platform is not a tactical system to be operated in the field; it is a desktop solution most likely to be located at a command centre or

HQ. That said, clearly actionable intelligence is destined for the field. The intelligence that you can glean from our Platform could be used to send a drone on a surveillance or strike mission. Equally, it could be used to direct troops into an area of operation or naval assets to a specific area in the maritime domain.

www.emdyn.com/what-we-do/ emdyn-platform

SHARING AI DATA

TRAINING AI MODELS FOR DEFENCE AND INTELLIGENCE: A LEGAL AND PROCUREMENT PERSPECTIVE

NATO Edge Speaker

describes some of the legal and procurement challenges linked to AI for defence, based on the experience of Safran.AI (formerly Preligens) in developing and supplying AI models for leading Allied ISR organizations

Artificial intelligence (AI) in defence is distinct from its use in other industries. While AI training requires vast amounts of labelled data, this often contrasts with the practical availability of military datasets, which must be carefully controlled and protected. Defence data is unique due to its frequent classification and its links to national security and interests. Thus, training and procuring AI systems for defence raise unique challenges, particularly in the legal and procurement realms. So, let’s examine some of these challenges and provide perspective on conjugating AI operational effectiveness with defence-specific restrictions around data and the training environment.

TRAINING AI SYSTEMS

AI performances are tightly linked to the availability of quality labelled data. For defence use cases, this data is often out of industry’s reach and cannot be procured from commercial vendors. The classified nature of many defence datasets brings challenges in obtaining the data needed for AI training. However, training AI models on limited or synthetic datasets can result in suboptimal performance, limiting the technology’s full potential. The scarcity of high-quality, labelled training data poses one of the largest barriers for industry for developing effective AI systems. Unlike most civilian use cases, where data is easier to collect, defence organizations face stringent regulations on data use, sharing and storage, which they pass on to their AI contractors.

Consequently, a key challenge in the development of defence AI is determining how industry can access sovereign and often classified data. Private companies require access to operational datasets to create effective solutions because training on such data is paramount for developing relevant models for operations that go beyond ‘AI on paper’ and ultimately provide gamechanging value to the end-user.

Nevertheless, granting such access at the scale that AI requires for training raises new challenges specific to AI development. Legal and policy frameworks governing how, when and to whom data is made available will need to be updated and refined to meet the requirements of AI. Private companies and public legal advisors must work together to define how to balance data security with the need for innovation. A critical issue is determining who has the authority to grant access to data and to whom it belongs — whether that be the military or a procurement agency.

Overcoming this challenge requires finding novel technical and legal arrangements compared to previous, less data-intensive technologies. Developing robust mechanisms between defence agencies and private contractors is essential, as is promoting data discoverability and exchange readiness within and between allied defence organizations. In this regard, NATO’s Alliance Data Sharing Ecosystem (ADSE) initiative, aimed at fostering secure data sharing at speed and scale, to further enhance situational awareness and data-driven decision-making, is highly promising.

PROCURING AI SYSTEMS

Procuring AI systems for defence raises intricate legal and technical challenges. Some of these arise from AI’s unique reliance on training data. Traditional procurement methods, designed for hardware or less complex software, may fall short of covering some specificities of AI capability development, especially when industry access to operational data is necessary. We often hear in conferences, working groups and the like about the importance of the feedback loops, and that ideally, we should speed them up as much as possible; but, let’s not forget that this also means really trusting each other, regardless of any considerations relating to security clearance aspects.

IMPLICATIONS OF DATA PROVISION IN AI CAPABILITY DEVELOPMENT

Legal frameworks governing data ownership and access must be carefully considered when contracting for the development of AI. A central issue is determining who owns and controls the data used to train the models. The effective custodian of the data is rarely the purchasing or procurement agency.

New data provision agreements must be developed to address the unique aspects of AI. These frameworks should unambiguously outline who has the authority to provide data, how it should be shared and the responsibilities of both the defence agency and industry stakeholders in protecting sensitive information.

Defence contractors must be prepared to handle classified datasets in compliance with strict legally binding rules. Their subsequent deletion, upon achieving model training, is not straightforward, as it may hinder future improvements and diminish the long-term value of those AI systems. Similarly, private companies also need to develop strategies for managing sovereign datasets. Current machine-learning operations (ML Ops) are often ill-suited to the technical constraints imposed by the defence sector, such as air-gapped classified networks. Engineers and legal teams will be expected to work more closely than ever to ensure compliance with data-handling protocols.

Data labelling is a challenge of its own since it may carry classification or intellectual property (IP) implications. Ensuring that labelling is performed near the data scientists working on the project is paramount, as they understand the technical nuances of the datasets and the specific requirements for effective AI training, which ultimately has a big impact on the performance of the end product.

CONSIDERATIONS AROUND DATA DERIVATIVES

It’s crucial to carefully define derivatives when data from national technical means is made available to industry for model training. This is to establish clear rules associated with these derivatives, which may consist of anything from the labelled data, the sole ‘labelling mask’, as well as the training and testing databases, to the algorithms themselves. A proper distinction must be made of what exactly falls within ensuring compliance with classification, addressing IP concerns or managing export controls.

• Intellectual property (IP): Procurement agencies must carefully weigh the necessity of claiming unlimited rights over AI developments on the back of the prominent value of the data provided, particularly when claiming the core IP of the

algorithms they would like to see developed. Doing so could stifle innovation, as private companies — especially start-ups — may be reluctant to invest in defence AI development if they ultimately risk losing control over their IP. Striking a balance between granting armed forces the necessary rights and claims on the AI systems they furnish training data for while allowing private companies to retain ownership of key IP elements is crucial.

• Classification: Training an AI model on classified data does not necessarily imply the classification of its derivatives. That being said, ‘contamination’ of an algorithm can occur in cases where information on the training dataset can be retrieved from any output generated by the AI solution or by retroengineering such a solution. This must be thoroughly assessed to define the adequate level of classification, if any. This is a technical challenge before being a legal challenge, which we can summarize as follows: would making algorithms available to third parties risk releasing classified information contained in the training data?

• Export control: Trained AI models, and their underlying code, must ultimately comply with export control requirements, acting as a final barrier to prevent sensitive AI technologies from falling into the wrong hands. In this respect, it may be questionable whether AI solutions running in central secured infrastructures should be regulated in the same way as those embedded in systems or platforms at the tactical edge, on the battlefield, and therefore as close to the adversary as possible.

Progressing data science literacy and awareness around some of these challenges will contribute to Allied defence organizations’ ability to take full advantage of AI’s potential while ensuring compliance with legal, ethical and security requirements.

While standards are yet to be defined at many levels, procurement agencies should embrace novel contracting mechanisms that allow for the rapid iteration and adoption of AI systems while accounting for their peculiarities.

Restrictions, particularly around data and associated derivatives, should systematically be balanced against the need for agility in getting the actual end-users — the warfighters — timely access to capabilities leveraging emerging and disruptive technologies.

AUTONOMY AND THE DRONES PRINCIPLES OF RESPONSIBLE USE

NCIA’s Talia Goode asks Dr Claudio Palestini from NATO HQ and Rene Thaens, NCIA’s Head of the Electronic Warfare and Surveillance Branch, how NATO’s Autonomy Implementation Plan is helping to ensure that the Alliance is able to use autonomous systems effectively while adhering to the Principles of Responsible Use (PRU) guidelines

What does the NATO Autonomy Implementation Plan cover and why is it important for the Alliance?

Claudio: When we talk about autonomy, we are usually referring to drones, but it also covers decision-making and the extent to which we want humans to oversee machine decision-making.

Therefore, NATO developed the NATO Autonomy Implementation Plan in 2022 with the goal of creating a common understanding and a common strategy on dealing with autonomy. While autonomy can bring great benefits in supporting operations and missions, we also have to consider how to counter autonomous systems from potential adversaries. This is why it is crucial that we have a common operational concept so that all Allies act harmoniously. As a result of the Autonomy Implementation Plan, actions have been delegated to all stakeholders to ensure that all areas progress and develop.

When will the Autonomy Implementation Plan start having a governance impact on the Alliance?

Claudio: The NATO Autonomy Implementation Plan has already been adopted and we are now regularly reporting our progress and developments directly to Allies so that they can discuss further in committees. The plan includes a lot of delegated actions and tasks, so it is important to come together regularly as a community to assess the status of our actions and identify potential gaps. The community consists of all fields working on autonomy, aiming to develop a common understanding and standard practices across all domains from the operational and counter perspective. This makes us stronger as a network and strengthens NATO’s Autonomy Implementation Plan as well as each individual strand of the community.

How important is it that NATO employs autonomous systems in accordance with the Principles of Responsible Use (PRUs)?

Rene: Employing autonomous systems means trusting machines to take decisions without human approval and oversight, which is why governance is required throughout the conception, design, testing, implementation and operation of such capabilities.

PRUs provide a common baseline to assure trustworthiness and interoperability. However, we must accept that our potential adversaries are not necessarily constrained by the same ethical and legal frameworks. While NATO and NCIA are carefully manoeuvring within these guidelines, opponents can create and implement autonomous systems with malicious intent, free from the same ethical constraints. We, therefore, feel a sense of urgency to develop the technology as fast as possible while adhering to the guidelines.

What PRUs are most relevant to the use of autonomous systems?

Claudio: NATO’s six PRUs for AI in defence are: Lawfulness, Responsibility and Accountability, Explainability and Traceability, Reliability, Governability and Bias Mitigation. It is impossible to say that one principle is more important than another, as they are all relevant for autonomy. I would like to emphasize the idea that the PRUs enable multinational cooperation and interoperability. When we use systems in an operational context and several Nations are involved, we need to use the PRUs so that no Allies will be inadvertently harmed when operations are conducted.

NATO must adhere to ethical and legal frameworks as it develops and deploys autonomous systems (PHOTO: NCIA)

Within the context of counter-unmanned aircraft systems (C-UAS), we are facing new challenges of using autonomous systems to defend against autonomy being used in a threatening way, but we must remember to follow the PRUs outlined, even when using autonomy in a defensive capacity.

To what extent are NCIA’s development, training and exercise activities with autonomous systems impacted by the Autonomy Implementation Plan?

Rene: In NCIA’s role as communication and information systems provider, we deal with large volumes of data with the aim of transforming this into awareness and actionable intelligence. With the aid of machine learning, we can automate the processing of data. It is nearly impossible for humans to perform the same level of data analysis in the same timeframe as machine applications, so we must leverage this technology to accelerate decision-making. Adhering to the Autonomy Implementation Plan is an essential guideline in these first steps towards building and implementing autonomous systems, so that operators have full confidence that the systems provide trustworthy output.

There is still a need to develop the plan into a working level within NCIA. NCIA’s Chief Technology Office serves as the bridge between innovating new technology and adhering to the PRU guidelines provided by the Alliance. NCIA works seamlessly and in close collaboration with NATO, which acts to further ensure the responsible and accountable use of autonomy.

To be ready to use autonomy for defence, as well as in an operational way, we have to remain one step ahead of the game. Even though we are always using the PRUs as our framework, we have to consider how autonomy could be used maliciously for those who do not adhere to such standards and practices.

What role does industry play in helping NCIA and NATO to develop autonomous systems?

Rene: Autonomous systems have two crucial elements: perception and action. The machine needs situational awareness and then to be able to take action. We are still developing the perception stage, which means we are working on machine learning for digesting large volumes of data quickly, consistently and accurately. Simultaneously, we are always looking ahead and keeping an eye on what technologies are being developed as this can influence our own design and creation.

What we need at implementation level is a roadmap, outlining the steps we need to take to ensure that by the time we are ready to implement our own autonomous systems, they adhere to the plan. There is undoubtedly a time pressure due to the threats that adversaries pose with their own autonomy technology, which is being developed in the dark, so to speak, so we must prepare as best we can. For this reason, it is important to understand and stay up to date with the science and technology world, so we are aware of new technologies as they are being developed. The defence industry is no longer the frontrunner in developing the latest technology and so we have a lot to learn and benefit from industry. Autonomous systems are already being used in many other sectors and therefore many governments have already implemented legal framework and governance in the civil world.

In this sense, working with industry is valuable as they are constantly developing new equipment, capabilities and solutions. At NCIA, we engage regularly with industry to work together to develop the technology NATO requires. Engaging with experts in the field means we can build systems that are more resilient and capable than current generations. This puts great importance on our relationship with industry.

The world has witnessed how Ukraine is using autonomy to counter airborne threats on a daily basis (PHOTO: NCIA)

Sovereign and isolated cloud solutions

How long has Oracle been active in the defence and intelligence sector?

Oracle has been active in the defence and security sector since the company’s inception. In fact, our first customer was the United States Central Intelligence Agency (CIA). So, we have been working with the U.S. Department of Defense and other government and defence organizations around the world for over 40 years. Defence and security have always been top priorities for Oracle.

Can you describe the extent of Oracle’s work in the defence industry, both with governments and organizations such as NATO?

Governments and organizations like NATO are grappling with a need to access, analyse and act on intelligence faster than ever as warfare is increasingly fought on a digital battlefield. Yet, they are hamstrung with ageing technology, complex bureaucratic processes and rigid procurement regulations. This is where advanced cloud and artificial intelligence (AI) come in.

Fast migration to the cloud, with the choice of sovereign and multicloud environments, is critical. And it needs built-in AI and automated security. Oracle offers the world’s only next-generation cloud infrastructure that delivers the built-in security, interoperability and reliability the defence and intelligence community needs.

We enable customers to access and deploy cloud capabilities wherever they need them – within specific countries and data centres, in a dedicated or sovereign model, and in multicloud environments. This allows customers to choose the infrastructure that works best for them knowing that each option is secure by default.

In addition to public cloud, some of the distributed cloud solutions that we offer to address sovereignty and security include:

• Oracle EU Sovereign Cloud: Designed to help customers address European data privacy and sovereignty requirements.

• Oracle Alloy: A complete cloud infrastructure platform that enables partners to become cloud providers and offer a full range of cloud services to expand their businesses. For example, Nomura Research Institute (NRI) and Fujitsu use Oracle Alloy to help address digital sovereignty and regulatory compliance requirements.

• OCI Dedicated Region: A complete Oracle Cloud Infrastructure (OCI) cloud region in a customer’s data centre, with both data and control planes on-premises to help meet data residency and low-latency requirements.

• Oracle Government Cloud: Providing governments with

a range of deployment models to address local data residency, classification, operational and security requirements.

• Oracle Cloud Isolated Region: Secure, air-gapped OCI solutions designed to meet the higher demands of global customers’ mission-critical classified workloads.

Oracle’s distributed cloud strategy provides customers with a flexible model that gives them access to the cloud and advanced AI capabilities while maintaining the highest levels of security and sovereignty. No other cloud provider offers this level of choice.

Why are defence organizations turning to sovereign and isolated cloud solutions to ensure some of the world’s most important data is safe and secure?

As the world becomes more digitized, the digital and the physical are starting to merge into a single domain. This is hugely important because the data within that emergent domain has become strategically important to all the organizations and entities that depend on it, including the military. Metaphorically speaking, data has become the fuel that powers the digital engine that drives operational effectiveness, productivity and efficiency. Therefore, it is imperative that data is kept secure so that its quality and accuracy can be trusted, and the insights it delivers can be utilized to their fullest extent.

Delivering this level of security for organizations requires more than just a data layer, it demands a robust data platform built on secure, scalable infrastructure. The cloud provides these capabilities, especially for security,

Vice President of Defence for EMEA at Oracle
Bram Couwberghs

scalability, innovation and speed. However, organizations and workloads may have varying data assurance needs, so you also need choice and flexibility. In addition, organizations need help addressing regulatory compliance related to data security and regional sovereignty requirements so they can stay on top of security threats and concerns, and mitigate securityrelated outages. For instance, the military must prevent unauthorized third-party access, as tampering, copying or removal of data could lead to catastrophic consequences.

Our experience working with the defence community has led us to go in a very different direction compared to the other hyperscale cloud providers. To address the defence sector’s security and sovereignty concerns directly, we have developed isolated, air-gapped cloud infrastructures to deliver the high levels of security the military demand. I am not only referring to cybersecurity; we also have the highest levels of physical security. This and the security of the data are both controlled by the customer and not outsourced to regions of the world where these assurances cannot be offered.

Why is Oracle doubling down as the only cloud provider to offer isolated, compliant and secure cloud solutions?

When Oracle began developing its cloud, we listened carefully to our customers’ requirements, and we designed a solution that incorporated crucial aspects that military customers needed.

We identified five key essential requirements:

• Security: Oracle’s solutions provide cybersecurity, software-defined security and physical security in dedicated data centres, often at the customers’ own facilities.

• Computational power: Military organizations require enormous amounts of computational power especially when using AI.

• Connectivity and interoperability: Operations require reliable connectivity to enable warfighters to run successful missions while also interoperating with sister services, partners and allies. Oracle’s Isolated Cloud operates independently without being connected to large hyperscale infrastructure. Connectivity is assured even when subsea cables or satellites have been disrupted.

• Innovation and speed: As the decision cycle continues to shorten, operators must be able to react at speed. Oracle’s new applications and solutions enable innovation to take place at speeds hitherto unimaginable, with customers receiving regular updates and new features to enhance functionality.

• Reliability and resilience Considered by Oracle to be akin to a weapons platform, Oracle’s cloud is reliable and resilient even under the most extreme conditions.

Oracle’s Isolated Cloud is the only cloud that can provide all of these services, solutions and advantages in an environment that prioritizes sovereignty, isolation and extra layers of physical defence. It can also be deployed in a small footprint — as small as 100m2 — which means minimal financial commitments from customers making it manageable for any country’s budget — and the footprint can scale depending on requirements.

What are your thoughts on the importance of AI in the future of modern warfare and the opportunities it provides?

AI is already integrated into Oracle Cloud — from the infrastructure layer right up to the Enterprise Resource Planning, Human Capital Management, Supply Chain Management and Customer Experience applications. As a result, productivity, cost savings and innovation are being massively enhanced. And looking into the future, we expect AI will revolutionize three key elements: autonomous

systems, decision-making and intelligence gathering.

As a former officer, I served in several war zones and remember how decisions that now take minutes to make often took as long as a day, sometimes longer. This decisionmaking speed, of course, relates to intelligence gathering and analysis. New military platforms — aircraft ships and submarines — are becoming data-generating machines, but we only have the capacity to analyse a relatively small percentage of all that precious data. We believe AI will make a step-change difference to our analysis capacity in the future.

By utilizing AI to collect and analyse vast swathes of data from a myriad of different sources, at scale and at speed, military leaders can have more of the information they need to make critical, life-saving decisions on the battlefield.

And lastly, autonomous systems rely on AI to operate at the edge of the battlefield. That trend should continue. Taking that a step further, we expect AI will also enable autonomous systems to operate in those environments where humans find it most challenging — outer space and in the depths of the ocean. Furthermore, when you combine those capabilities with the enterprise side of a military organization, you will have a system that allows data to breach the silos it was formally constrained within, and, in doing so, unleash the full power of multi-domain operations.

www.oracle.com

AI MASTERCLASS

Mariana Antunes asks
Principal Data

Scientist

NCIA

and

Head

of Data Science and AI Engineering Profession, Ivana Ilic Mestric, to define NATO’s AI Masterclass and how it helps NATO senior personnel to understand the power of AI

Artificial Intelligence (AI) has rapidly emerged as one of the most transformative technologies of our time. As it gained momentum and began to be integrated into daily operations, processes and practices, it became apparent that, while everyone wanted to be a part of this technological revolution, few understood how to harness its full potential. In NATO’s case, AI fundamentally changed what the Alliance could achieve by creating new opportunities, but it also introduced new threats, risks and challenges.

In today’s fast-paced environment, adopting AI is essential to stay relevant and avoid falling behind technologically. However, while our decision-makers recognize the growing importance of AI, they sometimes lack the deep expertise to make informed decisions about how to best use it. AI may seem like a great solution for any given innovative challenge, but it isn’t necessarily always the

right one. Therefore, it was important to bridge this knowledge gap to ensure a balanced and informed approach to the use of AI. This is why Ivana Ilic Mestric and her colleagues realized that it was vitally important to create an AI Masterclass for NATO decision-makers.

AI MASTERCLASS

“This tailored training provides key leaders with the confidence and knowledge to make informed decisions related to AI,” says Ilic Mestric, who played a key role in developing the AI Masterclass. “It also empowers and inspires them to understand the potential, as well as the limitations, of AI technologies.”

With NATO’s AI Masterclass, decision-makers can learn to recognize where they can efficiently benefit from AI and initiate actions that enable its faster adoption within their organizations.

“NATO’s AI Masterclass helps to prepare senior leaders to harness the power of AI for the benefit of the Alliance by developing a shared language around the basic concepts of AI and machine learning”

NATO’s AI Masterclass helps to prepare senior leaders to harness the power of AI for the benefit of the Alliance by developing a shared language around the basic concepts of AI and machine learning. It also introduces practical and comprehensible tools and guidelines for the responsible use of AI, and, by bringing together NATO’s strategy and current capabilities, it helps to identify the steps needed for a stronger future competence in utilizing AI.

The Masterclass — which is delivered by NCIA, industry experts and academics — combines theory and valuable use-case examples from within NATO and the Member Countries. The syllabus covers topics such as the fundamental principles, risks and security threats of AI and machine learning and how to manage and mitigate them. “The goal is to simplify technical processes for people who are not knowledgeable about the ins and outs of AI, in order for them to be able to successfully adopt and implement AI in defence environments where cybersecurity and protection against new threats is essential,” explains Ilic Mestric.

KNOWLEDGEABLE ECOSYSTEM

This initiative began in early 2023 with a small team but it is now a well-recognized training opportunity available to the whole Alliance. Ilic Mestric wanted to ensure that this training is open to the whole Alliance

so that all decision-makers, particularly those driving capabilities, are well informed in this area.

“Since no decision in the Alliance exists in isolation, we need to ensure we create a knowledgeable ecosystem by educating everyone to facilitate future collaboration,” she says.

She continues: “As NATO’s tech and IT providers and enablers, any AI service and requirement comes to NCIA. The more knowledgeable our customers are, of both the technology and how to gain most value from it, the more it de-risks any AI project and increases the value that it provides to the users. Training NCIA decision-makers on every level ensures NATO can engage with industry faster and more effectively. It also makes us better prepared to adopt AI swiftly, effectively and responsibly.”

Such has been the overwhelming appreciation of the AI Masterclass, that NCIA is now working closely with NATO Alliance Command Operations (ACO) and the NCI Academy to introduce the AI Masterclass and other key AI coaching activities to NATO’s training catalogue.

THE FUTURE OF ARTIFICIAL INTELLIGENCE IN DEFENCE

Solita’s Head of Sustainable AI, Anna Metsäranta, explores key considerations on the role of artificial intelligence (AI) in future defence, bringing forth experiences from Solita’s work within the defence domain

PROMISE AND LIMITATIONS OF AI IN DEFENCE

AI technologies can enhance decision-making, improve situational understanding and increase operational efficiency across various military domains. Autonomous drones assist in reconnaissance, AI-driven predictive maintenance helps to anticipate equipment failures and advanced models help us to make sense of complex information environments. Beyond tactical advantages, AI enhances strategic decision-making through simulating conflict scenarios, optimizing resource allocation and analysing rival behaviour and geopolitical trends.

Future applications include advanced systems capable of real-time, multi-domain operations in dynamic situations. AI-enhanced command systems will reduce decision cycles by giving leaders faster access to critical data.

The possibilities must be balanced with an understanding of AI’s limitations and risks. Current AI systems struggle with changing environments, adversarial attacks and ethical dilemmas. Careless use could lead to unintended conflict escalation or ethical issues, including breaching international humanitarian law. AI deployment in defence requires a measured approach, ensuring responsible use and optimal human-machine collaboration.

BALANCING AI SUPPORT AND HUMAN AUTONOMY

One of the primary roles of AI in defence is to support human decision-making. While AI can process vast amounts of data and identify patterns that humans would miss, humans are superior at operating with sparse information in unforeseen circumstances. AI solutions should support human decision-making without introducing additional cognitive load.

A critical question arises: how much can we trust AI systems and rely upon them? We must carefully assess which military tasks can be safely automated, how the trustworthiness of AI outputs will be monitored and how appropriate human oversight will be secured. Balancing AI’s benefits with human judgment and autonomy is of paramount importance. Achieving this requires organizational capabilities well beyond data and technology.

ENABLERS FOR SUCCESSFUL AI ADOPTION

Successful adoption of AI hinges on an organization’s ability to acquire, develop, use and adapt AI technologies for effective problem-solving. The rapid pace of development makes it challenging to predict the future of AI, and betting solely on the latest AI technology is almost certainly a mistake.

Instead, the key is to invest in organizational capabilities that enable responsible and effective use of AI. These capabilities include AI literacy, organizational structures and leadership that support cross-functional collaboration, agile ways of working, scalable digital infrastructures and a culture that fosters innovation. When an organization has the necessary skills, structures and processes to use AI optimally alongside humans, they can more easily reap the benefits of future technologies too.

AI LITERACY: A CORE ENABLER

The first step towards successful AI adoption is to increase AI awareness and understanding across the organization. While not everyone needs to know the technical workings, it is vital for everyone to have a basic understanding of AI’s realistic possibilities, limitations and risks.

Solita worked closely with NCIA to develop the AI Masterclass (PHOTO: NCIA)

AI literacy empowers leaders to make informed decisions about adopting AI responsibly in their own contexts. To address this need, Solita collaborates with NCIA to provide AI Masterclasses for senior leaders. These training programmes raise strategic awareness of AI opportunities, discuss the limitations and risks of different types of AI technologies and allow leaders to apply their knowledge through relevant use cases.

THE WAY FORWARD FOR AI IN DEFENCE

AI education is a pragmatic activity to start with but increasing AI maturity also requires experimentation with AI technologies. Defence organizations typically have long planning cycles and rigid procurement processes, which pose a challenge to the iterative approaches needed to learn from data and AI. Giving sufficient focus to responsible use of AI, organizations need to adopt a human-centric design approach and work in multidisciplinary teams, both of which are rarely seen in military contexts.

To capture the benefits of AI, defence organizations must:

• Invest in AI training: Increase AI literacy at all levels to ensure informed decision-making and appropriate and proper use of AI technologies.

• Devise an AI strategy: Define the goals of AI use and the organizational, human and technical capabilities required to achieve them.

• Foster interdisciplinary collaboration: Form diverse teams of military strategists, operational users, AI experts, designers, human and social scientists, ethicists, lawyers and other relevant stakeholders to ensure a human-centric approach and holistic problem-solving.

• Integrate responsible AI practices: Deploy governance and responsible AI frameworks to ensure that AI use complies with international law

and adheres to principles of responsible use, as well as introduce mechanisms to evaluate and manage risks and impacts of AI solutions.

• Adopt agile ways of working and invest in scalable digital infrastructures: Create flexible procurement and development processes and build scalable digital infrastructures that support experimentation and learning from evolving AI technologies.

• Strengthen international cooperation: Collaborate with Allies and partners to share knowledge and best practices.

AI is expected to transform military operations, offering unprecedented capabilities in tactical decision-making, operational efficiency and strategic planning. However, realizing the potential of AI requires more than data and technology — it demands a holistic approach that considers not only data and AI but also human factors, organizational structures and ethical implications in unique defence contexts.

The future success of AI in defence requires a strategic intent backed up by investment to balance innovation with responsibility. By addressing AI skills development, strategic goals and requirements, interdisciplinary collaboration, responsible AI practices, agile ways of working, digital infrastructures and international collaboration, defence organizations can harness the potential of AI while mitigating risk.

The journey to integrate AI into military operations has started and the path ahead is exciting and challenging. By adopting a human-centric approach, we can work towards AI-enhanced military capabilities while upholding the values and principles that are fundamental to our defence.

Anna Metsäranta (second right) participating in an AI panel discussion hosted by NCIA (PHOTO: Solita)

IMPACT OF EMERGING AND DISRUPTIVE TECHNOLOGIES ON MDO

Alper Köprülü, Project Manager in NCIA’s Chief Technology Office, explains which Emerging and Disruptive Technologies are likely to have the biggest impact on the Alliance’s ability to conduct MDO (Multi-Domain Operations)

In today’s rapidly changing geopolitical environment it is vitally important to provide NATO Member Countries with the advanced capabilities necessary for conducting operations across multiple domains. These operational domains are expanding thanks to technological advancements and now encompass two new areas that were not formerly accredited as separate NATO operational domains: cyber and space. Emerging and Disruptive Technologies (EDTs) impact these domains and impose drawbacks on existing military technologies, some of which are falling behind civilian technologies. However, EDTs do not only bring disadvantages, they also offer many opportunities for the advancement of existing military technologies to leverage Multi-Domain Operations (MDO).

What is a Multi-Domain Operation (MDO)?

NATO Allied Command Transformation (ACT) defines MDO as the push for NATO to orchestrate military activities across all operational domains and environments. According to the ACT, these actions shall be coordinated with non-military activities, allowing the Alliance to achieve desired outcomes at the right time and place.

NATO’s Warfighting Capstone Concept emphasizes the critical need to develop tools that consistently harness data and innovation. It also highlights that ensuring interoperability across all domains is essential for maintaining NATO’s military effectiveness.

MDOs are complex to carry out effectively owing to the need for integration and coordination across many, if not all, of the land, sea, air, space and cyberspace domains simultaneously. Without having the MDO-adapted tools and capabilities, the chances of achieving military superiority are reduced.

What is the fundamental difference between ‘Joint Operations’ and MDO?

While Joint Command Structures enhance coordination among different elements of traditional armed services — army, navy and air force, for example — a ‘multi-domain’ mindset extends this coordination to include both military

and non-military assets. It is this wider, more comprehensive approach that distinguishes these two types of complex warfare activities.

How does NCIA contribute to warfare development in MDO?

NCIA is playing an active role by executing its Science and Technology Programme of Work under its Innovation Portfolio and directly supporting ACT’s Innovation Branch. More specifically, the NCIA Chief Technology Office’s Innovation Team is leading an ACT-funded project that encompasses scenario and concept development for MDO through experimenting with EDTs. NCIA’s project team is composed of subject matter experts (SMEs) from various business areas. Moreover, a team of multidisciplinary experts is developing reference use cases where the inclusion of a wide range of EDTs can facilitate decision-making in MDO.

NCIA also collaborates with its partners in government bodies, industry and academia, fostering a comprehensive network of expertise and resources. The collaboration is particularly strong with pioneers in the fields of innovation and digital transformation.

What is the role of EDTs in MDO?

Maintaining superiority in MDO is complex and the defence sector should investigate new ways of conducting operations to cope with this increased complexity. EDTs undoubtedly have huge potential to be game-changing and prove their usefulness on the battlefield. They can provide the military with advanced capabilities and enhanced adaptability, as well as improve overall efficiency in operations. By taking advantage of these innovations, the Alliance will be better able to meet the dynamic challenges of modern warfare and ensure that continued superiority is maintained in all domains.

Which EDTs can make a difference in MDO?

To overcome the complexity of MDO, it is essential to employ a number of EDTs and integrate them into existing and future systems. The following examples are the most obvious:

• Artificial Intelligence (AI) can be employed to enhance decision-making by analysing vast amounts of data from multiple domains in real time, as well as providing predictive insights. This would optimize the allocation of resources, allowing pattern detection and adaptation of strategies to cope with dynamic battlefield conditions.

• Adopting and exploiting civilian telecommunications standards, such as 5G, can unlock new capabilities to ensure low latency and increased throughput in both peacetime and on the battlefield. Next-generation communications are key to operating effectively in a contested electromagnetic spectrum. The use of dynamic spectrum-management techniques is essential to optimize and protect these communication channels.

• Investing in quantum technologies will revolutionize cryptography by making communications more secure and enhancing data-processing capabilities. However, it is also essential to understand both the opportunities and challenges that post-quantum cryptography brings. Strengthening cybersecurity measures accordingly would help to protect information systems, ensuring secure communication and data integrity.

• Enhanced satellite systems can improve navigation, communication and intelligence, providing critical support for operations across all domains. Advanced sensors, higher-resolution satellite imagery and increased data-processing capabilities would enhance situational awareness, support reconnaissance missions and help to monitor critical infrastructure.

Most importantly, effective MDO requires integrated command and control (C2) systems. Ubiquitous C2 networks allow for better orchestration and organization of operations, safeguarding rapid and effective responses to threats. This can be achieved by employing EDTs in future C2 systems by considering interoperability aspects.

Overall, investing in EDTs positions the Alliance at the forefront of technological innovation and delivers MDO-compatible defence capabilities.

How can NATO employ EDTs in MDO?

The process should start with the development of concepts by considering the state-of-the-art in EDTs. This entails validating and evaluating them by experimenting and eventually adapting them as new capabilities.

The ultimate way to achieve success is fostering innovation and establishing partnerships with industry and academia. Coherence and collaboration between different NATO bodies is also key. In the meantime, NCIA, in partnership with ACT, will continue to contribute to the ‘concept development’ for several innovative applications to ensure that EDTs are fully considered and effectively integrated into MDO.

The Alliance’s ACT Innovation Branch successfully completed the NATO Innovation Continuum IGNITE event at Pensacola, Florida, looking into how to employ EDTs in MDO (PHOTO: NATO)

FIGHTING AI DEEPFAKES

Deepfakes, AI-generated media fabricated with deceptive intentions, are becoming ever more ubiquitous and are a pervasive threat that has sparked global concern. The political and societal implications are significant: a June 2024 poll by the UK’s communications regulator, Ofcom, found that only 9% of people aged 16+ are confident in their ability to identify deepfakes.

Eve Michell, technology writer, spoke with Professor Janet Coats, Managing Director of the Consortium on Trust in Media and Technology at the University of Florida’s College of Journalism and Communications, to discuss this growing challenge

"Deepfakes are used to create messaging that is disingenuous or misleading, with the intention of changing peoples’ perceptions, beliefs or behaviours"

Can you tell us about your work at the Consortium on Trust in Media and Technology?

The Consortium on Trust in Media and Technology was started in 2019 as part of the University of Florida’s Moonshot initiatives, where we are trying to solve the big problem of trust and misinformation. We take an approach that is based on rigorous academic research, but also one where we can work on projects that go into the field quickly. Academic research can take a long time so we’re also focusing on work that is more practical, responsive and experimental.

I teach a general education course offered to students from all over the campus called Media Mastery, a term that I prefer to media literacy. Media Mastery is about refining the skillset to be a discerning consumer of media and understanding how to identify trusted sources of information. We live in the age of trusted influencers so one thing we’re teaching our students is how to find trusted, evidence-based journalists on platforms like TikTok.

We also teach people to look for classic ‘tells’ of deepfakes, such as the use of highly emotional language that appeals to the part of our brain that is easily outraged. Anger is one of the most powerful human emotions, so if we can find ways to cool it down and get people to question why content is making them angry, they can start to sense that their emotions are being manipulated. Deepfakes are rooted in manipulation and understanding this is vital to becoming a discerning and highly literate consumer of media, armed against these pervasive methods of disinformation.

How are deepfakes created and what risks do they pose?

A deepfake is false information — whether a video, text, audio or a combination of the three — that has been created through a technological process. Deepfakes have been around for a long time and people have been able to augment or alter images and change their meanings for decades. But with emerging generative artificial intelligence (GenAI) technology, people can now create something that is extremely convincing. It can be very difficult to tell the difference between a GenAI deepfake and an authentic image or video. GenAI also allows people to create deepfakes much faster and at a very low cost, if any cost at all. This lowers the barriers to creating them as you don’t need special equipment or training. All you need is access to a GenAI tool that has been trained on the data you want to use.

The motivations are inherently deceptive. Deepfakes are used to create messaging that is disingenuous or misleading, with the intention of changing peoples’ perceptions, beliefs or behaviours. They’re often disseminated to create a vision of somebody that is either more positive than in reality or, more commonly, more negative. There are significant implications, particularly in politics. We have seen deepfakes of high-profile politicians and officials supposedly saying things that they never would or mixing with people that they would normally take great pains to avoid.

However, the area we need to be the most concerned about is what’s happening at the local level. In the US, for example, many people don’t know what their election supervisor and officials look like. This opens up a massive opportunity for deception. There have been instances of people posting deepfakes of election officials giving false information about upcoming elections. This includes messages such as, “Don’t bother to send in postal votes as we are not counting them.” The more local you get, the bigger the risk — and the stakes are much higher than we may think.

Janet Coats is working on identifying and countering deepfakes at the University of Florida’s College of Journalism and Communications (PHOTO: University of Florida)

What methods are people using to spread deepfakes and how can we combat this?

There must be a vector for transmission and commonly this is social media, which has already been a hotspot for the proliferation of disinformation and misinformation for at least 15 years. Social channels are still the main distribution network and the bar is low for how technically sophisticated you have to be. There is the stereotype that Facebook has become a platform for Boomers (the generation born between 1946 and 1964) to share problematic memes and ‘fake news’, and there is definitely some truth in that.

However, video-first platforms like TikTok and Instagram Reels are now perhaps more obvious hosts for deepfakes. We know that deepfakes get the most traction on these platforms. Many users scroll through these feeds quickly, without stopping to think about what they’ve seen, which makes them ideal platforms to target users on. That said, malicious deepfake creators will choose the platform based on the demographic they are trying to reach.

What can be done to counter deepfakes?

The key questions we need to ask are: how do we help people to be more sceptical and how do we get them to question what they’re seeing?

There are a few routes we can take. One is to foster better media literacy to help people analyse the media they’re seeing. There are more and more media literacy programmes in secondary education and it makes a lot of sense to teach this from an earlier age.

There is also the potential to use AI as a tool against deepfakes. If we train a machine in deepfake detection, we could use AI tools as a detection and early warning system. At the moment, AI-detection tools are not quite sophisticated enough, but they will definitely get better and more effective.

This AI-generated deepfake was circulated to put pressure on the national government following Hurricane Helene in the US (PHOTO: AI-generated image)

INDUSTRY PERSPECTIVE

INDUSTRY PERSPECTIVE

How the cloud can transform defence operations

How the cloud can transform defence operations

Richard Goodman

EMEA Defence Lead, Hexagon’s Safety, Infrastructure & Geospatial division

EMEA Defence Lead, Hexagon’s Safety, Infrastructure & Geospatial division

As seen in civilian life, the adoption of cloud computing is transformative. For defence organizations, it addresses many of the limitations of traditional, on-premise IT infrastructure.

As seen in civilian life, the adoption of cloud computing is transformative. For defence organizations, it addresses many of the limitations of traditional, on-premise IT infrastructure.

Although the cloud has been around for many years, there is still some confusion as to what it is and how it can be an advantage for defence organizations. Simply put, cloud computing is the delivery of computing services, including servers, storage, databases, networking, software, analytics and intelligence, over the internet (‘the cloud’). Key here — compared with traditional server-based computing — is the ability to scale required resources automatically and on-demand. There are differing configurations available, such as having the cloud on premise, having a hybrid mixture of cloud and local computing capabilities and using a mixture of data that is locally stored or delivered via the cloud. There are different scenarios where cloud capabilities can be very beneficial for defence, especially with respect to geographic data and geospatial software.

Although the cloud has been around for many years, there is still some confusion as to what it is and how it can be an advantage for defence organizations. Simply put, cloud computing is the delivery of computing services, including servers, storage, databases, networking, software, analytics and intelligence, over the internet (‘the cloud’). Key here — compared with traditional server-based computing — is the ability to scale required resources automatically and on-demand. There are differing configurations available, such as having the cloud on premise, having a hybrid mixture of cloud and local computing capabilities and using a mixture of data that is locally stored or delivered via the cloud. There are different scenarios where cloud capabilities can be very beneficial for defence, especially with respect to geographic data and geospatial software.

Cloud computing is transformative with regard to the traditional way defence thinks about and provisions computing resources. So, what capabilities does defence need and how transformative is cloud computing?

Cloud computing is transformative with regard to the traditional way defence thinks about and provisions computing resources. So, what capabilities does defence need and how transformative is cloud computing?

Here are a few potential benefits:

Here are a few potential benefits:

• Collaboration: cloud-based collaboration allows seamless working where files can be stored and accessed, meetings can occur in real time and decisions can be completed cooperatively from a single, secure location. Enhancing this are solutions such as Software as a Service (SaaS) collaboration workspaces, where defence organizations and allied agencies can easily share and act on their data in a secure environment across the domains of land, sea, air, space and, of course, cyber.

• Collaboration: cloud-based collaboration allows seamless working where files can be stored and accessed, meetings can occur in real time and decisions can be completed cooperatively from a single, secure location. Enhancing this are solutions such as Software as a Service (SaaS) collaboration workspaces, where defence organizations and allied agencies can easily share and act on their data in a secure environment across the domains of land, sea, air, space and, of course, cyber.

• Resilience: this refers to the ability of a cloud-based system or application to offer continuity and recover from disruptions without interruption. However, for defence, the planning and managing of major events, operations and incidents is resilience. Harnessing the collective capabilities of diverse units and providing a single source of information through the entire event lifecycle is enabled for safe, efficient and effective operations.

• Resilience: this refers to the ability of a cloud-based system or application to offer continuity and recover from disruptions without interruption. However, for defence, the planning and managing of major events, operations and incidents is resilience. Harnessing the collective capabilities of diverse units and providing a single source of information through the entire event lifecycle is enabled for safe, efficient and effective operations.

• Distribution: linked to resilience, this is a model that spreads cloud services and infrastructure across multiple geographic locations, instead of keeping them in a single

• Distribution: linked to resilience, this is a model that spreads cloud services and infrastructure across multiple geographic locations, instead of keeping them in a single

data centre. The benefits of this are being exemplified in Ukraine. For effective uploading, managing and maximizing of geographic data in this model, solutions need to power applications with storage, visualization and collaboration tools as well as automated microservices for data processing.

data centre. The benefits of this are being exemplified in Ukraine. For effective uploading, managing and maximizing of geographic data in this model, solutions need to power applications with storage, visualization and collaboration tools as well as automated microservices for data processing.

• Scalability: the cloud’s ability to scale processing resources on demand is crucial for handling large datasets and simulations by creating new compute instances as demand changes. This scalability reduces the time taken to process data, for example in simulations. Real-world engineering and spatial problems can also be solved by leveraging this scalability to harness the power of cutting-edge simulation technologies. With the increased use of AI-based analytics in geospatial workflows, the availability of almost unlimited CPUs and GPUs on the cloud is a massive advantage to analysing both 2D and 3D data.

• Scalability: the cloud’s ability to scale processing resources on demand is crucial for handling large datasets and simulations by creating new compute instances as demand changes. This scalability reduces the time taken to process data, for example in simulations. Real-world engineering and spatial problems can also be solved by leveraging this scalability to harness the power of cutting-edge simulation technologies. With the increased use of AI-based analytics in geospatial workflows, the availability of almost unlimited CPUs and GPUs on the cloud is a massive advantage to analysing both 2D and 3D data.

• Analysis: using cloud computing for uploading data from edge devices, carrying out preset analysis and data processing and then making these results available to interested commanders is a major benefit. This maximizes the return on data and helps to democratize knowledge and capabilities to forces through self-developed applications.

• Analysis: using cloud computing for uploading data from edge devices, carrying out preset analysis and data processing and then making these results available to interested commanders is a major benefit. This maximizes the return on data and helps to democratize knowledge and capabilities to forces through self-developed applications.

• Accessibility: as described already, using the cloud from disparate locations via differing media is fundamental. This

• Accessibility: as described already, using the cloud from disparate locations via differing media is fundamental. This

access includes defence users accessing data and services that they may not have on hand locally. Sharing capabilities like data processing and visualisation via cloud hosting as well as workflows and tools fosters geospatial tradecraft creation, common usage and secure sharing of the latest intel, making up-to-date information accessible and reusable.

• Situational awareness: with multiple data feeds and data stores, along with local realtime sensors, cloud computing is an ideal platform for providing real-time situational awareness, as found by the European Defence Agency CLAUDIA project. The key to cloud computing is to enable solutions to seamlessly work together, leveraging all relevant data and delivering missioncritical intelligence at scale. The DIoT (Defence Internet of Things) is as complex as the civilian IoT (Internet of Things), although more dynamic and ad-hoc when operations require

it. Having configurable middleware to link these edge devices is necessary for rapid collaboration in multi-nation operations.

• Connectivity: interoperability standards play a vital role in cloud computing, for transferring data, communications between applications and end-user security, as well as web services. As a member of numerous standards bodies, Hexagon implements standards-based COTS software for cloud use and helps to define future interoperability.

• Security: in addition to being removed from the computers, cloud computing can remove users from data totally, so that any data is stored, processed and accessed via the cloud. With a cloud-based permission and access-management provision, users can be accredited regardless of their entry point.

• Updates: applying software releases and updates to cloud instances is much simpler than

visiting thousands of end-user computers. Software updates are important for adapting to technology changes and advancements, incorporating user feedback, strengthening robustness of the software and increasing optimization of cloud resources. This ensures that all defence users are working with the same software capabilities, as well as from the same map.

• Cost efficiency: moving to a subscription model for computer infrastructure and software reduces the extremes of defence spending and overall costs.

For many defence organizations, having a hybrid approach to cloud computing is their first step. This can be using data on the cloud, such as Google Earth Engine repositories, along with local software and data. Capturing data workflows that utilize local and cloud-stored data and processes can reap the benefits of cloud computing while maintaining local control and repeatability of analysis.

Smaller defence organizations can benefit massively by adopting cloud-based data stores and processes from both commercial companies and other defence organizations. With minimal investment in infrastructure and training, they can kick-start and enhance their data analytics.

Cloud computing is unmatched in its scalability, flexibility and real-time data-processing abilities, making it a real asset for defence organizations with mission-critical operations.

CLOUD ADOPTION JOURNEY

Simon Michell talks to Chris Bailey, General Manager, Global National Security and Defence at Amazon Web Services, to find out why military organizations and governments are adopting the cloud as well as how secure their data is within it

Governments who review their IT ask questions about the capability of cloud technology to support missions effectively and securely, from logistics through to near-real-time data analysis.

Chris Bailey, General Manager, Global National Security and Defence at Amazon Web Services, explains: “Governments worldwide are recognizing that this is imperative in the current global threat context. Recent major cyber incidents have reminded us how much organizations depend on the resilience of their digital operations. This applies equally to defence structures. Digital transformation equips military organizations with the innovation and solutions they need to achieve their missions.

Cloud technology is a key enabler in this respect, as it allows mission-based applications and services to be developed in a very agile way, scaled when needed and without excessive cost. It also provides defence organizations with the baseline for cybersecurity that they will not achieve with legacy IT infrastructure, as on-premises solutions are unable to match the breadth and constantly updated suite of security solutions that are available with the cloud. In short, you can’t focus on speed and innovation if your foundation is not solid.”

“ When starting their cloud adoption journey, governments ask how they can show that the cloud is a safe and secure environment”

There are common themes and questions about the cloud adoption journey that emerge, regardless of the stage of digital maturity a government or organization has reached. Bailey and his team have listened to many organizations who want to know how to get started.

“This is not an IT refresh, it is an organizational transformation. Change begins with a strong top-level leadership vision. But, the execution of this vision requires new ways of working across IT, procurement, security, legal and many other functions. The whole organization needs to evolve. Training is essential,” he explains.

Some organizations may accelerate digital transformation with a memorandum of understanding, such as the UK MoD MoU. Starting with a specific project as a proof of concept makes testing within a defined scope possible, building confidence and expertise, such as in the case of the US XVIII Airborne Corps.

GETTING STARTED

When starting their cloud adoption journey, governments ask how they can show that the cloud is a safe and secure environment. “AWS [Amazon Web Services] has been architected to be the most flexible and secure cloud computing environment available today,” says Bailey. He continues, “Our core infrastructure is built to satisfy the security requirements for military, global banks and other highly sensitive organizations. Our service offerings and associated supply chain are vetted and accepted as secure enough for top-secret workloads, which benefits all our customers globally.”

Resilience is critical to security, Bailey adds. “The AWS Cloud spans 34 Regions and 108 Availability Zones (AZs), with announced plans for six more AWS Regions and 18 more AZs. Each AWS Region has a minimum of three isolated and physically separate AZs within a geographic area. Each AZ has independent power, cooling and physical security and is connected via redundant, ultra-low-latency networks. Our infrastructure is monitored 24/7, offers multiple fault isolation capabilities to improve resilience, and allows encryption of all of the data flowing across the network before it leaves our secured facilities. This is the foundation of AWS infrastructure that provides cloud capabilities and underpins resilient services that can scale rapidly.”

Organizations also want evidence of accreditation from regulatory bodies. AWS, for example, supports 143 security standards and compliance certifications, more than any other offering, including PCI-DSS, HIPAA/ HITECH, FedRAMP, GDPR, FIPS 140-2, and NIST 800-171, helping satisfy compliance requirements for virtually every regulatory agency around the globe. This is backed

by a deep set of cloud security tools, with over 300 security, compliance and governance services and key features — more than any other cloud provider.

To enable organizations to keep control of their data, AWS uses a shared responsibility model with the customer; AWS manages and controls the components from the host operating system and virtualization layer down to the physical security of the facilities in which the services operate, and AWS customers are responsible for building secure applications. There are also best practices documents, encryption tools and other guidance for delivering application-level security measures.

CLOUD FOR DEFENCE ORGANIZATIONS

AWS created Trusted Secure Enclaves (TSE) on AWS for national security and defence organizations so they can build a comprehensive cloud architecture for sensitive workloads. Organizations can use TSE to meet and accelerate cloud accreditation processes against national and alliance security and compliance requirements.

The UK Ministry of Defence, one of the earlier cloud adopters, set out the reasons it believes the cloud is more secure than on-premises for Official level data, noting that: security patches can be applied faster; it’s easier to deploy security controls at scale; you can authorize everything and implement ‘separation of duties’ more easily.

The US Government has four data levels. It uses the commercial cloud for Unclassified and AWS GovCloud (US) for Restricted. For Secret and Top Secret, it uses regions that operate in separate AWS partitions and are physically and logically isolated from the internet.

AWS GovCloud (US) is ‘US persons’ only, is connected to the internet, and was built 13 years ago to meet US ITAR and US FedRAMP High requirements.

“However”, Bailey says, “for governments starting their journey today, all requirements can be met in the cloud. And today, most US Government customers are using the commercial cloud it has more services, the latest technology, more location options, and is lower cost.”

Ukraine’s government accelerated its cloud adoption to meet the threat posed by Russia’s invasion, covering civilian functions such as healthcare, education, tax and land records as well as security and defence. The government moved 10+ petabytes of citizen data to AWS, including education records and digital classrooms; birth and family services; health records; personal and corporate financial services; bank accounts, ATM and online banking records; and

property records. According to Mykhailo Fedorov, Deputy Prime Minister for Innovation, Education, Science and Technology – Minister for Digital Transformation, “This is core for the operation of the economy, of the tax system, of banks, and of government overall. This war proves that digital infrastructure is the most resilient one — you cannot destroy it easily with bombs.”

Governments who choose to remain on-premises can expect capacity constraints and to encounter shadow IT when people find a way to access services they require, which creates risk. Multimillion-euro networks may be compromised by a single email.

INSIGHTS FOR A SUCCESSFUL TRANSFORMATION

For those who decide to adopt the cloud, sometimes called the commercial cloud, they can consider three insights that Bailey and his team suggest:

1. Align your organization, not just your leadership. Set a vision and bold goals. Then, leverage training to expose your teams to the journey ahead. You will need deeper training for technology specialists and awareness training for support functions. Leverage consultants to assist in transformation, jump-start projects and transfer knowledge.

2. Listen to your people. Start with the user’s needs and understand what their required outcomes are, rather than with policymakers. This helps build the case for transformation. Sometimes there are misplaced assumptions about what policies allow — challenge these.

3. Consider an Amazon leadership principle, bias for action, to overcome ‘analysis paralysis’. Very few decisions are one-way door decisions. Most can be tested and then adapted or even reversed, if necessary.

“One of the most powerful insights we learn from successful digital transformations is simply to get started,” says Bailey. “Collaborate with specialist partners, learn and iterate. The support and the technology are all there for you.”

CLOUDEX ALIGNING WITH NECOM

Stefaan Vermassen , NCIA’s Principal Cloud Architect within the Cloud Centre of Excellence, reveals how the CloudEx training exercise has helped the Alliance accelerate its journey to the cloud

NATO Edge Speaker

The digital revolution continues to reshape the defence landscape, and NATO stands at the forefront, embracing technological advancements to enhance operational effectiveness. At the heart of this transformation is cloud computing — an enabler that provides NATO with the scalability, agility and resilience needed to meet the complex challenges of the 21st century. The NATO Enterprise Cloud Operating Model (NECOM) introduces a cohesive governance framework and clear operating principles, marking a decisive shift in how NATO adopts and manages cloud technology.

THE PATH TO NECOM: A UNIFIED CLOUD STRATEGY

NATO’s initial forays into cloud computing have spanned several years, but the lack of coherence in cloud adoption slowed progress. Different parts of the NATO Enterprise had been implementing cloud solutions independently, leading to stovepiped approaches, fragmented governance and inconsistent application of security measures. This decentralized approach limited NATO’s ability to fully leverage the potential of the cloud.

NECOM changes all that. It brings the governance and management of cloud computing across the entire NATO Enterprise under one umbrella, providing a unified strategy for cloud adoption. The model operates at three key levels — strategic, operational and tactical — ensuring that cloud adoption is aligned with NATO’s broader goals and that all stakeholders, from policymakers to technical experts, are engaged.

The NECOM model was introduced with the NATO Cloud Computing Directive, which addressed existing pain points, such as lack of governance and insufficiently trained personnel. However, for NECOM to succeed, it needed the establishment of key governance roles, such as the Cloud Service Broker (CSB) and the Cloud Strategy Group (CSG), to oversee the cloud’s architecture,

governance and lifecycle management. These roles became integral to the operationalization of NECOM and were formalized and tested during CloudEx 2024.

CLOUDEX: A CATALYST FOR CLOUD ADOPTION

The NATO Enterprise Cloud Exercise (CloudEx), conducted in February 2024 by NCIA, the Office of the NATO Chief Information Officer (OCIO) and Allied Command Transformation (ACT), played a critical role in finalizing core components of NECOM. This exercise brought together leaders and stakeholders from across the NATO Enterprise to validate NECOM’s governance roles by simulating use cases and gathering feedback.

CloudEx provided an invaluable platform for NATO leaders to engage with industry experts, learn about cloud operating models and discuss how cloud technologies could be leveraged to accelerate NATO’s digital transformation. One of the key takeaways from the exercise was the importance of aligning NECOM with other NATO cloud initiatives, such as the Protected Business Network (PBN) and the Cloud Security Technical Directives for NATO RESTRICTED and NATO SECRET classification levels. These efforts are crucial to ensure that NATO’s adoption of cloud technologies is secure, efficient and strategically sound.

The exercise highlighted the transformative potential of NECOM, particularly in providing a structured, coherent approach to cloud governance. By finalizing NECOM’s components through CloudEx, NATO can now move forward with a clear, actionable plan for cloud adoption, focusing on security, scalability and seamless integration across its operations.

KEY COMPONENTS OF NECOM

At the core of NECOM are two vital roles: the Cloud Service Broker (CSB) and the Cloud Strategy Group (CSG). The CSB, managed by NCIA, is responsible for the

“NATO can now move forward with a clear, actionable plan for cloud adoption, focusing on security, scalability and seamless integration across its operations”

architecture, engineering and overall governance of cloud services within NATO. The CSB ensures that cloud services are delivered efficiently and meet NATO’s security and operational requirements. This role is essential for providing clarity on cloud brokerage and ensuring the cloud’s adoption is well-managed and consistent across the NATO Enterprise.

The CSG, led by OCIO, focuses on strategic planning and harmonizing cloud services across NATO. It acts as a decision-making body that aligns cloud projects and programmes with NATO’s broader strategic goals.

The CSG’s collaborative approach ensures that NATO’s cloud strategy is implemented coherently, enabling the NATO Enterprise to leverage the cloud’s full potential for operational and business IT needs.

NECOM’S IMPACT ON NATO’S DIGITAL TRANSFORMATION

With the formalization of NECOM, NATO is poised to accelerate its cloud adoption efforts. NECOM addresses several key challenges that have hindered NATO’s cloud journey, including:

• Governance: NECOM provides clear accountability for cloud architecture, deployment and operation, ensuring that cloud services are delivered efficiently and securely across NATO.

• Strategic alignment: NECOM ensures that the cloud supports NATO’s mission-critical operations by aligning cloud adoption efforts with NATO’s broader digital transformation goals.

• Scalability and agility: NECOM-compliant cloud services provide NATO with the flexibility to scale operations quickly, improve collaboration and respond to evolving threats.

• Security: NECOM’s focus on establishing secure cloud frameworks, such as the NATO Cloud Security Technical Directives, ensures that sensitive NATO information is protected while enabling the use of public cloud services for non-sensitive applications.

LOOKING AHEAD: THE FUTURE OF NATO’S CLOUD JOURNEY

NECOM and CloudEx underscore NATO’s commitment to embracing cloud technologies as a cornerstone of its digital transformation. As NATO continues to refine its cloud operating model, the lessons learned from CloudEx and the operationalization of NECOM will serve as guiding principles for future cloud initiatives.

Moving forward, NATO’s adoption of cloud technologies will not only enhance its operational capabilities but also foster greater collaboration and innovation across the NATO Enterprise. With NECOM in place, NATO is well-positioned to meet the challenges of the digital age, ensuring that it remains agile, resilient and ready to defend the security of its Member Countries. In addition, NECOM will be leveraged in industry engagements when acquiring new cloud services and will be further operationalized as part of the NATO Allied Command Transformation’s Protected Business Network Capability Programme Plan.

As NATO’s cloud journey unfolds, NCIA will continue to play a critical role in driving cloud adoption and ensuring that NECOM delivers on its promise of a unified, efficient and secure cloud environment for the NATO Enterprise.

CloudEx took place in February 2024 to test the readiness of NECOM for operationalization (PHOTO: NCIA)

COMBINING THE POWER OF CLOUD AND 5G FOR DEFENCE APPLICATIONS

Philippe Agard, Vice President of Market Creation at Nokia Defense International, explores the impact of cloud computing and 5G on the defence sector, highlighting its benefits, applications and challenges

The integration of cloud computing and 5G technology will revolutionize many sectors, with Defence being one of the most significant. As military operations become increasingly reliant on advanced digital technology, extreme bandwidth and interoperability, the convergence of 5G and the cloud offers many opportunities for enhancing operational efficiency, information superiority and overall mission effectiveness.

UNDERSTANDING 5G AND THE CLOUD FOR DEFENCE

5G delivers higher speeds and capacity, lower latency, greater reliability and a vast ecosystem of interoperable devices and sensors, supporting a wide range of applications, from real-time HD video streaming to augmented and virtual reality and the Internet of Things. This is becoming a key component of the defence sector’s digitalization effort across a broad set of use cases from smart bases to forward operations and even up to the tactical front lines. All this is possible due to 5G tactical communications systems that support data-intensive or latencysensitive applications.

Generally speaking, the cloud refers to a broad set of concepts related to the ability to move data and workloads through an information and communications technology network from central systems to the edge of the network. This definition can be extended to the far edge, with recent developments in edge computing, even though it is more about increasing local computing capability thanks to virtualization and miniaturization. Such a multi-cloud architecture has the advantage of positioning all types of data, and both intelligence and processing, close to the point of collection and/or consumption, which, similarly to 5G, improves latency, scalability, network resilience and agility.

Arguably, only when you combine 5G technology with multi-cloud technology can you deliver a service and data-centric defence network that is extended to all domains, including the tactical edge. This would provide more autonomy to deployed or advanced command posts closer to the battlefield and thus solve some of the issues linked to the distance between them and the rear echelon where historically decision-making usually occurred. It is now possible to shorten the decision cycle with local processing and immediate effect.

THE SYNERGY BETWEEN 5G AND THE CLOUD

From a technological standpoint, 5G core and radio network components can be seen as workloads or applications that facilitate the optimization of their deployment at the ‘Tactical Edge’. For example, 5G core and baseband assets can be deployed on the same military-grade servers used in a deployable communications and information system and/or embedded in existing cloud-capable compute servers in existing military vehicles that possess the necessary computing platforms.

Hence, the integration of multi-cloud technology with 5G networks fulfils the aspiration of federated mission networks and multi-domain operations (MDO), realizing the use of new digital services hosted in the distributed cloud across all locations. These are just a few of the ways cloud computing and 5G will impact defence digitalization:

• Enabling collaborative combat: Far-edge computing and 5G tactical communications power the battlefield of things, including unmanned vehicles, drone swarms, sensor networks and

real-time high-definition video/infra-red cameras. 5G will allow a mobile command post, hosting far-edge computing, to consume enriched data. This boosts its command, control, communications, computers, cyber, intelligence, surveillance and reconnaissance (C5ISR) capabilities.

• Forward operations: Deployable cloud and 5G networks will enable operators to bring smart base services into forward operating bases. This will enhance activities such as base security, medical services, deployed HQ site interconnectivity, logistics, and maintenance, repair and overhaul (MRO).

• Naval task forces: Innovative digital services within naval task forces will be enhanced by providing access to services running from a cloud infrastructure. This infrastructure will be disseminated between several ships in a task force, enabled by 5G ship-to-ship, ship-to-drone, and in-ship mobile connectivity.

• Transfer of rich data and information: Highcapacity exchange of information between the central cloud and edge cloud will be facilitated by enabling 5G slicing for communications service

Naval task-force operations will be transformed with the integration of the cloud and 5G (PHOTO: Crown Copyright)

providers. This will enable the transfer of data between defence core and edge cloud infrastructure within a country.

• Improving homeland operations: Smart base services using 5G connectivity on a military base will connect to cloud-based digital services. This will enhance many aspects of base activities such as training, base security, healthcare, logistics and MRO.

CHALLENGES AND CONSIDERATIONS

While the potential benefits of integrating the cloud and 5G in defence are substantial, one significant challenge to adoption is the integration of existing military communication technologies and applications. We should not expect critical military systems that rely on traditional infrastructure or communication technology to vanish. Instead, 5G and the cloud will augment them by enabling the deployment of new data-centric services. The integration of these new infrastructures into existing military platforms, as well as coexistence and interoperability between the new cloud and 5G-based services and existing ones, will have to be managed in the smoothest possible way.

Another challenge that applies to any new technology in this era is ensuring proper security. With cybersecurity threats and risks increasing every day, securing cloud deployments and wireless interfaces is of paramount importance, particularly in networks used by and for defence agencies.

USHERING IN A NEW ERA OF DEFENCE CAPABILITY

The integration of cloud computing and 5G technology is set to transform the defence sector, enhancing operational capabilities, improving communication and enabling real-time data processing of a rich data set to turn it into useful information as quickly as possible. As military operations increasingly depend on advanced technologies, embracing their integration will be essential to improve NATO’s and Allied nations’ digital backbones and maintain strategic advantages on the battlefield. This year, NATO ACT’s Digital Backbone Experiment (DiBaX) in Riga has effectively demonstrated several benefits, particularly in the context of MDO.

By leveraging the power of the cloud and 5G, defence forces can enhance their readiness, responsiveness and effectiveness in an increasingly complex global landscape. Moreover, these two technological platforms are mature enough for their adoption within defence command, control, communications, computers, cyber defence and combat systems and intelligence, surveillance and reconnaissance (C6ISR) systems, pending the establishment of proper policies for their deployment. It will also require careful planning, investment and collaboration among defence systems and platforms stakeholders. However, as these technologies continue to evolve, the potential for innovation in defence operations will only grow, paving the way for a new era of military capability and effectiveness.

“ Deployable cloud and 5G networks will enable operators to bring smart base services into forward operating bases”

NITECH

NATO INNOVATION AND TECHNOLOGY

ISSUE 12 | DECEMBER 2024

Editors

Lara Vincent-Young and Simon Michell

Project Manager

Andrew Howard

Editorial Director

Emilie Dock

Art Direction

Errol Konat

Layout

Billy Odell

Contributing Photographers

Marcos Fernandez Marin, Conrad Dijkstra, Francesc Nogueras Sancho

Printed by Micropress Printers Ltd

Published by

Chantry House, Suite 10a High Street, Billericay, Essex CM12 9BQ

United Kingdom

Tel: +44 (0) 1277 655100

On behalf of

NATO Communications and Information Agency (NCIA)

Oude Waalsdorperweg 61, 2597 AK The Hague, Netherlands

© 2024. The views and opinions, expressed by independent (non-NATO) authors, contributors and commentators in this publication, are provided in their personal capacities and are their sole responsibility. Publication thereof, does not imply that they represent the views or opinions of NCIA, NATO or Global Media Partners (GMP) and must neither be regarded as constituting advice on any matter whatsoever, nor be interpreted as such. References in this publication to any company or organization, as well as their products and services, do not constitute or imply any direct or indirect endorsement, recommendation or preference by NCIA, NATO or GMP. Furthermore, the reproduction of advertisements in this publication does not in any way imply endorsement by NCIA, NATO or GMP of products or services referred to therein.

Ready to innovate with NATO?

Join forces with NCIA to create the future of defence technology

Scan to explore our business opportunities

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.