The way forward AI helps drive data centre CX
AI and cybersecurity
Deep reinforcement learning offers smarter cybersecurity
The way forward AI helps drive data centre CX
AI and cybersecurity
Deep reinforcement learning offers smarter cybersecurity
Not just limited to ChatGPT and Bard, generative AI is transforming the gaming industry
Innovators are paving the way for a more resilient, sustainable and efficient future. The rules have changed. It’s time for DISRUPTION.
Tech LIVE Virtual returns to highlight the innovators changing the industry through expert keynote speakers, interactive fireside and panel discussions. This exclusive 1-day virtual event will bring together the greatest voices in the industry for an essential deep dive into the future of Technology, AI and Cyber.
Brought to you by BizClik, Technology, AI and Cyber Magazines, the event will shine a light on essential topics such as the AI revolution, quantum computing, the virtual workplace, technology’s place in sustainability and much more.
It’s time for DISRUPTION.
Position your business as a pioneer in Technology and showcase your values, products and services at Tech LIVE Virtual.
This is your chance to share your innovations with the technology community by making an impact in front of fellow decision-makers and influencers as well as accessing potential partners via an active and engaged audience.
See you on the 8th June 2023.
We produce Digital Content for Digital People across 20+ Global Brands, reaching over 15M Executives
Digital Magazines
Websites
Newsletters
Industry Data & Demand Generation
Webinars: Creation & Promotion
White Papers & Research Reports
Lists: Top 10s & Top 100s
Events: Virtual & In-Person
Work with us
EDITOR-IN-CHIEF
MARCUS LAW
CHIEF CONTENT OFFICER
SCOTT BIRCH
MANAGING EDITOR
NEIL PERRY
PROOF READER
JESS GIBSON
CHIEF DESIGN OFFICER
MATT JOHNSON
HEAD OF DESIGN
ANDY WOOLLACOTT
LEAD DESIGNER
HECTOR PENROSE
an issue!
FEATURE DESIGNERS
MIMI GUNN
SOPHIE-ANN PINNELL
HECTOR PENROSE
SAM HUBBARD
JUSTIN SMITH
REBEKAH BIRLESON
ADVERT DESIGNERS
JORDAN WOOD
DANILO CARDOSO
CALLUM HOOD
VIDEO PRODUCTION MANAGER
KIERAN WAITE
SENIOR VIDEOGRAPHER
HUDSON MELDRUM
DIGITAL VIDEO PRODUCERS
MARTA EUGENIO
ERNEST DE NEVE
THOMAS EASTERFORD
DREW HARDMAN
JOSEPH HANNA
SALLY MOUSTAFA
JINGXI WANG
PRODUCTION DIRECTORS
GEORGIA ALLEN
DANIELA KIANICKOVÁ
PRODUCTION MANAGERS
JANE ARNETA
MARIA GONZALEZ
CHARLIE KING
YEVHENIIA SUBBOTINA
MARKETING MANAGER
INDIA BERRY
PROJECT DIRECTORS
KRIS PALMER
TOM VENTURO
MEDIA SALES DIRECTOR
JAMES WHITE
MEDIA SALES
JASON WESTGATE
MANAGING DIRECTOR
LEWIS VAUGHAN
CEO
GLEN WHITE
OpenAI’s GPT-4 is here, and it’s already helping bigname companies like Morgan Stanley and Stripe with its incredible new features and abilities
As GPT-4 emerges from the shadows with its own take on “Hello, World!”, it’s hard not to be impressed by OpenAI’s new GPT offering.
The new model now offers an extraordinary 25,000word capacity – for reference, that limit is around the same wordcount as The Old Man and the Sea by Ernest Hemingway and only slightly less than George Orwell’s Animal Farm or John Steinbeck’s Of Mice and Men.
Morgan Stanley already harnesses GPT-4 to tap into its vast content library, while Stripe utilises this innovative AI model to accelerate workflows and identify valuable applications. GPT-4's groundbreaking capabilities include enhanced creativity, image input, extended text handling, and improved problem-solving.
This issue is a testament to the transformative power of AI, particularly GPT-4, in revolutionising industries and shaping the future of business. Join us as we embark on a journey through the AI landscape, bringing you expert analysis, in-depth discussions, and a closer look at the technology that is redefining the way we live and work.
marcus.law@bizclikmedia.com
“OpenAI’s GPT-4 now offers an extraordinary 25,000-word capacity – for reference, that limit is around the same wordcount as The Old Man and the Sea by Ernest Hemingway”
14 BIG PICTURE
The European quantum computer project –OpenSuperQ – is extended
16 THE BRIEF Artificial synapses making memories for more sustainable AI
18 TIMELINE The history of AI in the gaming industry
20 TRAILBLAZER Colin Angle
24 FIVE MINS WITH Dr Emilia Molimpakis
30 BELL
Evolving towards the Finance of the Future through Data and Analytics
60 SWITCH DATA CENTERS Sustainability strategies and getting the public on-board
86 SAP
SAP’s Sam Castro on AI and risk resilience in manufacturing
Jamie Cairns, Chief Strategy Officer at Fluent Commerce, on how its order management platform enhances operational efficiency and customer experience.
Fluent Commerce is a global software company focused on distributed order management. Its cloud-native platform, Fluent Order Management, provides accurate and near real-time inventory data visibility, order orchestration, fulfilment optimisation, instore pick and pack, customer service, and reporting to transform fulfilment into a competitive advantage.
As Jamie Cairns, Chief Strategy Officer at Fluent Commerce explains, the process of managing orders begins with inventory data. “Being able to unify a view of inventory and then syndicating that inventory data out across a range of different channels lets you improve the customer experience,” Cairns comments.
That, in turn, has a range of different operational efficiency benefits, reducing costs by reducing split shipments, cancelled orders, and customer service calls.
As Cairns describes, order management represents an opportunity for retailers and B2B organisations to harness inventory data to provide real-world benefits. One of their recent innovations, Fluent Big Inventory, is about unifying in near real-time those inventory sources, enabling all systems to become inventory aware.
“It is not just about enhancing the order fulfilment process, which is typically what has been the domain of an order management system,” Cairns explains. “It’s about making inventory data available to other systems, like search, as well and ultimately being able to personalise search results based on inventory.” With changing customer preferences in recent years,
brands have had to adapt quickly. As Cairns explains, Fluent Order Management not only provides a robust software-as-a-service platform, but at a lower total cost so businesses can move quickly and meet customer expectations efficiently.
During the COVID-19 pandemic when stores were closed, Fluent Order Management enabled businesses to adapt quickly. “Stores still had inventory and there were huge spikes in online demand,” Cairns explains. “Our customers were able to adapt in a matter of a day to completely change their fulfilment workflows.
“Digital agility is essential,” he concludes. “We are not trying to predict what the future is, but to provide a toolset that allows you to adapt as the future evolves.”
The project OpenSuperQPlus is now underway. It will see a major expansion and enhancement of the OpenSuperQ project, for which the key partners of the national initiatives of the Netherlands, France, Finland, Germany, Hungary, and Sweden (plus a number of full-stack quantum computing startups and other leaders in the sphere) will be brought together.
Part of the European Quantum Technology Flagship, the consortium for this project predicts that the special usecases in quantum simulation will span the chemical industry, materials science, and critical ML applications.
Only 40% of Australians trust AI at work
A study by the University of Queensland and KPMG Australia has revealed that only
40 % of Australians trust AI at work.
What’s more, only
43 % of Australians believe their employer has practices to support the responsible use of AI.
Ambience Healthcare’s Ambience AutoScribe is a fully automated medical scribe, using state-ofthe-art AI technology to address burnout in an overworked US healthcare system.
READ MORE
Technology startup Istari – backed by former Google Chief Executive Officer, Eric Schmidt, and founded by former US Air Force Assistant Secretary, Will Roper – announced plans to create a Matrixinspired world where technology is created, tested, and even certified through modelling and simulation.
A University of Maryland School of Medicine physician-scientist will head a new, federallyfunded research programme to develop and test a whole blood product that can initially be used to transfuse wounded soldiers in the field within 30 minutes of injury.
We have a lot of engineers that have a thorough background in industrial design, meaning we have all the knowledge in-house that we need
Edgar Van Essen
Managing Director, Switch Datacenters
It's really important that the entire CoE engages in the transformational vision, and that all those involved share the same beliefs
Nantes Kirsten
Director of Financial Analytics & AI, Centre of Excellence, Bell Finance
A team of US researchers has developed a device that mimics the dynamics of the brain's synapses – the connections between neurons that allow for the transfer of information – a development that could potentially change how we approach memory consolidation in artificial systems and lead to a new era of advanced, energy-efficient AI technology.
This new device was developed by a team at the McKelvey School of Engineering at Washington University in St Louis, led by Professor Shantanu Chakrabartty, and uses quantum tunnelling to create an artificial synapse, providing a much simpler and more energy-efficient connection than previous methods.
The team's research, published in Frontiers in Neuroscience, showed that their artificial synapse could mimic some of the dynamics of biological synapses, allowing AI systems to continuously learn new tasks without forgetting old ones.
81% Would be willing to pay more for a game with AI-improved NPCs
84% Of gamers agree that current NPCs make a positive difference in gameplay.
52% Dislike repetitive NPC dialogue
76% Want to see NPCs with better situational awareness
In terms of player immersion, AI technology is consistently pushing the boundaries of what’s possible. We take a look back at the history of AI’s deployment in the gaming industry and explore the current leaders in this sphere.
AI and ML pioneer Arthur Samuel devised the Samuel's Checkers Player program.
This was one of the first instances of successful selflearning programmes in the world, and a clear (albeit very early) insight into the power and future of AI.
The 1980s saw a significant increase in the pace of AI’s deployment in gaming. One of the earliest (and most renowned) being PAC-MAN.
The PAC-MAN game broke new ground, using a simple AI model to control its iconic ghosts.
Another key landmark in the decade’s successes was the release of the space game Elite. For the creation of its vast universe, developers used a procedural generation system.
In the 2000s, games used AI in rapidly more complex ways. The horror series F.E.A.R. was one of the leaders in this field, using AI systems to make enemy behaviour more sophisticated, unpredictable, and realistic.
The release of Halo represented a major advancement in the deployment of AI in games. This first-person shooter game – which has generated over $5bn in sales to date – is renowned for its highly realistic, responsive, and adaptive AI enemy characters.
The most recent versions of Microsoft Flight Simulator have been making big headlines – both in the gaming world and, indeed, in the wider tech industry.
The simulator has always been perceived as a tech pioneer, and the most recent version uses AI to generate an extremely high-fidelity version of the real world –with copies of more than 1.5bn buildings.
Alongside establishing (and driving the long-term success of) the personal robotics company iRobot, Angle also previously worked at the National Aeronautical and Space Administration’s Jet Propulsion Laboratory. There, he worked on the design of the rovers that, ultimately, led to Sojourner’s 1997 Mars exploration.
In short, Angle is one of the world’s leading influencers in robotics technologies. And, for more than 30 years, he has been at the forefront of this global industry and its extraordinary developments.
Three decades ago, Angle became one of the first-movers in the home robot industry. With the launch of the iconic floorvacuuming robot, Roomba, his company, iRobot, marked one of the first steps in this transformative sector.
Speaking in an interview with The Verge, Angle explained that the idea for Roomba came from remarkably pressing customer demand.
“The idea came because everyone asked for it! I would literally introduce myself, and people wouldn’t say ‘good to meet you’ or ‘how are you’, but ‘When are you going to clean my floor?’ It was very predictable. And after a while, I would say back, ‘Well how much are you willing to pay?’ Because I didn’t know how to build a vacuum cleaner that was inexpensive.”
At the time, Angle and his team were already working with the cleaning giant S.C. Johnson, on a project to build new cleaning machines.
“And when that project was winding down, there were a few employees – including Joe Jones – who said, ‘We have this idea: I think we finally know enough to build a consumer robot’. And I said, ‘Okay, here are some resources. Show me what you can do’.”
Fast forward three decades and the Roomba has not only sold over 40 million products across the world, but iRobot was acquired by Amazon in 2022 in a huge purchasing deal costing around $1.7bn.
Post-acquisition, Angle remains as iRobot’s CEO. Within this role – which he has held
As the CEO and Co-Founder of robotics giant iRobot, Angle has had an extraordinary degree of influence over the global personal robotics market
“The Roomba and the Braava are the best known iRobot products and the foundation of our business, but today, with smart homes coming online and connected devices all around us, there is so much more we can do”
since co-founding the company in 1990 – he manages its strategic direction, and works with some of the world’s leading scientists, builders, and engineers.
Angle describes the company’s longstanding mission as “designing and building toward a world where humans and robots work together for the benefit of all”.
In the wider STEM field, Angle plays a key role in advancing education through robotics technologies. He assisted in the foundation of National Robotics Week in 2010, with the aim that “these events might inspire the next generation of builders, inventors, and problem solvers”.
To this aim, Angle is also the Chairman of the Board for the non-profit organisation Science from Scientists, which places scientists in the classroom and works to further improve STEM literacy amongst young people.
Further exemplifying the positive impact that robotics can have on the world, Angle is also a Board Member and the Executive Chairman of RSE (Robots in Service of the Environment). This organisation develops robots that solve some of our most pressing environmental problems. For instance, the non-profit’s first initiative was the development of an undersea robot, which could help to mitigate the destruction that is being caused by the invasive lionfish species in the western Atlantic.
“After decades in this field, I remain incurably optimistic about the future of people and robotics”
Dr Emilia Molimpakis is the CEO and Co-Founder of Thymia, a pioneering platform that uses video games to make accurate neuropsychological assessments, faster.
“We also take great pride in building a beautiful and engaging product, it is one of the core values of Thymia, so I love seeing every new development there and contributing to the design”
Dr Emilia Molimpakis has been at the helm of Thymia for almost three years now. In her creation and ongoing development of the platform, she utilised her decade-long experience in academia, working as a Neuroscientist and Linguist.
During this period of her career – most of which she spent at University College London – she used other individuals’ approaches to producing and following language as a biomarker to gain deeper insights into various cognitive processes.
Molimpakis has worked with an extensive variety of conditions, including identifying cognitive decline in Alzheimer’s Disease patients and working towards treatments for depression – all through language use.
Interestingly, during her career Molimpakis has also worked as a Scientific Consultant for video game developers, using her psycholinguistics background to help video game developers create progressively more difficult levels.
Molimpakis decision to launch Thymia was rooted in personal experience and seeing first-hand how healthcare systems can’t always cover all bases.
A friend of hers, who was working with a psychiatrist on a regular basis, attempted to take her own life.
For Molimpakis, this drove her to devise a new, AI-led solution. Using her academic background, she worked to create a technology that could successfully identify a cognitive pattern such as this, ahead of time.
Her goal – and the continued pursuit of Thymia – is to create an AI application technology that can support and enhance the mental health support systems that are in place.
Today, the company is growing rapidly, and is the subject of extensive interest – from the fields of tech to healthcare and beyond.
Can video games help diagnose, monitor, and treat depression? | Dr Emilia Molimpakis | TEDxLondon WATCH NOW
The global mental health apps market size was awarded a $5.2bn evaluation
The mental health app market is expected to grow at a staggering 15.9% CAGR
As the CEO of this accelerating application, Molimpakis explains the multifaceted nature of her role.
“There are two big things for me at this point; one is the intellectual challenge of developing Thymia’s clinical solution.”
“It’s a technically and scientifically complex task which involves multiple scientific disciplines, not just Neuroscience, Psychology and Linguistics (my specialties), but also Computer Vision, ethical AI and multi-modal ML. I love bouncing ideas off of Stefano, my Co-Founder, and other members of the team and seeing these take shape within hours.”
As the company continues to develop this potentially revolutionary application of AI, the experience of the end-user remains at the centre of Thymia’s focus.
“We also take great pride in building a beautiful and engaging product – it is one of the core values of Thymia – so I love seeing every new development there and contributing to the design.”
“The other thing that is quite simply incomparable is seeing how big of an impact this solution can have on the everyday lives of so many people.”
“I always smile every time a patient or other end-user tells us how much they love our games. This really is the reason we created Thymia, and it's also the reason I keep pushing hard every single day to make that day count, even on super challenging days.”
“One thing that is quite simply incomparable is seeing how big of an impact this solution can have on the everyday lives of so many people”
Executives from Appian, AWS, and Xebia share their collaborative efforts and excitement about their partnership in low-code, cloud, and sustainability.
Technology is instrumental to achieving next-level capabilities across industries. But organizations that want to operate sustainably must choose technology that lets them adhere to strong environmental, social, and governance principles.
Appian Corporation, a process automation leader, is a critical piece of the digital transformation and sustainability puzzle. The enterprise-grade Appian Low-Code Platform is built to simplify today’s complex business processes, with process mining, workflow, and automation capabilities.
“By quickly building apps that streamline and automate workflows, organizations are using Appian to make their processes for monitoring and reporting on ESG initiatives faster, simpler, and more effective,” says Meryl Gibbs, Emerging Industries Leader at Appian.
“Both AWS and Appcino are amazing partners of ours,” says Michael Heffner, VP Solutions and Industry Go To Market at Appian. “We have an extremely long legacy engagement with AWS as our trusted, go-to-market partner and Appcino builds “meaningful, business-focused applications on the Appian platform and is amazing in all things ESG.”
As an AWS leader enabling sustainability solutions built on the cloud, Mary Wilson, Global Sustainability Lead at AWS, talks about the partnership with Appian.
“Our objective is to help our customers achieve sustainability goals across their business operations,” says Wilson. “[This means] looking at data availability, meaning access to more data, and enabling actionable insights. “Lowcode, cloud-enabled, technologies will allow organizations to build fast, learn fast, iterate, and continue to improve these insights to drive their sustainability outcomes.”
Tarun Khatri, Co-Founder & Executive Director of Appcino (product part of Xebia), explains just how critical ESG is in the face of digital transformation. “The investment community now considers ESG reporting as a major factor for measuring performance,” says Khatri The collaboration will continually uncover new insights and provides customers the opportunity to accelerate their ESG goals with speed and security.
WRITTEN BY: İLKHAN ÖZSEVIM PRODUCED BY: CRAIG
KILLINGBACKBell is Canada’s largest communications company, providing advanced broadband wireless, TV, Internet, media and business communication services throughout the country. Their purpose is to advance how Canadians connect with each other and the world, enabled by a strategy that builds on their competitive strengths and embraces the new opportunities of the integrated digital future.
Founded in Montréal in 1880, Bell has a long history of connecting Canadians to the people and things that matter. From their earliest days, starting with the telephone, Bell continues to bring generations of Canadians together with the latest technology. Today, Bell is delivering the future to customers with an unmatched infrastructure investment in the best broadband fibre and wireless technologies, including the growing 5G network.
Through Bell for Better, they are investing to create “a better today and a better tomorrow”, by supporting the social and economic prosperity of their communities with a commitment to the highest environmental, social and governance (ESG) standards. This includes the Bell Let’s Talk initiative, which promotes Canadian mental health with national awareness and anti-stigma campaigns like Bell Let’s Talk Day, and significant Bell funding of community care and access, research and workplace leadership initiatives throughout the country.
Beginning their finance transformation journey back in 2019 – Bell Finance saw the potential emerging technologies have to fundamentally transform the way that they work and the services that they provide, discerning that data and analytics (D&A) is core to what they do as a finance function. So fundamental, in fact, that they consider D&A and technology adoption to be a cornerstone of their entire transformation, if not the chief driving force behind it.
If you stop to consider what a finance department or an accounting function actually is, you’ll quickly realise that it begins and ends with data and information – and the packaging up of that information to either produce new insights on performance, help develop business forecasts, or to ultimately provide strategic recommendations and services to the organisation as a whole.
“It's really about being able to develop a leading edge practice by moving away
from a traditionally federated approach, to instead managing data and analytics by channelling it into a centralised team and a centralised system,” says Matthew MacEwen, VP of Finance, and lead of Finance Transformation initiative at Bell, including the Finance Data and Analytics work stream. “This has really helped our finance function to accelerate its transformation and to have a specific focus, ensuring our entire finance community has access to all of the data that they need in order to carry out their day-to-day jobs, and help drive automation of manual processes.”
In 2019, MacEwen was asked by the CFO to help lead their finance transformation initiative, and to help develop their future state roadmap. At that time, the company had started centralising their CoE and MacEwen was – and still is – the leader of that group. In his D&A role, he leads finance transformation, as well as supporting
TITLE: DIRECTOR OF FINANCIAL ANALYTICS COE
INDUSTRY: FINANCE
LOCATION: CANADA
Nantes Kirsten leads the Financial Analytics Centre of Excellence (CoE) at Bell, a specialist team of financial data, strategy and automation experts developing solutions that drive efficiency and are enabling transformation within Bell’s Finance organisation. Kirsten’s involvement at Bell started as a management consultant supporting the formulation of the vision and roadmap for Bell Finance’s D&A transformation program.
Before joining Bell, Kirsten held various hands-on and leadership roles within risk and finance divisions, across Telecommunications, Financial Services and Retail industries.
“It's really important that the entire Centre of Excellence is engaged in the transformational vision, and that all those involved share the same beliefs”
NANTES KIRSTEN DIRECTOR OF FINANCIAL ANALYTICS COE, BELL
The sector is undergoing transformation at a rapid pace driven by changing consumer behaviours and new technologies.
Our team offers a forward-looking portfolio of Audit, Tax, Risk, Financial and Strategic Advisory services backed by a global network of skilled industry professionals positioned to help clients anticipate and exceed evolving expectations.
Learn more
Dan Krausz on how KPMG is one of Bell’s strategic advisors on its finance transformation journey, operating as an integrated team to realize value
As Partner and Telecommunications Sector
Lead at KPMG in Canada, Dan Krausz has played a key role on Bell’s finance transformation journey over the past few years.
To be successful in the face of huge industry transformation, Krausz explains that one thing is vital: “Telecommunication organizations need access to more data, more frequently, that is well governed. Data is at the core of what finance does, so being able to make informed decisions, fast, is something that finance needs to drive.”
By working with Bell on their finance transformation project from the beginning, KPMG has helped shape the related vision and strategy.
“We started out by helping Bell determine what they wanted to achieve and then translated that into a practical roadmap for implementation,” explains Krausz.
An integral aspect of KPMG’s involvement in the transformation is the combined approach of delivering technology outcomes while bringing a business lens to the table, taking a use case approach.
He adds: “There have been several workstreams that we have been involved in. One that I’m most proud of is intelligent forecasting – we were able to develop AI models which use both internal and external data sources to produce forecasts. These not only improved forecast accuracy, but they also continue to learn and become more accurate over time, and drive efficiency into a process which was previously manual in nature. Another use case was for analytical process automation – significantly reducing the number of manual journal entries as part of the financial close process. This also provided additional granularity from a management reporting perspective given the details were maintained in the data warehouse.”
“A common theme throughout this project with Bell has been our collaboration, working shoulder to shoulder to not only deliver transformation outcomes, but to support the enhancement of Bell’s capabilities, helping to ensure that the teams are well equipped to leverage new technologies,” concludes Krausz.
TITLE: VICE PRESIDENT OF FINANCE TRANSFORMATION
INDUSTRY: FINANCE
LOCATION: CANADA
Matthew MacEwen is the Vice President of Finance, Customer Experience and Corporate Planning, Transformation and Finance Analytics at Bell Canada. In addition to supporting various Business Units in achieving their strategic, operational and financial objectives, Matthew is responsible for executing Bell Canada’s digital transformation for the Finance function. This includes advancing the function’s capabilities with the development and
some of their finance operation’s business, including business units from a more financial planning and analysis (FP&A) perspective.
The emerging technologies in question –and central to Bell Finance’s transformation – are, of course, artificial intelligence (AI), the application of machine learning (ML) and predictive analytics within finance processes. These technologies have the potential to provide many organisational benefits, such as providing useful insights to minimise unpredictability, creating sophisticated forecasting capabilities and automating traditionally time-consuming and inefficient processes.
For a finance department, the fiscal knock-on effects of such technologies are able to free up resources that can have a sweeping effect throughout the entire structure of an organisation. This is exactly what Bell finance has set out to do – and they are already seeing the impact of such systems.
Another aspect of this transformation concerns not only technology and data, but also Bell Finance building new capabilities within the actual function, as well as the critical upskilling of their workforce so
MATTHEW MACEWEN VICE PRESIDENT OF FINANCE TRANSFORMATION, BELL
“One of the things that we learned was to take a more agile approach and to work on smaller developments that we iterate on – and then to build out from there – versus trying to do more big-bang developments from the outset”
Learn more
that they are able to leverage these new technologies and provide services in more sophisticated forms. The dynamic balance between the transformation of processes and that of people is a delicate and everpresent one. These are not just changes that take their aim at possibilities with the potential to transpire weeks and months into the future, but incremental shifts that have a fundamental effect on the dayto-day workings within the organisation itself, with the potential to transform organisational processes as a whole. The Finance organisation therefore realises that
this transformation starts with people and people engagement.
“It's really important that the entire CoE engages in the transformational vision, and that all those involved share the same beliefs,” establishes Nantes Kirsten, Director of Financial Analytics & AI, Centre of Excellence.
“I realised early on in my career, as a quantitative risk-management consultant –that I enjoyed data science and the ability to translate data into actionable decisions and insights, and that's when I decided to move into telecommunications where there's an abundance of data and untapped insights.
At Bell, as the Director of Finance and Analytics at the CoE, we are trying to help use data to enable transformation and drive better insights – and there are so many others in the team that share this alignment between what we enjoy and the work we get to do,” he says.
The Financial Analytics CoE is focused on the finance 2025 transformation program data and analytics delivery, seeking to complete its transformation in the next few years; a transformation that will reshape Bell Finance from the bottom up and inevitably evolve the organisation closer to the Finance of the Future.
“I lead the financial data development team,” explains Kirsten, “alongside the solutions development teams that focus on visualisation, reporting automation, and financial data science initiatives like intelligent forecasting; we're currently focused on using the data and technology work-stream within finance 2025 to enable the transformation.”
Asked about the genesis of such a program, MacEwen says: “A program like this starts with leadership support and commitment. We had crucial and very strong support from our senior leadership
MATTHEW MACEWEN VICE PRESIDENT OF FINANCE TRANSFORMATION, BELL
“Beginning with leadership sponsorship, one of the first things we did in the early phases was to focus on people, and to find those within their finance function who had leading-edge skills in some of these technologies and to bring them together, and that really formed the nucleus of our centre of excellence”
team, starting with our CFO all the way through to the actual investment in the Program as well as in bringing people together to provide a different way of working for our finance function.”
Beginning with leadership sponsorship, one of the first things they did in the early phases was to focus on people and to find those within their finance function in possession of leading edge skills in some of these technologies and to bring them together, “and that really formed the nucleus of our centre of excellence”.
“As we gathered those folks, along with some of the work that they were doing, into a single team, we started to build a team vision – and to create a roadmap that we then started to execute as a part of our longer-term strategy.”
The CoE acts as the brain as well as the executive centre of the finance transformation’s nervous system at Bell, and there are core capabilities and skills that determine how effective it is in bringing about this transformation.
Kirsten proposes that they are categorised broadly in two areas: “There are behavioural capabilities and technical capabilities that are required.”
“On the behavioural side, for any centre of excellence, you need to have an innovative, strategic and critical thinking cap on. It ensures that we keep driving our transformation mandate, but also helps keep our lives interesting. Also on the behavioural side is solid engagement and stakeholder management capability. It's critical that we have a strong stakeholder engagement capability in the CoE so that we continue to keep our finance partners involved, engaged, testing and adopting
the solution, and of course, collaboration is absolutely key.”
“On the technical side (and I don't think this will come as a surprise), you need to have an intimate understanding of the finance organisation. As we're a finance CoE, this is different from what you would see in other enterprises – typically as a Centre of Excellence with an analytics focus. You naturally need the technical Data mining / Wrangling and Visualisation competence, but the solution development is crisper with a good foundational finance understanding.“
“Finally, it’s important that we are technology agnostic. We should be able to pick up a technology, learn how to code in it, use it, and to make sure that our solutions can be developed in any technology, any coding language and focus more on driving value through the use case, not leading with technology.”
The central data design and architecture principles for the Centre of Excellence In terms of core data design and architecture principles, the Bell Finance CoE began
NANTES KIRSTEN DIRECTOR OF ANALYTICS COE, BELLFINANCIAL
“Pick an impactful use case and find the data that unlocks it”
the transformation journey with a keen understanding that the solution component (that is, the use case) was going to be the driving force at the helm.
“Pick an impactful use case and find the data that unlocks it,” says Kirsten. “And that's one of the principles to which we adhere: find something that’s tangible to our clients and that adds value and then make sure we have centralised data that reconciles with upstream accounting systems that can enable development and automation of that use case.”
The other core aspect that they wanted to retain while reaching their transformation goal was “to develop principles that give us a blueprint, to be able to repeat what we've done”.
The use case would then lead to repeatable principles and processes, (“recipes”) for the Data Development Lifecycle, and then to solid Data Governance around the solutions and lifecycle.
So, in terms of the data lifecycle, Bell Finance wants to make sure that they have core principles in place to answer the data lifecycle questions, for example how they gather requirements, how they develop and how they engage and train their finance users.
“‘How do we start experimenting and have those data lifecycle components at the core of our CoE initiatives?’ – these were all part and parcel of the principles that we needed to put in place. Then there are support considerations, such as the finance support for the end-users, so we needed to make
sure that we had a process in place for them to ask someone for help. We need our CoE as their first-line of support.
“At the core of all of this is easily accessible, curated and centralised data.” For such a transformation, then, the Centre of Excellence was necessary. This Centre of Excellence then had to be formed with systems in place that would sustain it –so that it could then go about forming the transformational process that would feedback into its own operations, like an organisational form of M.C Escher’s ‘drawing hands’.
Essentially, the point is that the systems in place are open-ended and ever-adaptive to change – a necessary requirement for an effective transformation to take place. But,
of course, such a program is no easy feat, to say the least. As part of the CoE, both MacEwen and Kirsten acknowledge that, early on, one of the challenges that arose stemmed from attempting to tackle too much at once and “biting off more than we could chew”, which they both emphasise.
MacEwen says: “One of the things that we learned was to take a more agile approach and to work on smaller developments that we iterate on – and then to build out from there – versus trying to do more big-bang developments from the outset.”
Kirsten adds: “And you should never underestimate the amount of time testing and training of users will take. The endusers sitting in the finance organisation are so critical to adoption,” – (the crucial user-
adoption aspect of the transformation)
– “so you want to make sure that they're bought in and brought in from an early stage and that you allow sufficient time in your program plan for that piece.”
He also underlines that finding the right people – those that are passionate about finance, data and analytics, but also about the vision and practicalities of organisational transformation – is key and that, with such a transformation, “it’s critical to choose the right technology and advisors (internally as well as externally) to ensure that your vision is in-line with best practices in the industry and with best in class companies are doing so that, from a solution development and technology point of view, you're not finding yourself on an island.”
Beyond senior support and the obvious financial investment needed in bringing about such a transformation, partners are invariably vital to such a process.
For their finance transformation, a blend of a strategic-advisory type of partner, as well as technology partners were needed.
“Ultimately,” says MacEwen, “we were looking for partners with a proven trackrecord and experience in the field that had demonstrated capabilities in being able to deliver some of the specific work streams on a roadmap – be it around data or new capabilities with reporting or forecasting – as well as wanting to make sure that they were involved from a technology perspective.”
“We also have a core engine of subjectmatter experts that are helping to deliver a lot of the more technical elements of our finance transformation program. A big part of our strategy, though, is to try to scale up as quickly as possible by leveraging the full breadth of our finance function – and this is
a great way to help ensure that we overcome some of those change-management and adoption issues that an organisation can encounter during such a transformation.”
In terms of technology capabilities and partners, Kirsten adds: “And just as ease of use is important, the next consideration is ease of integration.”
“In terms of this side of things, we wanted to pick technology that can speak to –and pull information from – our source
accounting systems, which host critical financial information that we don't share enterprise-wide, as it contains sensitive information relating to our corporate financial reporting. The technology needs to support that vision, but also have native documentation and metadata capability.
“In other words, whether developed by the finance users, or within the CoE - we need to be able to document what has been developed so that we don't build
solutions in a vacuum and just hand it over to finance, or they build solutions and create critical resource risks.
“We also wanted to make sure that the solution isn't just a desktop technology and that you can productionise the workflows that you build onto a server environment – which, again, removes the critical resource risk.
“As you go through this, I think it's clear that Alteryx was one of those technologies
“You need to have an innovative, strategic and critical thinking cap on in a CoE"
that really supported our transformation journey; therefore, given the criteria, utilising them was a no-brainer as part of our technology kit to enable automation.”
Kirsten says that Bell Finance have completed some of their more foundational development, specifically around their finance-governed data warehouse, but adds that it's an evolution. Now, they’re scaling new capabilities with visualisation and reporting, trying to develop crosscompany insights and analytics for their finance community, as well as starting to look to ML and AI to build new capabilities with intelligent forecasting and analytical process automation.
“We're in an accelerated development phase of being able to leverage these new capabilities,” Kirsten says, “where it's now become real for our finance function, and we're seeing the benefits. We're continuing to grow from that foundation that we’ve built, and scaling the solution is the most important piece over the next 18 months.”
MacEwen adds: “The telecommunications industry is currently going through a rapid transformation. When you think about new services emerging in the marketplace, with things like the launch of 5G, the Internet of Things (IoT), Multi-Access Edge Computing (MEC) and Cloud – these are all new types of services that telecoms globally are launching and trying to take advantage of.
“This is going to bring new and interesting ways for the finance organisation to analyse data and provide insights, bringing new capabilities along the way. We think there's going to be lots of interesting ways for our industry to monetise these emerging services, and finance will play an important role in that process.”
WE EXPLORE HOW DATA CENTRES ARE UTILISING AI TECHNOLOGIES TO NOT ONLY IMPROVE CUSTOMER EXPERIENCE, BUT ALSO TO BOOST PROVIDERS’ PROFITABILITY
WRITTEN BY: JOSEPHINE WALBANKThe transformative potential of AI technology is being increasingly recognised by the data centre industry. In fact, it is widely anticipated to be one of the industry’s biggest areas of investment over the next few years.
According to a recent report published by CoreSite - Artificial Intelligence: Charting the Way Forward for AI: 2022 Survey of IT Leaders and Service Providers on AI Deployment - AI usage is expected to accelerate rapidly.
In fact, researchers found that, over the next five years, a staggering 82% of respondents expect their company’s use of AI to increase. What’s more, no respondents anticipated that their company’s use of AI or ML would decrease.
Within the report, two of the leading benefits for service providers of using AI - as selected by respondents - were improving customer experience (CX) and limiting churn (40%), and driving innovation and revenue for new business (37%). So, how can we expect to see those benefits realised in the context of the data centre industry?
In order to successfully accomodate the rise of AI, big data, IoT, robotics and the metaverse, the world’s data centres are being required to expand at a phenomenal pace.
“We need to start replacing traditional computing power with intelligent computing power to handle modern computing scenarios,” urges Liu Jun, Vice President and GM of AI and HPC at Inspur Information.
“With the acceleration of global digital transformation and the development of the digital economy, optimising and upgrading digital infrastructure and building new data centres is not only vital, but inevitable. The digital economy is creating brand-new application scenarios that need more adaptive, efficient, and green data centres.”
Ironically, AI is proving to be a pivotal solution to the challenge that it itself raises. In everything from onboarding talent to improving operational efficiency, AI is advancing data centre capabilities across the board.
“Traditional general-purpose computing will have its place, but intelligent computing power will develop faster, and it will
“THE INTELLIGENT DATA CENTRE IS ABOUT BEING SMART AND MAXIMISING COMPUTING RESOURCES; IT’S ALSO ABOUT GOING GREEN”
LIU JUN
VICE PRESIDENT AND GM OF AI AND HPC, INSPUR INFORMATION
account for an ever-increasing proportion of total computing power in the future. This represents a fundamental change in the data centre concept. It’s now the era of the intelligent data centre,” Jun outlines.
The concept of the intelligent data centre represents higher standards of performance and greater efficiency gains, which leads to more opportunities for innovation, better services, and sustainable technologies.
“The core idea of the intelligent data centre is that AI is dramatically more power intensive, requiring infrastructure specifically equipped to handle this high performance. That performance is best achieved with rack-scale architecture, which separates compute, storage, and network resources within a rack to be easily managed by APIs for ideal resource utilisation and infinite scalability,” Jun explains.
“It represents the full integration of hardware, applications, and algorithms working together, and is a complete rethink of how the data centre is operated and managed for maximum efficiency and peak performance.”
The CoreSite report demonstrated the exceptional value of deploying AI solutions, in terms of improving the quality, consistency and variety of data centre services, enabling sites to operate more cost efficiently, and thereby significantly boost CX. In fact, the report drew a clear parallel between implementing AI and improving customer retention metrics.
Two of the primary ways in which AI is driving these successes in data centres are fostering sustainability drives, and resolving the threat of the talent shortage.
Although the world’s data demands show no sign of slowing, data centres are coming
“TALENT IS ONE OF THE BIGGEST STRUGGLES FOR DATA CENTRES, WHETHER THAT INVOLVES ATTRACTING NEW TEAM MEMBERS OR RETAINING EXISTING COLLEAGUES”
MICK LANE GLOBAL TECHNOLOGY SOLUTIONS MANAGER, CBRE
under increasing scrutiny. It is no longer sufficient to continue meeting demand - providers need to do so in a way that is sustainable. Otherwise, their customer retention figures will quickly plummet.
That’s where AI comes in.
“The intelligent data centre is about being smart and maximising computing resources; it’s also about going green,” says Jun.
“It provides high-quality and highefficiency computing power that is proportionally much higher than traditional data centres, but also focuses on reducing carbon emissions and reducing energy consumption. It’s the balance needed for our digital future and our digital economy.”
According to research undertaken by Equinix, 62% of global IT decisionmakers see a shortage of personnel with IT skills as one of the primary threats to their business.
“Talent is one of the biggest struggles for data centres, whether that involves attracting new team members or retaining existing colleagues,” explains Mick Lane, the Global Technology Solutions Manager at CBRE.
As such, the entire industry is eagerly anticipating the advancement of AI recruitment technologies.
And, beyond simply making manual tasks like CV scanning more efficient, AI is being deployed in data centre recruitment in extremely sophisticated, pioneering ways.
Taking one example of CBRE’s approach, Lane explains how the company deploys AI in data centre training and upskilling, to improve staff skills, reduce churn, and so keep the customer’s experience as positive as possible.
“CCAM training allows us to meet the demand for intelligent facilities by continuously assessing our people’s competence and confidence in relevant technologies and technical disciplines and to close any gaps identified within the engineering workforce via comprehensive training programmes,” Lane explains.
“Using AI to train our teams within data centres allows us to embrace new, advancing technologies and ensure our talent is prepared for the next generation of tech and remains ‘right skilled’. In adopting this approach, CBRE is helping ensure that our staff remain continuously employable in a fast-changing market.”
“WITH THE ACCELERATION OF GLOBAL DIGITAL TRANSFORMATION AND THE DEVELOPMENT OF THE DIGITAL ECONOMY, OPTIMISING AND UPGRADING DIGITAL INFRASTRUCTURE AND BUILDING NEW DATA CENTRES IS NOT ONLY VITAL, BUT INEVITABLE”
LIU JUN VICE PRESIDENT AND GM OF AI AND HPC, INSPUR INFORMATION
When it comes to public opinion, data centres are among the industries that face the biggest of uphill battles. The sector – and not always without good reason – has long been perceived as the antagonist of the world’s sustainability targets. And now, this long-standing reputation is proving a major obstacle to the growth of the sector.
Despite the fact that the data centre industry is full of pioneering sustainability advocates, each of whom is developing sophisticated, future-proof, green solutions, it’s still a challenge to reverse the state of public opinion.
So, how can companies that are leading the charge towards a greener future establish the right reputation in their marketplace and successfully demonstrate their sustainability commitment to the public?
In an exclusive interview with Edgar Van Essen, Managing Director of Switch Datacenters, we learned about how the company is deploying unrivalled green initiatives and ensuring those efforts don’t go unnoticed.
TITLE: MANAGING DIRECTOR
COMPANY: SWITCH DATACENTERS
Van Essen joined Switch Datacenters five years ago as Managing Director and - together with Gregor Snip (CEO & Co-Founder) - is responsible for the strategic, daily growth of the company.
Before joining Switch Datacenters, Van Essen was an Executive Vice President leading a large region for a renowned Swiss high tech company.
Van Essen has a long and successful track record in building out the digital infrastructural markets in EMEA, and has been very successful in implanting new growth strategies across Europe. He holds a Master Degree in Business from Rijksuniversiteit Groningen, and is driven by innovation and smart new business models.
Switch Datacenters is an Amsterdambased data centre provider that has been at the forefront of the region’s sustainability focus for the last 15 years.
moved into large hyperscale and wholesale site development. We’re not known to everybody, and we like to keep it that way,” explains Van Essen.
“We are not shouting from the rooftops what we do – that doesn't help us in our plans to get the right locations and the right new sites in Amsterdam. So we have a deliberately low profile, but in Amsterdam, we know every street, and every potential building that we could turn into a data centre.
Every year, Switch develops new data centres and adds to its expanding portfolio.
“So we have more than 120MW of data centre capacity in development in Amsterdam. But, when we have bigger customers on board, we might also follow them abroad,” Van Essen adds.
“Our role in the Dutch data centre market is to be the challenger. We’re coming from a relatively small position in retail and, over the last three years, have succesfully
The company was founded to address what they saw as a gap in the market and bring a new approach to the industry.
“Switch Datacenters started 15 years ago by
“Our role in the Dutch data centre market is to be the challenger and innovator”
EDGAR VAN ESSEN MANAGING DIRECTOR, SWITCH DATACENTERS
Schneider Electric’s Thierry Chamayou explains why sustainability is a strategic imperative for the data centre industry
THIERRY CHAMAYOU, VICE PRESIDENT CLOUD & SERVICE PROVIDERS EMEA, SCHNEIDER ELECTRIC“Within the last few years sustainability has moved from a ‘nice to have’, to one of the top three procurement considerations for end-users and operators.” says Schneider Electric’s Vice President, Cloud Service Providers, EMEA, Thierry Chamayou.
Schneider Electric, who has signed the 17 science-based targets that form the foundation of the United Nations’ sustainable development goals has made sustainability a fundamental focus of its operations, positioning itself as one of the world’s leading authorities on net zero. Today the company develops technologies for several critical sectors, including buildings, grids, industrial manufacturing, and of course, data
centres. It is here that Chamayou believes data centres are leading the charge and demonstrating that energyintensive industries can be a key enabler for decarbonisation.
Sustainability has indeed become a strategic imperative for data centres, and for businesses embarking on this journey, Schneider Electric is leading by exampleestablishing new innovations and investing significant amounts of revenue in research and development. Chamayou tells us that the company is not only helping organisations improve efficiency and reduce emissions, but is helping business to establish strategies that will enable long-term, sustainable change.
A key example is its position as a leader in the Power Purchase Agreements (PPA) market, where it was recently ranked No. 1 for its NEO Network and Zeigo platforms. These acquisitions have enabled Schneider Electric to simplify the buying process by connecting customers with trusted experts, and offering exclusive market intelligence to accelerate decision making.
Chamayou tells us that there is “no trade-off”, and sustainability is not just good for the planet, it’s now become a central of focus for organisations both in the industry and outside of it. Many businesses, for example, are becoming more climate-conscious, and as such, making significant investments in Greentech to futureproof and safeguard their operations.
In the data centre sector specifically, sustainability has been driven by enduser requirements, pushing operators to measure and prove their environmental impact in a multitude of ways. “One way in which the company is helping here is through its industry-first sustainability framework”, says Chamayou, “helping operators to measure their impact through five key areas – energy, greenhouse gas emissions (GHG), water, waste, land and biodiversity.”
For Schneider, these are vital, as they give data centre organisations fixed and quantifiable metrics for them to measure their progress towards improved sustainability standards.
“Ultimately, creating sustainable change comes down to setting a bold and actionable strategy,” continues Chamayou, “but no two strategies are entirely alike. Each customer will define its ambition in terms of climate impact, and we take a data-driven approach to help them drive change. To do that, we use our global platform with suites of different software, called EcoStruxure™.”
One example is EcoStruxure’s Resource Advisor, which enables customers to unlock greater optimisation, while giving them access to the analytics and reporting tools that are critical to the first phase of improving sustainability. For data centres in particular, this level of transparency and measurability is invaluable.
Interestingly Schneider Electric’s efforts within the space coincide with the publication of several of its recent research papers. One example is ‘Sustainability at the Edge’, exploring the gap between enterprise plans and edge sustainability programs. One of the report’s key findings was the revelation of a ‘perception-versus-reality dilemma’ across much of the industry.
According to the report, “the maturity evaluations of nearly half of respondents (48%) did not match a previous answer,” as many enterprise organisations believe their sustainability programmes are more advanced than they are, in-reality. What’s more, the paper found that 73% of organisations surveyed ranked sustainability as their second-most important business priority. Critically, only 33% say they have created a strategic sustainability plan.
Chamayou tells us that having clear, definable metrics that data centres can use to shape their sustainability strategies will be an immense aid. Through its framework, technologies and expertise, Schneider Electric is helping the industry to create datadriven strategies, where success and progress are measurable.
coincidence, because Gregor (CEO) and his brother were running the first hosting company in the Netherlands.
“They felt really mistreated on pricing, flexibility and customisation. So they decided, typically entrepreneurially, to just build a data centre themselves. They had no clue what it was, but they did that, improved their designs over the years and, actually, they succeeded in doing so.”
From there, Gregor and his brother spent a lot of time optimising their technology and innovating in the data centre space.
“They achieved this not just by buying everything that was on catalogue, but by actually thinking for themselves what they needed and reinventing smarter and better
“We have totally forgotten to understand that we have to actually adapt our language to the language of the public and give something back to them”
EDGAR VAN ESSEN MANAGING DIRECTOR, SWITCH DATACENTERS
solutions. In the end, this led to very efficient solutions that we developed ourselves and in-depth knowledge on cooling systems, modular design, construction and energy efficiency.
“Today, Switch still has a natural focus on independent development. We have a lot of engineers that have a thorough background in industrial design, meaning we have all the knowledge in-house that we need.”
30MW Switch Datacenters currently operates around 30MW of IT data centre capacity
“At this point in time, we’re one of the best at finding and developing new plots – and we do this in a totally different way to a lot of our competitors. The reason we are good at this is because we focus on Amsterdam, and we know the Amsterdam market inside-out. We know the language, we know the politics, we know how to get the power, and we know the right areas for new locations.”
And now, after almost two decades spent establishing its presence in the Dutch market, Switch’s next goal is to utilise its learnings, grow even bigger in Amsterdam and perhaps strategically expand across the continent.
“So we decided, ‘Let's first build a strong base in Amsterdam and, once we have that base and we have those customers on board, we will replicate that model in other countries’.”
Switch Datacenters was founded in 1998 and is one of the most sustainable data center operators and developers in Amsterdam. Today it owns three data centers in the Netherlands, and provides premium hosting, colocation and connectivity services for cloud, retail, and government organizations.
Sustainability is embedded within Switch Datacenters’ DNA and it believes that data centers can be reliable, affordable, innovative, and sustainable, all at the same time. Its locations in Amsterdam also places it within a global trade hub, while its position within the FLAP-D market makes it one of the world’s most competitive regions for data centers and cloud providers. Sustainability and innovation have, therefore, become key differentiators for Switch Datacenters, and are vital to its leadership position.
For more than a decade Switch Datacenters and Schneider Electric have established a long-term, strategic partnership to build an efficient, adaptive, and resilient data center platform. Across its portfolio, Switch Datacenters has worked with Schneider Electric to design and build its data centers, and today it uses turnkey solutions from Schneider Electric's EcoStruxure™ for Data Centers architecture to deliver marketleading services.
Switch Datacenters designed its facilities primarily to minimalize the impact of CO2 emissions on the environment and it was the first data center operator in Europe to reduce its Power Usage Effectiveness (PUE) to 1.1 by using revolutionary cooling concepts and innovative liquid cooling solutions.
From a power and resilience perspective, Switch Datacenters uses Schneider Electric Galaxy™ VX UPS with LithiumIon, which offers up to 99% energy efficiency using Schneider Electric’s patented eConversion technology, and critical powertrain solutions including its Busbar and Medium Voltage (MV) panel technologies.
The collaboration between Switch Datacenters and Schneider Electric has resulted in an initial 30% initial cost savings and 25% greater energy efficiency, enabling Switch Datacenters to meet today’s demands for industry-leading uptime and sustainability, and provision for a sustainable future.
According to Van Essen, the Amsterdam market is a particularly difficult one for data centres.
“The main challenges in the Amsterdam market are mainly about sustainability and getting power and permits. There's a lot of political pressure in this regard, and I think it all comes back to the public not understanding what data centres are about,” Van Essen explains.
Continuing in this vein, he asserts that one of the industry’s key flaws lies in the fact that its leaders and innovators are talking exclusively in technical language. This mode of communication is actually creating a big gap, because politicians aren’t engineers, and the general public are unable to follow these explanations. As a result, those outside the
EDGAR VAN ESSEN MANAGING DIRECTOR, SWITCH DATACENTERS
“ This optimisation game – data centre 1.0, as I call it – was very technologyoriented and not at all society-aware. That is actually what caused a lot of frustration in Amsterdam with the policymakers and the public”
Everyone wants clean air, to live in a nice neighbourhood, to work in a good workplace and to be able to travel safely from A to B. By making things better, more sustainable and smarter, Heijmans is creating that healthy living environment. We are a stock exchange-listed company that combines activities in property development, building & technology and infrastructure. In addition to this, we work safely and we add value to the places where we are active. This is how we build the spatial contours of tomorrow together with our clients: www.heijmans.nl/en/
For more information:
industry are becoming increasingly alienated from – and therefore mistrusting of – it.
“We have totally forgotten that we have to actually adapt our language to the language of the public and, when using scarce resources like land, power etc, give something back to the public,” Van Essen explains. “I also think a lot went wrong there, specifically, after a lot of foreign capital came to Amsterdam and adopted a one-size-fits-all push out of their US headquarters.”
For many global enterprises, Van Essen says a ‘copy-paste’ approach across all their new locations is common. However, this can be a serious hindrance to efficiency, and be highly detrimental to a brand reputation and a reason for slow innovation.
“This optimisation game – data centre 1.0, as I call it – was very technologyoriented and not at all society-aware.
That is actually what caused a lot of frustration in Amsterdam with both policymakers and the public, and it is now also becoming a broader European topic.”
“Amsterdam was one of the first cities that started to block the growth of data centres because, actually, they were consuming too much space and power, which were meant for other things. In the end, the industry was very much the root cause of creating this block, and you see the same happening now in Frankfurt, and even in London and Dublin.”
Unexpectedly, Van Essen says that because Amsterdam has always been leading this trend towards high sustainability demands, that has proven to be a major advantage of developing solutions there.
“Once you have a working solution in Amsterdam, the chance that you can copypaste it into Frankfurt, Dublin, and the others is pretty high. If you start the other way around – develop something in London and then try to get into Amsterdam – the likelihood that you will fail is pretty big,” Van Essen asserts.
“Amsterdam was one of the first cities that started to block the growth of data centres because, actually, they were consuming too much space and power”
EDGAR VAN ESSEN MANAGING DIRECTOR, SWITCH DATACENTERS
One of the biggest challenges for MTDC’s and their Cloud Scale tenants is that they must think about the infrastructure requirements in terms of workloads rather than in space and power.
To meet future scalability and bandwidth demands, Cloud service providers must have the ability to rapidly deploy any type of workload at any required network speed, at any time, and at any location.
As a result, we see Cloud Type Data Centers rapidly evolving, as data speeds, power usage per rack, and infrastructure complexity increase.
This requires smarter than ever Data Center designs. From smart power and cooling techniques and designs to smart and future ready high speed fiber infrastructure designs, from the entrance room in the grey space to and in-between the equipment and storage racks in the white space.
The objectives of a fiber infrastructure design are comparable to those for designing a highway. To meet current and future traffic demands it needs to be safe, efficient and allows for fast movement of traffic. Next to that overall cost, maintenance, sustainability, and planning for anticipated future traffic must be considered.
In an ideal world the MTDC’s and their Cloud Scale tenants would like to have a fiber infrastructure that can easily be migrated to higher speeds in the most sustainable way and at lowest CAPEX and OPEX thinkable. No matter if you’re on a 100G, 2 fiber backbone today and want to migrate to a 400G or 800G 8 or 16 fiber tomorrow or if you already think about 1.6TB backbone, the solution is already there.
Interested in hearing more about a solution that has ultra-low loss connectivity, is highly modular, has minimum weight, allows for one person install, is sustainable and drives cost and efficiency let us know.
Contact
Bas Mondria Sales Manager EMEA bas.mondria@commscope.com Dick Philips Sr. Manager Global Cloud Accounts EMEA dick.philips@commscope.com“So, it's all about understanding what is the best spot to start developing a new data centre formula. And we are 100% convinced that Amsterdam is the perfect ground for that. Not only due to the size of the market, but also the political environment we are in.”
Switch Datacenters has more than 120MW of data centre capacity in development in Amsterdam
Communicating sustainability –how to get the public on-board So, the twofold challenge becomes implementing future-proof sustainable solutions, while also communicating those efforts to clients and the public. As Van Essen states, preserving the industry requires nothing short of a new reputation.
“What we really now need to see happening, very quickly, is for the hyperscalers to start adapting their operational data centre technology models to actually become much more green by nature,” he urges.
“If they still build new, huge land-consuming data centres designed on air cooling, the situation will never change. So we will really call on the hyperscalers and the forwardlooking companies to start implementing alternative cooling technologies and start looking at site redevelopment, rather than builing new ones from scratch.”
Despite being a relatively small industry player, Switch is renowned for being a major presence on the sustainability stage and a key voice in these global discussions.
“We have been a very active member of all the forward-looking innovation committees in the global industry for many years, and try to contribute there as much as we can. We've been very active in the Open Compute Platform, for instance; we're also one of the founding members of the Sustainable Digital Infrastructure Alliance and a member of the Climate Neutral Data Centre Pact.”
To help drive innovation further in the sector, Switch even opened up one of its Amsterdam data centres as a test batch for new sustainable technologies.
“We said, ‘If you guys want to see how this works and try it out, come to us, and we will actually help you to develop your solutions’. Already, we have quite a number of forward-looking companies coming to us because they want to do something with liquid cooling, reusing heat or using refurbished equipment.”
“They naturally come to us now: we understand their way of thinking, and we actually facilitate their division models. And that's
“It's all about understanding what is the best spot to start developing a new data centre formula. And we are 100% convinced that Amsterdam is the perfect ground for that”
EDGAR VAN ESSEN MANAGING DIRECTOR, SWITCH DATACENTERS
totally different if you go to the big names – we take the opposite approach to them. We say, ‘Please come to us and, jointly, we will find the next level’.”
For instance, experimenting with reusing server heat is a key part of those activities. For Switch, reusing the heat generated by its servers is an essential part of both futureproofing tech design, and preparing its customers for new business models and sustainability legislation in the future.
Switch’s long-term ambition is that data centres will eventually become heating
plants for district heating, thereby helping cities, policy makers and utility providers to move away from the current fossil fuel heat plants much more quickly.
“Data centres will no longer be seen as enemies of the people, and will contribute to wider society in a much smarter way. And that's what we call data centre 2.0.”
It is through these market differentiators that Switch has been able to establish a unique, immensely strong relationship with its network of partners.
“Our partners see that we're the odd one out – the new kids on the block, in a certain way. They’re also seeing that we're growing quite substantially. So, now, we are really on a lot of innovation calls with really big suppliers,” Van Essen explains.
“They start to understand what we're doing, why we’re doing it, and see the value of our approach, that actually brings us into very strong strategic alliances. Despite the fact that we don’t yet have the volumes of the really large data centres, we still get the same level of board attention, because we bring much more to the table than just volume.”
In this way, Switch is adopting a multifaceted, intuitive approach to sustainability, bringing all the required elements together, while successfully showing the public that data centres can have a place in our greener future.
“It's all about understanding where the public opinion is heading to understand new things around the corner that will impact your business. It is absolutely not technology alone that defines our future. It's much more about understanding society and contributing to society – and that's where we differ from the rest.”
AI and video games have always gone hand-in-hand. But now, generative AI has the potential to completely disrupt and transform an entire industry
WRITTEN BY: MARCUS LAWAs an integral part of video games since their inception in the 1950s, AI in gaming is nothing new.
Twenty years before the release of Pong by Atari, AI was being used in games. One of the earliest examples is the 1951 mathematical strategy game Nim, in which two players take turns removing objects from piles, where the computer was able to beat human players regularly. In the same year, the AI Ferranti Mark 1 machine at the University of Manchester was used to write a game of chess, in one of the first computer programmes ever written.
Today, generative AI tools have the potential to transform an industry which, according to a report by PwC, is expected to be worth as much as $321 billion by 2026.
Highlighted by the rising popularity of tools such as ChatGPT, generative AI has the potential to revolutionise numerous industries, including the video game industry. Video games have already come a long way since their inception, with improved graphics, sound, and gameplay mechanics, but generative AI could take them to the next level.
One of the most significant ways that generative AI can transform the video game industry is through procedural content generation. Procedural content generation involves using algorithms to generate game content – such as levels, maps, environments, and characters – automatically. This can save game developers a significant amount of time and resources, while also creating a more varied and unique gameplay experience for players.
For the 2020 release of its pioneering flight simulator, Microsoft partnered with the
educators. Empowering students. Explore how we accelerate student discovery, learning and innovation with our Digital Education 3D Experience.
Austrian startup Blackshark.ai, training an AI to generate a photorealistic 3D world from 2D satellite images.
Blackshark’s AI-driven technology enabled Microsoft’s Flight Simulator to display the surface of the entire planet in 3D – with over 1.5 billion photorealistic buildings – giving users an unprecedented immersive 3D flight experience and the largest open world in the history of video games.
With a team of 50+ AI specialists, geospatial engineers, data scientists, and real-time rendering developers, Blackshark developed a unique solution that uses the Microsoft Azure Cloud and AI to gain insights about our planet based on Bing Maps data.
“We have reconstructed approximately 1.5 billion buildings and detected over 30 million square kilometres of vegetation”, explained Stefan Habenschuss, Head of Blackshark’s Machine Learning Group, who described the game as ‘a demonstration of the power of AI’.
“In Flight Simulator, we look at 2D areas and then find footprints of buildings, which is actually a computer vision task,” said Blackshark Co-Founder and CEO Michael Putz, in a recent interview with TechCrunch. “But if a building is obstructed by a shadow
The global gaming industry could be worth US$321bn by 2026, according to PwC’s Global Entertainment and Media Outlook 2022-26.
The expansion is being driven by social and casual gaming, after millions of people picked up their controllers to escape the boredom and isolation of the COVID-19 lockdowns.
“People were looking for ways to both entertain themselves and maintain their social connections,” says Bartosz Skwarczek, Co-Founder and CEO of online gaming marketplace G2A.com.
“Gaming has so often been painted with the wrong brush – stereotyped as being isolating and unsociable. However, the pandemic has shown this could not be further from the truth.”
“We have reconstructed approximately 1.5 billion buildings and detected over 30 million square kilometres of vegetation”
STEFAN HABENSCHUSS HEAD OF MACHINE LEARNING GROUP, BLACKSHARK.AI
Generative AI can be used by the video game industry in a number of ways, including:
1. Procedural content generation: Generative AI can be used to create game content, such as levels, environments, characters and items, automatically. This can save time and resources for game developers and offer players a unique and varied gameplay experience.
2. NPC behaviour: Generative AI can be used to create non-playable characters (NPCs) with more realistic and varied behaviours. NPCs can learn and adapt to player actions, making the game feel more immersive.
3. Dialogue generation: Generative AI can be used to generate dialogue for NPCs, making them feel more like real characters with unique personalities and responses.
4. Game balancing: Generative AI can be used to balance game mechanics, such as difficulty levels and rewards, based on player behaviour and feedback.
5. Player modelling: Generative AI can be used to model player behaviour and preferences, allowing the game to adapt to each player's individual playstyle.
of a tree, we actually need machine learning because then it’s not clear anymore what is part of the building and what is not, because of the overlap of the shadow — but then, machine learning completes the remaining part of the building.”
Generative AI is an increasingly popular topic in gaming. Earlier this year, Roblox said it was testing a tool that could accelerate the process of building and altering in-game objects by getting AI to write the code. The tool lets anyone playing Roblox create items such as buildings, terrain, and avatars; change the appearance and behaviour of those things; and give them new interactive properties by typing what they want to achieve in natural language, rather than complex code.
In a blog post, James Gwertzman and Jack Soslow at Venture Capital firm Andreesen Horowitz explained how utilising generative AI will provide substantial benefits for developers.
“When talking to game developers who are experimenting with integrating generative AI into their production pipeline, the greatest excitement is over the dramatic reduction in time and cost,” they write. “One developer has told us that their time to generate concept art for a single image, start to finish, has dropped down from three weeks to a single hour: a 120-to-1 reduction. We believe similar savings will be possible across the entire production pipeline.”
However, despite these cost savings, they say that artists themselves are not in danger of being replaced by AI.
“One developer has told us that their time to generate concept art for a single image, start to finish, has dropped down from three weeks to a single hour”
JAMES GWERTZMAN JACK SOSLOW ANDREESEN HOROWITZ
“For game developers, generative AI poses two core issues: who owns the content generated by AI, and does training data used without the consent of the data owners violate their copyright?”
MAGDALENE BEDI PARTNER, PILLAR LEGAL
“To be clear, artists are not in danger of being replaced,” they write. “It does mean that artists no longer need to do all the work themselves: they can now set initial creative direction, then hand off much of the timeconsuming and technical execution to an AI. In this, they are like cel painters from the early days of hand-drawn animation, in which highly skilled “inkers” drew the outlines of animation, and then armies of lower-cost “painters” would do the time-consuming work of painting the animation cels, filling in the lines.”
One of the crucial legal issues surrounding generative AI is the ownership of intellectual property rights. The ability of AI systems to create new works without human input raises questions about who owns the rights to these creations, and how to protect them from infringement.
“Although there are present, emerging and potential applications for generative AI as a tool for creatives, some AI generators have been met by confusion and anger from artist communities, due to uncertainty about how generative AI impacts creators’ rights with respect to training data and generated content,” explains Magdalene Bedi, a partner at legal firm Pillar Legal. “For game developers, generative AI poses two core issues: who owns the content generated by AI, and does training data used without the consent of the data owners violate their copyright?
“The first core issue presented by generative AI is that of ownership: who owns the copyright to content produced by generative AI? If ownership isn’t clear, or if content generated by AI is public domain, then game developers will be limited in their ability to protect that content against infringement by others.
“The second core issue presented by generative AI is whether using data to train generative AI models without the data owners’ permission constitutes copyright infringement. Such copyright infringement may occur when training data is input into generative AI models, and in the subsequent content produced by such generative AI models.”
So, although AI promises nothing short of an industry revolution, before these models can be deployed at scale across the gaming sphere, these complex legal hurdles still need to be overcome.
SAP is a global software provider and a leader for enterprise business process software, including solutions to manage supply chains. SAP provides technologies, supports the cloud and cloud platform environments, as well as artificial intelligence/machine learning (AI/ML) libraries, robotic process automation (RPA) and in-memory technology for high-end computers. SAP’s solutions for manufacturing execution and insights are part of a portfolio of products for supply chain management and leverages these technologies.
“We're an enterprise business software and a technologies company,” says Sam Castro Senior Director, Solution Management, LoB Digital Manufacturing.
Castro is a Senior Director at SAP and a part of the line of business manufacturing solution management team. The line of business covers the 27 manufacturing industries for which SAP provides software solutions.
“All of those industrial companies have needs around operations visibility, control and reporting,” Castro explains. “The different industries have different targets that they're after. Some are heavier on the asset side, some of them are heavier on product quality and yields, others are all about logistics and moving products around on-time through the supply chain.”
SAP is met with a diverse set of requirements and needs from its customers. Solution management takes these industry needs and applies them to market direction and invests them in the portfolio.
“We provide guidance on where to focus and the emphasis for development, and that strategy big picture where we want to take the products,” Castro explains.
In college, Castro completed a Bachelor's in computer engineering and a Master's in computer science at the Rochester Institute of Technology (RIT).
“I came from the hardware bridge to the software bridge very naturally after graduating,” says Castro. “I was dropped into the manufacturing floor because that is exactly where the hardware automation side bridges over into the software.”
He was faced with a great deal of information and digital signals from the automation layer and was tasked to turn it into information — how does SAP make that translation?
“I started at the very lowest level and moved my way through Lighthammer Software, which was acquired by SAP back in July 2005,” says Castro. “I worked my way through SAP into the role that I'm in today.”
“Being a sustainable enterprise means that you're an efficient enterprise”
SAM CASTRO SENIOR DIRECTOR, SOLUTION MANAGEMENT, LOB DIGITAL MANUFACTURING, SAPSam Castro is Senior Director of Solution Management, LoB Digital Manufacturing at SAP
Reap the benefits of SAP’s shopfloor manufacturing execution system (MES) for your overall manufacturing process.
Enabling a new level of production continuous improvement, from raw materials to finished goods, while achieving Industry 4.0 benefits:
y Increased product quality
y Increased profitability
y Waste reduction
y Better staff utilization
y Easier regulatory compliance
y Increased customer satisfaction
Reach out to us for a consultation on how Industry 4.0 and Digital Transformation can help you achieve the “Factory of the Future”.
When you talk about risk resilience at SAP, it’s about how to handle the real world, not setting up a plan and adhering to it day in and day out.
“You would like it to be like clockwork, for sure,” says Castro. “Where everything always aligns and meshes the way that it's supposed to all the time, every second. But we know that's not always the case.”
Weather events, pandemics, labour shortages or large sporting events can cause supply chain issues. For Castro, resiliency is the byproduct of having to have to handle these off-topic or out-of-sync scenarios and the ability to detect that you're out of sync with the original plan and react to it in a coordinated manner.
“The faster you can do that, the faster you can correct that problem,” says Castro. “Then you’re able to identify how often those
deviations occur — that frequency of occurrence, that is your opportunity.”
Being able to quantify that opportunity and understand what those little deviations actually add up to, and how that impacts the business financially, is one of the key topics around what customers will hear about resiliency from SAP, says Castro.
“Sustainability is an overlay to that, sustainability is a byproduct of efficiency,” says Castro. “Being a sustainable enterprise means that you're an efficient enterprise. If things are running effectively, things are running safely, and in a very energy-friendly manner as well.”
Castro views the impact of the cloud on manufacturing as a positive one.
“There are benefits for the IT team from a maintenance perspective and a continuous update and management of that software package,” he explains.
Cloud users are not dealing out of sync or outdated documentation, they’re not dealing
with security issues that creep into the environment over time. Updates and patches are handled in real-time by the cloud hosting and software provider, that SaaS provider in the cloud environment. Castro views offloading that burden from the manufacturing layer and the IT teams that support them centrally and locally as a big deal for organisations and businesses.
“It keeps that barrier to entry for managing efficient production and tracking off of those teams, and it puts it
“Here are the enablers of AI and ML type algorithms that you can use and put together how you see fit”
SAM CASTRO SENIOR DIRECTOR, SOLUTION MANAGEMENT, LOB DIGITAL MANUFACTURING, SAP
firmly on the shoulders of the software provider. What does that mean for the business? It means that the end users aren't working with stale software. You're not working with software that has a UI from 15 years ago. You're not working with an ad-hoc analytical environment that used to be cool but now uses plugins and stuff that your browser doesn't support and ultimately causes it to have problems,” Castro explains.
As businesses are not dealing with these issues from the end user perspective,
TITLE: GLOBAL VICE PRESIDENT, CENTRE OF EXCELLENCE
INDUSTRY: MANUFACTURING
LOCATION: PENNSYLVANIA, US
Sam Castro joined SAP in July of 2005 with the acquisition of a small company called Lighthammer. He was responsible for implementation consulting, field enablement, custom development, and training for the core products (Illuminator, Xacute, UDS, CMS). These products have since evolved into the core SAP Connected Manufacturing products (Mfg. Integration & Intelligence or MII and Plant connectivity or PCo) that you see today.
Sam is now part of SAP LoB Manufacturing Solution Management group, which is directly responsible for strategy, direction, and customer adoption of all of the manufacturing products at SAP. He is specifically responsible for Industrial Analytics, that is SAP MII, Digital Manufacturing for insights, and Digital Manufacturing for execution, and he is the solution owner for Process MES products. In this role, he is actively working on mid- and long-term features and deliverables and how they are positioned with the broader SAP portfolio; he also provides guidance for product development investment.
they're able to take advantage of a very modern, easy to consume and use software experience and focus on their core business functions.
“Despite not directly interacting with it, the work around you is what's driving that environment for you,” says Castro.
“You're not putting that burden of three or four extra clicks on somebody, this is just software that's being driven from digital signals; from integration, automation, and the tasks that the operator is performing.”
This newer approach to software design is how SAP leverages the industry investment companies have made and it is what's ultimately reducing the impact that end users have on that environment themselves.
There are different pillars within organisations, which have their own priorities. CEOs, CIOs, CTOs and CFOs are all working together and have overlapping needs that drive different business cases. But they need to have the right information at the top layer to make the right decision for the lowest layers within the organisation. This doesn't happen unless there is a framework in place for the distribution and analysis of the data that is generated, from the very edges of the manufacturing and supply chain processes to the shop floor.
“If you don't have a way for that information to work its way up to the top, organisations really struggle to understand where the priority needs to be,” says Castro.
For manufacturers to focus on business value versus technology, Castro believes that they need to intelligently manage profitability
and investments. As a result of that additional profitability, they also need to protect that inflow of money and profitable behaviour for the company.
“Is that a CapEx investment? Is it an OPEX investment? Is it better granularity on product quality and an emphasis on quality for certain products or certain areas within a process that are very tricky and cumbersome?” asks Castro. “Maybe it's a new product that you're introducing and as a result, that process isn't fully stable yet. What is the emphasis in how much we put into that project to stabilise it? Those are the goals that are very coveted from the C-suite down, but they really are reliant from all edges of the supply chain and having that information roll all the way up.”
“Sustainability is an overlay to that, sustainability is a byproduct of efficiency”
SAM CASTRO SENIOR DIRECTOR, SOLUTION MANAGEMENT, LOB DIGITAL MANUFACTURING, SAP
With your SAP MII and ME system being paramount to achieving high-performance manufacturing, let RTS support your shopfloor solution to maintain peak plant operating performance at all times.
Our experienced and knowledgeable staff provide quick support response on a 24/7 basis. RTS not only provides support but will also deliver:
y System upgrades to newer versions
y Modifications to your existing MES solution
y Adding completely new capabilities to your existing system
y IoT connectivity
Reach out to us for details on how we can “support and service” your existing SAP MII and ME shop floor solution.
Enterprise-led manufacturing follows in tune with this exactly.
“The enterprise has to provide guidance to the manufacturing and supply chain teams as a whole,” says Castro. Where they want to see improvements and how much they're willing to invest in those improvements, what's it worth? How do you build that community up?”
To understand the role that manufacturing plays in an organisation’s reinvestment strategy, you must first understand where it matches up with other locales in the manufacturing environment.
“Manufacturing isn't just a single-faceted environment. It's often made up of plants that have been around for a long time, some
that were built up by your own organisation, some that came into the organisation through acquisition,” says Castro. “So you see different heritages and mentalities. They have this communal approach for how the plant manager wants to lead that group in the business forward.”
At SAP, being able to take advantage of AI standardisation in a universal way is important.
“You can take and apply these very technical algorithms in order to get information off them. Here's the technology, here are the enablers of data, here are the enablers of AI- and ML-type algorithms that you can use and put together how you see fit,” says Castro. “Then that carries over into the
application side, which says, we know we have these technologies, we know that this data is being generated from our transacting processes, so we have our own structured analytics pieces and now we can use these structures to drive our own models to influence our execution process.”
SAP has global partners, as well as local partners, who rely on its technology. When Castro talks about partnerships, he does not put one partner over another.
“We try to keep the community as open as possible,” he says. “We try not to promote one partner over another, because they're all very important to us.”
The openness of SAP and the openness of its software is for its customers to take advantage of, but also for their partners to put their own industry expertise behind.
“It is what gives SAP the power that we have to leverage in our own technologies to leverage partner-led innovation using those technologies to intelligently power our applications.”
“ You want it to be like clockwork, where everything always aligns. But we know that that's not always the case”
SAM CASTRO SENIOR DIRECTOR, SOLUTION MANAGEMENT, LOB DIGITAL MANUFACTURING, SAP
US researchers say deep reinforcement learning can offer a way for artificial intelligence to help protect computer networks in a world where state-sponsored hacking groups rub shoulders on the Dark Web with more traditional black hat types.
But the researchers say cybersecurity staff can relax, and even though businesses worldwide are reporting increases in ransomware, they’re not about to be replaced by an AI-powered workforce. For now.
Researchers with the Department of Energy's Pacific Northwest National Laboratory developed a simulation environment to test multistage attack scenarios involving different types of adversaries. This dynamic attack-defence simulation environment allowed them to compare the effectiveness of different AI-based defensive methods under controlled test settings.
According to their research, deep reinforcement learning (DRL) was effective at stopping adversaries from reaching their goals up to 95% of the time in simulations of sophisticated cyberattacks.
While other forms of artificial intelligence are standard for detecting intrusions or filtering spam messages, deep reinforcement learning expands defenders' abilities to orchestrate sequential decisionmaking plans in their daily face-off with adversaries. Deep reinforcement learning offers smarter cybersecurity by detecting changes in the cyber landscape earlier and the opportunity to take preemptive steps to scuttle a cyberattack.
“An effective AI agent for cybersecurity needs to sense, perceive, act and adapt, based on the information it can gather
Deep reinforcement learning offers smarter cybersecurity by detecting issues earlier, giving the opportunity to take preemptive steps to stop a cyberattack
and on the results of decisions that it enacts,” says Samrat Chatterjee, a data scientist who presented the team’s work. “Deep reinforcement learning holds great potential in this space, where the number of system states and action choices can be large.”
The outcome of this research offers promise for a role for autonomous AI in proactive cyber defence. The development of such a simulation environment for experimentation is itself a win, as it offers researchers a way to compare the effectiveness of different AI-based defensive methods under controlled test settings.
By creating a dynamic attack-defence simulation environment, they could test multistage attack scenarios involving different adversaries, allowing them to compare the effectiveness of different AI-based defensive methods.
Deep reinforcement learning (DRL) is emerging as a game-changing decisionsupport tool for cybersecurity experts. Unlike other forms of AI, which are limited to detecting intrusions or filtering spam messages, DRL allows defenders to learn,
“Our goal is to create a defence agent that can learn the most likely next step of an adversary then respond”
SAMRAT CHATTERJEE CHIEF DATA SCIENTIST, PACIFIC NORTHWEST NATIONAL LABORATORY
The team trained defensive agents based on four deep reinforcement learning algorithms: DQN (Deep Q-Network) and three variations of what’s known as the actor-critic approach.
The agents were trained with simulated data about cyberattacks and then tested against attacks that they had not observed in training.
In the least sophisticated attacks (based on varying levels of adversary skill and persistence), DQN stopped 79% of attacks midway through the attack stages and 93% by the final stage.
For moderately sophisticated attacks, DQN stopped 82% of attacks midway and 95% by the final stage.
In the most sophisticated attacks, DQN stopped 57% of attacks midway and 84% by the final stage — far higher than the other three algorithms.
adapt, and make autonomous decisions in the face of rapidly changing circumstances. By orchestrating sequential decisionmaking plans, defenders can quickly respond to cyberattacks and prevent them from doing any damage.
One of the key benefits of DRL is its ability to detect changes in the cyber landscape early, allowing defenders to take preemptive steps to stop cyberattacks before they happen. With the threat of cyberattacks only set to increase, DRL offers a smarter and more proactive way to keep our computer networks safe.
The research findings were documented in a research paper and presented at a workshop on AI for Cybersecurity during the annual meeting of the Association for the Advancement of Artificial Intelligence in Washington, D.C. The development of DRL for cybersecurity defence represents an exciting step forward in the battle against cyber threats. As technology advances, researchers will undoubtedly discover new and innovative ways to harness AI for cybersecurity, ensuring that our systems remain safe and secure.
In addition to Chatterjee and Bhattacharya, authors of the AAAI workshop paper include Mahantesh Halappanavar of PNNL and Ashutosh Dutta, a former PNNL scientist.
Good decisions get a positive reward
DRL is a powerful decision-making tool that combines reinforcement learning and deep learning to excel in complex environments that require a series of decisions. Positive rewards are given to reinforce good decisions that lead to desirable outcomes, while negative costs discourage bad choices that result in unfavourable results.
“Our algorithms operate in a competitive environment—a contest with an adversary intent on breaching the system; It’s a multistage attack”
SAMRAT CHATTERJEE CHIEF DATA SCIENTIST, PACIFIC NORTHWEST NATIONAL LABORATORY
This learning process through positive and negative reinforcement is similar to how humans learn many tasks. For instance, when a child completes their chores, they might receive positive reinforcement, such as a playdate with friends. Not doing their work could lead to negative reinforcement, such as losing digital device privileges. By mimicking this natural process of learning, DRL provides a promising approach to decision-making in the field of cybersecurity, enabling defenders to quickly adapt to changing situations and respond with greater efficiency.
“It’s the same concept in reinforcement learning,” says Chatterjee. “The agent can choose from a set of actions. With
each action comes feedback, good or bad, that becomes part of its memory. There’s an interplay between exploring new opportunities and exploiting past experiences. The goal is to create an agent that learns to make good decisions.”
To evaluate the efficacy of four deep DRL algorithms, the team leveraged Open AI Gym, an open-source software toolkit, as a foundation for creating a custom and controlled simulation environment.
The researchers incorporated the MITRE ATT&CK framework, developed by MITRE, and included seven tactics and 15 techniques used by three separate adversaries. Defenders were given 23 mitigation actions to halt or prevent an attack from progressing.
The attack was divided into several stages, including reconnaissance, execution, persistence, defence evasion, command and control, collection, and exfiltration, when data is transferred out of the system. The adversary was declared the winner if they successfully reached the final exfiltration stage.
By testing these DRL algorithms under these controlled conditions, the team was able to assess the strengths and weaknesses of each approach, providing valuable insights into the potential of this technology to enhance cybersecurity defence strategies.
“Our algorithms operate in a competitive environment—a contest with an adversary
intent on breaching the system,” says Chatterjee. “It’s a multistage attack, where the adversary can pursue multiple attack paths that can change over time as they try to go from reconnaissance to exploitation. Our challenge is to show how defences based on deep reinforcement learning can stop such an attack.”
“Our goal is to create an autonomous defence agent that can learn the most likely next step of an adversary, plan for it, and then respond in the best way to protect the system,” says Chatterjee.
Despite the progress, no one is ready to entrust cyber defence entirely to an AI system. Instead, a DRL-based cybersecurity system would need to work in concert with humans, says coauthor Arnab Bhattacharya, formerly of PNNL.
“AI can be good at defending against a specific strategy but isn’t as good at understanding all the approaches an adversary might take,” says Bhattacharya. “We are nowhere near the stage where AI can replace human cyber analysts. Human feedback and guidance are important.”
“The agent can choose from a set of actions. With each action comes feedback, good or bad, that becomes part of its memory”
SAMRAT CHATTERJEE CHIEF DATA SCIENTIST, PACIFIC NORTHWEST NATIONAL LABORATORY
With widespread disruption, the IoT is helping businesses streamline their supply chains, increasing efficiency, reducing costs, and improving visibility
Global supply chains have been severely impacted by the Russo-Ukrainian conflict and the COVID-19 pandemic, with disruptions caused by such events leading to manufacturing delays, shipping complications, panic buying, and increased energy costs.
But with the rise of IoT devices revolutionising the way organisations around the world operate, offering a wealth of benefits across global supply chains, new technologies are helping businesses streamline their supply chains, increase efficiency, reduce costs, and improve visibility across the entire process.
The fourth industrial revolution, Industry 4.0, relies upon IoT to join operational and informational technology, for the sake of improving quality, reducing risk and minimising cost. “IoT is a core part of Industry 4.0 because it allows us to build digital networks of machinery, devices, and infrastructure. By using IoT, organisations can assemble smart factories and supply chain processes which continuously collect data,” says David Beamonte Arbués, Product Manager (IoT & Embedded Products) at Canonical.
“Businesses can then apply AI and ML technologies which, once synchronised, remove silos in the supply chain process and allow unprecedented levels of transparency, automation, insight and control,” Arbués adds. “Industry 4.0 focuses heavily on interconnectivity, automation, ML and real-time data. It marries physical production and operations with smart technology – none of which would be possible without IoT at its core.”
A modern network must be able to respond easily, quickly and flexibly to the growing needs of today’s digital business. Must provide visibility & control of applications, users and devices on and off the network and Intelligently direct traffic across the WAN. Be scalable and automate the process to provide new innovative services. Support IoT devices and utilize state-of-the-art technologies such as real-time analytics, ML and AI. And all these must be provided with maximum security and minimum cost.
This is the power that brings the integration of two cloud managed platforms, Cisco Meraki and Cisco Umbrella. This integration is binding together the best of breed in cloud-managed networking and Security. cisco.com
One of the most significant benefits that IoT technologies offer for supply chains is improved visibility. IoT devices can be integrated into every stage of the supply chain, from production to delivery, allowing businesses to track their goods in real time. This increased visibility enables businesses to make informed decisions about inventory management, transportation, and production schedules, which can lead to significant cost savings and improved customer satisfaction.
In many cases, such information is also combined with third-party data sources, allowing organisations to react quickly to unexpected supply chain events, such as extreme weather events, geopolitical instability or sudden dips in customer demand.
“Industry 4.0 marries physical production and operations with smart technology – none of which would be possible without IoT at its core”
DAVID BEAMONTE ARBUÉS PRODUCT MANAGER (IOT & EMBEDDED PRODUCTS), CANONICAL
“Some of the recent supply chain disruptions we have seen are due to a lack of real-time visibility across the supply chain. For a well-functioning supply chain, across multiple levels, you must collect, integrate and analyse data to provide a single view of the supply chain”, explains Bjorn Andersson, Senior Director of Global IoT at Hitachi Vantara.
“This view should include data from sensors and devices, such as data associated with temperature and vibration. IoT enables this level of visibility, allowing
you to see if suppliers can meet their commitments, or to spot when a proactive action needs to be taken to prevent a disruption in production.”
IoT devices can also help businesses optimise their supply chains by providing real-time data on the performance of equipment and machinery. This data can be used to identify potential issues before they become major problems, helping reduce downtime and improving overall equipment performance, leading to increased productivity and cost savings.
Data-driven predictive maintenance solutions are helping businesses with profitability, sustainability and the personal safety of their people.
According to SAP, predictive maintenance seeks to prevent equipment failure and downtime by connecting IoT-enabled enterprise assets, applying advanced analytics to the real-time data they deliver, and using the resulting insights to inform educated, cost-effective, and efficient maintenance protocols.
“SCADA has been around for 40 years, but Industry 4.0 technologies such as AI, ML and IoT have taken it to a whole new level of sophistication”
ANDY HANCOCK
GLOBAL VP, DIGITAL SUPPLY CHAIN CENTRE OF EXCELLENCE, SAP
Today’s predictive maintenance has its roots in supervisory control and data acquisition (SCADA) - a control-system architecture formed of computers, networked-data communications and graphical user interfaces for high-level supervision of machines and processes, for assets either in situ or in remote locations.
“SCADA has been around for 40 years,” says Andy Hancock, Global VP of SAP’s Digital Supply Chain Centre of Excellence (CoE), “but Industry 4.0 technologies such as AI, ML and IoT have taken it to a whole new level of sophistication.
“SCADAs were all about condition-based monitoring,” he adds. “So, ‘If this happens, then do that’. But what people increasingly want is a predictive element, such as gaining insight into the useful remaining life of an asset somewhere out in the field. With big data, it’s now possible to carry out trend analysis to get at this kind of information.”
But Hancock warns that data-driven solutions can also bring their own problems.
“With predictive maintenance, you should always be looking out for the exception,” he says. “Think of a temperature gauge on a
WATCH NOW“With data, less is usually more. Systems can soon get inefficient if overloaded with data”
ANDY HANCOCK GLOBAL VP, DIGITAL SUPPLY CHAIN CENTRE OF EXCELLENCE, SAP
piece of equipment, which is feeding back data. If everything is running fine, it will always be roughly the same temperature. You don’t need to keep feeding back data about that piece of equipment. The only data you want to capture is if something changes - say if the thermostat fails.
“With data, less is usually more. Systems can soon get inefficient if overloaded with data. And then you end up chucking more tech at the problem, where what you really need to do is cut back on the data.”
Predictive maintenance might feel like a new concept to many, but, says Hancock, some companies are already taking things to the next level, using tech to help drive something called “prescriptive maintenance”.
AI is revolutionising the supply chain industry, and it’s hardly surprising as automation with AI can reduce the time and money spent on traditionally manual tasks, making the process faster and more efficient.
Common supply chain tasks that can be automated for businesses include warehouse robotics, predictive analytics, digital process automation (DPA), optical character recognition (OCR), and data entry automation.
AI also increases productivity by saving time on tasks that can be done through automation, providing constant visibility without any breaks or downtime, and increasing the capabilities of employees who are not experts in their business's technology tools. AI also streamlines the decision-making process, making it easier, faster, and smarter.
Implementing AI in supply chains can benefit the business by creating visibility, and optimization. Supply chain managers can try an AI simulation tool to optimise their operations using realworld scenarios.
AI data analysis algorithms can also identify where the materials are being used and what materials are being wasted.
He says: “Imagine you have two identical assets. One's nicely tucked inside a factory and so is in a fairly constant environment, but the other asset is exposed to the elements on the seafront. Clearly, you’re going to have different performances from those two assets, and so you create prescriptive maintenance cycles that take environmental conditions into account.”
Having fine-tuned maintenance cycles - whether predictive or prescriptive - is important for preventing over-maintenance as much as undermaintenance, Hancock stresses. With sustainability now an issue of utmost importance for businesses, the green gains of predictive maintenance solutions are also very appealing.
“If you're over-maintaining an asset, you’re wasting money and resources,” Hancock concludes. “Why do an overhaul every six months when it’s not necessary? With predictive maintenance, you can use historical information via IoT sensor data and extend the maintenance cycle to seven or eight months, with no negative impact on the machinery.”
We spotlight the Top 10 women who are breaking new ground in the global AI industry and working with industry leaders including MIT, Google, and OpenAI
According to Deloitte’s critical report – The State of Women in AI Today – the AI industry has held a markedly persistent gender and diversity gap. But, as the sector seeks to evolve and future-proof itself, navigating the gender bias in AI will prove critical.
In fact, a staggering 71% of Deloitte report respondents asserted that adding women to AI and machine learning will bring unique perspectives to high
tech that are needed in the industry. What’s more, 66% believe that AI and ML solutions would benefit from having more diverse employees in designer and developer positions.
With this critical situation – and the next generation of AI talent – in mind, we’ve shared our pick of the top 10 women in AI, each of whom are responsible for some of the greatest advancements achieved to date.
Allie Miller has an extensive background spanning the fields of AI, ML, humancomputer interaction, technology, cognitive science, analytics, product and user experience, consumer insights, startups, and venture capital.
As an accomplished AI entrepreneur, advisor, and investor, Miller’s previous experience includes key positions at AWS, at IBM as Global Head of Machine Learning Business Development, Startups and Venture Capital, and Lead Product Manager at IBM Watson. In fact, Miller is the youngest ever woman to build an AI product at IBM.
Rana el Kaliouby is the Co-Founder of Affectiva, a software company that builds AI with an understanding of human emotions, cognitive states, activities, and the objects people use.
El Kaliouby has also published a memoir – Girl Decoded – in which shares the details of her unique career and her personal, longstanding goal of humanising both our technology and the ways in which we connect with one another.
Shivon Zilis has held key roles in the leadership teams of many of the most famous AI companies. These include the likes of OpenAI, Neuralink, and Bloomberg Beta. She also worked at Tesla as a Project Director for its Autopilot product and chip design team.
She was also the youngest board member at both OpenAI and Neuralink. In 2015, Zilis was listed on the Forbes 30 Under 30 list for venture capital.
Regina Barzilay is an Israeli computer scientist, working as a Professor at MIT. She is also a leading member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).
In addition to her work in the field of natural language processing, Barzilay is making critical strides in the deployment of AI technologies to improve cancer diagnosis, for which she recently received the National Science Foundation's CAREER award.
Chelsea Finn is a Stanford University Assistant Professor, specialising in the fields of computer science and electrical engineering. Her Stanford lab – IRIS – is affiliated with both the wider Stanford AI laboratory, and the ML Group. She leads pioneering research in the field of intelligence through robotic interaction at scale.
Finn is also a key figure in Google’s industry leading Google Brain team.
Cynthia Rudin is a Computer Science Professor at Duke University, known for her pioneering work in the fields of ML, applied ML, and causal inference. She has also held positions at Columbia, NYU, and MIT. She is a three-time winner of the INFORMS Innovative Applications in Analytics Award and, in 2022, Rudin was awarded Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity from the Association for the Advancement of Artificial Intelligence (AAAI).
Timnit Gebru is one of the most impactful advocates of AI DE&I. She is the Co-Founder of the Black in AI initiative, the Founder of The Distributed AI Research Institute, and was recently named by Time as one of its 100 most influential people in the world.
Gebru is also known as the Former Co-Lead of Google’s Ethical AI team, with a key part of her role being to ensure that Google’s AI products did not hold racial bias. However, following the publication of her paper that challenged the ethics of AI language models at large – and criticised Google’s approach to this complicated matter – she left the position in 2020, after two years.
Fei-Fei Li is one of Stanford University’s most influential tech and AI professors.
Not only is she the inaugural Sequoia Professor in the university’s Computer Science Department, but she is also Co-Director of Stanford’s Human-Centred AI Institute, and previously served as the Director of its world-renowned AI Lab.
She is also the Co-Founder and Chairperson of AI4ALL, a non-profit organisation that works to achieve greater DE&I across AI education.
For the last 20 years, Kate Crawford has been one of the industry’s global thought-leaders on the question of AI’s political and social implications.
She holds a number of prestigious positions in academia, including being a Senior Principal Researcher at MSR-NYC, a Research Professor at USC Annenberg, and an Honorary Professor at the University of Sydney. She has also co-founded a number of interdisciplinary research groups – including the AI Now Institute at NYU, and Knowing Machines at USC – and advised policy makers in both the UN and the White House.
Her most recent book – Atlas of AI – has received extensive critical acclaim, described as "a sweeping view of artificial intelligence that frames the technology as a collection of empires, decisions, and actions that are together fast eliminating possibilities of sustainable future on a global scale”.
Cynthia Breazeal is one of MIT’s leading computer science professors, and she has been an Associate Professor at the worldfamous institution for over 20 years. She founded (and continues to direct) the institution’s Personal Robots group and is the Director of its Initiative on Responsible AI for Social Empowerment and Education.
Breazeal is renowned across the industry for her pioneering work in social robotics. Within her wider work, she
focuses
To this aim, she is leading the industry’s knowledge in the long-term impact that social robots will have and the place that they can have in our daily lives.
In harmony with her research and academic background, Breazeal is also the Founder and Chief Scientist of Jibo. Jibo is a companion robot that strives “to foster a secure, supportive and exciting space in social AI companionship”.