Global Banking & Finance Review Issue 67 - Business & Finance Magazine
Shaping the Future of Banking: Diebold Nixdorf on Technology, Resiliency, and Sustainability
Helena Müller, VP Banking Europe, Diebold Nixdorf
Chairman and CEO
Varun SASH
Editor
Wanda Rich email: wrich@gbafmag.com
Managing Director
Martin Murphy
Project Managers
Megan Sash | Raj Gopal | Rima Attar
Customer Service Representative
Tamara Yavtushenko
Head of Operations
Robert Mathew
Office Manager
Priya KV
Business Consultants - Digital Sales
Paul Dus Davis | Cora Joseph Shefali Kochhar | Aakarshita Gautam
Business Consultants - Nominations
Sara Mathew | Adam Luiz
Divyansh Vaid | Sowmya N Ashish Mishra | Anurag Rajak
Business Analysts
Varshitha | Jackson Brize
Video Production & Journalist
Phil Fothergill
Graphic Designer
Jesse Pitts
Advertising
Phone: +44 (0) 208 144 3511 marketing@gbafmag.com
GBAF Publications, LTD
Alpha House
100 Borough High Street London, SE1 1LB United Kingdom
Global Banking & Finance Review is the trading name of GBAF Publications LTD
Company Registration Number: 7403411
VAT Number: GB 112 5966 21 ISSN 2396-717X.
The information contained in this publication has been obtained from sources the publishers believe to be correct. The publisher wishes to stress that the information contained herein may be subject to varying international, federal, state and/or local laws or regulations.
The purchaser or reader of this publication assumes all responsibility for the use of these materials and information. However, the publisher assumes no responsibility for errors, omissions, or contrary interpretations of the subject matter contained herein no legal liability can be accepted for any errors. No part of this publication may be reproduced without the prior consent of the publisher
editor
Dear Readers’
Welcome to Issue 67 of Global Banking & Finance Review. As the financial landscape continues to evolve, we’re delighted to bring you insights from industry leaders who are shaping this transformation. In this issue, we explore innovations in banking technology, sustainability practices, and data integrity that are setting new standards across the sector.
Our Cover Story, "Shaping the Future of Banking: Diebold Nixdorf on Technology, Resiliency, and Sustainability," features Helena Müller, Vice President of Banking Europe at Diebold Nixdorf. Recently in London, Helena shared her perspectives on how the company is pioneering solutions to meet the needs of today’s banks and customers. From ATM pooling and the future of physical branches to their commitment to resilient and sustainable practices, Diebold Nixdorf is redefining how financial institutions can deliver seamless, reliable service in a digital world (Page 24).
Kenneth Grant and Carlos Pareja’s article, "Scope 3 Emissions: Seeking Clarity in a Sea of Uncertainty," dives into the complex landscape of Scope 3 emissions reporting. As CFOs navigate evolving standards, this timely piece unpacks the challenges and opportunities in achieving transparency for long-term environmental impact (Page 18).
In "The Open Banking Challenge: Ensuring Compliance with API Standards for the Financial Industry," Jamie Beckland discusses the compliance and security standards necessary to support the open banking ecosystem. With regulatory expectations rising in the EU, US, and beyond, he highlights proactive steps that financial institutions can take to maintain robust, reliable API structures (Page 28).
Dr. Tendü Yogurtcu, CTO of Precisely, addresses the crucial issue of data integrity in "Navigating AI Bias: Ensuring Data Integrity in the Age of Generative AI." From the impacts of biased credit scoring to strategies for achieving reliable AI outcomes, Dr. Yogurtcu shares how organizations can reinforce data quality and governance to reduce AI bias and support fair, accurate insights (Page 32).
Professor Karl Schmedders from IMD contributes "Integrating Sustainability for Long-Term Business Resilience and Value Creation." As investors demand greater commitment to ESG, Professor Schmedders explores how the TCFD framework supports companies in building resilient, future-focused strategies that address climate risks and drive long-term growth (Page 20).
At Global Banking & Finance Review, we strive to be your trusted source of insights and perspectives in the financial sector. Whether you are an industry veteran or a curious newcomer, there is something here for you. We value your feedback and invite you to share your thoughts on how we can better serve your needs in future editions.
Enjoy the journey through our latest issue!
Wanda Rich Editor
Stay caught up on the latest news and trends taking place by signing up for our free email newsletter, reading us online at http://www.globalbankingandfinance.com/ and download our App for the latest digital magazine for free on Google Play and the Apple App Store
The Open Banking Challenge: Ensuring Compliance with API Standards for the Financial Industry
Jamie Beckland, Chief Product Officer, APIContext The now and next for banking personalisation
Matt Phillips Head of Banking, UK and Ireland, Diebold Nixdorf
Banking in 2035: How emerging technologies will transform the way we bank
John Da Gama-Rose, Head of Banking & Financial Services for Global Growth Markets, Cognizant
Nageswar Cherukupalli, Senior Vice President, Head of Banking & Financial Services, Americas, Cognizant
BUSINESS
The Evolution of Architectural Trends in MarTech: From SOA to MACH
Mark Barrett, CRO, Aionic Digital
Navigating Global Crises: Strengthening Retail Resilience
Scope 3 Emissions: Seeking Clarity in a Sea of Uncertainty
Kenneth Grant, Managing Director, Energy Policy and Regulation, Berkeley Research Group Carlos Pareja, Vice President, Internal Audit, Capital Planning, Morgan Stanley Integrating Sustainability for Long-Term Business Resilience and Value Creation
Karl Schmedders, Professor of Finance, IMD
Dr. Michael Erkens and Dr. Ries Breijer
Kamran Hedjri, Group CEO,
Financial Why a Shift in Regulatory Enforcement Demands a New Approach to Trade Surveillance
Joe Schifano, Head of Global Regulatory Affairs, Eventus
Fintech Art of Scaling from a Single ‘Killer’ Feature
Toby Strangewood, Co-Founder, Wake the Bear Is there a danger of overregulation stifling competition? – Roger Alexander
Laurent Roger Alexander, Key Advisor, Chargebacks911
TikTok trend or the cause of financial fear?
Navigating AI Bias: Ensuring Data Integrity in the Age of Generative AI
Banks are ready to embrace AI, but customers are hesitant: How to bridge the trust gap with GenAI and customer communications
Scott Draeger SVP of Product Marketing & Vertical Solutions Smart Communications
How to Implement Unified High Availability (HA) and Disaster Recovery (DR) for SQL Server in Financial Services
Don Boxley Jr, CEO and Co-Founder, DH2i How banks can mitigate cloud security threats
Furqan Siddiqui, SOC Operations Officer, Obrela
Steve Bradford, Senior Vice President EMEA, SailPoint
Cover Story
Shaping the Future of Banking: Diebold Nixdorf on Technology, Resiliency, and Sustainability
Helena Müller,VP Banking
Europe, Diebold Nixdorf
How Close Are We to G20 Cross-Border Goals?
Cross-border payments are crucial to the global economy, facilitating transactions across countries and currencies. As such, addressing the challenges in this space is more pressing than ever, prompting G20 leaders to set an ambitious target: 75% of cross-border payments credited to the beneficiary within an hour by 2027. As we cross the halfway point in achieving this, Kamran Hedjri, Group CEO at PXP Financial, looks at how close we really are.
In today’s interconnected global economy, cross-border payments play a pivotal role in facilitating international trade and commerce. With a surge in low-value crossborder transactions, particularly in the digital services sector, addressing the challenges in this space has never been more important.
It’s encouraging that, since its endorsement in November 2020, the G20’s ambitious roadmap to transform crossborder payments by 2027 has been driving significant advancements towards making payments cheaper, faster, and more transparent.
Achieving the goal of 75% of cross-border payments being credited to the beneficiary within an hour means making a significant leap forward in speed and reliability, and there’s a lot still to be done.
The importance of cross-border payments in international trade
The value of cross-border payments is estimated to increase to over $250 trillion by 2027, equating to a rise of over $100 trillion in just 10 years. They have become a critical component of international trade, especially in the growing digital services sector.
By facilitating transactions across different countries, they allow businesses and individuals to engage in international commerce efficiently, provide a streamlined process for making international transactions, reducing the complexity and time associated with international trade, and, by making it easier to buy and sell across borders, help increase trade volumes, which contributes to global economic growth.
Cross-border payments are critical for businesses to expand into new international markets, tapping into a broader customer base and increasing revenue potential. Companies that effectively manage cross-border payments can gain a competitive edge by offering better pricing, faster delivery, and improved customer service.
And it’s not just businesses demanding a better crossborder service. Consumers increasingly expect seamless and instant payment experiences, regardless of geographical boundaries.
As the expansion of the digital services sector continues, the need for cross-border payments becomes even greater. They are essential for facilitating transactions in this sector, where services are often delivered digitally across borders. Many digital services operate on subscription or licensing models that require recurring cross-border payments, highlighting the need for reliable and efficient payment systems.
All this considered, it’s little wonder that the world’s economic heavyweights have placed great expectations on cross-border payments through the Financial Stability Board’s (FSB) G20 cross-border payments targets.
G20 Roadmap progress
It’s worth mentioning that, when the G20 targets were first agreed, there was no widely available data on how the industry was performing against the targets at the time. Although it was generally agreed there was a way to go, very few players or countries had a strong sense of how they were performing against the goals.
Nevertheless, a major milestone came in 2023 when Swift adopted ISO20022 messages, establishing a standardised communication framework for cross-border transactions worldwide. Its implementation has provided the foundation to live up to the G20 targets, because the message standard allows banks to reduce costs, improve reconciliation and enhance financial crime detection.
However, data from October 2023 showed only 18% of cross border payments and cash management reporting messaging were being based on the ISO format, with a little less than 800 banking institutions systematically issuing payment messages with an ISO syntax, proving there was still a considerable amount of work to do. Even today, migration needs to be faster, but the investment to migrate core banking systems is a huge barrier, and corporates also need to be persuaded to use ISO20022.
Elsewhere, industry initiatives led by Swift and supported by banks, are having a transformative effect in creating fast, predictable, cost-effective and secure cross-border payments – namely, Swift GPI (Global Payment Initiative), which provides end-to-end tracking for high-value crossborder payments, and Swift Go, a comparable service focused on lower-value payments.
Efforts are also being made to interlink domestic payment systems to enable instant cross-border flows. So far, such schemes have found success in Singapore, India and other countries across South-East Asia, where connections have been established to solve specific cross-border instant payment use cases.
Other initiatives in development are as the Immediate Cross-Border Payments (IXB) Pilot devised by EBA Clearing, The Clearing House (TCH) and Swift, to interconnect systems in the United States and Europe, as well as the European Payments Council’s (EPC) new One-Leg-Out Instant Credit Transfer (OCT Inst), which aims to use the existing SEPA (single Euro payments area) payment rails for international instant credit transfers.
There are still some steps to me made
The rise of fintech has led to a range of innovations that offer faster, more secure, and cost-effective cross-border payment solutions, and mobile payment technologies have also made cross-border transactions more accessible and convenient. The missing ingredient is not innovation, it’s collaboration, particularly when it comes to addressing payment system interoperability.
A core issue is that, while cross-border payments are global by nature, regulation is enforced locally, so to drive interoperability, regulatory alignment is needed, for example, with respect to rules of anti-money laundering (AML), countering financing of terrorism (CFT) and data governance.
As yet, the goals of the Roadmap are not backed by regulatory or legal mandates, which raises concerns, and the ISO 20022 migration is a good example. In areas where central banks are mandating the use of ISO 20022, every single participant is working on the initiative. Where it is not mandated, the opposite is often true. Without clear and consistent consequences for not meeting the KPIs, and if local regulators fail to incorporate the Roadmap’s essentials, it’s possible that results may emerge at a later stage in favor of other activities.
Existing data frameworks that conflict with financial regulations with respect to data privacy and data sharing need to be resolved so the payments industry can drive the innovative solutions needed to meet that bold 2027 target.
The G20 goals will have a profound impact on end-users at a time when more people than ever need seamless cross-border payments. Now is the time to fill the lack of formal mandates and make the G20 Roadmap part of the action plan of all players – governments, regulators, banks and technology providers. Only then can we make the Roadmap a reality as collaboration is key to achieving G20 goals for cross-border payments.
Kamran Hedjri Group CEO
PXP Financial
The Open Banking Challenge:
Ensuring Compliance with API Standards for the Financial Industry
For banks, retailers and enterprise businesses, open banking and application programming interfaces (APIs) are a powerful combination that streamlines how financial data is exchanged. Moreover, APIs reduce IT complexity and simplify financial transactions for the financial industry.
However, as this landscape evolves, particularly with API usage increasing, meeting regulatory and compliance requirements for API reporting poses a significant challenge for financial institutions. Regulations such as the forthcoming EU PSD3 and US CFPB 1033 aim to address the big issue in how APIs are being built and deployed while ensuring quality and security are maintained throughout the API lifecycle. And, for the first time, they will include technology speed and availability requirements.
In the UK, should issues arise with APIs, open banking regulations require that they are reported to industry regulators. Issues can happen if the API is not aligned with the specification to which it was created, is not available in the valid format it is supposed to be, or if the
data in the API is not accurate. The UK has been at the forefront of the global open banking revolution due to the proactive attitudes of regulators which created an Open Banking ecosystem that provides the UK with best practices in the implementation of API-based Open Banking that many other jurisdictions are looking to as a framework.
The US is seeking to adopt new open banking regulations dedicated to ensuring API quality and security standards are being met. While the UK has already adopted the FinancialGrade API protocol (FAPI), the US is currently in a listening period for new regulations – but those within the industry know those new regulations are fast approaching.
FAPI is a specialised set of standards and guidelines that aim to ensure the security and reliability of APIs used in the financial industry. It is defined by the OpenID Foundation, an industry body that’s been working on creating hardened API standards that work for sharing financial information, managing transactions, making
payments, checking balances, and more. It uses OAuth 2.0 and OpenID Connect as its base and then adds technical requirements for the financial industry and other industries that require higher API security. Indeed, the goal of FAPI is to provide a “higher level of security than provided by standard OAuth or OpenID Connect.”
Security is a concern with APIs because as the number of APIs exposed increases, so does the exposed surface area. Should an API be poorly created or not maintained, gaps will appear with an increased likelihood of exploits. Since security around finance transactions is paramount, many look to FAPI to set the standard for API security.
In addition, API regulatory reporting requirements exist to ensure all APIs are compliant throughout their lifecycle, and not just when they are first created. For instance, annual reports for APIs are obligatory for organisations in the UK and any time there is a violation these must be immediately reported.
Unlike the annual reporting requirement in the UK, the US is likely to demand reporting to be conducted more frequently or even continuously. Globally, meeting API standard compliance continues to be a hot topic. Countries such as Australia, Brazil, Mexico, India and the UAE have either implemented regulatory requirements or are in the process of enforcing a certain version of the technical standard – meaning all businesses within that country will need to conform to that standard.
Organisations need to have monitoring capabilities in place for APIs to ensure they are compliant and conformant, especially to industry standards in locations where they conduct business. Yet, monitoring APIs and checking for API compliance can be slow and painful for businesses that don’t have the right tools, with much of it being a manual process. Furthermore, proactive API security and governance will be crucial to the future of open banking’s success otherwise this could potentially cause problems with regulators and industry standards groups.
Therefore, organisations should implement robust controls for current API services, including real-time and automated API monitoring, access management, testing, and governance checks to gain the full context of the performance of APIs in use. This will assist organisations with potential service outages and security or conformance issues before customers, partners or regulators find out.
Ultimately, API performance is critical to ensure a strong user experience for core digital use cases like payment processing and transfers. Implementing these steps will help inspire customer confidence and ensure the organisation’s Open Banking services are delivered efficiently, safely and securely.
Jamie Beckland Chief Product Officer APIContext
The Evolution of Architectural Trends in MarTech: From SOA to MACH
The Marketing Technology (MarTech) industry has witnessed the ebb and flow of trends over the last decade, always promising revolutionary changes but often falling short of expectations. From the emergence of XML to the current fascination with Microservices, APIs, Cloud-Native, and Headless architectures encapsulated in the MACH framework, businesses are constantly seeking innovative solutions to stay ahead of the curve. Reflecting on past experiences and current industry shifts, it’s evident that while technology evolves, the fundamental principles of addressing business needs remain constant.
Nearly twenty years ago, the buzz surrounding Service-Oriented Architecture (SOA) permeated the tech landscape. XML, SOAP, and other related technologies were heralded as the panacea for enterprise integration and scalability challenges. However, amidst the excitement, there was a lack of clarity regarding the practical implications of adopting SOA. Misinterpretations led to confusion, with some mistaking requests for simple RSS feed implementations as inquiries about complex enterprise architecture decisions.
Despite the initial hype, SOA laid the groundwork for modern architectural paradigms, including the shift towards microservices. This evolution didn’t render SOA obsolete but rather integrated its principles into a more refined approach. Similarly, MACH represents the latest evolution in serviceoriented architectures, emphasizing flexibility, scalability, and agility. Drawing on lessons learned from past trends like SOA, MACH offers a comprehensive approach to address the challenges of today’s digital landscape.
MACH isn’t merely a marketing gimmick. It’s a strategic framework designed to address contemporary challenges in the MarTech landscape and deliver real value to businesses. By embracing MACH, organizations can achieve significant benefits, including scalability, flexibility, faster time-to-market, and reduced vendor lock-in.
However, adopting MACH isn’t a onesize-fits-all solution. As SOA wasn’t a magic bullet for every business, MACH isn’t universally applicable. Factors such as organizational size, technical expertise, and infrastructure readiness must be carefully considered. While MACH holds promise for many, smaller businesses may find its complexity daunting, while larger enterprises may struggle with tightly coupled dependencies.
In the context of e-commerce, MACH presents compelling advantages, particularly for mid to large-scale businesses seeking composability and flexibility in their tech stack. However, its suitability varies depending on specific needs and constraints. For instance, while Headless architecture offers benefits in delivering consistent user experiences across channels, it may prove overly complex for smaller operations.
The advantages of MACH are numerous, ranging from scalability and agility to cost efficiency and enhanced user experiences. Yet, its implementation requires a thoughtful approach tailored to each organization’s unique circumstances. By carefully assessing needs, leveraging the right technologies, and embracing MACH principles where appropriate, businesses can navigate the complexities of modern MarTech architectures more effectively. Here are some key advantages to adopting MACH:
Scalability: A microservices architecture allows organizations to scale individual services independently based on demand. This enables better resource utilization and improved system responsiveness, essential for efficiently handling fluctuating workloads.
Flexibility and agility: With its microservices and API-first approach, MACH provides flexibility in the independent development, deployment, and scaling of services. This agility is crucial for adapting to changing business requirements and market dynamics, ensuring that organizations can stay responsive and competitive.
Faster time-to-market: Leveraging cloud-native technologies and DevOps practices, MACH enables faster development cycles, continuous integration, and continuous deployment. This results in quicker releases and a reduced time-to-market for new features and products.
Improved developer productivity: MACH often involves decentralized development teams taking ownership of individual microservices. This can lead to increased developer productivity as teams can work independently without being tightly coupled to other parts of the system.
Technology diversity: By emphasizing an API-first approach, MACH enables diverse technologies to communicate seamlessly. This allows organizations to choose the best tools and technologies for each microservice, promoting innovation and avoiding technology lock-in.
Enhanced user experience: MACH’s headless architecture decouples the front end from the back end, allowing for greater flexibility in designing user interfaces. This separation enables the delivery of a consistent user experience across various channels and devices, enhancing customer satisfaction and engagement.
Cost efficiency: Organizations are enabled to take advantage of cloud services leading to cost savings by optimizing resource usage, benefiting from pay-as-you-go models, and reducing the need for extensive infrastructure management.
Easier maintenance and updates: With its microservices architecture and decentralized development model, MACH makes maintenance and updates more manageable. Changes to one microservice can be made without affecting the entire system, reducing the risk of errors and downtime.
Better resilience and fault isolation: If one microservice fails, it doesn’t necessarily impact the entire system, improving resilience and system reliability, critical for ensuring uninterrupted business operations.
Integration with third-party services: MACH’s API-first approach facilitates seamless integration with third-party services, allowing organizations to leverage external tools and services easily, expanding their capabilities and offerings.
As we continue to witness technological advancements and market shifts, the evolution of architectural trends in MarTech will persist. With careful consideration and strategic implementation, MACH can drive significant value and propel businesses toward success. While MACH represents the current pinnacle, it’s essential to remain adaptable and forward-thinking, recognizing that
the landscape will continue to evolve and new paradigms will emerge. In this ever-changing environment, the key lies in embracing innovation while staying grounded in principles that drive tangible business value.
Mark Barrett is an accomplished sales professional with a proven history of growing technology-focused companies. Mark has a deep background in running software and services firms and a proven track record of helping clients and partners achieve their goals. Prior to founding Aionic Digital, Mark spearheaded the creation of three successful sales organizations for start-ups and bootstrapped a brand-new SaaS product to a successful capital raise with over $12M in ARR. While Mark has held roles as CEO and President, his real passion is building successful sales organizations and cultivating extremely productive executive teams.
Name Title
Mark Barrett CRO, Aionic Digital
Navigating Global Crises: Strengthening Retail Resilience
Ongoing global crises has created unprecedented challenges for retailers in 2024. Supply chain disruptions, unpredictable consumer demand, and logistical complexities have become the new norm. In such a turbulent environment, businesses urgently need stability and adaptability.
Ecommerce solutions have emerged as powerful tools for businesses navigating these uncertainties. Unified platforms capable of centralising operations across multiple sales channels offer retailers the visibility, agility and flexibility needed to thrive amidst the disruption.
As global crises persist, having access to an overview of these elements of a business in one place is a powerful tool for businesses seeking resilience and assurance. With innovative approaches and adaptable strategies, retailers can withstand the adversity and emerge stronger in an increasingly volatile world, further explains more….
Opportunity Brings Pressure
Yet many small to medium sized retailers are completely blind to the potential impact of these events. The reality of the situation only becomes apparent when suppliers fail to deliver, 3PLs confirm contracts cannot be fulfilled or customers complain – often loudly and publicly. Which is, more often than not, far too late.
Achieving Immediate Visibility
Retailers of every size now have unprecedented access to new customers across the globe. The extraordinary expansion of marketplaces over the past decade has transformed customer reach and enabled retail entrepreneurs to scale up at pace. This opportunity, however, places new demands on retail organisations to deliver top quality customer experiences every time – and the implications of failure are extremely severe. A supply chain glitch in the past might have led to one or two disgruntled customers whose products were delayed or unexpectedly unavailable. Today, that same glitch can mean a retailer fails to hit its marketplace Service Level Agreement (SLA), risking cutting off an entire revenue stream and potentially devastating the business.
And these supply chain glitches are hitting retailers thick and fast. The crisis in the Red Sea last year added up to two weeks to shipment time for ships rerouting via the Cape of Good Hope and had seen costs soar. The damage to the Baltimore Bridge has required a rapid rethinking of supply chains as goods are diverted to other ports. Ongoing uncertainty regarding inflation and interest rates continues to affect the retail cost base, from transport and goods to staff. And those are the big-ticket items. Every day retailers are facing seemingly random decisions that can have unforeseen consequences – such as a change to cycle lanes in London that interferes with regular delivery routes adding cost and delays; or a new sustainability regulation introduced in another country that can affect product manufacture or even the ability to sell in that geography.
Technology has a vital role to play in enabling retailers of every size to improve their resilience. Real-time inventory management allows retailers to identify and respond to supply chain events. With up-to-date information, a retailer can confidently and quickly take the decision to relocate inventory to counteract blockages in the supply chain, for example, find a new source to replace missing inventory or swivel to a drop ship model in certain geographies.
The speed with which a retailer can identify a problem and quickly understand the business implications is key. Alternative routes and suppliers will have finite capacity and, while the enterprise sized retailers will already have supply chain contingencies in place, the rest of the market will be scrabbling to access both products and shipping capacity. The cost of these contingency strategies will, of course, be a key consideration. Retailers need not only real time visibility of logistics and operations but also finances.
This is where the increasing sophistication of technology is providing a compelling solution. By integrating Connected Commerce Operations systems, retailers can seamlessly integrate and automate all their commerce operations across multiple sales channels, from inventory management to order fulfilment. This additional insight improves financial governance by providing both day to day insight into profitability and a rapid understanding of the costs associated with risk mitigation strategies when a supply chain event occurs.
Collaborative Approach
Being part of an ecosystem is enormously powerful, facilitating both the collaboration between partners required to quickly identify supply chain contingency options and, increasingly, the interplay between organisations that is reducing empty miles and improving fulfilment performance. With the right technology stack within a connected ecosystem, within moments of a supply chain problem occurring, a retailer can discover alternatives, highlight the associated costs and take a decision.
This insight also supports the proactive customer communication required to avoid the negative feedback that could affect marketplace ratings and overall reputation. Customers generally respond well when a retailer not only explains the reason for a delay but also simultaneously outlines its response and confirms when the product, or replacement, will be available.
Technology alone, however, is never a panacea. People and processes are also key. To avoid the risk of an internally created crisis, a retailer must also ensure everyone is not only using consistent information but also sharing that information. A marketing team should never
embark on a massive sales outreach campaign without checking warehouse operations are ready, for example. Warehousing should always inform the rest of the business when it decides to relocate or replace racking to ensure any possible customer impacts are managed and mitigated. Without strong internal communication a retailer will undermine its resilience and agility.
Conclusion
Supply chain resilience is now a core component of retail success: companies simply cannot assume goods will be delivered on time. With the complexities created by ecommerce and the rapid rise in customer demands, the supply chain has become an integral part of the overall customer journey – and one that no retailer can afford to take for granted.
The only way to mitigate against the risk of potential devastation is to recognise the strategic importance of the supply chain. By implementing operational visibility and creating a collaborative model where the supply chain works hand in hand with the revenue and commercial side of the business, and with a wider ecosystem, a retailer can create the agility required to overcome each new challenge as it arises.
Georgia Leybourne Chief Marketing Officer at Linnworks
The now and next for banking personalisation
Balance is an extremely relevant word in the financial services industry right now. On one hand the push to drive efficiencies and carve out costs from operating models has been dominant on strategic agendas for many organisations. In parallel, the need to be more customer-focused than ever is vital to staying relevant and competitive. With these often-opposing priorities at the forefront of current decision making, what role is personalisation playing on both sides of the equation, and how can personalising services for the future help drive a growth agenda for financial services?
An inflection point for personalisation
In many cases the industry is at a point of inflection - and indeed reflection - when it comes to personalisation. Many financial institutions have been developing and offering increasingly personalised services, which has been shown to increase revenue and drive customer loyalty. Indeed, a recent study showed that 86% of financial institutions viewed personalisation as a clear strategic priority, with further investments in this area planned by 92% of the same group.
Despite this commitment, in some cases the personalisation agenda has been somewhat diluted and steered off track by other priorities; resulting in basic levels of customer satisfaction being missed. With efficiency drives taking priority for many, there is often a lack of depth to truly understanding pain points and the experiences consumers are looking for in today’s current climate.
Going back to basics can often help resolve this as a starting point. Of course, longer term a truly emotional connection needs to go beyond the basics. As well as offering personalised solutions, financial institutions must be socially responsible, provide value-added services and support, and ultimately align with an individual’s values to create an emotional connection for the future.
The physical versus digital debate
There has been much discussion over recent years about where consumer services should be offered, and what the scope of consumer touchpoints should look like. As part of this, considering the role physical services play in the topic of personalisation is critical. For example, is it more important that the staff in your local branch recognise you and can tailor your in-branch service, or that you only get relevant product offerings through your banking app? Most consumers want a blend of the two, highlighting the importance of creating connected experiences.
Balancing people and technology to get the optimised combination of services creates the foundation for achieving overall customer satisfaction and driving revenue opportunities. Going too far down the automation route has in some cases, led to the elimination of personal value. Getting back to creating a value-added exchange of experiences can leapfrog organisations ahead in the competitive landscape.
Inclusive human experiences that create a blend of both physical and digital banking services will enable financial institutions to be bold and take the required steps forward for growth. In addition, baking innovation and the latest technology into the consumer offering will facilitate a ‘routine innovation’ approach and ensure that services are relevant, timely and fit for the current economic climate.
Making data part of the journey
We all know the power of creating data-led customer journeys and the role it plays in effective personalisation of services - but how advanced is the financial services sector in actually implementing such journeys? With an unpredictable economic and political climate, in some cases the focus on capitalising on data, and the insights it brings, have taken a back seat.
However, this is not the case for all organisations with some taking strides ahead in their use of data. Many financial institutions are effectively embracing data to attract new customers as well as maintain loyalty within their existing customer base. By creating a true understanding of consumers through the intelligence of data, there are more opportunities to create a relationship of trust. Consumers want to feel recognised and understood and this deeper level of connection is only achieved by adding additional value on top of basic transactional services offered.
To move forward, the industry needs to embrace data as a non-negotiable and fundamental element of building consumer journeys in the overall customer approach. On top of the wealth of data financial institutions hold on their customers, there may also be a need to invest in external data and take a more proactive approach to utilising social and economic data to create that next level of meaningful and truly personalised customer experiences.
Taking the lead on education
The role of personalisation can also extend beyond simply offering more tailored and relevant services and journeys. Creating and maintaining continued trust and confidence in services goes back to creating that emotional connection.
Financial institutions play a vital role in our economy and how we manage our money. With the economy likely to be more unpredictable in the future, does the industry have a responsibility to play more of an educational role rather than a transactional provider of banking services? Some organisations are spearheading the push for consumer education, and this is likely to pay dividends in an increasingly competitive environment. Not only can offering consumers tailored education and support with financial services help them embrace and utilise your individual services more effectively, it can also provide a platform for advancing financial inclusion as a whole – truly allowing consumer choice and empowerment for those in every society.
In summary, the future of personalisation requires a dual approach. Firstly, financial institutions should take now as the opportunity to go back to basics and reassess whether they have a true understanding of their customer’s needs in today’s current environment. Once this solid foundation has been built, embedding a commitment to personalisation across the consumer journey ecosystem will drive value creation, boost revenue opportunities and customer engagement - ultimately creating more collaborative and two-way customer relationships for the future.
Matt Phillips Head of Banking, UK and Ireland, Diebold Nixdorf
Scope 3 Emissions:
Seeking Clarity in a Sea of Uncertainty
Background
It is commonly recognized that Scope 3 emissions represent the majority of emissions for most companies. Unsurprisingly, investors and regulatory authorities continue to advocate for the inclusion of corporate financial disclosures relating to greenhouse gas (GHG) emissions that arise from the goods and services acquired to produce their outputs (upstream) and as a consequence of the use of their products (downstream). Yet corporate financial officers (CFOs) face two dilemmas: 1) the methodological and operational challenges of determining the emission of GHGs outside of the company’s control; and 2) the absence of regulatory clarity as to how they are to be disclosed, with multiple frameworks proffering guidance. And these challenges omit the assurance, verification, and enforcement structure required but yet to be built.
Consequently, CFOs face new emissionreporting responsibilities and risks.
The Transition
In the midst of the call for “more detailed, consistent, reliable, and comparable information” of firm-specific, climate-related impacts, risks, and opportunities, multiple parties have developed frameworks offering guidance as to how such information is to be estimated, including the Greenhouse Gas Protocol, CDP, Global Reporting Initiative, Transition Plan Task Force, and Taskforce on Naturerelated Financial Disclosures (TNFD), as well as the IFRS.
However, the profusion of such frameworks, with their diverse guidance as to the data and methodologies that may or should be employed in calculating, estimating, and reporting Scope 3 emissions, runs contrary to investors’ needs for transparency, consistency, and comparability.
Fortunately for CFOs, the IFRS has been actively seeking to reduce the complexity of the reporting process, including for Scope 3 emissions, by entering into strategic relationships with regulatory authorities and framework providers. The agreements with regulators intend to improve the clarity of the reporting process by harmonizing the information to be disclosed. The agreements with framework providers seek to simplify the data collection and reporting process through the development of a common understanding of how the various frameworks map against each other and the reporting requirements put forth by the regulatory bodies.
Yet gaps remain. For example, the IFRS Climate-Related Disclosures Standard (aka IFRS S2 standard) allows for relief in reporting Scope 3 emissions where an enterprise determines it impracticable to do so. In contrast, the EU’s regulation on corporate sustainability disclosures (EU CRSD) provides for no such relief.
While the EU CRSD takes no position on the accuracy or precision of the current state of Scope 3 data, the uncertainty of estimating these emissions is evidenced by the range of approaches that may be employed when such information cannot, with “reasonable efforts,” be collected directly. These include “data from indirect sources,” “sector-average data,” “sample analyses,” “market and peer-groups,” “scenario or sensitivity analysis,” “spend-based data,” or “other proxies.”
In short, the situation imposes a heavy burden on CFOs: how is a company to report what is difficult to measure, not yet fully defined, and requires choosing among potentially dissimilar options?
The Role of the CFO
The recent enactment of the EU’s sustainability disclosure requirements sets a landmark precedent and will provide insight for regulatory authorities, reporting enterprises, investors, and civil society organizations to agree on methodologies for reporting Scope 3 emissions information in a way that that meets the needs of the capital markets.
But this will take time. During this period of regulatory adjustment, reporting enterprises remain responsible for meeting evolving reporting requirements. The following practices can serve to mitigate risks that must be managed by CFOs and controllers that are instrumental in such a transition:
1. Maintain general ledger flexibility, including managerial information availability and reliability
2. Build processes to identify and assess materiality of climate-related risks, including Scope 3 emissions, and key assumptions employed in and limitations of those assessments
3. Assess the quality of the data—both internal and that acquired from thirdparty vendors—employed in meeting reporting requirements
4. Document processes and data transformation employed to generate climate-related information, including Scope 3 emissions
5. Establish independent validation mechanisms to challenge emissionsrelated processes, methodologies, and outputs, including estimated financial impacts
6. Benchmark the enterprise’s results against similar enterprises and/or industries
7. Ensure consistency with the type of assurance acceptable to IFRS and US Generally Accepted Accounting Principles (GAAP) in the reporting of GHG emissions’ CSRD metrics and criteria
8. Develop a governance program that involves and informs senior corporate management, including the audit committee and board of directors, on proposed methodologies and key concerns from external and internal auditing required for the production of climate-related disclosures.
Conclusion
The EU’s CRSD begins the transition to the formal development of regulatory and accounting standards for the reporting of Scope 3 emissions-related financial and operational impacts. Given the complexity and uncertainty associated with the measurement of Scope 3 GHG emissions, the transition inevitably will present challenges for CFOs. Regulators, investors, and boards of directors will analyze the provided disclosures, with the CFOs caught between calls for greater transparency, methodological alignment, and the need to protect commercially sensitive information. We believe, however, the CFO can take actions that will enhance the robustness of the information sought by investors while mitigating the regulatory ambiguity inherent in the transition.
Vice President, Internal Audit, Capital Planning, Morgan Stanley
The opinions expressed are those of the authors and do not necessarily reflect the views of the organizations, their clients or [Publisher], or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
Carlos Pareja
Kenneth Grant Managing Director, Energy Policy and Regulation, Berkeley Research Group
Integrating Sustainability for Long-Term Business
Resilience and Value
Creation
Companies that proactively address climate risks are better equipped to adapt to a rapidly evolving business environment, making them more attractive to investors seeking sustainable long-term returns.
Addressing environmental concerns is now essential for the long-term success of a business. Climate change and the damage to essential global ecosystems are no longer just environmental concerns; they present significant financial risks that disrupt business operations and devalue assets.
The recent hurricanes in the Southeastern United States, Helene and Milton, have only highlighted the physical and financial damage of extreme weather events, once again emphasising the need for businesses to embed sustainability into their core strategies. But managing climate-related risks is not just about preventing losses—it is also about seizing opportunities for growth and innovation.
resilience in the face of economic and environmental challenges. They have been best guided by frameworks like that proposed by the Task Force on Climate-related Financial Disclosures (TCFD), which provides a structured approach for managing climaterelated risks and opportunities.
The TCFD outlines four key elements essential for integrating sustainability: Governance, Strategy, Risk Management, Metrics and Targets. These pillars help businesses develop robust systems for identifying and managing climate risks, improving transparency, and ensuring they meet investor expectations for long-term resilience and value creation.
Governance: Setting the Tone at the Top
assigning board-level responsibility for climate risk. For example, Bayer’s CEO holds direct responsibility for climate protection in their role as Chief Sustainability Officer. With senior leadership actively engaged, companies are much more likely to develop forward-looking strategies that not only mitigate potential disruptions but also seize emerging opportunities in the transition to a low-carbon economy.
Strategy: Aligning Business Plans with Sustainability
To address financial risks, companies must then integrate ESG into their long-term strategies. The TCFD recommends that businesses map climate-related risks and opportunities under different climate change and climate policy scenarios, including those aligned with the goals of the Paris Agreement. Indeed, depending on the scenario – whether that be heightened supply chain regulation measures or an increase in catastrophic weather events – companies can be affected in diverse and nuanced ways.
Demands
Sustainability Integration: Meeting Investor and Shareholder
Investors and shareholders are increasingly favouring the integration of corporate responsibility into the fabric of a company. Needed for sustainable growth, they’re pushing businesses to adopt E(SG) practices that address climate risks, ensuring
This must begin with governance. As the foundation of effective ESG integration, boards and senior management must take ownership of climate-related risks and opportunities and embed them into the company’s overall risk management and decision-making processes. Investors want to see strong governance practices that demonstrate accountability and proactive management of ESG factors.
Companies that have clear governance structures in place are best able to anticipate regulatory changes and manage operational risks. Good practice here might include
Scenario analysis helps companies plan for not only the physical impacts of climate change but for the market shifts associated with the transition to a low-carbon economy too. Customers can increase the demand for green products and avoid ‘bad’ products, which in turn dictates the market prices of goods.
But it is not only doom and gloom. Companies can actually capitalise on new market opportunities. For instance, companies that invest in clean technologies, renewable energy, or circular economy models are wellpositioned to benefit from changing consumer preferences and regulatory support for sustainable solutions. In the steel and aluminium industries, for example, we can already see some companies moving away from fossil fuel-based energy to renewable.
In contrast, companies that fail to incorporate these strategies may find themselves exposed to transition risks, such as rising carbon prices and declining demand for carbon-intensive products. These companies may lose their competitive advantage as a result of inaction.
The TCFD framework highlights the importance of incorporating both physical and transition risks. This requires a comprehensive understanding of how climate risks can disrupt operations, impact financial performance, and affect long-term viability.
Physical risks, such as the hurricanes in the Southeastern United States, result in significant disruptions to supply chains, damage to infrastructure, and subsequently, financial loss. Improving infrastructure resilience, diversifying supply chains, and conducting climate risk assessments is the only way to withstand these shocks.
Transition risks, on the other hand, arise from the global shift toward a lower-carbon economy. This shift, driven by regulatory changes, technological innovation, and evolving market preferences, impacts industry reliance on carbonintensive processes. Companies that fail to adapt face financial penalties, reputational damage, and reduced market competitiveness. You only have to look at the case of Volkswagen’s emissions scandal for a taste of the huge repuetaitional damage associated with haphazard practice. Effective risk management involves staying ahead of these trends, ensuring compliance with evolving regulations, and investing in sustainable innovations that align with market shifts.
Metrics and Targets: Measuring and Communicating Progress
Clear metrics and targets are essential for tracking progress in managing climate-related risks and opportunities. The TCFD emphasizes the importance of setting measurable goals related to emissions reduction, energy efficiency, and sustainability performance. By disclosing these metrics, companies provide transparency to investors, shareholders, and other stakeholders, demonstrating their commitment to managing ESG risks effectively.
Companies that set ambitious targets for reducing greenhouse gas emissions not only contribute to global climate goals but also position themselves as leaders in sustainability. If we take a look at Nestle’s bold target to achieve and maintain 100% assessed deforestation-free primary supply chains by 2025 for cocoa and coffee, we can understand why they are more likely to attract investment from ESG-focused funds and enjoy longterm market advantages. Moreover, tracking progress through clearly defined metrics allows businesses to adjust their strategies as needed, ensuring they remain resilient in the face of changing environmental and regulatory landscapes.
How Climate Risk Management Drives Long-Term Value Creation
Managing climate-related risks is not just about preventing losses—it is also about seizing opportunities for growth and innovation. Companies that proactively address climate risks are better equipped to adapt to a rapidly evolving business environment, making them more attractive to investors seeking sustainable longterm returns.
For instance, businesses that invest in renewable energy, sustainable supply chains, or eco-friendly products are tapping into growing consumer demand for sustainable solutions. Additionally, by mitigating risks related to extreme weather events or regulatory changes, these companies reduce the likelihood of financial disruptions and improve operational resilience.
Incorporating the TCFD’s recommendations into corporate strategy ensures that businesses are not only prepared for the risks posed by climate change but also positioned to capitalize on the opportunities presented by the transition to a sustainable economy. This forward-looking approach can lead to enhanced financial performance, improved brand reputation, and increased shareholder value.
Leading the Transition to a Sustainable Future
Integrating ESG into corporate governance, strategy, risk management, and metrics is no longer optional—it is a business imperative. The impacts of hurricanes Helene and Milton have shown the increasing urgency of addressing climate risks, while the global push towards sustainability offers significant opportunities for those willing to innovate and lead.
By adopting the TCFD framework and taking proactive steps to manage climate-related risks and opportunities, businesses can enhance their resilience, meet investor expectations, and generate long-term value. The companies that succeed in this new era will be those that embed sustainability into their core strategies, positioning themselves as leaders in the transition to a sustainable and resilient future.
Karl Schmedders Professor of Finance IMD
Shaping the Future of Banking: Diebold Nixdorf on Technology, Resiliency, and Sustainability
Diebold Nixdorf works with financial institutions across the globe to offer banking solutions that are more personalised, more accessible, more secure, more convenient, more flexible and more consistent across all channels. The company has evolved to meet those requirements and to collaborate as a strategic, committed partner for the client’s transformation journey. Diebold Nixdorf works with financial institutions in over 130 countries with the aim of building more personal connections through its AllConnect Services SM , Vynamic® software suite and DN Series® ATMs. These days, it's about connecting each person to what they need, when they need it. Global Banking & Finance is pleased to offer Diebold Nixdorf an Award for Best Banking Technology Solutions Provider Europe 2024; also, the award for Excellence in Innovation - Banking Technology Solutions Provider Europe 2024.
We were glad to welcome the company's Vice President of Banking Europe, Helena Müller, to receive the awards, and later, I took the opportunity to talk to her about the success of the company.
Phil Fothergill: Helena Müller, welcome to London. It’s so good to talk to you. Congratulations on the awards from Global Banking & Finance.
Helena Müller: Thank you so much. I'm honoured to be here and to receive the awards.
Phil: Let's talk a little bit about the operation of your company. I think everybody knows now that technology is transforming the world of banking and the banking industry. What do you see as the role Diebold Nixdorf will play in that transformation?
Helena: As we know, the market has been evolving for a long time now and as a technology supplier, we take the responsibility to serve and be a partner to the financial institutions in their journey of transformation.
So, we see some of the markets are moving into ATM pooling, which is a totally different journey than when you take the decision to move into recycling and using the notes in recycling mode. We’re helping our customers navigate the path between digital and physical banking as the industry continues to evolve and using our innovative technology solutions to allow our customers to achieve their strategic goals and meet end-user needs.
Phil: When you say ATM pooling, what does that mean exactly?
Helena: ATM pooling is a utility model in which several financial institutions partner up in order to deliver one service for ATM services into the market.
Phil: Well, obviously that is just part of giving a good service to clients and customers. How do you ensure that the service you provide is absolutely seamless and painless for clients and customers?
Helena: For Diebold Nixdorf, it's key to also connect the technology with intelligent software that allows the service to be available to the end users. The main focus is actually to serve the end users in their journey and feel that the ATM channel is a part of their other payment transactions.
Phil: Now, I think most people know, particularly here in the UK, that many bank branches are disappearing thanks to technology and the fact that everything is being streamlined. Some people regret it, some don't. In your opinion, what would you say that the future of the physical bank branch is, and how is Diebold Nixdorf actually supporting those transitions and changes?
Helena: I believe that both the end user and efficiency are the main drivers for the transformation and also for the branches. I think we will see that the branches will continue to decline, but at the same time, it's important to connect the digital and physical transactions for the end users. The end users like to feel recognised within the ATM channel as well as in the digital payment channel. So, for us, it's important to connect these two worlds with each other and to serve our financial institutions with good solutions.
Phil: Taking that a stage further, for the average person using a banking facility, either as a business or as an individual, how can you make sure that this is seamless, and that people don't miss the old-fashioned techniques?
Helena: It will all be driven by innovation, and we will see different kinds of new technologies coming up into the market and the ATM channel will also adapt to these payment formats. Today, end users are used to using a card at the ATM, but it might also evolve to mobile payments and mobile transactions at the ATM. Also, going to the branch closures or transformations, the end user will drive what kind of services and touchpoints they want to see in the market and then financial institutions must deliver the right cash journeys in the most efficient way.
Phil: What about resiliency and reliability? There are sometimes, in some companies, faults that cause complete blackouts and so on. How do you ensure that things are smooth in that particular area?
Helena: The topic around resiliency is at the forefront. I think this is included in all strategies today and for the future, and we need to keep in mind that we have the resilience as a concept around all services for the financial institutions moving forward. This is, for sure, very important within the future strategy.
Phil: Of course, sustainability is a kind of watchword at the moment, in every way and in every industry. How is that increasing focus on sustainability shifting the role Diebold Nixdorf is playing in that transformation?
Helena: Looking into the strategic work and the importance of sustainability here, financial institutions take care of cost efficiency; this is also where we deliver the services. At the same time, we look to the end users to make sure that everybody is included in the possibility to pay. Sustainability for financial institutions will cover cost-effectiveness, but also include everybody as an end user to be able to pay.
As we’ve mentioned, we’re seeing new banking formats and channels emerging that focus on sustainability and longevity. Implementing adaptable services is crucial, and flexible and modular designs of hardware and software are needed to be nimble within a dynamic market.
The efficiency of the services and products themselves is also very important. Reducing the manufacturing carbon footprint, optimising energy consumption, and using recycled and recyclable parts are all coming into focus.
Phil: Would that be very much the policy of Diebold Nixdorf as well?
Helena: For Diebold Nixdorf, we take an end-to-end approach to sustainability across all parts of our business focusing on three key areas – creating the possibility for sustainable consumer interactions, reducing our carbon footprint, as well as helping our customers reduce their carbon footprint, and equally reducing cost and energy consumption across our customer’s operations.
As part of this it has been really important to make sure that the equipment that we manufacture and produce, and the parts we are sourcing, follow the concept of sustainability; also to ensure that we have less power consumption, that we are using more intelligent technology, and that the manufacturing parts are delivered in a better capacity.
We have also been using AI for many years to continuously improve the performance of ATMs. This creates a more sustainable way of managing systems, for example, performing remote diagnosis on an ATM to avoid sending an engineer out to the site unnecessarily, therefore reducing the carbon footprint of service operations.
Phil: And in that way, you can ensure better sustainability. Let's look at some of the services you provide. There are many, but one that I read about was DN Vynamic Software, which sounds very impressive and it probably is. So, do tell us a little bit about this. How does this actually enhance the banking experience for customers and clients?
Helena: We're very proud of the Vynamic software suite in order to connect the customer journey across channels in a more seamless way. The modern ATM landscape is all about journeys. Consumers have come to expect connected experiences. DN Vynamic software allows financial institutions to create highly consistent yet deeply personalized banking experiences.
Equally important are operational journeys― those that enable the financial institution to achieve new levels of automation, efficiency, service and performance. Our software creates more simplified and sustainable operations, allowing the ATM channel to evolve and meet customer needs in a costeffective way.
Phil: It’s excellent to get just that one particular item as an overview, but obviously there are other technological advancements as well. What do you see as being the important technological advancements in the future of banking and how will your company, Diebold Nixdorf, actually deal with that?
Helena: I think the main drivers will continue to be the end user as well as efficiency, driving technology and the demand for sustainable solutions. As a technology supplier, we have to address our solutions according to those requests.
Phil: Exciting times ahead, then. In the meantime, thank you, Helena, for coming to talk to us today in London. Once again, congratulations on the awards from Global Banking & Finance.
Helena: Thank you so much.
Helena Müller Vice President of Banking Europe Diebold Nixdorf
“The beautiful thing about stories is that everybody has one. These stories are told in the lines on our faces, in the glint of our eyes, and continued even on the soles of tiny feet. Your story is not too big or too small, and it certainly isn’t too quiet or too loud,” Absa Group stated earlier this year when it first introduced Your Story Matters—its new brand ambition—to the world. “Whatever shape it takes, it has power and deserves to be heard. There are many stories: stories of love, stories of hope, and stories of courage. Stories with pages you want to read over and over again and stories with pages you want to tear out. All of them have the power to inspire.”
This “power to inspire” has driven the ascension of Absa’s customer service to a new high standard that is unparalleled within South African banking. Combining the three key elements of (i) seamless customer experiences, (ii) human-centred empathy and (iii) outperformance as a financialservices provider, Absa has never been more focused on transforming itself into a purpose-led organisation, one that strives to listen and relate to customers to deliver the most appropriate solution and banking experience to each one.
This approach is paying off handsomely and could perhaps be described as revolutionary in customer-experience optimisation. From the outset, Your Story Matters powerful and instantly recognisable brand identity has helped build trust with both existing and potential customers, which, in turn, has fuelled greater loyalty to said brand and, ultimately, greater interest in Absa’s banking products and services. By being memorable and relatable, this branding is setting Absa apart from its competitors—another major draw for the bank to attract interest from potential customers.
Its distinct focus on “human-centred empathy”, meanwhile, is helping to cultivate crucial emotional connections between Absa and its customers, duly fostering a strong sense of belonging and loyalty. Indeed, Your Story Matters is designed at its core to deliver a memorable, empathic experience to all customers, ensuring they feel seen and heard at every stage of their banking journeys with Absa. This is proving a game-changer for customer retention, with the bank’s investment
in brand recognition a vital precursor to establishing a robust foundation to attract and retain new customers, driving growth and success in the long term.
Breakthroughs in digitalisation have certainly helped in this regard, with Absa being able to offer a comprehensive one-stop shop for banking self-service via essential features that add value and impact customers’ lives positively. Customer experience, quality, innovation, security and, of course, Absa’s all-new and powerful brand identity are just some of the key pillars underpinning this digital strategy. As for specific digital-banking applications, a complete ecosystem has been strategically built on a “digital first” premise that involves a hugely popular mobile-banking app, connected (internet) banking, cell-phone banking (Unstructured Supplementary Service Data, or USSD) open web (Absa. co.za), complete digital sales platform, and chat (social-media) banking.
Convenience is paramount to the success of this ecosystem. Whereas previously, robust banking policies meant that customers were required to fulfil their requests at physical bank branches, today, Absa’s 400 self-service features have virtually eliminated the need for a customer to visit a branch. As such, customers can conduct almost all of their banking tasks from the comfort of their own homes.
Of course, that comfort can only be sustained when customers are wholly confident that their banking security cannot be compromised—an issue that has become an undeniable priority amid a massive shift in customers’ banking behaviours away from the branch and into the home. World-class digital safety features have thus been employed to provide both the customer and the bank with peace of mind.
In practice, it is arguably the advancements in biometrics that most aptly illustrate this concern, with the bank recently launching a joint venture, AbsaID Facial Biometrics, that enables customers to transact securely by effectively using their faces as their passwords. And it has proven unambiguously popular. Since its launch in 2022, more than 2.5 million verified facial-biometric customers who transact safely and securely through Absa’s banking channels have been recorded.
One can also observe Absa’s championing of empathy bearing fruit on the corporate social responsibility (CSR) front, with its flagship programme nothing short of behemothic in transforming South Africa’s diversity and inclusion landscape. On September 1, 2023, the bank implemented a BroadBased Black Economic Empowerment (B-BBEE) scheme, allocating a significant 7 percent of the total Absa Group shareholding to structures that seek to mainly benefit black participants. This is achieved, firstly, via a 4-percent evergreen corporate social investment (CSI) component focused on education and youth-employability support and empowerment for thousands of black South African beneficiaries and, secondly, through a 3-percent employee Colleague Share Scheme called eKhaya, a Zulu word meaning “at home”. Participation in eKhaya is free, and while all permanent employees receive an equal allocation, black South African employees receive an additional 20-percent allocation.
The B-BBEE transaction also provides equity ownership to all of Absa’s permanent employees in South Africa, while employees in Absa Regional Operations and other international operations participate equally in a cashequivalent component. This milestone development underscores Absa’s commitment to be an active force for good, demonstrating how the bank is actively putting its stated purpose to work: “Empowering Africa’s tomorrow, together… one story at a time.”
Absa has put in place a dynamic digitalbanking ecosystem with the necessary agility and speed to solve diverse issues while providing a renewed edge and greater accountability so the customer value chain stays on the cutting edge of banking. And with a digital strategy that, going forward, will both deepen existing relationships and build ties with prospective and newly acquired customers, as well as continuously develop and deliver the latest exciting digital capabilities and features, Absa clearly remains at the forefront of innovation, relentlessly updating its digital technology to match customer habits and behaviours and ultimately empowering them with the gift of choice, when and where they want to use it.
Why a Shift in Regulatory Enforcement Demands a New Approach to Trade Surveillance
Regulators are changing tact. Significant enforcement actions demonstrate an increasing focus not just on whether market abuse occurs, but whether firms’ surveillance capabilities and internal processes are fit for purpose.
One notable case this year involved deficiencies in wash trading surveillance practices. Despite utilizing modules from its vendor, the firm struggled with excessive false positives and could not adequately justify why it selected certain surveillance modules while omitting others. The regulatory focus here was clear: firms must deploy appropriate and complete surveillance modules, tailored to detect market abuse patterns like wash trades. If certain modules are not used, firms must be able to demonstrate why these exclusions are appropriate.
Elsewhere, a large financial institution faced enforcement action for repeatedly failing to accurately report millions of swap transactions over a five-year period. These issues stemmed from data ingestion problems, where the firm’s systems failed to capture and report critical trade data. Despite being subject to a prior regulatory order for similar deficiencies, the firm had not implemented adequate controls to prevent future occurrences. This enforcement action highlights the regulatory expectation for sustained compliance efforts, especially following remediation agreements.
Similarly, a global financial firm missed surveillance of billions of orders across multiple global trading venues over a period of several years. The firm’s surveillance system failed to ingest direct-from-exchange data, leading to major gaps in trade monitoring.
Lastly, another global financial firm permitted continued placement of problematic orders near the close for months despite receiving multiple warnings from the regulator. In this case, the surveillance tool is cited for both erroneous programming and a lack of follow-up to repeated service requests from the firm.
Unpacking key enforcement trends
While all cases are unique, there are overarching trends that demonstrate where legacy surveillance systems are currently falling short.
Anecdotally, these enforcement actions collectively suggest that regulatory examiners appear increasingly focused on how firms manage internal controls and compliance procedures. This emphasizes the need for surveillance systems that can automate and enhance internal compliance reviews, helping firms stay ahead of regulatory requirements and avoid costly enforcement actions.
Yet one of the core issues across all of these enforcement cases is alert adjudication, with the process by which firms investigate and close alerts now under heightened scrutiny. Regulators are increasingly looking at whether firms can demonstrate a clear, documented process for handling surveillance alerts. The problem is that many firms are struggling with the volume of false positives, often caused by poorly configured thresholds or overly conservative alert settings.
This brings us to another area of concern: the calibration of surveillance tools. As seen in several enforcement actions, misconfiguration of surveillance tools can lead to an overwhelming number of false positives, which drain resources and hinder the ability to identify genuine market abuse.
Supervisory controls are also in the spotlight. Regulators have made it clear that firms must regularly review their surveillance processes to ensure they remain effective. In one case, a firm’s failure to capture billions of order messages over several years was traced to inadequate supervisory controls. In another, a hard-coded error in a surveillance procedure persisted for a long-time.
Empowering firms to proactively mitigate risk
As firms look to navigate heightened regulatory scrutiny in the U.S. and across other jurisdictions, firms must ensure that their surveillance systems are not only comprehensive but also flexible, auditable, and capable of handling complex trading behaviors.
This demands various capabilities across their surveillance programs to overcome ongoing challenges:
• Automation for efficiency: Automating routine alerts allows analysts to focus on exceptions and more complex cases. For example, automatically reviewing routine alerts based on bespoke compliance logic reduces the number of false positives and cuts down on manual work.
• Customization and flexibility: Programs must allow for flexible configuration and deep customization, enabling firms to fine-tune their surveillance thresholds according to their unique risk profiles, trading behaviors, and market conditions.
• Real-time data ingestion and accuracy: A key issue in several cases has been the failure to capture and report data in real-time. Systems must ensure that all relevant data is captured in real-time and that there are no gaps or delays in the ingestion process. Regular system audits and fail-safes should be built into surveillance programs to prevent such issues from occurring.
• Sustainability and audibility: The increasing regulatory focus on sustainable and auditable processes places the onus on firms to easily demonstrate their compliance efforts during regulatory audits. This requires clear audit trails, documented processes, and the ability to generate reports that show how alerts were handled from inception to closure.
As a trade surveillance vendor with an experienced team of former CCOs, we endeavor to anticipate emerging trends in surveillance requirements. We have worked closely with firms to ensure they are equipped with the tools they need to effectively mitigate risks highlighted in the risk assessments—while improving the overall efficiency and effectiveness of their surveillance programs.
Joe Schifano Head of Global Regulatory Affairs, Eventus.
Navigating AI Bias: Ensuring Data Integrity in the Age of Generative AI
Since the emergence of generative artificial intelligence (AI), the business landscape has transformed into a new world of efficiency and innovation. Many view generative AI as the ultimate silver bullet in organisations’ arsenals – it streamlines processes, discovers new avenues for innovation, analyses masses of data within seconds, and even personalises the customer experience.
This wide array of benefits has led to 34 percent of UK businesses implementing at least one form of artificial intelligence into their systems in 2024, and 59 percent have indicated that their spending on AI projects will rise compared to 2023.
However, despite its popularity, the use of AI does not always guarantee success. Many organisations have implemented it into their systems without adequately preparing their data and processes, leading to a minefield of issues.
When AI goes wrong
Although it is perceived as impartial, the reality is that AI is a product of the data that fuels it. For example, if there is a lack of representation in the data powering an AI model, it is highly likely to start producing biased outputs. In fact, AI bias – the production of incorrect outputs due to inaccurate, incomplete, or unreliable data – is one of the most prominent AI-related issues facing modern businesses. For many organisations, their data is either living in silos, is stale, unstandardised, is full of duplicates, or lacks the insight required to make it usable. This translates into a range of issues, such as irrelevant or inaccurate outcomes, which can lead to significant realworld consequences.
For example, in the financial services industry, AI is often used for credit scoring to determine the creditworthiness of loan applicants. However, if the data used to train the AI model is biased, it can lead to unfair outcomes. For instance, if the historical data includes a disproportionate number of loan defaults from a particular demographic group, the AI model might learn to associate that group with higher risk. As a result, the model could unfairly deny loans to individuals from that demographic group, even if they are financially stable and creditworthy.
This is already occurring in the real world, in fact one financial institution’s AI system was found to be biased against women. The system was trained on historical data that reflected existing gender biases in lending practices. Consequently, women were more likely to be denied loans or offered less favourable terms compared to men with similar financial profiles. This bias not only perpetuated existing inequalities but also harmed the financial institution’s reputation and led to regulatory scrutiny.
Ultimately, as AI is integrated into all aspects of the business world, organisations have a responsibility to ensure it is accurate and reliable, as the ramifications can be detrimental. Underpinning these tools with trustworthy data is critical for any organisation looking to capitalise on the endless capabilities that AI can offer. Consequently, many organisations are looking to boost their data integrity by prioritising data integration, data quality and governance, location intelligence, and data enrichment.
Break down silos with data integration
Many organisations are vulnerable to AI bias as they have an array of data living in different systems, stored in different formats. When data is siloed across lines of businesses and across data platforms, it is extremely difficult to create an accurate and unified view of the organisation’s data. As a result, the AI’s outputs may not fully reflect the available information, potentially leading to ineffective recommendations, such as marketing campaigns that fail to consider the latest point of sales data.
By integrating critical data across cloud, on-premises, and hybrid environments as well as across business functions, organisations can help ensure that data is integrated, complete, consistent, and accurate – improving the reliability of AI results and reducing the risk of errors and biases.
Build robust frameworks for data quality and governance
Collating wide ranges of data and translating it into one format is not enough by itself to prevent AI bias. Data can still be inaccurate, inconsistent, missing, or containing duplicates. Consequently, including all relevant and critical data is only the first step –organisations must also look to ensure the quality and governance of the data fuelling AI models.
A robust data quality strategy should include tools that can monitor health of data on an ongoing basis, cleanse, de-duplicate, and validate critical data, whilst also producing dashboards and automated workflows. This helps companies to proactively observe, detect, and address data quality issues faster and with less difficulty.
Data governance helps align technology, people, and processes, enabling organisations to have a wider understanding of its data. This creates enhanced visibility, which strengthens the accountability and quality of an organisation’s data assets and allows it to be correctly monitored to ensure compliance with privacy and security regulations. By combining data quality and governance frameworks, organisations can look to considerably enhance the extent of which trustworthy insights can be drawn from their data and AI models. Data governance is also a key part of AI governance.
Enrich data for contextual relevance
Regardless of whether data is accurate or complete, if it lacks context its outputs will be vulnerable to biases, as it won’t be able to consider the subtle and complex details that could alter the value of the data. For example, if AI was used to predict flight demand across a calendar year and the data it used included the year of the COVID pandemic – without the correct context around it, the data would not be an accurate representation of normal flying habits. Therefore, for data to lack bias, it must also include context to eliminate the impacts of anomalies.
Enriching data with trustworthy thirdparty datasets and geospatial insights can help companies to improve the diversity of their data whilst also revealing any undiscovered patterns that could have been previously overlooked. Datasets that enhance these insights include demographics data, detailed address information, consumer behaviour, points of interest data, and environmental risk factors.
When AI is fuelled by contextual data, organisations can be confident that they are facilitating the most relevant and reliable outputs for all possible applications, whilst minimising the likelihood of data bias.
Data integrity is the key to AI success
Organisations are seeing AI as the new gold rush of the technology world, making it the ultimate way to remain competitive. However, as AI deployment grows, so will the number of organisations that suffer from AI-related issues, including bias. Mitigating bias requires taking action for de-biasing the data as well as mitigating algorithmic bias and systematically inspecting algorithmic decision making.
As a result, organisations must ensure that the data used to train their AI models and the data used for AI predictions is reliable. If the data is inaccurate or irrelevant, it can lead to significant repercussions, including regulatory fines and reputational damage.
Now more than ever, organisations need to be proactive when it comes to building a meaningful, sustainable data strategy that combines data integration, data governance and quality, location intelligence, and data enrichment. By doing so, they can use AI and generative AI to accelerate data-driven decisions.
Tendü Yogurtcu PhD, chief technology officer at Precisely
Rethinking the Cost of Sustainability Reporting
With new disclosure requirements pushing for enhanced transparency in environmental, social, and governance (ESG) practices, many companies are seeking to reduce expenses for preparing sustainability reports and complying with regulations. Our research, however, indicates that this strategy is ‘penny wise, pound foolish’. Unless companies rethink the cost of sustainability reporting, many may soon face its true costs: market value losses from inadequate reporting.
The most effective strategy for reducing the costs of sustainability reporting, paradoxically, is to invest more in it. Our findings suggest that the market value losses caused by inadequate reporting far exceed what would be considered reasonable
expenses to prepare a thorough report. Companies should, therefore, shift their view of sustainability reporting from a compliance obligation to a strategic investment – one where transparency is a competitive advantage.
A paradox
The European Corporate Sustainability Reporting Directive (CSRD), aiming to enhance transparency on sustainability issues, will impact nearly 50,000 European firms over the next three years. Similar regulations are in the making in the United States and other jurisdictions. Many companies consider these regulations a regulatory burden, with significant compliance costs on the horizon.
However, the less you spend on sustainability reporting, the more costly it becomes. This paradox has emerged from our research on the economic consequences of a sustainability reporting mandate (available here). As the CSRD has yet to take effect, our focus has been on its predecessor, the Non-Financial Reporting Directive (NFRD). Like the CSRD, the goal of the NFRD was to create a standardized, transparent system for disclosing sustainability information, helping stakeholders make informed, sustainable decisions. Yet, our research shows the regulation’s most significant impact has been on companies that might otherwise withhold or obscure sustainability information.
Companies that began disclosing sustainability information long before any reporting mandate, reflecting their genuine commitment to transparency, were already providing meaningful disclosures that reduced information asymmetry and increased investor trust; the Directive has had a minimal additional effect on them.
On the other hand, companies that began reporting only because it was mandated often chose to rely on generic, boilerplate language in their disclosures – a strategy that might lower the cost of preparing the report, but ultimately fails to add real value. The result? Greater information asymmetries and a decline in firm value.
How to get it right
Considering our findings on the negative consequences of inadequate reporting, the question arises: how can companies effectively minimize the true costs of mandated sustainability reporting, or even make it a profitable practice? Through our analysis of the sustainability reporting practices of firms with a genuine commitment to transparency, we have identified the most important strategies to consider.
• Be proactive and forthcoming: Companies that proactively engage in sustainability reporting are better positioned to create meaningful and valuable reports. This approach not only builds trust with stakeholders but also helps avoid the pitfalls of boilerplate disclosures.
• Do not withhold information: Reporting ‘bad news’ is often better than reporting ‘no news’ at all. If you withhold information, investors and other stakeholders may assume the worst and react accordingly, leading to a loss of confidence and a
potential drop in firm value.
• Tailor your reporting: Generic reports often fail to address stakeholders’ specific concerns. By tailoring reports to highlight the unique aspects of your business and its impacts, you can provide more relevant information, reducing information asymmetries and enhancing firm value.
• Leverage established frameworks: While some regulations allow flexibility in choosing reporting frameworks, aligning with wellrecognized standards developed by the Global Reporting Initiative (GRI) or the International Sustainability Standards Board (ISSB) can lend credibility to your reports and make them more comparable across industries.
• Monitor and adjust continuously: Sustainability reporting is not a one-time task but an ongoing process. Regularly reviewing and adjusting your reporting practices in response to stakeholder feedback helps maintain transparency and relevance, ensuring that your reports continue to add value.
It’s an investment
Although the shift towards enhanced transparency in sustainability is not without its challenges, it doesn’t have to be a financial burden. Companies can shift their perspective on sustainability reporting from a compliance obligation to a strategic investment. After all, in today’s business environment, transparency isn’t just a regulatory requirement –it’s a competitive advantage.
Michael Erkens is a professor at Nyenrode Business University and an associate professor at Erasmus School of Economics. He specializes in financial and sustainability reporting and corporate governance. His research addresses firms’ disclosure practices, executive compensation, and the effect of regulations on firm behavior.
Ries Breijer, an assistant professor at Nyenrode Business University, has a strong interest in financial and sustainability reporting. His research focuses on the regulatory impacts on sustainability reporting, the consequent economic and real effects, and how changes in financial reporting standards can contribute to a more sustainable economy.
Dr. Ries Breijer
Dr. Michael Erkens
The Fintech Art of Scaling from a Single ‘Killer’ Feature
The financial sector isn’t for the faint-hearted.
It’s a category of established and well-funded heritage and high-street brands as well as famed, PR grabbing unicorns. There is also a constant wave of emerging fintechs entering the market, all with aspirations of global scale based on promises of better customer-centric FS solutions.
Having led strategy for well established banks, as well as for some of the most exciting fintech start-ups, I know that when a business has ‘cracked the tech’ and can genuinely deliver game-changing but broad FS solutions, it can be daunting to know where to start in your customer communications due to your long list of functional and emotional benefits. This issue is compounded as you’re a new, unknown (and therefore untrusted) brand.
So it may reassure those businesses to know that, while it seems counter-intuitive, there is a strong argument for ‘less is more’ when launching and establishing yourself in market.
If your business has created a product that is genuinely amazing, and contains so many features and benefits that you know it will conquer the category, a clever strategy for launch is to drip feed your innovation piece by piece through your marketing in order to avoid hitting your potential customer with a singular macro solution which may be hard to comprehend and even harder to trust.
Airbnb has revolutionised a static and archaic category, but didn’t launch with its ‘live like a local’ messaging at the outset. Instead of leaping to the full potential of the product and the experiences it creates, its marketing team focused on immediate and more tangible benefits, that of monetizing your spare room, or staying somewhere cheaper than a hotel.
This approach requires the brand to consciously take a step back and acknowledge that the bigger brand promise and delivery actually needs to be simplified at first. Frustrating if you have much more under the hood but, I’d argue, necessary to ensure your launch has a clear focus and clear, understandable benefit.
Ask yourself, ‘what is the killer feature of my app or FS product?’. If you could only talk about one, what would it be? If that feature has a clear value exchange to your customer, then it’s often a much better way to launch with that, showing one card rather than the whole pack and ensuring consumers ‘get what you do’.
In short, sell an immediate benefit, not your entire dream.
The following ten financial service startups knew that. They started with one solid feature or benefit and doubled down on it. Only then did they scale into the financial multitaskers they are today.
The Single-Feature Success Stories
Robinhood went all in on commissionfree trading. They slashed the fees and the hassle for new traders. Once they had them hooked, they branched out into crypto and banking, turning no-cost trading into a gateway for financial exploration.
Acorns made investing as easy as buying a coffee. Pop your spare change into an investment account and watch it grow. Their winning idea? People are more willing to invest pennies than piles of cash. Once trust was built, Acorns sprouted into banking and retirement services.
Venmo made splitting dinner bills a breeze, with a social twist. But here’s the deal: They didn’t just make payments easier, they made them fun. From there, it wasn’t just about paying your friend for pizza – it was about being part of a community that could also pay businesses.
Square offered a little white square that turned phones into cash registers. Magic for small businesses, right? They honed in on those who felt left out by the big payment processors and then broadened into loans, payroll, and more.
Stripe seized the confusion of online payments and threw it out the window. Catering first to developers with sleek, simple integration, they caught the tech wave and rode it straight into a sea of financial solutions.
WealthSimple simplified the complex world of investing and targeted those who wouldn’t know a stock from a bond. With robo-advisors handling the investments, they paved the way for tax services and private portfolios.
TransferWise (Wise) revolutionised the money transfer process by being transparent with fees. They underlined the ‘hidden fee’ pain point, scratched it out, and created a loyal user base that helped them grow into a financial wunderkind with borderless banking.
SoFi addressed the elephant in the room –student debt – by offering refinancing to save grads from sinking. Once they secured a foothold, they leapt into insurance, investing, and even a card to rack up credit.
Chime enticed with a no-fee checking account, hooking those tired of being nickeland-dimed. The initial draw? An intuitive app and automated savings. They let their customers’ savings grow while adding credit builders and early access to earned wages.
Revolut targeted the nomads and travellers, slashing the costs of currency exchange, and delivered a multi-currency debit card worth talking about. With their foot in the door, they upped their game to include stock trading and cryptos.
These are all perks of stripping back to a killer feature or value exchange for launch.
Starting with a single feature isn’t being timid, it’s being strategic. These companies knew the potency of simplified marketing: one feature, one message. Effortlessly digestible for the chronically busy entrepreneur.
If your product isn’t exactly where you want it to be for launch, then zooming into a single feature that is ready – assuming you believe it to be a game-changer – can also reduce development costs pre-revenue.
Lastly, consider this: a unique offering carves out a niche, meaning reduced competition. These companies didn’t throw a hodgepodge of services to see what stuck. They owned one. And when you own your space, scaling isn’t just slapping on more features; it’s growth with purpose.
Toby Strangewood Co-Founder, Wake the Bear
Banking in 2035: How emerging technologies will transform the way we bank
The year is 2035. You wake up and check your finances through a voiceactivated digital assistant, who appears as a hologram of Elvis. After being authenticated via voice and fingerprint biometrics, the late King of Rock and Roll provides a snapshot of your spending, savings, and investments in a personalised dashboard containing all your accounts and financial data in one place.
As futuristic as this may seem, this is how banking in just over a decade could look and feel. By investing in the right areas and harnessing the power of emerging technologies, today’s financial services industry will be almost unrecognisable in 2035. This will be down to five key technology trends already impacting finance today: quantum computing, artificial intelligence, Decentralised Finance (DeFi), digital ecosystems, and biometric security.
Quantum computing unlocks hyper-personalisation
The convergence of quantum technologies and the financial services industry can provide greater accuracy, efficiency, and security in the products and services shaping the sector in 2035.
One of the most transformative aspects of quantum computing will be enabling real-time hyper-personalisation of financial services. Traditional computing faces limitations in processing and deriving insights from massive datasets. Quantum, on the other hand, introduces game-changing power to analyse big financial datasets combined with deep learning algorithms.
Banks will use these capabilities to understand each customer’s unique needs, preferences, and values. When you walk into a bank branch, a biometrically authenticated quantum AI assistant will already have an intimate grasp of your financial personality, risk appetite, and life goals. All this information will save you time when applying for a credit line or micro-loan, or even creating an investment portfolio. Quantum computing technology will have tailor-made options ready for customers to choose from based on their preferences and financial capabilities.
AI will automate and secure banking as we know it
AI will be at the heart of many disruptions that take place in the banking industry by 2035. Banks are already using AI chatbots to handle customer service queries and for process automation. Integrating quantum power and generative techniques will unlock unprecedented applications for AI in the future.
Smart contracts running on decentralised networks will settle complex interbank transactions and derivatives trades near-instantaneously with complete transparency. Real-time analytics of operations data will allow AI to monitor risks and anomalies and initiate preventive measures continuously. Likewise, fraud identification models will leverage biometric surveillance and quantum pattern recognition to flag threats before they occur, meaning banking will be more secure than ever before.
DeFi and cryptocurrencies will redefine how money moves
Decentralised Finance (DeFi) – which removes financial intermediaries through blockchain networks – and cryptocurrencies will fundamentally change how money moves and banking operates by 2035. Direct access to crypto assets, peer-topeer (P2P) transactions, and yield-generating open platforms will become mainstream thanks to integrations by trusted financial institutions.
Banks may provide access to regulated stablecoins fully collateralised by currency and government bonds to enable instant domestic and crossborder money movement 24/7. This will mean customers can directly exchange currencies during foreign travel or split bills across currencies in real time without fees or delays.
Banks will also allow clients to tokenise and trade assets, including real estate, precious metals, intellectual property, and more, through licensed decentralised exchanges wired into the traditional finance system. Open lending platforms will enable peer-to-peer loans using cryptocurrency as collateral, and interest rates could be set algorithmically based on supply and demand.
Banks embracing crypto today, like Goldman Sachs and DBS, will have a firstmover advantage as DeFi matures from its early days. Their critical role to ensure its success will be integrating decentralised finance into trusted and secure platforms.
Digital ecosystems and embedded banking weave finance into daily
life
The trend of integrating financial services into non-financial platforms is gaining significant traction already with companies like Uber, Amazon, and Apple. From embedded finance to open banking and P2P payments and wallets, the industry’s new collaborative operating models are evolving how financial services are accessed, delivered, and experienced. This convergence will accelerate greatly by 2035.
Your digital interactions – messaging friends, shopping online, or riding in an autonomous vehicle – will seamlessly blend personalised financial features powered by banks and fintechs behind the scenes. For example, when grocery shopping online, you can apply for and receive real-time approval on a micro-loan through your bank to cover the purchase. Your personalised lease extension will be automatically initiated for your review and biometric e-signature – no need to schedule meetings or submit applications.
Biometrics will make banking invisible
John Da Gama-Rose
Head of Banking & Financial Services for Global Growth Markets at Cognizant / Nageswar Cherukupalli, Senior Vice President, Head of Banking & Financial Services, Americas, Cognizant
The proliferation of biometric authentication, for instance using fingerprints, facial recognition, iris scans, and more, will provide the security scaffolding to make banking invisible in 2035. Passwords and PINs will be substituted by your unique biological identity as your banking passcode.
This will enable more convenient and secure customer experiences. You may access your account via voice assistant just by speaking. Authorising payments could involve simply looking at your phone. Transferring money would only require a quick selfie.
The runway for 2035
The banking experience will be vastly different in 2035 with significant advancements in computing power, intelligent systems, decentralisation, and biometrics. Being digital-first will no longer be enough. Competing in 2035 will require becoming “technology-infused.” But amid this change, banks must keep financial access, inclusion, and sustainability central to ensure technology elevates humanity.
The future of banking will be defined by those who take decisive action today to shape it for the benefit of society. The runway to 2035 is open. It’s time for institutions to accelerate and take banking to the next level.
Banks are ready to embrace AI, but customers are hesitant: How to bridge the trust gap with GenAI and customer communications
Scott Draeger SVP of Product Marketing & Vertical Solutions Smart Communications
Scott Draeger is the SVP of Product Marketing and Vertical Solutions at Smart Communications. With a passion for collaboration, he focuses on how communications can be better for the recipient and perform better for the business. He started as a document designer using a collection of hardware and software technologies, before moving to the software side of the industry. His broad experience includes helping organizations improve heavily regulated customer communications all around the world.
Generative AI (GenAI) has unquestionable potential for the banking industry, with some estimates suggesting that it could add the equivalent of $200 billion to $340 billion to banking revenues each year. Banking leaders are already taking steps to capture that value, with over $20 billion invested in AI technology last year – the most significant spending by any sector worldwide – and nearly $100 billion in investment expected by 2027.
But, when it comes to the customer experience, banking customers are not ready to fully embrace AI just yet. Our recent survey of 2,000 customers in finance, healthcare, and insurance showed skepticism over the use of GenAI in their customer communications, with 63% concerned about ethical use and 66% with security concerns. The report suggested a need for greater checks and balances so that the public would feel more comfortable with this stillnew technology.
So, how are banks embracing AI, and what concerns do they need to address among consumers?
Fraud, trendspotting and customer service: AI’s greatest strengths in banking
It’s early days, but the banking industry has already identified three core areas to maximize the benefit of artificial intelligence.
1. Fraud Protection: The first is improving fraud detection, which cost the industry nearly $500 billion in 2023 alone. Banks are using AI to monitor transactions in real-time, detect unusual behavior and alert both customers and internal prevention teams to potentially fraudulent activity. Because these AI models can process data and recognize patterns far more effectively than the human mind, they have a significantly higher success rate, saving banks from heavy losses and protecting their customers.
2. Market Trends: AI is also crunching the numbers on market trends, helping investment managers make more intelligent decisions based on more data than they could ever manage before. These prediction tools can also be made customer-facing, which gives a bank’s clients greater access to individualized investment advice and helps them feel more in control of their financial decisions.
3. Customer Experience: Intelligent chatbots, customized service and AI-generated communications are helping banks on the third frontier: customer experience (CX). These investments help reduce wait times, drive greater efficiencies and improve the accuracy of communications they send to customers. But according to our consumer survey, it’s these investments that customers are also most wary of.
Customers are skeptical – and concerned – over GenAI in customer communications.
Our research warns of a potential risk between banks and their customers regarding using GenAI. Two-thirds of those customers have ethical and security concerns (63% and 66%, respectively) over the use of GenAI in the communications they receive, and less than a third (30%) believe that the technology can be better than humans in creating customer communications content.
Consumers are not just skeptical; they want their institutions to put guardrails in place when deploying GenAI, and they want to know when it’s been used. A staggering 81% said that humans should check GenAI content, and over three-quarters want its use to be explicitly called out.
These sentiments are not dealbreakers, but banks must consider how customers feel about this groundbreaking new technology when using it in their communications.
How to bridge the perception gap –openness, security and best-in-class experience
Banks can begin by embracing honesty and transparency as the best policy for GenAI use. They should include disclaimers on content created with GenAI tools so that customers are fully aware of its use. This is also an opportunity to show that humans remain in the loop by adding a signature from the bank employee who checked over the AI-generated content and signed off on its use.
This policy of openness should extend to other aspects of the customer experience where AI is deployed, such as with chatbots and other customer service functions. Customers should also be able to offer feedback on how the technology works so they can feel like they are part of the process.
To address security concerns, banks could disclose the data used to train their customer-facing AI models. Specifically, they can explicitly call out that sensitive, secure and personal data are not being used so that customers can rest easy that their data remain under lock and key. Customers should also be able to opt out of these functions wherever possible to give them greater control over AI’s role in their banking experience.
Finally, there’s no better way to assuage skepticism than by demonstrating the positive power of AI. Banks must ensure that wherever AI is deployed, it delivers maximum benefit to customers and improves their overall experience. Whether through reduced wait times, more timely and accurate messages from their bank, or simply a more personalized experience, customers are more likely to embrace any new technology when they can see a positive impact on their own lives.
AI’s domination of news cycles, political debate and even kitchen table conversations is unlike any technology we’ve seen in the past quarter century. Not since the advent of the internet in the early 90s has an innovation promised such a profound change to our daily lives, so it’s understandable that consumers have strong opinions about its use. Our research shows that the banking industry has a real opportunity to show it understands these concerns and factor them into how it builds AI into every business layer, reaping its benefits for the coming decades.
How to Implement Unified High Availability (HA) and Disaster Recovery (DR) for SQL Server in Financial Services
The financial services sector, including banking, mortgages, credit cards, payment services, tax preparation and planning, accounting, and investing, is one of the world’s most important and influential sectors. Here, assuring reputation and trust are a must. There are of course a number of critical factors that must be addressed to do so, however one that tops the list is ensuring that data and services access remain uninterrupted and available at all times.
In order to do so, procedures and technology that ensure high availability (HA) and disaster recovery (DR) are crucial – not only for reputation and trust, but for complying with the industry’s strict regulations.
In this article, I will discuss one such area that plays a key role in most financial organizations – that is maintaining a resilient and responsive Microsoft SQL Server infrastructure, which is unfortunately… no walk in the park.
The Challenge of SQL Server in Financial Services
Like most industries, SQL Server is the cornerstone of many financial organizations’ data architecture. It is also common that these organizations are managing hundreds or even thousands of SQL Server instances, which can make effectively configuring HA and DR feel like a virtually impossible task. Add to that the complexity of ensuring compliance, managing costs, and meeting stringent servicelevel agreements (SLAs), and it becomes clear why so many financial organizations struggle to create a unified HA/DR strategy.
While SQL Server is a powerful and one of the world’s most widely adopted platforms, it often requires very specific and unique solutions to
ensure HA and DR. And if done wrong, instead of improving the financial institutions HA and DR posture, it can instead introduce unnecessary complexity, potential failure points, and inflated costs – challenges that financial institutions, constrained by budgets and regulatory requirements, can’t afford.
Where to Begin? Planning a Resilient HA/DR Architecture
Step number one for IT professionals in financial services is to assess the current infrastructure and determine the availability needs for each application or database. Here, a tiered approach can serve to help prioritize resources and ensure the highest levels of protection for the most mission-critical services, while utilizing a more cost-effective approach to HA for applications with lower uptime requirements.
Here’s a breakdown of how you can categorize your HA/DR requirements:
• Tier 1 – Multi-Site HA + DR – For mission-critical apps with the strictest SLAs, ensure HA across multiple sites with SQL Server instances replicating between geographically separate locations – this ensures failover within and between sites for seamless recovery
• Tier 2 – Multi-Site Availability Groups (AG) Without HA – For apps with slightly less stringent availability demands, deploy SQL Server instances at two sites with AG replication, ensuring data availability without full intra-site HA
• Tier 3 – Single-Site HA Only –For services where intra-site HA is sufficient, but cross-site DR isn’t required, reduce costs and complexity by focusing on local HA solutions
Choosing the Right Tools and Technologies for Financial Services
Financial organizations face a myriad of challenges when trying to select technologies that offer flexibility and platform independence – yet, this is crucial. Consequently, IT professionals should opt for solutions that integrate with both Windows and Linux environments, which will allow for hybrid deployments that meet your specific needs.
The use of Extended Vhosts is one innovative approach to simplifying SQL Server architecture. Extended Vhosts offer a cost-effective means of providing intra-site and crosssite HA while leveraging the builtin capabilities of SQL Server and Availability Groups (AG) for replication.
And, what’s more… by reducing reliance on third-party replication solutions, financial institutions can significantly reduce technology costs and streamline their infrastructure.
Implementing a Tiered Model for SQL Server HA/DR
Once you’ve established a tiered approach, the next step is configuring your infrastructure accordingly:
• Tier 1 – Multi-Site HA + DR
Configuration: Set up HA instances at each site, use AG replication to synchronize data, and configure automatic failover for uninterrupted service during both site and local failures
• Tier 2 – Multi-Site AG Without HA: Deploy SQL Server instances at multiple sites, using AG replication to ensure data redundancy and availability
• Tier 3 – Single-Site HA
Configuration: Focus on local failover within a single site to ensure high availability for applications that don’t require cross-site DR
Optimizing Costs While Maintaining Security and Compliance
Cost is always top of mind for financial organizations, especially when dealing with SQL Server licensing. Consider reducing your Windows footprint and exploring Linuxbased deployments to cut OS licensing fees. This approach is particularly beneficial in larger environments, where the licensing costs for hundreds of SQL Server instances can quickly escalate.
Other cost-saving measures include consolidating SQL Server instances using containerization, and eliminating third-party replication by fully leveraging SQL Server’s builtin AG capabilities.
Ensuring Continuous Monitoring and Testing
Once your HA/DR infrastructure is in place, continuous testing and monitoring are essential. Regularly test both intra-site and cross-site failover processes to verify they meet your SLAs and recovery objectives. Implement realtime monitoring and automated alerting to detect and address issues before they impact your production environment.
Achieving Business Continuity in Financial Services
For financial institutions, a unified HA/DR strategy that leverages innovative technologies like extended vhosts can help achieve business continuity, reduce costs, and simplify infrastructure management. By planning ahead, streamlining architecture, and optimizing SQL Server deployments, financial organizations can protect their operations, ensure data availability, and meet even the most stringent regulatory and customer demands.
This approach can be adapted across industries where data availability, cost efficiency, and resiliency are paramount, but it’s particularly critical for financial services – where disruptions can affect the bottom line and customer trust.
Boxley
Co-founder and CEO DH2i
Don
Jr
How banks can mitigate cloud security threats
As custodians of sensitive financial data, banks and financial institutions face several cloud security threats that demand security controls and mitigation strategies.
Data breaches are a primary threat as malicious actors may infiltrate cloud systems to gain unauthorised access to confidential customer information such as account numbers, passwords, and transaction histories. Additionally, insider threats pose risks, where disgruntled employees or negligent staff members might compromise sensitive data.
Furthermore, the reliance on third-party cloud vendors introduces another layer of risk, as these entities may themselves become targets of attacks, or inadvertently expose data through misconfigurations or vulnerabilities in their infrastructure.
So, what can be done? Banks have a huge responsibility to operate within the relevant compliance framework(s), and to also reassure clients that their personal and sensitive data is protected. Robust cybersecurity measures are now essential for continued operation in the digital era.
To mitigate these growing threats, banks can adopt the following security controls to ramp up cloud security.
Implementing robust encryption protocols ensures data remains protected both in transit and at rest within the cloud. Meanwhile, ensuring strong access controls and authentication mechanisms are in place helps ensure unauthorised individuals cannot gain entry to sensitive systems and data.
Regular security audits and vulnerability assessments help identify and address any weaknesses in the cloud infrastructure. A third-party assessment ensures honesty, neutrality and often, a higher level of expertise.
Employees must be kept aware of current and potential security issues. Invest in comprehensive training programmes and employee awareness activities regarding security best practices and the importance of safeguarding sensitive data.
And there’s no avoiding the fact that 24×7 monitoring enables banks to promptly detect and respond to potential security threats in their cloud environments, ensuring continuous protection of sensitive financial data. You need to ensure round-theclock monitoring is in place, whether in-house or outsourced, manual, automated or hybrid.
Finally, banks need a robust incident response policy to swiftly address security breaches and mitigate the potential impact on both customers and the institution’s reputation.
To combat these evolving threats, the banking and financial sector must adopt a multi-layered approach to cybersecurity, including advanced threat detection, response tools and cybersecurity awareness training for all staff. Collaboration and information sharing between industries and governmental bodies are also crucial for staying ahead of emerging threats.
Robust data backup and recovery plans, and a zero-trust architecture are also imperative.
While the sector remains one of the most targeted by cyber criminals, getting all the above in place means we have a chance to foil attacks before they’ve even taken root.
Furqan Siddiqui SOC Operations Officer
Obrela
Is there a danger of over-regulation stifling competition? – Roger Alexander
The debate between regulation and competition is one of the longest in modern business, and one of the least likely to ever be satisfactorily resolved. On one hand, supporters of deregulation argue that loosening rules allows companies to create more innovative products or seize upon efficiency savings. Those on the other side of the argument believe it’s in fact the regulations themselves that fuel creative solutions by creating frameworks in which innovation can take place.
Often, the best answer to debates like this isn’t a firm yes or no in either direction, but to humbly admit that the reality is far more complicated than what can be summed up in a simple binary. I can see two recent points of evidence for and against the idea that over-regulation stifles competition, but is it enough to answer the debate once and for all?
Open
Banking:
Inventing an Entire Industry
The creation of Open Banking is an instance in which an industry, today valued at $ 25.1 billion USD, was effectively created overnight by the stroke of a pen. The Payment Services Directive 2 (PSD2) rules that were brought into effect and adopted by many European countries did much to strengthen security and increase trust, but it also created the Open Banking framework, which has been a major boon for consumers, companies and those trying to reduce chargebacks. The payments element of Open Banking doesn’t allow for chargebacks, which seems counterintuitive, but in practice allows consumers to work directly with merchants to get their money back, something that we at Chargebacks911 have always advocated for.
Open Banking could be an industry worth hundreds of billions of dollars by the 2030s and employ thousands of people. From the start, it was very explicitly engineered to increase innovation by driving competition, and it has definitively proved that it can do this.
APP Fraud Rules – stifling innovation?
On the other hand, the payment services directive is introducing rules that will come into force on October 7, 2024 that require UK payment service providers to reimburse customers who fall victim to APP fraud. A good thing surely? Not necessarily—it could stifle innovation, especially amongst the smaller UK fintech companies.
Companies will be required to pay fraud victims within five days, and each individual claim could cost them as much as £85,000 (though a typical single act of APP fraud costs a business £11,000 or a member of the public £1,500). This short time limit means that if financial institutions want to dispute a claim for compensation in a similar way to what merchants can do with chargebacks, they will have very little time to do so. Consequently, many will be forced to pay out sums that they would not be required to do if they had time to properly investigate the claims. Make no mistake, fraudsters will find a way to exploit this system. I am far from the first person to have reservations about these rules: the UK Treasury and FCA share the same concerns.
Large PSPs might be able to absorb these costs, but they shouldn’t have to. The real damage will happen to smaller and likely more innovative companies— those that could become the ‘next big thing’ if they were allowed to grow unimpeded. Ironically, some of these
companies will be precisely those created by Open Banking regulations. Although the regulations are yet to come into effect, it seems that they are a prime example of how regulation can stifle growth.
So, what’s the answer?
While the answer to whether regulation helps or harms innovation is nuanced, we need transparency and a space for debate amongst financial institutions and the companies they work with. We should have complete information on upcoming regulations and a way in which to discuss them constructively in such a way that regulators can see our criticism and adjust regulations accordingly. So does regulation stifle innovation? Only if we don’t collaborate with all parties involved to ensure our systems foster fairness and growth.
Roger Alexander Key Advisor, Chargebacks911
A TikTok trend or the cause of financial fear? How the financial industry can navigate the threat of deepfake fraud
In 2021, a wave of deepfake videos started to emerge across the internet and social media. From humorous TikTok videos of Tom Cruise, to unsettling speeches from Morgan Freeman explaining synthetic reality, AI-driven deepfakes started to capture the attention of many internet users.
Over the past year, as AI has bled into the mainstream, technology that was once reserved for the experts has fallen into the hands of the everyday internet user. Now, whilst this has led to the development of some funny celebrity parodies across social media and even the development of the TV show ‘Deep Fake Neighbour Wars’, it has also opened the door to some very real, sci-fi-like threats.
Like many types of initially innocent technologies, deepfakes are now being exploited by malicious cyber criminals for nefarious means, with one of the latest victims being the world’s longest serving Central Bank Chief. Earlier this year, video and audio clips of the governor of the National Bank of Romania, Mugur Isarescu, were used to create a deepfake scam encouraging people to invest in a scam. Whilst the bank of Romania issued a warning that it was neither the governor or the bank behind the investment recommendations, it calls into question the severity of deepfake threats, especially for financial services, where organisations and customers could pay a high price as a result of disinformation.
With deepfake incidents in the fintech sector increasing 700% in 2023 from the previous year, let’s explore how financial services institutions can navigate these choppy waters to prevent against AI-enabled impersonation scams.
The threat facing financial services
Unfortunately, the financial services industry is notoriously fertile ground for cyber-attacks. It’s a high target given the monetary gain for fraudsters, vast amounts of sensitive personal information, and the opportunity to deceive and manipulate customers, who put so much trust in financial institutions like banks.
It’s no wonder, then, that these types of impersonation scams are gaining traction across the UK among other countries. Just last summer, trusted consumer finance expert Martin Lewis fell victim to a deepfake video scam in which his computer-generated twin encouraged viewers to back a bogus investment project.
These types of attack are growing in prevalence. We’ve already seen a finance worker pay out $25 million after a video call with their deepfake CFO. Deepfakes could even be used to fraudulently open bank accounts and apply for credit cards. The danger and damage of deepfake scams are far ranging and banks cannot afford to sit still.
To combat this growing threat, the financial services industry needs stronger authentication than seeing and hearing. It’s not enough for financial experts or customers to simply trust their senses, especially over a video call where fraudsters will often utilise platforms with poorer video quality as part of the deceit. We need something more authoritative, along with additional checks. Enter identity security.
Distinguishing reality from the deepfake
To protect against deepfake threats, businesses need to batten down the hatches on their organisation. Increased training for staff on how to spot a deepfake is essential. So is managing access for all workers –employees but also third parties like partners, contractors. Organisations must ensure these identities only have as much access as their roles and responsibilities allow. No more, no less, so if a breach does occur, it is limited from spreading throughout the organisation. Data minimisation— collecting only what is necessary and sufficient—is also essential.
Stronger forms of digital identity security can also help prevent against an attack being successful. For instance, verifiable credentials, a form of identity that is a cryptographically signed proof that someone is who they say they are, could be used to “prove” someone’s identity rather than relying on sight and sound. In the event of a deepfake scam, proof could then be provided to ensure that the person in question is actually who they say they are.
Some emerging security tools now even leverage AI to defend against deepfakes, with the technology able to learn, spot, and proactively highlight the signs of fake video and audio to successfully thwart potential breaches. Overall, we’ve seen that businesses using AI and machine learning tools, along with SaaS and automation, scale as much as 30% faster and get more value for their security investment through increased capabilities.
Building a security backbone through AI-enabled identity security
As the battle rages against AI-enabled threats, the war goes far beyond deepfakes. Bad actors are leveraging AI technology to create more realistic phishing emails, masquerading as official bank sites to trick consumers and fuel the rapid dissemination of malware techniques.
Adhering to regulatory standards is of upmost importance to navigate this complex threat landscape. But this should be considered the baseline when it comes to enhancing security practices. To ensure businesses are best prepared to combat bad actors, regulation needs to be met with robust technology like AI-enabled identity security. Through this, organisations can scale their security programmes whilst gaining visibility and insights over their applications and data.
In today’s digital age, organisations cannot compete securely without AI. The reality is that cyber criminals have access to the same tools and technology that businesses use. But it’s not enough for businesses to simply keep pace with criminals. Rather, businesses need to get ahead by working closely with security experts to implement the necessary tools and technology which can help combat the rise in threats.
With over 9 in 10 (93%) financial service firms facing an identityrelated breach in the last two years, embedding a unified identity security programme that monitors everyone in your network will allow organisations to see, manage, control, and secure all variations of identity – employee, nonemployee, bot or machine. This will help the financial services industry to know who has access to what, and why across their entire network, being vital to detect and remediate risky identity access and respond to potential threats in real-time.
Only through a combination of increased training and stronger forms of digital identity security can banks and other financial institutions start to navigate through the sea of fakes and inform their customers on how to do the same. As the pool of deception grows, investment into AI and automation to prevent against such attacks must be a priority in 2024.
Steve Bradford Senior Vice President EMEA SailPoint