Earlier this year, Accenture published a fascinating article on the top 10 trends in banking in 2024.
One of which was “the power of pricing”. The global professional services consultancy argued that banks had always understood the significant effect of optimised pricing on their top and bottom lines.
But, it said, they were now combining this intuition with generative AI and “more comprehensive data” to “turbocharge” scenario planning - bringing personalised pricing a step closer to reality.
TriFidus was reminded of this potentially game-changing development at the Banking Transformation Summit in London in June. See our summary of the event here.
Many of the conversations at the conference centered around the arrival of artificial intelligence and what it could mean for financial services.
Delegates were left in no doubt that adoption of this technology was critical to the future of the sector for a whole range of reasons - customer experience, risk management, organisational efficiencies and so on.
However, it was also evident from the discussions that banks are still weighing up AI’s efficacy rather than racing to implement it.
The time for tyre-kicking has passed: an implementation strategy is imperative
That stance will need to change though, if established financial institutions are to avoid being left behind by more agile companies, which are wasting no time in embedding such modern tools into their roadmaps.
At the very least, large banks are going to have to make their systems highly interoperable with digital platforms. Interoperability has always been a problem for the industry as new technology has been bolted onto legacy systems designed decades ago. This operational risk has increased over the years and interoperability, the ease with which these systems can talk and function, has become an ongoing problem.
A report by Accenture, looking into the commercial implications of interoperability, made some startling observations. A survey of thousands of executives found that businesses with high interoperability grew revenue six times faster than peers with low interoperability.
The research also found “if two similar companies started with $10bn today, the company with high interoperability would stand to make $8bn more than the company with low interoperability over the next five years”.
Alongside this, embedded finance growth shows no signs of slowing down, according to global management consultancy McKinsey & Company.
It says that by the end of the decade, embedded finance revenues in Europe could exceed €100 billion, possibly accounting for between 10% and 15% of banking revenue pools and up to 25% of retail and SME lending.
There is no denying that innovation in banking is accelerating on many fronts. Yet it’s one thing to understand the potential of technology for financial services. Successfully executing its adoption is quite another.
Beyond the excitement, a sober assessment is vital
Fintechs, understandably, focus on the capabilities of their platforms. However, often forgotten amid bold marketing claims are some fundamental realities.
One of these - particularly for organisations that are deemed too big to fail - is the need to understand the myriad ramifications of shifting to modern systems or launching entirely new entities.
This is where accurate and dynamic financial modelling comes into its own. By combining deep industry knowledge with the right approach, it’s possible to demonstrate a convincing business case for groundbreaking innovation at a time when economic uncertainty reigns.
But what does financial modelling actually entail?
Firstly, it can focus on an entire organisation or a specific department. The aim here is to understand the different sources of value within a chosen context.
Secondly, it plots a wide range of likely and unlikely scenarios - the known knowns, the known unknowns and the unknown unknowns - to plan for and effectively manage financial risk.
Thirdly, a financial model not only provides a clear picture of possible impacts of a project. It also helps to identify the best sources for capital funding, to contain costs and maximise revenue. In the case of banks, this needs to highlight the regulatory capital and liquidity impact of the options taken.
Done well, this work gives leaders the ability to grow confidence in projects among internal and external stakeholders, building trust through transparency. And it works regardless of the complexity of a project or organisation.
Driving innovation with a clear view of the road ahead
As we think about the innovation journey banks are on, it’s perhaps helpful to return to Accenture’s 10 banking trends in 2024 and its commentary on “All the risk we cannot see” in financial services.
Its author Michael Abbott wrote: “In 2024, banks will be confronted by a variety of risks – some familiar, others less obvious.” He goes on to list 5 risks that banks should consider, of course there are many more.
Banks are mandated to consider risks and the impact they will have but without a financial model capable of demonstrating the holistic impact of multiple scenarios, Boards and Senior Management are left with gut feel and intuition. The fact is even relatively small banks are too complex for this approach to work.
At TriFidus, we have numerous examples of insightful analysis that challenged perceived wisdom. This led to better decision-making through the rigorous use of financial models that start with operational drivers and end with financial impacts.
These powerful tools put the Board and Senior Management in the driving seat of transformational change, allowing them to chart their course with confidence.
To realise the full potential of technology you need navigational aids and tools to enable 360 vision. When it comes to banking innovation that equates to solid financial modelling.
Talk to us about how we can support the execution of your technology roadmap. Get in touch here.
Sources: