Deep tech in transition

Core banking technologies are being updated with new methodologies that bring out existing strengths

Fergal O’Riordan, director, CubeMatch: ‘Banks have a fundamental importance to the economy; that can’t be said of neo-banks, at least not yet. A bank’s IT has to be as good as that of governments’

Anyone who thinks banking never changes cannot have been paying attention. Not only are new players entering the market, but in response to current conditions such as low (for now at least) interest rates, core banking products have been evolving over time.

More recently, regulatory innovation, aimed at tightening compliance on the one hand and promoting innovation on the other, has resulted in some fundamental changes to banking.

Take PSD2, for example, the revised EU Payment Services Directive: at its core is the concept of ‘open banking’.

“Open banking has really changed the game. Other companies can now represent your data,” Fergal O’Riordan, director at CubeMatch, said.

As a global change and transformation consultancy specialising in financial services, CubeMatch is at the coalface of delivering change in a sector and industry that cannot allow disruption.

Of course, disruption is there in the market, but banks cannot themselves perform sudden volte-face that leave customers anxious. Reputations built over decades, or in some cases centuries, cannot be risked on a whim, and this need for stability extends to the hardware and software that run the world’s transactions.

“Banks have a fundamental importance to the economy; that can’t be said of neo-banks, at least not yet. A bank’s IT has to be as good as that of governments,” O’Riordan said.

This results in a certain caution, but it should not be confused with inertia. “Banks have slow-to-change technology and legacy systems, but what they are looking at is trying to be as nimble as the fintechs,” he said.

Consumer fintechs, such as the aforementioned neo-banks, obviously do not have legacy reputations to stand on. Nor do they have as wide a range of product offerings as the pillar banks. They do have some things easier, though.

“They have a lower cost base as they have no branch network,” O’Riordan said.

They also move fast. “A lot of average project times in banks might be nine to 12 months. With neo-banks it will be weeks,” he said.

This kind of agility is what CubeMatch wants to bring to its clients, and banks are calling for it, not dragging their heels. O’Riordan said banks were working on new, agile technology, and were not hidebound by legacy technology.

“It’s, frankly, old thinking now to think of banks as sitting there in shell shock. Actually, they are very aware and have lots of programmes that are going off in every direction that have no connection with the mainframe,” he said.

“There is a very good future for banks. A lot are ten years into this [ongoing change] process.”

Technologies such as cloud computing have moved from being off-limits to being part of operations.

“Banks, traditionally were suspicious of cloud, but now they’re getting into it. They’ve understood how to secure it,” he said.

Beyond security, one other factor is that cloud concepts, such as virtualisation, actually derive from the mainframe. As a result, O’Riordan said, “the cloud is what will eventually replace the mainframe down the line, whenever that actually does happen”.

Increasingly, CubeMatch is getting asked to instil cultural change alongside technology.

“We bring quality agile cultures. A lot of institutions are working on a hybrid model: they do two-week sprints, but then they tend to bank those sprints, no pun intended, and release every few months. They don’t release every four minutes like Netflix does, and that’s due to tools and extreme DevOps,” he said.

Of course, given what banks do, we would not want them on a continuous release cycle: video streaming going down is a nuisance; banking systems going down is a nightmare.

“What you need is appropriate agility. It would not be to the banks’ benefit to be like Netflix, and if someone is selling you that idea you should be very careful,” he said.

But DevOps, which merges development and IT operations, is not just about speed, it is also about getting things right.

“DevOps has a lot to say about quality, ensuring that your testing is in-built, deploys it and does functional tests before any human ever even sees it,” he said.

At its core, DevOps is a form of infrastructure, but one that brings scalability and a level of flexibility that was hitherto impossible. In effect, it creates a cloud world, where you can spin up any kind of server you want without human intervention; except the click to say ‘go’.

There is a pleasing symmetry to this, then, as computing and IT have oscillated between centralised and decentralised since the first digital computers appeared. But techniques like DevOps and technologies like the cloud take from both approaches.

“The history of computing is: we started with mainframe then had dumb terminals in order to interact with them, then the PC arrived when processing power grew, so we moved away to a client-server model. Cloud is essentially virtualised mainframe: it’s a single source of tremendous power, but it can spin up in minutes,” O’Riordan said.