TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%
Data / FinOps / Operations

Microservices and Mortgage Meltdown: Let’s Get Relational

It’s time for core banking IT systems to undergo a microservices overhaul. That means embracing a distributed relational data model.
Aug 30th, 2023 7:51am by
Featued image for: Microservices and Mortgage Meltdown: Let’s Get Relational
Image from Brian A Jackson on Shutterstock

Financial services are embracing digital transformation, but if the UK’s 2022 mortgage meltdown proved anything it’s that there’s a long way left to go.

The mortgage crisis saw many well-known lenders unable to update the business IT systems behind their mortgage products in time to keep up with surging interest rates. Rather than leave themselves exposed, they withdrew mortgage products, losing money and customers as a result.

Technology wasn’t supposed to be like this. So what went wrong?

Playing Catch-up

The financial sector is changing, but the pace is slowed by decades of IT legacy. Behind each mortgage product sits business IT systems responsible for different phases of the mortgage process, from web offers through approval to account and customer management.

Tales of IT legacy are almost as old as London’s venerable financial heart, but the IT problem is no longer siloed. Each employs a monolithic application founded on hundreds of thousands of lines of code running on lots of different platforms — mainframes, client-server and hybrid cloud bought or built in-house. A change to one system has consequences for the others in the chain so must be taken offline for work.

The situation was complicated by banks’ IT change processes. As banks have built or bought IT systems, the job of managing them has spread out across teams that extend to third parties with requests for change communicated through arcane, ticket-based systems. It took one building society using such a system seven days to change the interest rates displayed on its customer portal.

Finally, some organizations simply lacked joined-up digital processes to move at speed. I know of organizations where staff printed online applications for manual review — taking three days to deliver a decision.

It’s tempting to write off the meltdown as a black swan, the culmination of unique events, but this would be wrong. It was merely a microcosm — change is now business as usual. The period between March 2009 and December 2021 saw six interest rate changes; rates since December 2021 have changed more than 16 times with UK mortgage interest rates now hitting 15-year highs.

As debt increases, financial services firms have now become preoccupied with how best to gain customers and stop them from being poached by more adroit rivals over the coming 18 months. Those with better packages and a unified customer journey to onboard those newcomers will dominate.

The Stack: Done Right

The prescribed answer to achieving this would seem obvious: to unravel the business IT software monoliths behind the mortgage businesses and reimplement the functionality as microservices. The logic is compelling: To rewrite integrated IT stacks as independent services that can be changed quickly and with minimal impact on the full application or fellow services.

Continuous-delivery guru Dave Farely explains what this looks like: processes deployed independently of each other, that are loosely coupled, that are organized around business capabilities, that operate within a bounded context and that are owned and maintained by small teams. In the case of the mortgage meltdown scenario, an interest rate service could be quickly updated by the team dedicated to its maintenance without taking down the entire lending application.

But microservices can come with baggage. One issue people have been struggling with is the creation of distributed monoliths. One cause of this is continued reliance on a single database, which means services remain anchored to the data sources and lose their flexibility.

In the mortgage scenario, the application has one source of truth, but the whole application must be taken down to be updated.

The answer is a distributed data model. The catch is, the model must employ the performance and reliability of a relational model that hitherto has struggled to perform at this kind of huge scale.

The Relational Route

Relational has been a reliable engine of business – an architecture of rows, columns and tables founded on principles of transaction completeness and isolation. This has seen relational databases used in transactional and analytical workloads. The relational model, however, has historically proved complicated to scale up for cloud native, which opened the door to the uptake of NoSQL, a document store that achieves scale and speed using different architecture. What we’ve seen, however, is firms trying to run relational workloads on NoSQL and having to build their own rows, columns and tables as a result at great cost and technical overhead.

It’s vital to pick the right database for this job: That means a system capable of delivering the reliable capabilities of relational but that’s distributed by design. This avoids using arcane practices and workarounds to make traditional relational scale and sidesteps the compromises of NoSQL.

The next step is to define the transactional model. At a high level that means understanding the use cases and the workloads. If we’re talking about a mortgage…

  • It means understanding the customer and business dynamics of the home-buying journey.
  • It means describing the data that will be used and the data store for systems like the financial ledger. It means describing what that ledger looks like.
  • It means specifying the systems and applications in the chain, the data flow and where the data would be updated.

That’s vital as the data won’t simply flow through offer, approval and settlement systems. It’ll flow through customer, employee and broker portals too. A change to a mortgage rate microservice will have to ripple through each system.

With the database and transactional model in place, it’s finally possible to create functionally independent microservices. Modern cloud infrastructure is invariably built on container technologies that are managed through a variety of orchestration and life cycle management tools. While this is great for open systems and developer choice, it can also result in a complex infrastructure that makes cloud native difficult and slow to manage. This can present a particular problem when deploying data-driven applications in a microservices architecture because data-driven updates must propagate consistently across each service and system on that infrastructure.

It therefore makes sense to employ an orchestration system that works in lockstep with your database’s automated deployment and administration capabilities. Achieving this will mean that data- and application-level changes can be packaged up and rolled out as part of any container or cluster update and deployed automatically using consistent processes and without manual intervention. The result is microservices that can be updated independently, autonomously and isolated from other services, allowing banks to respond at speed to changing business events.

What’s Next?

We are in the midst of the AI hype cycle and mortgage lenders are starting to think about how to integrate AI to improve the mortgage lending process. It can automate routine tasks, provide valuable insights, reduce risk and fraud and improve the customer experience — yet another reason to ensure your tech stack is capable of handling the influx of data that will be stored and analyzed to transform an antiquated industry.

Additionally, operational resilience — an organization’s ability to adapt and respond to disruptions or unexpected events while maintaining continuous operations without interruption — is going to be of huge importance in Finserv going forward, especially now that the stakes are high enough that governments have begun stepping in.

The UK is leading the way in holding financial firms responsible and accountable for their operational resiliency. Regulators have instructed financial firms to meet operational resilience requirements, overlaying governmental oversight on top of internal decision-making. Other countries are also pursuing similar regulatory initiatives in their financial sectors. One of the most significant is the European Union’s proposed Digital Operational Resilience Act (DORA), which seeks to ensure that all financial market participants have effective strategies and capabilities in place to manage operational resilience.

We are reaching the end of the “single cloud provider as automatic best practice” era. Centering your application architecture on cloud platform-agnostic services and tools has become an essential survival strategy that requires a distributed solution.

Conclusion

We’ve heard much about digitization in financial services, but 2022 proved just how much work is left outstanding. With businesses and consumers facing a challenging 18 months, it’s time for core banking IT systems to undergo a microservices overhaul. That means embracing a distributed relational data model to make the financial products — and the business — genuinely agile.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.