The Database Takes the Wheel in Driving Developer Productivity
As organizations rush to digitally transform their businesses, a trend that was further accelerated by the global pandemic, serverless application architectures are becoming more popular. They allow development teams to rapidly bring new customer-facing applications to market without the overhead associated with managing and maintaining the underlying infrastructure.
Today’s modern microservice and serverless architectures offer real benefits. It’s easier to evolve services independently because their state is encapsulated behind APIs. You can easily enforce security and schema constraints and database hardware can be scaled on a per-service basis. But there are downsides as well.
If an API needs to change, it needs to remain backward compatible. The more services and APIs a client needs to talk to, the slower and less reliable the application becomes. The use of multiple databases results in data that is duplicated in different formats across many services and quickly gets out of sync.
This can directly impact user experience and lead to failed inventory checks and checkout processes in e-commerce apps, slow or lost messages in customer support tools, out-of-date market information in banking and cryptocurrency applications, etc. As captured in Brooks’s law: the more potential communication paths there are in a system or an organization, the higher the communication overhead becomes.
When applications are complex, developer productivity also takes a hit. To reclaim upfront and ongoing developer time, it’s worth considering a strategy that harkens back to development patterns of the past — that is, integrating these services through a shared database.
Why? Data shared across applications doesn’t require an additional layer of integration services. But does the conventional view that this simplification increases risks still hold true? I think the answer is no, especially when it comes to serverless architectures.
In the past, a common pattern might look like this: clients and services would never talk to each other, but instead would access the database via a low-latency local network. The database administrator (DBA) managed access controls and would validate for performance and correctness. This freed up local computing resources to run non-secure business and presentation logic.
As systems evolved, the relational database management system (RDBMS) model became an operational bottleneck and progress settled in the direction of microservices. But fast forward to today, and we’ve actually overcome and even removed the limitations that caused the trend away from the “integrate via the database pattern.”
Between modern databases, serverless operational models, APIs, modern query languages, and web-native security practices, we’re now at a point where the legacy database-oriented architecture makes sense again. A system built to integrate around a modern database (such as the one we developed at Fauna) can run everything in a fully transactional, strongly consistent context.
Customers who are increasingly adopting serverless architecture and integrating around a database usually have some combination of a mobile or single-page dynamic web app and business logic implemented in GraphQL or database-native query languages. The apps and backend only talk to the database and to any additional API services.
When teams integrate their services and clients through a modern database that was specifically architected for modern application patterns, we’ve seen increased developer productivity and better app availability, consistency, performance. I encourage all developers to reevaluate their assumptions and take advantage of these new capabilities in their own applications.