5 Critical Multicloud Data Architecture Strategies
As digital transformation accelerates, so does the modernization of data infrastructure. Virtually every digital process requires — or is made dramatically better by — leveraging the skyrocketing volumes of data, at digital speed and scale, to make the very best decision in a given moment.
One aspect of data infrastructure modernization (and likely the most talked about) is the migration of data to the cloud. According to Gartner, by 2022 “75% of all databases will be deployed or migrated to a cloud platform, with only 5% ever considered for repatriation to on-premises.” Increasingly clouds are the new data centers, the internet is the new network, and SaaS is the new application stack.
There’s a lot to look forward to in the cloud, and it’s hard to imagine any modern application that doesn’t in some way leverage cloud infrastructure. But, if you get complacent and settle for basic tools or default strategies, you’ll soon find your cloud costs soaring, the application stack stalled at scale, and your mission-critical applications beholden to a single vendor — practically negating all the real benefits of the cloud.
Here are five critical strategies to get your multicloud strategy stable, secure, and on the right track for long-term success.
- Before You Move Into “A” Single Cloud, Architect for “Any” Cloud: Internal applications teams, and the databases and tools they leverage for data-rich applications, need to support multiple clouds. You think you’re saving time, and yes, you might for very simple applications or basic instances. But you are not creating flexibility to change down the line, or resiliency to leverage multiple clouds for scale or times of duress for critical applications. Therefore, make it a point to pick technologies that work across multiple clouds. And, of course, times change. The pace of innovation is only accelerating, and having the flexibility to use multiple clouds for particular business needs lets you pick the very best for an application in that moment, yet keep it flexible enough to pick another cloud for another moment down the line.
- Demand Widespread Cloud Standards from Your Database: We’re at the point where the cloud has some very clear standards, as defined by the Cloud Native Computing Foundation (CNCF). And even proprietary innovations born in the largest clouds are now becoming open source and standards across multiple cloud vendors (the most obvious being Kubernetes, which originated inside Google more than a decade ago). Stick to the standards, reduce custom development, and set yourself up for multicloud success.
- “It Comes with the Cloud” is Not an Application Stack Strategy: The cloud platform vendors — Amazon, Google, Microsoft, and others — certainly paved the way and created a lot of innovation to get us here. They all have their own proprietary offerings including database services, orchestration frameworks, monitoring platforms, etc. But remember, these proprietary offerings are default additions to the vendor’s cloud offering. They are general purpose in design for a wide range of workloads and generally work well and are affordable for mild to moderate use. They are not architected for the mission-critical, massive scale of high-performance, big data applications like real-time fraud, financial transactions, instant decisioning, and a new wave of machine learning and AI. Even though you may not be able to use the default cloud offerings for your high-scale real-time mission-critical deployments, that doesn’t mean that you shouldn’t fully embrace the techniques of cloud simplicity and management that are exemplified by these general-purpose applications stacks. The best-in-class databases, tools, and other components in your stack are now optimized for the cloud, and mostly have similar levels of automation and simplicity of the “built-in” cloud provider options. You have to expect a little more sizing and planning work upfront to ensure the reliability and scale needed. But it’ll be worth the effort. You’ll achieve a degree of independence at a time when innovation is moving so fast that consolidation with today’s perceived “winner” can quickly seem like a bad bet just 12 months down the line.
- Deploy Compliance Zones by Geography: Where your data actually resides is a key factor in the design of multicloud operations. You’re a big business. You have big data. And almost always a big global footprint with any variety of industry, country, or even county-level compliance and privacy requirements. Read any user agreement or SLA from a cloud provider and you’ll quickly see that it doesn’t — and can’t — assume liability for privacy compliance. Moving your data to the cloud does not remove your responsibility for the privacy and compliance of user information.
- Architect for Success: Digital transformation is accelerating. Data volumes are skyrocketing. One of the cloud’s greatest promises and deliverables is elasticity, but you need to architect for elasticity across the entire stack. Trust me, anything you’re planning for today, take that and scale at least 10x, and probably 100x that level. In my work with large enterprises, a yearly doubling of transaction and data volumes seems to be the bare minimum. Mission-critical, data-intensive applications rapidly hit scalability problems that instantly stall performance and will leave you scrambling. And when you’re in a hurry to scale, and the cloud cost model does not match your revenue model, you will see costs soar and the system quickly becomes unaffordable.
The cloud, like everything else, is not a silver bullet. But if you architect your data infrastructure using these guiding principles, you can build a foundation that gives you maximum flexibility, stability, and the ability to innovate even faster for handling the digital transformation and performance demands of global business.
Feature image via Pixabay.