Today’s enterprises need modern compute and storage platforms that can support the next generation of business applications, across hybrid cloud footprints. They must do this all while preserving existing infrastructure in a safe, predictable mode of operation. The core essence of maintaining both modes of work and eventually bridging the two with minimal disruption can be captured by a single word: “and.”
What’s so important about the word “and?” “And” is inclusive. It connects old to new. Technology with culture. Scale up to scale out. On-premises to cloud.
You could argue that “and” is the only viable path forward for the modern enterprise to remain competitive since the alternative is having to choose between either secure obsolescence or disruptive chaos.
The discontinuity is clear. Developers and infrastructure operators today need to protect the past AND lay a path to the future for tomorrow’s business applications. The rapid rise of DevOps processes and teams is a great case in point. It wasn’t enough for developers and infrastructure operators to exist in parallel universes. Enterprises can innovate faster when dev “and” ops collaborate on an iterative model to build, test, and maintain applications.
The need for elasticity and flexibility is why many organizations undergoing digital transformation have chosen the hybrid cloud route, which offers the benefits of both an on-premises cloud infrastructure and public cloud services. Organizations need to expand their horizons and move past either/or binary choices.
Storage: Bridging the Gap Between Old ‘and’ New
Many organizations are still wedded to their legacy storage infrastructures. What becomes of them? The answer is, perhaps nothing — at least not immediately. In fact, when it comes to it, all data storage is software-defined, even systems running on an older piece of hardware. Without software, that hardware is nothing but a doorstop.
Software-defined storage can provide an abstraction layer that overlays older hardware and enables data to be managed through modern application management solutions such as Linux containers.
Linux containers are quickly becoming a new normal for application development powered by storage. Container-native storage enables persistent, stateful storage to run side-by-side with container-based applications. Enterprises can do this and keep their legacy hardware or use container-native storage as an abstraction to the cloud. They don’t have to choose.
Open ‘and’ Open Source
The storage approach I’m advocating here is open: Build on industry-standard hardware and components. It is also open source, where storage is developed and curated by communities). Open source can enable practitioners to have a say in the roadmap of their storage architectures and contribute to new features of the project. It means that their technology is backed by a vibrant community whose collective purpose is the ongoing advance and innovation of their storage without regards to an individual company’s business gains. Open source also offers the freedom to use whatever solutions a practitioner chooses because it provides choice and flexibility.
By taking an open and open source approach, you get the best of both worlds: the innovation emerging from open source communities, along with the solidity, integrity, choice and staying power of being open.
Digital transformation can be hard to navigate. Organizations should seek out partners who can help them bridge existing and new solutions in their own time.
Digital transformation can become a little easier when one realizes that they do not have to sacrifice one thing for another. Organizations don’t need to choose a particular storage application or infrastructure over another. Or maybe they’ve chosen to run containers and virtual machines side-by-side. Those and choices define modern enterprise IT.
Indeed, a company transformed is one that embraces the concept of “and” so that it has the flexibility necessary to handle today’s evolving demands.
Red Hat is a sponsor of The New Stack.
Feature image via Pixabay.