IBM sponsored this post.
The infamous Ever Given cargo ship that wedged itself across the Suez Canal in March may have renewed concerns about the vulnerability of such massive vessels plying the narrow waterways of the world, but in other ways, it underscored the integrity of the humble shipping container.
Although blamed for “acting as a sail” in the heavy winds that streamed across Northern Africa at the time, and causing the 400-meter-long ship to run aground on both sides of the canal, each of the 18,000 containers onboard emerged from the ordeal five days later without a scratch.
It wasn’t always this way. The intermodal shipping container, as Malcolm McClean called it when he introduced it in 1956, came at a time when shipping was inefficient, time-consuming and expensive. McClean’s modular, reusable steel box design changed all that. It was easy to fill, load, stack and transport. It was portable and versatile. As its use grew, shipping costs dropped, efficiencies rose, and global commerce accelerated.
Not long ago a colleague of mine wrote about McClean’s invention and its impact and likened it to the impact that application containers were having in the realm of DevOps and DataOps. It was an apt analogy. The speed that containers bring to software development from testing to deployment, as well as portability, cannot be understated. But I think we can take that analogy even further. Because today, application containers are at the heart of modernization efforts and are being pulled into action to drive digital transformations across the globe, and in particular to support the rising adoption of hybrid clouds.
Within the past 12 months alone, we’ve seen a dramatic spike in digital transformations as a result of the pandemic. According to Twilio, 97% of enterprise decision-makers said the pandemic led them to speed up their digital transformations in 2020. Similarly, a C-Suite study from the IBM Institute for Business Value found that 62% of executives plan to accelerate their digital efforts over the next two years due to the pandemic.
Fueling the transformations, according to researchers, has been hybrid cloud and artificial intelligence (AI); as companies look for greater agility, efficiencies and resiliency. Driving that adoption has been the lightweight, portable container that can traverse on-premises, cloud and the edge with speed and ease. According to IDC, today more than 50% of companies have deployed containers, while 34% have put them into production.
Why Go Container-Native?
Considering the incredible momentum behind containers, it’s important to integrate container-native system storage, the foundational data layer to the hybrid cloud, into any transformation. Data continues to be spread across the enterprise, increasingly necessitating a storage infrastructure that can provide agility, streamlined data discovery, stronger governance and improved security, among other things.
But it’s important to point out the difference between container-native and container-enabled approaches to storage — both viable options, but with different objectives and results. Container-enabled storage is a great option when you want to ‘”containerize” or retrofit your existing systems to be able to support container workloads. Such solutions leverage Red Hat’s Container Storage Interface (CSI), which itself relies on third-party drivers. Once installed, existing systems can consume container workloads. Although this is a sound option for containerizing in a cost-effective way, they are somewhat limited in agility and portability, and offer fewer integration capabilities with Red Hat OpenShift.
On the other hand, software-defined storage and modern data protection software that are engineered for container platforms are extremely agile, provide a high degree of portability and integrate tightly with Red Hat OpenShift. In addition, container-native storage approaches provide a consistent application-aware management platform. From there, organizations can easily develop things like policy management and application-aware disaster recovery. In addition, agility is greatly improved and portability is simplified.
We’ve been at work on both approaches for years and offer solutions in each. In fact, our latest container-native technology offers even greater functionality. For example, it can eliminate the need to make copies of the data — a plague of digital transformations. Today most platforms require far too many copies for replication, backup, etc. Our innovation leverages sophisticated caching that enables companies to access the data and then virtualize it for use. We also have an innovation, Active File Management, that is built on a global single namespace and enables data to be managed, regardless of how many copies exist or where they reside.
Refining the Fuel
As digital transformations continue to accelerate, data is becoming increasingly distributed, stressing resources to find, access and retrieve it for analytics and AI. Remember, data is the fuel for AI and as a result, the storage architecture plays an increasingly critical role in AI initiatives. Indeed, IDC projects that the storage opportunity associated with AI will grow to $5.4B by 2024, at five-year CAGR of 18.6%.
We have a mantra that “You can’t do AI without IA (information architecture)”. That means a foundational data storage layer that is container-native, to support hybrid clouds and provide access to all the data your AI requires to enable the most accurate outcomes. This is the true course for the digital transformation.
Lead image via Pixabay.