How has the recent turmoil within the OpenAI offices changed your plans to use GPT in a business process or product in 2024?
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
No change in plans, though we will keep an eye on the situation.
With Sam Altman back in charge, we are more likely to go all-in with GPT and LLMs.
What recent turmoil?
Containers / DevOps / Software Development

Want to Improve Your Agile Process? Look to Containers

Organizations are containerizing their applications and running their CI/CD pipelines on Kubernetes using container native storage to reap the full benefit of the Agile process
Aug 5th, 2021 12:00pm by
Featued image for: Want to Improve Your Agile Process? Look to Containers
Feature image via Pixabay.

Ami Kleinman
As VP R&D at Ionir Ami Kleinman has overall responsibility for product design, development, and quality. His primary focus is building a first-rate development organization that adheres to a quality-oriented Agile development process that is able to deliver the product consistently on schedule. He brings over 25 years of industry experience in product development, including 15 years in storage at companies such as EMC and IBM. Prior to joining Ionir, Ami held roles as VP R&D at N2WS, a cloud data protection startup acquired by Veeam and VP R&D at FilesX, a storage startup acquired by IBM. Ami began his career at Intel where he worked in factory automation. Ami holds a BA degree in Computer Science from Yale University and an MBA from Florida International University.

Agile software development, which has become nearly ubiquitous over the last decade, was a revolution. Instead of waterfall development, with its long cycles that required planning six to nine months ahead, Agile came in and required planning only two to four weeks ahead for fast iterations, each of which was potentially shippable. At least, this is how Agile was sold to management.

It came with a catch, however, that most organizations moving toward Agile didn’t discover until the end of their first sprint. The catch is that the ability to iterate quickly is predicated on excellent quality assurance (QA) automation that can do regression testing in hours, which traditional waterfall does in months.

Merely good QA automation isn’t good enough for a truly efficient Agile development process, and most organizations were somewhere between poor and non-existent when it came to automation. The need to close the automation gap led directly to the rapid and ever-escalating demand for experienced automation engineers. However, organizations that were fortunate enough to hire good automation engineers and removed this impediment to their Agile process discovered another major roadblock: lots of quality automation requires lots of automation environments.

Creating the many automation environments required for Agile development can be a heavy lift, especially when working with physical hardware. Many organizations had to virtualize their development and test environments to get maximum value from their massive investment in automation and propel their Agile development processes forward.

Fortunately, around this time the industry saw the rapid rise of private and public clouds and infrastructure as code, all of which begat DevOps.

DevOps and VMs

DevOps quickly became the next bottleneck in moving Agile software development forward and organizations responded by hiring DevOps engineers coupled with a concomitant investment in infrastructure as code (IaC). But as DevOps engineers built out extensive CI/CD pipelines, the dominant framework was virtual machines, whether VMware virtual machines (VMs) in the private cloud, or EC2 instances in AWS.

Herein lies the rub. VMs, by their nature, are relatively heavy and cumbersome. To spin one up, you must first make a copy of the VM template which, depending on the size of the boot and data volumes involved, can easily take 10 to 20 minutes or longer. And then booting and stabilizing the VM takes an additional several minutes. Spinning up several VMs in parallel can actually slow things down because they often compete for storage IOPS, CPU and networking bandwidth.

A relatively simple pipeline that spins up a build environment, followed by a unit testing environment, followed by a regression testing environment can involve the creation and boot of several VMs and take several hours. An optimal Agile automation strategy mandates running a CI pipeline such as the one outlined above on each commit, which can easily mean dozens of pipelines a day. This can be a huge challenge when using VMs.

Here Come Containers

The modern alternative to VMs as DevOps infrastructure is containers, which overcome the limitations that hamstring VMs. Containers are very lightweight and come up as quickly as OS processes spin up. A pipeline that kicks off in 10 or 20 minutes with VMs can begin running in well under a minute with containers. What’s more, because they are lightweight, containers require significantly fewer CPU, network, and storage resources, so they are less expensive to run than VMs.

VM infrastructures that don’t have the capacity to handle CI/CD pipelines for each individual commit may be able to handle the load using containers by repurposing the hardware to Kubernetes and containers. When combined with Kubernetes, which has become the default container orchestrator in recent years, containers enable Agile organizations to take a huge step forward in extracting full value from their investment in automation and DevOps.

And, indeed, forward-leaning organizations are containerizing their applications and running their CI/CD pipelines on Kubernetes to accelerate Agile development. To get the most value out of Kubernetes-based CI/CD environments, it’s important to couple them with container-native storage. Container native storage supports instantaneous cloning of volumes and immediate access to data volumes across Kubernetes clusters.

In such an environment, Kubernetes clusters can easily be dedicated to development, test, integration, and staging environments, which gives DevOps engineers maximum flexibility. Test data can be cloned between different pipelines and moved seamlessly between Kubernetes clusters, reducing the duration of pipeline runs while maintaining segregation between the different environments. With container-native storage, the time it takes to move data between Kubernetes clusters and between pipeline stages is reduced to mere seconds.

Container Native

Organizations using Agile software development methodologies have come a long way. Automation is broader and deeper than it used to be, and DevOps now supports complex CI/CD pipelines that allow organizations to reap the full benefit of their automation and accelerate development. However, organizations that are using VMs are hamstrung by pipelines that take many hours to run and use significant resources. To reduce the duration of pipelines and run more pipelines, modern software development organizations are containerizing their applications and running their CI/CD pipelines on Kubernetes using container-native storage to reap the full benefit of the Agile process to accelerate software development.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.