Red Hat sponsored this post.
We were curious to answer this question ourselves, so we reached out to 100 enterprise tech leaders across industries and asked them what workloads they were running on Kubernetes. Some of the results, while not surprising to us, were not what you would expect if you have simply read about Kubernetes and not used it.
For a start, the one workload that was supposedly unsuited to containers is actually one of the most commonly hosted: databases. Traditional wisdom would state that open hybrid cloud applications should avoid state as much as possible, and that generally, containers should be treated like cattle, not pets.
There is nothing more pet-like than a relational database. Yet, 80% of our respondents said they run databases or data caches in Kubernetes. This included the expected data stores for cloud applications like Redis and MongoDB. More surprisingly, our survey showed MySQL and PostgreSQL were also in this category and were also very common workloads.
That is a lot more saved state than would have been expected in a Kubernetes cluster back in 2015, but today, the platform is stable and reliable enough to host this type of workload and to do so in a redundant, backed-up fashion.
That syncs up with the information we saw around stateful workloads as a whole. Of our respondents, 77% said that they were planning to run a mix of both stateful and stateless workloads on Kubernetes. In fact, only 1% were deploying stateless workloads only and 9% said they were deploying mostly stateless workloads. The remaining 13% were planning to host mostly stateful workloads.
This does seem to indicate that the workloads being run on Kubernetes are far more in line with the business than with some mythical list of acceptable containerized applications. Instead, enterprises seem to be moving their existing workloads and expanding into new workloads that are related to artificial intelligence/machine learning (AI/ML), analytics and data management/ingestion.
In fact, data ingestion tools were the second-most-popular workloads in our survey, coming in with 66% of respondents stating they are deploying things like Apache Kafka, Apache Spark and other big data tools onto their Kubernetes estates.
Next in line was a similar vein with data ingestion: 60% of those surveyed ran logging and monitoring tools such as Elasticsearch and Kibana; 56% said they used web servers like NGINX; and application servers, mostly Java-related, were in at 52%.
While those application servers generally represent the older applications that have been at the core of enterprises since the Java revolution took place in the early 2000s, they were tied overall as a category with the up-and-coming AI/ML applications.
Of our survey respondents, 52% said they were working with Jupyter Notebooks, Python, TensorFlow, PyTorch and other AI/ML tools. Let’s stop and think about that for a moment: Right now, inside these 100 businesses, the future-looking AI/ML applications they are designing are taking up just as much compute time and cloud space as their traditional Java applications.
That is the heart of a sea change, rearing its head in the form of workloads themselves. Even three years ago, we were only beginning to see AI/ML workloads show up in these surveys, and now more than half of our participants are running these workloads.
While that is important for the future, it also indicates that those older workloads are making it onto Kubernetes at a relatively stable clip. Seventy percent of those in our survey wanted flexibility in the way they modernize their Java and .NET applications, with a strong desire to coexist with microservice application architectures and platforms.
When it comes to modernizing those legacy Java and .NET applications, there are a few concerns that remain top of mind for our survey participants. Fifty-seven percent said they wish to reduce modernization time and costs, while 52% said they are having trouble settling on the proper modernization technology.
Of course, technology is not always the bottleneck here. In fact, exactly half of those in our survey cited a lack of knowledge about their legacy applications as a major roadblock. This can be compounded by the lack of skills out there in the job market: 43% cited scarcity of skills as a challenge.
When we get more specific about the usage of Kubernetes and its various capabilities and features, we find that Kubernetes Operators are also pulling a great deal of weight when it comes to saving developers time and energy. Sixty-one percent of those using Operators said they saved developers’ time and reduced the need for a larger team; 49% said Operators simplified their deployment of applications; and 44% said they saw a reduction of errors with deployment and life-cycle management when using Operators.
Just what is the benefit that Kubernetes offers that saves the most time and pain? Well, according to our survey, it is basically the same benefit Kubernetes and Linux containers have always promised: easier hand-off from development to IT, and lower overhead overall.
Those benefits can be seen very clearly in our survey, where 70% of our respondents said one of their top three reasons for deploying Kubernetes and containers was to increase agility; 67% cited consistency across environments; and 60% said they moved to Kubernetes for the scalability.
While the original cloud revolution was a bit like a political campaign where only one could win, businesses have since adopted an open hybrid cloud model. In fact, no one in our survey said they were using only one cloud. Instead 93% said they used exactly three clouds.
Indeed, 58% of respondents said that the ability to run these workloads consistently across the hybrid cloud is an important consideration.
While there is quite a future still ahead of Kubernetes, this survey showed us fairly conclusively that workloads of all shapes and sizes are now running comfortably on Kubernetes and inside Linux containers. Not only that, but they are being run in production and taking over the roles of legacy VM-based applications.
These newly containerized applications are also not just running in one cloud. They must function in multiple regions, across multiple cloud providers. This is where the real benefit of Linux containers and Kubernetes Operators can be brought to bear upon the IT estates of enterprises. By reducing the complexity to deploy these applications around the world, on-demand and closer to the data they are processing, businesses are increasing their velocity, and better utilizing their development resources.
If you would like to see the full results of this survey, you can download it here.
Featured image via Pixabay.