Kubernetes: Capitalizing on Containers from Edge to Core to Cloud
Kubernetes was introduced six years ago, and in that relatively short time it has cemented its position as the de-facto standard for container orchestration and management — and the global fan base grows faster every year. Kubernetes and cloud native technologies have a wide set of applications, as they are a solid and ubiquitous connective tissue for countless open source innovations — from simplifying the life of developers and business applications, to enabling different types of infrastructures and adding AI/ML capabilities.
For many CIOs, Kubernetes is the top-tier container management system. Kubernetes-related entries on GitHub now number 60,682, which is only expected to increase as we experience the rapid expansion of Kubernetes use into the enterprise and in new use cases such as edge computing.
Kubernetes offers numerous advantages, particularly when supporting DevOps. It ensures quick, easy management and logical organization of containerized applications and services, as well as automating many operational tasks — such as scaling and availability management. The ubiquity of Kubernetes service offerings among cloud providers and the portability of containers themselves ensure that businesses can even move workloads between different providers. With all these benefits, it’s no surprise that its popularity is soaring.
SUSE recently commissioned a global study with Insight Avenue, an independent market research agency, to explore how IT leaders are rising to today’s challenges. The study found that average production workloads that are containerized are expected to grow from 27% today to 34% in a year’s time and to 47% in two years’ time. Along with increased demand for containerization, there is also an appetite for container orchestration — reflecting a growing maturity as organizations continue their journey to fully cloud native solutions.
In a nutshell, Kubernetes is essential for organizations looking to boost innovation now and in the future. It plays a vital role in helping businesses to accelerate their application delivery with containerized and cloud-native workloads. With the rise of edge computing and hybrid cloud, new needs surface. According to Insight Avenue Research, IT leaders expect a tangible difference for the success of their business by adopting IoT (82%) or edge computing (80%), and of course open source (70%).
Let’s examine some of the use cases behind those numbers.
Edge to Core to Cloud
Today edge computing is implemented across many different industries; and it comes with a lot of applications. Edge computing — like cloud or core computing — requires infrastructure, data and application management, plus a seamless integration with on-premise, private data centers and public clouds. While the “as-a-Service” approach of clouds and the device-based approach of edge may initially seem very different, they are tightly interconnected and co-dependent on one another.
Smart, self-driving cars are a very good example for an edge-to-core-to-cloud use case. Fully equipped with sensors and compute units, these cars are the endpoints of an edge architecture of a massively distributed comprehensive system. These smart cars generate a huge amount of data. Some of it is processed locally, as latency is not an option for all aspects of driving reactions, while some data goes back to the data centers of automotive manufacturers (on-prem or collocated) to be stored, filtered, processed and analyzed. This system is a two-way street, as the car system requires updates and instructions too. Finally, the public clouds, hosting centralized data and end-user applications are connecting to — and with — the cars.
There are several obvious components in this scenario — cars, data centers, clouds — but what we can’t miss is the wide-ranging foundation: the people, solutions and processes that make it all work. There are active users, as well as IT and developer teams, leveraging tools and processes to manage application, data, infrastructure, security and observability. This layer makes the entire system a reality and provides the necessary support for all components.
Of course, edge computing networks get very large, very fast; and they include various types of clouds and infrastructure. But there are open source projects, such as K3s and KubeEdge, that give us the means to use Kubernetes in edge environments, plus other cloud native technologies that enable us to manage the whole ecosystem — for example multicloud or multicluster.
Certainly, this scenario can easily be adopted by sectors outside the automotive industry, such as retail or health care. The prerequisite for all use cases across industries is that edge computing requires an integrated infrastructure that spans from edge to core to cloud. Cloud native technologies are key and are now adapted and used for edge computing.
Most importantly, all these advances will enable Kubernetes and other cloud native projects to become the unifying platform that makes edge computing accessible for the enterprise. In today’s revolutionary technology landscape, Kubernetes and cloud native solutions are essential for organizations looking to boost innovation now and in future.
To learn more about Kubernetes and other cloud native technologies, consider coming to KubeCon + CloudNativeCon EU, Aug. 17-20, virtually.
Feature image via Pixabay.
At this time, The New Stack does not allow comments directly on this website. We invite all readers who wish to discuss a story to visit us on Twitter or Facebook. We also welcome your news tips and feedback via email: firstname.lastname@example.org.