Cloud Native / Cloud Services / Technology / Sponsored

3 Reasons Why You Can’t Afford to Ignore Cloud Native Computing

25 Apr 2019 10:16am, by

KubeCon + CloudNativeCon sponsored this post. KubeCon + CloudNativeCon, May 20-23 in Barcelona, will feature key maintainers behind popular projects like Kubernetes, Prometheus, gRPC, Envoy, OpenTracing and other domain experts, adopters, and end users to further the education and advancement of cloud native computing.

Anita Buehrle
Anita has over 20 years’ experience in software development. She’s written technical guides for the X Windows server company, Hummingbird (now OpenText) and also at Algorithmics. She’s managed product delivery teams and developed and marketed her own mobile apps. Currently, Anita leads content and other market-driven initiatives at Weaveworks.

The past five years have seen a rapid evolution of technologies collectively known as cloud native computing. This shift to cloud native technology has fundamentally changed the way software development is done today, leading to the adoption of DevOps practices and other new strategies of delivering software with the main benefits of an increase in your team’s velocity that satisfy customer demands and allow organizations to be competitive in the marketplace.

These technologies have also been instrumental in the success of tech powerhouses such as Netflix, as well as digital native companies like Uber and others changing business models in highly disruptive ways.

But while most organizations lack the DevOps resources of the Netflixes and Ubers of the world, these and other tech giants have also created replicable cloud native frameworks smaller enterprises can benefit from.

What Cloud Native Looks Like

Cloud native computing embodies a set of architectures as well as best practices for developing and delivering applications running on those technologies. Typically, cloud native applications are architected as a set of microservices that run in Docker containers, orchestrated in Kubernetes and managed and deployed using DevOps and GitOps workflows:

Microservices architectures

Docker containers lend themselves very well to microservices. When microservices are run in separate containers, they can be deployed independently and even in different languages. Because containers are portable and can operate in isolation from one another, it is very easy to create a microservices architecture using containers and also move them from one environment to another or to another public cloud if you need to.

Automatic Deployment and Orchestration

Microservices are deployed and “orchestrated” in an automated, and scalable architecture called Kubernetes. With Kubernetes, an application’s containers are grouped into logical units for easy management and discovery. An advantage of Kubernetes is its ability to scale with your app without you needing to add more resources to your Ops team.

Modern DevOps and GitOps Best Practices for Software Delivery

With applications running in containers and orchestrated in Kubernetes, the next step is to automate deployments to speed up your team’s productivity. A continuously automated flow of features is what distinguishes DevOps from other software development philosophies and practices like the waterfall model where development follows an orderly sequence of stages. With Kubernetes, the most efficient way to speed up your team is to use tools and workflows that are familiar to them. GitOps is the most effective way to enable your deployment pipelines to Kubernetes.

Now that you have an idea of what a cloud native application looks like, what are the main reasons for starting the digital native transformation?

Reduced Time to Market and More Efficient Operations

Enterprises who are making the digital transformation and who need to move their business forward to be competitive are interested in creating what’s been referred to as an ‘invisible infrastructure’ for their development teams. To move faster, infrastructure changes and requests that can slow down development needs to be reconsidered.  Because Kubernetes and its applications that run on it are almost completely declarative, infrastructure can be kept alongside code in a source control system such as git. With the entire system in one place, it’s then very simple for developers to make changes to not only their applications, but to the infrastructure as well — otherwise known as GitOps.

Developers use Familiar Tools and Workflows with GitOps

GitOps is an operating model for building cloud native applications in Kubernetes. The goal of GitOps is to speed up development so that your team can make both infrastructure changes and applications updates safely and securely to complex applications running in Kubernetes. It does this using the tools and workflows that developers use everyday like git and GitHub.

What DevOps Is to the Cloud, GitOps Is to Cloud Native

Cloud native is the impetus behind the DevOps shift that has led us to a whole new set of methods and philosophies on how to do software development. Companies who adapt cloud native and DevOps best practices like GitOps can increase mean time to deployment from 1 or 2 deployments a week to 150 deployments or more in a single day.

With the ability to continuously deploy changes, development teams can conduct advanced deployments like canary tests and roll out features to subsets of customers more easily. Also, since your entire system is kept in git, rolling backward or forward is only a click away which allows developers to make frequent low-risk changes that can be easily backed out of. With this kind of automation and velocity, enterprises can bring new ideas to production within minutes or hours instead of weeks and months, resulting in a greater rate of innovation and competitiveness.

Competitive Advantage and Improved Bottom Line

The number one motivation for readjusting your software delivery and infrastructure strategy toward cloud native is competitor differentiation which will ultimately lead to a larger bottom line. There are also a number of other benefits to take advantage of once you’ve made the transition, and some of these include:

Increased Reliability and Scalability

On-demand elastic scaling or cloud bursting offers near limitless scaling of compute, storage and other resources. Enterprises can take advantage of built-in scalability to match any demand profile without the need for extra infrastructure planning or provisioning.

GitOps and DevOps best practices not only provide developers with a low-risk method of reverting changes, clearing the way for innovation, but since you can also cleanly rollback, recovery from complete cluster disaster is also faster. Higher uptime guarantees mean businesses are more competitive, can offer more stringent service level agreements and a better quality of service.

Lower Infrastructure Costs

Because cloud native technology enables pay-per-use models, the economies of scale are passed through and shift spending from CAPEX to OPEX. Because the barrier to entry is lowered for upfront CAPEX spending, organizations can invest more in application development rather than on infrastructure costs. And since cloud native infrastructure is more flexible and portable, Total Cost of Ownership (TCO) is also much lower.

Attract and Retain Top Talent

Working with cloud native and other cutting edge open source technology that lets you move faster and spend less time on infrastructure is appealing to developers. Hiring higher-quality developers results in better products, and therefore more innovation for your business. An added bonus is that open source contributions can help establish your reputation as a technology leader.

Reduced Vendor Lock-in

Cloud native gives you a choice of tools without being stuck with legacy offerings. By taking advantage of multicloud compatible tooling wherever possible, your applications are more portable and beyond the reach of vendor predatory pricing. You can easily migrate to alternate public clouds with better product offerings or where compliance requires multicloud infrastructure.

Flexible Infrastructure with a Supportive Ecosystem and Community

The Cloud Native Computing Foundation (CNCF) was created four years ago and is the vendor-neutral home of Kubernetes — an open source system for automating deployments as well as scaling and managing applications. Kubernetes was originally created by Google to support and run their search engine, but today it has contributions from Amazon, Microsoft and Cisco, as well as more than 300 other companies.

The main mission of the CNCF is to build sustainable ecosystems and community around a constellation of high-quality projects that support and manage containers for cloud native applications built on Kubernetes.

Off-The-Shelf Cloud Native Components

By taking advantage of the many incubating projects available in the CNCF,  enterprises can easily set up the infrastructure and lay the groundwork for IT teams to innovate more.  Before cloud native technology, adding a new business component to a monolithic platform meant hiring a team of consultants that may have taken months to implement.

But now, much time is saved by using the CNCF’s landscape of off the shelf, community supported components. This allows businesses to focus on more important tasks like introducing machine learning or other data science methods into your business to increase its innovation and competitiveness.

Cloud native technology have introduced a different method of doing software development from 15 years ago, when it took a monumental coordination effort to deploy a single change.  With technology like Kubernetes, its supporting ecosystem and with best practices like GitOps, deploying changes on a continuous basis is trivial since cloud native makes this all possible.

To learn more about containerized infrastructure and cloud native technologies, consider coming to KubeCon + CloudNativeCon Barcelona, May 20-23 in Barcelona.

Feature image via Pixabay.

A newsletter digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.