As Enterprises Experiment with Containers, Secure Access Remains a Challenge
The speed of business development is ever-accelerating, with faster software release cycles, automated continuous integration and continuous delivery pipelines, extreme scalability and elasticity of production environments, and public and private cloud adoption. Businesses — and their developers — need new tools and methods to keep up with the speed of innovation. In this environment, containers have emerged as the future of software deployment.
By bundling an application and its supporting code, libraries, settings and assets into a single package, containers make software deployment faster, easier and consistent across computing environments.
Containers support the style of work that modern developers want by offering speed, efficiency and automation. Now, instead of having to spend valuable time worrying about configuring an application environment, developers can acquire everything they need to deploy an app with just a few lines of code in Kubernetes, the most popular orchestration platform for Docker container management.
Enterprise + Containers
As a result, the use of containers is becoming more and more popular. Research indicates that the containers market will be worth more than $4.3 billion in 2022, and a recent survey of 519 companies shows that 81% are currently running container technologies, compared with just 55% in the previous year.
For startups and cloud-first businesses, experimenting with containers for software deployment is relatively straightforward, since they can use containers that are hosted on fully cloud infrastructure, like Amazon Web Services, Google Cloud, or Microsoft Azure.
Like startups and cloud-first companies, enterprises also want to move toward experimenting with container-based deployments, so they can innovate and keep their developers happy. Many enterprises are rolling out internal “innovation incubators” that are empowered to think and work like startups within the corporate environment. Developers within these teams often have the latitude to work with the tools they prefer, which can include Docker.
But, containers are not as simple for enterprises to implement. The realities of an enterprise’s highly regulated and risk-averse IT climate means enterprises often can’t deploy containers to popular public cloud container platforms. Instead, enterprise developers in, for example, the financial sector, need to host Docker and Kubernetes within a more controlled virtual private cloud or hybrid infrastructure.
The Secure Access Problem
Here’s the crux of the problem. IT administrators still need to govern access to the container infrastructure, just like any other system. And it’s becoming more difficult as the types of systems that they must govern evolve.
Consider how secure access is changing to adapt to modern ways of working. In a legacy enterprise on-premise server environment, IT admins are already dealing with millions of unmanaged SSH keys, which may be redundant, poorly configured, and non-compliant, all while enabling critical processes. Now throw on top of that the typical workflow of modern enterprise development teams, who spin up hundreds of new cloud servers each day but only do the minimum when it comes to security. They may tick the box by using Security Information and Event Management (SIEM) tools like Splunk, but shouldn’t be expected to take the extra step of creating an audit trail for day-to-day privileged access.
As a result, IT admins struggle to determine who has access to what, creating several major compliance questions. What if someone has access to root infrastructure, or customer credit card data, or other sensitive information, when they should not? What if a third-party contractor or freelancer continues to have access to critical IT resources long after the engagement has ended? It’s a compliance time bomb that’s unacceptable in any enterprise.
Chief information security officers (CISOs) are also under fire on multiple fronts, as they respond to somewhat conflicting demands from both developers and IT admins. Developers want as much freedom as possible to explore containers, while the IT admins want to retain a holistic view of their entire security landscape.
Put one way, in the face of growing complexity in IT environments, IT administrators need to find a way to keep environments secure and running smoothly, without adding friction to the speed of business. Put another, enterprises need to find a way to stay on the cutting edge of container technology, without making any sacrifices to their overall security.
So, how can enterprises take advantage of their Kubernetes container environments while ensuring they remain secure?
Enabling Secure Innovation
Enterprises need a way to control, manage, and automate up to millions of access keys, for operating systems that host Docker or Kubernetes. With full key lifecycle management capabilities, IT admins are able to retain a holistic view of the entire security landscape. Meanwhile, developers are free to explore containers as the future of software development, within that highly regulated environment.
IT admins don’t have to completely lose control of the security environment in order to give developers that freedom. And, internal auditors can rest easy, knowing that the company’s agile software development processes deliver regulatory compliance.
In this way, we can make it easy for enterprises to realize the business benefits of packaging and running software in containers, while ensuring cost-effectiveness, agility, and compliance at the same time.
Feature image by suju from Pixabay.