Technology / Tutorials / /

The Year Ahead: The Rise of Containers as a Service

30 Dec 2015 12:07pm, by


The year 2016 will be the year for Containers as a Service (CaaS), according to Docker. Not only that, but “Ops-led Containers as a Service.”

As the company put it among its 2016 predictions:

“Deployment and use of containers in production will be greatly eased by Ops-led Containers as a Service architectures focused on enabling IT’s ability to deconstruct monolithic application architectures in favor of microservices. CaaS will succeed without requiring organizational changes as seen with the rise of DevOps, eliminating the need to retool and reskill by refocusing on what Ops can do for Dev …”

As Colm Keegan, senior analyst for Enterprise Strategy Group put it:

“Developers have largely driven the use of containers and done so somewhat independently. Operations folks, in their defense, haven’t had much exposure to container technology in the past and weren’t quite sure what to do with them. As a consequence of that, developers, out of necessity, started managing these environments themselves. … so out of necessity, they’ve had to assume operational control of this stuff. That becomes a problem because if they’re spending a portion of their time wearing an operations hat instead of a developer’s hat.”

Offering it as a service means IT can sanction it as an enabled service and put all the needed controls such as security and governance around it, he said, providing the service not only in the local data center but also possibly across public clouds.

2015 brought widespread interest from operations organizations about how to best deploy and manage containers, according to Scott Johnston, SVP of product management for Docker. That has given rise to a number of vendors including Rackspace, Joyent, Microsoft as well as Docker offering container services as “a true product category,” he said.

“It’s not just the recognition that we need to manage these containers, but offering a suite of products and having a very opinionated point of view about what’s required to manage those successfully,” Johnston said.

“How to manage it all the way from the developer’s laptop all the way through production was really not discussed or understood.”

“Operations is setting up these self-service portals that allow development teams from the very get-go [from laptop to production] to have an end-to-end solution. In contrast to some ops-centric or ops-only solutions out there, while ops is the buyer, they’re standing up infrastructure that’s going to be used and consumed by all players in the application-delivery pipeline. The reason it’s ops-led is that ops tends to own the servers, the data center and the underlying compute, storage, network resources.”

Applications are driving this next wave and Ops is not wanting to get in the way of that, but to support it, he said.

They need products that can provide governance over those compute, network, storage resources and at the same time provide the flexibility, agility and portability of applications that the development team are producing. Ops is writing the checks, but they’re trying to be a partner in this workflow that is largely driven by the application development teams,” Johnston said.

He used security as an example, saying 2016 will bring more requirements for end-to-end security solutions.

“You want to start with security from the get-go; you don’t want to do it just before it’s put into production because it’s just a Band-Aid at that point. One technology we’re seeing gaining a lot of traction in our ecosystem is image signing. Signed just when they’re going to share their application with others. Once it’s been signed, you can have the security team put their signature on it, the QA team put their signature on it … over time, you get a clear picture of what steps this particular container has been through.

“That allows the ops team to make real-time decisions – and ultimately automated decisions, on how to deploy that application. Say, ‘this container has super-secret IP, so it can only deploy in local data centers on premise,’ or ‘this is a simple test application that can deployed to a public cloud.’ It’s a great example of how doing things at the beginning of the pipeline gives you huge benefits downstream. That’s just one example of howContainers as a Service provides the agility that developers are looking for and the control that Ops is looking for.”

He sees similar developments with networking and storage.

“Enterprises are saying, ‘Don’t cause me to break my app as it moves through the application stages from dev to QA to prod. Let my app maintain its logical relationships and swap out the actual implementation as it goes through those stages.’ This is putting pressure on storage and networking vendors to produce implementation plug-ins and given us great feedback in terms of what those APIs for networking and storage from an application viewpoint need to look like. So you’ll see focus on cleaner APIs at the application layer and increasingly different implementations of network and storage stacks.”

Joyent CTO Bryan Cantrill says those things have already been happening. He sees the real challenge as clearing up the confusion surrounding containers, the work taken on by the Cloud Native Computing Foundation.

“The more I talk to practitioners, the more I believe this, the greatest impediments to containers are all these seemingly rival solutions in all these different parts of the stack and not really understanding how the pieces compete with one another or fit in with one another. Interoperability has been kind of a tertiary concern. What I think we’re going to see in 2016 is interoperability become a much more important concern. We need to shift our thinking from this kind of land grab mentality into this interoperable world. We need to define some of these interface boundaries, then people can innovate underneath them.

“With microservices, we’re not going to move to some vertically integrated monolithic stack where one provider provides everything.”

He says that’s not something vendors want to hear because they’d like to think they can provide that next “magic stack.”

“It’s going to be much more heterogeneous, much more composable, much more interoperable than that.”

Jay Lyman, research manager for cloud management and containers at 451 Research also noted growing interest from Ops in container management.


His organization’s Voice of The Enterprise (VOTE) survey of 991 enterprise-IT pros in Q1 2015 found more than half of enterprises said they were in the discovery and evaluation phase with containers, but it also found significant interest in application containers beyond experimentation, with more than 20 percent of enterprises indicating container use in trials and pilots, test and development and initial or broad implementation of production applications.

“Given the fact that Amazon and other public clouds are a huge part of growing enterprise cloud adoption, it is not surprising to see user demand and vendor response for containers in a SaaS model, as companies such as Shippable and now Docker provide,” he said.

We also saw this year more focus from vendors on the IT operations side of containers, beyond developers, which indicates that while the application container trend is still relatively early in its development, it is already maturing in the enterprise IT market.

“Finally, while we had a defacto standard for application container format from the start with Docker, it is still an open game at the container management and orchestration layer, with software and providers such as Amazon EC2 Container Service, CoreOS Tectonic, Docker Swarm, Google Container Engine, Kubernetes and Mesosphere all getting significant interest and use, but also used together in true open source coopetition.

“This is sort of the next big question for enterprises as they adopt application containers for more of their infrastructure management and application releases: What do I use to manage these containers at scale and in production? I expect we’ll begin to see a few of these emerge as the clear enterprise leaders over the course of 2016.”

Keegan’s research, conducted last month, also points to gradual adoption of containers in the enterprise and little use in production, contrary to previous reports otherwise.

In a survey of 308 IT decision-makers, 36 percent said they were using containers in limited capacity, such as in a development environment inside a VM; 32 percent are currently testing them, 16 percent said they’re planning to use containers; and 8 percent said they’re interested in containers, but have no current plans for them.

He said most enterprise folks he’s talked to are still looking to use containers inside VMs.

“I still think there’s a big educational element to it, trying to figure out how containers are going to co-exist with virtualized environments and what the longer-term implications are. Most people I’ve talked to don’t see them as mutually exclusive. You can have both. It’s not like containers are going to rule the universe and virtual machines are going away.”

Joyent and Docker are sponsors of The New Stack.

Feature Image: “Japanese garden” by Dean Hochman, licensed under CC BY 2.0.

A digest of the week’s most important stories & analyses.

View / Add Comments