Why You Should Care About Docker

As I wander in the midst of excited conversations at Dockercon, I wonder: how do I explain Docker to my wife, at home in Portland, as she stays home, watching and wondering about my sick 18 month old. What is so compelling about Docker that would have me go 600 miles away to hang around with geeky 30 year-olds?
Much of the news out of Docker right now requires you to know the vagaries of cgroups, systemd, and LXC. Starting a conversation with those technologies is a sure fire way to end the conversation unless you have a Stack Overflow or Server Fault ranking over 1K. I wanted to discover what made people excited about Docker beyond these highly technical terms, and figure out what makes their jobs easier, what streamlines business applications, what makes companies stronger.
Docker accelerates technology adoption, even in conservative organizations
At lunch yesterday, I spoke with two developers from a large Fortune 500 financial services company. They described how difficult it is to inject new technologies into their environment. Security professionals inside large organizations are designed to say “no” to new techologies, a constant conflict between more progressive groups like developers who like to adopt and use new technologies.
Docker, as a standardized delivery system, pushes the responsibility of resource allocation and security isolation into the container, removing that responsibility from the list of security or operation roles. Though not a silver bullet, this makes it more likely that security teams will approve new technologies if they are only responsible for verifying that the Docker container process is secure. This is a game changer.
Docker makes it trivial to maintain legacy OS and code
Docker makes it trivial to keep legacy OS, no matter what flavor of Linux you are running. Related to the issue above, many large organizations have aging legacy systems and codebases which they must support. Small new startups don’t have this problem. When I asked Fabio Kung from Heroku and Rafael Rosa about this problem, Fabio noted that Docker makes it trivial to support legacy systems and code. You don’t have to run expensive bare metal servers to host each legacy system. With Docker you get an inexpensive alternative to heavyweight VMs (as long as your legacy system is or runs on a Linux variant). Docker reduces the pain and cost of maintaining old systems and even records that process into a versioned “Dockerfile.”
Docker rapidly reduces “shipping” pain
Executives and decision makers tolerate CI servers, unit testing, or agile development practices, but they really care about one thing: shipping code, the last cycle in the development chain, often referred to as “deployment.” Ironically, even with all the tools mentioned above and more, deployment is still a major pain for developers. As Spotify engineer Rohan Singh highlighted to me yesterday, there is still a massive pain point between committing the final tested code and then getting it running on the final production servers. Docker vastly simplifies this final step, and that makes a difference to executives, developers and gets features in front of customers faster.
Docker solves problems for the Fortune 500 as much as it does for startups
What was most interesting to me as I spoke with people at the conference was how developers from many large organizations saw huge value in using Docker. Docker adoption and development is occurring at such a rapid pace, you would expect it to be something that only startups and early adopters could track, but as it turns out, Docker is quickly showcasing its relevance to both large and small organizations.
With companies large and small adopting and contributing to Docker, these benefits grow and grow. It is an exciting time to be at DockerCon 2014.
Chris Dawson is a developer and the creator of Teddy Hyde, a platform built on top of Twitter bootstrap that uses the Jekyll blogging platform.
Flickr image via Creative Commons