As Docker’s founder Solomon Hykes said at DockerCon EU this week, “We’re all responsible for the quality of our code.”
Testing is good, not testing is bad. Adequate and thorough testing coverage allows you to update without breaking things, validate the functionality of updates, and it checks to make sure both your continuous integration and continuous delivery (CI/CD) as well as making sure refactoring doesn’t break existing functionalities.
It’s just that simple — you have to test early and often — but sticking to this philosophy isn’t quite so simple. Besides the obvious hurdle of having too much to do and too little time to do it, testing can dramatically slow down deployment and can simply take so long to do.
What you need is to do is to test in an efficient way that doesn’t take up time and space. Docker has risen as a possible solution to this.
Testing with Docker Compose
Frank has been using Docker for about two years, starting with Version 0.7.0. Her talk was focused on testing patterns with simple code examples to express them, as a way to alleviate at least some of the frustrations that are associated with testing.
“Docker is a really easy way to provide very predictable testing across many platforms,” Frank said.
She started off by offering a walk-through of how to test via Docker Compose, which she says makes it incredibly easy to create consistent, reproducible environments. “A lot of us use Docker Compose to set up development environments already, but then we stop there, we run tests outside Docker Compose,” Frank said.
“In most cases, you need to do a bit of setup running different testings, and you may even need a different Dockerfile to run tests—but it’s only going to be that easy,” she continued, offering the example below.
She also mentioned that, right away, you can run a one-off command against a service with Docker Compose.
More Docker Compose testing configuration options can be found here on her slides. She has also found that Docker makes for a great solution to better dependency management.
Frank also recommends using Docker Compose as testing and development environment. “Docker delivers a predictable and reproducible testing environment. It will be identical to your development environment and you can also change it and extend it easily,” Frank said.
How Docker can Enable Easy CI/CD Testing
First and foremost, if you want continuous integration — and really, if you want continuous delivery, too, without breaking anything — you need to be constantly testing, so you get your errors in production, not with a failed run of your CI/CD system.
“We are responsible for pushing code and then reviewing it, but you can’t do that without test coverage,” Frank said. “We run our unit containers during development, so we should run them in the testing containers too.”
What if you could run testing inside containers as well, isolating the tests, during both development and deployment? Similarly, Frank argues that if you are running services in containers during development, why wouldn’t you run them in containers in production, too? This all enables you to fail fast and early.
She said at Codeship, “We have Jenkins running in a container and then we want our other services and we want Jenkins to have control.”
Running Docker in Docker is not always great …
But this isn’t all rosy. As Jérôme Petazzoni wrote, running Docker in Docker is not always great. He warns you to look out to make sure that the inner Docker isn’t trying to apply security profiles that conflict with the outer Docker.
“My changes worked (and all tests would pass) on my Debian machine and Ubuntu test VMs, but it would crash and burn on” another machine, Petazzo said. Also, there are many combinations that don’t work, like running AUFS on top of AUFS. He said there are workarounds for many problems, like “if you want to use AUFS in the inner Docker, just promote
/var/lib/docker to be a volume and you’ll be fine.”
And to avoid data corruption, Petazzo offers this last piece of advice: “The Docker daemon was explicitly designed to have exclusive access to
/var/lib/docker. Nothing else should touch, poke, or tickle any of the Docker files hidden there.” Frank offers the alternative of binding the Docker socket with a shared daemon.
How to use Docker for Parallel Testing
Frank says that when you’re developing, you can tend to think of a container as a small, virtual machine that runs a specific workload or a specific application, “but if we can kind of change our thinking to create containers as processes and not just machines, we can allow ourselves to be more imaginative,” to have multiple processes. She illustrated it with these steps below:
Similarly to Docker Compose, you define different testing steps and each step is run independently as a container. “And the great thing is that Docker makes these things identical and you can spin off different groups,” Frank explained.
Frank shared the free download information for Codeship’s own tool that does this is jet, which runs locally in your development environment.
With jet and Docker, “They are all sort of mixing together but they are all aggregating into one source,” Frank said. “With this parallel pipeline kind of framework, you can add an additional pipeline to fail if quality decreases.”
She went on to say that, if you are committed to quality, you should compare your current coverage on the coverage that was before it.
Frank finally ended her talk with sound advice to all developers and the importance of testing: “Work harder to know you’re wrong.”
Docker is a sponsor of The New Stack.
The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: Docker, Bit.