Docker-Based Dynamic Tooling: A Frequently Overlooked Best Practice

Containers are quickly becoming the universal deployment format among large enterprises and small companies alike. Docker is naturally acting as the common actor between both developers and operations allowing for easy deployments and self-contained releases.
Using containers for deployments is indeed a welcome transition from the old (bare-metal and Virtual Machine (VM) world, as the small footprint (both in size and startup time) has allowed organizations to deploy much more frequently than before. Decreasing the time between releases is a constant goal for any organization because this ensures that new features are reaching customers as soon as they are implemented.
Unfortunately, this quick transition from VMs to Docker images has overshadowed another big advantage of containers that is rarely mentioned, and this is the benefit to developers and operations when using containers in the Continuous Integration (CI) process in the form of dynamic tooling. This is a game-changing characteristic of containers which is arguably even more important than their use as a deployment artifact.
This misconception on how containers can be used in the CI/CD (Continuous Integration/Continuous Deployment) process is now so prevalent that “Docker adoption” is almost synonymous with production deployments. This could not be further from the truth, and in this article, we will explain why leveraging Docker-based tooling is also an important and independent part of the full Docker adoption process.
Dynamic Build Nodes with Docker
In a traditional CI environment, all machines that perform builds have a superset of all the possible tools a developer might need. Each node is offered with pre-installed versions of build, testing and provisioning tools that have been adopted in the company.
Having multiple versions of the same tool is a big challenge and for really large organizations where multiple technologies are used by different teams, the effort required to maintain the build nodes can quickly get out of hand.
The advent of containers (in the form of Docker) is now presenting us with another more intuitive and streamlined approach — dynamic tooling. With dynamic Docker tooling, all build nodes start with only one thing installed — Docker — and nothing else.
Dynamic Docker-based tooling feels like a renaissance to developers that are accustomed to the constraints of the traditional static build tooling approach.
Then, during build-time, only the specific tools needed for the build job at hand are launched using Docker containers. Once the build is finished, the build node reverts back to its original state (i.e. completely empty of any tool).
This simple approach is very powerful and has several advantages both for developers and operators as we will see in the next sections.
The Dark Ages of Static Build Tooling — Developers’ Viewpoint
Now that we have seen how you can adopt Docker for the CI process only, instead of full CD, we need to explain the advantages of Docker-based tooling. And the easiest way to see the benefit is to explain the shortcomings of the traditional static build approach.
In a static tooling platform, the build nodes are long-running and only loaded with “approved” build tools. This creates a lot of productivity issues ( and frustration) for developers:
- Upgrading a new tool must be requested first by operations, resulting in very slow upgrade cycles.
- Developers are forced to configure their own workstation according to what is available on the build node.
- Creating a brand-new project with new frameworks and tools requires a lot of effort, as all build nodes must be upgraded to accommodate it.
- Developers must keep track of build node capabilities and make sure that their build job is actually sent to a node that fulfills all the requirements.
- Using multiple versions of the same tool in a build node is always a big challenge. In extreme cases, developers are forced to change the libraries of their project, simply because a build node was upgraded/downgraded to that version.
The adoption of cloud-based architectures has further exacerbated this problem as it is now possible for a single organization to deploy to multiple platforms at the same time which are externally controlled.
Using multiple cloud infrastructures leads to the need for employing provisioning and deployment tools which are completely different for each deployment target. The rate of tool development is very fast, and many times operators cannot keep up with all the changes needed in a build node.
The end result is that developers are always unhappy, as they think that the build platform is working against them. There is always tension between developers and operators with regard to build tools availability.
The Benefits of Dynamic Docker Tooling for Developers
With dynamic Docker tooling, communication between developers and operators becomes very easy. There is only one hard requirement for a build node, and that is Docker itself.
Once Docker is installed in the build node, any developer can launch Docker images with the specific tools needed by that specific project. The operator is no longer an obstacle for adopting new frameworks and new libraries.
The dynamic nature of this approach comes from the fact that Docker containers are short-lived. They only exist as long as the respective build requires them. This is a huge difference compared to the traditional practice of having pre-installed tools in the build node.
Developers are now happy (and more productive) because:
- They can use any version of a framework they choose right away.
- It is very easy to create new projects that use a completely new architecture.
- All build nodes are equal, so they can send their jobs to any node knowing beforehand that tool version mismatches will never happen.
- Using multiple versions of the same tool is now trivially easy (even within the same project).
- They are never forced to upgrade their library versions. Legacy projects can still use completely different tool versions than greenfield projects.
- Build nodes are “self-cleaning” so they never have to worry about a clash in version tooling.
- Communication with operators becomes very straightforward. The only major topic to discuss becomes the version of the Docker daemon in the build node.
Dynamic Docker-based tooling feels like a renaissance to developers that are accustomed to the constraints of the traditional static build tooling approach.
Now let’s look at how operators benefit from dynamic tooling in CI.
The Dark Ages of Static Build Tooling — Operators’ Viewpoint
Operators (i.e. system administrators) traditionally spend the most effort on managing static build nodes. Their responsibility is to keep a huge list of “blessed” versions of tools to make sure that these tools are available to developers.
The complexity of such an approach can quickly result in a daily firefight, especially in organizations that use different tools and technologies.
To address this complexity of multiple build tools and versions, operators usually follow one of two methods:
- All build nodes are exactly the same and each one contains ALL build tools needed by every possible project used by developers.
- Different build nodes have different collections of build tools. Nodes are assigned special “labels” that show their capabilities.
Both methods have advantages and disadvantages. If all nodes in the build farm are exactly the same, then a special mechanism needs to be in place for handling multiple versions of the same tool. Also, each build node can quickly become overloaded. On the other hand, this makes life for developers a bit easier since they can choose any node for their build.
Using different nodes for different tools solves the version clash for build tools, as each node can have a different version on the same tool. In this case, however, operators need to keep close track of which tool is installed on which node and make sure to upgrade all nodes when a new version comes along.
Developers also need to be aware of the latter approach as they must make sure that their build jobs are sent to the correct node. For example, a Python developer needs to specify that a job requires a node with the “python” label, while a JavaScript developer needs a node with the “javascript/npm” label, and so on.
In summary, static build nodes are a huge time sink for operators. There are companies where maintenance of build nodes is literally a full-time job.
The Benefits of Dynamic Docker Tooling for Operators
With dynamic Docker tools, the life of operators becomes very easy.
All nodes are easy to set up and maintain especially if there is an existing Kubernetes cluster for builds, which is quickly becoming a common practice. As mentioned, each build node needs just Docker installed and nothing else. All nodes are also exactly the same (by definition).
With this straightforward approach, operators…
- maintain a list of approved tools, but do not need to install them beforehand;
- do not care about the exact version of tools used by developers;
- are not responsible anymore for tool upgrades (as developers can do it themselves);
- no longer face the problem of multiple versions of the same tool;
- can work on a homogeneous fleet of build machines; and
- don’t have to manage labels for nodes and keep track of which node has which tool.
Communication with developers is now very easy, as the only thing to discuss is the Docker version of the node.
One other advantage not shown in the picture comes from the speed and footprint of Docker containers. With the traditional static build approach, operators must always have build nodes ready and available for jobs, even when there is no developer actually building anything.
With Docker-based tooling, tools are launched by developers on demand and within seconds. When there are no developers using the nodes, the nodes can be easily reassigned to another development team that uses completely different technologies.
In summary, Docker-based tooling frees the hand of operators and removes a daily burden from them.
Two Completely Orthogonal Ways to Use Docker
The central point of this article is that using Docker for dynamic build tools is a best practice that you can adopt today without actually using Docker itself for production deployments.
Docker deployment artifact or build tooling methods are completely independent and you can easily and effectively mix and match any of them depending on your organization.
Essentially there are 4 possible stages of container adoption within a company:
- VM-based tooling with deployment on VMs (the old way).
- VM-based tooling with deployment on containers (most people are familiar with this approach).
- Docker-based tooling with deployment on VMs (a great way to gain benefits from containers).
- Docker-based tooling with deployment on containers (full Docker adoption — the holy grail).
The majority of Docker-related press focuses on Docker-based deployments, instead of Docker-based build tooling, making a lot of organizations oblivious to the benefits of the latter.
It should be clear from the illustration above that Docker-based tooling can be adopted on its own (while deployments can still be targeted at VMs/bare-metal). A lot of organizations try to jump onto the container bandwagon by blindly attempting to use Docker in production deployments without understanding that this is not the only possible approach.
In fact, Docker-based tooling can bring more benefits to the CI/CD process as it solves a lot of common productivity issues faced by developers, as we have seen in the previous sections.
The ability to create build environments on demand instead of waiting for lengthy provisioning approvals is one of the most frequent pain points between developers and operations.
At Codefresh, we have implemented this approach for CI/CD pipelines. Each step is its own container. Want to run Node? There is a Docker image for that. Want to run Maven? There is also a Docker image. Want to do a Canary rollout? There’s an image for that as well. Do you need Selenium? Do you need Terraform? Essentially everything that is offered as a Docker image can be used as a build step.
You can still use Codefresh to deploy to traditional targets (i.e. VMs and bare metal machines), but the heart of the build platform is all about leveraging containers and Docker images with tools.
Developers can create pipelines where each build step runs in the context of a Docker image that contains the required tool. Version clashes, tool upgrades and build nodes with different labels are now a thing of the past.
We see dynamic Docker build tooling as a new approach that is changing the lives of both developers and operators and would love to see it gaining further acceptance within companies and organizations.