Docker Rolls out 3 Tools to Speed and Ease Development
There are good reasons to keep development local — including security, flow and tools. There are also good reasons to develop in the cloud, including on-demand resources and ease of use. Docker’s thinking — and key to its new offering — is to adopt a hybrid approach that allows developers to take advantage of both worlds.
“Given where we sit, Docker desktop sitting on that local laptop, we see an opportunity to bring the best of local together with the best of cloud so it’s not local or cloud, it’s local and cloud,” Docker CEO Scott Johnston told The New Stack ahead of this week’s DockerCon conference in Los Angeles. “Conceptually, what this looks like is you can take advantage of the on-demand resources in the cloud […] and then on the local side, you can keep your existing workflows — you have infinite choice of local tools.”
Today, Docker introduced three new tools, all based on open source and open standards, that operate in this hybrid space.
1. Docker Scout GA for Insights
A developer might have to open many different browser tabs to access all the tools they use, Johnston said, citing a GitHub survey that found 31% of a development team’s time is spent finding and fixing security vulnerabilities.
Docker Scout GA is not a replacement for all the tools, but an adjunct. It uses APIs to integrate and consume the metadata from other tools. This enables it to provide insights, policy evaluation and contextual remediation to Docker’s existing content, to build automation and software bill of materials tools.
“What it is is a cloud service and a local service that integrates with all these tools and consumes all the metadata about the events that they’re doing,” Johnston said. “Each time an image goes to CI (continuous integration) or each time an image goes and gets committed into Git, that throws off an event. We collect that event to put together an end-to-end view of what is going on with that image.”
That allows Docker Scout to find the right context at the right time in terms of what’s inside the image, who touched it last, and what are the issues with it downstream. It also can offer recommendations on how to address any issues with the app.
“For example, if you have a library that they’re using in their image locally, we’re able to say ‘That library has a CVE (Common Vulnerabilities and Exposures) and that library actually is used in production and so here we’re going to recommend you upgrade to this next version of that library so that you’re safe,’” he said. “It’s a proactive way of addressing security versus getting some alert 30 minutes later.”
One customer who will speak at DockerCon had a team of developers, each with their own toolchain. Using Scout, they were able to provide coverage protection and gather insights within one hour to more than 300 repos, Johnston said.
2. Next Generation Docker Build
Docker Build takes the source code and turns it into a container image. Often, this build is done on the local machine, and, frankly, it eats up a chunk of the developer’s time.
“Today, developers report spending up to an hour a day waiting for the builds to finish; and that could be one big build or it could be a number of small builds, but an hour a day in a 24-hour day, or maybe eight to 10-hour workday — it’s still a lot of time,” Johnston said. “Docker Build today is really constrained by what resources are available on that laptop.”
The next-generation Docker Build leverages the cloud on the backend to speed up builds, and it does so without any change to the tools, workflow or configuration, he explained. It promises to speed builds by as much as 39 times, by automatically taking advantage of on-demand cloud-based servers and team-wide build caching.
“We take advantage of the cloud on the backend to say that looks like a big build, let’s go off and put you on a much bigger server in the cloud with faster CPU, more memory, faster disk IO,” he said. “We’ve seen compression of build times 39 times, where an hour build can be compressed to just over a minute and a half.”
That’s giving developers back an hour to their day, he added.
3. Docker Debug
Developers can spend as much as 60% of their time debugging applications. Much of that time is spent sorting and configuring tools and set-up instead of actual debugging. That’s an ephemeral process that doesn’t preserve the state between their activities. Johnston said.
Docker Debug is a language-independent, integrated toolbox for debugging local and remote containerized apps, that speeds up the debug process.
“It works on local and remote containers, and it’s got all the tools all in one, so that they can spend time problem-solving and not have to waste time setting up, tearing down, configuring and jumping around,” Johnston said.