TNS
VOXPOP
Favorite Social Media Timesink
When you take a break from work, where are you going?
Instagram/Facebook
0%
Discord/Slack
0%
LinkedIn
0%
Video clips on TikTok/YouTube
0%
X, Bluesky, Mastodon et al...
0%
Web surfing
0%
I do not get distracted by petty amusements
0%
AI / Data / Software Development

Docker CTO Explains How Docker Can Support AI Efforts

In this episode of The New Stack Makers, Docker CTO Justin Cormack explained how Docker is making AI models easier to deploy locally.
Nov 28th, 2023 10:09am by
Featued image for: Docker CTO Explains How Docker Can Support AI Efforts

Docker isn’t new to artificial intelligence: It turns out Docker has been used for years by data scientists while AI and machine learning still were niche endeavors, according to Docker CTO Justin Cormack.

“People have been using Docker for AI/ML over the years, but it’s been a very specialized niche area for a long time. It’s been data scientists building models, often for image processing or prediction… all those kinds of internal applications,” Cormack told The New Stack during this episode of The Makers podcast. “Many of them use Docker for this, it’s very common to deploy using Docker.”

Then OpenAI released ChatGPT last year and created a massive explosion of interest for Docker with AI, Cormack said. Organizations and developers started to see how large language models (LLMs) could be used to be more productive.

“There’s been a massive explosion of interest and people were asking us, ‘How do I get started with this stuff? Can you make this experience?’” He said.

Now, organizations are particularly interested in the retrieval-augmented generation (RAG) stack, which allows companies to customize existing LLMs with their own data so users can then query on organization-specific knowledge, he explained.

To do that, Docker teamed up with Ollama, which allows developers to run Llama 2, Code Llama, and other models locally.

“It’s much simpler to get started locally and a lot of people are really excited about what you can deliver locally as well because there’s a big GPU shortage,” Cormack said. “It’s hard to build applications sometimes that are not in the cloud with GPUs.”

Docker also teamed up with Neo4j, a graph database company. They’ve added extensions to support vector database staff, which is how developers can store extra data for retrieval by the LLMs, Cormack explained.

Finally, Docker worked with LangChain, the most popular framework for working with LLMs, to bring it together with Docker Compose. Users can access Docker Compose in the Learning Center on the latest version of Docker Desktop.

“You can just one click and bring up a stack and then start iterating, experimenting, change the code, build your own code for your own data source,” he said.

To test it, Docker built a simple app that takes questions from Stack Overflow and allows users to choose which questions on Stack Overflow to put in. Users can then ask questions about a specific domain on StackOverflow that it might not know about otherwise, such as new technologies or a niche area, he said.

“You feed in all these questions, and it will use those to actually answer in a lot of detail and give you citations and references to that,” Cormack said.

Docker is also using its own data to build an AI solution. One of the challenges Cormack hopes to solve is helping new users get started with Docker and Dockerizing their applications. It also hopes to help existing developers overcome the challenges they encounter when using Docker.

“We decided that we take this experience we have and build a tool using LLMs, and other pieces of tech to really help you,” he said. “It’s actually an interactive notebook in Visual Studio Code, with other editors in the future, but … it can give you suggestions about what you might want to do, it can look at your application, and it can see what kind of application is, what language you’re using, and help you write a Docker file for that.”

More Episodes from DockerCon 2023

Debugging Containers in Kubernetes — It’s Complicated

Will GenAI Take Jobs? No, Says Docker CEO

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Docker.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.