Spotify’s Golden Path to Kubernetes Adoption Had Many Twist and Turns
Spotify is well known worldwide for its music service. Not so well known is its path to Kubernetes deployment has been a road with many twists and turns.
What also may be a surprise to many is that Spotify is a veteran user of Kubernetes and how it owes much of its product-delivery capabilities to its agile DevOps. Indeed, Spotify continues to increasingly rely on a container and microservices infrastructure and cloud native deployments to offer a number of advantages. This allows its DevOps teams to continually improve the overall streaming experience for millions of subscribers.
In this edition of The New Stack Analysts podcast, as part of The New Stack’s recent coverage of end-use Kubernetes, Jim Haughwout, head of infrastructure and operations, shares Spotify’s cloud native adoption war stories and discusses its past and present Kubernetes challenges. Alex Williams, founder and publisher of The New Stack; Cheryl Hung, vice president of ecosystem at Cloud Native Computing Foundation (CNCF); and Ken Owens, vice president, cloud native engineering, Mastercard, all hosted the podcast.
Spotify continues to expand its use of Kubernetes on a monthly basis since its adoption a few years ago. Previously, Spotify had already begun to shift its operation to a containerized infrastructure before it began to consider the potential benefits Kubernetes might offer.
“When we originally looked at Kubernetes, we were in an interesting situation, because we had already had an in-house orchestration solution we had built, and, anecdotally, launched the very same week an open source [alternative] when Kubernetes was launched,” said Haughwout. “So we did a lot of work to essentially make the transition to Kubernetes incredibly easy for developers, and to make it so that we could have hundreds of teams work across shared clusters securely and safely together.”
Despite its early adoption, Spotify began to shift to Kubernetes “in earnest” about a year and a half ago. Kubernetes has since played a key role in Spotify’s DevOps in two key ways. This includes how the platform has helped to “reduce toil,” said Haughwout.
“We want to take away the need for engineers, who have to manage infrastructure, to worry about scaling up and down so they can just simply build and deploy features,” he explained.
The second main benefit Kubernetes has provided is how the adoption of a cloud native infrastructure has enabled the music streaming giant to add a number of new tools and platforms to improve its production pipeline and operations. “One of the big reasons that we’ve been working diligently with the Cloud Native Computing Foundation is to make it easy to adopt a lot of infrastructure,” Haughwout said. “Kubernetes has become a kind of lingua franca of cloud native technology that opens the door for us to get into lots of other technologies.”
During the past year, Spotify has been expanding the number of services it runs on Kubernetes while taking advantage of its highly distributed structures. For example, Haughwout described how Spotify moved data pipelines and machine learning to Kubernetes and “relies on it to build ephemeral environments.”
The road to Kubernetes adoption can be, of course, fraught with challenges. In Spotify’s case, Haughwout recounts the challenges associated with empowering several hundred autonomous engineering teams to “move as quickly as possible.”
“They’re working on building and iterating features and experiments that people use on Spotify every day and we want them to keep working on that,” Haughwout said. “So one of our challenges is how do we migrate them, when you have 299 million monthly users, without interrupting the music stream to things like Kubernetes, without slowing them down.”