How WeatherBug Uses Microservices Without Containers
Since 2015, weather forecasting service WeatherBug has been using microservices in its IT infrastructure to dramatically simplify and transform its old, large home-built monolithic applications by turning their components into lighter, more manageable systems.
But instead of running microservices the common way — inside of containers — WeatherBug has been taking an innovative approach and runs them directly on Amazon Web Service’s Elastic Compute Cloud (Amazon EC2) platform without containers, simplifying its deployment even more.
It may have been an unusual road, said Eddie Dingels, senior vice president of engineering for WeatherBug, but it has turned out to be the correct path for the company and for the IT problems he and his staff were trying to correct.
WeatherBug, which has been around since 1993, was built on large, database-driven applications dating back to the company’s founding, said Dingels. That meant that many of them were unwieldy and massive and dated back to before mobile phones and apps were ever envisioned. The old monolithic applications made it difficult to update or modify them quickly when needed, which had become a necessity in the world of mobile devices and the many apps which consumers want to use on them.
By 2015, it was clear that changes were needed so the company began exploring alternatives to its crusty old infrastructure.
“We were finding that doing a new deployment was hard, so a lot of our rationale came down to increasing the speed and the pace of innovation” for new architecture, said Dingels. When they first looked at microservices, it became clear that they could be used to decouple the development processes of different development teams so the needed applications and services could be scaled independently and operate more autonomously.
That was the beauty of microservices — to break apart the monolithic applications and development from that moment forward to be done more quickly and efficiently and to get code into production much more quickly, he said. That’s also when the idea of using EC2 arose.
EC2 is an environment that lets companies use microservices without necessarily having to add the extra overhead of a container platform such as Docker, said Dingles.
“If you look at AWS, there’s a number of different ways you can run compute,” he said. With EC2, basically, you have a host and a host environment that can run pretty much anything, which even turns out to be microservices. That structure is a virtual host, which is just like a container but without the formal container, he added.
For WeatherBug, the microservices help connect the various mobile and desktop apps it offers to new features, configurations, code and changes that happen on the fly within the company’s infrastructure, said Dingels. That includes using microservices to help apps get real-time information on a user’s location so they can get accurate weather details.
“Each one of these is independently-running services, independent pieces of code that are running on top of EC2 and they’re not connected into a single service,” said Dingels. “What’s key here is we’re breaking apart the functions that our services do into the smallest possible elements that we can independently deploy and iterate on.”
They could have done the same things using a container platform but were happy to avoid the extra complexity, he said. Using EC2, each of the microservices ends up in its own auto-scaling group and can be managed independently from a scale perspective.
“Now you might say we’re not taking advantage of Kubernetes and Docker containers and the many advantages to using those,” said Dingels. “They are definitely on our roadmap, but I don’t believe you need to do them to use microservices.”
If someone says that containers must be used to run microservices, said Dingels, “I would argue that they’ve limited themselves. The whole idea of microservices is that you don’t limit your technology choices.”
By breaking up applications into smaller components with microservices, it allows those components to be easily verified and deployed elsewhere with less disruption to other code.
After the first few weeks of the original microservices discussions in early 2015, it became clear to the WeatherBug team that it was the right strategy to use to dramatically overhaul their development processes.
“It almost immediately made sense,” said Dingels. “It wasn’t like there was a large amount of convincing to do. Everybody was feeling the problem, which was really around deployment and regression” and how much time it took doing those things with the monolithic applications from the past. “What we found was that breaking apart the services and deploying them on AWS made perfect sense and that it worked very well without necessarily putting in Docker, which was the only thing available at the time.”
Significant improvements have been seen inside WeatherBug since the move to microservices began, including faster release cadences – often to multiple times per week today compared to months in the past — as well as increasing code stability and reducing troubleshooting time and QA regression time, according to Dingels. “We did find that regression can be done extremely quickly when you’re dealing with microservices.
Today, about 99 percent of the old monolithic applications inside the company have been transformed using microservices as part of an ongoing process, and for several years, all new coding has been done using them as well. “There are some remnants of those monoliths that still exist, legacy pieces, but it just hasn’t made sense to go back in and break them apart yet,” said Dingels.
It’s almost impossible to determine how much the switch to microservices has cost WeatherBug since the work has been progressive over time rather than started and ended on specific dates, he said. “We were kind of doing the work as the plane was flying, and we said we want to do microservices.”
There have been some challenges along the way, including some they expected and some that were unexpected. First, dependencies become harder to map because the code is being separated, but that can be resolved using new tools such as X-Ray from AWS, which automates the process, said Dingels. In addition, the log files that were helpful in the past to find problems in the monoliths are not effective with microservices due to how the code is being segmented, but other tools, including the ELK stack, can help resolve those problems.
Yet despite its container-free approach so far, Dingels said that WeatherBug could one day also end up deploying Docker or Kubernetes or another container management system because there are some potential benefits to those approaches as well.
For some uses, a container management system could increase the company’s IT capabilities, even more, he said, by providing better utilization of its underlying infrastructure by introducing containers. Another benefit is that containers would make it easier to quickly scale when using AWS EC2, he said.
“Getting new compute resources takes more time running on EC2 alone than it does using containers,” said Dingels. “Now you can imagine that with weather we see a pretty wide variation in usage, especially depending on where our users are located,” whether in urban areas in New York City or rural areas such as Montana.
“I would say it’s on the roadmap,” he said of a future container system deployment as well. “One of the things we have found is that it really comes back to where you’re going to get the most bang for your buck.”
EastBanc Technologies, WeatherBug’s technology partner, helped facilitate this interview. EastBanc Tech and Earth Network/WeatherBug have been partnering for years over many initiatives, from native cloud migration and deployments, to architecture, development and third party-exposure of APIs. Along the way, EastBanc Tech introduced WeatherBug to the power of Microservices and established innovative cross-cloud deployments that take advantage of unique respective strengths of the different cloud providers.