Modal Title
Edge Computing

Arriving at the Edge with OpenStack

Last month at the Open Infrastructure Summit in Denver, Colorado, nearly three dozen sessions addressed the topic of the edge, and there are a variety of use cases and technical approaches on display, from industrial IoT and "smart warehouses" to sessions on how telecommunication companies are using the edge to make the low latency and high bandwidth of 5G a reality. There was even discussion of how to use Kata Containers to deploy FaaS on the edge.
May 17th, 2019 6:00am by
Featued image for: Arriving at the Edge with OpenStack
Feature image via Pixabay.

For years, you have heard the drumbeat of Big Data and Internet of Things, but now, as more and more devices go online, from smart devices in the home to autonomous vehicles on the road, the flow of data is increasing exponentially as is the need to process it. With all of that arrives the reality that the pre-existing infrastructure simply can’t process all of that data in a timely manner unless we move some of that processing to the edge.

Last month at the Open Infrastructure Summit in Denver, Colorado, nearly three dozen sessions addressed the topic of the edge, and there are a variety of use cases and technical approaches on display, from industrial IoT and “smart warehouses” to sessions on how telecommunication companies are using the edge to make the low latency and high bandwidth of 5G a reality. There was even discussion of how to use Kata Containers to deploy FaaS on the edge.

The vast array of discussions makes it apparent that “the edge” is no longer a topic on the outer boundaries of what is possible, never mind what is necessary. One panel succinctly describes the problem at hand in its description, saying that “applications like IoT, AR/VR, and connected cars all require extreme proximity to the end user, making the edge the natural location for these low-latency, high-bandwidth applications.”

Jaromir Coufal, principal product manager with Red Hat, explained in an interview with The New Stack that whether you are talking about the edge in terms of telecommunications or in terms of enterprise applications, a common denominator is often that the hardware behind the compute is, as suggested by the name, in a physically remote location out on the edge of the network. This presents some challenges that are unique to edge computing that Coufal says OpenStack can help to solve.

“When we start talking about deploying at the edge, you can imagine that there are a lot of end-users who are consuming the services, you’re talking about millions or tens of millions of end-users per country, depending on the size of the country. Therefore, you need to have a lot of distributed sites, because you need to have certain coverage in certain areas,” said Coufal. “This drives the need to decrease IT expertise when you’re doing those deployments and operational management of those sites. You can’t have an expert at every single location.”

Coufal went on to explain that, because of these challenges, it is important that infrastructure be easily deployable, maintainable, and able to be done remotely with as little human intervention as possible. Because of this, it is also important that infrastructure deployed at the edge be self-healing and resilient, so that when something goes wrong, a technician is not immediately necessary to go out into the field, increasing costs and downtime.

Julia Kreger, principal software engineer with Red Hat and OpenStack Ironic project team leader, expanded on this idea in terms of telecommunications companies, explaining that the virtualization of network functions also allows for network operators to move the management of these services to the edge without having to physically go there themselves.

“Operators are needing to push their management of their physical, bare metal machines, and those physical data center assets, further out, closer to the cell sites and closer to the end user,” said Kreger. “When we started out with the internet, you had MAE-East and MAE-West, which were two of the original major telco peering points, that helped define the internet. And as time went on, that distributed out across the world, and connections got closer and closer, and now people are shipping cargo containers to sit in parking spots, with fiber connections and power, because they need to get just that much closer to the end user.”

Sandro Mazziota, director of product management with Red Hat, went on to explain that the edge use case for telecommunications companies goes even further than providing a physically remote location for compute and remote management of that service. As telecoms move to expand 3G/4G and deliver 5G networks, they need to do so not simply by adding more and better hardware, but also by providing new services at the edge on top of virtualized hardware stack. This becomes a part of the transition lifecycle.

“People have to augment bandwidth capacity, and now, with hardware virtualization, they have the ability to augment to the edge as well. This didn’t exist when we launched 3G,” said Mazziota. “They are virtualizing network infrastructure. Instead of buying dedicated hardware, now we see the trend of people buying virtual solutions and deploying them to the edge.”

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma, The New Stack.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.