Cox Edge sponsored this post.
Edge computing is getting more attention of late — because there are advantages to having computing power and data storage near the location where it’s needed. As Edge computing needs grow, users are likely to take a hard look at whether public cloud giants like AWS, Azure, and Google are their best choice, or whether their local ISP is best suited for the job.
ISPs — including cable, DSL and mobile providers — claim to offer benefits when delivering SaaS and other services compared to public cloud providers: low latency, high-bandwidth connections, fewer security vulnerabilities, regional regulation compliance, and greater data sovereignty. While they must also demonstrate that they can deliver services robust enough to meet DevOps needs, ISPs can offer tremendous benefits and fill gaps in current cloud computing offerings.
“A key concern cloud customers have when leveraging their microservices architecture for the applications they offer or rely on is how to achieve and maintain ultra-low latency,” said Ron Lev, executive director, new growth at Cox Communications. “What do they need to get there? You can say they need to turn to the service provider that owns the last mile.”
Cloud providers’ business models typically depend on offering customers economies of scale. By emphasizing possibly cheaper capacity in place of performance, they often lack the localized infrastructure required to meet low-latency requirements — which are especially critical for optimizing edge, distributed databases, and other applications, Lev said.
A number of technologies exist, of course, to help improve latency and throughput, ranging from DNS connections to load balancing. A load balancer, for example, can help to dynamically optimize data connections. However, these technologies can fall short for application developers and users, Lev said.
Let’s walk through an example. A public cloud customer in Seattle, for instance, could struggle with maintaining the desired latency it requires for an application with numerous microservices spread across the United States in order to meet the latency requirements of the end user. The cloud provider’s centralized database might be located in, let’s say, New Jersey; this could increase latency and slow data throughput, since NoSQL and other commonly used databases were not originally designed to meet the needs of highly distributed applications. And if the end-user inserts a query that requires database computations? Service could slow down even further.
Regardless of whether load balancers or other technologies are in place to help lower latency, in the preceding scenario, “you are still taking a hit for data processing for customer-facing edge applications in infrastructure,” Lev said. The immense infrastructure consisting of routers and data centers located with direct connections that an ISP can provide largely erases these latency and data throughput issues associated with public cloud connections, he said.
The ISP industry’s evolution also means it is “virtualizing the telecom stack,” said
Chetan Venkatesh, CEO and president of Macrometa, which offers a serverless platform. “This is creating new opportunities for them to partner with cloud vendors to build out and serve tools like virtual machines and container runtimes that cloud native developers are accustomed to using to build apps and services,” Venkatesh said.
The Security Connection
It is also often beneficial for the SaaS and cloud provider to own and have direct control over the infrastructure in a localized way that only an ISP can provide, said Ron Edgerson, a senior application security consultant for cybersecurity advisory services provider Coalfire.
It is very important that SaaS and cloud providers own and manage their infrastructure so that “they control the data “that is either stored, routed, or presented,” Edgerson said. “Doing so allows their customers to efficiently secure their edge infrastructure; not doing so could just be considered negligent,” he said. “No matter what, additional risk will be introduced. But with these providers knowing and controlling their own infrastructure, it gives them the opportunity to minimize the inherent risk for their customers.”
The more APIs, and routing and rerouting connections across the Internet, the more potential entry points intruders have for access. For DevOps teams creating and deploying distributed applications, a public cloud network raises serious security concerns, because network attackers are presented with many more potential targets, some experts say. “With a larger attack surface that public cloud service providers have, we’re likely to see an increase of routing and injection-based attacks, because let’s not forget that the server code is stored on the edge device or network,” Edgerson said.
An ISP cloud provider is also well-positioned to meet data sovereignty needs, as compliance requirements can differ according to region and geographic areas. In the United States, for example, state-specific data laws apply to highly regulated industries, such as health care, insurance, and finance, said Maddison Long, vice president of products at CloudOps, a cloud consulting company.
Conversely, with a centralized cloud infrastructure to support different regions, a hyperscale cloud provider is not necessarily “going to offer the data sovereignty — nor the latency — that an organization may need if it’s in a highly regulated industry, Long said.
“If organizations run workloads in a hyperscale cloud region, such as in the central U.S., these workloads could transcend multiple states and make compliance a challenge,” Long said. A service provider is in a good position to be able to guarantee that data resides in a specific location, he added. With such an approach, “you’re deploying in a city, such as Las Vegas, San Diego, and Phoenix, and not in a region. In this sense, an edge deployment is a more granular, cloud-like experience. This will help these highly regulated industries maintain their regulatory compliance.”
The data sovereignty an ISP provides can also help to protect networks and cloud services from having compliance issues with the stricter regulations to come. “Data compliance laws are only going to get more complex, and since there’s no overarching national U.S. data law, regulated enterprises are going to need to find a way to meet the evolving compliance requirements across several states,” Long said.
ISP cloud providers are also well positioned to provide edge computing services. Besides the benefits edge computing itself offers, ISPs can customize how content is stored in order to more efficiently serve content to end-users, whether it’s gaming, web apps, video conferencing, etc.,” Edgerson said.
Among the security advantages detailed above, edge computing can also provide efficient redundancy to protect against DDoS attacks, Edgerson said: “We’re talking about service providers being able to continue serving content even if the master server is down — that’s the bottom line.”
Since ISPs are natively geo-distributed and physically closer to customers, “working with ISPs to get applications and application data closer to users puts cloud and SaaS companies at an advantage for maintaining low-latency connections because of the physics of networks and proximity,” Venkatesh said.
Ultimately, the low latency that ISP cloud providers can offer DevOps customers in order for them to provide ultra-low latency edge applications for their end-user customers is key, Edgerson said.
“People don’t really have time for time these days, and since applications of all types help to run the world we know, they must be optimized accordingly,” he said. “If the web is late, the world is late.”
Amazon Web Services and Cox Edge are sponsors of The New Stack.
Featured image by Amvia.