Edge / IoT / Machine Learning

The Challenge of Scaling the Intelligent Edge

17 Aug 2021 5:00am, by

The IT industry is at an inflection point when it comes to edge computing, according to Andy Nelson. Nelson is a principal architect for cloud and the data center at Insight Enterprises, a B2B and IT solutions and services provider.

Various proof-of-concept (POC) and proof-of-value (POV) projects have proven the ROI and business benefits of intelligent edge deployments that bring artificial intelligence (AI) and machine learning (ML) to edge environments, Nelson told The New Stack.

Insight Enterprises has run multiple POC and POV intelligent edge deployments for organizations that cover as many as 50 sites, he said. Such initiatives have given companies in verticals ranging from retail and healthcare to financial services and manufacturing a clearer idea of the value of the intelligent edge.

The challenge now is figuring out how to scale these deployments to hundreds or thousands of sites so these organizations can take full advantage of the business-critical data they are generating at the edge, he said. The technology needed is available now, Nelson said. Tech companies just need to bring all the hardware together into a form factor that can contain the components needed to run the workloads and small enough to be housed in edge locations.

“We’ve had GPUs and edge devices for a while,” Nelson said. “The challenge is, how do I package it so that it doesn’t need as much power and cooling, doesn’t need as much serviceability? The problem with a standard GPU is it needs air cooling, so can I do it with a smaller GPU that’s embedded? Or can I do it with an FPGA [field-programmable gate array]? Intel’s coming out with a bunch of embedded FPGAs. They’ve actually got their own GPU that doesn’t take as much power and cooling. They’re not targeting the gamer community. They’re totally targeting this use case. … The vendors have seen that there is an industry that needs a smaller form factor — tighter, more hardened. The technology is available and the pieces fit together, but it’s not at the right size and footprint so that we can manage this and pay for it at scale.”

Rise of the Edge

The edge essentially has become the third leg on the stool that is the IT world, joining on-premises data centers and the cloud. In a widely and increasingly distributed world, where data and applications are being accessed and created outside of traditional data centers, it also is playing the key role in helping to collect, process and analyze that data closer to the devices — like remote systems, handheld devices and sensors — that are generating it.

Artificial intelligence and machine learning capabilities are being pushed to the edge to accelerate analytics and automation, which means more useful information is derived faster and business-critical decisions based on the information can be made more quickly.

The edge will only grow in importance. There will be 25.44 billion Internet of Things (IoT) devices worldwide by 2030, according to Statista, and all of them will be creating data. Global spending on edge computing will hit $250.6 billion in 2024, growing an average of 12.5% a year through then, IDC analysts predict.

From SCADA to Edge

To an extent, the idea of the IoT and edge isn’t anything new. Utility companies, the military and other verticals for years had Supervisory Control and Data Acquisition (SCADA)-based systems complete with computers, network and graphical user interfaces that enabled users to control industrial processes remotely. But the technology was proprietary and lacked intelligence, Nelson said.

New edge systems need graphics processing with GPUs or FPGAs, intelligence with AI and ML and the horsepower to run the workloads in that environment. The software needs to be open, cloud native and containerized and managed by Kubernetes or similar technology. And to scale, it all needs to fit into a form factor about the size notebook and that can fit into a space-constrained area that can be environmentally unfriendly — very hot or cold, loud, dusty, that sort of thing.

“All of this comes together in this intelligent edge conversation as we need operational excellence like other industries have done, and we need to adapt that to the new-school technology,” Nelson said. “We need to figure out the devices and the ecosystem and how we code and mine the data. Then we also figure out what data. If you remember the old data lake conversation or … Hadoop, Hadoop was going to solve world peace. It’s kind of fallen off. This is a similar conversation.”

Data Is Key

All companies have important data that can be monetized, he said. They can run it through analytics, use heat-mapping technology and leverage the video they’re collecting, all in the hopes of running their businesses more efficiently, creating new solutions and services or understanding their customers better. Retail stores can spot buying trends, convenient stores can figure out the best areas to place products, hospitals can keep track of equipment and energy companies and use Bluetooth beacons to keep track of what’s happening on distant oil rigs.

But they need the right technology with the right components running the right software at the right scale.

Many of the POC and POV projects Insight has run have leveraged Raspberry Pi systems, essentially entry-level and inexpensive commodity hardware. That won’t cut it. Raspberry Pis aren’t made for massively scaled deployments. There are no GPUs for video processing, not much storage and not enough compute power or bandwidth to run AI and ML workloads, and edge devices need to be secure. The intelligent edge at scale needs to have much of that locally available.

Making the Hardware Fit

OEMs like Dell Technologies, Lenovo, Hewlett Packard Enterprises and Cisco Systems are working to develop systems that have the necessary horsepower and the small form factor to live at the edge. The hardware also needs to be affordable — a $100 Raspberry Pi is attractive, but a $500 system loaded with the right components will be a better investment.

The hardware needs to be in place for a long time — five to 10 years or so — and the operational technology (OT) infrastructure needs to be able to handle multiple vendors, devices and versions of firmware and to manage them all in a similar fashion.

“There’s a whole bunch of new vendors that have popped up into that space,” Nelson said, adding that both the established and new OT vendors must embrace “that model of, ‘We’re going to write to APIs. REST APIs. We’re going to burn firmware. We’re going to manage the little embedded Linux loads that go on these devices.’ That’s the interesting part for our industry. Nobody’s had this figured out yet. We’re getting the scaling part figured out, so we’re working really hard internally to come up with different architectures and different blends of tools and pieces to put that together.”

Open Is Key to Edge Software

Like the hardware, the software equation is making what already is availably manageable in a smaller environment, he said. A key is that is has to be open. Siemens has a rich portfolio of hardened device and management software, but it’s all closed and not DevOps friendly.

“What we’re seeing is, how do I take a standard Kubernetes load or a standard container footprint that my DevOps developer buddies are comfortable with and get that into the device?” Nelson said. “That’s got to run on a little bit of Linux generally. How do I pick a slimmed-down Linux embedded — maybe potentially even embedded Linux — but a slimmed-down Linux load? Then how do I do firmware updates and security patches on that? All of that stuff has been around in the data center for years, so it’s not like we’re necessarily saying that Linux and the computers are new. Partitions aren’t new, but how do I do it at about a $500 to a $1,000 price point and maybe $100 license per device or $50 per device?”

Developers and applications owners are becoming the focus of what is happening at the edge, he said. It’s becoming a world of ModOps, a phrase bringing together DevOps and application modernization.

With DevOps, they just want APIs they can consume, they just want containers they can consume, and that’s why they love the cloud,” Nelson said. “What we’re talking about is … how do I give a developer a cloud-like experience? How do I give them a developer-centric experience everywhere?

A Focus on Containers and Kubernetes

The goal for Insight is to create a container platform that will have Kubernetes (K3s is a slimmed-down, single-binary version of Kubernetes that could fit this environment) and everything developers need underneath it to write to a Kubernetes API and REST API, such as data protection and replication. They won’t have to worry about the underlying hardware — it will support whatever the developers need — and if the customer decides to switch from one hardware vendor to another, as long as they can load a Kubernetes variant on it and meet the needs of their specific use case, they shouldn’t care what the hardware is.

“That’s our ModOps strategy in a nutshell,” he said. “We’re just trying to commoditize or mask all of the underlying infrastructure because developers are key in all of the ROI and use case and developing business value and blah, blah, blah.”

How fast all of this will happen is hard to predict, he said. The COVID-19 pandemic illustrated how unpredictable events can upend the IT world. Businesses — particularly ones with razor-thin margins like retail — need an advantage and AI and ML at the edge can deliver that. Some large clients have told Insight that they have already built an IoT strategy and they want Insight to run it for them.

“They’re saying, ‘Here just you do it. You fix what we did. We did the proof of concept. Now you scale it,’” he said. “That’s a litmus test of certain industries and certain forward thinkers have figured this out. But I think a lot of people also understand that it’s hard and not baked yet. … You can probably see in your mind’s eye how [the pieces] should fit. It’s just a matter of getting a very clear picture and actually pulling it all together. We know how to make wedding cakes, but how do I turn that into a cupcake and then a cake pop? It’s kind of the same analogy.”

The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: Made For, run.ai, MADE, Bit.

Feature image by Yeshi Kangrang on Unsplash.

A newsletter digest of the week’s most important stories & analyses.