Birth of the Cloud: A Q&A with Vint Cerf and Linode’s Christopher Aker
I have had the opportunity to work with a number of tech pioneers over the course of my career. So when an opportunity to interview two who were at the forefront of the internet and the cloud, I jumped at it.
Vint Cerf was there — quite literally — at the start of this thing we call the internet. Cerf, along with Bob Kahn perfected packet-switching technology, laying the ground for TCP/IP and the decentralized internet (Currently, Cerf is a vice president and chief internet evangelist for Google). Years later after the creation of TCP/IP, Christopher Aker used that new network to build a cloud before public cloud providers were a thing.
As Linode, the company Aker built, turns 18 this year, I asked Cerf and Aker to weigh in on where we’ve been, where we are today, and where we’re going.
You’ve both been in the business of cloud for many years. Looking back to when you first started in this business, how has cloud evolved?
Vint Cerf: I look back at cloud and see its seeds in the 1960s, when people were depicting data centers as giant buildings with some steam coming out of the top. What we have in today’s world — 60 years later — is approximately the same thing. There may not be any steam coming out of the top, but we have facilities for computers that encompass acres. So it’s the reincarnation of the time-shared, industrial-grade system. In that sense, architecturally, cloud isn’t too different. Of course, now we have remote access to large-scale computing through the web and through browsers. And it’s all-time shared and dynamically parallel supplied, so we’ve simply come full circle.
Christopher Aker: When you look through the lens of the last 17 or 18 years, where we are today in Infrastructure-as-a-Service is more or less exactly where the growth trajectory and evolution of technology predicted we’d be by now. As a service provider, it’s remarkable when you consider that the adoption predictions early on may actually have been a little underhyped.
What obstacles are inhibiting some organizations or individuals from transitioning to the cloud?
VC: I think transitioning to the cloud is appealing to most organizations, but the big obstacle to overcome is risk. Imagine that you are not yet using cloud resources. You’re running on wholly-owned facilities. You inevitably discover that maintenance is expensive, not to mention the capital outlay, and you don’t have surge capability. Of course, you would like to have those capabilities available to you, but at the same time, you’re concerned about the safety, security, and privacy of your own customers’ data. I mean, if you’re serving other people, and you use a third-party cloud, the first thing you want to be reassured of is that your customers are not exposed by having operations taking place in the cloud. Everyone is concerned about that. That said, we’ve put a lot of effort into making the cloud secure, and our customers always hold the key to their own data.
CA: Concepts like cloud take time to spread, and it takes time for organizations to build up their bench strength of talent to take advantage of it. The industry has consistently delivered new abstractions to ease this transition, and the most recent iteration of this is serverless. And while from our position as a service provider it doesn’t seem like the majority of enterprises and individual developers are thinking, “Maybe we should go serverless,” I think it’s starting to happen.
How can cloud providers make cloud computing more open and accessible?
VC: This is a common dilemma, especially in the competitive business arena. In almost all proprietary situations, a company is trying to get an edge. It is trying to attract customers by creating a product that nobody else has — one that works better, cheaper, and faster. My impression is that customers provide the countervailing force. Through their demands for choice and flexibility, customers provide the motivation for companies to take a more open approach. So what typically happens, at least in my experience, is that every business looks for an edge, but then the customers demand that the edges be replicated and made available in other choices so that they are no longer locked into that particular thing. If you are paying attention to customer needs, you may very well conclude that you need to go build something that the other person has so that you’re competitive. Using open source solutions is the way businesses can meet customer demand without losing investments in proprietary IP, but then they must innovate to find their differentiation elsewhere.
“I wish that I had done IPv6 first instead of IPv4”–Vint Cerf
CA: When people think about open source, it’s easy for them to equate it to using open source software as an alternative to too-expensive proprietary software licenses. And that’s certainly part of what open source is all about. But open source also applies to infrastructure and the cloud world. In fact, we built Linode squarely upon the concept of open infrastructure with a goal of making cloud accessible to everyone. Before Linode, your infrastructure options were restricted. You had to buy a whole machine or rent a whole machine, and it was expensive and often went underutilized. But now for a couple of bucks a month, you can have the best of both worlds — share the expensive machine with others and have a low barrier of entry from a financial point of view.
Looking back over the years, are there moments that stand out to you as an inflection point where cloud computing took off?
CA: I think we’d be remiss to not mention the iPhone and mobile as a key inflection point, because that literally put the power of cloud computing into the palm of your hands. And you didn’t have to be a command-line jockey to be able to drive it and consume it. Smartphones and mobile communications have truly transformed our world, and that’s a perfect example of how cloud computing fulfilled its promise.
Are there innovations that weren’t created that you wish had been?
VC: Well, I wish that I had done IPv6 first instead of IPv4. The other thing is the use of crypto. We almost could have injected it into the system. It showed up around 1976 as a concept. I remember thinking, “You know, it would have been nice if we could have had that available.”
CA: Yes, in hindsight, I think we made the address space too small and probably would have done IPv6 or something similar to it from the very beginning.
Where do you see the concept of “multicloud” progressing? Will it help encourage more interactions and/or prevent a rise in monopolies?
VC: I am excited about multicloud for the same reason that I was excited about the Internet. It’s one thing to be able to move from one place to another and then run something. The next possibility is running something in more than one cloud at the same time and allowing interactions to take place. This is the next step in the multicloud environment. Kubernetes helps, but there’s more to it than that. If you’re going to run things concurrently in more than one cloud, you have to be able to understand what’s going on in all of the clouds that you’re operating in. I think commonality is going to be demanded by customers with regard to the management of cloud-based operations.
CA: I think it’s incredibly important that there’s competition. Conversely, I don’t think a vast ecosystem of proprietary and closed options is good for anybody either. Fortunately, what we typically see is that markets consolidate, and we usually end up with a few strong players in the end. In the cloud market specifically, multicloud essentially is the driver for a healthy equilibrium. Because multi-cloud makes sense to consumers for many reasons, consumer demand for multicloud will be a driving force for moderating the cloud market, preventing both monopolization and fracturization.
What’s next for the cloud? Where are we going from here?
VC: We’re rapidly approaching the point where many billions of devices are on mobile networks that then connect to the public internet. We’re also seeing the utility in having some computing near that collection of devices that offers local capability without having to go all the way out to the cloud. Part of that is to reduce latency, part of it is to contain where information goes, and part of it is to allow machine learning actions to take place locally, near a sensor system, for example. Clearly, edge computing is emerging as the next milestone in our evolution.
CA: Cloud is probably headed for a split. That is, large hyperscale players will optimize for mass-scale use cases and large enterprises, while alternative cloud service providers like us will focus on creating services and user experiences that are tailored for smaller enterprises and individual developers. It’s hard for a provider tuned to one market to serve the other as just as well.
Disclosure: Author Mike Maney runs global corporate communications for Linode.