Cloud Services / Data / Technology / Sponsored / Contributed

How 5G and Public Clouds Will Shape the Future of Applications

27 May 2020 8:14am, by

Cockroach Labs sponsored this post.

Spencer Kimball
Spencer is the co-founder and CEO of Cockroach Labs, where he maintains a delicate balance between his love for programming distributed systems and the excitement of helping the company grow smoothly. He’s cut his teeth on databases during the dot com heyday, and had a front-row seat at Google for a decade’s worth of their evolution.

This past February, before the pandemic was deemed a pandemic and New York City was still humming along, as usual, I sat down at Cockroach Labs HQ to talk about the future of data in application development. It’s a broad topic and one that’s about to undergo a fundamental and immutable shift. We’ll get to why in just a minute.

In the three months since that conversation, a lot has changed in the world. COVID-19 has rendered remote work a necessity. It’s led to unprecedented levels of internet usage. And the lasting changes it will bring to the economy and tech landscape are yet to come. But instead of re-routing or altering the course of the future of data, COVID-19 is merely acting as an accelerant to the transformation that was already underway.

Before the pandemic, two technological tectonic plates were moving inexorably toward one another. The first: the continued improvement and accessibility of public clouds. The second: the impending rollout of 5G. These two capabilities are about to unlock latency levels that were previously only available to Fortune 500 companies. As these latency levels become more prevalent, users will come to expect them. In a pre-pandemic world, these latency levels were a nice-to-have. But in a post-pandemic world, they’ll be a must-have.

Here’s what the transformation will look like, and what application developers — and application users — can anticipate in a 5G world.

Global Applications and the 100 Millisecond Rule

Today people consider the utilization of public clouds as a means of replicating data across different data centers and availability zones to enhance resilience. But this strategy does not necessarily solve the problem of latency. It’s no secret that latency is starting to matter more and more. It’s a huge problem that you can’t deliver adequate latency for a user in Australia if their request has to hop all the way to Virginia in order to talk to the database, then go all the way back to the user in Australia. In fact, you can’t do that in less than several hundred milliseconds. Which brings us to the 100 Millisecond Rule.

The first mention of the 100 Millisecond Rule that I encountered came from the Department of Defense. They did a study on command and control systems in which they asked the question: what’s the maximum latency from the moment a user makes some sort of action, to when that action has a visible effect? For example, when a user hits a key on a keyboard, when should you guarantee that a letter appears on the screen? The answer, according to the Department of Defense, is 100 milliseconds. When you go beyond 100 milliseconds, you’ve introduced a delay that’s perceptible to a human being. That delay can befuddle an application experience that should feel instantaneous and non-virtual.

Today, companies are trying to serve a global audience with applications deployed primarily in a single availability zone. They send everyone on the planet, regardless of location, to that one place. This is the old model and it’s incapable of satisfying the 100 Millisecond Rule for a global (or even bi-coastal) user base.

A global application is an application that delivers a local experience to a global audience. To understand global applications you just need to look at the companies that have built them: Google, Facebook, Netflix, Apple, and HBO. Those kinds of companies have figured out global applications. But if we were to measure the number of hours engineers have spent building that data architecture, it would add up to centuries or maybe even millennia of engineering hours.

That is what it took to build global applications. And what they achieved in those centuries, if you want to boil it down, is a global customer base that has what feels like a local experience.

Local Experiences Are The Next Big Opportunity

Right now, it’s not uncommon when you’re clicking on your mobile device to wait seconds before the thing happens that you’re expecting to happen. And we’re used to that. At the same time, we’re ubiquitously addicted to these mobile devices. In Manhattan, everyone is bumping into each other because they’re walking around with their heads down, looking at Slack, Instagram, Twitter, Youtube, TikTok, or some other new hotness I’m too old to know about.

The incredible amount of time that an average person now spends in these virtual realities, enduring a latency-laced experience, is an unbelievable opportunity that’s ripe for the next generation of application developers.

People always wonder, “what’s going to be the big next platform shift?” “What’s the next iPhone?” “Is it gonna be an Apple Watch?” (Nope!) It’s the out-of-the-box infrastructure that enables developers to deliver local experiences to global audiences — a platform for the people, with the global capabilities that Netflix, Facebook, Twitter and HBO spent cumulative centuries of engineering hours building.

5G Is a Historic Leap in Network Latency

Picture yourself speaking in front of a crowded room or, more prescient to today, in front of a Zoom call with hundreds of people tuned in. Think about the ways you read the faces of the people listening. Think of all the subtle cues that you use to evaluate the experience. “She yawned, she’s bored, I’m terrible.” “Oh, he’s nodding, I must be killing it — or is he just being nice?” Now imagine building that kind of interpersonal nuance into application interactions. Can you reimagine Facebook or Twitter when the interactions happen in something approaching real-time? The next generation of applications will harness the capabilities of public cloud resources and 5G to deliver incredibly low latencies and ultimately nearly real-time experiences for those using them. New applications will utterly alter the user experiences that people spend half their waking hours consuming.

5G may sound like an incremental step forward for cellular networking. But the integer jump from 4 to 5G is, in my view, a historic leap. It’s rare in telecommunication networks for there to be a drop in latency like the one that’s expected here. And it will have a tremendous impact on the ways that people re-imagine the multi-billion dollar user platforms of today. The last time we saw such a shift in speed was when we went from dial-up to DSL. For anybody who was around at that time, it was amazing. The applications that came out of that shift were tremendous. We went from text-based games to multi-user interactive applications. And now we’re at Fortnite. But there is something beyond Fortnite. And 5G will help us get there.

New Consumption Models Will Empower Developers

What’s more, this new generation of applications isn’t going to come exclusively from large companies with millions in VC dollars. That’s because the lessons learned at places like Facebook and Google are quickly being boiled down and packaged into general-purpose systems. It’s a formidable challenge to take on this work. But it’s happening.

We’ve seen this ourselves at Cockroach Labs. Many of the startups using CockroachDB are run by ex-Google, ex-Uber, and ex-Facebook engineers. These engineers have arrived at their next project with a very different attitude than they had when they joined Google, or Facebook or Uber five or ten years ago. This generation of engineers has been through the paradigm shift. They understand distributed systems. They’re using public clouds, microservices, and modern database technology.

Engineers are not going to start with infrastructure that’s going to have to be re-platformed again and again. They’re going to demand the capabilities they’ve seen at Google, Facebook, and Uber. And companies are rushing to fill that gap by providing managed service software and infrastructure-as-a-service.

The Future of Application Development

Application development was already on the precipice of a massive shift. And then COVID-19 hit. Consumer demand for low latency enabled technologies — and the transportive experiences they enable — is higher than ever before. Appetites for Zoom hangouts that feel like real life, for MMORPG games that are truly immersive, and for new use cases we haven’t even dreamt of yet, are running high. And the perfect storm of 5G, the improved public cloud, and managed service software is going to make that a reality.

It’s a brave new world on the horizon. One in which previously unthinkable applications will be developed, not by the Facebooks and the Googles of the world, but by individual developers and small teams. The coronavirus pandemic is making our world more distributed than ever before — and the impending changes to application development will make us ready to meet that reality.

Feature image from Pixabay.

A newsletter digest of the week’s most important stories & analyses.