Cloud Native / DevOps / Security / Sponsored

Where Is Cloud Native Security Going in the Long Run?

10 Feb 2021 10:55am, by

First things first. Many cloud native security problems are the same old security problems we’ve always failed to handle properly, as Matthew Chiodi, Palo Alto Networks’ chief security officer for the public cloud, recently pointed out. That said, we must also deal with newer security problems. And as more workloads move to the cloud and adopt cloud native development patterns, a new approach to security will be required over the long term. These include securing application programming interfaces (APIs); navigating the new cloud native security metrics and culture; and turning toward machine learning and open source software.

To get a better idea of what’s driving these long term security trends and how to best adapt security to cloud native architectures, we talked with John Morello, Palo Alto Networks vice president of product. Morello is a prominent expert in cloud native security, a co-author of the U.S. National Institute of Standards and Technology’s (NIST) guide to container security and former chief technology officer of cloud native security startup Twistlock, which was acquired by Palo Alto Networks in 2019. Here’s what he had to say.

API Security Becomes Key

Long before cloud native computing was a thing, APIs were commonly used to connect programs with each other. Today, they’re the most common way we link together microservices and containers. Unsurprisingly, even as we use them to transfer data and connect services, hackers target APIs to suck down data and introduce malware. Indeed, APIs have become a long-term security concern.

“It’s not so much that the APIs themselves are inherently insecure or necessarily the weakest link. But it’s often what attackers target simply because they’ve been designed to be programmatically accessible,” Morello explained. “They also present a more interesting kind of attack surface for somebody to probe to figure out what else it could be made to do, besides what it was intended to do.”

In addition, Morello continued, applications are transiting from a development pattern, where most of the communications between the various components of the application were internal and not exposed over the network to what we have today. Now, he said, “microservices and decomposed applications are where we’re going to and those individual components are being explicitly designed to work together over the network. And, thus are exposing more of their kind of internal mechanics over the network.”

Subscribe: SoundCloud | | Pocket Casts | Stitcher | Apple Podcasts | Overcast | Spotify | TuneIn

That openness is very handy for today’s cloud-based world, but, Morello observed “the attack surface is much larger than it was historically. If you don’t have a good security model, and the right tooling to protect those microservices, you can be much more easily attacked.”

It also doesn’t help any, as Morello pointed out, that the skill level required to abuse APIs is much lower. Today, “APIs are typically exposed, as just an HTTP interface, often exchanging JSON. In years past, we were dealing with some sort of C++, C compromise, or buffer overflow. There the level of skill set required to understand, debug and to ultimately exploit that was relatively high. Whereas if you’re looking at a wide set of APIs exposed over microservices, and an application, all of those things are very easily discoverable. This isn’t to say that the new pattern is bad, it’s just that there’s always trade-offs with everything and security. One of the trade-offs now is that you’ve made it more discoverable.”

Regardless of what kind of API you use there are several things you can do to keep APIs from being open doors into your software. These include:

  • Practice good API hygiene. APIs should be designed with authentication, access control, encryption and activity monitoring. API keys must be protected and not reused.
  • Use authentication tokens. Back these with trusted identities and use them to control services and resource access.
  • Use standard API secure frameworks. These include the Open Cloud Computing Interface (OCCI) and the Cloud Infrastructure Management Interface (CIMI).
  • Use encryption and signatures. Encrypt your data in transition using TLS. Require signatures to supplement user authentication.
  • Identify and patch software holes. As always, keep on top of your sub-programs and components security patches.
  • Use quota throttling. Place quotas on how often your API can be called and track its use over history.
  • Use API Management to support API security schemes.
  • Use an API gateway. These enable you to authenticate traffic and control and analyze how the APIs are used.

Security Metrics for a Cloud Native Future

Put it all together, and as Morello wisely observed, “You can’t have security as something that comes as an afterthought. The biggest lesson for organizations to take away from this is the importance of embedding security along the lifecycle of the application. And we really have a unique opportunity with the way that cloud native applications are built and deployed, to do that in a much more comprehensive manner than you’ve ever been able to practically do before.”

To make sure all this works out as well and securely as we’d like over the long term, Morello strongly suggested we make use of DevOps metrics. But, you can’t let yourself “get overwhelmed with the sheer amount of tools and terminology and capabilities out there. What I always tell people, at least from a security standpoint, is that probably the most important thing to measure is not the number of vulnerabilities within the environment, because you’re always gonna have vulnerabilities. Even if you’re doing your job perfectly, there’s always going to be vulnerabilities… The key metric is, ‘how long do those vulnerabilities live within your application? How quickly are you remediating them?'”

Using that data you should look for “the meantime to patch or to remediate vulnerabilities after they’ve been discovered. Ideally, you want to see that trend go down over time,” he said. “I often tell people the ideal situation would be whatever your sprint cycle is, you know, 150% of that is a pretty good goal to have in terms of the meantime to fix a vulnerability. That way, you should be able to say this or the very next one, you’re going to correct vulnerabilities as they’re discovered within your application. If you’re doing that well, and you’re doing that early in the development process, and you’re redeploying your application with those fixes in it, you’re going to really stay ahead of the curve of vulnerabilities as they’re being discovered.”

Think about it, Morello continued, “If you look at the big exploits in the news. It’s rare that those exploits are because of some amazing skill set. “For the great majority of those cases, people are compromised for two reasons. They have insecure configurations that were known to be poor practices, yet nonetheless persist in the environment. Or they had known vulnerabilities in many cases that were known about for years in the past, that, again, remain persistent in the environment that were not remediated.”

Morello went on, “If you’re not addressing those two really core aspects of security, stopping vulnerabilities and software and stopping insecure configurations and your infrastructure, you’re always playing from behind. Customers want to talk about the latest esoteric memory injection attack or some CPU flaw. But, in most cases, they’ve got dozens of machines with years-old critical vulnerabilities. Take care of the things that are within your grasp to fix because the attackers are going to go for those easy holes first.”

More Secure with Open Source and Machine Learning

Moving on, Morello also believes “developer-driven security goes beyond just checking during the build process and finding vulnerabilities and config problems, it goes to actually enabling your developers to effectively program the security tool, utilizing the same artifacts that they’re using to build their application natively, without them having to do extra work.”

That means realizing that when “you look at the modern application stack today, you can really say open source has pretty much won that whole stack.” So, naturally, Palo Alto Networks uses open source as the foundation for its own programs using technologies such as “Red Hat’s Universal Base Images (UBI) for our base layer.” Palo Alto Networks also contributed to Docker and Red Hat OpenShift. “We’ve done a lot of work in the open source community overall.”

Morello confessed he worked at Microsoft from 2000 to 2014 where there was a lot of debate over open source. But today with their customers there’s no debate, he said. “They’re using the right things to accelerate their business as much as possible and that’s open source.”

Specifically, Morello spoke about CRI-O, the lightweight container runtime for Kubernetes. With the most recent release of Prisma Cloud, CRI-O is protected just as much as Docker. Morello said, “just like the runtime support — blocking based on vulnerability and compliance policy — we’ve added the ability to do custom compliance checks for CRI-O, to have CRI-O specific compliance checks. This actually assesses the secure configuration of the CRI-O environment that underpins your Kubernetes clusters.”

In short, if you like the Prisma Cloud security you’ve gotten while running Docker, you can now get the same protection in a completely Docker-less environment.

Looking ahead, using open source tools Morello said that “we’ll be able to have a much more automatic approach to runtime defense. Security is always about defense-in-depth and having multiple layers. That means you need runtime defense, good vulnerability management, and  hygiene for the build process.”

Some of this will come from “being able to do unsupervised machine learning (ML) of your application. So, as you deploy an application into an environment that Prisma Cloud protects, it automatically learns things such as ‘what are the normal behavioral patterns for that given image? What are the processes that it runs? What’s the process tree look like?'”

Using this data and open source ML tools Prisma Cloud automatically uses that to create a least-privilege approach to what it allows the application to do at runtime. So even if there is some exploit in the future, as long as we have that model defined, and we know what that model says is normal and allowed, anything that customers try to run is going to be automatically prevented without having to create additional manual rules, Morello said.

“The real unmatched challenge right now,” Morello concluded, “is how do you create a level of learning that’s in the software that you use to protect those environments, so that you’re bothering the human operators as little as possible and only for things that really require that human intervention.”

Does that sound like science fiction? It’s not. It’s Palo Alto Networks using open source and ML to protect our ever more complex cloud native computing programs. We’re not there yet, but we’re on our way.

Feature Image par Antonio López de Pixabay.

A newsletter digest of the week’s most important stories & analyses.