Where are you using WebAssembly?
Wasm promises to let developers build once and run anywhere. Are you using it yet?
At work, for production apps
At work, but not for production apps
I don’t use WebAssembly but expect to when the technology matures
I have no plans to use WebAssembly
No plans and I get mad whenever I see the buzzword
Operations / Security / Software Development

We Need to Rethink Risk in Vulnerability Management

If we don’t have these honest conversations around risk today, tomorrow will have us all focused on the wrong thing.
Feb 1st, 2024 6:18am by
Featued image for: We Need to Rethink Risk in Vulnerability Management
Image from ESB Professional on Shutterstock.

With the number of vulnerabilities discovered in software growing annually, we need a frank conversation in software circles about what constitutes risk. Decades ago, when vulnerabilities discovered in a month could be counted on your fingers (and toes, in a bad month) it was easy enough to avoid any and all risk by patching every vulnerability found. Those days are well behind us.

Today, we find more than 1,500 vulnerabilities a month, on average. What worked before simply doesn’t scale to this level, so we need to look at the root of our vulnerability management practices, which ultimately brings us back to risk.

The problem isn’t necessarily more vulnerabilities; in the last 20 years, the sheer volume of software has increased exponentially, with vulnerabilities growing linearly with it.

At the same time, while exploitation rates in software have risen as well, they have not risen to the same degree. The average for actively exploited software each year? Per the Cybersecurity & Infrastructure Security Agency (CISA), only 4% of all vulnerabilities discovered have been publicly exploited.

Red Hat recently published a five-part blog series discussing this very challenge. On the one hand, we all want to avoid any risk when it comes to software because breaches are expensive to deal with. But, according to Verizon, less than 10% of breaches are due to software exploitation. By focusing exclusively on software issues, we’re making an expensive problem even more expensive by spending money in the wrong place.

Given low exploitation and “due-to-software” breach rates and a high focus on software vulnerabilities instead of the actual source of breaches, a lot of money is being spent remediating the wrong thing, especially if the end goal is to reduce the probability of a breach. And isn’t that the goal?

The goal isn’t to fix all software vulnerabilities just for the sake of fixing them. The goal is to avoid a potentially catastrophic and expensive event that affects your business. That means focusing on the real source of breaches: misconfigurations, spoofing, phishing, compromised passwords, social engineering and the like. Notice a recurring theme here? It’s what we call “the human element.”

This is why it’s so important to take a fresh look at a really old problem. Misunderstanding the end goal of vulnerability management and the costs associated with it means we will continue to invest in an area of diminishing returns while potentially ignoring those areas with a higher return on investment.

This old way of thinking is further cemented by new regulations and legislation introduced globally by governments and regulatory bodies that tell us to “fix everything” and not just the things that actually matter or are risky. We don’t know why, and we can’t quantify how expensive it might be, but an old “best practice” is now a requirement, so money is spent and breaches continue to happen. And those breaches become more widespread and more expensive.

When security becomes compliance, it’s no longer security. Compliance tells us to complete a task because someone, usually a regulator or a government agency, made it a requirement. Security helps us to minimize risk. There is an inherent tension here, and while being compliant won’t necessarily make you more secure, being more secure should be what makes you compliant. Are we heading in the wrong direction?

This is the conversation the industry needs to be having, and the time to have that conversation was yesterday, before the introduction of new requirements that further reinforce outdated practices. If we don’t have these honest conversations around risk today, tomorrow will have us all focused on the wrong thing. This challenge affects us all: software vendors, legislators, customers and end users.

It’s time for traditional beliefs in patch management to be examined and discussed so we can focus on a future that is truly based on risk mitigation beyond just software. We need a balanced approach that focuses on data protection, adopting security principles like “secure by design, secure by default,” automation and better testing, configuration management and monitoring for changes, as well as human education.

For much greater detail about the conversations that need to happen, and the events building up to them, read the series on the Red Hat Blog.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.