Unmaintained Dependencies and Other Ways to Measure CI/CD Security
How many reports are needed to answer the important questions about the security of the software supply chain? This week we look at five recent studies, with a focus on CI/CD and open source. As always, the analysis goes beyond the press release-based reporting you may have read elsewhere.
First off, we want to announce that the results of a poll The New Stack conducted April 14 through May 5, 2020. Access to the raw data and tabulated results are in a publicly available workbook. We aren’t trumpeting the results with fancy charts because the sample size was small (79), but the answers add context to our analysis of the other surveys. For example, almost three quarters (58 of 79) think the percentage of software component dependencies that are out-of-date (i.e., a newer version has been released) should be a performance metric for DevOps teams. A Synopsys study found that 82% of analyzed codebases have components that have not been updated in the last four years. The industry will need to get more granular before this type of KPI can be truly be implemented.
Although our question dealt with the DevOps role, another survey, this one from GitLab, found that there continue to be problems getting developers to be accountable for finding code vulnerabilities. We’ve previously noted that there are many problems when security is a shared responsibility. Our findings indicate that the types of tools the security team accesses are substantially different depending on which team has a leading role in selecting tooling used in the deployment or release management stage of the CI/CD pipeline.
Snyk’s State of Open Source Security Survey is still in the field and looks at how and when container images are analyzed for security. Based on our findings, we expect security teams are more interested in scanning container images as compared to other job roles.
Source: Synopsys’ “2020 Source Security and Risk Analysis (OSSRA) Report”.
Readers can find below additional analysis from three surveys as well as interesting questions from two surveys actively being fielded.
- Eighty-two percent of codebases have components that are more than four years out of date, according to due diligence audits of over a thousand commercial applications conducted by the Synopsys Cybersecurity Research Center. Licensing problems exist in almost three-quarters of codebases. The report would be more useful if it provided data about the average number of unmaintained components and quantified how often these issues result in a high-risk vulnerability that needs to be addressed immediately.
- Seventy percent of audited code is open source, a jump from the 2015 report’s 36% figure. Instead of calculating the statistic based on lines of code, Synopsys used the number of bytes (i.e., the storage consumed) of the actual code. Although the report says 99% of codebases have an open source component, it does not describe the percentage of components that are open source. Just like counting the microservices in an application is difficult, so too is creating a measurable definition of a software component.
- Synopsys has software that helps companies identify and remediate these issues. Future reports can be strengthened by providing additional data as well as describing how representative representativeness of its clients’ audited codebases.
- Estimate of faster development is widely overstated (Part 1). The survey answers indicate that 60% are deploying code multiple times a day, once a day, or once every few days, but that is not really a jump from 45% in the last year’s survey. The 2019 version of the question actually found that 43% deploy more than once a day and another 45% deploy between once a day and once a month. GitLab didn’t disclose how many are deploying once a week, so it is almost impossible to make a true apples-to-apples comparison.
- Estimate of faster development is widely overstated (Part 2). It is believable that 83% of the developers surveyed are releasing code faster and more often than last year. A follow-up question asked people to quantify these gains and 29% couldn’t provide information about release speed. The survey’s answer choices compound the problem of quantifying release speed, as the “slowest” increase possible increase according to the survey was twice as fast as the year before (35% of the responses). With an additional 29% deploying 10x as fast, that means almost two-thirds said they were deploying at least twice as fast year-over-year. If deployment speed went from two days to one, that may be a credible finding, but why wasn’t it possible for a respondent to say their speed increase anywhere from 1%-199%?
- Sample size is only one indicator of valid findings. Despite 3,650 survey responses, GitLab’s latest report is limited by a survey that likely under-represents companies with more than a 100 employees (50%) and over-represents GitLab users (59% use it as their primary Git tool for work). This resulted in an undercount in usage for Jenkins and GitHub Actions continuously integration (builds). Also, although a section of the report purports to represent the views of security pros, most of the people answering this question were not primarily responsible for security.
- Narrowly speaking, 38% of DevOps implementations include a CI/CD platform. We have no idea what “implementation” means in the context of this survey, but 82% of survey respondents have had DevOps in place for more than a year. We also don’t know exactly what a CI/CD platform, although it probably refers to a vendor-solution (like GitLab’s) that incorporates many aspects of the software development lifecycle.
- Security teams access security scans before applications are released. Static application security testing (SAST) is the only type of scanning accessed by a majority of respondents’ information security function prior to production. Information security teams are more likely to utilize dependency scans at companies where DevOps led the selection of a scan tool.
- Survey fatigue is real. The unique value of our four-question flash poll didn’t stand out among a plethora of surveys delving into this subject.
- How do you vet the open source packages you use?
- How do you ensure the workloads you are running on Kubernetes are configured securely?
- Do you have good quality data from across the toolchain to accurately report on development progress and release timing so all stakeholders are prepared?
Snyk and GitLab are sponsors of The New Stack.
Feature image by Pixabay.