STATE OF SOFTWARE SECURITY VOLUME 10
Welcome Letter 2
Executive Summary 3
Copyright By PowCoder代写 加微信 powcoder
Overall State of Software Security 6
How prevalent are application flaws? What proportion of flaws are fixed? How quickly are flaws fixed?
Does DevSecOps drive faster fixing? Is security debt rising or falling?
A Look at Application Security Testing 14
How often are applications tested? How regularly are applications tested?
Not All Flaws Are Created Equal 18
What types of flaws are most common?
How common are severe and exploitable findings? Does flaw prevalence differ by language?
Not All Flaws Are Remediated Equally 26
Which flaws are fixed most often? Is remediation out of focus?
How fast are flaws fixed?
The Elusive “Average”
Which flaws are fixed the fastest?
Breaking Down Security Debt 37
Do priorities contribute to security debt? Is there a security debt-to-income ratio? Is fix capacity a constant?
What is security debt comprised of?
Regional Breakouts 49
Does software security change by region?
Key Takeaways 52 Appendix: Methodology 55
State of Software Security 1
Senior Director, Product Management
Founder and Chief Technology Officer
Chief Research Officer
LETTER FROM TIM JARRETT, CHRIS WYSOPAL AND CHRIS ENG
Welcome to the 10th volume of Veracode’s flagship report, the State of Software Security (SOSS). This is a big milestone for the application security industry, and for us — a decade of SOSS!
As we reflect back over the past 10 volumes, we’re struck by both the enormous change and growth in our industry (and in our own company), and also what has remained the same. We’ve seen AppSec awareness grow in leaps and bounds since we started down this SOSS path a decade ago. When we were working on SOSS Volume 1, we spent most of our time trying to explain and advocate for application security. Today, we spend far less time talking about what AppSec is, and more time talking about how to build an effective, mature application security program.
At the same time, the core problem we are trying to solve today is not that far removed from the problem we were trying to solve 10 years ago. In State of Software Security v1, we concluded that “Most software is indeed very insecure.” We could use that same statement in Volume 10. However, we are seeing some positive AppSec signs in 2019. Organizations are increasingly focused on not just finding security vulnerabilities, but fixing them, and prioritizing the flaws that put them most at risk. Though vulnerabilities are introduced as part of the development process, the data suggests that finding and fixing vulnerabilities is becoming just as much a part of
the process as improving functionality.
Even with the strides the industry has made over the past 10 years, there’s plenty
of room for improvement — especially regarding the time it takes to make those fixes.
In talking with our customers, and examining the data we used for this year’s report,
the notion of security debt has emerged as a significant pain point. Just as with
credit card debt, if you start out with a big balance and only pay for each month’s new spending, you’ll never eliminate the balance. In AppSec, you have to address the new security findings while chipping away at the old. Easier said than done, but we unearthed some data points for this year’s report that shed light on a path forward, and highlight some of the practices that help our customers tackle their security debt. This year’s analysis highlights compelling evidence that a steady, regular scanning cadence not
only improves fix rates, but also lightens the security debt load.
Thanks for being part of this big milestone on our AppSec journey. We started Veracode with a mission to secure the world’s software. Today, that mission remains, with the added focus of enabling you to create, innovate, and “change the world” with software, without being held back by security concerns. We hope the best practices outlined in this report play a role in that goal.
Here’s to the next 10!
SINCERELY,
2 Veracode
Executive Summary
In 2011, wrote an article in the Wall Street Journal that included the now-famous phrase “software is eating the world.” Eight years on, that statement rings truer than ever. It’s not a stretch to say that software is eating the cybersecurity world as well. The fallout from not integrating security early in the development lifecycle has never been more apparent. And our annual report on the
State of Software Security (SOSS) has never been more important.
This year also marks an important milestone for the SOSS itself. It’s our 10th edition! We’ve observed and learned a lot over the last decade producing this report, and we’re especially excited to share some “Then vs. Now” comparisons.
Number of applications tested:
1,591 85,000 Applications with
at least one flaw:
That’s an increase of 11%!
That’s a growth of 50x!
Applications with high-severity flaws:
That’s a decrease of 14%!
Average number of days to fix flaws:
VOL. 1 VOL. 10 59 171
But the median remained 59 days. This indicates most fixes happen quickly, but there’s a long and growing tail of unresolved findings.
Pass rate for OWASP Top 10 policy scans:
IMPROVED BY ALMOST 10%
4 Veracode
Beyond those 10-year views, we learned
more about the state of software security in 2019.
POLICY COMPLIANCE
applications fail
to pass initial tests based on the OWASP Top 10 and SANS
25 industry standards
FLAW BUSTING
of software flaws eventually get fixed
of high-severity flaws are addressed by developers
Half of applications showed a net reduction in flaws over the sample time frame. Another 20% either had no flaws or showed no change. This means 70% of development teams are keeping pace or pulling ahead in the flaw busting race!
Those who read last year’s SOSS may remember a heavy emphasis on flaw persistence timeframes and what contributes to making them longer or shorter. We return to that topic this year, but focus on the accumulating security debt in applications caused by those persistent flaws and long fix timeframes.
Here are some key findings we’ll expound on in this report:
SECURITY DEBT
The chance that flaws will ever be dealt with diminishes the longer they stick around, resulting in accumulating “security debt” in many applications.
less security debt in organizations that scan their code more than 300 times per year
A more regular testing cadence also corresponds to driving down security debt.
MEDIAN FIX TIME OF SCANNED FLAWS
days for applications scanned 12 or fewer times per year
days for applications scanned 260+ times per year
carries 3x to 5x more unresolved flaws than .NET over a sample period
Certain languages appear more prone to the buildup of security debt than others.
That’s a 72% reduction!
They also tripled fix rates over teams that scan infrequently.
State of Software Security 5
It’s a near certainty that your applications have security flaws of various types. The likelihood of remediating those flaws in a comprehensive and timely manner is not nearly as certain. The ability to do this consistently — and thereby driving down security debt rather than racking it up — is what separates leading and lagging SDLC programs.
CORE LESSON
Overall State of Software Security
As stated previously, this is the 10th edition of the SOSS.
As we review what the data tells us about important trends over the last year, it makes sense to reflect back on what we’ve seen during the last decade as well. Let’s do that now, in fact, starting with a data point that shows just how much the SOSS has grown over the years.
Way back in Volume 1, we studied scan results from 1,591 applications. In Volume 10, we have the privilege of testing over 85,000 applications. That’s over a 50-fold increase in sample size!
That’s pretty impressive, but perhaps even more so is the level of depth we’re now able to achieve in that analysis. We’ve teamed up once again
with the data scientists and storytellers at the Cyentia Institute to level up that analytical prowess to maximize value to our readers. And with a massive dataset spanning 85,000 applications, 1.4 million scans, and nearly 10 million security findings at our disposal, you’re in for an analytical treat in the pages that follow!
Number of Apps Tested in SOSS Volume 1 vs. Volume 10
VOL. 1 VOL. 10 1,591 85,000
That’s over a 50-fold 1,591,5a9p1palicpaptliocantsions increas8e5i,n08s05a0,m0ap0pl0epsalicpzaept!liocantsions
applications tested applications tested
in VionluVmoelu1me 1
in VionluVmoelu1m0e 10
( =( 500=a5p0p0licaptipolnicsa)tions)
Represents 500 applications
FIGURE 1 Comparison of the number of apps tested in SOSS Vol. 1 vs. Vol. 10 Source: Veracode SOSS Vol. 10
SourcSeo: uVrecrea:cVoedreacSoOdSeSSVOoSluSmVeol1u0me 10
State of Software Security 7
JARGON WATCH
Flaw Prevalence
The proportion of applications that have a (type of) flaw.
How prevalent are application flaws?
This first question seems simple on the surface but gets deep
pretty quick. We’ll dip a toe into those waters now, and wade in progressively deeper through the report. We’ve already mentioned that we discovered about 10 million flaws across 85,000 applications. Beyond that, 83% of those applications had at least one flaw in the initial scan run by customers. That’s squarely within the range of
our most recent volumes, but somewhat higher than the inaugural prevalence of 72% recorded way back in Volume 1. We attribute that upward shift to the broader set of applications tested and expanded scanning capabilities developed over that timeframe.
Apps with no flaws Apps have at least 1 flaw
Apps pass OWASP compliance Apps don’t pass OWASP
Apps pass SANS compliance Apps don’t pass SANS
Source: Veracode SOSS Volume 10
8 Veracode
Source: Veracode SOSS Volume 10
Proportion of applications with
at least one flaw in the initial scan
Source: Veracode SOSS Vol. 10
Pass rates for OWASP Top 10 and SANS 25 compliance testing
Source: Veracode SOSS Vol. 10
Beyond overall prevalence, we closely track the OWASP Top 10 vulnerabilities and SANS 25 software errors because of their status as consensus listings of the critical flaws across the industry. The pass rate for OWASP Top 10 compliance on the initial scan reversed a three-year decline by rising to 32%. That’s not the highest ever recorded — that peak happened in 2016 — but the 10-year trend in Figure 3 shows things are moving in the right direction. The pass rate on tests based around the SANS 25, surprisingly, matches exactly what we tested in Volume 1.
We now know that most applications are flawed, but how
serious are those findings? Overall, we discovered high-severity (level 4 or 5) vulnerabilities in 20% of applications, a 14% improvement over the equivalent statistic measured 10 years ago. Thus, the overall prevalence of findings rose over the last decade but fewer of them constitute a serious risk to applications. If you want more information on the types of flaws discovered and which ones are considered more severe, sit tight. Many pages lie ahead.
What proportion of flaws are fixed?
Prevalence conveys a key aspect of the state of application security, but more important still is whether these issues are
dealt with in an effective and timely manner. Fix rate offers one way of looking at that and measures the proportion of discovered flaws that are closed or remediated. The overall fix rate across all flaws is 56%, which lands right in the neighborhood of recent years (52% in 2018; 58% in 2017).
Logic holds that not all flaws are fixed with equal urgency,
and the evidence presented in Figure 5 backs that conclusion. Findings in the OWASP and SANS lists, for instance, receive slightly preferential treatment over general flaws. High-severity flaws are roughly 15% to 20% more likely to be remediated than those of lower severity. Again, none of this is terribly surprising. The main takeaway is that application teams achieve better- than-average fix rates for the flaws they prioritize. We’ll talk more about what gets prioritized and why later.
Severity 5 Severity 4 SANS 25 OWASP 10 All
FIGURE 5 Fix rate across all flaws and for various categories of flaws Source: Veracode SOSS Vol. 10
Source: Veracode SOSS Volume 10
Apps with no high-sev flaws Apps with at least 1 high-sev flaw
Source: Veracode SOSS Volume 10
Proportion of applications with higher-severity flaws in initial scan
Source: Veracode SOSS Vol. 10
JARGON WATCH
The proportion of discovered flaws
that are successfully closed or remediated.
State of Software Security 9
Distribution of flaw fix rates across applications with
at least one flaw
Source: Veracode SOSS Vol. 10
29% of apps fix all flaws
16% of apps fix no flaws
25% 50% Fix Rate
25% 20% 15% 10% 5% 0%
We’re glad to see a slight skewing toward the upper end of that distribution and hope this report will motivate even more to cross over to the right side of the fix rate chasm.
6% 5% 4% 3% 2% 1% 0%
10 Veracode
Fix Rate (Excluding 0% and 100%)
Source: Veracode SOSS Volume 10
Another interesting angle is how fix rate applies to individual applications. Figure 6 grants us that perspective. On the top, we
see that development teams fix nothing for 16% of applications and successfully close all flaws in 29% of apps. Upon further investigation, we noted that many applications at these opposing ends of the spectrum had very few flaws.
Because these opposing extremes dominate the scale in Figure 6, we removed them from the bottom chart to focus more closely
on the majority of applications that fall in the middle. Other than the spikes at 33%, 50%, and 66% that show the influence of small denominators (i.e., 1 of 3 fixed, 1 of 2, 2 of 3), there’s a nice concave shape of the “bridge” connecting the two extremes.
Percent of Applications Percent of Applications
JARGON WATCH
The mean (average) time it takes to fix flaws discovered in an application.
The median time it takes to fix flaws discovered in an application.
How quickly are flaws fixed?
Having covered the overall fix rate for application flaws, we now turn attention to how long it takes development teams to roll out those fixes. Readers of last year’s SOSS may remember a heavy emphasis on fix timelines using survival analysis techniques. We’ll delve even deeper into that topic this time around; let’s start simple with the common measure of Mean Time to Remediation (MTTR). As the name implies, MTTR measures the average time it takes
to remediate flaws.
Figure 7 contrasts the MTTR observed in SOSS Volume 1 with that of our current sample. The results are eye-opening, to say the least. MTTR nearly tripled over the ensuing decade, raising the question of what’s going on with software security in the 2010s. That comparison is deceptive, however, because the average of
flaws suffers from the flaw of averages.
Median: 59 days
Distribution of Mean Time to Remediation among closed application flaws
Source: Veracode SOSS Vol. 10
Volume 1 Volume 10
Source: Veracode SOSS Volume 10
Not to be overly mean,1 but the average becomes an
unreliable measure of “typical” values in skewed distributions.
And time-to-remediation creates a very long-tailed distribution (take a sneak peek at Figure 21 if you want proof). That long tail is comprised of unresolved findings that inflate the MTTR. By way of comparison, the median time-to-remediation (MedianTTR) of flaws in the development cycle during the last year is just two months — equal to the MTTR from back in the SOSS Volume 1 report. Thus, typical fix times haven’t gotten worse; the tail of ever-accruing “security debt” just got a lot longer.
Figure 7 shows a tripling
of average fix time over the last decade, which seems
to suggest software security may have lost its way. But the median fix time remains unchanged from 10 years ago. Thus, typical fix times haven’t gotten worse; the tail of ever-accruing “security debt” just got a lot longer.
1 Stats jokes always regress to being mean.
State of Software Security 11
It’s not shown in Figure 8, but those same frequent scanners also tripled their fix rate and reduced security debt by five-fold! “I should scan my apps more often” is the smart mental note to make here.
12 Veracode
2 260 scans approximates an average of one scan per working day (52 X 5).
Does DevSecOps drive faster fixing?
As with financial debt, escaping out from under security debt necessarily requires changing habits to pay down balances. The integration of software development and IT operations (DevOps) and integration of security into those processes (often called DevSecOps) over the last several years has certainly changed habits. We do not have a definitive way to distinguish development teams that practice Dev(Sec)Ops, but we can look for certain observables tied to behaviors in keeping with that spirit.
The frequency and cadence of security testing are two such observables. In general, we expect a DevOps-oriented team to conduct frequent security scans of their code at regular intervals during the development lifecycle. Furthermore, we’d hope to see evidence that those behaviors correlate with faster fix timelines.
Figure 8 shows that hope has some merit. The MedianTTR for applications scanned 12 or fewer times a year (less than once per month, on average) stands at 68 days. Those with an average scan frequency of daily or more (260+ scans2) knocked that statistic way down to 19 days. That’s a 72% reduction in MedianTTR.
1-12 scans
13-52 scans (monthly-weekly avg)
53-260 scans (weekly-daily avg)
260+ scans (daily+ avg)
26 days 19 days
Median TTR
Effect of scan frequency on fix rate and time-to-remediation
Source: Veracode SOSS Vol. 10
Source: Veracode SOSS Volume 10
JARGON WATCH
Fix Capacity
The number
of flaws a development team can close relative to the number of flaws discovered. Usually expressed as a negative or positive ratio.
Is security debt rising or falling?
That notion of security debt brings us to arguably the most defining indicator of the state of software security in 2019 — whether applications are accruing or eliminating flaws over time. To that end, Figure 9 measures the overall fix capacity of development teams by comparing the number of flaws found in an application’s first and last scans.
Overall, 30% of applications show an increased number of flaws in their latest scan. This doesn’t necessarily imply those teams are doing a bad job managing flaws — it could represent a period of rapid growth and change — but it does reveal evidence of accruing security debt. If these applications are on a path similar to those of virtuous venture-backed startups, then we hope to see them escape their negative security burn rate in the near future.
All Flaws OWASP Top 10 SANS Sev 4 Sev 5
Difference in the number of flaws found between first and last scans of sample period
Reduced Amount of Flaws
No Flaws Same 7.4%
Percent of Applications
Increased Flaws
Source: Veracode SOSS Volume 10
69.6% 1.9%
82.9% 1.0%
Source: Veracode SOSS Vol. 10
Half of application teams drove down flaws over the sample time frame. Another 20% either had no flaws (1
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com