In 2012, the number of publicly reported software vulnerabilities jumped by 26 percent, the biggest increase in security issues in five years.
Bad news? Not necessarily. While the past decade of vulnerability disclosures saw the reversal of a five-year decline, it also marked a reduction in the number of easily exploitable, critical severity flaws. Two reports -- one released earlier this month and another scheduled for release next week -- analyzed the trends over the past decade or more and noted both positive and negative trends in software security.
The reports highlight the fact that vulnerabilities will not go away and that companies must find ways to minimize their impacts, says Stefan Frei, research director of NSS Labs, a security consultancy.
"Vulnerabilities are here to stay," he said. "I don't think that in five years time we will have eliminated vulnerabilities from any software product."
In its report released in early February, NSS Labs analyzed almost 54,000 vulnerabilities in nearly 21,000 software products using data from the National Vulnerability Database (NVD). Next week, network-security firm Sourcefire plans to release its own analysis spanning more than two decades of software flaws.
Both reports find that the number of publicly reported software vulnerabilities peaked in 2006, and then declined during the next five years. In 2012, however, the tally of software flaws jumped -- by more than a quarter, according to NSSLabs' analysis.
Here are four lessons from the data, according to the experts who crunched the numbers.
1. Focus on security reduces exploitability, severity.
First the good news: Easily exploitable, critical severity vulnerabilities are increasingly uncommon.
In 2012, less than half of all vulnerabilities were easily exploitable, down from approximately 95 percent in 2000. In addition, fewer high severity flaws were found: The number of vulnerabilities with a score on the Common Vulnerability Scoring System (CVSS) of 7.0 or higher dropped to 34 percent of reported issues in 2012, down from a high of 51 percent in 2008.
The numbers indicate "a clear -- but slowing -- trend towards an increase in attack complexity," Frei stated in the NSS Labs' report.
2. Still more than enough flaws.
While there are thousands of software vulnerabilities out there, opportunistic attackers tend to exploit only a dozen or two in any single year -- typically those exploits baked into cybercriminal toolkits.
Unfortunately, there are more than enough highly critical flaws to go around. In 2012, more than 9 percent of the publicly reported vulnerabilities had both a CVSS score of 9.9 and a low attack complexity, according to NSS Labs' analysis. Adobe, Mozilla, and Oracle -- the company that supports Java -- are the developers with the most easy-to-exploit vulnerabilities, accounting for almost half of the high-severity issues.
"You take all these thousands of vulnerabilities, and maybe only a dozen are being regularly exploited," says Zulfikar Ramzan, chief scientist at Sourcefire.
Companies should focus on fixing or mitigating the vulnerabilities that are included in exploit kits, he says.
3. New developers, new technology are fertile fields.
New technologies have always posed as potential fodder for vulnerability research; as a constant incubator of new technologies and frameworks, the Web is a steady source of software vulnerabilities, says Jacob West, chief technology officer of Hewlett-Packard's Enterprise Security Products group, which plans to release its own vulnerability analysis at the coming RSA Conference.
In the past decade, four of the top six vulnerabilities were Web-based software issues, he says. Today, more than half of Web sites are vulnerable to cross-site scripting issues.
[While zero-day attacks -- targeting previously unknown and unpatched vulnerabilities -- are a wide concern, companies need to test their security against known vulnerabilities as well. See More Exploits For Sale Means Better Security.]
"Enterprise applications are on a slower life cycle -- they are slower to patch," West says. "The other side of that coin, however, is that something like the Web, with an agile life cycle, can be updated faster, but may be more likely to have security holes introduced."
Other technologies that have yet to have mature, secure development life cycles are mobile applications and industrial control systems, he says.
4. Private markets competing for disclosure.
The decline in highly critical vulnerabilities may not necessarily be good news. HP and NSS Labs both theorize that security intelligence startups created by vulnerability researchers as well as other channels for private sales have culled some of the best vulnerabilities from being publicly disclosed.
"There is a bigger markup for critical vulnerabilities than there use to be," HP's West says. "There are black markets; there are private sources. So we think a lot of those high criticality vulnerabilities are being siphoned off."
The number of high severity vulnerabilities sold through the two most popular white-market programs, HP TippingPoint's Zero Day Initiative and iDefense's Vulnerability Contributor Program, peaked in 2011 at 18 percent, and fell last year to 6.3 percent. Since 2005, the programs have disclosed an average of 7 percent of high-severity issues.
The trend "correlates with reports of the vulnerability and exploit market rapidly expanding in 2012," the NSS Labs' report states. "These changes in the security industry likely affect the share of established programs and could change the dynamics of the vulnerability handling processes in the future."
Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message.