Half Of Retail, Healthcare Sites 'Always Vulnerable'

Finding vulnerabilities in custom web applications isn't the major problem; fixing them in a timely fashion is, a new report from WhiteHat Security finds.

Sara Peters, Senior Editor

May 21, 2015

5 Min Read

When it comes to fixing vulnerabilities in websites and custom web applications, the security industry has a long way to go, according to research released today by WhiteHat Security. Because of slow remediation rates, half of healthcare and retail sites were considered "always vulnerable" throughout the year. While regular static code analysis, clear accountability, and a risk-driven approach could improve the situation, researchers say there's still big room for big innovation.

According to WhiteHat's Website Security Statistics Report, the second- and third-most likely vulnerabilities to be found in applications were information leakage (56 percent) and cross-site scripting (47 percent) -- still prevalent after all these years, largely because legacy code is still so prevalent. Yet the most likely of all vulnerabilities was insufficient transport layer protection, at 70 percent. That's due to Heartbleed -- which is really an infrastructure vulnerability, not a custom web application vulnerability.

WhiteHat wasn't certain about whether to integrate tests for those kinds of vulns because software developers complained that they'd take the blame for holes that were really the infrastructure team's job to fix. Yet, what WhiteHat actually found was that including data about infrastructure vulnerabilities improved organizations' overall metrics, because infrastructure teams fixed vulns faster than software development teams. They seem to have that part of the process down better.

Part of the reason for this, says WhiteHat Security founder Jeremiah Grossman, is "In the custom web application world, all the exploits are one-off."

So the time to exploit isn't as quick as it is with widespread vulnerabilities in infrastructure or off-the-shelf products, and therefore the custom web app world has gotten away with being less orderly with its process of fixing vulnerabilities, he says. "We're not there yet."

Another reason, he says, is that development teams may be working on products that drive business and make money; therefore, there will be more financial pressures affecting when apps can be updated or when the release of buggy new products can be delayed. "When infrastructure bugs pop up, it's nothing more than a patch," says Grossman, "it's not a lot of business aspects involved."

Business aspects may make less of an impact on vulnerability status if someone is clearly held accountable for data or system breaches. According to the report, organizations with accountability tend to find and fix more security holes.

When software development teams are accountable, the number of open vulnerabilities tends to be lower; when security teams are accountable, vulnerabilities are remediated more quickly and at a higher rate. Yet, the number of open vulnerabilities is lowest (8) and the rate of remediation highest (40 percent) when the board room is held accountable.

"If the developers are accountable, the code that they write ... is actually going to get better," says Grossman. But "you have to have executive-level accountability, otherwise none of the interests fall into [complete] alignment," he says.


WhiteHat found far more companies (35 percent) citing risk reduction as the primary reason for resolving vulnerabilities than companies citing compliance (14 percent). This is a major change from 2013, when respondents ranked compliance number one. The companies using a more risk-based method had more vulnerabilities (23 per site, vs. 12 per site), a lower remediation rate (18 percent, vs. 86 percent), and a shorter time-to-fix (115 days, vs. 158 days). Why?

"This is where we're anthropologists more than anything else," says Grossman. They can't be certain, but the way WhiteHat interprets those results is that those driven by compliance do a very good job of doing the bare minimum required by regulators. 

So companies using a compliance-based approach find fewer vulnerabilities because they don't look as hard for them; and although they remediate most of those holes, they may take their time about it, just making sure it's done before the auditor's next visit. 

"Once everyone got compliant and the attacks didn't stop, everyone decided 'okay what's going on,'" and evolved to a risk-based approach that caused them to keep looking for more threats, beyond what the auditors demanded, says Grossman. So while those companies' "numbers are worse optically, their security might be better."

There were a variety of methods organizations used that reduced the volume of vulnerabilities in their websites or shortened the time to fix. However, "We didn't really see ... best practices that worked for every organization, no matter the circumstances," says Grossman.

That said, the report did find that ad hoc reviews of high-risk apps had a "dramatic positive effect" on the average number of vulnerabilities. Organizations that never do ad-hoc reviews averaged 35 vulnerabilities per site, while those that did an ad-hoc review for every major release had only 10.

Further, static code analysis made a big difference on time-to-fix. The average time-to-fix for organizations that never did static analysis was 157 days, while those that did daily code analysis fixed vulns in 96 days.

Even 96 days, however, isn't soon enough for Grossman, who stressed that the industry needs to find out ways to make fixing vulnerabilities faster and easier -- particularly when you consider the fact that websites may have multiple holes at the same time.

According to the report, the average number of vulnerabilities per site is quite low -- ranging from 2 in the public administration sector to 11 in transportation and warehousing. Yet it all adds up to very long windows of exposure (the number of days an application has one or more serious vulnerabilities open).

According to the report, 55 percent of retail/trade sites, 50 percent of healthcare/social assistance sites, and 35 percent of finance/insurance sites were always vulnerable (vulnerable every day of the year). Education was the best performing on window of exposure -- 27 percent always vulnerable, 40 percent rarely (30 days per year or less).

"Now it's time for the whole industry to get better at fixing vulnerabilites," says Grossman. "That's where the innovation needs to be."

About the Author(s)

Sara Peters

Senior Editor

Sara Peters is Senior Editor at Dark Reading and formerly the editor-in-chief of Enterprise Efficiency. Prior that she was senior editor for the Computer Security Institute, writing and speaking about virtualization, identity management, cybersecurity law, and a myriad of other topics. She authored the 2009 CSI Computer Crime and Security Survey and founded the CSI Working Group on Web Security Research Law -- a collaborative project that investigated the dichotomy between laws regulating software vulnerability disclosure and those regulating Web vulnerability disclosure.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like

More Insights