Vulnerabilities / Threats //

Vulnerability Management

5/21/2015
08:02 AM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

Half Of Retail, Healthcare Sites 'Always Vulnerable'

Finding vulnerabilities in custom web applications isn't the major problem; fixing them in a timely fashion is, a new report from WhiteHat Security finds.

When it comes to fixing vulnerabilities in websites and custom web applications, the security industry has a long way to go, according to research released today by WhiteHat Security. Because of slow remediation rates, half of healthcare and retail sites were considered "always vulnerable" throughout the year. While regular static code analysis, clear accountability, and a risk-driven approach could improve the situation, researchers say there's still big room for big innovation.

According to WhiteHat's Website Security Statistics Report, the second- and third-most likely vulnerabilities to be found in applications were information leakage (56 percent) and cross-site scripting (47 percent) -- still prevalent after all these years, largely because legacy code is still so prevalent. Yet the most likely of all vulnerabilities was insufficient transport layer protection, at 70 percent. That's due to Heartbleed -- which is really an infrastructure vulnerability, not a custom web application vulnerability.

WhiteHat wasn't certain about whether to integrate tests for those kinds of vulns because software developers complained that they'd take the blame for holes that were really the infrastructure team's job to fix. Yet, what WhiteHat actually found was that including data about infrastructure vulnerabilities improved organizations' overall metrics, because infrastructure teams fixed vulns faster than software development teams. They seem to have that part of the process down better.

Part of the reason for this, says WhiteHat Security founder Jeremiah Grossman, is "In the custom web application world, all the exploits are one-off."

So the time to exploit isn't as quick as it is with widespread vulnerabilities in infrastructure or off-the-shelf products, and therefore the custom web app world has gotten away with being less orderly with its process of fixing vulnerabilities, he says. "We're not there yet."

Another reason, he says, is that development teams may be working on products that drive business and make money; therefore, there will be more financial pressures affecting when apps can be updated or when the release of buggy new products can be delayed. "When infrastructure bugs pop up, it's nothing more than a patch," says Grossman, "it's not a lot of business aspects involved."

Business aspects may make less of an impact on vulnerability status if someone is clearly held accountable for data or system breaches. According to the report, organizations with accountability tend to find and fix more security holes.

When software development teams are accountable, the number of open vulnerabilities tends to be lower; when security teams are accountable, vulnerabilities are remediated more quickly and at a higher rate. Yet, the number of open vulnerabilities is lowest (8) and the rate of remediation highest (40 percent) when the board room is held accountable.

"If the developers are accountable, the code that they write ... is actually going to get better," says Grossman. But "you have to have executive-level accountability, otherwise none of the interests fall into [complete] alignment," he says.

Anthropology

WhiteHat found far more companies (35 percent) citing risk reduction as the primary reason for resolving vulnerabilities than companies citing compliance (14 percent). This is a major change from 2013, when respondents ranked compliance number one. The companies using a more risk-based method had more vulnerabilities (23 per site, vs. 12 per site), a lower remediation rate (18 percent, vs. 86 percent), and a shorter time-to-fix (115 days, vs. 158 days). Why?

"This is where we're anthropologists more than anything else," says Grossman. They can't be certain, but the way WhiteHat interprets those results is that those driven by compliance do a very good job of doing the bare minimum required by regulators. 

So companies using a compliance-based approach find fewer vulnerabilities because they don't look as hard for them; and although they remediate most of those holes, they may take their time about it, just making sure it's done before the auditor's next visit. 

"Once everyone got compliant and the attacks didn't stop, everyone decided 'okay what's going on,'" and evolved to a risk-based approach that caused them to keep looking for more threats, beyond what the auditors demanded, says Grossman. So while those companies' "numbers are worse optically, their security might be better."

There were a variety of methods organizations used that reduced the volume of vulnerabilities in their websites or shortened the time to fix. However, "We didn't really see ... best practices that worked for every organization, no matter the circumstances," says Grossman.

That said, the report did find that ad hoc reviews of high-risk apps had a "dramatic positive effect" on the average number of vulnerabilities. Organizations that never do ad-hoc reviews averaged 35 vulnerabilities per site, while those that did an ad-hoc review for every major release had only 10.

Further, static code analysis made a big difference on time-to-fix. The average time-to-fix for organizations that never did static analysis was 157 days, while those that did daily code analysis fixed vulns in 96 days.

Even 96 days, however, isn't soon enough for Grossman, who stressed that the industry needs to find out ways to make fixing vulnerabilities faster and easier -- particularly when you consider the fact that websites may have multiple holes at the same time.

According to the report, the average number of vulnerabilities per site is quite low -- ranging from 2 in the public administration sector to 11 in transportation and warehousing. Yet it all adds up to very long windows of exposure (the number of days an application has one or more serious vulnerabilities open).

According to the report, 55 percent of retail/trade sites, 50 percent of healthcare/social assistance sites, and 35 percent of finance/insurance sites were always vulnerable (vulnerable every day of the year). Education was the best performing on window of exposure -- 27 percent always vulnerable, 40 percent rarely (30 days per year or less).

"Now it's time for the whole industry to get better at fixing vulnerabilites," says Grossman. "That's where the innovation needs to be."

Sara Peters is Senior Editor at Dark Reading and formerly the editor-in-chief of Enterprise Efficiency. Prior that she was senior editor for the Computer Security Institute, writing and speaking about virtualization, identity management, cybersecurity law, and a myriad ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Microsoft Word Vuln Went Unnoticed for 17 Years: Report
Kelly Sheridan, Associate Editor, Dark Reading,  11/14/2017
Companies Blindly Believe They've Locked Down Users' Mobile Use
Dawn Kawamoto, Associate Editor, Dark Reading,  11/14/2017
121 Pieces of Malware Flagged on NSA Employee's Home Computer
Kelly Jackson Higgins, Executive Editor at Dark Reading,  11/16/2017
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Managing Cyber-Risk
An online breach could have a huge impact on your organization. Here are some strategies for measuring and managing that risk.
Flash Poll
The State of Ransomware
The State of Ransomware
Ransomware has become one of the most prevalent new cybersecurity threats faced by today's enterprises. This new report from Dark Reading includes feedback from IT and IT security professionals about their organization's ransomware experiences, defense plans, and malware challenges. Find out what they had to say!
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.