SQL injection drops out of WhiteHat Security's top 10 website vulnerability list

Websites now contain fewer numbers of serious security vulnerabilities, but the majority of websites still have at least one serious flaw that can lead to a major compromise.

Some 86 percent of websites have at least one serious bug that could be used in an attack, while the total number of serious bugs per website dropped from 79 in 2011 to 56 in 2012, according to new data released today by WhiteHat Security on the state of website security.

WhiteHat's report, based on data from tens of thousands of websites from some 650 of its customer organizations, also correlated software development life cycle data from 76 customers surveyed by the vendor.

All in all, the report demonstrates how cleaning up websites -- the top attack vector these days -- doesn't happen overnight.

Even organizations that are schooling their developers in security, running Web application firewalls (WAFs), and performing static code analysis are experiencing mixed results overall in their app security, the report shows -- with more vulnerabilities in their websites in some cases, the report found. Those with WAFs had 11 percent more vulnerabilities, for instance, while organizations that ran static-code analysis on their websites had 15 percent more flaws.

"Websites are no less hackable today than before," says Gabriel Gumbs, director of solution architecture at WhiteHat Security. "But organizations are doing a lot more -- performing more training and static-code analysis."

Chris Wysopal, CTO of Veracode, says the WhiteHat report's findings reflect the reality of the long, slow road to attaining more secure Web apps. "We are making some progress. But look how long it took Microsoft from the [Bill] Gates memo to where they could say, 'Our products are better.' That took at least five years, and some would argue even longer," Wysopal says. "You are seeing a big uptick in the last couple of years of people testing in the development phase and fixing a lot of the issues there. I think we're still two to three years away from saying things are really getting better."

Wysopal notes that Veracode's recent state of software security report had similar findings to WhiteHat's 86 percent number when it came to the number of vulnerabilities in Web apps. "This aligns with the number of Web apps we found that didn't comply with the OWASP Top 10 -- at 87 percent. Not complying with the OWASP Top 10 is roughly equivalent to at least one serious vulnerability," Wysopal says.

[For the first time in nearly four years, the top malware threat plaguing enterprises is not the Conficker worm: Web-based attacks have taken over. See Microsoft: Worms And Rogue AV Dying, Web Threats Thriving.]

Organizations resolved 61 percent of their serious vulnerabilities, and it took an average of 193 to fix them. Just less than 18 percent of websites had known flaws for less than 30 days, according to WhiteHat's findings.

SQL Injection Is Up, SQL Injection Is Down
SQL injection, one of the most prolific attack vectors in the past year or so, was actually on the decline in websites: It didn't even make WhiteHat's top 10 vulnerabilities list for 2012, dropping to No. 14 from No. 8 in 2011. Only 7 percent of websites were found with SQL injection flaws, down from 11 percent, according to WhiteHat's data. "SQL injection is still accounting for a large percentage of data record losses," Gumbs says. "I really was a little surprised that SQL injection was down. You'd expect it to normalize or stay around the same area."

Veracode's data shows a much different picture with SQL injection. SQL injection flaws were found in one-third of applications tested by Veracode, and a previous trend in declining SQL injection flaws has basically come to a standstill, with 32 percent of Web applications presenting SQL injection flaws from the first quarter of 2011 to the second quarter of 2012.

"We are seeing three times what [WhiteHat] saw last year. We're not seeing a drop in SQL injection," he says.

Why the discrepancy? Wysopal says it likely has to do with the difference between dynamic and static code testing. "Most of the things [vulnerabilities] we both tested for lined up about the same," he notes. But using either just static or just dynamic testing won't find everything. "Using static and dynamic testing both is what you should be doing. That's the main message" here, he says.

"And we don't collect data [to know] if a website is Internet-facing. With internal websites, companies might ... let SQL injection slide," he says, which also could account for the different findings between the two vendors on SQL injection bugs.

The top two vulnerability types found in websites last year were information leakage (55 percent) and cross-site scripting (53 percent), WhiteHat says. The other top 10 bugs were content spoofing (33 percent), cross-site request forgery (26 percent), brute force (26 percent), fingerprinting (23 percent), insufficient transport layer protection (22 percent); session fixation (14 percent), URL redirector abuse (13 percent), and insufficient authorization (11 percent).

Industry sector-wise, all but the IT and energy industries have fewer vulnerabilities in their websites than in years past. IT companies has the biggest number of bugs per website: 114; government websites contain the fewest number of serious flaws -- an average of eight. Banking websites have an average of 11, the report shows. Media and entertainment sites fix bugs most, with an 81 percent remediation rate.

Organizations running Web app development frameworks actually experienced an increase in bugs of 62 percent. "We wanted to be able to say it went down because app development frameworks help remediate those problems out of the door when coding," Gumbs says. "We suspect they are leverage frameworks, and assuming there are other parts of SDLC that they don't need to address."

"You live by compliance and die by compliance," he says.

Some 57 percent of organizations say they provide security training for their software developers. The good news is that those that do have 40 percent fewer flaws in their websites and fix them 59 percent faster. And the 39 percent that use static-code analysis on their websites end up with 15 percent more flaws, 26 percent slower resolution times, and a 4 percent lower remediation rate.

"Web apps are still a targeted, rich environment where a lot of organizations are being hit. There are fewer vulnerabilities overall, but the numbers [show] it's still very much a problem," Gumbs says.

The full 2013 WhiteHat Security Website Security Statistics Report is available here (PDF) for download.

Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message.

About the Author(s)

Kelly Jackson Higgins, Editor-in-Chief, Dark Reading

Kelly Jackson Higgins is the Editor-in-Chief of Dark Reading. She is an award-winning veteran technology and business journalist with more than two decades of experience in reporting and editing for various publications, including Network Computing, Secure Enterprise Magazine, Virginia Business magazine, and other major media properties. Jackson Higgins was recently selected as one of the Top 10 Cybersecurity Journalists in the US, and named as one of Folio's 2019 Top Women in Media. She began her career as a sports writer in the Washington, DC metropolitan area, and earned her BA at William & Mary. Follow her on Twitter @kjhiggins.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like

More Insights