Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Vulnerabilities / Threats

12/9/2015
10:30 AM
Kevin E. Greene
Kevin E. Greene
Commentary
Connect Directly
Twitter
LinkedIn
RSS
E-Mail vvv
50%
50%

‘Re-innovating’ Static Analysis: 4 Steps

Before we pronounce the death of static analysis, let's raise the bar with a modern framework that keeps pace with the complexity and size found in today's software.

Static analysis isn’t dead like some have suggested.  Has static analysis lost some of it’s luster?  Absolutely! Many of the studies would suggest that static analysis tools (commercial and open-source) are underperforming on certain types of bugs or weakness classes. But one of the reasons why I like tool studies is because they help you understand what a tool can and cannot do -- provided that you have developed the test cases to measure whether or not the tool actually detected the coding issue or violation. 

Tool studies also help you better understand the behavior and characteristics of static analysis tools for a given code construct or different styles of coding.  Static analysis tools perform differently on different program structures, and understanding why tools fail on certain types of code is important to know (with confidence) if we are going to raise the bar in static analysis capabilities and innovate. 

The results of many of these tool studies haven’t been favorable. In fact, one could argue that given the simplicity of the test cases used, static analysis tools should be performing much better. For instance, the Juliet Test Case suite that was funded by NSA Center for Assured Software is a collection of Java and C/C++ synthetic, meaning they are created as examples with well-characterized weaknesses. 

A criticism of Juliet is that the test cases don’t represent “real” world software. Given that the test cases are less complex than real-world programs, and are synthetic, you might expect that tools would perform much stronger, but that hasn’t been the case. I’m aware of at least four tool studies where the test results have been mediocre across the board -- OWASP Benchmark, NIST Static Analysis Tool Exposition (SATE), NSA Center for Assured Software, and a project funded at IUPUI, led by Dr. James Hill. 

One revelation from the tool studies is that each tool does something really well; a “sweet spot.”  Most of the tools have several sweet spots, but outside of them, the tools tremendously underperform. It should be noted that overall, the commercial static analysis tools fair better than the open-source tools, but some studies suggest that open-source tools may be better at finding a particular weakness.    

Improving static analysis

Static analysis tools are not dead; they just need to be updated to keep pace with modern-day software. There needs to be more emphasis and investment in research and development by the software assurance community to find new breakthroughs and advancements in techniques to improve static analysis capabilities.  

Organizations who buy static analysis tools have to put more pressure on commercial tool vendors to invest more in R&D so that tools can be modernized and improved. Adding rules and heuristics is not fixing the problem long-term, nor does it provide the innovation to keep pace with the evolution in software. We’ve seen with the Heartbleed vulnerability in OpenSSL, that vendors can add rules and heuristics to identify the weakness that exposed the Heartbleed vulnerability (after the fact).  The fact that none of the tools were able to detect the weakness that exposed the vulnerability can be summarized as the crux of the problem with static analysis tools and capabilities. 

I want to share with you a research project that I’m funding to push forward the state-of-the-art in static analysis capabilities. The Static Analysis Tool Modernization Project (STAMP) research is an attempt to address the lack of innovation around static analysis tools. The goal of STAMP is to modernize static analysis tools, creating better techniques that can scale the complexity and size of today’s software. The inspiration for STAMP came from the HGTV show, Property Brothers, where brothers find neglected homes and infuse money into the homes to renovate them. STAMP has the potential to renovate (re-innovate) static analysis capabilities. STAMP will focus on four key areas:

1. Develop improved code constructs and test cases that represent “real” world programs (modern software). This will address some of the shortcomings of Juliet, and to a certain extent some of the new test case suites such as the OWASP Benchmark project. The next generation of test cases developed in STAMP will help baseline existing state-of-the-art static analysis tools. 

2. Conduct an in-depth tool study to understand what tools can and cannot do it terms of tool coverage across the various weakness classes. By identifying the gaps and strengths in static analysis tools, this will help identify the areas where static analysis capabilities need to be “modernized”.

3. Develop a modernization framework to improve the capabilities in static analysis tools.  Engaging in R&D to develop a framework to explore new techniques, methods, and services will help make static analysis tools more precise and sound and achieve what many call “security at-speed.” 

4. Score and label static analysis tools and capabilities based on areas where tools perform well, and areas where tools struggle in regards to tool coverage. A consumer report will be developed to better educate and guide the software assurance community in purchasing and procuring static analysis capabilities. Oftentimes when you purchase or procure a static analysis tool, you don’t really know what the tools missed. The scoring and labeling will help organizations mix and match features in static analysis to leverage the strength of each tool(s) to cover a wider attack surface.  

One of the interesting and unique aspects in working with researchers and computer scientists who study the area of static analysis, users of commercial static analysis tools, and the commercial tool vendors, is that I get so much useful information about problem areas. One common theme that I hear is that no one tool can give you the coverage you need. Organizations should be able to read a label on a given static analysis tool (the same way nutrient labels are on foods) to understand the strengths, the “sweet spotsof static analysis tools. 

Before we pronounce the end or death of static analysis, let’s see what innovation and improvements STAMP will provide to help raise the bar in static analysis tools and capabilities. Static analysis is just one context --  like DAST and IAST -- that can be leveraged to help reduce false positives, but also provide more visibility into “real” bugs and potential vulnerabilities that exist in software. 

All application security testing approaches have their limitations; to think that one is superior than the other is a bit naive. I’ve funded research that’s shown how Hybrid Analysis Security Testing (HAST) can really improve software analysis capabilities, by infusing the context of SAST and DAST together for better applications security situational awareness. There is no uber approach or tool!  We are seeing in other technology areas where vendors are opening up their platforms with APIs because customers are wanting better situational awareness across their technology investments to improve overall threat management. I see the same happening with the resurrection of static analysis and a shift from relying on a “black box” technology solution. 

Kevin Greene is a thought leader in the area of software security assurance. He currently serves on the advisory board for New Jersey Institute of Technology (NJIT) Cybersecurity Research Center, and Bowie State University's Computer Science department. Kevin has been very ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
KevGreene_Cyber
50%
50%
KevGreene_Cyber,
User Rank: Author
12/15/2015 | 2:33:16 PM
Re: The *term* static analysis should die
Jeff... that's not the case.  I think there are ways (we have been exploring) to correlate and bring disparate results together; as well as the context of various security testing activiites -- see ASTAM.  So.. we have already explored new ways to bring results together.. I think we are moving in the right direction.  Lots of exciting things to come... Thanks for reading...
planetlevel
50%
50%
planetlevel,
User Rank: Author
12/15/2015 | 12:26:30 PM
The *term* static analysis should die
Hi Kevin - nice article.  Just wanted to check if you thought it was me suggesting static analysis is dead -- which I don't.  I have written that the *term* static analysis should die.  At least for security.  I think I've been clear that I use and trust static analysis for many types of code quality analysis.

http://www.contrastsecurity.com/blog/why-its-time-for-the-terms-static-analysis-and-dynamic-analysis-to-die

The basic idea is that it's the information -- the context -- available to the tool that matters, not the fact that it happens to be static, dynamic, interactive, runtime, or whatever.  We should be talking about whether a tool has access to accurate data flow, backend connections, http requests/responses, libraries/frameworks, etc...  That's what tools need to start getting accurate.

And I want to be clear. This isn't about correlating the results of single-approach tools. My experience is that it's very difficult to accurately correlate.  And even if you do, you lose the "sweet spots" that were only found by one tool and so didn't correlate with anything.  I'm talking about merging the analysis techniques themselves into a single tool - so that the analysis engine itself can use a broad range of contextual information when identifying vulnerabilities.

Static analysis is a key security technology. But in my view needs the information from other security analysis approaches closely integrated during analysis.  I look forward to seeing the results of the STAMP.

 

 

 
KevGreene_Cyber
50%
50%
KevGreene_Cyber,
User Rank: Author
12/14/2015 | 1:13:09 PM
Re: nice post
Thanks for the support.  Much appreciated.. 
Poissonpraveen
50%
50%
Poissonpraveen,
User Rank: Apprentice
12/11/2015 | 8:03:51 AM
nice post
nice post and good peice of information and keep it bro...
News
FluBot Malware's Rapid Spread May Soon Hit US Phones
Kelly Sheridan, Staff Editor, Dark Reading,  4/28/2021
Slideshows
7 Modern-Day Cybersecurity Realities
Steve Zurier, Contributing Writer,  4/30/2021
Commentary
How to Secure Employees' Home Wi-Fi Networks
Bert Kashyap, CEO and Co-Founder at SecureW2,  4/28/2021
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you today!
Flash Poll
How Enterprises are Developing Secure Applications
How Enterprises are Developing Secure Applications
Recent breaches of third-party apps are driving many organizations to think harder about the security of their off-the-shelf software as they continue to move left in secure software development practices.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2021-31755
PUBLISHED: 2021-05-07
An issue was discovered on Tenda AC11 devices with firmware through 02.03.01.104_CN. A stack buffer overflow vulnerability in /goform/setmac allows attackers to execute arbitrary code on the system via a crafted post request.
CVE-2021-31756
PUBLISHED: 2021-05-07
An issue was discovered on Tenda AC11 devices with firmware through 02.03.01.104_CN. A stack buffer overflow vulnerability in /gofrom/setwanType allows attackers to execute arbitrary code on the system via a crafted post request. This occurs when input vector controlled by malicious attack get copie...
CVE-2021-31757
PUBLISHED: 2021-05-07
An issue was discovered on Tenda AC11 devices with firmware through 02.03.01.104_CN. A stack buffer overflow vulnerability in /goform/setVLAN allows attackers to execute arbitrary code on the system via a crafted post request.
CVE-2021-31758
PUBLISHED: 2021-05-07
An issue was discovered on Tenda AC11 devices with firmware through 02.03.01.104_CN. A stack buffer overflow vulnerability in /goform/setportList allows attackers to execute arbitrary code on the system via a crafted post request.
CVE-2021-31458
PUBLISHED: 2021-05-07
This vulnerability allows remote attackers to execute arbitrary code on affected installations of Foxit Reader 10.1.1.37576. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file. The specific flaw exists within the handlin...