Application Security

9/30/2014
11:00 AM
Kevin E. Greene
Kevin E. Greene
Commentary
Connect Directly
Twitter
LinkedIn
RSS
E-Mail vvv
100%
0%

Software Assurance: Time to Raise the Bar on Static Analysis

The results from tools studies suggest that using multiple tools together can produce more powerful analytics and more accurate results.

I had an interesting conversation recently about the after-effects of Heartbleed and the challenges facing static analysis with Barton Miller, the chief scientist of the Software Assurance Marketplace (SWAMP), which is a project I'm sponsoring at the Department of Homeland Security to improve software quality and raise the bar of static analysis capabilities.

I wanted to know if the problems associated with static analysis can be attributed to a lackluster analysis engine. Are the core engines in static analysis tools robust enough to keep pace with the complexity and size of modern software? Obviously, these tools appear to be lacking in depth and breadth, which results in oversimplifying, which may lead tools to make inaccurate assumptions about code; as a result, they miss (simple) things and produce a generous amount of false positives.

Generating false positives can be annoying and very time consuming for developers to triage. However, when static analysis tools miss things, then the users need to know what was missed. That's why it is important for me, in the role that I'm in, to sponsor and support tool studies to understand what a tool can and cannot do, such as the ones sponsored by the National Institute of Standards and Technology (NIST), with its Static Analysis Metrics and Tool Evaluation (SAMATE) program, the National Security Agency (NSA), the Center for Assured Software Tool Study, as well as a tool study I'm sponsoring through the Security and Software Engineering Research Center (S2ERC).

Tool studies are essential for improving static analysis capabilities. They model the behavior of tools to help identify gaps that exist in techniques, and provide some evidence as to a tool's strengths and weaknesses. What is important to note is that tools perform differently on different program structures. All Java code is not written the same: Not all C/C++ code is written the same; so the program structure (as seen with OpenSSL) strongly impacts how static analysis tools perform. My end goal with tool studies is to understand where the gaps are, and innovate -- sponsor research and development projects to create new techniques and capabilities that will help advance the state-of-the-art, specifically improving open-source static analysis capabilities.

Many organizations have a structure that is based on having various development contracts (some outsourced), with a host of developers that have different coding styles, and use different programming languages to support their enterprise-wide application environments. Given this complicated approach, it is not realistic for an organization to use one static analysis tool to satisfy all of its software assurance needs.

The fallacy or lack of understanding of static analysis also creates residual risks in many organizations, where weaknesses are present in software code, but the tool is not able to produce the evidence that can be attributed to a particular coding violation. This creates a situation where an organization that uses static analysis to assess a new system or application will get a report from a tool and remediate what is stated in the report, and then proceed to deploy that system or application online in a production network without knowing what risks remain. The residual risk associated with static analysis could give an adversary an attack vector to exploit vulnerable systems.

There is no über tool; all tools struggle to some degree with tool coverage. Every tool has a sweet spot (or several sweet spots), some thing or things it does very well; for instance, some tools may be really good at identifying SQL injection, cross-site scripting, or code quality bugs or issues, but may not analyze other weakness classes that well.

The results from tools studies have suggested that using multiple tools together can improve tool coverage, and improve the accuracy of results. As tool studies produce more powerful analytics and results, mixing and matching tools (open-source and commercial) can help organizations reach deeper into their codes to reduce the residual risks associated with conducting static analysis.

This is where SWAMP plays a key role in helping create better tools and provide a way for software researchers to discover new techniques and capabilities in static analysis. The SWAMP analysis framework uses CodeDx to bring together the many sweet spots of static analysis tools. CodeDx takes results from disparate static analysis tools, normalizes and correlates the results. I like to say, "The sum of many is better than the sum of one."

Kevin Greene is a thought leader in the area of software security assurance. He currently serves on the advisory board for New Jersey Institute of Technology (NJIT) Cybersecurity Research Center, and Bowie State University's Computer Science department. Kevin has been very ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
KevGreene_Cyber
50%
50%
KevGreene_Cyber,
User Rank: Author
10/7/2014 | 10:09:36 AM
Re: complementary sweet spots
@Sara -- from my experience organizations typically use one tool -- the concept of best of breed has died and people have bought into the concept of UTM (Unified Threat Management).  We've seen this on the network side of the shop with Cisco, Juniper, and Fortinet... The same has happened with the AppSec/SwA tools -- all in one.  But that locks organizations into that proprietary solution.  The feedback I got from organizations is that it takes too much time to triage mutiple reports from tools -- or it takes resources to bring in a new tool.  So that becomes a barrier to introducing additional tools into the workflow.  The SWAMP eliminates that barrier and enable the developer to focus on those weaknesses that matters the most. The bigger shops or more mature organizations tend to use multiple tools, but have to glue results from various tools.  I'm sharing witht the community, we have solved that problem and are able to leverage the context from various tools to help dive deeper into weaknesses in code. 
Sara Peters
100%
0%
Sara Peters,
User Rank: Author
10/6/2014 | 4:10:28 PM
complementary sweet spots
As you mention, Kevin, it's better to use multiple tools, instead of just one, because different tools excel at different things. In your experience, do most organizations and developers combine tools like this, or do they too often pick just one?
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
10/2/2014 | 1:43:31 PM
Re: Tool Compilation
Thanks, Kevin. We'll be looking forward to you sharing the insight you get from SWAMP as the project evolves. 
KevGreene_Cyber
50%
50%
KevGreene_Cyber,
User Rank: Author
10/1/2014 | 12:09:19 PM
Re: Tool Compilation
@Marilyn -- we have not determined that yet.  However, there is some data to suggest which tools may work well together depending on language and the program structure of code.  For open-source tools, we can definitely share some insight on that, but commercial tools you are restricted from sharing information based on the EULA --

The SWAMP opened in Feb of 2014, and we are stil buidling the analytics around this notion.  
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
10/1/2014 | 8:59:37 AM
Re: Tool Compilation
Following on Ryan's comment, Kevin: Have you determined through SWAMP (thus far) what is the most popular or most effective tool combo for SA? 
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
9/30/2014 | 2:46:24 PM
Re: Tool Compilation
I'll take a look thanks! Just a quick question, do you specialize in one genre of tool or do you group tools by genres to subscribe to more users? (NetSec tools, InfoSec tools, etc)
KevGreene_Cyber
50%
50%
KevGreene_Cyber,
User Rank: Author
9/30/2014 | 1:16:03 PM
Re: Tool Compilation
@Ryan... Thanks for your comment.  CodeDx bundles open-source tools and allow you to bring in others as well (commerical and open-source) in one GUI.  The goal is to provide a cost-effective solution to help formalize aspects of software assurance in organizations.  You should give it a eval and let us know what you think.  Also, create an account in SWAMP let us know what you think.  
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
9/30/2014 | 1:08:47 PM
Tool Compilation
Good article. It definitely seems to be more often than not the case that a tool alone yields not as verbose data results as tools used in correlation. It takes time and effort to find the precise tools needed and each endeavor requires different data so its hard to foresee what tools are needed.

For tools that are open source, why not have them combined in a GUI or command line fashion in one distribution. Allow functionality to add upon those packages for data correlation and your analysis efficiency should increase ten fold. 
Higher Education: 15 Books to Help Cybersecurity Pros Be Better
Curtis Franklin Jr., Senior Editor at Dark Reading,  12/12/2018
Worst Password Blunders of 2018 Hit Organizations East and West
Curtis Franklin Jr., Senior Editor at Dark Reading,  12/12/2018
2019 Attacker Playbook
Ericka Chickowski, Contributing Writer, Dark Reading,  12/14/2018
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
10 Best Practices That Could Reshape Your IT Security Department
This Dark Reading Tech Digest, explores ten best practices that could reshape IT security departments.
Flash Poll
[Sponsored Content] The State of Encryption and How to Improve It
[Sponsored Content] The State of Encryption and How to Improve It
Encryption and access controls are considered to be the ultimate safeguards to ensure the security and confidentiality of data, which is why they're mandated in so many compliance and regulatory standards. While the cybersecurity market boasts a wide variety of encryption technologies, many data breaches reveal that sensitive and personal data has often been left unencrypted and, therefore, vulnerable.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2018-20201
PUBLISHED: 2018-12-18
There is a stack-based buffer over-read in the jsfNameFromString function of jsflash.c in Espruino 2V00, leading to a denial of service or possibly unspecified other impact via a crafted js file.
CVE-2018-20194
PUBLISHED: 2018-12-18
There is a stack-based buffer underflow in the third instance of the calculate_gain function in libfaad/sbr_hfadj.c in Freeware Advanced Audio Decoder 2 (FAAD2) 2.8.8. A crafted input will lead to a denial of service or possibly unspecified other impact because limiting the additional noise energy l...
CVE-2018-20195
PUBLISHED: 2018-12-18
A NULL pointer dereference was discovered in ic_predict of libfaad/ic_predict.c in Freeware Advanced Audio Decoder 2 (FAAD2) 2.8.8. The vulnerability causes a segmentation fault and application crash, which leads to denial of service.
CVE-2018-20196
PUBLISHED: 2018-12-18
There is a stack-based buffer overflow in the third instance of the calculate_gain function in libfaad/sbr_hfadj.c in Freeware Advanced Audio Decoder 2 (FAAD2) 2.8.8. A crafted input will lead to a denial of service or possibly unspecified other impact because the S_M array is mishandled.
CVE-2018-20197
PUBLISHED: 2018-12-18
There is a stack-based buffer underflow in the third instance of the calculate_gain function in libfaad/sbr_hfadj.c in Freeware Advanced Audio Decoder 2 (FAAD2) 2.8.8. A crafted input will lead to a denial of service or possibly unspecified other impact because limiting the additional noise energy l...