Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


10:30 AM
Kevin E. Greene
Kevin E. Greene
Connect Directly
E-Mail vvv

Certifying Software: Why We’re Not There Yet

Finding a solution to the software security and hygiene problem will take more than an Underwriter's Lab seal of approval.

There’s no arguing with the fact that acquirers of software need assurances that the software they purchase is safe and stable to use.  However, I struggle with the notion that analyzing software and assigning a pass/fail rating is the best solution, given that many state-of-the-art software assurance tools, technologies and capabilities have not kept pace with the complexity and size of modern software. Of particular concern to me are the challenges in performance, precision, and soundness of many static analysis tools, both open-source and commercial. 

Static analysis is listed by Underwriter’s Lab as one of the assessments that will be used to identify weaknesses in software, along with other activities such as fuzz testing, evaluation of known vulnerabilities, hunting for malware, and static binary analysis. While this all make sense, there is a dirty little secret about static analysis tools that is largely ignored: there is a residual risk in using any of these tools. 

The problem with residual risk is that we don’t know what parts of the software the tools were not able to analyze due to ‘opaqueness,’ parts of the code that are written so poorly that tools can’t understand. We also don’t know what percentage of actual weaknesses is correctly reported by the tools, and when, with certainty, these tools start to oversimplify and miss true bugs due to performance issues. My point is that there needs to be some ground truth established for software assurance tools that both sets a baseline and also measures what the tools can and cannot do. 

Federal-funded Programs: Trying to Move the Needle
There have been several attempts by the NSA Center for Assured Software, NIST, and other federal-funded programs to conduct tool studies to better understand areas where modernization is needed. DHS Science &Technology Static Tool Analysis Modernization (STAMP) research project will not only modernize a list of open-source static analysis tools, but it will also offer a structured way of measuring static analysis tools, and provide a ‘consumer report’ to identify the strengths and weaknesses of each tool. 

The government took the lead in addressing this problem by conducting these studies, but, in many cases, researchers were restricted from sharing the results due to license agreements or DeWitt clauses which prevents distributing the test results with the security community at large.  Many commercial vendors also refused to share their results as part of their participation in NIST’s Static Analysis Tool Exposition (SATE)

I personally believe that sharing tool study results (in a very healthy way) will ultimately stimulate competition and innovation, and improve software assurance. Georgetown University, for example, recently conducted a study of license agreements and similar clauses in a white paper titled, “Vendor Truth Serum,” which explores how to share results from tool studies in a way that helps organizations make informed choices, accomplish satisfactory testing in minimal time at minimal expense, and choose the right set of tools that best match the software under test.   

The Role of Secure Architecture and Design
While many of the technologies that are called out in the UL 2900 series (You have to pay for the documents.) are behind the power curve of innovation, it’s important to carefully consider secure design, architectural and engineering principles early in the process before a system is developed because implementing poor architecture decisions will introduce security flaws which often lead to security breaches.  A Carnegie Mellon study suggests that over 90% of security breaches can be traced back to poorly designed and developed systems.  

Mehdi Mirakholi, PhD., assistant professor in the department of software engineering at Rochester Institute of Technology, believes the problem of security architecture erosion is exacerbated by the fact that popular software engineering tools and environments fail to integrate architecture analysis and security thinking into developers’ daily activities. As a result, programmers are not kept fully informed of the consequences of their programming choices or refactoring activities on the security design of the system.

For instance, a third-party library with good security hygiene used in the wrong way could be as bad or worse than using a flawed third-party library. Understanding the secure design principles of a system or application is important because it gives context to how security tactics and features should be implemented to enforce security.  An in-depth assessment of a system design and architecture is a critical component, especially when there are plans of tying in the software liability component to UL certification. With the emergence of Internet of Things (IoT), secure design becomes very important, as seen with the recent discovery of security flaws in Johnson & Johnson insulin pumps

I would also like to see a greater emphasis on secure design reviews and improved strategies to assess whether are not security is actually “built-in.” Testing and assessing the system using architecture and design contexts will help guide the use of static and dynamic analysis, malware hunting, static binary analysis, and fuzz testing.   

Forward Leaning with more Innovation
Overall I like the concept of certifying software, or some sort of variation of a consumer report for software security like the one that Mudge and Sarah Zatko developed. The consumer report format gives the user or acquirer the information to make informed decisions and quantitatively compare different products.  My hesitation is with the science and innovation around the technologies, and capabilities of software assurance tools. As a program manager in software assurance R&D, I work with computer scientists from industry and academia and understand how far behind software assurance tools are in keeping pace with modern software. 

One of my researchers, Henny Sipma, Ph.D, suggests that static analysis capabilities are 15 years behind. Other computer scientists and researchers such as Bart Miller and James Kupsch described issues in static analysis as far back as April 2014, when none of the existing tools were able to find the weakness that exposed the Heartbleed vulnerability

We need more science and innovation in not only static analysis, but automated tools that are designed to look for bad things in software.  According to Mirakholi, “an analysis of reported CVEs suggests that roughly 50% of security problems are the result of software design flaw, poor architectural implementation, violation of design principals in source code and degradations of security architecture. Unfortunately, such problems are exacerbated by the fact that current tools do not provide enough architecture-centric analysis to detect erosion of security architecture in the code and generate appropriate recommendations for the developers on how to fix these design issues."

It’s definitely time to raise the bar with better tech innovation and novelty so that movements and initiatives like UL certification can help shape a healthier and safer software world.

Related Content:


Kevin Greene is a thought leader in the area of software security assurance. He currently serves on the advisory board for New Jersey Institute of Technology (NJIT) Cybersecurity Research Center, and Bowie State University's Computer Science department. Kevin has been very ... View Full Bio
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
Navigating the Deluge of Security Data
In this Tech Digest, Dark Reading shares the experiences of some top security practitioners as they navigate volumes of security data. We examine some examples of how enterprises can cull this data to find the clues they need.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industry’s conventional wisdom. Here’s a look at what they’re thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2019-12-05
The Lever PDF Embedder plugin 4.4 for WordPress does not block the distribution of polyglot PDF documents that are valid JAR archives.
PUBLISHED: 2019-12-05
D-Link DAP-1860 devices before v1.04b03 Beta allow arbitrary remote code execution as root without authentication via shell metacharacters within an HNAP_AUTH HTTP header.
PUBLISHED: 2019-12-05
D-Link DAP-1860 devices before v1.04b03 Beta allow access to administrator functions without authentication via the HNAP_AUTH header timestamp value. In HTTP requests, part of the HNAP_AUTH header is the timestamp used to determine the time when the user sent the request. If this value is equal to t...
PUBLISHED: 2019-12-05
GitBook through 2.6.9 allows XSS via a local .md file.
PUBLISHED: 2019-12-05
In radare2 through 4.0, there is an integer overflow for the variable new_token_size in the function r_asm_massemble at libr/asm/asm.c. This integer overflow will result in a Use-After-Free for the buffer tokens, which can be filled with arbitrary malicious data after the free. This allows remote at...