Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Risk

If You Build It, They'll Crash It

Software failure can happen, and it can be very costly, sometimes even costing human lives

On the first day of class in college, mechanical engineering students find out first hand what happens when engineering designs fail by watching the Tacoma Narrows bridge shake itself to death.

By contrast, computer science students are asked to write "hello world" in C and told that they can build almost anything they can imagine. It's no wonder that we have a large and growing software security problem! We seem to have forgotten that software can fail.

Software failure happens. Perhaps the first day of computer science class should cover the Ariane 5, the Therac 26, the Denver Airport automated bag control system, and the Mars Climate Orbiter crash. Those four stories remind us that software failure can happen, and that it can be very costly, sometimes even costing human lives.

It gets more complicated though, since software failure can be maliciously induced. When bad guys enter the picture, avoiding software failure becomes a serious challenge.

But No One Would Ever Do That
I have been deeply involved in software security analysis since 1995, having delivered hundreds of engagements, and having personally witnessed and helped uncover a number of spectacular and never-before-seen software security problems. Together, these problems account for tens of millions of dollars worth of business risk.

During all of these engagements, about 80% of the time that a software security defect (implementation bug or design flaw) is revealed to the team that built the code, the reaction is the same: incredulity. The two most common remarks we encounter are, "Well, you're not supposed to do that," and "No one would ever actually do that."

Getting past the naïveté of software developers takes some doing. Or maybe it's optimism that needs getting past. You see, developers are by their very nature an optimistic lot. They are constantly tasked with building something very complex from nothing, and they never even bat an eye; they just start coding. These kinds of people need to be reminded that bad people exist so they can better develop their cynicism. Reminding developers that nefarious people rob banks even though they're not supposed to is a start. After that, you can get straight into development "can'ts" and "won’ts."

The most powerful way to open the eyes of a development team is to show them a living exploit on their own stuff. There's nothing like examples from your own code base to make training come alive. The only problem with this idea is that developing a demonstrable exploit takes serious work. A slightly easier approach is to pull closely related examples from history and talk about those.

I wrote the book Exploiting Software with Greg Hoglund for just that reason. Really, it's pretty simple. Know your enemy, understand how they tick, and find out what kinds of tools are likely to be wielded against your creation.

The Power of Evil Thoughts
The best way to build something that doesn't roll over and die when attacked is to buy a black cowboy hat and think like a bad guy when you're designing and building it. This idea comes as no surprise to network security practitioners who have been using tools like SATAN, nmap, and nessus to scan their networks for exploitable vulnerabilities ever since Dan Farmer and Wietse Venema came up with the idea in 1995. (Don’t forget that way back then, Dan was fired for his trouble. Ironically, system administrators who don't use such network scanning technology today are fired for incompetence.)

The problem is that most software people still don't know that they should think like attackers, nor are they armed with the knowledge to do so properly. That needs to change.

Security vendors don't help much either. Some of the early "application security testing" tools on the market have laughably stupid "security tests" baked right into them. One example is a test that sends 50 "a's" through port 80 to bind to some arbitrary variable and try to cause a buffer overflow. Tests like those are just plain dumb. We have yet to see the Attack of the 50 A's in the wild, but it is highly worrisome (not)!

What we need is a science of attacks. Clear discussion of attack patterns and the tools commonly found in an attacker’s toolkit are a start.

The NASCAR Effect
Nobody watches NASCAR racing to see cars driving around in circles; they watch for the crashes. That's human nature. People prefer to see, film, and talk about the crashes rather than talk about how to build safer cars.

This same phenomenon happens in software. It seems that when it comes to software security, people would rather talk about software exploit, how things break, and the people that carry out attacks than to talk about software security best practices. I've seen evidence of this in my own work, where my "bad guy" books consistently outsell the "good guy" ones 3 to 1. I suppose that instead of being discouraged by this effect, we need to take advantage of it to get people interested in the software security problem.

In the end, we will need to integrate security into the software development lifecycle (as described in Software Security). But maybe the best way to start is to get a better handle on what attacks really look like. Perhaps then we'll learn more about what deep trouble we're in.

Gary McGraw is CTO of Cigital Inc. Special to Dark Reading

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
97% of Americans Can't Ace a Basic Security Test
Steve Zurier, Contributing Writer,  5/20/2019
How Security Vendors Can Address the Cybersecurity Talent Shortage
Rob Rashotte, VP of Global Training and Technical Field Enablement at Fortinet,  5/24/2019
TeamViewer Admits Breach from 2016
Dark Reading Staff 5/20/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
Building and Managing an IT Security Operations Program
As cyber threats grow, many organizations are building security operations centers (SOCs) to improve their defenses. In this Tech Digest you will learn tips on how to get the most out of a SOC in your organization - and what to do if you can't afford to build one.
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-7068
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .
CVE-2019-7069
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have a type confusion vulnerability. Successful exploitation could lead to arbitrary code execution .
CVE-2019-7070
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .
CVE-2019-7071
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an out-of-bounds read vulnerability. Successful exploitation could lead to information disclosure.
CVE-2019-7072
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .