Application Security
2/6/2014
12:45 PM
Jeff Williams
Jeff Williams
Commentary
Connect Directly
Twitter
LinkedIn
Google+
RSS
E-Mail
100%
0%

The 7 Deadly Sins of Application Security

How can two organizations with the exact same app security program have such wildly different outcomes over time? The reason is corporate culture.

The kneejerk approach to application security is to start finding and fixing vulnerabilities. The problem with these reactive programs is that they end up being expensive witch-hunts that don’t change the way code is built. Instead, we need to think of those vulnerabilities as symptoms of a deeper problem that lies somewhere in the software development organization.

Over the past 15 years, I’ve worked with a variety of organizations, both large and small, to improve their application security capabilities. One thing I’ve noticed is that two organizations with the exact same application security activities can have wildly different results over time. One organization will improve, steadily stamping out entire classes of vulnerabilities. The other will continue to find the same problems year after year.

The difference is culture. In some organizations, security is an important concern that is considered a part of every decision. In others, security is considered a productivity killer and a waste of time. These "culture killers" will, most certainly, undermine and destroy your application security program. Let’s take a look at the seven most deadly security sins...

Sin 1: Apathy. In 2002, Bill Gates famously drafted his "Trustworthy Computing" memo in which he makes clear that "Trustworthy Computing is the highest priority for all the work we are doing. We must lead the industry to a whole new level of Trustworthiness in computing." Executives have the power to make security a priority or a joke. What messages are you sending with your security decisions and actions?

Sin 2: Secrecy. Consider the mission of the Open Web Application Security Project (OWASP): "To make software security visible, so that individuals and organizations worldwide can make informed decisions about true software security risks." Making security visible in your organization will ensure that development teams and business organizations are on the same page. Secrecy, on the other hand, leads to confusion, blind decision-making, and can even create disincentives for secure coding.  Do you use every vulnerability as an opportunity to improve or as a secret to cover up as quickly as possible?

Sin 3: Forgetfulness. Security is created through the evolutionary process. "Builders" create a security system, and "breakers" challenge that security. Each iteration advances security a little bit. However, many organizations don’t save the improvements made during each iteration. They don’t learn. Instead, their developers continue to make the same mistakes year after year. Organizations that learn from their mistakes capture the lessons they have learned in standards, technical defenses, training, and other forms.

Sin 4: Promiscuity. Some organizations allow development teams to adopt new technologies without any security analysis. This might be an open-source library, a new framework, or a new product. Eventually, vulnerabilities are identified in the new technology, but by then it is either technically or contractually too late to fix. The solution is to ask the hard questions before you get in bed with a new technology. What is the security story behind the technology? How can you verify security? How have the vendors handled security issues in the past?

Sin 5: Creativity. Done right, security is boring. Every time we see custom security controls, we find serious problems. Everyone knows not to build their own cryptography, or at least they should. But the same reasoning applies to authentication, access control, input validation, encoding, logging, intrusion detection, and other security defenses.

Sin 6: Blame. Some people like to blame developers for security flaws. Economic theory suggests that this is the efficient approach, since developers are in the best position to prevent security glitches from occurring. But blaming developers for application security problems is exactly the wrong thing to do. Blame creates a dangerous feedback loop where developers despise security, security teams exaggerate their findings to get attention, and everyone ends up blindly trusting applications. Remember the words of Ice-T, "Don’t hate the playa, hate the game."

Sin 7: Faith. Many organizations rely on a quick scan or static analysis to see if their applications are "good-to-go" without really understanding what they are getting. For example, access control is a critical security defense, but most tools can’t test it at all. Same for many encryption and authentication defenses as well. So you’ll need a different strategy to verify that those controls are in place and effective. Blind faith is not a defense.

Consider your software development culture and ask yourself if you’ve made secure coding simple and fun. Just remember: All the tools and processes in the world won’t lead to secure code unless you tackle the culture killers. Good luck!

A pioneer in application security, Jeff Williams has more than 20 years of experience in software development and security. Jeff co-founded and is the CTO of Aspect Security, an application security consulting firm that provides verification, programmatic and training ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
Marilyn Cohodas
100%
0%
Marilyn Cohodas,
User Rank: Strategist
2/10/2014 | 8:59:38 AM
Importance of culture
Jeff, I couldn't agree more with your point about the importance of culture in setting the tone for attitudes about security. This is true on all levels -- from  software app development to enterprise-wide security architectures to end user awareness. And the message cleary has to come from the top. 
J_Brandt
100%
0%
J_Brandt,
User Rank: Apprentice
2/11/2014 | 1:14:43 PM
Culture is #1
Excellent piece.  It seems like almost weekly I am making the point to my clients that culture can override and make invalid any security hardware, software or procedure.  I'll be referencing your 7 deadly sins to help "tackle the culture killers!"
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
2/11/2014 | 1:33:19 PM
Re: Culture is #1 -- True, but what's the deadliest sin?
Glad you liked the column, J_Brandt. Curious to know which of the 7 deadly sins stands are the most deadly in your experience.
J_Brandt
100%
0%
J_Brandt,
User Rank: Apprentice
2/11/2014 | 5:39:36 PM
Re: Culture is #1 -- True, but what's the deadliest sin?
Definitely Sin 1: Apathy.  With no incentive at the highest levels, nothing can be done.  No education, no awareness, no tools, no procedures, no nothing can happen.
planetlevel
50%
50%
planetlevel,
User Rank: Author
2/12/2014 | 10:13:44 AM
Re: Culture is #1 -- True, but what's the deadliest sin?
Some security professionals have commented to me that the best way to boostrap a security culture is to get hacked.  And I have seen companies pour effort into application security after a hack.  But I'm not convinced. I've worked with a number of clients that have managed to create a very strong security culture without a devastating hack.  And I've also seen those post-hack efforts fade over time -- meaning they didn't really change the culture.

To me the best signal of a great security culture is that security has been made "visible" -- there are artifacts available and people encourage informed and rational discussion of security issues.  Beware that "black-and-white" thinking! What other signals do you see of either great or terrible security culture?
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading, September 16, 2014
Malicious software is morphing to be more targeted, stealthy, and destructive. Are you prepared to stop it?
Flash Poll
DevOps’ Impact on Application Security
DevOps’ Impact on Application Security
Managing the interdependency between software and infrastructure is a thorny challenge. Often, it’s a “developers are from Mars, systems engineers are from Venus” situation.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-0985
Published: 2014-09-20
Stack-based buffer overflow in Advantech WebAccess (formerly BroadWin WebAccess) 7.2 allows remote attackers to execute arbitrary code via the NodeName parameter.

CVE-2014-0986
Published: 2014-09-20
Stack-based buffer overflow in Advantech WebAccess (formerly BroadWin WebAccess) 7.2 allows remote attackers to execute arbitrary code via the GotoCmd parameter.

CVE-2014-0987
Published: 2014-09-20
Stack-based buffer overflow in Advantech WebAccess (formerly BroadWin WebAccess) 7.2 allows remote attackers to execute arbitrary code via the NodeName2 parameter.

CVE-2014-0988
Published: 2014-09-20
Stack-based buffer overflow in Advantech WebAccess (formerly BroadWin WebAccess) 7.2 allows remote attackers to execute arbitrary code via the AccessCode parameter.

CVE-2014-0989
Published: 2014-09-20
Stack-based buffer overflow in Advantech WebAccess (formerly BroadWin WebAccess) 7.2 allows remote attackers to execute arbitrary code via the AccessCode2 parameter.

Best of the Web
Dark Reading Radio