Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Application Security

8/27/2014
04:35 PM
Connect Directly
Google+
Twitter
RSS
E-Mail
50%
50%

10 Common Software Security Design Flaws

Google, Twitter, and others identify the most common software design mistakes -- compiled from their own organizations -- that lead to security woes and how to avoid them.

It's not all about the security bugs: Mistakes in how a software application's security is designed can lead to major breaches like that suffered by the mega-retailer Target.

Security experts from Cigital, Google, Twitter, HP, McAfee, EMC, RSA, Harvard University, George Washington University, Athens University of Economics and Business, the Sandosky Foundation, and the University of Washington as part of the IEEE Center for Secure Design published a report today that pinpoints 10 of the most common software security design flaws they collectively found in their own development programs.

"When you can solve a problem at the [software] design phase, it automatically solves a bunch of problems later on in the stages," says Neil Daswani, who is with Twitter's security engineering team. "It's very cost-effective to solve security at the design stage."

The organizations came up with a top 10 list during a workshop session this spring, where each brought along examples of design flaws it had experienced. "What we did as a group of companies is dump out a list" based on the overlap in all the design issues brought to the table, Daswani says.

To date, the security industry has mostly been laser-focused on finding and eradicating security vulnerabilities or bugs. There are plenty of lists available, such as the OWASP Top 10, that provide the most common software bugs in development. But design flaws -- such as using encryption incorrectly or not validating data properly -- can also be exploited by attackers or lead to security bugs. These flaws can be less noticeable on the surface but just as deadly if abused by an attacker.

"Getting software designers and architects what they need to think about when building software is just as important as getting developers to think about bugs," says Gary McGraw, CTO at Cigital and a member of the team behind the new "Avoiding the Top 10 Software Security Design Flaws" report. "Unfortunately, not much attention is paid to that."

With cross-site scripting (XSS) vulnerabilities, for example, a simple design change can wipe out the possibility of those bugs in an application, he says. "You can make a change to the design of the API" of an application that could eliminate an entire category of bugs.

According to McGraw, Target's data breach was a real-world example of a design flaw leading to a hack. The environment was "crunchy on the outside and chewy in the middle." As a result, it was "easy to get to the middle where all the data was stored" once the attackers had compromised the point-of-sale system. "The design of the communications and storage… were [poorly] done," he says. "Often a bug provides a toehold into a system to be exploited because of bad" security design.

Twitter already is implementing its own software security design flaw prevention program based on the report. Daswani says an internal Twitter document specifically recommends how to design its software securely. "For example, we recommend the use of a certain number of templating frameworks -- the developers choose one -- so it's more likely that their code won't be vulnerable to cross-site scripting" and other bugs.

One reason XSS is so prevalent in software today is that, when you build a web application, it's easy to design it with inherent flaws at the user interface. "If you're not careful about what data you output to the user interface, sometimes the browser at the other end can get confused and think the data might be code that has to be executed," he says. "By recommending a templating language, it makes a clear delineation on what is considered code and what is to be considered data. That sure makes it harder to have a bug like cross-site scripting."

The report recommends how to prevent each of the 10 most common software security design flaws:

1. Earn or give, but never assume, trust.
2. Use an authentication mechanism that cannot be bypassed or tampered with.
3. Authorize after you authenticate.
4. Strictly separate data and control instructions, and never process control instructions received from untrusted sources.
5. Define an approach that ensures all data are explicitly validated.
6. Use cryptography correctly.
7. Identify sensitive data and how it should be handled.
8. Always consider the users.
9. Understand how integrating external components changes your attack surface.
10. Be flexible when considering future changes to objects and actors.

"It's [an] important point that the vendor community in software security and even OWASP has a very myopic focus on bugs," McGraw says. "That leads some customers to believe that software security is a bug problem" only, but design flaws account for about half of software security issues.

Tom Brennan, global vice president of the OWASP Foundation, says the approach of attacking the source of the problem makes sense, and his organization is on the same page as the others.

"I am glad to see colleagues promoting a proactive risk approach to the core source of the problem and not the symptoms," Brennan says. "IEEE, MITRE, OWASP, ISC2 ASAC, and other associations have been shifting the focus in security from bug hunts and finding bugs to identifying common design flaws to communicate more effectively with the technology risk officers."

In practice, "as a penetration tester, we continue to identify, prioritize, and make recommendations for individual findings," he says. "Providing guidance from individual vulnerabilities to eradication of entire classes of problems elevates the discussion to the board of directors" level.

Daswani says the best time to ensure secure coding is from the get-go, with development and design teams working together. "It's important to include development teams as part of the design. Entire classes of bugs can be knocked out."

Dan Kaminsky, chief scientist at WhiteOps, calls the design flaw approach interesting. "The hard problem in computer security is operationalization -- how do we take the hard-won lessons learned from years of getting hacked and apply them to real-world systems that really need to be protected?" he says. "The IEEE here is doing something interesting. This guidance is for architects -- not necessarily of the end software, but of the frameworks that everything is built out of. These are security principles for our Legos and, if made properly concrete, will be helpful."

Addressing potential design flaws even as early as during the requirements phase could help.

"It's important to ask questions that identify what all the different ways the software will get used," Daswani says. "The more you can put in design-level criteria and decisions in place, the more you can help mitigate entire classes of bugs before the code is even developed."

A copy of the full report is available here for download.

Kelly Jackson Higgins is Executive Editor at DarkReading.com. She is an award-winning veteran technology and business journalist with more than two decades of experience in reporting and editing for various publications, including Network Computing, Secure Enterprise ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
Page 1 / 2   >   >>
anon9106759839
50%
50%
anon9106759839,
User Rank: Apprentice
8/27/2014 | 6:51:54 PM
Nah
Big names, but little value can come from this conversation.

Application security problems stem from attacks. MITRE CAPEC describes the underlying model for attacks, while PTES, OSSTMMv4, and OWASP guides such as the Testing Guide and ASVS 2.0 standards cover the open methods.

There are also models (CWE) and methods (OWASP Dev Guide, SAFEcode, Microsoft SDL, etc) for building secure software, but this is where security and appdev activities are split.

On Twitter, someone important today said, "a design flaw is a property of the design that allows an attacker to violate one of your security objectives".
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
8/28/2014 | 8:16:02 AM
Re: Nah> "little value can come from this conversation"
@anon9106759839 -- Are you saying that there is no significant relationship between security and appdev? Or that the conversation will not lead to a viable solution. 
Stratustician
50%
50%
Stratustician,
User Rank: Moderator
8/28/2014 | 9:25:10 AM
Re: Nah> "little value can come from this conversation"
I think there is definitely some value with really reminding folks that security is closely tied to application development.  While yes, many flaws will come up as part of a security attack, if you have strong code at the onset, especially if groups like these industry folks are able to start to identify "here are where we are seeing code vulnerabilities", it will hopefully lead to better code overall for these applications and reduce the risks.  You can't eliminate every potential threat, but at least you've narrowed the attack field by closing known vulnerabilities.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
8/28/2014 | 9:34:21 AM
Re: Nah> "little value can come from this conversation"
Well said,  @Stratustician. Narrowing down the field of code vulnerabiliies is definitely a valuable endeavor. 
Robert McDougal
50%
50%
Robert McDougal,
User Rank: Ninja
8/28/2014 | 11:18:44 AM
Re: Nah
I must disagree with your assessment wholeheartedly. I can tell you from direct experience that secure coding practices are not taught in our colleges currently. What that leads to is developers who don't understand the importance of using stored procedures and prepared statements. This in turn leads to applications which have easily preventable vulnerabilities.


Secure coding will not fix all vulnerabilities but if done correctly it will prevent known vulnerabilities such as SQL injection or XSS from making its way into future applications.
Kelly Jackson Higgins
50%
50%
Kelly Jackson Higgins,
User Rank: Strategist
8/28/2014 | 11:35:27 AM
Re: Nah
Good point about the Mitre, OWASP and other models. What I thought was particularly interseting with the IEEE report was that the recommendations come from real-world design flaws the participants themselves experienced -- Twitter, Google, etc. 
GonzSTL
50%
50%
GonzSTL,
User Rank: Ninja
8/28/2014 | 2:22:29 PM
Secure Software Design
I received my CompSci degree quite a while back, and even then, the practice of input validation and communication comparmentalization was stressed in all my programming classes. My involvement in IT throughout the years encompasses software development, network architecture, server infrastructure, storage architecture, desktop standardization, virtualization, etc., so I can pretty much see things from a broad picture as well as from individual areas. In all those IT domains, the vast majority of exploits come from software design security flaws, and secondly, improper configurations.

What I believe is that there is tremendous pressure to deliver applications and technology, and sometimes that leads to shortcuts or bypassing certain aspects of development. If security considerations are part of the whole development process, and rigidly enforced from inception to delivery, then perhaps we would see a dramatic drop in exploitable software flaws. The question is, why are the shortcuts and bypasses allowed, and who allows them? Improper oversight seems to be the culprit, either due to lacck of knowledge or understanding, or faulty risk management in the development process. Simply stated, security considerations should be enforced from beginning to end.
Kelly Jackson Higgins
50%
50%
Kelly Jackson Higgins,
User Rank: Strategist
8/28/2014 | 3:51:35 PM
Re: Secure Software Design
Great perspective, @GonzSTL. The go-to-market/release pressures are the biggest issue with much of app development, for sure. But you raise another good point about a lack of oversight and enforcement of good secure coding practices.
macker490
50%
50%
macker490,
User Rank: Ninja
8/29/2014 | 8:10:20 AM
cart and horse
remember: the O/S must protect the apps rather than the reverse.  you are always going to have a bad app someplace and if that can get an un-authorized update into the o/s you're toast.

you must start with a secure o/s and then proceed with the authentication of inputs particularly software but also data particularly anything financial or sensitive in nature.
DarkReadingTim
50%
50%
DarkReadingTim,
User Rank: Strategist
8/29/2014 | 9:32:37 AM
Re: Secure Software Design
Great to see our old friend Neil Daswani in Dark Reading again! One of the things that strikes me each year when OWASP posts its Top 10 Vulnerabilities list is how many of the vulnerabilities are old. I mean, *really* old like SQL injection and buffer overflow. I wonder, why do these well-known vulns continue to occur with such great frequency, and isn't there something that could be done at the development level to prevent them?
Page 1 / 2   >   >>
97% of Americans Can't Ace a Basic Security Test
Steve Zurier, Contributing Writer,  5/20/2019
TeamViewer Admits Breach from 2016
Dark Reading Staff 5/20/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: I told you we should worry abit more about vendor lock-in.
Current Issue
Building and Managing an IT Security Operations Program
As cyber threats grow, many organizations are building security operations centers (SOCs) to improve their defenses. In this Tech Digest you will learn tips on how to get the most out of a SOC in your organization - and what to do if you can't afford to build one.
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-7068
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .
CVE-2019-7069
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have a type confusion vulnerability. Successful exploitation could lead to arbitrary code execution .
CVE-2019-7070
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .
CVE-2019-7071
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an out-of-bounds read vulnerability. Successful exploitation could lead to information disclosure.
CVE-2019-7072
PUBLISHED: 2019-05-24
Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .