Application Security
8/27/2014
04:35 PM
Connect Directly
Google+
Twitter
RSS
E-Mail
50%
50%

10 Common Software Security Design Flaws

Google, Twitter, and others identify the most common software design mistakes -- compiled from their own organizations -- that lead to security woes and how to avoid them.

It's not all about the security bugs: Mistakes in how a software application's security is designed can lead to major breaches like that suffered by the mega-retailer Target.

Security experts from Cigital, Google, Twitter, HP, McAfee, EMC, RSA, Harvard University, George Washington University, Athens University of Economics and Business, the Sandosky Foundation, and the University of Washington as part of the IEEE Center for Secure Design published a report today that pinpoints 10 of the most common software security design flaws they collectively found in their own development programs.

"When you can solve a problem at the [software] design phase, it automatically solves a bunch of problems later on in the stages," says Neil Daswani, who is with Twitter's security engineering team. "It's very cost-effective to solve security at the design stage."

The organizations came up with a top 10 list during a workshop session this spring, where each brought along examples of design flaws it had experienced. "What we did as a group of companies is dump out a list" based on the overlap in all the design issues brought to the table, Daswani says.

To date, the security industry has mostly been laser-focused on finding and eradicating security vulnerabilities or bugs. There are plenty of lists available, such as the OWASP Top 10, that provide the most common software bugs in development. But design flaws -- such as using encryption incorrectly or not validating data properly -- can also be exploited by attackers or lead to security bugs. These flaws can be less noticeable on the surface but just as deadly if abused by an attacker.

"Getting software designers and architects what they need to think about when building software is just as important as getting developers to think about bugs," says Gary McGraw, CTO at Cigital and a member of the team behind the new "Avoiding the Top 10 Software Security Design Flaws" report. "Unfortunately, not much attention is paid to that."

With cross-site scripting (XSS) vulnerabilities, for example, a simple design change can wipe out the possibility of those bugs in an application, he says. "You can make a change to the design of the API" of an application that could eliminate an entire category of bugs.

According to McGraw, Target's data breach was a real-world example of a design flaw leading to a hack. The environment was "crunchy on the outside and chewy in the middle." As a result, it was "easy to get to the middle where all the data was stored" once the attackers had compromised the point-of-sale system. "The design of the communications and storage… were [poorly] done," he says. "Often a bug provides a toehold into a system to be exploited because of bad" security design.

Twitter already is implementing its own software security design flaw prevention program based on the report. Daswani says an internal Twitter document specifically recommends how to design its software securely. "For example, we recommend the use of a certain number of templating frameworks -- the developers choose one -- so it's more likely that their code won't be vulnerable to cross-site scripting" and other bugs.

One reason XSS is so prevalent in software today is that, when you build a web application, it's easy to design it with inherent flaws at the user interface. "If you're not careful about what data you output to the user interface, sometimes the browser at the other end can get confused and think the data might be code that has to be executed," he says. "By recommending a templating language, it makes a clear delineation on what is considered code and what is to be considered data. That sure makes it harder to have a bug like cross-site scripting."

The report recommends how to prevent each of the 10 most common software security design flaws:

1. Earn or give, but never assume, trust.
2. Use an authentication mechanism that cannot be bypassed or tampered with.
3. Authorize after you authenticate.
4. Strictly separate data and control instructions, and never process control instructions received from untrusted sources.
5. Define an approach that ensures all data are explicitly validated.
6. Use cryptography correctly.
7. Identify sensitive data and how it should be handled.
8. Always consider the users.
9. Understand how integrating external components changes your attack surface.
10. Be flexible when considering future changes to objects and actors.

"It's [an] important point that the vendor community in software security and even OWASP has a very myopic focus on bugs," McGraw says. "That leads some customers to believe that software security is a bug problem" only, but design flaws account for about half of software security issues.

Tom Brennan, global vice president of the OWASP Foundation, says the approach of attacking the source of the problem makes sense, and his organization is on the same page as the others.

"I am glad to see colleagues promoting a proactive risk approach to the core source of the problem and not the symptoms," Brennan says. "IEEE, MITRE, OWASP, ISC2 ASAC, and other associations have been shifting the focus in security from bug hunts and finding bugs to identifying common design flaws to communicate more effectively with the technology risk officers."

In practice, "as a penetration tester, we continue to identify, prioritize, and make recommendations for individual findings," he says. "Providing guidance from individual vulnerabilities to eradication of entire classes of problems elevates the discussion to the board of directors" level.

Daswani says the best time to ensure secure coding is from the get-go, with development and design teams working together. "It's important to include development teams as part of the design. Entire classes of bugs can be knocked out."

Dan Kaminsky, chief scientist at WhiteOps, calls the design flaw approach interesting. "The hard problem in computer security is operationalization -- how do we take the hard-won lessons learned from years of getting hacked and apply them to real-world systems that really need to be protected?" he says. "The IEEE here is doing something interesting. This guidance is for architects -- not necessarily of the end software, but of the frameworks that everything is built out of. These are security principles for our Legos and, if made properly concrete, will be helpful."

Addressing potential design flaws even as early as during the requirements phase could help.

"It's important to ask questions that identify what all the different ways the software will get used," Daswani says. "The more you can put in design-level criteria and decisions in place, the more you can help mitigate entire classes of bugs before the code is even developed."

A copy of the full report is available here for download.

Kelly Jackson Higgins is Executive Editor at DarkReading.com. She is an award-winning veteran technology and business journalist with more than two decades of experience in reporting and editing for various publications, including Network Computing, Secure Enterprise ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 2   >   >>
JasonSachowski
50%
50%
JasonSachowski,
User Rank: Author
9/3/2014 | 11:35:58 AM
Re: Secure Software Design
@billkarwin, i agree with the answer to your question about how writing secure code transcends to much more than just those born in the 90's.  For arguement sake, the same statement can be made about older and younger generations who have other collateral factors at play; such as they don't understand the technology or perhaps don't have the attention to detail.

@RyanSepe and @MarilynCohodas, I think you are right that we need to introduce the generations that follow us with the fundamentals of information/cyber/digital security much earlier than college or university.  Looking back at how fast technology has evolved in our lifetimes, one can only imagine what technologies the next generations will bring reinforces the fact that we have to educate eariler and make it a part of there every day lives.

I think software security in the education system today is looked at as somewhat of a security specialization and not a practice that is available in normal software development programs; in my experiences.  I will say that it's great to see the communities of InfoSec professionals actively involved in providing elementary schools with basic information/cyber/digital security but after this, it really needs to be continued as part of daily curriculum.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
9/2/2014 | 10:40:50 AM
Re: Secure Software Design
Good point, Ryan. Security awareness really does have to be more baked into our educational system, doesn't it. And not just at the high ed level where newbie programmers are drilled at the most secure way to design apps. I think security awareness  about threats should be started in elementary schools in the same way children are schooled to avoid putting themselves in harms way in the physical world..
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
9/2/2014 | 10:07:18 AM
Re: Secure Software Design
@Marilyn Cohodas, I think this comes down to specialization within education. Until recently there were very few collegiate programs that dealt specifically with cyber security and information security. I have seen more and more pop up in recent years. Information Technology and Computer Science is a generalized overview of the subject matter. Basically outlining the different aspects on a lower level. If you were to specialize in an area you would receive a higher level of understanding and knowledge base. If they took programs that specialized in app development, I am sure in the core curriculum that app security would be one of the courses given not just an individual unit of a singular course. This would allow a more in depth learning process and transfer over into development.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
9/2/2014 | 8:23:17 AM
Re: Secure Software Design
@RyanSepe and @billkarwin, your back-and-forth about the generation gap in secure software development & education is great -- and extremely interesting. But why aren't app designers coming out of computer science programs with a deeper understanding of the importance of building secure applications?   Why isn't that a given?
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
9/1/2014 | 5:13:04 PM
Re: Secure Software Design
I agree with you that education is the key here. The point I was trying to get across is just because someone is in the infancy of their career doesn't mean they don't have the theoretical components to write secure code. That is all. I was conveying that by the nature vs. nuture argument. If nature or experience is rivalled by nuture, which is the crux of the argument, than logically inexperienced (I mean this is in the sense of application) have the components to create secure code.

Your middle inquiry obviously uses reductio ad absurdum to berate the above statement. However, unfortunately you have heard of this being the case. With your example and with secure code. Many times its not until a breach happens where institutions decide there is a change needed to be made, and some software developers may be unaware of the hole in the first place. Hence new vulnerabilities. Is this due to a security flaw or a new attack strategy or maybe both? You cannot protect against something you are unaware of. (Heartbleed) It wasn't until this was discovered that a design flaw was even brought to light. This is highly unfortunate and our jobs as security professionals to try and show core value in security to other departments in the institution.

I agree with many of the points you make. Especially in the ways of education being the key. But to say that people that are surrounded by technology and have it ingrained in their daily lives opposed to a test group that has doesn't have it and is provided later in their lives is a frivolous and prideful notion.

I would say that security as a newer notion is valid, wherein people that are born in this generation will be the ones I speak of overall or the generation after. Security needs to a principle taught from a young age. Only than will people outside security be reached in its entirety.
billkarwin
50%
50%
billkarwin,
User Rank: Apprentice
9/1/2014 | 3:40:29 PM
Re: Secure Software Design
Ryan, if people born in the 90's have such a higher proclivity to technology, then why aren't they writing more secure code? I see both young and old individuals unwittingly writing vulnerable code. It has nothing to do with what year they were born, and everything to do with how aware they are of the risks and the consequences.

If they aren't educated to be aware of secure coding practices, then what's the alternative? Wait until they have a personal experience of being responsible for a security disaster because of the poor code they wrote?

Just like becoming a convert to using antivirus software, or becoming a convert to making backups diligently, after losing all one's files.
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
8/31/2014 | 10:50:34 AM
Re: Secure Software Design
Good analogies. To some extent I do feel that education does play a big role here from the start.(Education is always the key in many situations, the better informed you are the better decisions can be made) As in younger analysts should be cognizant of security measures.

Similar to how if you are born in the 90's you have a higher proclivity to technology and its apsects because you were born in that time period of rapid growth. (Born it it) Might be a little biased and I normally don't like using generalized statements but statistically speaking that is the case. 

If we look at this from a scrum standpoint using an adaptive method to vulnerability assessment is the most conducive to the environment. Vulnerabilities and attack vectors deviate and adapt so the ones that are coding software need to adapt their focus as well.
billkarwin
50%
50%
billkarwin,
User Rank: Apprentice
8/30/2014 | 2:25:49 PM
Re: Secure Software Design
Tim, your question reminds me of a story my mother experienced. She joined a group of volunteers at the local college to help young people register to vote (in the US this is not automatic, you have to fill out a form when you turn 18). She and her group did this every year. One year, one of the other women complained, "We've been helping to register students to vote for ten years! When are they going to learn to do it themselves?" The rest of the group had to remind her that every year, a new crop of students turn 18, and those individuals had naturally never had to registered before.

This is also the reason that well-known security vulnerabilities continue to be a problem, even decades after the remedies were first understood. The developer community gets a new crop of newbie programmers every year. They have never had to think about secure programming while doing class assignments, and they're even less likely to have done so if they are self-taught.

Yes, there are well-known fixes for old security flaws. At least, they're well-known to us experienced programmers. It's our responsibility to spread the word and educate all developers to program in a secure way by default.
DarkReadingTim
50%
50%
DarkReadingTim,
User Rank: Strategist
8/29/2014 | 9:32:37 AM
Re: Secure Software Design
Great to see our old friend Neil Daswani in Dark Reading again! One of the things that strikes me each year when OWASP posts its Top 10 Vulnerabilities list is how many of the vulnerabilities are old. I mean, *really* old like SQL injection and buffer overflow. I wonder, why do these well-known vulns continue to occur with such great frequency, and isn't there something that could be done at the development level to prevent them?
macker490
50%
50%
macker490,
User Rank: Ninja
8/29/2014 | 8:10:20 AM
cart and horse
remember: the O/S must protect the apps rather than the reverse.  you are always going to have a bad app someplace and if that can get an un-authorized update into the o/s you're toast.

you must start with a secure o/s and then proceed with the authentication of inputs particularly software but also data particularly anything financial or sensitive in nature.
Page 1 / 2   >   >>
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading December Tech Digest
Experts weigh in on the pros and cons of end-user security training.
Flash Poll
DevOps’ Impact on Application Security
DevOps’ Impact on Application Security
Managing the interdependency between software and infrastructure is a thorny challenge. Often, it’s a “developers are from Mars, systems engineers are from Venus” situation.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-8551
Published: 2014-11-26
The WinCC server in Siemens SIMATIC WinCC 7.0 through SP3, 7.2 before Update 9, and 7.3 before Update 2; SIMATIC PCS 7 7.1 through SP4, 8.0 through SP2, and 8.1; and TIA Portal 13 before Update 6 allows remote attackers to execute arbitrary code via crafted packets.

CVE-2014-8552
Published: 2014-11-26
The WinCC server in Siemens SIMATIC WinCC 7.0 through SP3, 7.2 before Update 9, and 7.3 before Update 2; SIMATIC PCS 7 7.1 through SP4, 8.0 through SP2, and 8.1; and TIA Portal 13 before Update 6 allows remote attackers to read arbitrary files via crafted packets.

CVE-2014-1421
Published: 2014-11-25
mountall 1.54, as used in Ubuntu 14.10, does not properly handle the umask when using the mount utility, which allows local users to bypass intended access restrictions via unspecified vectors.

CVE-2014-3605
Published: 2014-11-25
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: CVE-2014-6407. Reason: This candidate is a reservation duplicate of CVE-2014-6407. Notes: All CVE users should reference CVE-2014-6407 instead of this candidate. All references and descriptions in this candidate have been removed to pre...

CVE-2014-6093
Published: 2014-11-25
Cross-site scripting (XSS) vulnerability in IBM WebSphere Portal 7.0.x before 7.0.0.2 CF29, 8.0.x through 8.0.0.1 CF14, and 8.5.x before 8.5.0 CF02 allows remote authenticated users to inject arbitrary web script or HTML via a crafted URL.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Now that the holiday season is about to begin both online and in stores, will this be yet another season of nonstop gifting to cybercriminals?