Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Analytics

8/24/2015
03:15 PM
Connect Directly
Google+
Twitter
RSS
E-Mail
50%
50%

What Drives A Developer To Use Security Tools -- Or Not

National Science Foundation (NSF)-funded research by Microsoft Research, NC State, and UNC-Charlotte sheds light on what really makes a software developer scan his or her code for security bugs.

Software developers are most likely to run security vulnerability scans of their code if their peers are doing so.

The infamous cultural gap between software developers and cyber security may be just as much about mindset and psychology than technology when it comes to the use of security tools, a team of computer science and psychology researchers from Microsoft Research, North Carolina State University (NC State), and the University of North Carolina-Charlotte have found.

"The power of seeing someone you know using the tool was the most substantial way it predicted their likelihood of using" it, says Emerson Murphy-Hill, the lead researcher for the National Science Foundation (NSF)-funded project, and an associate professor of computer science at NC State University. That's because seeing a tool in a real-world setting--especially in use by someone you work with--is a more effective "testimonial" than an outside recommendation, he says.

Another major factor in security tool adoption by developers is corporate culture, where managers encourage developers to employ security tools, according to the research. "Developers are fairly independent people. I figured management wouldn't have that big of a role to play" in determining their use of the tools, Murphy-Hill says. "[But] What management says made a difference: developers actually paid attention to whether management encourages the use of tools."

And interestingly, developers who said they worked on products in which security was important were not much more likely to use security tools than other programmers, the researchers found in a survey of developers from 14 companies and 5 mailing lists. They came up with nearly 40 predictors of security tool use, which they detailed in an academic paper that they will present next week at the Symposium on the Foundations of Software Engineering in Bergamo, Italy. They also will present two other related papers -- Questions Developers Ask While Diagnosing Potential Security Vulnerabilities with Static Analysis and Bespoke Tools: Adapted to the Concepts Developers Know.

The second study (and paper) looked at whether security tools provide the information developers really need to determine if there's a legitimate problem in the code and if so, how to fix it. They armed 10 developers--novice and experienced ones--with an open-source static-analysis security tool called Find Security Bugs to scan for bugs in a vuln-ridden open source software program.

The programmers found that the tool offered multiple resolution options, but not sufficient contextual information about the pros and cons of each fix. "We found that this made it difficult for programmers to select the best course of action," Murphy-Hill says. The tool also failed to connect the dots between notifications that were related to the same problem, for instance, which caused more confusion.

"A lot of research has been about the technical stuff, how we can add more power, do more to make it more sophisticated, how to find more and more bugs," Murphy-Hill says. "But there's the other side of it, too: some [developer] has to deal with the output of those tools. What's making the difference for them? How do they make a choice of whether to do something with the vulnerabilities, ignore the tool, or spend more time on code … Where do they spend their time and how do we make these tools better."

He acknowledges that a more polished commercial tool might have some resulted in some different experiences for the developers in the research, but the study wasn't about focusing on an individual tool. "When you're looking at a vulnerability with tainted data potential, you have to figure out where the data came from," he says.

Plus, one security tool may be better for one user than another, he says.

That's where the so-called "bespoke" security tools come in for developers: "Tools that consider the programmer as a person are more valuable," he says. The researchers are developing prototype tools that automatically learn and adapt to the programmer's expertise and interest, and they hope that will inspire tool vendors. 

Kelly Jackson Higgins is Executive Editor at DarkReading.com. She is an award-winning veteran technology and business journalist with more than two decades of experience in reporting and editing for various publications, including Network Computing, Secure Enterprise ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 2   >   >>
Erik Klein
50%
50%
Erik Klein,
User Rank: Apprentice
9/24/2015 | 12:58:29 PM
What if the developer didn't need to USE a tool?
This article reports on the factors that influence a software developer to proactively stop the process of developing software, perform a context-switch, and execute a security tool ... steps that do not contribute to the functional deliverables of the SDLC.

Of course there would be pushback ... unless the developer is compensated for secure code.

But what if accurate application security vulnerabilities could be identified before code check-in without any of the steps mentioned above?  What if the applciation security vulnerabilities were identified simply from the FUNCTIONAL development and usage of the system?

As a former developer (and current AppSec tooling guy), I am always looking for ways to invisibly inject security into the SDLC ... ways that do NOT require a new line item in a project plan, an extra step in the coding / development process, or a self-imposed "wait state" in order to get application security results ... and, ideally, to have appication security vulnerabilities identified continuously and in real-time as an invisible and natural by-product of the process of building and testing software in an SDLC without regard for "Security Testing".

I have found that passive IAST products are capable of achieving this goal and not only enabling developers to identify and fix their vulnerabilities before the code leaves their desktop, but actually proactively reaching out to them to show the exact line of code that is vulnerable ... ALL WITHOUT A SCAN or extra step ... all in real-time from performing the very act that all developers do before checking in code ... FUNCTIONAL sanity/smoke testing.

Contrast Security provides such a solution.
Dr.T
50%
50%
Dr.T,
User Rank: Ninja
8/26/2015 | 4:07:40 PM
Re: Monkey see Monkey do
Comfortable medium is somewhere we keep the balance of CIA: Confidentiality, Integrity, Availability. If you push one side more than others it eventually causes other problems. Keep in mind that there will never be a "Secure" system.
Kelly Jackson Higgins
50%
50%
Kelly Jackson Higgins,
User Rank: Strategist
8/26/2015 | 4:07:25 PM
Re: Security makes it complex
Wow, @Dr. T. That's a shame. That says a lot about the problem. No one expects a dev to write perfect code--not possible--but if they had more support in writing more secure and better code, maybe they would find it challenging yet realistic.
Dr.T
50%
50%
Dr.T,
User Rank: Ninja
8/26/2015 | 4:05:12 PM
Re: Monkey see Monkey do
Psychological perspective security may not even really matter, we are all concern about privacy in most cases.
Dr.T
50%
50%
Dr.T,
User Rank: Ninja
8/26/2015 | 4:03:16 PM
Re: User-Experience
Agree. User experience may be important factor why some of our codes are not as secure as they could be.
Dr.T
50%
50%
Dr.T,
User Rank: Ninja
8/26/2015 | 4:01:57 PM
Re: Monkey see Monkey do
Money tests can really catch lost for security vulnerabilities that standard set of test action items.
Dr.T
50%
50%
Dr.T,
User Rank: Ninja
8/26/2015 | 3:59:46 PM
Security makes it complex

I know some of my developer friends stop doing development since they started thinking that it is getting harder and harder to write a code in secure way, so they just give up. :--))
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
8/25/2015 | 11:47:22 AM
Re: Monkey see Monkey do
Very interesting. Looking at it from a psychological perspective it would seem that developers are hardcoded for functionality while security folk are hardwired towards safeguards. How do we reach a comfortable medium?
Kelly Jackson Higgins
50%
50%
Kelly Jackson Higgins,
User Rank: Strategist
8/25/2015 | 9:12:23 AM
Re: Monkey see Monkey do
I thought it was interesting to look at the issue from a psychological perspective. That may well be a key element in bridging the gap between the dev and security worlds.
RyanSepe
50%
50%
RyanSepe,
User Rank: Ninja
8/25/2015 | 9:08:00 AM
User-Experience
Also, user experience is another event that needs to be understood here. If the developer has a good experience with the tool they are more likely to use it in the future than if they had a bad experience. This principle is universal.
Page 1 / 2   >   >>
Florida Town Pays $600K to Ransomware Operators
Curtis Franklin Jr., Senior Editor at Dark Reading,  6/20/2019
Pledges to Not Pay Ransomware Hit Reality
Robert Lemos, Contributing Writer,  6/21/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Building and Managing an IT Security Operations Program
As cyber threats grow, many organizations are building security operations centers (SOCs) to improve their defenses. In this Tech Digest you will learn tips on how to get the most out of a SOC in your organization - and what to do if you can't afford to build one.
Flash Poll
The State of IT Operations and Cybersecurity Operations
The State of IT Operations and Cybersecurity Operations
Your enterprise's cyber risk may depend upon the relationship between the IT team and the security team. Heres some insight on what's working and what isn't in the data center.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-12280
PUBLISHED: 2019-06-25
PC-Doctor Toolbox before 7.3 has an Uncontrolled Search Path Element.
CVE-2019-3961
PUBLISHED: 2019-06-25
Nessus versions 8.4.0 and earlier were found to contain a reflected XSS vulnerability due to improper validation of user-supplied input. An unauthenticated, remote attacker could potentially exploit this vulnerability via a specially crafted request to execute arbitrary script code in a users browse...
CVE-2019-9836
PUBLISHED: 2019-06-25
Secure Encrypted Virtualization (SEV) on Advanced Micro Devices (AMD) Platform Security Processor (PSP; aka AMD Secure Processor or AMD-SP) 0.17 build 11 and earlier has an insecure cryptographic implementation.
CVE-2019-6328
PUBLISHED: 2019-06-25
HP Support Assistant 8.7.50 and earlier allows a user to gain system privilege and allows unauthorized modification of directories or files. Note: A different vulnerability than CVE-2019-6329.
CVE-2019-6329
PUBLISHED: 2019-06-25
HP Support Assistant 8.7.50 and earlier allows a user to gain system privilege and allows unauthorized modification of directories or files. Note: A different vulnerability than CVE-2019-6328.