What Drives A Developer To Use Security Tools -- Or NotWhat Drives A Developer To Use Security Tools -- Or Not
National Science Foundation (NSF)-funded research by Microsoft Research, NC State, and UNC-Charlotte sheds light on what really makes a software developer scan his or her code for security bugs.
August 24, 2015
Software developers are most likely to run security vulnerability scans of their code if their peers are doing so.
The infamous cultural gap between software developers and cyber security may be just as much about mindset and psychology than technology when it comes to the use of security tools, a team of computer science and psychology researchers from Microsoft Research, North Carolina State University (NC State), and the University of North Carolina-Charlotte have found.
"The power of seeing someone you know using the tool was the most substantial way it predicted their likelihood of using" it, says Emerson Murphy-Hill, the lead researcher for the National Science Foundation (NSF)-funded project, and an associate professor of computer science at NC State University. That's because seeing a tool in a real-world setting--especially in use by someone you work with--is a more effective "testimonial" than an outside recommendation, he says.
Another major factor in security tool adoption by developers is corporate culture, where managers encourage developers to employ security tools, according to the research. "Developers are fairly independent people. I figured management wouldn't have that big of a role to play" in determining their use of the tools, Murphy-Hill says. "[But] What management says made a difference: developers actually paid attention to whether management encourages the use of tools."
And interestingly, developers who said they worked on products in which security was important were not much more likely to use security tools than other programmers, the researchers found in a survey of developers from 14 companies and 5 mailing lists. They came up with nearly 40 predictors of security tool use, which they detailed in an academic paper that they will present next week at the Symposium on the Foundations of Software Engineering in Bergamo, Italy. They also will present two other related papers -- Questions Developers Ask While Diagnosing Potential Security Vulnerabilities with Static Analysis and Bespoke Tools: Adapted to the Concepts Developers Know.
The second study (and paper) looked at whether security tools provide the information developers really need to determine if there's a legitimate problem in the code and if so, how to fix it. They armed 10 developers--novice and experienced ones--with an open-source static-analysis security tool called Find Security Bugs to scan for bugs in a vuln-ridden open source software program.
The programmers found that the tool offered multiple resolution options, but not sufficient contextual information about the pros and cons of each fix. "We found that this made it difficult for programmers to select the best course of action," Murphy-Hill says. The tool also failed to connect the dots between notifications that were related to the same problem, for instance, which caused more confusion.
"A lot of research has been about the technical stuff, how we can add more power, do more to make it more sophisticated, how to find more and more bugs," Murphy-Hill says. "But there's the other side of it, too: some [developer] has to deal with the output of those tools. What's making the difference for them? How do they make a choice of whether to do something with the vulnerabilities, ignore the tool, or spend more time on code … Where do they spend their time and how do we make these tools better."
He acknowledges that a more polished commercial tool might have some resulted in some different experiences for the developers in the research, but the study wasn't about focusing on an individual tool. "When you're looking at a vulnerability with tainted data potential, you have to figure out where the data came from," he says.
Plus, one security tool may be better for one user than another, he says.
That's where the so-called "bespoke" security tools come in for developers: "Tools that consider the programmer as a person are more valuable," he says. The researchers are developing prototype tools that automatically learn and adapt to the programmer's expertise and interest, and they hope that will inspire tool vendors.
About the Author(s)
Hacking Your Digital Identity: How Cybercriminals Can and Will Get Around Your Authentication MethodsOct 26, 2023
Modern Supply Chain Security: Integrated, Interconnected, and Context-DrivenNov 06, 2023
How to Combat the Latest Cloud Security ThreatsNov 06, 2023
Reducing Cyber Risk in Enterprise Email Systems: It's Not Just Spam and PhishingNov 01, 2023
SecOps & DevSecOps in the CloudNov 06, 2023
Passwords Are Passe: Next Gen Authentication Addresses Today's Threats
How to Deploy Zero Trust for Remote Workforce Security
What Ransomware Groups Look for in Enterprise Victims
Securing the Remote Worker: How to Mitigate Off-Site Cyberattacks
How Enterprises Are Managing Application Security Risks in a Heightened Threat Environment