There is a lesson to be learned from a locksmith living 150 years ago: Attackers and criminals are the only parties who benefit when security researchers fear the consequences for reporting issues.

Katie Moussouris, Founder & CEO, Luta Security

May 12, 2015

3 Min Read

The recent example of a software vendor leveraging laws like the Digital Millennium Copyright Act (DMCA) to intimidate a security researcher is counterproductive. The researcher and team at the security consulting firm IOActive took a risk by attempting to report security flaws in a digital lock, and the company that makes the lock didn't exactly welcome the news.

While we don’t know all the details, according to multiple press reports, IOActive tried to contact the vendor privately before public disclosure, and that vendor responded through its lawyers, who mentioned the DMCA. As Chris Sogohian, staff technologist for the ACLU, tweeted about this incident, "Having a lawyer respond to security researchers is like asking your neighbor to turn down the music w/ a gun in your hand. It won't end well"

This phenomenon is sadly all too common when we look at the history of security research, and results in a chilling effect on security research. Attackers and criminals are the only parties who benefit when security researchers fear the consequences for reporting issues.

The year 1853 called. They want their disclosure debate back.

A locksmith living over 150 years ago named Alfred Charles Hobbs said it beautifully when discussing whether revealing lock-picking techniques publicly was acceptable: "Rogues are very keen in their profession, and know already much more than we can teach them respecting their several kinds of roguery."

The irony that the modern lock manufacturers have not learned the lessons of their industrial-age forebears indicates that we haven't sufficiently shifted the norms of vendor behavior in over a century and a half or more.

Hackers gonna hack.
When vendors lack a process and ability to receive, investigate, remediate, and communicate about security vulnerabilities, often the first reaction is to call in the lawyers. However, software bugs are not usually fixed by lawyers, threats, or intimidation. They simply distract all parties from the only route that ensures our collective security.

Back when I founded Symantec Vulnerability Research, I made t-shirts for the team that said simply:

All software contains bugs. The maturity of a vendor's product security is measured in part by how it handles vulnerability reports. Those who are unable to gracefully deal with external parties who are trying to warn them of security holes are putting their users, and possibly the Internet as a whole, at risk.

Recently, I worked with MIT Sloan School of Management and Harvard Kennedy School on relevant research, sponsored  by Facebook, on system dynamics modeling of the 0day market. The result of the research concluded, among other things, that defenders should try to increase the rate of finding vulnerabilities through incentives for bugs. Responding to friendly hackers with legal intimidation runs counter to this research and all recommended best practices.

5 Stages of Vulnerability Response Grief: A Standard Approach
Denial. Anger. Bargaining. These are all emotional reactions to a technical problem. The cure? Acceptance. This short video offers a humorous look at this serious issue. Unfortunately this is still an ongoing phenomenon, and organizations will benefit from quickly understanding the pitfalls of these activities that don't ultimately work to improve their security posture.

As I write this from the 25-year anniversary meeting of the ISO SC27 working group in Malaysia, I am happy to report that we already have standard guidelines in the form of ISO 29147 Vulnerability disclosure and ISO 30111 Vulnerability handling processes. These are available to help organizations adopt a vulnerability handling, coordination, and public disclosure process. Will a set of standards end the disclosure debate once and for all? Not entirely, but it is an important first step.

Hackers can help prevent attacks if they can come forward without fear of prosecution. Encourage research, offer proper incentives, and have a safe and transparent way to receive potential security issue reports.

Prosecute crime, not research. The result is a safer Internet for everyone.

About the Author(s)

Katie Moussouris

Founder & CEO, Luta Security

Katie Moussouris is the founder and CEO of Luta Security, a company offering unparalleled expertise to create robust vulnerability coordination programs. Luta Security specializes in governments and multi-party supply chain vulnerability coordination. Moussouris recently testified as an expert on bug bounties & the labor market for security research for the US Senate and has also been called upon for European Parliament hearings on dual-use technology. She was later invited by the US State Department to help renegotiate the Wassenaar Arrangement, during which she successfully helped change the export control language to include technical exemptions for vulnerability disclosure and incident response.

Moussouris is co-author of an economic research paper on the labor market for bugs, published as a book chapter by MIT Press in 2017, and presented on the first system dynamics model of the vulnerability economy and exploit market in 2015, as part of her academic work as a visiting scholar at MIT Sloan School. She is the former chief policy officer for HackerOne.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights