The industry rewards cutting corners rather than making software safe. Case in point: the Equifax breach.

Dimitri Stiliadis, CEO of Aporeto

September 25, 2017

4 Min Read

There is plenty of blame to go around after the Equifax incident, and I'm not trying to be an apologist for the credit rating company. The problem is that the wrong incentives are driving software development. Unless we change the incentives, security will continue to be a problem. The question remains, what can we do to avoid the "next Equifax"?

The Economics of Software
Let's consider the situation from the perspective of a software organization or a developer. When was the last time that a developer got a bonus or a promotion for taking longer to complete a project because he or she wanted to improve security? When was the last time that a product manager got rewarded for stopping a software release because of a software vulnerability or because of lack of proper security reviews? When was the last time that a software vendor took responsibility for bad code rather than blaming the end users? When was the last time that a venture capitalist upped an investment's valuation because of the company's security processes?

If software were a car, we would be knowingly shipping it with faulty seatbelts or airbags with the hope that there wouldn't be an accident and making the driver sign an end-user agreement that releases all of our liability.

Fast feature delivery is the core incentive in software design. Our mantra is "prototype fast, fail fast." The subtext is "cut corners to test business models faster." The practice is to worry about security when the product is mature and has customers. In reality, this rarely happens because when a product becomes more successful other customer issues and business priorities then eclipse security concerns.

The Equifax Vulnerability
Take, for example, the now infamous Struts vulnerability, via which an attacker can create a special message in the Content-Type HTTP header and achieve remote execution of arbitrary code.

When one looks carefully at the code, it is evident that a parser didn't follow the formal specification. Section 14.17 of the IETF RFC 2616 precisely defines the language and format allowed in the Content-Type field of an HTTP header. Essentially, Content-Type can have a value of one of several media types. (Media types are well-defined here). 

Could we have designed the parser the right way? Could we have predicted all malformed content in this field and avoided the debacle? Could it have been tested ahead of time?

Applying rigorous engineering to the problem would require a formal and mathematically correct parser that would implement the exact definition of the complete standard. It would require fuzzing in unit testing that would catch all violations. We know how to do that, but there are many pages of specifications requiring several days of work that produces no "new feature." In other words, there is no value in this activity for the business. As a result, software developers don't have the time or incentive for such rigor.

Bending Standards, Breaking Security
I am speculating, but it appears that several WAF or firewall vendors had a parser that followed the RFC to the letter. In several incident responses, firewalls enforced this check immediately. I would not be surprised, though, if they were earlier forced to disable it or remove certain security precautions because some applications violated some part of the standard, such as a custom media type that would help in some application feature. Even library or framework developers often don't enforce all parts of standards because some user requires the "customization flexibility" to deliver faster.

Bending the standards or cutting corners to achieve fast software delivery is commonplace. Businesses frequently ask security engineers to remove controls because they "break" the application. Feature delivery takes precedence over security posture because it generates revenue 

Economics Is Killing the "Engineering" in Software Engineering
The behavioral and economic models of software operations provide incentives for fast delivery rather than quality and security. Security does not to add to the top line. Software engineering rigor is often considered an impediment because it would fundamentally change the profitability dynamics of the software industry. This is the fundamental underlying cause of most security vulnerabilities.

But there is hope. The fact that Equifax lost 35% of its market cap in five days, destroying several billion dollars of wealth in the process, could be the trigger to change this equation. Security expert Bruce Schneier, for one, argues for government intervention.

If the economic or regulatory incentives reward applying strict engineering rigor to software design, we will address a significant fraction of our accelerating security breaches. Until then, we will all continue to cut corners to pay the bills or risk getting a bad credit score by Equifax.

Related Content:

Join Dark Reading LIVE for two days of practical cyber defense discussions. Learn from the industry’s most knowledgeable IT security experts. Check out the INsecurity agenda here.

About the Author(s)

Dimitri Stiliadis

CEO of Aporeto

Dimitri Stiliadis is the CEO and co-founder of Aporeto, where he is leading the technology and company vision. Prior to Aporeto, he was the co-founder and CTO of Nuage Networks and CTO of the Non-Stop Laptop Guardian at Alcatel-Lucent. Before that, he has held several leading roles in Bell Labs Research, where he led a series of research programs with fundamental
contributions in networking, security, algorithms, and distributed systems, and was instrumental in the commercialization of these technologies. Dimitri received a PhD and an MsC in computer engineering from the
University of California, Santa Cruz.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights