A closer look at the debate surrounding this market

Dark Reading Staff, Dark Reading

November 20, 2012

7 Min Read

Over the past 15 years of cybersecurity discussion, it's doubtful that you'll have failed to notice the biannual flare-ups concerning vulnerability disclosure. Whether it's due to the slow fuse of a software vendor silently patching a legacy vulnerability, or the lightning strike of a zero-day being dropped at a security conference, these brushfires rage with fury for a short period until the tinder and media gets exhausted ... until the next time.

More recently, these flare-ups have come to envelop the exploit development business, and there's a tremendous amount of confusion and (dare I say it) misinformation being thrown into the mix. The passion and vigor with which a small number of very vocal players are driving their agendas is obscuring many of the true facts of the business.

The commercial exploit development business goes by many names -- depending on which agenda you're seeking to push and, frankly, who your customers are likely to be. For example, I have traditionally used the term "weaponization" of vulnerabilities, vendors of protection products often use the term "proof of concept," while those employed in the production of exploit material simply refer to it as "product."

Most security professionals greatly underestimate the amount of effort and time that goes in to the creation of stable, reliable exploits. There's also a misconception that all vulnerabilities can be turned in to exploits. Oh, that's so far from the truth to be laughable. It may take a few hours for an automated fuzzer to uncover a batch of bugs within a particular software package, and it may take a few days for a bug hunter to personally crawl through those discovered bugs and determine which are, in fact, security vulnerabilities, but it will likely take many weeks of skillful effort and determination to turn a handful of those vulnerabilities into something that could be used to gain control or manipulate the software package from a "hacker" perspective. Therefore, the process of turning a vulnerability into an exploit is an expensive proposition.

This season's brushfires have centered on small boutique companies that specialize in weaponizing vulnerabilities that happen to sell to government entities. Companies that are willing to craft and sell exploits for vulnerabilities that haven't yet been disclosed to the vulnerable software vendors have been analogous to magnesium flares thrown into a tinder-dry California forest at the height of summer (as far as the anti-exploit-sale lobbyists are concerned).

The construction, deployment, and use of exploits are a critical component of modern security practices. Both commercial and freeware penetration testing tools include thousands of exploits -- and are typically employed to categorically prove that a particular vulnerability exists within the environment under test. In other realms -- such as clandestine statecraft -- reliable exploits can be considered specialized skeleton keys for accessing information that could prevent a physical altercation. Regardless, the business of exploit development is a core component of modern cybersecurity doctrine. The question of whether exploit development is "ethical," "moral," or legally justifiable is more a reflection of how well someone understands the business in its entirety.

The heated debate and subsequent focus on those magnesium flare companies exemplifies the rather poor understanding of how vulnerabilities are discovered and weaponized, and who's consuming the exploits.

I use the term "boutique" to describe these vendors for a reason. Compared to the mainstay of the exploit development ecosystem, they're offering a niche, high-priced product. These small companies are like the booths at the annual arts and craft fair that take over the car park outside of an Ikea store. Each booth offers an assortment of themed homemade artwork for a high, but not extortionate price. While some of the artwork is fascinating and interesting, in the grand scale of things, they're novelties. At the end of the day, if you want a functional set of chairs you'd probably just go to the store that the parking lot happens to belongs to. In that analogy, that store is likely to be a major defense contractor.

For the past two years, at least the major defense contractors (in all the G20 countries) have been establishing or greatly expanding their "cyberwarfare" capabilities for sale to their long-standing government customers. Apart from the public posting of recruitment flyers and occasional disclosure from freelance bug hunters who've sold a vulnerability to one of the contractors, there's very little discussion of their participation within the ecosystem. Defense contractors tend to shun media attention -- it's not good for business (unlike the boutique exploiters). But based on the resources they're able to throw at the problem and the scale of recruitment going on, the bulk of the exploit market is being served by them. And rightly so! Pretty much all of the major defense contractors have had strong research arms (and contractual requirements) for information warfare tools since before the Internet even existed.

There's one other area of the debate I wanted to touch on, and that’s the role of commercial security consulting practices. The myopic focus on boutique exploit development shops as an unethical scab on the industry (not my words), and the assumption that if these businesses did not exist the whole business would go away, is ignorant of how easily any organization or government can "game" the system.

As a security consultant for many years and having worked many contracts for clients around the world, the most efficient and cheapest way of acquiring zero-day vulnerabilities and the exploits to accompany them is to simply go forth and hire a team of reverse engineers and bug hunters. The majority of large security consulting businesses have a stable of highly skilled and trained reverse engineers and, given an appropriately scoped statement of work, can deliver exploits for any software package of choice. In fact, this is more often than not becoming a standard line of business for high-end consulting practices. It's not even nefarious.

Let me give you a (not entirely made up) example. A large national telecommunications provider in an Arab state had to choose between three vendors' software products as part of the solution evaluation process. One component of the evaluation was, "Does the software introduce new vulnerabilities to the system?" In order to determine that, a team of four reverse engineers were contracted to spend three months on-site at the regional facility to identify all the bugs they could within each of the software packages and to develop "proof of concept" code to show whether the vulnerability was remotely exploitable. The product with the least severe vulnerabilities would then win that category of the evaluation.

For the consultants, this is an increasingly standard contract. The client wants to diligently evaluate any new products from a security context prior to purchase and deployment, and they're paying top dollar for any findings. At the end of the engagement, the vulnerabilities, exploits, and report all belong (exclusively) to the client. More often than not, the contract expressly forbids the consultants and their company from disclosing these new findings to the original software vendors. I suspect that the volume of "zero-day" exploits generated around the world through these kinds of contracts exceeds the combined output of defense contractors and "boutique" weaponization companies by quite a margin.

As for gaming the system... the fact that the three software packages were the same packages that a competitor in another country to the client uses is irrelevant (and likely unknown to the consultants). What the client does with that information afterward is entirely its business.

So the next time someone is fanning the latest brushfire and proclaiming that weaponized exploit development is evil and that purveyors of such exploits should be regulated or licensed as arms dealers, try to look beyond the marketing hype of a single boutique shop and its pretty driftwood chairs.

Gunter Ollmann is CTO of Damballa

About the Author(s)

Dark Reading Staff

Dark Reading

Dark Reading is a leading cybersecurity media site.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights