He pioneered vulnerability research, but he's no hacker. Chris Wysopal -- a.k.a. "Weld Pond" and formerly of the famed hacker group known as the L0pht -- is now co-founder and CTO of startup Veracode, a new security software testing services provider.
Wysopal, who co-authored the industry's de facto word on the responsible disclosure of vulnerabilities, has literally written the book on software security testing, The Art of Software Security Testing. And he and other former executives of @stake (purchased by Symantec in 2004) recently rolled their software security testing technology into a service offering from Veracode, which they launched this week.
Wysopal spoke to Dark Reading senior editor Kelly Jackson Higgins about the new business, bugs, and his book. (See Security Startups Make Debut.)
DR: What's the origin of the binary-testing technology that Veracode now offers in its new services? And why didn't Symantec adopt it?
CW: We developed it at @stake... The first couple of years it was a skunk-works project and we were figuring out if it could be done. Nobody had done binary analysis where it could be as accurate as source-code analysis.
Symantec purchased @stake for its consulting business. We came along as part of the company and it didn't seem like there was a good fit. Symantec had divested itself of developer tools and things of that nature. So [we spun] out the technology and it took about a year to do it... Symantec got a small equity stake in the company [Veracode].
DR: Why binary-code testing for more secure software products?
CW: There's no silver bullet. A flight-control system, for instance, has to be very secure. You need to do design reviews of it, source code reviews, and security testing... The fact of the matter is testing costs a lot of money and takes a lot of time. And there is software out there that doesn't need to be secure, like an internal Web application to look up the [company] cafeteria menu.
We're trying to make it easier and more cost-effective for all software to get a security analysis. Binary is cheaper if you are offering it as a service -- you don't need to install software on lots of different desktops and have developers running the tools... We can fit into the development cycle and different milestones of the development cycle. You don't need extra resources to get a security analysis.
DR: Why The Art of Software Security Testing?
CW: I got the idea about two or three years ago when I was at a large software vendor [client] site, and three or four of us from @stake were doing security testing. We kept interfacing with the QA [quality assurance] people, asking them, "Do you have a testing tool that can do this -- we want to take it and modify it." This got them asking, "what are you actually doing, and can we sit here and watch you do this?" We [realized] we should really write all this stuff down so other QA people can learn, too.
We are trying to bridge some of the artistry of penetration testers, who are traditionally self-taught, to a more formalized software development process. I think [security software testing] has really only been formalized in software development groups in the last two or three years... Before that, it was being done by forward-thinking companies that would hire people from @stake and have them come in and be part of their QA process.
We want to bridge the gap between the security world and the QA practitioner.
DR: How much has changed in the security research world since you first co-wrote the definitive "responsible disclosure" RFC for the industry?
CW: I think it's changed a little bit. Software vendors are more responsive than when we first wrote that document four years ago. There's always been a group of people that doesn't necessarily want to work with this vendor and give them free security research. They think vendors are not going to take security seriously and build their products more securely [without public disclosures]. Some of those ideas may have been true several years ago, but I think most vendors understand that security vulnerability in their software is not good for sales: If you Google on a software name, the first five hits in the list usually come up with vulnerabilities.
The problem I have with the Month of Bugs is they don't give any time for a vendor to respond to the issue. It's a good thing to raise awareness of a particular class of problems and vulnerabilities in a certain area. But cramming it into a month where there's no way for a vendor to respond in that timeframe is not necessarily a good thing.
DR: Do you consider yourself a reformed hacker?
CW: I see myself mostly as a software developer.
My QA, software, and vulnerability research background has helped me understand the big picture, and having a diversified view of security is important. I think that actually sitting down and doing vulnerability research is important to understanding how to secure software.
My first foray into writing security tools was software called L0phtCrack that became a product at @stake, and then at Symantec.
DR: What scares you most about security today?
CW: We're just starting to scratch the surface of some of the vulnerabilities in the whole Web application space -- Cross Site Request Forgery (CSRF) and Ajax-style development. I don't think we understand the ramifications of a very powerful, rich client/service model where the code is sort of running all over the place
We're definitely very interested in coming up with solutions for these dynamic mashup applications. We are coming up with testing techniques to find vulnerabilities [here].
DR: So what else should we expect to see from Veracode?
CW: We didn't just take a binary static analyzer and bolt a Web front-end onto it. We built a software assurance platform where we could plug in multiple types of engines, and take threat feeds and other types of vulnerability information, and build a rich software assurance platform. You can look for us to be announcing other different services on top of our platform.
Kelly Jackson Higgins, Senior Editor, Dark Reading