Microsoft's Vista Hacker Speaks: 7 Lessons LearnedChris Paget served on the "final security review" team that assessed Vista before release. Check out what he learned about software hardening.
(click image for larger view)
Slideshow: Top Features Absent From Windows 7
If you're a Microsoft conspiracy theorist seeking a smoking gun over Redmond's security practices, prepare to be disappointed. In so many words, that was the message delivered by security expert Chris Paget of Recursion Ventures--job title: chief hacker--who five years ago was part of a handpicked "final security review" team called in to assess Microsoft Vista for security defects on the eve of its release.
Vista's developers had expected their code to be near-perfect. Thanks to the efforts of Paget--a self-professed Unix aficionado--the release of Vista was delayed, as the three-month code review tripled in personnel size and project duration.
Speaking earlier this month at Black Hat, a UBM TechWeb event in Las Vegas, just days after his five-year nondisclosure agreement (NDA) with Microsoft expired, Paget outlined the code review's security lessons for Microsoft, and what other independent software vendors can learn too:
1. Hire Outsiders: Microsoft hired several outside security experts to review the Vista code for bugs, then supported them. "We had full access to everything," said Paget. "We had a really spectacular team working with us at Microsoft, where we got anything we needed--documentation, source code, or a [response from a] developer who wasn't returning our calls because we had asked tough questions." To his knowledge, this was the first time Microsoft had used outside reviewers to assess an operating system code base.
2. Price Bugs: Is it worth hiring a code-review team before shipping a product? To find out, calculate the cost of every pre-production bug found in code. "At the time, Microsoft had come out with an internal study that said it was $250,000 to fix a security bug" in production code, said Paget. That cost included numerous factors, comprising not just software engineering time, but also the cost of the bug to Microsoft's reputation.
3. Commoditize Code Review: Knowing how much it costs to fix a bug in production code gives software developers a business case for ensuring bugs never make it into product releases. Hypothetically speaking, if Microsoft spent $1 million for a code review that unearthed five new bugs, then it saved money.
4. Build A Bug Bar: As part of its security development lifecycle (SDL), Microsoft maintains a document known as the Bug Bar, specifying how to classify any given bug. This was relevant because the Bug Bar clearly stated that certain types of bugs, if not fixed before Vista was ready to ship, would result in their related feature being struck from Vista. Developers took notice of that clause.
5. Model Threats: Microsoft required developers to "threat model" Vista features, resulting in "some superb documentation," said Paget. "We had folks on the team, all they were doing was sitting and reviewing threat models for months and months and months." By spotting whether two features might react in bad way, developers could then resolve how they interacted.
6. Search For "Bug": The code review team searched for profanity in Vista code base comments and "found surprisingly little." But whenever they discovered a comment along the lines of "bug bug" or "fix me," they paid careful attention.
7. Use Secure Development: ""Microsoft's security process, in my opinion is spectacular," said Paget. "I strongly recommend it to everyone I talk to, Microsoft has really set the gold standard about how to implement SDL for a large software product. Please quote me on that."
The vendors, contractors, and other outside parties with which you do business can create a serious security risk. Here's how to keep this threat in check. Also in the new, all-digital issue of Dark Reading: Why focusing solely on your own company's security ignores the bigger picture. Download it now. (Free registration required.)