About a year ago, I proposed that IT departments move away from the Windows monoculture. My argument was simple. First, by using different vendors for different platforms you break up your patch cycles. If we continue down the path of "standardization" every second Tuesday of every month we will be doing QA on our desktops, laptops, PDAs, servers, PBXs, ATMs, and manufacturing equipment. Second, whenever a worm or virus hits, we expose the entire organization to downtime and outages.
The Windows monoculture reminds me of the dangers from arboreal monoculture. At one time the entire eastern half of the United States was blanketed in chestnut trees. If you visit a building built before 1900, you see the rich, warm color of chestnut wood everywhere. Thanks to the chestnut blight of 1904, practically the entire population of 4 billion trees was wiped out.
The middle of the last century was a golden age (or should I say green?) for Midwestern towns. Towering elm trees lined every street, arching over and creating arboreal tunnels.
Until Dutch elm disease wiped them out.
The latest scourge of the Midwest is the Asian Emerald Ash Borer, a stupid little bug that burrows under the bark of mature ash trees, completely girdling the tree and killing it within a single season. In the county where I live, there are reported to be over 4 million dead trees already.
I would like to argue that Microsoft should also abandon its strategy of Windows on everything. Imagine the benefits:
- Opportunity for innovation. Microsoft employs some of the smartest people in the world. If the individual product teams could design the best platform for each new device, they could create best-of-breed products. Why shackle the game developers with antiquated code, file systems, and memory management? Is Windows really the best platform for an ATM? Or a PBX? A home media controller? By allowing independent development and restricting reuse of code, Microsoft could do some amazing things.
- The less they reuse code the fewer emergency patches that have to be deployed to everything. Remember WMF? A zero-day vulnerability discovered last Christmas? That one forced them to issue a critical patch for Vista!
- Better division of labor. Servers are very different from desktops. They do the same thing over and over. They should be optimized for transaction processing and availability. The overhead of supporting things like Windows Explorer is unwarranted. Just as in the IT shop there is no need for server admins or developers to be the same people as desktop admins or developers.
- Better security. Compared to hackers, security researchers, and criminals, Microsoft has infinite resources. Why give hackers the keys to the kingdom every time a new platform is released? Why should the guys who brought you CoolWebSearch, MSBlaster, and MyDoom be immediately empowered with the tools and techniques to attack the Windows cell phone? Or the Windows ATM? Or the Windows heart monitor?
If security were really the prime motivator at Microsoft, it would abandon its strategy of world domination through Windows standardization and introduce variation and innovation.