News
6/5/2009
11:20 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

What Is Deduplication And Why Should You Care?

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.With all the news about NetApp and EMC in a bidding war to buy Data Domain. It might make sense for us to pause a moment and help explain why these two companies are willing to pay almost $2 billion dollars for the market leading provider of this technology.

Deduplication at its most simple level, examines data, compares it to other data that is already stored and if that data is identical, instead of storing that second copy of data the deduplication technology establishes a link to the original data. It requires significantly less storage space to establish a link than to actually store the file.

Deduplication first gained traction as a technology to enhance disk backup. Without deduplication your disk backup had to scale to store multiple full backups and several weeks worth of incremental backups. Even with the plummeting price of ATA storage the cost to configure a disk array to store even a months worth of backups, let alone the power, cooling and space required by the array was enormous.

If you do backups you know that this data, especially in full backups, is highly redundant and this is where deduplication shines, as a result it was the first market that the technology became a requirement and companies like Data Domain and Avamar became market leaders. Avamar was snatched up by EMC but Data Domain made it all the way to becoming a public company.

What really drove Data Domain's success in the backup space is the ability to replicate backup data to another site. This was an often requested feature when disk to disk backup first started to become viable, but the way and speed at which backup data is created standard replication wouldn't work across normal WAN bandwidth. Deduplication gets around this because it only stores changed or net new data and then only that data needs to be replicated; much more WAN friendly.

These capabilities in backup alone are not enough to justify a $2 billion investment in deduplication technology. What is driving these companies to pay this type of money is what deduplication can do to the rest of the storage spectrum; primary storage and archive storage.

For example if a company armed with deduplication can implement this into primary storage in a way that causes little to no performance impact yet can increase storage efficiencies 60% to 70%, that could get interesting. Imagine if you needed 80TB's of storage but one of your vendors only needed to supply you with 40TB because they had this technology, clearly that vendor would have a significant advantage in winning your business.

Clearly this technology is not limited to Data Domain and there are a host of other vendors that can provide compression, deduplication or both at different levels of storage. The publicity generated by this bidding war obviously helps Data Domain but it also helps many of the other deduplication suppliers.

What all of this should be telling you is that deduplication is important, how it is used and how it is implemented in the various storage tiers matters and this is as good a time as any to begin to learn about and implement the technology.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
DevOps’ Impact on Application Security
DevOps’ Impact on Application Security
Managing the interdependency between software and infrastructure is a thorny challenge. Often, it’s a “developers are from Mars, systems engineers are from Venus” situation.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-0607
Published: 2014-07-24
Unrestricted file upload vulnerability in Attachmate Verastream Process Designer (VPD) before R6 SP1 Hotfix 1 allows remote attackers to execute arbitrary code by uploading and launching an executable file.

CVE-2014-1419
Published: 2014-07-24
Race condition in the power policy functions in policy-funcs in acpi-support before 0.142 allows local users to gain privileges via unspecified vectors.

CVE-2014-2360
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules allow remote attackers to execute arbitrary code via packets that report a high battery voltage.

CVE-2014-2361
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules, when BreeZ is used, do not require authentication for reading the site security key, which allows physically proximate attackers to spoof communication by obtaining this key after use of direct hardware access or manual-setup mode.

CVE-2014-2362
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules rely exclusively on a time value for entropy in key generation, which makes it easier for remote attackers to defeat cryptographic protection mechanisms by predicting the time of project creation.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Sara Peters hosts a conversation on Botnets and those who fight them.