News
6/5/2009
11:20 AM
George Crump
George Crump
Commentary
50%
50%

What Is Deduplication And Why Should You Care?

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.With all the news about NetApp and EMC in a bidding war to buy Data Domain. It might make sense for us to pause a moment and help explain why these two companies are willing to pay almost $2 billion dollars for the market leading provider of this technology.

Deduplication at its most simple level, examines data, compares it to other data that is already stored and if that data is identical, instead of storing that second copy of data the deduplication technology establishes a link to the original data. It requires significantly less storage space to establish a link than to actually store the file.

Deduplication first gained traction as a technology to enhance disk backup. Without deduplication your disk backup had to scale to store multiple full backups and several weeks worth of incremental backups. Even with the plummeting price of ATA storage the cost to configure a disk array to store even a months worth of backups, let alone the power, cooling and space required by the array was enormous.

If you do backups you know that this data, especially in full backups, is highly redundant and this is where deduplication shines, as a result it was the first market that the technology became a requirement and companies like Data Domain and Avamar became market leaders. Avamar was snatched up by EMC but Data Domain made it all the way to becoming a public company.

What really drove Data Domain's success in the backup space is the ability to replicate backup data to another site. This was an often requested feature when disk to disk backup first started to become viable, but the way and speed at which backup data is created standard replication wouldn't work across normal WAN bandwidth. Deduplication gets around this because it only stores changed or net new data and then only that data needs to be replicated; much more WAN friendly.

These capabilities in backup alone are not enough to justify a $2 billion investment in deduplication technology. What is driving these companies to pay this type of money is what deduplication can do to the rest of the storage spectrum; primary storage and archive storage.

For example if a company armed with deduplication can implement this into primary storage in a way that causes little to no performance impact yet can increase storage efficiencies 60% to 70%, that could get interesting. Imagine if you needed 80TB's of storage but one of your vendors only needed to supply you with 40TB because they had this technology, clearly that vendor would have a significant advantage in winning your business.

Clearly this technology is not limited to Data Domain and there are a host of other vendors that can provide compression, deduplication or both at different levels of storage. The publicity generated by this bidding war obviously helps Data Domain but it also helps many of the other deduplication suppliers.

What all of this should be telling you is that deduplication is important, how it is used and how it is implemented in the various storage tiers matters and this is as good a time as any to begin to learn about and implement the technology.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2015-0732
Published: 2015-07-28
Cross-site scripting (XSS) vulnerability in Cisco AsyncOS on the Web Security Appliance (WSA) 9.0.0-193; Email Security Appliance (ESA) 8.5.6-113, 9.1.0-032, 9.1.1-000, and 9.6.0-000; and Content Security Management Appliance (SMA) 9.1.0-033 allows remote attackers to inject arbitrary web script or ...

CVE-2015-2974
Published: 2015-07-28
LEMON-S PHP Gazou BBS plus before 2.36 allows remote attackers to upload arbitrary HTML documents via vectors involving a crafted image file.

CVE-2015-4287
Published: 2015-07-28
Cisco Firepower Extensible Operating System 1.1(1.86) on Firepower 9000 devices allows remote attackers to bypass intended access restrictions and obtain sensitive device information by visiting an unspecified web page, aka Bug ID CSCuu82230.

CVE-2015-4288
Published: 2015-07-28
The LDAP implementation on the Cisco Web Security Appliance (WSA) 8.5.0-000, Email Security Appliance (ESA) 8.5.7-042, and Content Security Management Appliance (SMA) 8.3.6-048 does not verify X.509 certificates from SSL servers, which allows man-in-the-middle attackers to spoof servers and obtain s...

CVE-2015-4692
Published: 2015-07-27
The kvm_apic_has_events function in arch/x86/kvm/lapic.h in the Linux kernel through 4.1.3 allows local users to cause a denial of service (NULL pointer dereference and system crash) or possibly have unspecified other impact by leveraging /dev/kvm access for an ioctl call.

Dark Reading Radio
Archived Dark Reading Radio
What’s the future of the venerable firewall? We’ve invited two security industry leaders to make their case: Join us and bring your questions and opinions!