News
6/5/2009
11:20 AM
George Crump
George Crump
Commentary
50%
50%

What Is Deduplication And Why Should You Care?

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.With all the news about NetApp and EMC in a bidding war to buy Data Domain. It might make sense for us to pause a moment and help explain why these two companies are willing to pay almost $2 billion dollars for the market leading provider of this technology.

Deduplication at its most simple level, examines data, compares it to other data that is already stored and if that data is identical, instead of storing that second copy of data the deduplication technology establishes a link to the original data. It requires significantly less storage space to establish a link than to actually store the file.

Deduplication first gained traction as a technology to enhance disk backup. Without deduplication your disk backup had to scale to store multiple full backups and several weeks worth of incremental backups. Even with the plummeting price of ATA storage the cost to configure a disk array to store even a months worth of backups, let alone the power, cooling and space required by the array was enormous.

If you do backups you know that this data, especially in full backups, is highly redundant and this is where deduplication shines, as a result it was the first market that the technology became a requirement and companies like Data Domain and Avamar became market leaders. Avamar was snatched up by EMC but Data Domain made it all the way to becoming a public company.

What really drove Data Domain's success in the backup space is the ability to replicate backup data to another site. This was an often requested feature when disk to disk backup first started to become viable, but the way and speed at which backup data is created standard replication wouldn't work across normal WAN bandwidth. Deduplication gets around this because it only stores changed or net new data and then only that data needs to be replicated; much more WAN friendly.

These capabilities in backup alone are not enough to justify a $2 billion investment in deduplication technology. What is driving these companies to pay this type of money is what deduplication can do to the rest of the storage spectrum; primary storage and archive storage.

For example if a company armed with deduplication can implement this into primary storage in a way that causes little to no performance impact yet can increase storage efficiencies 60% to 70%, that could get interesting. Imagine if you needed 80TB's of storage but one of your vendors only needed to supply you with 40TB because they had this technology, clearly that vendor would have a significant advantage in winning your business.

Clearly this technology is not limited to Data Domain and there are a host of other vendors that can provide compression, deduplication or both at different levels of storage. The publicity generated by this bidding war obviously helps Data Domain but it also helps many of the other deduplication suppliers.

What all of this should be telling you is that deduplication is important, how it is used and how it is implemented in the various storage tiers matters and this is as good a time as any to begin to learn about and implement the technology.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This is a secure windows pc.
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.