Deduplication Checklist

Here are five key questions to ask before committing to a data deduplication system.

Howard Marks, Network Computing Blogger

May 12, 2008

3 Min Read
Dark Reading logo in a gray background | Dark Reading

No doubt about it: data deduplication can be a magic bullet for backup. Organizations that apply it intelligently will see faster backups, easier restores, and a reduction in power, space, and cooling costs. But put in the wrong solution, and you may instead find yourself walking the unemployment line.

Nowhere in IT does the phrase "Your mileage may vary" apply more than with data deduplication. Data reduction ratios vary depending on the type of data being backed up, the rate at which data changes between backups, and the backup scheme used.

To help companies choose the best technology for their needs, I've identified five key questions to ask: InformationWeek Reports

Where to deduplicate?
Organizations looking to bring sanity to remote-office backups should consider remote-office/back-office backup software such as Asigra's Televaulting or EMC's Avamar that deduplicates at the source server, reducing the bandwidth needed to back up across the WAN. Larger branches, or those with less reliable WAN connections, are better served by deduplicating appliances that can replicate globally deduplicated data.

Pure Windows shops can look at Data Storage Group's ArchiveIQ, an innovative backup program that deduplicates data at the backup server. I expect EMC, CommVault, and Symantec to add backup-server deduplication over the next year or two.

How fast do you need to back up?
While vendors like to talk about speeds and feeds, the main thing is whether a given backup device is fast enough for your needs. Vendors claim their in-line deduping targets can handle 200 GB to 800 GB an hour, and their post-processing virtual tape libraries (VTLs) have data ingestion rates of up to 34 TB an hour. But the latter then may need several hours to deduplicate the data.

In addition to overall performance, make sure to look at how fast the appliance you're considering can handle a single backup stream from your biggest backup job, and how long the deduplicating post-process will take to complete.

Nowhere in IT does the phrase 'Your mileage may vary' apply more than with data deduplication

Does the technology work with your backup software?
Content-aware products rely on their knowledge of the data formats that the backup applications write in. Pair a content-aware solution with a backup application it isn't equipped to manage, and you won't get any deduplication.

What interface?
Deduplicating targets come with network-attached storage and/or VTL interfaces. The NAS interface makes it easier to manage data--after all, you can't delete part of a tape, real or virtual. The problem with NAS is that it's limited to 1-Gbps Ethernet, while VTLs run over 4-Gbps Fibre Channel hardware interfaces. If you need more than backup speeds of 300 GB an hour, VTL is the way to go.

How does the technology scale?
The first data corollary to Murphy's Law states, "Data will grow to fill all available space." As a result, no matter what backup appliance or VTL you buy today, in two or three years you'll need a bigger one.

Look for devices that can expand to at least twice their size when you buy them. Gateway devices that use storage area networks and appliances that can be expanded by adding drive trays are more flexible than standalone devices. NEC's Hydrastor, using a grid architecture of accelerator and storage nodes with essentially no maximum capacity, is especially well-suited to those with fast-growing or unpredictable needs.

Illustration by John Hersey

Return to the story:
With Data Deduplication, Less Is More

About the Author

Howard Marks

Network Computing Blogger

Howard Marks is founder and chief scientist at Deepstorage LLC, a storage consultancy and independent test lab based in Santa Fe, N.M. and concentrating on storage and data center networking. In more than 25 years of consulting, Marks has designed and implemented storage systems, networks, management systems and Internet strategies at organizations including American Express, J.P. Morgan, Borden Foods, U.S. Tobacco, BBDO Worldwide, Foxwoods Resort Casino and the State University of New York at Purchase. The testing at DeepStorage Labs is informed by that real world experience.

He has been a frequent contributor to Network Computing and InformationWeek since 1999 and a speaker at industry conferences including Comnet, PC Expo, Interop and Microsoft's TechEd since 1990. He is the author of Networking Windows and co-author of Windows NT Unleashed (Sams).

He is co-host, with Ray Lucchesi of the monthly Greybeards on Storage podcast where the voices of experience discuss the latest issues in the storage world with industry leaders.  You can find the podcast at: http://www.deepstorage.net/NEW/GBoS

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights