News
5/28/2010
10:54 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

The Roll Down Hill Effect Of Primary Storage Deduplication

The adoption rate of deduplication in primary storage has been relatively low so far in primary storage. There are concerns on user's minds about performance impact, data integrity and how much capacity savings they will see. Clearly each of these concerns need to be addressed. When it comes to capacity savings though, there is a key component of capacity savings that might get overlooked, the roll down hill effect of proper primary storage deduplication.

The adoption rate of deduplication in primary storage has been relatively low so far in primary storage. There are concerns on user's minds about performance impact, data integrity and how much capacity savings they will see. Clearly each of these concerns need to be addressed. When it comes to capacity savings though, there is a key component of capacity savings that might get overlooked, the roll down hill effect of proper primary storage deduplication.Thus far the big winner in deduplication has been the backup process. If you are doing weekly full backups then there is plenty of opportunity for redundant data and you can post some incredible efficiency gains. This is not the case, or at least should not be, in primary storage. With the exception of virtualization images its unlikely that you will be able to make double digit storage efficiency gains thanks to deduplication alone. If you see typical efficiency claims of 12X in backup deduplication, expect maybe 5X gain in primary storage deduplication.

If you stop there though your missing an important part of the picture, the roll down hill effect of primary storage deduplication. If, and that is an important if, your primary storage deduplication technology can keep the data in an optimized state throughout its entire life cycle then you can see tremendous residual value in primary storage deduplication. With primary storage deduplication snapshots, replication, clones, extra copies of data (just in case copies) all now come at near zero capacity cost. For example you can perform dumps of your database every ten minutes if you want to, deduplication will curtail the capacity growth that would normally create.

The key issue is if and when primary storage deduplication will need to "re-inflate" to a non-optimized data state. Optimization throughout the data lifecycle and the tiers of storage it is on, is critical for making deduplication make sense in primary storage. In fairness there may be a time you want to re-inflate on purpose and remove dependency on the deduplication hash table. That is going to depend on how much you trust your deduplication technology to maintain its meta-data and provide rich data integrity features.

Deduplication technology tries to fix the capacity explosion problem faced by most data centers. Where deduplication is being successful right now, in backup repositories, is trying to fix that problem after it has already occurred. Primary storage deduplication that maintains data in its optimized state fixes the problem before it becomes a problem. If properly implemented primary storage deduplication could have significant reduction on the storage demands of your data center.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
karthickkandaiyah2
50%
50%
karthickkandaiyah2,
User Rank: Apprentice
12/27/2012 | 4:01:16 PM
re: The Roll Down Hill Effect Of Primary Storage Deduplication
good one
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading Must Reads - September 25, 2014
Dark Reading's new Must Reads is a compendium of our best recent coverage of identity and access management. Learn about access control in the age of HTML5, how to improve authentication, why Active Directory is dead, and more.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2012-5485
Published: 2014-09-30
registerConfiglet.py in Plone before 4.2.3 and 4.3 before beta 1 allows remote attackers to execute Python code via unspecified vectors, related to the admin interface.

CVE-2012-5486
Published: 2014-09-30
ZPublisher.HTTPRequest._scrubHeader in Zope 2 before 2.13.19, as used in Plone before 4.3 beta 1, allows remote attackers to inject arbitrary HTTP headers via a linefeed (LF) character.

CVE-2012-5487
Published: 2014-09-30
The sandbox whitelisting function (allowmodule.py) in Plone before 4.2.3 and 4.3 before beta 1 allows remote authenticated users with certain privileges to bypass the Python sandbox restriction and execute arbitrary Python code via vectors related to importing.

CVE-2012-5488
Published: 2014-09-30
python_scripts.py in Plone before 4.2.3 and 4.3 before beta 1 allows remote attackers to execute Python code via a crafted URL, related to createObject.

CVE-2012-5489
Published: 2014-09-30
The App.Undo.UndoSupport.get_request_var_or_attr function in Zope before 2.12.21 and 3.13.x before 2.13.11, as used in Plone before 4.2.3 and 4.3 before beta 1, allows remote authenticated users to gain access to restricted attributes via unspecified vectors.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
In our next Dark Reading Radio broadcast, we’ll take a close look at some of the latest research and practices in application security.