News
6/8/2010
10:25 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Does Deduplication Make Storage Capacity Planning Difficult?

With all the technologies out now, and it not just deduplication, to optimize the use of primary storage capacity, the guidelines for how you estimate how much capacity you need in a given year needs to change. In some ways storage capacity planning is more difficult than it has been in the past. It has to change to keep up with the new capabilities of storage systems like thin provisioning, compression and deduplication.

With all the technologies out now, and it not just deduplication, to optimize the use of primary storage capacity, the guidelines for how you estimate how much capacity you need in a given year needs to change. In some ways storage capacity planning is more difficult than it has been in the past. It has to change to keep up with the new capabilities of storage systems like thin provisioning, compression and deduplication.Storage capacity planning of a few years ago seems like a relatively simple task compared with the capacity planning of today. You estimated the amount of capacity that you were going to need based on organic growth and new application needs, then doubled that number and ordered the storage. In many cases no one batted an eye to the process. If you apply that same logic today you may end of with 50% or more of your capacity purchase never being used. In fact several vendors are claiming, and even guaranteeing, that you will need less storage if you replace your current storage solution with theirs.

You could continue to use the old math when calculating storage capacity needs and enjoy all the extra free capacity. It is important to remember though that storage is not wine, it does not get more expensive with age, unused capacity is wasted budget dollars as well as power and cooling. The time has come to factor all these techniques into your next capacity or even storage system upgrade. Of these capabilities deduplication, compression and thin provisioning probably will have the most impact.

Primary storage deduplication has been discounted by some in the industry. There are concerns about performance impact and data integrity; both those concerns are technology issues and either are or will be overcome. Some solutions are now claiming micro seconds of latency and not altering the data format. The other and more legitimate concern is how much duplicate data do you really have on primary storage. In the past I would say this is a valid concern, until server and desktop virtualization. Now there can be TB's and TB's of redundant data on the system. Deduplication can address that problem and result in massive savings. Estimating how much deduplication should factor into your capacity planning is difficult. If the environment is going to be heavy on the virtualization side, I would suggest at least a 3:1 reduction in the amount of storage you were going to purchase maybe more.

Compression is another optimization technique to consider. Compression gains optimization across almost all files, it does not require duplicate data. The data does need to be compressible of course but in almost every case the net is at least a 2:1 gain. In most cases compression is not an inhibitor to deduplication, most of the solutions work together, some are even integrated.

Thin provisioning helps in an area that deduplication and compression do not, capacity that is allocated but not in use. Essentially storage that is captive to a particular server. You can't compress or deduplicate something that is not there. The only way to optimize this capacity is to free it from being bound to a particular server. As we discuss in our Thin Provisioning White Paper, the technology is no longer limited to optimizing new application deployment but also to ongoing application use. Modern thin provisioning technology can reclaim deleted space from volumes as well.

In our next entry we will discuss how to roll all this information together to plan your next capacity upgrade or to plan a new storage system purchase.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
Partner Perspectives
What's This?
In a digital world inundated with advanced security threats, Intel Security seeks to transform how we live and work to keep our information secure. Through hardware and software development, Intel Security delivers robust solutions that integrate security into every layer of every digital device. In combining the security expertise of McAfee with the innovation, performance, and trust of Intel, this vision becomes a reality.

As we rely on technology to enhance our everyday and business life, we must too consider the security of the intellectual property and confidential data that is housed on these devices. As we increase the number of devices we use, we increase the number of gateways and opportunity for security threats. Intel Security takes the “security connected” approach to ensure that every device is secure, and that all security solutions are seamlessly integrated.
Featured Writers
White Papers
Cartoon
Current Issue
Dark Reading's October Tech Digest
Fast data analysis can stymie attacks and strengthen enterprise security. Does your team have the data smarts?
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-0334
Published: 2014-10-31
Bundler before 1.7, when multiple top-level source lines are used, allows remote attackers to install arbitrary gems by creating a gem with the same name as another gem in a different source.

CVE-2014-2334
Published: 2014-10-31
Multiple cross-site scripting (XSS) vulnerabilities in the Web User Interface in Fortinet FortiAnalyzer before 5.0.7 allow remote attackers to inject arbitrary web script or HTML via unspecified vectors, a different vulnerability than CVE-2014-2336.

CVE-2014-2335
Published: 2014-10-31
Multiple cross-site scripting (XSS) vulnerabilities in the Web User Interface in Fortinet FortiManager before 5.0.7 allow remote attackers to inject arbitrary web script or HTML via unspecified vectors, a different vulnerability than CVE-2014-2336.

CVE-2014-2336
Published: 2014-10-31
Multiple cross-site scripting (XSS) vulnerabilities in the Web User Interface in Fortinet FortiManager before 5.0.7 and FortiAnalyzer before 5.0.7 allow remote attackers to inject arbitrary web script or HTML via unspecified vectors, a different vulnerability than CVE-2014-2334 and CVE-2014-2335.

CVE-2014-3366
Published: 2014-10-31
SQL injection vulnerability in the administrative web interface in Cisco Unified Communications Manager allows remote authenticated users to execute arbitrary SQL commands via a crafted response, aka Bug ID CSCup88089.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Follow Dark Reading editors into the field as they talk with noted experts from the security world.