As discussed in our <a href="http://www.informationweek.com/blog/main/archives/2010/06/does_deduplicat.html">last entry</a>, all the storage optimization strategies will impact how much capacity you will need to purchase in your next upgrade. The problem is that much of the savings are going to be dependent on your data. You will hear vendors state something like "your actual mileage will vary" and that is very true. With that as the backdrop how do you make sure you don't overshoot or worse, un

George Crump, President, Storage Switzerland

June 10, 2010

4 Min Read

As discussed in our last entry, all the storage optimization strategies will impact how much capacity you will need to purchase in your next upgrade. The problem is that much of the savings are going to be dependent on your data. You will hear vendors state something like "your actual mileage will vary" and that is very true. With that as the backdrop how do you make sure you don't overshoot or worse, undershoot on your next capacity estimate?The first step is to understand which of the available optimization technologies your vendor has to offer. Very few offer all three; thin provisioning, deduplication and compression. Also some only offer the technologies on all types of storage (NAS and block). For example all three of the optimization techniques are becoming common on NAS storage and if NAS is your primary storage platform then you are in great shape. If you need block storage you may have to investigate further.

If your budgeting process will allow it, our number one recommendation is to not spend all your budget on storage upfront. Set aside as much as 20% of the budget and see what the storage optimization technologies will deliver. This will allow you to be fairly aggressive with your estimates. Just make sure that you and your supplier are ready to bring in more storage quickly. As we discussed in the last entry the amount of efficiency can vary between data types. As a general rule of thumb we find that 50% optimization is a safe bet if you are storing server virtualization images or have large home directories, again depending on the combination of optimization technologies that you choose. For example we have seen server virtualization environments experience a 90% reduction in capacity needs when thin provisioning, deduplication and compression are all used together. The set aside for additional storage is critical just in case you are too aggressive in your planning, you'll want to have pre-approval to bring in more capacity. If the optimization works as planned or even better then you can either spend the money elsewhere or enjoy a bonus (maybe) for not spending all of your IT budget.

If you cannot set aside some of the budget and you are in a "spend it or loose it" situation then I have two recommendations. First, if with even pessimistic expectations for capacity optimization your budget is going to allow you to purchase more storage than you need, consider buying a portion of that capacity as solid state disk (SSD). This will not only consume budget but as we discuss in our Visualizing SSD Readiness Guide it also provides accelerated performance for the applications that can justify it. That guide will show you how to determine which applications will benefit most from SSD.

Second and probably the worst case is you think you will need the capacity and you are in a spend it or loose it situation. My concern here is that the optimization techniques will work better than you think. With this scenario you could truly end up with shelves of unused capacity. What we suggest here is to buy the capacity as you planned but when the product comes in don't connect it all. You can even have it sitting in the rack, just don't power it on. At least this way you are not paying to power it and users don't see TB's of free capacity causing them to get sloppy in the use of that storage. Also it is much harder to deactivate a shelf since most storage systems stripe data vertically across separate shelves. It's better to not enable the shelf until you are sure you are going to need it. This again depends on a system that is easy to add capacity to with minimal or no downtime.

Regardless of your purchasing flexibility you do need to factor in these new capacity optimization techniques into your capacity plan. The more aggressive you can be the less capacity you will need. Once you have run the capacity enabled system for a few weeks and can see what kind of reduction you are going to get on your data then you can make the decision to spend it elsewhere (like SSD) or save it for the future.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

About the Author(s)

George Crump

President, Storage Switzerland

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, and SAN. Prior to founding Storage Switzerland, he was CTO at one the nation’s largest storage integrators, where he was in charge of technology testing, integration, and product selection. George is responsible for the storage blog on InformationWeek's website and is a regular contributor to publications such as Byte and Switch, SearchStorage, eWeek, SearchServerVirtualizaiton, and SearchDataBackup.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights