Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

News

6/10/2010
08:55 AM
George Crump
George Crump
Commentary
50%
50%

Implementing Storage Capacity Planning In The Modern Era

As discussed in our last entry, all the storage optimization strategies will impact how much capacity you will need to purchase in your next upgrade. The problem is that much of the savings are going to be dependent on your data. You will hear vendors state something like "your actual mileage will vary" and that is very true. With that as the backdrop how do you make sure you don't overshoot or worse, un

As discussed in our last entry, all the storage optimization strategies will impact how much capacity you will need to purchase in your next upgrade. The problem is that much of the savings are going to be dependent on your data. You will hear vendors state something like "your actual mileage will vary" and that is very true. With that as the backdrop how do you make sure you don't overshoot or worse, undershoot on your next capacity estimate?The first step is to understand which of the available optimization technologies your vendor has to offer. Very few offer all three; thin provisioning, deduplication and compression. Also some only offer the technologies on all types of storage (NAS and block). For example all three of the optimization techniques are becoming common on NAS storage and if NAS is your primary storage platform then you are in great shape. If you need block storage you may have to investigate further.

If your budgeting process will allow it, our number one recommendation is to not spend all your budget on storage upfront. Set aside as much as 20% of the budget and see what the storage optimization technologies will deliver. This will allow you to be fairly aggressive with your estimates. Just make sure that you and your supplier are ready to bring in more storage quickly. As we discussed in the last entry the amount of efficiency can vary between data types. As a general rule of thumb we find that 50% optimization is a safe bet if you are storing server virtualization images or have large home directories, again depending on the combination of optimization technologies that you choose. For example we have seen server virtualization environments experience a 90% reduction in capacity needs when thin provisioning, deduplication and compression are all used together. The set aside for additional storage is critical just in case you are too aggressive in your planning, you'll want to have pre-approval to bring in more capacity. If the optimization works as planned or even better then you can either spend the money elsewhere or enjoy a bonus (maybe) for not spending all of your IT budget.

If you cannot set aside some of the budget and you are in a "spend it or loose it" situation then I have two recommendations. First, if with even pessimistic expectations for capacity optimization your budget is going to allow you to purchase more storage than you need, consider buying a portion of that capacity as solid state disk (SSD). This will not only consume budget but as we discuss in our Visualizing SSD Readiness Guide it also provides accelerated performance for the applications that can justify it. That guide will show you how to determine which applications will benefit most from SSD.

Second and probably the worst case is you think you will need the capacity and you are in a spend it or loose it situation. My concern here is that the optimization techniques will work better than you think. With this scenario you could truly end up with shelves of unused capacity. What we suggest here is to buy the capacity as you planned but when the product comes in don't connect it all. You can even have it sitting in the rack, just don't power it on. At least this way you are not paying to power it and users don't see TB's of free capacity causing them to get sloppy in the use of that storage. Also it is much harder to deactivate a shelf since most storage systems stripe data vertically across separate shelves. It's better to not enable the shelf until you are sure you are going to need it. This again depends on a system that is easy to add capacity to with minimal or no downtime.

Regardless of your purchasing flexibility you do need to factor in these new capacity optimization techniques into your capacity plan. The more aggressive you can be the less capacity you will need. Once you have run the capacity enabled system for a few weeks and can see what kind of reduction you are going to get on your data then you can make the decision to spend it elsewhere (like SSD) or save it for the future.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Zero-Factor Authentication: Owning Our Data
Nick Selby, Chief Security Officer at Paxos Trust Company,  2/19/2020
44% of Security Threats Start in the Cloud
Kelly Sheridan, Staff Editor, Dark Reading,  2/19/2020
Ransomware Damage Hit $11.5B in 2019
Dark Reading Staff 2/20/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
6 Emerging Cyber Threats That Enterprises Face in 2020
This Tech Digest gives an in-depth look at six emerging cyber threats that enterprises could face in 2020. Download your copy today!
Flash Poll
How Enterprises Are Developing and Maintaining Secure Applications
How Enterprises Are Developing and Maintaining Secure Applications
The concept of application security is well known, but application security testing and remediation processes remain unbalanced. Most organizations are confident in their approach to AppSec, although others seem to have no approach at all. Read this report to find out more.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-5524
PUBLISHED: 2020-02-21
Aterm series (Aterm WF1200C firmware Ver1.2.1 and earlier, Aterm WG1200CR firmware Ver1.2.1 and earlier, Aterm WG2600HS firmware Ver1.3.2 and earlier) allows an attacker on the same network segment to execute arbitrary OS commands with root privileges via UPnP function.
CVE-2020-5525
PUBLISHED: 2020-02-21
Aterm series (Aterm WF1200C firmware Ver1.2.1 and earlier, Aterm WG1200CR firmware Ver1.2.1 and earlier, Aterm WG2600HS firmware Ver1.3.2 and earlier) allows an authenticated attacker on the same network segment to execute arbitrary OS commands with root privileges via management screen.
CVE-2020-5533
PUBLISHED: 2020-02-21
Cross-site scripting vulnerability in Aterm WG2600HS firmware Ver1.3.2 and earlier allows remote attackers to inject arbitrary web script or HTML via unspecified vectors.
CVE-2020-5534
PUBLISHED: 2020-02-21
Aterm WG2600HS firmware Ver1.3.2 and earlier allows an authenticated attacker on the same network segment to execute arbitrary OS commands with root privileges via unspecified vectors.
CVE-2014-7914
PUBLISHED: 2020-02-21
btif/src/btif_dm.c in Android before 5.1 does not properly enforce the temporary nature of a Bluetooth pairing, which allows user-assisted remote attackers to bypass intended access restrictions via crafted Bluetooth packets after the tapping of a crafted NFC tag.