News
4/29/2010
09:38 AM
George Crump
George Crump
Commentary
50%
50%

Fixing Storage Utilization Without A Refresh

In the final part of our storage utilization series we address how to improve storage utilization without refreshing the storage itself. This is, unfortunately, the most difficult way to improve storage utilization.

In the final part of our storage utilization series we address how to improve storage utilization without refreshing the storage itself. This is, unfortunately, the most difficult way to improve storage utilization.It is much easier to start from a clean slate and a storage system designed to maintain high levels of utilization. The first step in improving utilization of an existing system is to see if it is even worth it.

One of the goal of higher utilization is to make sure that if drives and shelves are installed and using power they have enough data on them to justify the power and space consumption as well as their cost. As mentioned in the other entries in this series, storage utilization needs to be addressed in two areas. The first and most often thought of is increasing efficiency of stored data via compression, deduplication or migration to archival storage.

These methods only work on actual data. The larger issue often is storage capacity that is assigned to particular servers but not in use, the capacity is captive. It is that room for growth that you assign to each server as it connects to the storage system.

The first method, making data already stored consume less space, will work on existing storage systems as well as new storage systems. The problem is that this often then compounds the second issue of captive storage by making even more space available but not in use. The most viable way to address the second problem is to implement a thinly provisioned storage system or a storage system that can very easily expand capacity on existing volumes.

If your current storage system can't do this, there are add-on solutions that can provide it for you. This can be delivered either through a NAS software/gateway solution or an external software based storage virtualization application. From that point forward new volumes you create can be thin provisioned. While this does not optimize what you have underutilized now you can at least stop the problem from getting worse.

Dealing with existing captive capacity though is a bigger challenge and unfortunately also where the utilization issue is typically the worst. What makes this difficult is that best practices for most array systems is to create volumes where the drives within that volume span vertically across several shelves in the array. Even if you could just shrink the size of the volume, the drives would still be in use. Nothing could be powered off, nothing would be saved.

The best way to address this would be to implement a "thin aware" file system that can also do "thin migrations" along with the new thin provisioning software. Then you could define a new volume that uses fewer physical drives and shelves. Leveraging a thin migration, the ability to only copy actual data, then migrate the data from one volume to another. With the right file system this migration could be done live without having an impact on users. Depending on space available you may have to migrate just a few volumes at a time. As you offline volumes you should be able to begin to shut down drive shelves.

This may seem like a lot of work to increase utilization. It is. The payoff though can be racks of storage that are no longer needed as well as all the other benefits that the modernized storage software and thin provisioned file system will deliver. In some cases the power savings alone or the freeing up of power for other systems can justify the effort.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading Tech Digest, Dec. 19, 2014
Software-defined networking can be a net plus for security. The key: Work with the network team to implement gradually, test as you go, and take the opportunity to overhaul your security strategy.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-8917
Published: 2015-01-28
Multiple cross-site scripting (XSS) vulnerabilities in (1) dojox/form/resources/uploader.swf (aka upload.swf), (2) dojox/form/resources/fileuploader.swf (aka fileupload.swf), (3) dojox/av/resources/audio.swf, and (4) dojox/av/resources/video.swf in the IBM Dojo Toolkit, as used in IBM Social Media A...

CVE-2014-8920
Published: 2015-01-28
Buffer overflow in the Data Transfer Program in IBM i Access 5770-XE1 5R4, 6.1, and 7.1 on Windows allows local users to gain privileges via unspecified vectors.

CVE-2015-0235
Published: 2015-01-28
Heap-based buffer overflow in the __nss_hostname_digits_dots function in glibc 2.2, and other 2.x versions before 2.18, allows context-dependent attackers to execute arbitrary code via vectors related to the (1) gethostbyname or (2) gethostbyname2 function, aka "GHOST."

CVE-2015-0312
Published: 2015-01-28
Double free vulnerability in Adobe Flash Player before 13.0.0.264 and 14.x through 16.x before 16.0.0.296 on Windows and OS X and before 11.2.202.440 on Linux allows attackers to execute arbitrary code via unspecified vectors.

CVE-2015-0581
Published: 2015-01-28
The XML parser in Cisco Prime Service Catalog before 10.1 allows remote authenticated users to read arbitrary files or cause a denial of service (CPU and memory consumption) via an external entity declaration in conjunction with an entity reference, as demonstrated by reading private keys, related t...

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
If youíre a security professional, youíve probably been asked many questions about the December attack on Sony. On Jan. 21 at 1pm eastern, you can join a special, one-hour Dark Reading Radio discussion devoted to the Sony hack and the issues that may arise from it.