Maximizing The Storage Budget - Capacity Optimization
In this economy, maximizing what you have and cost justifying what you need now becomes a much sought-after skill. The IT budget and the storage budget along with it are not growing in many organizations and I often hear that the budget is the same but they are not allowed to spend right now, which is worse than the budget being cut. Regardless spendable IT dollars are a precious commodity.
In this economy, maximizing what you have and cost justifying what you need now becomes a much sought-after skill. The IT budget and the storage budget along with it are not growing in many organizations and I often hear that the budget is the same but they are not allowed to spend right now, which is worse than the budget being cut. Regardless spendable IT dollars are a precious commodity.As we detailed in our article "Maximizing Your Storage Cost Cutting Efforts", the first step in maximizing your budget is to understand what the situation looks like right now. How much of those storage assets are you using and is there wasted space that can be reclaimed?
With this information in hand, the next step in maximizing the storage budget is to look for areas to optimize the investment instead of buying additional capacity. First look at unstructured data which is often thought of as user home directories but is really any data not in a database. There are excellent, low cost solutions available that can optimize the file servers or NAS systems that host this data.
For example companies like Storwize and Ocarina Networks can significantly reduce the size of the unstructured data store. These technologies work either in-line or as a post process to compress and/or deduplicate primary storage by as much as 90% or more. This type of reduction can not only delay the need for additional storage this year, it could delay the need for several years, showing a positive gain on future budgets.
As we highlighted in our article Optimized Migrations, some of these solutions can also handle migrating data from primary storage to secondary storage while they optimize its capacity requirements. An optimized secondary tier is important as you free up primary storage space. Being able to store the same amount of data in significantly less space and at a lower cost enlarges the delta between primary and secondary storage costs and makes the effort of this sort of project more valuable.
NAS vendors are realizing that optimized storage is an important capability for them to offer. NetApp has been offering deduplication on their systems for some time now, and especially when storing virtual server images it can provide value. Other companies like Isilon and BlueArc are partnering with optimization software companies to be able to enhance their offerings. ONStor is leveraging their use of the ZFS Filesystem to offer compression with deduplication coming soon.
The choices available to more efficiently manage and store unstructured data are maturing and are becoming a requirement in the space. In our next entry we will look at what you can do to optimize block based storage to help better invest budget dollars.
Track us on Twitter: http://twitter.com/storageswiss.
Subscribe to our RSS feed.
George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.
About the Author
You May Also Like
Cybersecurity Day: How to Automate Security Analytics with AI and ML
Dec 17, 2024The Dirt on ROT Data
Dec 18, 2024