Data Thinkage

Data storage capacity is cheap. For most environments obtaining enough capacity is no longer a challenge, it is managing that capacity that becomes the problem. Growth, especially in unstructured data, continues unabated. Deciding what data should be where is one of the biggest challenges that the storage manager has to face today. Users don't want to think about where data should be stored and storage managers don't have the time to think about it.

George Crump, President, Storage Switzerland

November 12, 2009

3 Min Read

Data storage capacity is cheap. For most environments obtaining enough capacity is no longer a challenge, it is managing that capacity that becomes the problem. Growth, especially in unstructured data, continues unabated. Deciding what data should be where is one of the biggest challenges that the storage manager has to face today. Users don't want to think about where data should be stored and storage managers don't have the time to think about it.The good news is today with disk archiving and MAID based storage systems you can store data in a cost effective fashion for a very long time. Scale and power efficiency, whichever is the higher priority, can be addressed by systems from companies like EMC, Nexsan, Permabit and others. The challenge is getting data to that system.

There is the manual method, basically using tools from companies like APTARE or Tek-Tools, to analyze the storage environment. Determine what files have not been accessed in a period of time and then move those files to one of the above secondary tiers of storage. There is a challenge in how you actually move that data and where the user goes to find the data once the move has been completed.

I also like that we are seeing some of the API sets from cloud providers like Nirvanix, Iron Mountain and others to eventually be able to do meta data tagging. When software applications support this a user should be able to set a migration and even retention policy at the point of creation. If we can get the users to take the extra step and provide the meta data information, this could be a big relief for storage managers.

Alternatively there is the concept of global file systems or file virtualization. While some manufacturers have this built into their NAS solutions like NetApp, others have a broader solution that supports multiple manufacturers. Companies like F5, AutoVirt and EMC's Rainfinity all provide this technology. Think of file virtualization as a DNS for files. You don't need to know a file's location, just its name.

These systems can be automatically based on file attributes like age or access and move data from one storage platform to another and do so transparently to the user. Something to be careful with is the level of granularity that the policy can be applied. Some file virtualization solutions are only granular to a folder level, meaning that the folder has to meet the policy, not the files within the folder.

In an effort to address the ever increasing types of data tiers available, data management is quickly evolving, manual tools are becoming easier to use, self service models are coming to market, and technologies like file virtualization that make the whole process transparent are maturing. Maybe data management won't require so much brain power in the future.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Read more about:

2009

About the Author(s)

George Crump

President, Storage Switzerland

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, and SAN. Prior to founding Storage Switzerland, he was CTO at one the nation’s largest storage integrators, where he was in charge of technology testing, integration, and product selection. George is responsible for the storage blog on InformationWeek's website and is a regular contributor to publications such as Byte and Switch, SearchStorage, eWeek, SearchServerVirtualizaiton, and SearchDataBackup.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights