Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

News

7/12/2010
10:27 AM
George Crump
George Crump
Commentary
50%
50%

Containing The Cost Of Keeping Data Forever - Capacity

As we stated when we began the Keeping Data Forever strategy, the reason we can even consider this as a viable strategy is because technology has provided us with solutions to the challenges associated with it. In this entry we will look at some of the ways to contain the costs associated with this strategy. We will look at containing the capacity costs.

As we stated when we began the Keeping Data Forever strategy, the reason we can even consider this as a viable strategy is because technology has provided us with solutions to the challenges associated with it. In this entry we will look at some of the ways to contain the costs associated with this strategy. We will look at containing the capacity costs.The first most obvious cost to contain is the cost to store all this information forever. There is little doubt that disk archive systems can now scale to meet the multi-petabyte capacity demands that a keep it forever strategy may entail, but you don't want to buy all that capacity today. Look for a way to add that capacity only as you need it and not before. While it's good that many storage systems can meet the capacity demands of a keep it forever strategy, that doesn't mean that you want to take advantage of it if you don't have to. You are going to want to curtail capacity growth as best you can, and that means capacity optimization is a key requirement.

Capacity optimization should come in at least two forms. One is compression and the other is deduplication. Deduplication, the ability to identify redundant data and only store that data once, captures all the attention. While it is important in the keep data forever strategy, there should not be the redundancy that there is in backup. As we stated in our article "Backup vs. Archive", backups send essentially the same data over and over again. Archive should be a one time event where data is archived one time, replicated to a redundant archive and then removed from primary storage. While certainly some duplication will exist, it will not deliver the same return on investment that backup deduplication will. While your mileage will vary, expect about a 3X to 5X reduction.

While compression does not have the same percentage gains that deduplication does when there is duplicate data, compression does work across almost all data, redundant or not. Gaining a 50% reduction on all files instead of a 300% reduction on a few files may provide greater savings. The best choice though is to combine the two techniques for maximum total reduction.

Tape as a means to contain capacity costs can not be left out of the capacity discussion. As we discuss in our article "What is LTFS?", IBM's new tape based file system for LTO makes tape more viable than ever for long term data retention. Tape and disk archive should no longer be looked at as competitors but complimentary to each other where disk fills the intermediate role of storing data for 3-7 years and tape stores data for the remainder of the time. There are several solutions that would support automatically moving data from disk to tape after a given timeframe.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
2020: The Year in Security
Download this Tech Digest for a look at the biggest security stories that - so far - have shaped a very strange and stressful year.
Flash Poll
Assessing Cybersecurity Risk in Today's Enterprises
Assessing Cybersecurity Risk in Today's Enterprises
COVID-19 has created a new IT paradigm in the enterprise -- and a new level of cybersecurity risk. This report offers a look at how enterprises are assessing and managing cyber-risk under the new normal.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-12512
PUBLISHED: 2021-01-22
Pepperl+Fuchs Comtrol IO-Link Master in Version 1.5.48 and below is prone to an authenticated reflected POST Cross-Site Scripting
CVE-2020-12513
PUBLISHED: 2021-01-22
Pepperl+Fuchs Comtrol IO-Link Master in Version 1.5.48 and below is prone to an authenticated blind OS Command Injection.
CVE-2020-12514
PUBLISHED: 2021-01-22
Pepperl+Fuchs Comtrol IO-Link Master in Version 1.5.48 and below is prone to a NULL Pointer Dereference that leads to a DoS in discoveryd
CVE-2020-12525
PUBLISHED: 2021-01-22
M&M Software fdtCONTAINER Component in versions below 3.5.20304.x and between 3.6 and 3.6.20304.x is vulnerable to deserialization of untrusted data in its project storage.
CVE-2020-12511
PUBLISHED: 2021-01-22
Pepperl+Fuchs Comtrol IO-Link Master in Version 1.5.48 and below is prone to a Cross-Site Request Forgery (CSRF) in the web interface.