News
6/15/2010
10:00 AM
George Crump
George Crump
Commentary
50%
50%

Revisiting The Keep It All Forever Retention Strategy

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.The first justification for a keep it all forever mentality is that the technology to archive years and even decades worth of data is more viable than ever. If you are going to keep everything, accessibility to this archive is critical. Disk based archive technology meets that demand and has improved on its ability to scale to multiple petabytes worth of capacity. As well as the ability to optimize storage capacity through compression and deduplication. Tape as an archive does not want to be left out either. As we discuss in our article "What is LTFS?" LTO-5 brings with it a new ease of use factor to tape based data. LTFS potentially makes reading data from tape as simple as reading data from a USB thumb drive.

The cloud as an archive cannot be left out of the discussion either. It provides the most incrementally scalable option and for businesses that don't have petabytes of storage to archive it may be the most cost effective option as well. Depending on the cloud storage provider it may also provide a better long term operational efficiency as well, relieving the data center from the data management burden in addition to saving upfront costs.

The second justification for a keep it all forever mentality is the improved ability to move older data to secondary storage. Data archiving software has certainly improved but most interesting is file virtualization products. As we discuss in our article "What is File Virtualization", the ability to seamlessly move data to and from an archive optimized disk repository without constant IT intervention is critical. In a keep everything strategy, policies need to be set broadly and then executed automatically. There may no longer be the time or resources available to manage down at the file level. As the number of regulations increase as well as focused enforcement of those regulations continues, the amount of recovery/discovery requests are going to skyrocket. IT can not afford to nor does it really want to be involved in delivering this data. The transparent recovery that file virtualization provides is key.

In our next entry we will discuss what else has to be in place for a keep it all forever data retention strategy to be successful. We will then conclude this series with a cost justification of a keep everything strategy vs. a strict data elimination strategy.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-6501
Published: 2015-03-30
The default soap.wsdl_cache_dir setting in (1) php.ini-production and (2) php.ini-development in PHP through 5.6.7 specifies the /tmp directory, which makes it easier for local users to conduct WSDL injection attacks by creating a file under /tmp with a predictable filename that is used by the get_s...

CVE-2014-9652
Published: 2015-03-30
The mconvert function in softmagic.c in file before 5.21, as used in the Fileinfo component in PHP before 5.4.37, 5.5.x before 5.5.21, and 5.6.x before 5.6.5, does not properly handle a certain string-length field during a copy of a truncated version of a Pascal string, which might allow remote atta...

CVE-2014-9653
Published: 2015-03-30
readelf.c in file before 5.22, as used in the Fileinfo component in PHP before 5.4.37, 5.5.x before 5.5.21, and 5.6.x before 5.6.5, does not consider that pread calls sometimes read only a subset of the available data, which allows remote attackers to cause a denial of service (uninitialized memory ...

CVE-2014-9705
Published: 2015-03-30
Heap-based buffer overflow in the enchant_broker_request_dict function in ext/enchant/enchant.c in PHP before 5.4.38, 5.5.x before 5.5.22, and 5.6.x before 5.6.6 allows remote attackers to execute arbitrary code via vectors that trigger creation of multiple dictionaries.

CVE-2014-9709
Published: 2015-03-30
The GetCode_ function in gd_gif_in.c in GD 2.1.1 and earlier, as used in PHP before 5.5.21 and 5.6.x before 5.6.5, allows remote attackers to cause a denial of service (buffer over-read and application crash) via a crafted GIF image that is improperly handled by the gdImageCreateFromGif function.

Dark Reading Radio
Archived Dark Reading Radio
Good hackers--aka security researchers--are worried about the possible legal and professional ramifications of President Obama's new proposed crackdown on cyber criminals.