News
6/15/2010
10:00 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Revisiting The Keep It All Forever Retention Strategy

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.The first justification for a keep it all forever mentality is that the technology to archive years and even decades worth of data is more viable than ever. If you are going to keep everything, accessibility to this archive is critical. Disk based archive technology meets that demand and has improved on its ability to scale to multiple petabytes worth of capacity. As well as the ability to optimize storage capacity through compression and deduplication. Tape as an archive does not want to be left out either. As we discuss in our article "What is LTFS?" LTO-5 brings with it a new ease of use factor to tape based data. LTFS potentially makes reading data from tape as simple as reading data from a USB thumb drive.

The cloud as an archive cannot be left out of the discussion either. It provides the most incrementally scalable option and for businesses that don't have petabytes of storage to archive it may be the most cost effective option as well. Depending on the cloud storage provider it may also provide a better long term operational efficiency as well, relieving the data center from the data management burden in addition to saving upfront costs.

The second justification for a keep it all forever mentality is the improved ability to move older data to secondary storage. Data archiving software has certainly improved but most interesting is file virtualization products. As we discuss in our article "What is File Virtualization", the ability to seamlessly move data to and from an archive optimized disk repository without constant IT intervention is critical. In a keep everything strategy, policies need to be set broadly and then executed automatically. There may no longer be the time or resources available to manage down at the file level. As the number of regulations increase as well as focused enforcement of those regulations continues, the amount of recovery/discovery requests are going to skyrocket. IT can not afford to nor does it really want to be involved in delivering this data. The transparent recovery that file virtualization provides is key.

In our next entry we will discuss what else has to be in place for a keep it all forever data retention strategy to be successful. We will then conclude this series with a cost justification of a keep everything strategy vs. a strict data elimination strategy.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading Must Reads - September 25, 2014
Dark Reading's new Must Reads is a compendium of our best recent coverage of identity and access management. Learn about access control in the age of HTML5, how to improve authentication, why Active Directory is dead, and more.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2003-1598
Published: 2014-10-01
SQL injection vulnerability in log.header.php in WordPress 0.7 and earlier allows remote attackers to execute arbitrary SQL commands via the posts variable.

CVE-2011-4624
Published: 2014-10-01
Cross-site scripting (XSS) vulnerability in facebook.php in the GRAND FlAGallery plugin (flash-album-gallery) before 1.57 for WordPress allows remote attackers to inject arbitrary web script or HTML via the i parameter.

CVE-2012-0811
Published: 2014-10-01
Multiple SQL injection vulnerabilities in Postfix Admin (aka postfixadmin) before 2.3.5 allow remote authenticated users to execute arbitrary SQL commands via (1) the pw parameter to the pacrypt function, when mysql_encrypt is configured, or (2) unspecified vectors that are used in backup files gene...

CVE-2012-5485
Published: 2014-09-30
registerConfiglet.py in Plone before 4.2.3 and 4.3 before beta 1 allows remote attackers to execute Python code via unspecified vectors, related to the admin interface.

CVE-2012-5486
Published: 2014-09-30
ZPublisher.HTTPRequest._scrubHeader in Zope 2 before 2.13.19, as used in Plone before 4.3 beta 1, allows remote attackers to inject arbitrary HTTP headers via a linefeed (LF) character.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Chris Hadnagy, who hosts the annual Social Engineering Capture the Flag Contest at DEF CON, will discuss the latest trends attackers are using.