News
6/15/2010
10:00 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Revisiting The Keep It All Forever Retention Strategy

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.The first justification for a keep it all forever mentality is that the technology to archive years and even decades worth of data is more viable than ever. If you are going to keep everything, accessibility to this archive is critical. Disk based archive technology meets that demand and has improved on its ability to scale to multiple petabytes worth of capacity. As well as the ability to optimize storage capacity through compression and deduplication. Tape as an archive does not want to be left out either. As we discuss in our article "What is LTFS?" LTO-5 brings with it a new ease of use factor to tape based data. LTFS potentially makes reading data from tape as simple as reading data from a USB thumb drive.

The cloud as an archive cannot be left out of the discussion either. It provides the most incrementally scalable option and for businesses that don't have petabytes of storage to archive it may be the most cost effective option as well. Depending on the cloud storage provider it may also provide a better long term operational efficiency as well, relieving the data center from the data management burden in addition to saving upfront costs.

The second justification for a keep it all forever mentality is the improved ability to move older data to secondary storage. Data archiving software has certainly improved but most interesting is file virtualization products. As we discuss in our article "What is File Virtualization", the ability to seamlessly move data to and from an archive optimized disk repository without constant IT intervention is critical. In a keep everything strategy, policies need to be set broadly and then executed automatically. There may no longer be the time or resources available to manage down at the file level. As the number of regulations increase as well as focused enforcement of those regulations continues, the amount of recovery/discovery requests are going to skyrocket. IT can not afford to nor does it really want to be involved in delivering this data. The transparent recovery that file virtualization provides is key.

In our next entry we will discuss what else has to be in place for a keep it all forever data retention strategy to be successful. We will then conclude this series with a cost justification of a keep everything strategy vs. a strict data elimination strategy.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading, September 16, 2014
Malicious software is morphing to be more targeted, stealthy, and destructive. Are you prepared to stop it?
Flash Poll
Title Partnerís Role in Perimeter Security
Title Partnerís Role in Perimeter Security
Considering how prevalent third-party attacks are, we need to ask hard questions about how partners and suppliers are safeguarding systems and data.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-3090
Published: 2014-09-23
IBM Rational ClearCase 7.1 before 7.1.2.15, 8.0.0 before 8.0.0.12, and 8.0.1 before 8.0.1.5 allows remote attackers to cause a denial of service (memory consumption) via a crafted XML document containing a large number of nested entity references, a similar issue to CVE-2003-1564.

CVE-2014-3101
Published: 2014-09-23
The login form in the Web component in IBM Rational ClearQuest 7.1 before 7.1.2.15, 8.0.0 before 8.0.0.12, and 8.0.1 before 8.0.1.5 does not insert a delay after a failed authentication attempt, which makes it easier for remote attackers to obtain access via a brute-force attack.

CVE-2014-3103
Published: 2014-09-23
The Web component in IBM Rational ClearQuest 7.1 before 7.1.2.15, 8.0.0 before 8.0.0.12, and 8.0.1 before 8.0.1.5 does not set the secure flag for the session cookie in an https session, which makes it easier for remote attackers to capture this cookie by intercepting its transmission within an http...

CVE-2014-3104
Published: 2014-09-23
IBM Rational ClearQuest 7.1 before 7.1.2.15, 8.0.0 before 8.0.0.12, and 8.0.1 before 8.0.1.5 allows remote attackers to cause a denial of service (memory consumption) via a crafted XML document containing a large number of nested entity references, a similar issue to CVE-2003-1564.

CVE-2014-3105
Published: 2014-09-23
The OSLC integration feature in the Web component in IBM Rational ClearQuest 7.1 before 7.1.2.15, 8.0.0 before 8.0.0.12, and 8.0.1 before 8.0.1.5 provides different error messages for failed login attempts depending on whether the username exists, which allows remote attackers to enumerate account n...

Best of the Web
Dark Reading Radio