News
6/15/2010
10:00 AM
George Crump
George Crump
Commentary
50%
50%

Revisiting The Keep It All Forever Retention Strategy

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.

Each day a seemingly new regulation is being placed on businesses and almost every one of these regulations adds to the data management burden in the data center. In the past I have advised against the keep it all forever mentality of data retention but now it may just be the only way left to protect the business.The first justification for a keep it all forever mentality is that the technology to archive years and even decades worth of data is more viable than ever. If you are going to keep everything, accessibility to this archive is critical. Disk based archive technology meets that demand and has improved on its ability to scale to multiple petabytes worth of capacity. As well as the ability to optimize storage capacity through compression and deduplication. Tape as an archive does not want to be left out either. As we discuss in our article "What is LTFS?" LTO-5 brings with it a new ease of use factor to tape based data. LTFS potentially makes reading data from tape as simple as reading data from a USB thumb drive.

The cloud as an archive cannot be left out of the discussion either. It provides the most incrementally scalable option and for businesses that don't have petabytes of storage to archive it may be the most cost effective option as well. Depending on the cloud storage provider it may also provide a better long term operational efficiency as well, relieving the data center from the data management burden in addition to saving upfront costs.

The second justification for a keep it all forever mentality is the improved ability to move older data to secondary storage. Data archiving software has certainly improved but most interesting is file virtualization products. As we discuss in our article "What is File Virtualization", the ability to seamlessly move data to and from an archive optimized disk repository without constant IT intervention is critical. In a keep everything strategy, policies need to be set broadly and then executed automatically. There may no longer be the time or resources available to manage down at the file level. As the number of regulations increase as well as focused enforcement of those regulations continues, the amount of recovery/discovery requests are going to skyrocket. IT can not afford to nor does it really want to be involved in delivering this data. The transparent recovery that file virtualization provides is key.

In our next entry we will discuss what else has to be in place for a keep it all forever data retention strategy to be successful. We will then conclude this series with a cost justification of a keep everything strategy vs. a strict data elimination strategy.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
DNS Threats: What Every Enterprise Should Know
Domain Name System exploits could put your data at risk. Here's some advice on how to avoid them.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-7445
Published: 2015-10-15
The Direct Rendering Manager (DRM) subsystem in the Linux kernel through 4.x mishandles requests for Graphics Execution Manager (GEM) objects, which allows context-dependent attackers to cause a denial of service (memory consumption) via an application that processes graphics data, as demonstrated b...

CVE-2015-4948
Published: 2015-10-15
netstat in IBM AIX 5.3, 6.1, and 7.1 and VIOS 2.2.x, when a fibre channel adapter is used, allows local users to gain privileges via unspecified vectors.

CVE-2015-5660
Published: 2015-10-15
Cross-site request forgery (CSRF) vulnerability in eXtplorer before 2.1.8 allows remote attackers to hijack the authentication of arbitrary users for requests that execute PHP code.

CVE-2015-6003
Published: 2015-10-15
Directory traversal vulnerability in QNAP QTS before 4.1.4 build 0910 and 4.2.x before 4.2.0 RC2 build 0910, when AFP is enabled, allows remote attackers to read or write to arbitrary files by leveraging access to an OS X (1) user or (2) guest account.

CVE-2015-6333
Published: 2015-10-15
Cisco Application Policy Infrastructure Controller (APIC) 1.1j allows local users to gain privileges via vectors involving addition of an SSH key, aka Bug ID CSCuw46076.

Dark Reading Radio
Archived Dark Reading Radio
Tim Wilson speaks to two experts on vulnerability research – independent consultant Jeremiah Grossman and Black Duck Software’s Mike Pittenger – about the latest wave of vulnerabilities being exploited by online attackers