Risk
12/10/2012
12:24 PM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Find The Right Data Archive Method

Backup-as-archive is an increasingly viable storage solution, especially for companies that don't have strict data retention requirements.

In almost every study I've done and seen, one fact remains consistent: at least 75% of the data that's stored on primary storage has not been accessed for more than one year.

This data really should be archived. Products that move such data transparently to an archive have improved dramatically in recent years, and it may be time to reconsider data archiving. In this column I'll look at the competing methods for archiving data; in my next column I'll look at some of the competing archive targets.

Backup As Archive

While not the ideal location, backup is the archive for many companies. Archive purists will not agree with me, but I believe backup products can in some cases solve the archive need, especially for companies that don't need to meet government regulations or other requirements on retaining data. Backup may also be the most realistic way to archive data since most companies are already doing it. As I discussed in this article, many organizations count on backups for long-term data retention instead of using a separate archive product.

[ How to choose a virtualization-friendly backup app: See Big Backup Challenge Of Medium-Size Data Centers. ]

One reason backup archiving has lately gained legitimacy is that backup software can now create larger meta-data tables (data about the data in the backup) and can better search that data. Some products now even offer content search capabilities. Improvements in backup products' scalability are another reason the backup-as-archive approach is more practical than it has been.

The key limiting factor for disk backup products has not been how many disks they can add to the shelf, but how far their deduplication tables scale. This is another meta-data issue. One approach we've seen vendors take is to segment their deduplication table into multiple tables as the data ages. This lowers deduplication effectiveness, but allows for longer storage without impacting current backup performance due to lengthy deduplication table lookups. Eventually, though, deduplication engines will need to be improved in order to scale, as discussed in this article.

One thing we don't typically see in the backup-as-archive method is the problem cited above: removal of data from primary storage. Backup-as-archive is best for companies that are less concerned with how much data they are storing on primary storage and primarily need a way to retain information in case they need it later.

Archive As Archive

Because backup as a long-term retention area is becoming more viable, archive solutions are taking a different approach. Just as solutions that move data from primary storage to archive storage are improving, so is the ability to browse the archive independently of a specific archive application. Most archives now simply show up as network mount. They also have the ability to leverage tape and disk for excellent restore performance and maximum cost-effectiveness.

The key to archive success is to move it upstream, where it can play a more active role in primary storage. Because of the high level of transparency and fast recovery time, archiving data after 90 days of data inactivity will likely have no impact on productivity -- and maximum impact on cost reduction.

There's a lot to be gained by removing 75% or more of your data from primary storage: backups will get faster and investment in higher-speed storage (SSD) for the remaining data can be justified. Data integrity will also improve since most archive solutions perform ongoing data integrity checks, protecting you from silent data corruption (bit rot).

In my next column I'll look at some of the products that are competing for your archive dollars: disk appliances, object storage systems, cloud storage providers, and, of course, tape.

Storing and protecting data are critical components of any successful cloud solution. Join our webcast, Cloud Storage Drivers: Auto-provisioning, Virtualization, Encryption, to stay ahead of the curve on automated and self-service storage, enterprise class data protection and service level management. Watch now or bookmark for later.

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
psmails950
50%
50%
psmails950,
User Rank: Apprentice
12/11/2012 | 10:03:54 PM
re: Find The Right Data Archive Method
Great article George. Many organizations start with a backup as an archive strategy but eventually recognize the limitations of a GĒ˙backup onlyGĒÖ approach. As you stated, the difference between archiving vs. simple data retention lies within the ability to extract value from your data, but the most recent IDC Digital Universe study illustrates that less than 3% of useful data is even tagged meaning most organization donGĒÖt even know what theyGĒÖre saving.

In response to that challenge, forward thinking organizations treat archiving and backup as different but complementary pillars of an overall data protection strategy and are investing in archiving solutions like EMC SourceOne that enable them to classify application and file data regardless of location, make intelligent decisions about where to store the data, and efficiently search the data for compliance or other business purposes.

Although archiving and backup requirements are best served by separate applications an increasing number of organizations are also benefiting from the tremendous cost and efficiency gains enabled by leveraging a common protection storage solution like EMC Data Domain for backup and archive. If anyone is interested in learning more, check out www.emc.com/archive.
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-2970
Published: 2014-07-31
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: CVE-2014-5139. Reason: This candidate is a duplicate of CVE-2014-5139, and has also been used to refer to an unrelated topic that is currently outside the scope of CVE. This unrelated topic is a LibreSSL code change adding functionality ...

CVE-2014-0914
Published: 2014-07-30
Cross-site scripting (XSS) vulnerability in IBM Maximo Asset Management 6.2 through 6.2.8 and 6.x and 7.x through 7.5.0.6, Maximo Asset Management 7.5 through 7.5.0.3 and 7.5.1 through 7.5.1.2 for SmartCloud Control Desk, and Maximo Asset Management 6.2 through 6.2.8 for Tivoli IT Asset Management f...

CVE-2014-0915
Published: 2014-07-30
Multiple cross-site scripting (XSS) vulnerabilities in IBM Maximo Asset Management 6.2 through 6.2.8, 6.x and 7.1 through 7.1.1.2, and 7.5 through 7.5.0.6; Maximo Asset Management 7.5 through 7.5.0.3 and 7.5.1 through 7.5.1.2 for SmartCloud Control Desk; and Maximo Asset Management 6.2 through 6.2.8...

CVE-2014-0947
Published: 2014-07-30
Unspecified vulnerability in the server in IBM Rational Software Architect Design Manager 4.0.6 allows remote authenticated users to execute arbitrary code via a crafted update site.

CVE-2014-0948
Published: 2014-07-30
Unspecified vulnerability in IBM Rational Software Architect Design Manager and Rational Rhapsody Design Manager 3.x and 4.x before 4.0.7 allows remote authenticated users to execute arbitrary code via a crafted ZIP archive.

Best of the Web
Dark Reading Radio