Risk
12/10/2012
12:24 PM
George Crump
George Crump
Commentary
50%
50%

Find The Right Data Archive Method

Backup-as-archive is an increasingly viable storage solution, especially for companies that don't have strict data retention requirements.

In almost every study I've done and seen, one fact remains consistent: at least 75% of the data that's stored on primary storage has not been accessed for more than one year.

This data really should be archived. Products that move such data transparently to an archive have improved dramatically in recent years, and it may be time to reconsider data archiving. In this column I'll look at the competing methods for archiving data; in my next column I'll look at some of the competing archive targets.

Backup As Archive

While not the ideal location, backup is the archive for many companies. Archive purists will not agree with me, but I believe backup products can in some cases solve the archive need, especially for companies that don't need to meet government regulations or other requirements on retaining data. Backup may also be the most realistic way to archive data since most companies are already doing it. As I discussed in this article, many organizations count on backups for long-term data retention instead of using a separate archive product.

[ How to choose a virtualization-friendly backup app: See Big Backup Challenge Of Medium-Size Data Centers. ]

One reason backup archiving has lately gained legitimacy is that backup software can now create larger meta-data tables (data about the data in the backup) and can better search that data. Some products now even offer content search capabilities. Improvements in backup products' scalability are another reason the backup-as-archive approach is more practical than it has been.

The key limiting factor for disk backup products has not been how many disks they can add to the shelf, but how far their deduplication tables scale. This is another meta-data issue. One approach we've seen vendors take is to segment their deduplication table into multiple tables as the data ages. This lowers deduplication effectiveness, but allows for longer storage without impacting current backup performance due to lengthy deduplication table lookups. Eventually, though, deduplication engines will need to be improved in order to scale, as discussed in this article.

One thing we don't typically see in the backup-as-archive method is the problem cited above: removal of data from primary storage. Backup-as-archive is best for companies that are less concerned with how much data they are storing on primary storage and primarily need a way to retain information in case they need it later.

Archive As Archive

Because backup as a long-term retention area is becoming more viable, archive solutions are taking a different approach. Just as solutions that move data from primary storage to archive storage are improving, so is the ability to browse the archive independently of a specific archive application. Most archives now simply show up as network mount. They also have the ability to leverage tape and disk for excellent restore performance and maximum cost-effectiveness.

The key to archive success is to move it upstream, where it can play a more active role in primary storage. Because of the high level of transparency and fast recovery time, archiving data after 90 days of data inactivity will likely have no impact on productivity -- and maximum impact on cost reduction.

There's a lot to be gained by removing 75% or more of your data from primary storage: backups will get faster and investment in higher-speed storage (SSD) for the remaining data can be justified. Data integrity will also improve since most archive solutions perform ongoing data integrity checks, protecting you from silent data corruption (bit rot).

In my next column I'll look at some of the products that are competing for your archive dollars: disk appliances, object storage systems, cloud storage providers, and, of course, tape.

Storing and protecting data are critical components of any successful cloud solution. Join our webcast, Cloud Storage Drivers: Auto-provisioning, Virtualization, Encryption, to stay ahead of the curve on automated and self-service storage, enterprise class data protection and service level management. Watch now or bookmark for later.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
psmails950
50%
50%
psmails950,
User Rank: Apprentice
12/11/2012 | 10:03:54 PM
re: Find The Right Data Archive Method
Great article George. Many organizations start with a backup as an archive strategy but eventually recognize the limitations of a Gbackup onlyG approach. As you stated, the difference between archiving vs. simple data retention lies within the ability to extract value from your data, but the most recent IDC Digital Universe study illustrates that less than 3% of useful data is even tagged meaning most organization donGt even know what theyGre saving.

In response to that challenge, forward thinking organizations treat archiving and backup as different but complementary pillars of an overall data protection strategy and are investing in archiving solutions like EMC SourceOne that enable them to classify application and file data regardless of location, make intelligent decisions about where to store the data, and efficiently search the data for compliance or other business purposes.

Although archiving and backup requirements are best served by separate applications an increasing number of organizations are also benefiting from the tremendous cost and efficiency gains enabled by leveraging a common protection storage solution like EMC Data Domain for backup and archive. If anyone is interested in learning more, check out www.emc.com/archive.
Equifax CIO, CSO Step Down
Dark Reading Staff 9/15/2017
1.9 Billion Data Records Exposed in First Half of 2017
Kelly Jackson Higgins, Executive Editor at Dark Reading,  9/20/2017
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: Jan, check this out! I found an unhackable PC.
Current Issue
Security Vulnerabilities: The Next Wave
Just when you thought it was safe, researchers have unveiled a new round of IT security flaws. Is your enterprise ready?
Flash Poll
[Strategic Security Report] How Enterprises Are Attacking the IT Security Problem
[Strategic Security Report] How Enterprises Are Attacking the IT Security Problem
Enterprises are spending more of their IT budgets on cybersecurity technology. How do your organization's security plans and strategies compare to what others are doing? Here's an in-depth look.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.