News

2/25/2009
06:40 PM
George Crump
George Crump
Commentary
50%
50%

Better Storage Practices To Improve Backup

Backup is the thorn in the side of many otherwise smoothly running IT operations. There is probably little coincidence that the newest hire is almost always assigned the backup process or the ramification for missing the assignments meeting. The truth is that backup should be simple -- all you're doing is copying data to tape. The problem in general has nothing to do with the backup process, it has more to do with how primary storage is managed and optimized.

Backup is the thorn in the side of many otherwise smoothly running IT operations. There is probably little coincidence that the newest hire is almost always assigned the backup process or the ramification for missing the assignments meeting. The truth is that backup should be simple -- all you're doing is copying data to tape. The problem in general has nothing to do with the backup process, it has more to do with how primary storage is managed and optimized.The primary problem is size; there is too much data, too much that is stagnant and unchanging. All of this unchanged data has to traverse most often the standard IP network or, best case, a SAN. While disk backup targets, especially those with data deduplication, help to optimally store this backup data, it really doesn't fix the problem at its source.

Fixing problems at the source is where you can derive the most improvement for the least investment. For example, look at real-time compression products like those from Storwize. These solutions compress file-based data in place and, in most cases, achieve better than 60% reduction of data. For user access, data is compressed and decompressed on the fly transparently, with little, if any, performance impacts. The backup application can access the compressed data in its compressed format, effectively cutting in half the amount of data to transport across the network and subsequently stored on backup disk and/or backup tape. Interesting that these inline compression appliances are compatible with deduplication appliances and actually improve their effective reduction rates.

The next step is to look into software that will reduce the amount of data sent across the network in the first place. The source-side reduction backup applications can either perform deduplication like EMC's Avamar or block-level incremental backups like SyncSort's Backup Express. While deduplication has the advantage of maximum backup disk storage optimization, block-level incremental technology (BLI) creates a backup volume that can be utilized for other purposes because it itself is a readable file system. Also, BLI has less impact on the server being protected, making more frequent backups more practical.

Finally, it makes sense to archive all this stagnant data off primary storage. Most studies indicate that more than 80% of this data can be archived and moved out of the backup process in its entirety. Using disk archive products like those from Permabit, NexSAN, or Copan Systems can be more aggressive than with tape-based systems, causing a dramatic reduction in your investment in primary storage and power costs.

Getting data to this archive can be as simple as using an operating systems move command to get the data to it. In our next entry we will look at a new technique, an optimized move.

View our Webcast on Primary Storage Optimization to learn more.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
New Mexico Man Sentenced on DDoS, Gun Charges
Dark Reading Staff 5/18/2018
Cracking 2FA: How It's Done and How to Stay Safe
Kelly Sheridan, Staff Editor, Dark Reading,  5/17/2018
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: "The one you have not seen, won't be remembered".
Current Issue
Flash Poll
[Strategic Security Report] Navigating the Threat Intelligence Maze
[Strategic Security Report] Navigating the Threat Intelligence Maze
Most enterprises are using threat intel services, but many are still figuring out how to use the data they're collecting. In this Dark Reading survey we give you a look at what they're doing today - and where they hope to go.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2018-10428
PUBLISHED: 2018-05-23
ILIAS before 5.1.26, 5.2.x before 5.2.15, and 5.3.x before 5.3.4, due to inconsistencies in parameter handling, is vulnerable to various instances of reflected cross-site-scripting.
CVE-2018-6495
PUBLISHED: 2018-05-23
Cross-Site Scripting (XSS) in Micro Focus Universal CMDB, version 10.20, 10.21, 10.22, 10.30, 10.31, 10.32, 10.33, 11.0, CMS, version 4.10, 4.11, 4.12, 4.13, 4.14, 4.15.1 and Micro Focus UCMDB Browser, version 4.10, 4.11, 4.12, 4.13, 4.14, 4.15.1. This vulnerability could be remotely exploited to al...
CVE-2018-10653
PUBLISHED: 2018-05-23
There is an XML External Entity (XXE) Processing Vulnerability in Citrix XenMobile Server 10.8 before RP2 and 10.7 before RP3.
CVE-2018-10654
PUBLISHED: 2018-05-23
There is a Hazelcast Library Java Deserialization Vulnerability in Citrix XenMobile Server 10.8 before RP2 and 10.7 before RP3.
CVE-2018-10648
PUBLISHED: 2018-05-23
There are Unauthenticated File Upload Vulnerabilities in Citrix XenMobile Server 10.8 before RP2 and 10.7 before RP3.