News
2/25/2009
06:40 PM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Better Storage Practices To Improve Backup

Backup is the thorn in the side of many otherwise smoothly running IT operations. There is probably little coincidence that the newest hire is almost always assigned the backup process or the ramification for missing the assignments meeting. The truth is that backup should be simple -- all you're doing is copying data to tape. The problem in general has nothing to do with the backup process, it has more to do with how primary storage is managed and optimized.

Backup is the thorn in the side of many otherwise smoothly running IT operations. There is probably little coincidence that the newest hire is almost always assigned the backup process or the ramification for missing the assignments meeting. The truth is that backup should be simple -- all you're doing is copying data to tape. The problem in general has nothing to do with the backup process, it has more to do with how primary storage is managed and optimized.The primary problem is size; there is too much data, too much that is stagnant and unchanging. All of this unchanged data has to traverse most often the standard IP network or, best case, a SAN. While disk backup targets, especially those with data deduplication, help to optimally store this backup data, it really doesn't fix the problem at its source.

Fixing problems at the source is where you can derive the most improvement for the least investment. For example, look at real-time compression products like those from Storwize. These solutions compress file-based data in place and, in most cases, achieve better than 60% reduction of data. For user access, data is compressed and decompressed on the fly transparently, with little, if any, performance impacts. The backup application can access the compressed data in its compressed format, effectively cutting in half the amount of data to transport across the network and subsequently stored on backup disk and/or backup tape. Interesting that these inline compression appliances are compatible with deduplication appliances and actually improve their effective reduction rates.

The next step is to look into software that will reduce the amount of data sent across the network in the first place. The source-side reduction backup applications can either perform deduplication like EMC's Avamar or block-level incremental backups like SyncSort's Backup Express. While deduplication has the advantage of maximum backup disk storage optimization, block-level incremental technology (BLI) creates a backup volume that can be utilized for other purposes because it itself is a readable file system. Also, BLI has less impact on the server being protected, making more frequent backups more practical.

Finally, it makes sense to archive all this stagnant data off primary storage. Most studies indicate that more than 80% of this data can be archived and moved out of the backup process in its entirety. Using disk archive products like those from Permabit, NexSAN, or Copan Systems can be more aggressive than with tape-based systems, causing a dramatic reduction in your investment in primary storage and power costs.

Getting data to this archive can be as simple as using an operating systems move command to get the data to it. In our next entry we will look at a new technique, an optimized move.

View our Webcast on Primary Storage Optimization to learn more.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
DevOpsí Impact on Application Security
DevOpsí Impact on Application Security
Managing the interdependency between software and infrastructure is a thorny challenge. Often, itís a ďdevelopers are from Mars, systems engineers are from VenusĒ situation.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-0607
Published: 2014-07-24
Unrestricted file upload vulnerability in Attachmate Verastream Process Designer (VPD) before R6 SP1 Hotfix 1 allows remote attackers to execute arbitrary code by uploading and launching an executable file.

CVE-2014-1419
Published: 2014-07-24
Race condition in the power policy functions in policy-funcs in acpi-support before 0.142 allows local users to gain privileges via unspecified vectors.

CVE-2014-2360
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules allow remote attackers to execute arbitrary code via packets that report a high battery voltage.

CVE-2014-2361
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules, when BreeZ is used, do not require authentication for reading the site security key, which allows physically proximate attackers to spoof communication by obtaining this key after use of direct hardware access or manual-setup mode.

CVE-2014-2362
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules rely exclusively on a time value for entropy in key generation, which makes it easier for remote attackers to defeat cryptographic protection mechanisms by predicting the time of project creation.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Sara Peters hosts a conversation on Botnets and those who fight them.