News
11/3/2008
11:59 AM
George Crump
George Crump
Commentary
50%
50%

Primary Storage Data Reduction - A Process

Primary storage data reduction is a series of steps you can take to reduce the amount of capacity dedicated to Tier 1 storage. The most common techniques are archiving, compression, data deduplication, and the use of intelligent storage systems. The question often comes up, what should I do first?

Primary storage data reduction is a series of steps you can take to reduce the amount of capacity dedicated to Tier 1 storage. The most common techniques are archiving, compression, data deduplication, and the use of intelligent storage systems. The question often comes up, what should I do first?The first step should always be to archive existing data to an archive solution. Since this can result in a reduction in as much as 80% of your primary storage, no matter what step follows, archiving clears the way for that step. Consider a disk-based solution such as those from Permabit, Copan Systems, or Nexsan. These allow for easy access and a rapid retrieval, resulting in greater confidence in a more aggressive archive plan.

If you're in the process of selecting a new primary storage system, the next step is to consider a system that has the ability to do thin provisioning and intelligent data movement. Archiving will have driven down the amount of primary storage you will need to purchase; thin provisioning will reduce that even further. Thin provisioning allows you to allocate as much storage as an application may need, but only consume that storage as it is used. According to some studies, this can result in a reduction of 70% in purchased capacity.

Regardless if you decide on a new primary storage system or not, the next step should be an inline real-time data compression device like those provided by Storwize. These devices allow for a 60%-plus reduction of NFS and CIFS mounted data with little to no performance impact. Even databases or VMware images compress well, yet maintain or even improve overall performance. The reason real-time compression is so early in the process is that its simple to implement and shows reduction across all data.

Finally there is deduplication; there are two types and, depending on your environment, they can have a big payoff for you. First is general-purpose deduplication, right now championed primarily by Network Appliance, although Riverbed has announced plans to take its WAN deduplication technology and move it into primary storage. The differences are worth a separate blog entry and one we will get into later. Ideal candidates for general-purpose deduplication are VMware images and to some extent user home directories.

Lastly, there are application-specific deduplicators, represented now by Ocarina Networks. By focusing on a particular application like ECO-System, these solutions can eliminate redundant data that might get past general-purpose deduplication tools.

For example, a photo site might have thousands of images where the pictures suffer from red-eye. They may go in and remove all those red-eyes, each image being stored as another image, and most deduplication solutions would treat this as totally unique files and each would be stored twice. An application-specific solution would identify these as similar files and only store the unique bytes that make up the corrected images. While the use case is smaller than general-purpose deduplication, the payoff can be enormous.

These solutions aren't mutually exclusive and in many cases complement each other. While this is the recommended workflow, the important part is to get started with any of theses steps and then revisit the others as time and need allow.

There is still time. Join us for our Webcast today at noon CST…. Demystifying Primary Storage Data Reduction.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2012-2808
Published: 2015-04-01
The PRNG implementation in the DNS resolver in Bionic in Android before 4.1.1 incorrectly uses time and PID information during the generation of random numbers for query ID values and UDP source ports, which makes it easier for remote attackers to spoof DNS responses by guessing these numbers, a rel...

CVE-2015-0800
Published: 2015-04-01
The PRNG implementation in the DNS resolver in Mozilla Firefox (aka Fennec) before 37.0 on Android does not properly generate random numbers for query ID values and UDP source ports, which makes it easier for remote attackers to spoof DNS responses by guessing these numbers, a related issue to CVE-2...

CVE-2015-0801
Published: 2015-04-01
Mozilla Firefox before 37.0, Firefox ESR 31.x before 31.6, and Thunderbird before 31.6 allow remote attackers to bypass the Same Origin Policy and execute arbitrary JavaScript code with chrome privileges via vectors involving anchor navigation, a similar issue to CVE-2015-0818.

CVE-2015-0802
Published: 2015-04-01
Mozilla Firefox before 37.0 relies on docshell type information instead of page principal information for Window.webidl access control, which might allow remote attackers to execute arbitrary JavaScript code with chrome privileges via certain content navigation that leverages the reachability of a p...

CVE-2015-0803
Published: 2015-04-01
The HTMLSourceElement::AfterSetAttr function in Mozilla Firefox before 37.0 does not properly constrain the original data type of a casted value during the setting of a SOURCE element's attributes, which allows remote attackers to execute arbitrary code or cause a denial of service (use-after-free) ...

Dark Reading Radio
Archived Dark Reading Radio
Good hackers--aka security researchers--are worried about the possible legal and professional ramifications of President Obama's new proposed crackdown on cyber criminals.