News
11/3/2008
11:59 AM
George Crump
George Crump
Commentary
50%
50%

Primary Storage Data Reduction - A Process

Primary storage data reduction is a series of steps you can take to reduce the amount of capacity dedicated to Tier 1 storage. The most common techniques are archiving, compression, data deduplication, and the use of intelligent storage systems. The question often comes up, what should I do first?

Primary storage data reduction is a series of steps you can take to reduce the amount of capacity dedicated to Tier 1 storage. The most common techniques are archiving, compression, data deduplication, and the use of intelligent storage systems. The question often comes up, what should I do first?The first step should always be to archive existing data to an archive solution. Since this can result in a reduction in as much as 80% of your primary storage, no matter what step follows, archiving clears the way for that step. Consider a disk-based solution such as those from Permabit, Copan Systems, or Nexsan. These allow for easy access and a rapid retrieval, resulting in greater confidence in a more aggressive archive plan.

If you're in the process of selecting a new primary storage system, the next step is to consider a system that has the ability to do thin provisioning and intelligent data movement. Archiving will have driven down the amount of primary storage you will need to purchase; thin provisioning will reduce that even further. Thin provisioning allows you to allocate as much storage as an application may need, but only consume that storage as it is used. According to some studies, this can result in a reduction of 70% in purchased capacity.

Regardless if you decide on a new primary storage system or not, the next step should be an inline real-time data compression device like those provided by Storwize. These devices allow for a 60%-plus reduction of NFS and CIFS mounted data with little to no performance impact. Even databases or VMware images compress well, yet maintain or even improve overall performance. The reason real-time compression is so early in the process is that its simple to implement and shows reduction across all data.

Finally there is deduplication; there are two types and, depending on your environment, they can have a big payoff for you. First is general-purpose deduplication, right now championed primarily by Network Appliance, although Riverbed has announced plans to take its WAN deduplication technology and move it into primary storage. The differences are worth a separate blog entry and one we will get into later. Ideal candidates for general-purpose deduplication are VMware images and to some extent user home directories.

Lastly, there are application-specific deduplicators, represented now by Ocarina Networks. By focusing on a particular application like ECO-System, these solutions can eliminate redundant data that might get past general-purpose deduplication tools.

For example, a photo site might have thousands of images where the pictures suffer from red-eye. They may go in and remove all those red-eyes, each image being stored as another image, and most deduplication solutions would treat this as totally unique files and each would be stored twice. An application-specific solution would identify these as similar files and only store the unique bytes that make up the corrected images. While the use case is smaller than general-purpose deduplication, the payoff can be enormous.

These solutions aren't mutually exclusive and in many cases complement each other. While this is the recommended workflow, the important part is to get started with any of theses steps and then revisit the others as time and need allow.

There is still time. Join us for our Webcast today at noon CST…. Demystifying Primary Storage Data Reduction.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading December Tech Digest
Experts weigh in on the pros and cons of end-user security training.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-2037
Published: 2014-11-26
Openswan 2.6.40 allows remote attackers to cause a denial of service (NULL pointer dereference and IKE daemon restart) via IKEv2 packets that lack expected payloads. NOTE: this vulnerability exists because of an incomplete fix for CVE 2013-6466.

CVE-2014-6609
Published: 2014-11-26
The res_pjsip_pubsub module in Asterisk Open Source 12.x before 12.5.1 allows remote authenticated users to cause a denial of service (crash) via crafted headers in a SIP SUBSCRIBE request for an event package.

CVE-2014-6610
Published: 2014-11-26
Asterisk Open Source 11.x before 11.12.1 and 12.x before 12.5.1 and Certified Asterisk 11.6 before 11.6-cert6, when using the res_fax_spandsp module, allows remote authenticated users to cause a denial of service (crash) via an out of call message, which is not properly handled in the ReceiveFax dia...

CVE-2014-7141
Published: 2014-11-26
The pinger in Squid 3.x before 3.4.8 allows remote attackers to obtain sensitive information or cause a denial of service (out-of-bounds read and crash) via a crafted type in an (1) ICMP or (2) ICMP6 packet.

CVE-2014-7142
Published: 2014-11-26
The pinger in Squid 3.x before 3.4.8 allows remote attackers to obtain sensitive information or cause a denial of service (crash) via a crafted (1) ICMP or (2) ICMP6 packet size.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Now that the holiday season is about to begin both online and in stores, will this be yet another season of nonstop gifting to cybercriminals?