News
12/1/2008
11:34 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

The Primary Storage Temptation

As IT staffs get stretched even thinner, a challenge is arising in what to do with primary storage. There is still the need for more and more capacity, there are limited dollars to buy more of it, and there are limited resources to properly manage it.

As IT staffs get stretched even thinner, a challenge is arising in what to do with primary storage. There is still the need for more and more capacity, there are limited dollars to buy more of it, and there are limited resources to properly manage it.The temptation is to just buy more primary storage because it is relatively easy to add more storage to your existing infrastructure. It may even feel like it is easier and less expensive to add to primary storage than it would be to implement a data management process that would remove old data and store it on a disk-based archive like those that Permabit, Nexsan, or Copan Systems offer.

The obvious issue with just adding more storage is the cost of primary storage vs. the cost of secondary storage. Even if you decide to use SATA storage from your primary supplier, there is a greater expense typically with that storage than SATA storage from one of the other suppliers, plus you would still have to develop a strategy to move old data to the SATA tier.

The other challenge with leaving data on primary storage is that, for the most part, you have to leave it in its native state. While some compression and data reduction can be performed on primary storage, greater gains can be made on archive storage. Since there is reduced performance pressure on archive storage, steps can be taken to reduce the size of the actual storage needed. Most of the archive systems have some form of compression or deduplication built into them.

For advanced data reduction tools, companies such as Ocarina Networks offer solutions that can not only scan active storage for stagnant data, they can, before data movement, invoke a series of compressors and deduplication that will provide maximum capacity reduction on the files, then they can move this smaller file to the archive storage system. Since these tools work in the background on stagnant data, extra time can be taken for maximum reduction of the storage requirement, even on files that do not typically yield great data reduction results.

If the purchase cost of the archive storage is exponentially less expensive on a terabyte to terabyte basis, then via a variety of means the actual storage space required on that archive is reduced, and the overall cost of the secondary storage is less expensive. For example, if 10 TB's of stagnant data can be stored on 2 TB's of archive space, the savings in acquisition costs, power requirements, and reduction in backup window, plus enabling better data retention, make the investment in an archive process obvious.

In our next entry, we will discuss the problems that expanding primary storage creates beyond the cost factors.

Join us for our upcoming Webcast on Improving IT Efficiency.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
Threat Intel Today
Threat Intel Today
The 397 respondents to our new survey buy into using intel to stay ahead of attackers: 85% say threat intelligence plays some role in their IT security strategies, and many of them subscribe to two or more third-party feeds; 10% leverage five or more.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2009-5142
Published: 2014-08-21
Cross-site scripting (XSS) vulnerability in timthumb.php in TimThumb 1.09 and earlier, as used in Mimbo Pro 2.3.1 and other products, allows remote attackers to inject arbitrary web script or HTML via the src parameter.

CVE-2010-5302
Published: 2014-08-21
Cross-site scripting (XSS) vulnerability in timthumb.php in TimThumb before 1.15 as of 20100908 (r88), as used in multiple products, allows remote attackers to inject arbitrary web script or HTML via the QUERY_STRING.

CVE-2010-5303
Published: 2014-08-21
Cross-site scripting (XSS) vulnerability in the displayError function in timthumb.php in TimThumb before 1.15 (r85), as used in multiple products, allows remote attackers to inject arbitrary web script or HTML via unspecified vectors related to $errorString.

CVE-2014-0965
Published: 2014-08-21
IBM WebSphere Application Server (WAS) 7.0.x before 7.0.0.33, 8.0.x before 8.0.0.9, and 8.5.x before 8.5.5.3 allows remote attackers to obtain sensitive information via a crafted SOAP response.

CVE-2014-3022
Published: 2014-08-21
IBM WebSphere Application Server (WAS) 7.0.x before 7.0.0.33, 8.0.x before 8.0.0.9, and 8.5.x before 8.5.5.3 allows remote attackers to obtain sensitive information via a crafted URL that triggers an error condition.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Three interviews on critical embedded systems and security, recorded at Black Hat 2014 in Las Vegas.