News
12/1/2008
11:34 AM
George Crump
George Crump
Commentary
50%
50%

The Primary Storage Temptation

As IT staffs get stretched even thinner, a challenge is arising in what to do with primary storage. There is still the need for more and more capacity, there are limited dollars to buy more of it, and there are limited resources to properly manage it.

As IT staffs get stretched even thinner, a challenge is arising in what to do with primary storage. There is still the need for more and more capacity, there are limited dollars to buy more of it, and there are limited resources to properly manage it.The temptation is to just buy more primary storage because it is relatively easy to add more storage to your existing infrastructure. It may even feel like it is easier and less expensive to add to primary storage than it would be to implement a data management process that would remove old data and store it on a disk-based archive like those that Permabit, Nexsan, or Copan Systems offer.

The obvious issue with just adding more storage is the cost of primary storage vs. the cost of secondary storage. Even if you decide to use SATA storage from your primary supplier, there is a greater expense typically with that storage than SATA storage from one of the other suppliers, plus you would still have to develop a strategy to move old data to the SATA tier.

The other challenge with leaving data on primary storage is that, for the most part, you have to leave it in its native state. While some compression and data reduction can be performed on primary storage, greater gains can be made on archive storage. Since there is reduced performance pressure on archive storage, steps can be taken to reduce the size of the actual storage needed. Most of the archive systems have some form of compression or deduplication built into them.

For advanced data reduction tools, companies such as Ocarina Networks offer solutions that can not only scan active storage for stagnant data, they can, before data movement, invoke a series of compressors and deduplication that will provide maximum capacity reduction on the files, then they can move this smaller file to the archive storage system. Since these tools work in the background on stagnant data, extra time can be taken for maximum reduction of the storage requirement, even on files that do not typically yield great data reduction results.

If the purchase cost of the archive storage is exponentially less expensive on a terabyte to terabyte basis, then via a variety of means the actual storage space required on that archive is reduced, and the overall cost of the secondary storage is less expensive. For example, if 10 TB's of stagnant data can be stored on 2 TB's of archive space, the savings in acquisition costs, power requirements, and reduction in backup window, plus enabling better data retention, make the investment in an archive process obvious.

In our next entry, we will discuss the problems that expanding primary storage creates beyond the cost factors.

Join us for our upcoming Webcast on Improving IT Efficiency.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading Tech Digest, Dec. 19, 2014
Software-defined networking can be a net plus for security. The key: Work with the network team to implement gradually, test as you go, and take the opportunity to overhaul your security strategy.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-5208
Published: 2014-12-22
BKBCopyD.exe in the Batch Management Packages in Yokogawa CENTUM CS 3000 through R3.09.50 and CENTUM VP through R4.03.00 and R5.x through R5.04.00, and Exaopc through R3.72.10, does not require authentication, which allows remote attackers to read arbitrary files via a RETR operation, write to arbit...

CVE-2014-7286
Published: 2014-12-22
Buffer overflow in AClient in Symantec Deployment Solution 6.9 and earlier on Windows XP and Server 2003 allows local users to gain privileges via unspecified vectors.

CVE-2014-8015
Published: 2014-12-22
The Sponsor Portal in Cisco Identity Services Engine (ISE) allows remote authenticated users to obtain access to an arbitrary sponsor's guest account via a modified HTTP request, aka Bug ID CSCur64400.

CVE-2014-8017
Published: 2014-12-22
The periodic-backup feature in Cisco Identity Services Engine (ISE) allows remote attackers to discover backup-encryption passwords via a crafted request that triggers inclusion of a password in a reply, aka Bug ID CSCur41673.

CVE-2014-8018
Published: 2014-12-22
Multiple cross-site scripting (XSS) vulnerabilities in Business Voice Services Manager (BVSM) pages in the Application Software in Cisco Unified Communications Domain Manager 8 allow remote attackers to inject arbitrary web script or HTML via a crafted URL, aka Bug IDs CSCur19651, CSCur18555, CSCur1...

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Join us Wednesday, Dec. 17 at 1 p.m. Eastern Time to hear what employers are really looking for in a chief information security officer -- it may not be what you think.