News
12/1/2008
11:34 AM
George Crump
George Crump
Commentary
50%
50%

The Primary Storage Temptation

As IT staffs get stretched even thinner, a challenge is arising in what to do with primary storage. There is still the need for more and more capacity, there are limited dollars to buy more of it, and there are limited resources to properly manage it.

As IT staffs get stretched even thinner, a challenge is arising in what to do with primary storage. There is still the need for more and more capacity, there are limited dollars to buy more of it, and there are limited resources to properly manage it.The temptation is to just buy more primary storage because it is relatively easy to add more storage to your existing infrastructure. It may even feel like it is easier and less expensive to add to primary storage than it would be to implement a data management process that would remove old data and store it on a disk-based archive like those that Permabit, Nexsan, or Copan Systems offer.

The obvious issue with just adding more storage is the cost of primary storage vs. the cost of secondary storage. Even if you decide to use SATA storage from your primary supplier, there is a greater expense typically with that storage than SATA storage from one of the other suppliers, plus you would still have to develop a strategy to move old data to the SATA tier.

The other challenge with leaving data on primary storage is that, for the most part, you have to leave it in its native state. While some compression and data reduction can be performed on primary storage, greater gains can be made on archive storage. Since there is reduced performance pressure on archive storage, steps can be taken to reduce the size of the actual storage needed. Most of the archive systems have some form of compression or deduplication built into them.

For advanced data reduction tools, companies such as Ocarina Networks offer solutions that can not only scan active storage for stagnant data, they can, before data movement, invoke a series of compressors and deduplication that will provide maximum capacity reduction on the files, then they can move this smaller file to the archive storage system. Since these tools work in the background on stagnant data, extra time can be taken for maximum reduction of the storage requirement, even on files that do not typically yield great data reduction results.

If the purchase cost of the archive storage is exponentially less expensive on a terabyte to terabyte basis, then via a variety of means the actual storage space required on that archive is reduced, and the overall cost of the secondary storage is less expensive. For example, if 10 TB's of stagnant data can be stored on 2 TB's of archive space, the savings in acquisition costs, power requirements, and reduction in backup window, plus enabling better data retention, make the investment in an archive process obvious.

In our next entry, we will discuss the problems that expanding primary storage creates beyond the cost factors.

Join us for our upcoming Webcast on Improving IT Efficiency.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-2184
Published: 2015-03-27
Movable Type before 5.2.6 does not properly use the Storable::thaw function, which allows remote attackers to execute arbitrary code via the comment_state parameter.

CVE-2014-3619
Published: 2015-03-27
The __socket_proto_state_machine function in GlusterFS 3.5 allows remote attackers to cause a denial of service (infinite loop) via a "00000000" fragment header.

CVE-2014-8121
Published: 2015-03-27
DB_LOOKUP in nss_files/files-XXX.c in the Name Service Switch (NSS) in GNU C Library (aka glibc or libc6) 2.21 and earlier does not properly check if a file is open, which allows remote attackers to cause a denial of service (infinite loop) by performing a look-up while the database is iterated over...

CVE-2014-9712
Published: 2015-03-27
Websense TRITON V-Series appliances before 7.8.3 Hotfix 03 and 7.8.4 before Hotfix 01 allows remote administrators to read arbitrary files and obtain passwords via a crafted path.

CVE-2015-0658
Published: 2015-03-27
The DHCP implementation in the PowerOn Auto Provisioning (POAP) feature in Cisco NX-OS does not properly restrict the initialization process, which allows remote attackers to execute arbitrary commands as root by sending crafted response packets on the local network, aka Bug ID CSCur14589.

Dark Reading Radio
Archived Dark Reading Radio
Good hackers--aka security researchers--are worried about the possible legal and professional ramifications of President Obama's new proposed crackdown on cyber criminals.