News
8/15/2008
05:38 PM
George Crump
George Crump
Commentary
50%
50%

Oh, Tier 3...

Remember about five years or so ago when life was simple? We had fast SCSI and Fibre Channel drives for data and we had tape for backup. Seemed perfect. Then came the ATA-based drives, and you were told to move your older data to them and start sending backups to disk. Then powering the data center and storage in particular became a problem; another use for ATA, put them in stand-by mode, spin them down, put them to sleep, and then eventually turn them off. As is usually the case, the hardware i

Remember about five years or so ago when life was simple? We had fast SCSI and Fibre Channel drives for data and we had tape for backup. Seemed perfect. Then came the ATA-based drives, and you were told to move your older data to them and start sending backups to disk. Then powering the data center and storage in particular became a problem; another use for ATA, put them in stand-by mode, spin them down, put them to sleep, and then eventually turn them off. As is usually the case, the hardware is ahead of the software and there's limited automation to leverage all of this, so what's a user to do?There are so many variations on Tier 3 that it's hard to categorize this entry and catch every permutation. First, what type of data should go on Tier 3? Ideally, everything that isn't currently being accessed (old data) or copies of current data where there is value in freezing the state of that data for some reason; for example, a database archive or a copy of a PowerPoint presentation that you are going to modify heavily. However, this data does NOT include backup data. That data needs to go on another disk tier: Tier 4. Tier 3, then, is essentially data at rest, but data that might need to be accessed in the future, so you want to keep it on a medium that can still deliver that data back to you in short order. The challenge has been to understand how the various manufacturers have responded to this market. One of the first incarnations and one of the most popular still today is manufacturers are just adding shelves with ATA drives to their existing systems or adding just an external box of cheap ATA RAID. Both of these strategies have limited value, unless you have a specific need for a scratch area or something of that nature. The exception being some storage systems that can auto-migrate old data blocks to this storage on an as-needed basis. If your storage system can't do this for you automatically, either change storage systems or don't use Tier 3 storage in this manner. Regardless of the capabilities of your Tier 1 or 2 offering, where things get interesting is with systems that are focusing specifically on the data-retention market. They address key requirements like portability, scalability, density, power management, data integrity, and cost efficiencies that the ATA in shelf solutions lack. By moving (not copying) data either manually or in an automated fashion, you can move this data off primary storage, while at the same time giving yourself a data vault. When I mention a data vault or retention, the first thought is usually compliance or litigation readiness. While these are important, think of the vault from another value ... assets. As the wealth of retained information grows and the ability to index its content improves year by year, data as an asset will be a key strategic initiative in many enterprises. The common requirement in data indexing will be the ability to access that data. What (if anything) you choose to index, the application today may be different than what you choose to index with tomorrow. Having that data stored in a simple open interface file system like CIFS or NFS will be critical. Next we will finish up our "Tour of Tiers" with Tier 4.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Latest Comment: nice one
Current Issue
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2015-1235
Published: 2015-04-19
The ContainerNode::parserRemoveChild function in core/dom/ContainerNode.cpp in the HTML parser in Blink, as used in Google Chrome before 42.0.2311.90, allows remote attackers to bypass the Same Origin Policy via a crafted HTML document with an IFRAME element.

CVE-2015-1236
Published: 2015-04-19
The MediaElementAudioSourceNode::process function in modules/webaudio/MediaElementAudioSourceNode.cpp in the Web Audio API implementation in Blink, as used in Google Chrome before 42.0.2311.90, allows remote attackers to bypass the Same Origin Policy and obtain sensitive audio sample values via a cr...

CVE-2015-1237
Published: 2015-04-19
Use-after-free vulnerability in the RenderFrameImpl::OnMessageReceived function in content/renderer/render_frame_impl.cc in Google Chrome before 42.0.2311.90 allows remote attackers to cause a denial of service or possibly have unspecified other impact via vectors that trigger renderer IPC messages ...

CVE-2015-1238
Published: 2015-04-19
Skia, as used in Google Chrome before 42.0.2311.90, allows remote attackers to cause a denial of service (out-of-bounds write) or possibly have unspecified other impact via unknown vectors.

CVE-2015-1240
Published: 2015-04-19
gpu/blink/webgraphicscontext3d_impl.cc in the WebGL implementation in Google Chrome before 42.0.2311.90 allows remote attackers to cause a denial of service (out-of-bounds read) via a crafted WebGL program that triggers a state inconsistency.

Dark Reading Radio
Archived Dark Reading Radio
Join security and risk expert John Pironti and Dark Reading Editor-in-Chief Tim Wilson for a live online discussion of the sea-changing shift in security strategy and the many ways it is affecting IT and business.