News
8/13/2008
03:29 PM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Tier Matching

Tiered storage can be difficult to manage and one of the challenges to its acceptance is the amount of effort it takes to move data between those tiers. We've written about several methods to move data between tiers in previous blogs, but in some cases the decision isn't that complicated.

Tiered storage can be difficult to manage and one of the challenges to its acceptance is the amount of effort it takes to move data between those tiers. We've written about several methods to move data between tiers in previous blogs, but in some cases the decision isn't that complicated.A great example is Tier 0. As I suggested in our last entry, Tier 0 or Solid State Disk (SSD) really have two options; DRAM-based SSDs or Flash-based SSDs. DRAM-based SSDs are primarily supplied by Solid Data, Texas Memory Systems, and Solid Access. Texas Memory is also shipping Flash-based systems along with traditional storage manufacturers EMC and Sun. Eventually, most storage manufacturers will be supplying some form of Flash-based SSDs.

When deciding what data to place on Tier 0, the decision is straightforward. For all practical purposes, you're looking for applications that are disk I/O bound. Typically, these will be applications whose data you are either short stroking or are considering short stroking. While there are exceptions, for many enterprises these are going to be databases and they will be specific files within those databases. I have previously discussed the decision between DRAM and Flash, but what it comes down to is whether the cost differential of DRAM is justifiable by its significant advantage in write I/O performance. Don't rule DRAM-based SSD out until you look at the price; traditional storage manufacturers may charge a premium for their SSDs (which are Flash only). As a result you may be able to get a DRAM-based SSD from one of the SSD-focused suppliers for not much more money than a Flash-based SSD from one of the storage suppliers adding it to their offering. Tier 1 is for all other production data that needs some level of performance -- the rest of the database application that didn't get to the SSD, for example. We are still a few years away from SSDs being at the point where the entire database application can reside on it. When looking at these solutions, remember that some suppliers like 3PAR and Compellent have solved many of the downsides of short stroking, so if you have an application that needs more performance but you can't quite justify the leap to Tier 0, consideration should be given there. In addition to databases, it makes sense for some file data to be on this tier as well. Tier 2 should be for production systems where performance has to be acceptable but not top end. User home directories that are front-ended by a NAS head are good examples. In my opinion, this type of storage, for now, anyway, needs to still be Fibre Channel in the medium-sized to large data center. Most Tier 2 storage is now as reliable and can take advantage of the same storage technologies (snapshots, replication, etc...) as Tier 1 storage. Also, in many cases they can be in the same storage system. For example, your Tier 1 storage may be just a few shelves of 15k RPM small-capacity drives and your Tier 2 might be several racks of 10k medium-capacity drives. An important point here is not buying the highest-capacity drives available. The higher the capacity drive, the longer a RAID rebuild time. RAID rebuild may be THE issue in production storage. The time it takes to rebuild a RAID 5 or RAID 6 volume continues to increase. It also is another reason I advise against SATA in these two tiers. There are a few vendors that are trying to address RAID rebuild issues and we will do a deeper dive in an upcoming entry. Tier 3 has become very interesting, with many different levels within the tier, and will take a blog entry all by itself. How about next time?

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
DevOps’ Impact on Application Security
DevOps’ Impact on Application Security
Managing the interdependency between software and infrastructure is a thorny challenge. Often, it’s a “developers are from Mars, systems engineers are from Venus” situation.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-0607
Published: 2014-07-24
Unrestricted file upload vulnerability in Attachmate Verastream Process Designer (VPD) before R6 SP1 Hotfix 1 allows remote attackers to execute arbitrary code by uploading and launching an executable file.

CVE-2014-1419
Published: 2014-07-24
Race condition in the power policy functions in policy-funcs in acpi-support before 0.142 allows local users to gain privileges via unspecified vectors.

CVE-2014-2360
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules allow remote attackers to execute arbitrary code via packets that report a high battery voltage.

CVE-2014-2361
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules, when BreeZ is used, do not require authentication for reading the site security key, which allows physically proximate attackers to spoof communication by obtaining this key after use of direct hardware access or manual-setup mode.

CVE-2014-2362
Published: 2014-07-24
OleumTech WIO DH2 Wireless Gateway and Sensor Wireless I/O Modules rely exclusively on a time value for entropy in key generation, which makes it easier for remote attackers to defeat cryptographic protection mechanisms by predicting the time of project creation.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Sara Peters hosts a conversation on Botnets and those who fight them.