Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

News

2/8/2010
04:14 PM
George Crump
George Crump
Commentary
50%
50%

The Importance Of QoS In Automated Tiering

In a conversation I had a few weeks ago with Pillar Data's CEO, Mike Workman, we discussed his recent blog entry on the "Auto Tiering of Data". In this blog he brings up several important considerations as vendors and users begin to examine automated tiering. One I'd like to elaborate on is QoS in Automated Tiering.

In a conversation I had a few weeks ago with Pillar Data's CEO, Mike Workman, we discussed his recent blog entry on the "Auto Tiering of Data". In this blog he brings up several important considerations as vendors and users begin to examine automated tiering. One I'd like to elaborate on is QoS in Automated Tiering.Quality of Service (QoS) of course has been with us in networks for a while. We have it enabled in IP routers, we are beginning to see it in network interface cards and now we are even seeing it in fibre channel networks with NPIV. Automate tiering now brings this to storage, where active data is automatically migrated up to faster storage, like SSD or even DRAM and inactive data is gradually migrated down to slower SATA based storage. This is a great start and compared to the alternative should provide a performance boost in almost every situation.

Merely identifying active data and moving it to a high performance tier is a brute force method that should eventually give way to a more intelligent method where possible. Just because data is active does not mean that it should go on the fastest and most expensive tier of storage. In most automated tiering environments the fast tier is going to be a finite repository and it may be impractical to keep all the active data on that tier. Really active but non-important data could prohibit slightly less active but important when accessed data from ever making it to the high speed tier.

What is needed is a more granular QoS capability in automated tiering systems. The ability to exclude or include data by type or location for example. Eventually these systems need to learn who the requester is. If it is from a small number of users on a relatively slow network connection, leave the data on mechanical storage. If the requester is an application or a high number of users then move the data up to the performance tier.

For the time being Solid State Disk and DRAM are finite resources. You want to make sure that you are not only putting active data on these tiers but data that can actually take advantage of the tier.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
News
Former CISA Director Chris Krebs Discusses Risk Management & Threat Intel
Kelly Sheridan, Staff Editor, Dark Reading,  2/23/2021
Edge-DRsplash-10-edge-articles
Security + Fraud Protection: Your One-Two Punch Against Cyberattacks
Joshua Goldfarb, Director of Product Management at F5,  2/23/2021
News
Cybercrime Groups More Prolific, Focus on Healthcare in 2020
Robert Lemos, Contributing Writer,  2/22/2021
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win an Amazon Gift Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you today!
Flash Poll
Building the SOC of the Future
Building the SOC of the Future
Digital transformation, cloud-focused attacks, and a worldwide pandemic. The past year has changed the way business works and the way security teams operate. There is no going back.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-23534
PUBLISHED: 2021-02-25
A server-side request forgery (SSRF) vulnerability in Upgrade.php of gopeak masterlab 2.1.5, via the 'source' parameter.
CVE-2021-27330
PUBLISHED: 2021-02-25
Triconsole Datepicker Calendar <3.77 is affected by cross-site scripting (XSS) in calendar_form.php. Attackers can read authentication cookies that are still active, which can be used to perform further attacks such as reading browser history, directory listings, and file contents.
CVE-2021-3124
PUBLISHED: 2021-02-25
Stored cross-site scripting (XSS) in form field in robust.systems product Custom Global Variables v 1.0.5 allows a remote attacker to inject arbitrary code via the vars[0][name] field.
CVE-2021-21064
PUBLISHED: 2021-02-25
Magento UPWARD-php version 1.1.4 (and earlier) is affected by a Path traversal vulnerability in Magento UPWARD Connector version 1.1.2 (and earlier) due to the upload feature. An attacker could potentially exploit this vulnerability to upload a malicious YAML file that can contain instructions which...
CVE-2021-21065
PUBLISHED: 2021-02-25
Adobe Bridge version 11.0 (and earlier) is affected by an out-of-bounds write vulnerability when parsing TTF files that could result in arbitrary code execution in the context of the current user. Exploitation of this issue requires user interaction in that a victim must open a malicious file.