Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

9/8/2008
09:39 AM
George Crump
George Crump
Commentary
50%
50%

Cloud Storage's Weakness

Cloud storage has one glaring weakness compared with traditional storage offerings; it does not get cheaper over time. Today, some services each year will increase your capacity at "no extra charge," but you are still paying the same amount of money for data written last year and data written this year.

Cloud storage has one glaring weakness compared with traditional storage offerings; it does not get cheaper over time. Today, some services each year will increase your capacity at "no extra charge," but you are still paying the same amount of money for data written last year and data written this year.If you think about it, that just does not make sense. Data written last year is far less likely to be accessed or referenced than data you wrote yesterday. Why should you pay the same amount for that storage? This is again another area where the back end architecture of the cloud solution is critical. Cloud storage, or any storage solution, for that matter, should have the ability to move data to less and less expensive storage. That could be higher density but slower disk drives or, depending on access needs, even some other form of media like optical or even tape.

To further drive down their costs and your monthly fee, the architecture also should allow for lower energy consumption as data ages. They should have the ability to take advantage of drives that can spin down or even a system that has the intelligence to completely power the drive off -- nothing is more green than off.

Powering down drives is just the start. The node or shelf that holds these drives consume a fair amount of power. The ability to completely power down nodes within the architecture will help further drive down the cost of storing older data. Even manufacturers without the grid-like architecture that seems to be common in cloud storage should look at the ability to power down drive shelves that are full of powered-down drives. The most important part of all of this is automation. Regardless if it is a public cloud, private cloud, or disk-based archive, these solutions are going to be just too large to expect the user to keep up with managing data movement and power management. The system itself is going to need to automatically decide where data should be placed and how the power should be managed in those environments. Eventually these systems should include a built-in billing system that will charge users based on the different classes of storage they are using.

Cloud services always will need to be cost competitive. Suppliers that have in their road map the ability to automatically move data to less expensive power-managed media are going to be able to better maintain that price advantage. When you look at the architecture, make sure you look for solutions that allow data to become less expensive with age.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Why Cyber-Risk Is a C-Suite Issue
Marc Wilczek, Digital Strategist & CIO Advisor,  11/12/2019
DevSecOps: The Answer to the Cloud Security Skills Gap
Lamont Orange, Chief Information Security Officer at Netskope,  11/15/2019
Unreasonable Security Best Practices vs. Good Risk Management
Jack Freund, Director, Risk Science at RiskLens,  11/13/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
Navigating the Deluge of Security Data
In this Tech Digest, Dark Reading shares the experiences of some top security practitioners as they navigate volumes of security data. We examine some examples of how enterprises can cull this data to find the clues they need.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-19040
PUBLISHED: 2019-11-17
KairosDB through 1.2.2 has XSS in view.html because of showErrorMessage in js/graph.js, as demonstrated by view.html?q= with a '"sampling":{"value":"<script>' substring.
CVE-2019-19041
PUBLISHED: 2019-11-17
An issue was discovered in Xorux Lpar2RRD 6.11 and Stor2RRD 2.61, as distributed in Xorux 2.41. They do not correctly verify the integrity of an upgrade package before processing it. As a result, official upgrade packages can be modified to inject an arbitrary Bash script that will be executed by th...
CVE-2019-19012
PUBLISHED: 2019-11-17
An integer overflow in the search_in_range function in regexec.c in Oniguruma 6.x before 6.9.4_rc2 leads to an out-of-bounds read, in which the offset of this read is under the control of an attacker. (This only affects the 32-bit compiled version). Remote attackers can cause a denial-of-service or ...
CVE-2019-19022
PUBLISHED: 2019-11-17
iTerm2 through 3.3.6 has potentially insufficient documentation about the presence of search history in com.googlecode.iterm2.plist, which might allow remote attackers to obtain sensitive information, as demonstrated by searching for the NoSyncSearchHistory string in .plist files within public Git r...
CVE-2019-19035
PUBLISHED: 2019-11-17
jhead 3.03 is affected by: heap-based buffer over-read. The impact is: Denial of service. The component is: ReadJpegSections and process_SOFn in jpgfile.c. The attack vector is: Open a specially crafted JPEG file.