Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

9/8/2008
09:39 AM
George Crump
George Crump
Commentary
50%
50%

Cloud Storage's Weakness

Cloud storage has one glaring weakness compared with traditional storage offerings; it does not get cheaper over time. Today, some services each year will increase your capacity at "no extra charge," but you are still paying the same amount of money for data written last year and data written this year.

Cloud storage has one glaring weakness compared with traditional storage offerings; it does not get cheaper over time. Today, some services each year will increase your capacity at "no extra charge," but you are still paying the same amount of money for data written last year and data written this year.If you think about it, that just does not make sense. Data written last year is far less likely to be accessed or referenced than data you wrote yesterday. Why should you pay the same amount for that storage? This is again another area where the back end architecture of the cloud solution is critical. Cloud storage, or any storage solution, for that matter, should have the ability to move data to less and less expensive storage. That could be higher density but slower disk drives or, depending on access needs, even some other form of media like optical or even tape.

To further drive down their costs and your monthly fee, the architecture also should allow for lower energy consumption as data ages. They should have the ability to take advantage of drives that can spin down or even a system that has the intelligence to completely power the drive off -- nothing is more green than off.

Powering down drives is just the start. The node or shelf that holds these drives consume a fair amount of power. The ability to completely power down nodes within the architecture will help further drive down the cost of storing older data. Even manufacturers without the grid-like architecture that seems to be common in cloud storage should look at the ability to power down drive shelves that are full of powered-down drives. The most important part of all of this is automation. Regardless if it is a public cloud, private cloud, or disk-based archive, these solutions are going to be just too large to expect the user to keep up with managing data movement and power management. The system itself is going to need to automatically decide where data should be placed and how the power should be managed in those environments. Eventually these systems should include a built-in billing system that will charge users based on the different classes of storage they are using.

Cloud services always will need to be cost competitive. Suppliers that have in their road map the ability to automatically move data to less expensive power-managed media are going to be able to better maintain that price advantage. When you look at the architecture, make sure you look for solutions that allow data to become less expensive with age.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 9/17/2020
Cybersecurity Bounces Back, but Talent Still Absent
Simone Petrella, Chief Executive Officer, CyberVista,  9/16/2020
Meet the Computer Scientist Who Helped Push for Paper Ballots
Kelly Jackson Higgins, Executive Editor at Dark Reading,  9/16/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-25789
PUBLISHED: 2020-09-19
An issue was discovered in Tiny Tiny RSS (aka tt-rss) before 2020-09-16. The cached_url feature mishandles JavaScript inside an SVG document.
CVE-2020-25790
PUBLISHED: 2020-09-19
** DISPUTED ** Typesetter CMS 5.x through 5.1 allows admins to upload and execute arbitrary PHP code via a .php file inside a ZIP archive. NOTE: the vendor disputes the significance of this report because "admins are considered trustworthy"; however, the behavior "contradicts our secu...
CVE-2020-25791
PUBLISHED: 2020-09-19
An issue was discovered in the sized-chunks crate through 0.6.2 for Rust. In the Chunk implementation, the array size is not checked when constructed with unit().
CVE-2020-25792
PUBLISHED: 2020-09-19
An issue was discovered in the sized-chunks crate through 0.6.2 for Rust. In the Chunk implementation, the array size is not checked when constructed with pair().
CVE-2020-25793
PUBLISHED: 2020-09-19
An issue was discovered in the sized-chunks crate through 0.6.2 for Rust. In the Chunk implementation, the array size is not checked when constructed with From<InlineArray<A, T>>.