06:12 PM
George Crump
George Crump

The Reality Of Private Clouds

In his blog "Clouds Are Only in the Sky" yesterday, Richard Martin suggested that a cloud must be on the public Internet for it to truly be a cloud and that if something resembling a cloud is used internally then it must be utility computing. He makes a very good point; however, I respectfully disagree.

In his blog "Clouds Are Only in the Sky" yesterday, Richard Martin suggested that a cloud must be on the public Internet for it to truly be a cloud and that if something resembling a cloud is used internally then it must be utility computing. He makes a very good point; however, I respectfully disagree.My explanation of private clouds and their value is explained in an entry from last week. In fairness, my focus often centers on the storage aspect of it and Rick's is across the entire cloud infrastructure.

While both private clouds and utility computing, and with it their storage counterparts, have similar capabilities and either term can easily be stretched to cover the other, private clouds and the companies creating the technology are emphasizing different aspects of the term. Private clouds share a public cloud's offering significant focus on being distributed. This distribution can be across geographies and across components within the platform. This distribution can be achieved by simple multisite replication like that done by cloud specific platforms from Parascale and Bycast, traditional archive platforms like Permabit and Copan Systems or altogether new concepts in distribution of data via block dispersion similar to what Cleversafe is performing. Private clouds share a public cloud's focus on simple, massive, and linear scalability by merely adding additional components to an active environment. Another focus in most cases is to use standard "off the shelf" hardware that is either then virtualized or in some sort of multinode grid. With additional components, scale will not only be delivered in additional capacity, but also in additional compute. In fact, one of the battle lines of cloud storage in particular is should you scale both capacity and performance in the same module or should you granularize further and allow specific granularization of capacity or performance. The scale private clouds deliver will be far beyond what most people associate to utility computing. Finally, private clouds in many cases will be the result of a migration from a public cloud. It will be common as an organization grows to look at the costs associated with having a service outsourced and they may decide to bring that capability in-house. In doing so, they will likely want to keep the components of architecture that worked well while improving on the weak points. Keeping the term cloud will be a more logical and efficient way to communicate to the users of that service what is happening... "We still have our is just behind our firewalls now." People will create private clouds similar to the way companies have created intranets. They like the implementation of the concept, they just want to keep it to themselves. Track us on Twitter:

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Current Issue
Dark Reading Tech Digest, Dec. 19, 2014
Software-defined networking can be a net plus for security. The key: Work with the network team to implement gradually, test as you go, and take the opportunity to overhaul your security strategy.
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
Published: 2015-01-27
Stack-based buffer overflow in the Attachmate Reflection FTP Client before 14.1.433 allows remote FTP servers to execute arbitrary code via a large PWD response.

Published: 2015-01-27
The Gst.MapInfo function in Vala 0.26.0 and 0.26.1 uses an incorrect buffer length declaration for the Gstreamer bindings, which allows context-dependent attackers to cause a denial of service (crash) or possibly execute arbitrary code via unspecified vectors, which trigger a heap-based buffer overf...

Published: 2015-01-27
The Schneider Electric ETG3000 FactoryCast HMI Gateway with firmware before 1.60 IR 04 stores rde.jar under the web root with insufficient access control, which allows remote attackers to obtain sensitive setup and configuration information via a direct request.

Published: 2015-01-27
The FTP server on the Schneider Electric ETG3000 FactoryCast HMI Gateway with firmware through 1.60 IR 04 has hardcoded credentials, which makes it easier for remote attackers to obtain access via an FTP session.

Published: 2015-01-27
Unquoted Windows search path vulnerability in the GoogleChromeDistribution::DoPostUninstallOperations function in installer/util/ in the uninstall-survey feature in Google Chrome before 40.0.2214.91 allows local users to gain privileges via a Trojan horse program in the ...

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
If you’re a security professional, you’ve probably been asked many questions about the December attack on Sony. On Jan. 21 at 1pm eastern, you can join a special, one-hour Dark Reading Radio discussion devoted to the Sony hack and the issues that may arise from it.