Cloud

10/25/2010
01:54 PM
George Crump
George Crump
Commentary
50%
50%

What Business Data Should Be In The Cloud?

In our last entry we discussed different ways that you can move data into the cloud, something I call onramps. In theory the ability now exists to put all your data types on a cloud storage platform, but is that the right choice for your business? How do you determine which data you should put in the cloud?

In our last entry we discussed different ways that you can move data into the cloud, something I call onramps. In theory the ability now exists to put all your data types on a cloud storage platform, but is that the right choice for your business? How do you determine which data you should put in the cloud?The answer, as we will discuss in our upcoming webinar, "What's Your Cloud Storage Strategy", like almost everything else in IT, is it depends. It depends on what your key internal storage challenges are and what the internal resistance to using an external service might be. Notice that not included in that discussion is what is the size of your company, the amount of IT resources you have nor the amount of data that you have. While I find that it is often assumed that cloud storage is for small business owners only, there are cloud storage solutions for businesses of all sizes including large enterprises.

The first area to examine is how much data is being accessed on a moment by moment basis. As you may have noticed from the discussion in our last entry there is an onramp or cloud gateway for almost every data type now, ranging from backups to primary block storage. The moment by moment change rate plus the data type will determine how large the local gateway cache will need to be and how often data will need to be recalled from the cloud. The total size of the data set is for the most part irrelevant, other than the GB cost to store it but that cost should be relatively static. The movement of data from your local cache from the cloud will be what delays an application. The more often that data can be served from local cache either through smart caching algorithms or large cache space the better. Also several cloud storage providers charge extra for the transfer out of the cloud back to local storage, so it can lead to a surprise on your bill. Since most onramps or gateways give you a choice of provider it makes sense to know what the hidden extras are from each provider.

The impact of restoring data back from the cloud and its potential extra costs is one of the reasons that backup and archive data have been so popular. The transfer is almost always one way; upload. Also most big recoveries can happen from the local cache and don't need the data stored on the cloud. The backup copy in the cloud mostly serves as a long term retention area. As you move into using cloud storage for primary data the transfer issues become a bit more thorny. The easiest data set use case to deal with is the file share use case. Most files on a file server are only active for a few days and then become dormant. This is an ideal use case for cloud storage, let the older files migrate to the cloud. Even if they do need to be recalled from cloud storage later only a single user is typically impacted by the delay in access, and a single file access is relatively fast.

Databases become a bit more tricky. Here look for applications that have a small portion of the application that is accessed on a regular basis. Microsoft SharePoint is a good example of a "ready for cloud now" data set and potentially some mail systems that store attachments and messages as discrete files. In the near future don't rule out busy transaction oriented databases. As the developers of these platforms embrace the availability of cloud storage they can build in ways to auto-segment off tier sections of data so that it can be stored on different storage types automatically and the cloud could be one of those types.

The second common decision point is the initial load in of data into the cloud. How do you get it all there? That will be the focus of our next entry.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
Dark Reading Live EVENTS
INsecurity - For the Defenders of Enterprise Security
A Dark Reading Conference
While red team conferences focus primarily on new vulnerabilities and security researchers, INsecurity puts security execution, protection, and operations center stage. The primary speakers will be CISOs and leaders in security defense; the blue team will be the focus.
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: No, no, no! Have a Unix CRON do the pop-up reminders!
Current Issue
Security Vulnerabilities: The Next Wave
Just when you thought it was safe, researchers have unveiled a new round of IT security flaws. Is your enterprise ready?
Flash Poll
[Strategic Security Report] How Enterprises Are Attacking the IT Security Problem
[Strategic Security Report] How Enterprises Are Attacking the IT Security Problem
Enterprises are spending more of their IT budgets on cybersecurity technology. How do your organization's security plans and strategies compare to what others are doing? Here's an in-depth look.
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.