Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


01:54 PM
George Crump
George Crump

What Business Data Should Be In The Cloud?

In our last entry we discussed different ways that you can move data into the cloud, something I call onramps. In theory the ability now exists to put all your data types on a cloud storage platform, but is that the right choice for your business? How do you determine which data you should put in the cloud?

In our last entry we discussed different ways that you can move data into the cloud, something I call onramps. In theory the ability now exists to put all your data types on a cloud storage platform, but is that the right choice for your business? How do you determine which data you should put in the cloud?The answer, as we will discuss in our upcoming webinar, "What's Your Cloud Storage Strategy", like almost everything else in IT, is it depends. It depends on what your key internal storage challenges are and what the internal resistance to using an external service might be. Notice that not included in that discussion is what is the size of your company, the amount of IT resources you have nor the amount of data that you have. While I find that it is often assumed that cloud storage is for small business owners only, there are cloud storage solutions for businesses of all sizes including large enterprises.

The first area to examine is how much data is being accessed on a moment by moment basis. As you may have noticed from the discussion in our last entry there is an onramp or cloud gateway for almost every data type now, ranging from backups to primary block storage. The moment by moment change rate plus the data type will determine how large the local gateway cache will need to be and how often data will need to be recalled from the cloud. The total size of the data set is for the most part irrelevant, other than the GB cost to store it but that cost should be relatively static. The movement of data from your local cache from the cloud will be what delays an application. The more often that data can be served from local cache either through smart caching algorithms or large cache space the better. Also several cloud storage providers charge extra for the transfer out of the cloud back to local storage, so it can lead to a surprise on your bill. Since most onramps or gateways give you a choice of provider it makes sense to know what the hidden extras are from each provider.

The impact of restoring data back from the cloud and its potential extra costs is one of the reasons that backup and archive data have been so popular. The transfer is almost always one way; upload. Also most big recoveries can happen from the local cache and don't need the data stored on the cloud. The backup copy in the cloud mostly serves as a long term retention area. As you move into using cloud storage for primary data the transfer issues become a bit more thorny. The easiest data set use case to deal with is the file share use case. Most files on a file server are only active for a few days and then become dormant. This is an ideal use case for cloud storage, let the older files migrate to the cloud. Even if they do need to be recalled from cloud storage later only a single user is typically impacted by the delay in access, and a single file access is relatively fast.

Databases become a bit more tricky. Here look for applications that have a small portion of the application that is accessed on a regular basis. Microsoft SharePoint is a good example of a "ready for cloud now" data set and potentially some mail systems that store attachments and messages as discrete files. In the near future don't rule out busy transaction oriented databases. As the developers of these platforms embrace the availability of cloud storage they can build in ways to auto-segment off tier sections of data so that it can be stored on different storage types automatically and the cloud could be one of those types.

The second common decision point is the initial load in of data into the cloud. How do you get it all there? That will be the focus of our next entry.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Why Cyber-Risk Is a C-Suite Issue
Marc Wilczek, Digital Strategist & CIO Advisor,  11/12/2019
DevSecOps: The Answer to the Cloud Security Skills Gap
Lamont Orange, Chief Information Security Officer at Netskope,  11/15/2019
Attackers' Costs Increasing as Businesses Focus on Security
Robert Lemos, Contributing Writer,  11/15/2019
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
Navigating the Deluge of Security Data
In this Tech Digest, Dark Reading shares the experiences of some top security practitioners as they navigate volumes of security data. We examine some examples of how enterprises can cull this data to find the clues they need.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2019-11-18
A Security Bypass Vulnerability exists in TBOOT before 1.8.2 in the boot loader module when measuring commandline parameters.
PUBLISHED: 2019-11-18
Apache Shiro before 1.4.2, when using the default "remember me" configuration, cookies could be susceptible to a padding attack.
PUBLISHED: 2019-11-18
Cross-site Scripting (XSS) in Jenkins main before 1.482 and LTS before 1.466.2 allows remote attackers to inject arbitrary web script or HTML in the CI game plugin.
PUBLISHED: 2019-11-18
In elliptic-php versions priot to 1.0.6, Timing attacks might be possible which can result in practical recovery of the long-term private key generated by the library under certain conditions. Leakage of a bit-length of the scalar during scalar multiplication is possible on an elliptic curve which m...
PUBLISHED: 2019-11-18
/usr/lib/lua/luci/controller/admin/autoupgrade.lua on PHICOMM K2(PSG1218) V22.5.9.163 devices allows remote authenticated users to execute any command via shell metacharacters in the cgi-bin/luci autoUpTime parameter.