News
10/28/2009
10:24 AM
George Crump
George Crump
Commentary
50%
50%

File Virtualization, The Ultimate Cloud Gateway?

In our last entry we talked about the use of cloud storage as a backup target, but another ideal use case for cloud storage is to use it as an archive area. Almost every IT organization has old data that they want or must keep, but are struggling with where to keep it. Its ability to identify, automatically move and transparently recall data could make file virtualization the ultimate cloud gateway.

In our last entry we talked about the use of cloud storage as a backup target, but another ideal use case for cloud storage is to use it as an archive area. Almost every IT organization has old data that they want or must keep, but are struggling with where to keep it. Its ability to identify, automatically move and transparently recall data could make file virtualization the ultimate cloud gateway.As we discussed in our white paper "Using Cloud Archive", moving data to a cloud archive is going to method a way to analyze that data for qualification for movement and a vehicle to get that data to the cloud. Many cloud providers like Iron Mountain and Nirvanix have established a mixed model cloud service where a gateway is placed in your data center. This gateway acts as a NAS to cloud translator and allows you to move data to the cloud as simply as moving data to a network mount point. This is where file virtualization systems from companies like F5 and AutoVirt come into play.

File virtualization, also known as a global file system, allows multiple network shares to be addressed singularly. Your users typically see only a single mount point. The file virtualization software manages the physical aspects of the storage infrastructure behind the scenes. Think of it as a DNS server for files. Most of us don't get to a web site via its IP address, we use the URL. In similar fashion file virtualization abstracts the actual location of a file from a user. Once in place this enables a multitude of capabilities and one of the largest is transparent movement of data.

The transparent movement of data is typically used to move inactive data off of primary storage and onto second or third tier forms of storage. For example you could clear out your expensive tier 1 NAS storage of inactive data and put it on a system that had power managed storage like Nexsan's AutoMaid or on to capacity optimized storage like Permabit or Data Domain. Depending on your needs, all of these devices are viable secondary tiers. Cloud storage can potentially be one of those tiers or even an additional tier for even longer term retention. File virtualization software would manage the network share that the cloud gateway presents as it would any other share.

Going further, the file virtualization server or appliance could also act as the cloud gateway. Combining these into one solution eliminates the need for a device in your data center and it provides you with some flexibility. For example you could send data to two different cloud providers for redundancy. You could also send different data types to different cloud suppliers based on their expertise. For example some cloud providers may specialize in the retention of medical images and offer retention and compliance management as part of their value add.

Cloud storage's biggest role may not be in backup but in archive. Unstructured data is an ever-growing problem and file virtualization may be the best way to get there.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
10 Recommendations for Outsourcing Security
10 Recommendations for Outsourcing Security
Enterprises today have a wide range of third-party options to help improve their defenses, including MSSPs, auditing and penetration testing, and DDoS protection. But are there situations in which a service provider might actually increase risk?
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2012-2808
Published: 2015-04-01
The PRNG implementation in the DNS resolver in Bionic in Android before 4.1.1 incorrectly uses time and PID information during the generation of random numbers for query ID values and UDP source ports, which makes it easier for remote attackers to spoof DNS responses by guessing these numbers, a rel...

CVE-2014-9713
Published: 2015-04-01
The default slapd configuration in the Debian openldap package 2.4.23-3 through 2.4.39-1.1 allows remote authenticated users to modify the user's permissions and other user attributes via unspecified vectors.

CVE-2015-0259
Published: 2015-04-01
OpenStack Compute (Nova) before 2014.1.4, 2014.2.x before 2014.2.3, and kilo before kilo-3 does not validate the origin of websocket requests, which allows remote attackers to hijack the authentication of users for access to consoles via a crafted webpage.

CVE-2015-0800
Published: 2015-04-01
The PRNG implementation in the DNS resolver in Mozilla Firefox (aka Fennec) before 37.0 on Android does not properly generate random numbers for query ID values and UDP source ports, which makes it easier for remote attackers to spoof DNS responses by guessing these numbers, a related issue to CVE-2...

CVE-2015-0801
Published: 2015-04-01
Mozilla Firefox before 37.0, Firefox ESR 31.x before 31.6, and Thunderbird before 31.6 allow remote attackers to bypass the Same Origin Policy and execute arbitrary JavaScript code with chrome privileges via vectors involving anchor navigation, a similar issue to CVE-2015-0818.

Dark Reading Radio
Archived Dark Reading Radio
Good hackers--aka security researchers--are worried about the possible legal and professional ramifications of President Obama's new proposed crackdown on cyber criminals.