News
8/31/2009
10:55 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

The Foundation Of The Data Asset

In my last entry we discussed Making Data an Asset. This entry will focus on where that data asset should be stored. What is needed is a strong storage foundation, one that is designed to last for years, if not decades, but also one that will store that data efficiently and of course be complimentary to the enterprise class indexing that we described in our last entry.

In my last entry we discussed Making Data an Asset. This entry will focus on where that data asset should be stored. What is needed is a strong storage foundation, one that is designed to last for years, if not decades, but also one that will store that data efficiently and of course be complimentary to the enterprise class indexing that we described in our last entry.This foundation is similar to the one we described in our recent article "The Foundation of DeDupe's Next Era". The difference is that if we are looking at data as an asset as opposed to something we have to retain. To ascertain what data is worthy of being an asset we need to have our index and classifications performed across the data center by companies like Index Engines or Kazeon. Those solutions have to be enterprise in scale yet be simple to implement; a single or few appliances that can index the entire enterprise in a reasonable amount of time. Once the data has been mined, data that has value or is an asset should be stored in a vault-like storage location as soon as it becomes static.

These types of storage vaults are being provided today be companies like EMC, Nexsan, NEC and Permabit. Their goal is to be able to retain data for years, store that data efficiently and to be able to provide some sort of retention capabilities that will maintain a chain of custody on that data.

Also these types of solutions must be able to be searched quickly and they must be able to move that data back into production quickly. Although tape can now be indexed, disk archives may be a more ideal storage vault for asset data. Data that is an asset will have a high probability of needing to be found and restored quickly. While indexing tape is important, index today's data that is on disk so that you can understand what data is an asset and then being able to get that data back quickly can give a company a competitive advantage.

This storage vault needs to have some of the other attributes of any other disk archive. Beyond being able to find the asset, redundancy may be the most important part of this storage vault. The archive must be able to survive through multiple component failures and still be able to deliver the data back to you. Once you have classified something as an asset, storing it just in case you need it is no longer enough. You have verified that there is a high probability that you will need it, when you do you have to make sure that you can get it back.

Once the organization realizes that there is some "gold" in the data that they are keeping, the investment to find and then store that gold becomes obvious. It allows the IT team to more easily cost justify future purchases that will benefit not just the data assets but all the data in the enterprise.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
DevOps’ Impact on Application Security
DevOps’ Impact on Application Security
Managing the interdependency between software and infrastructure is a thorny challenge. Often, it’s a “developers are from Mars, systems engineers are from Venus” situation.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-7392
Published: 2014-07-22
Gitlist allows remote attackers to execute arbitrary commands via shell metacharacters in a file name to Source/.

CVE-2014-2385
Published: 2014-07-22
Multiple cross-site scripting (XSS) vulnerabilities in the web UI in Sophos Anti-Virus for Linux before 9.6.1 allow local users to inject arbitrary web script or HTML via the (1) newListList:ExcludeFileOnExpression, (2) newListList:ExcludeFilesystems, or (3) newListList:ExcludeMountPaths parameter t...

CVE-2014-3518
Published: 2014-07-22
jmx-remoting.sar in JBoss Remoting, as used in Red Hat JBoss Enterprise Application Platform (JEAP) 5.2.0, Red Hat JBoss BRMS 5.3.1, Red Hat JBoss Portal Platform 5.2.2, and Red Hat JBoss SOA Platform 5.3.1, does not properly implement the JSR 160 specification, which allows remote attackers to exec...

CVE-2014-3530
Published: 2014-07-22
The org.picketlink.common.util.DocumentUtil.getDocumentBuilderFactory method in PicketLink, as used in Red Hat JBoss Enterprise Application Platform (JBEAP) 5.2.0 and 6.2.4, expands entity references, which allows remote attackers to read arbitrary code and possibly have other unspecified impact via...

CVE-2014-4326
Published: 2014-07-22
Elasticsearch Logstash 1.0.14 through 1.4.x before 1.4.2 allows remote attackers to execute arbitrary commands via a crafted event in (1) zabbix.rb or (2) nagios_nsca.rb in outputs/.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Where do information security startups come from? More important, how can I tell a good one from a flash in the pan? Learn how to separate ITSec wheat from chaff in this episode.