Risk
4/20/2012
01:29 PM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

How To Protect Big Data Analytics

Big data analytics often means big challenges when it comes to data protection. Here are some things to keep in mind when you're working in these environments.

Big Data Talent War: 10 Analytics Job Trends
Big Data Talent War: 10 Analytics Job Trends
(click image for larger view and for slideshow)
Data protection is often the forgotten part of any trend in the data center, and the launch of big data initiatives is no exception to this trend. Data protection is too often an afterthought. What is particularly challenging with big data, especially big data analytics, as I discussed in a recent column, is that it is the perfect storm for a data protection disaster.

Big data analytics has all the things you don’t want to see when you're trying to protect data. First, it can have a very unique sample set--for example, a device that monitors a soil sample every 30 seconds, a camera that takes thousands of images every minute, or a cell phone call center that logs millions of text messages. All that data is unique to that moment; if it is lost it is impossible to recreate.

That uniqueness also means that the data is probably not deduplicatable. As I discussed in a recent article, you may need to either turn off deduplication, or at least factor in a very low effective rate, in such environments. This means that the capacity of the backup appliance may have to be close to what the real data set is than in other backup situations where you may be counting on a high level of dedupe effectiveness.

[ Bigger data sets mean bigger compliance challenges. Read more at Big Data's Dark Side: Compliance Issues. ]

The large number of files that can be resident in big data analytic environments is also a challenge. In order for the backup application and the appliance to churn through this large number of files, the bandwidth to the backup server and/or the backup appliance needs to be large, and the receiving devices must be able to ingest data at the rate that the data can be delivered. They also need significant CPU processing power to churn through billions of files.

There's also a database component to big data that needs to be considered. Analytic information is often processed into either an Oracle or Hadoop environment of some sort, so live protection of that environment may be required. This means a smaller number of larger files need to be backed up.

This is a worst-case mix workload of high performance: billions of small files with a small number of large files, which may break many backup appliances. Finding one that can ingest this mixed workload of data at full speed, that has a deduplication configuration that won't impact performance, and that can scale to massive capacities may be the biggest challenge in the big data backup market. You may have to consider tape, and if so, the disk backup vendor needs to know how to work with it.

The other form of big data, big data archive, should be less of an issue if it's designed correctly. If the design uses tape as part of the archive, then backup can be built in as part of the workflow. Designing the storage infrastructure for big data archive environments will be the subject of an upcoming column.

Follow Storage Switzerland on Twitter

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Storage Switzerland's disclosure statement.

The Enterprise 2.0 Conference brings together industry thought leaders to explore the latest innovations in enterprise social software, analytics, and big data tools and technologies. Learn how your business can harness these tools to improve internal business processes and create operational efficiencies. It happens in Boston, June 18-21. Register today!

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
JTAYLOR9009
50%
50%
JTAYLOR9009,
User Rank: Apprentice
4/20/2012 | 7:36:11 PM
re: How To Protect Big Data Analytics
Interesting article. RDBMS & NoSQL both have their places in the world. I like to think of them like tools; sometime you need a sledgehammer and sometimes you need a chisel. Still equally important.
MarkNeilson
50%
50%
MarkNeilson,
User Rank: Apprentice
2/23/2014 | 8:38:51 AM
re: How To Protect Big Data Analytics
Analytics is always essential to take care of the de-duplication problem. The analytics clearly explains the overall issues and the increased demand of data management in the upcoming years. If the data is bigger there will be compliance issues. So, it is quite important to take care of such data related issues along with its quality and management. You can also take the help of Data Cleansing software that will solve all your issues.
Register for Dark Reading Newsletters
Partner Perspectives
What's This?
In a digital world inundated with advanced security threats, Intel Security seeks to transform how we live and work to keep our information secure. Through hardware and software development, Intel Security delivers robust solutions that integrate security into every layer of every digital device. In combining the security expertise of McAfee with the innovation, performance, and trust of Intel, this vision becomes a reality.

As we rely on technology to enhance our everyday and business life, we must too consider the security of the intellectual property and confidential data that is housed on these devices. As we increase the number of devices we use, we increase the number of gateways and opportunity for security threats. Intel Security takes the “security connected” approach to ensure that every device is secure, and that all security solutions are seamlessly integrated.
Featured Writers
White Papers
Cartoon
Current Issue
Dark Reading's October Tech Digest
Fast data analysis can stymie attacks and strengthen enterprise security. Does your team have the data smarts?
Flash Poll
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-0334
Published: 2014-10-31
Bundler before 1.7, when multiple top-level source lines are used, allows remote attackers to install arbitrary gems by creating a gem with the same name as another gem in a different source.

CVE-2014-2334
Published: 2014-10-31
Multiple cross-site scripting (XSS) vulnerabilities in the Web User Interface in Fortinet FortiAnalyzer before 5.0.7 allow remote attackers to inject arbitrary web script or HTML via unspecified vectors, a different vulnerability than CVE-2014-2336.

CVE-2014-2335
Published: 2014-10-31
Multiple cross-site scripting (XSS) vulnerabilities in the Web User Interface in Fortinet FortiManager before 5.0.7 allow remote attackers to inject arbitrary web script or HTML via unspecified vectors, a different vulnerability than CVE-2014-2336.

CVE-2014-2336
Published: 2014-10-31
Multiple cross-site scripting (XSS) vulnerabilities in the Web User Interface in Fortinet FortiManager before 5.0.7 and FortiAnalyzer before 5.0.7 allow remote attackers to inject arbitrary web script or HTML via unspecified vectors, a different vulnerability than CVE-2014-2334 and CVE-2014-2335.

CVE-2014-3366
Published: 2014-10-31
SQL injection vulnerability in the administrative web interface in Cisco Unified Communications Manager allows remote authenticated users to execute arbitrary SQL commands via a crafted response, aka Bug ID CSCup88089.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Follow Dark Reading editors into the field as they talk with noted experts from the security world.