News
6/30/2010
11:07 AM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Keeping Data Forever vs. Data Retention

Keeping data forever vs. data retention is going to become an increasingly fierce battle. In the past data retention strategies always won but as we discussed in our first entry in the series the technology is now available to store data forever and as we discussed in the second entry the technology is there to find it when you need it.

Keeping data forever vs. data retention is going to become an increasingly fierce battle. In the past data retention strategies always won but as we discussed in our first entry in the series the technology is now available to store data forever and as we discussed in the second entry the technology is there to find it when you need it.The alternative to a keep it forever strategy is to have a very specific data retention strategy, something that I used to be a promoter of. The challenge with implementing fixed data retention strategies is that first you have to get various non-IT departments to decide exactly how long their data needs to be retained. Herding cats may be an easier task. Many will say they want their data kept forever anyway. Which then you need to convince them why they shouldn't. Obviously in the keep it forever strategy you are giving them exactly what they want. Giving people what they want is always popular.

Other departments will want their information deleted rather quickly or to follow some obscure guideline. Reality is that different types of data needs to be stored for varying lengths of time and the regulations that dictate those timeframes are often vague and change frequently. The challenge is most people don't store or tag their information by how it should be retained, they either don't have the time, don't know how to tag it or wouldn't know what the retention policy is even if they could tag it. The odds of you properly categorizing all the data in all its forms into the right retention windows are stacked against you. The man hours to properly identify up front and as an ongoing bases all the data which is being created in your enterprise, and then to properly move that data into the right retention buckets at just the right time are going to be staggering.

Finally and probably most condemning to retention policies is the fact that digital assets are too portable. As a result even if you build the perfect data retention strategy, are able to maintain it and verify that data is deleted at just the right time, employees have a tendency to look after themselves first, not the organization. It is difficult to stop an employee that finds some condemning data that may hurt the organization but helps or protects them. They can for example email the data to a personal email address or copy it to a USB stick. You have to assume if the data was going to hurt the organization it is going to get out somehow. It seems like it always does. The organization's best bet, other than never doing anything wrong, is to at least know about potential threats and be prepared to defend itself. If the data is deleted as part of a retention policy, that is hard to do.

In our next entry we will wrap up this series with looking at the costs associated with a keep data forever strategy and how to keep those costs under control. The strategy needs to be accomplished while meeting the typical cost challenges beyond hard costs; power, cooling and space.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
Threat Intel Today
Threat Intel Today
The 397 respondents to our new survey buy into using intel to stay ahead of attackers: 85% say threat intelligence plays some role in their IT security strategies, and many of them subscribe to two or more third-party feeds; 10% leverage five or more.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2010-5110
Published: 2014-08-29
DCTStream.cc in Poppler before 0.13.3 allows remote attackers to cause a denial of service (crash) via a crafted PDF file.

CVE-2012-1503
Published: 2014-08-29
Cross-site scripting (XSS) vulnerability in Six Apart (formerly Six Apart KK) Movable Type (MT) Pro 5.13 allows remote attackers to inject arbitrary web script or HTML via the comment section.

CVE-2013-5467
Published: 2014-08-29
Monitoring Agent for UNIX Logs 6.2.0 through FP03, 6.2.1 through FP04, 6.2.2 through FP09, and 6.2.3 through FP04 and Monitoring Server (ms) and Shared Libraries (ax) 6.2.0 through FP03, 6.2.1 through FP04, 6.2.2 through FP08, 6.2.3 through FP01, and 6.3.0 through FP01 in IBM Tivoli Monitoring (ITM)...

CVE-2014-0600
Published: 2014-08-29
FileUploadServlet in the Administration service in Novell GroupWise 2014 before SP1 allows remote attackers to read or write to arbitrary files via the poLibMaintenanceFileSave parameter, aka ZDI-CAN-2287.

CVE-2014-0888
Published: 2014-08-29
IBM Worklight Foundation 5.x and 6.x before 6.2.0.0, as used in Worklight and Mobile Foundation, allows remote authenticated users to bypass the application-authenticity feature via unspecified vectors.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
This episode of Dark Reading Radio looks at infosec security from the big enterprise POV with interviews featuring Ron Plesco, Cyber Investigations, Intelligence & Analytics at KPMG; and Chris Inglis & Chris Bell of Securonix.