Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


02:43 PM
George Crump
George Crump

Deletion And Reclamation - The Ultimate Deduplication Strategy

With all the products that are available today for optimizing storage through deduplication and/or compression, one of the best methods available is deletion and reclamation.

With all the products that are available today for optimizing storage through deduplication and/or compression, one of the best methods available is deletion and reclamation.When you mention deletion of files as a data management method hearts stop beating throughout IT. We know the old rule, the moment you delete a file, even if it has not been accessed for years, is the moment that a user will want it back again. In addition to that old rule, there may be very legitimate legal retention reasons for files to be kept on storage.

IT professionals need a tool from companies like Tek-Tools or APTARE that allows them to audit and trend data in the enterprise. Tools like this will allow the IT team to identify inactive data, how long it typically stays inactive and wether or not it is likely to stay inactive. The goal of course being that you want to have this data on the least expensive storage tier while at the same time delivering acceptable performance.

These same tools can be used to identify data that is either dormant but likely to become active again as well as data that simply can't be deleted because of various policies or regulations. The best course of action is to move this to a disk archive storage platform, where it can be deduplicated and compressed for longer term storage.


With all this data identified and either deleted or moved, how do you recapture that space? All this storage capacity has been made available but it is still assigned to the server that originally housed all that data. While most operating systems can now handle volumes that grow, they can not handle a volume that shrinks.

Even thin provisioning struggles with making free space available to the rest of the storage pool. Most thin provisioning systems have no way of knowing how to determine the difference between free space that is occupied by files marked for deletion and actual data consuming space on a file system.

Clearly there are other advantages to archiving or deleting data, but if space reclamation is your primary goal it is questionable how much space you will actually get to repurpose. Without advancements in the way file systems deal with recently deleted data you might be just as well off leaving the data in place and either compressing it or deduplicating it.

As we discuss in our Thin Provisioning White Paper, what is needed here is some cooperation between the file system manufacturers and storage hardware manufacturers. For example Symantec has partnered with companies like 3PAR and HDS to develop a thin aware API set that allows for this space to be reclaimed after a large deletion activity.

This type of technology needs to be deployed quickly and its use spread across other operating systems. Technologies that increase storage efficiencies like deduplication, compression, archive and thin provisioning, while still valuable, are not able to reach their full potential without them.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.


Recommended Reading:

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 8/3/2020
Pen Testers Who Got Arrested Doing Their Jobs Tell All
Kelly Jackson Higgins, Executive Editor at Dark Reading,  8/5/2020
New 'Nanodegree' Program Provides Hands-On Cybersecurity Training
Nicole Ferraro, Contributing Writer,  8/3/2020
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Current Issue
Special Report: Computing's New Normal, a Dark Reading Perspective
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
The Changing Face of Threat Intelligence
The Changing Face of Threat Intelligence
This special report takes a look at how enterprises are using threat intelligence, as well as emerging best practices for integrating threat intel into security operations and incident response. Download it today!
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2020-08-08
In JetBrains YouTrack before 2020.2.6881, the markdown parser could disclose hidden file existence.
PUBLISHED: 2020-08-08
In JetBrains YouTrack before 2020.2.6881, a user without permission is able to create an article draft.
PUBLISHED: 2020-08-08
JetBrains YouTrack before 2020.2.8873 is vulnerable to SSRF in the Workflow component.
PUBLISHED: 2020-08-08
In JetBrains Kotlin before 1.4.0, there is a script-cache privilege escalation vulnerability due to kotlin-main-kts cached scripts in the system temp directory, which is shared by all users by default.
PUBLISHED: 2020-08-08
In JetBrains TeamCity before 2020.1, users with the Modify Group permission can elevate other users' privileges.