Feeds

Delete all you like, but it won't free up space

You've been (de)duped ...

Securing Web Applications Made Simple and Scalable

Comment: Networker blog author Preston de Guise has pointed out a simple and inescapable fact: deleting files on a deduplicated storage volume may not free up any space.

De Guise points out that, in un-deduplicated storage: "There is a 1:1 mapping between amount of data deleted and amount of space reclaimed." Also, space reclamation is near-instantaneous. With deduplication neither need be true.

Huh? Think about it. You add files to a deduplicated volume and any blocks of data in them that are identical to existing stored block groups get deduplicated out of existence and replaced by pointers. The file shrinks. This carries on as more files are added. The drive's capacity gets used up. You become aware of this. You start deleting files to reclaim space. You may find that much of the deleted files' originally fat content is actually skinny pointers and you just reclaim a few bytes of space instead of megabytes or terabytes. Oops; you just got stuffed by deduplication.

Space reclamation with dedupe also requires the dedupe function to do some scanning once a file is deleted:

Whenever data is deleted from a deduplication system, the system must scan remaining data to see if there are any dependencies. Only if the data deleted was completely unique will it actually be reclaimed in earnest; otherwise all that happens is that pointers to unique data are cleared. (It may be that the only space you get back is the equivalent of what you’d pull back from a Unix filesystem when you delete a symbolic link.)

Not only that, reclamation is rarely run on a continuous basis on deduplication systems – instead, you either have to wait for the next scheduled process, or manually force it to start.

His conclusion is this:

The net lesson? Eternal vigilance! It’s not enough to monitor and start to intervene when there’s say, 5 per cent of capacity remaining. Depending on the deduplication system, you may find that 5 per cent remaining space is so critically low that space reclamation becomes a complete nightmare.

He recommends the use of "alerts, processes and procedures targeting" a set of capacity utilisation levels such as 60 per cent, 70 per cent, 75 per cent and so on.

Great idea. Preston de Guise is a clever guy. ®

The Essential Guide to IT Transformation

More from The Register

next story
Manic malware Mayhem spreads through Linux, FreeBSD web servers
And how Google could cripple infection rate in a second
EU's top data cops to meet Google, Microsoft et al over 'right to be forgotten'
Plan to hammer out 'coherent' guidelines. Good luck chaps!
US judge: YES, cops or feds so can slurp an ENTIRE Gmail account
Crooks don't have folders labelled 'drug records', opines NY beak
FLAPE – the next BIG THING in storage
Find cold data with flash, transmit it from tape
Seagate chances ARM with NAS boxes for the SOHO crowd
There's an Atom-powered offering, too
Gartner: To the right, to the right – biz sync firms who've won in a box to the right...
Magic quadrant: Top marks for, er, completeness of vision, EMC
prev story

Whitepapers

Top three mobile application threats
Prevent sensitive data leakage over insecure channels or stolen mobile devices.
The Essential Guide to IT Transformation
ServiceNow discusses three IT transformations that can help CIO's automate IT services to transform IT and the enterprise.
Mobile application security vulnerability report
The alarming realities regarding the sheer number of applications vulnerable to attack, and the most common and easily addressable vulnerability errors.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Consolidation: the foundation for IT and business transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.