Feeds

NetApp slims for Storage Foundation

Goes on thin reclamation diet

Choosing a cloud hosting partner with confidence

NetApp is supporting Symantec's thin reclamation API, almost two years after it was announced.

Symantec announced its thin reclamation API as part of a Veritas Storage Foundation update in October 2008, with 3PAR first off the block in supporting it. IBM gave the API the thumbs up a year later. It solves a problem with thinly-provisioned block storage in that when files are deleted by a host server the storage array providing the storage doesn't know anything about it.

With thinly-provisioned storage a host server application only consumes storage as it writes data. Although it has a logical allocation of, say, 10TB, it has written 5TB of data and that is all it's actually given by the storage array, driving up disk utilisation. If that application then deletes a 250GB file it remains allocated by the storage array, sending disk utilisation down again. Cue the host running Veritas Storage Foundation software, and it sends a message to the array, using the thin reclamation API, which then hunts down the deleted file, reclaims the space and returns it to its general storage pool for use elsewhere.

NetApp calls this hole punching and had introduced a host-based deleted file space reclamation facility with SnapDrive for Windows back in 2008. It wanted an industry-standard way of doing this, but that has not come to pass and so now it's going with the Symantec flow.

In these days of focus on general data reduction, having disk capacity occupied by deleted files is just silly. Symantec Storage Foundation users with NetApp arrays will now be able to increase their disk utilisation rate. The larger the IT shop and the higher the file deletion rate the more effective this API will be. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.