Feeds

BridgeSTOR: They called us mad, but we've put deduped data on TAPE!

Are we crazy or what?!

Build a business case: developing custom apps

Tape Summit BridgeSTOR has announced it will store deduplicated data on tape, a medium considered totally unsuitable for deduplication by everybody else.

John Matze said his company's Data Deduplication File System (DDFS) works with the open Linear Tape File System (LTFS) so that deduped data on tape can be rehydrated and restored. Matze was, at one time, the chief technology officer for tape and disk data protection company Overland Storage.

A deduplication system typically involves an array of disk drives and a controller of some kind removing duplicated blocks of stored file data and replacing them with a reference to an index of master blocks, typically using some form of content hashing to decide whether each block is unique or not. Rehydration, the populating of the deduped files with the original blocks so they can be accessed by applications, requires the index of master blocks.

Because tape is a sequential medium and slower than disk to access data, it has not been viewed as a practical medium for storing deduped data since deduplication processes could take a long, long time.

CommVault has a deduplication to tape capability but not much has been heard about it.

Precisely how DDFS combines with LTFS to make deduping tape a practical proposition will be revealed on Tuesday, 30 October, when Matze discusses the technology at a Tape Summit in Frankfurt, Germany, at the SNW Powering the Cloud event.

A benefit of deduplication is that the raw capacity of a tape cartridge will be increased by the data deduplication ratio. An LTO-6 cartridge holding 3.2TB of raw data could hold 32TB with a 10:1 deduplication ratio.

The cost-per-GB of storing such data would be comparatively trivial and opens the door to literally vast tape archives with an enormous density of data, far far outstripping any other archival technology - if DDFS works. For example, a tape library with, say, 10,000 LTO-6 cartridges, could hold 10 x 10,000 x 3.2TB equalling 320,000TB or 312.5PB. This is a quite a prospect and the technology will be examined closely to see if it can deliver on such a dizzying prospect.

BridgeSTOR says its technology uses "on-premises Virtual Deduplication Appliances" and the company has a "Deduplication as a Service” (DaaS) operating expense-only cost model. Clearly it sees cloud service providers as prospective customers for its technology. ®

Boost IT visibility and business value

More from The Register

next story
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
VMware vaporises vCHS hybrid cloud service
AnD yEt mOre cRazy cAps to dEal wIth
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?