Feeds

Data Domain sticks neck out on deduping

Dopey notions from speed freaks?

Maximizing your infrastructure through virtualization

Comment Can block-level access storage area network data be deduplicated? Data Domain thinks so, but nobody else does.

The company also reckons deduping data on solid state drives will solve the SSD price problem. Deduplication is the removal of repeated data patterns in files to drastically shrink the space they occupy; it is used particularly with backed-up data where versions of files are repeatedly stored in case they need to be recovered. By identifying duplicated and redundant patterns and replacing them with pointers to a gold or master pattern, the space needed by the backups can be reduced, sometimes to as much as a tenth or even less of the original backup data.

Such deduplication speeds backup time, and reduces the number of disks needed and the electricity required to spin them; it's good news all round, except that deduplication imposes a processing burden above and beyond what a normal drive array controller can cope with. That means that many deduplication products land incoming raw data on their disk drives first and deduplicate it after the files have come in, termed post-processing.

Data Domain relies on the most powerful Intel processors to drive its controllers, and is currently using 4-core Xeons in its top-end DD690 product. It processes the raw data as it comes it, in-line deduplication.

A two-way funnel has confined deduplication to file data. It works like this: because deduplication achieves its maximum benefit with highly-redundant data, and because this is typically backup and archive file data, then deduplication is a file-level access process.

The general view is that deduplication can't be applied to transaction-level data, the stuff requiring the fastest reading and writing on storage arrays, because deduping/rehydrating (the opposite of deduplication) takes up too much time and slows the work rate of the servers involved. The net effect is that SAN data, accessed at block level, where the transaction data is stored on tier 1 drives, is not deduped.

De-duping block-access SAN data

Enter Frank Slootman, the CEO of dedupe market leader Data Domain. He says that dedupe is a technology that should be in storage arrays, all storage arrays. EMC, a strong competitor, has the view that all Data Domain is doing is selling a storage array feature and this is wrong. It's not a stand-alone feature, but needs integrating with your information infrastructure.

Slootman agrees. Any Data Domain product is a storage array that happens to include a great dedupe engine. As a storage array it needs access interfaces and it has started out with file-level ones: NFS, CIFS, and NDMP. It's adding OST, Symantec's open storage technology API. Slootman said: "We'll announce our alliance with OST protocol later this month. We have black-boxed it - it's no longer visible."

The benefit will be faster data transfer from Symantec software into Data Domain's products.

Another new access protocol would be a block-level one. "We've researched this and proved the ability to do it," Slootman told us. "I wouldn't exclude that you would see that from us but I'm not announcing it. Backup and archive are file-oriented and that's our market today. The front-end data is transaction-based and it's block-level access. It (deduplicating it) absolutely is possible."

The well-telegraphed forthcoming new top-of-the-range product from Data Domain should arrive by mid-year, and will likely use 8-core Xeon controllers. Slootman said: "We refresh the top end of our line every year. You're going to get much bigger, much faster. The amount of throughput behind a single controller will be absolutely off the chart." He reckons that Data Domain in-line dedupe will write data faster than some post-process dedupers can land raw data on disk.

What about clustering Data Domain boxes, so that they could scale more and, in theory, offer protection against node failure? Slootman said: "The technology already exists. We've been working on it for the last two and a half years. It's much, much more complex than a single node product. It's coming out likely before the end of this (calendar) year (and will be) installed at customers' sites."

He says the fundamental problem is not how big the system gets, it's how fast they get: "We have the fastest dedupe heads in the industry by far, in-line or post-process." This company is fixated on speed.

The Power of One eBook: Top reasons to choose HP BladeSystem

Next page: De-duping SSDs

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Amazon Reveals One Weird Trick: A Loss On Almost $20bn In Sales
Investors really hate it: Share price plunge as growth SLOWS in key AWS division
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.