Feeds

Data Domain sticks neck out on deduping

Dopey notions from speed freaks?

Next gen security for virtualised datacentres

Comment Can block-level access storage area network data be deduplicated? Data Domain thinks so, but nobody else does.

The company also reckons deduping data on solid state drives will solve the SSD price problem. Deduplication is the removal of repeated data patterns in files to drastically shrink the space they occupy; it is used particularly with backed-up data where versions of files are repeatedly stored in case they need to be recovered. By identifying duplicated and redundant patterns and replacing them with pointers to a gold or master pattern, the space needed by the backups can be reduced, sometimes to as much as a tenth or even less of the original backup data.

Such deduplication speeds backup time, and reduces the number of disks needed and the electricity required to spin them; it's good news all round, except that deduplication imposes a processing burden above and beyond what a normal drive array controller can cope with. That means that many deduplication products land incoming raw data on their disk drives first and deduplicate it after the files have come in, termed post-processing.

Data Domain relies on the most powerful Intel processors to drive its controllers, and is currently using 4-core Xeons in its top-end DD690 product. It processes the raw data as it comes it, in-line deduplication.

A two-way funnel has confined deduplication to file data. It works like this: because deduplication achieves its maximum benefit with highly-redundant data, and because this is typically backup and archive file data, then deduplication is a file-level access process.

The general view is that deduplication can't be applied to transaction-level data, the stuff requiring the fastest reading and writing on storage arrays, because deduping/rehydrating (the opposite of deduplication) takes up too much time and slows the work rate of the servers involved. The net effect is that SAN data, accessed at block level, where the transaction data is stored on tier 1 drives, is not deduped.

De-duping block-access SAN data

Enter Frank Slootman, the CEO of dedupe market leader Data Domain. He says that dedupe is a technology that should be in storage arrays, all storage arrays. EMC, a strong competitor, has the view that all Data Domain is doing is selling a storage array feature and this is wrong. It's not a stand-alone feature, but needs integrating with your information infrastructure.

Slootman agrees. Any Data Domain product is a storage array that happens to include a great dedupe engine. As a storage array it needs access interfaces and it has started out with file-level ones: NFS, CIFS, and NDMP. It's adding OST, Symantec's open storage technology API. Slootman said: "We'll announce our alliance with OST protocol later this month. We have black-boxed it - it's no longer visible."

The benefit will be faster data transfer from Symantec software into Data Domain's products.

Another new access protocol would be a block-level one. "We've researched this and proved the ability to do it," Slootman told us. "I wouldn't exclude that you would see that from us but I'm not announcing it. Backup and archive are file-oriented and that's our market today. The front-end data is transaction-based and it's block-level access. It (deduplicating it) absolutely is possible."

The well-telegraphed forthcoming new top-of-the-range product from Data Domain should arrive by mid-year, and will likely use 8-core Xeon controllers. Slootman said: "We refresh the top end of our line every year. You're going to get much bigger, much faster. The amount of throughput behind a single controller will be absolutely off the chart." He reckons that Data Domain in-line dedupe will write data faster than some post-process dedupers can land raw data on disk.

What about clustering Data Domain boxes, so that they could scale more and, in theory, offer protection against node failure? Slootman said: "The technology already exists. We've been working on it for the last two and a half years. It's much, much more complex than a single node product. It's coming out likely before the end of this (calendar) year (and will be) installed at customers' sites."

He says the fundamental problem is not how big the system gets, it's how fast they get: "We have the fastest dedupe heads in the industry by far, in-line or post-process." This company is fixated on speed.

5 things you didn’t know about cloud backup

Next page: De-duping SSDs

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Docker kicks KVM's butt in IBM tests
Big Blue finds containers are speedy, but may not have much room to improve
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Gartner's Special Report: Should you believe the hype?
Enough hot air to carry a balloon to the Moon
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Dell The Man shrieks: 'We've got a Bitcoin order, we've got a Bitcoin order'
$50k of PowerEdge servers? That'll be 85 coins in digi-dosh
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Solving today's distributed Big Data backup challenges
Enable IT efficiency and allow a firm to access and reuse corporate information for competitive advantage, ultimately changing business outcomes.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.