Feeds

Data Domain doing faster dedupe

Doubling up and up

Boost IT visibility and business value

Data Domain has increased the speed of its deduplication to between 50 and 100 percent through a software update.

The speed boost varies from 50 per cent on DD510, 530, and 580 models, 58 per cent on the DD565, up to 90 per cent on the DD690g and DDX array, and 100 per cent on the DD120.

The DD690g had a 1.4TB/hour throughput rating when it was introduced in May last year. Now it is 2.7TB/hour (with Symantec NetBackup OpenStorage and a 10GbitE connection).

It makes you wonder whether the code was that bad; clearly there was room for improvement. What Data Domain has done is to get its deduplication code executing better on multi-core CPUs so that more is done in parallel. The code, using a technology Data Domain calls Stream Informed Segment Layout (SISL), has been tuned so that it makes better use of the available cores.

Data Domain's platform operating software has revved from DD OS v4.5 to 4.6. Shane Jackson, senior director for product and channel marketing at Data Domain, contrasts competing vendors who, he says, rely on adding disk spindles to boost deduplication speed, with Data Domain's reliance on CPU speed. "Intel shows up with a faster processor more often than Seagate shows up with a faster drive," he says.

That's over-egging the pudding, as all dedupe vendors rely on software and disks. Data Domain happens to have, it appears, as good as if not better software algorithms than most, certainly enough for it to suggest its products can be used to deduplicate some primary storage applications.

The neat aspect of this is that it's widely expected to introduce new, Nehelem-boosted hardware later this year. With 8 cores available there should be another doubling or near-doubling of performance compared to the current quad core Xeons being used. That means a DD690-type product could ramp its performance up to 5.4TB/hour, meaning 90GB/min or 1.5GB/sec.

Sepaton and Diligent, now owned by IBM, emphasise their deduping speed and NetApp also pushes its ASIS dedupe into some primary data deduplication applications.

Two thoughts: first, it looks as if a deduping race is on. Secondly, it begins to look as if inline deduplication is quite viable for the majority of backup applications.

Offline dedupe vendors say that, to keep backup speeds high you really should land the backup data uninterrupted by any processing and dedupe it afterwards. At speeds of up to 750MB/sec now and with 1.5GB/sec speeds coming, Data Domain would say that most backup applications could be deduped inline and avoid the need for a substantial chunk of disk capacity kept aside to land the raw data. ®

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Object storage bods Exablox: RAID is dead, baby. RAID is dead
Bring your own disks to its object appliances
Nimble's latest mutants GORGE themselves on unlucky forerunners
Crossing Sandy Bridges without stopping for breath
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.