Feeds

Data Domain doing faster dedupe

Doubling up and up

HP ProLiant Gen8: Integrated lifecycle automation

Data Domain has increased the speed of its deduplication to between 50 and 100 percent through a software update.

The speed boost varies from 50 per cent on DD510, 530, and 580 models, 58 per cent on the DD565, up to 90 per cent on the DD690g and DDX array, and 100 per cent on the DD120.

The DD690g had a 1.4TB/hour throughput rating when it was introduced in May last year. Now it is 2.7TB/hour (with Symantec NetBackup OpenStorage and a 10GbitE connection).

It makes you wonder whether the code was that bad; clearly there was room for improvement. What Data Domain has done is to get its deduplication code executing better on multi-core CPUs so that more is done in parallel. The code, using a technology Data Domain calls Stream Informed Segment Layout (SISL), has been tuned so that it makes better use of the available cores.

Data Domain's platform operating software has revved from DD OS v4.5 to 4.6. Shane Jackson, senior director for product and channel marketing at Data Domain, contrasts competing vendors who, he says, rely on adding disk spindles to boost deduplication speed, with Data Domain's reliance on CPU speed. "Intel shows up with a faster processor more often than Seagate shows up with a faster drive," he says.

That's over-egging the pudding, as all dedupe vendors rely on software and disks. Data Domain happens to have, it appears, as good as if not better software algorithms than most, certainly enough for it to suggest its products can be used to deduplicate some primary storage applications.

The neat aspect of this is that it's widely expected to introduce new, Nehelem-boosted hardware later this year. With 8 cores available there should be another doubling or near-doubling of performance compared to the current quad core Xeons being used. That means a DD690-type product could ramp its performance up to 5.4TB/hour, meaning 90GB/min or 1.5GB/sec.

Sepaton and Diligent, now owned by IBM, emphasise their deduping speed and NetApp also pushes its ASIS dedupe into some primary data deduplication applications.

Two thoughts: first, it looks as if a deduping race is on. Secondly, it begins to look as if inline deduplication is quite viable for the majority of backup applications.

Offline dedupe vendors say that, to keep backup speeds high you really should land the backup data uninterrupted by any processing and dedupe it afterwards. At speeds of up to 750MB/sec now and with 1.5GB/sec speeds coming, Data Domain would say that most backup applications could be deduped inline and avoid the need for a substantial chunk of disk capacity kept aside to land the raw data. ®

Eight steps to building an HP BladeSystem

More from The Register

next story
Apple fanbois SCREAM as update BRICKS their Macbook Airs
Ragegasm spills over as firmware upgrade kills machines
THUD! WD plonks down SIX TERABYTE 'consumer NAS' fatboy
Now that's a LOT of porn or pirated movies. Or, you know, other consumer stuff
EU's top data cops to meet Google, Microsoft et al over 'right to be forgotten'
Plan to hammer out 'coherent' guidelines. Good luck chaps!
US judge: YES, cops or feds so can slurp an ENTIRE Gmail account
Crooks don't have folders labelled 'drug records', opines NY beak
Manic malware Mayhem spreads through Linux, FreeBSD web servers
And how Google could cripple infection rate in a second
FLAPE – the next BIG THING in storage
Find cold data with flash, transmit it from tape
prev story

Whitepapers

Seven Steps to Software Security
Seven practical steps you can begin to take today to secure your applications and prevent the damages a successful cyber-attack can cause.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Consolidation: the foundation for IT and business transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.