Feeds

Data Domain doing faster dedupe

Doubling up and up

Secure remote control for conventional and virtual desktops

Data Domain has increased the speed of its deduplication to between 50 and 100 percent through a software update.

The speed boost varies from 50 per cent on DD510, 530, and 580 models, 58 per cent on the DD565, up to 90 per cent on the DD690g and DDX array, and 100 per cent on the DD120.

The DD690g had a 1.4TB/hour throughput rating when it was introduced in May last year. Now it is 2.7TB/hour (with Symantec NetBackup OpenStorage and a 10GbitE connection).

It makes you wonder whether the code was that bad; clearly there was room for improvement. What Data Domain has done is to get its deduplication code executing better on multi-core CPUs so that more is done in parallel. The code, using a technology Data Domain calls Stream Informed Segment Layout (SISL), has been tuned so that it makes better use of the available cores.

Data Domain's platform operating software has revved from DD OS v4.5 to 4.6. Shane Jackson, senior director for product and channel marketing at Data Domain, contrasts competing vendors who, he says, rely on adding disk spindles to boost deduplication speed, with Data Domain's reliance on CPU speed. "Intel shows up with a faster processor more often than Seagate shows up with a faster drive," he says.

That's over-egging the pudding, as all dedupe vendors rely on software and disks. Data Domain happens to have, it appears, as good as if not better software algorithms than most, certainly enough for it to suggest its products can be used to deduplicate some primary storage applications.

The neat aspect of this is that it's widely expected to introduce new, Nehelem-boosted hardware later this year. With 8 cores available there should be another doubling or near-doubling of performance compared to the current quad core Xeons being used. That means a DD690-type product could ramp its performance up to 5.4TB/hour, meaning 90GB/min or 1.5GB/sec.

Sepaton and Diligent, now owned by IBM, emphasise their deduping speed and NetApp also pushes its ASIS dedupe into some primary data deduplication applications.

Two thoughts: first, it looks as if a deduping race is on. Secondly, it begins to look as if inline deduplication is quite viable for the majority of backup applications.

Offline dedupe vendors say that, to keep backup speeds high you really should land the backup data uninterrupted by any processing and dedupe it afterwards. At speeds of up to 750MB/sec now and with 1.5GB/sec speeds coming, Data Domain would say that most backup applications could be deduped inline and avoid the need for a substantial chunk of disk capacity kept aside to land the raw data. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
Troll hunter Rackspace turns Rotatable's bizarro patent to stone
News of the Weird: Screen-rotating technology declared unpatentable
prev story

Whitepapers

A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.