Feeds

Data Domain is working on global deduplication

Today Nehelem-based boxes, tomorrow the world!

Reducing the cost and complexity of web vulnerability management

Data Domain has been working on global deduplication across its DDX arrays for a couple of years, but the technology is not yet ready.

The DDX arrays are collections of 4, 8 or 16-controller Data Domain deduplication systems which dedupe independently of each other. If they were to use a global deduplication map or index across the array members then the overall deduplication ratio would increase.

Brian Biles, a Data Domain co-founder and its product management VP, was speaking at SNW in Frankfurt, and said: "It's very difficult to build at scale. We've been in development for two years. When it's ready we'll announce."

When asked about Nehalem use he answered: "We may well use Nehalem 6- and 8-core processors. It would be a sensible extrapolation of our history."

Our feeling is that these products would probably come in the first half of 2010. There was no suggestion that global deduplication technology was ready to roll and that looks like a possibility for the second half of 2010, if not later.

Biles contrasted NetApp filers with ASIS deduplication and Data Domain's deduplication products by saying that NetApp filers were built to preserve high random IOPS while deduplicating: "They do a less effective job of data reduction but without sacrificing IOPS bandwidth too much."

Data Domain's products "are built for relatively fewer IOPS and more sequential I/O." The products are not optimised for transaction performance, but are highly optimised for data ingest. This means that we shouldn't expect Data Domain to bring out near-primary data deduplication products any time soon.

Biles also compared Data Domain and Ocarina data reduction approaches, in response to a suggestion that Data Domain products might be augmented by using Ocarina technology. Data Domain set out to solve a data protection problem whereas Ocarina set out to solve a media management problem: "I think it [Ocarina] is in a different market that's not that synergistic. It's a different choice from how to optimise data protection."

The implication is that even if Ocarina offered OEM deals for its technology, Data Domain would not be enthusiastic. ®

Reducing the cost and complexity of web vulnerability management

More from The Register

next story
Wanna keep your data for 1,000 YEARS? No? Hard luck, HDS wants you to anyway
Combine Blu-ray and M-DISC and you get this monster
US boffins demo 'twisted radio' mux
OAM takes wireless signals to 32 Gbps
Apple flops out 2FA for iCloud in bid to stop future nude selfie leaks
Millions of 4chan users howl with laughter as Cupertino slams stable door
No biggie: EMC's XtremIO firmware upgrade 'will wipe data'
But it'll have no impact and will be seamless, we're told
Students playing with impressive racks? Yes, it's cluster comp time
The most comprehensive coverage the world has ever seen. Ever
Run little spreadsheet, run! IBM's Watson is coming to gobble you up
Big Blue's big super's big appetite for big data in big clouds for big analytics
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.