Feeds

Data Domain is working on global deduplication

Today Nehelem-based boxes, tomorrow the world!

Beginner's guide to SSL certificates

Data Domain has been working on global deduplication across its DDX arrays for a couple of years, but the technology is not yet ready.

The DDX arrays are collections of 4, 8 or 16-controller Data Domain deduplication systems which dedupe independently of each other. If they were to use a global deduplication map or index across the array members then the overall deduplication ratio would increase.

Brian Biles, a Data Domain co-founder and its product management VP, was speaking at SNW in Frankfurt, and said: "It's very difficult to build at scale. We've been in development for two years. When it's ready we'll announce."

When asked about Nehalem use he answered: "We may well use Nehalem 6- and 8-core processors. It would be a sensible extrapolation of our history."

Our feeling is that these products would probably come in the first half of 2010. There was no suggestion that global deduplication technology was ready to roll and that looks like a possibility for the second half of 2010, if not later.

Biles contrasted NetApp filers with ASIS deduplication and Data Domain's deduplication products by saying that NetApp filers were built to preserve high random IOPS while deduplicating: "They do a less effective job of data reduction but without sacrificing IOPS bandwidth too much."

Data Domain's products "are built for relatively fewer IOPS and more sequential I/O." The products are not optimised for transaction performance, but are highly optimised for data ingest. This means that we shouldn't expect Data Domain to bring out near-primary data deduplication products any time soon.

Biles also compared Data Domain and Ocarina data reduction approaches, in response to a suggestion that Data Domain products might be augmented by using Ocarina technology. Data Domain set out to solve a data protection problem whereas Ocarina set out to solve a media management problem: "I think it [Ocarina] is in a different market that's not that synergistic. It's a different choice from how to optimise data protection."

The implication is that even if Ocarina offered OEM deals for its technology, Data Domain would not be enthusiastic. ®

Security for virtualized datacentres

More from The Register

next story
It's Big, it's Blue... it's simply FABLESS! IBM's chip-free future
Or why the reversal of globalisation ain't gonna 'appen
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
CAGE MATCH: Microsoft, Dell open co-located bit barns in Oz
Whole new species of XaaS spawning in the antipodes
Microsoft and Dell’s cloud in a box: Instant Azure for the data centre
A less painful way to run Microsoft’s private cloud
AWS pulls desktop-as-a-service from the PC
Support for PCoIP protocol means zero clients can run cloudy desktops
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.