Feeds

Data Domain makes de-dupe box for corporate boonies

Beams back remote office data to mothership

Internet Security Threat Report 2014

Data Domain wants to give its de-duplication technology a comfier fit in the wiring closet with a shrunken system for remote offices.

The DD120 is a miniaturized edition of the Data Domain's current data de-dupe appliances, and is designed specifically to replicate data from a branch office to a company's data center HQ. Network traffic and storage are reduced via the box since it eliminates redundant data.

The 1U DD120 provides 150GB/hour inline deduplication throughput. The physical storage capacity of the system is 750GB, which can hold between 7TB to 18TB of data with the wizardry of de-duplication, the company claims.

Although Data Domain already angles its DD510 and DD530 appliances at branch location backup, the new DD120 specifically comes bundled with a $2,000 license for Data Domain's Replicator software. That option in the latter two devices starts at an extra $3,500.

The DD120 is slower and has less storage capacity than its older siblings but comes in at a lower price. The new system starts at $12,500 compared to the next box up, the DD510, with a $19,000 starting price.

The 3U DD510 has a physical storage capacity of 3.75TB, and a logical, de-duped capacity of 135TB. Maximum throughput is 290GB/hour.

The DD120 supports CIFS or NFS protocols and the Symantec Veritas NetBackup OpenStorage interface.

The capacity liberating promises of de-duplication technology have sparked an arms race amongst storage vendors like Hitachi Data Systems, EMC, Network Appliance and Quantum. Last week, NetApp updated its OnTap operating system to allow de-dupe capabilities for primary data storage. NetApp takes the opposite side of the technology, using post-processing de-dupe, which removes redundant data after its written to disk. Data Domain does its voodoo before it's sent to backup.

The DD120 data sheet can be found here. (PDF warning) ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Seattle children’s accelerates Citrix login times by 500% with cross-tier insight
Seattle Children’s is a leading research hospital with a large and growing Citrix XenDesktop deployment. See how they used ExtraHop to accelerate launch times.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?