Feeds

Dell and the dedupe appliance conundrum

What's going on?

SANS - Survey on application security programs

Comment Dell is announcing a new deduplication appliance on Monday, the DL2000, while simultaneously saying dedupe will move on from a backup and appliance focus to something broader and more pervasive. What's going on?

Let's essay an attempt to join up some dots in this release with previous Dell statements - and with EMC's statements and activities - and see where it leads us.

The DL2000 release said: "Dell’s deduplication strategy is to foster and encourage the rapid evolution of dedupe technology into a storage environment where the functionality exists everywhere. As deduplication matures quickly, it will move beyond backup storage – where it primarily resides today - to other data types including near-primary, archive, file, and object storage solutions."

The DL2000 is an integrated deduping backup appliance and so represents, we might say, first generation deduplication. The second generation will encompass, in Dell's view, deduplication of near-primary data, archive data, file, and object storage.

Considering that Dell storage, apart from the EqualLogic PS6000 products, comes from EMC, we can note that near-primary storage refers to general, unstructured information stored on primary or tier 1 (Fibre Channel) storage, meaning Clariion CX in Dell/EMC terms. We can also note that the general EMC meaning of file storage is a filer, network-attached storage (NAS). Dell sells EMC's Celerra NX4 NAS product. The general meaning of object storage in EMC terms is Centera, which Dell does not sell.

The implication here is that Dell will provide deduplication for Clariion CX near-primary storage and Celerra NX4 filer storage. But here's a question. How can Dell provide object storage deduplication when it doesn't supply an object storage product? Is there a hint here that Dell is going to take and sell EMC's Centera product?

Let's return to the DL2000 release for a moment and find another telling statement: "Dell believes that incorporating deduplication functionality into ISV, application and storage software can provide significant benefits to customers."

What does this mean? To me it says that Dell will supply deduplication software technology that can be incorporated into ISV software, into application software and into storage software. This seems to be another aspect of second generation deduplication.

Follow the implication here and imagine a piece of application software and its stored data, either on directly-attached storage (DAS) or on a storage array. The application has code that deduplicates its stored data. Okay, where is that done? On the server running the application and storing data on DAS, or on the storage array, if it uses one, using its controller CPU cycles?

We have specific deduplication products now because dedupe is CPU and disk-intensive, and running it on general servers could cripple them. Okay, but Nehalem servers are coming and server virtualisation is here already so couldn't we find the CPU cycles to do that on the servers?

Alternatively, we could parse "storage software" to mean storage array controller software and have the dedupe execute on the array, using spare CPU cycles - they're tending to be multi-core Xeon controllers now - with the ISV and application software telling the array what to dedupe and where to place the deduped data.

That chimes in with ideas of multi-tiering inside arrays. So far no-one is talking much about multi-tiered DAS but we can suppose that might come. The situation seems easier to understand with networked storage than it is with DAS, since DAS implies application server CPU cycles are used in deduping which seems a poor use of server CPU cycles.

3 Big data security analytics techniques

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
IBM rides nightmarish hardware landscape on OpenPOWER Consortium raft
Google mulls 'third-generation of warehouse-scale computing' on Big Blue's open chips
It's GOOD to get RAIN on your upgrade parade: Crucial M550 1TB SSD
Performance tweaks and power savings – what's not to like?
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
Microsoft's Nadella: SQL Server 2014 means we're all about data
Adds new big data tools in quest for 'ambient intelligence'
BOFH: Oh DO tell us what you think. *CLICK*
$%%&amp Oh dear, we've been cut *CLICK* Well hello *CLICK* You're breaking up...
prev story

Whitepapers

Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.