Feeds

HDS adds ROBO on-ramp to content platform

Multi-tenants can sub-let now too

Boost IT visibility and business value

Hitachi Data Systems has added remote office cacheing facilities to its multi-tenant Hitachi Content Platform (HCP) archive product, and allowed tenants to sub-let with individual archive policies for the sub-lettees.

HCP is HDS's archive platform, with up to 40 billion objects stored in an Ethernet cluster of ingest and retrieval nodes, offering up to 40PB of capacity. It is targeted at private and public cloud providers with each tenant (user) having their own name space and specific policies concerning retention compression, replication, etc. It uses HDS's VSP, USP, and AMS storage arrays as its storage platform with the HCP nodes providing the archive functionality layer on top of them.

Version 4.0 increases the granularity of the multi-tenant feature added to HCP last year in v3.0. A user or tenant of the HCP can now have multiple namespaces with each namespace providing storage for a component of the tenant's organisation, such as order-processing, manufacturing, sales, etc. This has been a much-requested feature from HDS's cloud and service provider customers, and is supported by extended chargeback and reporting facilities such as I/Os per namespace and total capacity consumed.

HDS has also added Hitachi Data Ingest (HDI) nodes; devices with an NFS and CIFS interface that are intended for use in remote or branch offices and hold up to 4TB of data. These link up to the ingest nodes in a central HCP installation and are closely integrated with them.

They should not be regarded as NAS heads, according to Lynn Collier, Hitachi's EMEA software and solutions director, being caches instead. Each HDI system can be a 2-node cluster and, Collier said, thousands of users can be supported on HDI, with Active Directory and LDAP integration.

When an HDI node becomes full and more data is added, then existing data is sent to the central HCP site with a stub left behind so users can still "see" the data and access it if they wish. It's pushed back out to the HDI node in that case.

An algorithm in the system keeps active data in the HDI cache for as long as it's active. There is also no need to back up an HDI system, Collier said, because of this automated transmission of data to HCP central.

The data being sent to the central HCP installation is not deduplicated. That feature is being looked at by HDS and may appear on the HCP roadmap.

HDS has an ongoing initiative to integrate third-party search, e-Discovery, legal hold and compliance applications with HCP. We understand that there may be HDS appliances built, an HDI system plus third-party software, to provide simplified access to HCP-stored data by the applications.

The HCP search function can search in a federated way, as network-attached storage (NAS) devices with a network link to HCP can have their content searched by HCP. There is a relationship with CommVault, with an HCP API providing HCP access to Simpana facilities and Simpana access to HCP-stored data. There is also a relationship with Symantec and HCP can receive streaming data input from Enterprise Vault.

HDS sees the main competition for HCP as EMC's Centera and NatApp's SnapVault. It positions HCP as a private or public cloud, highly-scalable, content storage system, with Collier saying the HDI nodes providing a cloud on-ramp in remote offices or in public cloud access points.

The HCP v4.0 and HDI products are available immediately. ®

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Docker kicks KVM's butt in IBM tests
Big Blue finds containers are speedy, but may not have much room to improve
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Gartner's Special Report: Should you believe the hype?
Enough hot air to carry a balloon to the Moon
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Dell The Man shrieks: 'We've got a Bitcoin order, we've got a Bitcoin order'
$50k of PowerEdge servers? That'll be 85 coins in digi-dosh
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.