Feeds

Researchers reveal radical RAID rethink

“Pipelined erasure coding” helps storage to scale at speed

Combat fraud and increase customer satisfaction

Singaporean researchers have proposed a new way to protect the integrity of data in distributed storage systems and say their “RapidRAID” system offers top protection while consuming fewer network, computing and storage array resources than other approaches.

RAID – redundant arrays of inexpensive disks – has been a storage staple for a almost quarter of a century. The technique involves replicating data across a number of disks so that failure or loss of a single spindle does not result in data loss. When a drive dies, RAID means a new drive can be added to an array and the data from the original drive will be restored. Different “levels” of RAID work with varying quantities of disk and deliver different levels of reliability.

RAID has, of late, become less popular as various scale-out architectures offer different approaches to redundant data storage. The technique is also challenged by multi-terabyte disk drives, as the sheer quantity of data on such disks means rebuilding a drive can take rather longer, and hog more IOPS, than many users are willing to endure.

Erasure codes are one of the techniques challenging RAID and can most easily be understood as a form of metadata. Erasure codes allow fragments of data to be spread across a wider pool of disks, before the desired data is re-assembled using fragments from multiple sources. Erasure codes feature in the Google File System, Hadoop’s file system, Azure and several commercial products.

Some have even described erasure codes as delivering RAIN – a redundant array of inexpensive nodes – that is positioned as a successor to RAID.

The Singaporean researchers’ work, available on arXiv, proposes a new scheme called RapidRAID that goes beyond other implementations of erasure codes, reducing the amount of storage required to create a viable archive while also speeding the time required to create that archive.

The team thinks this is possible with what it calls “pipelined insertion” under which:

“… the encoding process is distributed among those nodes storing replicated data of the object to be encoded, which exploits data locality and saves network traffic. We then arrange the encoding nodes in a pipeline where each node sends some partially encoded data to the next node, which creates parity data simultaneously on different storage nodes, avoiding the extra time required to distribute the parity after the encoding process is terminated.”

The paper linked to above then proposes RapidRAID, a set of erasure codes which, just like RAID, offer different levels of data protection.

Tests of the new codes are described in the paper, which compares RapidRAID to the Reed-Solomon erasure codes used in many current implementations. In a test involving 50 thin clients and 16 EC2 instances, the researchers proclaim RapidRAID superior in some ways.

The researchers therefore declare RapidRAID a viable big data enabler, but conclude that there’s more work to be done before it can be declared suitable for applications that require more than two copies of data.

The codes are available for download on github. ®

3 Big data security analytics techniques

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
Microsoft's Nadella: SQL Server 2014 means we're all about data
Adds new big data tools in quest for 'ambient intelligence'
BOFH: Oh DO tell us what you think. *CLICK*
$%%&amp Oh dear, we've been cut *CLICK* Well hello *CLICK* You're breaking up...
prev story

Whitepapers

Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.