Feeds

Researchers reveal radical RAID rethink

“Pipelined erasure coding” helps storage to scale at speed

Top 5 reasons to deploy VMware with Tegile

Singaporean researchers have proposed a new way to protect the integrity of data in distributed storage systems and say their “RapidRAID” system offers top protection while consuming fewer network, computing and storage array resources than other approaches.

RAID – redundant arrays of inexpensive disks – has been a storage staple for a almost quarter of a century. The technique involves replicating data across a number of disks so that failure or loss of a single spindle does not result in data loss. When a drive dies, RAID means a new drive can be added to an array and the data from the original drive will be restored. Different “levels” of RAID work with varying quantities of disk and deliver different levels of reliability.

RAID has, of late, become less popular as various scale-out architectures offer different approaches to redundant data storage. The technique is also challenged by multi-terabyte disk drives, as the sheer quantity of data on such disks means rebuilding a drive can take rather longer, and hog more IOPS, than many users are willing to endure.

Erasure codes are one of the techniques challenging RAID and can most easily be understood as a form of metadata. Erasure codes allow fragments of data to be spread across a wider pool of disks, before the desired data is re-assembled using fragments from multiple sources. Erasure codes feature in the Google File System, Hadoop’s file system, Azure and several commercial products.

Some have even described erasure codes as delivering RAIN – a redundant array of inexpensive nodes – that is positioned as a successor to RAID.

The Singaporean researchers’ work, available on arXiv, proposes a new scheme called RapidRAID that goes beyond other implementations of erasure codes, reducing the amount of storage required to create a viable archive while also speeding the time required to create that archive.

The team thinks this is possible with what it calls “pipelined insertion” under which:

“… the encoding process is distributed among those nodes storing replicated data of the object to be encoded, which exploits data locality and saves network traffic. We then arrange the encoding nodes in a pipeline where each node sends some partially encoded data to the next node, which creates parity data simultaneously on different storage nodes, avoiding the extra time required to distribute the parity after the encoding process is terminated.”

The paper linked to above then proposes RapidRAID, a set of erasure codes which, just like RAID, offer different levels of data protection.

Tests of the new codes are described in the paper, which compares RapidRAID to the Reed-Solomon erasure codes used in many current implementations. In a test involving 50 thin clients and 16 EC2 instances, the researchers proclaim RapidRAID superior in some ways.

The researchers therefore declare RapidRAID a viable big data enabler, but conclude that there’s more work to be done before it can be declared suitable for applications that require more than two copies of data.

The codes are available for download on github. ®

Beginner's guide to SSL certificates

More from The Register

next story
729 teraflops, 71,000-core Super cost just US$5,500 to build
Cloud doubters, this isn't going to be your best day
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Want to STUFF Facebook with blatant ADVERTISING? Fine! But you must PAY
Pony up or push off, Zuck tells social marketeers
Oi, Europe! Tell US feds to GTFO of our servers, say Microsoft and pals
By writing a really angry letter about how it's harming our cloud business, ta
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
SAVE ME, NASA system builder, from my DEAD WORKSTATION
Anal-retentive hardware nerd in paws-on workstation crisis
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.