Revolution speeds stats on Hadoop clusters

R language teaches 'meaningful' math to elephants

Secure remote control for conventional and virtual desktops

Revolution Analytics, the company that is extending R, the open source statistical programming language, with proprietary extensions, is making available a free set of extensions that allow its R engine to run atop Hadoop clusters.

Now statisticians that are familiar with R can do analysis on unstructured data stored in the Hadoop Distributed File System, the data store used for the MapReduce method of chewing on unstructured data pioneered by Google for its search engine and mimicked and open sourced by rival Yahoo! as the Apache Hadoop project.

R can now also run against the HBase non-relational, column-oriented distributed data store, which mimics Google's BigTable and which is essentially a database for Hadoop for holding structured data. Like Hadoop, HBase in an open source project distributed by the Apache Software Foundation.

With MapReduce, unstructured data is broken up and spread across server nodes, where the data is mapped across the nodes (with replication at multiple points for performance as well as fault tolerance) and chewed on in parallel (that's the reduce part) rather than in series like you would have to do on a single machine.

Revolution R Hadoop logo combo

With the marriage of R and Hadoop, explains David Champagne, chief technology officer at Revolution Analytics, the R engine is installed atop each Hadoop node in the cluster. Instead of programming a reduction algorithm in Java, as you do in Hadoop, you set up an R algorithm from an R workstation, and it is parsed out to the Hadoop nodes by Hadoop's mapping function. Statistical analysis is thus done in parallel on the data is stored in HDFS.

You don't do MapReduce and then extract data that comes back to the workstation for the analysis, but you chew on data right where it is in the cluster, and then aggregate it. In essence, R is using Hadoop as a grid controller, managing where specific algorithms run and the data they run against.

"We allow not just sums, means, and averages, which can be done easily in Java or Python, but statistically meaningful analysis," says Champagne. And you don't have to know jack about Java or MapReduce to run an R algorithm against a Hadoop cluster with either HDFS or HBase as its data store. "We want to hide some of the complexity of the MapReduce approach from R programmers and statisticians."

The tool that makes this integration possible is called RevoConnectR for Apache Hadoop, and Champagne tells El Reg that it was tested against Cloudera's CDH3 commercial distribution of Hadoop combined with the Revolution R Enterprise 4.3 stats engine. But you can take the Revolution R Community Edition and plunk it down on an open source Hadoop cluster (from Apache or from Cloudera) and the Hadoop connector for R will also work.

You can download the R connector for Hadoop from GitHub. Both Revolution Analytics and Cloudera want to encourage customers to use their commercial releases – and pay for tech support, of course. But this R-Hadoop connector is supplied for free even if they do. ®

Beginner's guide to SSL certificates

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story


Seattle children’s accelerates Citrix login times by 500% with cross-tier insight
Seattle Children’s is a leading research hospital with a large and growing Citrix XenDesktop deployment. See how they used ExtraHop to accelerate launch times.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.