Feeds

SGI inks reseller deal with Cloudera

Hadoop clusters stacked and racked, data not included

The essential guide to IT transformation

Silicon Graphics is chasing elephants – stuffed elephants, that is – and it has enlisted Cloudera as a partner in its big-data safari.

Technically speaking, Silicon Graphics doesn't have to launch a Hadoop cluster business so that companies can do the kind of analytics that search engine giants such as Google and Yahoo! do on the unstructured data that encompasses the Web and how we surf around on it, leaving our trails of cookies and Web logs. Some of the biggest names on the Internet – including Yahoo!, Amazon.com, and Facebook – have already deployed the Hadoop MapReduce tool and its associated Hadoop Distributed File System on its iron, and some big US government agencies have tapped SGI to build Hadoop clusters – sometimes in containerized data centers.

SGI says that it has done Hadoop installations that span as many as 40,000 server node, and has done individual clusters that weight in at 4,000 nodes. That is the upper limit of Hadoop cluster size at Yahoo! and elsewhere these days, and according to insiders at Yahoo!, the search and media company has grown from 25,000 nodes running Hadoop – which is used to crunch clickstreams, scan emails, and serve up customized content – to 42,000 nodes with over 200PB of data in 2010. Yahoo expects to hit around 60,000 or so Hadoop nodes by the end of this year.

Not everyone has this scale of Hadoopery, of course. But just like every company thought during the dot-com boom that they needed to have a retail operation on the Web, every company during today's big-data boom thinks that every little bit of information on their systems is somehow valuable and can be mined for money. So they are saving everything and trying to figure out how to chew on it with tools like Hadoop.

For SGI, Hadoop clusters have been largely bespoke work, but in June the company began offering prefabbed Hadoop clusters, marrying its Rackable lines of power-efficient, double-sided rack servers and its Altix ICE supercomputing rack servers with the open source distribution of Hadoop maintained by the Apache Software Foundation, which inherited the Hadoop stack after it was developed by Yahoo! a number of years ago.

But, as is not surprising to anyone, governments and enterprises want commercial support for open source software, and they want systems that are not bespoke, but instead configured and tuned to work well together and delivered as a finished system that is ready to be loaded up with data.

SGI Hadoop cluster

SGI Hadoop clusters, unstructured data not included

So last Friday, SGI inked a reseller agreement with Cloudera, which fancies itself the Red Hat of the Hadoop distribution, and which includes some of the key people from Yahoo! who created the clone of the analytics behind an earlier generation of Google's search engine – Hadoop and its related Hadoop Distributed File System clone the MapReduce technique created by Google, which most certainly was never open source.

Under the agreement with Cloudera, SGI will distribute the current CDH3 distribution from Cloudera, and offer Level 1 technical support for the software as well. Cloudera is providing Level 2 and Level 3 support for the CDH3 software for now, but over time, says Bill Mannel, vice president of product marketing at SGI, the company will build up more Hadoop expertise and eventually offer Level 2 support and might even offer the whole enchilada.

At the moment, SGI is cooking up reference configurations on its Rackable machines, and hopes to have these cooked up by the Hadoop World conference in three weeks and shipping by the end of November.

In the meantime, if you are building a Hadoop cluster, SGI can build one in its factory and ship it to you ready to chew all that unruly data.

This time around, SGI plans to focus mostly on peddling its energy-efficient Rackable machines for Hadoop clusters, with the Altix ICE clusters, which are designed for HPC supercomputing workloads, taking a backseat. The company is also offering InfiniBand networking as an alternative to Ethernet for lashing the server nodes together, but it is not yet clear what advantage this may have or how that might affect the price of a Hadoop cluster.

"We are going to show bang, bang for the buck, and power savings," says Mannel, referring to the competitive advantage SGI thinks it can bring to Hadoop workloads. ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.