Feeds

SGI inks reseller deal with Cloudera

Hadoop clusters stacked and racked, data not included

Top 5 reasons to deploy VMware with Tegile

Silicon Graphics is chasing elephants – stuffed elephants, that is – and it has enlisted Cloudera as a partner in its big-data safari.

Technically speaking, Silicon Graphics doesn't have to launch a Hadoop cluster business so that companies can do the kind of analytics that search engine giants such as Google and Yahoo! do on the unstructured data that encompasses the Web and how we surf around on it, leaving our trails of cookies and Web logs. Some of the biggest names on the Internet – including Yahoo!, Amazon.com, and Facebook – have already deployed the Hadoop MapReduce tool and its associated Hadoop Distributed File System on its iron, and some big US government agencies have tapped SGI to build Hadoop clusters – sometimes in containerized data centers.

SGI says that it has done Hadoop installations that span as many as 40,000 server node, and has done individual clusters that weight in at 4,000 nodes. That is the upper limit of Hadoop cluster size at Yahoo! and elsewhere these days, and according to insiders at Yahoo!, the search and media company has grown from 25,000 nodes running Hadoop – which is used to crunch clickstreams, scan emails, and serve up customized content – to 42,000 nodes with over 200PB of data in 2010. Yahoo expects to hit around 60,000 or so Hadoop nodes by the end of this year.

Not everyone has this scale of Hadoopery, of course. But just like every company thought during the dot-com boom that they needed to have a retail operation on the Web, every company during today's big-data boom thinks that every little bit of information on their systems is somehow valuable and can be mined for money. So they are saving everything and trying to figure out how to chew on it with tools like Hadoop.

For SGI, Hadoop clusters have been largely bespoke work, but in June the company began offering prefabbed Hadoop clusters, marrying its Rackable lines of power-efficient, double-sided rack servers and its Altix ICE supercomputing rack servers with the open source distribution of Hadoop maintained by the Apache Software Foundation, which inherited the Hadoop stack after it was developed by Yahoo! a number of years ago.

But, as is not surprising to anyone, governments and enterprises want commercial support for open source software, and they want systems that are not bespoke, but instead configured and tuned to work well together and delivered as a finished system that is ready to be loaded up with data.

SGI Hadoop cluster

SGI Hadoop clusters, unstructured data not included

So last Friday, SGI inked a reseller agreement with Cloudera, which fancies itself the Red Hat of the Hadoop distribution, and which includes some of the key people from Yahoo! who created the clone of the analytics behind an earlier generation of Google's search engine – Hadoop and its related Hadoop Distributed File System clone the MapReduce technique created by Google, which most certainly was never open source.

Under the agreement with Cloudera, SGI will distribute the current CDH3 distribution from Cloudera, and offer Level 1 technical support for the software as well. Cloudera is providing Level 2 and Level 3 support for the CDH3 software for now, but over time, says Bill Mannel, vice president of product marketing at SGI, the company will build up more Hadoop expertise and eventually offer Level 2 support and might even offer the whole enchilada.

At the moment, SGI is cooking up reference configurations on its Rackable machines, and hopes to have these cooked up by the Hadoop World conference in three weeks and shipping by the end of November.

In the meantime, if you are building a Hadoop cluster, SGI can build one in its factory and ship it to you ready to chew all that unruly data.

This time around, SGI plans to focus mostly on peddling its energy-efficient Rackable machines for Hadoop clusters, with the Altix ICE clusters, which are designed for HPC supercomputing workloads, taking a backseat. The company is also offering InfiniBand networking as an alternative to Ethernet for lashing the server nodes together, but it is not yet clear what advantage this may have or how that might affect the price of a Hadoop cluster.

"We are going to show bang, bang for the buck, and power savings," says Mannel, referring to the competitive advantage SGI thinks it can bring to Hadoop workloads. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?