Feeds

Serengeti corrals Hadoop with VMware hypervisors

Squeezing elephants inside ESXi VMs

  • alert
  • submit to reddit

Boost IT visibility and business value

Hadoop World 2012 It is a wonder why VMware has not rolled up its own Hadoop stack yet and made it part of its Cloud Foundry project, but perhaps a new project called Serengeti, the virtualization juggernaut that wants to rule the cloudy world is taking another step in that direction.

VMware has already created a Spring framework tuned specifically for corralling Hadoop big data elephants, which debuted at the end of February. The mashup pairs the Spring Java application framework with the Apache Hadoop distribution, which seems natural enough given that Hadoop is itself a Java application.

But the framework is not about somehow running Hadoop better, so much as it is about giving Java developers a better way to create applications that can make use of Hadoop's MapReduce data sifting and Hadoop Distributed File System data store.

With Serengeti, which is an open source project just like the acquired Cloud Foundry – and is in fact a sub-project on Cloud Foundry – VMware has made contributions to the Hadoop stack that make it virtualization-aware and therefore can be packaged up in discrete virtual machines, one for each kind of physical server in a normal Hadoop cluster, whether it is a NameNode for HDFS, a JobTracker node for managing MapReduce workload scheduling, or myriad data nodes where information is spread out to make up HDFS itself.

Other Hadoop management, query, and data abstractions tools can also be packaged up inside of an ESXi virtual machine. The end result is that customers can build and manage Hadoop clusters just like they would manage raw infrastructure clouds, providing replication, failover, and other services for the virtualized nodes.

David McJannet, director of vFabric product marketing at VMware, tells El Reg that another benefit with running Hadoop in virtual rather than physical mode is that you can use the virtualization layer to multitask a cluster, perhaps running Hadoop batch runs at night and then other kinds of analytics in other VMs by day when Hadoop is not chewing so heavily.

The other benefit of Serengeti is that it can deploy an entire Hadoop stack on a single physical server for development or demonstration purposes, or on a small cluster if that is what a shop has, and balance the work across the limited number of nodes.

Serengeti is a collection of scripts, called recipes, written for the Chef management framework, with some Ruby code thrown in for good measure. You can read the release notes for the Serengeti 0.5 Hadoop deployment tool here (PDF) and download the binary of the code there. The source code will also be available for download on GitHub under and Apache 2.0 license.

At the moment, Serengeti only knows how to deploy Hadoop onto ESXi 5.0 hypervisors, but VMware will backcast this to earlier releases at some point in the future. And Serengeti being an open source project, it can be tweaked to support other virtual machine hypervisors. (Don't expect VMware to do the work, though.)

Serengeti 0.5 supports the Apache Hadoop 1.0 stack (which is based on the core Apache 0.20 code and which shipped in January) as well as CDH3 from Cloudera, Data Platform 1.0 from Hortonworks, and Greenplum HD 1.0 (which is based on the Apache code, not the M5 distribution from MapR Technologies) from EMC. Serengeti does not yet support the Apache 0.23 code or the Apache 2.0 stack on which it is based. Future Serengeti releases will, however, as this code moves out of alpha.

In addition to launching Serengeti at the Hadoop World event, VMware has tweaked that Spring-Hadoop mashup to the 1.0.0 M2 level. With this release, the Kerberos security in the Spring Framework is aware that it needs to secure HDFS, MapReduce, and Pig, a higher-level programming language that is used to create MapReduce jobs (in Pig Latin, of course).

The new Spring Hadoop M2 supports the Cascading application framework that was created separately for Java applications running on top of Hadoop (that seems to be two Java frameworks at the same time as far as El Reg can see). The support creates links between the frameworks called taps by VMware, allowing Cascading to inherit and use the file, TCP. Twitter, and RSS adapters for Spring for either gathering up inbound data or pushing outbound data.

The Spring framework's Data Access Object (or DAO) method for linking to database tables is now integrated with the HBase database layer that you can run atop HDFS. Spring Hadoop 1.0.0 M2 also supports WebHDFS an add-on that gives REST APIs to the Hadoop file system. ®

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Docker kicks KVM's butt in IBM tests
Big Blue finds containers are speedy, but may not have much room to improve
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Gartner's Special Report: Should you believe the hype?
Enough hot air to carry a balloon to the Moon
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Dell The Man shrieks: 'We've got a Bitcoin order, we've got a Bitcoin order'
$50k of PowerEdge servers? That'll be 85 coins in digi-dosh
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.