Feeds

Hadoop goes 'open core' with Cloudera Enterprise

All-star startup fattens stuffed elephant

Protecting against web application threats using SSL

Hadoop Summit Cloudera – the commercial Hadoop outfit – has unveiled its first for-pay product: Cloudera Enterprise, an augmented version of the open source distributed data crunching platform designed specifically for production environments.

Cloudera Enterprise – announced today at the Hadoop Summit in Santa Clara, California – beefs up Hadoop with several proprietary management, monitoring, and administration tools, and it's sold on a subscription basis, priced according to the size of your Hadoop cluster. The all-star Silicon Valley startup has adopted an "open core" model, enhancing an open source Hadoop core with additional software that carries a price tag.

"We've been in the market with customers for coming up on two years now, supporting Hadoop in real enterprise production environments," Cloudera CEO Mike Olson, tells The Register. "We've learned a lot about how customers use [Hadoop], what it does well, and what makes it difficult to deploy and operate. As a result of all of that activity."

Additional proprietary tools include integration with LDAP directory servers for user authentication and access control; dashboards for controlling and managing the flow of data into Hadoop clusters; and user interface for cluster management and administration. Buyers also receive maintenance update and support.

The heart of this new enterprise product is the company's open source Hadoop distro – Cloudera’s Distribution for Hadoop (CDH) – which just graduated to version 3. Also announced today, the CDH consists of Apache Hadoop and eight additional open sources projects.

This includes Hive (a SQL-like query language developed at Facebook), Pig (a lower-level language developed by Yahoo!), HBase (a distributed database developed by the now Microsoft-owned Powerset), Sqoop (a MySQL connector built by Cloudera), Oozie (the Hadoop workflow system), and Zookeeper (means of juggling distributed services from a central location), as well as two new projects just opened up by Cloudera under an Apache license: Flume and Hue.

Flume is Cloudera's data loading infrastructure, while HUE – short for Hadoop User Interface – is the web-based Hadoop GUI formerly known as the Cloudera Desktop. HUE provides a graphical user interface for creating and submitting jobs on a Hadoop cluster, monitoring the cluster's health, and browsing stored data. Typically, clusters are managed via the command line.

Based on Google’s proprietary software infrastructure, Hadoop is a means of crunching epic amounts of data across a network of distributed machines. Named for the yellow stuffed elephant belonging to the son of project founder Doug Cutting, the platform underpins online services operated by everyone from Yahoo! and Facebook to Microsoft.

Hadoop mimics GFS, Google's distributed file system, and MapReduce, Mountain View's distributed number-crunching platform. In 2004, Google published a pair of research papers on these infrastructure technologies, and Doug Cutting used the papers to build a platform that would back Nutch, his open source web crawler. Hadoop was open sourced at Apache, and it was bootstrapped by Yahoo!, which hired Cutting in 2006, before he left for Cloudera.

The platform consists of the Hadoop File System (HDFS) and Hadoop MapReduce.

Previously, Cloudera's only proprietary product was the free Cloudera Desktop, which has now been open sourced as HUE. The company offered its own Hadoop distro and various other open source tools in tandem with support, training, and certification services. But Olson and company say they've long been planning to add a subscription revenue stream.

"For the first time we're able to go to market with the stance that if you're using just HDFS and MapReduce, you're not getting the volume you should be out of Hadoop," says company co-founder Cloudera Jeff Hammerbacher, who worked on Hadoop at Facebook. "At Facebook, HDFS and MapReduce provided an excellent starting point for building infrastructure to management and extract value from datasets, but we had a large variety of tools surrounding those two." ®

The next step in data security

More from The Register

next story
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
Protecting users from Firesheep and other Sidejacking attacks with SSL
Discussing the vulnerabilities inherent in Wi-Fi networks, and how using TLS/SSL for your entire site will assure security.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.