Feeds

Google File System II stalked by open-source elephant

Cloudera: Anything Google can do...

Choosing a cloud hosting partner with confidence

As Google rolls out GFS2 - a major update to the custom-built file system underpinning its online infrastructure - the company's former infrastructure don sees no reason why the open source world can't follow suit.

Famously, Christophe Bisciglia taught a course at the University of Washington meant to educate rising computer scientists in Google's epic data juggling ways, and earlier this year, he brought his Big Data know-how to Cloudera, a star-studded startup that's helping to mimic Google's data prowess via open source software.

Cloudera is what you might call a Red Hat for Hadoop, the Apache-hosted open source platform based on the original Google File System (GFS), and MapReduce, Mountain View's distributed number-crunching platform. The startup offers services and support around its own Hadoop distro. Hadoop already underpins online services from Yahoo!, Facebook, Microsoft (believe it or not), and other web outfits, but it might be applied to almost any task dependent on processing unusually large amounts of data.

Details on Google's GFS2 are slim. After all, it's Google. But based on what he's read, Bisciglia calls the update "the next logical iteration" of the original GFS, and he sees Hadoop eventually following in the (rather sketchy) footsteps left by his former employer.

"A lot of the things Google is talking about are very logical directions for Hadoop to go," Bisciglia tells The Reg. "One of the things I've been very happy to see repeatedly demonstrated is that Hadoop has been able to implement [new Google GFS and MapReduce] features in approximately the same order. This shows that the fundamentals of Hadoop are solid, that the fundamentals are based on the same principles that allowed Google's systems to scale over the years.

"I don't think there's anything in Google's internal systems or GFS2 that are out of scope or out of range for Hadoop. The only issue is the reality that Google has been working on it for much longer."

GFS is celebrating its 10th anniversary, and Hadoop didn't get its start until 2005, after Google published a pair of research papers describing GFS and MapReduce. The project was founded by Nutch crawler creator Doug Cutting, who named it after his son's yellow stuffed elephant.

Google says it's been brewing GFS2 - if that's what it's called - for about the last two years. And it's now part of Caffeine, a rewrite of Google's search indexing system.

With GFS - and the Hadoop File System - a master node oversees data spread across a series of distributed chunkservers. Chunkservers store, yes, chunks of data, each about 64 megabytes. Meanwhile, GFS2 uses not only distributed slaves, but distributed masters as well.

Among other things, this improves speed. "Multiple masters is going to increase your read capacity," Bisciglia says. "It's going to allow you to have many more clients accessing the file system without inducing all of that load onto the master. From a read perspective, this is very natural. It's a reasonably understood engineering task."

It's unclear whether GFS2 also writes from multiple masters. But the new setup also provides added redundancy. If one master goes down, (many) others are there to pick up the slack.

Originally, when masters went down, GFS had no hot failover at all. But Google eventually made amends, and now, the Hadoop project is developing its own hot failover. "I watched a lot of the growing pains of GFS during my time at Google," says Bisciglia. "I remember back when there was no failover for the master and there was no [user storage] quotas on in the file systems. And these are the some of the things you're now seeing from Hadoop."

Cloudera's latest test distro - CDH2, based on Hadoop 0.20 - adds quotas. "System administrators can allocate quotas to groups and users...You can much more effectively share a cluster among a larger group of users. You don't want to cut a cluster into small, isolated independent pieces. You want to manage resource sharing on a higher level, so that when one user is not using it, someone else can. And we now allow for all that."

Anything Google can do... ®

Bootnote

Cloudera touts CDH2 here. Typically, the original CDH1 is still used for production systems, but Bisciglia expects that CDH2 - which also includes more stable APIs - will receive its official release sometime this fall.

On October 2, Cloudera is hosting a Hadoop conference in New York. Until September 21, it's offering discounted tickets for developers here.

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
'Kim Kardashian snaps naked selfies with a BLACKBERRY'. *Twitterati gasps*
More alleged private, nude celeb pics appear online
Wanna keep your data for 1,000 YEARS? No? Hard luck, HDS wants you to anyway
Combine Blu-ray and M-DISC and you get this monster
US boffins demo 'twisted radio' mux
OAM takes wireless signals to 32 Gbps
Google+ GOING, GOING ... ? Newbie Gmailers no longer forced into mandatory ID slurp
Mountain View distances itself from lame 'network thingy'
Apple flops out 2FA for iCloud in bid to stop future nude selfie leaks
Millions of 4chan users howl with laughter as Cupertino slams stable door
Students playing with impressive racks? Yes, it's cluster comp time
The most comprehensive coverage the world has ever seen. Ever
Run little spreadsheet, run! IBM's Watson is coming to gobble you up
Big Blue's big super's big appetite for big data in big clouds for big analytics
Seagate's triple-headed Cerberus could SAVE the DISK WORLD
... and possibly bring us even more HAMR time. Yay!
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.