Feeds

Facebook warehousing 180 PETABYTES of data a year

The Social Network open-sources ‘Corona’ tool used to manage the deluge

New hybrid storage solutions

Facebook’s data warehouses grow by “Over half a petabyte … every 24 hours”, according to an explanatory note The Social Network’s Engineering team has issued to explain a new release of open source code.

The note says the warehouse performs "ad-hoc queries, data pipelines, and custom MapReduce jobs process this raw data around the clock to generate more meaningful features and aggregations."

But vanilla-flavoured Apache Hadoop can't do that job, so Facebook has created the code in question, dubbed Corona, to extend the big data darling's capabilities so it can manage the deluge of data it collects each day.

The note explains “We initially employed the MapReduce implementation from Apache Hadoop as the foundation of this infrastructure, and that served us well for several years. But by early 2011, we started reaching the limits of that system.”

Those limits saw compute clusters clogged, due to scheduling issues with MapReduce, while resource management struggled to meet Facebook’s enormous demands.

Facebook characterises MapReduce, Hadoop-style, with the following illustration.

Facebook's depiction of Hadoop at work

Corona, by contrast, offers the configuration depicted below.

Facebook's Corona tool

Facebook says Corona rocks for the following reasons:

“Corona introduces a cluster manager whose only purpose is to track the nodes in the cluster and the amount of free resources. A dedicated job tracker is created for each job, and can run either in the same process as the client (for small jobs) or as a separate process in the cluster (for large jobs). One major difference from our previous Hadoop MapReduce implementation is that Corona uses push-based, rather than pull-based, scheduling. After the cluster manager receives resource requests from the job tracker, it pushes the resource grants back to the job tracker. Also, once the job tracker gets resource grants, it creates tasks and then pushes these tasks to the task trackers for running. There is no periodic heartbeat involved in this scheduling, so the scheduling latency is minimized.”

The post also details how Facebook introduced the new tool and, along the way, gives some insights into the scale of the company’s infrastructure with the revelation rollout started with a modestly-sized cluster of 500 nodes, to “get feedback from early adopters.”

A 1000-node trial yielded the first scaling problem, before the tool was introduced to all of the company’s servers.

The company has now made Corona available, on github. By doing so it has played by the right open source rules, given that the Engineering note suggests the company believes Corona will be a crucial tool for “for years to come”.

Given the note says Facebook’s data warehouse “has grown by 2500x in the past four years” Corona looks to have serious data-handling grunt. And that’s just the warehouse: how much other data Facebook holds is not disclosed. Nor is just what Corona will deliver, in terms of products or data analysis.

It may therefore be sensible, if one were to relax and partake of Corona’s namesake beverage, to admire the technical achievements described here, but to reserve judgement on what they may enable. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
Not appy with your Chromebook? Well now it can run Android apps
Google offers beta of tricky OS-inside-OS tech
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
Greater dev access to iOS 8 will put us AT RISK from HACKERS
Knocking holes in Apple's walled garden could backfire, says securo-chap
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.