Feeds

Facebook 'open sources' custom server and data center designs

The last rule of Google 'Fight Club'

  • alert
  • submit to reddit

Gartner critical capabilities for enterprise endpoint backup

The Penthouse

According to Jay Park, Facebook's director of data-center design, the company chose Prineville for its new facility because the rural Oregon town had the necessary networking and power infrastructure as well as the appropriate climate for efficiently cooling the facility. "We can maximize the free cooling," he said.

On one level, the data center is designed to more efficiently deliver power to its servers. Typically, Park said, there is a power loss of between 11 and 17 per cent when you transfer power all the way to a data center's servers, but the Prineville center takes this figure down to 2 per cent, thanks to the use of a single transformer rather than the four-transformer setup used in the typical data center.

The system does away with a central UPS. For every six racks of servers, there's a single 48 volt DC UPS integrated with a 277 volt AC server power supply. "We eliminated some single points of failure, so we actually improved reliability by up to six times," Park said, adding that he dreamed up the facility's electrical design in the middle of the night, and with no paper available, he sketched it out on a napkin.

At the facility, outside air comes through a grill in a "penthouse" at the top of the data center, where equipment is used to remove water from the air. If the air is too cold, it will actually be mixed with hot air from, well, the data center's servers. The outside air is then pushed down to the data center. Park said the temperature here will range from 65 to 80 degrees Fahrenheit, and humidity will range from 45 to 60 per cent.

Facebook data center - AMD motherboard

The AMD version of the mother of all social-networking motherboards (click to enlarge)

As Facebook has said in the past, Park also indicated that the facility will use heat from the servers to heat up its built-in office space. There are no chillers. But there is an system that provides additional cooling with evaporated water.

Facebook's Amir Micheal, part of the company's hardware-design team, described the Prinevlle servers as "vanity-free". Michael said that Facebook removed "all the plastic bezels" and "almost all the screws" and anything else that "didn't make sense". The chassis is taller than the standard server - 1.5 U - and this let the company use taller heat sinks. Offering more surface area, he said, they're more efficient when cooling components. This, in turn, means that Facebook needn't force as much air onto the servers.

But the design uses larger fans as well, because, Michael says, these are more efficient as well. The fans measure about 60mm. The servers are also include snaps and spring-loaded plungers designed to make it easier for technicians to remove and replace parts.

Facebook has built both AMD and Intel motherboards, both manufactured by Quanta. As with the chassis, Michael and crew sought to remove as many components as possible, including expansion slots and other connectors. According to Michael, the voltage regulators on the motherboard achieve 93 per cent efficiency. The entire system weighs six pounds less than the traditional IU server, Michael said.

There are two connectors to the power bricks on each server, one for the 277 volt input and another for the 48 volt battery backup system. The entire motherboard, Michael said, achieves 94 per cent efficiency.

The company has also built its own rack, known as a "triplet rack," housing three columns of thirty servers. That's total of 90 servers per rack. Servers are mounted on shelves rather than rails. There's a battery cabinet in each rack for backup power.

According to Heiliger, the data center is 38 per cent more efficient than Facebook's existing leased data centers, but the cost is about 20 per cent less. The company began testing the data center at the end of last year, Heiliger tells The Reg, and it began taking live traffic over the past month.

Facebook data center - server racks

Rack up the pokes! (click to enlarge)

Facebook broke ground on the Prineville data center in January 2010. Previously, the company leased data-center space from third parties. At the time of the groundbreaking, Facebook said it would use outside air and evaporated water to cool the facility, rather than depend on chillers. About 60 to 70 per cent of the time outside air will be sufficient, the company then, but during the warmer and more humid days of the year, an "evaporative cooling system" will kick in. Heiliger told us at Thursday's event that outside cooling could potentially happen year-round.

Secure remote control for conventional and virtual desktops

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Object storage bods Exablox: RAID is dead, baby. RAID is dead
Bring your own disks to its object appliances
Nimble's latest mutants GORGE themselves on unlucky forerunners
Crossing Sandy Bridges without stopping for breath
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.