Feeds

Facebook 'open sources' custom server and data center designs

The last rule of Google 'Fight Club'

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

The Penthouse

According to Jay Park, Facebook's director of data-center design, the company chose Prineville for its new facility because the rural Oregon town had the necessary networking and power infrastructure as well as the appropriate climate for efficiently cooling the facility. "We can maximize the free cooling," he said.

On one level, the data center is designed to more efficiently deliver power to its servers. Typically, Park said, there is a power loss of between 11 and 17 per cent when you transfer power all the way to a data center's servers, but the Prineville center takes this figure down to 2 per cent, thanks to the use of a single transformer rather than the four-transformer setup used in the typical data center.

The system does away with a central UPS. For every six racks of servers, there's a single 48 volt DC UPS integrated with a 277 volt AC server power supply. "We eliminated some single points of failure, so we actually improved reliability by up to six times," Park said, adding that he dreamed up the facility's electrical design in the middle of the night, and with no paper available, he sketched it out on a napkin.

At the facility, outside air comes through a grill in a "penthouse" at the top of the data center, where equipment is used to remove water from the air. If the air is too cold, it will actually be mixed with hot air from, well, the data center's servers. The outside air is then pushed down to the data center. Park said the temperature here will range from 65 to 80 degrees Fahrenheit, and humidity will range from 45 to 60 per cent.

Facebook data center - AMD motherboard

The AMD version of the mother of all social-networking motherboards (click to enlarge)

As Facebook has said in the past, Park also indicated that the facility will use heat from the servers to heat up its built-in office space. There are no chillers. But there is an system that provides additional cooling with evaporated water.

Facebook's Amir Micheal, part of the company's hardware-design team, described the Prinevlle servers as "vanity-free". Michael said that Facebook removed "all the plastic bezels" and "almost all the screws" and anything else that "didn't make sense". The chassis is taller than the standard server - 1.5 U - and this let the company use taller heat sinks. Offering more surface area, he said, they're more efficient when cooling components. This, in turn, means that Facebook needn't force as much air onto the servers.

But the design uses larger fans as well, because, Michael says, these are more efficient as well. The fans measure about 60mm. The servers are also include snaps and spring-loaded plungers designed to make it easier for technicians to remove and replace parts.

Facebook has built both AMD and Intel motherboards, both manufactured by Quanta. As with the chassis, Michael and crew sought to remove as many components as possible, including expansion slots and other connectors. According to Michael, the voltage regulators on the motherboard achieve 93 per cent efficiency. The entire system weighs six pounds less than the traditional IU server, Michael said.

There are two connectors to the power bricks on each server, one for the 277 volt input and another for the 48 volt battery backup system. The entire motherboard, Michael said, achieves 94 per cent efficiency.

The company has also built its own rack, known as a "triplet rack," housing three columns of thirty servers. That's total of 90 servers per rack. Servers are mounted on shelves rather than rails. There's a battery cabinet in each rack for backup power.

According to Heiliger, the data center is 38 per cent more efficient than Facebook's existing leased data centers, but the cost is about 20 per cent less. The company began testing the data center at the end of last year, Heiliger tells The Reg, and it began taking live traffic over the past month.

Facebook data center - server racks

Rack up the pokes! (click to enlarge)

Facebook broke ground on the Prineville data center in January 2010. Previously, the company leased data-center space from third parties. At the time of the groundbreaking, Facebook said it would use outside air and evaporated water to cool the facility, rather than depend on chillers. About 60 to 70 per cent of the time outside air will be sufficient, the company then, but during the warmer and more humid days of the year, an "evaporative cooling system" will kick in. Heiliger told us at Thursday's event that outside cooling could potentially happen year-round.

Remote control for virtualized desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
The Heartbleed Bug: how to protect your business with Symantec
What happens when the next Heartbleed (or worse) comes along, and what can you do to weather another chapter in an all-too-familiar string of debilitating attacks?