Feeds

Facebook 'open sources' custom server and data center designs

The last rule of Google 'Fight Club'

  • alert
  • submit to reddit

High performance access to file storage

The Penthouse

According to Jay Park, Facebook's director of data-center design, the company chose Prineville for its new facility because the rural Oregon town had the necessary networking and power infrastructure as well as the appropriate climate for efficiently cooling the facility. "We can maximize the free cooling," he said.

On one level, the data center is designed to more efficiently deliver power to its servers. Typically, Park said, there is a power loss of between 11 and 17 per cent when you transfer power all the way to a data center's servers, but the Prineville center takes this figure down to 2 per cent, thanks to the use of a single transformer rather than the four-transformer setup used in the typical data center.

The system does away with a central UPS. For every six racks of servers, there's a single 48 volt DC UPS integrated with a 277 volt AC server power supply. "We eliminated some single points of failure, so we actually improved reliability by up to six times," Park said, adding that he dreamed up the facility's electrical design in the middle of the night, and with no paper available, he sketched it out on a napkin.

At the facility, outside air comes through a grill in a "penthouse" at the top of the data center, where equipment is used to remove water from the air. If the air is too cold, it will actually be mixed with hot air from, well, the data center's servers. The outside air is then pushed down to the data center. Park said the temperature here will range from 65 to 80 degrees Fahrenheit, and humidity will range from 45 to 60 per cent.

Facebook data center - AMD motherboard

The AMD version of the mother of all social-networking motherboards (click to enlarge)

As Facebook has said in the past, Park also indicated that the facility will use heat from the servers to heat up its built-in office space. There are no chillers. But there is an system that provides additional cooling with evaporated water.

Facebook's Amir Micheal, part of the company's hardware-design team, described the Prinevlle servers as "vanity-free". Michael said that Facebook removed "all the plastic bezels" and "almost all the screws" and anything else that "didn't make sense". The chassis is taller than the standard server - 1.5 U - and this let the company use taller heat sinks. Offering more surface area, he said, they're more efficient when cooling components. This, in turn, means that Facebook needn't force as much air onto the servers.

But the design uses larger fans as well, because, Michael says, these are more efficient as well. The fans measure about 60mm. The servers are also include snaps and spring-loaded plungers designed to make it easier for technicians to remove and replace parts.

Facebook has built both AMD and Intel motherboards, both manufactured by Quanta. As with the chassis, Michael and crew sought to remove as many components as possible, including expansion slots and other connectors. According to Michael, the voltage regulators on the motherboard achieve 93 per cent efficiency. The entire system weighs six pounds less than the traditional IU server, Michael said.

There are two connectors to the power bricks on each server, one for the 277 volt input and another for the 48 volt battery backup system. The entire motherboard, Michael said, achieves 94 per cent efficiency.

The company has also built its own rack, known as a "triplet rack," housing three columns of thirty servers. That's total of 90 servers per rack. Servers are mounted on shelves rather than rails. There's a battery cabinet in each rack for backup power.

According to Heiliger, the data center is 38 per cent more efficient than Facebook's existing leased data centers, but the cost is about 20 per cent less. The company began testing the data center at the end of last year, Heiliger tells The Reg, and it began taking live traffic over the past month.

Facebook data center - server racks

Rack up the pokes! (click to enlarge)

Facebook broke ground on the Prineville data center in January 2010. Previously, the company leased data-center space from third parties. At the time of the groundbreaking, Facebook said it would use outside air and evaporated water to cool the facility, rather than depend on chillers. About 60 to 70 per cent of the time outside air will be sufficient, the company then, but during the warmer and more humid days of the year, an "evaporative cooling system" will kick in. Heiliger told us at Thursday's event that outside cooling could potentially happen year-round.

High performance access to file storage

More from The Register

next story
Seagate brings out 6TB HDD, did not need NO STEENKIN' SHINGLES
Or helium filling either, according to reports
European Court of Justice rips up Data Retention Directive
Rules 'interfering' measure to be 'invalid'
Dropbox defends fantastically badly timed Condoleezza Rice appointment
'Nothing is going to change with Dr. Rice's appointment,' file sharer promises
Cisco reps flog Whiptail's Invicta arrays against EMC and Pure
Storage reseller report reveals who's selling what
Bored with trading oil and gold? Why not flog some CLOUD servers?
Chicago Mercantile Exchange plans cloud spot exchange
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Just what could be inside Dropbox's new 'Home For Life'?
Biz apps, messaging, photos, email, more storage – sorry, did you think there would be cake?
IT bods: How long does it take YOU to train up on new tech?
I'll leave my arrays to do the hard work, if you don't mind
prev story

Whitepapers

Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
HP ArcSight ESM solution helps Finansbank
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.