Feeds

Facebook beats the heat in North Carolina data center

It's not the heat or the humidity

Beginner's guide to SSL certificates

The techies at Facebook may like to "move fast and break things" as company founder Mark Zuckerberg admonished them to do three years ago before the social media juggernaut went public, but the one thing they don't want to do is break a data center and all of the servers and storage running inside of it.

But using outdoor air cooling in its Forest City, North Carolina data center, which started serving up applications using the company's custom and open sourced data center designs in April, was a tad bit risky.

Daniel Lee, a mechanical engineer at Facebook, posted some stats about the Forest City data center in a blog over at the Open Compute Project, where those open source server and data center designs live.

Among other things, the charts show that the weather has been kind so far to the data center. When the heat has been on, the humidity has been low enough that evaporative cooling of the outside air has been sufficient to keep Zuck's servers from melting and Wall Street from gnashing its teeth even more than it already has about Facebook's finances.

As Lee puts it, Facebook knew that it was taking some risks with trying to cool the data center with outside air in a region that, as far as the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) guidelines were concerned, was out of bounds because of the humidity levels.

Facebook's first data center, in Prineville, Oregon, used outside air cooling as well, but Prineville is an arid environment with cool nights and winters and hot, dry summers. As long as the air is dry, you can cool it with water.

But back in the swampy air of the US East Coast, you can get both heat and humidity, and then all the air conditioning in the world doesn't seem to help. And if you are trying to do outside-air cooling for the 100,000 servers in the Forest City data center, when the humidity and the heat are both on the rise you have a problem.

Lee says that Facebook hedged its bets and installed a direct expansion (DX) coil system – basically, an air conditioner like we have in our homes – in the Forest City data center, just in case it got too hot or too humid. Or both at the same time.

Because the weather is quite a bit different in Oregon and North Carolina, Lee says that the data center was tweaked to have a server inlet relative humidity of 90 per cent, considerably higher than the 65 per cent in Prineville, and a server inlet temperature of 85 degrees instead of 80 degrees out in Prineville.

And it worked.

The DX conditioners have not been turned on all year, despite the dry bulb temperatures outside of the Forest City data center being 100 degrees or higher several times in June and July.

The heat and humidity cut Facebook some slack in North Carolina this summer

The heat and humidity cut Facebook some slack in North Carolina this summer (click to enlarge)

In fact, says Lee, on July 1, when the outside temperature was 102 degrees (as measured by a dry bulb thermometer) during the afternoon, the relative humidity was only 26 per cent. In some cases, when the air was particularly moist, Facebook got lucky that temperatures were not that high and actually mixed in hot, relatively dry air coming off the servers back into the server inlet to lower the overall humidity of the inlet air, which was cool enough to pass back over the servers again.

The upshot is that you don't have to put your data centers in arid or cool areas to make it work – so long as you are happy to step outside the boundaries of the ASHRAE guidelines, as Facebook did:

If you can read this ASHRAE chart and ignore it, Facebook wants to hire you

If you can read this ASHRAE chart and ignore it, Facebook wants to hire you (click to enlarge)

Another interesting thing is that the Forest City data center was projected to have a power usage effectiveness (PUE) rating of between 1.06 and 1.08. PUE is the ratio of all of the power coming into the data center divided by the power consumption of the server, storage, and networking gear. A lot of old data centers have a PUE of around 2 or worse, while the best ones from Google, Facebook, and Yahoo! are around 1.10, and sometimes a little lower.

Despite the harsher weather conditions, Lee says the Forest City data center has attained a PUE of 1.07, smack dab in the projected range, for the summer. And it even bested the Prineville data center, which had a PUE of 1.09 over the same time. ®

Security for virtualized datacentres

More from The Register

next story
It's Big, it's Blue... it's simply FABLESS! IBM's chip-free future
Or why the reversal of globalisation ain't gonna 'appen
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Microsoft and Dell’s cloud in a box: Instant Azure for the data centre
A less painful way to run Microsoft’s private cloud
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
CAGE MATCH: Microsoft, Dell open co-located bit barns in Oz
Whole new species of XaaS spawning in the antipodes
AWS pulls desktop-as-a-service from the PC
Support for PCoIP protocol means zero clients can run cloudy desktops
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.