Feeds

Facebook beats the heat in North Carolina data center

It's not the heat or the humidity

Intelligent flash storage arrays

The techies at Facebook may like to "move fast and break things" as company founder Mark Zuckerberg admonished them to do three years ago before the social media juggernaut went public, but the one thing they don't want to do is break a data center and all of the servers and storage running inside of it.

But using outdoor air cooling in its Forest City, North Carolina data center, which started serving up applications using the company's custom and open sourced data center designs in April, was a tad bit risky.

Daniel Lee, a mechanical engineer at Facebook, posted some stats about the Forest City data center in a blog over at the Open Compute Project, where those open source server and data center designs live.

Among other things, the charts show that the weather has been kind so far to the data center. When the heat has been on, the humidity has been low enough that evaporative cooling of the outside air has been sufficient to keep Zuck's servers from melting and Wall Street from gnashing its teeth even more than it already has about Facebook's finances.

As Lee puts it, Facebook knew that it was taking some risks with trying to cool the data center with outside air in a region that, as far as the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) guidelines were concerned, was out of bounds because of the humidity levels.

Facebook's first data center, in Prineville, Oregon, used outside air cooling as well, but Prineville is an arid environment with cool nights and winters and hot, dry summers. As long as the air is dry, you can cool it with water.

But back in the swampy air of the US East Coast, you can get both heat and humidity, and then all the air conditioning in the world doesn't seem to help. And if you are trying to do outside-air cooling for the 100,000 servers in the Forest City data center, when the humidity and the heat are both on the rise you have a problem.

Lee says that Facebook hedged its bets and installed a direct expansion (DX) coil system – basically, an air conditioner like we have in our homes – in the Forest City data center, just in case it got too hot or too humid. Or both at the same time.

Because the weather is quite a bit different in Oregon and North Carolina, Lee says that the data center was tweaked to have a server inlet relative humidity of 90 per cent, considerably higher than the 65 per cent in Prineville, and a server inlet temperature of 85 degrees instead of 80 degrees out in Prineville.

And it worked.

The DX conditioners have not been turned on all year, despite the dry bulb temperatures outside of the Forest City data center being 100 degrees or higher several times in June and July.

The heat and humidity cut Facebook some slack in North Carolina this summer

The heat and humidity cut Facebook some slack in North Carolina this summer (click to enlarge)

In fact, says Lee, on July 1, when the outside temperature was 102 degrees (as measured by a dry bulb thermometer) during the afternoon, the relative humidity was only 26 per cent. In some cases, when the air was particularly moist, Facebook got lucky that temperatures were not that high and actually mixed in hot, relatively dry air coming off the servers back into the server inlet to lower the overall humidity of the inlet air, which was cool enough to pass back over the servers again.

The upshot is that you don't have to put your data centers in arid or cool areas to make it work – so long as you are happy to step outside the boundaries of the ASHRAE guidelines, as Facebook did:

If you can read this ASHRAE chart and ignore it, Facebook wants to hire you

If you can read this ASHRAE chart and ignore it, Facebook wants to hire you (click to enlarge)

Another interesting thing is that the Forest City data center was projected to have a power usage effectiveness (PUE) rating of between 1.06 and 1.08. PUE is the ratio of all of the power coming into the data center divided by the power consumption of the server, storage, and networking gear. A lot of old data centers have a PUE of around 2 or worse, while the best ones from Google, Facebook, and Yahoo! are around 1.10, and sometimes a little lower.

Despite the harsher weather conditions, Lee says the Forest City data center has attained a PUE of 1.07, smack dab in the projected range, for the summer. And it even bested the Prineville data center, which had a PUE of 1.09 over the same time. ®

Beginner's guide to SSL certificates

More from The Register

next story
The cloud that goes puff: Seagate Central home NAS woes
4TB of home storage is great, until you wake up to a dead device
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Intel offers ingenious piece of 10TB 3D NAND chippery
The race for next generation flash capacity now on
Want to STUFF Facebook with blatant ADVERTISING? Fine! But you must PAY
Pony up or push off, Zuck tells social marketeers
Oi, Europe! Tell US feds to GTFO of our servers, say Microsoft and pals
By writing a really angry letter about how it's harming our cloud business, ta
SAVE ME, NASA system builder, from my DEAD WORKSTATION
Anal-retentive hardware nerd in paws-on workstation crisis
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
The hidden costs of self-signed SSL certificates
Exploring the true TCO for self-signed SSL certificates, including a side-by-side comparison of a self-signed architecture versus working with a third-party SSL vendor.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.