Feeds

Facebook beats the heat in North Carolina data center

It's not the heat or the humidity

The essential guide to IT transformation

The techies at Facebook may like to "move fast and break things" as company founder Mark Zuckerberg admonished them to do three years ago before the social media juggernaut went public, but the one thing they don't want to do is break a data center and all of the servers and storage running inside of it.

But using outdoor air cooling in its Forest City, North Carolina data center, which started serving up applications using the company's custom and open sourced data center designs in April, was a tad bit risky.

Daniel Lee, a mechanical engineer at Facebook, posted some stats about the Forest City data center in a blog over at the Open Compute Project, where those open source server and data center designs live.

Among other things, the charts show that the weather has been kind so far to the data center. When the heat has been on, the humidity has been low enough that evaporative cooling of the outside air has been sufficient to keep Zuck's servers from melting and Wall Street from gnashing its teeth even more than it already has about Facebook's finances.

As Lee puts it, Facebook knew that it was taking some risks with trying to cool the data center with outside air in a region that, as far as the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) guidelines were concerned, was out of bounds because of the humidity levels.

Facebook's first data center, in Prineville, Oregon, used outside air cooling as well, but Prineville is an arid environment with cool nights and winters and hot, dry summers. As long as the air is dry, you can cool it with water.

But back in the swampy air of the US East Coast, you can get both heat and humidity, and then all the air conditioning in the world doesn't seem to help. And if you are trying to do outside-air cooling for the 100,000 servers in the Forest City data center, when the humidity and the heat are both on the rise you have a problem.

Lee says that Facebook hedged its bets and installed a direct expansion (DX) coil system – basically, an air conditioner like we have in our homes – in the Forest City data center, just in case it got too hot or too humid. Or both at the same time.

Because the weather is quite a bit different in Oregon and North Carolina, Lee says that the data center was tweaked to have a server inlet relative humidity of 90 per cent, considerably higher than the 65 per cent in Prineville, and a server inlet temperature of 85 degrees instead of 80 degrees out in Prineville.

And it worked.

The DX conditioners have not been turned on all year, despite the dry bulb temperatures outside of the Forest City data center being 100 degrees or higher several times in June and July.

The heat and humidity cut Facebook some slack in North Carolina this summer

The heat and humidity cut Facebook some slack in North Carolina this summer (click to enlarge)

In fact, says Lee, on July 1, when the outside temperature was 102 degrees (as measured by a dry bulb thermometer) during the afternoon, the relative humidity was only 26 per cent. In some cases, when the air was particularly moist, Facebook got lucky that temperatures were not that high and actually mixed in hot, relatively dry air coming off the servers back into the server inlet to lower the overall humidity of the inlet air, which was cool enough to pass back over the servers again.

The upshot is that you don't have to put your data centers in arid or cool areas to make it work – so long as you are happy to step outside the boundaries of the ASHRAE guidelines, as Facebook did:

If you can read this ASHRAE chart and ignore it, Facebook wants to hire you

If you can read this ASHRAE chart and ignore it, Facebook wants to hire you (click to enlarge)

Another interesting thing is that the Forest City data center was projected to have a power usage effectiveness (PUE) rating of between 1.06 and 1.08. PUE is the ratio of all of the power coming into the data center divided by the power consumption of the server, storage, and networking gear. A lot of old data centers have a PUE of around 2 or worse, while the best ones from Google, Facebook, and Yahoo! are around 1.10, and sometimes a little lower.

Despite the harsher weather conditions, Lee says the Forest City data center has attained a PUE of 1.07, smack dab in the projected range, for the summer. And it even bested the Prineville data center, which had a PUE of 1.09 over the same time. ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.