Facebook beats the heat in North Carolina data center
It's not the heat or the humidity
The techies at Facebook may like to "move fast and break things" as company founder Mark Zuckerberg admonished them to do three years ago before the social media juggernaut went public, but the one thing they don't want to do is break a data center and all of the servers and storage running inside of it.
But using outdoor air cooling in its Forest City, North Carolina data center, which started serving up applications using the company's custom and open sourced data center designs in April, was a tad bit risky.
Daniel Lee, a mechanical engineer at Facebook, posted some stats about the Forest City data center in a blog over at the Open Compute Project, where those open source server and data center designs live.
Among other things, the charts show that the weather has been kind so far to the data center. When the heat has been on, the humidity has been low enough that evaporative cooling of the outside air has been sufficient to keep Zuck's servers from melting and Wall Street from gnashing its teeth even more than it already has about Facebook's finances.
As Lee puts it, Facebook knew that it was taking some risks with trying to cool the data center with outside air in a region that, as far as the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) guidelines were concerned, was out of bounds because of the humidity levels.
Facebook's first data center, in Prineville, Oregon, used outside air cooling as well, but Prineville is an arid environment with cool nights and winters and hot, dry summers. As long as the air is dry, you can cool it with water.
But back in the swampy air of the US East Coast, you can get both heat and humidity, and then all the air conditioning in the world doesn't seem to help. And if you are trying to do outside-air cooling for the 100,000 servers in the Forest City data center, when the humidity and the heat are both on the rise you have a problem.
Lee says that Facebook hedged its bets and installed a direct expansion (DX) coil system – basically, an air conditioner like we have in our homes – in the Forest City data center, just in case it got too hot or too humid. Or both at the same time.
Because the weather is quite a bit different in Oregon and North Carolina, Lee says that the data center was tweaked to have a server inlet relative humidity of 90 per cent, considerably higher than the 65 per cent in Prineville, and a server inlet temperature of 85 degrees instead of 80 degrees out in Prineville.
And it worked.
The DX conditioners have not been turned on all year, despite the dry bulb temperatures outside of the Forest City data center being 100 degrees or higher several times in June and July.
In fact, says Lee, on July 1, when the outside temperature was 102 degrees (as measured by a dry bulb thermometer) during the afternoon, the relative humidity was only 26 per cent. In some cases, when the air was particularly moist, Facebook got lucky that temperatures were not that high and actually mixed in hot, relatively dry air coming off the servers back into the server inlet to lower the overall humidity of the inlet air, which was cool enough to pass back over the servers again.
The upshot is that you don't have to put your data centers in arid or cool areas to make it work – so long as you are happy to step outside the boundaries of the ASHRAE guidelines, as Facebook did:
Another interesting thing is that the Forest City data center was projected to have a power usage effectiveness (PUE) rating of between 1.06 and 1.08. PUE is the ratio of all of the power coming into the data center divided by the power consumption of the server, storage, and networking gear. A lot of old data centers have a PUE of around 2 or worse, while the best ones from Google, Facebook, and Yahoo! are around 1.10, and sometimes a little lower.
Despite the harsher weather conditions, Lee says the Forest City data center has attained a PUE of 1.07, smack dab in the projected range, for the summer. And it even bested the Prineville data center, which had a PUE of 1.09 over the same time. ®