This article is more than 1 year old

Facebook's first data center DRENCHED by ACTUAL CLOUD

Revealed: Cloud downed by ... cloud!

Facebook's first data center ran into problems of a distinctly ironic nature when a literal cloud formed in the IT room and started to rain on servers.

Though Facebook has previously hinted at this via references to a "humidity event" within its first data center in Prineville, Oregon, the social network's infrastructure king Jay Parikh told The Reg on Thursday that, for a few minutes in Summer, 2011, Facebook's data center contained two clouds: one powered the social network, the other poured water on it.

"I got a call, 'Jay, there's a cloud in the data center'," Parikh says. "'What do you mean, outside?'. 'No, inside'."

There was panic.

"It was raining in the datacenter," he explains.

The problem occurred because of the ambitious chiller-less air conditioning system the data center used. Unlike traditional facilities, which use electricity-intensive, direct-expansion cooling units to maintain a low, steady temperature, consumer internet giants such as Google, Facebook, and others have all been on a tear building facilities that use outside air instead.

In Prineville's first summer of operation, a problem in the facility's building-management system led to high temperature and low humidity air from the hot aisles being endlessly recirculated though a water-based evaporative cooling system that sought to cool the air down – which meant that when the air came back into the cold aisle for the servers it was so wet it condensed.

As Facebook rather dryly put it at the time:

This resulted in cold aisle supply temperature exceeding 80°F and relative humidity exceeding 95%. The Open Compute servers that are deployed within the data center reacted to these extreme changes. Numerous servers were rebooted and few were automatically shut down due to power supply unit failure.

Some servers broke entirely because they had front-facing power supplies and these shorted out. For a few minutes, Parikh says, you could stand in Facebook's data center and hear the pop and fizzle of Facebook's ultra-lean servers obeying the ultra-uncompromising laws of physics.

Facebook learned from the mistakes, and now designs its servers with a seal around their power supply, or as Parikh calls it, "a rubber raincoat."

"This is one of those things. When you are 100 per cent aircooled it's awesome from an efficiency perspective, but the range you have to operate in is much, much wider," Parikh says.

The company also improved its building-management system to make sure that the error couldn't happen again. These days, Facebook's data centers are some of the most efficient bit barns in the entire cloud industry – they even sometimes beat Google's own facilities.

Since then, the giant hasn't been graced with any other clouds within its cloud. But we do wish it would happen again, just so they could snap a picture. ®

More about

TIP US OFF

Send us news


Other stories you might like