The chill in Mountain View
Google is currently operating a chiller-less data center in Belgium, and Microsoft is building one in Ireland. But these are a tad different. Microsoft is using Direct eXpansion (DX) cooling – similar to traditional air conditioning – while it seems that Google uses a software system that automatically shifts loads to other data centers when outside temperature get too high. The system is called Spanner, and though Google has been coy about its use of Spanner, the company has publicly presented a paper on the platform.
Facebook is building a second custom-built data center in western North Carolina, where local tax breaks have made it data-center hot spot housing several big names, including Google and Apple. The weather in North Carolina is less temperate, so the company may have to make changes to its cooling systems. And Heiliger told us that the company is already making changes to its hardware designs for use in the North Carolina facility.
There have been rumblings that Facebook would switch to ARM servers or other "massively multi-core" server designs, but Heiliger indicated to us that there are no definite plans to do so. But he did say that the company is always evaluating new designs.
The company is not using the sort of modular data center design popularized by Google and picked up by the likes of Microsoft. Google has long used such designs, and it has long built its own servers. The company did reveal some of its designs in the spring of 2009, but this was years after the fact – and these were apparently not its latest designs.
Heiliger's "Fight Club" line was surely aimed at Google. When we asked Heiliger about Facebook's decision to release its server and data-center designs, he equated the decision to open sourcing back-end software, an area in which Facebook is also putting Google to a certain amount of shame.
"We think the bigger value comes back to us over time," he told us, "just as it did with open source software. Many people will now be looking at our designs. This is a 1.0. We hope this will accelerate what everyone is doing." ®
I don't much like Facebook due to their cavalier attitude to privacy, but as a network programmer who works on cloud services frequently, it's infuriating that everyone has different server designs when for the most part they are all doing the same thing, I share their hope that this will help standardise things.
"all those microwaves"
Back in the late 90's we were flashing our 33.3k US Robotics modem racks up to 56.6k (woot! that was exciting at that time!) and to do so, had to run down to our colo at Qwest facilities in Tucson. Security was not much and you got to walk by hundreds of modem racks separated by chain link fencing to get to your cage amongst all the other ISP's like AOL, Compuserve, etc.
You can't help but wonder how much damage one man with a supersoaker could do in that situation lol.
Open racks are easy to work with but damn! One awshit can really mess up your day/week/year/life. And ahshits are as likely as death and taxes.
Off topic and meandering memories coming back.... sitting in that colo with the lights out and watching thousands of people connecting and disconnecting to the racks, seeing your call fallover work repopulating cards after flashing firmware... kinda profound wondering wtf they are all doing at 3am in the morning. Good stuff.