Feeds

Google cools data center with bottom of Baltic Sea

Newspaper destruction metaphor goes chillerless

  • alert
  • submit to reddit

Internet Security Threat Report 2014

Google's new data center on the coast of Finland will be cooled entirely with sea water.

The web colossus tells Swedish magazine Computer Sweden this is the first Google data center – and, to its knowledge, the only data center in the world – chilled solely with water from the sea. As Data Center Knowledge points out, this will not only reduce the facility's load on local water utilities, but also allow it to operate without power-scarfing chillers, much like Google's new-age facility in Belgium.

In February of 2009, Google paid €40 million (roughly $52 million) for a 53-year-old paper mill in Kymenlaakso, Finland, and less than a month later, the company confirmed it would be transformed into a data center. Google said that the mill purchase included approximately 166 hectares (410 acres) of land and that it would spend roughly €200 million ($260 million) – including the mill purchase price – erecting one of its top-secret server and storage facilities, which now number nearly forty worldwide.

It was a perfect metaphor for the gradual destruction of the newspaper and magazine business. Global paper-maker Stora Enso shut down its Summa Mill early in 2008, citing a decrease in newsprint and magazine-paper production that lead to "persistent losses in recent years and poor long-term profitability prospects." These days, the world gets its news and magazine stories through things like, yes, Google data centers.

According to an English translation of the Computer Sweden article and a summary from Royal Pingdom, Google's Finland data center will pull cold water from the floor of the Baltic Sea using pipes up to two meters in diameter and (refurbished) twenty-year-old pumps once used by the mill.

The facility will also store water in an old paper mill silo, but this water will only be used in a case of a fire. It will not be used for cooling.

Last year, Google let slip that its data center in Saint-Ghislain, Belgium was built without chillers. It uses nothing but outside air to keep temperatures down, and if the outside air gets too hot, it shifts the data center's compute loads to other facilities with the apparent help of a proprietary platform known as Spanner.

According to a PowerPoint presentation delivered by the company last year, Spanner is a “storage and computation system that spans all our data centers [and that] automatically moves and adds replicas of data and computation based on constraints and usage patterns.” This includes constraints related to bandwidth, packet loss, power, resources, and “failure modes."

Google senior manager of engineering and architecture Vijay Gill alluded to the technology during an appearance at the cloud-happy Structure 09 mini-conference in San Francisco earlier this year. “What we are building here...is warehouse-sized compute platforms,” Gill said, referring to Google's worldwide collection of data centers. “You have to have integration with everything right from the chillers down all the way to the CPU.

“Sometimes, there’s a temperature excursion, and you might want to do a quick load-shedding — a quick load-shedding to prevent a temperature excursion because, hey, you have a data center with no chillers. You want to move some load off. You want to cut some CPUs and some of the processes in RAM.”

This was apparently a reference to the Belgium data center, and he indicated the company could do this sort of thing automatically and near-instantly – i.e. without human intervention. “How do you manage the system and optimize it on a global level? That is the interesting part,” he said.

“What we’ve got here [with Google] is massive — like hundreds of thousands of variable linear programming problems that need to run in quasi-real-time. When the temperature starts to excurse in a data center, you don’t have the luxury to sitting around for a half an hour… You have on the order of seconds.”

Presumably, the Finland data center will benefit from Spanner as well. That PowerPoint presentation indicates that Google plans on scaling Spanner to between one million and 10 million servers, encompassing 10 trillion (1013) directories and a quintillion (1018) bytes of storage. All this would be spread across “100s to 1000s” of locations around the globe.

The Finland data center will also make use of wind power, with at least some coming from a brand new 12 MW wind park adjacent to the facility. The wind park is owned by the local power company, but some of the land it sits on was donated by Google. Currently, it includes four 3 MW wind turbines with rotors that are 100 meters (328 foot) in diameter.

The data center is still under construction, and it's scheduled to go live next year. ®

Beginner's guide to SSL certificates

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
The total economic impact of Druva inSync
Examining the ROI enterprises may realize by implementing inSync, as they look to improve backup and recovery of endpoint data in a cost-effective manner.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Mitigating web security risk with SSL certificates
Web-based systems are essential tools for running business processes and delivering services to customers.