This article is more than 1 year old

Some like it hot ... very hot: How to use heat to your advantage in your data center

We try to melt preconceptions about staying cool

Analysis Heat has traditionally been the sysadmin's enemy. We may have turned technology to our advantage and chipped away at heat's wasteful nature over the years, but our old foe has remained.

From turning data centers into walk-in fridges, and hot/cold aisle separation to cold aisle containment and positive pressure, we've tried everything, and yet the standard operating temperature has jumped from 18°C to around 23°C.

As compute requirements get bigger, then heat production increases – what lies in the future?

Surprisingly, the future could be one where we ride rising head levels and turn them back to the greater good.

Having kept server inlet temperatures at Baltic levels, the hardware vendors have come out in force to say their kit was robust enough to stand higher inlet temperatures. We almost didn't believe it was true, but servers (and other hardware) will run comfortably at 22-23°C – a full 4-5°C higher than conventional wisdom has always taught us.

When you consider the reduction in standard operating temperatures and plot the revised inlet temperature against the mercury outdoors, it suddenly makes for an interesting wrinkle when it comes to planning your data center and its cooling setup.

In May 2015, for example, nowhere in the UK recorded a temperature higher than 23.4°C. That's a peak temperature less than half a degree above the temperature that we're chilling our cold aisles down to, and for reference, the mean temperature for the month was a measly 9.6°C.

That means all that power spent cranking the computer room air conditioning round the clock was a monumental waste, as you were making the aisle temperature barely lower than the air outside. Enter free-air cooling.

This is not as simple as just throwing the data hall doors open to the elements. Let's face it, you'd be inviting inconsistencies, moisture, and all sorts of particulate nasties into your lovely clean environment. But if you've got heating, ventilation, and air-conditioning that supports it, you can open the external vents, divert the fans, and flood your cooling system with good old-fashioned British temperatures.

Then your system will strip the nasty particulates, balance the humidity and pressure, and cycle it into your data hall. Sometimes it's cold enough outside that you would actually need to be pumping hot air into the system to balance the inbound temperature. A clever system will keep the free-air-cooling vents closed – alternately opening and closing them at just the right moment, like the keys on a saxophone – to ensure the minimum of electrical effort to maintain temperature and pressure.

If you were smart enough to site your data center somewhere nice and cold – like the border of the Arctic Circle as has Facebook or, for everybody else, somewhere accessible like Scotland – then you'll barely need to use your cooling at all.

If you're really fancy, you might like water-cooling, cutting out the HVAC middleman altogether. It feels horrifically counter-intuitive to be voluntarily introducing any fluids of any kind into a data hall, but most of the major tin manufacturers offer water-cooling for heat-intensive blade systems, so it can't be all bad.

IBM was the the first major hardware vendor to embrace water-cooling decades ago, and is still at the forefront of the technology today. They have another weapon in their arsenal though: The hot-water-cooled supercomputer.

More about

TIP US OFF

Send us news


Other stories you might like