This article is more than 1 year old

Take the heat from data centres’ PUE pitch

It’s hard to match new data centres' efficiency, but you can improve your own

Data centre openings have become a dime a dozen of late, nearly always featuring (here in Australia at least) a suit from the operator talking up the new facility’s power usage effectiveness (PUE) rating as a compelling reason to move your kit within its walls.

PUE measures the amount of energy that goes into the building and divides it by the amount consumed by actual working kit – servers, SANs and other computing devices. If one incoming kilowatt leaves .8KW to power kit, PUE will be 1.25. The lower the PUE, the better, as a low number means the data centre’s energy overheads are low and the prices you’ll be charged to lodge kit within its walls should be commensurately lower.

The best greenfield data centres can achieve PUEs of around 1.2, and when they do their operators aren’t shy about letting us know. The recently-opened HP Aurora and Macquarie Telecom data centres, for example, both made their 1.3 PUE score a centrepiece of their launch events, and mentioned the rating as a reason to abandon on-premises kit and instead move it into their hallowed data halls.

That’s just the kind of argument one’s CEO is likely to read in an airline magazine, before asking some not-especially-well-informed questions about Reg readers' data centres or server rooms and what they cost to operate.

Leaving aside the many reasons it might be impractical to send all your kit outside your office, the first point with which to address questions about PUE is that new data centres have the benefit of being entirely new. That does confer some advantages, given that newer technology and designs nearly always improve on their predecessors.

A second point is that that just because a data centre offers a low PUE doesn’t mean it will be cheaper to run your computers within its walls.

“PUE is just looking at power in and power out, but does not look at the efficiency of computing,” says Per Grandjean-Thomsen, Engineering Manager, UPS, at Emerson Network Power. “A 1kw server with 1000 IOPS does not compare to 1kw server with 100 IOPS ops per second.”

That makes computers that crunch more data with the same, or lesser, electricity consumption a fine way to reduce the cost of computing on your premises. Or in a third-party facility.

“It all starts with new IT equipment,” says Michael Mallia, Eaton Industries marketing manager for power quality. “The latest hardware is always more efficient.”

If the boss wants lower electricity bills, you can therefore start to think about new silicon. Virtualising those new, shiny, servers is another low-hanging tactic to improve on-premises performance, given the likelihood it will allow you to run fewer servers.

A look at your UPS is also worthwhile, as if yours is set up to provide more power than you really need, it will be consuming more electricity. A lesser or newer and more efficient model might just, after careful consideration of the many factors contributing to your need for uptime, pay for itself.

The next thing to do to make your server room less thirsty for electrons is tidy it up.

“I've seen nests of cables that block exhaust heat,” says Grandjean-Thomsen. “That means the server temperature goes up, so the fans have to work harder.” Harder-working fans means more heat. And more heat means your air conditioning needs to work harder, which means more electricity consumption.

Tidying up cabling and other impediments to airflow can therefore help servers to run cooler.

Looking at absences can also help, as there's no point in letting air cooled at considerable expense waft into spaces where it won't be useful. “Too often I walk around data centres and racks are not fully populated, with empty spaces not covered in blanking panel,” says Schneider IT's vice president for APAC, Paul Tyrer. “Plugging those gaps with blanking panels - a cheap plastic strip that goes across the front of a rack - stops cold air going into those spaces.”

Eaton's Mallia recommends a similar tactic, namely wedging pieces of foam into gaps between racks to stop cold air going to waste in those spaces.

Another useful tactic for reducing electricity consumption is running the data centre at a higher temperature. That may sound like madness, but Mallia says many data centres and server rooms are often set to run at a temperature below that at which the equipment they contain will happily operate. “Making small adjustments of even one or two degrees to your desired temperature takes huge load away from cooling systems,” he says.

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like