Sun goes eco-friendly with data center compression
Finally, value from StorageTek
By moving to energy-efficient servers and using flywheel UPS systems (which don't have batteries), Sun has been able to chop electric use by one million kilowatt-hours per month. That's enough juice to power around 1,000 homes in Colorado, and it is about $1m a year in electricity bills that Sun doesn't have to pay now. That works out to about 11,000 metric tons of carbon dioxide reduction, which in turn is about six per cent of Sun's global footprint.
The Broomfield site is more eco-friendly than the Louisville data center in other ways. For one, it does not need the lead and acid in batteries thanks to the flywheel UPS system, and the new data center has a water treatment system for the chillers (which ultimately gets the heat outside the data center walls) that does not require chemical treatments.
This means Sun uses 675,000 gallons less of water a year as part of its chilling operations, and it doesn't have to treat the water with chemicals.
The Broomfield site is Sun's first "day lit" data center, which apparently took employees a little while to get used to. But apparently they like sunlight. It is also, according to Sun, the largest data center in the world using Liebert's XD chilling system, which allows up to 30 kilowatts of power draw and heat dissipation inside a single rack.
The XD system attaches heat absorbers (the opposite of a radiator) to a server or storage rack, and then removes the heat from the rack by pumping an automobile refrigerant called R134A from the gear to a chiller in the data center. This refrigerant is not something you'd want to drink (unless you like bad wine) and is about five times as efficient as water at moving heat out of the system.
Sun has a major data center in Santa Clara, California, which it was showing off back in August 2007 as a consolidation and upgrade project. The Santa Clara data center was compressed from 254,000 square feet down to 127,000 square feet.
Sun cut the server body count down from 2,177 units to 1,240 and storage arrays dropped from 738 arrays down to 225. All of this gear was compressed into 65 racks (instead of the 550 they replaced). That resulted in about $1.1m in savings in power and cooling costs per year while Sun boosted the aggregate performance of its server infrastructure by 456 per cent.
When Sun caught the green bug a few years back, it had over 1.3 million square feet of data center space scattered around the globe, and had over 1,500 data closets and labs as well as lots of big data centers. Monroe and his colleagues have been trying to consolidate and then compress all of this computing to drive out costs. To date, Sun has compressed its total data center square footage by about 60 per cent, and in Silicon Valley data centers, it has cut its operating expenses by 30 per cent.
Having done the Broomfield project, Sun is of course selling what it knows as data center strategy, design, and build out services. These services are for new spaces built on slab floors, just like Sun did in Broomfield, and are distinct from the eco services suite Sun launched in August 2007 when the Santa Clara data center was rolled out and when Sun was focused strictly on assessing, optimizing, and virtualizing existing data centers.
One last thought: notice how Sun didn't use containerized data centers, even when it had the chance? I'm just saying... ®
Sponsored: Hyper-scale data management