Whatever happened to Green IT?
When low energy levels are a good thing
Call it green computing or sustainable IT, ten years ago it was all the rage. The IT press was filled with articles about it. Today, it’s hard to find a headline that mentions it. What happened?
Green IT gained real traction in 2007-2008, as companies vied for position as the most sustainable on the block, often with massive projects. IBM invested $1bn in Project Big Green, consolidating data centres and axing thousands of servers. In 2006, BMW started a green IT programme that saw it take gas from a waste dump to generate data centre electricity. Other more outlandish projects envisioned data centres powered with wood chips.
The green agenda also took hold down at the desktop level. Thin client vendors jumped on the bandwagon at the time, in what was a relatively new move for them. They switched their terminology to green computing, moving away from ease of management, and lower cost. The desktop argument hasn’t gone away, even if not many are talking about it.
According to IDC, thin clients grow at around one per cent annually on average, although the last two years have been a rollercoaster ride, with sales jumping 7.9 per cent in 2014, and then falling 6.7 per cent in 2015. The analyst firm puts the success of thin clients a year ago down to the end of XP support, though, rather than a sudden sustainability push on the part of CIOs. The power reduction is a welcome side benefit, though.
By turning existing computers into thin client devices researchers at the Fraunhofer Institute for Environmental, Safety and Energy Technology (UMSICHT) in Germany found that businesses can cut their overall desktop management by up to 47 per cent over three years.
Incidentally, the greatest saving in comes from extending the life of the computer by converting it to a thin client device, the Institute found, as the manufacturing of the computer was responsible for the bulk of CO2e emissions.
A cost-driven decision
The call for green IT always revolved around cost and practicality, rather than corporate responsibility. In 2005, when the greening first began, the likes of Intel began making dual-core CPUs that used the extra cores to run tasks in parallel without drawing more power.
This move was partly due to the difficulty of pushing more power through smaller integrated circuits, but was equally if not more so around cost. At Intel’s press conference to launch those chips, Google fellow Urs Hölzle said that Google was paying another half of the capital expense on its servers in energy costs. Dual core would make PCs and servers more efficient.
Then, there was the cost of real estate. The economy hadn’t crashed yet. Land was still costly. The pressure was on to squeeze as much computing capacity into a data centre as possible in a small space, without reaching the power capacity of the building.
These days, the fervour has died away. Green IT headlines rarely feature in the press. Google Trends references have sputtered, and you’ll find few El Reg stories on the topic.
Some initiatives around green computing have waned. CompTIA, for example, killed off its Green IT certification in 2013, just three years after introducing it. Its justification was that green IT is just a part of how companies do business these days, rather than a separate discipline. But IT managers need to play catch up here as many seem unable to measure, or plan for. sustainability.
In a recent survey of 150 facilities managers in the UK, France and Germany by the Green Grid, 43 per cent said that they don’t have a data centre energy efficiency strategy. Almost a decade after the initial big greening, seven in ten people who run data centres couldn’t ‘entirely quantify’ the environmental effects of their facility. Green computing doesn’t seem exactly baked into their DNA, then.
What stopped ‘green computing?'
Several trends that threw green IT into the spotlight reversed, which are likely to have contributed to its marketing demise. The obvious one is cost. The main interest in energy efficiency for a resource-constrained CIO is to make savings on the bottom line, but the concerns over the proportion of computer cost spent on energy aren’t as urgent now. These days, cutting energy usage might be a nice-to-have, but it isn’t the low-hanging fruit it used to be, because energy is to be had on the cheap - for now.
In 2007-8, when marketing noise and green IT was at its loudest, oil prices were also at their highest, and this fuelled concern around energy costs. Real dollar energy costs ticked upwards significantly between 2005-8, after a period of protracted decline. In 2009, the economy crashed, and oil did too.
Seven years later, it’s still low, and so are energy prices, in real dollars. US wholesale energy prices fell during 2015, according to the EIA. They have been consistently falling in the UK too, for several quarters. Thanks to fracking, gas prices have tanked since their 2008 peak, and oil prices, which never recovered from that peak either, are now at their lowest rate for decades.
In any case, there are more options for companies to save money in other ways than there were ten years ago. We’ve seen the rise of hyperscale cloud providers who focus extensively on energy efficiency because they use it on such an massive scale. Saving a watt or two per node generates massive savings for that crowd. It’s harder for smaller data centres to recognise those economies of scale, making the incentives lower for them.
Another worry that may have pushed the green IT agenda was regulatory, but that has also now receded. For example, regulatory pressures changed in some parts of the world. While there have been some regional initiatives on greenhouse gas trading emissions in the US, federal efforts promised by the Obama administration in 2008 failed to materialise. The American Clean Energy and Security Act failed to make it through Congress in 2009.
Why it isn’t coming back
There are still some drivers that could push green IT. Obama’s Clean Power Plan has put emissions reduction back on the table. In the UK, the Carbon Reduction Commitment (CRC) still puts pressure on data centre operations.
The impetus to do it is still there, because the overall energy draw from data centres is increasing. In 2013, data centres in the US chewed up 91bn kw/hrs of electricity, according to the Natural Resources Defense Council, which it said amounts to around 34 large (500 megawatt) coal-fired energy plants. That figure will rise to around 140bn kw/hrs (50 plants) by 2020, it said.
So why isn’t the green IT rhetoric coming back? We could posit all kinds of reasons, but the simplest explanation is that it was only ever marketing term. Large companies with high levels of maturity may have jumped onto the bandwagon when it first emerged, and vendors love something new to sell.
But most companies in the SMB space, which is where most of the economy happens, just have better things to do. Where it is there, it isn’t referenced explicitly. It’s just viewed as what it always really was: a general wish for cost efficiency, indexed to the economic and political pressures of the time. Ten years on, we’re just being more honest about it.
Some regulations are already making their way into California, where the data centre industry is huge and where new facilities must use outside air or water economizers and air flow management. Over in the UK, there are no mandates on how data centres should handle their chilling, but the government’s Carbon Reduction Commitment (CRC) puts pressure on data centres to reduce their carbon output.
The biggest sustainable computing move may not be happening in end-users’ data centres at all. Instead, some are seeing a move to the cloud as the simplest way to slash short-term costs and cut through the whole Gordian knot. Being able to play the sustainability card would be a welcome side-benefit.
Pick your statistic; most of them suggest that the move to third party cloud providers is happening, to some degree. Here’s McKinsey in 2014 forecasting a drop in infrastructure spending from 27% to 19% in the coming three years, for example, and arguing that it’s due to an planned increased spend on cloud.
IDG’s 2015 enterprise cloud computing survey found an increase in adoption at various levels. 72% of its 962 respondents had at least one application or portion of its computing infrastructure in a cloud-based service this year, compared to 57% in 2012.
Sustainable computing wasn’t anywhere in sight in the list of reasons for adoption, though. The closest was the reduction of resource waste, an objective shared by two thirds of participants. Lower TCO was still the primary driver.
Green IT isn’t a discrete item on the agenda for many companies these days. It may if anything be a guiding principle, with efficiency informing policy decisions going forward. If the economy continues to nose downward, the emphasis for many IT shops this year will be on saving money quickly on painlessly. If that happens to keep environmentally-minded executives happy at the same time, so much the better. ®
Sponsored: Beyond the Data Frontier