Feeds

World Meteorological Organization says climate data is uncool

Weather wonks call for more frequent collation of climate baselines

Secure remote control for conventional and virtual desktops

The World Meteorological Organization's Commission for Climatology has called for governments to refresh their “climate normals” more often.

Climate normals are thirty-year chunks of weather observations that are used as baselines for comparison with more recent events. The World Meteorological Organization (WMO) says the data most commonly used today covers 1961 to 1990.

The WMO says that data isn't optimal because “rising atmospheric concentrations of greenhouse gases are changing the Earth’s climate much faster than before.”

The statement doesn't say where those greenhouse gases are coming from, but does say relying on old data means “decision-makers in climate-sensitive industries may be basing important decisions on information that may be out of date.”

The organization's preferred method to create better data is twofold. For starters, it wants all nations to quickly adopt a climate normal spanning 1981 to 2010, which some countries have already decided to do of their own accord. The second initiative it recommends is updating climate normals every ten years, “so that the 30-year climate normal to be used in the 2020s would be 1991-2020.”

“This approach would satisfy modern needs for current information and standardize weather and climate information and forecasts around the world,” the organization says, adding that updating climate normals more often will also make it easier to put modern technology to work on analysis of climate data. More data, more often, also offers the chance to tap the increasing amounts of data gathered by the planet's proliferating weather monitoring stations.

The proposal won't junk 1961-1990 data, which the WMO advocates retaining “as the base period for monitoring and assessing long-term climate variability and change … over the course of this century and beyond.”

The ideas outlined above were discussed over the last week at a meeting of the Commission for Climatology and condensed into a technical regulation titled “Calculating Climatological Standard Normals Every 10 Years”. That regulation will be debated at the World Meteorological Congress, a The Congress, the WMO's governing body, for three weeks from 25 May 2015.

Maybe the WMO can figure out where the greenhouse gases are coming from before then? ®

The essential guide to IT transformation

More from The Register

next story
Our LOHAN spaceplane ballocket Kickstarter climbs through £8000
Through 25 per cent but more is needed: Get your UNIQUE rewards!
LOHAN tunes into ultra long range radio
And verily, Vultures shall speak status unto distant receivers
EOS, Lockheed to track space junk from Oz
WA facility gets laser-eyes out of the fog
Volcanic eruption in Iceland triggers CODE RED aviation warning
Lava-spitting Bárðarbunga prompts action from Met Office
NASA to reformat Opportunity rover's memory from 125 million miles away
Interplanetary admins will back up data and get to work
Major cyber attack hits Norwegian oil industry
Statoil, the gas giant behind the Scandie social miracle, targeted
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?