Feeds

Mystic Met Office predicts neighbourhood Thermageddon

Modelling 'totally inadequate' last year - why trust it now?

The next step in data security

On Thursday, the Met Office launched its new report on global warming: UK Climate Projections 2009, otherwise known as UKCP09. This is based on the output of Hadley Centre climate models that predict temperature increases of up to 6°C with wetter winters, dryer summers, more heatwaves, rising sea levels, more floods and all the other catastrophes that one would expect from similar exercises in alarmism.

What makes this report different from any of its predecessors is the resolution of the predictions that the Met Office is making. They are not just presenting a general impression of what might happen globally during this century, or even how climate change could affect the UK as a whole. They are claiming that they can predict what will happen in individual regions of the country - down to a 25km square. You can enter your postcode and find out how your street will be affected by global warming in 2040 or 2080.

All this is rather unexpected. In May last year, I posted here and here about a world summit of climate modellers that took place at Reading University. On the agenda was one very important problem for them; even the most powerful super-computers that have been developed so far are not capable of running the kind of high resolution models that they claim would allow them to reduce the degree of uncertainty in their predictions, and also make detailed regional predictions that policy makers would like to have so that they can build climate change into infrastructure planning.

Here are a couple of excerpts from the conference website:

The climate modelling community is therefore faced with a major new challenge: Is the current generation of climate models adequate to provide societies with accurate and reliable predictions of regional climate change, including the statistics of extreme events and high impact weather, which are required for global and local adaptation strategies? It is in this context that the World Climate Research Program (WCRP) and the World Weather Research Programme (WWRP) asked the WCRP Modelling Panel (WMP) and a small group of scientists to review the current state of modelling, and to suggest a strategy for seamless prediction of weather and climate from days to centuries for the benefit of and value to society.

A major conclusion of the group was that regional projections from the current generation of climate models were sufficiently uncertain to compromise this goal of providing society with reliable predictions of regional climate change.

Modellers also fretted that the GCMs, or General Circulation Models, were blunt instruments.

Current generation climate models have serious limitations in simulating regional features, for example, rainfall, mid-latitude storms, organized tropical convection, ocean mixing, and ecosystem dynamics. What is the scientific strategy to improve the fidelity of climate models?

This was summed up by Julia Slingo (at that time Professor of Meteorology at Reading University, who also chaired part of the conference) in a report by Roger Harrabin on the BBC News website:

So far modellers have failed to narrow the total bands of uncertainties since the first report of the Intergovernmental Panel on Climate Change (IPCC) in 1990.

And Julia Slingo from Reading University admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.”

Doom Your Way

Professor Slingo said several hundred million pounds of investment were needed.

“In terms of re-building something like the Thames Barrier, that would cost billions; it’s a small fraction of that.

“And it would allow us to tell the policymakers that they need to build the barrier in the next 30 years, or maybe that they don’t need to.”

If, since the conference, several hundred million pounds had been invested in producing a new generation of supercomputers, a thousand times more powerful than the present generation, and the Met Office had already developed and run the kind of high resolution models which were so far beyond the scientist’s grasp just a year ago, then I suspect that this might have seeped into the media and we would have head about it. So far as I am aware, the fastest supercomputers are still a thousand times slower than the modellers consider necessary for credible regional scale modelling of the climate.

So I wondered whether Professor Slingo had anything to say about the Met Office’s new report.

Choosing a cloud hosting partner with confidence

More from The Register

next story
SCREW YOU, Russia! NASA lobs $6.8bn at Boeing AND SpaceX to run space station taxis
Musk charging nearly half as much as Boeing for crew trips
Boffins say they've got Lithium batteries the wrong way around
Surprises at the nano-scale mean our ideas about how they charge could be all wrong
Thought that last dinosaur was BIG? This one's bloody ENORMOUS
Weighed several adult elephants, contend boffins
Edge Research Lab to tackle chilly LOHAN's final test flight
Our US allies to probe potential Vulture 2 servo freeze
Europe prepares to INVADE comet: Rosetta landing site chosen
No word yet on whether backup site is labelled 'K'
India's MOM Mars mission makes final course correction
Mangalyaan probe will feel the burn of orbital insertion on September 24th
Cracked it - Vulture 2 power podule fires servos for 4 HOURS
Pixhawk avionics juice issue sorted, onwards to Spaceport America
City hidden beneath England's Stonehenge had HUMAN ABATTOIR. And a pub
Boozed-up ancients drank beer before tearing corpses apart
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.