Global Warming IS REAL, argues sceptic mathematician - it just isn't THERMAGEDDON

IPCC hid the good news? Let's find out

Build a business case: developing custom apps

Hot in here? Not so much...

Q. The biggest surprise to readers who haven't followed this closely is how little warming we'll see in this century. It isn't what you read in the papers - or hear from NGOs or activists. How much warmer is it going to be?

Lewis: If you take the IPCC AR5 figures, we have total forcing of 2.3 W/m2 [watts per square metre] anthorpogenic and another 0.1C natural - we haven’t had much natural forcing)- say 2.4 W/m2. Of that, between 0.4 and 0.6 W/m2 is being absorbed by the ocean. (Trenberth’s reanalysis not based on new observations)The most recent estimate is that under .5˚C absorbed in the ociean. That would mean that one-third of that 2.4 W/m2 is still going into the ocean whereas the other is cancelled by the higher radiation from earth because of temperature.

So 0.75 W/m2 to 0.8 W/m2 has nullified three-quarters of the forcing we’ve had. So in equilibrium would have to nullify the other quarter too. But that would take a long, long time.

Q. There's a contradiction here between the two methods. You and some others are using the energy conservation model. The high numbers come climate models (general circulation models, or GCMs). Why when the "science is certain" are there such different outcomes?

Lewis: The basic reason they get such high figures - if you take the ones starting from pre-industrial levels - is that the models have a higher sensitivity going forward: they use an average of TCR of 1.8-1.9 W/m2 in practice, which is more than the 1.5 W/m2 allowed for past forcing.

Some models also have some linearity in them. Most are linear - the forcing goes up in a straight line. Others have a worsening at high forcing levels. The difference between models is feedbacks.

Q. Why don't climate models reflect the basic "equation"?

Lewis: The models are not seeded from the equation. They supposedly obey basic physics in terms of how they model thermal dynamics and circulation. What determines ECS is the feedbacks in the model. Now, some of this physics are based on reasonable physics - like water vapour feedback and the lapse rate feedback. But some are far from perfect. IT's in the 2:1 range.

When it comes to cloud feedbacks, they cannot do it from the basic physics at all. So the modellers "parameterise" the clouds. That doesn’t come from basic physics at all, and the cloud feedback varies WILDLY between models. Why some ECS have a 2.1 and some 4.6.

Q. Clouds are pure guesswork?

Lewis: Yes.

And obviously some clouds increase albedo, cooling the system, and some act as a kind of insulator and reduce "cooling"…

Lewis: Yes. Some of the emissions scenarios produced by models are pretty aggressive. ARP 8.5 for example is in the 90th percentile. Cloud feedback varies from -0.4 W/m2 in ccm4 (from the National Center for Atmospheric Research, NCAR) up to +1.2 W/m2 for IPSL-CM5A (Pasteur Institute). That is a huge range;  the way they try to analyse them is complicated.

You've spent a lifetime modelling - surely they're not completely useless"

Lewis: Models are extremely useful but better at somethings than others. They're pretty good at atmospheric circulation. But when it comes to ECS there's really no reason to think they’re going to be accurate

Q. So what is the new evidence that has changed the estimates? A little or a lot? Lewis: It's because the estimate for the forcing from aerosols - which includes pollution and the particulates from volcano eruptions has been cut. This has two effects. Firstly, the TCR estimate is now lower. Secondly there is less scope for the natural internal variability to be responsible for the temperature increase that we have had.

Q. The "wildcard" is less 'wild'?

Lewis: The IPCC would argue - and I wouldn’t dispute this - that we can be more certain that greenhouse gas is the most important forcing, the bulk of which is anthropogenic. But at same time our best estimate should be lower.

Q. Can your elaborate on your approach. It appears to be take slices of history, comparing time frames of the same length where the forcings were well known. Essentially a deductive approach?

Lewis: With a Climate Sensitivity of 2 W/m2 , it means that we've had a warming over pre-industrial levels of 0.8˚C and have 1.2˚C to go. [CO2 hasn't doubled over pre-industrial levels. In modern times it has varied with the glacial cycle between 180 and 280 parts per million, and is currently at around 400 parts per million. However it has been much higher than the current level before; a mystery regulator prevents "runaway global warming"]

Boost IT visibility and business value

Next page: 'Noisy' data

More from The Register

next story
The police are WRONG: Watching YouTube videos is NOT illegal
And our man Corfield is pretty bloody cross about it
China hopes home-grown OS will oust Microsoft
Doesn't much like Apple or Google, either
Super Cali signs a kill-switch, campaigners say it's atrocious
Remote-death button bad news for crooks, protesters – and great news for hackers?
UK government accused of hiding TRUTH about Universal Credit fiasco
'Reset rating keeps secrets on one-dole-to-rule-them-all plan', say MPs
Fast And Furious 6 cammer thrown in slammer for nearly three years
Man jailed for dodgy cinema recording of Hollywood movie
Caught red-handed: UK cops, PCSOs, specials behaving badly… on social media
No Mr Fuzz, don't ask a crime victim to be your pal on Facebook
e-Borders fiasco: Brits stung for £224m after US IT giant sues UK govt
Defeat to Raytheon branded 'catastrophic result'
Don't even THINK about copyright violation, says Indian state
Pre-emptive arrest for pirates in Karnataka
prev story


Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?