Two studies suggest for the first time a clear link between global warming and extreme precipitation.
There's a sound rule for reporting weather events that may be related to climate change. You can't say that a particular heatwave or a particular downpour – or even a particular freeze – was definitely caused by human emissions of greenhouse gases. But you can say whether these events are consistent with predictions, or that their likelihood rises or falls in a warming world.
Weather is a complex system. Long-running trends, natural fluctuations and random patterns are fed into the global weather machine, and it spews out a series of events. All these events will be influenced to some degree by global temperatures, but it's impossible to say with certainty that any of them would not have happened in the absence of man-made global warming.
But over time, as the data build up, we begin to see trends which suggest that rising temperatures are making a particular kind of weather more likely to occur. One such trend has now become clearer. Two new papers, published by Nature, should make us sit up, as they suggest for the first time a clear link between global warming and extreme precipitation (precipitation means water falling out of the sky in any form: rain, hail or snow).
Updated below with comments by Piers Corbyn CLICK to read FULL report from George Monbiot - The Guardian
This is a personal recording I uploaded to YouTube - the sound quality is poor. GR
The CO2 experiment presented at the famous Royal Institution in London is mentioned in this clip from the BBC Panorama production "What's Up With The Weather" on the 28 June It was used to show viewers that the "science is sound" on CO2 being a "Greenhouse Gas".
Now comes what I consider to be the "flaw" in that experiment, it was carried out by replacing a normal atmosphere to that of an atmosphere with 100% CO2.
The question is this, was this the correct way to show how CO2 changes the Earth's Atmosphere?
The Atmosphere on Earth contains about .04% CO2, and the amount of "Human" CO2 is about 4% of that, the simple fraction to use to work out the amount of "Man Made" CO2 is 4% of .04%
thus: 4/100 of 4/10000 = 16/1000000
This large fraction cancels down to 1/62500, so of the CO2 in the Atmosphere you could say 62499/62500 is "Normal" and 1/62500 is "Man Made".
We can take this a step further, to change the .04% CO2 into a complete Atmosphere. To do that you would have to increase the level of CO2 by 2500 times (10000/4 =2500) and of that calculation you would find that the "Man Made" part would be 2500 x 62500 = 1/156250000, so in order to have an Atmosphere of 100% CO2 it would be 156249999/156250000 "Normal" and 1/156250000 "Man Made"
How can that experiment demonstrate how CO2 changes the Atmosphere? All it demonstrates is the result of having a 100% concentration of CO2 and there is no chance of having that as a result of "Man Made" CO2.
If they conducted the experiment with a normal Atmosphere and added an increase of 1/62500 CO2 there would be NO CHANGE! And yet we are investing billions into the "green" industry on the basis that an increase of "Man Made" CO2 changes the Atmosphere.
There is NO experiment that can show this change for such a small increase of CO2 concentrations, only through "mother nature".
The more CO2 in the atmosphere the bigger plants grow, the bigger they grow the more ground they cover, and the more ground they cover the cooler the ground becomes!
The amount of CO2 is so trivial in the real atmosphere that this so called 'experiment' is irrelevant even if it means something in itself. However to deal totally with the nonsense we have to look at this profoundly dishonest experiment in detail.
In the experiment (at the end of the clip) it is noted that CO2 (or water vapour etc) filling a tube "absorbs more heat" (infra red) radiation going through it than air itself.
This 'experiment' is a non-equilibrium situation. Of course the CO2 absorbs infra red. However in doing so the gas itself warms and also emits more infra-red itself until the whole tube of gas is the same temperature as the warm end.
THIS is the final situation whether you double, quadruple or half the amount of CO2 in the tube. The end result is the gas is the same temperature as the source. The final situation takes longer to be reached if there is less CO2 or water vapour but still it gets there.
The temperature at the source end of the tube does not go up.
BY NOT WAITING FOR EQUILIBRIUM TO BE REACHED THE EXPERIMENT PROPAGATES THE LIE THAT CO2 ITSELF 'WARMS' THE ATMOSPHERE.
This 'experiment' does nothing to support the Global warmers CO2 theory.
NEVERTHELESS, is there ANY truth in the idea of CO2 having an overall effect in the real atmosphere??
OBSERVATIONALLY there is no effect in hundreds, thousands or millions of years of data. But could there be an any effect under any argument? And if there could be why isn't there?
A simple argument can be to consider a uniform temperature layer of air near the ground. It is not made warmer by increasing the CO2 within it, and so on for any number of layers (each of uniform temperature). The CO2 in each will not change their temperature.
HOWEVER there is something else happening in the atmosphere. It is under gravity and therefore cooler higher up irrespective of any infra-red absorbing properties of the constituents of the atmosphere. [This is the case because a molecule moving upwards slows down (cools) under gravity or in standard gas thermodynamics a parcel of gas moving upwards expands where pressure is lower and in so doing does work and cools].
So we have to consider the atmosphere as a stack of layers of gas each of uniform temperature UNAFFECTED by the CO2 within each and as you go up in the atmosphere the consecutive layers are cooler; but increasing or decreasing the CO2 in each layer has no effect.
End of story? Not quite. There seems to be a problem with what happens at the top of the atmosphere. Radiation leaves there into space and the height of that 'emission' layer must be higher up if there is more CO2 or water vapour around to absorb above any layer. That 'top layer' has to reach the standard temperature needed to radiate away enough energy into space. This means the whole atmosphere is made 'deeper' ('higher') a small bit by more CO2/ water vapour (other things being equal) and therefore, the argument goes, if the reduction of temperature with height (the lapse rate) stays the same the air near the top must get warmer (a 'hotspot' is predicted) and the surface must be warmer.
HOWEVER NEITHER EFFECT IS OBSERVED. In fact there is a COLDSPOT!!!! There are various arguments around about this but see the presentation in WeatherActionnews 27 - one of the redbold items in COMMENTS in the following link...