The amount of CO2 is so trivial in the real atmosphere that this so called 'experiment' is irrelevant even if it means something in itself. However to deal totally with the nonsense we have to look at this profoundly dishonest experiment in detail.
In the experiment (at the end of the clip) it is noted that CO2 (or water vapour etc) filling a tube "absorbs more heat" (infra red) radiation going through it than air itself.
This 'experiment' is a non-equilibrium situation. Of course the CO2 absorbs infra red. However in doing so the gas itself warms and also emits more infra-red itself until the whole tube of gas is the same temperature as the warm end.
THIS is the final situation whether you double, quadruple or half the amount of CO2 in the tube. The end result is the gas is the same temperature as the source. The final situation takes longer to be reached if there is less CO2 or water vapour but still it gets there.
The temperature at the source end of the tube does not go up.
BY NOT WAITING FOR EQUILIBRIUM TO BE REACHED THE EXPERIMENT PROPAGATES THE LIE THAT CO2 ITSELF 'WARMS' THE ATMOSPHERE.
This 'experiment' does nothing to support the Global warmers CO2 theory.
NEVERTHELESS, is there ANY truth in the idea of CO2 having an overall effect in the real atmosphere??
OBSERVATIONALLY there is no effect in hundreds, thousands or millions of years of data. But could there be an any effect under any argument? And if there could be why isn't there?
A simple argument can be to consider a uniform temperature layer of air near the ground. It is not made warmer by increasing the CO2 within it, and so on for any number of layers (each of uniform temperature). The CO2 in each will not change their temperature.
HOWEVER there is something else happening in the atmosphere. It is under gravity and therefore cooler higher up irrespective of any infra-red absorbing properties of the constituents of the atmosphere. [This is the case because a molecule moving upwards slows down (cools) under gravity or in standard gas thermodynamics a parcel of gas moving upwards expands where pressure is lower and in so doing does work and cools].
So we have to consider the atmosphere as a stack of layers of gas each of uniform temperature UNAFFECTED by the CO2 within each and as you go up in the atmosphere the consecutive layers are cooler; but increasing or decreasing the CO2 in each layer has no effect.
End of story? Not quite. There seems to be a problem with what happens at the top of the atmosphere. Radiation leaves there into space and the height of that 'emission' layer must be higher up if there is more CO2 or water vapour around to absorb above any layer. That 'top layer' has to reach the standard temperature needed to radiate away enough energy into space. This means the whole atmosphere is made 'deeper' ('higher') a small bit by more CO2/ water vapour (other things being equal) and therefore, the argument goes, if the reduction of temperature with height (the lapse rate) stays the same the air near the top must get warmer (a 'hotspot' is predicted) and the surface must be warmer.
There I put forward THREE reasons why the Infra-Red absorbing gas Under Gravity (IrGrav) effect (the sensible name for the so-called 'greenhouse effect') in the real atmosphere DOES NOT GIVE WARMING:-
1. The atmosphere has day/night variations which cause extra upper level cooling with more CO2/water vapour - giving the observed COLDSPOT!
2. The argument that the lapse rate is constant is probably false.
3. PLANTS + MORE CO2 grow faster and cause extra SURFACE TRANSPIRATION COOLING
Have a read! We will have to get the arguments right ourselves because the warmers will pull more and more tricks!