Friday, April 5th 2013, 7:12 AM EDT
‘Global temperatures are warmer than at any time in at least 4,000 years… and over the coming decades are likely to surpass levels not seen on the planet since before the last ice age.’ That was the pithy message offered by New York Times eco-columnist Justin Gillis, reporting on a new reconstruction of past global temperatures published in Science last month. The Atlantic was blunter: ‘We’re Screwed: 11,000 Years’ Worth of Climate Data Prove It.
The Science paper is an attempt to chart changes in global temperatures for the past 11,000 years. In the absence of actual thermometer records any earlier than the late seventeenth century, paleoclimatologists use ‘proxy’ data - things like tree rings - to estimate changing temperatures. The researchers, led by Shaun Marcott of Oregon State University, found ‘Early Holocene (10,000 to 5,000 years ago) warmth is followed by ~0.7 degree Celsius cooling through the middle to late Holocene (less than 5,000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago’.
However, then things changed dramatically: ‘Current global temperatures of the past decade have not yet exceeded peak interglacial values but are warmer than during ~75 per cent of the Holocene temperature history.’ The accompanying graph of temperature changes, as shown in the Atlantic article, is startling. Temperatures are more or less stable until just over 1,000 years ago, when a marked cooling started. Then, after a recovery since the Little Ice Age, the line takes off like a rocket in the twentieth century. What clearer evidence could there be for manmade global warming?