Tuesday, March 12th 2013, 8:39 PM EDT

There was much made in the media about a new report that claims modern day temperatures are the highest in 5,000 years. Moreover, the investigators assert that this century's temperature rise is “unprecedented,” echoing the assertions of climate change alarmists over the past 30 years. Various news outlets seized upon this report as final proof that the world is headed for a hot steamy demise because of human greenhouse gas (GHG) emissions. There are, however, a number of problems with that assertion. First among them is the methodology used to generate the global temperature history and the comparison of proxy data with instrument data from recent times. This may be science but it is being used to deceive the public into believing that anthropogenic global warming (AGW) is a crisis on an unprecedented scale.
Appearing the the journal Science—a publication with a notably biased stance regarding the theory of AGW—the report of a new study of historical global temperatures has reignited global warming fever in the news media and blogosphere:
#Past Century's Global Temperature Change Is Fastest On Record – Christopher Joyce on America's National Public Radio.
#The world is hottest it has been since the end of the ice age - and the temperature's still rising – the UK's Independent.
#Global warming is epic, long-term study says – Ben Brumfield on CNN.
#Global Warming Has Already Caused Unprecedented Change – John Light on billmoyers.com.
#Global warming escalation: Alarming new study shows rapid increase of hell on Earth – the Prairie Dog Press on allvoices.com.
Aside from mostly getting the story wrong—the report did not claim that the present is the hottest time since the end of the ice age—the news mavens were beside themselves with glee. Nothing like pending disaster to help increase ratings and readership. It really is amazing the level of scientific ignorance among the news media “experts” that cover the environment. For them, climate change is the gift that keeps on giving.
The study itself, titled “A Reconstruction of Regional and Global Temperature for the Past 11,300 Years,” was authored by Shaun A. Marcott, Jeremy D. Shakun, Peter U. Clark, and Alan C. Mix. Beginning with a bow to climate science convention, the report's abstract is not as bombastic as the media reporting but clearly does obeisance to global warming dogma.
Surface temperature reconstructions of the past 1500 years suggest that recent warming is unprecedented in that time. Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally distributed records. Early Holocene (10,000 to 5000 years ago) warmth is followed by ~0.7°C cooling through the middle to late Holocene (<5000 years ago), culminating in the coolest temperatures of the Holocene during the Little Ice Age, about 200 years ago. This cooling is largely associated with ~2°C change in the North Atlantic. Current global temperatures of the past decade have not yet exceeded peak interglacial values but are warmer than during ~75% of the Holocene temperature history. Intergovernmental Panel on Climate Change model projections for 2100 exceed the full distribution of Holocene temperature under all plausible greenhouse gas emission scenarios.
No wonder the warmist media was excited. But should they be? Notice that even in the abstract, the authors admit that current temperatures have not exceeded the temperatures of more than 5,000 years ago, a time commonly known as the Holocene Climate Optimum. In fact, they reinforce the commonly held history of Holocene temperatures, including the Little Ice Age, a cool period that several previous studies have tried to erase. Even more important is how this study was conducted and the experimental limits on its results. To examine these we need to understand something about the use of foraminifera as a proxy for temperature.
The accepted climate history of the Holocene. Source The Resilient Earth.
In 1947, Harold Urey, a nuclear chemist at the University of Chicago, discovered a way to measure ancient temperatures. The secret was in the oxygen isotopes found in fossil sea shells. The amount of different oxygen isotopes that an organism takes up from sea water varies according to the water's temperature while it was alive. Basically, the ratio of 18O18 to 16O serves as a proxy thermometer.
In 1955, Cesare Emiliani, a geology student from Italy working in Urey's laboratory, used this method to create a record of historical temperatures. Emiliani measured the oxygen isotopes in microscopic shells from foraminifera, a kind of ocean plankton. Tracking the shells layer by layer in long cores of clay extracted from the seabed, he constructed a record of temperature variations. This is essentially the same method used in the Marcott et al. paper.
There are a number of problems with using foraminifera sediment data to reconstruct a history of ancient temperatures. Here is a quote from chapter 3 of the Global Warming Field Guide by John Weaver, Jeff Braun, and William J. Szlemko, “Measuring Temperature in the Distant Past, The Art of Developing Temperature Proxy Data”:
The theory is that these creatures form near the surface at a certain sea temperature, then fall to the ocean floor over time, leaving behind a permanent record of historic sea temperatures – which are loosely related to air temperatures. Unfortunately, settling rates can be variable due to ocean currents and other, lesser, factors. Plus, settling rates are extremely slow for these tiny shells, so there can be mixing of many years before they actually make it all the way to the bottom. In the end, there is good news and bad news. The good news is that sediment layers contain much longer records than do ice core samples. The bad news is that it is nearly impossible to resolve the year-to-year differences that are possible with ice core data. The resolution for sediment cores is more likely on the order of hundreds of years, although the records cover several million years. In this sense, perhaps, ice core data and sediment cores sort of provide complimentary information.
Scientists know that all sources of palaeoclimatic proxy data differ according to their resolution, spatial coverage and the time period to which they pertain. There are several types of uncertainty lurking in the proxy method used in this study, including temporal, spatial and measurment. To start with, the number of samples is fairly small, only 73 scattered around the world, and they vary in quality. The southern hemisphere is acknowledged as being under represented. Then there is the resolution with which the proxy temperatures can be dated.
“The 73 globally distributed temperature records used in our analysis are based on a variety of paleotemperature proxies and have sampling resolutions ranging from 20 to 500 years, with a median resolution of 120 years,” they authors state. How did they compensate for differences in their datasets? “We account for chronologic and proxy calibration uncertainties with a Monte Carlo–based randomization scheme.” they explain.
“In addition to the previously mentioned averaging schemes, we also implemented the RegEM algorithm to statistically infill data gaps in records not spanning the entire Holocene, which is particularly important over the past several centuries.” It seems that their complete record of Holocene temperature contains a lot of gaps and uncertainties that have been filled with estimates and randomness. Far from the conclusive record of global temperatures trumpeted by the media.
In the past a number of studies based on tree rings have gotten things notoriously wrong—for example Mann's “hockey stick” graph resulted from the improper merging of several tree ring studies. “Published reconstructions of the past millennium are largely based on tree rings and may underestimate low-frequency (multicentury-to-millennial) variability because of uncertainty in detrending ... whereas our lower-resolution records are well suited for reconstructing longer-term changes,” the autors claim. Some of that older data is provided for comparison in the figures below with descriptions by the authors.

Comparison of different methods and reconstructions of global and hemispheric temperature anomalies. (A and B) Globally stacked temperature anomalies for the 5° × 5° area-weighted mean calculation (purple line) with its 1σ uncertainty (blue band) and Mann et al.'s global CRU-EIV composite mean temperature (dark gray line) with their uncertainty (light gray band). (C and D) Global temperature anomalies stacked using several methods (Standard and Standard5x5Grid; 30x30Grid; 10-lat: Arithmetic mean calculation, area-weighted with a 5° × 5° grid, area-weighted with a 30° × 30° grid, and area-weighted using 10° latitude bins, respectively; RegEM and RegEM5x5Grid: Regularized expectation maximization algorithm-infilled arithmetic mean and 5° × 5° area-weighted). The gray shading [50% Jackknife (Jack50)] represents the 1σ envelope when randomly leaving 50% of the records out during each Monte Carlo mean calculation. Uncertainties shown are 1σ for each of the methods. (E and F) Published temperature anomaly reconstructions that have been smoothed with a 100-year centered running mean, Mann08Global, Mann08NH, Moberg05, WA07, Huange04, and plotted with our global temperature stacks [blue band as in (A)]. The temperature anomalies for all the records are referenced to the 1961–1990 instrumental mean. (G and H) Number of records used to construct the Holocene global temperature stack through time (orange line) and Mann et al.'s reconstruction (gold vertical bars). Note the y axis break at 100. The latitudinal distribution of Holocene records (gray horizontal bars) through time is shown. (I and J) Number of age control points (e.g., 14C dates) that constrain the time series through time.
Notice how they include a sharp uptick in temperature at the end of each graph, supposedly those unprecedented temperatures mentioned at the beginning of the article. The only reason they seem so anomalous is that the older temperatures are averages, of at least 20 years and as many as 120 years, due to the inherent temporal uncertainty in the foram data. Modern temperatures, taken directly with instruments, should not be compared directly with the foram proxy derived data. To present an unbiased representation the graphs should all have stopped a decade ago.
“Because the relatively low resolution and time-uncertainty of our data sets should generally suppress higher-frequency temperature variability, an important question is whether the Holocene stack adequately represents centennial- or millennial-scale variability,” the authors confess. What this means is that sudden excursions in temperature during times past would be totally undetectable from the proxy data. To merge modern data with historical proxy data in this manner is disingenuous to say the least and, given the tone of the abstract, might even be construed as intentionally misleading.
What was the hottest year of the Holocene? What was the hottest decade? No one knows, but they almost certainly occurred during the “temperature plateau” extending from 9500 to 5500 years ago. Even by the flawed comparisons made in this paper the averaged temperatures during the Holocene Climate Optimum were higher than today. If the year to year variability could somehow be reconstructed we would surely find many years with much higher temperature spikes during that 4,000 year period of global warmth.
While this study is interesting and useful in its results, it is clear that the authors tried to spin the results into a prop for the failed AGW theory. It certainly had the intended result, news outlets around the world announced the new conclusive proof of AGW causing an unprecedented temperature increase—none of them realizing, or perhaps caring, that the comparison was invalid. This may be science, but it is science distorted, it is the science of deception.
Be safe, enjoy the interglacial and stay skeptical.
Click source for more [LINKS]
views 68,660