Tuesday, August 21st 2012, 12:32 PM EDT
Whew, it has really been hot lately…along with all those unprecedented droughts and storms! How can there be any lingering doubt about global warming? Right? So isn’t it a really good thing that, as Ryan Lizza reported in a June New Yorker article, “The president has said that the most important policy he could address in his second term of office is climate change”? In other words, this means he will further energize his EPA’s war on fossil energy, and double-down on his “green energy” subsidy agenda. If this doesn’t help to fix the economy, that priority will just have to wait.
Premised upon recent weather in some U.S. regions, the global warming crisis narrative has been driving lots of media traffic. For example, an Investor’s Business Daily Op/Ed piece by Eugene Robinson titled “Feeling the Heat: It’s Too Hot to Be a Global Warming Skeptic” notes that “…the nation’s capital and its suburbs are in post-apocalypse mode. About one-fourth of all households have no electricity, the legacy of an unprecedented [that word again] assault by violent thunderstorms…” He went on to say: ”Yes, it’s always hot here in summer. Yes, we always have thunderstorms-but never like these.”
Robinson offered a sensible disclaimer admitting that no one extreme weather event can be definitively blamed on climate change, while also taking issue with those who dismiss climate change as a “figment of scientist’s imagination, or even as a crypto-socialist one-worldish plot to take away our God-given SUVs” as “the data are beginning to add up.”
Fair enough. I agree that anyone who thinks climate change is illusory probably isn’t intellectually qualified for a license allowing them to drive an SUV, or any other motor vehicle that will outpace a riding lawnmower for that matter. And as for any “crypto-socialist one-worldish plot” to take them away our choice to own one, that won’t be necessary. Imposition of the Obama administration’s radical new automotive CAFÉ standards will take care of that right here within our own government, avoiding any need to depend upon the U.N. for this.
But regarding that “data adding up” to support a war against fossil fuels under a man-made climate crisis banner…well, maybe that is something that warrants a bit more attention.
Mr. Robinson supported his reasoning by citing a NOAA statement that “the past winter was the fourth-warmest in the United States since record-keeping began in 1895”, along with NASA Goddard Institute for Space Studies (NASA-GISS) surface temperature reports that indicate “nine out of the warmest 10 years on record have occurred since 2000”.
So to begin, let’s consider the first statement from a global warming perspective (because that’s what “global” really means). That warm 2011/2012 U.S. winter (which accounts for about 1.5% of the Earth’s surface), would certainly have been a very welcome difference from what much of world experienced. A European cold spell killed more than 500. More than 140 perished in the Ukraine, along with hundreds of others in France, Serbia and the Czech Republic. Europe’s 2,860-kilometer Danube River that is crucial for transport, power, industry and fishing froze over, as did nearly all rivers in the Balkans. More than 130 villages in Bulgaria went without electricity.
Closer to those of us in the lower forty-eight, Fairbanks, Alaska reported the coldest January temperatures since 1971, reaching -24º F. The coldest January average temperature there occurred in 1906 (-36.4º F).
Just like its former governor, Alaska has continued to go rogue. After experiencing its record-breaking cold winter, the state is now reported to have one of its coldest Julys, averaging 53 degrees F during the beginning of the month, about 12 degrees below average. The National Weather Service predicts this will soon change. If so, some will undoubtedly point to this as even more evidence of an imminent global warming disaster.
And what about the statement claiming that nine out of ten warmest years occurred since 2000? According to an article posted in the UK’s Daily Mail, recent readings taken from more than 30,000 measuring stations that were quietly released by the U.K.’s Met Office and the University of East Anglia Climate Research Unit show that world temperatures haven’t warmed over the past 15 years.
Shortly afterwards, a Met spokesman issued a response charging that the article was misleading. While much of the criticism challenged the article’s central premise that changes in solar activity during the next “Cycle 25” will produce a net cooling influence, (contradicting Met’s projections), it also renounced the 15 year lack of warming assertion. After admitting that its future temperature projections are “probabilistic in nature,” and that it will require several decades of data to assess the projections, it said: “However, what is absolutely clear is that we have continued to see a trend [italics added] of warming, with the decade of 2000-2009 being clearly the warmest in the instrumental record going back to 1850.”
But while the Met response refers to a warming “trend”, it should be remembered that the temperature record trend line began at the end of the Little Ice Age. Yup, there has certainly been a warming trend since then. And while the recent decade has been warm, the current “trend” over the past decade (measured in degrees Celsius/decade) over the past decade is approximately zero. Yes, temperatures have been essentially flat. Measurements for any temperature trend over any “x” number of years depends upon what number you establish for “x”, and the year you begin measuring. It can readily be argued that the Daily Mail article was correct on this matter.
Regarding the possibility that the global climate will soon enter a substantial cooling phase attributable to a weak new solar cycle, this may ultimately prove true also. Many prominent scientists predict that this is likely due to important modulating cloud-forming influences of cosmic rays throughout periods of reduced sunspot activity. More clouds tend to make conditions cooler, while fewer often cause warming.
Met Office projections hold that the greenhouse effects of man-made carbon dioxide are far stronger than the Sun’s influences, and sufficiently so not only to overwhelm potential solar cooling, but to produce net warming. These findings are fiercely disputed by solar experts. They point out that the Met’s assessment is based upon highly theoretical climate models that exaggerate CO2 influence, while failing to account for numerous other important contributing factors.
Judith Curry, a well-known climatologist who chairs the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, finds the Met’s confident determination of there being a “negligible” solar climate impact “difficult to understand”. She has stated that “The responsible thing to do would be to accept the fact that the models may have severe shortcomings when it comes to the influence of the Sun”.
Dr. Curry also notes important contributions of 60-year Pacific and Atlantic Ocean temperature cycles, observing that they have been “insufficiently appreciated in terms of global climate”. When both oceans were cold in the past, such as from 1940 to 1970, the climate cooled. The Pacific “flipped” back from a warm to a cold mode in 2008, and the Atlantic is also thought likely to flip back in the next few years.
Global temperatures have been rising since before the Industrial Revolution from the time the Little Ice Age ended in the mid-19th century…and according to NASA-GISS, about 0.8°C (1.5°F) since 1880. About half of all estimated warming since 1900 occurred before the mid-1940s, despite continuously rising CO2 levels since that time.
Yes, let’s realize that climate change is very real, dating back to always. It actually began to occur long before humans invented agriculture, smoke stacks, and gasoline-fueled internal combustion engines. In fact, a recent study conducted by German researchers using tree ring data reveals that 2,000 years ago Romans wore cool togas with good reason. Summer temperatures between 21 to 50 AD were about 1 degree Celsius warmer than now, and they were just as warm during the Medieval period about 1,000 years later.
Lead author Professor Jan Esper of Johannes Gutenberg University in Mainz said: “We found that previous estimates of historical temperatures during the Roman era and the Middle Ages were too low.” He notes that while 1 degree Celsius may not seem significant, “their findings are significant with respect to climate policy, as they will influence the way climate changes are seen in context of historical warm periods.”
The study documented temperatures dating back to 138 BC, indicating that the world has been on a “long-term cooling trend” punctuated with a couple of warm spells for two millennia until another warming occurred during the twentieth century. In general, there was a slow cooling of 0.6 degrees Celsius over the earlier period.
As for those “worst time ever” extreme weather conditions that Eugene Robinson referred to, remember that as recently as 1,000 years ago Icelandic Vikings were raising cattle, sheep and goats in grasslands on Greenland’s southwestern coast. Then, around 1200, temperatures began to drop, and Norse settlements were abandoned by about 1350. Atlantic pack ice began to grow around 1250, and shortened growing seasons and unreliable weather patterns, including torrential rains in Northern Europe led to the “Great Famine” of 1315-1317.
Temperatures dropped dramatically again in the middle of the 16th century, and although there were notable year-to-year fluctuations, the coldest regime since the last Ice Age (that so-called “Little Ice Age”) dominated the next hundred and fifty years or more. Food shortages killed millions in Europe between 1690 and 1700, followed by more famines in 1725 and 1816. The end of this time witnessed brutal winter temperatures suffered by Washington’s troops at Valley Forge in 1776, and Napoleon’s bitterly cold retreat from Russia in 1812.
Although temperatures and weather conditions have been generally mild over about the past 150 years, we should remember that significant fluctuations are normal. In fact, the past century has witnessed at most, two (and very possibly only one) periods of warming. The first definite warming period occurred between 1900 and 1945. Since CO2 levels were relatively low then compared with now, and didn’t change much, they couldn’t have been the cause before 1950. Since this apparently resulted from natural influences, then why is more recent warming being attributed to increased atmospheric CO2 emissions?
A recent reanalysis of U.S. temperature trends reported at National Ocean and Atmospheric Administration surface stations indicates that there has been only about half as much warming over the past 30 years as was previously believed (+0.155C/decade vs. +0.309C/decade). This spurious doubling of estimates is attributed to serious miss-location problems at many NOAA recording sites, along with erroneous post-measurement data adjustments which exaggerated temperatures upwards between 1979 and 2008. The new analysis conducted by Anthony Watts (California), Evan Jones (New York), Stephen McIntyre (Toronto), and John Christy (Department of Atmospheric Science, U. of Alabama), applied a more advanced station measurement rating method which revealed that “station siting does indeed have a significant effect on [recorded] temperature trends.”
Regarding global temperatures, while some measurements suggest some warming between 1975 and 1998, a strong Pacific Ocean El Niño year, some scientists seriously question the existence of solid evidence of that increase. (A future article will be devoted exclusively to this important subject.)
Yet even if that 1975-1998 warming occurred, U.K. Hadley Center and U.S. NOAA radiosonde (balloon) instrument analyses fail to show any evidence, whatsoever, of a human CO2 emission-influenced warming telltale “signature” in the upper troposphere over the equator as predicted by all U.N. Intergovernmental Panel on Climate Change (IPCC) global circulation models.
Regarding false alarm linking recent global warming to more extreme weather conditions, CBS News, CNN, the Christian Sciences Monitor, and even Nature.com recently covered a NOAA press release stating that “La Niña -related heat waves, like experienced in Texas in 2011, are now 20 times more likely to occur during a La Niña today than La Niña fifty years ago.” Texas State Climatologist and Texas A&M Professor, John Nielsen-Gammon, discusses the indefensibility of this seriously hyped pronouncement in his July 20 Chron.com Climate Abyss blog. While he notes that, based upon simplistic modeling assumptions, the “20 times more likely” prediction goes too far, he agrees with report conclusions that La Niña heat waves are “distinctly more probable” today, with global warming but one contributing factor among others. .
There is little dispute that severe droughts experienced in the Texas and Oklahoma panhandles have been caused primarily by natural Pacific Ocean equatorial La Niña sea surface temperature cooling that shifted the jet stream more northward than usual. This shift created a stronger jet stream aimed towards the northwest part of the U.S., bringing drier and warmer conditions across the southern and eastern portion of the country.
It appears that the U.S. really is in store for some more extreme weather. Evelyn Browning-Garriss, owner and author of the monthly Browning Newsletter climate publication, predicts this will result due to a transition from current cooler than normal La Niña ocean conditions to developing hotter than normal El Niño temperatures.
Many natural factors are known to contribute to these longer-term climate and short-term weather changes, although even the most sophisticated climate models and theories they are based on cannot predict the timing, scale (either up or down), or future impacts…much less the marginal contributions of CO2, a trace atmospheric “greenhouse gas” which has been branded as a primary culprit and endangering “pollutant”.
And if you’re really worried about catastrophic global warming, perhaps take cheer that it isn’t likely to last very long. Consider that we are currently about 10,000 years into a typical 12,000 to 18,000 year-long interglacial period. Assuming that climate history over the past 400,000 years continues to repeat its pattern with the nearly electrocardiogram regularity, maybe we should think about enjoying this brief intermission before the next life-unfriendly Ice Age covers much of the Northern Hemisphere with glaciers up to miles thick for the next 90,000 years.
So if human carbon dioxide emissions actually do make any difference, are you feeling any better about those SUVs and coal plants now?
Click source for more [LINKS]