BOSTON—For decades, astronomers and climatologists have debated whether a prolonged 17th century cold spell, best documented in Europe, could have been caused by erratic behavior of the sun. Now, an American solar physicist says he has new evidence to suggest that the sun was indeed the culprit.
The sun isn’t as constant as it appears. Instead, its surface is regularly beset by storms of swirling magnetic fields. As a result, like a teenager plagued with acne, the face of the sun often sprouts relatively dark and short-lived “sunspots,” which appear when strong magnetic fields inhibit the upwelling of hotter gas from below. The number of those spots waxes and wanes regularly in an 11-year cycle. However, even that cycle isn’t immutable.
In 1893, English astronomer Edward Maunder, studying historical records, noted that the cycle essentially stopped between 1645 and 1715. Instead, the sun was almost devoid of sunspots during this period. In 1976, American solar physicist John “Jack” Eddy suggested there might have been a causal link between this “Maunder Minimum” in the number of sunspots and the contemporaneous Little Ice Age, when average temperatures in Europe were a degree centigrade lower than normal.
One might expect the absence of dark spots to make the sun slightly brighter and hotter. But the absence of other signs of magnetic activity, such as bright patches of very hot gas known as faculae more than compensates for this effect. So in fact, the total energy output of the sun is lower during a solar minimum. If the minimum is prolonged, as it was in the second half of the 17th century, the dip in output might indeed affect Earth’s climate.