We want to take a step back from “news” in the sense of recent headlines to ask alarmists a question that matters a great deal: Roughly when do you think the effects of man-made climate change began to be felt? Was it around 2000? Perhaps 1983? 1970? One gets a surprising range of answers, including 1920, 1850 or even 5000 BC. But as the Science and Environmental Policy Project recently reminded us, the great philosopher of science Karl Popper wrote that “In so far as a scientific statement speaks about reality, it must be falsifiable: and in so far as it is not falsifiable, it does not speak about reality.” So if we’re trying to test a very important theory by comparing its predictions and testing its explanations against crucial evidence, it would be helpful to know what the evidence is meant to be. And it’s surprisingly hard to find out.
Perhaps you think it’s obvious. The typical alarmist argument seems to be that the real effects hit around 2000: they point to a supposed run of hot years since then, a supposed increase of hurricanes or at least in their intensity, an explosion of wildfires from Fort McMurray to Australia and the Amazon to California, accelerating sea-level rise, jellyfish “taking over the world“ and diseases from zika to allergies, all more or less in the last two decades. Isn’t that what they say?
Bear with us here. We have previously observed that global wildfire counts have not increased but decreased in the past two decades. Which is certainly pertinent to the claim spreading like wildfire that the effects of climate change kicked in recently and include setting everything on fire as well as blowing it down, washing it away etc. in a disastrous “new normal”. As for instance in the recent NBC headline “Biden slams Trump as ‘climate arsonist’ as fires ravage West” or the one about “Northern Hemisphere summer was hottest on record, scientists say/ The worrying milestones come as historic wildfires and extreme weather events in the U.S. have sharpened focus on global warming and the catastrophic impacts of climate change.”
To stick with just one standard news outlet, NBC also wrote that “As wildfires rage, climate experts warn: The future we were worried about is here/ As a forest ecologist who has been studying the links between wildfires and climate change since the early 1990s, Susan Prichard is well aware that global warming is contributing to longer and more intense fire seasons around the world. Yet, nothing could have prepared her for the past 12 months.” In which case, presumably, the past 12 months are qualitatively different than even the past 12 years, let alone the past 12 decades. NBC goes on “’What strikes me is that the future we were really worried about and that us climate scientists talked about for decades, we’re living through that now,’ Prichard, a research scientist at the University of Washington, said.”
Right. So you predicted it for decades and you think it’s happening. Now. That’s a theory we can test, and we do so by pointing out that there hasn’t been an increase in wildfires, instead there’s been a decrease. Yet consider James Hansen’s pivotal 1988 U.S. Senate testimony when he said the effects of climate change were already underway. In which case wildfires should have been getting worse since the 1980s. And what if the effects started much earlier?
It has been credibly claimed. For instance in his well-researched and very interesting The Little Ice Age: How Climate Made History 1300-1850, Brian Fagan points the fingerbone of blame at “a vast migration from Europe by land-hungry farmers and others not only to North America but much further afield, to Australia, New Zealand, and southern Africa. Millions of hectares of forest and woodland fell before the newcomers’ axes between 1850 and 1890, as intensive European farming methods expanded across the world. The unprecedented land clearance released vast quantities of carbon dioxide into the atmosphere, triggering for the first time humanly caused global warming. Wood also fueled the early stages of the Industrial Revolution in the United States, adding to rising levels of greenhouse gases. Global temperatures began to rise slowly after 1850. They climbed more rapidly in the twentieth century as the use of fossil fuels proliferated and greenhouse gas levels continued to soar. The rise has been even steeper since the early 1980s, with record-breaking summer heat and mild winters during the 1990s.”
So here we have climate change in the late 19th century, again in the 20th and hitting hard in the 1990s. Unfortunately this theory, by explaining everything, explains nothing. If a trend persists from 1850 to 2020 it’s global warming. If something changes suddenly in the mid-20th century, it’s proof of global warming. If it changes suddenly in the 1980s, it’s proof of global warming. As the book was written in 2000, he has nothing to say about things changing suddenly in the past two decades, which seem to cause many people to gaze with longing at the 1990s he considers so dismal.
Fagan’s account does allow alarmists to call the retreat of the glaciers in the early 20th century man-made, a process that is somewhat awkward for other proponents of the theory with a more recent focus. But it still doesn’t explain why the retreat was faster a century ago than more recently. The effects should be accelerating not tapering off, shouldn’t they?
It might seem that 1850 is an outlier. But there are other even more extreme claims out there. In a 2012 video about the crucial role of orbital changes that nevertheless endorsed the alarmist account of the recent past, Dan Britt claimed that “we should be dropping into the next Ice Age” from a peak warmth about 8000 years ago “given the solar input”. Indeed, he calculates that the next glaciation would have started around 5000 BC “absent of human activity”. But, he says, the invention of agriculture led to mass slash-and-burn land-clearing, millennia before the invention of writing, that by adding 20 ppm of CO2 to the atmosphere, and lots of methane due to beef cattle and rice cultivation, turned a (potentially lethal) decline into a steady state.
It’s an ingenious theory. But it too presents a fairly major problem, as does any claim that what happened in 1900 or 1915 or even 1940 was man-made. Namely it posits such a high degree of climate sensitivity to CO2 that it is impossible to understand why we have not gone half-way to Venus without getting in a rocket ship since 1980. On the other hand, it avoids a real inconvenience in the Hansen/IPCC view, namely that if a lot of warming between the early 19th and the mid-20th centuries was normal, it means (a) there’s much less recent warming to blame on people or (b) they need some theory as to why the natural warming stopped just when ours started, changing causes mid-stream without any disruption in the ride effects.
There is another possibility and in the scientific spirit we want to consider various hypotheses. As Ed Hoskins argues, it may well be that as atmospheric CO2 increases, its effects diminish logarithmically; in effect it “saturates” the relevant absorption frequencies. And of course if so, one could have a high initial sensitivity and major effects a century ago, and progressively less and less impact even as GHGs increase. But in that case there’s no crisis, we are indeed there yet and the effects, such as they were for good or ill, have more or less arrived.
The bottom line is that it is hard to pin down and dissect these crucial difficulties if the alarmists play fast and loose with their start date. But we can’t test any of it unless they tell us we think we can prove, and a la Popper invite you to falsify, a relationship between atmospheric CO2 and temperature over the following specific period.
If they can’t or won’t name the period, it’s not science. And indeed Fagan alone is all over the map. At one point he says we now live in “an era so warm that sixty-five British bird species laid their eggs an average of 8.8 days earlier in 1995 than in 1971” and “brush fires consumed over 500,000 hectares of drought-plagued Mexican forest in 1998” and, he says, “the sea level has risen in Fiji an average of 1.5 centimeters a year over the past nine decades.” But then the effect in Fiji has been going on for a century, in Britain for three decades and in Mexico perhaps ten years. And while we acknowledge the complexity of climate, at some point your global warming has to be global or it’s not real warming.
So tell us. Roughly when is the dividing line between natural processes dominating and human ones dominating? We can’t begin to test the hypothesis about CO2 until you tell us roughly what phenomenon it’s meant to explain. Is it divergent conditions in the late 20th century or early 21st? The whole 20th century? Everything since 1500? Everything since the Neolithic? Because unless we know, we can’t test your theory. In which case it’s not a real theory.
Oh, one more thing about Fagan’s book, which is well worth reading despite the flaws we are highlighting here. He depicts the Little Ice Age as a very difficult period for human beings not just because cold is bad generally but because… wait for it… the weather went from the comparative stability of the Medieval Warm Period (are you listening, Michael Mann?) to “an irregular seesaw of rapid climatic shifts, driven by complex and still little understood interactions between the atmosphere and the ocean… an endless zigzag of climatic shifts” and a lot of really nasty storms including the one that doomed the Spanish Armada. Contrary to the Al Gore/Michael Mann line about stability until recently, he says “human relationships to the natural environment and short-term climate change have always been in a complex state of flux. To ignore them is to neglect one of the dynamic backdrops of the human experience.” But he claims that our own era presents not climate-driven weather extremes, but fairly uncharacteristic stability though it was also a feature of the earlier natural Medieval Warm Period. So can we at least get a firm statement whether climate change causes weather to become more or less stable? Something we could in principle test against evidence?