Before believing those computer models about the coming apocalypse, or some strange possible drawback to massive arrogant geoengineering, we want to return to this issue of predicting the past. Charles Rotter on Watts Up With That alerts us to a new study about the influence of plants on temperature that admits it’s complicated. Which might seem like a molehill rather than a mountain, while the fact that it’s all about computer modeling might lead you to wonder if there’s even a mole. But we like it anyway, because it uses real-world data as the starting point for an attempt to explain a real, massive past temperature fluctuation, the Holocene Climatic Optimum, that had nothing to do with CO2 so alarmists tend to brush it aside even if they haven’t yet tried to high-stick it right out of the stadium.
The press release about this study by a post-doctoral associate at Washington University in St. Louis notes that Alexander “Thompson had long been troubled by a problem with models of Earth’s atmospheric temperatures since the last ice age. Too many of these simulations showed temperatures warming consistently over time. But climate proxy records tell a different story. Many of those sources indicate a marked peak in global temperatures that occurred between 6,000 and 9,000 years ago.” Aka the models don’t work worth a darn.
They’re certainly simple-minded, which is the fault not of the hardware or even the software but the “wetware”, the programmers. But the data tell a different story. And to his credit, Thompson listens and doesn’t just sneer at Uber drivers.
As with too many scientific papers, the prose makes it hard to understand what he thinks he discovered. But the actual paper does contain the fairly clear statement that “The model-data mismatch, termed the ‘Holocene Temperature Conundrum,’ potentially exposes uncertainty in our understanding of the ways in which the climate system responded to changes in forcings during the Holocene”. Potentially. And then he does open the door to the notion that the Holocene Thermal Maximum did not happen (“the existence of an HTM remains unresolved, if there was indeed a global HTM”), since if the data stubbornly refuse to conform to the model one easy solution is to chuck the data.
Another might be to chuck the plants, because the apparent hypothesis is that “Increases in vegetation cover warm the land surface by enhancing the surface absorption of shortwave (SW) radiation directly through lowered albedo (24, 25) and indirectly through limiting dust mobilization (26).” So more CO2 and warmth mean more plants, more plants mean more warmth and voila, runaway greenhouse effect of exactly the sort we… um… don’t see as temperatures cool from the HTM, with fluctuations, toward the chilly present.
Still, the paper admits the temperature fluctuated in ways the models can’t explain and that the models need to do a better job of fitting the data. Which amazingly in this area is gigantic progress.
“They’re certainly simple-minded, which is the fault not of the hardware or even the software but the “wetware”, the programmers“
In my industry I call that “problems with the organic interface”.
No, no, no. The purpose of models is to fit the narrative, not the data. Get with the plan, will you!
Not being a climate scientist, I have to rely on my own observations. I observe when I walk from my bare field into the wood lot, it is cooler in the woodlot any time of the year. Now that I have seen this article, I know that it is actually warmer in the woodlot. These guys need to get off the computer and take their thermometers through a walk in the country.