The big news, or news hole, on environment in Canada this week is that the government scrapped a bunch of historical weather data because they didn’t like it very much. So they chucked everything from 1850 to 1949, and backfilled the blanks with computer model simulations of 1950-2005 instead. We aren’t living in George Orwell’s Oceania, of course. But it’s mighty odd to hear the erasing of actual data for the made-up kind justified as “an important next step in giving our decision-makers even greater access to important climate data for long-term planning.” Simulation is data, and data is down the memory hole.
It’s perfectly reasonable to be skeptical of old temperature data. Have you seen a 19th-century thermometer? Or log book? And indeed it’s reasonable to be skeptical of a good deal of the modern stuff, frankly, given the patchy nature of the measurement network and well-known concerns about where the thermometers we do have are located. The biggest problem, as a great many scientists know, isn’t inaccurate data, it’s no data at all.
So it makes sense to try to clean up old and new data sets, and to be frankly wary of gaps and other deficiencies. The problem comes when people simply discard the data along with skepticism and proceed to fill in the blanks with stuff made up by computers. And then to circle around, notice that the computer-made data agree with the computer-made predictions and go yup, the planet’s blazing away and man-made CO2 did it.
The modelers and their political acolytes, we fear, are so used to substituting computer simulations for what happens outside the laboratory that they’ve forgotten they’re doing it. They’re not engaged in conscious or outwardly-directed deceit; they really think the right way to figure out what’s going on with climate is to build a model that says we’re ruining it, then check the model’s findings against the model’s findings, or feed the model’s outputs back in as inputs in a closed circle, and hail the match as evidence-based proof positive. But it’s not.
As Roy Spencer recently warned, computer models are built on a pile of shifting scientific sand. They depend fundamentally on knowing the “energy balance”: how much energy comes in to our entire planet Earth and how much goes out again. Because of the 1st Law of Thermodynamics, if the former is bigger the planet heats up; if the latter is bigger it cools; if they’re about the same it doesn’t do much. The problem is, the scientists don’t know how much is coming in or going out.
That’s the scientists’ problem. The problem for the rest of us is that instead of admitting how complicated it is, they make computer models filled with invented variables for lots of other important stuff they can’t measure either like cloud cover, all jiggled around until the energy balance is neutral (which flies in the face of masses of data showing that temperature has fluctuated dramatically over the centuries, millennia and longer). Then they add digital CO2 based on what they think CO2 would do and hold up as conclusive proof an almost completely made-up result where CO2 causes massive warming inside a computer. But, Spencer protests, “they have only demonstrated what they assumed from the outset. It is circular reasoning.”
It’s even worse when it’s circular evidence too. And when, because the models assume man-made warming from 1950 on, they necessarily project a cooler past. As Lorrie Goldstein points out, Blacklock’s Reporter, which originally broke this story about the Canadian government discarding displeasing data, “notes that in many cases the observed temperatures scrapped by Environment Canada in creating its computer models, were higher in the past than today.” Gosh. Who saw that coming?
Well, everybody. The unsurprising result was to wipe out things like Vancouver’s record temperature from 1910, Toronto’s blazing 1852 summer or Brandon, Manitoba’s scorching 1936 heat wave.
Down the memory hole indeed.