Various news outlets pounced on the claim that 2020 had just tied 2016 for the hottest year ever. Which ought to raise the suspicions of anyone with an elementary understanding of statistics because of the spurious claim of precision that, like a clock striking 13, calls into question all that precedes and all that follows. The National Post, headlining its story “1.25C”, didn’t exude a whiff of doubt while NBC, after identifying the source of this figure as the European Union's Copernicus Climate Change Service, shelved journalists’ fabled skepticism and instead quoted an NOAA scientist that “We need another dictionary to help us describe how these extremes continue to play out and unfold year after year”, instead of asking who measured the temperature anywhere to a hundredth of a degree let alone who did it everywhere.
Even the bizarre metaphor about needing another dictionary was not challenged. After all the dictionary already contains words like “fiery” and “disastrous” and “apocalyptic” and “we’re all going to die”. But it also contains words like “exaggeration” and “absurd” and “better check it out”. And you should.
A Google search of “2020 hottest year” generated 503 million hits in 0.48 seconds. And we believe it. But we do not believe that anyone knows what the temperature of “the Earth” was to that degree of precision four years ago, or in the year that only ended 13 days ago, let alone in the pre-satellite era. In fact, it’s impossible that they might do so.
As those who survived even high school statistics know, if you measure two things to one decimal place then multiply them together, you may well get a result with two decimal places. For instance 4.2 times 5.6 is 23.52. But unless the inputs were exact, that is, they are known to have been 4.20 and 5.60, their product is actually less precise than either initial number; it is the uncertainties that multiply not the certainties.
Suppose for instance the inputs have been rounded to one decimal place and the range of real possible starting values stretches from 4.16 to 4.24 for the first and 5.56 to 5.64 for the second. And if you multiply the two lower boundary numbers and the two higher boundary numbers, that is, 4.16 times 5.56 and 4.24 times 5.64, the range of results goes from 23.13 to 23.91. Meaning suddenly the uncertainty is not one decimal place, let alone two, but a whole number. Oh dear.
Now you may be tempted to say decimal shmecimal, the point is that the planet was like super-hot man. But how do you know how hot it was except by taking real, reliable, believable measurements? And once you start asking questions about measurement, the uncertainties loom. Or should.
The Post uncritically asserted that “In 2020, temperatures globally were an average of 1.25 degrees Celsius higher than in pre-industrial times, tying with 2016 as the world’s warmest on record and rounding off the hottest decade globally. The Arctic and northern Siberia continued to warm more quickly than the planet as a whole in 2020, with temperatures in parts of these regions averaging more than 6C above a 30-year average.”
Oh really? More than 6°C? Where? And over how large an area? Because as we noted last week, our home planet is a complicated object exhibiting very complicated patterns of temperature change in both directions at the macro and even the micro scale. And yes, in both directions. How could it be otherwise? If it really is 1.25°C warmer than during the Little Ice Age overall, but any significant part has warmed by six or seven degrees, then there must be significant parts that have not merely warmed less but actually cooled. Whereas if you just picked one or two places having freakishly hot spells, well, anyone with sufficient data can go back to any time you like and find similar cherries.
For instance Spain, which just set an all-time record low temperature and suffered a massive, lethal snowstorm. And if you’re tempted to say Spain Shmain, it’s just one place, we’re talking the future of mankind on Earth, China also experienced a massive cold spell including an all-time record low in Qingdao and Beijing’s lowest Jan. 7 low in over half a century. Iran had lethal snowfalls.
Going back to “pre-industrial times” is another major piece of statistical legerdemain that ought to be called out, and not only because we do not know what the temperature of any place was to a decimal place and for almost the entire land surface, and the entire ocean, we have only proxies involving wild uncertainty. It is also a cheat because the IPCC itself says that much of the warming before 1950 may well have been natural. So that famous 1.25°C warmer ought to be reduced considerably to eliminate natural warming before 1950 even if it ends up being less scary. And there’s more or, rather, less.
Unless someone threw the off switch on Earth’s furnace in 1950, it must also be true that some of the warming since was natural as well. Which stands to reason since the historical record, for all its imprecision, does show long natural warming and cooling cycles lasting many centuries. From peak to peak of the Roman and Medieval Warm Periods it was roughly 1,000 years, and from trough to trough of the Dark Ages and Little Ice Ages more like 1,200. On which basis it stands to reason that the natural warming that started around 1700 ought to continue until about 2600 meaning, yes, we’re still in it.
Once you grasp this point, you realize that this famous, if spurious “average of 1.25 degrees Celsius higher than in pre-industrial times” only gives you at most about three quarters of a degree of man-made warming to play with. And so unless the margin of uncertainty about the temperature in 1950 and in 2016 is less than three eighths of a degree either way, instead of an exact match to two decimal places we have noise bigger than the signal.
Can anyone show us data sets going back before 1979, and indeed back to “pre-industrial times”, that are accurate to .375 of a degree? Anyone at all? In fact, does anyone think we have reliable measurements of the Earth’s temperature in 2016 to that pesky .375 of a degree? No. Of course not. Because yet again there is a host of uncertainty concealed by journalistic prose here.
The total surface of the Earth, land and sea, is about 500 million square kilometers, 70% of it water. So if you took 500 million evenly spaced measurements every day, without which it’s hard to claim you know what the “average” temperature was for the year, you’re still only sampling one square centimetre per square kilometer for one minute out of 1440 each day. Which is one ten-billionth spatially and less than one one-thousandth temporally, which multiplies together to one-trillionth of the actual temperature being read. To claim on that basis to know it to two decimal places and declare a tie is worse than fatuous.
It amounts to saying here’s a scary number that’s also silly. But never mind that it’s silly. Just be scared.
We are. But not of your results. Of basing policy on such mathematical gibberish.