×
See Comments down arrow

Precisely approximate

02 Apr 2025 | OP ED Watch

One of the flags of pseudoscience, and not only on climate, is a spurious degree of precision in output claims and a lack of curiosity about input data. For instance Matthew Wielicki recently warned that “Sea level is one of those metrics we’re constantly told reveals the truth about climate change…. But few stop to ask: what exactly is ‘sea level,’ and how do we even measure it?” And if they do, they discover that satellite measurements are subjected to “massive adjustments” that “depend on models, assumptions, and data processing protocols that aren’t always transparent. And here’s where things start to smell fishy.” Meanwhile Scientific American emailed an article headlined “As the world gets closer to the mark 1.5 degrees Celsius in Paris climate agreement, scientists are racing to establish a single way to monitor current warming”. But if they don’t have a way to measure it, how do you know we’re closing in on it? How confident are you about “current warming” if, say, it’s a pastiche of temperatures going in a variety of directions but a largely interpolated large warming in Antarctica drives up the average?

Wielicki dives into things that most people do not. Including sea level rise, or even fall, varying dramatically around the world for geological reasons. And while alarmists make a kind of simplistic assumption that it’s as if we had water rising in a bathtub, so it didn’t matter to the obvious overall trend if some particular sponge was sinking or if a plastic boat was bobbing up distorting local readings, in reality the “raw radar data is almost unusable without massive adjustments” because of issues like “atmospheric interference, wave height, water vapor, satellite altitude, Earth’s gravitational field, and the wobbly orbit of the satellite itself.”

Time for some humility? Oh dear us no. Instead we get estimates to two decimal places with a side of angry dogmatism, such as a story from MSN, originating with that cynosure of climate science Moneycontrol but who’s checking sources, growling:

“The rising sea is becoming harder to ignore. Scientists tracking ocean levels found that 2024 saw an unexpected surge in sea level rise. The rapid increase is linked to warming waters and melting ice, adding to long-term concerns about climate change. In 2024, global sea levels rose at 0.23 inches (0.59 cm) per year. This rate was higher than the expected 0.17 inches (0.43 cm) per year. NASA researchers say ocean warming was the biggest reason for this jump.”

As for the data, well, the rhetoric is rock-solid:

“NASA has monitored ocean height for over 30 years, beginning with the TOPEX/Poseidon satellite in 1992. Today, Sentinel-6 Michael Freilich continues the mission, tracking sea levels with high accuracy. Its twin, Sentinel-6B, will launch soon to ensure uninterrupted data collection.”

No curiosity, no skepticism, no credibility. Just predictable panic:

“Scientists are closely watching these changes to refine predictions about future sea level rise. As ocean temperatures continue to climb, their impact on coastal communities worldwide will only grow.”

The SA piece, from its even less sober-minded “Climatewire”, is even worse. It starts:

“The world can’t seem to agree on when the planet will exceed a key temperature threshold in the Paris climate agreement. Nearly 200 nations committed back in 2015 to pursuing efforts to keep global temperatures from exceeding 1.5 degrees Celsius above preindustrial levels. But there is no official metric for determining when the world has crossed that line into increasingly catastrophic impacts.”

Here you’ll notice a kind of double pseudo-science. Not only do they not know how to measure this thing they know we’re fast approaching, they know it demarcates whatever bad thing is now happening from “increasingly catastrophic impacts”. Which translates into they don’t know what it is but they know what it will do.

Then there’s the dreaded “argument from authority” because the piece then goes:

“Enter an international team of scientific experts.”

Ooooh. Not just experts. Scientific ones. And an international team.

Now send in the decimals:

“Some of their preliminary findings are detailed in the WMO’s latest State of Climate report, which estimates that current global warming is somewhere between 1.34 degrees and 1.41 degrees compared with the 1850-1900 average.”

What on Earth is the point of giving us two decimal places when the margin of uncertainty is nearly a tenth of a degree? And to dig the hole deeper, are you really telling us you know the global average temperature from 1850 to 1900 to even one decimal place? How did you measure it?

Funnily enough the actual report blows past 1.5°C in the first sentence of the Foreword from the Secretary General of the World Meteorological Association, which assembled this massive balanced team of 10 committed alarmists. She writes:

“The annually averaged global mean near-surface temperature in 2024 was 1.55 °C ± 0.13 °C above the 1850–1900 average. This is the warmest year in the 175-year observational record, beating the previous record set only the year before. While a single year above 1.5 °C of warming does not indicate that the long-term temperature goals of the Paris Agreement are out of reach, it is a wake-up call that we are increasing the risks to our lives, economies and the planet. Over the course of 2024, our oceans continued to warm, sea levels continued to rise, and acidification increased.”

So suddenly the error bar is 0.13 not 0.08. And the temperature span is 1.42 to 1.68. And predictably the seas are rising in wrath and blah blah blah. A term here including send more money:

“Investment in National Meteorological and Hydrological Services is more important than ever to meet the challenges and build safer, more resilient communities.”

Nowhere in here is there any discussion of how they measure temperature. Which is surely kind of important to the story. Instead they eventually, on p. 25 of 42 (numbered p. 21) get around to the IPCC, not the Paris Accord, definition of the dreaded 1.5°C tipping point threshold disaster thingy:

“By this definition, 1.5 °C of warming would only be confirmed once the observed temperature has reached that level over a 20-year period, 10 years after the year of exceedance. Thus, there would be a 10-year delay in recognizing and reacting to exceedance of the long-term temperature goal. Even taking a shorter average of 10 years, as done in IPCC AR6 and the First Global Stocktake, would result in a 5-year delay. Several approaches are under consideration by WMO and the international scientific community to enable more timely reporting. These approaches fall broadly into three categories. The first combines the most recent 10 years of observed historical temperature with climate model projections for the next 10 years. The second fits a trend or function to the historical data to better estimate where the long-term warming is today. The third estimates the human factor in the historical change by estimating the underlying warming resulting from historical changes in key human drivers of the climate system such as greenhouse gases.”

Ugh. Two of these compare a computer guess about what’s going to happen with an unreliable estimate of what did, and the third tests the models by comparing their outputs with their outputs, discarding even real historical data for a reading backward of what CO2 would have caused if CO2 caused what the models were told to assume it causes.

The result is a blur:

“The best estimates of current global warming from these three approaches are between 1.34 °C and 1.41 °C compared to the 1850–1900 baseline; however, given the uncertainty ranges, the possibility that we have already exceeded 1.5 °C cannot be ruled out.”

Nor the possibility that they’re comparing babbles to boranges since none of this stuff is real, rigorously measured actual temperature. And they admit they don’t care:

“Ultimately, it is essential to recognize that, regardless of the methodology used to track global temperature, every fraction of a degree of warming matters. Whether it is at a level below or above 1.5 °C of warming, every additional increment of global warming leads to changes in extremes and risks becoming larger.”

But how do they know, since they have no idea how warm it was in 1875 or today and the historical data on storms, fires etc. don’t support an increase in extreme weather? And the paper makes no effort to explain where they got the supposed temperature in the former year or last year and what margin of error exists in, say, how hot they think it was during the Franco-Prussian War or why they think so.

Way down low they do talk about the various data sets, and you could chase those references to try to figure out what they did to conjure up their results. But the critical point is that none of this has to do with measuring more carefully or more exactly, or expressing greater caution about spurious precision.

It’s all about manipulating unsatisfactory inputs to create scarier outputs. Which boils down to that if you accept their assumptions, their conclusions follow. QED. Which is hardly new.

Indeed, as Wielicki observes of “an intriguing NOAA graphic” showing sea level rise as indicated by “satellites”, or more exactly obscure satellite data massaged to fit a preconceived vision:

“The official record shows a clean, steady upward march of sea level at an average rate of 3.0 ± 0.4 mm per year since 1992. But if you look closely, every noticeable jump in the rate of sea-level rise aligns with the transition from one satellite to the next. Each new satellite ‘corrects’ the record upward… never downward. Somehow, with every new generation of equipment, the urgency of the crisis deepens.”

Oh that’s original. So here’s a cliché of ours to push back: caveat emptor.

3 comments on “Precisely approximate”

  1. So they can tell you what the temperature range above say,the year 1850 is above that time,to hundreds of a degree?But not when we'll cross that arbitrary threshold of 1.5C above 1850's temps?Or even if we've already crossed said threshold?!Call me skeptical...

  2. Great article! The Pseudo Science exposed by these grifters is breathtaking! The fact that nobody knows the "average global temperature" today, much less in 1875 doesn't slow them down one bit, but with the US pulling 100% of government funding for this garbage will bring it all to a grinding halt!

  3. "Way down low they do talk about the various data sets, and you could chase those references to try to figure out what they did to conjure up their results. But the critical point is that none of this has to do with measuring more carefully or more exactly, or expressing greater caution about spurious precision."
    Once data have been "adjusted" they cease to be data and become merely estimates of what the data might have been had it been collected timely from properly selected, calibrated, sited, installed and maintained sensors. The "various datasets" are merely estimate sets. Two decimal place precision involving measurements "adjusted" in the first decimal place is laughable.
    Approximately 96% of the US near-surface temperature data is compromised. It is likely that the remaining global data is also compromised, though nobody has bothered to evaluate those sites.

Leave a Reply to Mike G Cancel reply

Your email address will not be published. Required fields are marked *

searchtwitterfacebookyoutube-play