Here’s a strange one. Question the link between manmade CO2 and global warming and nearly everyone rushes to shoosh you, including most of the people opposed to carbon taxes. But as we’ve already pointed out, there’s a surprisingly poor correlation between atmospheric CO2 and temperature throughout history and prehistory. And now it turns out there’s a surprisingly poor correlation between atmospheric CO2 and man-made CO2. It’s even odder since as human production of CO2 rose so did atmospheric CO2, which broadly is not true of temperature. (Not from 1850 until the early 20th century, and not in the mid-20th century, or the early 21st.) But now a study in Health Physics looks at isotopes, because natural CO2 contains 14C and the man-made kind essentially does not (because 14C is created by cosmic rays hitting atmospheric carbon but its half-life is about 5730 years, so it’s long gone in fossil fuels that have been underground since the last saurus brontoed). So the air should have lots of 14C-free CO2 from all our fossil fuel burning. But the study finds that the share of 14C-free CO2 in the atmosphere is far lower than its share in total emissions since 1750 would predict. So even if you could prove that the increase in CO2 since that date caused the Little Ice Age to end, you couldn’t finger humans as the cause of the whole process.
In fact we got there first on this one too. It is a mantra among alarmists that exactly half of man-made CO2 is absorbed in the natural carbon cycle and the rest is not, a suspiciously round as well as precise figure. And just plain suspicious: If 95% of the CO2 that is emitted in a year is reabsorbed, it stands to reason that 95% of the natural and 95% of the man-made would be absorbed, since plants don’t discriminate based on 14C. And the main reason this process is rejected by “the science” is precisely a question of policy-based evidence-making: If it were so, then the accumulation of CO2 would be a natural matter, due to changes in the natural absorption process… just like it was for 560 million years before James Watt.
In fact there’s good evidence, accidentally revealed by Al Gore’s An Inconvenient Truth, that a warmer Earth has more atmospheric CO2 not the other way around, because heat makes the oceans degas. That evidence being the awkward pattern from those famous ice cores of temperature increases generally preceding CO2 increases by around 800 years, the time between the non-existent Medieval Warm Period and the modern rise in atmospheric CO2, while temperature drops also seem to cause CO2 drops with the same lag. As the paper points out, and it’s secretly not controversial, “During the last long glacial period, the oceans absorbed a large amount of CO2 from the atmosphere.” Why? Because it was cold.
The last long glacial period did not happen because CO2 fell. CO2 fell because the last long glacial period happened. Scientists say.
The heresy continues. “It appears in the figure that Earth is still in the Holocene interglacial period that started 11,500 y ago. Its peak temperature change over the 11,500 years, thus far in 1950, appears to be significantly less than those over the three previous interglacial periods. Its peak CO2 appears less than 300 ppm and less than the peak value in the previous interglacial period. Thus, the increase in CO2 that Earth has been experiencing since 1800 appears to have started more than 5,000 years ago.”
Yes, it’s worth reading twice. Not only were previous interglacials warmer than today’s unprecedented temperatures, but their man-free atmospheric CO2 levels were also higher. So even if you’re going to say CO2 drives temperature, with the aid of a time machine, you still have to concede that nothing happening today is different from anything that happened before humans started creating GHGs. Nothing. Well, except it being colder than is typical even of these relatively cold periods.