Overheated Claims on Temperature Records

Published January 29, 2018

Now that the excitement over the news that 2017 was one of the hottest years on record have died down, it is time for a sober second thought. Did the January 18 announcements by the National Oceanic and Atmospheric Administration (NOAA) that 2017 was the third-hottest year, and NASA’s claim that it was the second hottest year, since 1880, actually mean anything?

Although the Los Angeles Times called 2017 “a top-three scorcher for planet Earth,” neither the NOAA nor the NASA records are significant. One would naturally expect the warmest years to be at the top of a warming record. And thank goodness we have been in a gradual warming trend since the depths of the Little Ice Age in the late 1600s.

Regardless, recent changes have been too small to even notice and are often less than the government’s estimates of uncertainty in the measurements. In fact, we lack the data to properly compare today’s climate with the past.

This is because, until the 1960s, temperature data was collected using mercury thermometers located at weather stations situated mostly in the United States, Japan, the UK, and eastern Australia. Most of the rest of the planet had very few temperature sensing stations. And none of the Earth’s oceans, which constitute 70% of the planet’s surface area, had more than the occasional station separated from its neighbor by thousands of kilometers.

The data collected at the weather stations in this sparse grid had, at best, an accuracy of +/-0.5 degrees Celsius. In most cases, the real-world accuracy was no better than +/-1 deg C. Averaging such poor data in an attempt to determine global conditions cannot yield anything meaningful. Displaying average temperature to tenths or even hundreds of a degree, as is done in the graphs by NOAA and NASA, clearly defies common sense.

Modern weather station surface temperature data is now collected using precision thermocouples. But, starting in the 1970s, less and less ground surface temperature data was used for plots such as those by NOAA and NASA. This was done initially because governments believed that satellite monitoring could take over from most of the ground surface data collection. But the satellites did not show the warming forecast by computer models. So, bureaucrats closed most of the colder rural surface temperature sensing stations, thereby yielding the warming desired for political purposes.

Today, there is virtually no data for approximately 85% of the Earth’s surface. Indeed, there are fewer weather stations in operation now than there were in 1960.

So, the surface temperature computations by NOAA and NASA after about 1980 are meaningless. Combining this with the problems with the early data, the conclusion is unavoidable: it is not possible to know how the Earth’s so-called average temperature has varied over the past century and a half.

In an exclusive interview with the Reuters news agency on January 9, U.S. Environmental Protection Agency (EPA) Administrator Scott Pruitt is reported to have “reaffirmed plans for the EPA to host a public debate on climate science sometime this year that would pit climate change doubters against other climate scientists.”

Referring to possible temperatures in the distant future, Pruitt said, “I think the American people deserve an open, honest, transparent discussion about those things.”

That is an understatement. For if a proper climate science debate is actually held, the public would soon realize that there is insufficient data of any kind – temperature, land and sea ice, glaciers, sea level, extreme weather, ocean pH, et cetera – to be able to determine how today’s climate differs from the past, much less predict the future. The effort to switch from coal and other fossil fuels, America’s least expensive and most abundant power sources, to unreliable and expensive alternatives to supposedly control the climate will then finally be exposed for what it really is: the greatest hoax in history.

[Originally Published at the Moultrie News]