Transforming the Science Used For (Climate) Regulations
Climate Change Weekly #257
The massive number of rules and regulations issued in the United States annually impose billions of dollars in costs on individuals and businesses. Regulatory costs top $1.9 trillion annually, amounting to $14,842 per U.S. household. That’s nearly $15,000 unavailable to pay for health insurance, medicine or medical bills, college expenses, groceries, a new car, or vacations.
Because the economic and human health implications of regulations are profound, the science they are built upon must be unimpeachable. Yet the public can have little confidence the rules forced upon them are scientifically justified. Regulatory agencies repeatedly take actions to keep the science they’ve used to justify regulations secret, hidden from review by outside researchers and the congressional committees charged with exercising oversight over them.
The science used to justify the Obama administration’s climate policies lacked transparency. Nevertheless, because the costs of those policies in terms of dollars and cents and personal freedom are so high, they are slowly being brought to light. The revelations demonstrate why secret science must not be allowed to shape federal regulations going forward.
A number of questions have been raised concerning the science used to justify the Obama administration’s Environmental Protection Agency (EPA) Clean Power Plan (CPP) rules. A recent study by James Wallace II, Joseph D’Aleo, and Craig Idso shows the warming trend upon which EPA’s endangerment finding was based relied on politically manipulated global average surface temperature (GAST) data sets. The three GAST data sets were adjusted to remove historic cyclical temperature patterns, making the past temperatures appear cooler than was measured and current temperatures seem warmer than has been measured, resulting in a steeper- than-measured increase in temperature across the twentieth century.
The U.S. Office of Management and Budget estimates every $7.5 million to $12 million in regulatory costs imposed on the economy results in a life lost. We can all be thankful CPP never took effect.
EPA, with reason to paint the benefits of its plan in the best light possible, estimates CPP would prevent 21,000 premature deaths through 2030. The Energy Information Administration pegged the cumulative costs of the rule through 2030 at $1.23 trillion in lost gross domestic product. Thus, the estimated 21,000 lives saved are dwarfed by the 102,500 to 164,000 early deaths CPP could be expected to cause.
Importantly, almost all the lives anticipated to be saved by CPP were due not to reduced carbon dioxide emissions—since carbon dioxide is not toxic at any reasonably expected level—but as “co-benefits” of reducing regulated pollutants, primarily particulate matter. Steve Milloy, founder of JunkScience.com, recently requested the New England Journal of Medicine retract the key studies used by EPA to claim reducing particulate matter will save thousands of lives annually. In support of his request, Milloy cites a series of articles showing air pollution at current levels is not causing illness or premature deaths in the United States.
In CCW 240 I detailed how researchers at the National Oceanic and Atmospheric Administration (NOAA) violated their agency’s own data quality and documentation policies in a rush to publish now thoroughly discredited studies that purported to show there had been no 18-year-long pause in rising temperatures in order to bolster President Barack Obama’s efforts to negotiate the Paris climate agreement. Prior to this disclosure, NOAA had rebuffed efforts by the House Committee on Science, Space, and Technology to obtain the studies’ underlying data and any documents related to its findings. Now we know why. After the agency refused to deliver all the requested data, the committee issued its first subpoena in 21 years to compel disclosure of the taxpayer-funded research. Ultimately the committee issued a record 25 subpoenas for scientific research used to shape rules during Obama’s final two years in office. Regulatory agencies often ignored the subpoenas. This is unacceptable.
Federal agencies should be required to disclose all the science, models, and information exchanges used to make agency decisions, and no agency should be allowed to use any report to support or justify a rule if the research is not open to verification by outside parties. Every government research contract should require recipients to make available all assumptions, models, data, and e-mail exchanges related to the contracted research upon receiving a Freedom of Information Act request or a request by the congressional committee with oversight responsibility.
Researchers who reject such oversight and the universities or private research institutes employing them should be denied government research grants until they agree to these terms. Transparency of all taxpayer-funded research or any research used to make rules imposed on the public should be the norm.
In addition, regulatory agencies should be required to supply any information requested by any congressional committee with oversight authority over them. Any agency employee or administrator who refuses to comply with a congressional request for information in a timely manner should be open to discipline, including fines and immediate termination.
— H. Sterling Burnett
SOURCES: GAO Report; United States Senate Committee on Environment and Public Works; United States Senate Committee on Environment and Public Works; On the Validity of NOAA, NASA, and Hadley CRU Global Average Surface Temperature Data & The Validity of EPA’s CO2 Endangerment Finding; Retraction Request Made For Nejm Air Pollution-Kills Study; CCW 240; and Environment & Climate News
IN THIS ISSUE …
RECENT ARCTIC TEMPERATURES, ICE NOT UNPRECEDENTED
The authors of this Climate of the Past article use 60 sets of proxy data to reconstruct historic summer temperatures in the Arctic and subarctic northern regions including Greenland. They find current high temperatures are not unprecedented. “The reconstruction shows a pronounced Medieval Climate Anomaly (MCA), from approximately 960 to 1060 in the common era (CE), characterized by a sequence of extremely warm decades over the entire area.” According to the researchers, “The medieval warming was followed by a gradual cooling into the Little Ice Age, with 1580–1680 CE as the longest centennial-scale cold period, culminating between 1812 and 1822 CE. ... At the same time there is evidence for a drastic reduction in sea-ice on the Greenland shelf, which is reflected by rather high summer temperatures over Greenland and Baffin Island during that decade.”
In short, the research shows during the MCA summer temperatures were as high as in the late twentieth and early twenty-first centuries. And Greenland warmed considerably again, and ice declined sharply, for a decade-long period more than 120 years before humans began to add significant greenhouse gases to the atmosphere.
SOURCE: Climate of the Past
CHINA’S WEATHER LESS EXTREME DURING WARMING
Using continuous severe weather data collected since 1951 from more than 500 manned weather stations and 483 other weather observatories spanning the entire Chinese mainland, an international team of researchers found the frequency of extreme weather events has declined significantly throughout China since 1960. The frequency of hail storms, thunderstorms, and high wind events decreased by nearly 50 percent on average throughout China since 1960. The total number of days with severe thunderstorms and damaging winds began falling in 1960, while the number of dangerous hail storms began falling sharply in the 1980s, just as warnings of climate-related weather extremes due to human caused global warming began to heat up.
The decline in severe weather closely correlates with the weakening of the East Asian summer monsoon, the primary driver of moisture and severe weather over China.
GREENLAND TEMPERATURES HISTORY AND NOW
Recent research shows the narrative humans are causing unprecedented warming and ice loss in Greenland is wrong. Despite an ongoing rise in carbon dioxide emissions, Vencore Weather reports, “Independence Day on July 4th, Summit Station in Greenland may have experienced the coldest July temperature ever recorded in the Northern Hemisphere at -33°C. Much of Greenland has been colder-than-normal for the year so far and has experienced record or near record levels of accumulated snow and ice since the fall of last year.” The typical daily maximum temperatures at Summit Station are around -35°C in winter (January) and -10°C in summer (July), meaning this July 4th’s low temperature was closer to temperatures expected in the middle of winter than during the summer, and the still-accumulating snow and ice on Greenland is running at near record levels in 2017.
Studies of the Arctic’s past climate also show the region has experienced both warmer temperatures and much less ice historically than during the present purported period of unusual human-caused warming.
One paper in Scientific Reports shows Greenland has been cooling since 2005, with volcanic activity being a significant driver of centennial- to millennial-scale temperature.
Research published in The Holocene examining deposits of certain shallow-water marine mollusks found Norway’s Svalbard archipelago in the Arctic Ocean was warmer than the present during at least three periods of time between 10,000 and 4,000 years ago, when carbon dioxide concentrations were approximately 150 ppm lower than they are now. In particular, August temperatures on Svalbard were 6°C warmer from 10,200 to 9,200 years ago than they are now; warmed again around 8,200 years ago when the researchers estimate Svalbard was 4°C warmer than present for a period lasting for more than 2,000 years; and warmed again for a short time period during the Medieval Warm Period 900 years ago.
CRITICAL FACTS ABOUT LARSEN ICE SHELF
Adrian Luckman, professor of glaciology and remote sensing at Swansea University, who leads a team studying the decline of the Larsen C ice shelf, writes the Delaware-sized iceberg that recently separated from the Larsen C, the fourth biggest ice shelf in Antarctica, didn’t break off due to climate change.
Luckman described the breakaway as “a rare but natural occurrence ... not a warning of imminent sea level rise.”
Satellite images from the 1980s show the rift where the iceberg broke free was a long-established feature of Larsen C, predating any recent atmospheric warming. In addition, Luckman said atmospheric warming “is not felt deep enough within the ice shelf” to cause the rift and ultimate break, and any recent ocean warming “is an unlikely source of change given that most of Larsen C has recently been thickening.”
Luckman also notes the calving does not signal an imminent collapse of the remaining part of Larsen C shelf. Luckman writes, “even if … [Larsen C] were to eventually collapse, many years into the future, the potential sea level rise is quite modest. Taking into account only the catchments of glaciers flowing into Larsen C, the total, even after decades, will probably be less than a centimeter”—which is less than half an inch.
SOURCE: The Conversation