History shows Earth’s climate goes through cycles, long and short, tied to a variety of natural factors. In the latter part of the 20th century, some scientists began to wonder about the causes of a modest warming, then cooling, then warming, which had been occurring since the mid-1800s. They also began to worry about the possible implications of continued warming.
Unfortunately, before scientists had gotten very far along in their research, politicians became involved, virtually destroying any chance for unbiased research.
In 1961, President Dwight D. Eisenhower delivered his now-famous farewell address warning of two dangers: the growing arms industry driven by the Cold War, which he referred to as the “military-industrial complex,” and the growing government influence over the development and use of science and technology. Eisenhower, noting scientific research was increasingly funded by governments, warned science’s aims might become corrupted. He believed science could eventually be used as a force for advancing the political aims of a scientific-technological elite. Eisenhower said, “We must also be alert to the … danger that public policy could itself become the captive of a scientific-technological elite.”
The history of governments’ involvement in climate research proves Eisenhower’s skill for predicting the future.
When the United Nations Framework Convention on Climate Change was adopted by governments in 1992, the die was cast. The convention was established on the assumption that human carbon-dioxide emissions were causing temperatures to rise to what many said were dangerous levels. The UNFCCC was established to determine ways to limit temperature increases. When the Intergovernmental Panel on Climate Change (IPCC) was formed, it was charged with understanding the human causes of climate change, not with determining the cause of warming. As the saying goes, “If you have a hammer, every problem looks like a nail,” and so it was with the IPCC. The leaders of the panel are political appointees, and each Assessment Report and Synthesis Report issued by the IPCC on the state of climate science is vetted, altered and approved by member governments.
Although the scientists working on the various IPCC reports generally do good work, when their findings conflict with the panel’s dogma about humans causing dangerous global warming, the findings are ignored, downplayed or, in the summary reports, even altered.
For instance, one section of Climate Change 1995: The Science of Climate Change: Contribution of Working Group I to the Second Assessment Report of the Intergovernmental Panel on Climate Change originally stated, “While some of the pattern-base studies discussed here have claimed detection of a significant climate change, no study to date has positively attributed all or part [of the climate change observed] to [man-made] causes.” Political intervention led to the statement being altered to read, “The body of statistical evidence in Chapter 8, when examined in the context of our physical understanding of the climate system, now points to a discernible human influence on the global climate.”
Scientists are charged with studying the human causes of climate change, but the IPCC’s “physical understanding of the climate system” is rather limited. Its own reports admit it has “low” understanding of 75 percent of the factors impacting climate change, although this hasn’t stopped the panel from having a high degree of confidence human greenhouse gas emissions drive climate change.
More recently, political leaders at the IPCC have honestly admitted the push to limit carbon-dioxide emissions is not about protecting human health or the environment; it’s about giving governments control over the world’s economy. In February 2015, Christiana Figueres, executive secretary of the U.N. Framework Convention on Climate Change, said, “This is probably the most difficult task we have ever given ourselves, which is to intentionally transform the economic development model for the first time in human history.”
Cutting carbon-dioxide emissions by 80 percent below 2005 levels, as demanded by the convention, would bring per capita carbon-dioxide emissions down to levels not seen since the 19th century, because renewables can’t replace fossil fuels and carbon-dioxide emissions can’t be captured and sequestered underground for thousands of years. That means returning emission levels to a time before cars, trucks, airplanes, computers, cell phones, refrigerators, air conditioners, heating, electric lighting, electric tools, nighttime sporting events and concerts, and the long list of other modern technologies that make life longer, healthier and more fulfilling. In short, it means forgoing the vast majority of the technological innovations that have made Western societies wealthy.
The use of coal, oil, gasoline and natural gas make modern life possible. Where fossil fuels are in regular use, people are wealthy, and where they are not used, poverty, disease and hunger are rife. Repudiating the demands of governments behind the U.N. convention will allow fossil fuels to improve the lives of billions of people by providing low-cost energy for centuries to come.
• Donn Dears is a retired senior executive at General Electric. H. Sterling Burnett is a research fellow on energy and the environment at the Heartland Institute.
A recent USA Today/Rock the Vote survey of millennials shows 80 percent of millennials support transitioning to “mostly clean” or renewable energy by 2030. Although their hearts may be in the right place, few millennials appear to realize how much energy their lifestyle actually consumes, where this energy comes from, and how much it would cost to transition to a nation that’s powered predominantly by renewables by 2030.
As a millennial myself, I’m quite familiar with this phenomenon. Many of my peers don’t understand electricity doesn’t just come from the wall; e-mail isn’t necessarily green because it isn’t printed on paper; and a lifestyle that revolves around binge-watching Netflix has a real impact on the environment.
One environmental group estimates U.S. data centers in 2013 consumed an estimated 91 billion kilowatt-hours of electricity, the same as the annual output of 34 large (500-megawatt) coal-fired power plants, and estimates are these data centers will consume the equivalent of 50 coal-fired power plants by 2030.
It’s ironic the generation that will consume more energy in their lifetimes than any before them, one that uses energy-gobbling technology for virtually every aspect of their lives—including dating apps, social media, finding a taxi, and even ordering from Taco Bell—can be so oblivious of how much energy they consume and where it comes from.
Most of the millennials I’ve spoken to drastically overestimate the amount of energy generated from wind and solar power in the United States. I am often met with incredulous looks when I explain the United States generates only about 2 percent of its total energy consumption from wind and solar combined and that these two sources of power produce less energy for the nation than burning wood.
Just four sources of energy account for 89.5 percent of the total energy produced in the United States. Thirty-five percent comes from oil, 28 percent from natural gas, 18 percent from coal, and 8.5 percent from nuclear.
These forms of energy dominate the mix because they are the most affordable sources and because renewables simply aren’t ready to be used as the country’s primary power sources. Wind and solar are unreliable; they generate energy only when the wind blows or the sun shines, and we have no way of storing this energy. Think of an electric car with no battery, and you will have an idea of why our power system can’t rely on renewables.
For these reasons, the U.S. Energy Information Administration, a division of the U.S. Department of Energy, estimates the world will still generate approximately 80 percent of its total energy from fossil fuels in 2040.
Germany and some other nations have aggressively pursued renewable energy, and they are paying a big price for it. Consumer electricity prices in Germany are approximately three times as high as prices in the United States, and wind and solar constitute only about 8.9 percent and 5.7 percent of Germany’s electricity generation, respectively.
Although renewables are unlikely to become staples for energy generation anytime soon, it’s not surprising millennials would want to transition to an economy powered mostly by clean or renewable energy; many of us grew up with our teachers telling us the world would soon run out of fossil fuels and we had to prepare for a switch to renewable energy. Those predictions were completely wrong. Hydraulic fracturing, also known as fracking, virtually guarantees decades, if not centuries, of oil and natural gas, and it has made theories of “peak oil” a thing of the past.
Surveys and polls are very susceptible to how the questions are worded. When questions offer people a presumed benefit, without discussing the costs or consequences of the policy in question, results are overwhelmingly positive. This was likely the case with this survey. If provided with all of the information, including the disadvantages, of renewables, millennials would likely be less enthusiastic about relying so heavily on renewable energy.
By Nancy Thorner & Bonnie O’Neil –
Under President Obama’s leadership, America has witnessed unusual, rather unexpected, serious clashes between citizens and authority. Tension between Blacks and the police has escalated, creating a racial divide most Americans believed had been healing. It seemed particularly odd that after the country had elected their first Black President, racial tensions would increase rather than decrease.
The Ferguson situation, in August, 2014, was not the first indication of a growing divide, but certainly the most publicized. Media crews captured the shocking scenes of chaos: mass looting, buildings burning, and riot police trying to control mobs of angry protestors, some of whom had weapons they used against the police. Predictably, the media gave the issue enormous coverage, thus providing the World with a front row seat to the mayhem.
The end result was a tragedy: a once decent town was left thrashed and the majority of good citizens left wondering why and how the disaster happened, because they too became victims. Both White and Black people lost businesses, jobs, and income. One could only speculate how one very unfortunate incident, that turned out to be far from what it first appeared, had escalated into such a tragic conclusion. Questions and accusations emerged as to why and who stirred up what appeared to be organized protestors with weapons that were used to harm the police.
Not long after Ferguson, racial tensions and conflicts increased in other larger cities and were again highlighted by the media. Black leaders continued to make claims of police injustices. Protestors in various pockets of the country marched the streets chanting slogans: “No justice! No peace! No racist police!” and “Black Lives Matter.” The crowd again became increasingly dangerous when protestors chanted: “What do we want? Dead cops. When do we want it? Now.” That resulted in a Black man executing two NYPD officers while they were sitting in their patrol car. Tension among Blacks and the police continues today in many cities throughout America.
The problem is not exclusive to ethnic concerns. Consider what happened in April, 2014. A serious clash occurred between ranchers and federal government agencies in the Western part of our country. In what reminded Americans of past western movie scenarios, a serious standoff occurred between armed ranchers and law enforcement. The problem was the result of a legal dispute between the United States Bureau of Land Management (BLM) and cattle rancher, Cliven Bundy. Bundy was not the only rancher who resented decisions and contract changes by the BLM that negatively impacted ranchers, but Bundy was the first to finally take a visibly bold stand against government’s intrusion into his ability to use his land, which seriously impacted his family’s life. Once again, a situation between a group of citizens and authorities received national attention.
To best understand the conflict, it is necessary to know a little about the history and area of the Great Basin , as well as the people who live and work there. It is a high-desert region of 200,000 square miles between the Sierra Nevadas in California and the greater Rocky Mountains in Utah, running from Southern Oregon all the way to Northern Baja, Mexico. The area is sparsely populated as water is fickle and scarce in the area, rendering the landscape fragile. Ranchers are among the very few who find the area tolerable, and for over a century have chosen to make that place home and raise their families.
Sustainable Land Management Fuels Conflict
In more recent years the ranchers and authorities have been engaged in a series of disputes over the land and new requirements. This may be in part due to the input of environmentalists who have become more engaged in protecting the area. The recent conflicts likely have roots in what the U.N. calls Sustainable Land Management. The World Bank defines sustainable land management as a process in a charged environment between environmental protection and the guarantee claim of ecosystem services. It is also about productivity of agriculture and forestry with respect to demographic growth and increasing pressure in land use and is defined accordingly.
Sustainable land management is the process by which resources of land are put to good effect. It covers all activities concerned with the management of land as a resource both from an environmental and from an economic perspective. This article will focus on sustainable land management within the charged environment that presently exists among ranchers, environmentalists, and the government. All three have varying interests regarding the management of the land. Their inability to compromise has resulted in tense conflicts. The American government and the United Nations appear to have more sympathy with environmentalists about the land, because their interests regarding land use are more similar to theirs than to those of the ranchers.
Ranchers have had continuing issues with the Bureau of Land Management (BLM) for decades. Numerous ranchers have claimed BLM has been taking over private lands and are using unscrupulous methods to do so, but not until recently have those complaints been widely know or have garnered national news. A man by the name of Cliven Bundy changed all that.
BLM vs. Bundy in 2014
The media seems to prefer and therefore covers stories which have an emotional basis. That is understandable because personal interest stories sell newspapers. However, the media does not always provide all the facts, some of which can alter perceived judgements as to who is in the right or wrong. That is largely the case when dealing with government and the general public The media can be biased. They can have altruistic reasons for highlighting one argument over another. In the case of the ranchers, they did a relatively poor job of explaining the history of the area and what forced Clive Bundy and his son to make such a bold move that defied the government and all its massive power, thus risking the reality they could be killed. Many watching the story were unaware that the ranchers’ property line runs along that of unoccupied federal land, without any discernible fences that mark where private and federal land begins and ends. Little or nothing was mentioned about all the continuing disagreements between the BLM and most ranchers in the area, as to whether there was any substantial evidence that Bundy had deliberately allowed cattle to graze beyond what the government considers the ranchers’ borders. Was the public informed that environmentalists disliked what they perceived as ranchers violating the land and that the environmentalists had the ear of government officials?
The media likewise barely mentioned the fact ranchers believed the federal agency had been purposely targeting them for decades with a myriad of questionable actions, perceived as a way to pressure them into selling their land and/or leases to the government. For instance, according to ranchers such as Brian Cunningham, the BLM ignored the original grazing permits each rancher had originally signed. Starting in the 90s, the BLM began making increment changes as to what was permissible on the property, each one impacting ranchers by reducing the original rights granted them and thus hindering them from make a decent living.
According to Mr. Cunningham, the televised standoff between Bundy and the BLM was the result of complete exasperation over the continued negative government intervention of the ranchers. One can only imagine the jubilation Bundy experienced when other ranchers from all over the region ended up joining him in his stand to keep his property. That they came armed and ready to defend Bundy sent a message to the BLM and all watching: this was not just about one man and his family, it was about what the BLM had done and was continually doing to them all. With television crews in place filming the dramatic scene, the federal deputies prudently backed off and a disaster was averted.
That one battle forced other ranchers to realize that while their options to fight government were very few and weak, the only chance they had to keep their land was to draw attention to their plight in their on-going war among the federal government, long time ranchers, and the relative newcomer to the conflict, environmentalist.
On January 2, 2016 the stage was set for more national attention to be directed toward the ranchers’ dilemma when another ongoing problem between a rancher and the government developed. This specific problem involved father and son ranchers, Dwight and Steven Hammond.
To follow is an article that will provide information about the ranchers 21-day stand-off at the Malheur National Wildlife Refuge and the resultant tragedy. Unlike the Ferguson story, which was televised day and night for weeks, the perceived problems encountered by the ranchers when facing the law received far less attention. Note: The four remaining occupiers of the Malheur National Wildlife Refuse surrendered yesterday morning, Thursday, 11, 2016, bringing an end to the standoff on its 41st day.
Article 2 will explain facts few individuals have seen in the national media about the ranchers’ last standoff. Unlike the Black vs.Police story of defying authority, the ranchers vs. the BLM is illustrative of quite a different story.
Supporters of education reform who advocate for government-funded choice mechanisms, such as vouchers, tend to argue the problems in K–12 schools in the United States are primarily economic matters, not pedagogical. This view is validated by much data, but the concept ought to be extended further to say the economic marketplace in which K–12 education operates needs more than vouchers to become as efficient as it needs to be to deliver a quality education to each and every child.
I recently reviewed trends in the performance levels of private and public schools, as reported by The Nation’s Report Card (NRC)—a congressionally mandated project administered by the National Center for Education Statistics, which is within the U.S. Department of Education and the Institute of Education Sciences—and found a modest but significant correlation between student achievement and the level of competition created by the availability of school choice in the form of vouchers and/or charter schools.
Where choice exists, student performance levels are improving faster than where it is absent, but the pace of student proficiency gains has been quite slow. This indicates many decades will be required for these schools to reach proficient academic performance levels. That’s not going to be good enough, so the United States must seek additional ways to energize the K–12 marketplace. An important missing ingredient is accurate consumer information that would enable parents and others to make wise choices in the selection of schools and other educational services.
Currently, most parents and other stakeholders operate in a sea of misinformation about school performance levels and other school characteristics. Public schools in every state I reviewed were found to have lied routinely and pervasively about student proficiency levels. Typically, twice as many students are deemed proficient, according to the NRC. Proficiency numbers are not usually available at the school or district level, leaving parents and others in the dark as to the performance levels of their local schools.
Private schools, in contrast, tend to hide behind their unearned reputation of being superior to their public school neighbors. It is rare to find a private school that publicizes its student performance levels, but the Nation’s Report Card tells us something about the national comparisons of public to private schools. When judged by how schools educate the same economically disadvantaged children, the surprising results revealed by NRC are public schools and private schools are tied in mathematics, with each only educating about 20 percent of 8th grade students to proficient levels. For reading, private schools are somewhat more effective.
I believe getting honest school performance information into the hands of parents will energize K–12 school reform and bring about the desired results. When parents are informed consumers, they will make better choices, and this will help invigorate the K–12 marketplace so the actual reforms will be nearly automatic. When this “informational choice” is combined with the power provided by vouchers, parents and other stakeholders will know how to hold schools accountable for poor results.
One significant problem is parents are not actively seeking such information about schools. Many parents are complacent and believe the propaganda school officials tell them. This suggests a need for additional remedies that will induce parents to want valid information about their local schools. Parents (and taxpayers) would surely be alarmed at the degradation of their schools if they knew the truth. There are several ways to induce them to seek out this information. One is to point out the many scandals that have occurred in K–12 education systems across the country. The education system is rife with conflicts-of-interest, corruption, and a lack of accountability. Reformers should identify these problems and publicize them. Bad schools can be sued, and the notoriety of the lawsuits can garner attention.
On the positive side, schools can advertise using honest and sobering statistics. Those who homeschool can play a role by providing information to their neighbors, on the Internet, and through the homeschooling grapevine. They can encourage other parents to be part-time homeschoolers by, at minimum, having their children tested independently of any school. Knowledge of those test results can spur competition as “word gets around.”
Reformers can publicize new methods and best practices, such as online self-paced instruction, in print and digital forums, spreading new educational developments to every part of the globe.
The beauty of informational choice is that it doesn’t cost much. Private organizations can do it. Responsible operators of schools and other educational services also have an interest in providing this valuable consumer information through the use of honest and aggressive marketing.
The only question that remains: Who will step forward to get this started?
When the stock market had its epic 2008 collapse, millions of Americans angrily questioned how something so outrageous could occur without the media giving it much notice. Yes, there had been some wise prognosticators prior to 2008 pointing out the existence of the housing market bubble, and free-market advocates have been arguing for decades the government’s heavy involvement in the mortgage market poses a significant risk to the market as a whole, but few in the media paid much attention. Then the crash happened.
In many ways, the current student loan debt crisis is very similar. American college graduates and students currently have over $1.2 trillion in outstanding debt, and every second, according to MarketWatch, “the outstanding balance of the nation’s student loans is growing by an estimated $2,726.27.” There are many factors that have contributed to the rising amount of student debt, not the least of which is the government’s increased involvement in the student loan market. Under President Obama’s administration, student loan debt and tuition rates have skyrocketed.
None of this, however, changes the importance of getting a college degree in today’s market. While it’s certainly possible to be successful without a college degree, the reality is many employers won’t hire an applicant for a quality job unless he or she has a college degree. What can prospective students do then? Well, one alternative to the traditional education model is to pursue an online degree. Online degrees are now being offered by an increasing number of quality educational institutions, and you simply cannot beat the value in terms of dollars and cents. They aren’t for everyone, to be sure, but they are an excellent option for lots of students who have no trouble learning online, who have families at home, or who are working through their college years.
To help students and prospective students find accurate information about online college degrees, SR Education Group created OnlineU.org, which recently released its updated list of the “2016 Most Affordable Online Colleges & Degrees,” available here: http://www.onlineu.org/most-affordable-colleges
If you or someone you know is interested in getting or finishing a degree online, I’d consider taking a look at that well-researched site. Also, check out OnlineU.org’s “Top Online Colleges” list to see online programs offered by Stanford University, Columbia (an Ivy League college), and other great schools.
Online education is an excellent way to obtain a college degree without the incredibly high amounts of debt that comes with the traditional model. As more top universities recognize the potential of offering an online degree, this market will expand, providing greater access to an affordable education for millions of Americans across the country.
NOTE: This author has not received any compensation from OnlineU.org or any online college or university. The author has no financial connection to any of the parties mentioned in the article.
The sudden death of Supreme Court Justice Antonin Scalia immediately began discussion of who President Obama will appoint to take his place, and whether the U.S. Senate should take up the nomination in an election year. I look back at Obama’s views on Supreme Court nominees seems appropriate.
On January 26, 2006, the junior senator from Illinois took to the floor of the United States Senate to explain why he was voting “no” to confirm Samuel Alito as an associate justice on the U.S. Supreme Court.
Young Sen. Obama’s reasoning was sound in a few sections of his quick, six-minute speech. I was especially struck by this bit about restraining executive power and respecting the Constitution’s checks and balances, views an older President Obama would certainly find to be nonsense.
When it comes to how checks and balances in our system are supposed to operate, the balance of power between the executive branch, Congress, and the judiciary, Judge Alito consistently sides with the notion that a president should not be constrained by either Congressional acts, or the check of the judiciary. He believes in the over-arching power of the president to engage in whatever policies the president deems to be appropriate.
I’m glad that, as usual, Obama was wrong — even the younger version. Alito was among the five justices (including the late Scalia) who stopped Obama’s illegal Clean Power Plan because (presumably) Alito does not believe “in the over-arching power of the president to engage in whatever policies the president deems to be appropriate.”
This bit from Young Obama also made sense — even though he was wrong to worry about Alito not being inclined to ensure that the president should be “constrained by the Constitution and our laws.”
In all of these cases, we believe that the president deserves our respect as commander-in-chief, but we also want to make sure that the president is bound by the law — that he remains accountable to the people who put him there, that we respect the office, and not just the man, and that office is bounded and constrained by our Constitution and our laws. And I don’t have confidence that Judge Alito shares that vision of our Constitution.
Full transcript below the video, which I clipped from CPAN-2.
Transcript picked up after some customary pleasantries:
“… There are some who believe that the president, having won the election, should have complete authority to appoint his nominee and the Senate should only examine whether or not the justice is intellectually capable, and an all-around good guy. That once you get beyond intellect, and personal character, there should be no further question as to whether the judge should be confirmed.
“I disagree with this view. I believe firmly that the Constitution calls for the Senate to advise AND consent. I believe that it calls for meaningful advice and consent that includes an examination of a judge’s philosophy, ideology, and record. And when I examine the philosophy, ideology, and record of Samuel Alito, I am deeply troubled.
“I have no doubt that Judge Alito has the training and qualifications necessary to serve. As has been already stated, he has received the highest rating from the ABA, he is an intelligent man, and an accomplished jurist. There’s no indication that he is not a man of fine character. But when you look at his record, when it comes to his understating of the constitution, I found that in almost every case he consistently sides on behalf of the powerful against the powerless. On behalf of a strong government or corporation against upholding Americans’ individual rights and liberties.
“If there is a case involving an employer and an employee, and the Supreme Court has not given clear direction, Judge Alito will rule in favor of the employer. If there is a claim between prosecutors and defendants, if the Supreme Court has not provided a clear rule or decision, then he’ll rule in favor of the state. He’s rejected countless claims of employer discrimination even refusing to give some plaintiffs a hearing for their case. He’s refused to hold corporations accountable numerous times for dumping toxic chemicals into water supplies, even against the decisions of the EPA. He’s overturned a jury verdict that found a company liable for being a monopoly when it had 90 percent marketshare in that industry at that time. It’s not just his decisions in individual cases that give me pause, though. It’s that decisions like these are the rule for Samuel Alito, rather than the exception.
“When it comes to how checks and balances in our system are supposed to operate, the balance of power between the executive branch, Congress, and the judiciary, Judge Alito consistently sides with the notion that a president should not be constrained by either Congressional acts, or the check of the judiciary. He believes in the over-arching power of the president to engage in whatever policies the president deems to be appropriate.
“As a consequence of this, I am extraordinary worried about how Judge Alito might approach the numerous issues that are going to arise as a consequence of the challenges we face with terrorism. There are issues like wiretapping, monitoring of emails, other privacy concerns that we have seen surface over the last several months. The Supreme Court may be called to judge as to whether a president can label an individual US citizen an enemy combatant, and thereby lock them up without the benefit of trial or due process. There may be considerations with respect to how the president can prosecute the war in Iraq, and issues related to torture.
“In all of these cases, we believe that the president deserves our respect as commander-in-chief, but we also want to make sure that the president is bound by the law — that he remains accountable to the people who put him there, that we respect the office, and not just the man, and that office is bounded and constrained by our Constitution and our laws. And I don’t have confidence that Judge Alito shares that vision of our Constitution.
“In sum, I’ve seen an extraordinarily consistent attitude on the part of Judge Alito that does not, I believe, uphold the traditional role of the Supreme Court as a bastion of equality and justice for United State citizens. Should he be confirmed, I hope that he proves me wrong. I hope that he shows the independence that I think is absolutely necessary in order for us to protect and preserve our liberties and our freedoms as citizens. But at this juncture, based on a careful review of his record, I do not have that confidence, and for that reason I will vote no and would urge my colleagues to vote no on this confirmation.”
In testimony before Congress, John Christy, Alabama’s state climatologist and director of the Earth System Science Center at the University of Alabama in Huntsville, explained the data used by the National Oceanic and Atmospheric Administration to proclaim record temperatures is biased in a number of ways. The ground-based data come from thermometers located near sources of artificial heat, including concrete and air conditioner exhausts, and the ocean data come from ship engine water intake valves. By contrast, Christy notes, satellite-derived temperatures offer global coverage and are not affected by the heat island effect. Christy noted climate models show 2.5 times as much warming as has been observed by satellites and weather balloons.
Because the satellite measurements challenge the narrative of a discernable human impact on climate, Christy noted,[t]here have been several well-funded attacks on those of us who build and use such datasets and on the datasets themselves. … It is a bold strategy in my view to actively promote the output of theoretical climate models while attacking the multiple lines of evidence from observations. Note that none of the observational datasets are perfect and continued scrutiny is healthy, but when multiple, independent groups generate the datasets and then when the results for two completely independent systems (balloons and satellites) agree closely with each other and disagree with the model output, one is left scratching one’s head at the decision to launch an offensive against the data.
Christy also pointed out actual observations show the frequency and intensity of extreme events is not increasing, disproving claims based on climate models.
An article by Senja Post published in the journal Public Understanding of Science examined the “ideals and practices” of German scientists as they communicated climate change research findings to the public. Post surveyed German climate scientists holding the position of full professor and actively engaged in climate research, finding “the more climate scientists are engaged with the media the less they intend to point out uncertainties about climate change and the more unambiguously they confirm the publicly held convictions that it is man-made, historically unique, dangerous and calculable.”
In addition, the more convinced scientists were rising carbon dioxide levels are causing dangerous climate change, the more they worked with the media to spread that message. Post’s survey also revealed German climate scientists object to publishing results indicating climate change is happening more slowly than expected, which in her words “gives reason to assume that the German climate scientists are more inclined to communicate their results in public when they confirm rather than contradict that climate change is dramatic.”
A new study by the Rights and Resources Initiative shows implementation of the United Nations’ Paris climate agreement could displace up to 4.1 million people living in heavily inhabited forests and another 0.9 million who depend on such areas for their economic well-being. The agreement calls for those areas to be designated “ecologically protected” in Liberia and the Democratic Republic of Congo (DRC), two of the poorest countries on Earth. Western-backed programs to expand forests and limit their use, in order to reserve forests as carbon sinks dedicated to removing carbon dioxide from the atmosphere, could make millions homeless.
According to Andy White of the Rights and Resources Initiative, “Governments have targets to expand their protected areas, and now with new climate funding being available the risk is they will use this to expand in a way that doesn’t respect local rights. It could result in the displacement of millions of people.” Andrew Follett of the Daily Caller reports the DRC and Liberia, with the support of Western governments and environmental organizations, have already displaced millions of resident from their historic forest homelands. The DRC has removed 17 million people, almost a quarter of the country’s population, from existing protected areas.
Under new programs funded by Germany and environmental non-profit groups, the DRC is planning to set aside 12 to 15 percent of its forested land as ecologically protected areas. Liberia has committed to turning 30 percent of its forests into ecologically protected areas in exchange for $150 million in developmental aid from Norway.
Concerning the impact on people being removed from their forest homes, Follett quotes a Mbuti tribal leader in the DRC saying, “Our new masters … like the animals more than humans and do not mind that people suffer as long as the animals are happy.”
by Julie Kelly and Jeff Stier
In its endless attempt to turn the country into one giant Weight Watchers meeting, the Obama Administration tucked away a little-known provision in the health care law that authorizes the FDA to force certain businesses to post the calorie count for every menu item.
Lawmakers on both sides of the aisle are unhappy with the new proposed rules on menu labeling (five years in the making) and this week, Congress will consider H.R. 2017, the Common Sense Nutrition Disclosure Act, to scale back some of the FDA’s most onerous rules set to take effect later this year.
As two culinary experts who have had a long career in both cooking and writing about food, we both agree this legislation can’t wait.
Last year, the FDA issued a set of inflexible rules that even suggests handcuffs and jail time for those restauranteurs who fail to comply. The proposed policy is a cassoulet of bureaucratic incursions – not to mention a violation of our right to eat in peace – that targets neighborhood restaurants, grocery stores and even vending machines. Convenience stores would be required to post calorie information wherever food is displayed, which could basically be every two feet if you think about how your local convenience store is configured.
Businesses would have to print multiple menus, going so far as to classify advertisements as a menu that need a calorie count. “In our business, we send out lots of advertising flyers, top boxes with flyers and put up posters in stores,” Domino’s executive vice president Lynn Liddle told a House committee last summer. “None of these were ever intended as menus.” The FDA won’t even allow a range of calories for items like pizza and sandwiches, creating an impossible demand that would be laughable if it didn’t turn your corner pizza joint owner into a felon (Domino’s Pizza offers 34 million ways to order a pizza).
And the costs are exorbitant: The White House’s Office of Management and Budget estimates the new law will cost $1 billion and require a staggering 14 million compliance hours which, in this administration’s parlance, makes it a job creator.
The FDA is largely following the Bloombergian approach to appetite control. In 2008, Mayor Bloomberg made New York City the first city in the nation to require restaurant chains with 15 or more stores to post the calorie count of all menu items. At the time, about 1/3rd of all the city’s restaurants fell under that category. The law was a flop and failed to achieve the mayor’s top goal of reducing obesity rates in the Big Apple and did nothing to stop diners from eating the same amount of calories as before. A survey released last year by NYU’s Langone Medical Center found no change in the eating habits of New Yorkers in the last six years: “Researchers found that the average number of calories bought by patrons at each sitting between January 2013 and June 2014 was statistically the same as those in a similar survey of 1,068 fast-food diners in 2008, when New York City initially imposed menu labeling. Diners were surveyed at major fast-food chains: McDonald’s, Burger King, KFC, and Wendy’s.”
The study actually found an uptick in calorie intake from 2008 to 2013-2014. Shortly after the policy was introduced in 2008, calories at labeled restaurants were 783 per meal; in 2013-14, the calorie range was 804-839 per meal.
“Our study suggests that menu labeling, in particular at fast-food restaurants, will not on its own lead to any lasting reductions in calories consumed,” says study senior investigator Brian Elbel, PhD, an associate professorat NYU Langone’s Department of Public Health.
H.R. 2017, sponsored by Reps. Cathy McMorris Rodgers (R-) and Loretta Sanchez (D-CA), which the House will consider this week, will try to blunt the same costly yet futile Bloobergian plan for the country. The Common Sense Nutrition Disclosure Act decriminalizes any mistakes in menu-labeling; the sandwich shop employee won’t go to jail if he accidentally exceeds the stated calorie count with an extra piece of cheese. It would also exempt convenience and grocery stores from the law.
This could be a final rebuke to an administration that has tried in vain for seven years to stem obesity rates and make Americans “healthier.” From a well-intentioned “Let’s Move” campaign to harsher school lunch rules that will likely yield the opposite result of the stated goals, the Obama Administration’s food policies will have to be viewed as one of its biggest failures. And as more questions are raised about nutrition science – and we are now finding out that much of what we’ve been told by the federal government about how we should eat has been dead, flat wrong – the next administration would be well-advised to reverse course.
Julie Kelly is a cooking teacher and food policy writer. She’s been published in the Wall Street Journal, National Review, Forbes, The Hill and Huffington Post. Jeff Stier New York, NY Jeff Stier is a senior fellow at the National Center for Public Policy Research, and you can reply to him on Twitter @JeffaStier.
If you don’t visit Somewhat Reasonable and the Heartlander digital magazine every day, you’re missing out on some of the best news and commentary on liberty and free markets you can find. But worry not, freedom lovers! The Heartland Weekly Email is here for you every Friday with a highlight show. Subscribe to the email today, and read this week’s edition below.300 Scientists Request Explanation for Temperature Shenanigans H. Sterling Burnett, Climate Change Weekly In support of U.S. House Science Committee Chairman Lamar Smith’s investigation of the National Oceanic and Atmospheric Administration (NOAA), approximately 300 scientists, engineers, economists, and other experts sent a letter requesting an explanation for what appears to be manipulated temperature data. The letter challenges NOAA’s findings that eliminated the 18-year “pause” in global temperature increase and demands the agency reveal its data and methodology for public examination. READ MORE Parents Lack Honest Data Necessary for School Choice Decisions David V. Anderson, The Hill For any market to work efficiently, there needs to be a free flow of information so consumers can make educated choices. This is a missing ingredient in America’s education system. Accurate consumer information would enable parents to make wise choices in the selection of schools and other educational services. Unfortunately, most parents today operate in a sea of misinformation about school performance levels and other characteristics. READ MORE Supreme Court Blocks Obama’s Clean Power Plan H. Sterling Burnett, The Heartlander The U.S. Supreme Court did something remarkable on Tuesday: It respected the separation of powers and finally shouted “ENOUGH!” to the lawless rule of the Environmental Protection Agency. The Court stayed President Barack Obama’s “Clean Power Plan,” a radical, law-by-decree scheme that would put this nation’s enormously complex energy-delivery system into the hands of central planners in Washington, DC. READ MORE Bonus Podcast: In The Tank (ep24) – Exploring the World of Think Tanks Hosts Donny Kendal and John Nothdurft have tweaked the format of their weekly podcast to highlight the work of think tanks across the country. Segments now include “Better Know a Think Tank,” in which a think tank spokesman talks about his or her organization’s mission, and “Featured Work of the Week,” which highlights recent research from a think tank. Donny and John kick off the feature by talking about (you guessed it) The Heartland Institute, as well as The Heritage Foundation’s “World Economic Freedom Ranking.” LISTEN TO MORE Never Lose a Debate with a Global Warming Alarmist! The Heartland Institute’s newest book, Why Scientists Disagree About Global Warming, demolishes the most pernicious myth in the global warming debate: that “97% of scientists” believe mankind is the cause of a global warming catastrophe. Heartland President Joseph Bast, who edited the book, will discuss his findings and bid a fond farewell to one of the coauthors, Robert Carter who passed away on January 19, at a free event at Heartland’s headquarters in Arlington Heights, Illinois, on March 9. Go to Amazon.com or the Heartland store [store.heartland.org] now and order a copy, or become a Heartland donor and get a free copy! Ride-Sharing Saves Time, Money, and Lives Jesse Hathaway, Real Clear Policy The popular peer-to-peer ride-sharing service, Uber, enjoyed a recent victory in New York City’s “war on ride-sharing.” An impact study released by Mayor Bill de Blasio showed Uber had not contributed significantly to the traffic congestion that has worsened recently in the Big Apple. Perhaps this will convince lawmakers to give up their efforts to hold back the wave of the future, and instead reap the benefits of the refreshingly entrepreneurial sharing economy. READ MORE The Politics Behind the Anti-Fossil Fuels Campaign Donn Dears and H. Sterling Burnett,Washington Times Political leaders at the United Nations’ Intergovernmental Panel on Climate Change last year admitted their push to limit carbon dioxide emissions is not about protecting human health or the environment: It’s about giving governments control over the world’s economy. Said Christiana Figueres, executive secretary of the U.N. Framework Convention on Climate Change: “This is probably the most difficult task we have ever given ourselves, which is to intentionally transform the economic development model for the first time in human history.” This moment of accidental honesty must not be forgotten. READ MORE
NYC Public Schools Try to Keep Vouchers to Themselves Joy Pullmann, School Choice Weekly A little-known fact about public schools is that they have the ability to contract out special-needs children to private schools or tutors. A recent federal investigation found 83 percent of public schools are not “fully accessible” to children with disabilities. The decision to contract these students out lies only with public schools themselves. Parents who want to put their child in a different school do not have that right. READ MORE Medical School Costs Keep Young Doctors Out of Primary Care Justin Haskins, Consumer Power Report Health care reform efforts over the past two decades have focused primarily on creating ways to help more Americans obtain health insurance. But little has been done to address the rapid increases in medical school costs plaguing hundreds of thousands of new doctors. Unless those costs come down – via market forces that are currently crowded out by government loans – America is going to see an accelerating drop in vital primary care physicians. READ MORE Featured Podcast: Kyle Maichle: Free Speech on College Campuses The hysteria at Yale University over “offensive” Halloween costumes supposedly threatening “safe” and “comfortable” spaces for students put free speech on campus into the national spotlight. Kyle Maichle, project manager for constitutional reform at The Heartland Institute, joins host Donald Kendal in a conversation about these basic free speech issues. Maichle discusses the history of speech codes on campus and gives an update on some of the more recent threats to freedom of expression at our institutions of higher learning. LISTEN TO MORE Surgeon Posts Prices Online to Improve Transparency, Competition Tony Corvo, The Heartlander One of the major contributing factors to the increasing costs of health care is the failure of hospitals and physicians to tell consumers the cost of procedures and treatments. One surgeon in Oklahoma City has found a simple way to help: He tells his patients the prices. His clinic’s website gives people the opportunity to research prices in advance and choose between the Surgery Center and other providers. How revolutionary! “I’m a free-market guy,” said Dr. Keith Smith. “We decided to put our money where our mouth was and post the prices online for everyone to see.” READ MORE Invest in the Future of Freedom! Are you considering 2015 gifts to your favorite charities? We hope The Heartland Institute is on your list. Preserving and expanding individual freedom is the surest way to advance many good and noble objectives, from feeding and clothing the poor to encouraging excellence and great achievement. Making charitable gifts to nonprofit organizations dedicated to individual freedom is the most highly leveraged investment a philanthropist can make. Click here to make a contribution online, or mail your gift to The Heartland Institute, One South Wacker Drive, Suite 2740, Chicago, IL 60606. To request a FREE wills guide or to get more information to plan your future please visit My Gift Legacy http://legacy.heartland.org/ or contact Gwen Carver at 312/377-4000 or by email at email@example.com.
With less than a year left in office, President Barack Obama is upping the pressure on America’s fossil fuel industries with a slew of new regulations and tax proposals.
Last month, Obama ordered a moratorium on new coal leasing on federal lands. Unlike oil and gas production, which occurs primarily on private land, 40 percent of U.S. coal mining takes place on federal lands.
Coal use has been declining in power generation for several years, in part because of abundant natural gas.
But it still accounts for 39 percent of our electricity, it remains the cheapest way to generate power and coal is a major export commodity.
Over time, the moratorium will drive up electricity costs for households and businesses, destroy jobs, diminish federal, state and local tax revenues and widen America’s trade deficit.
The administration also wants to put further limits on offshore drilling for oil and gas.
Under the 2012-2017 lease program, no sales were permitted on the Pacific Coast, the Atlantic Coast, the eastern third of the Gulf of Mexico and much of Alaska.
Then last October, the Interior Department canceled two planned lease sales for Arctic drilling rights and denied two companies’ requests for lease extensions.
Now the Bureau of Ocean Energy Management is putting together its 2017-2022 program for offshore lease sales.
A first draft of the plan, released last year, would allow offshore drilling on the Atlantic outer continental shelf, parts of the Gulf of Mexico and limited tracts of the Arctic Ocean north of Alaska.
But environmental groups are pushing for a total moratorium on offshore drilling. More than 100 East Coast communities have passed anti-drilling resolutions, while 100 members of Congress, as well as 650 state and local elected officials, have expressed opposition to Atlantic drilling.
In today’s low-price environment, it might seem counterintuitive that investors would want to bid on leases in fields that have no production history and limited seismic testing. But they’re thinking long term — perhaps 10 to 20 years out.
Should the final 2017-2022 lease plan be modified to prohibit drilling in the Atlantic, not only will imports increase but communities along the East Coast will be forfeiting high-wage jobs, income and tax revenue.
In its most recent attack on hydrocarbons, the Obama administration is proposing a $10-per-barrel tax on oil companies as part of its fiscal 2017 budget.
Ostensibly, the tax — which is equivalent to 30 percent at today’s prices — would help finance the Highway Trust Fund while providing money to invest in “sustainable transportation programs.”
In practice, it would further erode the earnings of an industry going through its greatest financial crisis in 25 years.
Oil imports would rise, because foreign producers would not be subject to the tax.
Today, with the world drowning in oil, and gasoline and diesel prices at their lowest levels in 15 years, the siren song of “keep it in the ground” (or keep it under the ocean floor) may be alluring.
But the American economy runs on energy, and oil, gas and coal will be providing the lion’s share of transportation and power generation fuel for decades to come.
It simply makes more economic and political sense to develop our own hydrocarbon resources, in an environmentally responsible manner, rather than increase our dependency on imported energy.[Originally published at the Star-Telegram]
Net neutrality absolutists are overreaching yet again in their push for a practical FCC ban of ISP zero rating offers under the FCC’s case-by-case “General Conduct Standard” review, by claiming violations of the “bright-line rules” in the FCC’s 2015 Open Internet Order against blocking, throttling and paid prioritization.
The problem here is that net neutrality absolutists, in exploiting the political pejorative power of the word ‘discrimination,’ have politically oversold their Title II net neutrality policy as “bright-line” ‘non-discrimination’ bans, implying no discrimination allowed, when Title II actually only bans “unjust and unreasonable discrimination.”
This is a distinction here with a huge difference; and it apparently is giving the net neutrality absolutists fits. They want to imagine that Title II prohibits their absolutist ‘no discrimination’ frame when it clearly does not.
They want to find a technical “gotcha” in every zero-rating or sponsored data offering, no matter how unreasonable their conclusion, so they can politically ask it be banned by the FCC under their concept of what a ‘no discrimination’ principle should be.
With T-Mobile’s Binge On, Stanford Professor Van Schewick charges that Binge On violates net neutrality because it favors commercial content over Google-YouTube’s user-generated videos. How is it unreasonably discriminatory when most every other company can interface with T-Mobile’s non-discriminatory Binge On interface, but the world’s most dominant and most technically-advanced digital video distributor, Google-YouTube, somehow can’t figure out what everyone else has figured out?
EFF also tested T-Mobile’s Binge On offering and charges that it violates net neutrality because it throttles everyone the same way to provide more data for less cost. How is that unjust and unreasonable network management?
With the news that Verizon’s Go90 video service is now using Verizon’s FreeBee Data 360 offering, some are insinuating sponsored data plans like this are somehow a net neutrality violation when offered as an “open, non-exclusive service available to other content providers on a non-discriminatory basis.” How is its unreasonably discriminatory when any content provider can enjoy the same thing Go90 does?
The problem here is that the net neutrality absolutists demanded the FCC adopt their politically correct, “bright-line” ‘no discrimination’ version of net neutrality, and not what Title II allows under the law and decades of FCC and court precedents.
The FCC’s Open Internet Order states: “Clear, Bright-Line Rules: Because the record overwhelmingly supports adopting rules and demonstrates that three specific practices invariably harm the open Internet—Blocking, Throttling, and Paid Prioritization—this Order bans each of them, applying the same rules to both fixed and mobile broadband Internet access service.” (Para 14) (Bold added for emphasis)
In promising not just net neutrality rules, but “bright line” (i.e. absolute) open Internet rules, and in claiming the three banned practices – blocking, throttling and no paid prioritization – “invariably” (i.e. absolutely always) harm the Open Internet, the FCC has created a real problem for itself.
How can the FCC reasonably rule under its ‘General Conduct Standard’ that any action that blocks or throttles Internet traffic can never be reasonable network management when the FCC itself is requiring ISPs to block robo-calls to users, or when ISPs are expected by the FCC to filter (i.e. block) viruses and malware, and to throttle the traffic of edge-driven denial-of-service-attacks that routinely assault ISP networks.
How can the FCC reasonably enforce absolute “bright-line” bans on blocking and throttling when the FCC Open Internet Order states: “The record broadly supports maintaining an exception for reasonable network management. We agree that a network management exception to the no-blocking rule, the no-throttling rule, and the no-unreasonable interference/disadvantage standard is necessary for broadband providers to optimize overall network performance and maintain a consistent quality experience for consumers while carrying a variety of traffic over their networks.” (para 215)
The FCC once again has got itself in a pickle in promising conflicting outcomes. On one hand it tells the public, industry and the reviewing court absolutely that blocking and throttling harm the Open Internet, but on the other hand it rules an exception is necessary for ‘reasonable network management.’
In conclusion, why FCC net neutrality policy is such a mess is that the absolutists are really not complaining about ISP “blocking, throttling, or paid prioritization.”
Their real beef is much less a Title II regulatory unreasonable discrimination problem, and much more akin to a lawsuit under antitrust law that alleges an ISP is a vertically-integrated monopoly that needs to be forever enjoined by a court from vertical integration.
And it is telling that the net neutrality absolutists effectively are trying to use Title II authority as antitrust law to preemptively ban a whole category of market participants from certain types of market behavior, with no investigation or finding of market power, and with no evidence that the targeted market participants actually did anything wrong, because no court would entertain such a baseless and senseless notion of antitrust law.
At bottom, net neutrality absolutists are demanding the FCC preemptively and absolutely ban market behaviors that no American law considers unreasonable on their face.
Scott Cleland served as Deputy U.S. Coordinator for International Communications & Information Policy in the George H. W. Bush Administration. He is President of Precursor LLC, a research consultancy for Fortune 500 companies, and Chairman of NetCompetition, a pro-competition e-forum supported by broadband interests.
Hosts Donny Kendal and John Nothdurft continue to explore the world of think tanks in episode #25 of the In The Tank Podcast. This weekly podcast features (as always) interviews, debates, roundtable discussions, stories, and light-hearted segments on a variety of topics on the latest news. The show is available for download as part of the Heartland Daily Podcast every Friday. Today’s podcast features work from the Georgia Public Policy Foundation, the Mercatus Center, and the Libertas Institute.
Better Know a Think Tank
In this “Better Know a Think Tank” segment, Donny and John talk to Benita Dodd, Vice President of the Georgia Public Policy Foundation. Benita talks about the background, history and mission of the Georgia-based think tank, and what they are currently working on.
Featured Work of the Week
Featured this week is a report by the Mercatus Center titled “The Proper Role of the FDA for the 21st Century.” They explain how the FDA got so far off their originally mission, what their proper role should be, and how to get them back to that narrow role.
Youtube video – Freedom and the FDA: The Matt Bellina Story
In the World of Think Tankery
Today Donny and John discuss a bizarre law out of Utah called the Zion Curtain – a law that requires restaurants and bars to pour alcoholic drinks out of the view of patrons. Our friends at the Libertas Institute, a Utah-based libertarian think tank, brought this to our attention.
Here are a handful of upcoming events that you may be interested in attending.
Mercatus Center – Conversations with Tyler: A Conversation with Nate Silver (Tuesday, Feb 16) @ George Mason University
Georgia Public Policy Foundation – Leadership Breakfast: Georgia Criminal Justice Reform: Looking Ahead, Staying Ahead – (Wednesday, Feb 17) @ The Georgian Club, Atlanta
Libertas Institute – Annual Liberty Forum with keynote speakers Larry Reed & Matt Kibbe (Saturday, May 7)
Mackinac Center For Public Policy – Let Them Work: Solutions for Michigan’s Overbearing Occupational Licensing Laws (Wednesday, Feb 17) in Lansing, Michigan.
Heartland Daily Podcast – Marita Noon: Presidential Candidates’ Energy Positions, and The Truth About Renewable Subsidies
In today’s edition of The Heartland Daily Podcast, Marita Noon, executive director for Energy Makes America Great Inc. and the companion educational organization Citizens’ Alliance for Responsible Energy (CARE), joins Managing Editor of Environment & Climate News H. Sterling Burnett. Noon joins the podcast to discuss the different Presidential candidates positions on energy and other energy related topics.
Together, Energy Makes America Great Inc. and CARE, work to educate the public and influence policy makers on energy, its role in freedom, and the American way of life. Also in the podcast, Noon talks about some of the positive provisions in an energy bill presently being developed in the Senate, and the truth about renewables and energy subsidies.
According to the United States Geological Survey, nearly half the land in the Western United States is owned by the federal government. This includes 84.9 percent of land in Nevada (hiding UFOs requires lots of space), 64.9 percent of Utah, 61.6 percent of Idaho, 61.2 percent of Alaska, 52.9 percent of Oregon, 48.1 percent of Wyoming, and 45.8 percent in California. Meanwhile, the federal government owns only about 5 percent of the land in states east of the Mississippi River. Altogether, Uncle Sam owns roughly 640 million acres of land.
In March 2012, Utah Gov. Gary Herbert (R) signed the Utah Transfer of Public Lands Act into law, which instructs the federal government to relinquish more than 20 million acres of land to the State of Utah. Although Utah has yet to bring forward a suit in an attempt to enforce the law, a move that is expected to bring strong opposition from the federal government, similar legislation is being considered in nine other Western states. These states are arguing if the federal government turns over its property in the West to the states, it will result in better environmental stewardship of the land, lower management costs, and an increase in productivity.
Environmentalists, support federal government land ownership in Western states because they say these lands contain the most biologically and environmentally valuable ecosystems in the nation that need to be protected by federal officials from less environmentally concerned states. “If not for federal policies for public land management,” University of Wyoming professor Debra Donahue told the New York Times, “America would lack a world-class system of national parks, wildlife refuges and wilderness areas.” This is undeniably true; however national parks, national monuments, wildlife refuges, and federal wilderness areas (FWAs), essentially the only parts of the West tourists ever lay their eyes on, would be excluded from any future land transfers.
Most of the land held by the U.S. Bureau of Land Management, excluding national parks, monuments, and FWAs, is the result of historical accident, not environmental concerns. During the Progressive and New Deal Eras, Congress created federal agencies to control Western lands under the belief central authorities would dispassionately apply science to determine the best use of natural resources. But as Montana State University professor of economics Holly Fretwell writes, “Science cannot determine whether hiking, biking or timber harvest is a higher-valued use. Instead, management decisions—regarding recreation use, commodity production or restoration activities—depend on budget appropriations and special interest battles.”
Fretwell says this leads to gross mismanagement of public lands, leaving Western communities at risk of wildfires, soil erosion, and other environmental problems that impose steep economic costs.
Without allowing market forces to have a greater say in how federal lands are used, Western states will continue to suffer economic and environmental disadvantages. A recent study from the Institute for Energy Research showed, even in a time of stagnant demand, opening up federal lands to just oil, coal, and gas production would bring many benefits to the states, including $663 billion in increased GDP and 2.7 million jobs created over the next 30 years. The report also found a $3.9 trillion increase over federal tax revenues, and a $1.9 trillion increase in state tax revenues, over the next 37 years. (In addition to coal, oil, and natural gas, federal lands are also a major source of softwood timber, hard metals, and grazing areas.)
A study comparing state trust lands in Arizona, Idaho, Montana, and New Mexico to federal multiple-use lands from 2009–13 by the Property and Environment Research Center (PERC) found these states earned a combined $14.51 for every dollar spent managing state trust lands, while the federal government earned only 73 cents for every dollar they spent managing federal lands. On a per-acre basis, the states earned $34.60 while the feds lost $4.38.
PERC writes, “State trust management has demonstrated its ability to resist excessive political influence, respond to market signals, and accommodate new resources over time.… Managing these lands should provide a rich source of revenues to benefit the public, but is instead coming at a high cost to taxpayers.”
Not only would a transfer of Western federal lands to the states be better for the preservation of much of the land and the residents of the states, it would also be a better deal for taxpayers nationwide.
After rejecting Medicaid expansion for each of the last three years, Nebraska lawmakers are once again counting the cost.
And once again, the cost is high.
The Nebraska Department of Health and Human Services this week posted a report by Optumas, a private actuarial firm, to analyze the 10-year cost of LB1032, state Sen. John McCollister’s (R-Omaha) bill to expand Medicaid by adopting a Transitional Health Insurance Program.
Using Medicaid dollars, newly eligible enrollees would purchase qualified health plans (QHPs) through the state’s health insurance exchange. Under the Affordable Care Act, the federal government would pay 100 percent of new Medicaid expansion costs for the first three years, with the federal share declining to 90 percent thereafter.
But the Optumas study finds that even with federal assistance, the 10-year cost of Nebraska’s share for the proposed Medicaid expansion plan would reach almost $1 billion.
Here are the report’s 10-year projections:
“Total Spend”: $14,781,000,000
“Federal Share”: $13,803,000,000
“State Share”: $978,000
And that’s a “conservative estimate,” said Calder Lynch, DHHS Director of Medicaid and Long-Term Care in a press release:
“Because there are many uncertainties, these costs could rise with a change of the federal matching rate or the under-collection of member contributions as required in the bill. I have serious concerns from both a fiscal and policy perspective.”
McCollister’s plan follows a Medicaid expansion model Arkansas adopted in 2014, which grossly underestimated the number of enrollees (300,000 vs. 215,000), according to a study by the Platte Institute for Economic Research. In Arkansas’ Failed Medicaid Experiment: Not a Model for Nebraska, authors Jonathan Ingram and Nicholas Horton write that 41 percent of Arkansans are now on the state’s Medicaid rolls. They warn Nebraskans to heed their neighbor-state’s cautionary tale:
This new approach to Medicaid expansion is unaffordable and unpredictable, pushes adults out of private insurance and into taxpayer-funded welfare, puts the truly needy on the chopping block, discourages work, and shrinks the economy. So it should be no surprise that, last year, Iowa policymakers scrapped the model entirely and Arkansas enacted legislation to repeal the expansion altogether at the end of 2016. Nebraska policymakers should learn from these mistakes, not repeat them.
Nebraska Governor Pete Ricketts (R) has made Arkansas’s experiment a pillar in his argument against McCollister’s plan.
The Heartland Institute’s Health Care News will cover Nebraska’s Medicaid deliberations in its next issue.
Image via Thinkstock
The U.S. Supreme Court (SCOTUS) did something remarkable on Tuesday: It momentarily respected the separation of powers and finally shouted “ENOUGH!” to the lawless rule of the Environmental Protection Agency. SCOTUS issued a stay on Obama’s “Clean Power Plan,” which is a radical law-by-decree scheme to do nothing less than put this nation’s enormously complex energy-delivery system into the hands of central planners in Washington.
It was Clinton advisor Paul Begala who once said: “Stroke of the pen. Law of the land. Kinda cool.” Not any more … at least for now in this case.
Here are the top three take-aways of this historic moment in SCOTUS history.
1. Ding, dong, the Clean Power Plan is dead.
With this stay, the rule is suspended until President Obama is out of office. On the fastest of tracks, SCOTUS will hear arguments this summer and issue a ruling in December (after the election) or in January (after a new president is inaugurated.) Even if the EPA “Clean Power Plan” rule is upheld, the next Republican president will cancel it. And while it may be likely that a President Hillary Clinton would keep that rule in effect, I don’t think that’s a guarantee. Hillary would want to put her own stamp on a climate agenda, not merely rubber stamp Obama’s. And if she has any hope of setting her own climate agenda, a Republican Congress will demand she start over on this front.
That said, it is not likely that SCOTUS would stay the ruling and then let it go back into effect. This extraordinary move is only justified if the Court thinks the plaintiffs, who want the Clean Power Plan nullified, are most-likely to prevail.
2. The Paris Climate Agreement from COP-21 is now “all dead,” instead of “mostly dead.”
If you took the time to read the eco-left’s comments of woe back in December, you’d clearly see what a defeat COP-21 was for them. Oh, some leftist outfits made happy noises about how this will help “battle climate change,” and the MSM trumpeted the Paris Agreement as an “historic moment.” But the fact of the matter is this: The document that came out of COP-21 was a complete failure. It is a sham. The agreement is not a treaty. It is not legally binding to any nation. It has no enforcement mechanisms. And even what it promises to do — keep the global temperature from rising no more than 1.5 degrees Celsius by 2100 (because 2.0 degrees would KILL US ALL!!!) — is just as likely to happen as not no matter how much CO2 emissions grow or abate … and they will surely grow.
Global temperatures, measured by satellite, show no upward trend since the late 1990s — despite the fact that about one-third of all human CO2 emissions since the dawn of the Industrial Revolution happened in that time span. Many solar scientists have been noting for years that they’ve observed historically low sunspot activity and solar energy — a possible repeat of the so-called Maunder Minimum, which has been tied to periods of global cooling, such as the “Little Ice Age” that ended in the mid-1800s.
Anyway, back to Obama. He left Paris saying: It doesn’t matter that the Paris Agreement isn’t a treaty. It doesn’t matter that there are no enforcement mechanisms. I will instruct the EPA to essentially outlaw coal-fired power plants in the United States over the next decade. And because the EPA’s rule-making is almost never overturned — by either a court or a subsequent administration — this will be the “law of the land” in the United States due. So let it be written! So let it be done!
Well, so much the president’s will being law, at least in this case. Which brings us to …
3. SCOTUS has had enough of Obama Imperialism
It’s a little late, but SCOTUS has finally put its foot down. As Marita Noon, executive director of the Citizens Alliance for Responsible Energy, noted in a release from The Heartland Institute, this is the first time SCOTUS has stayed an EPA rule. Why? Because the EPA (and Obama) were so obviously and egregiously overstepping their authority — and their arrogance about it throughout the Obama presidency was probably their undoing in the eyes of the Court.
As the applicants for the stay noted in their brief: After SCOTUS ruled in 2015 that EPA was abusing its rule-making authority under the Clean Air Act, the EPA bragged on its own blog that the decision was moot. EPA knew that industry would be compelled to operate under the assumption that the rule would be upheld. To do otherwise was foolish — not only from a business stand-point (can’t be caught flat-footed compared to competitors), but a legal one (we’ll be liable if we don’t comply by the deadline). In other words, EPA spiked the football and said, explicitly, that SCOTUS doesn’t matter — even when it rules against it.
I want to note this bit from the plaintiff’s brief, joined by the attorneys general from 29 of our 50 states:
In short, EPA extracted ‘nearly $10 billion a year’ in compliance from power plants before this Court could even review the rule …
Where did power plants get that $10 billion? From you and me, the consumers of electricity. Obama’s Clean Power Plan is not a rule that punishes Big Energy corporations. It’s a rule that punishes you ane me, the consumers of the energy we need to live. Remember that the next time you plug in your iPhone or hear your heat pump kick in.
Well, it appears that five justices decided they’ve had enough of EPA’s and Obama’s corruption of the law-making process. (BTW: It’s a scandal that the decision on the stay was not unanimous. Even the Court’s liberals should have proper respect for the separation of powers — for ego’s sake, if not the Constitution.)
A rule as sweeping, significant, and expensive to consumers as the Clean Power Plan must originate in Congress, duly pass, and be signed by the president. Or, if part of a treaty, it must be submitted to and approved by the Senate. I think we can now expect (or at least hope) that SCOTUS will finally uphold that basic constitutional principle. We should hope Justice Antonin Scalia gets to write the majority opinion, and not merely contribute a biting concurring opinion. This decision needs to really sting.
Not to get our hopes up, but this is shaping up to be a very rare judicial victory for the rule of law over the rule of bureaucrats and an imperial president. But we need many, many more to turn the ship of state even remotely toward Constitutional governance after eight years of Obama’s rule-by-decree.
[First published at Ricochet.]
In today’s edition of The Heartland Daily Podcast, Kyle Maichle, project manager for constitutional reform at The Heartland Institute joins Host Donald Kendal to talk about free speech issues on college campuses.
Since the situation at Yale regarding proper Halloween costumes, free speech issues have littered the news. From free speech codes to safe spaces, this topic has gained national attention. Maichle come on to the Podcast to give us background information and updates on these situations. Maichle also brings up the report recently released by the Foundation for Individual Rights in Education (FIRE) that ranks colleges based on their free speech restrictions.
The January-February 2016 issue of Audubon Magazine (Figure 1) proclaims “Arctic on the Edge: As global warming opens our most critical bird habitat, the world is closing in.” In reality, the magazine’s writers and editors have gone over the edge, with wildly misleading “reports” on the Arctic.
The magazine is awash in misstatements of fact and plain ignorance of history, science and culture. It epitomizes the false claims that characterize “news coverage” of “dangerous manmade climate change.” The following analysis corrects only some of the most serious errors, but should raise red flags about most every claim Audubon makes.
The first part of this issue devotes pages to each of the countries surrounding the Arctic Ocean. The Finland page says “storms become more severe” with warming. The writers are either clueless or intentionally misleading; they likely did not take Earth Science or Meteorology and are oblivious of atmospheric fluid dynamics.
The pole to equator temperature difference drives the strength of storms. If there actually is Arctic warming, that temperature difference declines, and storm strength becomes less severe – not more so.
The Norway page describes the Black-legged Kittiwake and speculates that warming in the Barents Sea attracts herring which feed on Kittiwake prey. The authors are clearly unaware that natural warming and cooling cycles have been occurring for centuries. On a map derived from the Norwegian Polar Institute’s examination of ship logs (Figure 2), a green dashed line depicting reduced Nordic Sea ice extent demonstrates extensive warming in the Barents Sea in 1769. During that particular warm period, ocean currents and weather conditions made Svalbard and even parts of Novaya Zemlya ice-free.
The Greenland page purports to show “Greenland Warming.” However, it was warmer than today during the Medieval Warm Period, and abundant new ice formed in Greenland during the past century. Enough snow and ice accumulated on the Greenland Ice Sheet that Glacier Girl, the P-38 airplane that landed there in 1942, was buried in 268 ft of ice before she was recovered in 1992. That’s 268 feet in 50 years, well over 5 feet of ice accumulation a year, much of it during a period when Earth was warming and Greenland was supposedly losing ice.
Audubon’s cover photo features a Russian oil rig amid an ice-covered Arctic Ocean. It is intended to instil fear, by suggesting that a once solidly icy Arctic is melting rapidly. However, history shows that the Nordic ice extent has been decreasing since at least the 1860s, and probably since the depth of the Little Ice Age, around 1690. In fact, historic data (Figure 3), indicate that multi-decadal variability of Nordic Sea extent (some 30-45% more or less ice during each cycle) has been occurring for over 150 years.
Toward the end of the January-February issue is an account of a visit to Wainwright, Alaska, an Inupiat village of about 556 natives, located on the Arctic Ocean in North Slope Borough. The Native Inupiat much prefer to maintain their subsistence culture, which has been their tradition since their ancestors settled nearby about 13,000 years ago.
The caption to the Audubon photograph of the village emphasizes rising ocean waters. However, most of Alaska has falling sea levels, the result of the isostatic adjustment of northern North America. This rebound effect began with the melting of the Wisconsin Ice Sheet, as Earth emerged from the Wisconsin Ice Age and entered the Holocene between 15,000 and 10,000 years ago. The nearest tide gage to Wainwright is Prudhoe Bay, and sea level rise there is very small: 1.20 mm/year +/- 1.99 mm/year (up to 7.9 inches per century) – so small that sea levels might actually be falling there, as well, when margin of error is considered.
The Audubon writers mention “melting permafrost” numerous times, but when the Natives spoke about this in 1979, they clearly did not view it as a problem. In fact, in their own words, recorded in The Inupiat View, the Natives specifically say melt water is scarce in North Slope Borough. What has happened in the years since?
First, the North Slope has a summer, and from early June until mid-September air temperatures average warmer than 32 degrees F. Wainwright’s extreme maximum once reached 80 degrees Fahrenheit! During summer, the soil melts, creating an “active layer.” The surface is not permanently frozen, but is melted part of the year, every year. Whether there actually is problematical “melting permafrost,” as claimed by Audubon, can be determined only by finding the long-term trend in the thickness of the active layer.
Specialists studying this phenomenon publish reports in the Circumpolar Active Layer Monitoring Network, in NOAA’s annual Arctic Report Card, and elsewhere. The 2012 Report Card edition had an extensive section on permafrost. A quote from this edition pours freezing water on Audubon’s “melting permafrost” claim: “Active-layer thickness on the Alaskan North Slope and in the western Canadian Arctic was relatively stable during 1995-2011,” it notes.
The NOAA Arctic Reports do have a heavy dose of alarmist rhetoric, especially in the boilerplate introductory sections, but the actual measurements and data present nothing that supports the alarmist polemic of the day. The long term pattern shows centuries-long slow warming, with multi-decadal fluctuations; significant or alarming anthropogenic trends are simply not there.
Audubon should stay away from areas where it has no expertise – specifically imagined or invented catastrophic anthropogenic global warming. Audubon’s equivocal policy on wind power ostensibly calls on wind energy developers to consider planning, siting and operating wind farms to avoid bird carnage; the Society claims to support “strong enforcement” of laws protecting birds and wildlife. On the other hand, the same Audubon policy speaks about “species extinctions and other catastrophic effects of climate change” and “pollution from fossil fuels.”
When read together, this schizophrenic policy clearly puts Audubon on the side of climate alarmism – with the loss of birds and bats merely a small price to pay in an effort to “save the planet.”
Another article shows that Audubon’s alarmist climate claims, rather than bird safety, clearly dominate president David Yarnold’s concerns. Beneath a picture of a forest fire, an editorial quotes him: “Climate change is the greatest threat to birds and biodiversity since humans have been on the planet.”
Yarnold’s editorial is rife with alarmist propaganda: increasing drought (data show drought decreasing in the United States over the past 110 years in regions where we have temperature and rainfall measurements) … increasing forest fires (not so, according to actual data) …increasing species extinctions (virtually no extinctions have occurred except on isolated islands where predators were introduced by humans) … and more flooding (there has been nothing outside normal experience).
Audubon needs to concentrate on saving birds and other flying creatures from the very real death machines that kill countless thousands, perhaps millions, of them every year. These killing machines include wind turbines, that chop up raptors, song birds and bats, and heliostats (installations using mirrors to concentrate the sun’s rays) that incinerate them.
Bats pollinate crops and consume insects. However, the number of bats killed has been conservatively estimated at 600,000 annually, and may be as high as 900,000. The Ivanpah solar-to-electrical-energy plant in California’s Mojave Desert actually ignites birds in flight; the dying birds are called “streamers,” because they emit smoke as they fall from the sky. One report estimates that over 100 golden eagles and 300 red tailed hawks are killed yearly by wind turbines at California’s Altamont Pass, but another calculates that millions of birds and bats are killed every year by US wind turbines.
Audubon needs to get some real science in its research and show true empathy for the human-caused deaths that our flying friends face on a daily basis
Robert Endlich served as a weather officer in the US Air Force for 21 Years. He has a BA in geology and an MS in meteorology and is a member of Chi Epsilon Pi, the national Meteorology Honor Society. (A more extensive version of this article can be found on MasterResource.org)
The news is filled with the everyday zigzags of those competing against each other for the Democrat and Republican Party nominations to run for the presidency of the United States. But one of the most important issues receiving little or no attention in this circus of political power lusting is the long-term danger from the huge and rising Federal government debt.
The Federal debt has now crossed the $19 trillion mark. When George W. Bush entered the White House in 2001, Uncle Sam’s debt stood at $5 trillion. When President Bush left office in January of 2009, it had increased to $10 trillion. Now into seven years of Barack Obama’s presidency, the Federal debt has almost doubled again.
And it is going to get much worse, according to the Congressional Budget Office. On January 26, 2016, the CBO released it latest “Budget and Economic Outlook” analysis for the next ten years, from 2016 to 2026.
Continuing Deficits and Growing National Debt
The economists at the CBO estimate that the Federal budget deficit for the fiscal year, 2016, will be $544 billion, or $105 billions more than Uncle Sam’s budget deficit in fiscal year 2015. And each year’s budget deficit will continue to be larger than the previous year from here on. Indeed, the CBO estimates the Federal government’s annual deficits will once more be over $1 trillion starting in 2022 and thereafter.
Between 2016 and 2026, the Federal debt, as a result, is projected to increase by a cumulative amount of almost $9.5 trillion, for a total national debt of around $30 trillion just ten years from now.
The reason for the continuing ocean of Federal red ink is the fact that while government revenues are projected to be around 49.5 percent higher in fiscal year 2026 ($5,035 trillion) than in fiscal year 2016 ($3.376 trillion), government spending will be over 63 percent more in fiscal year 2016 ($6,401 trillion) than in fiscal year 2016 ($3,919 trillion).
Understanding the Fiscal History of America
The famous Austrian-born economist, Joseph A. Schumpeter (1883-1950), once wrote an article on, “The Crisis of the Tax State” (1918). He said the following about a country’s fiscal history:
“[A country’s] budget is the skeleton of the state stripped of all misleading ideologies – a collection of hard facts . . . The fiscal history of a people is above all an essential part of its general history. An enormous influence on the fate of nations emanates from the economic bleeding which the needs of the state necessitates, and from the use to which its results are put . . . The view of the state, of its nature, its forms, its fate [are] seen form the fiscal side . . .
“The spirit of a people, its cultural level, its social structure, the deeds its policy may prepare – all this and more is written in its fiscal history, stripped of all phrases. He who know how to listen to its message here discerns the thunder of world history more clearly than anywhere else . . . The public finances are one of the best starting points for an investigation of society, especially though not exclusively of its political life.”
A hundred years ago, around 1913, before the beginning of the First World War, all levels of government in the United States – Federal, State, and local – taxed and spent less than 8 percent of national income, with the Federal government absorbing less than half of this amount.
By 1966, Federal outlays alone took 17.2 percent of Gross Domestic Product and are projected to rise to 21.2 percent in 2016 and will to 23.1 percent of GDP by 2026. Over the fifty years between 1966 and 2016, government outlays as a percentage of GDP increased by nearly 24 percent, and will be growing more over the next decade.
The Welfare State Drives the Deficits and the Debt
What can America’s fiscal history, as Schumpeter suggested, tell us about the direction and drift of government over the last half-century and looking to the future? Perhaps not too surprisingly for both supporters and critiques of the welfare state, it has been and is being driven by the continuing expansion of the “mandatory spending” of the redistributive “entitlement” programs.
In 1966, the intergenerational redistribution program known as Social Security absorbed 2.6 percent of GDP; in 2016, it will suck up 4.9 percent, for a nearly 90 percent increase. And by 2026, Social Security spending will represent 5.9 percent of GDP, for a 20 percent increase over the coming decade. (See my article, “There is No Social Security Santa Claus.”)
Major Federal-funded health care programs (Medicare, Medicaid and related programs) siphoned off a mere 0.1 percent of GDP in 1966; in 2016 this will have increased to 5.6 percent of GDP, a more than 500 percent increase over fifty years. By 2026, the CBO estimates, these Federal health care programs (now including ObamaCare) will take 6.6 percent of GDP, for a nearly 18 percent increase in the next ten years. (See my article, “For Healthcare the Best Government Plan is No Plan.”)
Summing over all of these and related mandatory entitlement spending programs, in 1966 the redistributive welfare state absorbed 4.5 percent of the nation’s Gross Domestic Product; in 2016 this will be 13.3 percent of GDP, and 15 percent of GDP in 2026. Or a 317 percent increase over the fifty years between 1966 and 2016, and an additional 13 percent increase between 2016 and 2026.
Due to all of the deficit spending to finance this redistributive largess over what the government collects in tax revenues to fund it, interest on the Federal debt will increase from 1. 4 percent of GDP in 2016 to 3.0 percent of GDP in 2026, or more than a 100 percent increase in the interest cost on the national debt over the next ten years as a percentage of GDP.
Welfare state spending plus mandatory interest payments on the Federal debt now absorbs around 60 percent of everything Uncle Sam spends.
For a point of comparison in this tilted direction of government spending, all non-entitlement spending represented 11. 5 percent of GDP in 1966, and will be down to 6.5 percent of GDP in 2016 and is projected to be 5.2 percent of GDP in 2026. This represents a decrease as a percentage of GDP in non-entitlement spending of 45 percent over the last fifty years, and another 20 percent decline as a percentage of GDP over the coming decade.
Now in absolute terms all government spending has grown over the last fifty years. But what America’s fiscal history highlights, looking over the half-century that is behind us, is that it is the dynamics of a growing domestic welfare state that is fundamentally driving the country’s financial ruin.
The Force of Collectivist Ideology and Political Privilege
This has been coming about due to two fundamental and interconnected factors at work: First, the ideology of a right to other people’s wealth and income, and, second, the democratization of political privilege.
For more than a century, now, the older American political tradition of classical liberalism, with its belief in individual liberty, economic freedom and constitutionally limited government, has been slowly but surely eroded by the “progressive” ideal of political, social and economic collectivism.
These dangers were already present in the late nineteenth century with the rise of the socialist movement, and its then appearance on this side of the Atlantic. “The workers,” however, were not the vanguard of socialism in either Europe or America. It was mostly intellectuals and political philosophers who arrogantly dreamed dreams of new and “better worlds” designed and planned according to what they considered a more moral and “socially just” society. (See my article, “American Progressives are Bismarck’s Grandchildren.”)
Its Not Your Fault and Others Owe You
Over the decades, for a century now, the socialist criticisms of capitalist society have eaten away, little by little, the understanding, belief in, and desire for a truly free market society. Your pay seems to be too low in comparison to what you think or have been told you deserve? It must be due to the exploitation and unfairness of profit-making businessmen.
You’re afraid that you might not have the health care you want or the retirement money you think you’ll need, surely it must because “the rich” have squandered their unearned wealth on things other than what “the people” really need. Your child cannot go to the topnotch college or university you would want them to attend for your offspring’s future? That can be cured along with those other injustices by taxing or regulating those who have more than you, and who don’t deserve it.
The social game is rigged; nothing is your fault, it is all due to those who have more than you, and who don’t pay their “fair share” to fund what “working people” like you need and have a “right” to.
When such thinking is repeated enough, time-after-time over years and, now, generations, a large number of people in our society implicitly take it all to be true. If only government has sufficient taxing and regulating authority, the world can be made better for “the many” against the greed and social disregard of the few (the “one percent.”)
Losing the Spirit and Practice of Individualism
The dangers in all this was warned about long ago, for instance, by J. Laurence Laughlin, an economist who was the founder of the economics department at the University of Chicago. In his 1887 book, The Elements of Political Economy Laughlin said:
“Socialism, or the reliance on the state for help, stands in antagonism to self-help, or the activity of the individual. That body of people certainly is the strongest and the happiest in which each person is thinking for himself, is independent, self-respecting, self-confident, self-controlled and self-mastered. Whenever a man does a thing for himself he values it infinitely more than if it is done for him, and he is better for having done it . . .
“If, on the other hand, men constantly hear it said that they are oppressed and downtrodden, deprived of their own, ground down by the rich, and that the state will set all things right for them in time, what other effect can that teaching have on the character and energy of the ignorant than the complete destruction of all self-help?
“They begin to think that they can have commodities which they have not helped to produce. They begin to believe that two and two make five. It is for this reason that socialistic teaching strikes at the root of individuality and independent character, and lowers the self-respect of men who ought to be taught self-reliance . . .
“The right policy is a matter of supreme importance, and we should not like to see in our country the system of interference as exhibited in the paternal theory of government existing in France and Germany.”
What Professor Laughlin feared and warned about nearly 130 years ago has increasingly come to pass with the social attitudes and political desires and demands of too many of our fellow countrymen. European collectivism invaded and continues to conquer America’s original spirit and politics of individualism.
The Rise of Democratized Privilege
The other force at work in bringing about our growing fiscal socialism is what I would suggest calling democratized privilege. Before the rise of democratic governments in the nineteenth century, the State was seen as a political force for exploitation and abuse. Under monarchy, kings and princes used their taxing and policing powers to plunder their subjects for their own gain as while as for the benefit of the aristocrats and noblemen who gave allegiance, obedience and support to the monarch. Power, privilege and plunder were for the political few at the expense of the many in society.
At first the call for democratic government was to place limits on the powers of kings and their lords-of-the-manor supporters, so to restrain political abuses that threatened or violated individuals’ rights in their lives, liberty and private property.
But with the rise of socialist and welfare-statist ideas as the nineteenth century progressed, there emerged a new ideal: a welfare-providing government for “the masses.” The view came to be that government was no longer a fearful master needed restraint and limits. No, democratically-elected government was now conceived as “the people’s” servant to do its bidding and to provide it with benefits.
People hoping to gain favors and privileges from new democratic governments formed themselves into groups of common economic interests. In this way, they aimed to pool the costs of the lobbying and politicking that was required to obtain what they increasingly came to view as their “right,” that is, to those things to which they were told and demanded they were “entitled.”
No longer were redistributive privileges to be limited to the few, as under the old system of monarchy. Now privileges and favors were to be available to all, heralding a new age, an Age of Democratized Privilege. More and more people are dependent upon government spending of one form or another for significant portions of their income. And what the government does not redistribute directly, it furnishes indirectly through industrial regulations price and production controls, and occupational licensing procedures.
Government Dependency and Resistance to Repeal
As dependency upon the State has expanded, the incentives to resist any diminution in either governmental spending or intervention have increased. All cuts in government spending and repeal of interventions threaten an immediate and often significant reduction in the incomes of the affected, privileged groups.
And since many of the benefits that accrue to society as a whole from greater market competition and more self-responsibility are not immediate but rather are spread out over a period of time, there are few present-day advocates of a comprehensive reversal of all that makes up the modern welfare state, and most certainly not in an election year.
While it may not be the center of political discussion and debate in this election year, the dilemma of ever-worsening government deficits and expanding national debt is not going to go away.
It will have to be, eventually, faced and confronted. But as Joseph Schumpeter pointed out to us, the fiscal history of a country tells us the underlying ideological and cultural currents at work that pull a nation in a particular direction.
The real dilemma is not whether this or that government program can be cut or reduced in terms of how fast it is growing at present and future taxpayers’ expense. The real challenge is to reverse the political and cultural trends toward more and growing fiscal redistributive socialism.
This will require a strong and articulate revival of a culture and a politics of individualism. It is, ultimately, a battle of ideas, not budgetary line items.