Does fracking cause housing prices to fall? The answer to that question is more difficult that it might seem. Many anti-fracking activists have claimed oil and natural gas development has led to substantial decreases in property values in areas where drilling occurs, but other places, such as North Dakota, saw property values skyrocket during the boom in oil production.
In this episode of The Heartland Daily Podcast, Kayla Harris, the director of Energy and Environmental policy at Ballotpedia helps answer this question. Harris discusses the results of her recent study on the impact of fracking on property values in several counties in Colorado with research fellow Isaac Orr, stating that fracking has led to increases in property values in some areas, and decreases in others.
If you don’t visit Somewhat Reasonable and the Heartlander digital magazine every day, you’re missing out on some of the best news and commentary on liberty and free markets you can find. But worry not, freedom lovers! The Heartland Weekly Email is here for you every Friday with a highlight show. Subscribe to the email today, and read this week’s edition below.
Cruz, Trump Go on Record on Climate, Energy
H. Sterling Burnett, Climate Change Weekly
The two leading Republican presidential candidates gave comprehensive statements on the topics of energy and climate when they responded to a survey by the American Energy Alliance. Both Trump and Cruz are skeptical of the policies implemented by the Obama administration to prevent purported dangerous anthropogenic global warming. Both candidates reject imposing a carbon tax, complain of regulatory overreach, and promise to scale back Obama’s Clean Power Plan. While the candidates agreed on a number of subjects, they did disagree on energy subsidies and the renewable fuel mandate. READ MORE
IRS Serves Government, Not Taxpayer
Jesse Hathaway, Real Clear Policy
With Tax Day upon us, now is the perfect time to remind ourselves that the Internal Revenue Service is the government Leviathan’s enforcer, nothing more. The IRS’s disregard for public safety was recently highlighted by a new report from the Government Accountability Office. The report showed the IRS failed to make recommended improvements, leaving taxpayers’ private information at the mercy of hackers, both domestic and foreign. READ MORE
Prosecute Climate Deniers? No, First Amendment Protects Debate
H. Sterling Burnett, The Philadelphia Inquirer
The global warming debate has reached a new, and chilling, level. U.S. Attorney General Loretta Lynch reports discussing with the FBI pursuing legal action against companies, research institutions, and scientists who debate whether humans are causing catastrophic climate change. This is an unprecedented politicization of justice and attack on our First Amendment right to freedom of speech. READ MORE
Featured Podcast: Yaron Brook: America’s Misguided Fight Against Income Inequality
“Income inequality” has once again become a rallying cry for the Left. Since achieving the goal of perfect equality is impossible, politicians like Bernie Sanders can forever champion the objective of increasing equality regardless of the amount of government intervention currently in place. Yaron Brook, president and executive director of the Ayn Rand Institute, joins the Heartland Daily Podcast to explain the truth about “income equality alarmism” and why equal is, in fact, unfair. LISTEN TO MORE
Coming Next Week to Arlington Heights: The Vaping Wars!
If you love discussions about liberty, you will not want to miss the great series of events Heartland has lined up through the spring. OnWednesday, April 20, a discussion will take place about the “vaping wars” – the government’s war on e-cigarettes. OnWednesday, May 11, Ryan Yonk, executive director of Strata Policy, comes to Heartland to discuss the new book, Nature Unbound: Bureaucracy and the Environment. We hope to see you here in Arlington Heights, but if you are unable to attend in person, the events will be live-streamed and archived on Heartland’s YouTube page. SEE UPCOMING EVENTS HERE
Exelon Again Threatens to Close Nuclear Plant Unless Taxpayers Subsidize Operations
H. Sterling Burnett, The Heartlander
Exelon, a utility with the largest fleet of nuclear plants in the United States, is once again seeking corporate welfare from taxpayers. Exelon told Illinois lawmakers it will close its Southern Illinois Clinton nuclear plant if it does not receive new financial support from the state in 2016. This isn’t the first threat. In 2015, Exelon said it would close three plants in the state unless Illinois agreed to impose a special surcharge that would have provided an additional $300 million to the company. READ MORE
College Admissions Offices Adapt to Homeschooling Boom
Andy Torbett, The Heartlander
As the popularity of homeschooling continues to grow, colleges and universities are changing their admissions processes to better accommodate homeschooled children’s unique backgrounds and skills. “There are colleges that are actively recruiting homeschooled students,” said Lennie Jarratt, The Heartland Institute’s project manager for education transformation. “Just one decade ago, this didn’t happen.” Could this be a solution to the radical left’s takeover of K–12 and higher education? READ MORE
Bonus Podcast: Brian Blase: Obamacare’s Broken Promises
In 2009, the Affordable Care Act was touted as a cure-all, required to expand health insurance access and reduce health care costs nationwide. In the six years since Obamacare’s passage, the shortcomings have become evident. Brian Blase, senior research fellow at the Mercatus Center, joins the Heartland Daily Podcast to discuss the disparity between Obamacare’s promises and the reality that exists three years into its full implementation. LISTEN TO MORE
Insurers Are Abandoning Obamacare Exchanges
Justin Haskins, Consumer Power Report
One of the most ambitious aspects of the Affordable Care Act (ACA) was the creation of health insurance marketplaces, which proponents said would increase market competition and lead to lower costs for consumers and insurers. A new report by The Heritage Foundation shows ACA has actually limited consumers’ insurance options by driving health insurers out of the Obamacare marketplace. According to the report, there are now only 287 “exchange-participating insurers,” down from 395 insurers in 2013. READ MORE
Help Us Stop Wikipedia’s Lies!
Joseph L. Bast, Somewhat Reasonable
Many people rely on our profile on Wikipedia to provide an objective description of our mission, programs, and accomplishments. Alas, the profile they find there is a fake, filled with lies and libel about our funding, tactics, and the positions we take on controversial issues. Wikipedia refuses to make the changes we request. It even deletes and reverses all the changes made by others who know the profile is unreliable. We need your help! READ MORE
School Choice Fosters Racial, Economic Equity
Joy Pullmann, School Choice Weekly
A common criticism thrown at school choice advocates is that school choice programs create segregation in the education system. Reality shows quite the opposite effect. Kevin Chavous, founding board member for the American Federation for Children, responds to this frequently made claim by pointing out that school choice programs typically serve far more minority students than is representative of their states’ populations, thereby serving to desegregate public and private schools. READ MORE
Invest in the Future of Freedom! Are you considering 2016 gifts to your favorite charities? We hope The Heartland Institute is on your list. Preserving and expanding individual freedom is the surest way to advance many good and noble objectives, from feeding and clothing the poor to encouraging excellence and great achievement. Making charitable gifts to nonprofit organizations dedicated to individual freedom is the most highly leveraged investment a philanthropist can make. Click here to make a contribution online, or mail your gift to The Heartland Institute, One South Wacker Drive, Suite 2740, Chicago, IL 60606. To request a FREE wills guide or to get more information to plan your future please visit My Gift Legacy http://legacy.heartland.org/ or contact Gwen Carver at 312/377-4000 or by email at email@example.com.
The media is spreading catastrophic global warming news from satellite temperature data ending February 2016. On March 3, 2016, the University of Alabama-Huntsville (UAH) posted the February 2016 global temperature of 0.83 degrees C. surpassed the previous record of 0.74 degrees C. for April 1998. These temperatures are the difference from the 30-year average from 1981 to 2010. This is a data set from 1979 until present when satellite temperature measurements were first made.
Associated Press writer Seth Borenstein wrote March 17, 2016 ”Freakishly hot February obliterates global weather records”. New York Times reporter Justin Gillis wrote March 22, 2016 “Scientists Warn of Perilous Climate Shift Within Decades, Not Centuries”. Expect more scary stories from other writers who live off reports from the scientific community that generates climate change (global warming) information.
The University of Alabama at Huntsville (UAH) posted its latest satellite global temperature data that spans until the end of March 2016 shown by Fig. 1.
Fig.1 Latest Global Average Tropospheric Temperatures
The March 2016 temperature has fallen to 0.73 degrees C which is even lower than the previous record of 0.74 degrees from April 1998. The satellite temperature data shows a temperature rise since 1979 of 0.12 degree C. per decade; or 1.2 degrees per century which places the earth’s warming below the recommended limit on global warming from the 2015 Paris Climate Accord.
Data over thousands of years show approximate 500-year cycles of planet warming and cooling. We are currently in the Current Warming Period which commenced approximately 1850. This was preceded by the Little Ice Age from approximately 1350 to 1850. Thus continued global warming should be anticipated until after the start of the 22nd. century.
SUPER EL NINO CAUSES TEMPERATURE RISE
Along the Equator stretching from New Guinea to Western South America is a region in which prevailing winds and sea surface temperatures create weather systems that impact the planet. When temperatures are warmer this creates a system called El Nino and countering cooling system is called La Nina. La Nina normally follows an El Nino. El Nino means the boy or Christ Child and this name was given because peak El Nino usually occurs around Christmas. These systems have been observed for centuries and are not caused by carbon dioxide from burning fossil fuels.
An El Nino system formed in early 2015 and became a Super El Nino that caused the February 2016 global temperature that was the highest observed by satellites over their period of measurements from 1979 to present. A La Nina is expected to follow this event and the question is when this happens and how much cooling takes place.
An excellent explanation of the importance of El Nino on global temperatures is given by the March 18, 2016 Reuters article “How much clarity do we have on transition to La Nina?” by Karen Braun. The article shows graphs of monthly surface and subsurface temperatures measured along the Equator.
Fig. 2 illustrates monthly El Nino 3.4 sea surface temperatures for eight El Nino events occurring since 1982/83. El Nino 3.4 designates an area 165 degrees W to 90 degrees W and 5 degrees S to 5 degrees N. Since the length of 1 degree is 69 miles at the Equator, the El Nino 3.4 area is 5200 miles by 690 miles or 3.5 million square miles.
For the temperature scale, temperatures above 0.5 degrees C. are for periods with El Nino and temperatures below -0.5 degrees C. are for periods with La Nina. If three-month average temperatures are above 0.5 degree C. (or below), then an El Nino (or La Nina) is considered in progress. Temperatures between -0.5 and 0.5 degrees are neutral.
Fig. 2 El Nino Sea Surface Temperatures
The lines are bolder for the strong El Nino over these years. As noted, the strongest El Nino prior to 2015 was the event 1997/98. In November 2015, the current El Nino sea surface temperature exceeded the maximum temperature from the 1997/98 El Nino. The current sea surface temperature has started a slight decline similar to that shown by the 1997/98 El Nino.
As seen for the 1997/98 El Nino, rapid cooling took place after the peak temperature early 1998. This resulted in a super La Nina later that year. The 2015/16 data does not show rapid cooling by the end of February 2016. However, due to the 0.11 degree C. drop in global temperature in March, 2016 shown in Fig. 1, one can infer substantial decreases in sea surface temperature are taking place.
A more recent April 7, 2016, Reuters article “Notion of a delayed La Nina might have been hasty: Braun” shows a more rapid decline in El Nino 3.4 sea surface temperature for March given by Fig. 3. This trend along with changes in prevailing winds indicates a switch to La Nina by July.
Fig. 3 Revised El Nino Sea Surface Temperatures
Perhaps a little more insight about the future of the current El Nino may be found by examining subsurface Pacific Ocean temperatures along the Equator. Fig. 3 gives monthly measured ocean subsurface temperatures along the Equator to a depth of 300 meters for the eight El Nino events. These measurements are along a greater length along the Equator shown in Fig. 2. The distance is from 130 degrees E to 80 degrees W or 10,300 miles.
Fig. 4 El Nino Subsurface Temperatures
By the end of February, the 2015/16 El Nino subsurface temperature had not turned negative; however, the lowered March temperature in Fig. 1 suggests this has taken place. The direction of this Super El Nino is similar to the one of 1997/98 and there is great chance of considerable global cooling by the end of the year.
NO GLOBAL WARMING FOR 58 YEARS
Another perspective about the 2016 global temperature is the March 7, 2016 article “NOAA Radiosonde Data Shows No Warming For 58 Years” by Tony Heller. In NOAA’s press briefing 2015 was the “hottest year ever” was a statement NOAA had a 58-year radiosonde (balloon) temperature record but only showed the last 37 years in a graph. Fig. 5 shows the graph released by NOAA.
Fig. 5 NOAA’s 37-YEAR SATELLITE AND RADIOSONDE TEMPERATURES
Tony Heller found the missing years of radiosonde data from 1958-1976 in the scientific journal article “Global Temperature Variation, Surface-100mb: An Update into 1977” in the June 1978 Monthly Weather Review. This data is shown in Fig. 5 which indicates global temperatures declined from 1958 until 1977.
Fig. 6 Temperature variation for World Surface-100MB and World Surface. Eruptions of Mt. Agung and volcano Fuego (Guatemala) are indicated
Mr. Heller combined Figs. 5 and 6 into Fig. 7 and added a horizontal red line which shows in the troposphere there has been no global warming from 1958 to 2016. Purists might complain the 1958 to 1977 data is for a region that goes from the earth’s surface to an elevation of 100 mb (54,000 ft.) while the 1979 to 2016 data is for five discrete elevations from 5000 ft. to 40,000 ft. However, the five discrete elevation data are quite similar which makes the earlier comparison valid.
Fig. 7 Radiosonde data from 1958 to 2016
The NOAA, and NASA, press release claimed 2015 was the year with the hottest global surface temperatures since measurements were made from 1880. The physics behind the behavior of the greenhouse gas carbon dioxide causing warming has the warming occurring in the atmosphere from the earth’s surface to the stratosphere. Thus the true measure of the influence of increasing atmospheric carbon dioxide from burning fossil fuels is shown by satellite or radiosonde data. From the period 1958 to 2016, atmospheric carbon dioxide has increased from 315 to 402 ppm. Since no appreciable global warming is shown by atmospheric temperatures over the period 1958 to 2016, one may infer increasing atmospheric carbon dioxide from burning fossil fuels has no significant influence on global warming.
DOES NOAA FUDGE DATA?
In his article, Tom Heller noticed NOAA 1958 to 2011 had two databases for radiosonde surface to atmospheric temperatures—(1) a database from 1958 to 2011and (2) a database from 1958 to 2016. The graph in Fig. 5 from the latest database shows about 0.5 degrees C warming from 1979 to 2010. However, the original 2011 database shows little warming during that period.
Fig. 8 NOAA 2016 Radiosonde Database Minus 2011 Database
Fig. 8 shows the 2011 database subtracted from the 2016 database which produces greater warming from 1992 to present and less warming from 1992 back to 1958.
Changes in NOAA global temperature databases are shown by the June 4, 2015, article published in Science that eliminated the pause or “hiatus” in global surface temperatures from 1998 to 2014. NOAA’s Director Thomas Karl said, “Adding in the last two years of global surface temperature data and other improvements in the quality of the observed record provide evidence that contradict the notion of a hiatus in recent global warming trends. Our new analysis suggests that the apparent hiatus may have been largely the result of limitations in past datasets, and that the rate of warming over the first 15 years of this century has, in fact, been as fast or faster than that seen over the last half of the 20th century.” Much controversy was generated over this study and many in the scientific community claimed it was wrong.
Texas Congressman Lamar Smith is heading a committee to investigate allegations of NOAA altering global temperature databases.
U.S. CLIMATE CHANGE SCIENCE PROGRAM REPORT
The U. S. Climate Change Science Program (CCSP) published an April 2006 180-page report
“Temperature Trends in the Lower Atmosphere Steps for Understanding and
Reconciling Differences “. The report showed comparisons of vertical global temperature distributions in the atmosphere computed by Parallel Climate Models (PCM) with actual radiosonde (balloon) temperature measurements. The data is displayed with a vertical axis of altitude given on the left side as pressure in millibars and the right side in kilometers (km). The horizontal axis is latitude from 75 degrees S to 75 degrees N.
Fig. 9 All temperature changes were calculated from
monthly-mean data and are expressed as linear trends
(in ºC/decade) over 1979 to 1999.
Fig. 9 shows from altitudes 1.5 km to 9 km temperature changes are small over this 20-year interval. This provides some agreement with Tony Heller’s paper “NOAA Radiosonde Data Shows No Warming For 58 Years”.
Fig. 10 Computer modeling zonal atmospheric temperature changes
from all forcings (greenhouse gas increase dominates)
Fig. 10 (all forcings) shows calculated global atmospheric temperatures from January 1958 to December 1999. Temperatures are given by the change over this time period. Fig. 10 shows a very distinct “hot spot” from altitudes of 4 km to 16 km and latitudes from 30 degrees S to 30 degrees N. This hot spot is caused by increased greenhouse gases (mostly carbon dioxide) over that time period. The radiosonde measurements shown in Fig. 9 show no “hot spot”. This clearly indicates the models for predicting global temperature changes are wrong on how they handle additions of “greenhouse gases” (predominately carbon dioxide) to the atmosphere.
The Abstract for the CCSP report contains the following information:
“Previously reported discrepancies between the amount of warming near the surface and higher in the atmosphere have been used to challenge the reliability of climate models and the reality of human-induced global warming. Specifically, surface data showed substantial global-average warming, while early versions of satellite and radiosonde data showed little or no warming above the surface. This significant discrepancy no longer exists because errors in the satellite and radiosonde data have been identified and corrected. New data sets have also been developed that do not show such discrepancies.
….For recent decades, all current atmospheric data sets now show global-average warming that is similar to the surface warming. While these data are consistent with the results from climate models at the global scale, discrepancies in the tropics remain to be resolved. Nevertheless, the most recent observational and model evidence has increased confidence in our understanding of observed climate changes and their causes.”
The vast differences between Figs. 9 and 10 indicate calculations from models have little agreement with experiments. The U. S. CCSP has taken the position “if models don’t agree with experiments; then the experiments are wrong.” Unfortunately, most readers never go beyond the Abstract and the CCSP conclusions are considered fact.
With a considerable amount of fanfare, NOAA and NASA announced 2015 and possibly 2016 are the warmest years since recording keeping started in 1880. Their media supporters like Seth Borenstein and Justin Gillis produced scary articles circulated through the media announcing this threat to the world from carbon dioxide from burning fossil fuels causing catastrophic global warming. The United States should lead the way for all nations to immediately find alternative energy sources to replace fossil fuels regardless the economic cost.
The importance of NOAA and NASA assertions is questioned by experimental data cited in this article. It is quite likely the present Super El Nino changes to a Super La Nina that brings global temperatures back to levels seen a few years ago. Will reporters like Seth Borenstein and Justin Gillis report to the public the errors of their recent assertions? I think not.
On October 5, 2009, President Obama issued an executive order, FEDERAL LEADERSHIP IN ENVIRONMENTAL, ENERGY, AND ECONOMIC PERFORMANCE, that showed policies toward reducing greenhouse gas emissions for the rest of his term in office. This paper explains the motivation for climate policies from all government organizations the past 7 years. The vast waste of tax dollars and impediments to fossil fuel production may be the reason for economic stagnation in spite of the U. S. becoming the fossil fuel energy producer of the planet. A paper by Dr. James H. Rust “President Obama Demands Agreement With Climate Policies” on The Heartland Institute’s website gives examples of compliance with the Executive Order for government, education, and commercial organizations.
One of the sources of surface temperature data is the United States Historical Climatology Network (USHCN), which gives temperature data in the contiguous United States. Walter Dnes wrote an essay “USHCN Monthly Temperature Adjustments” which gives references 1, 2, 3, 4, 5, and 6 that describe in detail monthly adjustments to USHCN data from 1872-to-present. These adjustments made present temperatures warmer, earlier temperatures cooler, and eliminated the 1930s period of heat waves and droughts.
A January 20, 2016 paper “No Pause in NASA Climate Science Corruption” shows NASA-GISS has doubled global warming the past 15 years by altering its data over 15 years. They completely ignored satellite data.
Numerous studies show NOAA and NASA made adjustments to temperature data to show unwarranted global warming. Real Climate published a paper showing adjustments by both NOAA and NASA to U. S. and other nation’s temperature data.
The United Kingdom has been exceptional in reporting news of bogus temperature data. British journalist James Delingpole wrote the January 30, 2015 article “FORGET CLIMATEGATE: THIS ‘GLOBAL WARMING’ SCANDAL IS MUCH BIGGER” which points out the world’s three surface data sources for global temperatures have adjusted their raw data. The sources are NASA-GISS, NOAA which maintains the dataset known as the Global Historical Climate Network, and the University of East Anglia Climatic Research Unit and Met Office data records known as Hadcrut. Mr. Delingpole found no satisfactory reasons for temperature adjustments.
A famous saying by Albert Einstein, “No amount of experimentation can ever prove me right; a single experiment can prove me wrong.” NOAA, NASA-GISS, and US-CCSP re-interpretation of this remark is “computer models are always right; no amount of experiments can prove them wrong.”
Investigations by Congress are in order to clear up discrepancies of NOAA and NASA-GISS temperature data from year-to-year and reported accuracies of climate models by groups like the CCSP.
A new report from the Government Accountability Office, a nonpartisan government agency tasked with auditing, evaluating, and investigating government affairs for Congress, faults the Internal Revenue Service for failing to properly secure taxpayer data, leaving taxpayers’ private information at the mercy of hackers, both domestic and foreign. The report, delivered to IRS chief John Koskinen on March 28, says the IRS has failed to make recommended improvements to its financial and information-technology procedures.
Unfortunately for taxpayers, the IRS has little motivation to protect the sensitive data it collects, because the agency views government, not taxpayers, as its consumer.
According to the report, the IRS “has not always effectively implemented access and other controls, including elements of its information security program, to protect the confidentiality, integrity, and availability of its financial systems and information.” Also: “These weaknesses — including both previously reported and newly identified — increase the risk that taxpayer and other sensitive information could be disclosed or modified without authorization.”
Other violations of good IT security practices cited in the report include failures to encrypt taxpayers’ vital information, weak passwords on servers containing taxpayer data, and easy-to-evade physical security. For example, the report says non-employees could plausibly sneak by security guards in some IRS data centers and gain access to secure systems.
GAO auditors can issue reports until they’ve run out of printer paper and toner ink, but until lawmakers get tough with the IRS, the taxman will have no incentive to shape up. Living things consume resources, grow, react to stimuli, and reproduce. Government agencies such as the IRS consume taxpayer money, hire more staff, make new rules and regulations, and spin off new divisions and departments on a regular basis.
From the politically motivated “enhanced auditing” of conservative organizations committed by Lois Lerner and her employees to repeated data breaches, the government agency U.S. citizens are forced to deal with every year on April 15 has treated Americans with contempt. Why? Precisely because Americans are forced to “do business” with the IRS, there is little reason for the IRS to provide better customer service, protect private data, or apply tax laws in a neutral, nonpartisan way.
As Bruce Yandle, dean emeritus of Clemson University’s College of Business and Behavioral Science, wrote for the Foundation for Economic Education, the solution to poor service from public servants is to reduce the power government has over citizens, so there are fewer opportunities for corruption and inefficiency.
“We must take action to reduce occurrences that corrupt the political process,” Yandle wrote. “But how? First, by limiting the domain of government action. Then, when the domain is limited, by requiring transparency and regular agency reports that demonstrate choice neutrality, by encouraging competition from the loyal opposition, and by showing constant vigilance.”
Also like living things, the IRS has an instinct for self-preservation, and lawmakers must harness that desire to align the agency’s actions with the best interests of taxpayers. By reducing the IRS’ size and power, as well as the power of the federal government in general, abuse and corruption become less attractive, and agencies are forced to concentrate on their core competencies.
Just as a trainer disciplines a disobedient animal, Americans need to demand their government start working for them again, instead of the current status quo of Americans constantly working to feed the government leviathan.
In The Tank Podcast (ep34): Rich States, Poor States, Tax Freedom Day, PEV Subsidies, and Global Warming Thought Crimes
John Nothdurft returns in episode #34 of the In The Tank Podcast. This weekly podcast features (as always) interviews, debates, and roundtable discussions that explore the work of think tanks across the country. The show is available for download as part of the Heartland Daily Podcast every Friday. Today’s podcast features work from ALEC, the Tax Foundation, the Freedom Foundation of Minnesota, and the Competitive Enterprise Institute.
Featured Work of the Week
This week’s featured work of the week is ALEC‘s Rich States, Poor States Report (9th ed). The report ranks all 50 states on 15 economic policies, including various tax, regulatory, and labor policies that give state lawmakers yearly comparisons of how policies are helping or hurting their economic outlook. John and Donny discuss the winners and the losers of this report and talk about what their major takeaways are.
In the World of Think Tankery
Today Donny and John get set from Tax Day. They discuss Tax Freedom Day, a measurement determined by the Tax Foundation. Tax Freedom Day is the date “when the nation as a whole has earned enough money to pay its total tax bill for the year.” 2016’s Tax Freedom Tax falls on April 24, 114 days into the year.
Speaking of taxes, Donny and John discuss a tax credit recently proposed in Minnesota for electric vehicles. The Freedom Foundation of Minnesota published an article titled “It’s Time to Pull the Plug on Electric-Car Subsidies.” While this article focuses on the state effort to prevent this tax credit, Donny and John talk about the implications of the national tax credit.
The last topic discussed by Donny and John is the crack down on free speech by a coalition of Attorneys general regarding global warming. The Competitive Enterprise Institute recently received a Subpoena demanding they turnover a decade’s worth of documents, including emails, donor information, statements, and other documents relating to climate change policy. Donny and John talk about the chilling effect this move has on the global warming debate and how it amounts to an attack on the first amendment.
New Hampshire Gov. Maggie Hassan signed into law on April 5 House Bill 1696 to modify and renew through 2018 the state’s Medicaid expansion program under the Affordable Care Act (ACA), which state lawmakers first adopted in 2014.
Although the law is scarcely a week old, the people of New Hampshire are less than seven months away from influencing whether their elected officials renew Medicaid expansion through 2020.
Why so soon? Most of the New Hampshire lawmakers who will decide Medicaid expansion next time around will hold their offices as a result of the general election on Nov. 8, just seven months from now.
State lawmakers will vote in the spring of 2018 either to allow the Medicaid expansion program to sunset at the end of the year, as HB 1696 prescribes, or renew the program again through 2020. In that year alone, New Hampshire will spend $47 million to extend Medicaid coverage to 50,500 newly eligible enrollees, according to HB 1696’s fiscal note. If the program attracts more enrollees than projected, as other states’ Medicaid expansion programs have, New Hampshire will pay even more.
As the people of New Hampshire prepare to elect candidates to represent their values and interests in the General Court, voters should consider three popular Medicaid expansion myths, misconceptions and rebuttals.
One myth is Medicaid expansion empowers states. In fact, the federal share of Medicaid expansion makes states more dependent on the federal government. In 2020, the federal share for New Hampshire will be $509 million. New Hampshire will grow more beholden to the federal government as it grows more dependent on federal money. Like the ACA itself, this federal overreach will further upset the balance of power between federal and state governments, known as federalism.
The Founding Fathers, in contrast to the majority of current New Hampshire lawmakers, saw value in limiting the federal government’s reach into matters the 10th Amendment reserves to the states, such as providing for the health and welfare of their citizens.
A second myth is popular among alarmists: If New Hampshire doesn’t renew its Medicaid expansion program, other states with Medicaid expansion programs will get billions of New Hampshire dollars in the form of federal shares.
The truth is, if New Hampshire allows its Medicaid expansion program to sunset, the federal share it was receiving would not go to other states. It would go unspent. “There is no magic pot of Obamacare money” in Washington, D.C., waiting to be distributed to other states, Nicolas Horton, a senior research fellow at the Foundation for Government Accountability, told Health Care News in March. “All of the money the federal government is spending on Obamacare’s Medicaid expansion is being added to the national debt.”
The national debt will reach close to $20 trillion when President Barack Obama leaves office on Jan. 20, 2017. The money New Hampshire currently receives in the form of the federal share never was New Hampshire’s, except in the sense New Hampshire shares the national debt. Whether people realize it or not, this debt belongs to every present-day American and countless Americans yet to be born into debt they did not create.
To say New Hampshire’s money could go toward expanding Medicaid in another state is misleading, because the money is not New Hampshire’s and technically does not exist.
A third myth is HB 1696’s addition of work requirements protects the program from abuse by newly eligible, able-bodied Medicaid recipients.
The Republican-controlled New Hampshire House of Representatives left HB 1696’s work requirements vulnerable to removal by the federal Centers for Medicare and Medicaid Services (CMS). A split in the House prompted Speaker Shawn Jasper to break a 181–181 tie on March 9 in favor of including a severability clause for work requirements in the bill. The adopted amendment, offered by state Rep. Karen Umberger, ensures the Medicaid expansion program will remain in effect even if CMS holds the legislation’s work requirements invalid.
On its face, the law’s inclusion of work requirements offers politicians a convenient cop-out when constituents challenge their representatives’ support of Medicaid expansion. Wise voters will recognize that a lawmaker’s consent to severing work requirements is, by definition, a vote not to require work.
The decision whether to renew New Hampshire’s Medicaid expansion program through 2020 rests not merely with lawmakers in the General Court in 2018, but with voters at the ballot box in November.
New Hampshire’s latest Medicaid expansion law increases the state’s dependence on Washington, D.C., spends money the country doesn’t have and which never belonged to New Hampshire, and does not truly require able-bodied recipients to work.
Proponent lawmakers, like Medicaid expansion myths, deserve busting.
The FCC’s AllVid proposal is déjà vu. We have seen Google-YouTube’s piracy-as-negotiating-leverage MO in action before.
Google’s puppeteering of FCC-sponsored piracy in the FCC AllVid set-top box proposal is not the first time Google has anticompetitively used piracy promotion to gain an anticompetitive market advantage for YouTube’s monopsony power — i.e. its market power from being the only repository in the world where one can access a copy of most every video created whether it is legal or pirated, and where Google often promotes pirated videos near the top of its search results.
Don’t take my word for it, listen to Google executives’ own words in Google’s internal Gmails captured for posterity in The Statement of Undisputed Facts filed in the 2007 Viacom v. Google-YouTube copyright trial that settled in 2007.
These undisputed facts/Gmails spotlighted and organized below are damning for three reasons.
First, they prove that for ten years, Google has been trying to “pressure premium content providers to change their model towards free,” which strongly suggests Google is using its extraordinary political influence over the Federal Government and the FCC to anticompetitively extort value from companies that Google-YouTube could not competitively negotiate in the free market, because they demand premium video content be made available to them for free or near free.
Second, they prove that Google knows full well that its willful blindness to profiting from mass piracy is both anticompetitive and predatory.
Third, they help expose the FCC’s apparent willful blindness that their purported set-top box AllVid NPRM does not have a limited, narrow and containable effect on competition for just the set-top box market segment, but that it actually has broad, uncontainable and predictable ancillary impacts that are demonstrably anticompetitive, monopolistic and monopsonistic to the value of the most valuable corpus of video content in the world.
What’s at stake?
That’s because what is really at stake in AllVid is not the roughly $20b in cable set–top box revenues that the FCC myopically touts to justify its proposal. What is at stake is content that generates ten times as much in annual revenues as set-top boxes, roughly $200b in annual video content revenues. (Annual TV advertising revenues were ~$80b in 2015 per Strategy Analytics estimates, and annual multichannel video revenues were ~$120b in 2015 per SNL Kagan estimates.)
Summary of the Sordid Story that Google’s Gmail Trail Tells
For perspective, I have organized the most telling undisputed quotes from Google execs that lay bare a damning legal fact predicate for Google-YouTube’s anti-competitive behavior. It shows:
(1) Prior to buying YouTube, senior Google executives were actively considering an anti-competitive strategy of extortion – i.e. threatening illegal mass-copyright-infringement of copyright law to extract better terms to access valuable content.
(2) At the same time, YouTube on its own was knowingly and aggressively facilitating rampant video piracy of valuable content in order to grow its value and sell the company at the highest price.
(3) Google then knowingly bought YouTube fully aware that it was buying an Internet video distribution site dependent on piracy for its traffic, growth, and value.
(4) Just a few months after buying YouTube, Google formalized a program of effective predatory copyright infringement and willful blindness to piracy to try and sign content on more favorable terms, i.e. an extension of its original anticompetitive extortion strategy.
(5) Since then, Google has continued and perfected YouTube’s copyright arbitrage practice — of openly welcoming and benefiting from copyright infringement for the period from upload to DMCA takedown. (Last week, Google reports copyright owners requested Google take down 22 million infringing URLs for just that week period.)
Google Execs’ incriminating Gmails that Google did not dispute in Federal Court ruling
(1) Prior to buying YouTube, senior Google executives were actively considering an anti-competitive strategy of forcing free video model on premium content providers by threatening mass copyright infringement to extort better terms to access others’ valuable content.
“On June 8, 2006, Google senior vice president Jonathan Rosenberg, Google Senior VP Product Management, emailed Google CEO Eric Schmidt and Google co-founders Larry Page and Sergey Brin a Google Video presentation that stated the following: “Pressure premium content providers to change their model towards free; Adopt ‘or else’ stancere prosecution of copyright infringement elsewhere; Set up ‘play first, deal later’ around ‘hot content. ‘” The presentation also stated that “[w]e may be able to coax or force access to viral premium content,” noting that Google Video could “Threaten a change in copyright polìcy” and “use threat to get deal sign-up.“” [Bold added for emphasis.]Viacom v. YouTube SUF #161
(2) At the same time, the revenue-less YouTube start-up obviously knowingly aided and abetted video piracy in order to grow its traffic virally so that it could then sell the company at the highest price.
“Steal it!…”we have to keep in mind that we need to attract traffic. How much traffic will we get from personal videos?” YouTube Co-founder Steve Chen SUF #44
“If you remove the potential copyright infringements… site traffic and virality will drop to maybe 20% of what it is.” YouTube Co-founder Steve Chen SUF #55
“But we should just keep that stuff on the site. I don’t really see what will happen. What? Someone from CNN sees it? He happens to be someone with power? He happens to want to take it down right away? He get in touch with cnn legal. 2 weeks later, we get a cease & desist letter. We take the video down.” YouTube co-founder Steve Chen SUF #47
“We’re going to have a tough time defending we are not liable… when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.” YouTube Co-founder Steve Chen SUF #40
“Save your meal money for some lawsuits!” YouTube co-founder Hurley SUF #38
“…concentrate all our efforts in building up our numbers as aggressively as we can through whatever tactics, however evil.” YouTube co-founder Chen SUF #85
(3) Then Google knowingly bought YouTube aware it was buying a piracy-driven/dependent Internet video distribution site, despite substantial high-level opposition internally.
“It crosses the threshold of Don’t be Evil to facilitate distribution of other people’s intellectual property…” “It’s a cop out to resort to dist-rob-ution.” Google Video Manager Ethan Anderson SUF #164
“…is changing policy [to] increase traffic beforehand that we’ll profit from illegal downloads how we want to conduct business? Is this Googley?” Google Co-founder Sergey Brin quoted SUF #162
“I think we should beat YouTube… but not at all costs. [They are] a video Grokster.” Google’s Eun to CEO Eric Schmidt before the deal was done SUF #158, #162
(4) After buying YouTube, Google knowingly operated a piracy-tolerant Google-YouTube in accordance with its original Pre-YouTube strategy of anti-competitively extorting competitors by forcing media companies into revenue deals with Google, if they wanted Google to protect their video content from mass piracy.
“Audio fingerprinting system whereby the content partner can send ‘reference’ fingerprints’ to Audible Magic’s database “are now live as well and are only offered to partners who enter into a revenue deal with us.”” Google Manager David Eun 2-15-07 SUF #216 [underline added for emphasis]
(5) After owning YouTube for several months Google was aware of growing mass copyright infringement by Google-YouTube:
“…a trend we see is that people upload copyrighted videos to their private videos… and then invite large numbers of people to view the video which bypasses our copyright restrictions” Google-YouTube employee Julie Havens in a 7-18-07 internal Google emailSUF #199
Google-YouTube’s predatory copyright infringement and willful blindness to mass piracy is exceptionally anticompetitive and profitable because it:
Generates an unbeatable cost advantage by avoiding the market cost of propertied goods for which law-abiding competitors must pay;
Creates an unfair, jump-the-gun, time-to-market advantage, by ignoring the rule of law standard of securing permission from property owners before use in the marketplace, a business practice that law-abiding competitors must respect;
Spawns and maintains a matchless online monopsony index/inventory advantage that no law-abiding competitor could hope to assemble; and
Kneecaps property-based, subscription-monetization models which compete with Google’s piracy-friendly, free advertising model.
Google’s forced video commons strategy is the ultimate predatory anti-competitive business practice, in that it unlawfully destroys the value of any copyrighted innovation and creative proprietary trade secret advantage a competitor may produce in a free market.
In short, Google-YouTube has an undisputed, demonstrable anticompetitive pattern of behavior over a decade that seeks to predatorily extort better wholesale video pricing by threatening to devalue, debase, and destroy video programmers businesses via willful blindness to mass video piracy on YouTube.
It should be beneath the FCC to allow itself to be used as Google’s de facto “muscle” to extort and force via government mandates that monopolist Google could not fairly negotiate by itself in the vibrantly competitive ~$200b pay TV marketplace.
In today’s edition of The Heartland Daily Podcast, we listen in as Bruno Behrend, Heartland Senior Fellow for education policy, speaks in front of the Great Homeschool Convention in Cincinnati, Ohio. He discusses Education Choice and How Homeschooling is blazing the trail.
Behrend talks about the importance of education choice and how we must embrace more of a market system, funding the child instead of the bureaucracy. Behrend also explains his ideal plan for education and how it would foster the growth and innovation in the school system.
The vaunted “97% consensus” on dangerous manmade global warming is just more malarkey
By now, virtually everyone has heard that “97% of scientists agree: Climate change is real, manmade and dangerous.” Even if you weren’t one of his 31 million followers who received this tweet from President Obama, you most assuredly have seen it repeated everywhere as scientific fact.
The correct representation is “yes,” “some,” and “no.” Yes, climate change is real. There has never been a period in Earth’s history when the climate has not changed somewhere, in one way or another.
People can and do have some influence on our climate. For example, downtown areas are warmer than the surrounding countryside, and large-scale human development can affect air and moisture flow. But humans are by no means the only source of climate change. The Pleistocene ice ages, Little Ice Age and monster hurricanes throughout history underscore our trivial influence compared to natural forces.
As for climate change being dangerous, this is pure hype based on little fact. Mile-high rivers of ice burying half of North America and Europe were disastrous for everything in their path, as they would be today. Likewise for the plummeting global temperatures that accompanied them. An era of more frequent and intense hurricanes would also be calamitous; but actual weather records do not show this.
It would be far more deadly to implement restrictive energy policies that condemn billions to continued life without affordable electricity – or to lower living standards in developed countries – in a vain attempt to control the world’s climate. In much of Europe, electricity prices have risen 50% or more over the past decade, leaving many unable to afford proper wintertime heat, and causing thousands to die.
Moreover, consensus and votes have no place in science. History is littered with theories that were long denied by “consensus” science and politics: plate tectonics, germ theory of disease, a geocentric universe. They all underscore how wrong consensus can be.
Science is driven by facts, evidence and observations – not by consensus, especially when it is asserted by deceitful or tyrannical advocates. As Einstein said, “A single experiment can prove me wrong.”
During this election season, Americans are buffeted by polls suggesting which candidate might become each party’s nominee or win the general election. Obviously, only the November “poll” counts.
Similarly, several “polls” have attempted to quantify the supposed climate change consensus, often by using simplistic bait-and-switch tactics. “Do you believe in climate change?” they may ask.
Answering yes, as I would, places you in the President’s 97% consensus and, by illogical extension, implies you agree it is caused by humans and will be dangerous. Of course, that serves their political goal of gaining more control over energy use.
The 97% statistic has specific origins. Naomi Oreskes is a Harvard professor and author of Merchants of Doubt, which claims those who disagree with the supposed consensus are paid by Big Oil to obscure the truth. In 2004, she claimed to have examined the abstracts of 928 scientific papers and found a 100% consensus with the claim that the “Earth’s climate is being affected by human activities.”
Of course, this is probably true, as it is unlikely that any competent scientist would say humans have no impact on climate. However, she then played the bait-and-switch game to perfection – asserting that this meant “most of the observed warming of the last 50 years is likely to have been due to the increase in greenhouse gas concentrations.”
However, one dissenter is enough to discredit the entire study, and what journalist would believe any claim of 100% agreement? In addition, anecdotal evidence suggested that 97% was a better figure. So 97% it was.
Then in 2010, William Anderegg and colleagues concluded that “97–98% of the climate researchers most actively publishing in the field support … [the view that] … anthropogenic greenhouse gases have been responsible for most of the unequivocal warming of the Earth’s average global temperature” over a recent but unspecified time period. (Emphasis in original.)
To make this extreme assertion, Anderegg et al. compiled a database of 908 climate researchers who published frequently on climate topics, and identified those who had “signed statements strongly dissenting from the views” of the UN’s Intergovernmental Panel on Climate Change. The 97–98% figure is achieved by counting those who had not signed such statements.
Silence, in Anderegg’s view, meant those scientists agreed with the extreme view that most warming was due to humans. However, nothing in their papers suggests that all those researchers believed humans had caused most of the planetary warming, or that it was dangerous.
The most recent 97% claim was posited by John Cook and colleagues in 2013. They evaluated abstracts from nearly 12,000 articles published over a 21-year period and sorted them into seven categories, ranging from “explicit, quantified endorsement” to “explicit, quantified rejection” of their alleged consensus: that recent warming was caused by human activity, not by natural variability. They concluded that “97.1% endorsed the consensus position.”
However, two-thirds of all those abstracts took no position on anthropogenic climate change. Of the remaining abstracts (not the papers or scientists), Cook and colleagues asserted that 97.1% endorsed their hypothesis that humans are the sole cause of recent global warming.
Again, the bait-and-switch was on full display. Any assertion that humans play a role was interpreted as meaning humans are the sole cause. But many of those scientists subsequently said publicly that Cook and colleagues had misclassified their papers – and Cook never tried to assess whether any of the scientists who wrote the papers actually thought the observed climate changes were dangerous.
My own colleagues and I did investigate their analysis more closely. We found that only 41 abstracts of the 11,944 papers Cook and colleagues reviewed – a whopping 0.3% – actually endorsed their supposed consensus. It turns out they had decided that any paper which did not provide an explicit, quantified rejection of their supposed consensus was in agreement with the consensus. Moreover, this decision was based solely on Cook and colleagues’ interpretation of just the abstracts, and not the articles themselves. In other words, the entire exercise was a clever sleight-of-hand trick.
What is the real figure? We may never know. Scientists who disagree with the supposed consensus – that climate change is manmade and dangerous – find themselves under constant attack.
Harassment by Greenpeace and other environmental pressure groups, the media, federal and state government officials, and even universities toward their employees (myself included) makes it difficult for many scientists to express honest opinions. Recent reports about Senator Whitehouse and Attorney-General Lynch using RICO laws to intimidate climate “deniers” further obscure meaningful discussion.
Numerous government employees have told me privately that they do not agree with the supposed consensus position – but cannot speak out for fear of losing their jobs. And just last week, a George Mason University survey found that nearly one-third of American Meteorological Society members were willing to admit that at least half of the climate change we have seen can be attributed to natural variability.
Climate change alarmism has become a $1.5-trillion-a-year industry – which guarantees it is far safer and more fashionable to pretend a 97% consensus exists, than to embrace honesty and have one’s global warming or renewable energy funding go dry.
The real danger is not climate change – it is energy policies imposed in the name of climate change. It’s time to consider something else Einstein said: “The important thing is not to stop questioning.” And then go see the important new documentary film, The Climate Hustle, coming soon to a theater near you.
David R. Legates, PhD, CCM, is a Professor of Climatology at the University of Delaware in Newark, Delaware.
A new study published in Environment International indicates hydraulic fracturing, commonly called “fracking,” and the heavy truck traffic that is associated with it would have a negligible impact on air quality if fracking were to be used extensively in the United Kingdom. Interestingly, the authors of the study appear to be a little disappointed with their findings, which may be why they decided to emphasize maximum exposure in a shorter timeframe in their study, rather than exposures over more realistic scenarios.
Hydraulic fracturing is a technique for extracting oil and natural gas from stubborn rocks, such as shale and tight sandstone. In less than 10 years, fracking has turned the United States from an “also-ran” to an energy superpower that has nearly doubled its oil production since 2008, making it the largest producer of natural gas in the world. The technique could also boost natural gas production in the United Kingdom, but fracking has been met with staunch opposition from environmental groups who oppose the potential impacts drilling, production, and heavy truck traffic may have on the region.
Heavy vehicles are associated with producing higher levels of noise, road damage, and air pollution in the form of small particulates—which form as a result of fuel combustion in all vehicles—compared to lighter vehicles. The authors of the paper developed a traffic impact model to produce an environmental assessment of both the short-term and long-term impacts of fracking at individual sites, as well as regional impact analysis.
According to the model developed by the researchers, heavy vehicle traffic related to fracking for an individual well, multi-well pad, or even a region would be negligible compared to those associated with transport in the region as a whole or those emissions associated with some other established industrial sector. However, the researchers did suggest there could be an increase in particle emission in the air during the fracturing process, which requires hundreds of trucks hauling water and sand to a well site, although the study was not clear on whether pollution standards were likely to be exceeded.
Particulate matter and other particle pollution can have an effect on the health of nearby residents if it exceeds the health-based safety standards, but people must generally be exposed to these levels of particulate matter for long periods for it to have negative health effects. This is why exposure to harmful particles is typically measured as a time-weighted average over a series of years (three years in the United States) to determine whether it will have adverse human health impacts. The short durations for which heavy traffic would take place during fracking would be unlikely to affect these longer-term averages.
Ironically, fracking may actually have an important role in reducing particle pollution in the United Kingdom in the coming decades, because Britain has been burning large quantities of diesel fuel, which emits far more particulates into the air than natural gas, for electricity generation.
Larger supplies of affordable natural gas will be essential if the United Kingdom wants to replace coal- and diesel-powered generation systems, which produce nitrogen oxides and particulate matter at significantly higher levels than natural-gas-fired power plants. Also, burning natural gas instead of coal emits half as much carbon dioxide into the atmosphere, and natural gas emits about one-third less carbon dioxide than gasoline or diesel.
Although the authors of the study don’t seem thrilled about the results, people in Britain should be, because it shows over a longer baseline—the entire operational lifetime of a pad—fracking would result in negligible relative increases compared to baseline traffic impacts. These findings, in addition to the environmental benefits of natural gas compared to coal or diesel, should make environmentally conscious people in the United Kingdom eager to consider the environmental benefits of fracking.
In today’s Health Care News Podcast, Brian Blase, senior research fellow at the Mercatus Center at George Mason University, joined Health Care News Managing Editor Michael Hamilton to discuss the disparity between promises many Americans were told the Affordable Care Act (ACA) would fulfill, and the stunning reality three years into the ACA’s implementation and six years after President Barack Obama signed the ACA into law.
Blase and Hamilton walk through the ruins. For most individuals and employers, health insurance costs more. Only half the number of people expected to gain coverage in the federal exchanges have done so. More Americans are insured, but the primary vehicle for insuring them has been expansion of Medicaid at the state level, an expensive program notorious for delivering sub-par health care. Insurers have dropped out or severely narrowed provider networks, limiting quality and consumer choice.
As part of The Heartland Institute’s continuing series of book and movie events, specifically designed to showcase freedom, the book, “Drilling through the Core”, edited with an introduction by Peter W. Wood, was presented by the author on Wednesday, April 6 in the newly named Andrew Breitbart Freedom Center, located at Heartland’s Arlington Heights facility, 3939 North Wilke Road, Arlington Heights, IL 60004.
As noted on the back outside cover of “Drilling through the Core”:
“For the first time in history Americans face the prospect of a unified set of national standards for 8-12 education. While this goal sound reasonable, and Common Core has been presented as a state-led effort, it is anything but. This book analyzes Common Core from the standpoint of it deleterious effects on curriculum — language arts, mathematics, history, and more — as well as its questionable legality, its roots in the aggressive spending of a few wealthy donors, its often-underestimated costs, and the untold damage it will wreck on American higher education. At a time when more and more people are questioning the wisdom of federally-mandated one-size-fits-all solutions, “Drilling through the Core” offers well-considered arguments for stopping Common Core in its tracts.”
Peter W. Wood is an anthropologist and former provost. He was appointed president of the National Association of Scholars in January 2009. Before that he served as NAS’s executive director (2007-2008) and as provost of the King’s College in New York City (2005-2007). Wood is the author of several books, including “A Bee in the Mouth: Anger in America Now” in 2007 and in 2003 “Diversity: The Invention of a Concept.”
Peter Wood was introduced by Lennie Jarratt, Project Manager – Education Transformation at The Heartland Institute.
According to Peter Wood, “The Common Core Is Dead”. It died of parental opposition, teacher opposition, political defection, and perhaps most importantly, flat-out academic failure. But it would be foolish to think that dead things can’t hurt us. Consider Bernie Sander’s resurrection of the Socialist economic theories, twenty-five years after the burial of the Soviet Union. Dead things can likewise also take the living with them, as in the case of Jeb Bush with his unconditional support of Common Core, from which he realized sizable financial gains.
Common Core was first perceived by architects, David Coleman and Jason Zimba, as a solution to the achievement gap between White and Asians on one hand, and Blacks and Hispanics on the other hand. Finding this concept difficult to sell to the general public, the achievement gap premise was changed. Common Core would now make all students ready for college and careers.
According to Peter Wood, Common Core was never intended to raise standards. Instead, it was a plan to establish a nationwide floor that would also be a ceiling. In other words, Common Core “was anti-excellence wrapped in the gift wrap and tinsel of excellence.”
A study by the liberal Brookings Institution in March of this year found no evidence that Common Core State Standards have made much of a difference during the six-year period when NAEP scores have been stagnant. The good news in the report is that Common Core does not appear to be the cause for the NAEP stagnation, as states not accepting Common Core suffered the same stagnation.
This troubling stagnation, as explained by Mr. Wood, has its basis in the following: an increase of single-parent families (the top factor); family dysfunction; financial insecurity, and immigration, all of which will result in poor school performance, and which likewise prove that changing the standards for K-12 education was never going to change the level performance of students. Common Core’s fine-tuned curriculum has seemingly moved in the opposite direction, which explains why SAT and ACT scores have dipped in the Common Core era.
Common Core Language Arts and Math Standards evaluated
There has been a decline of instruction in literature, to be replaced by non-fiction. Why? Because Common Core insists that students learn best from treating everything as informational texts, despite the ability of literature to teach students how to read beyond the literal text. As such students learn how to see the forest and not just the pine needles. Common Core leads students into the territory of pine needles.
Regarding Common Core English Language Arts own standards, Mr. Wood knows of no college that would value an approach to literature that chops everything into fine pieces and then dissolves content so students come away not knowing why they read “Moby Dick” or any other book. However, such a spoon diet of fragmented versions of great literature conforms to how Common Core views literature. In its fragmented approach, Common Core is able to ward off literature as dangerously privilege or even elitist.
Common Core math slows down the pace of math instruction. Instead of third grade before pre-Common Core, when almost all state expected students to master basic addition and subtraction, Common core decided fourth grade would do. Instead of 5th grade, the multiplication table and long division has been moved to the 6th grade. Algebra is kicked up to the 9th grade. Often there is no room for pre-calculus instruction, logarithms are barely mentions, parametric equations are absent, and Arithmetic series are omitted. Adults can live without this mathematical knowledge, but the door is being shut for millions of students for careers in fields where a solid foundation in math is critical.
The thinning out of math standards betrays the two main promises made by Common Core architects, already mentioned David Coleman and Jason Zimba, that the Standards would make students college and career ready, and that the Standards would be internationally bench marked to at least as high as the standards in countries that excel in math. Last year the U.S. ranked a dismal 28th.
Many parents have noticed that there children are being taught tediously complicated forms of computation in primary school, which are deliberately meant to drive a wedge between parent and child. Geometry is now being taught in a way tried before in the Soviet Union in the 1980’s, where it was deemed a failure and discarded.
The aims of Common Core
As to what kind of people we want our children to become, as inferred by the nature of Common Core standards, the outcome is summarily set forth by Peter Wood:
Common Core aims to make children into well-organized utility-maximizers — people who do not waste time contemplating hard problems or dreaming big dreams, but who have a ready means to cut things down to the size they already know how to handle. The perfect job for a Common Core graduate is probably coding.
Parroting the confession made by one of the Common Core architect, Common Core defenders are now using the excuse that the initial “college ready” promise of Common Core was meant to convey a readiness to attend a “community” college.
Peter Wood is adamant that Common Core is finished and that a resurrection by die-hard partisans can’t be achieved, for dead is dead! Mr. Wood mused how the Common Core mess will be cleaned up; who will pay for it, and what will come next?
Not so sure about “dead is dead”
Given the millions of dollars the Gates Foundation provided to set the stage for Common Core (some of which was used as bribe money to convince cash-strapped states to sign on to Common Core sight unseen) to the large investments spent in school districts on textbooks, teacher training, and computers to support the Common Core tests, the Common Core curriculum can’t be eliminated just by wishing it were so by waving a magic wand.
Consider what happened this past December when both the U.S. Senate and the House voted to continue this nation’s federal boondoggle in education. Despite talking points about getting the feds out of creating standards, there is still a requirement that states continue to maintain high state standards, a clear nod to the continuation of the much-hated Common Core State Standards. Furthermore, states must continue to submit their state plans for review and approval by the U. S. Secretary of Education.
Eagle Forum described the bill, “Every Student Succeeds”, as Common Core by a New Name and on Steroids. Supported by the owners of the Common Core standards, the bill (S 1177) was guided through Congress by Senator Lamar Alexander (R-TN).
Suggestions given by Mr. Wood as to how parents can survive waiting out the bad years ahead:
1) Move children out of public schools.
2) Keep children in public schools but work extra hard at home to compensate for Common Core’s poor delivery of essential knowledge and it mis-channeling of children’s intellectual development.
Peter Wood’s current focus is to make less harsh the upstream damage to higher education. One of his battles will be to fight the continuing effort of the College Board, under David Coleman’s stewardship, to institutionalize as much of the Common Core as possible through the SAT’s and Advanced Placement examinations.
A live stream youtube video of the Patrick Wood event can be seen here.
Upcoming Heartland events in April will feature F. W. Buckley, author of “The Way Back: Restoring the Promise of America” on Thursday, April 14, from 5:30 p.m. to 8:00 p.m.
On Wednesday, April 18, Brian Fojtik and Victoria Vasconcellos will speak about the impact the new “vapor wars” (e-cigarettes) have on science, public policy, business and jobs?
Events are free. To register, visit Heartland’s event page or call Heartland at 312/377- 4000.
In this election cycle, we hear a lot about the “establishment.” Most people are not really sure who they are, but they are sure that they do not like them. The anger toward the establishment is not party specific and has propelled two unlikely candidates: Donald Trump on the Republican side and Senator Bernie Sanders for the Democrats.
The faithful following these outsiders may be more about “the grassroots trying to teach the establishment a lesson,” as Gary Bauer posited last month, than about affection for either man. In an InfoWars video, reporter Richard Reeves, at the University of Texas in Austin speaks to Wyatt, a young man who’d just voted for Sanders. Wyatt indicates that most of his fellow students likely voted for Sanders as well. The surprise is his comment about the students’ second choice: “Donald Trump.” Why? He’s not “establishment.” Wyatt admits he didn’t consider voting for anyone else—just Sanders and Trump.
The establishment has been slow to grasp the public’s rejection of an increasingly distrusted political class.
However one might define the “establishment,” it certainly includes long-time Washington politicians like Senators Harry Reid (D-NV), Bill Nelson (D-FL), Ron Wyden (D-OR), John Thune (R-SD), Orrin Hatch (R-UT), and Mitch McConnell (R-KY)—who have just engaged in the exact tactics that have fed the voter frustration aimed at them. Avoiding a vigorous debate, they are using a must-pass bill to sneak through millions in totally unrelated taxpayer giveaways to special interests in the renewable energy industry—and they hope voters won’t notice.
The bill is the Federal Aviation Administration (FAA) Reauthorization Act. On April 6, using an unrelated House bill (H.R. 636) that will serve as the legislative shell for the Senate’s FAA measure (S. 2658), the Senate began consideration to reauthorize the FAA for 18 months. It is expected that the bill will be voted on this week, followed by the House—which will take it up when it is back in session.
Funding for the FAA expired in September and received a 6-month extension—which expired again on March 31. Avoiding a shutdown, Congress passed another extension that President Obama signed on March 30. This legislation authorized federal spending on aviation and related aviation taxes through mid-July 2016.
Both the House and Senate have been grappling with a multi-year aviation bill. Now, FAA reauthorization only has about two weeks to be debated and approved before it will be shoved aside to make way for budget proceedings. One major point of conflict is the renewable energy tax breaks. Because the Senate FAA bill includes a tax title, it is open to unrelated tax amendments.
Many renewable energy tax credits were extended in the omnibus spending package that was passed late last year, but Democrats claim that in the chaos of last minute negotiations, some were “unintentionally” left out. According to Morning Consult, Thune said: “This is what [Democrats] always viewed as the best opportunity to get some of these things that were left out of last year’s extender bill.” Senate Minority Leader Reid announced: “the inclusion of the provisions is a requirement for the legislation to move forward.”
While many Republicans opposed the addition of the renewable energy tax credits, provisions supporting investments in fuel cells, geothermal and biomass were included in the Senate negotiations. Addressing the Senate’s scramble to “settle on a cohesive strategy” regarding attaching the renewable energy tax breaks to the bill, Politico reports: “House Republicans have made it clear they’re not interested in renewing any of the expired tax provisions this year.” The bill’s coverage in Roter Daily states: “key Republicans have already warned fellow House members to oppose a deal on tax extenders if it comes out of the Senate, saying they have consistently failed to promote economic growth and create jobs.”
As we have seen with the recent demise of government-funded, green-energy projects, such tax credits and subsidies have repeatedly failed to deliver on their promises of long-term job creation and economic viability. It is for this reason that, on April 5, a coalition of more than 30 organizations sent a letter to the Senate Finance Committee expressing our deep opposition to the proposal. The letter, of which I am a signatory, states: “Congress considered the matter of expiring tax provisions less than 4 months ago. … It should also be noted that Congress extended significantly favorable tax treatment to renewable energy in omnibus appropriation legislation that accompanied the aforementioned tax extender package.”
Andrew Langer, President of the Institute for Liberty, who also signed the letter, explains his position: “In December, Congress purposefully allowed a series of tax credits for so-called ‘green’ energies to expire. This was not some mere oversight as some have alleged, but a purposeful recognition that as the energy landscape has changed, the need to extend some two dozen of these credits was unwarranted. Others were allowed to continue—but roughly $1.5 billion were not.”
If you believe, as all the signatories to the letter do, that American taxpayers shouldn’t have to prop up large, well-connected special interests through tax handouts, carve outs, and loopholes using unsustainable Washington spending, please let your representatives know now. Please urge Senate offices to oppose keeping in the tax extenders, and encourage House offices to oppose adding in extenders.
With our national debt totaling more than $19 trillion, the last thing we need is more corporate welfare. But our legislators are slow to learn. Senate Republicans, like Thune, who is the lead negotiator for the Republicans, have worked with the Democrats to include the renewable energy tax credits. Thune stated: “We’re listening to them and we’re working for them.”
No wonder the electorate is angry. But Washington politicians don’t get it. While a battle rages over who will be the next president, unfazed, the establishment continues on.
Langer concludes: “the political ramifications are clear, as history has taught us. Republicans who give in to cronyism, who give in to profligate spending… they get nothing in the end. Worse, they do considerable damage to the concept that Republicans are the party of lower spending and less government. In a political cycle where the future is entirely uncertain for Republicans at all levels, those who are pushing for these tax breaks do their colleagues no great service.”
Join us in educating the “establishment” by calling them and telling them: “No more green pork!” #GreenPork
The author of Energy Freedom, Marita Noon serves as the executive director for Energy Makes America Great Inc., and the companion educational organization, the Citizens’ Alliance for Responsible Energy (CARE). She hosts a weekly radio program: America’s Voice for Energy—which expands on the content of her weekly column. Follow her @EnergyRabbit.
Quick: What is 17 cents out of $100,000? If you said 0.00017 percent, you win the jackpot.
That number, by sheer coincidence, is also the percentage of methane in Earth’s atmosphere. That’s a trivial amount, you say: 1.7 parts per million. There’s three times more helium and 230 times more carbon dioxide in the atmosphere. You’re absolutely right, again.
Equally relevant, only 19% of that global methane comes from oil, natural gas and coal production and use. Fully 33% comes from agriculture: 12% from rice growing and 21% from meat production. Still more comes from landfills and sewage treatment (11%) and burning wood and animal dung (8%). The remaining 29% comes from natural sources: oceans, wetlands, termites, forest fires and volcanoes.
The manmade portions are different for the USA: 39% energy use, 36% livestock, 18% landfills, and 8% sewage treatment and other sources. But it’s still a piddling contribution to a trivial amount in the air.
Of course, the Obama EPA and Climate Cataclysm Industry ignore these inconvenient facts. They insist that methane is “a far more potent greenhouse gas” than carbon dioxide, and that its emissions must be drastically reduced if we are to avoid “runaway global warming.” So EPA and other federal agencies are preparing to unleash a tsunami of new regulations to block natural gas drilling, fracking, flaring and production, while radical environmentalists orchestrate new assaults on petrochemical plants that create plastics, paints, fabrics, computer and vehicle components and countless other products for modern life.
They want us to believe that government regulators can decree Earth’s climate simply by controlling methane and carbon dioxide – regardless of what the sun, ocean circulation, recurrent planetary temperature cycles and other powerful natural forces might do. They say it’s pure coincidence that these two trace gases (CH4 and CO2) are the only climate-affecting mechanisms that are associated with the fossil fuels and industrialized economies they despise.
They also want us to believe reducing United States methane emissions will make a huge difference. But even if US manmade methane emissions are 20% of the worldwide total, the 39% US fossil fuel portion of that US portion means even totally eliminating US methane emissions would reduce global manmade methane output by a minuscule 7.8 percent. Under a best-case scenario, that might keep atmospheric methane below a still irrelevant 0.00020% (2.0 ppm; 20 cents out of $100,000) for a few more years.
This smells like fraud. And as New York AG Eric Schneiderman so kindly reminded the climate skeptics he’s threatening with RICO, “The First Amendment does not give anyone the right to commit fraud.”
Perhaps EPA plans to go after America’s agricultural sector next. After all, as former UN Secretary General Kofi Annan intoned last year, red meat is bad for us (cancer) and for the climate (animal flatulence and manure). Moreover, “insects have a very good conversion rate from feed to meat,” there are 1,900 species of edible insects on Planet Earth, and more than a billion people already make bugs part of their diet. Perhaps the IPCC and White House will serve roasted roaches at their next state dinners?
That would reduce US methane emissions a bit more. But it gets even more deceitful, more barking mad.
The un-ratified 2015 Paris climate treaty obligates the United States, Australia, Canada and Europe to continue reducing their fossil fuel use and emissions – even though they can hardly afford to kill more millions of jobs and further roll back living standards for all but their ruling elites.
Meanwhile, developing countries will not and cannot afford to lock up their fossil fuels, shut down their economic growth, and leave billions of people mired in poverty, malnutrition and disease. Indeed, under the Paris treaty, they are not required to reduce their fossil fuel use or “greenhouse gas” emissions; they need only take voluntary steps to reduce them, when it is convenient for them to do so.
That means slashing US methane (and carbon dioxide) emissions – and the jobs, living standards, health and welfare that fossil fuels bring – will have no effect whatsoever on atmospheric greenhouse gas levels.
But that is irrelevant to Mr. Obama and his EPA. The fact is, this methane mendacity and madness has nothing to do with stabilizing Earth’s climate. It has everything to do with hogtying and bankrupting US fossil fuel companies, controlling industrial activities and people’s living standards – and mandating a costly transition to renewable energy, while rewarding the hordes of scientists, activists and industrialists who benefit from the $1.5-trillion-per-year Climate Crisis, Inc. money train.
That raises a critical question: Just where and how will we produce those “eco-friendly” biofuels?
US ethanol production alone requires all the corn grown on an area the size of Iowa (36 million acres), and it makes up only 10% of the country’s E10 gasoline blends. Replacing all gasoline with ethanol from corn, sorghum or still-illusory switchgrass would therefore require ten Iowas: 360 million acres. But there is one other critical factor: ethanol has one-third less energy per gallon than pure gasoline.
That means we would need to plant an additional 120 million acres, 480 million acres in all, just to replace gasoline. That’s equal to Alaska, California and West Virginia combined!
Replacing all the liquid petroleum we use annually (291 billion gallons) would require twice as much land – some 45% of all the land in the United States: six times more land than we currently have under cultivation for all cereal crops – plowing even marginal croplands, deserts, forests and grasslands.
We’d also need far more fuel to grow, harvest and convert those crops into “eco-friendly” fuel. That would likely mean turning southern Canada into a vast biofuel plantation – unless, of course, the ruling classes simply impose lower living standards and vehicle ownership restrictions on us commoners.
Growing biofuel crops also requires hundreds of times more water than is needed to conduct hydraulic fracturing (fracking) operations to produce the same amount of energy from oil and gas, on a tiny fraction of the acreage. Where on this water-starved planet will that precious liquid come from?
Biofuel crops also require prodigious amounts of fertilizer and pesticides. And if organic and anti-GMO factions have their way, far more land would be needed, pest control would be minimal or done by hand, and fertilizer would come from human wastes and animal manure – raising even more complex issues.
To put it bluntly, a biofuel future would be totally and disastrously unsustainable.
There’s another deep, dark secret about biofuels. Somebody needs to tell Obama, McCarthy, Clinton, Sanders and their army of “green” supporters that biofuels are hydrocarbons! They are composed of carbon and hydrogen, though in less complex molecular structures than what we pull out of the ground – which means we get less energy per gallon. And when we burn them, they release carbon dioxide!
We have at least a century of untapped oil and natural gas (and of coal) right under our feet. To lock that up, based on unproven, illusory, fabricated, fraudulent climate chaos claims, is utter insanity.
Even crazier, most anti-fossil-fuel zealots also oppose nuclear and hydroelectric power – and want future electricity generated primarily or solely with wind turbines and solar panels. To blanket our scenic, crop and wildlife lands with wind farms, solar installations and biofuel plantations – and destroy economies, jobs, living standards, health and welfare in the process – is nothing short of criminal.
President Obama and presidential candidates Clinton and Sanders assure us we can have 30% renewables by 2030, 50% by 2050, 100% by 2100 – or some similar magic, catchy, sound bite concoction.
Voters should demand to know exactly how they will make this happen. If they cannot or will not answer satisfactorily, a strong case can be made for the proposition that they are too ignorant and dishonest to hold office – and that their supporters are too stupid and anti-environment to vote. J
What do Michelangelo, LeBron James, Steve Jobs, and Bernie Sanders have in common? In this episode of the weekly Budget & Tax News podcast, managing editor and research fellow Jesse Hathaway talks with Yaron Brook, the president and executive director of the Ayn Rand Institute to find out!
Equality is not fair, Brook says, and it’s not an achievable goal in the real world, because everyone has different abilities and skills. Instead of weighing down LeBron James so he’s no better at basketball or make Michelangelo paint left-handed so he’s no better at art than other people, we should embrace the fact that everyone has different talents than everyone else.
“Income equality alarmism,” as Brook calls it, is causing people to fear that some people might have different levels of success in life, leading them to conclude that big-government policies like higher taxes and protectionist trade policies are the only way to “make America great again.” Instead of turning to government to keep everyone equal, Brook says people should embrace their inner Steve Jobs and strive for success… and government should get out of the way of the people’s right to the pursuit of happiness.
During March 22 hearings before the House Energy and Commerce Committee, under questioning by West Virginia Rep. David McKinley (R), EPA Administrator Gina McCarthy admitted (once again) the Obama administration’s climate efforts will do nothing to protect public or environmental health. McCarthy instead acknowledged the efforts are merely a symbolic attempt to get other countries’ leaders to join the Paris climate agreement.
Concerning the Clean Power Plan (CPP), McKinley asked, “If it doesn’t have an impact on climate change around the world, why are we subjecting our hardworking taxpayers and men and women in the coal fields to something that has no benefit?”
“We see it as having had enormous benefit in showing sort of domestic leadership, as well as garnering support around the country for the agreement we reached in Paris,” McCarthy responded.
There is nothing new in McCarthy’s admission. In September 2013, in response to questions from Rep. Mike Pompeo (R-KS) concerning CPP, which was then under development, McCarthy said she couldn’t show it would have any impact on the 26 “climate indicators” being tracked by the Environmental Protection Agency. McCarthy said, “[The CPP was] part of an overall strategy … positioning the U.S. for leadership in an international discussion.”
In July 2015, McCarthy testified before the House Science Committee, where she said, “The value of [the CPP] is not measured [by the amount of warming it prevents]. … I’m not disagreeing that this action in and of itself will not make all the difference we need to address climate action, but what I’m saying is that if we don’t take action domestically we will never get started.”
The domestic costs of climate symbolism are high. Regulations imposed by the Obama administration have already shuttered 265 coal-fired power units between 2009 and 2014, which took enough power offline to electrify 12.6 million homes. These closures cost 39,684 jobs at coal-fired electric power plants and thousands of other jobs at coal mines; at companies who provide services or machinery to the coal-mining, transportation, or power-generation industries; and at retailers and restaurants where the newly unemployed used to shop and dine. The North American Electric Reliability Corporation warns CPP will likely result in the closure of nearly five times more coal-fired power plants, which means hundreds of thousands of additional jobs could be lost.
Consumers will also be directly harmed by CPP. A study produced by NERA Economic Consulting shows CPP could raise electricity by double digits, with ratepayers in the 28 states hardest hit by CPP possibly facing price spikes greater than 20 percent.
Poor and many middle-income families, which include many minorities and those on fixed incomes, spend a higher percentage of their money on energy and commodities that are heavily dependent upon energy consumption, such as food and transportation, compared to relatively wealthy families. This means CPP and other similar policies are essentially a tax on the poor.
To some extent, the Obama administration’s symbolic climate policies have worked. They helped convince 184 nations to agree to cut or cap their emissions at the Paris climate conference in December 2015, but even these commitments are only symbolic. The United Nations has acknowledged even if all the Paris-agreement nations keep their commitments, the impact on global temperature will be minimal—around one or two-tenths of a degree Celsius, virtually too small to measure.
The costs imposed by the Paris agreement would be the most detrimental for the poorest among us. The carbon dioxide reductions agreed to in Paris would virtually guarantee continued poverty and unnecessary premature death for the most impoverished 1.5 billion people in the world.
Think about how many lives would be saved by increased fossil-fuel use in the health care industry alone. Modern hospitals cannot function without fossil fuels. Gasoline fuels emergency vehicles and electricity keeps the lights, computers, climate controls, and refrigeration working properly. Fossil-fuel-powered electricity runs incubators that save the lives of premature babies, and respirators keep people breathing until they can breathe on their own. Electricity runs the machines sterilizing instruments, and MRIs, x-rays, CT scans and virtually all of the other tests and technologies enabling medical professionals to predict, diagnose, and treat the many diseases and injuries humans suffer each year need significant amounts of electric power.
Electricity also delivers safe drinking water, and fossil fuels are used to make the plastics that are used in hospital blood and medicine bags, tubes, wiring, and even furniture.
How long should the poor in developing countries be required to forego the fossil-fuel dependent, lifesaving medical technologies we in the modernized Western world take for granted? Symbolic climate policies carry an unjustifiable price tag, killing people and leaving millions of families without energy and unable to raise themselves out of poverty. Nothing could be more immoral.
The industrial revolution started in Britain with inventors and entrepreneurs using coal to drive steam engines and make iron and steel. Generations have benefitted.
But Greens have started the DI (de-industrial) revolution. Their policies aim to have Britain producing no coal or steel and relying on expensive, subsidized, intermittent energy from windmills.
Consequently, heavy industry in Britain is in sharp decline, real jobs are scarce and some frosty night, lights will go out, trains will stop and “Earth Hour” will last until dawn.
The pampered green class will rejoice temporarily, but will soon join the modern pommy paupers forced back to candles, poaching and workhouses.
Race to go Green is killing UK heavy Industry:
Crippling Energy Costs cause layoffs at Tata Steel:
Energy Policy threatens British Steel:
Closure of last British Deep Coal Mine:
Greece, Spain and Italy are also infected with the DI disease. It has now spread to Australia where mining, energy and heavy industries are drowning in green ooze.
O’Sullivan’s Law, named after British journalist John O’Sullivan, holds that any organization or enterprise that is not expressly right wing will become left wing over time.
Nowhere does that law hold more true than on American college and university campuses, where surveys and campaign donation records show that leftist faculty routinely outnumber conservative faculty, and that faculty and administrators tend to donate to Democratic candidates for office over Republican candidates by ratios of close to 100 to 1.
And where faculty and administrators lead, can students be far behind?
One need look no further than the current Democratic Presidential campaign, in which Bernie Sanders, an avowed socialist whose campaign platform is indistinguishable from that of the Communist Party USA, draws much of his support from young college students lured by the unrealistic promise of “free” tuition for the education they largely aren’t receiving.
The reason college students no longer assuredly receive a genuine education is that, bowing to the political correctness of the day, true core curricula in the classics and Western civilization are largely anathema. Stanford students who marched around in the 1980s chanting “Hey, Hey, Ho, Ho, Western Civ Has Got to Go!” largely got their wish.
When the Bass brothers of Texas donated $20 million to Yale College in 1991 with the express directive that the money be used to fund core courses in Western Civilization, the University refused to go along with the brothers’ directives and gave the money back – despite an ongoing fundraising drive.
Instead of teaching the classics, American universities are busy falling all over themselves to be politically correct, imperiling free speech and academic freedom in the process. How did all this happen? Mostly through federal government intervention.
Title VII of the Civil Rights Act, as amended, prohibits discrimination in employment based on race, color, religion, sex (including pregnancy), national origin, age (40 or older), disability, or genetic information. Such discrimination also includes “harassment,” which the EEOC defines as “unwelcome conduct” directed against a person in one of these protected classes where “1) enduring the offensive conduct becomes a condition of continued employment, or 2) the conduct is severe or pervasive enough to create a work environment that a reasonable person would consider intimidating, hostile, or abusive.”
Title VII applies only to conditions of employment, which affects faculty and staff and, perhaps, graduate students and research assistant paid by the college or university, but not students themselves unless they also work for the institution. That’s where Title IX comes in.
Signed into law by President Richard Nixon in 1972, Title IX of the Education Amendments forbids educational programs or activities receiving Federal financial assistance from excluding any person from participation in, being denied the benefits of, or being the subject of discrimination under, any such program or activity on the basis of sex. That sound simple enough: What’s sauce for the goose is sauce for the gander. And because almost all colleges accept students who take out federal loans, even private institutions are subject to government oversight.
Back in 1972, with the rare exception of hermaphrodites, the law considered the existence of only two sexes: male and female. And even at colleges and universities that admitted both sexes (some Ivy League schools did, not for example, before 1969), students were largely segregated by sex, at least officially, in their living, sleeping, and bathroom arrangements. Many state laws required that men and women use separate bathrooms, and some schools even enforced parietal rules that restricted the hours and locations in which students of the opposite sex could be in one another’s company.
This all sounds quaintly Victorian these days, but look at the result in today’s hyper-sexualized, hyper-liberal world: fights over who gets to use which locker room and a disputed but widely-touted claim that, before she graduates, one out of every five female students will be the subject of a sexual assault.
Genuine sexual assault is a serious crime and should be reported and dealt with as such. But the insistence of the federal Office of Civil Rights (OCR) that “sexual harassment” means “unwelcome conduct of a sexual nature” and includes “unwelcome sexual advances, requests for sexual favors, and other verbal, nonverbal, or physical conduct of a sexual nature” can lead to some absurd results.
As of March 2016 the OCR has investigated over 100 colleges and universities for such alleged Title IX violations as failing to respond to allegations of sexual assault until a complaint had been filed (to what does one respond if no the filing of a complaint?) and failing to consider the need for “broadly addressing the issue of sexual harassment” even after complainants requested anonymity and confidentiality.
In December 2013, a female sociology professor’s popular class on “Deviance in US Society” (no classics here!) at the University of Colorado-Boulder led to her early retirement after administrators told her that, in a “post-Sandusky” environment, the class entailed too much risk. Small loss, perhaps, but a Louisiana State University early childhood education associate professor also lost her job in 2015 for simply allegedly using “salty language” that same December.
More famously, Northwestern University students petitioned to have Professor Laura Kipnis investigated for “retaliation” under Title IX simply for writing a piece about the absurdity of it all for The Chronicle of Higher Education entitled “Sexual Paranoia Strikes Academe.” Bizarrely, the Faculty Senate President who accompanied Kipnis to sessions with investigators and the University President, who wrote a Wall Street Journal Op-ed defending academic freedom, were also targeted for alleged “retaliation.”
“For the record,” in the words of Kipnis, “this isn’t retaliation. It’s intellectual disagreement … what’s the good of having a freedom you’re afraid to use?”
The Occupational Safety and Health Administration (OSHA) has recently finalized a rule that would reduce the amount of respirable crystalline silica workers can be exposed to during an eight hour day. The rule, which would cut the permissible exposure limit (PEL) in half, will have broad implications for industries that use industrial sand, including hydraulic fracturing, glass making, foundries, and the construction industry.
In this episode of The Heartland Daily Podcast, Mark Ellis, the President of the National Industrial Sand Association and National Industrial Minerals Association of North America, and Isaac Orr discuss the specific changes being made to the current silica rules, and why the new changes may not be necessary in order to prevent new cases of silicosis, a serious but entirely preventable lung disease.
According to Mark Twain, “Everybody talks about the weather, but nobody does anything about it.” Now six state attorney generals (AG) have banded together to do something about it by initiating governmental legal prosecution. Can a modern “Reign of Terror” be far behind?
These AGs claim a climate conspiracy implying that investors are unaware that climate changes may impact corporate investments, and have committed to using the power of the state to solve it. In early November 2015, Exxon Mobil was targeted by New York State Attorney General Eric T. Schneiderman pursuing a legal case based on claimed similarities to the way tobacco companies of the 1950s and 1960s were eventually found guilty of suppressing their own research showing tobacco being both harmful and addictive. Their advertising served to minimize such health risks, and they funded scientific studies to provide reassuring data. In 2006, the companies were found guilty of “a massive 50-year scheme to defraud the public.”
Last month, Virginia AG Mark Herring joined five other AGs and former Vice President in the goal of determining “whether fossil fuel companies misled investors and the public on the impact of climate change on their businesses.” AG Herring is also a supporter of the EPA’s Clean Power Plant plan.
When valid scientific argument fails its cause, the heavy fist of governmental legal prosecution becomes Plan B. “Attorneys General and law enforcement officials around the country have long held a vital role in ensuring that the progress we have made…” according to Gore. That becomes the “inconvenient truth” of governmental dogma.
The Minoan (3440 YBP), the Roman (2400 YBP), and the Medieval (1000 YBP) warming periods occurred before the current uptick in atmospheric CO2 which commenced around 1850 AD with the advent of the modern industrial age and large scale use of fossil fuels. Global temperatures during these three earlier periods were warmer than present even though atmospheric CO2 was much lower, and fossil fuel use miniscule. Although global temperature is an imperfect metric of climate change, it is the measure that makes the alarmists’ headlines. The public is repeatedly subjected to a faux alarm over the fact that since 1850 global temperatures have risen about 0.8º C, which is less than the sunrise-to-sunset daily temperature swings mankind experiences worldwide.
Claiming disastrous climate change related to human activities, they disregard eons of natural climate variations before human existence. Climate change itself remains a vague term and undefined. No student of history denies that the climate changes. These legal determiners of scientific truth join the current vogue to label any variation in some idealized concept of an unchanging “normal” climate (the Goldilocks Climate) as a harbinger of impending disaster. The evidence is otherwise: sea level rate-of-rise remains about 7 inches per century, there is no long-term drought, tornadoes are less frequent and less deadly, there are fewer hurricanes hitting the U.S., even the polar bears are thriving in number, and global temperatures have not risen for 18 years even as CO2 levels have increased 10% (the recent El Nino caused an expected spike).
Droughts are cyclical on some time scale. From California: “Rain and snow returned in March (2016) to boost the state’s two largest reservoirs – Shasta Lake and Lake Oroville – to slightly above their historic levels for the date…”
In 2007 Georgia officials warned that Lake Lanier, a 38,000-acre reservoir that supplies more than 3 million residents (Atlanta, GA) with water, was less than 3 months from depletion. Smaller reservoirs were dropping even lower. This drought became a prime example for Gore’s inconvenient truth traveling show in which a gullible public is bombarded with the environmental panic de jour by a media which thrives on scare stories.
Unsurprisingly, a December 2015 report on Lake Lanier has not received the same attention of the media and/or environmentalists. “The Army Corps of Engineers canceled massive water releases planned for Lake Lanier Thursday because of downstream flooding. Lake Lanier is currently at 1075.3. That’s more than 4 feet above full pool. The lake is expected to peak at 1075.7 over the next week after rising more than 4 feet in the last several days.” Environmentalists fret not, there is an environmental crisis awaiting your discovery, out there, somewhere. Move on and never look back; never say you were wrong.
Virginia’s AG Herring is committed to the EPA’s Clean Power Plan proposal even though it is estimated to offset only 0.01º C of global warming. EPA Administrator Gina McCarthy, when asked if she considers 0.01º C to be a significant contribution to halting climate change said, “No…” It is the righteous feeling and thought that counts, not the actual change achieved. Progressives base their actions on emotional satisfaction, not scientific analysis.
The tidewater/Hampton Roads/Norfolk coastal areas of Virginia are a constant source of justification for climate control advocates, including Virginia Governor Terry McAuliffe. Rising sea levels are targeted as the driving force for flooding concern. From there it is a short leap to global warming driving climate change and sea level rise.
Such dogmatic political posturing ignores the scientific findings of local scientists at the Virginia Institute of Marine Science, part of William and Mary College. Coastal marine scientist Dr. John Boon, reported that “the good news is that absolute sea level in Chesapeake Bay is rising only about half as fast as the global average rise rate. The bad news is that local subsidence more than makes up for it.” Coastal flooding is a disaster, but it is the result of the land sinking as a consequence of ongoing geological forces, not climate change.
According to folklore law strategy for the lawyers: “1. If the facts are against you, argue the law. 2. If the law is against you, argue the facts. 3. If the facts and the law are against you, yell like hell.” To which one can now add forget the facts, forget the law, and “sue into submission.”