“The Revolution devours its children,” wrote French royalist Jacques Mallet du Pan in 1793, but in the case of the American left, the children are now devouring their masters, both literally and figuratively. For the progressive war on free speech is nowhere more evident than on campus, where it has taken on sinister aspects completely apart from Title IX, about which we wrote in Part Two of this series.
Having been coddled as “special” and “unconditionally loved’ (read “undisciplined”) by the village that now raises them, the children of yesterday are the woefully-unprepared college students of today, too many of whom need “safe spaces” and can’t actually handle the education that they supposedly attend college to receive.
Perhaps the first major example to hit the public eye was at the University of Missouri, which actually fired its football coach (see “A University the Football Team Can Be Proud Of”) in response to unsubstantiated accusations of alleged racist remarks made by people who didn’t even go to school there. But following closely behind were some of America’s allegedly elite institutions, where cowardly administrators cravenly capitulated to some of the most ridiculous demands imaginable.
Following the example of England’s Oxford and Cambridge, for example, certain Ivy League institutions, namely Harvard and Yale, have had for decades a system that Harvard calls “houses” and that Yale prefers to call “Residential Colleges.”
Modelled after the actual colleges that exist within Oxford and Cambridge, Harvard Houses and Yale’s Residential Colleges pretend to be communities of scholars within their larger undergraduate colleges and universities themselves, each with resident libraries and faculties. One such faculty member, called “Dean,” is purportedly responsible for each student’s academic progress. A second faculty member, known as the “Master” – as in “Master of the House” – is more broadly responsible for setting the “moral and intellectual tone of the college.”
In recent years, after decades of “affirmative action” aimed at boosting the enrollment of non-white “minority” students, it seems that the title of “Master” has become too much for some students – and indeed some faculty to bear.
Never mind that, led by many graduates of both Harvard and Yale, this nation fought a bloody war to end slavery over 150 years ago, well before Harvard started its houses or Yale began its Residential Colleges in the 1930s. Never mind also that no student or faculty member alive today was ever a slave or a slave master. And never mind that the title of “Master” itself has nothing to do with slavery; it’s simply an accolade, British in origin, that refers to the head of the college within the college.
Never mind, too, that “Master” has manifold innocuous meanings. Schoolmasters are presiding officers of a school, and the U. S. has a Postmaster General. After receiving their baccalaureate degrees, some college students go on to pursue a Master’s degree, often on the way to a Ph.D. “Master” is a (typically British also) form of address used for boys and young men in formal correspondence, and the captain of a ship is more formally its “Master.”
Courts often appoint “Special Masters” to assist judges in fields requiring particular expertise; the martial arts have Masters; and certain peace-loving Buddhist monks and nuns are “Dharma masters.” Chess has its masters, guilds had master craftsmen, and let’s not forget the “Old Masters” of Western Renaissance painting. Then of course are the ubiquitous hosts of events from wedding receptions to the Academy Awards, typically called the “Master of Ceremonies.” Even the American Inns of Court, dedicated to legal excellence, civility, professionalism, and ethics, have “Masters of the Bench,” again derived from British nomenclature.
In short, the false furor over “master” is as silly as the uproar over the use of the word “niggardly” (which means parsimonious or tight-fisted) that forced the resignation of a white mayoral staff member in Washington, D. C. back in 2007.
Students at Ivy League institutions really ought to know better. And if they don’t, then they should learn.
Commonwealth Edison (ComEd) is pushing for the deployment of 4,000,000 smart meters despite the fact that government agencies and the military have known for decades that Radio Frequency/microwaves can cause serious health effects.
This information is not new; it is just being brought to the forefront as a health crisis is emerging in Illinois. ComEd is using the Energy Infrastructure Modernization Act, also known as the “smart grid modernization bill” (written by ComEd lobbyists), and the Illinois Commerce Commission’s interpretation of that bill, as justification for installing millions of wireless smart meters.
The RF/microwave emissions from smart meters are listed by the World Health Organization’s International Agency for Research on Cancer ‘IARC’ as a Class 2B Carcinogen. That makes this the first time in history a known carcinogen has been mandated on ALL homes, schools, and government buildings.
Barrier Trower, a retired British Secret Service Microwave Weapons specialist, states:
“The paradox is how Radio Frequency/microwave radiation can be used as a weapon to cause impairment, illness and death; and at the same time be used as a communications instrument [such as in smart meters].”
Trower continues, “By 1971 we knew everything that needed to be known.”
“A 1976 document summarizing U.S. Defense Intelligence research lists all of the health hazards caused by wireless devices and concludes: This should be kept secret to preserve industrial profit.”
Jerry Flynn, is a retired Canadian Armed Forces captain with specialized training and 22 years of experience in Electronic Warfare and Signals Intelligence. Flynn has worked with U.S. and NATO armies in this specialized capacity. He writes:
“The U.S. military has known for decades that the RF/microwave frequencies most harmful to man are those within the band 900 MHz to 5 GHz. These frequencies penetrate all organs of the body, thus putting all human organ systems at risk. Smart meters emit these precise frequencies which, when combined with certain pulsed modulation characteristics and power densities, are most harmful to the brain, central nervous system, immune system, and can cause cancers. This is precisely why these frequencies are used in Microwave weapons of war.”
ComEd smart meters contain two transmitters emitting high-intensity pulsed signals every few seconds in two frequencies within the “most harmful” range mentioned by Flynn. One frequency is 900 MHz used for the wireless network that relays data from the smart meter on one house to the smart meter on another house and then on to a collector which sends the data to ComEd. The second frequency, 2.45 GHz, is used for appliances inside the house to transmit data to the smart meter.
Although ComEd claims that data is only transmitted six times a day, what they neglect to mention is that smart meters also emit high-intensity RF/microwave pulses each time they perform network management functions. According to California court documents, a single smart meter can emit these pulses on average 10,000 to 190,000 a day. The number of pulses depends on where in the mesh network the smart meter is located and how often it is relaying data from other neighbors’ meters.
It is these around-the-clock, high-intensity pulses within the frequency range “most harmful” to humans that make smart meters so damaging. Consider 4,000,000 ComEd smart meters blanketing Illinois with billions of pulses in these frequencies being emitted every day, forever.
Basis for FCC guidelines: Health or Profits?
The Federal Communication Commission (FCC) knew decades ago, for according to Gittleman, “back in the 1950’s there were growing concerns as to the dangers of these low-level microwaves, so the U.S. military had sought safety limits.”
The current FCC safety limits are based on thermal exposure alone. The FCC guidelines are ten times more lenient than what the Environmental Protection Agency (EPA) would have permitted to protect the general population from the health hazards of RF/microwave radiation.
In the late 1980’s, the EPA radiation division, staffed with practicing biologists and epidemiologists, decided on a safe limit for human exposure. Before the announcement was made, industry intervened, federal funding for that division of the EPA was cut, and the FCC was given the task of setting the RF/microwave guidelines for the public. The FCC, made up of bureaucrats and engineers, had no experience or training in setting “health related” guidelines. Therefore, from the beginning, FCC guidelines were set at a limit that was too lenient to protect the general population.
Government agencies respond to the FCC guidelines
- Environmental Protection Agency (EPA), 1990: “FCC exposure standards are seriously flawed.” In fact, 40 EPA scientists released a 393-page report titled, “An Evaluation of the Potential Carcinogenicity of Electromagnetic Fields (EMF’s)”, which proposed classifying EMF’s as a “probable” carcinogen and Radio Frequency and microwave radiation as a “possible” carcinogen.
- Food and Drug Administration (FDA), 1993: “FCC rules do not address the issue of long-tern chronic exposure to Radio Frequency fields. Data strongly suggests that RF/microwaves can accelerate the development of cancer.”
- National Institute for Occupational Safety and Health (NIOSH)—a division of the Center for Disease Control (CDC), 1994: “FCC’s standard is inadequate because it is only based on adverse health effects caused by body tissue heating (which means thermal).”
- U.S. Consumer Affairs Commission, 1999: “Current thermal guidelines associated with Electromagnetic Radiation (EMR) are irrelevant. Cancer and Alzheimer’s are associated with non-thermalEMR effects.”
- Environmental Protection Agency, 2002: “FCC’s current Radio Frequency/microwave exposure guidelines are thermally based, and do not apply to chronic, non-thermal exposure situations.” Norbert Hankin, Director, Radiation Protection Division
Medical and legal groups respond to the FCC guidelines
Today there are more than 900 health and environmentally conscious groupssending comments to the FCC as part of the agency’s reassessment of the guidelines. The American Academy of Pediatrics, with headquarters in Illinois, is one of these concerned medical organizations. The Academy of Environmental Medicine, along with the American Academy of Justice(formerly the Association of Trial Lawyers of America), are two more such groups. All of these organizations are concerned with saving the environment and preserving public health from the government approved harmful levels of microwave radiation.
ComEd touts compliance with FCC standards to assure the public that smart meters are safe. However, FCC exposure guidelines are irrelevant since the limit set is for thermal exposure. ComEd smart meters subject the public to chronic non-thermal exposure.
Which authorities knew or should have known of RF/microwave harm?
The U.S. military and intelligence agencies: As early as the 1950’s, the military and intelligence agencies were aware of the health effects from RF/microwaves. From 1,000 classified studies, it was apparent that even low-level RF/microwaves could create bio-effects that could be used to disrupt the enemy in covert, or battlefield operations. RF/microwaves could be utilized to create confusion, slow reaction time, create nausea, and shock adversaries in the field.
NASA: This space agency has been studying the health effects for years to facilitate protection from electromagnetic radiation for astronauts traveling in space.
Government Health Departments: These departments are charged with protecting public health and have a responsibility to keep up on studies. At this time, there are thousands of peer-reviewed studies showing adverse biological and health effects.
The Department of Energy: It is the duty of this agency to investigate negative health effects before launching such an expansive national project. No health data was considered before deployment of billions of smart meters in wireless networks.
The World Health Organization: In 2011 the International Agency for Research on Cancer ‘IARC’ categorized Radio Frequency emissions from all wireless devices as a Class 2B Carcinogen. ComEd’s wireless smart meters fall into this category. Although the IARC classification has been known for five years, the deployment of 4,000,000 ComEd smart meters is still being mandated.
The Telecom executives: Two decades ago Dr. George Carlo, who was in charge of the Wireless Technology Research (WTR) project in 1993 informed the Telecom executives. He reported the results of the research which revealed analarming increase in tumors and many other health related problems.
Lloyd’s of London: This well-known insurance underwriter now specifically “excludes liability coverage for claims directly or indirectly resulting from electromagnetic radiation and illnesses caused by continuous, long-term, (non-thermal) radiation exposure.” ComEd’s wireless smart meters will inflict continuous, long-term, (non-thermal) radiation exposure on all life forms.
Utilities, such as ComEd: Utilities have been charged with providing safe delivery of electricity. Clearly, there has been no investigation into the safety of incorporating into the electric grid a product utilizing this dangerous technology.
What scientists recognize about “the emerging public health crisis”
The International EMF Scientist Appeal has been signed by 190 scientists from 39 nations. These scientists have collectively published over 2,000 peer-reviewed papers on the biological or health effects of non-thermal radiation and are calling upon the United Nations, World Health Organization, and UN member states to:
- Address the emerging public health crisis related to wireless devices, wireless utility meters [smart meters] and wireless infrastructure.
- Urge that UN Environmental Program initiate an assessment of current exposure standards [in order] to substantially lower human exposures to non-thermal radiation.
- Take a planetary view of potential for harm that EMF pollution presents to biology—the evolution, health, well-being and very survival of all living organisms worldwide.
Illinois politicians and members of the General Assembly: What do they know?
Members of the General Assembly, who voted to pass the smart grid modernization bill, (after “ComEd’s lobbyists were able to muscle the bill through”, according to the Illinois Attorney General), and/or voted to override Governor Quinn’s veto, might want to take another look at the health threat being inflicted on ComEd customers. Why would any political leader knowingly permit their constituents to be forced to live with a meter on their homes that emits a known Class 2B Carcinogen?
With a mandate in place and no permanent opt-out option available, residents are powerless to protect their families. In order for justice to prevail, consumer choice has to be restored, and a permanent opt-out option granted to ComEd customers.
They knew, they did not tell us, where do we go from here?
Flynn’s summary on smart meter dangers:
“Pulsed non-thermal radiation, which is emitted by smart meters, is far more damaging at the body’s cellular level to all life forms than any other technology ever devised by man. Militaries of the world have known for more than 50 years that RF/microwaves are the perfect weapon. Today, democratic governments are knowingly and callously authorizing untested (for safety) smart meters to operate (emitting pulsed non-thermal radiation) at the most lethal frequencies known to man.”
Dr. Willie Soon is an astrophysicist in the Solar, Stellar and Planetary Sciences Division of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. He began as a post-doctoral fellow in 1991 and took his scientist position in 1997. His subsequent career is a textbook example of speaking truth to power and bravery facing the consequences.
Dr. Soon produced an important series of astrophysics papers on the sun-climate connection beginning in 1994 and received positive discussion in the United Nations Intergovernmental Panel on Climate Change’s second and third assessment reports (1996 and 2001). In that era, the IPCC still admitted uncertainties about human influence, despite green NGO pressure and U.S. State Department insistence on finding a “smoking gun” in weak data. Even Bert Bolin, co-creator and first chairman of the IPCC (1988-1997), deplored the denial of uncertainty he saw rising. In his 2007 History of the Science and Politics of Climate Change (page 112), Bolin wrote, “It was non-governmental groups of environmentalists, supported by the mass media who were the ones exaggerating the conclusions that had been carefully formulated by the IPCC.” In 1997 Bolin went so far as to tell the Associated Press, “Global warming is not something you can ‘prove.’ You try to collect evidence and thereby a picture emerges.”
Dr. Soon’s study of solar influence on climate behavior made him a target for alarmists, but he had defenders. In 2013, the Boston Globe acknowledged his guts and sound science with a quote from iconic science leader, Freeman Dyson: “The whole point of science is to question accepted dogmas. For that reason, I respect Willie Soon as a good scientist and a courageous citizen.”
In February of 2015, Greenpeace agent Kert Davies, a vocal critic since 1997, falsely accused Dr. Soon of wrongfully taking fossil-fuel company grants by failing to disclose “conflicts of interest” to an academic journal. The journal’s editors and the Smithsonian Institution found no violation of their disclosure or conflict of interest rules. However, the Greenpeace accusation caused a clamor around the world as lazy liberal reporters repeated it for major media with no fact-checking for accuracy.
The Greenpeace ruckus brought high-level Obama administration pressure on the Harvard-Smithsonian Center to silence climate skeptics – Vice President Joe Biden is a member of Smithsonian’s Board of Regents. The Institution responded with an elaborate new Directive on Standards of Conduct that forced its employees to wade through bureaucratic rules replete with an Ethics Counselor and a “Loyalty to the Smithsonian” clause of a sort not seen since the McCarthy Red Scare.
The Institution announced an Inspector General investigation of Soon, combing his emails and announcing that he had broken no rules. That seriously stung the NGO-Media-Politician coalition, which launched more attacks.
Ten days apart in the Spring of 2016, two outlets published stories scurrilously demonizing Dr. Soon. Both articles were long on bias and bogus claims but short on facts. The two activist/writers, David Hasemyer of the controversial Rockefeller-funded InsideClimateNews and Paul Basken of the for-profit Delaware corporation, The Chronicle of Higher Education, seem to have forgotten journalistic ethics and the facts.
Basken’s March 25 item, “A Year After a Climate-Change Controversy, Smithsonian and Journals Still Seek Balance on Disclosure Rules,” bemoans the fact that last year’s load of Greenpeace false accusations hadn’t caused the Institution to impose harsh enough rules to get rid of all scientists with climate skeptic views. Any fact checking didn’t show.
Hasemyer’s April 5, 2016 piece, “Smithsonian Gives Nod to More ‘Dark Money’ Funding for Willie Soon,” bewails the fact that Soon’s employer didn’t follow their playbook but approved a $65,000 grant from the non-profit Donors Trust, which is despised by greens because it uses anonymous “donor-advised-funds.” Such “dark money” grants are an IRS-approved shield pioneered decades ago by the far-left Tides Foundation for its $1.1 billion worth of grants to radicals, much of it “dark,” which Hasemyer didn’t seem to recall.
Hasemyer also neglected to note that even if Donors Trust’s “dark” grant came from ExxonMobil Foundation, the fossil-fuel philanthropy also gave universities $64,674,989; museums $2,771,150; the Red Cross $2,549,434; the Conservation Fund, Nature Conservancy and similar groups $1,210,000; Habitat for Humanity $798,000, Ducks Unlimited, $402,000 and many more from 1998 to 2014 according to IRS records. Will they be demonized as shills too?
Neither Hasemyer nor Basken displayed any familiarity with what scientists have to go through in order to do science in the Harvard-Smithsonian Center for Astrophysics or how it works, which is the bedrock of sound, ethical journalism on the topic.
The Center combines the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of the universe. It is comprised of six divisions, and Dr. Soon is listed in the Solar, Stellar, and Planetary Sciences (SSP) Division.
About one-third of the Center’s scientists, including Willie Soon, are employed in what are called “Smithsonian Trust positions.” These positions are held mostly by PhD specialists, unlike Federal civil service. According to the Smithsonian Employee Handbook, Federal position paychecks are paid from the Smithsonian’s annual Federal appropriation and Trust position paychecks are paid from the Smithsonian’s Trust Fund. Scientists in Trust positions are paid by the hour with a Smithsonian paycheck.
Scientists in Trust positions must find donors who will give the Smithsonian grants that pay for the science. An employee information document states, “Obtaining competitive funding is an important part of the scientists’ jobs and a measure of their career success.” The grants always go directly to the Smithsonian for the science project with a 30 to 40 percent cut off the top for the Institution’s management and overhead, but never go directly to the scientist. Media attacks on Dr. Soon misrepresenting his success at this duty as nefarious are either ignorant or disingenuous.
Scientists in Trust positions must follow exacting procedures in order to obtain grants for their science according to the rules in the elaborate Contract and Grant Administration document.
The prescribed steps most relevant to Dr. Soon’s position are: First, the scientists must prepare a draft of their proposed scientific project or work. The draft then goes for pre-approval to the Director’s Office, held since 2004 by distinguished astronomer Charles Alcock. The scientists must give the Director suggestions for potential funders, but all decisions are the Director’s.
If the Director approves the draft proposal, he signs it and gives it to the Grant Office, which prepares the presentation package, including a budget, the approved proposal, and a cover letter formally requesting a grant. The Director signs the cover letter and the grant officer sends it to the potential donor.
The donor replies to the Director saying yes or no. If yes, the reply may contain a pledge to be paid when invoiced by the Center or direct payment to Smithsonian, which handles all of the Center’s money. The scientist who performs the project may not know and has no need to know who gave the grant.
When scientists perform an “off the clock” (unpaid) study to be published in a peer-reviewed journal and pays for it out of personal funds, as Willie Soon has on numerous occasions over the years, all Smithsonian approvals and checkpoints must still be passed. Claims that Dr. Soon has pocketed any off-the-clock grant money have all been shown false.
Writers who accuse Dr. Soon of wrongdoing despite firm evidence to the contrary are violating the Code of Ethics of the Society of Professional Journalists, which states, among many other points: “Ethical journalism should be accurate and fair. Journalists should examine the ways their values and experiences may shape their reporting. Journalists should support the open and civil exchange of views, even views they find repugnant.”
The hostile coverage attacking Dr. Soon could hardly be considered ethical journalism by these professional standards. The writers and publishers of such unethical journalism should be brought to account.
The Federal Communications Commission (FCC) describes itself thus: “An independent U.S. government agency overseen by Congress.” Under the Barack Obama Administration, it has been none of these things.
Sure – the instructions were actually quite overt.
And Chairman Wheeler has been a dutiful waiter – taking orders like a pro, and delivering them exactly as asked.
Authoritarian? Again, the FCC page describes itself as: “an independent U.S. government agency overseen by Congress.” The FCC is in fact a creation and a creature of Congress – and thus can not do anything unless and until Congress first writes a law that says “Yo, FCC – do this.”
Congress never passed anything telling the FCC to do any of this. (And the President calling for it carries zero legislative weight – as we know from so many other of hisunilateral, illegal executive branch actions.)
So as the power grabs pile up, the private sector is forced to grab whips and chairs to fend off the government – i.e., they call the lawyers.
And you know there will have to be even more lawsuits filed when the FCC finishes cramming through its set-top-box power grab.
Dishonest? FCC Chairman Wheeler said many of these power grabs would never happen. Then he made them happen.
FCC ‘Net Neutrality’ Plan Calls for More Power Over Broadband: “The main advantage of (Chairman Wheeler’s) hybrid proposal, as opposed to full reclassification, is that it wouldn’t require the FCC to reverse earlier decisions to deregulate broadband providers, which were made in the hopes of encouraging the adoption and deployment of high-speed broadband.”
But that plan wasn’t good enough for the White House: “In response to news of Mr. Wheeler’s plan, a senior White House official said Thursday that ‘the president has made it abundantly clear that any outcome must protect net neutrality and ban paid prioritization—and has called for all necessary steps to safeguard an open Internet.’”
So Chairman Wheeler began the bow to his master’s call for full-on power grab reclassification.
Care for some more dishonesty? There’s plenty, you know.
FCC Chairman Wheeler: There’s the Internet, Then There’s Interconnection: “Wheeler added, ‘I think one of the things that I have said along the way is that peering (interconnection) is not a net neutrality issue. We haven’t seen peering as a net neutrality issue. There is a matter of the ‘open Internet,’ and then there is a matter of interconnection among the various, disparate pathways that become the Internet.’”
The FCC is supposed to be an independent, expert agency – delivering decisions devoid of politics, based solely on the facts.
THIS FCC is none of this. It is just another partisan, hack member of this partisan, hack administration.
Congress should stop pretending its the former – and start seriously reining in the latter.
There have been numerous stories, rumors, and outright falsehoods reported in the media and by detractors regarding state Sen. Arthur Orr’s (R-Decatur) recently proposed welfare reform bill.
The fact is Alabama’s current welfare program is a decade behind most of the country; it earned an “F” grade in The Heartland Institute’s 2015 Welfare Reform Report Card, and since 1996, Alabama has been one of the worst states at reducing caseloads for the Temporary Assistance for Needy Families program (TANF), also known as “welfare.” Additionally, a 2013 Cato Institute study found in the current welfare system, a family collecting welfare benefits in Alabama could receive benefits worth more than $23,310 in just one year, while a single person with two children working a full-time minimum wage job and utilizing the Earned Income Tax Credit earns $22,628 per year.
Instead of shedding light on the significant flaws of Alabama’s welfare program, proponents of the current program are misleading Alabamans. For instance, in a story written by Amanda Marcotte for Salon, the author incorrectly claims Orr’s bill “would change the way food stamp eligibility is calculated, counting your car as a financial asset—like a savings account—and requiring you to sell it before you’re allowed to get assistance.” The truth is the assets examined for the Supplemental Nutrition Food Assistance Program (SNAP), commonly known as “food stamps,” would not include home equity or a primary vehicle, but bank account balances, recreational vehicles—such as snowmobiles, boats, motorcycles, jet skis, and ATVs—and other valuable assets would be considered.
Welfare and SNAP should only be available to those who truly need assistance, and asset tests would play an important role in ensuring enough resources are available for Alabama’s most impoverished citizens.
Currently, 14 states already require an asset test to receive food stamps. According to the Foundation for Government Accountability (FGA), if every state matched its asset testing for food stamp eligibility to the federal baseline, 749,000 fewer Americans would be trapped in food stamp dependence. Nationally, taxpayers would save more than $1.1 billion per year.
Orr’s bill would also decrease lifetime limits on eligibility for TANF, preclude the department from applying for a waiver of work requirements for SNAP, and implement stricter sanctions for noncompliance, along with other common-sense reforms that would help to reduce welfare fraud.
The “t” in TANF stands for “temporary.” Polling has shown 50 percent of the public believes welfare should only be available for 12 months, and 82 percent of the public support requiring able-bodied, working-age adults with no children at home to work or train for work for at least 20 hours per week in order to receive taxpayer-funded food stamp benefits. One of the most important ways governments can work with people to escape poverty is by helping them obtain work. Only 2.6 percent of full-time workers are poor, as defined by the federal poverty level standards, compared with 23.9 percent of adults who do not work. Even part-time work makes a significant difference; only 15 percent of part-time workers are poor.
In 2011, only 40 percent of TANF recipients in Alabama were working, which means three out of five TANF aid recipients were not working in return for their benefits. Strengthening the sanctions regime for failure to participate in work activities will help to increase work participation in Alabama. That participation would make it more likely recipients would gain the necessary skills to earn an income sufficient for them to leave welfare rolls permanently.
A study by FGA found after three months of reinstated work requirements in Kansas, nearly 13,000 Kansans left the welfare rolls. Within a year, nearly 60 percent of these former enrollees found employment and increased their incomes by an average of 127 percent.
Instead of trapping welfare recipients in a sustained cycle of poverty, these proposed policies can help by giving poor people a hand up. Enforcing fraud prevention measures and adopting reforms, such as an asset test, strict-but-fair time limits and sanctions, and work requirements, will improve opportunities for recipients to reach self-sufficiency, give help to those people who truly need assistance, and protect taxpayers.
These reforms would also free up block grant money, which could be reinvested in transportation, workforce development, childcare, and alcohol and substance abuse programs for the truly needy.
In today’s edition of The Heartland Daily Podcast, Kyle Maichle, project manager for constitutional reform joins the show to talk about the rules that govern a potential Article V Convention.
So far, the most popular Article V Convention application is the single-subject, balanced budget amendment. 28 states have passed an application; 34 are needed to really start the process. As Maichle explains, when the 34-state threshold is passed, there is still a long and complicated process ahead. Maichle discusses what a convention would look like, how delegates are chosen, and how long it may take to fully enact a balanced budget amendment.
Friday, April 22, will mark the 47th Earth Day. You may think it is all about planting trees and cleaning up neighborhoods. But this year’s anniversary will be closer to its radical roots than, perhaps, any other since its founding in 1970. Considered the birth of the environmental movement, the first Earth Day took place during the height of America’s counterculture era. According to EarthDay.org, it gave voice to an “emerging consciousness, channeling the energy of the anti-war protest movement and putting environmental concerns on the front page.”
We did need to clean up our act. At that time “littering” wasn’t part of our vocabulary, The air in the Southern California valley where I grew up was often so thick with smog we couldn’t see the surrounding mountains.
Thankfully, that has changed.
Look around your community. You’ll likely see green trees, blue skies, and bodies of water sparkling in the sunshine. With the success of the environmental movement, its supporters, and the nonprofit groups it spawned, had to become ever more radical to stay relevant.
Environmentalism has changed.
The morphing of the movement may be most evident in Earth Day 2016—which some are calling “the most important Earth Day in history.”
This year, on April 22, in a high-level celebration at the United Nations headquarters in New York, the Paris Climate Agreement will officially be signed. Thirty days after its signing by at least 55 countries that represent 55 percent of global greenhouse gas emissions, the agreement will take effect—committing countries to establishing individual targets for emission reductions with the expectation that they will be reviewed and updated every five years.
While news reports of Earth Day 2016 will likely depict dancing in the streets, those who can look past the headlines will see a dire picture—one in which more than 10 percent of a household’s income is spent on energy costs; one of “green energy poverty.”
To meet the non-binding commitments President Obama made last December in Paris, he is counting on, among many domestic regulations, the Clean Power Plan (CPP).
Last week, on the Senate floor, Senator Jim Inhofe (R-OK), chairman of the Senate Environment and Public Works Committee, delivered remarks in advance of Earth Day on the unattainability of the U.S. climate commitments. He said: “The Clean Power Plan is the centerpiece of the president’s promise to the international community that the U.S. will cut greenhouse gas emissions by 26 to 28 percent.” It would “cause double digit electricity price increases in 40 states” and “would prevent struggling communities from accessing reliable and affordable fuel sources, which could eventually lead to poor families choosing between putting healthy food on the table or turning their heater on in the winter.”
The Heritage Foundation has just released a report on the devastating economic costs of the Paris Climate Agreement, which it calls “a push for un-development for the industrialized world and a major obstacle for growth for the developing world.” Because global warming regulations “stifle the use of the most efficient and inexpensive forms of electricity, businesses as well as households will incur higher electricity costs.” The report concludes: “restricting energy production to meet targets like those of the Paris agreement will significantly harm the U.S. economy. Bureaucratically administered mandates, taxes, and special interest subsidies will drive family incomes down by thousands of dollars per year, drive up energy costs, and eliminate hundreds of thousands of jobs. All of these costs would be incurred to achieve only trivial and theoretical impacts on global warming.”
Real world experience bears out the both Inhofe’s observations and the Heritage Foundation’s conclusions.
Germany is one of the best examples of green energy poverty as the country has some of the most aggressive greenhouse gas reduction programs that offer generous subsidies for any company producing green energy. Based on an extensive study done by green energy believers in 2014, I addressed the program’s overall result: raised costs and raised emissions. I stated: “After reading the entire 80-page white paper, I was struck with three distinct observations. The German experiment has raised energy costs to households and business, the subsidies are unsustainable, and, as a result, without intervention, the energy supply is unstable.” At that time, I concluded: “The high prices disproportionately hurt the poor, giving birth to the new phrase: ‘energy poverty.’”
More recently, others have come to the same conclusion (read here and here). On April 13, the Wall Street Journal (WSJ) opined: “Germany’s 16-year-old Energiewende, or energy transformation, already has wrecked the country’s energy market in its quest to wean the economy off fossil fuels and nuclear power. Traditional power plants, including those that burn cleaner gas, have been closing left and right while soaring electricity prices push industries overseas and bankrupt households. Job losses run to the tens of thousands.” Meanwhile, emissions over the past seven years have increased. Last month, Mike Shellenberger, President, Environmental Progress and Time magazine “Hero of the Environment,” tweeted: “people really want to believe good things about Germany’s energy shift, but … its emissions rose.” WSJ concludes: “The market distortions caused by overreliance on expensive but undependable power already have pushed German utilities to rely more on cheap and dirty coal-fired power plants to make up the shortfall when renewable sources can’t meet demand.”
Germany is not alone.
The U.K., according to Reuters, is facing “fuel poverty.” The report states: “The government is also under pressure to curb rising energy bills with 2.3 million of Britain’s 27 million households deemed fuel poor, meaning the cost of heating their homes leaves them with income below the poverty line.” Another account covers the U.K.’s cuts to solar subsidies, saying: “The government says the changes were necessary to protect bill payers, as the solar incentives are levied on household energy bills.”
The Netherlands, which is already behind in meeting its green energy targets, has, according to the Washington Post, had to build three new coal-fueled power plants—in part, at least, to power the high percentage of electric cars. Additionally, the country has hundreds of wind turbines that are operating at a loss and are in danger of being demolished. A report states: “Subsidies for generating wind energy are in many cases no longer cost-effective. Smaller, older windmills in particular are running at a loss, but even newer mills are struggling to be profitable with insufficient subsidies.”
Bringing it closer to home, there is über-green California—where billionaire activist Tom Steyer aggressively pushes green energy policies. Headlines tout California has the most expensive market for retail gasoline nationwide. But, according to the Institute for Energy Research, it also has some of the highest electricity prices in the country—“about 40 percent higher than the national average.” A 2012 report from the Manhattan Institute, states that about one million California households were living in “energy poverty”—with Latinos and African Americans being the hardest hit. With the Golden State’s headlong rush toward lower carbon-dioxide emissions and greater use of renewables, the energy poverty figure is surely much higher today.
This week, as you hear commentators celebrate “the most important Earth Day in history” and the global significance of the signing of the Paris Climate Agreement, remember the result of policies similar to CPP: green energy poverty. Use these stories (there are many more) to talk to your friends. Make this “Green Energy Poverty Week” and share it: #GEPW.
We, however, do not need to be doomed to green energy poverty. There is some good news.
First, the Paris Climate Agreement is non-binding. Even Todd Stern, U.S. climate envoy, acknowledged in the Huffington Post: “What Paris does is put in place a structure that will encourage countries to increase their targets every five years.” While the requisite number of countries will likely sign it before the election of the next president, the only enforcement mechanism is political shaming. Even if it was legally binding, as was the Koyto Protocol, Reason Magazine points out what happened to countries, like Canada and Japan, which “violated their solemn treaty obligations”—NOTHING. The Heritage report adds: “History, however, gives little confidence that such compliance will even occur. For instance, China is building 350 coal-fired power plants, and has plans for another 800.”
Then there is the legal delay to the implementation of the CPP—which, thanks to a Supreme Court decision earlier this year, will be tied up in courts for at least the next two years. Inhofe stated: “Without the central component of (Obama’s) international climate agenda, achieving the promises made in Paris are mere pipe dreams.”
“President Obama’s climate pledge is unobtainable and it stands no chance of succeeding in the United States,” Inhofe said. “For the sake of the economic well-being of America, that’s a good thing.”
The author of Energy Freedom, Marita Noon serves as the executive director for Energy Makes America Great Inc., and the companion educational organization, the Citizens’ Alliance for Responsible Energy (CARE). She hosts a weekly radio program: America’s Voice for Energy—which expands on the content of her weekly column. Follow her @EnergyRabbit.
In an April 5 editorial titled “Bill would ruin certificate of need program,” the News Sentinel argued legislation Tennessee lawmakers are considering could make it harder for the poor and Tennesseans living in rural communities to obtain access to high-quality, affordable health care.
The fact is rural residents, the poor, and those most in need of Tennessee’s best health care stand to gain if the General Assembly passes House Bill 1730 and Senate Bill 1842, sponsored by state Rep. Cameron Sexton, R-Crossville, and state Sen. Todd Gardenhire, R-Chattanooga, respectively.
These bills would open new pathways for Tennesseans to access innovative, cost-effective health care services, facilities and equipment. At present, pathways to improving facilities are being blocked by Tennessee’s certificate of need law. The law requires hospitals and other medical facilities to apply for and obtain special approval from the Tennessee Health Services and Development Agency before health care providers may expand patient services and make needed upgrades.
The state’s current law impedes small and large health care improvements. A Tennessee hospital wishing to add even a single bed to one of its units must first chase down HSDA’s approval, which charges them for the pleasure. The nonrefundable minimum application fee is $3,000. The maximum fee — also nonrefundable — is $45,000, according to HSDA’s 23-page certificate of need application.
Tennessee’s state-certified need-based system implies current providers ought not to make improvements to facilities and services until after they have fallen into disrepair. As Sexton told Health Care News, “The [current] system doesn’t force people to improve what they have, because they know the CON prevents other people from entering their market.”
Sexton and Gardenhire’s proposed reforms would indeed guide Tennessee toward a better health care system — one based not on money or state-certified need, but on quality.
The current system discourages new providers from entering the health care marketplace and encourages current providers to grow complacent toward patients. The reform bill, by contrast, would help ensure hospitals and other medical facilities get and stay motivated to offer patients the best health care money can buy.
Scaling back certificate of need requirements would save Tennessee’s most innovative health care providers time, money and litigation — savings business-savvy providers will pass on to patients in the form of frequent facilities and equipment upgrades, lower costs and increased access to better health care services.
Reform could even save lives in rural Tennessee communities. States with certificate of need programs have on average 30 percent fewer hospitals and 13 percent fewer ambulatory surgical centers per 100,000 rural residents than states without such programs, according to a 2016 study by the Mercatus Center at George Mason University. The state’s certificate of need law places rural communities at a medical disadvantage by effectively making it easier for providers in counties with more than 200,000 people to acquire magnetic resonance imaging machines than providers in less populous counties. Reform is more likely to help rural Tennessee patients gain access to potentially life-saving radiology services.
Cost-prohibitive application fees, lack of motivation for current providers to upgrade facilities and government-created obstacles to rural health care certify Tennessee’s need for certificate of need reform.
According to the World Health Organization (WHO), Zika is, like Ebola, a Public Health Emergency of International Concern. Now the U.S. Centers for Disease Control and Prevention (CDC) has stated that it is “clear” that the Zika virus causes a serious birth defect, microcephaly (small head).
CDC publishes a scary interactive map of cases diagnosed in the U.S. The highest number as of this writing is in Florida, colored an ominous brown.
CDC is in high gear, with politically correct advice on Zika. Meanwhile, cases of dengue in Mexico have topped 10,000. Dengue is caused by a related but far more serious virus, carried by the same Aedes aegypti mosquito. And 78,000 people in Africa die every year of another relative, yellow fever. The vector was coming under good control decades ago, but is re-emerging now. Asking “why” should be the main response to Zika.
Instead the advice seems to be: “Don’t travel, don’t have a baby, don’t let a mosquito bite you, stop climate change” – and give the authorities billions of dollars for a crash vaccine development program.
What should people do about Zika?
First, don’t panic. It’s not Ebola. Ebola and Zika are alike in that they were first recognized in Africa decades ago. Also, there is no cure or vaccine for either. But while Ebola has killed thousands, Zika has likely not killed anyone. Symptoms, if any, are mild: a few days of fever, rash, joint pains and red eyes. Ebola is extremely contagious through personal contact. Zika is primarily transmitted by mosquito, although it may be carried in semen.
The main problem with Zika is that, like rubella (German measles), it can apparently cause birth defects. (Rubella can cause microcephaly.)
Zika virus has been found in the brain of a few babies born with microcephaly. But two things are very clear: MOST microcephaly is NOT caused by Zika. About 7 of 10,000 babies born in the U.S. have microcephaly – no thanks to Zika. Most (more than 90 percent) of the Brazilian babies recently confirmed to have microcephaly tested negative for Zika.
Additionally, MOST mothers who have Zika during pregnancy give birth to a normal, healthy baby. Mothers in northeastern Brazil also had a lot of other problems, including malnutrition, heavy exposure to toxic agricultural chemicals and an aggressive vaccination campaign.
Panic can cause people to do things that might make the problem worse. For example, New York State’s Zika prevention kits contain the mosquito repellent DEET, which is absorbed through the skin. It is not known to be safe in pregnancy. Some animal studies have shown nerve toxicity.
And what about vaccines? A March 22/29 JAMA article is entitled: “Pregnancy in the Time of Zika: Addressing Barriers for Developing Vaccines and Other Measures for Pregnant Women.” One barrier is “lack of a broadly accepted ethical framework” for clinical research during pregnancy. Basically, to test whether something harms a developing baby, you have to try it out on developing babies and see whether they are hurt.
A registered clinical trial that sought 50 pregnant volunteers to test Tdap (adult tetanus, diphtheria, acellular pertussis) vaccine, in Vietnam, was scheduled for completion in 2015. But Tdap had already been given to thousands of Brazilian expectant mothers some months before reports of microcephaly started to surface. There was no control group of unvaccinated women. It was not, after all, research, but a public health response to an increase in pertussis (whooping cough) cases.
A Zika vaccine would likely be a live virus, and all live virus vaccines are contraindicated in pregnancy – and possibly while nursing. That includes measles, mumps, rubella, chickenpox, shingles and rotavirus. There is a case report of a nursing infant who got meningoencephalitis, probably from yellow fever vaccine virus. Nursing Labrador puppies got canine distemper, a relative of measles, after their mother got a booster shot.
WHO and CDC are quick to indict Zika virus, which might eventually turn out to be an innocent bystander. But the chief culprit is known: a breakdown in vector control.
In 1970, the safest and most effective public health weapon in history – DDT – was banned by the U.S. Environmental Protection Agency. In the 1990s, Mexico agreed to abandon its DDT program as a condition of NAFTA. Mosquitoes travel.
Alarm about Zika will be a public-relations exercise, covering the waste of countless human lives and billions of dollars on ineffective or harmful campaigns, if it does not open a discussion of why diseases on their way out in the 1970s are coming back now.
Here’s the help you’ll need to prepare your household for the realities of living under a centralized health-care system — order “Surviving the Medical Meltdown: Your Guide to Living Through the Disaster of Obamacare”
Sad news tonight.Dr. William H. Gray, Ph.D. — a friend of The Heartland Institute, who was also the world’s most learned expert on hurricanes — has died.
Gray was a frequent speaker at Heartland’s 11 International Conferences on Climate Change. We were honored to host him, and you can see all his presentations below, or at this link.
Bill went where the data led him, especially when it came to the frequency of hurricane activity due to man-caused, CO2-driven global warming. Point of fact: Bill said Al Gore’s filmed prediction of AGW causing more hurricanes was nonsense. Bill, not Al, was proven right.
One would not expect a glowing eulogy of Bill from the folks at the “Capital Weather Gang” at The Washington Post. They haven’t been very warm to Heartland’s climate conferences, or Bill’s views, in the past. But Jason Samenow posted a fantastic tribute on the site to Bill by Capital Weather Gang contributor Phil Klotzbach.
How to describe 16 years spent with one of the greatest minds in hurricane research of the past 60 years? I’m still having trouble coming to grips with the fact that he’s gone. There are so many things about our relationship that I’m going to miss. The daily hour-long phone calls, the tag-team conference presentations, the forecast day donuts, the chats about topics ranging from hurricanes, to climate change, to politics, to baseball, to the Civil War.
I first was introduced to the Colorado State University (CSU) seasonal hurricane forecasts and Dr. Gray when I did an undergraduate project on his research for my climatology class. I ended up doing my undergraduate Honors thesis on his research, and I was beyond excited when he called me to offer me a graduate research assistantship at CSU. One of my first interactions with him was the American Meteorological Society Hurricanes and Tropical Meteorology Conference in 2000 in Fort Lauderdale. After a brief introduction, his first question was “Who had the most RBIs in a single season, which team did he play for, and how many RBIs did he get in the season”? I knew that the answer was Hack Wilson for the 1930 Pittsburgh Pirates with 191 RBIs. At that point, Dr. Gray said he knew I would make a good project member …
has produced in an extraordinarily distinguished research career that spanned 60 years. The humility that he has demonstrated throughout his career is something that we would all do well to emulate.
Dr. Gray had an incredible knowledge of the way that the climate works. His development of his genesis parameters – six key ingredients necessary for tropical cyclogenesis – was a groundbreaking piece of research when it was first published in the late 1960s. He also spent many years with his graduate students studying and publishing papers in a variety of fields from tropical cyclone structure to tropical radiation.
He is best known worldwide for his seasonal hurricane predictions. He instituted these predictions when he discovered that El Nino impacted Caribbean and tropical Atlantic vertical wind shear. This was the first time that any group had issued seasonal forecasts for the Atlantic. Now, nearly two-dozen groups have followed his lead issuing these predictions. He has consistently issued these forecasts for over 30 years – a track record unparalleled for university predictions. …
Dr. Gray’s generosity with his resources was incredible. He contributed a considerable amount of his own resources to keep our project alive when research grants went dry a few years ago. He also let me stay at his house when I came back for in-person visits after relocating to California. …
Even at the end, Dr. Gray was focused on his research. He gave me very clear instructions on various projects I should be conducting over the next few years. He was still sketching clouds using his legal pad and #2 pencils and discussing the intricacies of cumulus convection when I came to see him a few days before his death. He told me several times throughout my time at CSU: “The only immortality that you have as a professor is through your graduate students.”
His graduate students, their students, and now even their students, are leaders in meteorological research around the globe. The incredible legacy left by Dr. Gray will last for generations to come. He will be sorely missed.
Anthony Watts, an other colleague of Bill, has his own tribute at Watts Up With That. An excerpt:
I knew this was coming, as I had a heads-up from Joe D’Aleo last week that the end was nearing. I knew Dr. Bill Gray through my work in climate, and from attending conferences. He always had a good word for me, and when I encountered him in person, he was fond of parodying the “Wayne’s World” skit where they meet Alice Cooper backstage and get on their knees and chant “I’m not worthy…I’m not worthy”. He’d actually do that with me. It was endearing, yet at the same time a little bit embarrassing, because I was the one who really should be doing that in his presence. I always found myself saying “please Bill, stop that!”, to which he’d get back on his feet and give me a chuckle with that toothy grin of his.
I knew Bill Gray only through Heartland — and knew him to be a brilliant and generous man who was always kind to me an appreciative of our efforts to showcase his research. The honor was ours.
View Dr. William Gray’s six presentations at Heartland’s International Conferences on Climate Change below.
Does fracking cause housing prices to fall? The answer to that question is more difficult that it might seem. Many anti-fracking activists have claimed oil and natural gas development has led to substantial decreases in property values in areas where drilling occurs, but other places, such as North Dakota, saw property values skyrocket during the boom in oil production.
In this episode of The Heartland Daily Podcast, Kayla Harris, the director of Energy and Environmental policy at Ballotpedia helps answer this question. Harris discusses the results of her recent study on the impact of fracking on property values in several counties in Colorado with research fellow Isaac Orr, stating that fracking has led to increases in property values in some areas, and decreases in others.
If you don’t visit Somewhat Reasonable and the Heartlander digital magazine every day, you’re missing out on some of the best news and commentary on liberty and free markets you can find. But worry not, freedom lovers! The Heartland Weekly Email is here for you every Friday with a highlight show. Subscribe to the email today, and read this week’s edition below.
Cruz, Trump Go on Record on Climate, Energy
H. Sterling Burnett, Climate Change Weekly
The two leading Republican presidential candidates gave comprehensive statements on the topics of energy and climate when they responded to a survey by the American Energy Alliance. Both Trump and Cruz are skeptical of the policies implemented by the Obama administration to prevent purported dangerous anthropogenic global warming. Both candidates reject imposing a carbon tax, complain of regulatory overreach, and promise to scale back Obama’s Clean Power Plan. While the candidates agreed on a number of subjects, they did disagree on energy subsidies and the renewable fuel mandate. READ MORE
IRS Serves Government, Not Taxpayer
Jesse Hathaway, Real Clear Policy
With Tax Day upon us, now is the perfect time to remind ourselves that the Internal Revenue Service is the government Leviathan’s enforcer, nothing more. The IRS’s disregard for public safety was recently highlighted by a new report from the Government Accountability Office. The report showed the IRS failed to make recommended improvements, leaving taxpayers’ private information at the mercy of hackers, both domestic and foreign. READ MORE
Prosecute Climate Deniers? No, First Amendment Protects Debate
H. Sterling Burnett, The Philadelphia Inquirer
The global warming debate has reached a new, and chilling, level. U.S. Attorney General Loretta Lynch reports discussing with the FBI pursuing legal action against companies, research institutions, and scientists who debate whether humans are causing catastrophic climate change. This is an unprecedented politicization of justice and attack on our First Amendment right to freedom of speech. READ MORE
Featured Podcast: Yaron Brook: America’s Misguided Fight Against Income Inequality
“Income inequality” has once again become a rallying cry for the Left. Since achieving the goal of perfect equality is impossible, politicians like Bernie Sanders can forever champion the objective of increasing equality regardless of the amount of government intervention currently in place. Yaron Brook, president and executive director of the Ayn Rand Institute, joins the Heartland Daily Podcast to explain the truth about “income equality alarmism” and why equal is, in fact, unfair. LISTEN TO MORE
Coming Next Week to Arlington Heights: The Vaping Wars!
If you love discussions about liberty, you will not want to miss the great series of events Heartland has lined up through the spring. OnWednesday, April 20, a discussion will take place about the “vaping wars” – the government’s war on e-cigarettes. OnWednesday, May 11, Ryan Yonk, executive director of Strata Policy, comes to Heartland to discuss the new book, Nature Unbound: Bureaucracy and the Environment. We hope to see you here in Arlington Heights, but if you are unable to attend in person, the events will be live-streamed and archived on Heartland’s YouTube page. SEE UPCOMING EVENTS HERE
Exelon Again Threatens to Close Nuclear Plant Unless Taxpayers Subsidize Operations
H. Sterling Burnett, The Heartlander
Exelon, a utility with the largest fleet of nuclear plants in the United States, is once again seeking corporate welfare from taxpayers. Exelon told Illinois lawmakers it will close its Southern Illinois Clinton nuclear plant if it does not receive new financial support from the state in 2016. This isn’t the first threat. In 2015, Exelon said it would close three plants in the state unless Illinois agreed to impose a special surcharge that would have provided an additional $300 million to the company. READ MORE
College Admissions Offices Adapt to Homeschooling Boom
Andy Torbett, The Heartlander
As the popularity of homeschooling continues to grow, colleges and universities are changing their admissions processes to better accommodate homeschooled children’s unique backgrounds and skills. “There are colleges that are actively recruiting homeschooled students,” said Lennie Jarratt, The Heartland Institute’s project manager for education transformation. “Just one decade ago, this didn’t happen.” Could this be a solution to the radical left’s takeover of K–12 and higher education? READ MORE
Bonus Podcast: Brian Blase: Obamacare’s Broken Promises
In 2009, the Affordable Care Act was touted as a cure-all, required to expand health insurance access and reduce health care costs nationwide. In the six years since Obamacare’s passage, the shortcomings have become evident. Brian Blase, senior research fellow at the Mercatus Center, joins the Heartland Daily Podcast to discuss the disparity between Obamacare’s promises and the reality that exists three years into its full implementation. LISTEN TO MORE
Insurers Are Abandoning Obamacare Exchanges
Justin Haskins, Consumer Power Report
One of the most ambitious aspects of the Affordable Care Act (ACA) was the creation of health insurance marketplaces, which proponents said would increase market competition and lead to lower costs for consumers and insurers. A new report by The Heritage Foundation shows ACA has actually limited consumers’ insurance options by driving health insurers out of the Obamacare marketplace. According to the report, there are now only 287 “exchange-participating insurers,” down from 395 insurers in 2013. READ MORE
Help Us Stop Wikipedia’s Lies!
Joseph L. Bast, Somewhat Reasonable
Many people rely on our profile on Wikipedia to provide an objective description of our mission, programs, and accomplishments. Alas, the profile they find there is a fake, filled with lies and libel about our funding, tactics, and the positions we take on controversial issues. Wikipedia refuses to make the changes we request. It even deletes and reverses all the changes made by others who know the profile is unreliable. We need your help! READ MORE
School Choice Fosters Racial, Economic Equity
Joy Pullmann, School Choice Weekly
A common criticism thrown at school choice advocates is that school choice programs create segregation in the education system. Reality shows quite the opposite effect. Kevin Chavous, founding board member for the American Federation for Children, responds to this frequently made claim by pointing out that school choice programs typically serve far more minority students than is representative of their states’ populations, thereby serving to desegregate public and private schools. READ MORE
Invest in the Future of Freedom! Are you considering 2016 gifts to your favorite charities? We hope The Heartland Institute is on your list. Preserving and expanding individual freedom is the surest way to advance many good and noble objectives, from feeding and clothing the poor to encouraging excellence and great achievement. Making charitable gifts to nonprofit organizations dedicated to individual freedom is the most highly leveraged investment a philanthropist can make. Click here to make a contribution online, or mail your gift to The Heartland Institute, One South Wacker Drive, Suite 2740, Chicago, IL 60606. To request a FREE wills guide or to get more information to plan your future please visit My Gift Legacy http://legacy.heartland.org/ or contact Gwen Carver at 312/377-4000 or by email at email@example.com.
The media is spreading catastrophic global warming news from satellite temperature data ending February 2016. On March 3, 2016, the University of Alabama-Huntsville (UAH) posted the February 2016 global temperature of 0.83 degrees C. surpassed the previous record of 0.74 degrees C. for April 1998. These temperatures are the difference from the 30-year average from 1981 to 2010. This is a data set from 1979 until present when satellite temperature measurements were first made.
Associated Press writer Seth Borenstein wrote March 17, 2016 ”Freakishly hot February obliterates global weather records”. New York Times reporter Justin Gillis wrote March 22, 2016 “Scientists Warn of Perilous Climate Shift Within Decades, Not Centuries”. Expect more scary stories from other writers who live off reports from the scientific community that generates climate change (global warming) information.
The University of Alabama at Huntsville (UAH) posted its latest satellite global temperature data that spans until the end of March 2016 shown by Fig. 1.
Fig.1 Latest Global Average Tropospheric Temperatures
The March 2016 temperature has fallen to 0.73 degrees C which is even lower than the previous record of 0.74 degrees from April 1998. The satellite temperature data shows a temperature rise since 1979 of 0.12 degree C. per decade; or 1.2 degrees per century which places the earth’s warming below the recommended limit on global warming from the 2015 Paris Climate Accord.
Data over thousands of years show approximate 500-year cycles of planet warming and cooling. We are currently in the Current Warming Period which commenced approximately 1850. This was preceded by the Little Ice Age from approximately 1350 to 1850. Thus continued global warming should be anticipated until after the start of the 22nd. century.
SUPER EL NINO CAUSES TEMPERATURE RISE
Along the Equator stretching from New Guinea to Western South America is a region in which prevailing winds and sea surface temperatures create weather systems that impact the planet. When temperatures are warmer this creates a system called El Nino and countering cooling system is called La Nina. La Nina normally follows an El Nino. El Nino means the boy or Christ Child and this name was given because peak El Nino usually occurs around Christmas. These systems have been observed for centuries and are not caused by carbon dioxide from burning fossil fuels.
An El Nino system formed in early 2015 and became a Super El Nino that caused the February 2016 global temperature that was the highest observed by satellites over their period of measurements from 1979 to present. A La Nina is expected to follow this event and the question is when this happens and how much cooling takes place.
An excellent explanation of the importance of El Nino on global temperatures is given by the March 18, 2016 Reuters article “How much clarity do we have on transition to La Nina?” by Karen Braun. The article shows graphs of monthly surface and subsurface temperatures measured along the Equator.
Fig. 2 illustrates monthly El Nino 3.4 sea surface temperatures for eight El Nino events occurring since 1982/83. El Nino 3.4 designates an area 165 degrees W to 90 degrees W and 5 degrees S to 5 degrees N. Since the length of 1 degree is 69 miles at the Equator, the El Nino 3.4 area is 5200 miles by 690 miles or 3.5 million square miles.
For the temperature scale, temperatures above 0.5 degrees C. are for periods with El Nino and temperatures below -0.5 degrees C. are for periods with La Nina. If three-month average temperatures are above 0.5 degree C. (or below), then an El Nino (or La Nina) is considered in progress. Temperatures between -0.5 and 0.5 degrees are neutral.
Fig. 2 El Nino Sea Surface Temperatures
The lines are bolder for the strong El Nino over these years. As noted, the strongest El Nino prior to 2015 was the event 1997/98. In November 2015, the current El Nino sea surface temperature exceeded the maximum temperature from the 1997/98 El Nino. The current sea surface temperature has started a slight decline similar to that shown by the 1997/98 El Nino.
As seen for the 1997/98 El Nino, rapid cooling took place after the peak temperature early 1998. This resulted in a super La Nina later that year. The 2015/16 data does not show rapid cooling by the end of February 2016. However, due to the 0.11 degree C. drop in global temperature in March, 2016 shown in Fig. 1, one can infer substantial decreases in sea surface temperature are taking place.
A more recent April 7, 2016, Reuters article “Notion of a delayed La Nina might have been hasty: Braun” shows a more rapid decline in El Nino 3.4 sea surface temperature for March given by Fig. 3. This trend along with changes in prevailing winds indicates a switch to La Nina by July.
Fig. 3 Revised El Nino Sea Surface Temperatures
Perhaps a little more insight about the future of the current El Nino may be found by examining subsurface Pacific Ocean temperatures along the Equator. Fig. 3 gives monthly measured ocean subsurface temperatures along the Equator to a depth of 300 meters for the eight El Nino events. These measurements are along a greater length along the Equator shown in Fig. 2. The distance is from 130 degrees E to 80 degrees W or 10,300 miles.
Fig. 4 El Nino Subsurface Temperatures
By the end of February, the 2015/16 El Nino subsurface temperature had not turned negative; however, the lowered March temperature in Fig. 1 suggests this has taken place. The direction of this Super El Nino is similar to the one of 1997/98 and there is great chance of considerable global cooling by the end of the year.
NO GLOBAL WARMING FOR 58 YEARS
Another perspective about the 2016 global temperature is the March 7, 2016 article “NOAA Radiosonde Data Shows No Warming For 58 Years” by Tony Heller. In NOAA’s press briefing 2015 was the “hottest year ever” was a statement NOAA had a 58-year radiosonde (balloon) temperature record but only showed the last 37 years in a graph. Fig. 5 shows the graph released by NOAA.
Fig. 5 NOAA’s 37-YEAR SATELLITE AND RADIOSONDE TEMPERATURES
Tony Heller found the missing years of radiosonde data from 1958-1976 in the scientific journal article “Global Temperature Variation, Surface-100mb: An Update into 1977” in the June 1978 Monthly Weather Review. This data is shown in Fig. 5 which indicates global temperatures declined from 1958 until 1977.
Fig. 6 Temperature variation for World Surface-100MB and World Surface. Eruptions of Mt. Agung and volcano Fuego (Guatemala) are indicated
Mr. Heller combined Figs. 5 and 6 into Fig. 7 and added a horizontal red line which shows in the troposphere there has been no global warming from 1958 to 2016. Purists might complain the 1958 to 1977 data is for a region that goes from the earth’s surface to an elevation of 100 mb (54,000 ft.) while the 1979 to 2016 data is for five discrete elevations from 5000 ft. to 40,000 ft. However, the five discrete elevation data are quite similar which makes the earlier comparison valid.
Fig. 7 Radiosonde data from 1958 to 2016
The NOAA, and NASA, press release claimed 2015 was the year with the hottest global surface temperatures since measurements were made from 1880. The physics behind the behavior of the greenhouse gas carbon dioxide causing warming has the warming occurring in the atmosphere from the earth’s surface to the stratosphere. Thus the true measure of the influence of increasing atmospheric carbon dioxide from burning fossil fuels is shown by satellite or radiosonde data. From the period 1958 to 2016, atmospheric carbon dioxide has increased from 315 to 402 ppm. Since no appreciable global warming is shown by atmospheric temperatures over the period 1958 to 2016, one may infer increasing atmospheric carbon dioxide from burning fossil fuels has no significant influence on global warming.
DOES NOAA FUDGE DATA?
In his article, Tom Heller noticed NOAA 1958 to 2011 had two databases for radiosonde surface to atmospheric temperatures—(1) a database from 1958 to 2011and (2) a database from 1958 to 2016. The graph in Fig. 5 from the latest database shows about 0.5 degrees C warming from 1979 to 2010. However, the original 2011 database shows little warming during that period.
Fig. 8 NOAA 2016 Radiosonde Database Minus 2011 Database
Fig. 8 shows the 2011 database subtracted from the 2016 database which produces greater warming from 1992 to present and less warming from 1992 back to 1958.
Changes in NOAA global temperature databases are shown by the June 4, 2015, article published in Science that eliminated the pause or “hiatus” in global surface temperatures from 1998 to 2014. NOAA’s Director Thomas Karl said, “Adding in the last two years of global surface temperature data and other improvements in the quality of the observed record provide evidence that contradict the notion of a hiatus in recent global warming trends. Our new analysis suggests that the apparent hiatus may have been largely the result of limitations in past datasets, and that the rate of warming over the first 15 years of this century has, in fact, been as fast or faster than that seen over the last half of the 20th century.” Much controversy was generated over this study and many in the scientific community claimed it was wrong.
Texas Congressman Lamar Smith is heading a committee to investigate allegations of NOAA altering global temperature databases.
U.S. CLIMATE CHANGE SCIENCE PROGRAM REPORT
The U. S. Climate Change Science Program (CCSP) published an April 2006 180-page report
“Temperature Trends in the Lower Atmosphere Steps for Understanding and
Reconciling Differences “. The report showed comparisons of vertical global temperature distributions in the atmosphere computed by Parallel Climate Models (PCM) with actual radiosonde (balloon) temperature measurements. The data is displayed with a vertical axis of altitude given on the left side as pressure in millibars and the right side in kilometers (km). The horizontal axis is latitude from 75 degrees S to 75 degrees N.
Fig. 9 All temperature changes were calculated from
monthly-mean data and are expressed as linear trends
(in ºC/decade) over 1979 to 1999.
Fig. 9 shows from altitudes 1.5 km to 9 km temperature changes are small over this 20-year interval. This provides some agreement with Tony Heller’s paper “NOAA Radiosonde Data Shows No Warming For 58 Years”.
Fig. 10 Computer modeling zonal atmospheric temperature changes
from all forcings (greenhouse gas increase dominates)
Fig. 10 (all forcings) shows calculated global atmospheric temperatures from January 1958 to December 1999. Temperatures are given by the change over this time period. Fig. 10 shows a very distinct “hot spot” from altitudes of 4 km to 16 km and latitudes from 30 degrees S to 30 degrees N. This hot spot is caused by increased greenhouse gases (mostly carbon dioxide) over that time period. The radiosonde measurements shown in Fig. 9 show no “hot spot”. This clearly indicates the models for predicting global temperature changes are wrong on how they handle additions of “greenhouse gases” (predominately carbon dioxide) to the atmosphere.
The Abstract for the CCSP report contains the following information:
“Previously reported discrepancies between the amount of warming near the surface and higher in the atmosphere have been used to challenge the reliability of climate models and the reality of human-induced global warming. Specifically, surface data showed substantial global-average warming, while early versions of satellite and radiosonde data showed little or no warming above the surface. This significant discrepancy no longer exists because errors in the satellite and radiosonde data have been identified and corrected. New data sets have also been developed that do not show such discrepancies.
….For recent decades, all current atmospheric data sets now show global-average warming that is similar to the surface warming. While these data are consistent with the results from climate models at the global scale, discrepancies in the tropics remain to be resolved. Nevertheless, the most recent observational and model evidence has increased confidence in our understanding of observed climate changes and their causes.”
The vast differences between Figs. 9 and 10 indicate calculations from models have little agreement with experiments. The U. S. CCSP has taken the position “if models don’t agree with experiments; then the experiments are wrong.” Unfortunately, most readers never go beyond the Abstract and the CCSP conclusions are considered fact.
With a considerable amount of fanfare, NOAA and NASA announced 2015 and possibly 2016 are the warmest years since recording keeping started in 1880. Their media supporters like Seth Borenstein and Justin Gillis produced scary articles circulated through the media announcing this threat to the world from carbon dioxide from burning fossil fuels causing catastrophic global warming. The United States should lead the way for all nations to immediately find alternative energy sources to replace fossil fuels regardless the economic cost.
The importance of NOAA and NASA assertions is questioned by experimental data cited in this article. It is quite likely the present Super El Nino changes to a Super La Nina that brings global temperatures back to levels seen a few years ago. Will reporters like Seth Borenstein and Justin Gillis report to the public the errors of their recent assertions? I think not.
On October 5, 2009, President Obama issued an executive order, FEDERAL LEADERSHIP IN ENVIRONMENTAL, ENERGY, AND ECONOMIC PERFORMANCE, that showed policies toward reducing greenhouse gas emissions for the rest of his term in office. This paper explains the motivation for climate policies from all government organizations the past 7 years. The vast waste of tax dollars and impediments to fossil fuel production may be the reason for economic stagnation in spite of the U. S. becoming the fossil fuel energy producer of the planet. A paper by Dr. James H. Rust “President Obama Demands Agreement With Climate Policies” on The Heartland Institute’s website gives examples of compliance with the Executive Order for government, education, and commercial organizations.
One of the sources of surface temperature data is the United States Historical Climatology Network (USHCN), which gives temperature data in the contiguous United States. Walter Dnes wrote an essay “USHCN Monthly Temperature Adjustments” which gives references 1, 2, 3, 4, 5, and 6 that describe in detail monthly adjustments to USHCN data from 1872-to-present. These adjustments made present temperatures warmer, earlier temperatures cooler, and eliminated the 1930s period of heat waves and droughts.
A January 20, 2016 paper “No Pause in NASA Climate Science Corruption” shows NASA-GISS has doubled global warming the past 15 years by altering its data over 15 years. They completely ignored satellite data.
Numerous studies show NOAA and NASA made adjustments to temperature data to show unwarranted global warming. Real Climate published a paper showing adjustments by both NOAA and NASA to U. S. and other nation’s temperature data.
The United Kingdom has been exceptional in reporting news of bogus temperature data. British journalist James Delingpole wrote the January 30, 2015 article “FORGET CLIMATEGATE: THIS ‘GLOBAL WARMING’ SCANDAL IS MUCH BIGGER” which points out the world’s three surface data sources for global temperatures have adjusted their raw data. The sources are NASA-GISS, NOAA which maintains the dataset known as the Global Historical Climate Network, and the University of East Anglia Climatic Research Unit and Met Office data records known as Hadcrut. Mr. Delingpole found no satisfactory reasons for temperature adjustments.
A famous saying by Albert Einstein, “No amount of experimentation can ever prove me right; a single experiment can prove me wrong.” NOAA, NASA-GISS, and US-CCSP re-interpretation of this remark is “computer models are always right; no amount of experiments can prove them wrong.”
Investigations by Congress are in order to clear up discrepancies of NOAA and NASA-GISS temperature data from year-to-year and reported accuracies of climate models by groups like the CCSP.
A new report from the Government Accountability Office, a nonpartisan government agency tasked with auditing, evaluating, and investigating government affairs for Congress, faults the Internal Revenue Service for failing to properly secure taxpayer data, leaving taxpayers’ private information at the mercy of hackers, both domestic and foreign. The report, delivered to IRS chief John Koskinen on March 28, says the IRS has failed to make recommended improvements to its financial and information-technology procedures.
Unfortunately for taxpayers, the IRS has little motivation to protect the sensitive data it collects, because the agency views government, not taxpayers, as its consumer.
According to the report, the IRS “has not always effectively implemented access and other controls, including elements of its information security program, to protect the confidentiality, integrity, and availability of its financial systems and information.” Also: “These weaknesses — including both previously reported and newly identified — increase the risk that taxpayer and other sensitive information could be disclosed or modified without authorization.”
Other violations of good IT security practices cited in the report include failures to encrypt taxpayers’ vital information, weak passwords on servers containing taxpayer data, and easy-to-evade physical security. For example, the report says non-employees could plausibly sneak by security guards in some IRS data centers and gain access to secure systems.
GAO auditors can issue reports until they’ve run out of printer paper and toner ink, but until lawmakers get tough with the IRS, the taxman will have no incentive to shape up. Living things consume resources, grow, react to stimuli, and reproduce. Government agencies such as the IRS consume taxpayer money, hire more staff, make new rules and regulations, and spin off new divisions and departments on a regular basis.
From the politically motivated “enhanced auditing” of conservative organizations committed by Lois Lerner and her employees to repeated data breaches, the government agency U.S. citizens are forced to deal with every year on April 15 has treated Americans with contempt. Why? Precisely because Americans are forced to “do business” with the IRS, there is little reason for the IRS to provide better customer service, protect private data, or apply tax laws in a neutral, nonpartisan way.
As Bruce Yandle, dean emeritus of Clemson University’s College of Business and Behavioral Science, wrote for the Foundation for Economic Education, the solution to poor service from public servants is to reduce the power government has over citizens, so there are fewer opportunities for corruption and inefficiency.
“We must take action to reduce occurrences that corrupt the political process,” Yandle wrote. “But how? First, by limiting the domain of government action. Then, when the domain is limited, by requiring transparency and regular agency reports that demonstrate choice neutrality, by encouraging competition from the loyal opposition, and by showing constant vigilance.”
Also like living things, the IRS has an instinct for self-preservation, and lawmakers must harness that desire to align the agency’s actions with the best interests of taxpayers. By reducing the IRS’ size and power, as well as the power of the federal government in general, abuse and corruption become less attractive, and agencies are forced to concentrate on their core competencies.
Just as a trainer disciplines a disobedient animal, Americans need to demand their government start working for them again, instead of the current status quo of Americans constantly working to feed the government leviathan.
In The Tank Podcast (ep34): Rich States, Poor States, Tax Freedom Day, PEV Subsidies, and Global Warming Thought Crimes
John Nothdurft returns in episode #34 of the In The Tank Podcast. This weekly podcast features (as always) interviews, debates, and roundtable discussions that explore the work of think tanks across the country. The show is available for download as part of the Heartland Daily Podcast every Friday. Today’s podcast features work from ALEC, the Tax Foundation, the Freedom Foundation of Minnesota, and the Competitive Enterprise Institute.
Featured Work of the Week
This week’s featured work of the week is ALEC‘s Rich States, Poor States Report (9th ed). The report ranks all 50 states on 15 economic policies, including various tax, regulatory, and labor policies that give state lawmakers yearly comparisons of how policies are helping or hurting their economic outlook. John and Donny discuss the winners and the losers of this report and talk about what their major takeaways are.
In the World of Think Tankery
Today Donny and John get set from Tax Day. They discuss Tax Freedom Day, a measurement determined by the Tax Foundation. Tax Freedom Day is the date “when the nation as a whole has earned enough money to pay its total tax bill for the year.” 2016’s Tax Freedom Tax falls on April 24, 114 days into the year.
Speaking of taxes, Donny and John discuss a tax credit recently proposed in Minnesota for electric vehicles. The Freedom Foundation of Minnesota published an article titled “It’s Time to Pull the Plug on Electric-Car Subsidies.” While this article focuses on the state effort to prevent this tax credit, Donny and John talk about the implications of the national tax credit.
The last topic discussed by Donny and John is the crack down on free speech by a coalition of Attorneys general regarding global warming. The Competitive Enterprise Institute recently received a Subpoena demanding they turnover a decade’s worth of documents, including emails, donor information, statements, and other documents relating to climate change policy. Donny and John talk about the chilling effect this move has on the global warming debate and how it amounts to an attack on the first amendment.
New Hampshire Gov. Maggie Hassan signed into law on April 5 House Bill 1696 to modify and renew through 2018 the state’s Medicaid expansion program under the Affordable Care Act (ACA), which state lawmakers first adopted in 2014.
Although the law is scarcely a week old, the people of New Hampshire are less than seven months away from influencing whether their elected officials renew Medicaid expansion through 2020.
Why so soon? Most of the New Hampshire lawmakers who will decide Medicaid expansion next time around will hold their offices as a result of the general election on Nov. 8, just seven months from now.
State lawmakers will vote in the spring of 2018 either to allow the Medicaid expansion program to sunset at the end of the year, as HB 1696 prescribes, or renew the program again through 2020. In that year alone, New Hampshire will spend $47 million to extend Medicaid coverage to 50,500 newly eligible enrollees, according to HB 1696’s fiscal note. If the program attracts more enrollees than projected, as other states’ Medicaid expansion programs have, New Hampshire will pay even more.
As the people of New Hampshire prepare to elect candidates to represent their values and interests in the General Court, voters should consider three popular Medicaid expansion myths, misconceptions and rebuttals.
One myth is Medicaid expansion empowers states. In fact, the federal share of Medicaid expansion makes states more dependent on the federal government. In 2020, the federal share for New Hampshire will be $509 million. New Hampshire will grow more beholden to the federal government as it grows more dependent on federal money. Like the ACA itself, this federal overreach will further upset the balance of power between federal and state governments, known as federalism.
The Founding Fathers, in contrast to the majority of current New Hampshire lawmakers, saw value in limiting the federal government’s reach into matters the 10th Amendment reserves to the states, such as providing for the health and welfare of their citizens.
A second myth is popular among alarmists: If New Hampshire doesn’t renew its Medicaid expansion program, other states with Medicaid expansion programs will get billions of New Hampshire dollars in the form of federal shares.
The truth is, if New Hampshire allows its Medicaid expansion program to sunset, the federal share it was receiving would not go to other states. It would go unspent. “There is no magic pot of Obamacare money” in Washington, D.C., waiting to be distributed to other states, Nicolas Horton, a senior research fellow at the Foundation for Government Accountability, told Health Care News in March. “All of the money the federal government is spending on Obamacare’s Medicaid expansion is being added to the national debt.”
The national debt will reach close to $20 trillion when President Barack Obama leaves office on Jan. 20, 2017. The money New Hampshire currently receives in the form of the federal share never was New Hampshire’s, except in the sense New Hampshire shares the national debt. Whether people realize it or not, this debt belongs to every present-day American and countless Americans yet to be born into debt they did not create.
To say New Hampshire’s money could go toward expanding Medicaid in another state is misleading, because the money is not New Hampshire’s and technically does not exist.
A third myth is HB 1696’s addition of work requirements protects the program from abuse by newly eligible, able-bodied Medicaid recipients.
The Republican-controlled New Hampshire House of Representatives left HB 1696’s work requirements vulnerable to removal by the federal Centers for Medicare and Medicaid Services (CMS). A split in the House prompted Speaker Shawn Jasper to break a 181–181 tie on March 9 in favor of including a severability clause for work requirements in the bill. The adopted amendment, offered by state Rep. Karen Umberger, ensures the Medicaid expansion program will remain in effect even if CMS holds the legislation’s work requirements invalid.
On its face, the law’s inclusion of work requirements offers politicians a convenient cop-out when constituents challenge their representatives’ support of Medicaid expansion. Wise voters will recognize that a lawmaker’s consent to severing work requirements is, by definition, a vote not to require work.
The decision whether to renew New Hampshire’s Medicaid expansion program through 2020 rests not merely with lawmakers in the General Court in 2018, but with voters at the ballot box in November.
New Hampshire’s latest Medicaid expansion law increases the state’s dependence on Washington, D.C., spends money the country doesn’t have and which never belonged to New Hampshire, and does not truly require able-bodied recipients to work.
Proponent lawmakers, like Medicaid expansion myths, deserve busting.
The FCC’s AllVid proposal is déjà vu. We have seen Google-YouTube’s piracy-as-negotiating-leverage MO in action before.
Google’s puppeteering of FCC-sponsored piracy in the FCC AllVid set-top box proposal is not the first time Google has anticompetitively used piracy promotion to gain an anticompetitive market advantage for YouTube’s monopsony power — i.e. its market power from being the only repository in the world where one can access a copy of most every video created whether it is legal or pirated, and where Google often promotes pirated videos near the top of its search results.
Don’t take my word for it, listen to Google executives’ own words in Google’s internal Gmails captured for posterity in The Statement of Undisputed Facts filed in the 2007 Viacom v. Google-YouTube copyright trial that settled in 2007.
These undisputed facts/Gmails spotlighted and organized below are damning for three reasons.
First, they prove that for ten years, Google has been trying to “pressure premium content providers to change their model towards free,” which strongly suggests Google is using its extraordinary political influence over the Federal Government and the FCC to anticompetitively extort value from companies that Google-YouTube could not competitively negotiate in the free market, because they demand premium video content be made available to them for free or near free.
Second, they prove that Google knows full well that its willful blindness to profiting from mass piracy is both anticompetitive and predatory.
Third, they help expose the FCC’s apparent willful blindness that their purported set-top box AllVid NPRM does not have a limited, narrow and containable effect on competition for just the set-top box market segment, but that it actually has broad, uncontainable and predictable ancillary impacts that are demonstrably anticompetitive, monopolistic and monopsonistic to the value of the most valuable corpus of video content in the world.
What’s at stake?
That’s because what is really at stake in AllVid is not the roughly $20b in cable set–top box revenues that the FCC myopically touts to justify its proposal. What is at stake is content that generates ten times as much in annual revenues as set-top boxes, roughly $200b in annual video content revenues. (Annual TV advertising revenues were ~$80b in 2015 per Strategy Analytics estimates, and annual multichannel video revenues were ~$120b in 2015 per SNL Kagan estimates.)
Summary of the Sordid Story that Google’s Gmail Trail Tells
For perspective, I have organized the most telling undisputed quotes from Google execs that lay bare a damning legal fact predicate for Google-YouTube’s anti-competitive behavior. It shows:
(1) Prior to buying YouTube, senior Google executives were actively considering an anti-competitive strategy of extortion – i.e. threatening illegal mass-copyright-infringement of copyright law to extract better terms to access valuable content.
(2) At the same time, YouTube on its own was knowingly and aggressively facilitating rampant video piracy of valuable content in order to grow its value and sell the company at the highest price.
(3) Google then knowingly bought YouTube fully aware that it was buying an Internet video distribution site dependent on piracy for its traffic, growth, and value.
(4) Just a few months after buying YouTube, Google formalized a program of effective predatory copyright infringement and willful blindness to piracy to try and sign content on more favorable terms, i.e. an extension of its original anticompetitive extortion strategy.
(5) Since then, Google has continued and perfected YouTube’s copyright arbitrage practice — of openly welcoming and benefiting from copyright infringement for the period from upload to DMCA takedown. (Last week, Google reports copyright owners requested Google take down 22 million infringing URLs for just that week period.)
Google Execs’ incriminating Gmails that Google did not dispute in Federal Court ruling
(1) Prior to buying YouTube, senior Google executives were actively considering an anti-competitive strategy of forcing free video model on premium content providers by threatening mass copyright infringement to extort better terms to access others’ valuable content.
“On June 8, 2006, Google senior vice president Jonathan Rosenberg, Google Senior VP Product Management, emailed Google CEO Eric Schmidt and Google co-founders Larry Page and Sergey Brin a Google Video presentation that stated the following: “Pressure premium content providers to change their model towards free; Adopt ‘or else’ stancere prosecution of copyright infringement elsewhere; Set up ‘play first, deal later’ around ‘hot content. ‘” The presentation also stated that “[w]e may be able to coax or force access to viral premium content,” noting that Google Video could “Threaten a change in copyright polìcy” and “use threat to get deal sign-up.“” [Bold added for emphasis.]Viacom v. YouTube SUF #161
(2) At the same time, the revenue-less YouTube start-up obviously knowingly aided and abetted video piracy in order to grow its traffic virally so that it could then sell the company at the highest price.
“Steal it!…”we have to keep in mind that we need to attract traffic. How much traffic will we get from personal videos?” YouTube Co-founder Steve Chen SUF #44
“If you remove the potential copyright infringements… site traffic and virality will drop to maybe 20% of what it is.” YouTube Co-founder Steve Chen SUF #55
“But we should just keep that stuff on the site. I don’t really see what will happen. What? Someone from CNN sees it? He happens to be someone with power? He happens to want to take it down right away? He get in touch with cnn legal. 2 weeks later, we get a cease & desist letter. We take the video down.” YouTube co-founder Steve Chen SUF #47
“We’re going to have a tough time defending we are not liable… when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.” YouTube Co-founder Steve Chen SUF #40
“Save your meal money for some lawsuits!” YouTube co-founder Hurley SUF #38
“…concentrate all our efforts in building up our numbers as aggressively as we can through whatever tactics, however evil.” YouTube co-founder Chen SUF #85
(3) Then Google knowingly bought YouTube aware it was buying a piracy-driven/dependent Internet video distribution site, despite substantial high-level opposition internally.
“It crosses the threshold of Don’t be Evil to facilitate distribution of other people’s intellectual property…” “It’s a cop out to resort to dist-rob-ution.” Google Video Manager Ethan Anderson SUF #164
“…is changing policy [to] increase traffic beforehand that we’ll profit from illegal downloads how we want to conduct business? Is this Googley?” Google Co-founder Sergey Brin quoted SUF #162
“I think we should beat YouTube… but not at all costs. [They are] a video Grokster.” Google’s Eun to CEO Eric Schmidt before the deal was done SUF #158, #162
(4) After buying YouTube, Google knowingly operated a piracy-tolerant Google-YouTube in accordance with its original Pre-YouTube strategy of anti-competitively extorting competitors by forcing media companies into revenue deals with Google, if they wanted Google to protect their video content from mass piracy.
“Audio fingerprinting system whereby the content partner can send ‘reference’ fingerprints’ to Audible Magic’s database “are now live as well and are only offered to partners who enter into a revenue deal with us.”” Google Manager David Eun 2-15-07 SUF #216 [underline added for emphasis]
(5) After owning YouTube for several months Google was aware of growing mass copyright infringement by Google-YouTube:
“…a trend we see is that people upload copyrighted videos to their private videos… and then invite large numbers of people to view the video which bypasses our copyright restrictions” Google-YouTube employee Julie Havens in a 7-18-07 internal Google emailSUF #199
Google-YouTube’s predatory copyright infringement and willful blindness to mass piracy is exceptionally anticompetitive and profitable because it:
Generates an unbeatable cost advantage by avoiding the market cost of propertied goods for which law-abiding competitors must pay;
Creates an unfair, jump-the-gun, time-to-market advantage, by ignoring the rule of law standard of securing permission from property owners before use in the marketplace, a business practice that law-abiding competitors must respect;
Spawns and maintains a matchless online monopsony index/inventory advantage that no law-abiding competitor could hope to assemble; and
Kneecaps property-based, subscription-monetization models which compete with Google’s piracy-friendly, free advertising model.
Google’s forced video commons strategy is the ultimate predatory anti-competitive business practice, in that it unlawfully destroys the value of any copyrighted innovation and creative proprietary trade secret advantage a competitor may produce in a free market.
In short, Google-YouTube has an undisputed, demonstrable anticompetitive pattern of behavior over a decade that seeks to predatorily extort better wholesale video pricing by threatening to devalue, debase, and destroy video programmers businesses via willful blindness to mass video piracy on YouTube.
It should be beneath the FCC to allow itself to be used as Google’s de facto “muscle” to extort and force via government mandates that monopolist Google could not fairly negotiate by itself in the vibrantly competitive ~$200b pay TV marketplace.
In today’s edition of The Heartland Daily Podcast, we listen in as Bruno Behrend, Heartland Senior Fellow for education policy, speaks in front of the Great Homeschool Convention in Cincinnati, Ohio. He discusses Education Choice and How Homeschooling is blazing the trail.
Behrend talks about the importance of education choice and how we must embrace more of a market system, funding the child instead of the bureaucracy. Behrend also explains his ideal plan for education and how it would foster the growth and innovation in the school system.
The vaunted “97% consensus” on dangerous manmade global warming is just more malarkey
By now, virtually everyone has heard that “97% of scientists agree: Climate change is real, manmade and dangerous.” Even if you weren’t one of his 31 million followers who received this tweet from President Obama, you most assuredly have seen it repeated everywhere as scientific fact.
The correct representation is “yes,” “some,” and “no.” Yes, climate change is real. There has never been a period in Earth’s history when the climate has not changed somewhere, in one way or another.
People can and do have some influence on our climate. For example, downtown areas are warmer than the surrounding countryside, and large-scale human development can affect air and moisture flow. But humans are by no means the only source of climate change. The Pleistocene ice ages, Little Ice Age and monster hurricanes throughout history underscore our trivial influence compared to natural forces.
As for climate change being dangerous, this is pure hype based on little fact. Mile-high rivers of ice burying half of North America and Europe were disastrous for everything in their path, as they would be today. Likewise for the plummeting global temperatures that accompanied them. An era of more frequent and intense hurricanes would also be calamitous; but actual weather records do not show this.
It would be far more deadly to implement restrictive energy policies that condemn billions to continued life without affordable electricity – or to lower living standards in developed countries – in a vain attempt to control the world’s climate. In much of Europe, electricity prices have risen 50% or more over the past decade, leaving many unable to afford proper wintertime heat, and causing thousands to die.
Moreover, consensus and votes have no place in science. History is littered with theories that were long denied by “consensus” science and politics: plate tectonics, germ theory of disease, a geocentric universe. They all underscore how wrong consensus can be.
Science is driven by facts, evidence and observations – not by consensus, especially when it is asserted by deceitful or tyrannical advocates. As Einstein said, “A single experiment can prove me wrong.”
During this election season, Americans are buffeted by polls suggesting which candidate might become each party’s nominee or win the general election. Obviously, only the November “poll” counts.
Similarly, several “polls” have attempted to quantify the supposed climate change consensus, often by using simplistic bait-and-switch tactics. “Do you believe in climate change?” they may ask.
Answering yes, as I would, places you in the President’s 97% consensus and, by illogical extension, implies you agree it is caused by humans and will be dangerous. Of course, that serves their political goal of gaining more control over energy use.
The 97% statistic has specific origins. Naomi Oreskes is a Harvard professor and author of Merchants of Doubt, which claims those who disagree with the supposed consensus are paid by Big Oil to obscure the truth. In 2004, she claimed to have examined the abstracts of 928 scientific papers and found a 100% consensus with the claim that the “Earth’s climate is being affected by human activities.”
Of course, this is probably true, as it is unlikely that any competent scientist would say humans have no impact on climate. However, she then played the bait-and-switch game to perfection – asserting that this meant “most of the observed warming of the last 50 years is likely to have been due to the increase in greenhouse gas concentrations.”
However, one dissenter is enough to discredit the entire study, and what journalist would believe any claim of 100% agreement? In addition, anecdotal evidence suggested that 97% was a better figure. So 97% it was.
Then in 2010, William Anderegg and colleagues concluded that “97–98% of the climate researchers most actively publishing in the field support … [the view that] … anthropogenic greenhouse gases have been responsible for most of the unequivocal warming of the Earth’s average global temperature” over a recent but unspecified time period. (Emphasis in original.)
To make this extreme assertion, Anderegg et al. compiled a database of 908 climate researchers who published frequently on climate topics, and identified those who had “signed statements strongly dissenting from the views” of the UN’s Intergovernmental Panel on Climate Change. The 97–98% figure is achieved by counting those who had not signed such statements.
Silence, in Anderegg’s view, meant those scientists agreed with the extreme view that most warming was due to humans. However, nothing in their papers suggests that all those researchers believed humans had caused most of the planetary warming, or that it was dangerous.
The most recent 97% claim was posited by John Cook and colleagues in 2013. They evaluated abstracts from nearly 12,000 articles published over a 21-year period and sorted them into seven categories, ranging from “explicit, quantified endorsement” to “explicit, quantified rejection” of their alleged consensus: that recent warming was caused by human activity, not by natural variability. They concluded that “97.1% endorsed the consensus position.”
However, two-thirds of all those abstracts took no position on anthropogenic climate change. Of the remaining abstracts (not the papers or scientists), Cook and colleagues asserted that 97.1% endorsed their hypothesis that humans are the sole cause of recent global warming.
Again, the bait-and-switch was on full display. Any assertion that humans play a role was interpreted as meaning humans are the sole cause. But many of those scientists subsequently said publicly that Cook and colleagues had misclassified their papers – and Cook never tried to assess whether any of the scientists who wrote the papers actually thought the observed climate changes were dangerous.
My own colleagues and I did investigate their analysis more closely. We found that only 41 abstracts of the 11,944 papers Cook and colleagues reviewed – a whopping 0.3% – actually endorsed their supposed consensus. It turns out they had decided that any paper which did not provide an explicit, quantified rejection of their supposed consensus was in agreement with the consensus. Moreover, this decision was based solely on Cook and colleagues’ interpretation of just the abstracts, and not the articles themselves. In other words, the entire exercise was a clever sleight-of-hand trick.
What is the real figure? We may never know. Scientists who disagree with the supposed consensus – that climate change is manmade and dangerous – find themselves under constant attack.
Harassment by Greenpeace and other environmental pressure groups, the media, federal and state government officials, and even universities toward their employees (myself included) makes it difficult for many scientists to express honest opinions. Recent reports about Senator Whitehouse and Attorney-General Lynch using RICO laws to intimidate climate “deniers” further obscure meaningful discussion.
Numerous government employees have told me privately that they do not agree with the supposed consensus position – but cannot speak out for fear of losing their jobs. And just last week, a George Mason University survey found that nearly one-third of American Meteorological Society members were willing to admit that at least half of the climate change we have seen can be attributed to natural variability.
Climate change alarmism has become a $1.5-trillion-a-year industry – which guarantees it is far safer and more fashionable to pretend a 97% consensus exists, than to embrace honesty and have one’s global warming or renewable energy funding go dry.
The real danger is not climate change – it is energy policies imposed in the name of climate change. It’s time to consider something else Einstein said: “The important thing is not to stop questioning.” And then go see the important new documentary film, The Climate Hustle, coming soon to a theater near you.
David R. Legates, PhD, CCM, is a Professor of Climatology at the University of Delaware in Newark, Delaware.
A new study published in Environment International indicates hydraulic fracturing, commonly called “fracking,” and the heavy truck traffic that is associated with it would have a negligible impact on air quality if fracking were to be used extensively in the United Kingdom. Interestingly, the authors of the study appear to be a little disappointed with their findings, which may be why they decided to emphasize maximum exposure in a shorter timeframe in their study, rather than exposures over more realistic scenarios.
Hydraulic fracturing is a technique for extracting oil and natural gas from stubborn rocks, such as shale and tight sandstone. In less than 10 years, fracking has turned the United States from an “also-ran” to an energy superpower that has nearly doubled its oil production since 2008, making it the largest producer of natural gas in the world. The technique could also boost natural gas production in the United Kingdom, but fracking has been met with staunch opposition from environmental groups who oppose the potential impacts drilling, production, and heavy truck traffic may have on the region.
Heavy vehicles are associated with producing higher levels of noise, road damage, and air pollution in the form of small particulates—which form as a result of fuel combustion in all vehicles—compared to lighter vehicles. The authors of the paper developed a traffic impact model to produce an environmental assessment of both the short-term and long-term impacts of fracking at individual sites, as well as regional impact analysis.
According to the model developed by the researchers, heavy vehicle traffic related to fracking for an individual well, multi-well pad, or even a region would be negligible compared to those associated with transport in the region as a whole or those emissions associated with some other established industrial sector. However, the researchers did suggest there could be an increase in particle emission in the air during the fracturing process, which requires hundreds of trucks hauling water and sand to a well site, although the study was not clear on whether pollution standards were likely to be exceeded.
Particulate matter and other particle pollution can have an effect on the health of nearby residents if it exceeds the health-based safety standards, but people must generally be exposed to these levels of particulate matter for long periods for it to have negative health effects. This is why exposure to harmful particles is typically measured as a time-weighted average over a series of years (three years in the United States) to determine whether it will have adverse human health impacts. The short durations for which heavy traffic would take place during fracking would be unlikely to affect these longer-term averages.
Ironically, fracking may actually have an important role in reducing particle pollution in the United Kingdom in the coming decades, because Britain has been burning large quantities of diesel fuel, which emits far more particulates into the air than natural gas, for electricity generation.
Larger supplies of affordable natural gas will be essential if the United Kingdom wants to replace coal- and diesel-powered generation systems, which produce nitrogen oxides and particulate matter at significantly higher levels than natural-gas-fired power plants. Also, burning natural gas instead of coal emits half as much carbon dioxide into the atmosphere, and natural gas emits about one-third less carbon dioxide than gasoline or diesel.
Although the authors of the study don’t seem thrilled about the results, people in Britain should be, because it shows over a longer baseline—the entire operational lifetime of a pad—fracking would result in negligible relative increases compared to baseline traffic impacts. These findings, in addition to the environmental benefits of natural gas compared to coal or diesel, should make environmentally conscious people in the United Kingdom eager to consider the environmental benefits of fracking.