MIT Gas Report Glosses Over Both Price and Security Risks

By Frank Clemente, Ph.D.,

In June, a group of MIT researchers released a report on “The Future of Natural Gas, June 2010”. The basic thrust of the report is that the production of large amounts of shale gas at very modest prices will allow for the significant expansion of gas consumption — especially for power generation in the United States. Unfortunately, the natural gas industry provided the money to write the report. While this alone doesn’t invalidate the work, it certainly suggests that any flaws might be those of omission rather than commission.

Put another way, the MIT group presented a risk-benefit analysis of natural gas but left the risk section to someone else. In an earlier issue of Energy Facts, we showed how the report gave surprisingly short shrift to the increasingly apparent environmental impacts of shale gas production. In last week’s edition we examined yet another key omission in the report — natural gas price volatility and its impact on families and businesses. This week we will focus on how the MIT group generally bypassed any substantive discussion of why increased dependence upon gas for electricity elevates the risk of both higher energy prices and inadequate supply.

No sentence is more telling in the MIT report than the following recommendation: “Coal generation displacement with NGCC generation should be pursued as a near-term option…” Both electricity and natural gas consumers should take heed. Over 90% of power plants built since 2000 have been gas-based. Unfortunately, we already know what that increased natural gas generation did to electricity prices:

More Natural Gas Generation Leads to Higher Electricity Prices

“The rising price of natural gas is one of the reasons why Southern California Edison, the largest utility in California, recently warned customers it would be requesting a sharp increase in rates.”
Christian Science Monitor (2)

But increased gas-based generation does not merely raise the cost of electricity. It also raises the price of gas for other consumers. During the past decade gas prices have not only averaged four times higher than coal prices but were far more volatile.

Further, the MIT recommendation would put electricity reliability at serious risk. The group legally mandated to assess electricity adequacy, the North American Electric Reliability Corporation (NERC), has warned “Continued high levels of dependence on natural gas for electricity generation in Florida, Texas, the Northeast, and Southern California have increased the bulk power system’s exposure to interruptions in fuel supply and delivery.” (3)

And Regarding National Energy Security:
While the MIT report briefly states that gas relates to national energy security issues, absolutely no analysis is presented as to what the consequences would be if, after developing an ever-increasing number of natural gas power plants in the United States, we simply just do not have enough domestically produced gas and have to import it in the form of liquefied natural gas (LNG). The National Energy Technology Laboratory has described that very likely scenario: “… The need for more LNG will create closer links to the world oil price, setting the stage for the marginal price of electricity to be set by the whims of foreign oil/LNG suppliers, for the first time in U.S. history.”
A Five Nation Cartel Would Control Almost 60% of the World’s Natural Gas

“Russia and Qatar [are] exploring possible means of adjusting their gas sales strategies to avoid head-to-head competition that could undermine the oil-indexed pricing both still support for their baseload long-term sales. … aiming to come up with a strategy to minimize price competition out to 2025.” (Energy Intelligence Report, March 2010)

“We are creating something similar to OPEC but with gas.”
—Hugo Chavez on the formation of a gas cartel

The Globalization of Natural Gas Prices:
LNG prices will be linked to oil prices over coming decades

“Our [LNG] projects are long term [and] linked to oil prices… in the coming two or three years there will be a shortage of gas.”
—Mohammed al Sada, Qatari Minister of Energy, March 2010 (Reuters, March 26, 2010)

Poll: Voters Reject New Taxes on Oil, Natural Gas

By Jane Van Ryan

Voters in ten states oppose higher taxes on America’s oil and natural gas industry by a 2-to-1 margin, according to a new poll released today.

The poll, conducted by Harris Interactive for API in ten states, found that 64 percent of registered voters oppose an increase, including 46 percent of voters who strongly oppose.

Only 27 percent support increasing taxes. The poll was conducted via telephone between July 15 and July 18 among 6,000 registered voters.

“Voters know raising taxes on an industry that provides most of their energy and supports more than 9.2 million jobs would hurt them and damage the economy,” said API President and CEO Jack Gerard. “Raising taxes doesn’t address their major concern, which is putting people back to work.”

Both the administration and some members of Congress have proposed imposing billions of dollars in new taxes on the oil and natural gas industry. But the poll found that those surveyed believe the two most important issues for the federal government to address are the economy and job creation. National polls from Gallup, CBS News and Bloomberg have reported similar results.

“With 15 million people out of work, now is not the time to be imposing more taxes,” Jack said. “The fact that the proposals are being pushed under the guise of addressing the oil spill in the Gulf doesn’t make them any better.”

The Energy Information Administration (EIA) says the U.S. oil and natural gas industry paid almost $100 billion in federal income taxes in 2008. As this video shows, the industry’s effective tax rate is much higher than the average for all other industries.

Harris Interactive conducted the polling in Colorado, Michigan, North Carolina, North Dakota, Pennsylvania, Virginia, Maine, Missouri, Ohio and West Virginia. The individual state polling results are available here.

Industries use airwaves to attack low-carbon fuel mandate

By Darren Goode

A broad coalition of oil, trucking, airline, manufacturing and other companies Tuesday will roll out a two-week advertising campaign accusing senators of hurting consumers and jobs if they lower the greenhouse gas content of fuels.

“Our families are struggling, but unfortunately it’s business as usual in Washington,” according to a 60-second radio ad sponsored by the Consumer Energy Alliance (CEA) and running in four Midwestern states.

The “latest bright idea” from Congress is a low-carbon fuel standard, according to the ad. Auto companies and the autoworkers union are pushing such a standard.

CEA’s radio campaign cites unnamed studies that claim a standard would cost consumers up to $2,000 annually and increase gas prices at the pump by up to 170 percent.

The coalition funded a recent study by Charles River Associates that contended a low-carbon fuels standard starting in 2015 and reducing the carbon intensity of transportation fuels by 10 percent after that would increase transportation fuels to consumers by 90 to 170 percent by 2025.

A low-carbon standard would also “further damage our ailing economy” and kill up to 1.1 million jobs and “10s of billions of dollars” in economic investment in the Midwest, according to the coalition’s ad. “Low-carbon fuel standards may sound like a good idea, but as usual Congress wants you to pay the price,” according to the ad.

Both the radio ads and accompanying cable TV ads will run for two weeks starting Tuesday in Michigan, Minnesota, Indiana and Ohio at a price tag of about $1 million. The TV ads were still being finalized late Monday.

Each of the four radio ads gives out the number for the U.S. Capitol switchboard and asks listeners to call the senators – both Democratic and Republican – in that particular state.

The low-carbon fuel standard idea has received some bipartisan backing. Michigan Democratic Sens. Debbie Stabenow and Carl Levin are looking to include it in an evolving three-tier package they are working on to limit the impact to their state’s auto industry of plans curbing transportation sector emissions. The package would be offered as part of a broader Senate climate and energy strategy expected to hit the Senate floor as early as next week. Major auto companies and the United Auto Workers are helping Stabenow and Levin draft their ideas.

Sens. Ron Wyden (D-Ore.) and Lamar Alexander (R-Tenn.) have also espoused the idea of a low-carbon fuel standard. President Obama did as well as a U.S. senator and on the 2008 presidential campaign trail.

California has adopted a standard and 11 East Coast states have pledged to model a regional plan from it.

The National Petrochemical & Refiners Association (NPRA), American Trucking Associations (ATA), and other industry groups have filed a lawsuit against California’s plan, alleging the state’s standard is an unconstiutional attempt to regulate interstate and foreign commerce. Both NPRA and ATA are members of CEA.

The ethanol industry has also filed suit against the California standard – which requires fuel providers to cut the carbon intensity of fuels sold in the state 10 percent by 2020.

See here

Understanding E = mc2

By William Tucker
Posted on Oct. 21, 2009

Ed. note: A few weeks ago, I had the pleasure of hearing William Tucker speak at a conference in Washington, DC. His explanation of E = mc2 was the best I had ever heard. Even better, Tucker explained how Einstein’s equation applied to renewable energy sources like wind, solar, and hydro. His lecture was a revelation. It showed that the limits of renewable energy have nothing to do with politics or research dollars, but rather with simple mathematics. During a later exchange of emails with Tucker, I praised his lecture and suggested he write an article that explained E = mc2 and its corollary, E = mv2.

To my delight, he informed me that he’d already written such an essay and he agreed that we could publish it in Energy Tribune.

I love this essay. And I’m proud that Tucker has allowed us to run it.

-Robert Bryce

Prof. Albert Einstein delivers the 11th Josiah Willard Gibbs lecture at the meeting of the American Association for the Advancement of Science in the auditorium of the Carnegie Institue of Technology Little Theater at Pittsburgh, Pa., on Dec. 28, 1934. Photo by AP

E = MC2

When I was in college, I took a course in the great political philosophers. We studied them in order – Hobbes, Locke, Rousseau, Kant, John Stuart Mill and Karl Marx.

In my mind, I had placed them with the historical eras they had influenced – Hobbes and the 18th century monarchs, Locke and the American Revolution, Rousseau and 19th century Romanticism, Kant and the 19th century nation-states, Marx and 20th century Communism.

Then one day I saw a time-line illustrating when they had all lived and died. To my astonishment, each had lived a hundred years before I had placed them in history. The implicated seemed clear. “It takes about a hundred years for a new idea to enter history.”

Almost exactly 100 years ago, Albert Einstein posited the equation E = mc2 in his “Special Theory of Relativity.” The equation suggested a new way of describing the origins of chemical energy and suggested another source of energy that at that point was unknown in history – nuclear energy. Nuclear power made its unfortunate debut in history 40 years later in the form of an atomic bomb. But 100 years later, Americans have not quite yet absorbed the larger implications of Einstein’s equation – a new form of energy that can provide almost unlimited amounts of power with a vanishingly small impact on the environment.

E = mc2. Who has not heard of it? Even Mariah Carey named her last album after it. “E” stands for energy, “m” for mass, and “c” is the speed of light – that’s easy enough. But what does it really mean? (The answer is not “relativity.”)

What E = mc2 says is that matter and energy are interchangeable. There is a continuum between the two. Energy can transform into matter and matter can transform into energy. They are different aspects of the same thing.

This principal of the equivalence of energy and matter was a completely unexpected departure from anything that had gone before. In the 18th century, Antoine Lavoisier, the great French chemist, established the Conservation of Matter. Performing very careful experiments, such as burning a piece of wood, he found that the weight of the resulting gases and ashes were always exactly equal to the weight of the original material. Matter is never created nor destroyed, it only changes form.

Then in the 19th century a series of brilliant scientists – Count Rumford, Sadi Carnot, Rudolf Clausius, Ludwig Boltzman – established the same principal for energy. Energy can take many forms – heat, light, motion, potential energy – but the quantity always remains the same. Energy is never created nor destroyed either.

Now at the dawn of the 20th century, Albert Einstein posited a third principal that united the other two in a totally unexpected way. Einstein stated a Law of Conservation between matter and energy. Nothing like this had ever been imagined before. Yet the important thing is that co-efficient – the speed of light squared. That is a very, very large number, on the order of one quadrillion.

We really don’t have a reference point for a factor of one quadrillion. We know what a trillion is – that’s the federal budget deficit. But a quadrillion is still a bit beyond our ken. What it means, though, is that a very, very large amount of energy transforms into a very, very small amount of matter and a very, very small amount of matter can transform into a very, very large amount of energy.

Perhaps the way to understand the significance of Einstein’s equation is to compare it to another equation, the formula for kinetic energy:

E= MV2

Kinetic energy is the energy of moving objects, “E” once again standing for energy, “m” indicating mass and “v” representing the velocity of the moving object. If you throw a baseball across a room, for example, its energy is calculated by multiplying the mass of the ball times the square of its velocity – perhaps 50 miles per hour.

The two formulas are essentially identical. When brought into juxtaposition, two things emerge:

For any given amount of energy, mass and velocity are inversely related. For an identical amount of energy, the higher velocity goes, the less mass is required and vice versa.

When compared to the velocities of moving objects in nature – wind and water, for instance – the co-efficient in Einstein’s equation is fifteen orders of magnitude larger – the same factor of one quadrillion.

How is this manifested in everyday life? Most of what we are calling “renewable energy” is actually the kinetic flows of matter in nature. Wind and water are matter in motion that we harness to produce energy. Therefore they are measured by the formula for kinetic energy.

Let’s start with hydroelectricity. Water falling off a high dam reaches a speed of about 60 miles per hour or 80 feet per second. Raising the height of the dam by 80 or more feet cannot increase the velocity by more than 20 miles per hour. The only way to increase the energy output is to increase the mass, meaning we must use more water.

The largest dams – Hoover and Glen Canyon on the Colorado River –stand 800 feet tall and back up a reservoir of 250 square miles. This produces 1000 megawatts, the standard candle for an electrical generating station. (Lake Powell, behind Glen Canyon, has silted up somewhat and now produces only 800 MW.)

Environmentalists began objecting to hydroelectric dams in the 1960s precisely because they occupied such vast amounts of land, drowning whole scenic valleys and historic canyons. They have not stopped objecting. The Sierra Club, which opposed construction of the Hetch-Hetchy Dam in Yosemite in 1921, is still trying to tear it down, even though it provides drinking water and 400 megawatts of electricity to San Francisco. Each year more dams are now torn down than are constructed as a result of this campaign.

Wind is less dense than water so the land requirements are even greater. Contemporary 50-story windmills generate 1-½ MW apiece, so it takes 660 windmills to get 1000 MW. They must be spaced about half a mile apart so a 1000-MW wind farm occupies 125 square miles. Unfortunately the best windmills generate electricity only 30 percent of the time, so 1000 MW really means covering 375 square miles at widely dispersed locations.

Tidal power, often suggested as another renewable resource, suffers the same problems. Water is denser than wind but the tides only move at about 5 mph. At the best locations in the world you would need 20 miles of coastline to generate 1000 MW.

What about solar energy? Solar radiation is the result of an E = mc2 transformation as the sun transforms hydrogen to helium. Unfortunately, the reaction takes place 90 million miles away. Radiation dissipates with the square of the distance, so by the time solar energy reaches the earth it is diluted by almost the same factor, 10-15. Thus, the amount of solar radiation falling on a one square meter is 400 watts, enough to power four 100-watt light bulbs. “Thermal solar” – large arrays of mirrors heating a fluid – can convert 30 percent of this to electricity. Photovoltaic cells are slightly less efficient, converting only about 25 percent. As a result, the amount of electricity we can draw from the sun is enough to power one 100-watt light bulb per card table.

This is not an insignificant amount of electricity. If we covered every rooftop in the county with solar collectors, we could probably power our indoor lighting plus some basic household appliances – during the daytime. Solar’s great advantage is that it peaks exactly when it is needed, during hot summer afternoons when air conditioning pushes electrical consumption to its annual peaks. Meeting these peaks is a perennial problem for utilities and solar electricity can play a significant role in meeting the demand. The problem arises when solar enthusiasts try to claim solar power can provide base load power for an industrial society. There is no technology for storing commercial quantities of electricity. Until something is developed – which seems unlikely – wind and solar can serve only as intermittent, unpredictable resources.

There is only so much energy we can draw from renewable sources. They are limited, either by the velocities attained, or by the distance that solar energy must travel to reach the earth. So is there anyplace in nature where we can take advantage of that “c2” co-efficient and tap transformations of matter into energy? There is one that we have used through history. It is called “chemistry.”

Chemical energy is commonly described in terms of “valences.” A sodium atom has a valence of +1, meaning it is missing an electron in its outer shell. Meanwhile, a chlorine atom has a valence of –1, meaning it has an extra electron. Together they “mate” to form sodium chloride (table salt). All chemical reactions are either “endothermic” or “exothermic,” meaning energy is either absorbed or released in the process. The Bunsen burner in chemistry class is a way of adding energy to a reaction. The other thing that can happen occasionally in chemistry lab is a sudden release of energy called an “explosion.”

The great achievement of 20th century quantum physics has been to describe chemical reactions in terms of E = mc2.

When we burn a gallon of gasoline, one-billionth of the mass of the gasoline is completely transformed into energy. This transformation occurs in the electron shells. The amount is so small that nobody has ever been able to measure it. Yet the energy release is large enough to propel a 2000-pound automobile for 30 miles – a remarkable feat when you think of it.

Still, electrons make up only 0.01 percent of the mass of an atom. The other 99.99 percent is in the nucleus of the atom. And so the question arose, would it be possible to tap the much greater amount of energy stored in the nucleus the way we tap the energy in the electrons through chemistry?

For a long time many scientists doubted it could be done. Einstein himself was skeptical, saying that splitting an atom would be like “trying to hunt birds at night in a country where there aren’t many birds.” But other pioneering scientists – Enrico Fermi, George Gamov, Lise Meitner and Leo Szilard – discovered it could be done. By the late 1930s it had become clear that energy in unprecedented quantity could be obtained by splitting the unstable uranium atom.

Unfortunately, World War II pre-empted the introduction of nuclear power. This is a historical tragedy. The atom bomb stands in the same relation to nuclear energy as gunpowder stands to fire. While gunpowder has played an important role in history, fire’s role has been far more essential. Would we want to give up fire just because it led to guns? Yet the atom bomb continues to cast a shadow over the equally important discovery of nuclear energy.

The release of energy from splitting a uranium atom turns out to be 2 million times greater than breaking the carbon-hydrogen bond in coal, oil or wood. Compared to all the forms of energy ever employed by humanity, nuclear power is off the scale. Wind has less than 1/10th the energy density of wood, wood half the density of coal and coal half the density of octane. Altogether they differ by a factor of about 50. Nuclear has 2 million times the energy density of gasoline. It is hard to fathom this in light of our previous experience. Yet our energy future largely depends on grasping the significance of this differential.

One elementary source of comparison is to consider what it takes to refuel a coal plant as opposed to a nuclear reactor. A 1000-MW coal plant – our standard candle – is fed by a 110-car “unit train” arriving at the plant every 30 hours – 300 times a year. Each individual coal car weighs 100 tons and produces 20 minutes of electricity. We are currently straining the capacity of the railroad system moving all this coal around the country. (In China, it has completely broken down.)

A nuclear reactor, on the other hand, refuels when a fleet of six tractor-trailers arrives at the plant with a load of fuel rods once every eighteen months. The fuel rods are only mildly radioactive and can be handled with gloves. They will sit in the reactor for five years. After those five years, about six ounces of matter will be completely transformed into energy. Yet because of the power of E = mc2, the metamorphosis of six ounces of matter will be enough to power the city of San Francisco for five years.

This is what people finds hard to grasp. It is almost beyond our comprehension. How can we run an entire city for five years on six ounces of matter with almost no environmental impact? It all seems so incomprehensible that we make up problems in order to make things seem normal again. A reactor is a bomb waiting to go off. The waste lasts forever, what will we ever do with it? There is something sinister about drawing power from the nucleus of the atom. The technology is beyond human capabilities.

But the technology is not beyond human capabilities. Nor is there anything sinister about nuclear power. It is just beyond anything we ever imagined before the beginning of the 20th century. In the opening years of the 21st century, it is time to start imagining it.

William Tucker is the author, most recently, of Terrestrial Energy: How Nuclear Power Will Lead the Green Revolution and End America’s Energy Odyssey.

See Energy Tribune post here.

Obama’s ‘battery story’ has yet to put a charge in the American public

By Anne E. Kornblut and Steven Mufson
Wednesday, July 14, 2010; 9:45 PM

After struggling to connect with voters on the economy over the last 17 months, President Obama is casting an unlikely hero as the new star of his narrative of redemption and recovery: the battery.

Obama is flying to Michigan on Thursday to attend the groundbreaking of an electric battery company that received $151 million in federal stimulus funding. It will mark his fourth battery-related trip as president, coming as the White House makes an aggressive push to tell what one senior official called “the battery story”: the tale of a small piece of technology that could affect daily life and spur employment if properly nurtured.

Obama’s grand vision for the battery — specifically, the advanced batteries that power plug-in hybrid and electric cars and trucks — is that it can become a new industry that both weans the United States off oil and provides a new manufacturing backbone.

The problem, however, is that the battery story has yet to occur, and might never. For now, it is just a promise. Skeptics argue that there will be insufficient demand for advanced batteries to sustain the U.S. factories now being built, and that such batteries are already being expertly produced abroad.

“The battery story is highly questionable,” said Menahem Anderman, founder and chief executive of Total Battery Consulting, who estimates that the global capacity to build car batteries in 2014 will be three times greater than the demand that year. “Basically, there’s really no proven market, neither electric vehicle nor plug-in hybrid electric vehicle. And there’s really no battery company in the United States that has a verified product.”

Meanwhile, Obama has tried telling this story before — in trips to North Carolina, Massachusetts, Missouri and California — but it has yet to capture the public imagination.

White House officials are hoping that changes this week, as they stage events in at least seven states to highlight jobs created by the electric vehicle industry, as well as a new Department of Energy report on the effectiveness of the Recovery Act, which devoted $2.4 billion to advanced battery production.

In a conference call Wednesday arranged by the White House, Michigan Gov. Jennifer Granholm (D) said that 62,000 battery industry-related jobs would be created in her state, 400 of them at the Compact Power plant in Holland, Mich., that Obama is visiting. Another 300 or so jobs in construction will be created in order to build the plant, administration officials said.

It is a compelling plot, in Obama’s view. “Just a few years ago, America had the capacity to build only 2 percent of the world’s advanced batteries for electric and hybrid cars and trucks,” the president told an audience at a recent fundraiser for Senate candidate Robin Carnahan in Missouri. “Today, thanks to our policies, thanks to a new focus on clean energy and the work taking place at plants like Smith Electric, in five years we could have as much as 40 percent of the world’s capacity to build these batteries — 40 percent. That means jobs right here in Missouri. It also means we’re developing the expertise in a sector that is going to keep building and growing and innovating far into the future.”

Yet Anderman, the battery expert, estimates that by 2015 the U.S. share of the world market will be no more than 10 percent. He notes that five Japanese and two Korean companies are years ahead of U.S. firms in manufacturing experience and research.

“The stimulus money was supposed to support something on order of 300,000 plug-in hybrid batteries a year by 2013,” Anderman said. “If GM can hold to its number, the market in the United States will be in the 40,000 range. And most of [the batteries for] it will come from an LG Chemical factory in Korea.

“I’m in the industry and want to see the industry succeed,” Anderman added. “There was just a lack of synchronization between the status of the technology and the rushing out to build plants. ” He said he fears that the drive could ultimately boomerang politically if many of the plants fail.

See Washington Post story here.


Peterson Institute’s Prediction of 203,000 Net Jobs Gained is Just More Spin

Link to Inhofe EPW Press Blog

“The first thing the intellect does with an object is to class it along with something else. But any object that is infinitely important to us and awakens our devotion feels to us also as if it must be sui generis and unique. Probably a crab would be filled with a sense of personal outrage if it could hear us class it without ado or apology as a crustacean, and thus dispose of it. ‘I am no such thing,’ it would say; ‘I am MYSELF, MYSELF alone.'” -William James, The Varieties of Religious Experience

We always eagerly await the next iteration of cap-and-trade legislation, for with it comes the inevitable refrain that “this time, it’s different.” Claims that cap-and-trade means fewer jobs, higher energy prices for consumers, a weaker economy-well, maybe for those other bills, advocates say, but not this one. The American Power Act, aka the Kerry-Lieberman bill, is deemed a special case, because, according to one prominent Senate supporter, this time “we got the balance right.”

That same supporter claims that, unlike those other unbalanced cap-and-trade bills, the Kerry-Lieberman bill will actually create jobs-203,000 jobs, in fact, according to a recent analysis by the Peterson Institute. Yet sadly for the bill’s authors, the bill is not sui generis; it’s fairly typical: close scrutiny of the Peterson Institute study shows Kerry-Lieberman is no different than Waxman-Markey and every other failed version of cap-and-trade-jobs will be lost and consumers will suffer.

According to the study, between 2011 and 2020, Kerry-Lieberman would actually kill 479,000 jobs. After tallying the jobs created from, among other things, “clean energy investment,” “adaptation,” and “energy efficiency,” the Institute subtracts those lost in key sectors of the economy because of Kerry-Lieberman. Consider the fossil fuel industry, which would lose 72,000 jobs because of “lower demand for fossil fuels and foregone construction of new fossil fuel power generating capacity. This includes direct, indirect, and induced jobs as well.” The study goes on: “We further subtract the jobs lost when households have less money to spend on other goods because energy has become more expensive.” The number subtracted? 305,000.

Then there’s Kerry-Lieberman’s “macroeconomic effects” caused by “changes in consumer demand.” “This includes,” the authors found, “changes in consumer demand for non-energy goods that are more expensive because of higher energy costs, reduction in investment in non-energy sectors because additional investment in power generation has pushed up interest rates, and changes in the US current account position resulting from a net increase in US investment demand.” Jobs lost: 102,000.

As for those 203,000 net jobs created, the Peterson Institute has some interesting things to say. After the bill’s free allocation of emissions allowances phases out, “resulting in higher energy prices,” the “net effect” is that “after 2025, some of the employment gains in the first decade are clawed back, bringing the 2011-30 average back in line with business as usual.” The authors go on to note that, “While outside the window of this analysis, energy prices will likely continue to increase beyond 2030 as GHG abatement costs get higher.”

As supporters seek to advance yet another version of cap-and-trade-this time one confined to the utility sector-the Peterson Institute study seems to confirm the proverb, “the more things change, the more they stay the same.”

MIT Report Ignores Volatility in Natural Gas Prices

By Dr. Frank Clemente

In June, a group of Massachusetts Institute of Technology (MIT) researchers released a report on “The Future of Natural Gas.”(1) The basic thrust of the report is that the production of large amounts of shale gas at very modest prices will allow for the significant expansion of natural gas consumption — especially for power generation in the United States.

Unfortunately, the natural gas industry provided the money to write the report. While this alone doesn’t invalidate the work, it certainly suggests that any flaws might be those of omission rather than commission. Put another way, the MIT group presented a risk-benefit analysis of natural gas… but left the risk section to someone else. Last week we showed how the MIT report gave surprisingly short shrift to the increasingly apparent environmental impacts of shale gas production. In this week’s edition we examine yet another key omission in their report — natural gas price volatility and how it impacts families and businesses.

Volatility: It’s a Matter of Perspective

The Roller Coaster Ride of Industrial Gas Prices in Illinois over the Past Decade(4)

“The results for electricity from natural gas strengthened this conclusion: given the low and high prices of natural gas in recent years, [gas] can be one of the lowest cost – or one of the highest cost – sources of electricity.”
—National Research Council, 2009(5)

Families Bear the Brunt of Fluctuating Gas Prices

Natural gas price volatility over the past decade has severely impacted families. Not only have many lost jobs, but at the same time their energy bills have increased. Increased gas-based generation does not merely raise the cost of electricity; it also raises the price of gas for other consumers. During the past decade gas prices have not only averaged four times higher than coal prices but were far more volatile. In 2000, for example, residential gas prices started at $6.37/mcf but escalated to $13.74 in 2004. They then jumped to $20.24 in 2008 but dropped to $10.48 by the beginning of 2010. Upwards of 60 percent of the homes in the United States are heated with natural gas. Many of these homes are concentrated in the Midwest and Mid-Atlantic States. During the winter their dependence upon natural gas is virtually total since most lack alternative sources of heating. Thus, their need for natural gas dramatically increases as temperatures plummet.

States Especially Vulnerable to Swings in Natural Gas Prices

Planning a family budget under such variability becomes nearly impossible at the lower income levels. The Low Income Home Energy Assistance Program (LIHEAP) and similar governmental agencies are chronically overwhelmed with requests for support for utility bills when home heating costs surge. Millions of families cannot sustain the escalating electric rates and home heating costs caused by high and volatile natural gas prices: 21% of LIHEAP recipients are families with children under five; 37% are elderly; and 50% are disabled.(6)

Should We Go Further Down This Costly Road?

Over 90 percent of the new power plants built since 2000 depend on natural gas. Almost 50,000 additional megawatts of gas capacity will be added by 2013. The consequences of this rush to build out the natural gas infrastructure have been expensive indeed. Despite the adverse socioeconomic impacts of this increased dependence on natural gas for power generation, the MIT group recommends we go even further down this expensive and risky road. They conclude “Coal generation displacement with NGCC generation should be pursued as a near-term option…” but virtually ignore the potential adverse societal, economic and environmental impacts of increasing our dependence on gas even further.

It’s Not Different This Time

“. . . the degree of impact expanding unconventional domestic natural gas reserves (will) have on long-term gas prices and volatility is less than certain . . . The future of the market appears likely to include continuing periods of volatility and uncertainty. ” — California Energy Commission, 2009(7)


(1) MIT. The Future of Natural Gas, June, 2010
(2) Houston Chronicle (
(4) Energy data based on files from EIA at
(5) America’s Energy Future NAS/NRC, 2009
(7) California Energy Commission, 2009

Energy Weekly here.

See You Next Tyranny Day!

By Jonah Goldberg

According to New York Times columnist Thomas L. Friedman in his mega-selling book “Hot, Flat, and Crowded,” China banned plastic bags a few years ago. “Bam! Just like that — 1.3 billion people, theoretically, will stop using thin plastic bags,” he gushed. “Millions of barrels of petroleum will be saved, and mountains of garbage avoided.”

China’s got us beat, suggests Friedman, because its leaders aren’t hung up on democracy or checks and balances or any of the other dusty old impediments found in the American system. Friedman has proclaimed his envy for China’s authoritarian system countless times. It’s why he titled one of the chapters in his book “China for a Day.” The idea — he calls it his fantasy — is that if we could just be China for a day, the experts could impose by diktat what they cannot win through democratic debate.

If only the Founding Fathers had included an annual “Tyranny Day” in the Constitution. Every 364 days America could debate and scheme, pitting faction against faction, governmental branch against governmental branch, and on the 365th day the Supreme Soviet of the United States could simply “do things that are tough” and shove 10 pounds of policy awesomeness into democracy’s five-pound bag.

Now, just for the record, China hasn’t banned plastic bags. Just ask anybody who’s been to China recently. But what a strange thing to sell your soul for. What was it Thomas More said, “it profits a man nothing to give his soul for the whole world … but to ban plastic bags?”

Now, I bring all of this up for a couple reasons. The first is that I am mildly obsessed with Tom Friedman. He’s easily one of the most influential columnists in America and he routinely and blithely expresses his envy for a barbaric police state that has killed tens of millions of its own people. I think pointing that out is worth a little repetition.

But it’s also worth noting that Friedman is hardly alone. He may stretch his argument to the point of parody, but he shares a widespread view that the “experts” have all the answers and the “system” is holding them back.

Such arguments are as old as they are dangerous. And they are arrogant beyond description. People like Friedman automatically assume that their preferred policies are so obviously right, so objectively enlightened, that there’s no need to debate them or vote on them.

Such arguments are usually deployed to avoid valid criticisms, not because there are none. Indeed, the Obama White House virtually lives by such claims. All of the experts agreed that their stimulus would work; that Obama’s version of healthcare reform was both necessary and popular, that weaning the U.S. from fossil fuels will create “green jobs.” The evidence on all of these fronts is mixed or weak and yet the president insists constantly that he doesn’t want to hear from people who disagree with him on these issues because all the facts are in.

Such arrogance is dangerous. The literature on the unintended consequences of policies crafted by experts is at least as old as the field of economics. Frederic Bastiat, the great 19th-century economist, noted that all that separated the good economist from the bad is the ability to appreciate the possibility of the unforeseen. Nobel Prize-winning economist Friedrich Hayek demonstrated that healthy economies couldn’t be controlled by experts, because the experts will always have a “knowledge problem.” They can never know all of the variables and never fully predict how their theories will play out in reality.

Right now Congress is debating a financial reform bill that simply commands that regulators predict when an unforeseen crisis will occur. This is like demanding regulators know when stocks will go up or down. If they knew that, they wouldn’t be regulators — they’d be billionaires.

But forget all that. Let’s get back to those evil plastic bags. A new study from the University of Arizona reveals that reusable shopping bags, the enlightened replacement for plastic ones, are breeding grounds for E. Coli and other dangerous bacteria. Roughly 50 percent of the bags inspected were found to contain dangerous, potentially lethal, bacteria.

No, this doesn’t mean we should abandon reusable bags, let alone ban them too on next year’s Tyranny Day. People can clean the bags and solve the problem. That’s a hassle, to be sure. But that’s the point. There’s always going to be a downside to even the best policies, because the experts don’t know as much as they think they do. Sometimes, they don’t even know they’re not experts at all.

Read more here.

Goofy Green Investments Fueled ShoreBank’s Problems

A significant portion of ShoreBank Corporation’s progressive vision is investment in “sustainability” and the creation of a “green” economy, which may be part of the reason the distressed lender is in need of a bailout, seeking millions of dollars from Wall Street firms so it will then qualify for funds from the Troubled Asset Relief Program.

For example, ShoreBank has two sub-entities based in the Pacific Northwest: the FDIC-backed ShoreBank Pacific, and the nonprofit ShoreBank Enterprise Cascadia. Both are institutions whose lending criteria are based upon progressively defined notions of “sustainability,” with the bank a partnership between ShoreBank Corp. and the environmental group Ecotrust. The bank’s mission is to “profitably assist businesses, and through them their communities, to be sustainable in economic, social, and environmental practices.” Here’s how they explain their lending criteria:

…Unlike other banks, we are conscientious to whom we lend, and potential customers are “scored” using a program of sustainability criteria, including conservation, community development, and economy measures. Our scoring process should not be seen as a deterrent to the loan process, but rather as an active step toward improving sustainability from the “bottom up”. A healthy and sustainable business will translate into a healthy and sustainable community and economy.

The problem with “green” projects is that hardly any of them are economically viable on their own. They require huge government subsidies, tax breaks, or charitable “investment,” or else they won’t survive (or even be started in the first place). A recently announced manure-to-energy project in Lynden, Wash. is illustrative:

Andgar Corporation is the general contractor building the 1.5 million-gallon reinforced concrete digester. It will convert manure from 2,000 dairy cows into a methane biogas, providing enough renewable electricity to supply an estimated 500 houses. Capturing the methane also will reduce greenhouse-gas emissions by the equivalent of 7,000 metric tons of carbon dioxide annually, according to a press release from Farm Power….

The project is funded by a $1.1 million grant from the Washington State Energy Program, a $500,000 USDA Rural Development grant and a $2.4 million bank loan from Shorebank Pacific.

According to its Web site, Farm Power is a start-up company founded by Kevin and Daryl Maas “with no corporation behind us and no fancy marketing…just the two of us (and) a couple dozen local investors….” If the manure digester project was truly profitable for the “investors,” it wouldn’t need huge grants from state and local taxpayers, in addition to a massive loan from a bank that is also looking for a government bailout. Is it judgment calls (and its lending guidelines) like this that got ShoreBank Corp. in trouble in the first place?

In another dubious decision, the lender’s nonprofit organization in Cleveland found this start-up worthy of investment:

The smell of the lacquer at Access-O-Ride Technology in Tallmadge is as intense as the goals co-founders Gary Green and Alissa Harvey have for the newly opened fiberglass company that (Ohio) Gov. Ted Strickland visited July 1…

The Tacoma Avenue company opened with a few employees just days ago after receiving help from JumpStart Inc., a pilot project started by the Ohio Department of Development’s Minority Business Enterprise Division to help minority-owned businesses and firms.

After ShoreBank, a Cleveland-based non-profit organization, lent the company $100,000, Green said they could begin their business. Now they are set to produce at least 80 pieces of fiberglass products daily.

“JumpStart’s involvement gave ShoreBank a great confidence to invest,” said Strickland.

“This is the best thing that’s happened to us,” said Green. “We hope to grow and eventually have three shifts a day with 50 employees.”

So all it takes to get money from a ShoreBank institution is taxpayer subsidization, passion, and hope. Watch as ShoreBank Pacific CEO David Williams explains why they believed the company Flexcar, a vehicle sharing company, was worthy of investment:

Williams described part of ShoreBank’s lending criteria, saying “How is that wealth that’s generated recirculated within the community, rather than being extracted? It is important to us that you’re not owned by some business outside the region. The ownership is local and the money gets recirculated locally.”

So what happened? Flexcar merged and moved in with Boston-based Zipcar in late 2007 and closed its Seattle headquarters, prior to this 2009 ShoreBank video promoting local ownership! But as with the previous examples, all that matters with the progressive model of business are intentions, hope and passion – and more government subsidies, which Flexcar enjoyed via public-private partnerships with local transit authorities.

In October 2007 The Washington Post reported the merger came “after years of losses by both companies:”

“It’s a niche that wasn’t exploited by the larger traditional car-rental companies,” said Chris Brown, managing editor of Auto Rental News. “I don’t think it will ever eat into a huge percentage of the $20 billion U.S. car-rental market. It’s kind of like this little cult of users that are all in it together in this cool new system.”

Cults, hope, change…all befitting an institution worthy of a taxpayer bailout in the eyes of the Obama regime. It appears more and more every day that ShoreBank’s distress isn’t about real estate or economic downturns as much as it’s about the failure of liberal redistribution schemes.

Read more here.

The Economic Impact of an Offshore Drilling Ban

Published on July 1, 2010 by David Kreutzer, Ph.D. and John Ligon

to the BP oil leak, President Obama instituted a moratorium on deepwater (over 500 feet) drilling. Though a judge ruled against the moratorium, drilling has not restarted. In addition, though no official moratorium was issued for drilling in shallower water, the permitting process has slowed considerably.[1]

The President has raised questions about the long-term necessity for drilling.[2] Others would take this argument much further and ban all drilling offshore.[3]

To help policymakers evaluate the arguments for limiting or eliminating offshore drilling, this paper analyzes the economic impact of a total offshore drilling ban on the U.S. economy. The authors use a mainstream model of the U.S. economy to simulate a policy change that prevents new wells from being drilled and allows offshore production to decline as the current set of wells reach the end of their productive lives.

Nipping Expansion in the Bud

The Department of Energy’s Energy Information Administration (EIA) projects that daily petroleum production will rise 18 percent between 2010 and 2035 and that daily production from offshore wells (in the lower 48 states) will rise by over 40 percent.[4] EIA also predicts that offshore drilling will supply significant increases in natural gas production. While total natural gas production will rise 16 percent over the same period, offshore production of natural gas will rise 63 percent, at which time it will be nearly a fifth of total domestic production.[5]

The reserves of petroleum are projected to rise by 5 billion barrels—even after extracting 57 billion barrels over the period 2010–2035. This happens because improvements in technology and price increases make previously uneconomic deposits economically viable. Further, because exploration and development are costly, it makes little sense to incur the costs of finding and extracting reserves that will not be used for decades.

In short, petroleum can be a major energy source for many decades. Consequently an offshore drilling ban’s impact on the U.S. would be felt for decades.

For example, between now and 2035 an offshore drilling ban would:

Reduce GDP by $5.5 trillion,
Reduce the average consumption expenditures for a family of four by $2,381 per year (and exceeding $4,000 in 2035),
Reduce job growth by more than 1 million jobs by 2015 and more than 1.5 million jobs by 2030, and
Increase the total expenditures for imported oil by nearly $737 billion.[6]
Effects on Consumer Prices

A permanent drilling ban would create a wedge between projected domestic oil production without the ban and the lower production levels with the ban in place. The lost petroleum output would have several impacts on the price of imported oil and thus consumer prices. For example, such a ban would necessitate the purchase of more imports to compensate for the lost domestic production. Because oil trades on world markets, this lost domestic production would cause world oil prices to rise—compounding the cost of the increased imports. The losses mount slowly, which means that the impact on oil prices and import costs will also mount slowly. The additional imported-oil cost exceeds $25 billion per year by 2018 and rises to over $45 billion per year by 2035.

Though, in percentage terms, the ban cuts domestic natural gas production half as much as domestic petroleum production, the price impact is greater because the natural gas market is predominantly regional, while the petroleum market is worldwide. Thus, there is less ability to buffer the domestic natural gas production cuts with additional imports. An offshore drilling ban, therefore, would likely lead to natural gas price increases of 10 percent by 2015, 23 percent by 2020, and 45 percent by 2035.

Since energy is a critical input for so many things, raising its cost will increase production costs throughout the economy. Though producers will pass most of the costs on to consumers, consumers will not be able to buy as much at these higher prices. Therefore, the higher energy prices cut the demand for all the other inputs, such as labor. As the higher costs for petroleum and natural gas ripple through the economy, there may be a few bright spots (such as suppliers of more energy-efficient capital goods), but the overall impact is decidedly negative.

An offshore drilling ban cuts domestic energy production, raises energy costs, and shrinks the nation’s economic pie. The broadest measure of economic activity, gross domestic product (GDP), drops $5.5 trillion over the period 2011–2035. Employment levels fall below those projected to occur without a ban in place. By 2020, employment would be 1.4 million jobs lower than without the ban. By 2030, the projected gap reaches 1.5 million jobs.

Of course, shrinking the economy makes families poorer. By 2020 the annual reduction in disposable income for a family of four exceeds $2,000. This lost income exceeds $3,000 per year in 2030 and is over $4,000 per year in 2035.

Pulling the Rug Out

Petroleum and natural gas play a vital role in the U.S. economy and are likely to remain critical to economic activity for decades to come. The Department of Energy expects offshore production to be a bigger supplier of the nation’s energy needs in the years ahead.

If a total ban on offshore drilling is implemented by 2011, then by 2035 Americans could expect national income (GDP) to drop by $5.5 trillion, total costs of imported oil to rise by $737 billion, total disposable income to decrease $54,000 per family of four, and job losses to exceed 1.5 million. A total ban on offshore drilling would pull the rug out from the economy’s incipient recovery.

David W. Kreutzer, Ph.D., is Research Fellow in Energy Economics and Climate Change and John L. Ligon is Policy Analyst in the Center for Data Analysis at The Heritage Foundation.

See post here.