Monday, October 31, 2011

Batteries for Electric Vehicles

Advances in battery technologies over the past few years have been absolutely remarkable. When I was working on the section devoted to batteries and electric cars in 2008, nickel metal hydride (NiMH) batteries were the batteries available for cars like the Toyota Prius.  They used to cost about $1.00/Wh capacity.  All-electric car manufacturers like Tesla and Fisker (Fisker is a series hybrid not an all-electric car), who were making vehicles targeted for the high-end market, were offering lithium ion batteries.  Li-ion batteries have a higher energy density and are more durable, and they were the battery of choice for many personal electronic devices such as cell phones, PDAs, and laptop computers.  However, they tended to be very expensive.  I estimated their cost to about $2.50/Wh based on the information available at that time for smaller systems.  The price of the Li-ion batteries has dropped dramatically in the last few years—to about $0.40/Wh, and are now the battery of choice for electric vehicles. 
The all-electric Nissan Leaf has a 24 kWh Li-ion battery pack and has a base price of $33,000 (before tax credits); for reference Toyota Prius has a 1.4 kWh battery pack.  Clearly, the batteries cost a lot less that $1,000/kWh.  Likewise, the new gas-electric hybrid Chevy Volt has 16 kWh Li-ion battery, and sells for $41,000.   It is sized to allow users a driving range of about 35 miles on electric power alone.  The battery lifespans have also improved significantly, and they are expected to last 10 years and guaranteed for 100,000 miles.  
These improvements have changed the economics of driving electric vehicles.  According to the US EPA, Nissan Leaf consumes 34 kWh for 100 miles. At grid electricity cost of $0.10/kWh, driving the 100 miles would only cost $3.40.  A comparably sized gasoline engine car giving 33 mpg would cost about $12.00 at the current price of gasoline ($4.00/gallon).  If we add the amortized cost of the battery ($9,600), the cost for 100-mile drive would increase to about $13.00, still comparable to the gasoline case, and not substantially higher as was the case a few years ago.

Friday, October 14, 2011

Fukushima Disaster and Germany's Response

I am in Oviedo, Spain, attending the International Conference on Coal Science & Technology.  It’s good to reacquaint with friends and colleagues whom I have known for several decades.  My conversations with friends from Japan and Germany, prompt me to write this post on the nuclear power after the Fukushima disaster.
The massive earthquake and the awesome tsunami it engendered wreaked a terrible havoc on the people in Japan, particularly those living in and around Fukushima.  About 15,000 people lost their lives from the tsunami, and hundreds of thousand were rendered homeless.  The Dai-ichi nuclear power plant shut down in response to the earthquake, but it needed power to keep water flowing around the fuel rods.  Downed power lines meant that the plant could not receive power from plants elsewhere to keep the cooling water circulating, and since the tsunami had knocked out all the backup diesel generators, the plant had only a few hours of power from its batteries, not long enough to restore power connection.  Bad things happen to the light water reactors when the coolant is lost, and the Dai-ichi plant experienced a series of those: at least partial meltdown of the fuel rods, generation of hydrogen from steam reacting with hot metal, venting of gas pressure in primary containment to keep it from blowing, build up of hydrogen in the secondary containment, hydrogen explosion and release of radioactive materials. 
The Scale of the Disaster
Each day brought news of a worsening situation.  As events unfolded, comparisons were made with previous nuclear accidents.  Initial reports placed this accident below or at the same level of severity as Three Mile Island, level 5 on the International Nuclear Events Scale (INES) scale.  As more information became available, it was recognized that radiation released from the Dai-ichi plant was much larger and the incident was re-classified as a level 7 disaster; the same level as Chernobyl.  It turns out that 7 is the highest rating on the INES scale.
By placing Chernobyl and Fukushima at the same level, we lose the value of the INES scale, which relates to radiation released and not its effect on people or environment. We need a different scale if to rate disasters.  The situation is analogous to describing earthquake disasters using the Richter scale.  The scale measures the energy released from the ground movement—not the effect it had on the structures and people.  The 1989 earthquake in San Francisco registered a magnitude of 6.8 and caused the death of 63 people and rendered about 4,000 homeless.  The Kobe earthquake of 1995, although of a similar magnitude registering 6.8 on the Richter scale, was much more devastating.  It resulted in the death of about 6,400 people and rendered about 300,000 homeless. 
At Fukushima the primary reinforced containment did not fail, and held back most of the radioactive materials.  In the Chernobyl disaster there was no secondary containment, and when the only un-reinforced structure failed and particles from the core were dispersed into the environment by a large explosion. The cloud of radioactive materials was then dispersed by the wind over a large part of northern and western Europe.  In the Chernobyl disaster nuclear material there were at least 47 fatalities from acute radiation, hundreds suffered radiation illness, and 600,000 people exposed to low-level radiation causing an estimated 4,000 additional cases of cancer (lower bound numbers). In the Fukushima incident there were explosions from hydrogen in the secondary containment structures, but they were not carrying core particles. To date there has been no fatality from acute radiation, although a few brave workers have suffered radiation illness.  
Risk from Long-Tern Exposure to Low-Level Radiation
Understandably, there is much concern about the effect of low-level radiation on the people living around Fukushima.  The fear of unknown is palpable, and parents are mistrustful of assurances by the government and industry spokespersons.  Large areas were evacuated and radiation was detected in the food and water from that region.  A friend of mine who lives in the neighboring Ibaraki prefecture wrote to me asking for help in locating a Geiger counter so he could check the radioactivity in the food and milk his family is getting. A quick check on eBay showed many vendors selling Geiger counters and radiation dosimeters, and many listed options for "shipping to Japan."  Evidently my friend wasn’t the only one in Japan looking for this device.
I gained some perspective on the health risks of radiation while doing research for the Cubic Mile of Oil book.  I hope this perspective will be of help to the readers.  While there are health risks from radiation exposure, the anxiety about it also poses a health risk.  

An acute exposure of 1 Sv is barely noticeable, above that, people suffer from radiation sickness manifesting in nausea and hair loss.  LD50 of acute radiation is 3 Sv, and exposure to 10 Sv is generally lethal within a few days.  The workers in Fukushima were in danger of suffering from acute radiation illnesses; fortunately, their exposure levels were monitored and managed, and as far as I know, none of them developed symptoms associated with acute radiation poisoning.  The other concern from radiation is from long-term exposure to low levels, which can lead to cancer.  This concern is the relevant one for the people living in Northern Japan. The consensus of health professional is that an exposure of 0.25 Sv (or 250 mSv) increases the chances of contracting cancer by 1%.  This correlation is based on extrapolation of high-level exposure to very low levels, and does not allow for any self-correction or healing of the radiation damage by the body.  It therefore serves as an upper bound to the risk.

We are constantly bombarded by radiation from all forms of natural (and man-made) sources.  Living on earth entails exposure to about 3 to 4 mSv (milli Sieverts) per year, mostly from cosmic radiation and from rocks around us.  Over a 100 years of lifespan, a person would have therefore been exposed to 300 mSv of radiation and that would increase the person's chances of contracting cancer by a little more than 1%.  Now, cancer is a very common disease, and as such afflicts about 20% of the population.  

I was looking at the radiation levels in the different prefectures of Japan at the Japan Times website.  In Fukushima the levels are about 2.77 uSv (microSv) per hour, in Miyago and Ibaraki it is less than 0.1 uSv/hr and elsewhere about 0.05 uSv/hr.  A steady exposure of 0.1 uSv/hr would over a year amount to a dose of 876 uSv or 0.88 mSv.  Over a 100-year lifespan, this dose would add up to 88 mSv.  Since 250 mSv increase the chance of contracting cancer by 1%, the dose of 88 mSv can be expected to increase the cancer risk for the individual by about 0.3%; from 20% to 20.3%.

There are a lot of statistics here, and I am aware that statistics offer little solace when one is considering health of near and dear ones.  However, I sincerely hope, this information allays some of the concerns by placing the risks in perspective.  Remember that anxiety is also deleterious to one’s health as it increases the risk of a cardiac illness.  

Germany’s Response
Soon after the Fukushima disaster Germany announced it would phase out nuclear power. I wondered where the replacement energy would come from. Would it be a serious commitment to renewables? What storage technology are they going to deploy to allow for that?  Germany has installed over 44 GW of wind and solar capacity, and in 2010 generated about 96 TWh of renewable power out of the total electricity generation of 621 TWh. Commentators opined that Germany would be looking to importing natural gas from Russia and coal from the Chech Republic to make up for the 140 TWh contributed by nuclear power.  The increased use of fossil sources would run counter to the goals of CO2 reduction.  However, Germany also exports annually a net of about 20 TWh of electricity, Could it be that between cutting back on exports and employing conservation strategies, Germany could avoid increasing consumption of fossil fuels.
Based on the data for the first half of 2011 from Germany’s Bureau of Statistics, which show that Germany was a net exporter of electricity Paul Gipe (Bloomberg, Sep. 27, 2011) tried to dispel the notion that Germany will be relying on increased imports.  I think it is premature to judge the net effect of Germany’s decision to close the nuclear reactors. Compared to the first half of 2010 when Germany exported nearly 11 TWh more electricity than it imported, in the first half of 2011 it sold only 4 TWh more electricity.  Further, during half of the first-half of 2011, the nuclear reactors were still operating. We will get a better sense of the impact when the final figures for 2011 are published.  Stay tuned.

Friday, October 7, 2011

Bio-sourced parts for Ford Focus

I recently saw an announcement that Ford and BASF teamed to make foam for the instrument panel of Ford Focus from castor oil.  The material allows them to offer industry's first seamless soft-touch instrument panel that is stronger and better-looking.  All that is good!  However, a quote that caught my eye was "...Finding a sustainable product that saves more than 5,000 barrels of oil for every 300,000 Ford Focus models produced in North America is a very exciting solution for all of us."

Wow!  That's 0.7 gallons per car; when each car is going to consume an estimated 10,000 gallons over its life in fuel.  They are getting excited about savings of 70 ppm!  Their emphasis on the displacement of petroleum is either misplaced, or it is a ploy to attract unsuspecting customers.

Wednesday, October 5, 2011

Re-framing the debate on energy supply

It is time to re-frame the debate about future energy supply, arguably the biggest challenge we face. This challenge has often been portrayed as a tension between the moral imperative of protecting the environment on the one hand and preserving the economic interests of the energy industry. This simplistic view misses the more difficult challenge that we face: namely, balancing the tension between protecting the environment—which would require us to turn off the use of fossil-fuels—against the equally important call for social justice to provide people around the world with sufficient and affordable energy so they can all live a healthy productive life. Meeting the global demand for energy is going to be a daunting challenge, and the way we choose to do it, namely the energy sources we choose to employ, will have a profound effect on the lives of millions–nay billions–of people. There are choices to be made, and the public at large must get engaged in making them.

To many in the sustainability community, fossil energy is an anathema. Continued use of fossil resources—oil, coal, natural gas—poses threats to the environment through the emission of pollutants and greenhouse gases.  The fact that they are a limited or exhaustible resource means that in the future we could either run out of them or their extraction will get progressively harder to a point that it takes more energy to extract them than would be derived from their use.  Using fossil energy is clearly not sustainable, and the world has to look to renewable resources for long-term survival.  

While it is true increased use of coal only exacerbates the global struggle to curb carbon dioxide emissions, the moral imperative to protect the environment has to balanced by the equally strong moral imperative of providing energy to enable people live healthy productive lives.  Between 1981 and 2005, China increased its use of coal four-fold, but over the same period it also lifted about 400 million people out of poverty.  

I recently spoke with Martin Wasserman, co-producer and host of Future Talk.  A ten-minute segment of that conversation can be viewed here.  

Deepwater Horizon Accident

A major news of 2010 was the accident at the Deepwater Horizon oil well in the Gulf of Mexico.  A disastrous fire engulfed the oil platform when a surge of gas issued from the well. Eleven workers died in the accident, oil gushed out from the broken pipe at the floor of the sea. Video of the oil pouring out provided dramatic images that were flashed all over the media. The actual quantity of the spill was difficult to ascertain initially, and estimates ranged from 10,000 to 100,000 barrels per day. After the fact, it was determined that the maximum rate of spill was about 62,000 barrels a day and over the three-month period of the spill, 4.9 million barrels of oil had poured out. Over 600 miles of the coastline had been affected.  Fishery and tourism are major industries of the region, and suffered enormous losses. It is uncertain as to how long before those operations would regain a sense of normalcy.
Deepwater Horizon is only one of several major events in which large amounts of oil were discharged into seas and oceans. Since 1978 four major events have occurred:
·      The breaking up of Amoco Cadiz off the coast of Brittany, France in March 1978 spilling about 1.7 million barrels of oil.
·      Ixtoc 1 oil well disaster in the Bay of Campeche releasing approximately 3.4 million barrels of oil over nine months beginning June 3, 1979.
·      The running aground of Exxon Valdez in Prince William Sound on March 24, 1989 resulting in a spill of 0.23 million barrels.
·      Sabotage by retreating Iraqi forces in January 1991 resulting in a spill of over 7.1 million barrels.
 Each of these incidents is different in significant ways—the amount of oil discharged, whether it was released at the surface or under sea, local ecology and environmental conditions such as water currents and temperature. Yet, tragic as these events have been for the people and animals directly affected, they also provide a strong testament to the resilience of the environment as recovery of the environment around them has taken three to five years.
The second of these, Ixtoc 1, initiated in on June 3, 1979 is eerily similar to the ongoing Deepwater Horizon disaster as it too entailed a blowout of an exploratory well situated in the Gulf of Mexico releasing oil under sea and not far from a prosperous fishing community. Many of the unsuccessful techniques employed for mitigating the disaster then were also used at Deepwater Horizon with equally unsuccessful outcomes. The blowout preventer failed in both instances. Dispersants, skimmers and booms were used then, as today, to reduce the impact of the oil spill. Ixtoc 1 disaster was finally contained when a relief well was drilled.  It took about nine months to complete and provide a controlled flow of oil. The relief wells at Deepwater are expected to take shorter times because of improvements in remote underground manipulations and drilling technologies; we are now able to drill deeper, faster, and more accurately.
While most of the oil released from Ixtoc 1 went toward the ocean to be dispersed in different ways, some oil flowed along the coastline severely damaging the environment along the Mexican shoreline and affecting the coastal fisheries. Special efforts were exerted to capture and relocate a small group of unique sea turtles – they survived and have since been restored to their former grounds where they are in good shape. Surveys conducted in 1981 showed that the shore vegetation had revived. Within three to four years fisheries had regained most of their productivity, although residues of oil can be found even today deep below the surface.
The first and third incidents, Amoco Cadiz and Exxon Valdez, were shipping accidents resulting in rapid discharges of large volumes oil on the surface. The latter spilled a relatively smaller volume of oil, but it did so in a very sensitive area that was the location of commercial and sports salmon fisheries as well as the habitat of many varieties of wildlife. The spill covered about 1,300 miles of coastline and 11,000 square miles of sea with varying amounts of oil. Various cleanup techniques were applied in both, none of which were particularly successful. Perhaps 10% of the oil was recovered, the rest was carried out to sea, and/or deposited on the channel floor.
In both these spills, high pressure water cleaning was used. It proved effective when applied to gravel and rock surfaces but less so if the vegetation was fouled. Natural remediation in the affected marine environments seemed to work best if left alone and the fouled area returned to life as early as three to five years following the spill with attendant resumption of the previously normal commercial and recreational activities. History suggests that if proper procedures are taken there will be quick recovery of the usual activities along the currently affected coasts of Louisiana, Mississippi, Georgia, and Florida coasts.
The fourth was a deliberate act of sabotage committed by Iraqi forces as they retreated from Kuwait on January 23, 1991.  Oil was released from storage tanks and tankers.  The amount released the sea was estimated to range from two to six million barrels. Apparently, no effort was made to stop the flow or cleanup the residues. Three years later a survey found little ecological damage but later ones reported significant residual ecological damage. 
For perspective, we note that about 1 million barrels gallons of oil naturally seeps in the Gulf of Mexico every year. The US Gulf region, as also the southern California coast, has been contaminated by tar since prehistoric times. Modern dwellers on the coast are accustomed to using kerosene or similar solvents to remove tar from their seagoing equipment and their bodies after their sea oriented play or work.
Based on the collective experience from these earlier disasters, we can expect that the Gulf of Mexico to also recover in a few years. The Food and Drug Administration has recently conducted tests on seafood from the Gulf of Mexico for contaminants but has found few problems. Recent reports of economic recovery in the region have also been encouraging, although the jury is still out on the long-term prospects.

Updating global energy scene

In A Cubic mile of Oil, we described the pattern of global energy based on 2006 numbers, and those numbers were based on information culled from many sources.  To look at the short term trends since the publication of the book, I will rely on one source, BP Statistical Review of World Energy, with a consistent set of definitions for comparing energy sources.
According to BP Statistical Review of World Energy, total primary energy consumption in 2006 was 2.93 CMO.  It grew in the two subsequent years to 3.05 CMO, but in 2009, total energy consumption dropped to 3.01.  In 2010 it rebounded to a new high of 3.18 CMO, an 8.2% increase over the 2006 level.  The bulk of this increase came from increased consumption of coal and natural gas.  Coal use increased from 0.84 CMO to 0.94 CMO, and natural gas from 0.68 CMO to 0.76 CMO.  Over the same period, oil consumption increased from 1.04 CMO to 1.07 CMO. Production of electricity from nuclear energy remained essentially constant at 0.17 CMO, while production wind, solar, and geothermal increased by 71%, from 0.027 CMO to 0.047 CMO.
Installed Capacity (GW)

% Increase
Solar, PV

The reserves of conventional oil increased from 46 CMO in 2006 to 48 CMO in 2010; together with oil sands the proved reserves currently amount to 53 CMO, with a reserves to production ratio of 49 years.
Oil and Gas Discoveries in Israel
Within a few months of each other, two reports of discoveries of oil and gas resources in Israel made the headlines.  In 2009, natural gas was discovered in off-shore shale in the Tamar Mediterranean Sea straddling the coastal waters of Israel, Lebanon, and Cyprus.  The deposit is estimated to contain 16 trillion cubic feet of natural gas and an additional 4.2 billion barrels of shale oil.  Shortly thereafter, an on-shore shale was discovered in northern Israel with estimated potential for producing 260 billion barrels of oil.  A WSJ article described these findings and wondered if Israel could become an energy giant.

The comparison with Saudi Arabian oil reserves needs to be understood in the context of the special meaning of "reserves."  Reserves are like money in the bank and unconventional resources are the earning potential of a high school graduate choosing the right college.  You cannot call any hydrocarbon accumulation a "reserve" unless you have a proven technology to recover that resource economically at today's prices.  These are part of SEC regulations. 

The article, like most in mainstream media, switches between barrels of oil and trillion cubic feet (tcf).  Let me translate them all into my favorite unit, cubic mile of oils (CMO).  Recall, current  global consumption of oil is 1 CMO/yr, and of natural gas it is 0.6 CMO/yr.

Saudi Arabia oil reserves 250 billion barrels = 9.4 CMO
Israel oil shale resource 260 billion barrels = 9.8 CMO
US shale gas resource 2,500 tcf = 16.3 CMO
Israel's major offshore gas find 16 tcf = 0.1 CMO
US Shale oil resource 1,500 billion barrels = 56.6 CMO
China shale oil resource = 13.4
Global conventional oil reserves are 46 CMO

Global unconventional oil resource is estimated between 200 and 400 CMO; much of it in Venezuela and Canada (~40 CMO each), but also significant amounts in US, China, Russia, Estonia, Congo, and yes, also the Middle-East/North-African countries.