Featured stories in this issue…

Sustainabiltiy Redefined

The 1987 definition of sustainability went like this: “Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.” That was a fine definition, but now a new one has been proposed: “Learning to live off the sun in real time.”

Solar Grand Plan

“Well-meaning scientists, engineers, economists and politicians have proposed various steps that could slightly reduce fossil-fuel use and emissions. These steps are not enough. The U.S. needs a bold plan to free itself from fossil fuels…. By 2050 solar power could end U.S. dependence on foreign oil and slash greenhouse gas emissions.”

Silent Streams?

The U.S. Geological Survey (USGS) reports that 40% of freshwater fish species in North America are now endangered, a 92% increase since 1989.

Toddlers Absorb More Toxic Chemicals Than Mothers

A new study has found that in 19 of the 20 families, concentrations of flame retardants were significantly higher in children than in their mothers. In all, 11 different types of flame retardants were found in these children.

Secret Pentagon-sponsored Report Warned of Global Warming in 1979

A shadowy scientific elite in the Pentagon, code named Jason, warned the U.S. about global warming 30 years ago but was sidelined for political convenience

New German Facility Begins Testing CO2 Sequestration

For the first time ever, a coal-fired power plant (in Germany) has actually begun pumping pressurized liquid carbon dioxide into the ground, hoping it will stay there forever. The operation is tiny, experimental, expensive — and an end-of-pipe solution — but the future of the coal industry hangs on experiments such as this one being declared a “success.”

Why the Environmental Movement Cannot Prevent Catastrophe.

Environmentalism is almost as compromised as the planet itself, argues Gus Speth in his new book. He faults the movement for using market incentives to achieve environmental ends and for the deception that sufficient change can come from engaging the corporate sector and working “within the system” and not enlisting the support of other activist constituencies.

[Rachel’s introduction: The 1987 definition of sustainability went like this: “Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.” That was a fine definition, but now a new one has been proposed: “Learning to live off the sun in real time.”]

SUSTAINABILITY REDEFINED

By Peter Montague

Chemical & Engineering News (C&EN) is the weekly voice of the American Chemical Society, which is the professional association for academic and industrial chemists. This high-quality magazine lies near the heart of the establishment and — like the Wall Street Journal — it hires some of the top writers in the business because many of its readers are elite decison-makers who need the best information available, whether it be good news or bad.

The August 18 issues of C&EN was devoted to “sustainability.” In it, editor-in-chief Rudy M. Baum pointed out that a sea change has occurred just in the past two years. He says humans passed a “tipping point” in about 2006. A “tipping point” occurs when something fundamental changes in a way that speeds up further change and/or makes change permanent.[1]

Baum writes, “In the case of humanity’s relationship to Earth, a tipping point appears to have occurred in 2006. In what seems to have been the blink of an eye, a shift in attitude occurred. On one side of the divide, people in general expressed concern, but not alarm, over the state of the environment. On the other side of the divide, past the tipping point, a consensus emerged that human actions were having a serious negative impact on the global environment. The consensus was embraced by scientists and nonscientists and, remarkably, by a large swath of corporate America.”

Community activists who struggle against toxic corporate behavior may doubt that “a large swath of corporate America” really accepts that “human actions are having a serious negative impact on the environment” — but it does seem true that important segments of the public have become convinced. This is new. This is big.

Baum continues: “What is clear is that humans need to change their relationship to Earth. No resource is infinite. There are enough of us, more than 6 billion, and we are clever enough that our activities are impacting the global environment. How is it that we can ever have imagined otherwise?”

It is as if Baum has just awakened from a pleasant dream and is realizing for the first time that we are all facing a harsh reality.

He then repeats the original definition of “sustainable development” from the 1987 “Brundtland Report,” formally titled “Our Common Future:”

“Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.” And, he says, “That is as good a definition of sustainable development as one will find.”

And he his own interesting new definition of sustainability: “Learning to live off the sun in real time.” He says, “Although sustainability is not only about energy, it is largely about energy.”

Then he dives into a brief history of humankind and of civilization. He points out that for aeons humans lived off the sun in real time. Then the discovery of coal, and later oil, powered the development of industrial society: “The extraordinary productivity of the past 150 years has largely been fueled by fossilized sunshine.” Then he says, “This has to change for two reasons.”

Reason No. 1: Fossil fuels are finite: “One can argue,” Baum writes, “whether we have already reached ‘peak oil’ — the point at which half of all the oil that ever will be discovered has been discovered and supply, while far from exhausted, will inevitably begin to decline — or whether we will reach it in 10, 20, or 30 years. The point is, we will reach peak oil. (Certainly,” Baum continues, “the current remarkable run-up in crude oil prices is consistent with what will occur when peak oil is reached.) Yes, there are vast reserves of petroleum locked in oil shale and tar sands, and yes, there’s enough coal out there to power society for 200 years, but extracting these resources will take a terrible toll on the landscape of Earth. At what point are we going to say, enough?”

Reason No. 2: Global warming. “The gigatons of carbon dioxide humans are pumping into the atmosphere as if it were a giant sewer are causing the climate to change. That’s no longer in dispute,” Baum writes.

But then suddenly Mr. Baum seems to slip back into the pleasant dream of yesteryear: his solution to our energy (and sustainability) problems — which he still calls “living off the sun in real time” — is nuclear power.

This is jarring because both U.S. and world supplies of uranium are finite and limited. Baum backhandledly acknowledges this by saying, “Energy efficiency and conservation will play important roles, but so will vastly expanded use of nuclear energy, including breeder reactors to enormously expand the supply of nuclear fuel.” So, uranium by itself will run out — most likely sooner than coal will run out — but we can “enormously expand the supply” of atomic fuel with breeder reactors. Mr. Baum doesn’t say so, but breeder reactors don’t breed uranium, they breed plutonium, the preferred raw material for rogue A- bombs.

Mr. Baum does acknowledge that his plan entails some difficulties — he calls them “complexities” — like “building safe breeder reactors, secure handling of plutonium, [and] responsible disposal of the remaining waste.” Complexities indeed.

Leaving aside the morally indefensible plan to bequeath tons of highly radioactive waste to our children to manage forever, humans haven’t devised a solution for the slow march of nuclear weapons across the globe — except of course to ban the manufacture of all raw materials for such weapons. This would require ending nuclear power globally, forever.

Item: Pakistan has nuclear weapons (which it developed from nuclear power reactors) and is supposedly a strong ally of the U.S. But Dexter Filkins reported this week in the New York Times Magazine that Pakistani soldiers sometimes shoot at American soldiers who are hunting fundamentalist Muslims along Pakistan’s border with Afghanistan. Filkins says “one of the more fundamental questions of the long war against Islamic militancy, and one that looms larger as the American position inside Afghanistan deteriorates [is]: Whose side is Pakistan really on?” Read the Filkins piece — an amazing feat of reporting — and you’ll see it’s a fair question.

Item: Last month President Bush authorized U.S. troops to begin military raids onto Pakistani soil — without asking Pakistan’s permission — to try to kill Taliban fundamentalists there. Announcing the President’s decision, the N.Y. Times wrote, “The new orders for the military’s Special Operations forces relax firm restrictions on conducting raids on the soil of an important ally without its permission.” The next paragraph in the Times story says, “Pakistan’s top army officer said Wednesday that his forces would not tolerate American incursions like the one that took place last week and that the army would defend the country’s sovereignty ‘at all costs.'” This is sounding more and more like the beginning of a new war — one with a nuclear-armed “ally” who also seems to be an ally of the Taliban.

The Taliban would like nothing better than to get their hands on a Pakistani A-bomb, deliver it to us on a cargo ship, and detonate it near the Statue of Liberty or beneath the Golden Gate Bridge. It would end the American experiment in democracy, almost certainly.

Item: This same week President Bush won approval from 45 nations for his plan to allow India — Pakistan’s blood enemy — to buy and sell nuclear materials on the global market, thus negating the Nuclear Non- Proliferation Treaty that had been in force for decades but which India has steadfastly refused to sign. Nuclear experts warn that Mr. Bush’s decision could lead to a nuclear arms race in Asia. Congress has yet to approve the deal, but Mr. Bush is now working to get their “fast track” approval.

Item: And this week, too, a writer in the New York Times pointed out that, “Many proliferation experts I have spoken to judge the chance of a detonation [of an A-bomb by Al Qaeda, or a Qaeda imitator on U.S. soil] to be as high as 50 percent in the next 10 years. I am an optimist, so I put the chance at 10 percent to 20 percent. Only technical complications prevent Al Qaeda from executing a nuclear attack today. The hard part is acquiring fissile material; an easier part is the smuggling itself (as the saying goes, one way to bring nuclear weapon components into America would be to hide them inside shipments of cocaine).”

Even if the optimistic view is correct — that the chance of a rogue A-bomb explosion in New York Harbor, or beneath the Golden Gate Bridge, is “only” 10% or 20% per decade — how many decades does that give us before the probability approaches 100%?

No, if humans are to survive, then “Learning to live off the sun in real time” cannot mean powering global civilization with plutonium- breeding nuclear reactors. It must mean really living off the sun in real time.

Luckily, that goal is seeming more realistic each passing week. In this issue of Rachel’s we carry a story from Scientific American Magazine that estimates we could derive 35% of our total energy (and 69% of our electricity) from sunlight by 2050 — and 90% of our total energy from the sun by 2100. And it would require a federal subsidy far smaller than we have so far committed to the Iraq war. Of course, if we felt the need were really urgent, we could get there even faster. That’s a new “tipping point” we can all work together to achieve.

==============

[1] Baum says “a tipping point occurs when some parameter reaches a value where various feedback loops come into play and further change in the parameter becomes radically more rapid and/or permanent.” He gives the example of carbon locked in the arctic permafrost. At some point, rising temperatures in the arctic will thaw the permafrost, releasing large amounts of carbon dioxide into the atmosphere, thus creating warmer conditions, in turn releasing more carbon from the permafrost… until?

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Scientific American …………………………..[This story printer-friendly]
January 1, 2008

SOLAR GRAND PLAN

[Rachel’s introduction: “Well-meaning scientists, engineers, economists and politicians have proposed various steps that could slightly reduce fossil-fuel use and emissions. These steps are not enough. The U.S. needs a bold plan to free itself from fossil fuels…. By 2050 solar power could end U.S. dependence on foreign oil and slash greenhouse gas emissions.”]

By Ken Zweibel, James Mason and Vasilis Fthenakis

High prices for gasoline and home heating oil are here to stay. The U.S. is at war in the Middle East at least in part to protect its foreign oil interests. And as China, India and other nations rapidly increase their demand for fossil fuels, future fighting over energy looms large. In the meantime, power plants that burn coal, oil and natural gas, as well as vehicles everywhere, continue to pour millions of tons of pollutants and greenhouse gases into the atmosphere annually, threatening the planet.

Well-meaning scientists, engineers, economists and politicians have proposed various steps that could slightly reduce fossil-fuel use and emissions. These steps are not enough. The U.S. needs a bold plan to free itself from fossil fuels. Our analysis convinces us that a massive switch to solar power is the logical answer.

Solar energy’s potential is off the chart. The energy in sunlight striking the earth for 40 minutes is equivalent to global energy consumption for a year. The U.S. is lucky to be endowed with a vast resource; at least 250,000 square miles of land in the Southwest alone are suitable for constructing solar power plants, and that land receives more than 4,500 quadrillion British thermal units (Btu) of solar radiation a year. Converting only 2.5 percent of that radiation into electricity would match the nation’s total energy consumption in 2006.

To convert the country to solar power, huge tracts of land would have to be covered with photovoltaic panels and solar heating troughs. A direct-current (DC) transmission backbone would also have to be erected to send that energy efficiently across the nation.

The technology is ready. On the following pages we present a grand plan that could provide 69 percent of the U.S.’s electricity and 35 percent of its total energy (which includes transportation) with solar power by 2050. We project that this energy could be sold to consumers at rates equivalent to today’s rates for conventional power sources, about five cents per kilowatt-hour (kWh). If wind, bio-mass and geothermal sources were also developed, renewable energy could provide 100 percent of the nation’s electricity and 90 percent of its energy by 2100.

The federal government would have to invest more than $400 billion over the next 40 years to complete the 2050 plan. That investment is substantial, but the payoff is greater. Solar plants consume little or no fuel, saving billions of dollars year after year. The infrastructure would displace 300 large coal-fired power plants and 300 more large natural gas plants and all the fuels they consume. The plan would effectively eliminate all imported oil, fundamentally cutting U.S. trade deficits and easing political tension in the Middle East and elsewhere. Because solar technologies are almost pollution- free, the plan would also reduce greenhouse gas emissions from power plants by 1.7 billion tons a year, and another 1.9 billion tons from gasoline vehicles would be displaced by plug-in hybrids refueled by the solar power grid. In 2050 U.S. carbon dioxide emissions would be 62 percent below 2005 levels, putting a major brake on global warming.

Photovoltaic Farms

In the past few years the cost to produce photovoltaic cells and modules has dropped significantly, opening the way for large-scale deployment. Various cell types exist, but the least expensive modules today are thin films made of cadmium telluride. To provide electricity at six cents per kWh by 2020, cadmium telluride modules would have to convert electricity with 14 percent efficiency, and systems would have to be installed at $1.20 per watt of capacity. Current modules have 10 percent efficiency and an installed system cost of about $4 per watt. Progress is clearly needed, but the technology is advancing quickly; commercial efficiencies have risen from 9 to 10 percent in the past 12 months. It is worth noting, too, that as modules improve, rooftop photovoltaics will become more cost-competitive for homeowners, reducing daytime electricity demand.

In our plan, by 2050 photovoltaic technology would provide almost 3,000 gigawatts (GW), or billions of watts, of power. Some 30,000 square miles of photovoltaic arrays would have to be erected. Although this area may sound enormous, installations already in place indicate that the land required for each gigawatt-hour of solar energy produced in the Southwest is less than that needed for a coal-powered plant when factoring in land for coal mining. Studies by the National Renewable Energy Laboratory in Golden, Colo., show that more than enough land in the Southwest is available without requiring use of environmentally sensitive areas, population centers or difficult terrain. Jack Lavelle, a spokesperson for Arizona’s Department of Water Conservation, has noted that more than 80 percent of his state’s land is not privately owned and that Arizona is very interested in developing its solar potential. The benign nature of photovoltaic plants (including no water consumption) should keep environmental concerns to a minimum.

The main progress required, then, is to raise module efficiency to 14 percent. Although the efficiencies of commercial modules will never reach those of solar cells in the laboratory, cadmium telluride cells at the National Renewable Energy Laboratory are now up to 16.5 percent and rising. At least one manufacturer, First Solar in Perrysburg, Ohio, increased module efficiency from 6 to 10 percent from 2005 to 2007 and is reaching for 11.5 percent by 2010.

Pressurized Caverns

The great limiting factor of solar power, of course, is that it generates little electricity when skies are cloudy and none at night. Excess power must therefore be produced during sunny hours and stored for use during dark hours. Most energy storage systems such as batteries are expensive or inefficient.

Compressed-air energy storage has emerged as a successful alternative. Electricity from photovoltaic plants compresses air and pumps it into vacant underground caverns, abandoned mines, aquifers and depleted natural gas wells. The pressurized air is released on demand to turn a turbine that generates electricity, aided by burning small amounts of natural gas. Compressed-air energy storage plants have been operating reliably in Huntorf, Germany, since 1978 and in Mcintosh, Ala., since 1991. The turbines burn only 40 percent of the natural gas they would if they were fueled by natural gas alone, and better heat recovery technology would lower that figure to 30 percent.

Studies by the Electric Power Research Institute in Palo Alto, Calif., indicate that the cost of compressed-air energy storage today is about half that of lead-acid batteries. The research indicates that these facilities would add three or four cents per kWh to photovoltaic generation, bringing the total 2020 cost to eight or nine cents per kWh.

Electricity from photovoltaic farms in the Southwest would be sent over high-voltage DC transmission lines to compressed-air storage facilities throughout the country, where turbines would generate electricity year-round. The key is to find adequate sites. Mapping by the natural gas industry and the Electric Power Research Institute shows that suitable geologic formations exist in 75 percent of the country, often close to metropolitan areas. Indeed, a compressed-air energy storage system would look similar to the U.S. natural gas storage system. The industry stores eight trillion cubic feet of gas in 400 underground reservoirs. By 2050 our plan would require 535 billion cubic feet of storage, with air pressurized at 1,100 pounds per square inch. Although development will be a challenge, plenty of reservoirs are available, and it would be reasonable for the natural gas industry to invest in such a network.

Hot Salt

Another technology that would supply perhaps one fifth of the solar energy in our vision is known as concentrated solar power. In this design, long, metallic mirrors focus sunlight onto a pipe filled with fluid, heating the fluid like a huge magnifying glass might. The hot fluid runs through a heat exchanger, producing steam that turns a turbine.

For energy storage, the pipes run into a large, insulated tank filled with molten salt, which retains heat efficiently. Heat is extracted at night, creating steam. The molten salt does slowly cool, however, so the energy stored must be tapped within a day.

Nine concentrated solar power plants with a total capacity of 354 megawatts (MW) have been generating electricity reliably for years in the U.S. A new 64-MW plant in Nevada came online in March 2007. These plants, however, do not have heat storage. The first commercial installation to incorporate it — a 50-MW plant with seven hours of molten salt storage — is being constructed in Spain, and others are being designed around the world. For our plan, 16 hours of storage would be needed so that electricity could be generated 24 hours a day.

Existing plants prove that concentrated solar power is practical, but costs must decrease. Economies of scale and continued research would help. In 2006 a report by the Solar Task Force of the Western Governors’ Association concluded that concentrated solar power could provide electricity at 10 cents per kWh or less by 2015 if 4 GW of plants were constructed. Finding ways to boost the temperature of heat exchanger fluids would raise operating efficiency, too. Engineers are also investigating how to use molten salt itself as the heat-transfer fluid, reducing heat losses as well as capital costs. Salt is corrosive, however, so more resilient piping systems are needed.

Concentrated solar power and photovoltaics represent two different technology paths. Neither is fully developed, so our plan brings them both to large-scale deployment by 2020, giving them time to mature. Various combinations of solar technologies might also evolve to meet demand economically. As installations expand, engineers and accountants can evaluate the pros and cons, and investors may decide to support one technology more than another.

Direct Current, Too

The geography of solar power is obviously different from the nation’s current supply scheme. Today coal, oil, natural gas and nuclear power plants dot the landscape, built relatively close to where power is needed. Most of the country’s solar generation would stand in the Southwest. The existing system of alternating-current (AC) power lines is not robust enough to carry power from these centers to consumers everywhere and would lose too much energy over long hauls. A new high- voltage, direct-current (HVDC) power transmission backbone would have to be built.

Studies by Oak Ridge National Laboratory indicate that long-distance HVDC lines lose far less energy than AC lines do over equivalent spans. The backbone would radiate from the Southwest toward the nation’s borders. The lines would terminate at converter stations where the power would be switched to AC and sent along existing regional transmission lines that supply customers.

The AC system is also simply out of capacity, leading to noted shortages in California and other regions; DC lines are cheaper to build and require less land area than equivalent AC lines. About 500 miles of HVDC lines operate in the U.S. today and have proved reliable and efficient. No major technical advances seem to be needed, but more experience would help refine operations. The Southwest Power Pool of Texas is designing an integrated system of DC and AC transmission to enable development of 10 GW of wind power in western Texas. And TransCanada, Inc., is proposing 2,200 miles of HVDC lines to carry wind energy from Montana and Wyoming south to Las Vegas and beyond.

Stage One: Present to 2020

We have given considerable thought to how the solar grand plan can be deployed. We foresee two distinct stages. The first, from now until 2020, must make solar competitive at the mass-production level. This stage will require the government to guarantee 30-year loans, agree to purchase power and provide price-support subsidies. The annual aid package would rise steadily from 2011 to 2020. At that time, the solar technologies would compete on their own merits. The cumulative subsidy would total $420 billion (we will explain later how to pay this bill).

About 84 GW of photovoltaics and concentrated solar power plants would be built by 2020. In parallel, the DC transmission system would be laid. It would expand via existing rights-of-way along interstate highway corridors, minimizing land-acquisition and regulatory hurdles. This backbone would reach major markets in Phoenix, Las Vegas, Los Angeles and San Diego to the west and San Antonio, Dallas, Houston, New Orleans, Birmingham, Ala., Tampa, Fla., and Atlanta to the east.

Building 1.5 GW of photovoltaics and 1.5 GW of concentrated solar power annually in the first five years would stimulate many manufacturers to scale up. In the next five years, annual construction would rise to 5 GW apiece, helping firms optimize production lines. As a result, solar electricity would fall toward six cents per kWh. This implementation schedule is realistic; more than 5 GW of nuclear power plants were built in the U.S. each year from 1972 to 1987. What is more, solar systems can be manufactured and installed at much faster rates than conventional power plants because of their straightforward design and relative lack of environmental and safety complications.

Stage Two: 2020 to 2050

It is paramount that major market incentives remain in effect through 2020, to set the stage for self-sustained growth thereafter. In extending our model to 2050, we have been conservative. We do not include any technological or cost improvements beyond 2020. We also assume that energy demand will grow nationally by 1 percent a year. In this scenario, by 2050 solar power plants will supply 69 percent of U.S. electricity and 35 percent of total U.S. energy. This quantity includes enough to supply all the electricity consumed by 344 million plug-in hybrid vehicles, which would displace their gasoline counterparts, key to reducing dependence on foreign oil and to mitigating greenhouse gas emissions. Some three million new domestic jobs–notably in manufacturing solar components–would be created, which is several times the number of U.S. jobs that would be lost in the then dwindling fossil-fuel industries.

The huge reduction in imported oil would lower trade balance payments by $300 billion a year, assuming a crude oil price of $60 a barrel (average prices were higher in 2007). Once solar power plants are installed, they must be maintained and repaired, but the price of sunlight is forever free, duplicating those fuel savings year after year. Moreover, the solar investment would enhance national energy security, reduce financial burdens on the military, and greatly decrease the societal costs of pollution and global warming, from human health problems to the ruining of coastlines and farmlands.

Ironically, the solar grand plan would lower energy consumption. Even with 1 percent annual growth in demand, the 100 quadrillion Btu consumed in 2006 would fall to 93 quadrillion Btu by 2050. This unusual offset arises because a good deal of energy is consumed to extract and process fossil fuels, and more is wasted in burning them and controlling their emissions.

To meet the 2050 projection, 46,000 square miles of land would be needed for photovoltaic and concentrated solar power installations. That area is large, and yet it covers just 19 percent of the suitable Southwest land. Most of that land is barren; there is no competing use value. And the land will not be polluted. We have assumed that only 10 percent of the solar capacity in 2050 will come from distributed photovoltaic installations–those on rooftops or commercial lots throughout the country. But as prices drop, these applications could play a bigger role.

2050 and Beyond

Although it is not possible to project with any exactitude 50 or more years into the future, as an exercise to demonstrate the full potential of solar energy we constructed a scenario for 2100. By that time, based on our plan, total energy demand (including transportation) is projected to be 140 quadrillion Btu, with seven times today’s electric generating capacity.

To be conservative, again, we estimated how much solar plant capacity would be needed under the historical worst-case solar radiation conditions for the Southwest, which occurred during the winter of 1982-1983 and in 1992 and 1993 following the Mount Pinatubo eruption, according to National Solar Radiation Data Base records from 1961 to 2005. And again, we did not assume any further technological and cost improvements beyond 2020, even though it is nearly certain that in 80 years ongoing research would improve solar efficiency, cost and storage.

Under these assumptions, U.S. energy demand could be fulfilled with the following capacities: 2.9 terawatts (TW) of photovoltaic power going directly to the grid and another 7.5 TW dedicated to compressed- air storage; 2.3 TW of concentrated solar power plants; and 1.3 TW of distributed photovoltaic installations. Supply would be rounded out with 1 TW of wind farms, 0.2 TW of geothermal power plants and 0.25 TW of biomass-based production for fuels. The model includes 0.5 TW of geothermal heat pumps for direct building heating and cooling. The solar systems would require 165,000 square miles of land, still less than the suitable available area in the Southwest.

In 2100 this renewable portfolio could generate 100 percent of all U.S. electricity and more than 90 percent of total U.S. energy. In the spring and summer, the solar infrastructure would produce enough hydrogen to meet more than 90 percent of all transportation fuel demand and would replace the small natural gas supply used to aid compressed-air turbines. Adding 48 billion gallons of biofuel would cover the rest of transportation energy. Energy-related carbon dioxide emissions would be reduced 92 percent below 2005 levels.

Who Pays?

Our model is not an austerity plan, because it includes a 1 percent annual increase in demand, which would sustain lifestyles similar to those today with expected efficiency improvements in energy generation and use. Perhaps the biggest question is how to pay for a $420-billion overhaul of the nation’s energy infrastructure. One of the most common ideas is a carbon tax. The International Energy Agency suggests that a carbon tax of $40 to $90 per ton of coal will be required to induce electricity generators to adopt carbon capture and storage systems to reduce carbon dioxide emissions. This tax is equivalent to raising the price of electricity by one to two cents per kWh. But our plan is less expensive. The $420 billion could be generated with a carbon tax of 0.5 cent per kWh. Given that electricity today generally sells for six to 10 cents per kWh, adding 0.5 cent per kWh seems reasonable.

Congress could establish the financial incentives by adopting a national renewable energy plan. Consider the U.S. Farm Price Support program, which has been justified in terms of national security. A solar price support program would secure the nation’s energy future, vital to the country’s long-term health. Subsidies would be gradually deployed from 2011 to 2020. With a standard 30-year payoff interval, the subsidies would end from 2041 to 2050. The HVDC transmission companies would not have to be subsidized, because they would finance construction of lines and converter stations just as they now finance AC lines, earning revenues by delivering electricity.

Although $420 billion is substantial, the annual expense would be less than the current U.S. Farm Price Support program. It is also less than the tax subsidies that have been levied to build the country’s high- speed telecommunications infrastructure over the past 35 years. And it frees the U.S. from policy and budget issues driven by international energy conflicts.

Without subsidies, the solar grand plan is impossible. Other countries have reached similar conclusions: Japan is already building a large, subsidized solar infrastructure, and Germany has embarked on a nationwide program. Although the investment is high, it is important to remember that the energy source, sunlight, is free. There are no annual fuel or pollution-control costs like those for coal, oil or nuclear power, and only a slight cost for natural gas in compressed- air systems, although hydrogen or biofuels could displace that, too. When fuel savings are factored in, the cost of solar would be a bargain in coming decades. But we cannot wait until then to begin scaling up.

Critics have raised other concerns, such as whether material constraints could stifle large-scale installation. With rapid deployment, temporary shortages are possible. But several types of cells exist that use different material combinations. Better processing and recycling are also reducing the amount of materials that cells require. And in the long term, old solar cells can largely be recycled into new solar cells, changing our energy supply picture from depletable fuels to recyclable materials.

The greatest obstacle to implementing a renewable U.S. energy system is not technology or money, however. It is the lack of public awareness that solar power is a practical alternative–and one that can fuel transportation as well. Forward-looking thinkers should try to inspire U.S. citizens, and their political and scientific leaders, about solar power’s incredible potential. Once Americans realize that potential, we believe the desire for energy self-sufficiency and the need to reduce carbon dioxide emissions will prompt them to adopt a national solar plan.

KEY CONCEPTS

** A massive switch from coal, oil, natural gas and nuclear power plants to solar power plants could supply 69 percent of the U.S.’s electricity and 35 percent of its total energy by 2050.

** A vast area of photovoltaic cells would have to be erected in the Southwest. Excess daytime energy would be stored as compressed air in underground caverns to be tapped during nighttime hours.

** Large solar concentrator power plants would be built as well.

** A new direct-current power transmission backbone would deliver solar electricity across the country.

** But $420 billion in subsidies from 2011 to 2050 would be required to fund the infrastructure and make it cost-competitive.

–The Editors

Plentiful Resource:

Solar radiation is abundant in the U.S., especially the Southwest. The 46,000 square miles of solar arrays required by the grand plan could be distributed in various ways.

PAYOFFS

** Foreign oil dependence cut from 60 to 0 percent

** Global tensions eased and military costs lowered

** Massive trade deficit reduced significantly

** Greenhouse gas emissions slashed

** Domestic jobs increased

PINCH POINTS

** Subsidies totaling $420 billion through 2050

** Political leadership needed to raise the subsidy, possibly with a carbon tax

** New high-voltage, direct-current electric transmission system built profitably by private carriers

MORE TO EXPLORE

The Terawatt Challenge for Thin Film Photovoltaic. Ken Zweibel in Thin Film Solar Cells: Fabrication, Characterization and Applications. Edited by Jef Poortmans and Vladimir Arkhipov. John Wiley & Sons, 2006.

Energy Autonomy: The Economic, Social and Technological Case for Renewable Energy. Hermann Scheer. Earthscan Publications, 2007.

Center for Life Cycle Analysis, Columbia University: http://www.clca.columbia.edu/

The National Solar Radiation Data Base. National Renewable Energy Laboratory, 2007. http://rredc.nrel.gov/solar/old%26lowbar;data/nsrdb

The U.S. Department of Energy Solar America Initiative: www1.eere.energy.gov/solar/solar_america

Photovoltaics

In the 2050 plan vast photovoltaic farms would cover 30,000 square miles of otherwise barren land in the Southwest. They would resemble Tucson Electric Power Company’s 4.6-megawatt plant in Springerville, Ariz., which began in 2000. In such farms, many photovoltaic cells are interconnected on one module, and modules are wired together to form an array. The direct current from each array flows to a transformer that sends it along high-voltage lines to the power grid. In a thin- film cell, the energy of incoming photons knocks loose electrons in the cadmium telluride layer; they cross a junction, flow to the top conductive layer and then flow around to the back conductive layer, creating current.

Underground Storage

Excess electricity produced during the day by photovoltaic farms would be sent over power lines to compressed-air energy storage sites close to cities. At night the sites would generate power for consumers. Such technology is already available; the PowerSouth Energy Cooperative’s plant in Mcintosh, Ala. (left), has operated since 1991 (the white pipe sends air underground). In these designs, incoming electricity runs motors and compressors that pressurize air and send it into vacant caverns, mines or aquifers (right). When the air is released, it is heated by burning small amounts of natural gas; the hot, expanding gases turn turbines that generate electricity.

Concentrated Solar

Large concentrated solar power plants would complement photovoltaic farms in the Southwest. The Kramer Junction plant in California’s Mojave Desert, using technology from Solel in Beit Shemesh, Israel, has been operating since 1989. Metallic parabolic mirrors focus sunlight on a pipe, heating fluid such as ethylene glycol inside. The mirrors rotate to track the sun. The hot pipes run alongside a second loop inside a heat exchanger that contains water, turning it to steam that drives a turbine. Future plants could also send the hot fluid through a holding tank, heating molten salt; that reservoir would retain heat that could be tapped at night for the heat exchanger.

==============

Ken Zweibel, James Mason and Vasilis Fthenakis met a decade ago while working on life-cycle studies of photovoltaics. Zweibel is president of PrimeStar Solar in Golden, Colo., and for 15 years was manager of the National Renewable Energy Laboratory’s Thin-Film PV Partnership. Mason is director of the Solar Energy Campaign and the Hydrogen Research Institute in Farmingdale, N.Y. Fthenakis is head of the Photovoltaic Environmental Research Center at Brookhaven National Laboratory and is a professor in and director of Columbia University’s Center for Life Cycle Analysis.

Copyright Scientific American

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: U.S. Geological Survey …………………………[This story printer-friendly]
September 9, 2008

SILENT STREAMS?

[Rachel’s introduction: The U.S. Geological Survey (USGS) reports that 40% of freshwater fish species in North America are now endangered, a 92% increase since 1989.]

Nearly 40 percent of fish species in North American streams, rivers and lakes are now in jeopardy, according to the most detailed evaluation of the conservation status of freshwater fishes in the last 20 years.

The 700 fishes now listed represent a staggering 92 percent increase over the 364 listed as “imperiled” in the previous 1989 study published by the American Fisheries Society. Researchers classified each of the 700 fishes listed as either vulnerable (230), threatened (190), or endangered (280). In addition, 61 fishes are presumed extinct.

The new report, published in Fisheries, was conducted by a U.S. Geological Survey-led team of scientists from the United States, Canada and Mexico, who examined the status of continental freshwater and diadromous (those that migrate between rivers and oceans) fish.

“Freshwater fish have continued to decline since the late 1970s, with the primary causes being habitat loss, dwindling range and introduction of non-native species,” said Mark Myers, director of the USGS. “In addition, climate change may further affect these fish.”

Most Vulnerable Groups

The groups of fish most at risk are the highly valuable salmon and trout of the Pacific Coast and western mountain regions; minnows, suckers and catfishes throughout the continent; darters in the Southeastern United States; and pupfish, livebearers, and goodeids, a large, native fish family in Mexico and the Southwestern United States.

Nearly half of the carp and minnow family and the Percidae (family of darters, perches and their relatives) are in jeopardy. Fish families important for sport or commercial fisheries also had many populations at risk. More than 60 percent of the salmon and trout had at least one population or subspecies in trouble, while 22 percent of sunfishes — which includes the well-known species such as black bass, bluegill and rock bass — were listed. Even one of the most popular game species in the United States, striped bass, has populations on the list.

Regions with the Most Troubled Fish

Regions with especially notable numbers of troubled fish include the Southeastern United States, the mid-Pacific coast, the lower Rio Grande and basins in Mexico that do not drain to the sea.

Hotspots of regional biodiversity and greatest levels of endangerment are the Tennessee (58 fishes), Mobile (57), and the southeastern Atlantic Slope river systems (34). The Pacific central valley, western Great Basin, Rio Grande and rivers of central Mexico also have high diversity and numbers of fish in peril, according to the report. Many of the troubled fish are restricted to only a single drainage. “Human populations have greatly expanded in many of these watersheds, compounding negative impacts on aquatic ecosystems,” noted Howard Jelks, a USGS researcher and the senior author of the paper.

Degree of Trouble

Of fish on the 1989 imperiled list, 89 percent are either still listed with the same conservation status or have become even more at risk. Only 11 percent improved in status or were delisted. The authors emphasized that improved public awareness and proactive management strategies are needed to protect and recover these aquatic treasures.

“Fish are not the only aquatic organisms undergoing precipitous declines,” said USGS researcher Noel Burkhead, a lead author on the report and the chair of the AFS Endangered Species Committee. “Freshwater crayfishes, snails and mussels are exhibiting similar or even greater levels of decline and extinction.”

The authors noted that the list was based on the best biological information available. “We believe this report will provide national and international resource managers, scientists and the conservation community with reliable information to establish conservation, management and recovery priorities,” said Stephen Walsh, another lead author and USGS researcher.

This is the third compilation of imperiled, freshwater and diadromous fishes of North America prepared by the American Fisheries Society’s Endangered Species Committee. Additional information is available at http://fisc.er.usgs.gov/afs/

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Oakland Tribune ………………………………[This story printer-friendly]
September 4, 2008

TODDLERS ABSORB MORE TOXIC CHEMICALS THAN MOTHERS

[Rachel’s introduction: A new study has found that in 19 of the 20 families, concentrations of flame retardants were significantly higher in children than in their mothers. In all, 11 different types of flame retardants were found in these children.]

By Suzanne Bohan, Oakland Tribune

In a world permeated with chemicals, toddlers’ penchant for crawling on floors, chewing on assorted objects and touching everything within reach expose their bodies to a disproportionate amount of toxic pollutants.

That’s the conclusion of a study released today by the Environmental Working Group in Oakland, which monitored 20 pairs of moms and their young children. The group reported that the children, on average, carried more than three times the amount of flame retardants in their blood than their mothers.

It’s only the second study to examine this chemical load in U.S. toddlers, and breaks new ground in taking a national glimpse at its prevalence.

MediaNews, in its 2005 series “A Body’s Burden,” first opened researchers’ eyes to the particular perils faced by young children in a world where more than 80,000 chemicals are found in all manner of products.

This latest research, which focused on blood levels of flame retardants in samplings of mothers and toddlers across the country, dovetails with findings of the award-winning newspaper series by reporter Douglas Fischer, according to Linda Birnbaum, a senior toxicologist with the U.S. Environmental Protection Agency.

The series reported that a 20-month-old boy and his 5-year-old sister consistently bore higher levels of flame retardants and other chemicals in their bodies in comparison with their parents. The results were condensed into a journal article, published in 2006 by the National Institutes of Health, and it’s now cited in scientific literature.

“Not only does this (new) study agree with what we saw with the Fischer study,” said Birnbaum, “but it indicates that children and teenagers have (higher levels of chemicals) than adults.”

She added, “This is not something we would have predicted a few years ago.”

========================================================

Sidebar: How to Limit Your Exposure to Toxic Flame Retardants

When purchasing new electronics products, look for these brands, which have publicly committed to phasing out brominated fire retardants:

Acer, Apple, Eizo Nanao, LG Electronics, Lenovo, Matsushita, Microsoft, Nokia, Phillips, Samsung, Sharp, Sony-Ericsson and Toshiba.

When purchasing furniture, opt for less flammable fabrics and materials such as leather, wool and cotton.

Discard foam items with ripped covers or decaying foam. If you can’t replace them, keep the covers intact.

Use a vacuum fitted with a HEPA filter. These vacuums are more efficient at trapping small particles.

Be careful when removing old carpet. The padding may contain flame retardants.

SOURCE: Environmental Working Group

========================================================

The new study found that in 19 of the 20 families, concentrations of flame retardants were significantly higher in children than in their mothers. In all, 11 different types of flame retardants were found in these children.

Although the Centers for Disease Control and Prevention conducts periodic monitoring of blood levels for more than 140 chemicals in a cross section of adults across the United States, analyses of young children hasn’t been part of that effort.

But it should be, insists Dr. Anila Jacob, a senior scientist with the Environmental Working Group’s Washington, D.C., office.

“Children are so much more vulnerable to toxic chemicals,” she said, describing animal studies linking permanent changes in growing brains with exposure to flame retardants.

Birnbaum is one of the country’s experts on the health effects of flame retardants, also called polybrominated diphenyl ethers, or PBDEs. She said research suggests that flame retardants circulating in the body damage nerve tissue, affecting learning and memory.

A 2008 report from the EPA stated that animal studies on PBDEs found that the chemicals were damaging to the kidney, thyroid and liver. One flame retardant in particular, Deca, is also a “possible human carcinogen,” the EPA report noted.

But the minute amounts of flame retardant detected in the Environmental Working Group study hardly raise reason for alarm, stated John Kyle, North American director for the Bromine Science and Environmental Forum, representing Deca manufacturers, in an email.

Flame retardants save lives, Kyle emphasized, and no one has ever reported any “illness, ailment or harm” from exposure to the chemicals, even among those working with it, he stated.

Nonetheless, because of mounting concerns over their possible health effects, even in minute quantities, the forum supports close monitoring and analysis by scientists and regulators, Kyle added.

Charlie Auer, director of the EPA’s Office of Pollution Prevention and Toxics, said the agency will soon be asking U.S. manufacturers of flame retardants to sponsor additional studies on exposure effects in children.

Deca is the only type of flame retardant still produced in the country.

The manufacturer of two other varieties voluntarily ceased production in 2005, and the EPA since enacted a regulation banning U.S. production or import of those two chemicals, due to health and environmental concerns. Loopholes, however, allow import of products made with these chemicals, today’s study noted.

In addition, they’re still in furniture and foam items purchased before the phase-out.

Deca is used to keep the plastics in televisions, computers, stereo equipment and other electronic gear from catching fire, as well as products like the lining of some curtains. Like other flame retardants, it slows the ignition and spread of fire, providing time to escape or extinguish a fire. They’ve been remarkably effective in reducing death, injury and damage from fires.

Other varieties of flame retardants are also found in furniture, carpets, couches, baby seats, pillows and other products made of foam or plastics. Some manufacturers are voluntarily phasing out these products and replacing them with other flame retardants, or redesigning their products to lessen fire danger. But there’s no way for consumers to know which flame retardant, if any, is in a product.

Minute traces of flame retardants have been detected worldwide in air, sediments, surface water, aquatic animal species and terrestrial wildlife. In the Bay Area, two pairs of nesting peregrine falcons had some of the highest levels of Deca of any living organism tested.

The most common route of exposure to flame retardants comes from dust in homes, or from directly touching products made from it. Given its prevalence in the environment, it has also entered the food supply.

Traces were found in a variety of grocery store items tested for flame retardants in one study.

Two states, Washington and Maine, now ban the use of Deca, and legislators in 10 other states, including California, have proposed bans, according to the Environmental Working Group. The European Union also banned the sales of products containing Deca.

Kristi Chester Vance is a San Francisco mother who participated in the Environmental Working Group study, along with her 4-year-old daughter Stella, to help advance the research. But she decided she didn’t want to know what level of flame retardant she and her daughter carry.

Stella already endured a round with lead poisoning when she was younger, and Vance wants relief from worries over environmental contaminants she has little control over.

Vance wants the government to take a far more aggressive stance in studying the thousands of industrial chemicals approved for use, usually with limited data on health effects.

But she makes efforts to keep the ubiquitous flame retardant residue out of her home, by mopping regularly to get rid of dust, using a vacuum with a fine particle filter, and she keeps her laptop computer off her lap. She and her children also wash their hands more frequently.

Beyond that, Vance figures she can’t do much more and still maintain her peace of mind.

“You just reach a point where you have to balance mental health and just enjoy these few years of childhood without looking at your kids and wondering what’s going on in their cells,” she said. “Sadly, I have to work pretty hard at it.”

Reach Suzanne Bohan at sbohan@bayareanewsgroup.com or (650) 348-4324

Copyright 2000-2008 ANG Newspapers

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: The Sunday Times (London, U.K.) ………………..[This story printer-friendly]
September 7, 2008

JASON AND THE SECRET CLIMATE CHANGE WAR

[Rachel’s introduction: A shadowy scientific elite in the Pentagon, code named Jason, warned the U.S. about global warming 30 years ago but was sidelined for political convenience]

By Naomi Oreskes and Jonathan Renouf

Today the scientific argument about the broad principles of what we are doing to the Earth’s climate is over. By releasing huge quantities of greenhouse gases such as carbon dioxide and methane into the atmosphere we are warming the world.

Since the early 1990s there has been a furious debate about global warming. So-called climate change “sceptics” have spent years disputing almost every aspect of the scientific consensus on the subject. Their arguments have successfully delayed significant political action to deal with greenhouse gas emissions. Recent research reveals how the roots of this argument stretch back to two hugely influential reports written almost 30 years ago.

These reports involve a secret organisation of American scientists reporting to the US Department of Defense. At the highest levels of the American government, officials pondered whether global warming was a significant new threat to civilisation. They turned for advice to the elite special forces of the scientific world — a shadowy organisation known as Jason. Even today few people have heard of Jason. It was established in 1960 at the height of the cold war when a group of physicists who had helped to develop the atomic bomb proposed a new organisation that would — to quote one of its founders — “inject new ideas into national defence”.

So the Jasons (as they style themselves) were born; a self-selected group of brilliant minds free to think the unthinkable in the knowledge that their work was classified. Membership was by invitation only and they are indeed the cream. Of the roughly 100 Jasons over the years, 11 have won Nobel prizes and 43 have been elected to the US National Academy of Sciences.

For years, being a Jason was just about the best job going in American science. Every summer the Jasons all moved to San Diego in California to devote six weeks to working together. They were paid well and rented houses by the beach. The kids surfed while their dads saved the world. Less James Bond, more Club Med.

Today the Jasons still meet in San Diego in a quaint postwar construction with more than a hint of Thunderbirds about it. In 1977 they got to work on global warming. There was one potential problem. Only a few of them knew anything about climatology. To get a better understanding they relocated for a few days to Boulder, Colorado, the base for NCAR — the National Center for Atmospheric Research — where they heard the latest information on climate change. Then, being physicists, they went back to first principles and decided to build a model of the climate system. Officially it was called Features of Energy-Budget Climate Models: An Example of Weather-Driven Climate Stability, but it was dubbed the Jason Model of the World.

In 1979 they produced their report: coded JSR-78-07 and entitled The Long Term Impact of Atmospheric Carbon Dioxide on Climate. Now, with the benefit of hind-sight, it is remarkable how prescient it was.

Right on the first page, the Jasons predicted that carbon dioxide levels in the atmosphere would double from their preindustrial levels by about 2035. Today it’s expected this will happen by about 2050. They suggested that this doubling of carbon dioxide would lead to an average warming across the planet of 2-3C [3.6 to 5.4 degrees Fahrenheit]. Again, that’s smack in the middle of today’s predictions. They warned that polar regions would warm by much more than the average, perhaps by as much as 10C or 12C [18 to 21.6 degrees Fahrenheit]. That prediction is already coming true — last year the Arctic sea ice melted to a new record low. This year may well set another record.

Nor were the Jasons frightened of drawing the obvious conclusions for civilisation: the cause for concern was clear when one noted “the fragility of the world’s crop-producing capacity, particularly in those marginal areas where small alterations in temperature and precipitation can bring about major changes in total productivity”.

Scientific research has since added detail to the predictions but has not changed the basic forecast. The Jason report was never officially released but was read at the highest levels of the US government. At the White House Office of Science and Technology Policy, Frank Press, science adviser to President Jimmy Carter, asked the National Academy of Sciences for a second opinion. This time from climate scientists.

The academy committee, headed by Jule Charney, a meteorologist from Massachusetts Institute of Technology (MIT), backed up the Jason conclusions. The Charney report said climate change was on the way and was likely to have big impacts. So by the late 1970s scientists were already confident that they knew what rising carbon dioxide levels would mean for the future. Then politics got in the way. And with it came the birth of climate change scepticism.

In 1980 Ronald Reagan was elected president. He was pro-business and pro-America. He knew the country was already in the environmental dog house because of acid rain. If global warming turned into a big issue, there was only going to be one bad guy. The US was by far the biggest producer of greenhouse gases in the world. If the president wasn’t careful, global warming could become a stick to beat America with.

So Reagan commissioned a third report about global warming from Bill Nierenberg, who had made his name working on the Manhattan Project developing America’s atom bomb. He went on to run the Scripps Institution of Oceanography where he had built up the Climate Research Division. And he was a Jason. Nierenberg’s report was unusual in that individual chapters were written by different authors. Many of these chapters recorded mainstream scientific thinking similar to the Charney and Jason reports. But the key chapter was Nierenberg’s synthesis — which chose largely to ignore the scientific consensus.

His basic message was “calm down, everybody”. He argued that while climate change would undoubtedly pose challenges for society, this was nothing new. He highlighted the adaptability that had made humans so successful through the centuries. He argued that it would be many years before climate change became a significant problem. And he emphasised that with so much time at our disposal, there was a good chance that technological solutions would be found. “[The] knowledge we can gain in coming years should be more beneficial than a lack of action will be damaging; a programme of action without a programme for learning could be costly and ineffective. [So] our recommendations call for ‘research, monitoring, vigilance and an open mind’.”

Overall, the synopsis emphasised the positive effects of climate change over the negative, the uncertainty surrounding predictions of future change rather than the emerging consensus and the low end of harmful impact estimates rather than the high end. Faced with this rather benign scenario, adaptation was the key.

If all this sounds familiar, it should. Similar arguments have been used by global warming sceptics ever since Nierenberg first formulated them in 1983. Global warming was duly kicked into the political long grass — a distant problem for another day. At a political level, Nierenberg had won.

But this was only the beginning of his involvement in what eventually became a movement of global warming sceptics. A year after his report came out he became a co-founder of the George C. Marshall Institute, one of the leading think tanks that would go on to challenge almost every aspect of the scientific consensus on climate change. Nierenberg hardened his position. He began to argue not just that global warming wasn’t a problem, but also that it wasn’t happening at all. There was no systematic warming trend, the climate was simply going through its normal, natural fluctuations.

The creed that Nierenberg originated all those years ago still has its dwindling band of followers. Sarah Palin, the Republican vice- presidential candidate, recently responded to a question about global warming by saying: “I’m not one who would attribute it to being man- made.”

==============

Professor Naomi Oreskes is a historian of science, researching the history of climate change. Dr Jonathan Renouf is producer of Earth: The Climate Wars, 9pm tonight on BBC2

Copyright 2008 Times Newspapers Ltd.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Spiegel Online ………………………………..[This story printer-friendly]
September 9, 2008

NEW GERMAN FACILITY BEGINS TESTING CO2 SEQUESTRATION

[Rachel’s introduction: For the first time ever, a coal-fired power plant (in Germany) has actually begun pumping pressurized liquid carbon dioxide into the ground, hoping it will stay there forever. The operation is tiny, experimental, expensive — and an end-of-pipe solution — but the future of the coal industry hangs on experiments such as this one being declared a “success.”]

By Christoph Seidler

It is complicated and expensive, but it could ultimately be good for the environment. A German pilot program has now begun testing the safety and usefulness of pumping CO2 emissions underground.

A remote spot in Brandenburg has become a popular destination for politicians in the past few months: the Schwarze Pumpe coal power station near Spremberg. Former SPD [Social Democratic Party] leader Kurt Beck visited, as did Transport Minister Wolfgang Tiefensee. Brandenburg’s Governor Matthias Platzeck has been there several times.

The attraction? Over the last two years, Swedish power supplier Vattenfall has built a pilot program to demonstrate how CO2 emissions from coal plants can be captured and pumped underground. The technology, known as Carbon Capture and Storage (CCS), could play a major role in the future of coal-fired energy sources. CCS may give the coal business, regarded as a harmful player in global warming, a much-needed green touch.

The 70-million-Euro ($98.1 million U.S. dollars) project officially kicked off on Tuesday. Top managers from Vattenfall gathered at the opening along with political bigwigs such as Thomas de Maiziere, Angela Merkel’s chief of staff in the Chancellory. Governor Platzeck was there as well.

How carbon sequestration works.

In truth, by the time Tuesday’s mini-ceremony took place, the complex had already been running for a few days. Guests could watch as carbon emissions were loaded into a waiting tanker truck. The power plant being used to test the project’s feasibility is 350 kilometers (217 miles) away from the CO2 storage site. In the next three years, Vattenfall plans to pump 100,000 tons of CO2 from the Schwarze Pumpe site into a nearly depleted deposit of natural gas in Saxony-Anhalt.

But CCS technology still has a way to go before it can hit the markets:

** Power plant technology is still in the development stages. The current pilot program has an output of barely 30 megawatts. [At 2 gigawatts, big U.S. coal plants are 66 times as large.] In 2015, Vattenfall wants to open two model power stations in Germany and Denmark. In contrast to the current project, these will produce meaningful quantities of electricity. This technology, however, only works in new, purpose-built power stations. Refitting old ones is not a possibility. Vatenfall’s competitor RWE has announced that it will build a 450-megawatt model CCS power plant in Hurth, nine kilometers southwest of Cologne. But CCS will only be viable in the marketplace with power stations that produce upwards of 1,000 megawatts — difficult given that CCS plants are currently much less efficient than traditional plants.

** The storage technology is still being tested. The most important test facility is at the Geo-Research Center in Potsdam (GFZ). For the last two months, scientists have been injecting CO2 through 800-meter- deep bore holes into a depleted gas reservoir. The scientists want to find out how the gas behaves underground and how long it is likely to remain there. Estimates vary betwen 1,000 and 10,000 years when it comes to the question of how long the gas must remain in the ground to have any positive effect on the climate at all.

* It also remains unclear exactly how C02 can be transported before storage. As of yet, there is no legal framework for the building of pipelines between power plants and storage sites. Such guidelines are currently being discussed within the European Union. But the construction of such pipelines would likely run into considerable political and public opposition.

Still, energy companies have high hopes for the capture and storage of carbon dioxide. In the best-case scenario, the process will be ready for widespread implementation in 2020 — too late to help meet short- term climate goals. There is political pressure, though, to make the project work. Germany needs “powerful new power stations and efficient, modern coal-fired plants are part of that,” Chancellor Angela Merkel said this week.

Environmental groups complain that development costs for the new technology are too high and that the project may help climate-harming coal-fired power plants maintain a foothold. A consortium of 99 organizations calling themselves the “Climate Alliance” invited protesters to Tuesday’s opening in Brandenburg.

After his visit to the Schwarze Pumpe site three weeks ago, then SPD head Kurt Beck seemed only moderately impressed. “One sees clearly that it is far more than just a theoretical beginning,” Beck said politely. “It is one of a number of solutions to the climate problem.” But carbon storage is certainly not a panacea.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

From: Washington Post (pg. BW4) ……………………..[This story printer-friendly]
April 27, 2008

WHY THE ENVIRONMENTAL MOVEMENT CANNOT PREVENT CATASTROPHE.

[Rachel’s introduction: Environmentalism is almost as compromised as the planet itself, argues Gus Speth in his new book. He faults the movement for using market incentives to achieve environmental ends and for the deception that sufficient change can come from engaging the corporate sector and working “within the system” and not enlisting the support of other activist constituencies.]

By Ross Gelbspan

Review of James Gustave Speth, The Bridge at the Edge of the World Capitalism, the Environment, and Crossing From Crisis to Sustainability (Yale University Press: 2008; 295 pp. $28).

Contemporary capitalism and a habitable planet cannot coexist. That is the core message of The Bridge at the Edge of the World, by J. “Gus” Speth, a prominent environmentalist who, in this book, has turned sharply critical of the U.S. environmental movement.

Speth is dean of environmental studies at Yale, a founder of two major environmental groups (the Natural Resources Defense Council and the World Resources Institute), former chairman of the President’s Council on Environmental Quality (under Jimmy Carter) and a former head of the U.N. Development Program. So part of his thesis is expected: Climate change is only the leading edge of a potential cascade of ecological disasters.

“Half the world’s tropical and temperate forests are gone,” he writes. “About half the wetlands… are gone. An estimated 90 percent of large predator fish are gone…. Twenty percent of the corals are gone…. Species are disappearing at rates about a thousand times faster than normal…. Persistent toxic chemicals can now be found by the dozens in… every one of us.”

One might assume, given this setup, that Speth would argue for a revitalization of the environmental movement. He does not.

Environmentalism, in his view, is almost as compromised as the planet itself. Speth faults the movement for using market incentives to achieve environmental ends and for the deception that sufficient change can come from engaging the corporate sector and working “within the system” and not enlisting the support of other activist constituencies.

Environmentalism today is “pragmatic and incrementalist,” he notes, “awash in good proposals for sensible environmental action” — and he does not mean it as a compliment. “Working only within the system will … not succeed when what is needed is transformative change in the system itself.”

In Speth’s view, the accelerating degradation of the Earth is not simply the result of flawed or inattentive national policies. It is “a result of systemic failures of the capitalism that we have today,” which aims for perpetual economic growth and has brought us, simultaneously, to the threshold of abundance and the brink of ruination. He identifies the major driver of environmental destruction as the 60,000 multinational corporations that have emerged in the last few decades and that continually strive to increase their size and profitability while, at the same time, deflecting efforts to rein in their most destructive impacts.

“The system of modern capitalism… will generate ever-larger environmental consequences, outstripping efforts to manage them,” Speth writes. What’s more, “It is unimaginable that American politics as we know it will deliver the transformative changes needed” to save us from environmental catastrophe. “Weak, shallow, dangerous, and corrupted,” he says, “it is the best democracy that money can buy.”

Above all, Speth faults environmentalists for assuming they alone hold the key to arresting the deterioration of the planet. That task, he emphasizes, will require the involvement of activists working on campaign finance reform, corporate accountability, labor, human rights and environmental justice, to name a few. (Full disclosure: He also approvingly cites some of this reviewer’s criticisms of media coverage of environmental issues.)

Speth, of course, is hardly the first person to issue a sweeping indictment of capitalism and predict that it contains the seeds of its own demise. But he dismisses a socialist alternative, and, at its core, his prescription is more reformist than revolutionary. He implies that a more highly regulated and democratized form of capitalism could be compatible with environmental salvation if it were accompanied by a profound change in personal and collective values.

Instead of seeking ever more consumption, we need a “post-growth society” with a more rounded definition of well-being. Rather than using gross domestic product as the primary measure of a country’s economic health, we should turn to the new field of ecological accounting, which tries to factor in the costs of resource depletion and pollution.

This book is an extremely probing and thoughtful diagnosis of the root causes of planetary distress. But short of a cataclysmic event — like the Great Depression or some equally profound social breakdown — Speth does not suggest how we might achieve the change in values and structural reform necessary for long-term sustainability. “People have conversion experiences and epiphanies,” he notes, asking, “Can an entire society have a conversion experience?”

==============

Ross Gelbspan is author of “The Heat Is On” and “Boiling Point.” He maintains the Web site http://www.heatisonline.org/.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

Rachel’s Democracy & Health News highlights the connections between issues that are often considered separately or not at all.

The natural world is deteriorating and human health is declining because those who make the important decisions aren’t the ones who bear the brunt. Our purpose is to connect the dots between human health, the destruction of nature, the decline of community, the rise of economic insecurity and inequalities, growing stress among workers and families, and the crippling legacies of patriarchy, intolerance, and racial injustice that allow us to be divided and therefore ruled by the few.

In a democracy, there are no more fundamental questions than, “Who gets to decide?” And, “How DO the few control the many, and what might be done about it?”

Rachel’s Democracy and Health News is published as often as necessary to provide readers with up-to-date coverage of the subject.

Editors:
Peter Montague – peter@rachel.org
Tim Montague – tim@rachel.org

Environmental Research Foundation
P.O. Box 160, New Brunswick, N.J. 08903
dhn@rachel.org