Studies backing successive IPCC reports

While it is obvious that the 2007 Fourth Assessment Report (4AR) of the Intergovernmental Panel on Climate Change (IPCC) was going to be more comprehensive than the 2001 Third Assessment Report (TAR), I was surprised to see the extent and the breakdown:

Sector – Studies assessed in TAR – Studies assessed in 4AR
Cryosphere: 23 – 59
Hydrology and water resources: 23 – 49
Coastal processes and zones: 4 – 56
Aquatic biological systems: 14 – 117
Terrestrial biological systems: 46 – 178
Agriculture and forestry: 5 – 49
Human health: 5 – 51
Disasters and hazards: 3 – 18

Total: 95 – 577

While it is simplistic to equate the number of studies examined with the overall quality of the conclusions drawn, the large increase is certainly reflective of the amount of research being devoted to climate change issues, as well as the level of resources it has been deemed appropriate to spend examining that body of scientific work.

These figures come from Cynthia Rosenzweig, a research scientist at NASA and member of the IPCC’s second working group.

Materials science and transgenic animals

Oil spill analysis equipment

One of the most interesting ongoing developments in materials science involves the borrowing of biologically originated materials and processes. The development is old news for people who follow science news, but seems worth mentioning to others.

In the first instance, there is the copying of chemical tricks that exist in nature. People have speculated about copying the wall sticking abilities of gecko feet, for instance. By artificial producing structures similar to those on the feet, excellent non-chemical adhesives could be made. Gecko feet are sufficiently adhesive to hold several hundred times the weight of the animal. Furthermore, they can be attached and detached at will by altering the geometry of the setae that produce the adhesion using Van der Waals force.

In the second instance, people have been exploiting biological processes to produce existing things in more effective ways. A favourite way to do this is through pharming: where new genes are introduced into species in order to turn them into pharmaceutical factories. For instance, goats have been genetically engineered to produce an anti-clotting drug in their milk, which can then be extracted, purified, and used by humans. The drug, called ATryn, treats hereditary antithrombin deficiency: a condition that makes people especially vulnerable to deep-vein thrombosis. The principle benefits of using goats are financial, as described in The Economist:

Female goats are ideal transgenic “biofactories”, GTC claims, because they are cheap, easy to look after and can produce as much as a kilogram of human protein per year. All told, Dr Cox reckons the barn, feed, milking station and other investments required to make proteins using transgenic goats cost less than $10m—around 5% of the cost of a conventional protein-making facility. GTC estimates that it may be able to produce drugs for as little as $1-2 per gram, compared with around $150 using conventional methods.

Transgenic goats are also being used to produce spider silk on an industrial scale. That super-strong material could be used in everything from aircraft to bullet-proof vests. Different varieties of spider silk could be used to produce materials with varying strengths and elasticities.

While the former behaviour seems fairly unproblematic (we have been coping from nature for eons), the latter does raise some ethical issues. Certainly, it involves treating animals as a means to greater ends – though that is also an ancient activity. People have generally been more concerned about the dangers to people and the natural world from such techniques: will the drugs or materials produced be safe? Will the transgenic animals escape and breed with wild populations? These are reasonable concerns that extend well beyond the genetic or materials expertise possessed by the scientists in question.

The potential of such techniques is undeniably considerable. One can simply hope that a combination of regulation and good judgment will avoid nightmare situations of the kind described in Oryx and Crake. So far, our genetically modified creatures tend to be inferior to their natural competitors. According to Alan Weisman, virtually all of our crops and livestock would be eliminated by predation and competition in a few years, in the absence of human care and protection. It remains to be seen whether the same will be true of plants and animals that currently exist only in the imaginations of geneticists.

Cleaner coal

Coal is a witches’ brew of chemicals including hydrocarbons, sulphur, and other elements and molecules. Burning it is a dirty business, producing toxic and carcinogenic emissions including arsenic, selenium, cyanide, nitrous oxides, particulate matter, and volatile organic compounds. Coal plants also produce large amounts of carbon dioxide, thus contributing to climate change. That said, some coal plant designs can reduce both toxic and climatically relevant emissions to a considerable extent. Given concerns about energy security – coupled with the vast coal reserves in the United States, United Kingdom, China, and elsewhere – giving some serious thought to cleaner coal technology is sensible.

Integrated Gasification Combined Cycle (IGCC) plants are the best existing option for a number of reasons. Rather than burning coal directly, they use heat to convert it into syngas, which is then burned. Such plants can also produce syngas from heavy petroleum residues (think of the oil sands) or biomass. One advantage of this approach is that it simplifies the use of carbon capture and storage (CCS) technologies, which seek to bury carbon emissions in stable geological formations. This is because the carbon can be removed from the syngas prior to combustion, rather than having to be separated from hot flue gases before they go out the smokestack.

The problems with IGCC include a higher cost (perhaps $3,593 per kilowatt, compared with less than $1,290 for conventional coal) and lower reliability than simpler designs (this diagram reveals the complexity of IGCC systems). In the absence of effective carbon sequestration, such plants will also continue to emit very high levels of greenhouse gasses. If carbon pricing policies emerge in states that make extensive use of coal for energy, both of these problems may be reduced to some extent. In the first place, having to pay for carbon emissions would reduce the relative cost of lower-emissions technologies. In the second place, such pricing would induce the development and deployment of CCS.

One way or another, it will eventually be necessary to leave virtually all of the carbon that is currently trapped in coal in the ground, rather than letting it accumulate in the atmosphere. Whether that is done by leaving the coal itself underground or simply returning the carbon once the energy has been extracted is not necessarily a matter of huge environmental importance (though coal mining is a hazardous business that produces lots of contamination). That said, CCS remains a somewhat speculative and unproven technology. ‘Clean coal’ advocates will be on much stronger ground if a single electricity generating, economically viable, carbon sequestering power plant can be constructed.

Poison-absorbing plants

A recent article in Scientific American describes the use of transgenic plants to remove toxins from contaminated sites. The plants have genes for toxin and carcinogen metabolisis (for instance, using the enzyme cytochrome P450-3A) inserted into their DNA. The technique has been tested with plants intended to address trichloroethylene, chloroform, carbon tetrachloride, vinyl chloride, and benzene contamination. Such plants have also shown promise in removing remaining concentrations of the explosive RDX from soil in test ranges. At present, there is sometimes no choice but to scoop up huge amounts of contaminated soil and put it into landfills; plants that are able to seperate the toxins from the soil could promise to facilitate the process, as well as reduce costs.

The article is not entirely clear on whether the plants simply absorb the chemicals, becoming contaminated by them in turn, or whether they actually break them down. In the former case, they might be useful for concentrating air, water, and soil contaminants into plant matter than can then be disposed of as hazardous waste. In the latter case, they could perform remediation without the need for such careful treatment of their remains. Another question is how the plants would deal with combinations of chemicals, such as might be found in actual contaminated sites.

All told, it seems a promising potential use for biotechnology. The world is certainly well saturated with contaminated sites and having more cost effective means of reclaiming them could be a boon to both nature and human health. It remains to be seen whether these limited trials can be scaled up and made cost-effective for commercial or governmental use.

Soggy runways

While they only represent a relatively small fraction of total emissions now, greenhouse gasses from air travel are growing rapidly. That said, one largely unanticipated check against their long-term rise may exist, if the potential sea rise effects of the disintegration of the Greenland or West Antarctic ice sheets become manifest.

This clever Google Maps mashup will show you what I mean:

Adding 7m to global sea levels (consistent with the melting of all of Greenland, all of West Antarctica, or half of each) would definitely drown a lot of runways. The Tokyo and London airports seem likely to be high and dry, though the cities themselves would be far from it.

Five gigatonne globe

∑ 5 Gt CO2e

[Update: 22 January 2009] Some of the information in the post below is inaccurate. Namely, it implies that some level of continuous emissions is compatible with climate stabilization. In fact, stabilizing climate required humanity to have zero net emissions in the long term. For more about this, see this post.

As discussed before, credible estimates of the carbon absorbing capacity of the biosphere are around five billion tonnes (five trillion kilograms) per year of carbon dioxide equivalent.

The graphic above is probably far too nerdy to have popular appeal – and it is possible the numerical figure will need revision – but it does strike me as a concise expression of what needs to be tackled.

Knowledge brokers get the Nobel

Meaghan Beattie and Milan Ilnyckyj

The hot news today is that the Intergovernmental Panel on Climate Change and Al Gore (though not Sheila Watt-Cloutier) have been awarded the Nobel Peace Prize. While some have questioned the appropriateness of awarding the prize on the basis of achievements not directly related to armed conflict, it does seem that the conflict potential connected with migration, water scarcity, and so forth makes this less of a stretch than some previous awards.

What is most notable about all this, for me, is that neither Gore nor the IPCC have actually contributed to climatic science. The IPCC exists to review the published academic literature on climatic science and agree upon a consensus position; Gore has acted as an effective advocate and representative, though his overall contribution has been far more in the area of information transmission than the area of information generation.

What this shows is how vitally important the layer between scientists and policy-makers or the general public is. Scientists are looking (with great skill and detail) at the individual elements that make up the climatic system. Translating that into a comprehensive understanding of relationships and risks – of the sort that can guide policy development – is critical and challenging. As such, these Nobel prizes are well earned.

Previous related entries:

The World Without Us

Around the globe, every natural system is being affected by human behaviour: from the composition of deep oceanic sediments to mountaintop glaciers. As such, the concept behind Alan Weisman’s extraordinary book The World Without Us is both ambitious and illuminating. Using a combination of research, expert consultation, and imagination, he projects what would happen to the Earth if all 6.7 billion human inhabitants suddenly vanished. Within weeks and months, all the nuclear power plants will melt down; the massive petroleum refinery and chemical production complexes will burn, corrode, and explode; and nature will begin the slow process of reclaiming everything. Over the course of decades and centuries, the composition of all ecosystems will change as farmland is retaken and once-isolated patches of wildlife become reconnected. Cities will fall apart as bridges stretch and compress with the seasons and foundations fail on account of flooding. In the end, only bronze sculpture and ceramics are likely to endure until our red giant sun singes or engulfs the planet in about five billion years. More broadly, there is reason to hope that radio waves and some interstellar space probes will endure for billions of years.

Weisman uses his central idea as a platform from which to explore everything from material science to palaeontology and ecology. The book is packed with fascinating tidbits of information – a number of which have been shamelessly plagiarized in recent entries on this blog. A few examples of especially interesting topics discussed are the former megafauna of North America, human evolution and migration, coral reef ecology, lots of organic chemistry, and the history of the Panama Canal.

In the end, Weisman concludes that the human impact upon the world is intimately linked with population size and ultimately determines our ability to endure as a species. As such, he concludes with the concise suggestion that limiting human reproduction to one child per woman would cut human numbers from to 3.43 billion by 2050 and 1.6 billion by 2100. That might give us a chance to actually understand how the world works – and how human activity affects it – before we risk being overwhelmed by the half-glimpsed or entirely surprising consequences of our energetic cleverness.

Whether you accept Weisman’s prescription or not, this book seems certain to deepen your thinking about the nature of our world and our place within it. So rarely these days do I have time to re-read things. Nevertheless, I am confident that I will pick up this volume again at some point. Readers of this blog would be well rewarded for doing likewise.

[4 November 2007] I remain impressed by what Weisman wrote about the durability of bronze. If I ever have a gravestone or other monument, I want the written portion to be cast in bronze. Such a thing would far, far outlast marble or even steel.

Hot Air

Meaghan Beattie and Tristan Laing

Hot Air: Meeting Canada’s Climate Change Challenge is a concise and virtually up-to-the-minute examination of Canadian climate change policy: past, present, and future. Jeffrey Simpson, Mark Jaccard, and Nic Rivers do a good job of laying out the technical and political issues involved and, while one cannot help taking issue with some aspects of their analysis, this book is definitely a good place to start, when seeking to evaluate Canada’s climate options.

Emission pathways

Hot Air presents two possible emissions pathways: an aggressive scenario that cuts Canadian emissions from 750 Mt of CO2 equivalent in 2005 to about 400 Mt in 2050, and a less aggressive scenario that cuts them to about 600 Mt. For the sake of contrast, Canada’s Kyoto commitment (about which the authors are highly critical) is to cut Canadian emissions to 6% below 1990 levels by 2012, which would mean emissions of 563 Mt five years from now. The present government has promised to cut emissions to 20% below 2006 levels by 2020 (600 Mt) and by 60 to 70% by 2050 (225 to 300 Mt). George Monbiot’s extremely ambitious plan calls for a 90% reduction in greenhouse gas emissions by 2030 (75 Mt for Canada, though he is primarily writing about Britain).

While Monbiot’s plan aims to reach stabilization by 2030, a much more conventional target date is around 2100. It is as though the book presents a five-decade plan to slow the rate at which water is leaking into the boat (greenhouse gasses accumulating in the atmosphere), but doesn’t actually specify how to plug the hole before it the boat sinks (greenhouse gas concentrations overwhelm the ability of human and natural systems to adapt). While having the hole half-plugged at a set date is a big improvement, a plan that focuses only on that phase seems to lack an ultimate purpose. While Hot Air does not continue its projections that far into the future, it is plausible that the extension of the policies therein for a further 50 years would achieve that outcome, though at an unknown stabilization concentration. (See this prior discussion)

Policy prescriptions

Simpson, Jaccard, and Rivers envision the largest reductions being achieved through fuel switching (for instance, from coal to natural gas) and carbon capture and storage. Together, these account for well over 80% of the anticipated reductions in both scenarios, with energy efficiency improvements, agricultural changes, waste treatment changes, and other efforts making up the difference. As policy mechanisms, the authors support carbon pricing (through either a cap-and-trade scheme or the establishment of a carbon tax) as well as command-and-control measures including tightened mandatory efficiency standards for vehicles, renewable portfolio standards (requiring a larger proportion of energy to be renewable), carbon management standards (requiring a larger proportion of CO2 to be sequestered), and tougher building standards. They stress that information and subsidy programs are inadequate to create significant reductions in emissions. Instead, they explain that an eventual carbon price of $100 to $150 a tonne will make “zero-emissions technologies… frequently the most economic option for business and consumers.” This price would be reached by means of a gradual rise ($20 in 2015 and $60 in 2020), encouraging medium and long-term investment in low carbon technologies and capital.

Just 250 pages long, with very few references, Hot Air takes a decidedly journalistic approach. It is very optimistic about the viability and affordability of carbon capture and storage, as well as about the transition to zero emission automobiles. Air travel is completely ignored, while the potential of improved urban planning and public transportation is rather harshly derided. The plan described doesn’t extend beyond 2050 and doesn’t reach a level of Canadian emissions consistent with global stabilization of greenhouse gas concentrations (though it would put Canada on a good footing to achieve that by 2100). While the book’s overall level of detail may not satisfy the requirements of those who want extensive technical and scientific analysis, it is likely to serve admirably as an introduction for those bewildered by the whole ecosystem of past and present plans and concerned with understanding the future course of policy.

Geologic time

Autumn leaves

While the Earth is about 4.54 billion years old, all of human civilization has been compressed into a single geological epoch: the Holocene. This has been ongoing for about 11,500 years, predating the first Mesopotamian civilizations for which we have any evidence. Prior to the Holocene was the Pleistocene, which ended with the Younger Dryas cold spell. Actually, the Holocene exists more as a demarcation for the period of geologic time that has included human civilization than as an epoch with an independent definition.

Our best ice core samples extend back 650,000 years: about a third of the way into the Pleistocene, but just a tiny foray into geologic time. Pollen from Lake Tanganyika might take us through the Pliocene (Greek for ‘more new’) and into the Miocene (‘less new’). Perhaps some yet-unanticipated data source will be able to take us further still.

It is amazing what scientists are able to determine from inference and the meticulous collection of data: from the age of the universe to the evolutionary history of the planet.