Climate ethics principles

Building in Old Montreal

Last November, the United Nations Framework Convention on Climate Change convened a meeting in Nairobi. One document that resulted from that meeting was the White Paper on the Ethical Dimensions of Climate Change (PDF). On the basis of arguments similar to those I have heard from Henry Shue and Stephen Gardiner, the document lists seven things that states intending to behave ethically on the climate change problem should do:

  1. Immediately acknowledge that they have a duty to reduce their emissions as quickly as possible to their fair share of safe global emissions;
  2. Immediately agree that an international greenhouse gas atmospheric stabilization target should be set as low as possible unless those who are most vulnerable to climate change impacts have consented to be put at risk from higher levels;
  3. No longer use scientific uncertainty or cost to their economies alone as justification for refusing to reduce GHG emissions;
  4. No longer refuse to reduce GHG emissions now on the basis that new less-costly technologies will be available in the future or that not all other nations have agreed to reduce their GHG emissions;
  5. Accept national targets for assuring that atmospheric concentrations of GHG are protective of human health and the environment that are based upon ethically supportable allocation criteria;
  6. Acknowledge that nations commit human rights violations that refuse to reduce their GHG emissions to their fair share of global emissions needed to protect those most vulnerable from climate change to loss of life, health, and well-being;
  7. Accept that those who are responsible for climate change have a duty to pay for costs of adaptation to and unavoidable damages from climate change.

Generally speaking, these are fine principles. If every major emitting state adopted them, it is entirely plausible that emissions could be brought down to sustainable levels within the next couple of decades and that the inevitable consequences of climate change from past emission could be equitably addressed. What the list fails to consider is the inherent Nash Equilibrium problem. States do not act as all states would act in an ideal world; rather, they generally act in a way that is rational given their inability to control the actions of other states. Given the Stern conclusion that mitigation is far less expensive than inducing and enduring climate change, it would be in the interest of all states to mitigate. Given how many states are proving reluctant to take that seriously, states that are serious about tackling the problem find themselves pushed towards a rationality of building up adaptive capacity instead of reducing emissions.

All that said, ethicists are not meant to be pragmatists. Having a well-argued idea of what ethical behaviour in the face of climate change would be provides a cognitive platform from which to evaluate current actions. It may also help to raise the overall profile of the issue in democratic states where moral and ethical argumentation can be an important element of the political process.

Small island states under threat

Milan Ilnyckyj in helmet and sunglasses

What can really be said about climate change and small island states? Working Group I of the IPCC projects that global sea levels will rise by 0.12 – 0.22m by 2100 not taking into account the melting of Greenland and Antarctica. With those elements factored in, a sea level rise of 1m certainly seems possible and it becomes conceivable that rises of several metres will occur if either of those icesheets goes the way of the polar icecap.

So what happens to the really low-lying states like the Maldives? The combination of coastal erosion, sea level rise, increased vulnerability to storm surges, and contamination of freshwater aquifers may well make them simply non-viable as places that can support a population. Nauru, Vanuatu, and Tuvalu face the same vulnerabilities – just to choose a few from among many examples.

A number of more substantial islands could be seriously threatened by the aquifer issue. Malta is suffering a double effect: rising sea levels threatening freshwater aquifers and decreased rainfall further increasing their salinity. In 2007, it doubled from 2000 to 4000 microsiemens and it is now too salty to water trees with. Fossil fuel based desalinators are being installed to help address water shortages: though they will increase Maltese GHG emissions.

All told, there isn’t much that can be hopefully said about low lying areas. Like the Arctic, these areas will certainly experience significant effects from climate change. The questions that remain are how serious and sudden it will be.

Increasingly clever machines

It seems my mountain climbing, robot-building friend Mark has a relatively new blog. He works with autonomous robots of the kind that competed in the recent DARPA Urban Challenge.

Here is one way in which such robots see the world: as a set of laser determined ranges.

Previous robot-related posts:

Death, drugs, and rock and roll

A recent study in the Journal of Epidemiology and Community Health confirms the hazards of musical stardom. The study examined the lives of 1,064 successful musicians in the rock, punk, rap, R&B, electronica, and new age genres. All became famous between 1956 and 1999 and all had records that were included in a ‘Top 1000 records of all time’ list from 2000.

It found that the median age of death for North American celebrities was an unimpressive 41.78. Europeans do even worse, at just 35.18. All told, successful musicians are nearly twice as likely to die early as members of the normal population.

The regional breakdown by cause of death is also interesting:

Cause – % in the US – % in Europe
Suicide – 2.8% – 3.6%
Drug or alcohol overdose – 15.3% – 28.6%
Chronic drug or alcohol disorder – 9.7% – 3.6%
Drug or alcohol related accident – 2.8% – 7.1%
Cancer – 19.4% – 21.4%
Heart disease – 18.0% – 3.6%
Accidents – 13.9% – 21.4%
Violence – 6.9% – 3.6%
Other – 11.1% – 7.1%

The largest single discrepancy is the probability of dying of a drug overdose, but lots of other significant differences exist. Neither regional profile suggests that music is a healthy profession: at least for those at the top.

Source:

Today’s best biofuel: Brazilian ethanol

Montreal graffiti

Many people see biofuels as a promising replacement for oil in transportation applications. Indeed, being able to replace the oil that contributes to climate change and must often be imported from nasty regimes with carbon-neutral fuels from domestic crops has a great deal of intuitive appeal. For this process to be worthwhile, however, there is a need to consider both life-cycle energy usage and net carbon emissions.

A study conducted in 2004 by Isaias de Carvalho Macedo at the University of Brazil focused on the production of ethanol from Brazilian sugarcane. This is considered by the majority of commentators to be the most energy efficient source of biofuel currently available. This is because most Brazilian sugarcane requires no irrigation and must only be ploughed up and replanted once every five years. The Macedo study found that producing a tonne of sugarcane requires 250,000 kilojoules of energy. This represents the need for tractors, fertilizers, and other elements of modern mechanical farming. The ethanol from one tonne of sugarcane contained 2,000,000 kilojoules of energy. Furthermore, the plants that produce it burn bagasse (the pulp left over when sugarcane has the sugar squeezed out) and can contribute net electricity to the grid. Corn ethanol (the kind being heavily subsidized in the United States) takes about as much energy to grow as is ultimately contained in the fuel.

In terms of net carbon emissions, cane ethanol is also fairly good. Using one tonne of ethanol instead of the amount of gasoline with the same energy content produces 220.5 fewer kilograms of carbon dioxide, when all aspects of production and usage are considered. Burning one litre of gasoline produces about 640 grams of carbon dioxide. Since ethanol has about 25% less energy than gasoline, the relevant comparison is between 1,000 kilograms of ethanol and 750 kilos of gasoline. The gasoline would emit 460 kilos of carbon dioxide, while the ethanol would emit 259.5 kilos.

This is an improvement over the direct use of fossil fuels, but not a massive one. The Macedo study concludes that widespread ethanol use reduces Brazilian emissions by 25.8 million tonnes of carbon dioxide equivalent per year. Their total carbon emissions from fossil fuels are about 92 million tonnes per year – a figure that increases substantially if deforestation is included.

The conclusion to be drawn from all of this is that ethanol – even when produced in the most efficient way – is not a long-term solution. Producing 259.5 kilos of carbon is more sustainable than producing 460, but it isn’t an adequate reduction in a world that has to cut from about 27 gigatonnes of carbon dioxide equivalent to five. Bioethanol may become more viable with the development of cellulosic technology (a subject for another post), but is certainly no panacea at this time.

References:

[Update: 8:54am] The above numbers on carbon dioxide emissions produced by gasoline per kilometre are disputed. If someone has an authoritative source on the matter, please pipe up.

Carbon pricing and GHG stabilization

Montreal graffiti

Virtually everyone acknowledges that the best way to reduce greenhouse gas emissions is to create a price for their production that someone has to pay. It doesn’t matter, in theory, whether that is the final consumer (the person who buys the iPod manufactured and shipped across the world), the manufacturer, or the companies that produced the raw materials. Wherever in the chain the cost is imposed, it will be addressed through the economic system just like any other cost. When one factor of consumption rises in price, people generally switch to substitutes or cut back usage.

This all makes good sense for the transition from a world where carbon has no price at all and the atmosphere is treated as a greenhouse gas trash heap. What might become problematic is the economics of the situation when greenhouse gas emissions start to approach the point of stabilization. If we get 5 gigatonnes collectively, that means a global population of 11 billion will get about half a tonne of carbon each.

Consider two things: Right now, Canadian emissions per person are about 24.3 tonnes of CO2 equivalent. Cutting to about 0.5 is a major change. While it may be possible to cut a large amount for a low price (carbon taxes or permits at up to $150 a tonne have been discussed), it makes sense that people will be willing to pay ever-more to avoid each marginal decrease in their carbon budget. Moving from 24.3 tonnes to 20 might mean carrying out some efficiency improvements. Moving from 20 to 10 might require a re-jigging of the national energy and transportation infrastructures, carbon sequestration, and other techniques. Moving from 10 to 0.5 may inevitably require considerable personal sacrifice. It certainly rules out air travel.

The next factor to consider if the effect of economic inequality on all this. We can imagine many kinds of tax and trading systems. Some might be confined to individual states, and others to regions. It is possible that such a scheme would eventually be global. With a global scheme, however, you need to consider the willingness of the relatively affluent to pay thousands or tens of thousands of dollars to maintain elements of their carbon-intensive lifestyles. This could mean that people of lesser means get squeezed even more aggressively. It could also create an intractable problem of fraud. A global system that transfers thousands of dollars on the basis of largely unmeasured changes in lifestyle could be a very challenging thing to authenticate.

These kinds of problems lie in the relatively distant future. Moving to a national economy characterized by a meaningful carbon price is likely to take a decade. Moving to a world of integrated carbon trading may take even longer. All that admitted, the problems of increasing marginal value of carbon and the importance of economic inequality are elements that those pondering such pricing schemes should begin to contemplate.

Index of climate posts

Fruit bar

For the last while, my aim on this blog has been both to entertain readers and to provide some discussion of all important aspects of the climate change problem. To facilitate the latter aim, I have established an index of posts on major climate change issues. Registered users of my blog can help to update it. Alternatively, people can use comments here to suggest sections that should be added or other changes.

The index currently contains all posts since I arrived in Ottawa. I should soon expand it to cover the entire span for which this blog has existed.

Problems with fusion ITER means to solve

Building in Old Montreal

The fundamental problem with nuclear fusion as a mode of energy production is establishing a system that produces more power than it consumes. Heating and containing large volumes of tritium-deuterium plasma is an energy intensive business. As such, the sheer size of the planned International Thermonuclear Experimental Reactor is a big advantage. Just like it is easier to keep a huge cooler full of drinks cold than to keep a single can that way, a larger volume of plasma has less surface area relative to its total energy. As such, bigger reactors have a better chance of producing net power.

The other big problems that scientists and engineers anticipate are as follows:

  1. No previous reactor has sustained fusion for very long. The JT-60 reactors in Japan holds the record, at 24 seconds. Because ITER is meant to operate for between 7 and fifteen minutes, it will produce a higher volume of very hot hydrogen (the product of the tritium-deuterium fusion). That hydrogen could interfere with the fusing plasma. As such, it needs to be removed from the reactor somehow. ITER plans to use a carbon-coated structure called a diverter, at the bottom of the reactor, to try to do this. It is not known how problematic the helium will be, nor how effective the diverter will prove.
  2. Both the diverter and the blanket that surrounds the reactor will need to be able to resist temperatures of 100 million degrees centigrade. They will also need to be able to survive the presence of large amount of radiation. It is uncertain whether the planned beryllium coatings will be adequate to deal with the latter. Prior to ITER’s construction, there are plans to test the planned materials using a specially built particle accelerator at a new facility, probably to be built in Japan. THis test facility could cost about $2.6 billion – one quarter of the total planned cost of ITER itself.
  3. Probably the least significant problem is converting the heat energy from the fusion reaction into electrical power. This is presumably just a matter of putting pipes carrying a fluid into the blanket, then using the expansion of that fluid to drive turbines. While this should be a relatively basic change, it is worth noting that ITER will have no capacity to generate power, and will thus need to dissipate its planned output of about 500 megawatts by other means.

None of these issues undermine the case for building ITER. Indeed, they are the primary justification for building the facility. If we already knew how to deal with these problems, we could proceed directly to building DEMO: the planned electricity-generating demonstration plant that is intended to be ITER’s successor.

The foolishness of the International Space Station

Montreal courthouse

On Tuesday, the space shuttle launched once again on a mission to add another piece to the International Space Station (ISS). As I have said before, it is a needlessly dangerous, unjustifiably expensive, and rather pointless venture. The science could be equally well done by robots, without risking human lives, and without spending about $1.3 billion per launch (plus emitting all the greenhouse gasses from the solid rocket boosters and related activities).

More and more, the ISS looks like a hopeless boondoggle. The lifetime cost is being estimated at $130 billion, all to serve a self-fulfilling mandate: we need to put people into space to scientifically assess what happens when we put people into space. Furthermore, the window between the completion of the ISS in about 2012 and the potential abandonment of the station as soon as 2016 is quite narrow. Robert Park may have summed up the whole enterprise best when he remarked that:

“NASA must complete the ISS so it can be dropped into the ocean on schedule in finished form.”

Normally, I am a big supporter of science. I think funding the International Thermonuclear Experimental Reactor and Large Hadron Collider is wise; these machines will perform valuable scientific research. Likewise, I support the robotic work NASA does – especially when it comes to scientists looking down on Earth from orbit and providing valuable research and services. I support the James Webb telescope. I also support the idea that NASA should have some decent plans for dealing with an anticipated asteroid or comet impact. The ISS, by contrast, is a combination between technical fascination lacking strategic purpose and pointless subsidies to aerospace contractors.

Of course, the Bush plan to send people to Mars is an even worse idea with higher costs, more risk, and even less value.