The true price of nuclear power

Maple leaf

Several times this blog has discussed whether climate change is making nuclear power a more acceptable option (1, 2, 3). One element of the debate that bears consideration is the legacy of contamination at sites that form part of the nuclear fuel cycle: from uranium mines to post-reactor fuel processing facilities. The Rocky Flats Plant in the United States is an especially sobering example.

Insiders at the plant started “tipping” the FBI about the unsafe conditions sometime in 1988. Late that year the FBI started clandestinely flying light aircraft over the area and noticed that the incinerator was apparently being used late into the night. After several months of collecting evidence both from workers and by direct measurement, they informed the DOE on June 6, 1989 that they wanted to meet about a potential terrorist threat. When the DOE officers arrived, they were served with papers. Simultaneously, the FBI raided the facilities and ordered everyone out. They found numerous violations of federal anti-pollution laws including massive contamination of water and soil, though none of the original charges that led to the raid were substantiated.

In 1992, Rockwell was charged with minor environmental crimes and paid an $18.5 million fine.

Accidents and contamination have been a feature of facilities handling nuclear materials worldwide. Of course, this does not suffice to show that nuclear energy is a bad option. Coal mines certainly produce more than their share of industrial accidents and environmental contamination.

The trickiest thing, when it comes to evaluating the viability of nuclear power, is disentangling exactly what sort of governmental subsidies do, have, and will exist. These subsidies are both direct (paid straight to operators) and more indirect (soft loans for construction, funding for research and development). They also include guarantees that the nuclear industry is only responsible for a set amount of money in the result of a catastrophic accident, as well as the implicit cost that any contamination that corporations cannot be legally forced to correct after the fact will either fester or be fixed at taxpayer expense. Plenty of sources claim to have a comprehensive reckoning of these costs and risks, but the various analyses seem to be both contradictory and self-serving.

Before states make comprehensive plans to embrace or reject nuclear power as a climate change mitigation option, some kind of extensive, comprehensive, and impartial study of the caliber of the Stern Review would be wise.

Jeffersonian trivia

Little known facts:

  1. Former American President Thomas Jefferson was an avid amateur palaeontologist.
  2. In an attempt to mock him, his political opponents gave him the nickname “Mr. Mammoth” during the 1808 election.
  3. He is credited with the discovery of an enormous ground sloth, larger than an elephant, that inhabited North America during the late Pleistocene.
  4. The creature now bears his name: Megalonyx jeffersonii.

These and many other entertaining facts come from the marvellous recent book The World Without Us, which has leapt to the top of my reading pile. I will post a full review when I finish it.

Seafood harm reduction

For those who haven’t taken the plunge into vegetarianism or veganism, but who are concerned about the ecological consequences of fish consumption, there are some good resources online. The Monterey Bay Aquarium has printable pocket-sized seafood guides, highlighting which species are harvested in relatively sustainable ways and which should definitely be avoided. The Blue Ocean Institute also has a number of resources, including a website for looking up species and a guide that can be downloaded.

Species that are particularly threatened (as well as often caught in highly unsustainable ways) include:

  • Bluefin tuna
  • Chilean Sea Bass (this is an industry name for Patagonian Toothfish)
  • Groupers
  • Orange Roughy
  • Atlantic Cod
  • Atlantic Halibut
  • Oreos (the fish, not the cookies)
  • Rockfish
  • Sturgeon Caviar
  • Snappers
  • Atlantic Salmon (note, all Atlantic salmon in the U.S. is farmed)
  • Sharks

While it is inadequate to think about marine conservation in terms of single species, such lists do provide a reasonably accessible way for consumers to scrutinize their actions. In the long run, however, marine resources need to be thought about in terms of whole ecosystems that need to be protected from threats including over-exploitation, toxins, and climatic changes.

Sputnik at 50

Bridge on the Rideau Canal

Even the Google logo has been altered to commemorate the 50th anniversary of the launch of Sputnik 1: the first artificial satellite. As someone who spends a very considerable amount of time thinking about how things are going to be in 2050 and 2100, it is remarkable to reflect upon both how different the world is from that of 1957 and how similar it is. The big changes that occurred have often been in areas that few if any people would have anticipated the importance of back then. Areas of great enthusiasm, such as nuclear power and space exploration, have only progressed incrementally since the 1950s and 60s.

I mentioned one Sputnik-related irony in a paper published back in 2005:

At the end of August, 1955, the Central Committee of the Communist Party approved the Soviet satellite program that would lead to Sputnik and authorized the construction of the Baikonour Cosmodrome. This facility, the largest of three Soviet launch sites that would eventually built, was the launching place of Sputnik I (and subsequent Sputniks), and the launch site for all Soviet manned missions…

This former stretch of Kazakhstani desert was also, fatefully, the place to which Nikifor Nikitin was exiled by the Czar in1830 for “making seditious speeches about flying to the moon.” He might have taken cold comfort in the fact that in 1955, the Central Committee gave control of the site to the new Soviet ‘Permanent Commission for Interplanetary Travel.’

For all the drama, it remains unclear to me that manned spaceflight serves any useful scientific or practical purpose at this point in time (see previous). In that sense, perhaps Sputnik – rather than John Glenn – was the true template for humanity’s future involvement in space: an 83.6kg ball of metal with a radio transmitter.

PS. My thesis mentions one somewhat surprising connection between Sputnik and climatic science:

A fortuitous bit of funding produced one of the most famous graphs in the climate change literature: the one tracking CO2 concentrations at Mauna Loa in Hawaii. Examining it closely, a gap can be seen in 1957, where David Keeling’s funding for the project ran out. The Soviet launch of Sputnik I on 4 October 1957 led to a marked concern in the United States that American science and technology had fallen behind. One result of the subsequent surge in funding was the resumption of the CO2 recording program, which continues to the present day.

This graph is the jagged, upward-sloping line that Al Gore devotes so much attention to near the beginning of An Inconvenient Truth.

A notable volcanic outburst

Most people probably will not have heard 1816 referred to as the Year Without a Summer, but that is exactly what the eruption of Mount Tambora in what is now Indonesia seems to have made it. That May, frost killed or ruined most of the summer crops. In June, two large snow storms produced substantial numbers of human casualties. Hungary and Italy got red snow, mixed with ash, while China experienced famine associated with sharply reduced rice production. In total, about 92,000 people died and the global mean temperature fell by 3°C.

One random yet positive consequence was constant rain causing Lord Byron to propose a writing contest, which Mary Shelley eventually won with Frankenstein. The increased cost of oats may also have driven a German man named Karl Drais to invent the first bicycle. (He called it the ‘velocipede,’ which sounds like a fast-moving and dangerous insect.)

Such incidents are inevitable on a planet that remains geologically active, but they certainly demonstrate the degree to which natural patterns can change rapidly, as well as the degree to which human beings are dependant upon them not doing so.

Ice and pollen

Brick and electrical metres

With good reason, ice cores have been getting a lot of attention lately. Their careful analysis gives us priceless insights into the history of Earth’s climate. Using cores from Greenland, we can go back more than 100,000 years, tracking temperature, carbon dioxide concentration, and even solar activity (using beryllium isotopes). Using cores from Antarctica, it is possible to go back about 650,000 years.

Ice cores can be even more valuable when they are matched up against records of other kinds. Living and petrified trees can be matched up, year for year, with the ice record. So can pollen deposits at the bottom of seas and lakes: arguably the richest data source of all. By looking at pollen deposits, it is possible to track the development of whole ecosystems: forests advancing and retreating with ice ages, the species mix changing in times of drought, and the unmistakable evidence of human alterations to the environment, going back tens of thousands of years.

Lake Tanganyika, in Tanzania, offers an amazing opportunity. 676km from end to end, it is the worst’s longest lake. It is also the second oldest and second deepest – after Lake Baikal in Siberia. Core samples from Tanganyika have already documented 10,000 years worth of pollen deposition. With better equipment and more funding, scientists say that it should be possible to collect data from the last five to ten million years: increasing the length of our climate records massively.

I am not sure if such an undertaking is already in the works. If not, it seems like the kind of opportunity we would be fools to pass up. If no government or scientific funding body is willing to stump up the cash, perhaps a billionaire or two can be diverted from their tinkering with rockets.

Polar opposites

By now, everybody knows that the Arctic summer sea ice is at an all-time low. What I only learned recently is that the extent of Antarctic ice is the greatest since satellite observation began in 1979. At the same time, it is undergoing “unprecedented collapses” like the much-discussed Larsen B collapse. Such realities hint at the complexities of the climate system.

Whereas the Arctic doesn’t have any effect on sea level, because it floats, the Antarctic rests on land. As such, changes in its ice mass do affect the depth of the world’s oceans. Antarctica is also the continent for which the least data is available, making it hard to incorporate into global climate models. As with all complex dynamic systems, there are non-linear effects to contend with. That makes it dangerous to extrapolate from present trends, especially when it comes to local conditions.

All this makes you appreciate why scientists frequently sound less certain about the details of climate change than politicians do. The harder you look at systems like the Earth’s climate, the more inter-relationships you discover, and the more puzzles there are to occupy your attention.

The Two Mile Time Machine

Fire hose reel

Richard Alley’s The Two Mile Time Machine: Ice Cores, Abrupt Change, and Our Future provides a good, though slightly dated, explanation of the science of ice core sampling, as a means for studying the history of Earth’s climate. Alley focuses on work conducted in Greenland prior to 2000. The book combines some surprisingly informal background sections with some rather technical passages about isotopic ratios and climatic cycles. Overall, it is a book that highlights the scientific tendency to dive right into the details of one area of inquiry, while skimming over many others that actually relate closely – especially if you are trying to use the science as the basis for sound decision-making.

This book does not really warrant inclusion in the first tier of books to read on climate change, but it certainly provides some useful background for those trying to develop a comprehensive understanding of the area. Arguably, the best contribution it makes is explaining the causes and characteristics of very long climatic cycles: those stretching over millennia or millions of years, with causes including orbital variation, continental drift, and cryosphere dynamics.

Given the amount of new data and analysis that has been undertaken since this book was published, a new edition may well be warranted. In particular, the very tenuous conclusions of Alley’s concluding chapters should either be revised, or defended in the fact of the new data.

A banking analogy for climate

[Update: 22 January 2009] Some of the information in the post below is inaccurate. Namely, it implies that some level of continuous emissions is compatible with climate stabilization. In fact, stabilizing climate required humanity to have zero net emissions in the long term. For more about this, see this post.

Every day, new announcements are made about possible emission pathways (X% reduction below year A levels by year B, and so forth). A reasonable number of people, however, seem to be confused about the relationship between emissions, greenhouse gas concentrations, and climatic change. While describing the whole system would require a huge amount of writing, there is a metaphor that seems to help clarify things a bit.

Earth’s carbon bank account

Imagine the atmosphere is a bank account, denominated in megatonnes (Mt) of carbon dioxide equivalent. I realize things are already a bit tricky, but bear with me. A megatonne is just a million tonnes, or a billion kilograms. Carbon dioxide equivalent is a way of recognizing that gasses produce different degrees of warming (by affecting how much energy from the sun is radiated by the Earth back into space). You can think of this as being like different currencies. Methane produces more warming, so it is like British Pounds compared to American dollars. CO2 equivalent is basically akin to expressing the values in the ‘currencies’ of different gasses in the form of the most important one, CO2.

Clearly, this is a bank account where more is not always better. With no greenhouse gasses (GHGs), the Earth would be far too cold to support life. Too many and all the ice melts, the forests burn, and things change profoundly. The present configuration of life on Earth depends upon the absence of radical changes in things like temperature, precipitation, air and water currents, and other climatic factors.

Assuming we want to keep the balance of the account more or less where it has been for the history of human civilization, we need to bring deposits into the account in line with withdrawals. Withdrawals occur when natural systems remove GHGs from the atmosphere. For instance, growing forests convert CO2 to wood, while single celled sea creatures turn it into pellets that sink to the bottom of the ocean. One estimate for the total amount of carbon absorbed each year by natural systems is 5,000 Mt. This is the figure cited in the Stern Review. For comparison’s sake, Canadian emissions are about 750 Mt.

Biology and physics therefore ‘set the budget’ for us. If we want a stable bank balance, all of humanity can collectively deposit 5,000 Mt a year. This implies very deep cuts. How those are split up is an important ethical, political, and economic concern. Right now, Canada represents about 2% of global emissions. If we imagine a world that has reached stabilization, one possible allotment for Canada is 2%. That is much higher than a per-capita division would produce, but it would still require us to cut our present emissions by 83%. If we only got our per-capita share (based on present Canadian and world populations), our allotment would be 24.5 Mt, about 3.2% of what we currently emit. Based on estimated Canadian and world populations in 2100, our share would be 15 Mt, or about 2% of present emissions.

Note: cutting emissions to these levels only achieves stabilization. The balance in the bank no longer changes year to year. What that balance is depends upon what happened in the years between the initial divergence between deposits and withdrawals and the time when that balance is restored. If we spend 100 years making big deposits, we are going to have a very hefty balance by the time that balance has stabilized.

Maintaining a balance similar to the one that has existed throughout the rise of human civilization seems prudent. Shifting to a balance far in excess carries with it considerable risks of massive global change, on the scale of ice ages and ice-free periods of baking heat.

On variable withdrawals

Remember the 5,000 Mt figure? That is based on the level of biological GHG withdrawal activity going on now. It is quite possible that climate change will alter the figure. For example, more CO2 in the air could make plants grow faster, increasing the amount withdrawn from the atmosphere each year. In the alternative, it is possible that a hotter world would make forests dry out, grow more slowly, and burn more. However the global rate of withdrawal changed, our rate of deposit would have to change, as well, to maintain a stable atmospheric balance.

Here’s the nightmare possibility: instead of absorbing carbon, a world full of burning forests and melting permafrost starts to release it. Now, even cutting our emissions to zero will not stop the global atmospheric balance from rising. It would be akin to being in a speeding car with no control of the steering, acceleration, or brakes. We would just carry on forward until whatever terrain in front of us stopped the motion. This could lead to a planetary equilibrium dramatically unlike anything human beings have ever inhabited. There is a reasonable chance that such runaway climate change would make civilization based on mass agriculture impossible.

An important caveat

In the above discussion, greenhouse gasses were the focus. They are actually only indirectly involved in changes in global temperature. What is really critical is the planetary energy balance. This is, quite simply, the difference between the amount of energy that the Earth absorbs (almost exclusively from the sun) and the amount the Earth emits back into space.

Greenhouse gasses alter this balance because they stop some of the radiation that hits the Earth from reflecting back into space. The more of them around, the less energy the Earth radiates, and the hotter it becomes.

They are not, however, the only factor. Other important aspects include surface albedo, which is basically a measure of how shiny the planet is. Big bright ice-fields reflect lots of energy back into space; water and dark stone reflect much less. When ice melts, as it does in response to rising global temperatures, this induces further warming. This is one example of a climatic feedback, as are the vegetation dynamics mentioned previously.

In the long run, factors other than greenhouse gasses that affect the energy balance certainly need to be considered. In the near term, as well demonstrated in the various reports of the IPCC, it is changes in atmospheric concentration that are the primary factor driving changes in the energy balance. Things that alter the Earth’s energy balance are said to have a radiative forcing effect. (See page 4 of the Summary or Policy Makers of the 4th Working Group I report of the IPCC.)

What does it mean?

To get a stable atmospheric balance, we need to cut emissions (deposits) until they match withdrawals (what the planet absorbs). To keep our balance from getting much higher than it has ever been before, we need to do this relatively quickly, and on the basis of a coordinated global effort.

The folly of Apollo redux

In an earlier post, I discussed the wastefulness of manned spaceflight. In particular, plans to return to the Moon or go to Mars cannot be justified in any sensible cost-benefit analysis. The cost is high, and the main benefit seems to be national prestige. Human spaceflight is essentially defended in a circular way: we need to undertake it so that we can learn how human beings function in space.

A post on Gristmill captures it well:

Let me be clear. There is a 0 percent chance that this Moon base or anything like it will ever be built, for the following reason: the moon missions in the ’60s and early ’70s cost something like $100 billion in today’s dollars. There is no way that setting up a semipermanent lunar base will be anything other than many times more expensive. That would put the total cost at one to a few trillion dollars.

Assuming that this taxpayer money needs to be lavished on big aerospace firms like Lockheed anyhow, it would be much better spent on satellites for the study of our planet (Some comprehensive temperature data for Antarctica, perhaps? Some RADAR analysis of the Greenland icecap? Some salaries for people studying climatic feedbacks?) or on robotic missions to objects of interest in the solar system.