Fissure in the Beaufort ice pack

During the past month, a massive piece of ice has broken off west of Banks Island, in the Canadian Arctic. This picture shows the area in question, while this animation from the US National Oceanic and Atmospheric Administration. The split left open water in the Bering Strait for 45 days. At the same time as the fissure, there was an unusual 45 day period of open water in the Bering Strait.

For a sense of scale, here is a map showing Banks Island in relation to the rest of Canada. While one event of this kind cannot be understood without comparison to what is happening in other areas and what has happened at other times, it is a reminder of the dynamic character of the polar icecap, even in the middle of winter. According to NOAA’s 2007 Arctic Report Card, anomolously high temperatures are yielding “relatively younger, thinner ice cover” which is “intrinsically more susceptible to the effects of atmospheric and oceanic forcing.”

It will be fascinating to see what happens the the icecap next summer: specifically, how the level of ice cover will compare to the shocking minimum in the summer of 2007.

[Correction: 15 January 2008] The open water in the Bering Sea is unrelated to this fissure, though both took place at the same time. Both pieces of information are listed in this report from the Canadian Ice Service.

Oil sands report card

Drew Sexsmith with a mandolin

The Pembina Institute and the World Wildlife Fund of Canada have a new report out on the oil sands. It is available as a four page summary or a 72 page PDF. The report is based on surveys sent to 10 different oil sands operations and focuses on the degree to which they have adopted policies to mitigate their environmental impact.

The report highlights both the greenhouse gas emissions associated with oil sands extraction and processing and the impacts upon fresh water. It also points out how the idea that land is ‘reclaimed’ after extraction is seriously faulty. Apparently, “[d]espite over 40 years of oil sands development, not a single hectare of land has been certified as reclaimed under Government of Alberta guidelines.” The permanent conversion of boreal forests ultimately belonging to the people of Alberta into fields of toxic mud is certainly cause for concern.

The report stresses possibilities for improvement, explaining how running all facilities using the best standards in other existing facilities would cut greenhouse gas emissions by 66%, and reduce volatile organic compound emissions by 47%. Nitrous oxide emissions could be cut by 80%, while sulphur dioxide emissions could be reduced by 47%. Adopting a proposed water efficiency standard would reduce annual water consumption by 60%. These figures are all based on facilities running at maximum capacity, as can probably be assumed with oil around $100 a barrel.

Depressingly, the report highlights that a currently proposed project has even worse standards than existing facilities. In order to mitigate the trend, three recommendations are made to government along with two to industry. The governmental suggestions are:

  1. Government needs to enforce acceptable standards of environmental performance and continuously improve regulations to reflect continuous improvement in companies’ abilities to reduce environmental impacts.
  2. Government needs to report on environmental impacts to public lands.
  3. Government must request segregated information to enable comparison of environmental performance.

The industry recommendations are:

  1. Companies need to implement best available practices and focus on developing and implementing new technologies and processes that lead to step-wise reductions in environmental impacts.
  2. Companies should make project specific oil sands environmental performance information more widely available and in a consistent format.

Overall, this approach may be a productive one. Rather than highlighting the ecological costs of oil sands extraction and demanding that the industry be scaled back, demands for all firms to meet the highest existing standards might be able to mitigate some of the harmful effects without creating as much antagonism. It’s not a comprehensive solution, but it may be a clever form of harm reduction.

Anyone interested in the state of Canada’s environment is encouraged to read at least the short summary.

Canada’s nuclear waste

Hilary McNaughton at Darma’s Kitchen

After being removed from a reactor, nuclear fuel is both too radioactive and too physically hot to be reprocessed or placed in dry storage. As such, it is kept in cooling pools for a period of five to six years. Given the absence of long-term geologic storage facilities, all of Canada’s high level waste is currently in cooling pools or on-site dry cask storage. On a per-capita basis, Canada produces more high level nuclear waste than any other state – a total of 1,300 tonnes in 2001.

Canada currently has eleven nuclear waste storage facilities. Among these, one is in the process of decommissioning and six contain high level waste. Four sites have waste in dry storage casks: Darlington, Bruce, Pickering, Gentilly, and Point Lepreau. Other facilities include spent fuel pools. According to the Canadian Nuclear Safety Commission (CNSC), all Canadian wastes are currently in ‘storage’ defined as: “a short-term management technique that requires human intervention for maintenance and security and allows for recovery of the waste.”

In 2002, a major review of waste disposal options was undertaken by the Nuclear Waste Management Organization (NWMO). Their final report – released in November 2005 – endorsed a system of “Adaptive Phased Management” employing both interim shallow storage and deep geological storage, with the possibility of future recovery of materials. Such recovery would be motivated either by concerns about leakage potential or a desire to process the fuel into something useful. The NWMO is currently engaged in a process of site selection, intended to lead eventually to a National Nuclear Waste Repository.

The nuclear waste problem

From both an environmental and public support standpoint, the generation of nuclear waste is one of the largest drawbacks of nuclear fission as a power source. Just as the emission of greenhouse gasses threatens future generations with harmful ecological outcomes, the production of nuclear wastes at all stages in the fuel cycle presents risks to those alive in the present and to those who will be alive in the future, across a span of time not generally considered by human beings.

Wastes like Plutonium-239 remain highly dangerous for tens of millennia: a span roughly equivalent to the total historical record of human civilizations. Furthermore, while most states using nuclear power have declared an intention of creating geological repositories for wastes, no state has such a facility in operation. The decades-long story of the planned Yucca Mountain repository in the United States demonstrates some of the practical, political, and legal challenges to establishing such facilities in democratic societies.

Dry cask storage is not an acceptable long-term option, as suggested by its CNSC categorization as “a short-term management technique.” When dealing with wastes dangerous for millennia, it cannot be assumed that regular maintenance and inspection will continue. Storage systems must be ‘passively safe:’ able to contain the wastes they store for the full duration of their dangerous lives, without the need for active intervention from human beings. To date, no such facilities exist.

The sex life of corn

Corn, the key species in modern industrial agriculture, is completely incapable of reproducing itself in nature. The cobs that concentrate the seeds so nicely for us are not conducive to reproduction because, if planted, the corn grows so densely it dies. As such, the continued existence of Zea mays depends upon people continuing to divide the cobs and plant a portion of the seeds.

Corn is apparently a descendant of an earless grass called Teosinte. It is hard to overstate the consequences of a heavily mutated strain of Teosinte finding a species capable of closing a reproductive loop that would otherwise be open, leading to swift extinction.

The actual mechanics of corn reproduction are similarly odd. Male gametes are produced at the top of the plant, inside the flower-like tassel. At a certain time of year, these release the pollen that fertilizes the female gametes located in the cobs. It reaches them through single strands of silk (called styles) that run through the husk. When a grain of pollen comes into contact with one of these threads it divides into two identical cells. One of them tunnels through the strand into the kernel, a six to eight inch distance crossed in several hours. The other fuses with an egg to form an embryo, while the digger grows into the endosperm.

Another curious aspect of corn reproduction is that, because of seed hybridization (not genetic modification), every stalk of corn in a field is a clone of every other stalk. This is because the seeds came from inbred lines: each made to self-pollinate for several generations, eventually yielding batches of genetically identical seeds that farmers buy every year. They do this because the yield from the identical seeds is higher than that from the mixed generation that would follow it by a degree sufficient to justify the cost of buying seeds.

Such hybrid corn pushed yields from twenty bushels an acre – the amount managed by both Native Americans and farmers in the 1920s – to about two hundred bushels an acre. Given the degree to which we are all constructed more from corn than from any other source of materials (most of the meat, milk, and cheese we eat is ultimately made from corn, as are tons of processed foods), these remarkable processes of reproduction and agriculture deserve further study. For my part, I am reading Michael Pollan’s The Omnivore’s Dilemma. I am only 10% into it, but it has been quite fascinating so far.

Per capita emissions and fairness

Per capita emissions by state, compared with sustainable emissions

As mentioned before, the Stern Review cites a figure of five gigatonnes of carbon dioxide equivalent as the quantity that can be sustainably absorbed by the planet each year. Given the present population of 6.6 billion people, that means our fair share is about 750kg of emissions each, per year. Right now, Canadian emissions are about 23 tonnes per person per year. They are highest in Alberta – 71 tonnes – and lowest in Quebec – 12 tonnes. Even in hydro-blessed Quebec, emissions are fifteen times too high.

Everybody knows that emissions in the developed world are too high. The average Australian emits 25.9 tonnes. For Americans it is 22.9; the nuclear-powered French emit 8.7 tonnes each. The European average is 10.6 tonnes per person, while North America weighs in at 23.1. One round-trip flight from New York to London produces the amount of greenhouse gas that one person can sustainably emit in three and a half years. These are not the kind of numbers that can be brought down with a few more wind turbines and hybrid cars; the energy basis of all states needs to be fundamentally altered, replacing a system where energy production and use are associated with greenhouse gas emissions with one where that is no longer the case.

What is less often acknowledged is that emissions in the developing world are already too high. Chinese per capita emissions are 3.9 tonnes, while those in India are 1.8. The list of countries by per-capita greenhouse gas emissions on Wikipedia shows three states where per-capita emissions are below 750kg: Comoros, Kiribati, and Uruguay. Even the average level of emissions for sub-Saharan Africa is almost six times above the sustainable level for our current world population.

And our world population is growing.

All this raises serious questions of fairness. Obviously, people in North America and Europe have been overshooting our sustainable level of emissions for a long time. Do developing countries have a similar right to overshoot? How are their rights affected by what we now know about climate change? If they do have a right to emit more than 750kg per person, does that mean people in developed states have a corresponding duty to emit less than that? Even if we emitted nothing at all, we couldn’t provide enough space within the sustainable carbon budget for them to emit as much as we are now.

The only option is for everyone to decarbonize. The developed world needs to lead the way, in order to show that it can be done. The developing world needs to acknowledge that the right to develop does not trump other forms of legal and ethical obligation: both to those alive now and to future generations. People in both developed and developing states may also want to reconsider their assumptions about the desirability of population growth. Spending a few centuries with people voluntarily restricting their fertility below the natural rate of replacement could do a lot to limit the magnitude of the ecological challenges we will face as a species.

Carbon constrained travel

Jennifer Ellan and a yellow wall

Pondering the era of post-air travel tourism, I have been thinking about places to visit by train. There are actually quite a lot of appealing prospects:

  • Halifax: see the Maritimes for the first time, and perhaps Caity Sackeroff as well
  • Boston: visit Iason, Sheena, Loretta, and perhaps Claire
  • New York: re-visit the city five years after my first foray
  • Bennington, Vermont: visit family

Have any readers undertaken the exploration of North America’s east coast using ground-based means? Are trains generally much more expensive than buses? How do they compare, in terms of speed?

The moral unacceptability of air travel has also left me thinking more seriously about the kind of grand backpacking tours that were more common when long-haul flights were ruinously expensive. The most ambitious possibility – flitting around at the edges of imagination – is to travel by ground all the way from London to Hong Kong, seeing as much as possible between the two. Emitting a couple of tonnes of carbon, in the form of flights across the Atlantic, would be a lot more ethically acceptable than undertaking multiple hops. Of course, a truly conscientous traveler would emulate a friend of a friend of mine and book passage across on a container ship.

Advertising over-fishing

This evening, I was surprised to happen across a billboard advertisement condemning fisheries subsidies. It declared that: “Subsidies are fishing the world’s oceans to death” and “It’s time to cut the bait.” The sentiment is an accurate one, particularly when it comes to the operation of the subsidized fleets of the developed world in the waters of developing states. Still, it was interesting to see a public display about a subject that is of considerable interest to me, but seemingly ignored by most of the population. You do see a bit of lobbying through advertising in Ottawa; for instance, there are piles of backlit signs personally thanking Prime Minister Harper for supporting ethanol and biodiesel. It was good to see something advocating the protection of a common resource, rather than seeking rents for private enterprises.

I was curious who would be behind such an advertising campaign, but then I noticed the logo of the Sea Around Us Project at the bottom. They have been mentioned here fairly frequently before and do good work. Shifting Baselines – a favourite blog of mine – is run by a doctoral student associated with the project.

[Update: 13 January 2007] I finally got around to uploading the low quality photo of the ad I took on my phone.

The implied right to pollute

In today’s news, there is some talk about the new report from the National Round Table on the Environment and the Economy. Much of it has surrounded the possibility of a carbon tax as a vehicle for assisting the with reduction of Canadian greenhouse gas emissions. One comment from the CBC struck me as especially wrong-headed. In relation to a carbon tax, a person being interviewed said that it “would specifically impact western oil producers who might have to carry the brunt of such attacks.”

The fallacy here is that western oil producers have the right to emit as many greenhouse gasses as they like, for free. If your neighbour was running a pulp mill in his back yard, allowing toxic chemicals to ooze throughout the neighbourhood, nobody would call it an ‘attack’ when he was made to stop. Arguments implying that industry or private individuals have the right to impose ecological harms upon others need to be challenged in terms of fairness and ethics. Otherwise, they obscure the true character of the situation and help to perpetuate the status quo.

HVDC transmission for renewable energy

Power lines in Vancouver

One limitation of renewable sources of energy is that they are often best captured in places far from where energy is used: remote bays with large tides, desert areas with bright and constant sun, and windswept ridges. In these cases, losses associated with transmitting the power over standard alternating current (AC) power lines can lead to very significant losses.

This is where high voltage direct current (HVDC) transmission lines come in. Originally developed in the 1930s, HVDC technology is only really suited to long-range transmission. This is because of the static inverters that must be used to convert the energy to DC for transmission. These are expensive devices, both in terms of capital cost and energy losses. With contemporary HVDC technology, energy losses can be kept to about 3% per 1000km. This makes the connection of remote generating centres much more feasible.

HVDC has another advantage: it can be used as a link between AC systems that are out of sync with each other. This could be different national grids running on different frequencies; it could be different grids on the same frequency with different timing; finally, it could be the multiple unsynchronized AC currents produced by something like a field of wind turbines.

Building national and international HVDC backbones is probably necessary to achieve the full potential of renewable energy. Because of their ability to stem losses, they can play a vital role in load balancing. With truly comprehensive systems, wind power from the west coast of Vancouver Island could compensate when the sun in Arizona isn’t shining. Likewise, offshore turbines in Scotland could complement solar panels in Italy and hydroelectric dams in Norway. With some storage capacity and a sufficient diversity of sources, renewables could provide all the electricity we use – including quantities sufficient for electric vehicles, which could be charged at times when demand for other things is low.

With further technological improvements, the cost of static inverters can probably be reduced. So too, perhaps, the per-kilometre energy losses. All told, investing in research on such renewable-facilitating technologies seems a lot more sensible than gambling on the eventual existence of ‘clean’ coal.

A grand solar plan for the United States

Sign in Sophie’s Cosmic Cafe

The latest issue of Scientific American features an article about a ‘grand solar plan.’ The idea is to install massive solar arrays in the American southwest, then use high voltage direct current transmission lines to transfer the energy to populated areas. The intention is to build 3,000 gigawatts of generating capacity by 2050 – a quantity that would require 30,000 square miles of photovoltaic arrays. This would cost about $400 billion and produce 69% of all American electricity and 35% of all energy used in transport (including electric cars and plug-in hybrids). The plan depends upon storing pressurized air in caverns to balance electricity supply and demand. The authors anticipate that full implementation of the plan would cut American greenhouse gas emissions to 62% below 2005 levels by 2050, even assuming a 1% annual increase in total energy usage.

The authors stress that the plan requires only modest and incremental improvements in solar technology. For instance, the efficiency of solar cells must be increased from the present level of about 10% to 14%. The pressurized cavern approach must also be tested and developed, and a very extensive new system of long-distance transmission lines would need to be built. While the infrastructure requirements are daunting, the total cost anticipated by the authors seems manageable. As they stress, it would cost less per year than existing agricultural subsidy programs.

Depending on solar exclusively is probably not socially or economically optimal. The authors implicitly acknowledge this when they advocate combining the solar system with wind, biomass, and geothermal sources in order to generate 100% of American electricity needs and 90% of total energy needs by 2100. Whether this particular grand plan is technically, economically, and politically viable or not, such publications do play a useful role in establishing the parameters of the debate. Given the ongoing American election – and the potential for the next administration to strike out boldly along a new course – such ideas are especially worthy of examination and debate. It is well worth reading the entire article.