Tundra dangers

Toronto Graffiti

One of the biggest climatic dangers out there is that warming in the Arctic will melt the permafrost. The tundra is heavily laden with methane – a potent greenhouse gas. In total, the ten million square kilometres contain about 1,000 gigatonnes of carbon (3,670 gigatonnes of carbon dioxide). The permafrost contains more carbon dioxide equivalent than the entire atmosphere at present.

If even a fraction of a percent of that gets released every year, it would blow our carbon budget. Even with enormous cuts in human emissions, the planet would keep on warming. Right now, humanity is emitting about 8 gigatonnes of carbon a year, on track to hit 11 gigatonnes by 2020. If we were to stabilize at that level, emitting 11 gigatonnes a year until 2100, the concentration of greenhouse gasses in the atmosphere will surpass 1,000 parts per million, creating the certainty of a vastly transformed world and a very strong possibility of the end of human civilization.

As such, it is vital to stop climate change before the planet warms sufficiently to start melting permafrost. This is especially challenging given that warming in the Arctic is more pronounced than warming elsewhere. There is also the additional challenge of the sea-ice feedback loop, wherein the replacement of reflective ice with absorptive water increases warming.

The actions necessary to prevent that are eminently possible. Unfortunately, people have not yet developed the will to implement them to anything like the degree necessary. Hopefully, the ongoing UNFCCC process for producing a Kyoto successor will help set us along that path before it becomes fantastically more difficult and expensive to act.

[Update: 4 February 2009] Here is a post on the danger of self-amplifying, runaway climate change: Is runaway climate change possible? Hansen’s take.

[Update: 19 February 2010] See also: The threat from methane in the North.

Hofmann’s ‘problem child’

Pink flowers

As an additional offering to see readers through my canoe-induced absence, here is an interesting article from The New York Times about lysergic acid diethylamide (LSD) – the ‘problem child’ of Albert Hofmann. It includes a description of his remarkable first experiences, when experimenting with the medical potential of ergot derivatives, as well as his later observations and reflections upon the molecule he introduced to the world.

Hofmann, who died last week, has an obituary in The Economist. It takes a somewhat interesting position: essentially, that LSD was a promising chemical that ended up universally banned because of the excesses of Timothy Leary and company.

Polar bears ‘threatened’

As of today, the American Department of the Interior has listed the polar bear as a ‘threatened’ species, on account of the ongoing disappearance of the Arctic ice cap. In making the announcement, Secretary of the Interior Dirk Kempthorne stressed that the decision is not meant to compel the regulation of greenhouse gasses:

Listing the polar bear as threatened can reduce avoidable losses of polar bears. But it should not open the door to use of the ESA [Endangered Species Act] to regulate greenhouse gas emissions from automobiles, power plants, and other sources. That would be a wholly inappropriate use of the ESA law. The ESA is not the right tool to set U.S. climate policy.

In a sense, that is fair enough. Creating something as comprehensive as a greenhouse gas mitigation strategy in response to concern about a single species is definitely a backwards-seeming way to go about it. At the same time, one is reminded of how somewhat awkward justifications have sometimes been used in the past to secure legal outcomes: for instance, the use of the ‘interstate commerce’ clause in the US Constitution to assert federal jurisdiction, or even the indictment of Al Capone on tax evasion charges, rather than those directly associated with organized crime.

The point here is less whether concern about polar bears does or does not create a legal obligation to act on climate change. Rather, this is another demonstration of how virtually all conservation planning now requires the consideration of climate change effects. This is just one of a thousand cuts through which federal reluctance to effectively regulate greenhouse gasses will need to be eliminated.

John McCain’s carbon targets

In a speech delivered in Oregon, John McCain laid out some targets for reducing American greenhouse gas emissions:

  • 2012: Return emissions to 2005 levels (18 percent above 1990 levels)
  • 2020: Return emissions to 1990 levels (15 percent below 2005 levels)
  • 2030: 22 percent below 1990 levels (34 percent below 2005 levels)
  • 2050: 60 percent below 1990 levels (66 percent below 2005 levels)

These targets look pretty similar to the ones adopted by the present Canadian government: a peak in emissions by 2012, a reduction to 20% below 2006 levels by 2020, and a 60-70% reduction below 2006 levels by 2050.

Stabilizing greenhouse gas concentrations below 550ppm probably requires more aggressive action. That being said, this is not a terrible place from which to begin negotiations: both between presidential candidates in the United States and between the United States and other countries. If the US was willing to commit to those targets unilaterally (and do so with a credible plan for actually achieving them), it might become a lot easier to get countries like China and India to begin making a more substantial contribution to the mitigation effort.

In exchange, the United States could adopt the kind of targets (and supplemental actions, like aid in preventing tropical deforestation) that are actually required to stabilize greenhouse gas emissions at a level around 450ppm, thus keeping total global temperature change in the realm of two degrees Celsius.

Nanotubes and hot sauces

Emily Horn on a fire escape

Hot sauce aficionados may be familiar with the Scoville Scale, used to express the heat of a sauce or pepper. The other day, my friend Antonia sent me an article explaining that the process of determining a Scoville rating might be significantly refined, thanks to carbon nanotubes:

The well-established Scoville method – currently the industry standard – involves diluting a sample until five trained taste testers cannot detect any heat from the chilli. The number of dilutions is called the Scoville rating; the relatively mild Jalapeño ranges from around 2,500-8,000, whereas the hottest chilli in the world, the ‘Naga Jolokia’, has a rating of 1,000,000. High performance liquid chromatography (HPLC) can also be used but this requires bulky, expensive equipment and detailed analysis of the capsaicinoids.

In Compton’s method, the capsaicinoids are adsorbed onto multi-walled carbon nanotube (MWCNT) electrodes. The team measures the current change as the capsaicinoids are oxidised by an electrochemical reaction, and this reading can be translated into Scoville units. The technique is called adsorptive stripping voltammetry (ASV), and is a relatively simple electrochemical method.

The Scoville Scale is pretty easy to understand. A sauce with a rating of 1000 can be diluted 1:1 with water to produce a sauce with a rating of 500. Tabasco sauce, of the sort ubiquitous in diners, has a rating of between 2,500 and 5,000. Dave’s Insanity Sauce – the spiciest one in my kitchen – has a rating of about 180,000. Even taking the upper estimate of Tabasco’s potency, that means one tablespoon of Dave’s is equivalent to about half a litre of Tabasco.

Of course, those who truly wish for their epithelial cells to signal as much heat and abrasion as is theoretically possible can do better. Blair’s 16 Million Reserve, which consists of a little bottle of pure capsaicin crystals, weighs in at 16,000,000 Scoville heat units. One tablespoon is thus akin to 1.31 litres of Dave’s Insanity Sauce, or 47.16 litres of Tabasco. Just the thing you need if you want to turn a bland chili dinner for your million person standing army into something a bit more interesting.

Vehicle efficiency

Fire station on Preston Street, Ottawa

My friend Mark sent me a link to a book in progress about sustainable energy. One of the more interesting sections is on vehicle efficiency. The author stresses that, while some kinds of efficiency gains are physically possible, others are not:

Could we make a new car that consumes 100 times less energy and still goes at 70mph? No. Not if the car has the same shape. The energy is going mainly into making air swirl. Changing the materials the car is made from makes no difference to that. A miraculous improvement to the engine could perhaps boost its efficiency from 25% to 50%. But the energy consumption of a car is still going to be roughly 40 kWh per 100 km.

The story is a familiar one: efficiency can get you a long way, but there are no free rides. Another interesting comment from this chapter is the major design differences between an efficient city car and an efficient highway car. Since the former is always stopping and starting, low weight is really important. Brakes that regenerate energy also make a big difference. For a highway car that avoids major acceleration and deceleration, the most important thing is reducing drag. Weight is comparatively trivial.

One other interesting assertion is that the energy involved in making a car is actually pretty trivial compared to the amount used in driving it around:

The energy cost of making the raw materials for a one tonne car is thus equivalent to about 3000 km of driving; an appreciable cost, but probably only 1% of the lifetime energy-cost of the car’s fuel.

If correct, that makes it seem a lot more reasonable to upgrade from an old and inefficient vehicle to a newer and less gas-thirsty model. It also suggests that government programs to replace inefficient cars with better ones might have strong justification, in terms of climate change mitigation potential.

In order to move to a low carbon society, we need to do a slew of things. We definitely need to increase the energy efficiency of accomplishing most tasks. We definitely need to reduce the quantity of greenhouse gas produced in the process of generating a unit of energy. We probably need to significantly reduce total energy consumption. Finally, we need to take actions that manage the greenhouse gasses that will inevitably be produced by some actions. The protection and enhancement of carbon sinks (mostly forests and soils) are essential for this.

When it comes to reducing total energy usage, the chapter does make one excellent suggestion: “a cyclist at 21 km/h consumes about 30 times less energy per kilometre than a lone car-driver on the motorway: about 2.4 kWh per 100 km.” Those who cycle more slowly are likely to be even more efficient, since doubling the time it takes to travel somewhere apparently reduces energy usage by three quarters.

Remember the platypus

The Platypus is a strange and intriguing creature. Some of the odder things about it:

  1. Males can inject venom from spurs on their ankles. The venom will not kill humans, but is extremely painful and heightens overall sensitivity to pain for a period between a few days and several months.
  2. They have ten sex chromosomes, out of a total of 52. Males are ‘XYXYXYXYXY.’
  3. They swim using only their two front legs, though the back two are also webbed.
  4. Only the left ovary of females is functional.
  5. They have no visible ears.
  6. They only use their eyes while above water.
  7. Underwater, they can detect electric fields generated by muscular contractions.
  8. They lose their three teeth before they first leave their mother’s burrow.
  9. They forage for twelve hours a day.
  10. They have a body temperature five degrees lower than most placental mammals.
  11. Females lactate through pores in their skin. Milk pools in grooves located on their abdomens.
  12. The DNA of one female – named Glennie – has now been sequenced by researchers at Oxford.

I recall reading that Australia has three types of animals: the venomous, the bizarre, and sheep. The platypus scores highly on the first two counts.

The Fischer-Tropsch process

Emily Horn and the sunset

Those hoping to understand energy politics in the coming decades would be well advised to read up on the Fischer-Tropsch process. This chemical process uses catalysts to convert carbon monoxide and hydrogen into liquid hydrocarbons. Basically, it allows you to make gasoline using any of a large number of inputs as a feedstock. If the input you use is coal, this process is environmentally disastrous. It combines all the carbon emissions associated with coal burnings with extra energy use for synthetic fuel manufacture, not to mention the ecological and human health effects of coal mining. If the feedstock is biomass, it is possible that it could be a relatively benign way to produce liquid fuels for transport.

The process was developed in Germany during the interwar period and used to produce synthetic fuels during WWII. The fact that it can reduce military dependence on imported fuel is appealing to any state that wants to retain or enhance its military capacity, but feels threatened by the need to import hydrocarbons. The US Air Force has shown considerable interest for precisely that reason, though they are hoping to convert domestic coal or natural gas into jet fuel – an approach that has no environmental benefits. By contrast, biomass-to-liquids offers the possibility of carbon neutral fuels. All the carbon emitted by the fuel was absorbed quite recently by the plants from which it was made.

Such fuels are extremely unlikely to ever be as cheap as gasoline and kerosene – even with today’s oil prices. The fact that there are parts of the world where you can just make a hole in the ground and watch oil spray out ensures that. That said, Fisher-Tropsch-generated fuels could play an important part in a low-carbon future, provided three conditions are met: (a) the fuels are produced from biomass, not coal or natural gas (b) the energy used in the production process comes from sustainable low-carbon sources and (c) the process of growing the biomass is not unacceptably harmful in other ways. If land is redirected towards growing biomass in a way that encourages deforestation or starves the poor, we will not be able to legitimately claim that synthetic fuels are a solution.

Immune system biochem

Face on a wall

It seems as though one of the coolest medical products you can make from blood is intravenous immunoglobulin (IVIG). Basically, it consists of antibodies extracted from the plasma of thousands of individual blood donors. It is given to people who have had their own ability to produce antibodies compromised and helps their immune system to attack infections over a period between two weeks and three months.

I remember a children’s television show where white blood cells are represented as the body’s police force. The analogy is fair enough. There are situations where the police force is lazy, so nasty gangs move in. There are situations where nasty gangs simply kill off the police force. Finally, there are situations where the police force goes haywire and starts savaging the population. Autoimmune diseases are the anatomical equivalent of the uncontrolled police force. Apparently, IVIG can help in all three circumstances: as well as in cases of inflammation.

Reading about biochemistry is an excellent way of being reminded just how absurdly complicated life is. I frequently find myself contemplating all the thousands of chemical reactions involved in performing the slightest action – tapping a key, dilating your pupil when a cloud crosses the sun – and being amazed that they can happen so quickly and consistently.

Keenlyside et al. on the next decade

As reported in the BBC, a Nature article is arguing that computer models suggest that little global warming will occur in the next decade:

[O]ver the next decade, the current Atlantic meridional overturning circulation will weaken to its long-term mean; moreover, North Atlantic SST and European and North American surface temperatures will cool slightly, whereas tropical Pacific SST will remain almost unchanged. Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.

Climate is a naturally variable thing and, as such, it is always undergoing upward and downward oscillations. Anthropogenic greenhouse gasses definitely have a growing warming effect, but that effect is overlaid on top of the existing variations and feedbacks. As such, a natural downward tendency might drown out the human impact for a certain span of time.

Having relatively accurate decade-to-decade forecasts on climate change impacts could be very useful for adaptation planning. By providing guidance on things like weather conditions and extreme events, they could allow for the more intelligent selection of crops, the concentration of effort in the most threatened areas, and the general development of anticipatory policy.

While such studies are clearly important for increasing our understanding of the climate system, there is a big danger of misunderstanding them – whether wilfully or not. Plenty of people would interpret a decade of flat or falling temperatures as strong evidence that the climate change consensus is wrong. It provides new fodder for those intentionally seeking to confuse the issue, as well as new grounds for confusion among those who are genuinely trying to understand the situation. Of course, we cannot ask for science to always emerge in ways that help people deal with it appropriately. It would be pretty tragic if a brief but poorly timed deviation from the warming trend helped to undermine the case for action at the very time when we must begin the long and difficult task of building a low-carbon world.