Greenhouse gases other than CO2

Rusty metal pipes

A recent Newsweek article discussing Al Gore’s new book made reference to recently published work on how different gases are contributing to anthropogenic climate change: Improved Attribution of Climate Forcing to Emissions, written by scientists from NASA’s Goddard Institute including Drew Shindell and Gavin Schmidt.

Two especially notable points are made. Firstly, the researchers estimate that carbon dioxide (CO2) is ‘only’ responsible for 43% of observed warming, once interactions between gases and aerosols were taken into account. At the same time, methane accounts for 27% of warming, halocarbons 8%, black carbon 12%, and carbon monoxide and volatile organics 7%. Secondly, there are the policy implications that flow from this. Preventing CO2 emissions basically requires reducing deforestation and the burning of fossil fuels – with the latter being an especially challenging thing to do in a world as promiscuous with energy as ours. Reducing methane, by contrast, may be as simple as capturing and burning gases from landfills, and adopting other comparatively low-cost and low-sacrifice strategies. The authors conclude that strategies that incorporate all greenhouse gases (GHGs) are “likely to be much more cost-effective than CO2-only strategies.”

There are other complications involving GHGs, including atmospheric lifetime. CO2 is removed by various means, across different timescales. Methane doesn’t last as long, but does cause more warming than CO2 when present and often breaks down into it later. Black carbon is washed out of the atmosphere quite quickly, meaning that eliminating its production could yield reduced radiative forcing relatively quickly.

The greater importance of non-CO2 gases described in this study is potentially good news for climate change mitigation, given how challenging it has been to convince governments to accept even very minor costs in order to reduce the risks associated with climate change. Developing an improved understanding of exactly how much various GHGs alter the climate should also allow for more efficient carbon pricing, where the incentives to reduce the most harmful GHGs are the strongest.

The Secret Sentry

Two red leaves

Less famous than the Central Intelligence Agency (CIA), the American National Security Agency (NSA) is actually a far larger organization. It also provides the majority of the intelligence material provided to the president daily. Matthew Aid’s The Secret Sentry: The Untold History of the National Security Agency tracks the history of the organization between the end of the Second World War and the recent past. While the book contains a fair bit of interesting information, it suffers from some significant flaws. Notably, it is very thin on technical detail, not written with a neutral point of view, and not always effective at putting the role of intelligence in context.

Aid’s book contains virtually no technical information on the main work of the NSA: codebreaking and traffic analysis. Confusingly, it doesn’t even clearly indicate that a properly implemented one-time-pad (OTP) is actually an entirely secure method of communication, if not a very convenient one. For those hoping to gain insight into the past or present capabilities of the NSA, this book is not helpful. It does provide some historical background on when the US was and was not able to read codes employed by various governments, but does not explore the reasons why that is. Is certainly doesn’t consider the kind of non-mathematical operations that often play a crucial role in overcoming enemy cryptography: whether that is exploiting mistakes in implementation, or ‘black bag’ operations where equipment and materials are stolen. On all these matters, David Khan’s book is a far superior resource. Personally, there is nothing I would rather know about the NSA than how successfully they can break public key encryption systems of the kind used in web browsers and commercial encryption software.

The Secret Sentry consists largely of brief biographies of NSA directors interspersed among accounts of the numerous conflicts with which the NSA has been involved. The most extensively described of these are the Vietnam War and the ongoing conflicts in Afghanistan and Iraq. The information on the Gulf of Tonkin incident is quite interesting, given the ways in which it shows how intelligence can be misused by politicians spoiling for a fight (as obviously happened again with Iraq in 2003). Indeed, some of the best information in the book concerns how intelligence can be both badly and poorly used. For example, it discusses how keeping sources and methods secret makes intelligence less credible in the eyes of those making choices partly based upon it. At the same time, having sources and methods revealed reduces the likelihood that current intelligence techniques will continue to work. On the politics surrounding intelligence, it was also interesting to read about how the NSA was involved in bugging UN officials and representatives during the lead-up to the 2003 invasion of Iraq. The book is also strong when it comes to providing examples of policy-makers ignoring intelligence advice that conflicts with what they want to believe – as well as explanations of why there was no prior warning before major events like the fall of the Soviet Union, the Yom Kippur War, or September 11th, 2001. Rather, it describes how the various bits of information that would have gone into such warnings were not pieced together and properly understood in time.

The book contains a number of errors and unclear statements that I was able to identify. In addition to the aforementioned matter of the cryptosecurity of the OTP, I think it is wrong to say that the 1983 marine barracks bombing in Lebanon was the world’s largest non-nuclear explosion. The Minor Scale and Misty Picture tests were larger – as was the Halifax Explosion. The term JDAM refers to a guidance kit that can be attached to regular bombs, not a kind of bunker buster. Also, GPS receivers determine their locations by measuring the amount of time signals from satellites take to reach them – they are not devices that automatically broadcast their own location in a way that can be triangulated by others. These errors make me fairly confident that the book contains others that I was not able to identify.

The book also has a somewhat perplexing structure. Roughly chronological, it is written in the form of little vignettes with headings. An example of the way this can seem disjointed is found in the chapter on the Reagan and Bush Senior administrations. One one page, it describes the tenure of William Odon as NSA director. It then jumps into short description of America’s signals intelligence (SIGINT) satellite technology at the time. Then, before the page is done, it jumps to the topic of Ronald Pelton selling NSA secrets to the Soviets. One sometimes gets the sense that the order of these chapter sub-units was jostled after they were written. Terms and abbreviations are sometimes explained well after their first use, and sometimes not at all. Bewilderingly, the Walker-Witworth spy ring is mentioned only in passing, in a single sentence, and yet is included in the index.

The Secret Sentry shows a lack of objectivity that becomes more acute as it progresses, culminating in tirades against the 2003 invasion of Iraq and the NSAs controversial domestic wiretap program. While there are certainly grounds for criticizing both, it is arguably the role of a historian to provide facts and analysis, rather than moral or legal judgments. It is also a bit odd to see the attack of one American armoured vehicle as ‘tragic’ while the destruction of large Iraqi military formations is discussed only in factual terms. It would also have been welcome for the book to include more information on how those outside the United States have perceived the NSA, and the SIGINT capabilities of states not allied with the US.

Perhaps a second edition will eventually correct some of this book’s flaws. That would be welcome, since the topic is an important one. While the record of the NSA at providing useful intelligence is checkered, it is almost certainly the most capable SIGINT organization in the world today. Its future actions will have implications for both the privacy of individuals and for geopolitics and future conflicts.

Inheritance law in Europe

Wheat stalks

One thing I didn’t know about continental Europe is that in many countries there inheritance isn’t something that you can allocate in your will. If you want to give it all to charity, tough luck: it is impossible and illegal. Instead, you are obligated to leave a set portion of your total estate to your children, divided equally among them. This is referred to as “forced heirship.” There are even provisions in place to “claw back” money given away in the last few years of life, so as to prevent people from circumventing the heirship law by donating while alive. As such, if you give a big dollop of money to a charity and die a few years later (less than two in Austria, or ten in Germany), the state might take it back and give some of it to your children.

This all strikes me as rather batty and weird. After all, the privilege of being able to assign where your wealth goes after death is a natural extension of private property rights in general (though is reasonably subjected to things like inheritance taxes). Particularly in the case of very wealthy individuals, you could also argue that giving a large set share of the estate to each child will do more harm than good. This is what Bill Gates, Warren Buffet and others have argued, when setting up their wills to give only a small fraction of their wealth to their children. Indeed, one of the major consolations associated with the way wealth tends to concentrate is that people who assemble truly colossal heaps of it often give a lot to charity as they age and die. Gates is certainly an example, as were Carnegie and others. The European system seems more inclined towards the establishment of dynasties. That said, it is certainly possible for people who have been given the ability to choose who will inherit their estate to make the choices poorly. There are definitely worse options than even distribution among children. Cases of people leaving their estates to their pets spring to mind.

In practical terms, there are lots of ways people could work their way around such requirements. They could hold much of their wealth in jurisdictions where the law is different. They could also convert most of their wealth into a life annuity upon retirement. It would be interesting to know what proportion of people use such mechanisms in European countries, and how they are distributed between different levels of wealth.

The Rebel XS and the 20D

Heron in Dow's Lake, Ottawa

Unfortunately, my year-old Canon Rebel XS suffered some kind of failure on Saturday: constantly reading ‘busy’ in the heads-up display and being unable to take photos. Henry’s is sending it back to Canon for repair, and estimate it will be away 4-6 weeks. Quite kindly, when they heard that I was planning to take photos for the Fill the Hill event, they lent me a 20D for the weekend.

The 20D is an older camera positioned at a higher level than the Rebel XS. It is larger and sturdier, and feels more substantial. It also feels more balanced with heavy lenses like my 70-200. Two things I really like about it are the shutter release sound (which seems a lot more pleasing and professional than the Rebel XS) and the intangible sense that this camera is always eager to take photos. Pressing the shutter feels like allowing it to follow through with a restrained urge. Part of that feeling may come from the absurdly fast burst shooting speed.

I do have some complaints about the 20D. Some of the controls are very confusing. For instance, the on-off switch has three positions. In one ‘on’ mode, you can use the rear control wheel for exposure correction, once you have half-depressed the shutter button. Nobody would ever guess that, and I spent a good 20 minutes trying to figure out how to undo the -1/3 correction I accidentally applied (I eventually got it back to 0 by switching from 1/3 stop increments to 1/2 stop). The screen is much smaller and less useful than the one on the Rebel XS, so it isn’t really all that useful for reviewing images in the field. Also, the processor is slower, meaning that photos take longer to download.

All told, I now have a better understanding of why people buy Canon’s $1000ish cameras, when their features are mostly the same as those in their $500ish cameras. The 20D certainly looks and feels more professional than the Rebel XS. That being said, I think I will stick with my plan of saving up and eventually buying a dSLR in the much more costly category of those with full-frame sensors.

P.S. With my Rebel XS away, it may be tough to produce nice photos of the day for the next month or so. I went out and took a heap of fall photos today, to try to see me through the dry spell. If I do end up going to a family reunion in Vermont in November, I will probably rent a dSLR (and maybe the 24-70 f/2.8L lens) for the duration.

P.P.S. One other lesson from all this is that megapixels really don’t matter. Which has more, the Rebel XS or the 20D? I don’t know, and it doesn’t matter in the slightest.

[Update: 14 November 2009] The Rebel XS came back from Canon with a new flaw introduced.

[Update: 14 June 2010] Recently, the electrical system on the Rebel XS failed again. Rather than get a replacement under the Henry’s service plan, I got credit towards a 5D Mark II.

LED lighting, effectiveness and efficiency

Perhaps the only thing that will ever silence the various overblown objections to compact fluorescent lights is when they are replaced by solid state lighting systems, based on light emitting diodes (LEDs).

Unfortunately, as a post on BoingBoing points out, there is still a way to go before such lighting systems will be viable options for most people. For one thing, they are still expensive. For another, the light they produce may start out not being white, or drift away from being white over time. Worse, it is very difficult for people to distinguish between high and low quality products currently on offer.

Another problem is that LEDs aren’t unambiguously more efficient than fluorescent lighting systems: “The more lumens per watt, the better the energy efficiency. The kind of fluorescent lamps used in offices–the long, narrow ones that are called T-5 or T-8s in Technicalland–regularly get more than 100 lumens per watt. An LED T-8 lamp tested by CALiPER last year got 42.” It seems those of us pushing for more energy efficient lighting may have to continue rebutting claims about mercury and flickering for some time yet.

A trillion tonnes of carbon

Previously, I described how Andrew Weaver used different estimates of how sensitive the Earth’s climate is to greenhouse gasses to determine how many total emissions humanity can have without causing more than 2°C of warming. The 2°C figure is commonly cited as the level of warming that is unambiguously ‘dangerous’ – either because of the harm it would do directly or because warming to that point would kick off positive feedbacks that would then make the planet hotter still.

A new site simplifies this analysis, arguing only that: “If we are to limit global warming caused by carbon dioxide emissions to less than 2°C, widely regarded as necessary to avoid dangerous climate change, we need to limit total cumulative emissions to less (possibly much less) than” one trillion tonnes of carbon (equivalent to 3.67 trillion tonnes of CO2). This is probably too high an estimate, given that the IPCC estimates climate sensitivity to be between 3.6°C and 4.5°C. At the low end, that means we need to cap total emissions below 0.661 trillion tonnes of carbon; at the high end, the limit would be 0.484 trillion tonnes. The website estimates that our emissions to date are around 0.555 trillion tonnes.

In the event that actual climate sensitivity is a high but possible 8°C, cumulative emissions of just 0.163 trillion tonnes of carbon would be enough to produce 2°C of warming.

Still, ‘trillionth tonne’ is an accessible concept and it is interesting to watch the numbers update in real time. One especially interesting figure is this one: “We would not release the trillionth tonne if emissions were to start falling immediately and indefinitely at…” At present, their estimate is about 2.1% per year. A higher rate of reduction is necessary if the trillion tonne figure proves overly high.

Smart grid skepticism

The Economist argues that the popularity of so-called ‘smart’ electrical grids is cause for suspicion. The fact that builders of renewable energy plants and operators of dirty coal plants are both on side suggests that the grids will not, in and of themselves, produce a push towards reduced greenhouse gas emissions. Indeed, if they reduce the amount of extra capacity required and cut energy prices through greater efficiency they might encourage increased usage and thus increased emissions.

The point is well taken, as is the argument that legislation is required to ensure that new technologies actually lead to climate change mitigation. Without government-created incentives like carbon pricing, we cannot assume that technological advancement and voluntary action will lead to reduced emissions.

The military importance of space

Cluster of security cameras

Given that unmanned aerial vehicles (UAVs) are not yet particularly autonomous, for the most part, they are generally operated remotely by people. Apparently, the transmission system and encryption used between UAV operators in Nevada and the drones they are piloting in Afghanistan and Pakistan introduces a 1.7 second delay between commands being given and responses being received. As a result, take-off and landing need to be handled by a team located within the theatre of operations, since these activities require more nimble responses. The Broad Area Maritime Surveillance system being considered by the US Navy will require much more dynamic communication capabilities, of the sort that can probably only be conveniently provided from orbit.

This is just one example of the way in which the operation of armed forces – and especially the American armed forces – is increasingly dependent on their capabilities in space. From communications to intelligence to navigation, satellites have become essential. That, in turn, makes the capability to interfere with satellites highly strategic. The umbrage taken by the US and others to the 2007 Chinese anti-satellite missile test is demonstrative of this. The test also illustrates the major dangers associated with creating debris in orbit. If enough such material was ever to accumulate, it could make the use of certain orbits hazardous or impossible. The 2009 Iridium satellite collision is a demonstration of how debris clouds can also arise from accidental events, which will become both more common and more threatening as more and more assets are placed in orbit. That crash created about 600 large pieces of debris that remain in Low Earth Orbit.

In the next few decades, we will probably see a lot of development where it comes to the weaponization of space, including (quite probably) the placement of offensive weapons in orbit, the proliferation of ground-based weapons that target satellites, and the deployment of weapons intended to counter those weapons (a significant secondary purpose for ballistic missile defence technologies

Three strikes rules for internet piracy

Charline Dequincey with her violin

The British ISP TalkTalk has been working to show why banning people from the internet, based on unproven allegations of piracy, is a bad idea. Specifically, they have highlighted how many people still use WEP to protect their wireless networks from use by strangers, despite the fact that WEP encryption is easily compromised. That means it is easy for someone to use software tools to access a nearby network and then use it for illegal purposes. My own experience with wireless networks has demonstrated that people really will use them for criminal purposes if they can gain access.

Beyond that, the idea of cutting people off on the basis of three accusations alone runs fundamentally contrary to the presumption of innocence in our system of justice. It would inevitably be abused by copyright holders, and it would inevitably lead to innocent people being cut off from the internet, an increasingly vital part of life for almost everyone. Indeed, Finland recently declared broadband access a right.

To me, the fact that laws like this may well emerge in France, the UK, and elsewhere seems like another example of just how badly broken our intellectual property (IP) systems are, and how badly skewed they are towards protecting the rights of IP owners rather than the public at large. We would be a lot better off if patents were granted more selectively, if licensing of them was mandatory, if copyright was less well defended and expired sooner, and if fair use rights were more effectively legally enshrined. Here’s hoping ‘pirate parties’ continue to proliferate, pushing back the IP laws that have become so unfairly weighted towards those who own the content.

After all, it needs to be remembered that there is nothing libertarian or natural about IP protection. Rather, content owners are having their property claims enforced by the mechanisms of the state. The justification for this is supposed to be that doing so serves the public interest; if that is no longer the case, the laws ought to be watered down or scrapped.

Supporters of 350, understand what you are proposing

Translucent leaves

The 350 movement is a group concerned about climate change that has adopted an upper limit of 350 parts per million (ppm) of carbon dioxide equivalent in the atmosphere as their target. The target is a good and extremely ambitious one, and the group is doing well in the media. That said, I worry that some of the 350 proponents don’t understand what they are arguing for.

The carbon cycle

To understand climate change, you need to understand the carbon cycle. In a normal situation, this refers to carbon in sugars being released as CO2 when animals, bacteria, and fungi metabolize them. This adds CO2 to the atmosphere. In turn, green plants use sunlight to make sugars out of CO2, releasing oxygen. These processes happen in a balanced way, with more CO2 emission in the winter (when plants are inactive) and more CO2 absorption in the summer.

Alongside the biological processes are geological ones. Two are key. Volcanoes emit greenhouse gases, and the erosion of certain kinds of rock locks up CO2 underground. The latter process happens very slowly. It is very important to understand that this is the only long-term phenomenon that keeps on drawing CO2 out of the atmosphere. The oceans will suck it up when CO2 accumulates in the air, but only until the seas become more acidic and come into balance. CO2 likewise accumulates in the biomass of living things, but there can only be so many forests and so much plankton on Earth.

When we burn fossil fuels, we add to the concentration of greenhouse gases in the atmosphere. Before the Industrial Revolution, it was around 280 ppm. Now, it is about 383 ppm and rising by 2 ppm per year.

What 350 means

It isn’t impossible to get back to 350 ppm. This is because the oceans haven’t caught up with the atmosphere yet. If we suddenly stopped burning coal, oil, and gas the quantity of CO2 in the atmosphere would start to fall as more of it went into the sea. That being said, if we keep burning these fuels in the way we are now, getting back to 350 ppm will become impossible.

When you argue to cap the atmospheric concentration at 350, you are arguing to cut the net human emissions of the entire planet to zero – and to do so before we cross the point where the oceans can’t draw us back under the number. The same is true if you argue for stabilizing at a higher level, such as 450 ppm or 550 ppm; those scenarios just give us more time to keep emitting before we reach zero net emissions. When you support 350 ppm, you are committing to keeping the great majority of the carbon bound up in remaining fossil fuels underground and unused by human beings. ‘Net’ human emissions means everything that goes up into the air from burning fossil fuels, minus the trickle of CO2 into rocks (as described above) and possibly minus whatever CO2 we can suck out of the air and bury (a costly and energy-intensive procedure).

Cutting net human emissions to zero is a laudable aim. Indeed, it is the only way concentrations (and global temperatures) can ever be kept stable in perpetuity. I just hope that more 350 supporters will come to understand and accept that, and realize that achieving that ambition requires massive societal change, not just marches and savvy media campaigns.

P.S. If all that isn’t enough of a challenge, remember that there are also positive feedback effects within the climate system where, once we kick off a bit of warming, CO2 concentrations rise on their own in response. These feedbacks include melting permafrost and burning rainforests. Keeping below 350 ppm requires cutting net human emissions to zero before these positive feedbacks commit us to crossing the threshold.

Why Bury Coal? explains this in more detail.

See also:

[Update: 24 March 2011] Some of what I just added to the bottom of my Earth Hour post is also relevant here, in that symbolic acts can help environmental groups achieve attention, even if the acts capturing the attention are dubious in some ways. 350.org should be commended for attracting so much general public attention.