Filling the gaps in chapter two

St Anne’s College, Oxford

The conclusion from working on my second chapter is that I have read too much general background material and not enough on my case studies. I am fairly well covered on POPs, since I have done research on them before. Naturally, adding a few more sources would be nice, though there are not really a great many out there. I am also quite well covered on current events relating to climate change, because there has been such a raft of coverage and discussion. While my intention has never been to write a blow-by-blow account of either (how could I possibly do so in 30,000 words?), it is certainly necessary to have a comprehensive understanding of the history, before any important and valid analysis can be done.

As such, I need to fill in my knowledge on recent developments pertaining to POPs, which should not be hugely difficult. Then, I need to shore up my section on the early history of the climate change debate. Aside from the mandatory OUSSG dinner and talk tonight, I suspect this will fill the next 32 hours. Naturally, I am interpreting my promise to Dr. Hurrell of having a second chapter dropped off at Nuffield by Wednesday as having that chapter dropped off, by my own hand, in time for him to read it on Thursday morning.

Framing, selection, and presentation issues

Harris Manchester College, Oxford

One of the major issues that arises when examining the connections between science and policy are the ways information is framed. You can say that the rate of skin cancer caused by a particular phenomenon has increased from one in ten million cases to one in a million cases. You can say that the rate has increased tenfold, or that it has gone up by 1000%. Finally, you could say that an individual’s chances of getting skin cancer from this source have gone up from one tiny figure to a larger, but still tiny seeming, figure. People seem to perceive the risks involved in each presentation differently, and people pushing for one policy or another can manipulate that. This can be especially true when the situations being described are of not comparably rare: having your chances of being killed through domestic violence reduced 1% is a much greater absolute reduction than having your chances of dying in a terrorist attack reduced by 90%.

Graphing

When talking about presentation of information, graphs are an important case. Normally, they are a great boon to understanding. A row of figures means very little to most people, but a graph provides a wealth of comprehensible information. You can see if there is a trend, what direction it is in, and approximately how strong it is. The right sort of graph, properly presented, can immediately illuminate the meaning of a dataset. Likewise, it can provide a compelling argument: at least, between those who disagree more about what is going on than how it would be appropriate to respond to different situations.

People see patterns intuitively, though sometimes they see order in chaos (the man on the moon, images of the Virgin Mary in cheese sandwiches). Even better, they have an automatic grasp of calculus. People who couldn’t tell you a thing about concavity and the second derivative can immediately see when a slope is upwards and growing ever steeper: likewise, one where something is increasing or decreasing, but at a decreasing rate. They can see what trends will level off, and which ones will explode off the scale. My post on global warming damage curves illustrates this.

Naturally, it is possible to use graphs in a manipulative way. You can tweak the scale, use a broken scale, or use a logarithmic scale without making clear what that means. You can position pie charts so that one part or another is emphasized, as well as abuse colour and three dimensional effects. That said, the advantages of graphs clearly outweigh the risks.

It is interesting to note how central a role one graph seems to have played in the debate about CFCs and ozone: the one of the concentration of chlorine in the stratosphere. Since that is what CFCs break down to produce, and that is what causes the breakdown of ozone, the concentration is clearly important. The graph clearly showing that concentrations would continue to rise, even under the original Montreal Protocol, seems to have had a big impact on the two rounds of further tightening. Perhaps the graph used so prominently in Al Gore in An Inconvenient Truth (the trends on display literally dwarfing him) will eventually have a similar effect.

Stats in recent personal experience

My six-month old Etymotic ER6i headphones are being returned to manufacturer tomorrow, because of the problems with the connector I reported earlier. Really not something you expect for such a premium product, but I suppose there are always going to be some defects that arise in a manufacturing process. Of course, being without good noise isolating headphones for the time it will take them to be shipped to the US, repaired or replaced, and returned means that reading in coffee shops is not a possibility. Their advantage over libraries only exists when you are capable of excluding the great majority of outside noise and of drowning the rest in suitable music.

Speaking of trends, I do wonder why so many of my electronics seem to run into problems. I think this is due to a host of selection effects. I (a) have more electronics than most people (b) use them a great deal (c) know how they are meant to work (d) know what sort of warranties they have and for how long (e) treat them so carefully that manufacturers can never claim they were abused (f) maintain a willingness to return defective products, as many times as is necessary and possible under the warranty. Given all that, it is not surprising that my own experience with electronics failing and being replaced under warranty is a lot greater than what you might estimate the background rate of such activity to be.

Two other considerations are also relevant. It is cheaper for manufacturers to rely upon consumers to test whether a particular item is defective, especially since some consumers will lose the item, abuse it, or simply not bother to return it even if defective. Secondly, it is almost always cheaper to simply replace consumer electronics to fix them, because of the economies of scale involved in either activity. From one perspective, it seems wasteful. From another, it seems the more frugal option. A bit of a paradox, really.

[14 March 2007] My replacement Etymotic headphones arrived today. Reading in coffee shops is possible again, and none too soon.

The identification of environmental problems

The identification of an environmental ‘problem’ is not a single crystalline moment of transition, from ignorance to understanding. Rather, it is ambiguous, contingent, and dependent upon the roles and modes of thinking of the actors involved, and values that inform judgments. Rather like Thomas Kuhn’s example about the discovery of oxygen (with different people accessing different aspects of the element’s nature, and understanding it in different contexts), the emergence of what is perceived as a new environmental problem occurs at the confluence of facts, roles, and existing understandings. While one or more causal connections ultimately form the core of how an environmental problem is understood, they are given comprehensibility and salience as the result of factors that are not strictly rational. From the perspective of global environmental politics and international relations, environmental problems are best understood as complexes of facts and judgments: human understandings that are subjective and dynamic, despite how elements of their composition are firmly grounded in the empirical realities of the world.

POPs and climate change

Consider first the case of persistent organic pollutants (POPs). The toxicity of chemicals like dioxins was known well before any of the key events that led to the Stockholm Convention. At the time, the problem of POPs was largely understood as one of local contamination by direct application or short distance dispersal. It took the combination of the observation of these chemicals in an unexpected place, the development of an explanation for how this had transpired, and a set of moral judgments about acceptable and unacceptable human conduct to form the present characterization of the problem. That understanding in turn forms the basis for political action, the generation of international law, and the investigation of techniques and technologies for mitigating the problem as now understood. Even now, the specific chemicals chosen and the particular individuals whose interests are best represented are partly the product of political and bureaucratic factors.

If we accept former American Vice President Al Gore’s history of climate change, the form of problem identification is even more remarkable. He asserts that the discovery of rising atmospheric CO2 concentrations by Roger Revelle in the 1960s, rather than of specific changes to the global climatic system directly, were what prompted the initial concern of some scientists and policy makers. This is akin to how the 1974 paper by Mario Molina and F.S. Rowland established the chemical basis for stratospheric ozone depletion by CFCs which, in turn, actually led to considerable action before their supposition was empirically confirmed. Gore’s characterization of the initial discovery of the climate change problem also offers glimpses into some of the heuristic mechanisms people use to evaluate key information, deciding which arguments, individuals, and organizations are trustworthy and then prioritizing ideas and actions.

Definition and initial implications

For the present moment, environmental ‘problems’ will be defined as being the consequences of unintentional (though not necessarily unanticipated) side effects of human activity in the world. While mining may release heavy metals into the natural environment, this didn’t crystallize in the minds of people as a problem until the harm they caused to human beings and other biological systems proved evident. While the empirical reality of heavy metal buildup may have preceded any human understanding of the issue, it could not really be understood as an environmental problem at that time. It only became so through the confluence of data about the world, a causal understanding between actions and outcomes, and moral judgments about what is right or desirable. Likewise, while lightning storms cause harm both to humans and other biological systems, their apparent status as an integral component of nature, rather than the product of human activities, makes them something other than an environmental problem as here described. Of course, if it were shown, for example, that climate change was increasing the frequency and severity of thunderstorms (a human behaviour causing an unwanted outcome, though a comprehensible causal link) then that additional damage could be understood as an environmental problem in the sense of the term here used.

Worth noting is the possibility of a dilemma between two sets of preferences and understandings: the alleviation of one environmental problem, for instance by regulating the usage of DDT, may reduce the scope to which another problem can be addressed, such as the possibility of increased prevalence of malaria in a warmer world. It is likewise entirely possible that different groups of people could ascribe different value judgments to the same empirical phenomena. For instance, ranchers and conservationists disagree about whether or not it is desirable to have wild wolves in the western United States.

Problem identification, investigation, and the formulation of understandings about the connections between human activity and the natural world do not comprise a linear progression. This is partially the product of how human psychological processes develop and maintain understandings about the world and partly the consequence of the nature of scientific investigation and political and moral deliberation. Existing understandings can be subjected to shocks caused by either new data or new ideas. Changed understandings in one area of inquiry can prompt the identification of possible problems in another. Finally, the processes and characteristics of problem investigation are conditioned by heuristic, political, and bureaucratic factors that will be discussed at greater length below.

Problematizing the origin of environmental problems as human understandings does not simply add complexity to the debate. It generates possibilities for a more rigorous understanding of the relationship between human beings and nature (including perceptions about why the two are so often seen as distinct). It also offers the possibility of dealing with dilemmas like the example above in a more informed and effective manner.

Nicholas Stern on climate change

Saint Edmund’s Hall, Oxford

During the initial coverage of Nicholas Stern’s report on the economics of climate change, I wondered why the media was paying so much attention. After all, the man is an economist reporting on something that scores of scientists have addressed comprehensively through the IPCC process. Now that I have heard him lecture, and spoken briefly with him personally, I have a much better sense. The man is what Karen Litfin calls a ‘knowledge broker,’ translating scientific data into policy options.

His basic position is the realistic liberal optimist one:

  1. Climate change is real and potentially devastating
  2. It is essentially a massive economic externality
  3. Regulating greenhouse gas (GHG) emissions is the way to stop it
  4. This can be done at moderate cost (1% of GDP) and without a massive change in (a) the basis of economic activity within the developed world or (b) the way in which people choose to live their lives.

He acknowledges that the energy balance needs to shift dramatically. In order to be responsible, he says, we need to shift all electrical production in the rich world to carbon neutral forms (renewables, nuclear, and possibly hydrocarbons with sequestration) by 2050. By that time, land transport should also be based on power sources that do not emit GHGs, whether because they are using stored electricity, or because they use fuels that are GHG neutral. India and China need to be encouraged to sequester the CO2 emitted from their coal stations, probably at the expense of the rich world. All in all, rich states should bear 60-80% of the costs of mitigation.

He focused a great deal on atmospheric CO2 levels. His target is to stabilize between 450ppm and 550ppm. This would lead to a likely scenario where mean global temperature rises by about 2 degrees Celcius (though by much more at the poles, given the nature of the climatic system). On the basis of a ‘business as usual projection’ we will hit 450ppm in eight to ten years. To stabilize at 450ppm, we would need to slow the rate of growth in GHG emissions immediately, having it peak in 2010. Then, we would need to reduce at about 6-10% a year thereafter. If we delayed the peak to 2020, we would likely be at the 550ppm portion of the range: an area that the German head of climate change policy expressed grave concern about, during the question session. Stern himself said that 550ppm is the “absolute upper bound” which it would be “outrageous” to exceed.

As for his very controversial decision about discounting rates, I think he defended himself admirably. He broke it into two bits: the possibility there will be no future generations beyond date X (they ascribed a 0.1% chance a year to an event like a comet or gamma ray burst that would simply snuff humanity out) and the strong likelihood that people in the future will be richer. The latter means that it may be economically efficient to delay some of the costs of dealing with climate change, especially given the probability that new technology will emerge.

I need to move on to other work, though I could discuss his comments for many thousands of words. I will transfer my handwritten notes to the wiki later this evening and link them here: notes from Nicholas Stern’s 21 February 2007 address to Oxford University.

PS. A few weeks ago, my default thesis music was Jason Mraz‘s superb album “Live at Java Joe.” Now, I am listening to Enter The Haggis‘ frantic song “Lannigan’s Ball” from their album Aerials over and over again.

The road to Kyoto plus, lessons from ozone

A lot of people seem to despair about the possibility of effective regulation of greenhouse gas emissions around the world, but the more I read about the cases of persistent organic pollutants and CFCs, the more plausible it seems, given that a few specific and important progressions take place.

The first is the process of scaling upwards in policy levels, as seen very distinctly with CFCs. The Rowland and Molina paper that first suggested that CFCs cause stratospheric ozone degradation was published in 1974. By 1975, two US states had already banned their use as aerosol propellants (Oregon and New York). Hopefully, the progression from there to national and international regulation is one that can be emulated. Already, lots of American cities and states have signaled that they are serious about climate change, and willing to use regulation to combat it.

The second important dynamic has to do with industry expectations. Six years before CFCs became an issue in environmental regulation, DuPont – the largest manufacturer – canceled its program for developing alternatives. When it became clear that regulation was forthcoming, they were able to field some alternatives within six months, and a comprehensive range within a few years. Up to the point where regulation seemed inevitable, they continued to claim that alternatives could not be easily developed. The point here is twofold. First, it shows that the existence of solutions to environmental problems is not independent of regulation and industry expectations about future regulation. Secondly, industries that anticipate national legislation (as they began to in the US in the mid-1980s on the CFC issue) become a powerful lobby pushing government towards completing an international agreement. It is far worse for American industry to be at a loss because local rules are tougher than global ones than it is to simply deal with some new issues.

Thus, an American administration that takes up the baton from the many states that have initiated their own efforts to deal with climate change might be able to create the same kind of expectations in industry. Some are already asking for regulation to “guide the market,” specifically decisions about what technologies and forms of capital in which to invest. From there, it is at least possible that the US could play a key role in negotiating a successor treaty to Kyoto that begins the process of stabilizing and reducing greenhouse gas emissions.

A related point has to do with the extent to which environmental images are heavily influenced by images and symbols. According to Karen Litfin, the Antarctic ozone hole was one of the major factors that led to the Montreal Protocol. She calls it an ‘anomaly,’ unpredicted by the atmospheric science that had been done up to that point, and thus capable of making scientists and politicians more aware of the possibility of unancitipated risks.

At his talk yesterday, Henry Shue says he is hoping for some iconic moment in climate change, to play a similar galvanizing role (a bare-topped Kilimanjaro, the Larsen B collapse, drowning polar bears, and Hurricane Katrina don’t seem to have done it yet, though the connection between climate change and the last of those is not entirely established.) Some spectacular and distressing (but hopefully non-lethal) demonstration of the profound effects human greenhouse gas emissions are having may be necessary to generate an urgent and powerful drive towards effective responses.

Intelligence, effort, and success

A couple of articles I came across today may be of interest to fellow students. The first, from New York Magazine, discusses possible perverse effects upon learning that arise due to how people understand intelligence. Specifically, people who believe themselves to be intelligent are more likely to choose easy tasks and less likely to apply themselves. Another article, on the website of the Association for Psychological Science, discusses ‘The Myth of Prodigy.’ It is about Malcolm Gladwell, author of The Tipping Point.

While I cannot really comment on the validity of the experimental results posited, the general idea does have the ring of truth to it. Intelligence, I think, is generally more likely to be a source of insecurity than confidence. It can always be proven hollow, or outdone by someone else. What that seems to happen, the trend is likely to sustain itself. This I have seen in both friends and myself. It may have a lot to do with why I never learned to drive or dance, and am rather hesitant to display my ineptitude at either.

PS. As with so many other items of interest, I first found this on Metafilter. On one hand, I feel bad for just grabbing their content. On the other, I recognize that it is a very efficient way of finding interesting material. Furthermore, I am driving traffic in their direction.

Richard Branson’s $25M atmosphere challenge

Arches in brick wall

With Richard Branson offering US$25 million to someone who can come up with a system to remove greenhouse gases (most importantly, CO2) from the atmosphere, a lot of people are probably wondering whether it is a pipe dream. Aside from the obvious option of growing more plants, I would be inclined to think so. In order to separate CO2 from air, then sequester it somewhere, it seems likely that you would need a lot more capital and energy than would be required to simply switch away from fossil fuels. It’s like turning on your air conditioning because your oven is making the house too hot. I don’t doubt that it is possible, but I doubt that it is a sensible solution.

That said, finding a technical solution to the greenhouse gas problem would please a very great many people. Though less likely to actually mitigate climate change, the ‘separate and sequester’ plan seems a lot more sensible than the sulfate injection plan, discussed previously. While it may be unlikely that someone will actually claim his prize (and it might distract research attention from more promising options like making more efficient solar panels), that is not to say it would be a bad thing if someone did.

Richard Casement internship

Canal in North Oxford

As one more project for the next couple of weeks, I am going to prepare a submission for the Richard Casement internship at The Economist. Since about ten people a day are finding my site by searching for that term, I am not going to give any hints about what I might write my 600 word article about. That said, I am told that such applications generally succeed through the combination of a good submission with a fortuitous personal connection with someone already inside the organization. Furthermore, their stated “aim is more to discover writing talent in a science student or scientist than scientific aptitude in a budding journalist” and I am neither of those things.

That said, I can hardly imagine a better way to spend the first three months after finishing here than writing about science in New York or London. Hopefully, my application this year will go better than the ones I submitted in past recruiting cycles.

The Resolution of Revolutions

Chapter XII of Kuhn’s Structure of Scientific Revolutions is a brilliant and highly convincing account of the historical nature of changed thinking in scientific communities, on matters fundamental enough to define paradigms. While he doesn’t use the analogy, it strikes me as being very similar to the processes of natural selection.

The first adopters of a new paradigm strike upon it for a complex combination of reasons. Included among them are vague aesthetic senses, personal prejudices, and the like. Because of the comprehensive nature of ‘normal’ scientific investigation within the existing paradigm, such meanderings are generally unlikely to be rewarded. That said, if they can win over a few people and develop to the point where they become evidently useful, they have the chance to win over the scientific community as a whole. Naturally, this is easiest to do in times of crisis: especially when the new paradigm seems to help resolve the questions that lie at the core. Kuhn rightly identifies how theories that do an especially good job of predicting effects unobserved until after predicted are unusually good at winning converts.

Consider the development of any novel biological phenomenon. The earliest creatures to undergo a significant mutation probably get eradicated as a result. Only once an alteration is at least benign and at best somewhat useful can we expect any number of beings to be found in the world with it. One can only imagine how many trillions of bacteria snuffed themselves out in the course of random variations that eventually led to things like more efficient cellular respiration, or the development of motion by flagella, or the existence of symbiotic modes of living.

Of course, I like the analogy because it serves my earlier arguments that it is practical usefulness that permits us to argue that one scientific perspective is better than another. Technology, in particular, lets us separate fruitless theory from the fruitful sort, as well as comprehend when seemingly incompatible views are just complex reflections of one another.

The current argumentation about whether string theory is ‘science’ or not strikes at this directly. String theory might be seen as the evolution of a new limb that hasn’t quite proved to be terribly useful yet. Driven by the kind of aesthetic sense that make Brian Greene call his book about it “The Elegant Universe” string theorists are engaged in the kind of development that might eventually lead to a resolution, as described by Kuhn.

PS. Part of the reason natural selection is so frequently useful for understanding what is going on in the world is because of how it is predicated upon an illuminating tautology: namely how arrangements that are stable in a particular environment will always perpetuate themselves, whereas those which are unstable will not. This applies to everything from virtual particle formation at the sub-atomic scale to the success and failure of businesses. That said, it should be noted that the ‘system’ in which businesses actually operate is distinctly different from the ideal form envisioned by the most vocal advocates of free markets. Crime, deceit, and exploitation may be important aspects of that system, in addition to innovation and individual acumen.

POPs and climate change as ‘anomalies’

Now nearly finished with Kuhn‘s Structure of Scientific Revolutions, I am pondering how to apply it to my thesis case studies. Basically, what Kuhn has done is sketch out a theory about how scientists interact with the world and each other, generating new scientific ways of understanding the world. You start with one paradigm (say, Newtonian physics). Then, scientists begin to notice anomalies – places where the theory cannot explain what they perceive to be going on. If such anomalies are of the right sort and sufficiently numerous, they may provoke a crisis within the paradigm. At that point, the scope of science broadens a bit, to examine bigger questions and alternative possibilities. In Kuhn’s terminology, the practice of ‘normal science‘ is interrupted. The crisis is resolved either through the modification of the previous paradigm or through the emergence of a new one, such as relativistic physics.

From the perspective of my thesis, the relevant discoveries are the rising global mean temperature and rising concentrations of POPs in the Arctic. Both were novel developments in our awareness and understanding of what is going on in the world, and both are the unintended products of modern economic activity. In the first case, the emission of greenhouse gases seems to be the primary cause of the change; in the second, pesticide use, industrial chemicals, and garbage burning seem to be the culprits. While scientists knew that these things were going on before the first research on POPs and climate change was done, these specific consequences were not anticipated. Their precise magnitude remains contested and uncertain.

While neither discovery induced a crisis in science (both are largely explicable using science that has existed for a long time), they did progress into general acceptance by following a pattern that is in some ways similar to that of paradigmatic development in the sciences. The researchers who first looked at POP concentrations in human blood and breast milk from the Arctic thought that the samples must have been contaminated, because they could imagine no reason for which people living in such an isolated environment would be so saturated with toxic chemicals. The establishment and operation of the Northern Contaminants Program thus involves both ‘normal science’ and the kind of thinking through which new paradigms are established. Because of such similarities, I am hoping that some of Kuhn’s insights into the ways scientists think, and especially the ways in which they make up their own minds and try to make up those of their colleagues, can be applied to the understanding of scientific perspectives on these particular environmental problems.

The biggest difference is probably how wider policy implications tend to arise from environmental discoveries in a way not parallel to the consequences of other sorts of discovery. Quantum mechanics may allow us to do new things, but it doesn’t really compel us to behave very differently. Learning about global warming, by contrast, interacts with our pre-existing notions about appropriate action by human beings in the world to suggest potentially radical changes in behaviour. While I am not saying that there is a direct or linear connection between scientific discoveries about the environment and specific policy choices, it seems valid to say that our understanding of the environment, informed by science, profoundly affects the ways in which we feel we can and should act in relation to the physical world.

On a related note, I would strongly suggest that any physicist working on string theory give Kuhn’s SoSR a careful read. The crisis in physics generated by apparent contradictions between relativity and quantum mechanics seems very much like those he describes, with similar implications in terms of how scientists are thinking and what they are doing.