The Salmon of Doubt

One more promising bit of academic news, from the MIT International Review:

Your paper is indeed still being considered (congratulations!), having made it through a particularly rigorous selection process. You will receive a more formal note to this effect in the forthcoming days.

This is, of course, the eternal fish paper, still passing through journal selection processes on its way to eternity. So much time has now passed since I wrote that paper that it feels like a familiar alien life-form that has been observing me continuously, but which I can only properly recognize when it glances at me in a certain way. Needless to say, this is an odd relationship to have with a piece of your own work.

I am very cautiously optimistic. If the paper gets through to publication, it will be my first published work in a journal not run by the University of British Columbia.

Research design essay blasted

I just got the feedback on my research design essay, and it is enormously less positive than I had hoped. The grade is a low pass and there are two written statements included: one that is fairly short and reasonably positive, the other longer and far more scathing. It opens with “[t]his research design is not well thought out.” Both comments discuss the Stockholm Convention and Kyoto Protocol as though they are the real focus of the thesis; by contrast, they were meant to be illustrative cases through which broader questions about science and policy could be approached.

The shorter comment (both are anonymous) says that “the general idea behind the research is an interesting one” while the longer comment calls the cases “well-selected… [with] fruitful looking similarities and differences.” The big criticisms made in the longer comment are:

  1. The nuclear disarmament and Lomborg cases are unnecessary and irrelevant.
  2. I haven’t selected which key bits of the Kyoto negotiations to look at.
  3. My philosophy of science bibliography is not yet developed.
  4. Not enough sources on Kyoto or Stockholm are listed. Too many are scientific reports.

It blasts me for not yet having a sufficiently comprehensive bibliography, and for the irrelevance the commenter sees in the nuclear weapons and Lomborg examples. The whole point of those is to address the question of what roles scientists can legitimately take, and how the policy and scientific communities see the role of science within global environmental policy making. The point is definitely not, as the comment seems to assume, to compare those cases with Stockholm and Kyoto. Taken all in all, this is hands-down the most critical response to anything important I have written for quite a number of years.

To me, it seems like the major criticism is that the thesis has not been written yet. I mention being interested in the philosophy of science, insofar as it applies, but have not yet surveyed the literature to the extent that seems expected. The same goes for having not yet selected the three “instances or junctures” in the Kyoto negotiations that I am to focus on.

As is often the case when I see something I was quite confident about properly blasted, I am feeling rather anxious about the whole affair – to the point, even, of feeling physically ill. I always knew there was a lot more work to be done – a big part of why I have decided to stay in Oxford over the summer – but I expected that the general concepts behind the thesis plan were clear enough. The long comment definitely indicates that not to be the case. I can take some solace in what Dr. Hurrell has said. He has more experience with environmental issues than probably anyone else in the department and has also had the most exposure to the plotting out of my particular project. Of it, he has said: “[the] Research Design Essay represent[s] an excellent start in developing the project and narrowing down a viable set of questions to be addressed.” Still, I would be much happier if the examiners had said likewise.

The major lesson from all this is to buckle down, do the research, and prove them wrong for doubting the potential and coherence of this project. The issue is an important one, even if it is more theoretical and amorphous than many of the theses they will receive. A simple comparison of Kyoto and Stockholm would be enormously less interesting.

Potentially misleading statistics

How frequently do you see in the headlines that scientists have discovered that tomato juice reduces the chances of Parkinson’s disease, that red wine does or does not reduce the risk of heart disease, or that salmon is good for your brain? While statements like these may well be true, they tend to come together as a random collection of disconnected datasets assessed using standard statistical tools.

Of course, therein lies at least one major rub inherent to this piecemeal approach. If I come up with twenty newsworthy illnesses and then devise one clinical trial to assess the effectiveness of some substance for fighting them, I am highly likely to come up with a statistically valid result. This is in fact true even if the substance I am providing does absolutely nothing. While the placebo effect could account for this, the more important reason is much more basic:

Statistical evaluation in clinical trials is done using a method called hypothesis testing. Let’s say I want to evaluate the effect of pomegranate juice on memory. I come up with two groups of volunteers and some kind of memory test, then give the juice to half the volunteers and a placebo that is somehow indistinguishable to the others. Then, I give out the tests and collect scores. Now, it is possible that – entirely by chance – one group will outperform the other, even if they are both randomly selected and all the trials are done double-blind. As such, what statisticians do is start with the hypothesis that pomegranate juice does nothing: this is called the null hypothesis. Then, you look at the data and decide how likely it is that you got the data you did, even if pomegranate juice does nothing. The more unlikely it is that your null hypothesis is false, given the data, the more likely the converse is true.

If, for instance, we gave this test to two million people, all randomly selected, and the ones who got the pomegranate juice did twice as well in almost every case, it would seem very unlikely that pomegranate juice has no effect. The question, then, is where to set the boundary between data that is consistent with the null hypothesis and data that allows us to reject it. For largely arbitrary reasons, it is usually set at 95%. That means, there is a chance of 5% or less that the null hypothesis is true – pomegranate juice does nothing – in spite of the data which seem to indicate the converse.

More simply, let’s imagine that we are rolling a die and trying to evaluate whether it is fair or not. If we roll it twice and get two sixes, we might be a little bit suspicious. If we roll it one hundred times and get all sixes, we will become increasingly convinced the die is rigged. It’s always possible that we keep getting sixes by random chance, the the probability falls with each additional piece of data we collect that indicates otherwise. The number of trials we do before the decide that the die is rigged is the basis for our confidence level.1

The upshot of this, going back to my twenty diseases, is that if you do these kinds of studies over and over again, you will incorrectly identify a statistically significant effect 5% of the time. Because that’s the confidence level you have chosen, you will always get that many false positives (instances where you identify an effect that doesn’t actually exist). You could set the confidence level higher, but that requires larger and more expensive studies. Indeed, moving from 95% confidence to 99% of higher can often require increasing the sample size by one hundred-fold or more. That is cheap enough when you’re rolling dice, but it gets extremely costly when you have hundreds of people being experimented upon.

My response to all of this is to demand the presence of some comprehensible causal mechanism. If we test twenty different kinds of crystals to see if adhering one to a person’s forehead helps their memory, we should find that one in twenty works, based on a 95% confidence level. That said, we don’t have any reasonable scientific explanation of why this should be so. If we have a statistically established correlation but no causal understanding, we should be cautious indeed. Of course, it’s difficult to learn these kinds of things from the sort of news story I was describing at the outset.


[1] If you’re interested in the mathematics behind all of this, just take a look at the first couple of chapters of any undergraduate statistics book. As soon as I broke out any math here, I’d be liable to scare off the kind of people who I am trying to teach this to – people absolutely clever enough to understand these concepts, but who feel intimidated by them.

Lomborg on fish

I just re-read the short section on world fisheries in Bjorn Lomborg’s Skeptical Environmentalist, and noted that the level of analysis shown there is low enough to cast doubt on the rest of the book. He basically argues that:

  1. The global fish catch is increasing.
  2. We can always farm our way out of trouble.
  3. Fish aren’t that important anyhow (only 1% of human calories, 6% of protein).

He is seriously wrong on all three counts. On the matter of overall catch, that is a misleading figure, because it doesn’t take into account the effort involved in catching the fish. You could be catching more because you’re building more ships, using more fuel, etc. As long as subsidy structures like those in the EU and Japan remain, this is inevitable. While such technological advances can conceal the depletion of fish stocks, the reality remains. If we’re fishing above the rate at which a fishery can replenish itself, it doesn’t matter whether our catches are increasing or not. Or rather, it does insofar as it helps to determine how long it will be before the fishery collapses, like the cod fisheries of Newfoundland and the North Sea already have. Fisheries are also complex things. Catching X fish and waiting Y time doesn’t necessarily mean that you will have X fish to catch again. Much has to do with the structure of food webs, and thus energy flows within the ecosystem.

The idea that farming can be the answer is also seriously misleading. First and foremost, farmed fish are almost exclusively carnivorous. That means they need to be fed uglier, less tasty fish in order to grow. Since they aren’t 100% efficient at turning food into flesh, there is an automatic loss there. More importantly, if we begin fishing other stocks into decline in order to farm fish, we will just have spread the problem around, not created any kind of sustainable solution. As I have written about here before, serious pressure already exists on a number of species that are ground into meal for fish-farming. There are also the matters of how fish farms produce large amount of waste that then leaches out into the sea: biological wastes from the fish, leftover hormones and antibiotics from the flood of both used to make the fish grow faster and get sick less often in such tight proximity, and the occasional seriously diseased of genetically damaged fish escaping to join the gene pool.

I can only assume that Lomborg is right to say that “fish constitutes a vanishingly small part of our total calorie consumption – less than 1 percent – and only 6 percent of our protein intake.” Even so, that doesn’t mean that losing fisheries as a viable source of calories and protein would not be a terrible event. Humanity overall may not be terribly dependent, but certain groups of individuals are critically dependent. Moreover, the “it’s not all that important a resource anyway, so who cares if it goes?” attitude that is implied in Lomborg’s assessment fails to consider the ramifications that continuing to fish as we are could have for marine ecosystems in general and the future welfare of humanity.

One last item to identify is the fallacious nature of the 100 million tons a year of fish we can “harvest for free.” This is his estimate of the sustainable catch, and he then notes that we are only catching 90 million tons. He goes on to say that “we would love to get our hands on that extra 10 million tons.” First off, the distribution here matters. If the sustainable catch for salmon is five million tons and we are catching twenty, the overall figure doesn’t reflect the fact that salmon stocks will be rapidly destroyed. If we’re burning our way through, species by species (look at the wide variety of fish now served as ‘cod’ in the UK), then even a total catch below the aggregated potential sustainable yield could be doing irreparable harm. Secondly, we have shown no capacity for restraint as a species. Just looking at what Canada has done within its own territorial waters demonstrates that even rich governments with good scientists can make ruinous policy choices for political or other kinds of reasons.

All in all, Lomborg’s analysis is seriously misleading and lacks comprehension of the dynamics that underlie marine ecology and the human interaction with it that takes place. While my research project for the thesis partly involves examining the controversy surrounding Lomborg, I am not planning to critique his statements directly in the thesis. With passages like this included, I may be tempted.

The science of complex systems

While walking with Bilyana this morning, we took to discussing complex dynamic systems, and the capability of present-day science to address them. Such systems are distinguished by the existence of complex interactions and interdependencies within them. You can’t look at the behaviour of a few neurons and understand the functioning of a brain; likewise, you can’t look at a few ocean currents or a few cubic miles of atmosphere and understand the climatic system. The resistance of these systems to being understood through being broken down and studied piece by piece is why they pose such a challenge to a scientific method that is generally based on doing exactly that.

Murray Gell-Mann, the physicist who discovered quarks while working at the Stanford Linear Accelerator Center, extensively discusses complex dynamic systems in his excellent book: The Quark and the Jaguar. Among the most interesting aspects of that book is the discussion of the difficulty of categorizing things as simple or complex. That is to say, establishing the conditions of complexity. Some kinds of problems, for instance, are extremely complex for human beings – taking the sixth root of some large number, for instance – but facile for computers. That said, computers have a terrible time trying to perform some tasks that people perform without difficulty. The comparison of human and machine capability is appropriate because of the difficulties involved in trying to understand something like the climatic system and determine the effects that anthropogenic climate change will have upon it. Increasingly, our approach to studying such things is based on computer modelling.

Whether studying an economy, the cognitive processes of a cricket, or the dynamics of a thunderstorm, modelling is an essential tool for understanding complex systems. At the same time, a level of abstraction is introduced that complicates the status of such understanding. First of all, it is likely to be highly probabilistic: we can work out about how many bolts of lightning a storm with certain characteristics might produce, but cannot predict with exactitude the behaviour of a certain storm. Secondly, we might not understand the reasons for which behaviour we predict is taking place. Some modern aircraft use neural networks and evolutionary algorithms to dampen turbulence along their wings, through the use of arrays of actuators. Because the behaviour is learned rather than programmed, it doesn’t reflect understanding of the fluid dynamics involved in the classical sense of the word ‘understanding.’

I predict that the most significant scientific advancements in the next hundred years or so will relate to complex dynamic systems. They exist in such importance places, like all the chemical reactions surrounding DNA and protein synthesis, and they are so imperfectly understood at present. It will be interesting to watch.

Theorems and conjectures

As strongly evidenced by how I finished it in a few sessions within a single 24-hour period, Simon Singh’s Fermat’s Last Theorem is an exciting book. When you are kept up for a good part of the night, reading a book about mathematics, you can generally tell that some very good writing has taken place. Alongside quick biographies of some of history’s greatest mathematicians – very odd characters, almost to a one – it includes a great deal of the kind of interesting historical and mathematical information that one might relate to an interested friend during a long walk.

xn + yn = zn

The idea that the above equation has no whole number solutions (ie. 1, 2, 3, 4, …) for x, y, and z when n is greater than two is the conjecture that Fermat’s Last Theorem supposedly proved. Of course, since Fermat didn’t actually include his reasoning in the brief marginal comment that made the ‘theorem’ famous, it could only be considered a conjecture until it was proven across the span of 100 pages by American mathematician Andrew Wiles in 1995.

While the above conjecture may not seem incredibly interesting or important on its own, it ties into whole branches of mathematics in ways that Singh describes in terms that even those lacking mathematical experience can appreciate. Even the more technical appendices should be accessible to anyone who has completed high school mathematics, not including calculus or any advanced statistics. A crucial point quite unknown to me before is that a proof of Fermat’s Last Theorem is also automatically a proof of the Taniyama-Shimura conjecture (now called a theorem, also). Since mathematicians had been assuming the latter to be true for decades, Wiles’ proof of both was a really important contribution to the further development of number theory and mathematics in general.

Despite Singh’s ability to convey the importance of math, one overriding lesson of the book is not to become a mathematician: if you manage to live beyond the age of thirty, which seems to be surprisingly rare among the great ones, you will probably do no important work beyond that point. Mathematics, it seems, is a discipline where experience counts for less than the kind of energy and insight that are the territory of the young.

A better idea, for the mathematically interested, might be to read this book.

On caffeine

Caffeine moleculeCaffeine – a molecule I first discovered as an important and psychoactive component of Coca Cola – is a drug with which I’ve had a great deal of experience over the last twelve years or so. By 7th grade, the last year of elementary school, I had already started to enjoy mochas and chocolate covered coffee beans. When I was in 12th grade, the last year of high school, I began consuming large amounts of Earl Gray tea, in aid of paper writing and exam prep. During my first year at UBC, I started drinking coffee. At first, it was a matter of alternating between coffee itself and something sweet and delicious, like Ponderosa Cake. By my fourth year, I was drinking more than 1L a day of black coffee: passing from French press to mug to bloodstream in accompaniment to the reading of The Economist.

Unfortunately, coffee doesn’t seem to work quite right in Oxford. My theory is that it’s a function of the dissolved mineral content in the water, which is dramatically higher than that in Vancouver.

As I understand it, caffeine has a relatively straightforward method of operation. After entering the body through the stomach and small intestine, it enters the bloodstream and then binds to adenosine receptors on the surface of cells without activating them. This eventually induces higher levels of epinephrine release, and hence physiological effects such as increased alertness. Much more extensive information is on Wikipedia.

From delicious chocolate covered coffee beans used to aid wakefulness during the LIFEboat flotillas to dozens of iced cappuccinos at Tim Horton’s with Fernando while planning the NASCA trip, I’ve probably consumed nearly one kilogram of pure caffeine during the last decade or so. After the two remaining weeks of this term – and thus this academic year – have come to a close, my tight embrace with the molecule will probably loosen a bit.

Draft RDE complete

Two hours before my self-imposed deadline (to be brutally enforced by Claire), I finished a solid first draft of my research design essay, including two appendices. Weighing in at about 5000 words, sans appendices, it is right in the middle of the range from minimum to maximum length, leaving me some space to correct errors that my two much appreciated peer-editors point out before Sunday.

Many thanks to Meghan and Claire for throwing themselves in front of that bullet.

If you feel left out for not getting a copy, download one here (PDF). Please leave me comments ranging from “this word is spelled incorrectly” to “the entire methodological construction of this project is hopeless, for the following intelligent and well-articulated reasons.” The linked PDF doesn’t include the appendices because they are separate Word files and I don’t have software to merge PDF files with me. They really shouldn’t be necessary, anyhow.

[Update: 27 May 2006] I have a slightly revised version up, based on my own editing. Still waiting for comprehensive responses from external readers.

Unintentional auto-satire

For a while, I was planning to simply ignore these videos, produced by the ‘Competitive Enterprise Institute,’ but they have now been sent to me enough times to indicate that this hopelessly disingenuous message is getting out. Let’s go through them, one by one:

Energy

Nobody in their right mind denies that carbon dioxide is “essential to life” or that “we breathe it out.” What any competent scientist will tell you is that releasing masses of it affects the way in which the atmosphere deals with the radiant energy from the sun. Higher concentrations of gasses of certain kinds (CO2, methane, etc) in the atmosphere cause the planet to absorb and retain more solar energy. That raises the mean global temperature and reduces the ratio of frozen to liquid water on earth. CO2 isn’t a pollutant, in the toxic sense, but it does affect how the earth is affected by the sun.

Regarding the issue of whether fuels that emit CO2 have “freed us from a world of backbreaking labour,” they probably have. That said, that doesn’t mean they are the only way we can avoid such suffering, nor does it mean that such alleviation comes without a cost.

Glaciers

Producing two scientific papers that show that specific ice sheets are growing or increasing in density doesn’t mean that the world overall isn’t experiencing global warming. While there is plenty of dispute about how bad global warming would be and how much it would cost to stop, to deny that it is happening on the basis of such a flimsy argument is worse than irresponsible.

It’s almost astonishing that anyone would be driven to respond to such absolute malarky. Likewise, I can’t believe that anyone who participated in the creation of these videos did so with genuine intent. They are absurd at the level of the “Amendment Song” from The Simpsons or many Monty Python sketches. If such things actually have the power to shape public opinion, we are in even worse shape than I thought.

Do you think these people are on crack? Whether you do or don’t, send an email to Myron Ebell, their Director of Energy and Global Warming Policy. It seems that messages to him need to go through this email address.

Animal testing in Oxford

For about an hour today, I spoke with Lee Jones while he was handing out Pro-Test leaflets on Cornmarket Street. For those outside Oxford – or those who have spent the last few months in a local cave, with fingers in their ears – Pro-Test is a group which promotes the use of animal testing in medical research, in opposition to groups like SPEAK and the Animal Liberation Front who have been agitating against the animal lab that is under construction near Rhodes House. Along with legitimate protests and demonstrations, some anti-testing groups have threatened construction workers and members of the university, as part of their campaign to stop the lab from being built. Similar protests in Cambridge led to the cancellation of an animal lab project there.

I do believe that animals are morally considerable, to a certain extent. That’s part of why I refrain from eating them. I don’t think there’s a rational basis for a harsh divide between humans and other animals. That said, there is a balance of competing moral claims. We need new antibiotics to deal with resistant bacteria. We need vaccines for HIV/AIDS and malaria. Oxford is the only organization in the world presently conducting second stage clinical trials on vaccines both both malaria and HIV/AIDS, as well as new treatments for tuberculosis. We need new surgical procedures and drugs to limit the harm caused to people around the world by infectious disease: a far more lethal phenomenon than war and terrorism put together. Developing all of these things fundamentally requires limited usage of animal testing. No computer models are adequate for dealing with the sophistication of animal biochemistry; likewise, it is irresponsible to test drugs and procedures on human beings, even volunteers, before basic toxological and side effect screenings have been completed.

Protections for laboratory animals in the UK are already extremely strong: far, far more robust than sanitary and ethical guidelines in the factory farming industry (which should be the real target for those concerned about animal cruelty). While alternatives to animal testing should be investigated, and employed where appropriate, the moral imperative to lessen the suffering caused by disease requires the continued development and use of facilities such as that under construction at Oxford.

Those interested in hearing Pro-Tests side of the story should consider attending an open public meeting on Monday the 22nd. It is happening from 7:00 to 9:00pm at the Oxford town hall and will include presentations from scientists, a Member of Parliament, and members of Pro-Test. They are also holding a demonstration on Saturday, June 3rd – starting at 11:45am on Parks Road.