Nixon and Gorbachev

Here’s a bit of Cold War role reversal for you:

Which US president cancelled America’s offensive biological weapons program? Richard Nixon, in 1969, three years before the Biological Weapons Convention.

Which Soviet premier ordered Biopreparat, the Soviet bioweapon program, to weaponize smallpox? Mikhail Gorbachev, in the Five Year Plan launched in December 1987. He also ordered the production of mobile production centres for biological weapons, to try to retain such offensive capabilities despite inspections of suspected bioweapon facilities.

For me, at least, this sits awkwardly with my general perceptions of the two men: Nixon the amoral schemer and Gorbachev the unintending architect of the end of the Communist system.

Cold War weapons and perceptions of risk

It seems a natural human intuition to think the world is going down the tubes. We look back across our lives and identify what seems more worrisome now than when we were born. We then worry about what sort of world future generations will inhabit. Written accounts demonstrate that such concerns go back at least to the classical world.

There is certainly some validity to that perspective, especially when it comes to cumulative threats like climate change. That said, there do seem to be many cases in which anxieties proved unjustified – such as when wave after wave of immigrants ended up successfully integrated into North American and Western European cultures, despite fears that they would create all manner of entrenched problems.

I started thinking about all this earlier today, reading Ken Alibek’s account of the Soviet biological weapons program. Until I was nine years old, the Russians were still doing open air testing of biological weapons on Vozrozhdeniya Island. That reminded me of two probable cognitive failures. Firstly, we are less aware of the dangers that existed in previous times, which reduces the validity of our apprehensions about a future that is worse. Secondly, there can be real improvements in the state of the world. While there are certainly still risks associated with Cold War era weapons, at least the spectre of their intentional use is less haunting now than it was in previous decades.

Facebook and data mining

I have written before about privacy and Facebook, expressing the view that people should treat whatever they put on Facebook in the same way as they treat something they put on a completely public website at this one. It may be wise to give people more granular control over who can see what, but it isn’t intelligent as a Facebook user to assume that their privacy controls will always be adequate and that your information will stay safe.

In the wake of the latest Facebook privacy debacle, I have realized that there is an element to the situation that I hadn’t considered before. Especially now that Facebook is working to put everybody’s ‘Interests’ into a standardized format, there is a real difference between how information on Facebook can be used, compared to the wider web.

A person with some time and interest could scan through my blog, figure out about how old I am, learn what sort of books I read, discover my political views, and so on. It would be rather tricky to write an automated computer program that would achieve the same result. Blogs are non-standardized, and comprised of human generated text. By contrast, information on Facebook is increasingly organized in a manner that is easily machine readable. If I want to reach 25-27 year olds who enjoy reading Carl Sagan books and live in Ottawa, it is easy to do via the information on Facebook, but hard to do with information from the general web. That seems to comprise a different sort of privacy violation and/or data mining.

In response, I have stripped my Facebook account of everything that might be of interest to advertisers, at least where it is easily machine-readable: hometown, current location, music and films appreciated, etc. A determined human user could still learn a lot about me from Facebook, for instance by looking at status updates and communication with others, but this will at least make it a bit trickier for machines.

On smallpox

In 1977, smallpox was eradicated as the result of a massive global effort. Rather than completely eliminate the virus, it was decided that the United States and Russia would each keep a sample. Part of the reasoning for this is that pox viruses are common in the animal world, and could potentially jump between species. Having samples of human smallpox could be useful, in the event that such a thing occurred.

Unfortunately – and rather threateningly – the Russian smallpox sample didn’t sit idly in a freezer. Smallpox is a highly contagious, highly lethal disease and yet Biopreparat, the Soviet Union’s biological weapons agency, made some twenty tonnes of the stuff, tested it on animals, and developed mechanisms to use it as a weapon, including delivery via warheads on intercontinental missiles. This was done at the State Research Institute of Virology and Biotechnology (also called Vector), outside the city of Novosibirsk, in Siberia, as well as at a more secret facility in Sergiyev Posad. It was also tested on Vozrozhdeniya Island. The Soviets made so much that it couldn’t all be accounted for. Quite possibly, some found its way into biological weapons programs in other states, such as China, India, Pakistan, Israel, North Korea, Iraq, Iran, Cuba, and Serbia.

Whereas human beings once had two major forms of protection from smallpox – immunity resulting from exposure to the virus, and vaccination campaigns – the former is now absent and the latter defunct and potentially difficult to restore. A single case, perhaps arising from some accident, could directly infect hundreds of people and kick off an escalating series of waves of infection, spaced fourteen days apart, as people go through the incubation period and become infectious. Such a global outbreak could kill a massive number of people.

The idea of an accidental release is not fanciful. In 1978, medical photographer Janet Parker became one of the two last people to contract smallpox, working in the anatomy department of the University of Birmingham Medical School. It seems entirely plausible that accidental exposure could occur at some shady biological weapon lab in Cuba, Pakstan, or North Korea.

If anything like that ever happens, people may end up looking on the decision not to stick to just one frozen sample of smallpox as the worst thing the Soviet Union ever did. Hopefully, all the concern and money expended on security since 2001 has at least left the world in a better position to launch a mass vaccination campaign, should the need ever arise.

Our imperfect memories

Slate has produced a good series highlighting the limitations of human memory, particularly how easily it can be manipulated and people can be made to remember things that never took place.

The imperfect nature of human memory has important consequences, including in situations like criminal proceedings and psychotherapy. It is also discussed in this Paul Bloom lecture:

It turns out that the same sort of experiments and the same sort of research has been done with considerable success in implanting false memories in adults. There are dramatic cases of people remembering terrible crimes and confessing to them when actually, they didn’t commit them. And this is not because they are lying. It’s not even because they’re, in some obvious sense, deranged or schizophrenic or delusional. Rather, they have persuaded themselves, or more often been persuaded by others, that these things have actually happened.

Psychologists have studied in the laboratory how one could do this, how one can implant memories in other people. And some things are sort of standard. Suppose I was to tell you a story about a trip I took to the dentist or a visit I took to–or a time when I ate out at a restaurant and I’m to omit certain details. I omit the fact that I paid the bill in a restaurant, let’s say or I finished the meal and then I went home. Still, you will tend to fill in the blanks. You’ll tend to fill in the blanks with things you know. So, you might remember this later saying, “Okay. He told me he finished eating, paid the bill and left,” because paying the bill is what you do in a restaurant.

This is benign enough. You fill in the blanks. You also can integrate suppositions made by others. And the clearest case of this is eyewitness testimony. And the best research on this has been done by Elizabeth Loftus who has done a series of studies, some discussed in the textbook, showing how people’s memories can be swayed by leading questions. And it can be extremely subtle. In one experiment, the person was just asked in the course of a series of questions–shown a scene where there’s a car accident and asked either, “Did you see a broken headlight?” or “Did you see the broken headlight?” The ‘the’ presupposes that there was a broken headlight and in fact, the people told–asked, “Did you see the broken headlight?” later on are more likely to remember one. It creates an image and they fill it in.

It is always troubling to be reminded that we cannot entirely trust our own minds. That said, it is far better to be aware of the limitation and suffer from its troubling implications than it is to ignorantly assume that our memories are an accurate record of past events that cannot be altered.

The cost of prison

Apparently, imprisoning someone in Canada costs over $100,000 a year. Right off the bat, that is clearly a substantial investment of resources. It gets even worse when you consider a few further aspects.

Firstly, it seems highly dubious that prisons play a rehabilitative role. Those who are incarcerated will probably deal with a lengthy stigma afterward, perhaps for the rest of their lives. This will worsen their employment prospects and reduce the welfare of their family members. It is also plausible that having a record of incarceration increases the relative appeal of crime as a means of financial subsistence. Before you have such a record, you have a lot to lose from a criminal conviction; afterward, you have fewer legitimate job opportunities and less to lose from a longer record.

Secondly, it seems clear that the government could spend that sum of many in a great many more productive ways. You could probably finance someone’s entire undergraduate degree for that amount, or provide an apprenticeship program for a trade. You could do a lot of preventative medicine, or invest a fair bit in deploying improvements in energy efficiency or renewable energy generation.

It seems particularly absurd to imprison people with a non-violent involvement in the drug trade. It is a normal characteristic of human beings to want to experience altered states of consciousness. It is one that we positively encourage in some cases, such as the thrill from athletic exertion or Hollywood movies, and tolerate and regulate in others, such as with alcohol and tobacco. It seems utterly foolish to imprison those who seek to alter their mental state in unauthorized ways, or assist other people in doing so, when that choice is costly to everyone in terms of lost opportunities, and especially costly to the person being punished, in terms of future prospects.

Perverse effects from police statistics

An article in the Village Voice describes how police officers in one New York precinct routinely downgraded crime reports, in order to make their statistics look more favourable. A whistle-blowing police officer revealed with, with evidence from covert audio recordings.

Indeed, the whole situation is deeply reminiscent of police work as portrayed on the television show The Wire. In particular, it matches up with two quotes from that series:

  • “But the stat games? That lie? It’s what ruined this department. Shining up shit and calling it gold so majors become colonels and mayors become governors.”
  • “Robberies become larcenies oh so easily. And rapes, well they just disappear.”

It’s a tricky problem to deal with. I have defended standardizes tests as protection against grade inflation, but they can clearly create similar perverse incentives. When people start chasing a number that is intended as a proxy for a good outcome, they can begin to produce worse outcomes in ways that flatter the particular figure you are looking at.

It’s not an easy problem to solve, allowing discretion while maintaining high standards. Clearly, part of all statistics-based systems must be an audit and oversight capacity that retains a sense of the importance of the real outcomes being sought, and a level of independence that prevents it from becoming just another political tool. Of course, the same political pressures that seem capable of turning police forces into factories for dodgy statistics apply just as strongly to any such oversight bodies. They also make it highly likely that whisteblowers will be ostracized, with everything possible being done to discredit them.

How useful are spies?

Malcolm Gladwell recently wrote a very interesting piece for The New Yorker about the extreme difficulty of interpreting information from spies properly. You can never really know whether a promising nugget information is actually that, or whether it was cleverly planted by an enemy. In the end, both intelligence agencies and those who rely on them must remain simultaneously aware of the possibility that actionable intelligence is genuine and accurate, and of the possibility that it is intentionally erroneous. As Gladwell concludes: “the proper function of spies is to remind those who rely on spies that the kinds of thing found out by spies can’t be trusted.”

The funniest bit of the story describes the plot of Peter Ustinov’s 1956 play, “Romanoff and Juliet:’

a crafty general is the head of a tiny European country being squabbled over by the United States and the Soviet Union, and is determined to play one off against the other. He tells the U.S. Ambassador that the Soviets have broken the Americans’ secret code. “We know they know our code,” the Ambassador, Moulsworth, replies, beaming. “We only give them things we want them to know.” The general pauses, during which, the play’s stage directions say, “he tries to make head or tail of this intelligence.” Then he crosses the street to the Russian Embassy, where he tells the Soviet Ambassador, Romanoff, “They know you know their code.” Romanoff is unfazed: “We have known for some time that they knew we knew their code. We have acted accordingly—by pretending to be duped.” The general returns to the American Embassy and confronts Moulsworth: “They know you know they know you know.” Moulsworth (genuinely alarmed): “What? Are you sure?”

This reminds me of a short story I once read, but which I cannot remember the name of. It concerned an American spy who was undercover in the Soviet Union. He was preparing for retirement, and genuinely addled about which side he had really been working for. Each had reason to suspect he was a spy, and so each had reason to feed him misleading information for the other side (or accurate information that they wouldn’t trust, given what they thought about him). He was left in the state of being unable to remember whether his proper retirement rewards was a gold Rolex from the CIA or a dacha from the KGB.

Doctors and conditional probabilities

While it is not surprising, it is worrisome that doctors have trouble with statistics, particularly conditional probabilities. 25 German doctors were asked about the following situation. It is clearly a tricky question, but it is surely a type of question that doctors are exposed to constantly:

The probability that one of these women has breast cancer is 0.8 percent. If a woman has breast cancer, the probability is 90 percent that she will have a positive mammogram. If a woman does not have breast cancer, the probability is 7 percent that she will still have a positive mammogram. Imagine a woman who has a positive mammogram. What is the probability that she actually has breast cancer?

The results of this small trial were not encouraging:

[The] estimates whipsawed from 1 percent to 90 percent. Eight of them thought the chances were 10 percent or less, 8 more said 90 percent, and the remaining 8 guessed somewhere between 50 and 80 percent. Imagine how upsetting it would be as a patient to hear such divergent opinions.

As for the American doctors, 95 out of 100 estimated the woman’s probability of having breast cancer to be somewhere around 75 percent.

The right answer is 9 percent.

You would think that this sort of quantitative analysis would play an important role in the medical profession. I am certain that a great many people around the world have received inappropriate treatment or taken unnecessary risks because doctors have failed to properly apply Bayes’ Theorem. Indeed, false positives in medical tests are a very commonly used example of where medical statistics can be confusing. It is also a problem for biometric security protocols, useful for filtering spam email, and a common source of general statistical errors.

The proper remedy for this is probably to provide doctors with simple-to-use tools that allow them to go from data of the kind in the original question to a correct analysis of probabilities. The first linked article also provides a good example of a more intuitive way to think about conditional probabilities.

Back up genes from endangered species

Out in Svalbard there is a seed bank, buried in the permafrost. The idea is that it will serve as a refuge for plant species that may vanish elsewhere, perhaps because industrial monocrops (fields where only a single species is intentionally cultivated by industrial means) continue to expand as the key element of modern agriculture.

Perhaps there should be a scientific and conservational project to collect just the genes of some of the great many species our species is putting into peril: everything from primates to mycorrhizal fungi to marine bacteria. The data could be stored, and maybe put to use at some distant point where humanity at large decides that it is better to carefully revive species than to indifferently exterminate them.

For many creatures, the genes alone won’t really be enough, regardless of how good at cloning we become. An elephant or a chimp built up alone from cells would never really become and elephant or chimp as they exist today. Whether those alive now are socialized in a natural or an artificial environment, they will have had some context-sensitive socialization, which subsequently affected their mental life. It is plausible to say that elephants or chimps raised among their peers, living in the way they did thousands of years ago, will develop mentally in a manner that is profoundly different from elephants or chimps in captivity today, much less solitary cloned beings in the future. Those beings will be weird social misfit representatives of those species.

Still, it is better to have misfits than nothing at all. If there is anything human beings should really devote themselves to backing up with a cautious eye turned towards an uncertain future, it seems far more likely to be the genes of species our descendants may not be fortunate enough to know than the Hollywood movies that probably account for a significant proportion of all the world’s hard drives.