Our generalization defect

One big surprise from Michael Leinbach and Jonathan Ward’s Bringing Columbia Home: The Untold Story of a Lost Space Shuttle and Her Crew is the claim that people at NASA hadn’t anticipated the catastrophic loss of a Shuttle during reentry. That despite of the delicacy of the thermal protective tiles and the fatal consequences expected from their failure, the lack of engines and thus any way to salvage a single failed landing attempt, and the long period of commitment to a particular landing site from deorbit burn to touchdown. The lesson drawn from Challenger, in spite of all the training of all the people at NASA in statistics and rigour, is that ‘these things fail during liftoff’ — a bit of a mad generalization for a vehicle built to take the crew through a series of environments, each of which would almost immediately kill them without technological protection, from the extremely high pressure from drag at max-Q during liftoff, through the fatal vacuum of space, through fiery reentry through plasma above mach 18 to an unpowered glider landing. Assuming that the first thing that went fatally wrong would be the standard is like getting on the back of a charging bull covered in poison-tipped spines in the middle of a minefield and thinking: “This is the one that kills you by stomping on your head”.

One small quibble: the subtitle of this book is misleading. I expected it to be much more about Columbia’s crew and final mission, whereas the bulk of it is about the debris recovery efforts across the country after the shuttle disintegrated.

Additively printed and magnetically bound

Today I received Bathsheba‘s Tetrabox: a 3d-printed steel sculpture which is also a puzzle held together with magnets.

At a minimum, it has what I think of as ‘tips forward‘ and ‘tips around‘ solutions. For the first, all three of the asymmetrical pieces have their extended tips pointing toward the one symmetrical piece, allowing the sculpture to rest on them. For the second, the tips circle around the symmetrical piece.

Previously, I got Bathsheba’s hemoglobin laser crystal for Amanda, and later Myshka got me the DNA polymerase crystal as a very generous birthday gift.

De-anonymization

De-anonymization is an important topic for anyone working with sensitive data, whether in the context of academic research, IT system design, or otherwise.

I remember a talk during a Massey Grand Rounds panel where a medical researcher explained how she could pick herself out from an ‘anonymous’ database of Ontarians, on the basis that her salary was public as an exact dollar figure, only people with her specific job had it, and she was the only woman in that position.

The more general idea is that by putting pieces together you may be able to identify somebody who someone else has made some effort to keep anonymous.

It’s a challenge when doing academic research and writing on social movements, when some subjects choose to be anonymous in publications. That means not just not sharing their name, but not sharing any information that could be used to identify them. That gets hard when you think about adversaries who might have access to other information (in an extreme case, governments with access to masses of information) or even just ordinary people who can combine information from multiple sources logically. The date of an event described in an anonymous quote might tell allow someone to look up where it happened online. Another quote in which a third party’s actions are described could be used to determine that the de-anonymization target wasn’t that person. And so on and on like the logical games on the LSAT or the intricacies of mole hunting.

Lee Ann Fujii wrote smart stuff about this, and about subject protection in research generally.

Subject-specific databases

One of my main strategies for organizing information is to create databases for subjects of interest. I’m using the term in the broad Wikipedia sense of “an organized collection of data, stored and accessed electronically” here, and it includes everything from a single folder where PDF versions of all the references cited in a particular monograph of mine are stored to financial tracking spreadsheets, records of my weight, and sets of original RAW files for my photoshoots.

So far for my PhD research I have set up a few:

  • A spreadsheet of all accredited Canadian universities, with pertinent information about each divestment campaign I have identified
  • A master timeline for significant events in all campaigns, as well as events relevant to university divestment that happened in other institutions, like municipalities
  • A list of all scholarly work about university divestment campaigns, including which school(s) the authors looked at
  • A spreadsheet with titles and links to common document types at many campaigns, including detailed petitions like our ‘brief’, recommendations from university-appointed committees, and formal justification for university decisions
  • The consent database specified in my ethics protocol, which has also been useful for keeping tabs on people who I’m awaiting responses from
  • (Somewhat embarrassingly) A Google sheet where I manually tally how long each MS Word chapter draft is at midnight each day

For my earlier pipeline resistance project I had started putting together a link chart of relevant organizations and individuals, as well as a glossary and timeline.

I would love to have more formal training (and ideally coding ability) for working with more flexible kinds of databases than spreadsheets. That would be useful for debugging WordPress MySQL issues, but more importantly for more fundamental data manipulation and analysis. I haven’t really coded (aside from HTML and LaTeX) since long-passed days of tinkering with QBASIC and Pascal during the days of my youth in Vancouver. It seems like it would make a lot of sense to learn Python as a means of building and playing around with my own SQL databases…

Bright or invisible rocket exhaust

LOX and RP-1 never burn absolutely clean, and there is always a bit of free carbon in the exhaust, which produces a luminous flame. So when you’re looking at TV and see a liftoff from Cape Kennedy—or from Baikonur for that matter—and the exhaust flame is very bright, you can be sure the propellants are Lox and RP-1 or the equivalent. If the flame is nearly invisible, and you can see the shock diamonds in the exhaust, you’re probably watching a Titan II booster burning N2O4 and 50–50.

Clark, John D. Ignition! An Informal History of Liquid Rocket Propellants. Rutgers University Press Classics, 2017. p. 96

One-way rocketry

Finally somebody in authority sat down and thought the problem through. The specifications of JP-4 [jet fuel] were as sloppy as they were to insure a large supply of the stuff under all circumstances. But Jupiter and Thor [ballistic missiles] were designed and intended to carry nuclear warheads, and it dawned upon the thinker that you don’t need a large and continuing supply of fuel for an arsenal of such missiles. Each missile is fired, if at all, just once, and after a few dozen of them have been lobbed by the contending parties, the problem of fuel for later salvoes becomes academic, because everybody interested is dead. So the only consideration is that the missile works right the first time—and you can make your fuel specifications just as tight as you like. Your first load of fuel is the only one you’ll ever need.

Clark, John D. Ignition! An Informal History of Liquid Rocket Propellants. Rutgers University Press Classics, 2017. p. 95–6

Pre-computer rocket propellant chemistry calculations

[Calculating rocket fuel performance mathematically] gets worse exponentially as the number of different elements and the number of possible species [of reaction products] increases. With a system containing carbon, hydrogen, oxygen, and nitrogen, you may have to consider fifteen species or more. And if you toss in boron, say, or aluminum, and perhaps a little chlorine and fluorine—the mind boggles.

But you’re stuck with it (remember, I didn’t ask you to do this!) and proceed—or did in the unhappy days before computers. First, you make a guess at the chamber temperature. (Experience helps a lot here!) You then look up the relevant equilibrium constants for your chosen temperature. Devoted and masochistic savants have spent years in determining and compiling these. Your equations are now before you, waiting to be solved. It is rarely possible to do this directly. So you guess at the partial pressures of what you think will be the major constituents of the mixture (again, experience is a great help) and calculate the others from them. You add them all up, and see if they agree with your predetermined chamber pressure. They don’t, of course, so you go back and readjust your first guess, and try again. And again. And eventually all your species are in equilibrium and you have the right ratio of hydrogen to oxygen and so on, and they add up to the right chamber pressure.

Next, you calculate the amount of heat which would have been evolved in the formation of these species from your propellants, and compare that figure with the heat that would be needed to warm the combustion products up to your chosen chamber temperature. (The same devoted savants have included the necessary heats of formation and heat capacities in their compilations.) And, of course, the two figures disagree, so you’re back to square one to guess another chamber temperature. And so on.

Clark, John D. Ignition! An Informal History of Liquid Rocket Propellants. Rutgers University Press Classics, 2017. p. 84 (italics in original)

What it’s like to hear but not see the Toronto Air Show

A tweet of mine, written in a moment of irritability aggravated by the sound of jets roaring overhead, has gotten some attention by virtue of being incorporated into some news articles about social media commentary on the Toronto Air Show.

In addition to my standard gripes about the wastefulness of jet engine use, the undesirability of unwanted background noise, and the militarism embodied in combat aircraft development, I suggested that there are people in Toronto who find the experience of being a bystander during the noise as troubling or a reminder of trauma, having heard military jets operating at close quarters during any number of recent conflicts, from Gaza to Afghanistan to Yemen, or during interception flights carried out by domestic air forces.

A disturbing amount of the response on Twitter expressed anti-immigrant sentiment, particularly an assertion that (a) people who have experienced conflict and now live in Toronto now live in preferable life circumstances and therefore (b) they owe certain moral obligations to people who previously lived in the place they now inhabit, to wit just lumping it and not complaining while these acrobatic displays are put on. To some extent my interpretation of the comments was inevitably coloured by Twitter’s reputation as an especially hostile and personal platform, but I think even when viewed with as much objectivity as can be mustered they brought unnecessary hostility to a discussion ultimately about public policy, specifically whether such spectacles should continue.

It’s entirely fair to criticize me for assuming what somebody else’s life experience would mean, in terms of their experience of these noises. That being said, the basic parameters of something like post-traumatic stress disorder are publicly known and it seems plausible to me that anyone who has traumatic memories of being close to combat in which jets operated (whether as a soldier or a civilian) would have some chance of being triggered by the sound of an air show. Given the population of Toronto, it’s plausible that hundreds of people with PTSD are within earshot of each loud noise made by flying aircraft. It’s much more speculative, but I have also wondered about the number of people who can have panic attracts triggered by a stimulus like a jet engine sound find it triggering due to associations they have made through fiction, specifically quasi-realistic military computer games and films which realistically depict violence like Saving Private Ryan. Statistically very few people, even during times of mass conscription, faced intense combat of the kind depicted in the film, but probably a majority of the adult population has now seen multiple detailed immersive representations, whether through films like Spielberg’s or depictions like HBO’s Generation Kill or Band of Brothers.

I don’t want to suggest that it’s the same thing at all, but I have my own negative associations with hearing but not seeing military jets at low altitude nearby, as I lived in North Oxford within earshot of at least some of the approaches to RAF Brize Norton and we used to listen to Vickers refuelling aircraft and B-52s flying in at all times of day and night (familiar eventually in their shrieks and rumbles) and speculate about whether they were coming back from Iraq or from Afghanistan, maybe carrying coffins.

I don’t think social media griping is going to lead to the abolition of the airshow, but I do think it’s a good thing to have a public dialog about what people in the city are going through in terms of their mental health and the choices we make together affecting it.

Those with any opinions on the matter are invited to comment, anonymously if you like.

Security vulnerabilities in computer hardware

Why is trustworthy computer security impossible for ordinary users? In part because the system has multiple levels at which failure can occur, from hardware to operating systems and software.

Spectre and Meltdown show that no matter how careful you are about the operating sytem and software you run you can still be attacked using the underlying hardware. Another bug included at least in some VIA C3 x86 processors has similar ramifications.

These kinds of problems will be much worst with the “Internet of Things”, since bugs like Heartbleed will go unpatched, or even be unpatchable, in a lot of embedded computing applications for consumers.

Word versus LaTeX for academic publishing

There are some good discussions online about the relative merits of different types of software for writing long scholarly documents like a PhD thesis. For instance, Amrys O. Williams’ “Why you should LaTeX your dissertation; or, why you don’t have to write your dissertation in Word“.

I’ve seen the plusses and minuses of using LaTeX in academic and activism contexts first-hand and the dominant set of considerations for me concern collaboration. Theoretically, as a free and open source typesetting system LaTeX ought to be ideal for preparing complex documents. Unfortunately, whether they are university professors or student activists, it’s likely that few or none of your potential collaborators will already be familiar with LaTeX syntax or comfortable providing comments on a document in the format of LaTeX source code.

For my dissertation I have decided to write the whole thing as chapters comprised of Microsoft Word files, for the ease of my committee members. They won’t have full citations, but just the unique identifiers and any other details which I will eventually need to produce a citation in LaTeX. This way, my committee members can provide comments on Word documents and, once I have everything nailed down, I can spend a few days moving all the text into LaTeX for the preparation of the final dissertation. This way committee members also won’t be distracted by a need to minutely copy edit formatting and other trivialities, since each chapter explains that it’s just a draft for review with precise formatting to be done later.

I would rather just write the whole thing as LaTeX code in TextMate, avoiding the need to use Word at all, but a central necessity of writing a doctoral thesis is soliciting and incorporating input from committee members so all told the approach of writing in Word and later typesetting in LaTeX seems to have the most to recommend it.