A suggestion to Google

One cool feature of Google is that it performs unit conversions. It makes it easy to learn that 1000 rods is the same as 2750 fathoms. One useful addition would be the calculation of carbon dioxide equivalents: you could plunk in “250 tonnes of methane in CO2 equivalent” and have it generate the appropriate output, based on the methodology of the IPCC. The gasses for which the calculator should work would also include nitrous oxide, SF6, HCFCs, HFCs, CFCs, and PFCs.

Sure, this feature would only be useful for less than one person in a million, but Google has often shown itself willing to cater to the needs of techie minorities.

The true price of nuclear power

Maple leaf

Several times this blog has discussed whether climate change is making nuclear power a more acceptable option (1, 2, 3). One element of the debate that bears consideration is the legacy of contamination at sites that form part of the nuclear fuel cycle: from uranium mines to post-reactor fuel processing facilities. The Rocky Flats Plant in the United States is an especially sobering example.

Insiders at the plant started “tipping” the FBI about the unsafe conditions sometime in 1988. Late that year the FBI started clandestinely flying light aircraft over the area and noticed that the incinerator was apparently being used late into the night. After several months of collecting evidence both from workers and by direct measurement, they informed the DOE on June 6, 1989 that they wanted to meet about a potential terrorist threat. When the DOE officers arrived, they were served with papers. Simultaneously, the FBI raided the facilities and ordered everyone out. They found numerous violations of federal anti-pollution laws including massive contamination of water and soil, though none of the original charges that led to the raid were substantiated.

In 1992, Rockwell was charged with minor environmental crimes and paid an $18.5 million fine.

Accidents and contamination have been a feature of facilities handling nuclear materials worldwide. Of course, this does not suffice to show that nuclear energy is a bad option. Coal mines certainly produce more than their share of industrial accidents and environmental contamination.

The trickiest thing, when it comes to evaluating the viability of nuclear power, is disentangling exactly what sort of governmental subsidies do, have, and will exist. These subsidies are both direct (paid straight to operators) and more indirect (soft loans for construction, funding for research and development). They also include guarantees that the nuclear industry is only responsible for a set amount of money in the result of a catastrophic accident, as well as the implicit cost that any contamination that corporations cannot be legally forced to correct after the fact will either fester or be fixed at taxpayer expense. Plenty of sources claim to have a comprehensive reckoning of these costs and risks, but the various analyses seem to be both contradictory and self-serving.

Before states make comprehensive plans to embrace or reject nuclear power as a climate change mitigation option, some kind of extensive, comprehensive, and impartial study of the caliber of the Stern Review would be wise.

The Storm Worm

The Storm Worm is scary for a number of good reasons. It acts patiently, slowly creating a massive network of drone machines and control systems, communicating through peer-to-peer protocols. It gives little evidence that a particular machine has been compromised. Finally, it creates a malicious network that is particularly hard (maybe impossible, at this time) to map or shut down.

This is no mere spam-spread annoyance. If it takes over very large numbers of computers and remains in the control of its creators, it could be quite a computational force. The only question is what they (or someone who rents the botnet) will choose to use it for, and whether such attacks can be foiled by technical or law-enforcement means. Hopefully, this code will prove a clever exception to the norm, rather than a preview of what the malware of the future will resemble.

Normally, I don’t worry too much about viruses. I use a Mac, run anti-virus software, use other protective programs, make frequent backups, and use the internet cautiously. While those things are likely to keep my own system free of malware, I naturally remain vulnerable to it. That’s where most spam comes from. Also, there is the danger that a network of malicious computers will crash or blackmail some website or service that I use. With distributed systems like Storm, the protection of an individual machine isn’t adequate to prevent harm.

Previous related posts:

Dr. Strangelove in a nuclear bunker

Marc Gurstein rides the bomb

After today’s orientation, I went with some friends to see Dr. Strangelove in the Diefenbunker – the infamous Canadian nuclear shelter, built to protect top Canadian military and civilian leadership in the event of nuclear war. Diefenbunker is actually a general term for shelters of the type: the one near Ottawa is called CFS Carp. Apparently, there is also one in Nanaimo, B.C. One odd thing is that the shelter has a multi-room suite for the Governor General. Presumably, Canada would not have much need for a local representative of the Queen, after the actual Queen’s entire realm is reduced to a burnt, radioactive plain.

Tonight’s film was followed up by Pho with three fellow employees of the federal government. It was all a distinct social step forward, and Ashley Thorvaldson deserves credit for organizing the expedition.

You can read about the Cold War movies events on the website of the Diefenbunker Museum.

Liability and computer security

One of the major points of intersection between law and economics is liability. By setting the rules about who can sue brake manufacturers, in what circumstances, and to what extent, lawmakers help to set the incentives for quality control within that industry. By establishing what constitutes negligence in different areas, the law tries to balance efficiency (encouraging cost-effective mitigation on the part of whoever can do it most cheaply) with equity.

I wonder whether this could be used, to some extent, to combat the botnets that have helped to make the internet such a dangerous place. In brief, a botnet consists of ordinary computers that have been taken over by a virus. While they don’t seem to have been altered, from the perspective of users, they can be maliciously employed by remote control to send spam, attack websites, carry out illegal transactions, and so forth. There are millions of such computers, largely because so many unprotected PCs with incautious and ignorant users are connected constantly to broadband connections.

As it stands, there is some chance that an individual computer owner will face legal consequences if their machine is used maliciously in this way. What would be a lot more efficient would be to pass part of the responsibility to internet service providers. That is to say, Internet Service Providers (ISPs) whose networks transmit spam or viruses outwards could be sued by those harmed as a result. These firms have the staff, expertise, and network control. Given the right incentives, they could require users to use up-to-date antivirus software that they would provide. They could also screen incoming and outgoing network traffic for viruses and botnet control signals. They could, in short, become more like the IT department at an office. ISPs with such obligations would then lean on the makers of software and operating systems, forcing them to build more secure products.

As Bruce Schneier has repeatedly argued, hoping to educate users as a means of creating overall security is probably doomed. People don’t have the interest or the incentives to learn and the technology and threats change to quickly. To do a better job of combating them, our strategies should change as well.

Oryx and Crake

Fire truck valves

Margaret Atwood‘s novel, which was short-listed for the Booker Prize, portrays a future characterized by the massive expansion of human capabilities in genetic engineering and biotechnology. As such, it bears some resemblance to Neal Stephenson‘s The Diamond Age, which ponders what massive advances in material science could do, and posits similar stratification by class. Of course, biotechnology is an area more likely to raise ethical hackles and engage with the intuitions people have about what constitutes the ethical use of science.

Atwood does her best to provoke many such thoughts: bringing up food ethics, that of corporations, reproductive ethics, and survivor ethics (the last time period depicted is essentially post-apocalyptic). The degree to which this is brought about by a combination of simple greed, logic limited by one’s own circumstances, and unintended consequences certainly has a plausible feel to it.

The book is well constructed and compelling, obviously the work of someone who is an experienced storyteller. From a technical angle, it is also more plausible than most science fiction. It is difficult to identify any element that is highly likely to be impossible for humanity to ever do, if desired. That, of course, contributes to the chilling effect, as the consequences for some such actions unfold.

All in all, I don’t think the book has a straightforwardly anti-technological bent. It is more a cautionary tale about what can occur in the absence of moral consideration and concomitant regulation. Given how the regulation of biotechnology is such a contemporary issue (stem cells, hybrid embryos, genetic discrimination, etc), Atwood has written something that speaks to some of the more important ethical discussions occurring today.

I recommend the book without reservation, with the warning that readers may find themselves disturbed by how possible it all seems.

Unlocking cars with computers

Back in the day when the original Palm Pilot was a hot new piece of technology, I remember BMW and a number of other car companies started selling cars with a keyless entry system based on an infrared transmitter in a key fob, just like a television remote control. Unfortunately, whatever sort of protocol the system used for authentication was quickly undermined and the Palm Pilot’s infrared transmitter suddenly became a key to all manner of expensive new automobiles.

Something similar has happened again. The KeeLoq system, used in the keyless entry systems of most car manufacturers, has been cracked by computer security researchers. A PDF of their research paper is online. The attack requires about one hour of radio communication with the key, which could be done surreptitiously while the owner is in an office or restaurant. The cryptographic analysis involved takes about a day and produces a ‘master key’ that can actually open a number of different cars. Having collected a large number of such master keys, it would be possible to intercept a single transmission between a key and a car (say, when someone is parking), identify the correct master key, and open the door in seconds. While this will not start the car – and there are certainly other methods available for breaking into one – it does create a risk for theft of objects inside cars in a way that shows no signs of forced entry. In many such cases, claiming insurance compensation is difficult.

Of course, mechanical locks also have their failings. One important difference has to do with relative costs. Making a physical, key-based access control system more secure probably increases the cost for every single unit appreciably. By contrast, improving the cryptography for a system based on an infrared or radio frequency transmission probably involves a one-off software development cost, with negligible additional costs per unit. As such, it is especially surprising that the KeeLoq system is so weak.

Quantum computers and cryptography

Public key cryptography is probably the most significant cryptographic advance since the discovery of the monoalphabetic substitution cipher thousands of years ago. In short, it provides an elegant solution to the problem of key distribution. Normally, two people wishing to exchange encrypted messages must exchange both the message and the key to decrypt it. Sending both over an insecure connection is obviously unsafe and, if you have a safe connection, there is little need for encryption. Based on some fancy math, public key encryption systems let Person A encrypt messages for Person B using only information that Person B can make publicly available (a public key, like mine).

Now, quantum computers running Shor’s algorithm threaten to ruin the party. Two groups claim to have achieved some success. If they manage the trick, the consequences will be very significant, and not just for PGP-using privacy junkies. Public key encryption is also the basis for all the ‘https’ websites where we so happily shop with credit cards. If a fellow in a van outside can sniff the traffic from your wireless network and later decrypt it, buying stuff from eBay and Amazon suddenly becomes a lot less appealing.

Thankfully, quantum computers continue to prove very difficult to build. Of course, some well-funded and sophisticated organization may have been quietly using them for years. After all, the critical WWII codebreaking word at Bletchley Park was only made known publicly 30 years after the war.

For those who want to learn more, I very much recommend Simon Singh’s The Code Book.

Precaution and bats

The ‘precautionary principle’ is frequently invoked in arguments about both security and the environment, but remains enduringly controversial. No matter how it is formulated, it has to do with probabilities and thresholds for action. Sometimes, it is taken to mean that there need not be proof that something is harmful before it is restricted: for instance, in the case of genetically modified foods. Sometimes, it is taken to mean that there need not be proof that something be beneficiail before it is done: for example, with organic foods. Sometimes, it has to do with who gets the benefit of the doubt, in the face of inconclusive or inadequate scientific data.

This article from Orion Magazine provides some interesting discussion of how it pertains to health threats generally, with an anecdote about rabid bats as an illustrative example.

I am not sure if there is all that much of a take home message – other than that people behave inconsistently when presented with risks that might seem similar in simple cost-benefit terms – but the article is an interesting one.

Peering into metal with muons

When cosmic rays collide with molecules in the upper atmosphere, they produce particles called muons. About 10,000 of these strike every square metre of the earth’s surface each minute. These particles are able to penetrate several tens of metres through most materials, but are scattered to an unusual extent by atoms that include large numbers of protons in their nuclei. Since this includes uranium and plutonium, muons could have valuable security applications.

Muon tomography is a form of imaging that can be used to pick out fissile materials, even when they are embedded in dense masses. For instance, a tunnel sized scanner could examine entire semi trucks or shipping containers in a short time. Such tunnels would be lined with gas-filled tubes, each containing a thin wire capable of detecting muons on the basis of a characteristic ionization trail. It is estimated that scans would take 20-60 seconds, and less time for vehicles and objects of a known configuration.

Muons have also been used in more peaceful applications: such as looking for undiscovered chambers in the Pyramids of Giza and examining the interior of Mount Asama Yama, in Japan.