Increasingly clever machines

It seems my mountain climbing, robot-building friend Mark has a relatively new blog. He works with autonomous robots of the kind that competed in the recent DARPA Urban Challenge.

Here is one way in which such robots see the world: as a set of laser determined ranges.

Previous robot-related posts:

Death, drugs, and rock and roll

A recent study in the Journal of Epidemiology and Community Health confirms the hazards of musical stardom. The study examined the lives of 1,064 successful musicians in the rock, punk, rap, R&B, electronica, and new age genres. All became famous between 1956 and 1999 and all had records that were included in a ‘Top 1000 records of all time’ list from 2000.

It found that the median age of death for North American celebrities was an unimpressive 41.78. Europeans do even worse, at just 35.18. All told, successful musicians are nearly twice as likely to die early as members of the normal population.

The regional breakdown by cause of death is also interesting:

Cause – % in the US – % in Europe
Suicide – 2.8% – 3.6%
Drug or alcohol overdose – 15.3% – 28.6%
Chronic drug or alcohol disorder – 9.7% – 3.6%
Drug or alcohol related accident – 2.8% – 7.1%
Cancer – 19.4% – 21.4%
Heart disease – 18.0% – 3.6%
Accidents – 13.9% – 21.4%
Violence – 6.9% – 3.6%
Other – 11.1% – 7.1%

The largest single discrepancy is the probability of dying of a drug overdose, but lots of other significant differences exist. Neither regional profile suggests that music is a healthy profession: at least for those at the top.

Source:

Today’s best biofuel: Brazilian ethanol

Montreal graffiti

Many people see biofuels as a promising replacement for oil in transportation applications. Indeed, being able to replace the oil that contributes to climate change and must often be imported from nasty regimes with carbon-neutral fuels from domestic crops has a great deal of intuitive appeal. For this process to be worthwhile, however, there is a need to consider both life-cycle energy usage and net carbon emissions.

A study conducted in 2004 by Isaias de Carvalho Macedo at the University of Brazil focused on the production of ethanol from Brazilian sugarcane. This is considered by the majority of commentators to be the most energy efficient source of biofuel currently available. This is because most Brazilian sugarcane requires no irrigation and must only be ploughed up and replanted once every five years. The Macedo study found that producing a tonne of sugarcane requires 250,000 kilojoules of energy. This represents the need for tractors, fertilizers, and other elements of modern mechanical farming. The ethanol from one tonne of sugarcane contained 2,000,000 kilojoules of energy. Furthermore, the plants that produce it burn bagasse (the pulp left over when sugarcane has the sugar squeezed out) and can contribute net electricity to the grid. Corn ethanol (the kind being heavily subsidized in the United States) takes about as much energy to grow as is ultimately contained in the fuel.

In terms of net carbon emissions, cane ethanol is also fairly good. Using one tonne of ethanol instead of the amount of gasoline with the same energy content produces 220.5 fewer kilograms of carbon dioxide, when all aspects of production and usage are considered. Burning one litre of gasoline produces about 640 grams of carbon dioxide. Since ethanol has about 25% less energy than gasoline, the relevant comparison is between 1,000 kilograms of ethanol and 750 kilos of gasoline. The gasoline would emit 460 kilos of carbon dioxide, while the ethanol would emit 259.5 kilos.

This is an improvement over the direct use of fossil fuels, but not a massive one. The Macedo study concludes that widespread ethanol use reduces Brazilian emissions by 25.8 million tonnes of carbon dioxide equivalent per year. Their total carbon emissions from fossil fuels are about 92 million tonnes per year – a figure that increases substantially if deforestation is included.

The conclusion to be drawn from all of this is that ethanol – even when produced in the most efficient way – is not a long-term solution. Producing 259.5 kilos of carbon is more sustainable than producing 460, but it isn’t an adequate reduction in a world that has to cut from about 27 gigatonnes of carbon dioxide equivalent to five. Bioethanol may become more viable with the development of cellulosic technology (a subject for another post), but is certainly no panacea at this time.

References:

[Update: 8:54am] The above numbers on carbon dioxide emissions produced by gasoline per kilometre are disputed. If someone has an authoritative source on the matter, please pipe up.

Carbon pricing and GHG stabilization

Montreal graffiti

Virtually everyone acknowledges that the best way to reduce greenhouse gas emissions is to create a price for their production that someone has to pay. It doesn’t matter, in theory, whether that is the final consumer (the person who buys the iPod manufactured and shipped across the world), the manufacturer, or the companies that produced the raw materials. Wherever in the chain the cost is imposed, it will be addressed through the economic system just like any other cost. When one factor of consumption rises in price, people generally switch to substitutes or cut back usage.

This all makes good sense for the transition from a world where carbon has no price at all and the atmosphere is treated as a greenhouse gas trash heap. What might become problematic is the economics of the situation when greenhouse gas emissions start to approach the point of stabilization. If we get 5 gigatonnes collectively, that means a global population of 11 billion will get about half a tonne of carbon each.

Consider two things: Right now, Canadian emissions per person are about 24.3 tonnes of CO2 equivalent. Cutting to about 0.5 is a major change. While it may be possible to cut a large amount for a low price (carbon taxes or permits at up to $150 a tonne have been discussed), it makes sense that people will be willing to pay ever-more to avoid each marginal decrease in their carbon budget. Moving from 24.3 tonnes to 20 might mean carrying out some efficiency improvements. Moving from 20 to 10 might require a re-jigging of the national energy and transportation infrastructures, carbon sequestration, and other techniques. Moving from 10 to 0.5 may inevitably require considerable personal sacrifice. It certainly rules out air travel.

The next factor to consider if the effect of economic inequality on all this. We can imagine many kinds of tax and trading systems. Some might be confined to individual states, and others to regions. It is possible that such a scheme would eventually be global. With a global scheme, however, you need to consider the willingness of the relatively affluent to pay thousands or tens of thousands of dollars to maintain elements of their carbon-intensive lifestyles. This could mean that people of lesser means get squeezed even more aggressively. It could also create an intractable problem of fraud. A global system that transfers thousands of dollars on the basis of largely unmeasured changes in lifestyle could be a very challenging thing to authenticate.

These kinds of problems lie in the relatively distant future. Moving to a national economy characterized by a meaningful carbon price is likely to take a decade. Moving to a world of integrated carbon trading may take even longer. All that admitted, the problems of increasing marginal value of carbon and the importance of economic inequality are elements that those pondering such pricing schemes should begin to contemplate.

Index of climate posts

Fruit bar

For the last while, my aim on this blog has been both to entertain readers and to provide some discussion of all important aspects of the climate change problem. To facilitate the latter aim, I have established an index of posts on major climate change issues. Registered users of my blog can help to update it. Alternatively, people can use comments here to suggest sections that should be added or other changes.

The index currently contains all posts since I arrived in Ottawa. I should soon expand it to cover the entire span for which this blog has existed.

Problems with fusion ITER means to solve

Building in Old Montreal

The fundamental problem with nuclear fusion as a mode of energy production is establishing a system that produces more power than it consumes. Heating and containing large volumes of tritium-deuterium plasma is an energy intensive business. As such, the sheer size of the planned International Thermonuclear Experimental Reactor is a big advantage. Just like it is easier to keep a huge cooler full of drinks cold than to keep a single can that way, a larger volume of plasma has less surface area relative to its total energy. As such, bigger reactors have a better chance of producing net power.

The other big problems that scientists and engineers anticipate are as follows:

  1. No previous reactor has sustained fusion for very long. The JT-60 reactors in Japan holds the record, at 24 seconds. Because ITER is meant to operate for between 7 and fifteen minutes, it will produce a higher volume of very hot hydrogen (the product of the tritium-deuterium fusion). That hydrogen could interfere with the fusing plasma. As such, it needs to be removed from the reactor somehow. ITER plans to use a carbon-coated structure called a diverter, at the bottom of the reactor, to try to do this. It is not known how problematic the helium will be, nor how effective the diverter will prove.
  2. Both the diverter and the blanket that surrounds the reactor will need to be able to resist temperatures of 100 million degrees centigrade. They will also need to be able to survive the presence of large amount of radiation. It is uncertain whether the planned beryllium coatings will be adequate to deal with the latter. Prior to ITER’s construction, there are plans to test the planned materials using a specially built particle accelerator at a new facility, probably to be built in Japan. THis test facility could cost about $2.6 billion – one quarter of the total planned cost of ITER itself.
  3. Probably the least significant problem is converting the heat energy from the fusion reaction into electrical power. This is presumably just a matter of putting pipes carrying a fluid into the blanket, then using the expansion of that fluid to drive turbines. While this should be a relatively basic change, it is worth noting that ITER will have no capacity to generate power, and will thus need to dissipate its planned output of about 500 megawatts by other means.

None of these issues undermine the case for building ITER. Indeed, they are the primary justification for building the facility. If we already knew how to deal with these problems, we could proceed directly to building DEMO: the planned electricity-generating demonstration plant that is intended to be ITER’s successor.

The foolishness of the International Space Station

Montreal courthouse

On Tuesday, the space shuttle launched once again on a mission to add another piece to the International Space Station (ISS). As I have said before, it is a needlessly dangerous, unjustifiably expensive, and rather pointless venture. The science could be equally well done by robots, without risking human lives, and without spending about $1.3 billion per launch (plus emitting all the greenhouse gasses from the solid rocket boosters and related activities).

More and more, the ISS looks like a hopeless boondoggle. The lifetime cost is being estimated at $130 billion, all to serve a self-fulfilling mandate: we need to put people into space to scientifically assess what happens when we put people into space. Furthermore, the window between the completion of the ISS in about 2012 and the potential abandonment of the station as soon as 2016 is quite narrow. Robert Park may have summed up the whole enterprise best when he remarked that:

“NASA must complete the ISS so it can be dropped into the ocean on schedule in finished form.”

Normally, I am a big supporter of science. I think funding the International Thermonuclear Experimental Reactor and Large Hadron Collider is wise; these machines will perform valuable scientific research. Likewise, I support the robotic work NASA does – especially when it comes to scientists looking down on Earth from orbit and providing valuable research and services. I support the James Webb telescope. I also support the idea that NASA should have some decent plans for dealing with an anticipated asteroid or comet impact. The ISS, by contrast, is a combination between technical fascination lacking strategic purpose and pointless subsidies to aerospace contractors.

Of course, the Bush plan to send people to Mars is an even worse idea with higher costs, more risk, and even less value.

Studies backing successive IPCC reports

While it is obvious that the 2007 Fourth Assessment Report (4AR) of the Intergovernmental Panel on Climate Change (IPCC) was going to be more comprehensive than the 2001 Third Assessment Report (TAR), I was surprised to see the extent and the breakdown:

Sector – Studies assessed in TAR – Studies assessed in 4AR
Cryosphere: 23 – 59
Hydrology and water resources: 23 – 49
Coastal processes and zones: 4 – 56
Aquatic biological systems: 14 – 117
Terrestrial biological systems: 46 – 178
Agriculture and forestry: 5 – 49
Human health: 5 – 51
Disasters and hazards: 3 – 18

Total: 95 – 577

While it is simplistic to equate the number of studies examined with the overall quality of the conclusions drawn, the large increase is certainly reflective of the amount of research being devoted to climate change issues, as well as the level of resources it has been deemed appropriate to spend examining that body of scientific work.

These figures come from Cynthia Rosenzweig, a research scientist at NASA and member of the IPCC’s second working group.

Materials science and transgenic animals

Oil spill analysis equipment

One of the most interesting ongoing developments in materials science involves the borrowing of biologically originated materials and processes. The development is old news for people who follow science news, but seems worth mentioning to others.

In the first instance, there is the copying of chemical tricks that exist in nature. People have speculated about copying the wall sticking abilities of gecko feet, for instance. By artificial producing structures similar to those on the feet, excellent non-chemical adhesives could be made. Gecko feet are sufficiently adhesive to hold several hundred times the weight of the animal. Furthermore, they can be attached and detached at will by altering the geometry of the setae that produce the adhesion using Van der Waals force.

In the second instance, people have been exploiting biological processes to produce existing things in more effective ways. A favourite way to do this is through pharming: where new genes are introduced into species in order to turn them into pharmaceutical factories. For instance, goats have been genetically engineered to produce an anti-clotting drug in their milk, which can then be extracted, purified, and used by humans. The drug, called ATryn, treats hereditary antithrombin deficiency: a condition that makes people especially vulnerable to deep-vein thrombosis. The principle benefits of using goats are financial, as described in The Economist:

Female goats are ideal transgenic “biofactories”, GTC claims, because they are cheap, easy to look after and can produce as much as a kilogram of human protein per year. All told, Dr Cox reckons the barn, feed, milking station and other investments required to make proteins using transgenic goats cost less than $10m—around 5% of the cost of a conventional protein-making facility. GTC estimates that it may be able to produce drugs for as little as $1-2 per gram, compared with around $150 using conventional methods.

Transgenic goats are also being used to produce spider silk on an industrial scale. That super-strong material could be used in everything from aircraft to bullet-proof vests. Different varieties of spider silk could be used to produce materials with varying strengths and elasticities.

While the former behaviour seems fairly unproblematic (we have been coping from nature for eons), the latter does raise some ethical issues. Certainly, it involves treating animals as a means to greater ends – though that is also an ancient activity. People have generally been more concerned about the dangers to people and the natural world from such techniques: will the drugs or materials produced be safe? Will the transgenic animals escape and breed with wild populations? These are reasonable concerns that extend well beyond the genetic or materials expertise possessed by the scientists in question.

The potential of such techniques is undeniably considerable. One can simply hope that a combination of regulation and good judgment will avoid nightmare situations of the kind described in Oryx and Crake. So far, our genetically modified creatures tend to be inferior to their natural competitors. According to Alan Weisman, virtually all of our crops and livestock would be eliminated by predation and competition in a few years, in the absence of human care and protection. It remains to be seen whether the same will be true of plants and animals that currently exist only in the imaginations of geneticists.