Provincial exams are very useful

Shoes and map

Speaking with people involved in the Ontario school system, I was surprised to learn that they do not have provincial exams for courses in the last years of high school. To me, this seems like a mistake. Provincial exams provide vital information to university application departments: specifically, they let you gauge how much grade inflation is happening at any particular school. A school where the mean class mark is 90%, but where the mean provincial mark is 60%, is clearly inflating grades. Conversely, a school where the mean class grade is 60% but where students get 90% on the provincial exam clearly has high standards.

Given that individual schools and teachers have a strong incentive to inflate the grades of their students, provincial (or federal) exams seem to be a key mechanism for keeping them honest. Otherwise, there is simply far too much opportunity to call mediocrity excellence, without anyone else being the wiser.

I very much hope B.C. retains the provincial exam system, and that it becomes universal across Canada.

DIY waste heat capture

We have discussed the issue of waste heat before, in the context of both incandescent lightbulbs and the cogeneration of heat and power. For those interested in a more hands-on treatment of the subject, there are instructions for building a thermoelectric unit which allows you to charge electronics using waste heat from appliances. The same page also shows how to make a LEGO car powered by electricity produced using the heat from a small tea candle.

Using this system while heating your house doesn’t make a lot of sense, but similar devices may have some practical value inside buildings that are being cooled or outdoors. Of course, the cost and complexity of the thermoelectric unit also demonstrates why a lot of waste heat goes uncaptured, since it is cheaper to use more electricity or fuel than to improve system efficiency.

Counting greenhouse gas emissions

Wood frame in a garden

Greenhouse gas emissions figures, as dealt with in the realm of public policy, are often a step or two removed from reality.

For instance, reductions in emissions are often expressed in relation to a ‘business-as-usual’ scenario, by governments wanting to flatter the results of their mitigation efforts. That means, instead of saying that emissions are X% up from last year, you say that they are Y% down from where they would have been in the absence of government action. Since the latter number is based on two hypotheticals (what emissions would have been, and what reductions arose from policy), it is harder to criticize and, arguably, less meaningful.

Of course, the climate system doesn’t care about business-as-usual (BAU) projections. It simply responds to changes in the composition of the atmosphere, as well as the feedback effects those changes induce.

The second major disjoint is between the relentless focus of governments on emissions directly produced by humans, compared with all emissions that affect the climate. For example, drying out rainforests makes them less biologically productive, leading to more greenhouse gasses in the atmosphere. Similarly, when permafrost melts, it releases methane, which is a powerful greenhouse gas. It is understandable why governments don’t generally think about these secondary emissions, largely because of the international political difficulties that would arise if they did. Can Canada miss its greenhouse gas mitigation targets because of permafrost melting? Who is responsible for that melting, Canada or everyone who has ever emitted greenhouse gasses? People who have emitted them since we learned they are dangerous?

While the politics of the situation drive us to focus on emissions caused by voluntary human activities (including deforestation), we need to remain aware of the fact that the thermodynamic balance of the planet only cares about physics and chemistry – not borders and intentionality. When it comes to “avoiding dangerous anthropogenic interference in the climate system” we need to remember to focus on both our absolute level of emissions (not their relation to a BAU estimate) and to take into account the secondary effects our emissions have. Doing otherwise risks setting our emission reduction targets too low, and thus generating climate change damage at an intolerable level.

Collecting bike statistics

Given that I am the kind of person who can be motivated by numbers, I decided to pick up a bike computer today – the simplest waterproof model available at MEC. After installing it, I wanted to make sure I had selected the correct wheel size (I think it’s 2174mm on the 700x32c wheels of my Trek 7.3FX). A few kilometres of cycling allowed me to confirm both its measure of velocity and distance traveled against my GPS receiver (a marine unit too big and cumbersome for cycling).

Unfortunately, it also confirmed that the little rare earth magnet that the sensor detects shifts around quite easily on my spoke, and it needs to be very carefully aligned to work. First, I tried gaffer tape, but it really wasn’t right for the job. Then, I tried the blogosphere, which suggested superglue. Glued in place, I hope that magnet isn’t going anywhere while I rack up the kilometres over the coming months.

For those keeping track, the trip out to get the computer, return home with it, and calibrate it amounted to 17.8km.

Fighting malaria with fungus

Strawberry cheesecake

All the chemicals that human beings use to kill living things (weeds, bacteria, viruses) are subject to the same basic problem of resistance. A chemical that doesn’t manage to kill a few individuals will leave them with a huge opportunity to reproduce without competition. As such, all pesticides, herbicides, and antibiotics are likely to become less effective with time. Andrew Read, a Professor of Biology and Entomology at Penn State, is working on an approach for controlling malaria that circumvents this difficulty.

The mosquitoes that spread malaria are not born with the disease. Rather, they must bite someone who is infected. It then takes 10-14 days for the parasites to develop in the female mosquitoes, after which they reach the salivary glands and the insect becomes infectious. Read’s idea is to create a fungus that becomes lethal to mosquitoes after 10-12 days. As such, the fungus would exclusively kill the type of mosquitoes that infect people with malaria. The brilliant aspect of this is that the females will already have reproduced before being killed. That makes it far more difficult for genes resistant to the fungus to emerge and proliferate within the gene pool. That could make it an especially valuable tool in the fight against malaria – an illness that kills about one million people a year.

The idea is similar in some ways to the insect killing fungi described in Paul Stamets’ book, though his colony-exterminating approach seems like it would eventually breed resistance in a way that killing only older female mosquitoes would hopefully not do.

The ROM and evolution

Kensington Avenue sign

Wandering through Toronto’s Royal Ontario Museum (ROM) is an enjoyable way both to experience the diversity of life and appreciate the degree to which its history has come to be comprehensible for human beings. From the grand displays of ancient bones to the more abstract explanations of taxonomy and evolutionary history, the place is a monument to the scientific understanding of the world. Given the power of that discourse – derived from the exceedingly high level of evidence provided by physical remains, genetics, and the study of living creatures – it makes it all the more astonishing that anybody out there believes that the Earth is 6,000 years old, that all the creatures on it were created simultaneously, and that evolution is not a powerful ongoing process that explains our biological origins.

Over and above matters of scientific understanding, the story told by the ROM is also enormously more compelling than the story of creation by an omnipotent god. The latter may have fireworks, but the former has a lot more power and beauty. It makes the creation story look like a bad Hollywood film that happens to star someone famous: the Waterworld of theories.

Space-based solar power

Dark bird on a fence

The Pacific Gas and Electric Company is seeking regulatory approval for a space based solar power system. The plan is for a 200 megawatt (MW) facility that will generate electricity from sunlight in orbit and beam it to a ground receiving station using radio waves. Older gamers may recall this technology as the basis of the ‘microwave’ power plants in SimCity 2000. Unfortunately, while the SimCity plants cost just $30,000 and produced 14,000 MW of energy, the 200 MW PG&E facility is expected to cost several billion dollars – far more than ground-based facilities with comparable output. The one real perk of space-based systems in geosynchronous orbits is that they will be exposed to the sun at all times, eliminating the need for storage or load balancing. Some have even speculated that the technology might eventually be able to direct beams of energy directly to facilities (perhaps even vehicles) that require it, reducing the need for transmission and energy storage infrastructure.

I am not sure how to feel about such initiatives. On the one hand, it is possible that space-based solar power will eventually be a commercially and ecologically viable source of energy. On the other, it may be a distraction from the urgent changes that need to occur in the near-term. There are also issues with the emissions associated with space launches, as well as the limited number of slots for satellites in geosynchronous orbit and ‘optical aperture’ issues. For now, it really doesn’t seem like a viable technology. That being said, if a private group can convince regulators that it is safe and environmentally effective, and investors that it is viable, I don’t see any reason to interfere with the attempt.

GMOs not providing yield or climate change benefits

White tree in archway

The Union of Concerned Scientists has a new report (PDF) out, arguing that genetic modification of crops has so far failed to increase yields or improve resilience to climate change. The study covers the period of the past fifteen years, during which GM crops have been widely commercially deployed in the United States and elsewhere. It focuses on corn and soybeans, since they are the most commonly-grown GM crops. 90% of American corn is GM, as are 64% of soybeans.

The report also highlights how GM crops are heavy users of nitrogen-rich synthetic fertilizers, and that their use generates nitrous oxide in soil, a powerful greenhouse gas. Producing fertilizer also requires energy and generally uses natural gas as a feedstock.

The report concludes that GM is being over-invested in, relative to conventional breeding techniques and approaches that minimize the use of external inputs. I have argued in the past that genetic modification could be one tool for helping to adapt to a changed climate, and I think that is still true. What this study shows is the importance of rigorous evaluation, as well as somewhat tempered enthusiasm when it comes to the ability of new technologies to yield strong, rapid changes in outcomes.

Book club, month two

The first month of the non-fiction book club is coming to an end, and I will be posting my review of Easterly’s The White Man’s Burden on Wednesday. As such, it is time to start choosing a second book. I have the following nominees:

1) Speth, Gus. The Bridge at the Edge of the World: Capitalism, the Environment, and Crossing from Crisis to Sustainability.

2) Jaccard, Mark. Sustainable Fossil Fuels: The Unusual Suspect in the Quest for Clean and Enduring Energy.

3) Cherry-Garrard, Apsley. The Worst Journey in the World. (About a failed Antarctic expedition)

What else would people consider reading?