Liquid lenses for low-cost eyeglasses

Joshua Silver – a retired Oxford professor – has developed a kind of eyeglasses that can be easily ‘tuned’ for a particular individual in the field. This is possible because the glasses contain sacs of liquid silicone and have syringes attached, allowing fluid to be added or removed. Changing the quantity of fluid effectively adjusts the kind of correction provided by the lenses, allowing them to address any degree or short- or long-sightedness.

10,000 pairs have already been distributed in Ghana, and there are plans to distribute 1,000,000 in India in 2009. Silver ultimately aims to produce enough glasses for 100 million people a year.

The science section at the Rideau Chapters

Icicles on green wood

The science section at the Rideau Centre Chapters always depresses me. It is often the most disorganized section of the store – tucked, as it is, in the very back corner. Books have frequently been relocated by customers and not re-shelved by staff, and the organizational system is deeply flawed even when properly implemented. For one thing, it has too many confusing sub-sections. It hardly makes sense to have a single shelf set aside for ‘physics’ books, when it is almost impossible to guess whether a specific tome will be in ‘physics,’ ‘mathematics,’ or the catch-all ‘science’ category. To top it all off, the catch-all category has been alphabetized in a bewildering serpent pattern, twisted back against itself and interrupted with random intrusions.

My two final gripes are that the science section is mysteriously co-mingled with the section on pet care (our most sophisticated form of understanding about the universe, lumped in with poodle grooming) and that the science section contains so many books of very dubious scientific merit, such as paranoid and groundless exposes on how MMR vaccines supposedly cause autism (they don’t, though they have saved countless infant lives).

While commercial pressures may legitimately dictate that the pilates section be more accessible, better organized, and more well-trafficked than the physics or biology sections, it is nonetheless saddening.

Torture, psychology, and the law

Morty wants a treat

For the darkest day of the year, a couple of torture-related items seem appropriate. Firstly, there is this New York Times piece, which argues that senior officials from the Bush administration should be charged with war crimes, for authorizing and enabling torture. The editorial argues that there is no chance that prosecutions will be sought under an Obama administration, but that he ought to clarify the obligation of the United States and its agents to uphold the Geneva Conventions, as well as reverse executive orders that “eroded civil liberties and the rule of law.”

The prospect of high-level American decision-makers being put on trial for authorizing torture is so unlikely that it is a bit difficult to even form an opinion about it. At the same time, it is likely that nobody thirty years ago would have anticipated the trials at the International Criminal Tribunal for Rwanda (ICTR), International Criminal Tribunal for the former Yugoslavia (ICTY), or International Criminal Court (ICC). There is no clear reason for which high political office should be any impediment to being tried for war crimes, but it is very unclear how any such prosecutions would fare in the United States. It would certainly be seen as a ‘political’ act, and any connections with international law would likely be the targets for special criticism and scorn from some quarters.

The other story worth mentioning is an experiment conducted by Dr Jerry Burger, of Santa Clara University. It was a less intense re-creation of Milgram’s famous experiment on obedience to authority. Like Milgram, Burger found that a startling proportion of the population is willing to torture a fellow human being as part of a scientific experiment. This is when the only pressure placed upon the subject of the experiment is the authority of the actor pretending to conduct it. That naturally makes one nervous about what people would be willing to do when they felt an urgent and important issue justified it, as well as when far stronger sanctions could be brought against them if they did not proceed.

The nature and future of wind power

This Economist article discusses the history, technology, and future of wind power. It includes a fair bit of useful information, particularly about integrating wind into the broader energy system:

In addition, the power grid must become more flexible, though some progress has already been made. “Although wind is variable, it is also very predictable,” explains Andrew Garrad, the boss of Garrad Hassan, a consultancy in Bristol, England. Wind availability can now be forecast over a 24-hour period with a reasonable degree of accuracy, making it possible to schedule wind power, much like conventional power sources.

Still, unlike electricity from traditional sources, wind power is not always available on demand. As a result, grid operators must ensure that reserve sources are available in case the wind refuses to blow. But because wind-power generation and electricity demand both vary, the extra power reserves needed for a 20% share of wind are actually fairly small—and would equal only a few percent of the installed wind capacity, says Edgar DeMeo, co-chair of the 20% wind advisory group for America’s Department of Energy. These reserves could come from existing power stations, and perhaps some extra gas-fired plants, which can quickly ramp up or down as needed, he says. A 20% share of wind power is expected to raise costs for America’s power industry by 2%, or 50 cents per household per month, from now until 2030.

In 2007, 34% of the new electricity generation capacity that came online in the United States was in the form of wind turbines; China has doubled its capacity every year since 2004. 20% of Danish electricity already comes from wind, along with 10% in Spain and 7% in Germany. Given aggressive construction plans in Asia, North America, and Europe, wind power definitely looks like a technology with a big future.

Energy usage and the US Department of Defence

This article on space solar power (collecting energy from sunlight using one or more satellites in geostationary orbit, then beaming it down to Earth using microwaves) contains some interesting information on American military logistics in Iraq:

The armed forces are America’s single greatest consumer of oil. The Department of Defence delivers 1.6m gallons (7.3m litres) of fuel a day—accounting for 70% by weight of all supplies delivered—to its forces in Iraq alone, at a delivered cost per gallon of $5-20. It also spends over $1 per kWh on electric power (ten times the domestic civilian price) in battle zones, because electricity must often be provided using generators that run on fossil fuels.

This helps explain why militaries have such a keen interest in new energy generation and efficiency technologies.

The information on space solar power is also quite interesting. It actually seems to be a bit less infeasible than I thought, though the launching costs remain a very significant barrier.

Evidence of a positive climate change feedback in the Arctic

Piano at Raw Sugar

A study being presented at the annual meeting of the American Geophysical Union claims to have found the first concrete evidence of ‘arctic amplification’ – the phenomenon in which the loss of sea ice exposes water that reflects less sunlight than the ice did, thus causing further warming:

Climate-change researchers have found that air temperatures in the region are higher than would be normally expected during the autumn because the increased melting of the summer Arctic sea ice is accumulating heat in the ocean. The phenomenon, known as Arctic amplification, was not expected to be seen for at least another 10 or 15 years and the findings will further raise concerns that the Arctic has already passed the climatic tipping-point towards ice-free summers, beyond which it may not recover.

As with many of the other things happening in the Arctic, the phenomenon is not unexpected but the timing is. Partly, that reflects the imperfect (or totally absent) integration of feedback effects into climatic models.

As with so much other Arctic news, one can only hope that this will be a reminder of the urgency of mitigating greenhouse gas emissions. It is possible that doing so is more urgent than addressing the ongoing financial crisis and, from a long-term perspective, it is certainly a lot more important.

Fatih Birol on peak oil

In an interview with British journalist George Monbiot, Fatih Birol, the chief economist of the International Energy Agency made the following predictions about when peak oil output for non-OPEC and OPEC states would be reached:

“In terms of non-OPEC [countries outside the big oil producers’ cartel]”, he replied, “we are expecting that in three, four years’ time the production of conventional oil will come to a plateau, and start to decline. … In terms of the global picture, assuming that OPEC will invest in a timely manner, global conventional oil can still continue, but we still expect that it will come around 2020 to a plateau as well, which is of course not good news from a global oil supply point of view.”

Coming from a representative of this particular organization, that is quite a surprising statement. Traditionally, the IEA has downplayed any suggestion that global oil output could peak before 2030. A peak in 2020 suggests that we have a lot less time than most firms and governments have been expecting to transition to a post-oil, post-gasoline, post-jet fuel future.

An early peak in oil output could have an enormous effect on both the development of the global economy and climate change. What effect it will have depends on many factors: three crucial ones being the timing of the peak, the severity of the drop-off in output afterwards, and the investment decisions made by states and firms. If we want to continue to produce enough energy to run a global industrialized society, and we also want to avoid the worst effects of climate change, we need to ensure that renewables (and perhaps nuclear) are the energy sources of the future, and that efficient means of energy storage are developed for vehicles.

Level and rate targets for greenhouse gas mitigation

When greenhouse gas mitigation commitments are made, the standard form is to ‘reduce by a certain percentage below the level in a base year by a target year.’ For example, 5% below 1990 levels by 2012. This can be easily converted into a target in absolute emissions. Say, cutting from 1,000 megatonnes (MT) in 1990 to 950 MT in 2020.

I have criticized the process of target-setting before, arguing that the ability of organization to set targets that look ambitious can obscure the absence of plans to actually achieve those reductions. In the end, it makes sense to focus our efforts on cutting emissions, rather than haggle over whether to cut by 65% or 70% by 2050.

Given that targets won’t be vanishing any time soon, I do have a proposal for improving one aspect of them. Rather than expressing targets are just an absolute level of emissions at a set date, they should be expressed as both an absolute level and a rate of reduction to be achieved by a target date. A financial equivalent would be to say: by 2010, I will have paid off 50% of my mortgage, and will be paying more off at a rate of $10,000 per year. What this avoids is the theoretical situation in which a state or other entity limps across the finish line, meeting a 2020 target with no new ideas and initiatives for reaching their 2050 target. This would be akin to a pharmaceutical company that has all its blockbuster drugs go off-patent simultaneously, at the same time as it has no promising new ones in the pipeline (not a hypothetical scenario for a significant number of drug companies right now).

Having a double rather than a single target doesn’t affect the disjoint between commitments and achievements, but it may help foster the kind of mindset required to build a low-carbon society.

Obama and manned spaceflight

Apparently, Barack Obama is thinking of curtailing NASA’s future manned spaceflight activities. Specifically, there has been talk of canceling the Ares 1 rocket and scaling back the Orion Crew Exploration Vehicle. If true, the news is welcome. There is very little evidence that ongoing manned programs – including the Space Shuttle and International Space Station – are generating useful science or providing other benefits. There is even greater doubt about the usefulness of returning to the moon.

Space exploration is an activity best undertaken by robots. They are cheaper to send up than humans and more capable. Given the very limited value provided by sending live people into space, it is something the United States should discontinue. At the very least, it is something that should be sharply scaled back while the government works to address America’s severe debts and other problems.

Ranking energy technologies, from wind turbines to corn ethanol

Mark Z. Jacobson, a professor of civil and environmental engineering at Stanford, headed up a study to quantitatively evaluate different electricity generation options, taking into consideration their impacts on climate, health, energy security, water supply, land use, wildlife, and more:

The raw energy sources that Jacobson found to be the most promising are, in order, wind, concentrated solar (the use of mirrors to heat a fluid), geothermal, tidal, solar photovoltaics (rooftop solar panels), wave and hydroelectric. He recommends against nuclear, coal with carbon capture and sequestration, corn ethanol and cellulosic ethanol, which is made of prairie grass. In fact, he found cellulosic ethanol was worse than corn ethanol because it results in more air pollution, requires more land to produce and causes more damage to wildlife.

It is naturally very difficult to assess the validity of any particular research methodology, given uncertainties about matters like the future development of technologies, the evolution of the global economy, the availability of fossil fuels, and so on. Nonetheless, it is good to see serious work being done on comparing the overall appropriateness of different energy technologies. Given the unwillingness of many states to impose serious carbon pricing solutions, and the tendency of governments to ‘pick winners’ when it comes to technologies being subsidized, the more high quality data available, the better.

While I haven’t looked over the study in detail, it does seem like the strongest objections raised against nuclear (which is ranked very badly) aren’t really about the environment or economics. The risk Jacobson highlights most is that of nuclear proliferation, and the dangers associated with making fissile material more widely available. Proponents of a nuclear renaissance probably won’t be keen to see discussion of “the emissions from the burning of cities resulting from nuclear weapons explosions potentially resulting from nuclear energy expansion.”

The entire study was published in Energy & Environmental Science, and can be accessed online.