Hale on why climate stability advocates are often confounded

The combination of uncertainty and low salience, in turn, enables obstructionism, the ability of interests tied to the status quo to maintain their interests. Consider the hurdles of a policy entrepreneur would have to overcome to create and implement a policy addressing a problem with distant effects like climate change. First, that policy entrepreneur would have to herself see value in pursuing an obscure issue, one that is unlikely to garner her a quick win and the associated political benefits. Few will have incentives to pursue such causes. Second, she would have to mobilize a sufficient coalition of interests to be able to influence policy. This would require each of those interests choosing to focus on a distant topic over their more urgent priorities. Third, this interest coalition would need to force the issue onto the broader political agenda, competing for limited space with numerous immediate priorities. Fourth, the coalition would need to somehow overcome, compensate, or neutralize political opponents.

To the extent those opponents are worried about the short-term costs of action, everything that is hard for the long-sighted policy entrepreneur will be easy for them. Opposing long-sighted policy—that is, promoting short-term outcomes—will give them the opportunity for quick wins on issues that are relatively easy to mobilize interests around. And even if the long-term-oriented policy entrepreneur wins a battle, she must preserve and maintain those gains permanently, as opponents will seek to reverse any defeats they face. A one-off victory may be important, but long problems often require sustained policies over time, while it only takes one victory by opponents to block them. The longer a problem’s effects reach into the future, the more friction the policy entrepreneur will face at every stage, and, should she get a win, the more enduring her victories will need to be.

Hale, Thomas. Long Problems: Climate Change and the Challenge of Governing Across Time. Princeton University Press, 2024.

Related:

Shrugging our way through the breakdown of a stable world

Lately, in observing our politics and dealing with our society, I feel like a time traveller who has been sent back to before the forthcoming collapse. There is no success to be had in warning people though. They sense and feel that the collapse is coming, and that they are unwilling to make the changes that might avoid it. It’s not that people don’t believe the warning; they do. Apocalypse has become the leitmotif of our culture. People are just too corrupted by self-interest and too pessimistic about the ability of our society to solve problems to believe that anything can be done.

Kahneman on risks from excess confidence and optimism

Organizations that take the word of overconfident experts can expect costly consequences. The study of CFOs showed that those who were most confident and optimistic about the S&P index were also overconfident and optimistic about the prospects of their own firm, which went on to take more risk than others. As Nassim Taleb has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers. One of the lessons of the financial crisis that led to the Great Recession is that there are periods in which competition, among experts and among organizations, creates powerful forces that favor a collective blindness to risk and uncertainty.

The social and economic pressures that favor overconfidence are not restricted to financial forecasting. Other professionals must deal with the fact that an expert worthy of the name is expected to display high confidence. Philip Tetlock observed that the most overconfident experts were the most likely to be invited to strut their stuff in news shows. Overconfidence also appears to be endemic in medicine. A study of patients who died in the ICU compared autopsy results with the diagnosis that physicians had provided when the patients were still alive. Physicians also reported their confidence. The result: “clinicians who were ‘completely certain’ of the diagnosis antemortem were wrong 40% of the time.” Here again, expert overconfidence is encouraged by their clients: “Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty to patients.” Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.

Kahneman, Daniel. Thinking Fast and Slow. Random House Canada, 2011. p. 262–3

The nuclear razor’s edge

I listened to the audiobook of Annie Jacobson’s Nuclear War. Having followed the subject and read a lot about it over the years, it nonetheless had a lot of new information inside of a compellingly presented, plausible, and chlling story.

Our whole world can end in a couple of hours; live life accordingly.

Rate matching in personal communication

One element of human interaction which I have always found perplexing and frustrating is when people lie with the expectation that you will understand that they are lying and also what they are really trying to say. For example, there is a kind of upper class reflex to say something like: “You will have to come visit the house sometime!” when they mean: “We will never see each other again, but please keep treating me like an aristocrat”.

One place where people frequently try this “I’ll be dishonest but assume they’ll understand what I mean” trick is with regard to volume of communication. For whatever reason, people are often dishonest about hearing from you too little or too much, and will even lie about it when directly asked, even if the volume of communication is really annoying them.

What I have learned to do in this arena is to ignore what people explicitly say and focus on rate matching. If someone responds to me promptly, I speed up the pace of my messages to keep the average time between my messages similar to the average time between theirs. Similarly, if someone is slow to respond to me (or never responds), I regulate down the frequency of my communication to be more closely matched. For example, if I send a text and get an immediate response, it’s OK to write back immediately. If it takes an hour, or five hours, or a day, or three days to get a response — it’s best to copy the length of the delay when responding.

The system doesn’t cover everything. One notable consideration is the division of labour. Perhaps because I am a lot keener than most to stay in touch with most people, I tend to be the person to establish and maintain communication. Once that behaviour has become a norm in the relationship, it can produce a dynamic where they rarely or never initiate contact because they now expect me to do it. Still, perhaps even here it would be sensible to rate match; if someone never ever reaches out to you, its probably an unspoken sign that they prefer to do other things with their time.

Whose agenda are you devoted to?

I have never seen George Monbiot’s bettered as career advice, though it will not lead to an easy life. For instance:

What the corporate or institutional world wants you to do is the opposite of what you want to do. It wants a reliable tool, someone who can think, but not for herself: who can think instead for the institution. You can do what you believe only if that belief happens to coincide with the aims of the corporation, not just once, but consistently, across the years

Also:

How many times have I heard students about to start work for a corporation claim that they will spend just two or three years earning the money they need, then leave and pursue the career of their choice? How many times have I caught up with those people several years later, to discover that they have acquired a lifestyle, a car and a mortgage to match their salary, and that their initial ideals have faded to the haziest of memories, which they now dismiss as a post-adolescent fantasy? How many times have I watched free people give up their freedom?

What he cheers for and takes satisfaction from is inspiring too:

Most countries have a number of small alternative papers and broadcasters, run voluntarily by people making their living by other means: part time jobs, grants or social security. These are, on the whole, people of tremendous courage and determination, who have placed their beliefs ahead of their comforts. To work with them can be a privilege and inspiration, for the simple reason that they – and, by implication, you – are free while others are not. All the money, all the prestige in the world will never make up for the loss of your freedom.

Autonomy, not authority, is the only way to escape the many traps of the status quo.

The affect heuristic

The dominance of conclusions over arguments is most pronounced where emotions are involved. The psychologist Paul Slovic has proposed an affect heuristic in which people let their likes and dislikes determine their beliefs about the world. Your political preference determines the arguments that you find compelling. If you like the current health policy, you believe its benefits are substantial and its costs more manageable than the costs of alternatives. If you are a hawk in your attitude toward other nations, you probably think they are relatively weak and likely to submit to your country’s will. If you are a dove, you probably think they are strong and will not be easily coerced. Your emotional attitude to such things as irradiated food, red meat, nuclear power, tattoos, or motorcycles drives your beliefs about their benefits and their risks. If you dislike any of these things, you probably believe its risks are high and its benefits negligible.

Kahneman, Daniel. Thinking Fast and Slow. Random House Canada, 2011. p. 103

What if we never respond adaptively to climate change?

A central assumption of many climate change activists and advocates for climate stability is that once people experience how destructive and painful climate change will be, they will become more willing to take actions to limit its severity – chiefly by foregoing fossil fuel production and use.

The Economist reports on how this assumption may not be justified, in discussing the threat of sea level rise to The Netherlands:

The longer-term issue, of course, is climate change. The North Sea has risen about 19cm since 1900, and the rate has increased from about 1.7mm per year to about 2.7mm since the 1990s. This makes it ever harder for riverwater to flow into the sea. With a quarter of their country lying below sea level, one might think that Dutch voters would be exceptionally worried by global warming and choose parties that strive to end carbon emissions. Yet in a general election last November they gave first place to a hard-right candidate, Geert Wilders, who wants to put global climate accords “through the shredder”. Mr Wilders’s party got 23.5% of the vote; a combined Green-Labour list got just 16%.

All across Europe this winter, as the effects of climate change grow starker, the parties that want to do something about it are getting hammered. In Germany, where the floodwaters hit first, the Green party’s popularity has plunged. Portugal’s Algarve is parched by drought, but with elections due on March 10th polls show the green-friendly left running well behind the centre- and far right. Southern Spain has declared a drought emergency, yet the pro-green Socialist-led government is teetering. Snowless ski resorts in Italy have done nothing for the fortunes of environmentalist parties; Italy’s Green party is polling at around 4%. In winter the Swiss Alps appear on heat-anomaly maps of Europe as a streak of red, 3°c above historical averages. But the hard-right Swiss People’s Party (SVP), the biggest in parliament, won even more seats in an election last autumn, while the Greens shrank.

I feel like the norm in human civilizations is that we are incredibly badly governed. People are easy to fool and deeply divided into tribes, and that provides ample opportunities for political leaders to claim credit and avoid blame.

If politics as usual is the self-serving and incompetent ruling in their own interests while putting together enough of a story to sustain public support, politics when the world is coming to an end promises to be even more dysfunctional and incapable of resolving problems.

Pfeffer on the limitations of intelligence as a path to power

Furthermore, intelligence, particularly beyond a certain level, may lead to behaviors that make acquiring or holding on to influence less likely. People who are exceptionally smart think they can do everything on their own and do it better than everyone else. Consequently, they may fail to bring others along with them, leaving their potential allies in the dark about their plans and thinking. Being recognized as exceptionally smart can cause overconfidence and even arrogance, which, as we will see in more detail later, can lead to the loss of power. And smart people may think that because of their great intelligence they can afford to be less sensitive to others’ needs and feelings. Many of the people who seem to me to have the most difficulty putting themselves in the other’s place are people who are so smart they can’t understand why others don’t get it. Lastly, intelligence can be intimidating. Although intimidation can work for a while, it is not a strategy that brings much enduring loyalty.

Pfeffer, Jeffrey. Power: Why Some People Have It — and Others Don’t. HarperCollins, 2010. p. 56

Taleb on the domain dependence of knowledge

I used to attend a health club in the middle of the day and chat with an interesting Eastern European fellow with two Ph.D. degrees, one in physics (statistical no less), the other in finance. He worked for a trading house and was obsessed with the anecdotal aspects of the markets. He once asked me doggedly what I thought the stock market would do that day. Clearly I gave him a social answer of the kind “I don’t know, perhaps lower”-quite possibly the opposite answer to what I would have given him had he asked me an hour earlier. The next day he showed great alarm upon seeing me. He went on and on discussing my credibility and wondering how I could be so wrong in my “predictions,” since the market went up subsequently. Now, if I went to the phone and called him and disguised my voice and said, “Hello, this is Doktorr Talebski from the Academy of Lodz and I have an interrresting prrroblem,” then presented the issue as a statistical puzzle, he would laugh at me. “Doktorr Talevski, did you get your degree in a fortune cookie?” Why is it so?

Clearly there are two problems. First, the quant did not use his statistical brain when making the inference, but a different one. Second, he made the mistake of overstating the importance of small samples (in this case just one single observation, the worst possible inferential mistake a person can make). Mathematicians tend to make egregious mathematical mistakes outside of their theoretical habitat. When Tversky and Kahneman sampled mathematical psychologists, some of whom were authors of statistical textbooks, they were puzzled by their errors. “Respondents put too much confidence in the result of small samples and their statistical judgments showed little sensitivity to sample size.” The puzzling aspect is that not only should they have known better, “they did know better.” And yet…

Taleb, Nassim Nicholas. The Black Swan: The Impact of the Highly Improbable. Random House, 2007. p. 194-5 (italics in original)