The November 2nd Economist included an article with some interesting claims about lies, politics, and identifying deceit:
But even in daily life, without the particular pressures of politics, people find it hard to spot liars. Tim Levine of the University of Alabama, Birmingham, has spent decades running tests that allow participants (apparently unobserved) to cheat. He then asks them on camera if they have played fair. He asks others to look at the recordings and decide who is being forthright about cheating and who is covering it up. In 300 such tests people got it wrong about half of the time, no better than a random coin toss. Few people can detect a liar. Even those whose job is to conduct interviews to dig out hidden truths, such as police officers or intelligence agents, are no better than ordinary folk.
Evolution may explain credulity. In a forthcoming book, “Duped”, Mr Levine argues that evolutionary pressures have adapted people to assume that others are telling the truth. Most communication by most people is truthful most of the time, so a presumption of honesty is usually justified and is necessary to keep communication efficient. If you checked everything you were told from first principles, it would become impossible to talk. Humans are hard-wired to assume that what they hear is true—and therefore, says Mr Levine, “hard-wired to be duped”.
…
In politics, however, these explanations cannot be the whole story. At the heart of the lying-politician paradox is an uncomfortable fact: voters appear to support liars more than they believe them. Mr Trump’s approval rating is 11 points higher than the share of people who trust him to tell the truth. A third of British voters view Mr Johnson favourably but only a fifth think he is honest. Voters believe in their leaders even if they do not believe them. Why?
The answer starts with the primacy of intuitive decision-making. ln 2004 Drew Westen of Emory University in Atlanta put partisan Republicans and Democrats into a magnetic-resonance-imaging scanner and found that lying or hypocrisy by the other side lit up areas of the brain associated with rewards; lies by their own side lit up areas associated with dislike and negative emotions. At no point did the parts of the brain associated with reason show any response at all. If voters’ judgments are rooted in emotion and intuition, facts and evidence are likely to be secondary.
…
A new version of confirmation bias is “identity-protective cognition”, argues Dan Kahan of Yale Law School. This says that people process information in a way that protects their self-image and the image they think others have of them. For example, those who live surrounded by climate-change sceptics may avoid saying anything that suggests humankind is altering the climate, simply to avoid becoming an outcast. A climate sceptic encircled by members of Extinction Rebellion might do the same thing in reverse. As people become more partisan, more issues are being taken as markers of the kind of person you are: in Britain, the country’s membership of the European Union; in America, guns, trade, even American football. All give rise to the acceptance of bias.
…
Thomas Gilovich of Cornell shows how fake news, cognition bias and assuming that people are telling the truth interact to make it easier to believe lies. If you want to believe a thing, he argues (that is, a lie that supports your preconceived ideas), you ask yourself: “Can I believe it?” A single study or comment online is usually enough to give you permission to hold this belief, even if it is bogus. But if you do not want to believe something (because it contradicts your settled opinions) you are more likely to ask: “Must I believe it?” Then, one apparently reputable statement on the other side will satisfy you. That may be why so many climate sceptics manage to cling to their beliefs in the teeth of overwhelming evidence to the contrary. Activists point out that 99% of scientists believe the Earth is warming up because of human actions. But people who doubt the reality of climate change listen to the other 1%.
There does seem to be good reason to believe that people often have powerful psychological impulses to protect their existing worldview rather than believe the most accurate available information or most plausible explanation for what has happened.
Related: