“The process of relaying intelligence can distort its meaning. Content can be altered unconsciously in transmission. Garbled data are made to appear more coherent when relayed in conversation, allowing actual disjunctions between facts to be replaced by false connections; lengthy information can be made shorter; details are suppressed subconsciously if they are not consistent with the rest of the relayer’s message; and transmission errors tend to make the message sound like what the person transmitting it had been expecting to hear. Subordinates also tend to bias messages so as to minimize distress to their superiors; transmitting individuals tend toward ‘closure’ of incomplete images and ‘confabulating detail where gaps are conspicuous’; long periods of time are reported as shorter; and short ones as longer. Early on the morning the Yom Kippur War began, a trusted source warned Israel that the Arabs would attack that day. Somewhere in the communication chain the time of six o’clock was added erroneously to the warning. The Arabs struck over four hours sooner.”
Betts, R.K. “Surprise despite warning: Why sudden attacks succeed” in Andrew, Christopher et al eds. Secret Intelligence: A Reader. London; Routledge. 2009. p. 94 (paperback)
[Update: 13 May 2013] More on the same topic:
“When a consumer is faced with data he prefers not to believe, he can fall back on four psychological mechanisms.
First, he can be more attentive to reassuring data. The threshold at which evidence confirming the individual’s assumptions is recognized comes well before the threshold for contradictory evidence. Information that challenges reigning expectations or wishes ‘is often required, in effect, to meet higher standards of evidence and to pass stricter tests to gain acceptance than new information that supports existing expectations and hypotheses.’ The consumer can also challenge the credibility of the source. An analyst or agency that has been chronically wrong in the past can be dismissed. Some political leaders also tend to be skeptical of advice from military sources and suspicious that professional soldiers manipulate information in order to gain authorization for desired changes in posture. A consumer’s belief that the person giving him information has an ideological axe to grind, or a vested interest in changing policy, will tend to discredit the information. Third, the decision maker can appreciate the warnings, but suspend judgment and order stepped-up intelligence collection to confirm the threat, hoping to find contradictory evidence that reduces the probability the enemy will strike. Finally, the consumer can rationalize. He may focus on the remaining ambiguity of the evidence rather than on the balance between threatening and reassuring data, letting his wish become father to his thought. He can explain away mounting but inconclusive threats by considering other elements of the context, or believing that enemy mobilization is precautionary and defensive. In many cases such reasoning is quite correct. The likelihood a responsible policymaker will let himself think this way varies directly with the severity of the specific costs involved in response to the warning and with the availability of reassuring evidence. There are always some data to dampen alarm. Such data can also be fabricated.” p.99 (paperback) emphasis in original
A lot of this seems quite applicable in the case of policy-makers deciding whether or not to take serious action in response to climate change.
Related: How useful are spies?