Some conspiracies are, to a certain extent, fun. Take, for instance, the absurd claims that Justin Trudeau, Canada’s prime minister, is an illegitimate son of Fidel Castro. The story went viral in 2018. It was quite inconsequential, and it seems fair to say that most of the people sharing it didn’t take it too seriously. They just had a good laugh.
Other conspiracies are more serious. They are usually based on a belief that some piece of information is being kept secret, hidden from the public. The US government was involved in the 9/11 events. The moon landing never happened. Reptilians govern us. There is a somewhat messianic aspect to these theories: those who believe must share, bring light to the subject. But who decides what is a conspiracy, and what is healthy scepticism? Analogously, who can draw the line between crime and civil disobedience?
To start, let’s take a look at the following examples. Can you tell which of the four is true?
- US military leaders had a plan to kill innocent people and blame it all on Cuba
- The US Navy fired on North Vietnamese torpedo boats that weren’t even there. They pretended to be under attack to escalate the war
- There was a human zoo in Belgium in 1958
- The CIA conducted “mind control” experiments on unwitting US and Canadian citizens, some of which were lethal
Maybe the most informed reader would happen to know that all the above are in fact true. The sources for each of them are quoted at the end of this article. These examples are arguably as absurd sounding as some conspiracy theories. But what is absurdity? For something to be absurd, we must first have an established view of the world, against which new information is compared, evaluated, and ultimately rejected. An initial impression of absurdity can nonetheless be overcome by sufficient evidence. A process of accommodation ensues, in which the person tries to find a new coherence between ideas, which may lead to the rejection of previous ones.
Daniel Kahneman, in his book “Thinking fast and slow”, suggests that one of the heuristics we use when processing new information and evaluating probabilities is the so-called “associative coherence”. Basically, we tend to be unreasonably confident whenever a new piece of information is coherent to our prior knowledge.
Quickly answer the question: how many animals of each species did Moses have on the Ark? Even though this example is known to most people familiar with behavioural economics, most people who hear the question for the first time might be tempted to say “two animals”. Clearly, however, the biblical story is about Noah, and not Moses. As Kahneman himself points out, if we had instead George W. Bush, the swap would be promptly noticed. Therefore, accepting the question as valid depends on the coherence of the name Moses in the context of a biblical story.
Similarly, we tend to accept new information that is consistent with our view of the world. This differs from simple confirmation bias because individuals not only filter for information that favours their opinions but also accept new opinions which “add up” to theirs more easily. This feature suggests that whenever a person has a “truthful” stock of knowledge, they will be more prone to accept new “true” information, whereas conspiracists might be trapped in lies as long as they are being fed a consistent story.
“Innocent” fake news and biased content consumption may then be the groundwork to establish the consistency of bigger conspiracies in the future. Social network platforms have been under scrutiny of the media and academia in favouring this process. Algorithms maximizing screen-time can reasonably be expected to work best when first introducing some “primer” content, testing the viewer in their openness to radicalism (I use the term as the strength of belief in a conspiracy). Depending on the response, more extremist content is put forward. This is known as a “rabbit hole”.
Radicalization is, in this example, additive. It does not depend on a sudden change but on a continuous intensification of one’s views. However, change also plays a role in believing conspiracies. Most of us can relate to the rush of coming across surprising information, so incredible that we can’t help but to bring it up constantly. For something to be unexpected, it must, to some degree, break with our usual beliefs. Here, it might be useful to note that, in Kahneman’s framework, the concept of associative coherence applies especially in effortless, careless decision-making and learning. It does not rule out changing your mind.
Most of the literature on convincing and persuasion will stress that humans are stubborn animals. We don’t usually like to have our worldview questioned, and we tend to have strong emotional reactions to different opinions. However, religious conversion experiences and switches in political opinions are widespread phenomena and also very emotionally charged. A conspiracy might therefore feel like a revelation, a rupture with the past. It might include the feeling of inclusion into a group of selected people who are aware of a distinctive knowledge. Conspiracies are, in this sense, processes of socialization.
However, understanding the processes through which people may end up believing falsehoods does not mean that everything which is labelled a conspiracy theory is wrong. Twenty years ago, someone claiming half of the extent of mass surveillance exposed by Snowden and Assange would sound like a lunatic. And yet they would be right. The label “conspiracy” is broad. It includes plain falsehoods and unreasonable claims. It is also instrumentalized by powerful groups to discredit their opponents.
This is not to say that denialism, the systematic denial of reality and truth, does not cause damages. Climate-change deniers, sponsored by oil companies, are a big impediment to the necessary, serious, and gigantic measures that must be taken to contain the damage of human greenhouse gas emissions. Misinformation caused policymakers in South Africa to prevent thousands of HIV positive mothers to get their antiretroviral medicine, who then passed the virus onto their children. More recently, Brazil’s president, Jair Bolsonaro, denied the effectiveness of social distancing and famously stated that “if you take the vaccine and you become an alligator, I have nothing to do with it”. A study from the federal university of Rio de Janeiro showed that cities with a higher concentration of Bolsonaro voters have disproportionately more covid cases. In the last week of March, Brazil accounted for roughly a third of all new deaths by covid in the world.
We must be aware that “conspiracy” and “denialism” are politically charged terms. They are labels we impose onto views we reject strongly or find absurd. In certain topics, like vaccines and the holocaust, having hard stances on denialists can be good. In other topics, a careful analysis might be necessary.
Nonetheless, conspiracists can never be right. When they are, they are not conspiracists any longer.
The four statements sources:
Pascal Diethelm, Martin McKee, “Denialism: what is it and how should scientists respond?”, European Journal of Public Health, Volume 19, Issue 1, January 2009, Pages 2–4.
Daniel Kahneman. “Thinking, Fast and Slow”. New York: Farrar, Straus and Giroux, 2011.