11 Conspiracy Theories

A Little More Logical | Brendan Shea, PhD

In this chapter, you will learn about the dangers of conspiracy theories and how to avoid falling prey to them. You’ll start by exploring Hume’s views on miracles and how they relate to the concept of belief. Then, you’ll delve into the topic of heuristics and biases, and how they can lead us astray. You’ll learn about the representativeness heuristic and how it can cause us to draw false conclusions based on incomplete information. You’ll also examine prospect theory, which helps us understand how we make decisions under uncertainty. Finally, you’ll consider whether it is possible to avoid making mistakes when it really counts, and you’ll explore the work of Daniel Kahneman and Amos Tversky, two influential psychologists who have studied the ways in which our minds can deceive us. Finally, we’ll turn to an extended example of a dangerous, real-life conspiracy theory—Holocaust Denial. By the end of this chapter, you should have a better understanding of the psychological pitfalls that can lead to the embrace of conspiracy theories, and how to avoid them.

Learning Outcomes: By the end of this chapter, you will be able to:

  1. Define conspiracy theories and explain their common features and psychological appeal.
  2. Understand David Hume’s argument against belief in miracles and its relevance to the evaluation of conspiracy theories.
  3. Identify and explain key heuristics and biases that contribute to belief in conspiracy theories, such as the representativeness heuristic, prospect theory, and confirmation bias.
  4. Analyze real-world examples of conspiracy theories, such as Holocaust denial, using the concepts and frameworks discussed in the chapter.
  5. Evaluate the evidence for and against specific conspiracy theories, distinguishing between legitimate inquiry and pseudohistory or pseudoscience.
  6. Recognize the psychological and social factors that make conspiracy theories persistent and difficult to counter, even in the face of contradictory evidence.
  7. Apply critical thinking strategies and an understanding of probabilistic reasoning to avoid falling prey to conspiracy theories and other forms of misinformation.

Keywords: Conspiracy theory, Inductive reasoning, Heuristics, Biases, Representativeness heuristic, Base rate, Prospect theory, Loss aversion, David Hume, Miracles, Testimony, Holocaust denial, Final Solution, Confirmation bias, Availability heuristic, Conjunction fallacy, Halo effect, Outcome bias, Amos Tversky, Daniel Kahneman, Pseudohistory, Cherry-picking, Illusion of explanatory depth, Antisemitism, Protocols of the Elders of Zion

However, before reading further, I need to warn you about an emerging threat to us all—geese!

Geese Gone Wild: The Bioengineered Terror of Rochester’s Skies

Dear Editor,

As a concerned citizen and truth seeker, I am writing to expose the insidious plot being carried out by the Mayo Clinic. For years, I have been observing the alarming increase in the local geese population, and after extensive research, I can no longer remain silent.

The Mayo Clinic, a so-called “reputable” medical institution, has been secretly conducting genetic experiments on these geese, creating a breed of super-intelligent birds capable of understanding complex commands and carrying out sinister tasks. I have personally witnessed these geese engaging in highly organized activities, such as forming intricate formations and communicating with each other using advanced vocalizations that no ordinary goose could produce.

Furthermore, I have uncovered a pattern of suspicious behavior from the Mayo Clinic’s employees. On numerous occasions, I have seen them releasing these genetically altered geese into the wild under the cover of darkness. When confronted, they claim to be “tagging” the geese for “research purposes,” but I know better. This is merely a cover-up for their true intentions: to create an army of geese that will help them seize control of the world’s governments and establish a New World Order.

The evidence is overwhelming, yet the mainstream media and the authorities refuse to acknowledge the truth. They dismiss my well-researched findings as mere “conspiracy theories,” claiming that the increase in the geese population is a result of “conservation efforts” and “favorable breeding conditions.” How convenient for them to ignore the clear signs of genetic manipulation and the Mayo Clinic’s involvement!

Moreover, when I attempted to share my findings on social media, my accounts were suddenly suspended, and my posts were removed. This is a clear indication that the Mayo Clinic’s influence extends far beyond the medical realm and into the world of technology and censorship. They will stop at nothing to silence those who dare to expose their nefarious agenda.

The public must wake up and demand transparency from the Mayo Clinic. We cannot allow this powerful institution to continue its genetic experimentation unchecked. The geese are just the beginning – who knows what other species they may be manipulating in their labs? We must put an end to their sinister plot before it’s too late and we find ourselves living in a world controlled by genetically enhanced creatures.

I urge all concerned citizens to join me in exposing the truth and holding the Mayo Clinic accountable for its actions. Together, we can shine a light on this dark conspiracy and protect the future of our planet.

 

Sincerely,

 

Irving Quackenbush

Concerned Citizen, Truth Seeker, and Defender of Humanity

Questions

  1. What evidence does Quackenbush present to support his claim that the geese in Rochester are genetically engineered bioweapons? Is this evidence convincing? Why or why not?
  2. How does Quackenbush’s belief in a conspiracy involving the geese in Rochester serve his own psychological needs or desires?
  3. How might Quackenbush’s belief in this conspiracy theory affect his behavior and decision-making, such as in terms of how he interacts with the geese or how he votes in local elections?
  4. How might Quackenbush’s belief in this conspiracy theory be challenged or debunked by evidence or logical argument?

Introduction

“In our reasonings concerning matter of fact, there are all imaginable degrees of assurance, from the highest certainty to the lowest species of moral evidence. A wise man, therefore, proportions his belief to the evidence.”—David Hume[1]

“The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”—Daniel Kahneman[2]

Conspiracy theories are beliefs that events or situations are caused by secret, often sinister, groups or individuals working together to achieve a specific goal. These theories often involve allegations of cover-ups or attempts to mislead the public.  Conspiracy theories vary widely in their content, the individuals and groups who believe in them, and in their effects on the behavior of these believers. For this reason, it may be difficult or impossible to come up with a completely general definition of conspiracy theory that captures all and only those theories that fit under this general label. Nevertheless, there are a significant number of conspiracy theories that share something like the following form:

There exists a certain small group of people that share a certain characteristic such as race, religion, occupation, or nationality. They have secretly undertaken actions that have harmed, or are intended to harm, me and people like me. The fact that these actions have not generally been recognized is due to the conspirators’ ability to conceal evidence of this.

Within the general scheme, there is plenty of room for variation. For example, the conspirators may be anonymous figures living otherwise unremarkable lives, or they may be well-known and powerful political, religious, or media elites. Similarly, some purported conspirators actively wish harm upon the believer and others—such as conspiracies positing “traitors” or “spies” working to ensure their own country loses some conflict—while others are held to have much more mundane motives, such as the desire for money or power. In this latter case, the harm in question may simply be an especially unpleasant side effect, though one that was foreseen by the conspirators. Finally, the harms attributed to the conspirators’ actions come in a number of forms. So, for example, it may be that the actions of the conspirators have led (or will lead) to the deaths of particular individuals, financial crises or crashes, military defeats, outbreaks of disease or illness, the overthrowal of the government, and so on.

Conspiracy theories are beliefs that events or situations are caused by secret, often sinister, groups or individuals working together to achieve a specific goal. These theories often involve allegations of cover-ups or attempts to mislead the public. Examples include:

  • Holocaust denial is a conspiracy theory that denies the reality of the systematic mass murder of millions of Jews and other minority groups by the Nazi regime during World War II. Despite overwhelming evidence to the contrary, Holocaust deniers claim that the Holocaust did not occur, or that it was significantly exaggerated.
  • QAnon is a far-right conspiracy theory that emerged in 2017. It alleges that there is a secret cabal of elites, often referred to as the “deep state,” that is working to undermine President Donald Trump and his supporters. QAnon followers believe that this cabal is involved in a variety of nefarious activities, including human trafficking and the production of child pornography. In the years sense, QAnon has developed to include theories about COVID vaccines, the war in Ukraine, and other things it believes to be actions of the “deep state.”
  • The Illuminati is a secret society that is believed by some to be a group of powerful individuals who control world events and seek to establish a New World Order. The Illuminati is often depicted as a shadowy organization that uses its influence to manipulate world events for its own benefit. Some people believe that the Illuminati is responsible for a variety of historical events, including revolutions and wars, and that it continues to exert influence on world affairs to this day.

Conspiracy theories of this type all crucially involve failures of what philosophers often call inductive reasoning, which involves using our available evidence to determine what is probable or likely to be true. Inductive reasoning is usually contrasted with deductive reasoning, which involves attempts to prove with 100% certainty that a conclusion follows. As it turns out, inductive reasoning makes up a huge part of our day-to-day lives. We reason inductively, for example, when we try to determine what was the cause of some event that we just observed, or when we try to figure out what the effects of this same event might be. We also reason inductively any time we make predictions about the future, or decide whether to trust what we’ve read or heard, or make generalizations about a large population based on the smaller sample that we are familiar with.

For this reason, conspiracy theories, and the errors of inductive reasoning that they exemplify, should be of interest to all of us. After all, if it turns out that many of the crucial errors committed by conspiracy theorists are ones that we ourselves are prone to, this will provide a strong reason for thinking hard about our own beliefs, and the process by which we have arrived at them.

Questions

  1. What is a conspiracy theory and how does it differ from other types of belief systems?
  2. What are some examples of conspiracy theories and how do they vary in content and effects on believers?
  3. What psychological and cognitive factors contribute to the appeal and persistence of conspiracy theories?
  4. How have conspiracy theories impacted society and individuals, both historically and in the present day?

Don’t Believe Everything You’re Told: Hume on Miracles

Conspiracy theories often serve as simple, attractive rivals to other, more complex theories about politics, history, or science. So, for example, where political scientists may offer theories that tie the outcome of a particular election to factors such as economic conditions, demographic shifts, incumbency bias, and the relative appeal of the candidates’ platforms and personae, conspiracy theorists often see the hidden hand of conspirators as being responsible for unwelcome outcomes. Similarly, where mainstream medical and scientific research suggests that conditions such as autism, drug addiction, or obesity have complex causal backgrounds, conspiracy theorists might reply that these bad things are actually due to the hidden side effects of vaccines, the clandestine activities of the CIA, or the machinations of “Big Ag.”

One way in which conspiracy theories are distinguished from their mainstream rivals is their method of origin and spread, which is often outside traditional scientific and academic channels. In the modern era, for example, conspiracy theories often begin in the so-called “dark corners” of the internet, as opposed to in peer-reviewed journal articles. They then spread, via both alternative media sources and social media, to larger and larger audiences. To what extent should this sort of difference in origin matter to the credibility of the theories in question?

The Scottish philosopher David Hume (1711-76) takes up a very similar question in the “Of Miracles” section of his Enquiry Concerning Human Understanding. Hume was among the first to clearly distinguish between inductive and deductive reasoning, and his account of the problems inherent in inductive reasoning has influenced (and often troubled) scholars studying inductive reasoning ever since. In “Of Miracles”, Hume considers whether or not one should ever believe peoples’ accounts of miracles. His answer is a resounding “No!”, and many of the reasons he provides are applicable to conspiracy theories as well.

Hume recognizes that the reasons people believe in miracles—because they hear or read about them from sources that they normally trust—are based in the same sort of inductive inference that underpins many of the things we believe. For example, nearly all of our beliefs about history, scientific theories, current events, and even the lives of our closest friends and family are, of necessity, based on what textbooks, teachers, newspapers, and other people tell us about these things. Because of the probabilistic nature of inductive inference, this means that is always possible that these sources are incorrect. However, we don’t normally take this possibility as grounds for dismissing everything we hear or read. So, what makes reports of miracles (or conspiracy theories) any different?

Hume provides a number of considerations for treating reports of miracles differently than other sorts of “testimony,” many of which are applicable to conspiracy theories. First, the chain of testimony supporting miracles often looks quite different than that of ordinary events. Miracles are almost universally said to have occurred long ago and/or in places far away, and under conditions that would have made it difficult or impossible for any skeptic to check on the truth of the claim. In conspiracy theories, by comparison, it is often held that the conspiracy theory is happening “right now!” or “under our noses!”. However, just as in the miracle case, it is a central part of the theory that there can be no possible recording/confirmation of the conspiration, since the conspirators have prevented this (perhaps by murdering witnesses or manipulating the media).  The fact that reports of miracles and conspiracy theories haven’t been and can’t be, checked out by skeptical listeners doesn’t mean that they are necessarily false, of course. What it does mean, however, is that these reports lack the sort of safeguard that comes with most testimony regarding strange or unlikely events—that is, if they were false, we would likely have some evidence of this.

A second key difference Hume notes relates to the motivations of those who talk about miracles. After all, one reason that miracles matter so much is that they can serve as evidence for the truth of certain religious views. This provides a strong motivation for people who already hold these religious views to believe in such reports (after all, we all like being shown right!), and it also provides motivation for them to spread these tales, even if they don’t fully believe in them. After all, telling tales of miracles might win converts for the faith, or signal to other members of the group your “loyalty to the cause.” Something quite similar can be said of many conspiracy theories—insofar as belief in these theories is closely linked to membership in some group, we have good reason to doubt the impartiality of those telling tales of conspiracies. Finally, Hume observes that, while one might think that the sheer strangeness and outlandishness of miracles would make people less likely to believe and repeat them, experience shows that something the opposite often seems to be the case—people seem to enjoy believing and repeating stories about events that are utterly unlike things they have experienced themselves.  This, again, has close analogues with conspiracy theories. Odd as it may seem, the very claims of a conspiracy theory that seem the furthest detached from evidence and ordinary experience may be the claims that encourage its spread.

Questions

  1. How do conspiracy theories differ from more mainstream theories in terms of their origin and spread?
  2. What are some of the reasons Hume gives for treating reports of miracles differently than other types of testimony? How do these reasons apply to conspiracy theories?
  3. How do factors such as the distance in time and space, the difficulty of verifying the claims, and the credibility of the sources impact the credibility of accounts of miracles and conspiracy theories?

Making Mistakes: Heuristics and Biases

In the generations since Hume first wrote, scholars in disciplines ranging from philosophy to economics to statistics to psychology have studied the nature of inductive reasoning from a variety of perspectives. While many of these investigations have aimed at uncovering better methods for inductive reasoning, others have aimed at figuring out how good ordinary humans are at inductive reasoning in a variety of contexts.  Most of us do well enough when the conclusions of inductive reasoning concern our immediate experience, for example—we learn quickly to avoid hot stoves, or to avoid drinking bottles labeled “poison,” but it is much less clear how successful we are when it comes to dealing with big-picture issues regarding statistical or causal reasoning in areas such as economics, science, or politics. These, of course, are precisely the areas where conspiracy theorists are most prone to get things wrong. So, why might this be? And just how common are these errors?

Starting in the late 1960s, two Israeli psychologists—Amos Tversky and Daniel Kahneman–began investigating just these sorts of questions. In a series of influential articles[3], they argued that humans are not intuitively “good statisticians,” and they make a number of systematic mistakes when engaging in inductive reasoning. Tversky and Kahneman’s research has had an impact far behind psychology, and in particular caused considerable problems for the view (once common in both economics and some areas of philosophy) that humans generally acted rationally.[4]  While Kahneman and Tversky don’t explicitly consider the problem of belief in conspiracy theories, their work provides a helpful framework for identifying and classifying many of the major inductive mistakes that conspiracy theorists make.

A foundational concept of Kahneman and Tversky’s approach is that we make many decisions using intuitive heuristics, or simple rules for making inductive decisions. In particular, they suggest that, when we are faced with making a complex decision, we often (without realizing it) “substitute” a simpler, easier-to-answer question, and answer that instead. And while this may work well enough in many day-to-day cases, it can also easily lead to fallacious reasoning of the sort exemplified in conspiracy theories. Some notable examples of such heuristics and biases include:

  • representativeness heuristic: the tendency to judge the likelihood of an event based on how similar it is to a prototypical example, without taking into account relevant base rates or statistical information
  • anchoring bias: the tendency to rely heavily on the first piece of information encountered when making a decision, and to adjust insufficiently from that initial anchor
  • availability heuristic: the tendency to judge the likelihood of an event based on the ease with which examples come to mind
  • confirmation bias: the tendency to seek out and pay more attention to information that confirms one’s preexisting beliefs and to disregard or downplay information that challenges them
  • sunk cost fallacy: the tendency to continue investing time, money, or other resources into a project or decision because of the time, money, or other resources already invested, even if the current costs outweigh the benefits
  • hindsight bias: the tendency to see events as being more predictable than they actually were, after learning the outcome
  • overconfidence bias: the tendency to be more confident in one’s beliefs and judgments than is warranted by the evidence

In the rest of  this chapter, we’ll explore a number of these biases in more detail, and show how they can lead to belief in conspiracy theories.

Graphic: Reasons People Believe Conspiracy Theories

 

The Story Just “Fits”: The Representativeness Heuristic

Conspiracy theories often begin with the intuition that some bad event—a recession, an outbreak of a disease in the local community, or a school shooting—cannot be adequately explained by any combination of normal causal processes discussed by scientists, public health officials, or psychologists and sociologists. They then conclude that this event must have been caused by a carefully planned process (instigated in secret by the conspirators!) that was designed to result in just this sort of outcome. This way of reasoning exemplifies what Kahneman and Tversky label the representativeness heuristic, in which the probability of a certain process P causing event E is judged solely by the “resemblance” between P and E and NOT by any careful consideration of how probable it was that P actually occurred, or the potential alternatives to P, or even how good of evidence for P we happen to have.

In the case of conspiracy theories, the representativeness heuristic might explain several inductive failures. First, it accounts for the way conspiracy theorists often seem to ignore the comparative base rates of “bad things caused by a combination of ordinary factors” versus “bad things caused by powerful secret organizations working in secret to cause just this sort of harm in each and every gory detail.” While the resemblance heuristic pushes us toward the conspiracy story (since it better “resembles” the bad thing in question), this is a bad inference. After all, the vast, vast majority of the harms that we in incur in life are NOT the result of explicit conspiracies intended to cause this exact outcome, but instead are the result of perfectly mundane causal factors acting in combination (that is, plain old “bad luck”).

For similar reasons, the representativeness heuristic can plausibly account for conspiracy theorist’s tendency to posit highly specific causes for events that are better explained by appeal to statistics. So, for example, small samples are more variable than large samples, and so we should be very careful in drawing conclusions based on what we have observed in small samples, even if the sample in question seems odd to us. So, for example, if two people in a small office of ten people each have a heart attack during the same month, this might seem unusual, but it doesn’t provide strong evidence the office coffee has secretly been poisoned by management seeking to save money on future pensions. By contrast, if 200 people in an office of 1,000 people suffer such attacks in a month (the same percent, but a much larger sample), this really does suggest something out of the ordinary is going on. However, in practice, conspiracy theorists (along with the rest of us) systematically overlook this difference in sample size, and too often jump to conclusions on the basis of small samples.

For similar reasons, the confidence we have in our conclusions about the causes of events ought to reflect the strength and variety of evidence that we have seen—after all, it is surely better to read ten high-quality journal articles and one moderately plausible social media post about a conspiracy theory than just the moderately plausible blog post. However, the representativeness heuristic (which ignores quantity or quality of evidence and cares only about its “fit” with a theory) can lead us to ignore this and, in some cases, to feel more confident in our conspiracy theory after reading just the social media post, since there are no additional sources to interfere with the nice clean fit between this story and our believing in the truth of the theory it describes. Basically, once we decide to give the social media post any credence whatsoever—as opposed to simply dismissing it out of hand—it can be very difficult to not overweight its value as evidence.

Examples of the representativeness heuristic might include:

  1. Believing that a vaccine is dangerous or ineffective because it was developed by a pharmaceutical company rather than by scientists working in the public interest. This belief may be based on the similarity between the vaccine and the idea of a profit-driven pharmaceutical company, rather than considering the probability of such a motive or the evidence for it.
  2. Believing that a political candidate is corrupt or untrustworthy because they are a member of a particular party or demographic group. This belief may be based on the similarity between the candidate and the stereotype of a corrupt or untrustworthy person, rather than considering the probability of this stereotype being true or the evidence for it.
  3. Believing that a financial crisis was caused by a secret group of bankers or financiers rather than by complex economic factors. This belief may be based on the similarity between the crisis and the idea of a secret group manipulating the economy, rather than considering the probability of such a group existing or the evidence for it.
  4. Believing that a natural disaster was caused by a secret government experiment or cover-up rather than by natural causes. This belief may be based on the similarity between the disaster and the idea of a secret government experiment or cover-up, rather than considering the probability of such an experiment or cover-up occurring or the evidence for it.

Questions

  1. How does the representativeness heuristic contribute to the appeal of conspiracy theories?
  2. Can you think of any examples of the representativeness heuristic at work in your own beliefs or decision-making processes?
  3. In what ways do conspiracy theories differ from more mainstream explanations for events or phenomena, and how might these differences influence the way we evaluate their credibility?
  4. How does the idea of a “chain of testimony” relate to the spread of conspiracy theories, and why might this be problematic in terms of evaluating their credibility?

Problems with Probabilities: Prospect Theory

The decision to adopt a conspiracy theory can be thought of as a sort of “bet” about the way the world will turn out, and what the “winning strategy” for living in such a world will be. So, for example, if I suspect there is a good chance that the members of the US Federal Reserve Board are an evil cabal intent on crashing the world economy to enhance the wealth of their corporate masters, I might buy gold and bury it in my back yard to hedge against this. If I assign a significant probability that pharmaceutical companies have hidden the evidence of vaccines causing autism, I might not vaccinate my children. Finally, if I believe it likely that some suspect group of people is up to no good, I might take action against them, potentially including violence.

Most of us would like to think that we are good at making such bets, since they are crucial to making decisions about how we invest our money, vote, and generally lead our lives. So, for example, it seems obvious that a 1% risk of a bad outcome is different than a 5% chance, which is in turn different from a 50% chance or 95% chance, and our choices and actions should reflect this difference. Unfortunately, according to Kahneman and Tversky, this is not how we actually make these sorts of decisions. Instead, we get things wrong in a number of ways.

First, we tend to focus not on the relative merits of a set of outcomes, but on how we think of ourselves as having arrived at these outcomes, and whether we view them as “gains” or “losses” from a psychological baseline. As it turns out, we care much more about potential losses than we do about potential gains, and simultaneously don’t care as much about the relative size of these gains or losses as we should. Conspiracy theorists offer excellent examples of this. First, in cases where they weigh large potential benefits from a change versus (much smaller) potential losses, they can be highly risk averse, for example when they reject the large potential benefits of vaccines or GMO foods on the grounds that there might be hidden health risks associated with these. Second, in cases where the conspiracy theorists already feel that they are below some psychological baseline, they can instead become risk-seeking, and adopt conspiracy theories that lead to highly risky actions in a last-ditch attempt to put themselves back  over the baseline, even though the most probable outcome of such behavior would be to put them even further under this baseline than they already feel themselves to be. So, for example, if the members of a certain group worry they are “losing control of their country” to their political rivals, they might respond by abandoning democratic norms or engaging in violence, even though these actions are, on balance, likely to lead to even greater losses.

Prospect theory also suggests that we systematically underweight the probabilities of some events while overweighting others. In particular, while we sometimes tend to treat extremely unlikely but possible events as being equal to 0, we quickly inflate the probabilities of unlikely events once we begin to treat them as being genuinely possible, no matter how “objectively” unlikely they might be. In the case of conspiracy theories, this might plausibly explain the simultaneous urge to (1) dismiss out-of-hand the possibility that the harms that have occurred to them are due to statistical “chance”, and (2) vastly inflate the probability that these harms are caused by the secret actions of conspirators.

Some potential examples of these flawed ways of reasoning include:

  1. Believing that the government hiding evidence of extraterrestrial life because it would be a “bigger” event and more exciting than the alternative explanation that no such evidence exists.
  2. Believing in a conspiracy theory about a powerful group secretly controlling world events because it gives a sense of control and agency in a chaotic world.
  3. Believing that a natural disaster being caused by a secret group or individual rather than accepting that it was a random act of nature, in order to avoid feeling powerless and vulnerable.
  4. Falling for a conspiracy theory about a medical treatment or procedure being dangerous or ineffective because the potential consequences of accepting the mainstream explanation are perceived as more negative than the potential consequences of the conspiracy theory.
  5. Believing in a conspiracy theory about a historical event being distorted or covered up in order to protect one’s cultural or personal identity, rather than accepting a more nuanced or uncomfortable explanation.

Questions

  1. What is the role of probability in decision-making and how does prospect theory challenge the way we traditionally understand probability?
  2. How does our perception of potential gains and losses impact the way we make decisions and how does this relate to conspiracy theories?
  3. How does prospect theory explain the tendency to underweight or overweight the probability of certain events, particularly in the context of conspiracy theories?
  4. Can you provide examples of how prospect theory might influence belief in specific conspiracy theories?
  5. In what ways might an understanding of prospect theory help us to better understand and address the appeal of conspiracy theories?

Can We Avoid Mistakes When It Counts?

So, what’s the take-away from all of this? It might be summarized as follows: conspiracy theorists, like the rest of us, notice bad things happening in the world around them. They (again, like the rest of us) are convinced that there must be a cause for these events. However, when they begin to consider what sort of cause this might be, they are led astray by the resemblance heuristic, which predisposes them towards a causal story (the conspiracy theory) that most closely “resembles” the limited samples they are familiar with, and the limited, biased evidence they have reviewed. This completely ignores the possibility that the events in question are simply the result of statistical “chance.” These errors are compounded by the failure to deal with probabilities and “risky decisions” properly, as described by prospect theory. Conspiracy theorists are often attached to some (perhaps imaginary) baseline about the way things “used to be” or the way “nature intended things,” and are willing to take risks to avoid accepting losses from this baseline. Simultaneously, they improperly dismiss the possibility of some unlikely events (such as the sorts of chancy processes that often explain strange-looking results in small samples) and the inflate the probability of others (such as the conspiracy theory they’ve heard so much about on talk radio).

In Thinking, Fast and Slow, Kahneman argues there are other heuristics and biases waiting to trip us up, beyond those described here. The halo effect, for example, predisposes us to (without any evidence!) assign good qualities to people/things we already believe are good in other respects, and bad qualities to those we already dislike or distrust.  Outcome bias, meanwhile, presents us with a false view of the past, whereby we assume that the things that did happen (for good or bad) were predictable. This conveniently allows us to avoid giving credit to decision makers for decisions that turned out well while blaming them for decisions that went wrong. These sorts of processes plausibly lend fuel to the fire of conspiracy theorists’ tendency to blame any and all bad outcomes on the actions of the purported conspirators (who, not coincidentally, tend to belong to groups the theory’s proponents already hold in ill regard). Finally, and perhaps most concerning our intuitive sense of how likely a given outcome is strongly affected by the detail in which one have imagined or described this outcome. So, the mere act of talking or reading about a conspiracy theory in detail might well serve to inflate our sense of how probable this sort of really thing is.

All of this happens generally happens without even thinking, and it can happen to even smart, knowledgeable people, since inductive fallacies don’t present themselves as defective means of reasoning. Instead, these processes present themselves as a strong feeling that certain theories or ideas are correct, and invite us to adopt and defend these ideas as our own with all of the intellectual creativity and rigor that we can muster. This suggests that that vulnerability to conspiracy theories may be linked to neither ignorance nor stupidity. Rather, it might be that conspiracy theorists are mentally “lazy” in the ways that many of us are lazy, and it is this laziness that undercuts their ability to make cogent inductive inferences. In particular, belief in a conspiracy theory allows one to avoid all sorts of uncomfortable thoughts, such as fully grappling with the role of chance in events, or the poverty and bias of the news we consume, or the systematic ways in which our sense of what’s possible misleads us about what is actually probable. Conspiracy theories reassure us that the bad guys really are all bad, and that, if we stop them next time, we can assure things will turn out well.

If correct, this suggests that there can be significant value in reflecting on the inductive failures of conspiracy theorists, even for those who feel quite confident that they themselves could never fall into the trap of believing in such a theory. Such confidence, as it turns out, may be a poor guide to one’s actual vulnerability. However, it may be that we can partially inoculate ourselves against conspiracy theories by paying close attention to the specific ways in which they exemplify bad inductive reasoning. This, in turn, might make it at least somewhat easier to catch our own errors, and to become better, more careful inductive reasoners[5].

Questions

  1. In what ways do our cognitive biases and heuristics, such as the representativeness heuristic and prospect theory, contribute to the belief in conspiracy theories?
  2. How does our desire to maintain a psychological baseline and avoid losses affect our likelihood of believing in conspiracy theories?
  3. How does the detail in which we imagine or describe an event influence our perception of its likelihood?
  4. How do our cognitive biases and heuristics contribute to the way we evaluate evidence for or against conspiracy theories?
  5. In what ways do conspiracy theories offer reassurance or a sense of control in the face of uncertainty or discomfort?
  6. How can we be more aware of and guard against our cognitive biases and heuristics in order to make more accurate inductive inferences?

Case Study: Holocaust Denial

Throughout history, conspiracy theories have often been used to target and scapegoat certain groups, particularly religious and ethnic minorities. One of the most pernicious and persistent examples of this is the various antisemitic conspiracy theories that have been used to justify discrimination against and persecution of Jewish people. These conspiracy theories have taken many forms over the centuries, from the medieval blood libel (which accused Jews of murdering Christian children to use their blood in religious rituals) to the notorious forgery The Protocols of the Elders of Zion (which purported to reveal a Jewish plan for global domination). In the modern era, one of the most troubling manifestations of antisemitic conspiracy thinking is Holocaust denial.

As we have seen, conspiracy theories often serve as simple, psychologically appealing alternatives to mainstream explanations for disturbing events. In the case of the Holocaust, the sheer scale and horror of the event can be difficult for people to comprehend or accept. Denying that it happened, or minimizing its severity, may provide a sense of reassurance or control in the face of such an overwhelming tragedy. Additionally, for those already predisposed to antisemitic views, Holocaust denial can serve as a way to undermine the moral legitimacy of the Jewish people and the state of Israel.

Understanding how and why people come to believe in Holocaust denial can provide valuable insights into the psychological and social factors that contribute to conspiracy thinking more broadly. By examining the flawed reasoning and rhetorical tactics used by Holocaust deniers, we can better understand how conspiracy theories spread and persist, even in the face of overwhelming evidence to the contrary.

The Mainstream Hypothesis

The mainstream historical understanding of the Holocaust, based on extensive documentary evidence, survivor and eyewitness testimony, and physical evidence, is that during World War II, the Nazi regime in Germany and its collaborators systematically persecuted and murdered approximately six million European Jews, as well as millions of other victims, including Roma people, disabled people, Slavic peoples, political opponents, and gay men.

This genocide was carried out through a network of concentration camps, death camps, and mass shooting operations. Jews and other targeted groups were subjected to forced labor, starvation, disease, medical experimentation, and mass murder in gas chambers. This systematic campaign of extermination, known as the “Final Solution,” was a central goal of the Nazi regime and was carried out with ruthless efficiency.

The motivations behind the Holocaust were rooted in Nazi ideology, which held that Germans were a superior “Aryan” race and that Jews were a subhuman race that posed a threat to German purity and survival. This ideology drew on centuries of antisemitic stereotypes and conspiracy theories, but took them to genocidal extremes under the cover of war and the totalitarian power of the Nazi state.

Evidence for the Mainstream Hypothesis

The evidence for the mainstream understanding of the Holocaust is overwhelming and comes from a wide range of sources. Some of the key pieces of evidence include:

  1. Documents: There are extensive surviving Nazi documents detailing the planning and implementation of the Holocaust, including records of mass deportations, lists of victims, and orders for extermination. One key example is the Wannsee Protocol, which records the minutes of a 1942 meeting of senior Nazi officials discussing the implementation of the “Final Solution.”
  2. Eyewitness testimony: Thousands of survivors, perpetrators, and bystanders have provided detailed accounts of their experiences during the Holocaust. These accounts come from Jews who survived the camps, Nazi officials who participated in the genocide, and local civilians who witnessed atrocities.
  3. Physical evidence: The remains of concentration camps, gas chambers, and mass graves provide physical testament to the reality of the Holocaust. Forensic analysis of these sites has yielded significant corroborating evidence, such as traces of poisonous gas in the ruins of gas chambers.
  4. Perpetrator confessions: Many high-ranking Nazi officials, including Adolf Eichmann and Rudolf Höss, provided detailed confessions of their roles in the Holocaust during post-war trials and interrogations.
  5. Documentary evidence: Contemporaneous news reports, diplomatic communications, and personal journals from the period all reference the ongoing persecution and mass murder of Jews and other groups.

This evidence has been thoroughly examined and verified by generations of scholars, and the mainstream historical understanding of the Holocaust is considered one of the most well-documented and incontrovertible events in modern history. Denial of the basic facts of the Holocaust is not a matter of legitimate historical debate, but an ideologically motivated distortion of the historical record.

The Conspiracy Theory Hypothesis

Holocaust deniers propose an alternative conspiracy theory that contradicts the overwhelming historical evidence. They claim that the Holocaust did not happen or that it was greatly exaggerated. According to this view, the Nazis did not systematically murder millions of Jews, and the gas chambers were not used for mass extermination. Instead, Holocaust deniers claim that the Holocaust is a myth created by the Allies, the Soviet Union, and the Jews themselves for various nefarious purposes, such as to extract reparations from Germany, to justify the creation of Israel, or to unfairly vilify the Nazi regime.

Holocaust deniers often claim that the accepted history of the Holocaust is based on fabricated or exaggerated evidence, coerced confessions, and a deliberate suppression of counter-evidence. They may argue that the gas chambers were actually used for delousing clothes, that the crematoria were not capable of disposing of millions of bodies, or that the Jewish population in Europe after the war was higher than what would be expected if six million had been killed.

Examining the “Evidence” for Holocaust Denial

Holocaust deniers often present their arguments as if they are engaging in legitimate historical inquiry or scientific investigation. However, their methods and arguments are fundamentally flawed and rest on misunderstandings about how historical research and scientific evidence work.

For example, Holocaust deniers often focus on minor discrepancies or ambiguities in survivor testimonies to discredit the entire body of evidence. However, this fails to recognize that eyewitness accounts, especially of traumatic events, are not always perfectly consistent, but this does not negate the overall validity of the testimony. In fact, the vast number of survivor accounts, despite some minor variations, is powerful evidence for the reality of the Holocaust.

Similarly, Holocaust deniers may present scientific or technical arguments about the capacity of gas chambers or crematoria to argue that mass extermination was impossible. However, these arguments are often based on flawed assumptions, selective evidence, or a misunderstanding of the actual conditions in the death camps. When subjected to rigorous scientific scrutiny, these arguments fall apart.

Holocaust deniers also engage in what is known as “cherry-picking” evidence – focusing only on pieces of information that seem to support their view while ignoring the vast body of evidence that contradicts it. This is not how legitimate historical or scientific research works, which must consider all available evidence and reach conclusions based on the overall weight of that evidence.

Psychological Mechanisms Behind Holocaust Denial

Several of the psychological mechanisms discussed earlier in this chapter can help explain the persistence of Holocaust denial despite the overwhelming evidence against it.

The representativeness heuristic may lead people to give undue credence to Holocaust denial arguments because they seem to offer a coherent, though false, narrative that “fits” with certain antisemitic stereotypes or political ideologies.

Prospect theory suggests that people are more willing to accept arguments that minimize or deny the Holocaust because the alternative – accepting the full reality of this genocidal atrocity – is so psychologically disturbing and uncomfortable.

The halo effect may lead people who are already predisposed to antisemitic views to more readily accept other anti-Jewish claims, including Holocaust denial.

Confirmation bias can lead people to seek out and focus on information that seems to support Holocaust denial while ignoring or dismissing the vast evidence that refutes it.

Additionally, the “illusion of explanatory depth” – people’s tendency to believe they understand complex phenomena better than they really do – may lead people to underestimate the complexity of the historical evidence and overestimate the plausibility of simplistic conspiracy theories.

The Continued Relevance of Holocaust Denial

Understanding the flaws in Holocaust denial arguments and the psychological mechanisms that can lead people to believe in such conspiracy theories remains deeply relevant today.

Firstly, Holocaust denial is a form of antisemitism and hate speech that continues to cause real harm to Jewish communities. It undermines the memory of the victims, distorts the historical record, and provides legitimacy to those who still harbor genocidal attitudes towards Jews.

Moreover, studying Holocaust denial can provide insights into how other conspiracy theories and forms of science denial operate. Climate change denial, for example, also relies on strategies of cherry-picking evidence, focusing on minor uncertainties, and proposing conspiratorial explanations.

When faced with psychologically uncomfortable realities – whether about the depths of human evil or the catastrophic impacts of environmental destruction – humans seem all too willing to embrace simplistic alternative narratives that absolve them of facing these hard truths.

By deconstructing the errors of Holocaust deniers and illuminating the psychological mechanisms that make such beliefs attractive, we can develop strategies to counter not just this specific conspiracy theory, but the broader tendencies of human reasoning that give rise to conspiratorial thinking. Promoting critical thinking, probabilistic reasoning, and a respect for expertise and evidence are crucial for navigating a world rife with misinformation and conspiracy theories. The lessons we learn from confronting Holocaust denial can strengthen our collective commitment to truth and our vigilance against all forms of dangerous pseudohistory and pseudoscience.

Discussion Questions

  1. What are the key differences between the mainstream historical understanding of the Holocaust and the claims made by Holocaust deniers? How does the evidence support the mainstream view and refute the deniers’ claims?
  2. How do the psychological mechanisms discussed in this chapter (such as the representativeness heuristic, prospect theory, and confirmation bias) contribute to the persistence of Holocaust denial and other conspiracy theories?
  3. Holocaust denial is often framed as a matter of “free speech” or “historical debate.” Why is this framing misleading? How can we balance the value of free expression with the need to combat hate speech and misinformation?
  4. What strategies can be used to counter Holocaust denial and other forms of pseudohistory or pseudoscience? How can we promote critical thinking and respect for evidence-based reasoning?
  5. Holocaust denial is a specific form of antisemitism. How does it relate to broader patterns of antisemitic conspiracy theories throughout history? What role do these conspiracy theories play in promoting hatred and violence against Jews?

References

  • Aupers, Stef. 2012. “‘Trust No One’: Modernization, Paranoia and Conspiracy Culture.” European Journal of Communication 27 (1). https://doi.org/10.1177/0267323111433566.
  • Brotherton, Robert, and Christopher C. French. 2014. “Belief in Conspiracy Theories and Susceptibility to the Conjunction Fallacy.” Applied Cognitive Psychology 28 (2): 238-248. https://doi.org/10.1002/acp.2995.
  • Douglas, Karen M., and Robbie M. Sutton. 2018. “Why Conspiracy Theories Matter: A Social Psychological Analysis.” European Review of Social Psychology 29 (1): 256-298. https://doi.org/10.1080/10463283.2018.1537428.
  • Goertzel, Ted. 1994. “Belief in Conspiracy Theories.” Political Psychology 15 (4): 731-742.
  • Levy, Neil. 2022. “Social Minds in Digital Spaces.” Philosophical Topics 50 (2): 1-19. https://doi.org/10.5840/philtopics202250214.
  • Radnitz, Scott, and Patrick Underwood. 2015. “Is Belief in Conspiracy Theories Pathological? A Survey Experiment on the Cognitive Roots of Extreme Suspicion.” British Journal of Political Science. https://doi.org/10.1017/S0007123414000556.
  • Sunstein, Cass R. 2008. “Conspiracy Theories.” Harvard Public Law Working Paper No. 08-03; U of Chicago, Public Law Working Paper No. 199; U of Chicago Law & Economics, Olin Working Paper No. 387. https://ssrn.com/abstract=1084585.
  • Uscinski, Joseph E., Adam M. Enders, Casey Klofstad, Michelle Seelig, John Funchion, Caleb Everett, Stephan Wuchty, Kamal Premaratne, and Manohar Murthi. 2020. “Why Do People Believe COVID-19 Conspiracy Theories?” Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-015.

 

Glossary

Term Definition
Conspiracy Theory A belief that events or situations are caused by secret, often sinister, groups or individuals working together to achieve a specific goal, often involving allegations of cover-ups or attempts to mislead the public.
Inductive Reasoning The process of using available evidence to determine what is probable or likely to be true, as opposed to deductive reasoning which seeks to prove conclusions with certainty.
Heuristic (Kahneman & Tversky) A simple, efficient rule or cognitive shortcut used to make decisions, often by substituting a more difficult question with an easier one. While useful, heuristics can lead to systematic errors or cognitive biases.
Representativeness Heuristic The tendency to judge the probability of an event based on how similar it is to a prototypical example, while neglecting relevant base rates or statistical information.
Base Rate The prevalence or frequency of a characteristic or event in a population, which should be considered when making probability judgments but is often neglected due to the representativeness heuristic.
Prospect Theory (Kahneman & Tversky) A model of decision-making under risk, which proposes that people evaluate potential losses and gains differently, and are more averse to losses than they are attracted to equivalent gains.
Loss Aversion The tendency to prefer avoiding losses to acquiring equivalent gains, often leading to risk aversion when considering potential gains but risk-seeking when faced with potential losses.
Miracle An event that appears inexplicable by scientific or natural laws and is often attributed to supernatural causes. Hume argued that one should never believe testimony about miracles.
Holocaust Denial The conspiracy theory that denies the historical reality of the systematic mass murder of Jews and other groups by the Nazi regime during World War II, despite overwhelming evidence to the contrary.
Final Solution The Nazi plan for the systematic extermination of the Jewish people during World War II, culminating in the Holocaust.
Confirmation Bias The tendency to seek out, interpret, and recall information in a way that confirms one’s preexisting beliefs, while giving less attention to information that contradicts it.
Availability Heuristic The tendency to judge the likelihood or frequency of an event based on how easily examples come to mind, which can lead to overestimating the probability of vivid or emotionally charged events.
Conjunction Fallacy The error of judging a specific scenario as more probable than a more general one that includes it, which violates the basic rules of probability.
Halo Effect The tendency for an individual’s positive or negative traits in one area to influence one’s perception of them in other areas.
Outcome Bias The tendency to judge a decision or action based on its outcome rather than on the quality of the decision at the time it was made, given the information available then.
Theory-Ladenness of Observation The idea that observations and perceptions are influenced by the theoretical beliefs, assumptions, and expectations of the observer.
Pseudohistory A type of pseudoscholarship that presents a distorted or fabricated version of history, often for ideological purposes or to promote conspiracy theories.
Protocols of the Elders of Zion An infamous antisemitic forgery purporting to reveal a Jewish plan for world domination, which has fueled antisemitic conspiracy theories and persecution.
Cherry-Picking The act of selectively choosing data or evidence that supports a particular position, while ignoring or dismissing evidence that contradicts it.
Illusion of Explanatory Depth The tendency for people to believe they understand complex phenomena better than they actually do, often leading to overconfidence in simplistic explanations or conspiracy theories.

 

[1] David Hume, An Enquiry Concerning Human Understanding, ed. Eric Steinberg, 2nd ed. (Indianapolis: Hackett Publishing, 2011), sec. 10.

[2] Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2011), 88.

[3]See especially “Judgment under Uncertainty: Heuristics and Biases,” Science 185, no. 4157 (1974): 1124–1131; “Prospect Theory: An Analysis of Decision under Risk,” Econometrica 47, no. 2 (1979): 263–292. A good summary of both their work and related research is provided in Kahneman’s Thinking, Fast and Slow (2011).

[4] In 2002, Kahneman won the Nobel Prize in Economics for this work. Unfortunately, Tversky died in 1996.

[5] I’d like to thank Todd Kukla for his helpful comments.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

A Little More Logical Copyright © 2024 by Brendan SHea is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book