1. Introduction
There are several authors who claim that there is a similarity between conspiracy theories and religious worldviews. Karl Popper (Reference Popper2020, 306) argued that conspiracy theories are secular forms of religious superstition. More recently, among others, Brian Keeley (Reference Keeley1999; Reference Keeley2007), Michael Wood and Karen Douglas (Reference Wood, Douglas, Dyrendal, Robertson and Asprem2018) Bradley Franks, Adrian Bangerter and Martin Bauer (Reference Franks, Bangerter and Bauer2013), and Glenn Bezalel (Reference Bezalel2021) claimed that there are important analogies between the structure of religious worldviews and conspiracy theories. This paper will focus on two things: First, it will be discussed if there indeed are substantial and interesting similarities between conspiracy theories and religious worldviews. We will argue that an element that is shared by theistic religions and conspiracy theories is an emphasis on holistic thinking and personal explanations.Footnote 1 Following that, we will focus on the consequences of the analogy between conspiracy theories and religious worldviews: we first highlight the Generalism vs. Particularism debate about conspiracy theories. According to the generalist view, all conspiracy theories share certain deficiencies that justify that we are prima facie (or pro tanto) suspicious about them. We will argue that the generalist view is not convincing at least if a non-evaluative definition of “conspiracy theory” is assumed. We claim instead that it can be epistemically vicious to have a dismissive attitude toward conspiracy theories. This, in turn, will enable us to infer that, contrary to a widespread impression, the analogy with conspiracy theories does not validate a general suspicion about religious worldviews. At the end of the paper, we will critically discuss Bezalel’s suggestion that both conspiracy theories and religious worldviews can be explained by the theory of bliks, which was introduced by R. M. Hare (Flew, Hare, and Mitchell Reference Flew, Hare, Mitchell, Flew and MacIntyre1955, 99–103). Before we begin with the discussion of the relation between conspiracy theories and religious worldviews, we will delineate what we mean by “conspiracy theory.”
2. What is a conspiracy theory?
As a definition of “conspiracy theory,” we propose the following: A conspiracy theory is an explanation of one or more events in terms of the significant causal agency of a group of persons – acting in secret and systematically trying to conceal their influence on the respective event(s).
This is very close to Brian Keeley’s (Reference Keeley1999, 116) definition. One difference is the last part about the intention to systematically conceal the influence on the respective events. This seems to be an important addition to us, since without it, every meeting that is not open to the public would qualify as a conspiracy. It is not crucial for a conspiracy that there are attempts to conceal the existence of the conspiring group.Footnote 2 The CIA or the Masons are secretive, but not secret organizations. However, their leadership clearly is not per definitionem unable to conspire. Rather, what is necessary for a conspiracy is the intention of the respective group to keep some kind of activity secret.Footnote 3
Compared to other proposals in the literature, our definition is rather parsimonious considering the criteria that constitute a conspiracy theory. First, we do not hold that a group of conspirers has to be relatively small.Footnote 4 When one day it dawned on Truman Burbank that everything and everyone he knew was part of a gigantic film set, he was about to posit a conspiracy theory that involved almost every adult human being on Earth. Not only did thousands of producers, actors, and statists actively conceal the true character of Truman’s surroundings and interactions, but billions of other people were in it, too, and in most cases, complicit in his ordeal.Footnote 5
Neither do we think that the conspirers need to have nefarious motives or aims.Footnote 6 There may be a moral presumption against acting in secret and, accordingly, a special moral obligation to justify such behavior.Footnote 7 However, this obviously does not mean that such a justification cannot be obtained sometimes. The 20 July plot to assassinate Hitler, e.g., is routinely referred to as a “conspiracy” by scholars and laypersons alike, though very few would argue that the plot was ethically unwarranted, let alone based on nefarious motives.Footnote 8
Finally, we do not think that conspiracy theories always must run counter to the official story.Footnote 9 What counts as the official story strongly differs geographically and historically. If the given criterion were added to our definition, e.g., the claim that Alexander Litvinenko was poisoned and killed by FSB agents in 2006, being contrary to the official story in Russia, would constitute a conspiracy theory there, whereas it wouldn’t constitute a conspiracy theory in the Western hemisphere. The conspiracy theory that in June 1972, the Democratic National Committee headquarters was burglarized on behalf of Richard Nixon and/or his senior staff would have ceased to be a conspiracy theory with Nixon’s resignation in August 1974 (or sometime before) etc.
We do not believe that such terminological fluctuations are desirable outcomes of defining “conspiracy theory.”Footnote 10 Incorporating opposition to the official narrative as a part of this definition also appears to rely on selective examples. Today, conspiracy theorists are often seen as small groups with unconventional beliefs. Michael Butter argues that the pathologizing of conspiracy theories, particularly in intellectual circles, developed from the 1930s through the 1960s, peaking with Richard Hofstadter’s book “The Paranoid Style in American Politics,” which continues to influence research on conspiracy theories. According to Butter, before the mid-20th century, political and intellectual authorities did not generally pathologize conspiratorial narratives.Footnote 11 Therefore, opposition to the official story may merely be a contingent aspect of many of the most topical contemporary conspiracy theories.
3. Conspiracy theories and religious worldviews
3.1. Analogies between conspiracy theories and religious worldviews
There are many ways in which conspiracy theories and religion can be related. There are, first, conspiracy theories within religions. Members of religions may disproportionally adhere to certain conspiracy theories, for example, about the COVID-19 pandemic. Sometimes their religion itself will be involved in such theories, for example, when believers complain about secretly orchestrated smear campaigns against their faith. Second, there are conspiracy theories about religions. Most radically, a conspiracy may be seen as the founding event of a religion: According to the German enlightenment philosopher Hermann Samuel Reimarus, the Christian apostles “enacted a spiritual coup d’etat” “by fabricating the resurrection of Jesus and his soon-to-be-expected return from heaven.”Footnote 12 A religion may also be viewed as being clandestinely hijacked by deceitful leaders. For instance, Dostoyevsky’s Grand Inquisitor confessed: “We are [no longer] with you [Jesus], but with him [Satan], there is our secret! […] Quietly they will die, quietly they will fade away in your name, and beyond the tomb will find only death. But we shall preserve the secret and for the sake of their happiness will lure them with a heavenly and eternal reward.” (Dostoyevsky Reference Dostoyevsky1993, 335; 338).
In the following, we do neither focus on conspiracy theories that occur within religions nor on those about religions. Instead, we analyze in which sense religions, or more specifically, religious worldviews show analogies with conspiracy theories. Since we are interested in epistemological questions, we will focus on epistemologically relevant analogies. Hence, we will not take into account possible analogies between the motivations for holding religious beliefs and conspiracy theories.Footnote 13 As was already mentioned in the introduction, Popper did not only initiate the debate about conspiracy theories, he already directly connected them to religious worldviews. According to Popper, Homer portrayed the gods as acting and complotting in secret and as being the ones who are responsible for the course of events. Today, conspiracy theorists consider powerful groups of people acting in secret to be in charge of what happens. For Popper, the “conspiracy theory of society” is a secular form of religious superstition. People who postulate such a conspiracy theory about the society think that historic events or social practices are the direct result of intentional actions of individuals or groups. This is, according to Popper, a naïve idea that shows a lack of understanding of the way the social world works because many of the things that happen in the social world are just unintended consequences of actions or political measurements (see Popper Reference Popper2020, 306f.).
Keeley (Reference Keeley1999, 123–126) argues in a similar fashion as Popper when he claims that conspiracy theorists are some of the last believers in an ordered universe. In an earlier time, this order was seen as established by God or other supernatural agents.Footnote 14 Today, the nefarious agents and groups that act in secret are purportedly responsible for the (dis)order of the world we experience. For Keeley, conspiracy theorists rely on the false assumption that the events of the social world are capable of being controlled. However, there are simply too many free agents involved to exercise the control that conspiracy theorists assume. Keeley does not claim that, if we give up conspiratorial thinking, we have to see the world as a random, chaotic place. Rather, what is missing is (personal) meaning and significance. Hence, by “order,” Keeley does not refer to just any kind of causal order but an intentionally established order that has a purpose. We should according to Keeley be careful not to over-rationalize the world and seek purposive explanations where they do not exist. The opposite of conspiratorial (and religious) thinking, for Keeley, is absurdism or nihilism, according to which the world is essentially meaningless and irrational.
Keeley remarks that the problem of conspiratorial thinking might be a “psychological one of not recognizing when to stop searching for hidden causes” (Reference Keeley1999, 126). This is in line with what several psychologists and cognitive scientists think. Wood and Douglas (Reference Wood, Douglas, Dyrendal, Robertson and Asprem2018), for example, claim among other things that religious beliefs and conspiracy theories both share the tendency to see patterns of agency where they do not exist. This phenomenon is called “agenticity” (Shermer Reference Shermer2009) or “hypersensitive agency detection” (Guthrie Reference Guthrie1993; Barrett Reference Barrett2007, 67–70; Douglas et al. Reference Douglas, Sutton, Callan, Dawtry and Harvey2016, 59–61). As examples of this tendency in both conspiratorial and religious thinking, Wood and Douglas (Reference Wood, Douglas, Dyrendal, Robertson and Asprem2018, 88f.) discuss interpretations of the tsunami that resulted from an earthquake in the Indian Ocean in 2004. Some people claimed that the tsunami was the punishment of a vengeful God. Others speculated that it was produced by the Israeli, US, or Indian governments who detonated nuclear weapons underwater. Similar explanations both from a religious and a conspiratorial perspective were produced after the 2011 tsunami in Japan. Both the religious and the conspiratorial explanations rest on the mistake of postulating personal explanations where these are inappropriate. Adherents of such explanations do not want to accept non-intentional explanations of the tsunamis. They seem to have problems to admit that those events were not caused by an agent but by the movement of the tectonic plates. Justin Barrett (Reference Barrett2007, 68) gives as an explanation for such hypersensitive agency detection its advantage in evolutionary adaption. For survival, it is preferable to assume agency where none exists than to miss agency. Michael Shermer (Reference Shermer2009) argues along these lines to explain our tendency to overattribute patterns in general. He claims that false positives are less evolutionarily damaging than false negatives. If one interprets the rustle of the grass as a predator approaching when, in fact, it is just the wind, this does not have many negative consequences. On the other hand, if you misinterpret the rustling as resulting from the wind when, in fact, it comes from a predator approaching, this can easily kill you.
Similarly, van Prooijen et al. (Reference van Prooijen, Douglas and De Inocencio2018) tried to show that both people who are prone to believe in conspiracies and people who believe in the supernatural have a tendency toward illusionary pattern perception. This means that they attribute patterns to randomly generated stimuli. The authors write:
Illusory pattern perception occurs when people mistakenly perceive randomly generated stimuli as causally determined through a nonrandom process, and hence as diagnostic for what future stimuli to expect (van Prooijen et al. Reference van Prooijen, Douglas and De Inocencio2018, 321).
The outcome of their studies supported the assumption that “illusionary pattern perception is a basic cognitive aspect of the conspiracy and supernatural beliefs under investigation here” (van Prooijen et al. Reference van Prooijen, Douglas and De Inocencio2018, 331). Still, the empirical evidence for a correlation between pattern perception and conspiracy belief is not straightforward. An earlier study (Dieguez et al. Reference Dieguez, Wagner-Egger and Gauvrit2015), for example, found no significant correlations between pattern perception and conspiracy beliefs. Moreover, the description of illusory pattern perception by van Prooijen et al. is problematic. If we ignore quantum mechanics and stay at the macrophysical level, every physical process is assumed to be causally determined. The physical or geological explanations of the tsunamis from 2004 and 2011 do not construe them as random events. Even if we, now, cannot reliably predict events like these, they are, nonetheless, the results of structured and nonrandom processes. Looking for patterns in the empirical data is what every scientist does. It is true that we should be careful about which patterns we use for explanations. Still, the decision is not between causally determined processes and mere randomness, but between different candidates for a causal explanation. Hence, it seems mistaken to take increased pattern perception in general as a candidate that separates adherents of conspiracy theories from, for example, adherents of generally accepted scientific explanations. A more promising candidate is what was discussed under “agenticity” or “hypersensitive agency detection.” Adherents of conspiracy theories typically assume an intentionally established order and not just an order established by the laws of physics. Plausibly, this feature is shared by theistic worldviews.
Another candidate for an analogy between conspiratorial and religious thinking is the overemphasis on holistic thinking at the expense of analytic thinking. People who tend to reason holistically focus more on the “big picture” and try to relate several events to a common cause. In analytic thinking, by contrast, the individual parts of a larger whole are carefully examined. The overemphasis of holistic over analytic thinking is according to some studies positively correlated with conspiratorial thinking (see, e.g., Swami et al. Reference Swami, Martin Voracek, Tran and Furnham2014) and religious beliefs (see, e.g., Pennycook et al. Reference Pennycook, James Allan Cheyne, Koehler and Fugelsang2012). Furthermore, some psychologists found evidence for the hypothesis that belief in conspiracy theories is negatively correlated with the performance in probabilistic reasoning. When somebody has problems with probabilistic reasoning, a conjunction of events is too easily taken to be a coincidence that demands a common cause. The chances that events co-occur without there being a causal connection are systematically underestimated in these cases. Wood and Douglas (Reference Wood, Douglas, Dyrendal, Robertson and Asprem2018, 93) see an analogy to religious beliefs as there are studies that found a negative correlation between religiosity and the general reasoning ability (see, e.g., Hergovich and Arendasy Reference Hergovich and Arendasy2005).
There are, of course, also important disanalogies between religious worldviews and conspiracy theories. Wood and Douglas (Reference Wood, Douglas, Dyrendal, Robertson and Asprem2018), as well as Franks et al. (Reference Franks, Bangerter and Bauer2013), for example, stress that in contrast to many beliefs in conspiracy theories, religious beliefs do lead to more civic participation and prosocial behavior. Furthermore, at least for monotheistic religions, it does not seem apt to speak of conspiratorial beliefs or doctrines. As Keeley (Reference Keeley2007, 140f.)Footnote 15 stressed correctly, it is conceptually questionable to use the term “conspiracy” if only one entity such as a god is acting in secret.Footnote 16
In sum, we think that the epistemically relevant and most plausible candidates for an analogy between religious worldviews and conspiracy theories are assumptions of agency, insistence on personal explanations, and an emphasis on holistic reasoning. Of course, a focus on agency in the explanations is only present within theistic religions. A tendency for holistic thinking, however, might also be attributable to non-theistic religions such as certain strands of Buddhism. Now, the epistemically relevant analogies stressed by philosophers and psychologists are all portrayed as epistemic vices. Agenticity and hypersensitive agency detection are defined as biases to assume purposeful activity where there is none. Also, the emphasis on holistic thinking is portrayed as a deficiency in analytic thinking as well as probabilistic reasoning. Most authors look for common epistemological biases or vices that are shared by both adherents of religious worldviews and conspiracy theories. Both forms of explanations are considered mistaken, in principle, or at least epistemologically suspicious. In the following, we want to challenge this assumption. To do so, we first look at the debate about Generalism and Particularism concerning conspiracy theories.
3.2. Generalism and particularism about conspiracy theories
Joel Buenting and Jason Taylor (Reference Buenting and Taylor2010, 568f.) introduced the terminological distinction between the generalist and the particularist view about conspiracy theories. According to the generalist view, believing in conspiracy theories can in general and without assessment of the claims of particular conspiracy theories be classified as irrational. In contrast, in the particularist view, every conspiracy theory has to be assessed individually in order to decide whether believing it is irrational or not. Hence, for adherents of the particularist view, we are never warranted to presume that believing a conspiracy theory is irrational, just because it is conspiracy theory.
Now, we think that Buenting and Taylor’s characterization of the generalist view is too strong. There are only a few authors who claim that belief in conspiracy theories is qua its subject, i.e. conspiracy theories, always irrational. Most of the authors make the concession that there are also some conspiracy theories that turned out to be true and some conspiracy theorists who do not commit fallacies and cannot be accused of intellectual vices.Footnote 17 Unless you engage in conceptual engineering and define that conspiracy theories are epistemically vicious,Footnote 18 it is hard to deny that there were some conspiracy theories that are at least not outright nonsense. Still, we think that the distinction between Generalism and Particularism has explanatory value if we use a weaker version of Generalism. With Matthew Dentith (Reference Dentith2021, 9901), we understand Generalism as the position that one is justified to be prima facie suspicious and skeptical about conspiracy theories. Hence, in our terminology, Generalists advocate a presumption that conspiracy theories are unwarranted; i.e. they assume that it is rational to dismiss conspiracy theories out of hand because, as Quassim Cassam (Reference Cassam2015) puts it, they are “mostly bunkum.” The few exceptions of conspiracy theories that turned out to be true, do not defeat such a Generalism. As long as they are only rare exceptions, the general presumption against the rationality of conspiracy theories remains warranted. Thus, conspiracy theories are understood as (epistemically) guilty until proven innocent. Particularists, in contrast, deny that such a presumption against conspiracy theories is justified. In the following, we present some arguments that were used to promote a general skepticism or suspicion towards conspiracy theories, i.e. in our terminology Generalism.
Karl Popper’s (Reference Popper2020) characterization of conspiracy theories was already delineated above. He claims that conspiracy theories (or as he writes “the conspiracy theory of society”) can in general be refuted because they all rest on the mistaken presupposition that a small group of people could control the social world. Therefore, although conspiracies occur, it is very rare that they turn out to be successful. The conspirators simply cannot exercise the control over the social world they intend to. Hence, believing conspiracy theories appears to be almost always unjustified.
Steve Clarke (Reference Clarke2002) suggested that adherents of conspiracy theories can be accused of the “fundamental attribution error” since they overemphasize dispositional at the expense of situational explanations. Dispositional explanations refer to dispositions of certain persons to explain the respective phenomena, whereas situational explanations focus on factors that influence the situation but are not part of the disposition of a certain person. Some psychologists claim that the overestimation of the importance of dispositional factors is widespread in human reasoning.Footnote 19 According to Clarke, the fundamental attribution error is especially characteristic of conspiracy theorizing.Footnote 20
Neil Levy (Reference Levy2007) argues that it is irrational to reject the official stories since, in most cases, we do not have enough expertise to adequately evaluate the bigger picture. We often fall victim to an “illusion of explanatory depth.” We think we can explain phenomena such as a rainbow or how a car works but are in fact unable to offer precise and compelling explanations for these phenomena. According to Levy, “we constantly underestimate the extent to which our knowledge depends upon our location in the socially distributed network of epistemic authorities” (Levy Reference Levy2007, 190). We should trust these epistemic authorities simply because we do neither have the ability nor the resources to come to better and more plausible conclusions than them. Since conspiracy theories very often run counter to the official story,Footnote 21 it follows, according to Levy, that it is in most cases irrational to believe them.
Cassam (Reference Cassam2015; Reference Cassam2016) argued that typical adherents of conspiracy theories (he discusses a fictional character called “Oliver” who believes that 9/11 was an inside job) can be accused of intellectual vices such as gullibility, cynicism, carelessness, and closed-mindedness. They are, according to Cassam, gullible concerning the sources they built the conspiracy theories from and cynical about more legitimate resources of information. They lack the intellectual virtues of discernment, humility, and carefulness. He also argues that the general propensity to think in conspiracist terms itself is an intellectual vice (see Cassam Reference Cassam2016, 172).Footnote 22
Keith Harris (Reference Harris2018, 248–252) accuses adherents of conspiracy theories among other things of a fallacious probabilistic usage of modus tollens. The classical inference of modus tollens says that if (p → q) is true, and q is false, we can infer that p is false, too. The probabilistic counterpart starts, for example, with “if p is true, then q is highly improbable”. Now, if q happens to be true, we may be tempted to infer that p is highly improbable. However, this inference is not valid. Harris illustrates this with a lottery example. If the lottery is fair, it is for every person who participates highly improbable to win. Still, if a specific person wins, that does not give us a good reason to think that the lottery was probably not fair. Analogously, Harris argues that adherents of conspiracy theories use errant data, i.e. data that are improbable given the official account,Footnote 23 to infer that the official account is (probably) false. According to Harris, this kind of fallacy is characteristic of conspiracy theorists.Footnote 24
An indiscriminate dismissal of as well as an epistemic presumption against conspiracy theories is criticized by particularists who seem to have become the majority in the philosophical debate about conspiracy theories.Footnote 25 We cannot discuss all features that are candidates for disqualifying conspiracy theories in general. In fact, the plausibility of particularism does not primarily rest on the refutation of all these candidates. Instead, the main argument of particularists consists of the overwhelming number of examples of conspiracy theories that have been rationally justified and turned out to be true. Matthew R.X. Dentith (Reference Dentith2021, 9900–9902) points out that there is a danger in choosing one-sided examples of conspiracy theories. If we mainly focus on obviously unjustified and false conspiracy theories, such as the ones put forward by David Icke or Alex Jones, then this affects our attitude toward conspiracy theories in general. We are then in danger of making false generalizations. Considering a wide range of examples of conspiracy theories, Charles Pigden (Reference Pigden, Loukola and Donskis2022; Reference Pigden, Lippert-Rasmussen, Brownlee and Coady2017) claimed that every historically or politically literate person is a conspiracy theorist because she believes in several conspiracies that are reported throughout history and the media. It is, according to Pigden, intellectually vicious to be a systematic skeptic about conspiracy theories. He illustrates this with several cases of conspiracies that turned out true or are at least quite plausible. An often-used example of a true conspiracy theory is the Watergate affair. But there are plenty of other ones. Just a few examples: Operation Menu from 1969 until 1970 was a tactical bombing of Cambodia by the US army. It was illegal since Cambodia was protected as a neutral country by the 1954 Geneva Conference. The bombing, which was supposed to hit fighters from the North Vietnamese Army and the Viet Cong, was kept secret from Congress, the press, and parts of the military itself. Hence, there was obviously a conspiracy in place here. The CIA also tried to hide their involvement in the overthrow of the Chilean President Salvador Allende in 1973 who was replaced by General Augusto Pinochet. Between 1981 and 1986 the US government secretly and against a decision from Congress used money they earned from illegally selling weapons to Iran to support the Contras, a rebel group in Nicaragua. More recently, it transpired that Exxon covered up its own research that had confirmed the reality of global warming since the 1970s and that Volkswagen systematically and, of course, secretly, and illegally installed software in its cars that, in tests, reduced their pollutant emissions. Furthermore, that Iran conspired with Hamas and trained some of its fighters for the attack against Israel on October 7th is not a far-fetched hypothesis. Similarly, that Alexei Nawalny was poisoned by Russian agents in 2020 appears to be a very plausible conspiracy theory. Pigden (Reference Pigden, Lippert-Rasmussen, Brownlee and Coady2017, 130f.) points out that most assassinations, terrorist operations, corruption of politicians, tax avoidances, insider trading, and other illegal activities involve conspiracies of some sort. Hence, if we assume that such things happen, we also assume many conspiracies. Pigden concludes from this that a skeptic about conspiracies, i.e. a person who entertains a strong presumption against belief in conspiracy theories, commits intellectual suicide since she would render large parts of the social realm unintelligible. She would only see effects but block in several cases quite salient inferences about their causes. This would, according to Pigden, be epistemically vicious.
Now, the attitude toward conspiracy theories does not only rest on the examples discussed but also on the definition of “conspiracy theory” that is being used. In the beginning, we proposed a non-evaluative definition according to which a conspiracy theory is an explanation of an event in terms of the significant causal agency of a group of persons – acting in secret and systematically trying to conceal their influence on the respective event. Hence, according to our definition, Pigden is right that you cannot be a politically and historically literate person without believing in several conspiracy theories.
Such a definition according to which theories about conspiracies are conspiracy theories was recently criticized by M. Giulia Napolitano (Reference Napolitano, Bernecker, Flowerree and Grundmann2021). She claims that such a non-evaluative definition is revisionary and a form of conceptual re-engineering, i.e. an intentional change of the meaning of the term “conspiracy theory” that is neither warranted nor fruitful. It is revisionary because in ordinary language “conspiracy theory” has, according to Napolitano, a negative valence that is lost in this dispassionate definition. Furthermore, she argues, among other things, that a non-evaluative definition leads to misunderstandings among researchers. Cognitive scientists, sociologists, and psychologists are mainly interested in the irrational forms of belief in conspiracies. Adherents of the broad definition accuse scholars from other disciplines than philosophy of pathologizing belief in conspiracies which, according to Napolitano, has in turn led to a hostile intellectual climate.Footnote 26 Therefore, she thinks we should look for another definition and suggests that we identify a conspiracy theory with “a distinctive way of holding the belief in the existence of a conspiracy, namely, one that is self-insulated” (Napolitano Reference Napolitano, Bernecker, Flowerree and Grundmann2021, 86). This definition is narrower than the non-evaluative one, as not all theories about or beliefs in conspiracies would count as conspiracy theories. Furthermore, the definition has a negative valence built into it because self-insulation is epistemically vicious. This form of re-engineering is according to Napolitano more fruitful than the non-evaluative definition since it captures what we have in mind when we talk about conspiracy theories in ordinary language and avoids misunderstandings between different disciplines researching the phenomenon of conspiracy theorizing.
It is, from a methodological point of view, of course legitimate, in principle, to conceptually re-engineer “conspiracy theory” as Napolitano suggested. Whether her definition is indeed more fruitful for research, however, is up for debate. Additionally, it is not obvious to us that our non-evaluative definition is a form of re-engineering, too, as Napolitano claims. If negative valence is just a common association with conspiracy theories, then a non-evaluative definition is not a change of meaning. Consider cases where people are hostile toward scientific disciplines because they challenge their worldview or because they enable human beings to develop things such as nuclear weapons or clones. It seems to be wrong that a dispassionate definition of, say, nuclear physics or genomics that does not involve a negative valence would have to be classified as conceptual re-engineering in these cases. Pointing out positive aspects of said disciplines, such as massive progress in medicine, might encourage these people to re-evaluate their attitudes toward them. No change of meaning would be involved, however. Similarly, nowadays for many people, the term “religion” has, for several reasons, a negative valence. Would pointing out that there are also examples of religions that are not in tension with science, and that promote general well-being and a form of emancipation amount to an attempt at conceptual re-engineering? It seems not. Again, changing the evaluation of a term’s referent is not necessarily a change in the term’s meaning.Footnote 27 Finally, Napolitano’s suggestion seems to be odd for semantic and syntactic reasons. It would follow from her re-engineering that a conspiracy theory is not a theory about a conspiracy but a self-insulated belief in a conspiracy, which would be in tension with the compositional structure of the expression “conspiracy theory.” Furthermore, it would also follow from her definition that it is ill-formed to speak of belief in a conspiracy theory since a conspiracy theory is already a form of belief. Hence, believing a conspiracy theory would be believing a belief, which is odd. Therefore, her definition has the strange consequence that we should not say that somebody believes a conspiracy theory. Of course, Napolitano is entitled to define terms as she sees fit. However, we remain skeptical about how fruitful her proposal is and stick with the non-evaluative definition delineated in section 1.
In sum, we agree with the particularist view that there is nothing suspicious or epistemically vicious about conspiracy theories as such. Still, we also agree with generalists (and particularists) that there are many unjustified conspiracy theories. Believing, for example, that Bill Gates intentionally caused the COVID-19 pandemic to reduce the world population or that condensation trails left in the sky by aircrafts consist of chemical substances to poison the population is obviously highly irrational. People who believe in these conspiracy theories have self-insulated beliefs that are largely immune to revision. It is correct to attribute several epistemic vices such as gullibility, carelessness, or closed-mindedness to them. Still, the fact that there is a class of irrational conspiracy theorists does not allow us to infer that belief in conspiracy theories should in general be presumed as irrational and baseless.Footnote 28
Focusing only on epistemically vicious forms of believing conspiracy theories is also in danger of concealing that it can be epistemically vicious, too, to presume the falsity of conspiracy theories. For example, if investigative journalists simply dismissed (the yet unproven) claims that Volkswagen was manipulating their car’s emissions because they constituted a conspiracy theory, then the fraud might never have come to light. Likewise, the Watergate conspiracy might have remained secret if the journalists had not given some credibility to the theory that Nixon and his team were conspiring to monitor the Democratic party. Treating conspiracy theorizing as irrational or epistemically vicious until proven innocent would discourage civic vigilance over activities that are intended to remain hidden from the public and may, therefore, ultimately deprive us of an important resource to expose illegal and immoral behavior. Basham and Dentith (Reference Basham and Dentith2016, 13) even claim that conspiracy theorizing is essential to the function of democracies and ethically responsible societies. Seeing too many patterns and intentional actions in the world is mistaken. But it is also mistaken to deny all patterns of intentional action in favor of mere coincidence. David Coady introduces the term “Coincidence theorist”:
Coincidence theorists are people who fail, as it were, to connect the dots; who fail to see any significance in even the most striking correlations (Coady Reference Coady2012, 127).
Coincidence theorists are irrational because they do not allow for inferences to intentional action where there are patterns that strongly speak against a mere coincidence of events. Hence, there is both the danger of postulating conspiracies where in fact are none and of overseeing conspiracies that do in fact occur. It is not clear that one of these dangers is significantly greater than the other.Footnote 29
3.3. Consequences for the analogy between religious worldviews and conspiracy theories
As was argued in section 2.1 above, there are epistemically relevant analogies between religions, in particular: theistic worldviews, and conspiracy theories. In both cases, there is a focus on personal explanations that give events in the world meaning, significance, and purpose. Additionally, religious worldviews certainly emphasize holistic thinking since they address the meaning of life and of the world in general. Now, both analogies, a focus on explanations involving intentions as well as an emphasis on holistic thinking, were delineated by authors who consider conspiracy theories and religious worldviews as something at least dubious if not outright epistemically vicious. As was mentioned above, hypersensitive agency detection is defined as the tendency of postulating agency where none exists. Adherents of religious worldviews and conspiracy theorists are characterized as people who over-rationalize the world by seeking personal and holistic explanations where they are clearly inapt.
Taking into account the results of our discussion about Generalism and Particularism, we should be more careful about which conclusions we draw from the analogies between religious worldviews and conspiracy theories. If we assume the particularist view, which appears, as we have argued above, the most plausible one, then both aspects, i.e. looking for personal explanations and an emphasis on holistic thinking, could also be epistemically virtuous. Or the other way round, it can be epistemically vicious to outright refuse personal explanations, i.e. meaning and significance of events. The Coincidence theorists described above are blind to patterns that it would be rational to consider seriously. In analogy to conspiracy theories, we argue for a Particularism concerning religious worldviews.Footnote 30 We think that adopting religious worldviews is not inherently irrational and that belief in theories that imply purpose, meaning, and significance can also be epistemically virtuous.
Of course, this does not mean that religious worldviews are generally epistemologically sound. There certainly are plenty of religious worldviews that are irrational because they are in direct and problematic tension with evidence and isolate themselves from any possibility of critical reflection. Examples are religious worldviews that deny or explain away the evidence for evolution and/or archeological and geological data just to hold on to a literalist understanding of religious texts. Here we find a similar form of self-insulation, gullibility, and closed-mindedness as in some problematic forms of conspiracy theories. But as there are conspiracy theories that are not epistemically vicious, so there are religious worldviews. Many people working in philosophy of religion and theology tried to spell out forms of religious worldviews that give meaning and significance to the world and, at the same time, account for the evidence established by the sciences. Some of these authors are adherents of rather traditional religious worldviews, some of them, such as John Schellenberg (Reference Schellenberg, Buckareff and Nagasawa2016), John Bishop and Ken Perszyk (Reference Bishop and Perszyk2023), or Philip Goff (Reference Goff2023), promote revisionary religious worldviews that crucially depart from the traditional ones. This does, of course, not show that any of these religious worldviews is true. But it seems difficult to defend the presumption that they are all irrational and can be dismissed out of hand. It is not obvious that treating the existence of the universe as a brute fact is the only rational option.Footnote 31 The available evidence is underdetermined here. Certainly, some options are rendered irrational by the evidence and the results of science. But other rational options beyond atheistic naturalism, at least prima facie,Footnote 32 still seem to be available. It is important to note that our argument does not depend on whether specific theistic or religious worldviews are rational, let alone true. Instead, we only put forward the much more moderate claim that the analogy between religious worldviews and conspiracy theories itself does not justify a dismissive attitude towards religious worldviews as such.
3.4. Bezalel’s theory of conspiracy theories and religious worldviews as bliks
Glenn Bezalel (Reference Bezalel2021) recently argued that the concept of “bliks” which was introduced by Richard M. Hare can help to better understand conspiracy theories and religious worldviews. Hare (Flew, Hare, and Mitchell Reference Flew, Hare, Mitchell, Flew and MacIntyre1955, 99–103) used the neologism blik to capture the unfalsifiable attitudes that ground our worldviews. The context of Hare’s arguments is the debate on how the positivist theory of meaning can be applied to religious language. Hare aimed to defend, in opposition to Antony Flew, the view that religious language can be meaningful even if it cannot be translated into statements that are falsifiable based on empirical evidence. To illustrate the notion of blik, Hare uses the example of a lunatic who thinks that all university teachers at Oxford want to murder him. To convince him that this belief is false, a friend of the lunatic introduces him to several dons at Oxford who are friendly and warm to the lunatic. But this strategy does not work. The lunatic takes the behavior of the dons to be staged with the intent to deceive him: it is their “diabolic cunning” that makes them act in a friendly way towards him. The lunatic continues to think that they are all plotting against him no matter how many friendly and mild dons the friend shows him. Hare argues that the blik that all dons want to kill him is at the same time unfalsifiable and meaningful. There is, on the one hand, nothing that could happen that would convince the lunatic that he is mistaken. On the other hand, his blik certainly does make a difference in the lunatic’s life. Hence, the blik that all dons conspired to kill him is, despite not being falsifiable, a momentous thing to have. Now, Hare claims that religious attitudes are bliks, too, i.e. they are unfalsifiable and shape our worldview. Furthermore, he thinks that all people, religious or not, have bliks as the basis for their worldview. Therefore, Hare argues that Flew and other positivists are mistaken to analyze religious claims in analogy to scientific explanations. The former are, in contrast to scientific explanations, frames that ground our explanations, i.e. they form the basis for our judgments on what counts as a good explanation and what does not.
Bezalel agrees with Hare’s theory of bliks and understands them as unfalsifiable beliefs.Footnote 33 He argues that it is, among other things, also in accordance with Wittgenstein’s theory of religious beliefs. Wittgenstein claims that these are unshakeable and not the result of critical reflection or the evaluation of evidence. Instead, religious beliefs are more akin to ways of living.Footnote 34 Now, Bezalel (Reference Bezalel2021, 685–688) argues that conspiracy-theorizing worldviews should, like religious worldviews, be analyzed as bliks. Although they are not as deeply rooted as religious beliefs, the belief in conspiracy theories can profoundly shape the way people live their lives. The permanent conspiracy theorizing creates forms of life that are different from forms of life that are more skeptical towards conspiracy theorizing. Bezalel infers from his analysis that most of the contemporary debate about conspiracy theories is misguided because “it misses the underlying blik or foundational knowledge which shapes a person’s worldview surrounding conspiracies, whether as theoriser or sceptic” (Bezalel Reference Bezalel2021, 689). The focus of the debate should, according to Bezalel (Reference Bezalel2021, 689f.), not be on how to justify or debunk conspiracy theories because arguments and reasoning come second and follow the underlying intuitions, which he identifies with bliks. In most cases, reasoning and argumentation cannot change the underlying bliks (which are unfalsifiable). Indeed, often we only reinforce them with counter-arguments because the available evidence can be accommodated to the respective blik. Therefore, Bezalel thinks that we should concentrate on an understanding of the worldviews that surround conspiracy theories. The bliks underlying religious and conspiratorial worldviews are not direct objects of a rational debate but ground the way people are reasoning. With Hume, Bezalel claims that reason follows intuition and affection – not the other way around.
We think that there are several problems with Bezalel’s argumentation. First, there is a certain ambiguity of what he takes to be the bliks in the case of conspiracy theories. The headline of his paper is “Conspiracy Theories and Religion. Reframing Conspiracy Theories as Bliks.” This suggests that he wants to argue that conspiracy theories are bliks. This is also in line with his comparisons to Hare’s example of the lunatic. In this case, the blik is the conspiracy theory that every don at Oxford wants to kill him. But further down in the paper, Bezalel argues that bliks are more general frames than specific conspiracy theories. In reliance on Coady (Reference Coady2007), he differentiates between conspiracy theorists, coincidence theorists, and institutional theorists. Conspiracy theorists tend to see conspiracies behind events, whereas coincidence theorists are unwilling to connect the dots and discern patterns in the events. Institutional theorists concede that there are patterns in the events but relate them to institutions such as markets or governments.Footnote 35 Now, Bezalel argues that all these worldviews or styles of thinking are bliks (see Bezalel Reference Bezalel2021, 685). We think that both readings, the general and the specific ones, have problems. If we assume the specific reading and treat individual conspiracy theories themselves as bliks, it is not clear how Bezalel’s theory can deal with what he himself calls the “ambiguity” of conspiracy theories.Footnote 36 As was mentioned above, there are, obviously, many conspiracy theories that appear to be epistemically deeply flawed. Believing them seems to be epistemically vicious. If we assume the specific reading, this conclusion appears to be misguided. Conspiracy theories are just other unfalsifiable bliks than the bliks “normal” people have. Hence, there are no mistakes in the reasoning of adherents of conspiracy theories that are almost universally regarded as preposterous. They just frame the world in a different way. Therefore, we should, according to Bezalel’s approach, rather try to understand their worldviews than look for faults in their reasoning. We think that this consequence is difficult to defend. People who believe that the COVID-19 pandemic was just a hoax to control the world population or that the moon landing was faked seem to make epistemic mistakes. They can be accused of gullibility, narrow-mindedness, and inconsistency in the way they evaluate the information they gather. They are skeptical and hypercritical toward relevant epistemic authorities and naïve about dubious sources they find, for example, on the internet. Additionally, they must assume an unrealistic control that a small group of people exert over many individuals in huge organizations. In short, it seems clear that it is perfectly meaningful to discuss the epistemological flaws in the reasoning of some conspiracy theorists. If their conspiracy theories were bliks, such a discussion would be misplaced. The different conspiracy theories would be, as Bezalel has put it with reference to Wittgenstein, different language games or “forms of life” with no shared ground to dispute the rationality of their respective approach (see Bezalel Reference Bezalel2021, 687).
Furthermore, Bezalel seems to confuse the psychological or sociological question of how to convince people that their conspiracy theories are wrong with normative epistemological questions. That we can only seldom convince conspiracy theorists of the problematic sort by explaining their epistemic mistakes to them is indeed a frustrating experience. It is correct that direct confrontation can even lead to a reinforcement of their position. But this does not alter the fact that their reasoning is flawed. Bezalel uses the example of a climate change denier. He claims with Kahan et al. (Reference Kahan, Ellen Peters, Paul Slovic, Braman and Mandel2012) that accusing the denier of epistemic vices would not help to convince her. The person is embedded in a community with group values and will interpret the data of science in a way that fits the group values. This might be correct but does not change the fact that the way the respective persons deal with epistemic authorities and probabilistic reasoning can be critically and normatively evaluated. Understanding the mistakes somebody makes does not imply that we also have tools to convince her that she makes these mistakes. These are different matters that should be kept apart.
What about the second reading of bliks as general patterns of reasoning? Our problem with this reading is that bliks understood this way are too vague and broad to have significant explanatory value. There are few if any people who are strict coincidence, conspiracy, or institutional theorists. Most people are something in between and decide which explanations fit best on a case-by-case basis. It is highly speculative to infer that a person in general and due to a character trait posits conspiracies and personal explanations rather than coincidences or the “agency” of institutions. It seems to be wrong to replace the debate about adequate and epistemically virtuous reasoning with speculative psychological questions about general character traits of people. Furthermore, even if we could detect such character traits relevant to the way a person reasons, we do not think that this is sufficient to explain why she believes or disbelieves specific conspiracy theories. For that, we would have to take into account several other factors such as the peer group the person is part of, the education she had, the values she shares, or her political orientation. Hence, even if we engage in the more psychological enterprise of trying to understand why people come to believe conspiracy theories, such general bliks as they were delineated by Bezalel have only very limited explanatory value. Finally, the general reading seems to be inconsistent with precisely the consideration of Coady that Bezalel claims to illuminate with the concept of blik. Coady (Reference Coady2007, 196–198; 203) introduces different styles of thinking to show that the extremes are all irrational. A radical conspiracy theorist is as irrational as someone who tries to explain all events with the impersonal forces of institutions or dismisses them as mere coincidences. However, for Bezalel, it seems to be mistaken to classify bliks as rational or irrational.Footnote 37 They constitute different language games or forms of life and can, therefore, not be evaluated according to an external standard such as rationality. This is in direct tension with Coady’s argumentation.
Interestingly, Bezalel (Reference Bezalel2021, 687f.) introduces a distinction between pure and impure bliks that was made by Howard Horsburgh (Reference Horsburgh1956) which might resolve the tension between the two readings. Pure bliks are general and broad frameworks for understanding how the world functions. Impure bliks are narrow claims about specific phenomena. The latter are “artificially unfalsifiable” since the believer twists the facts and works with auxiliary hypotheses to avoid falsification. Now, Bezalel describes the lunatic as having an impure blik. The latter just protects his belief with all kinds of auxiliary hypotheses. We are skeptical about how much this distinction helps Bezalel and whether it is even consistent with his argumentation. Applying this distinction, many beliefs in conspiracy theories would come out as impure bliks, i.e. beliefs that are made unfalsifiable in an epistemically problematic way, involving a form of epistemic self-insulation. But this implies that debates about whether specific conspiracy theories are epistemically virtuous or vicious are adequate. The epistemically vicious conspiracy theorist just treats her beliefs as bliks, although they are not. The pure bliks, on the other hand, suffer from the problems we attributed to the general reading of bliks as styles of reasoning. They are too vague to have significant explanatory value. Unfortunately, after its introduction, Bezalel never uses this distinction between pure and impure bliks. In sum, it remains unsettled how exactly he wants to apply the concept of blik to conspiracy theories. And no matter how we explicate this application, it is confronted with severe problems.
What does this imply for religious worldviews? We agree with Bezalel (and Wittgenstein) that religious worldviews are often deeply rooted in our thinking and shape the way we live our lives. Hence, it is not easy to convince somebody to give up her religious worldview. Before this happens, people usually try to defend it with several auxiliary hypotheses. But contra Wittgenstein and Bezalel, we do not think that this makes a religious worldview a language game or a form of life that cannot be evaluated on the basis of the usual standards of reasoning. The analogy to conspiracy theories shows the opposite. As there are obviously irrational forms of conspiracy theories, there are religious worldviews glaringly misguided from an epistemological point of view. The fact, that, for psychological or sociological reasons, it is difficult to change even obviously preposterous forms of religious worldviews does not make the normative epistemological debates about these worldviews superfluous or ill-placed.
4. Conclusions
We have argued that there are substantial analogies between conspiracy theories and religious worldviews. In both cases, there is an emphasis on holistic thinking and on personal explanations of events. Both conspiracy theorists as well as religious people thereby attribute meaning and significance to events in the world. The world is not just seen as the result of coincidence and chance. We further claimed that the similarities with conspiracy theories do not support the view that religious worldviews are inherently irrational or suspicious. Referring to the epistemological debate about conspiracy theories, we argued that Particularism, i.e. the view that conspiracy theories must not as such be presumed to be irrational, is convincing. Indeed, there are many cases where it is epistemically vicious to refrain from conspiracy theorizing. In analogy to this debate, we argued for a Particularism about religious worldviews. They cannot be dismissed simply by virtue of being religious worldviews. Religious claims merit serious investigation and must not simply be presumed to be irrational. Finally, we argued against the view that religious worldviews and conspiracy theories should be understood as bliks that constitute a-rational frames for language games or forms of life. Although worldviews and theories might be deeply rooted in the thinking of people, they are, at least in part, susceptible to critical evaluation. If such an evaluation is blocked, this is itself an epistemic vice that should be overcome.
Funding
Jacob Hesse: Research for this article is funded by a grant from the German Research Foundation (DFG). Grant number: 490996084.