Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-20T05:40:36.291Z Has data issue: false hasContentIssue false

The role of conspiracy mentality in denial of science and susceptibility to viral deception about science

Published online by Cambridge University Press:  13 August 2019

Asheley R. Landrum
Affiliation:
College of Media & Communication, Texas Tech University
Alex Olshansky
Affiliation:
College of Media & Communication, Texas Tech University

Abstract

Members of the public can disagree with scientists in at least two ways: people can reject well-established scientific theories and they can believe fabricated, deceptive claims about science to be true. Scholars examining the reasons for these disagreements find that some individuals are more likely than others to diverge from scientists because of individual factors such as their science literacy, political ideology, and religiosity. This study builds on this literature by examining the role of conspiracy mentality in these two phenomena. Participants were recruited from a national online panel (N = 513) and in person from the first annual Flat Earth International Conference (N = 21). We found that conspiracy mentality and science literacy both play important roles in believing viral and deceptive claims about science, but evidence for the importance of conspiracy mentality in the rejection of science is much more mixed.

Type
Article
Copyright
© Association for Politics and the Life Sciences 2019

Science denialism permeates society. Though adamant anti-vaxxers and resolute flat Earthers may be small in numbers, many more people in the United States deny climate change and/or evolution (at least 50% and 33%, respectively1). And while scientists face public denial of well-supported theories, popular culture celebrates pseudoscience: Olympic athletes engage in cupping,Reference Carter2 “gluten-free” is trending (even among those without disorders like celiac diseaseReference Doheny3), and unsubstantiated alternative medicine methods flourish with support from cultural icons like Oprah.Reference Gorski4 Governments face furious opposition to fluoridated water (when it was added to prevent tooth decay5), and popular restaurant chains, like Chipotle, proudly tout their opposition to genetically modified organisms (GMOs) (see https://www.chipotle.com/gmo; scientists stress that the focus should be on the risks and benefits of each specific product and not globally accepted or rejected based on the processes used to make them).Reference Tagliabue6

Moreover, the emergence of social media has provided a broad forum for the famous, not famous, and infamous alike to share and crowdsource opinions and even target misinformation to those who are most vulnerable.Reference Valdez7 This allows so-called fake news to go viral.Reference Kata8 Yet who is most susceptible to denying science and/or believing misinformation? In the current study, we consider the extent to which conspiracy mentality leads people to (a) reject well-supported scientific theories and (b) accept viral and deceptive claims (commonly referred to as fake news) about science, two ways in which publics disagree with scientists.

Scientists versus publics

Why are there such gaps between what scientists have shown and what lay publics believe? One of the original models attempting to answer this question, the public deficit model,Reference Bauer, Allum and Miller9 poses that science denialism is fueled by a lack of science knowledge. In other words, if people simply understood the science, then they would accept the science. This model, however, oversimplifies a complex problem: despite the modest gains in acceptance that occur with scientific literacy, the relationship is often conditional on individuals’ prior beliefs, attitudes, values, and worldviews (e.g., their “priors”; note that we are using the term “priors” colloquially—we do not intend to refer to Bayesian priors).10 While greater scientific knowledge can increase the likelihood of accepting scientific results for some, it can increase the likelihood of rejecting those results for others—the opposite of what the deficit model envisages. For example, compared to Republicans with less science knowledge, Republicans with greater science knowledge are even more likely to reject climate change.Reference Kahan, Peters, Wittlin, Slovic, Ouellette, Braman and Mandel11 In such cases, people likely are using their knowledge and reasoning abilities to be even better at conforming evidence to fit their existing schemas,Reference Kahan, Peters, Dawson and Slovic12 a process that is part of a cluster of phenomena commonly referred to as motivated reasoning.Reference Kunda13

Conspiracy theorizing: A method of motivated reasoning

Concocting and/or endorsing conspiracy theories—that is, conspiracy theorizing—can function as a method of motivated reasoning.Reference Douglas, Sutton and Cichocka14 Conspiracy theories are explanations for situations that involve powerful agents secretly working together to pursue hidden goals that are usually unlawful or malevolent.Reference Clarke15 Although believing conspiracies is often discussed in the literature as a pathological behavior,Reference Abalakina-Paap, Stephan, Craig and Gregory16, Reference Green and Douglas17 such conspiracy theorizing can be normal.Reference Aupers18, Reference Butler, Koopman and Zimbardo19, Reference Jensen20, Reference Hofstadter21 That is, anyone might believe a conspiracy theory under the right set of circumstances. For example, Radnitz and UnderwoodReference Radnitz and Underwood22 exposed participants in an experiment to a fictional vignette that was interpreted as a conspiracy theory conditional on participants’ political views. Although the vignette contained no explicit partisan content, participants could extrapolate political cues from whether the “villain” of the vignette was described as a government institution (conservative partisan cue) or a corporate one (liberal partisan cue). As expected, when the vignette implicated a corporation, liberals were more likely to perceive a conspiracy, and when the government was implicated, conservatives were more likely to perceive a conspiracy.Reference Radnitz and Underwood22

A signature feature of conspiracy theorizing is impugning experts, elites, or other authorities or powerful institutions with corrupt motives. This is also true of science-relevant conspiracies. Some skeptics of GMOs, for instance, dismiss proponents of agricultural biotechnology as “Monsanto shills,”Reference Haspel23 and some vaccine skeptics depict vaccine advocates as “poisoning children to benefit Big Pharma.”Reference Blaskiewicz24 Similarly, some describe climate change as a conspiracy among scientists to sustain grant fundingReference Lewandowsky, Oberauer and Gignac25 or as a left-wing conspiracy to harm the U.S. economy.Reference Sussman26, Reference Uscinski and Olivella27 Questioning authorities’ motivations is rooted in heuristic processing.Reference Petty and Cacioppo28 Without having adequate knowledge to evaluate the scientific claims, nonexpert publics instead tend to evaluate the competence and motivations of expert communicators.Reference Landrum, Eaves and Shafto29 By impugning these experts with self-serving or even malevolent motivations,Reference Kahan, Jenkins-Smith and Braman30, Reference Miller, Saunders and Farhart31 individuals may justify their rejection of otherwise credible scientific evidence and resolve any cognitive dissonance.

Conspiracy mentality: A political worldview

Although there is evidence than anyone can conspiracy theorize under the right circumstances (e.g., conditional conspiracy thinkingReference Uscinski and Olivella27), there still may be a unique worldview captured by broad endorsement of conspiracy theories, or conspiracy mentality. Reference Bruder, Haffke, Neave, Nouripanah and Imhoff32, Reference Darwin, Neave and Holmes33 Conspiracy mentality (which is also sometimes called conspiracy ideationReference Lewandowsky, Oberauer and Gignac25) has been described as a political worldview consisting of general feelings of distrust or paranoia toward government services and institutions, feelings of political powerlessness and cynicism, and a general defiance of authority.Reference Hofstadter21 Traditionally, this construct has been measured by asking individuals to evaluate a collection of unrelated conspiracy theories (e.g., alternative explanations surrounding the death of Princess Diana, the origins of diseases such as HIV, and the “truth” behind the September 11 attacks on the World Trade Center) and/or generic ones (e.g., “I think that many very important things happen in the world which the public is never informed about”) and measuring either total or average agreement that the claims are likely to be true.Reference Uscinski and Olivella27, Reference Darwin, Neave and Holmes33, Reference Bruder and Manstead34 Researchers have found that the best predictor of belief in one conspiracy often is belief in other conspiracies,Reference Wood, Douglas and Sutton35, Reference Swami, Coles, Stieger, Pietsching, Furnham, Rehim and Voracek36 lending support to the view that conspiracy mentality acts as a generalized political attitude or worldview.Reference Imhoff and Bruder37, Reference Koerth-Baker38 Conspiracy mentality has been associated with a series of traits and worldviews such as an active imagination,Reference Swami, Chamorro‐Premuzic and Furnham39 paranoia and schizotypal tendencies,Reference Bruder, Haffke, Neave, Nouripanah and Imhoff32, Reference Darwin, Neave and Holmes33 high levels of anomie, and low levels of self-esteem.Reference Abalakina-Paap, Stephan, Craig and Gregory16, Reference Goertzel40

Study aims

As discussed earlier, conspiracy theories can be involved in disagreements between scientists and publics at two levels that are not mutually exclusive: (1) as a method of motivated reasoning—doubting the communicator’s credibility and suggesting a conspiracy justifies rejecting otherwise credible scientific evidence (i.e., conspiracy theorizing), and (2) as a monological belief system or political worldview in which subscribing individuals find all authorities and institutions, including scientific ones, inherently deceitful (i.e., conspiracy mentality). Our first aim focuses primarily on the latter, while our second aim focuses on both.

First, we aim to determine whether and, if so, to what extent conspiracy mentality predicts the rejection of well-supported scientific theories (i.e., anthropogenic climate change and human evolution) above and beyond the more well-studied priors of science literacy, political ideology, and religiosity. Second, we aim to examine to what extent conspiracy mentality and the aforementioned priors predict acceptance of inaccurate, deceptive, and, in some cases, conspiratorial science claims (i.e., viral deception about science). That is, who is most susceptible to accepting this type of viral deception about science?

To examine these questions, we analyze data from two samples collected as part of a broader study on alternative beliefs. The first sample consists of 513 individuals requested to be recruited to match census data by Research Now/SSI, an online digital data collection company. Because endorsement of conspiracies is often low in nationally representative populations, we felt that it was also important to actively recruit individuals who would be more likely to have higher conspiracy mentality scores. To that end, we recruited 21 individuals in person at the first annual Flat Earth International Conference to take our survey. Although these individuals were recruited in person, they completed the survey in the same online format as the sample from Research Now.

Aim 1: Examining who rejects well-supported scientific theories

Regarding our first aim, a handful of other researchers have examined potential links between conspiracy theorizing and science denial, specifically in the domain of climate change. Lewandowsky and colleagues surveyed climate-blog visitorsReference Lewandowsky, Oberauer and Gignac25 and an online panelReference Lewandowsky, Gignac and Oberauer41 and found that conspiracy mentality predicted the rejection of climate change and other sciences. Yet their findings have been challenged by others who state that the conclusions are not supported by the dataReference Dixon and Jones42 (also see Lewandowsky and colleagues’ response to this challengeReference Lewandowsky, Gignac and Oberauer43). Similarly, Uscinski and OlivellaReference Uscinski and Olivella27 found that the relationship between conspiracy and climate change attitudes is much stronger than previously suggested and that is contingent on people’s political party affiliation and, thus, non-monotonic. We seek to examine this question and offer the following hypotheses:

  1. H1a: Conspiracy mentality will predict rejection of well-supported scientific theories.

  2. H1b: Conspiracy mentality will predict rejection of well-supported scientific theories conditional on political party affiliation.

Aim 2: Examining who accepts viral deception about science

Regarding our second aim, we examine participants’ evaluations of viral deception (or fake news) relevant to science. Fake news has been defined as bogus or fabricated information that resembles news media content but does not adhere to journalistic norms and is promoted on social media.Reference Lazer, Baum, Benkler, Berinsky, Greenhill, Menczer and Metzger44 Kathleen Hall Jamieson, Elizabeth Ware Packard Professor of Communication at the University of Pennsylvania and director of the Annenberg Public Policy Center, argues that we should not use the term “fake news”; instead, we should use the term “viral deception,” with the acronym VD, to purposefully associate it with venereal disease. During an interview with Brian Stelter on CNN’s Reliable Sources, Jamieson explained this:

We don’t want to get venereal disease. If you find someone who’s got it, you want to quarantine them and cure them. You don’t want to transmit it. By virtue of saying “fake news,” we ask the question, well, what is real news—and you invite people to label everything they disapprove of “fake news.” As a result, it’s not a useful concept. What are we really concerned about? Deception. And deception of a certain sort that goes viral.45

Therefore, in this study, we use the term “viral deception” in place of “fake news” and refer to viral deception about science specifically. We define viral deception about science as bogus or fabricated science, or science-relevant, claims from sources known for spreading misinformation and propaganda on social media, such as NaturalNews.com, RT.com, FoodBabe.com). The stories on these sites frequently explain away noncongenial scientific evidence by assigning corrupt motives to scientific authorities, research organizations, and regulatory agencies.

Not everyone will be susceptible to viral deception about science when exposed. The differential susceptibility to media effects model,Reference Valkenburg and Peter46 for example, proposes individual difference variables can moderate or modify the direction or strength of media use effects. Because many of the claims made are conspiracy oriented—that is, they offer a narrative that helps individuals dismiss credible evidence by impugning the experts with malicious motives—we hypothesize the following:

  1. H2: Conspiracy mentality will predict evaluating viral deception about science as likely to be true.

However, other individual differences might also contribute to susceptibility to viral deception about science. For instance, recent work looking at the susceptibility of individuals to fake political news found that lower cognitive reflectionReference Pennycook and Rand47 and religious fundamentalismReference Bronstein, Pennycook, Bear, Rand and Cannon48 predicted susceptibility among some other reasoning-type measures (e.g., dogmatism). Therefore, we examine whether and, if so, to what extent science literacy (which includes the reasoning-type measures of cognitive reflection and numeracy), political affiliation, and religiosity predict evaluating deceptive science claims as likely to be true. We hypothesize the following:

  1. H3a-b: Science literacy (a) and religiosity (b) will predict evaluating viral deception about science as likely to be true.

Moreover, given that previous literature showing conditional effects of science knowledge and conspiracy theorizing by political party, we also include these interactions in our analysis.

Method

Samples and Data Collection

Sample 1.

Participants for the first sample were part of an online national consumer panel recruited by Research Now/SSI (referred to as the national sample) and were surveyed during the fall of 2017. To compensate panel participants, Research Now/SSI uses an incentive scale based on the length of the survey and the panelists’ profiles. Panel participants who are considered “time-poor/money-rich” are paid significantly higher incentives per completed survey than the average panelist so that participating is attractive enough to be perceived as worth the time investment. The incentive options allow panelists to redeem from a range of options such as gift cards, point programs, and partner products and services.

We requested 500 participants, sampled to be approximately nationally representative based on census numbers. Anticipating about a 50% completion rate, Research Now/SSI sampled over 1,000 participants on their online consumer panel, and we paid the company for the participants who qualified as “complete.” To qualify as complete, participants had to (a) be at least 18 years old, (b) reside in the United States, (c) correctly answer the attention check item (i.e., “if you are reading this, choose ‘likely false’”), (d) finish and submit the survey (participants could still submit the survey without having answered questions they preferred to skip), and (e) have taken at least 5 minutes to complete the approximately 20-minute survey.

Our sample of participants (N = 513) was 56% female and ranged from 18 to 80 years old (M = 48.98, Median = 50, SD = 14.97). About 5% of participants reported being black or African American, 5% Asian or Asian American, and 11.5% Hispanic/Latinx. The median level of education attained was an associate’s degree (coded to equal 14 years of school), with an average of 15.49 years of schooling (SD = 3.13).

Sample 2.

The second sample consisted of 21 individuals who attended the Flat Earth International Conference in Raleigh, North Carolina, in November 2017 (referred to as the FE sample). About 60 conference attendees provided us with their email addresses, so that we could send them the link to the survey, but only 21 individuals began and submitted the survey. Participants who submitted the survey received a $5 Amazon gift card. Of these individuals, nine were male, seven were female, and five declined to provide their gender. Most of the participants were white, but one reported being black or African American, one reported being Hispanic/Latinx, and three declined to report their race/ethnicity. Regarding the highest level of education attained, two reported high school, four reported some college, one reported having a two-year degree (e.g., associate’s degree), seven reported having bachelor’s degrees, two reported having graduate degrees, and five declined to report their education. The average age of this group was 38.62 years (Median = 36.5, SD = 12.91), though five declined to provide information on age.

Data collection.

All participants were emailed a link to the survey, which was hosted on Qualtrics.com. Participants completed several measures and those used in this study are described next. For more information on the survey, including a full list of questions asked, please see our project page on the Open Science Framework at https://osf.io/9x5gm/.

Measures

Well-supported scientific theories.

We asked participants whether climate change is real and human caused (response options: a. true, b. false because it is not human caused, c. false because it is not happening, or d. prefer not to answer) and whether humans evolved from earlier species of animal (response options: a. true, b. false, or c. prefer not to answer). Items were coded so that responses that align with scientific consensus (i.e., true) were scored as 0, or accepting the fact, and those against the consensus (i.e., false or prefer not to answer) were scored as 1, or as rejecting the fact. These items were embedded in the science literacy section of the survey. A greater proportion of the FE sample rejected anthropogenic climate change and human evolution than in the national sample. All 21 of the participants in the FE sample rejected anthropogenic climate change, whereas only 36% of the national sample did, χ2(1) = 34.26, p < .001. Moreover, all 21 of the FE sample rejected human evolution compared to only 37% of the national sample, χ2(1) = 33.73, p < .001.

Viral deception about science.

In addition to rejection of well-supported scientific theories, we also measured whether participants evaluated viral deception about science (i.e., inaccurate and misleading claims from social media about GMOs, a cure for cancer, the Zika virus, and vaccination) as likely to be true or false. For each of these items, participants were asked whether they thought the statement was definitely true (4), likely true (3), likely false (2) or definitely false (1). These items were embedded in the “beliefs” section of the survey, which also included the conspiracy theory items. Each of the statements used for this study come from deceptive claims made by viral campaigns, typically from NaturalNews.com or other dubious websites. Two of these claims featured a conspiracy, and two made inaccurate causal claims.

VD claims of conspiracy. One item stated that “a cure for most types of cancer has already been found, but medical circles prefer to keep getting research funding from governments and keep their findings secret.” There are many myths surrounding cancer,Reference Childs49 and this one in particular combines the myth that there is a miracle cure for cancer out there and the myth that researchers, particularly those at pharmaceutical companies and government agencies, are suppressing it. The FE sample (M = 3.42, Median = likely true, SD = 0.61) more strongly endorsed this claim as true than the national sample (M = 2.09, Median = likely false, SD = 1), t(21.95) = 9.12, p < .001, Cohen’s d = 2.13, 95% CI [1.65, 2.61].

A second item stated that “agricultural biotechnology companies like Monsanto are trying to cover up the fact that genetically modified organisms (GMOs) cause cancer.” This item comes from the website thetruthaboutcancer.com, in which Jeffrey Smith, a self-described expert on genetically modified foods, charges Monsanto with covering up the “fact” that there are “two deadly poisonous ingredients found in GMOs based on proven research that causes [sic] cancerous tumors to form in rats.”Reference Bronstein, Pennycook, Bear, Rand and Cannon48 This article, which includes a video, has been shared over 31,700 times on social media. Similarly, Vani Hari, who is known as the “Food Babe,” accused Monsanto of conspiring with the Environmental Protection Agency to bury evidence that its weed killer causes cancer.Reference Childs49 That GMOs cause cancer is also “fake news”: a review by the National Academies of Sciences, Engineering, and Medicine found “no substantiated evidence of a difference in risks to human health between currently commercialized genetically engineered (GE) crops and conventionally bred crops,”50 and the Society of Toxicology51 reported that “data to date have identified no evidence of adverse health effects from commercially available GE crops or the foods obtained by them.” The FE sample (M = 3.40, Median = likely true, SD = 0.82) more strongly endorsed this headline than the national sample (M = 2.56, Median = likely true, SD = 0.87), t(22.18) = 6.01, p < .001, Cohen’s d = 1.37, 95% CI [0.92, 1.83].

VD claims about causation. In addition to the two viral and deceptive claims about science conspiracies, we examined two inaccurate causal claims. Our third item stated that “the Zika virus was caused by the genetically modified mosquito.” This claim comes from a 2016 article posted on NaturalNews.com,Reference Adams52 which can be traced back to an article posted on RT.com.53 This theory of how Zika came about is inaccurate: FactCheck.org debunked the claim one month after it first appeared.Reference Schipani54 The FE sample (M = 2.88, Median = likely true, SD = 0.72) more strongly endorsed this headline than the national sample (M = 2.09, Median = likely false, SD = 0.87), t(16.52) = 4.27, p < .001, Cohen’s d = 1.09, 95% CI [0.58, 1.59].

Lastly, we asked participants about the claim that “childhood vaccinations are unsafe and cause disorders like autism.” Despite being debunked overReference DeStefano55 and over,Reference Kalkbrenner, Schmidt and Penlesky56 many deceptive sites, including NaturalNews.com,Reference Adams57, Reference Lilley58 continue to propagate this misinformation. The FE sample (M = 2.88, Median = likely true, SD = 0.72) more strongly endorsed this claim than the national sample (M = 2.09, Median = likely false, SD = 0.87), t(19.33) = 9.02, p < .001, Cohen’s d = 2.11, 95% CI [1.63, 2.59].

Science literacy.

Science literacy was measured using a shortened version of the Ordinary Science Intelligence (OSI 2.0) scale.Reference Kahan59 Our shortened version of the OSI included six items that were chosen based on their difficulty and discriminatory power from a previous item response theory analysis with a nationally representative population. Items were scored so that correct answers received 1 point and incorrect answers (and no response) received 0 points. On average, participants answered 2.54 questions out of 6 correctly (FE sample: M = 2.24, Median = 3; national sample: M = 2.56, Median = 2). Consistent with prior research, the scale was evaluated and scored using item response theory (a 2PL model). Then, scores were centered so that the mean was 0 (SD = 0.75); the scores ranged from –1.26 to 1.35. There was no significant difference between the mean science literacy scores of the two samples, t(21.05) = 0.40, p = .696, Cohen’s d = 0.09, 95% CI [–0.35, 0.53]. Because the distributions of scores are not normal (see the supplementary materials), we also conducted the nonparametric Wilcoxon rank sum test, which similarly showed no significant difference between the two samples, W = 5423, p = 0.959.

Political party affiliation and religiosity.

As stated in the introduction, scientific literacy does not account for all of the variance in public acceptance (or rejection) of science: people’s values and worldviews are extremely influential in their acceptance or rejection of scientific findings. Therefore, we also asked about political party affiliation and religiosity.

To capture political party affiliation (i.e., party), we asked participants, “generally speaking, do you consider yourself a …” with the following response options: strong Democrat, Democrat, independent, Republican, strong Republican, other, and “I choose not to answer.” Because many of the flat Earth conference attendees, whom we interviewed in person for a separate study, vociferously rejected affiliating with any political party, we realized the importance of including unaffiliated (or no answer and refusal to answer) as a possible response option, particularly when sampling conspiracy-minded individuals who are suspicious of institutions like political parties. It is very common for research studies to use listwise deletion, analyzing only participants for which they have complete data. However, we believe that this would lead to a loss of many of the participants with the strongest conspiracy mentality who refuse to answer the political party question. Therefore, we treated party as a categorical variable.

To reduce the number of comparison groups, we combined strong Democrat and Democrat into one response level, combined strong Republican and Republican into one response level, kept independent as one response level, and combined other (n = 30) and prefer not to answer (n = 44, including people who left the item blank) into one response level. The resulting variable was categorical with four levels: Democrat, independent, Republican, and unaffiliated/other. Among the national sample, 31% were coded as Democrat (n = 158), 33% were coded as independent (n = 172), 25% were coded as Republican (n = 130), and 11% were coded as unaffiliated/other (n = 66). Among the FE sample, 5% (n = 1) were coded as Democrat, 5% (n = 1) were coded as Republican, 14% (n = 3) were coded as independent, and 76% (n = 16) were coded as unaffiliated/other. A chi-square goodness-of-fit test showed that the distribution of party affiliation differed between the two samples, χ2(3) = 91.47, p < .001.

Religiosity was assessed by asking participants how much guidance faith or religion provide in their day-to-day lives (0 = not religious, 1= none at all, 2 = a little, 3 = a moderate amount, 4 = a lot, 5 = a great deal). The median religiosity response for the FE sample was “a lot” (FE sample: M = 3.0, SD = 1.89), whereas the national sample’s median was “a moderate amount” (national sample: M = 2.61, SD = 1.69). An independent samples nonparametric test suggests that the two samples did not differ statistically in their religiosity (W=4276, p=.362).

Conspiracy mentality.

Lastly, to measure conspiracy mentality, we used a modified version of the Conspiracy Theory Questionnaire.Reference Bruder and Manstead34 Our version of the scale consisted of seven conspiracy theories ranging from prototypical conspiracies (e.g., the Apollo program never landed on the moon) to more recent ones (e.g., Barack Obama was not born in the United States). Participants were asked to rate each item on a four-point scale (1 = definitely false, 2 = likely false, 3 = likely true, 4 = definitely true). The seven items were internally consistent (Cronbach’s alpha = 0.69, 95% CI [0.65, 0.73]). On average, the national sample rated items around “likely false” (M = 2.31, SD = 0.46) and the FE sample rated items around “likely true” (M = 3.36, SD = 0.30). We used a graded response model (using the ltm package in RReference Rizopoulos60) to calculate participants’ scores, and then we centered them. Scores ranged from –2.21 to 2.56 (M = 0, SD = 0.87), with higher numbers indicating stronger conspiracy mentality. As anticipated, the FE sample (M = 1.54, SD = 0.59) scored much higher on this measure of conspiracy mentality than the national sample (M = –0.06, SD = 0.79), t(23.06) = 11.99, p < .001, Cohen’s d = 2.67, 95% CI [2.20, 3.13].

Analysis plan

To test our hypotheses, we merged the data from the two samples. Then we conducted general linear model (GLM) analyses (controlling for sample: FE versus national) and report significance based on type III tests. For more details about the analysis, please see the supplementary materials.

Data availability

The data sets and coding used for the this article are available as a component on our project page on the Open Science Framework at https://osf.io/4pa96/.

Results

Rejection of well-supported scientific theories

Much of the literature on rejection of science has highlighted and found interactions between science literacy (or other types of reasoning abilities such as “actively open-minded thinking” or “need for cognition”) and worldviews such as political ideology for denial of climate change and religiosity for denial of human evolution. Thus, to examine the potential influence of conspiracy mentality on rejection of scientific facts, we incorporated the following as predictors for the base model: science literacy, party (referent = Democrat), religiosity, and two interactions, one between science literacy and one between science literacy and religion. Then, we ran the model a second time, adding conspiracy mentality and an interaction between conspiracy mentality and party (to test the conditional effect found by Uscinski and OlivellaReference Uscinski and Olivella27). We report the deviance between the two models and the effects from the second model (see also the supplementary materials for more detailed results).

Rejection of human-caused climate change.

As stated earlier, none of the 21 participants in the FE sample believed that climate change is real and human caused—100% rejected it. In contrast, 64% of the online panel (n = 326 of 513) accepted anthropogenic climate change, whereas only 36% rejected it. Given that the FE sample had a stronger conspiracy mentality, these frequencies appear to support findings from prior literature that conspiracy mentality is positively related to climate change denial.Reference Lewandowsky, Oberauer and Gignac25, Reference Lewandowsky, Gignac and Oberauer41 However, when the two samples were combined, conspiracy mentality did not significantly predict climate change rejection (b = 0.13, χ2 = 0.16, p = .686), and adding conspiracy mentality to the model only marginally improved its fit (deviance = 9.39, p = .052).

Although conspiracy mentality did not predict rejection of climate change when controlling for the other variables in the model, sample (FE versus national) did, b = –19.18, χ2 = 35.28, p < .001. We should note, however, that although there was a significant difference between Democrats and the unaffiliated/other in how conspiracy mentality relates to the rejection of climate change (b = –1.56, p = .018), simple effects tests with Bonferroni correction (adjusting the significance threshold, or cutoff p value, to 0.013 for the four comparisons) showed no significant relationship between conspiracy mentality and rejecting climate change for Democrats or for unaffiliated/other (see Figure 1).

Figure 1. Predicted probability of rejecting climate change based on conspiracy mentality by political party. Post hoc simple effects tests with Bonferroni correction (adjusting the cutoff alpha to .016) suggest that the effect of conspiracy mentality on rejecting climate change is not statistically significant for any of the party affiliations, despite what might appear to be positive relationships depicted in the figure (Democrats: b = 0.51, p = .067; Republicans: b = 0.06, p = .813; independent: b = 0.01, p = .964; unaffiliated/other: b = 0.19, p = .537). Shaded regions represent 95% confidence intervals.

The GLM analysis found the expected robust interaction between science literacy and party, consistent with prior research.Reference Kahan, Peters, Wittlin, Slovic, Ouellette, Braman and Mandel11 For Democrats, the probability of rejecting anthropogenic climate change decreased with increasing science literacy, whereas the opposite was true for Republicans (b = 1.25, p = .008). The odds of rejecting climate change for a Republican who scored lower on science literacy (when science literacy = –1) was 249% greater than for a Democrat with the same science literacy score. Polarization between the two political parties on climate change was even larger, however, among partisans with higher science literacy: the odds of rejecting climate change for a Republican who scored higher on science literacy (when science literacy = +1) was 4105% greater than for a Democrat with the same science literacy score (see Figure 2).

Figure 2. Party affiliation by science literacy interaction effect on the predicted probability of rejecting climate change. Whereas Democrats (b = –0.71, p = .025) and Independents (b = –0.52, p = .026) are less likely to reject climate change with increasing science literacy, Republicans and the unaffiliated/other are more likely to do so. Shaded regions represent 95% confidence intervals.

Rejection of human evolution

As for climate change, the effect of conspiracy mentality on the rejection of evolution did not reach statistical significance (b = –0.09, χ2 = 0.10, p = .076); however, this relationship was conditional on political party (χ2 = 10.33, p = .016), and the variable’s addition to the model significantly improved the model fit (deviance = 13.27, p = .010). Simple effects tests with Bonferroni correction (adjusting the cutoff p value to .013) suggest that the effect of conspiracy mentality on rejecting evolution was marginal for Republicans (b = 0.57, p = .018) and significant for unaffiliated/other (b = 1.66, p < .001), but it was not significant for Democrats (b = 0.41, p = .067) or independents (b = 0.19, p = .350). Specifically, among those in our sample with lower conspiracy mentality (when conspiracy mentality = –1), the odds of Republicans rejecting evolution were 33% greater than those of Democrats. In contrast, among those in our sample with higher conspiracy mentality (when conspiracy mentality = +1), the odds of Republicans rejecting evolution were 856% greater than Democrats (see Table 1 and Figure 3).

Table 1. Results of GLM predicting rejection of well-supported scientific theories when controlling for sample.

Notes: Party is treated as a categorical variable with Democrat as the referent. Response level significance for this variable is reported based on summary output from the GLM, whereas variable level significance is reported based on type III tests. Asterisks mark statistical significance. Coefficients (b) are not standardized.

*** p < .001;

** p < .01;

* p < .05;

tp < .10.

Figure 3. Interaction effect of party and conspiracy mentality on the rejection of human evolution. Simple effects tests with Bonferroni correction (adjusting the cutoff p value to .013) suggest that the effect of conspiracy mentality on rejecting evolution is marginally significant for Republicans (b = 0.57, p = .018) and for unaffiliated/other (b = 1.66, p < .001), but it is not significant for Democrats (b = 0.41, p = .067) or independents (b = 0.19, p = .350).

Susceptibility to Viral Deception about Science

Our second aim focused on who is susceptible to viral deception about science. We hypothesized that conspiracy mentality would predict the extent to which people endorsed the deceptive claims as true and that individuals’ priors (science literacy, political party affiliation, and religiosity) would predict endorsement of these claims after accounting for conspiracy mentality. The claims were scored like the conspiracy theories (1 = definitely false to 4 = definitely true). The results of the regression analyses are reported in Table 2. For a more detailed table (with exact p-values and sums of squares), see the supplementary materials. In addition, after reporting the GLM analyses results, we report the results of a test of relative importance using a method devised by Lindeman, Merenda, and Gold,Reference Lindeman, Merenda and Gold61 which averages the sequential sums of squares (type 1) across all of the orderings of the regressors with the relaimpo packageReference Grömping62 in R. This analysis allows us to determine which factors, or their interactions, explain the largest proportions of response variance or are the most “important” relative to the other predictors in the model.

Table 2. Results from the regression analyses predicting the four deceptive claims.

Notes: Party is treated as a categorical variable with Democrat as the referent. Response level significance for this variable is reported based on summary output from the GLM, whereas variable level significance is reported based on type III tests. Coefficients (b) are not standardized.

*** p < .001;

** p < .01;

* p < .05;

tp < .10.

Claim that the cure for cancer is being suppressed.

A common deceptive claim is that a cure for most types of cancer has already been found, but medical circles prefer to keep getting research funding from governments and keep their findings secret. Greater conspiracy mentality and lower science literacy predicted endorsing this claim as more likely to be true. There was also a marginal interaction effect of science literacy and political party: the relationship between science literacy and evaluating the claim as true was conditional on political party, with Democrats being marginally different from Republicans and significantly different from the unaffiliated/other. Follow-up simple effects tests show that for Democrats and independents, greater scientific literacy led to endorsing the claim as more likely to be false (Democrats: b = –0.66, p < .001; independents: b = –0.50, p < .001). In contrast, science literacy did not significantly predict endorsement of the claim for the unaffiliated/other (b = 0.04, p = .809), and for Republicans, the negative relationship was marginal (b = –0.25, p = .062).

Claim that GMOs cause cancer and corporations are covering it up.

Another common deceptive claim propagated by untrustworthy websites is that GMOs cause cancer and agricultural biotechnology corporations, such as Monsanto, are covering it up. For this item, conspiracy mentality indeed predicted evaluating this claim as likely true. Moreover, there was a significant interaction of conspiracy mentality and science literacy. Among those with lower conspiracy mentality, higher science literacy predicted evaluating the claims as more likely to be false. In contrast, among those with higher conspiracy mentality, higher science literacy predicted evaluating the claims as more likely to be true (see Figure 4).

Figure 4. Predicting endorsement of the claim that GMOs cause cancer and corporations are covering this up on a scale from definitely false (1) to definitely true (4). There was a significant interaction between conspiracy mentality and science literacy. Among people with lower conspiracy mentality (scores less than –1), higher science literacy predicted evaluating the claim as more likely to be false. Among people with higher conspiracy mentality (scores greater than 1), higher science literacy means evaluating the claim as more likely to be true.

Claim that that the Zika virus was caused by a genetically modified mosquito.

Another deceptive claim contends that the genetically modified mosquito, which was developed at least in part to help curb the spread of diseases like Zika (see https://www.oxitec.com/friendly-mosquitoes/), is actually the underlying cause of the Zika virus. As expected, higher conspiracy mentality and lower science literacy strongly predicted believing the claim that the Zika virus is caused by the genetically modified mosquito. No other effects or interactions were significant (see Table 2).

Claim that childhood vaccines are unsafe and cause disorders like autism.

One of the most common assertions about vaccinations among deceptive websites is that childhood vaccinations are unsafe and cause disorders such as autism. As with the previous items, greater conspiracy mentality and lower science literacy significantly predicted evaluations that this claim is likely to be true. In addition, people who reported stronger religiosity were also likely to evaluate this claim as more likely to be true, which is consistent with prior work.63 This was also the only claim for which sample remained a significant predictor after accounting for other effects (see Table 2).

Relative importance of the factors.

Using the results of the GLM analyses for each of the claims, we conducted a test of relative importance of the variables. As we stated earlier, we used the relaimpo package in R and report the results of the lmg analyses, which average the sequential sums of squares over all orders of the regressors. This analysis produces a value that represents the proportion of response variance for which each of the factors accounts. Graphing these results, we can see the relative importance of each of the factors included in the model. This analysis and Figure 5 more clearly illustrate our findings from the GLM analyses, that conspiracy mentality and science literacy were the most important predictors (relative to the others included in the model). For a table with the exact values from the lmg analysis, see the supplementary materials.

Figure 5. Relative importance of the factors predicting susceptibility to each deceptive claim. Conspiracy mentality and science literacy were the two factors that accounted for the most response variance.

Discussion

The role of conspiracy mentality

This study primarily set out to examine the potential role of conspiracy mentality in predicting two phenomena: the rejection of well-supported scientific theories and the acceptance of viral deception about science.

Rejecting science.

We found mixed evidence that conspiracy mentality predicts rejection of science. Although conspiracy mentality was influential in rejection of evolution contingent on political party affiliation (e.g., the relationship was positive and marginal for Republicans and significant for the unaffiliated/other category), it did not meaningfully predict rejection of climate change. Two things are important to note here, however. First, although we found a significant interaction indicating that the degree (and/or direction) of the relationship between conspiracy mentality and rejection of climate change differs by political party affiliation (i.e., it is conditional on political party, somewhat consistent with work by Uscinski and OlivellaReference Uscinski and Olivella27), simple effects tests suggest that the relationship between conspiracy mentality and rejection of climate change is not significant for any of the party affiliations (see Figure 1).

Second, none of the Flat Earth International Conference attendees—who scored significantly higher on conspiracy mentality than the national sample, endorsed human-caused climate change as true. Thus, it is inaccurate to say that our findings are completely inconsistent with prior work that has shown relationships between conspiracy mentality and rejection of climate change. Instead, we question the robustness of the findings from prior work; certain changes to the way in which conspiracy mentality or climate change beliefs are measured may alter the strength and existence of the relationship.

Related to this point, one strength of our study is that we included a subsample of individuals with high conspiracy mentality, or so-called conspiracy theorists. Although we recognize that our subsample is not representative of all conspiracy theorists, especially as these participants are subscribers to flat Earth ideology and not all conspiracy theorists are flat Earthers, we did find that all the flat Earthers surveyed rejected the existence of anthropogenic climate change (and human evolution). It is possible, for instance, that the relationship between conspiracy mentality and climate change rejection (when measured as a continuous variable) is not linear. Future research should continue to test this hypothesis using samples of individuals with strong conspiracy mentalities (i.e., among populations of conspiracy theorists) and test whether a relationship between conspiracy mentality and rejection of climate change is a continuous relationship or one that, for the most part, appears only after crossing a certain threshold.

There are other reasons, too, why our results may differ from previous studies examining the relationship between conspiracy mentality and science denial. For one, our one-item measurement of climate change acceptance is not sensitive and does not allow for much variance in views about climate change. However, it can be argued that a particularly robust effect of conspiracy mentality on the denial of climate change ought to be present when simply asking participants whether they believe that climate change is occurring. For example, the robust interaction of knowledge and political ideology persists across different measurements (or operationalizations) of climate change views, science knowledge, and political party and ideology. Of course, we do not mean that we doubt the existence of any effect of conspiracy mentality on climate change denial; we simply question the power and persistence of such an effect.

Believing viral deception.

Our second aim examined susceptibility to believing viral deception about science. Our hypothesis that conspiracy mentality would predict endorsement of these claims was supported, and conspiracy mentality was the most important predictor of susceptibility in our model (see Figure 5). However, we were also interested in whether individuals’ prior values and beliefs predicted acceptance of the deceptive claims even after accounting for conspiracy mentality. Indeed, even though the number of individuals with pathological levels of conspiracy mentality is arguably small, viral fake news campaigns are dangerous because people who may not be conspiracy oriented are predisposed to accept conspiracies that support their worldviews.

What makes these viral deceptive claims different than typical conspiracy theories is the number of people who believe them. On average, very few people endorse most conspiracy theories (with notable exceptions like the conspiracy theories surrounding the assassination of President John F. KennedyReference Koerth-Baker38). On the other hand, many of our participants believed the deceptive claims about GMOs and Zika. About 56% of our national sample said it is likely or definitely true that Monsanto is covering up for the fact that GMOs cause cancer, and 32% of our national sample said that it is likely or definitely true that the Zika virus is caused by the genetically modified mosquito. Future research should measure whether believing viral deception leads to later rejection of science communication about those topics and related policy efforts, such as blocking the release of a new Food and Drug Administration–approved genetically modified food product or protesting the release of transgenic mosquitoes in areas at high risk of Zika, dengue, or malaria.

The role of science literacy

Aside from conspiracy mentality, only one other individual factor was consistently relevant to predicting rejection of well-supported scientific theories and accepting viral deception about science: science literacy. First, and expectedly, we found more evidence for the robust interaction effect between science literacy (measured here using a shortened version of Kahan’s OSI scale) and political ideology on the rejection/acceptance of anthropogenic climate change. In contrast to other work that has treated political ideology as a continuous variable, we looked at political party affiliation as a categorical one so as not to lose participants who choose not to affiliate. As expected, the relationship of science literacy and acceptance of the scientific consensus on climate change and evolution was conditional on political party affiliation. That is, Democrats and Republicans polarized along science literacy; with increasing science literacy, Democrats were more likely (and Republicans were less likely) to accept that human-caused climate change is a real phenomenon. Interestingly, people who refused to answer the political affiliation item (or said that they do not affiliate with the listed political parties) showed a similar pattern to Republicans and those who reported being independent showed a similar pattern to Democrats (see Figure 2).

Moreover, when it came to predicting evaluations of the deceptive claims as likely to be true, science literacy was the only factor in our model besides conspiracy mentality that appeared to meaningfully predict each of the four deceptive claims. Unlike when predicting rejection of science, however, we did not consistently find conditional effects of the relationship of science literacy on acceptance of the deceptive claims. While there was a marginal interaction effect of science literacy and political party affiliation on evaluating the cancer cure suppression item, simple effects tests showed that greater science literacy predicted evaluating the claim as more likely to be false among Democrats, independents, and Republicans (though the effect for Republicans was marginal). There was no relationship between science literacy and evaluating the claim for unaffiliated/others. For the “GMOs cause cancer” item, there was an interaction effect of science literacy and conspiracy mentality. Among those with lower conspiracy mentality, having greater science literacy led to evaluating the claims as more likely to be false. In contrast, among those with higher conspiracy mentality, having greater science literacy led to evaluating the claims as more likely to be true. The effect of science literacy on evaluation of the other two deceptive claims (about autism and about vaccines) was not conditional on another value or identity factor that we measured. Thus, implementing interventions to increase science literacy may be influential in preventing proliferation of viral deception about science, at least among the majority of the population.

Limitations

This study, like many others, has limitations that ought be taken into consideration when interpreting its findings. First, although the sample is much more diverse than typical convenience samples (e.g., undergraduate students) or samples using Amazon’s Mechanical Turk, it is not nationally representative or probabilistic. Also, because this was a secondary analysis of a larger study aiming to examine the relationship between conspiracy mentality and science curiosity, the survey included only a few questions about science-related beliefs and acceptance/rejection of scientific facts and the analysis was exploratory. These points aside, we were still able to examine some of the issues that are more prevalent in today’s media environment: climate change and evolution, and we were able to examine fake news headlines that have “gone viral” on social media. Future studies should aim to replicate our findings here with different samples and should consider asking participants about a broader array of scientific beliefs including controversial and noncontroversial issues.

Conclusion

The proliferation of deceptive claims on social media has done a lot to normalize conspiracy, and to some extent conspiratorial worldviews. We can try to dismiss conspiracy theorizing as something undertaken only by a foil-hat-wearing fringe, however when our friends and neighbors (and sometimes ourselves) begin to believe and share conspiracies on social media, we must acknowledge that conspiracy theorizing is much more widespread. And when it becomes commonplace to project conspiratorial motives onto scientific institutions (and not just corporate or governmental ones) merely because information disagrees with our worldviews, we are in danger of entering into a space where knowledge becomes almost completely relative, we cannot engage in rational discussion with those with whom we disagree, and we completely break down the division of cognitive labor on which our society relies. Although we should not be gullible—after all, there are real conspiracies—we must learn how to balance skepticism with trust.

Acknowledgments

We would like to thank the organizers of the International Flat Earth Conference for allowing us to interview conference attendees and request email addresses from attendees to participate in our online survey. In addition, we are grateful to Tim Linksvayer, Deena Weisberg, Michael Weisberg, Stephan Lewandowsky, Cam Stone, and Rosalynn Vasquez for providing feedback on earlier versions of the manuscript. Finally, we would like to thank the Science Communication and Cognition Lab team members for their help and support.

Supplementary Materials

To view supplementary material for this article, please visit http://dx.doi.org/10.1017/pls.2019.9.

References

Pew Research Center, “Public and scientists’ views on science and society,” January 29, 2015, http://www.pewinternet.org/2015/01/29/public-and-scientists-views-on-science-and-society/, accessed June 20, 2019.Google Scholar
Carter, K., “Does ‘cupping’ do Olympic athletes any good—and does it matter if it doesn’t?,” Guardian, August 8, 2016, https://www.theguardian.com/lifeandstyle/shortcuts/2016/aug/08/does-cupping-do-olympic-athletes-any-good, accessed June 20, 2019.Google Scholar
Doheny, K., “What’s behind the gluten-free trend?,” WebMD, September 16, 2016, https://www.webmd.com/digestive-disorders/celiac-disease/news/20160916/whats-behind-gluten-free-trend#1, accessed June 20, 2019.Google Scholar
Gorski, D., “The Oprah-fication of medicine,” Science-Based Medicine, June 1, 2009, https://sciencebasedmedicine.org/the-oprah-fication-of-medicine/, accessed June 20, 2019.Google Scholar
U.S. Department of Health and Human Services, “U.S. public health service recommendation for fluoride concentration in drinking water for the prevention of dental caries,” Public Health Reports, 2015, 130(4): 318331.Google Scholar
Tagliabue, G., “The necessary ‘GMO’ denialism and scientific consensus,” Journal of Science Communication, 2016, 15(4): Y01, https://doi.org/10.22323/2.15040401.Google Scholar
Valdez, A., “Everything you need to know about Facebook and Cambridge Analytica,” Wired, March 23, 2018, https://www.wired.com/story/wired-facebook-cambridge-analytica-coverage/, accessed June 20, 2019.Google Scholar
Kata, A., “Anti-vaccine activists, Web 2.0, and the postmodern paradigm—An overview of tactics and tropes used online by the anti-vaccination movement,” Vaccine, 2012, 30(25): 37783789, https://doi.org/10.1016/j.vaccine.2011.11.112.Google Scholar
Bauer, M., Allum, N., and Miller, S., “What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda,” Public Understanding of Science, 2007, 16: 7995, https://doi.org/10.1177/0963662506071287.Google Scholar
National Academies of Science, Engineering, and Medicine, Science Literacy: Concepts, Contexts, and Consequences (Washington, DC: National Academies Press, 2016), https://doi.org/10.17226/23595.Google Scholar
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., and Mandel, G., “The polarizing impact of science literacy and numeracy on perceived climate change risks,” Nature Climate Change, 2012, 2: 732735, https://doi.org/10.1038/nclimate1547.Google Scholar
Kahan, D. M., Peters, E., Dawson, E. C., and Slovic, P., “Motivated numeracy and enlightened self-government,” Behavioural Public Policy, 2017, 1(1): 5486.10.1017/bpp.2016.2Google Scholar
Kunda, S., “The case for motivated reasoning,” Psychological Bulletin, 1990, 108(3): 480498, https://doi.org/10.1037/0033-2909.108.3.480.Google Scholar
Douglas, K. M., Sutton, R. M., and Cichocka, A., “The psychology of conspiracy theories,” Current Directions in Psychological Science, 2017, 26(6): 538–42, https://doi.org/10.1177/096372147718261.Google Scholar
Clarke, S., “Conspiracy theories and conspiracy theorizing,” Philosophy of the Social Sciences, 2002, 32(2): 131150, https://doi.org/10.1177/004931032002001.Google Scholar
Abalakina-Paap, M., Stephan, W. G., Craig, T., and Gregory, W. L., “Beliefs in conspiracies,” Political Psychology, 1999, 20 2), 637647, https://doi.org/10.1111/0162-895X.00160.Google Scholar
Green, R. and Douglas, K. M., “Anxious attachment and belief in conspiracies,” Personality and Individual Differences, 2018, 125: 3037, https://doi.org/10.1016/j.paid.2017.12023.Google Scholar
Aupers, S., “‘Trust no one’: Modernization, paranoia and conspiracy culture,” European Journal of Communication, 2012, 27(1): 2234, https://doi.org/10.1177/0267323111433566.Google Scholar
Butler, L. D., Koopman, C., and Zimbardo, P., “The psychological impact of viewing the film ‘JFK’: Emotions, beliefs, and political behavioral intentions,” Political Psychology, 1995, 16(2): 237257, https://doi.org/10.2307/3791831.Google Scholar
20Jensen, T., “Democrats and Republicans differ on conspiracy beliefs,” Public Policy Polling, April 2, 13, https://www.publicpolicypolling.com/polls/democrats-and-republicans-differ-on-conspiracy-theory-beliefs/, accessed June 20, 2019.Google Scholar
Hofstadter, R., The Paranoid Style in American Politics: And Other Essays (New York: Knopf, 1965).Google Scholar
Radnitz, S. and Underwood, P., “Is belief in conspiracy theories pathological? A survey experiment on the cognitive roots of extreme suspicion,” British Journal of Political Science, 2017, 47(1): 113129, https://doi.org/10.1017/S0007123414000556.Google Scholar
Haspel, T., “Genetically modified foods: What is and isn’t true,” Washington Post, October, 15, 2013, https://www.washingtonpost.com/lifestyle/food/genetically-modified-foods-what-is-and-isnt-true/2013/10/15/40e4fd58-3132-11e3-8627-c5d7de0a046b_story.html?utm_term=.be6e5899d1e6, accessed June 20, 2019.Google Scholar
Blaskiewicz, R., “The Big Pharma conspiracy theory,” Medical Writing, 2013, 22(4): 259261, http://doi.org/10.1179/2047480613Z.000000000142.Google Scholar
Lewandowsky, S., Oberauer, K., and Gignac, G. E., “NASA faked the moon landing—Therefore, (climate) science is a hoax: An anatomy of the motivated rejection of science,” Psychological Science, 2013, 24(5): 622633, https://doi.org/10.1177/0956797612457686.Google Scholar
Sussman, B., Climategate: A Veteran Meteorologist Exposes the Global Warming Scam (Washington, DC: WND Books, 2010).Google Scholar
Uscinski, J. E. and Olivella, S., “The conditional effect of conspiracy thinking on attitudes toward climate change,” Research & Politics, 2017, 4(4): 19, https://doi.org/10.1177/2053168017743105.Google Scholar
Petty, R. E. and Cacioppo, J. T., “The elaboration likelihood model of persuasion,” Advances in Experimental Social Psychology, 1986, 19: 123205, https://doi.org/10.1016/S0065-2601(08)60214-2.Google Scholar
Landrum, A. R., Eaves, B., and Shafto, P., “Learning to trust and trusting to learn: A theoretical framework,” Trends in Cognitive Sciences, 2015, 19(3): 109111, https://doi.org/10.1016/j.tics.2014.12.007.Google Scholar
Kahan, D. M., Jenkins-Smith, H., and Braman, D., “Cultural cognition of scientific consensus,” Journal of Risk Research, 2011, 14(2): 147174, https://doi.org/10.1080/13669877.2010.511246.Google Scholar
Miller, J. M., Saunders, K. L., and Farhart, C. E., “Conspiracy endorsement as motivated reasoning: The moderating roles of political knowledge and trust,” American Journal of Political Science, 2016, 60(4): 824844, https://doi.org/10.1111/ajps.12234.Google Scholar
Bruder, M., Haffke, P., Neave, N., Nouripanah, N., and Imhoff, R., “Measuring individual differences in generic beliefs in conspiracy theories across cultures: Conspiracy Mentality Questionnaire,” Frontiers in Psychology, 2013, 4: Article ID 225, https://doi.org/10.3389/fpsyg.2013.00225.Google Scholar
Darwin, H., Neave, N., and Holmes, J., “Belief in conspiracy theories: The role of paranormal belief, paranoid ideation and schizotypy,” Personality and Individual Differences, 2011, 50(8): 12891293, https://doi.org/10.1016/j.paid.2011.02.027.Google Scholar
Bruder, M. and Manstead, A. S. R., “Questionnaire on conspiracy theories,” 2009, http://www.conspiracytheory.martinbruder.com/en/, accessed June 20, 2019.Google Scholar
Wood, M. J., Douglas, K. M., and Sutton, R. M., “Dead and alive: Beliefs in contradictory conspiracy theories,” Social Psychological and Personality Science, 2012, 3(6): 767773, https://doi.org/10.1177/1948550611434786.Google Scholar
Swami, V., Coles, R., Stieger, S., Pietsching, J., Furnham, A., Rehim, S., and Voracek, M., “Conspiracist ideation in Britain and Austria: Evidence of a monological belief system and associations between individual psychological differences and real-world and fictitious conspiracy theories,” British Journal of Psychology, 2011, 102(3): 443463, https://doi.org/10.1111/j.2044-8295.2010.02004.x.Google Scholar
Imhoff, R. and Bruder, M., “Speaking (un-)truth to power: conspiracy mentality as a generalised political attitude,” European Journal of Personality, 2014, 28(1), 2543, https://doi.org/10.1002/per.1930.Google Scholar
Koerth-Baker, M., “Why rational people buy into conspiracy theories,” New York Times Magazine, May 21, 2013, https://www.nytimes.com/2013/05/26/magazine/why-rational-people-buy-into-conspiracy-theories.html (accessed June 20, 2019).Google Scholar
Swami, V., Chamorro‐Premuzic, T., and Furnham, A., “Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs,” Applied Cognitive Psychology, 2010, 24(6): 749761, https://doi.org/10.1002/acp.1583.Google Scholar
Goertzel, T., “Belief in conspiracy theories,” Political Psychology, 1994, 15(4): 731742, https://doi.org/10.2307/3791630.Google Scholar
Lewandowsky, S., Gignac, G. E., and Oberauer, K., “The role of conspiracist ideation and worldviews in predicting rejection of science,” PLOS ONE, 2013, 10(8): e75637, https://doi.org/10.1371/journal.pone.0134773.Google Scholar
Dixon, R. M. and Jones, J. A., “Conspiracist ideation as a predictor of climate-science rejection: An alternative analysis,” Psychological Science, 2015, 26(5): 664666, https://doi.org/10.1177/0956797614566469.Google Scholar
Lewandowsky, S., Gignac, G. E., and Oberauer, K., “The robust relationship between conspiracism and denial of (climate) science,” Psychological Science2015, 26(5): 667670, https://doi.org/10.1177/0956797614568432.Google Scholar
Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., et al., “The science of fake news: Addressing fake news requires a multidisciplinary effort,” Science, 2018, 359(6380): 10941096, https://doi.org/10.1126/science.aa02998.Google Scholar
Annenberg Public Policy Center, “Jamieson offers new name for fake news: ‘Viral deception’ or VD,” March 6, 2017, https://www.annenbergpublicpolicycenter.org/on-cnn-jamieson-offers-new-name-for-fake-news-viral-deception-or-v-d/, accessed June 20, 2019.Google Scholar
Valkenburg, P. M. and Peter, J., “The differential susceptibility to media effects model,” Journal of Communication, 2013, 63: 221243.Google Scholar
Pennycook, G. and Rand, D. G., “Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning,” Cognition, 2019, 188: 3950, https://doi.org/10.1016/j.cognition.2018.06.011.Google Scholar
Bronstein, M. V., Pennycook, G., Bear, A., Rand, D. G., and Cannon, T. D., “Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism and reduced analytical thinking. Journal of Applied Research in Memory and Cognition, 2019, 8(1): 108117, https://doi.org/10.1016/j.jarmac.2018.09.005.Google Scholar
Childs, O., “Don’t believe the hype—10 persistent cancer myths debunked,” Cancer Research UK, March 24, 2014, https://scienceblog.cancerresearchuk.org/2014/03/24/dont-believe-the-hype-10-persistent-cancer-myths-debunked/#Big-Pharma, accessed June 21, 2019.Google Scholar
National Academies of Science, Engineering and Medicine, Genetically Engineered Crops: Experiences and Prospects (Washington, DC: National Academies Press, 2016), https://doi.org/10.17226/23596.Google Scholar
Society of Toxicology, “SOT issue statement: Food and feed safety of genetically engineered food crops,” November 2017, https://www.toxicology.org/pubs/statements/SOT_Safety_of_GE_Food_Crops_Issue_Statement_FINAL.pdf, accessed June 21, 2019.Google Scholar
Adams, M., “Zika virus outbreak linked to release of genetically engineered mosquitoes…disastrous unintended consequences now threaten life across the Americas,” Natural News, February 1, 2016, https://www.naturalnews.com/052824_Zika_virus_genetically_engineered_mosquitoes_unintended_consequences.html, accessed June 21, 2019.Google Scholar
RT, “GMO mosquitoes could be cause of Zika, critics say,” January 30, 2016, https://www.rt.com/news/330728-gmo-mosquitoes-zika-virus/, accessed June 21, 2019.Google Scholar
Schipani, V., “GMOs didn’t cause Zika outbreak,” FactCheck.org, February 23, 2016, https://www.factcheck.org/2016/02/gmos-didnt-cause-zika-outbreak/, accessed June 21, 2019.Google Scholar
DeStefano, F., “Vaccines and autism: Evidence does not support a causal association,” Clinical Pharmacology & Therapeutics, 2007, 82(6): 756759, https://doi.org/10.1038/sj.clpt.6100407.Google Scholar
Kalkbrenner, A., Schmidt, R. J., and Penlesky, A. C., “Environmental chemical exposures and Autism spectrum disorders: A review of the epidemiological evidence,” Current Problems in Pediatric and Adolescent Health Care, 2014, 44(10): 277-318, https://doi.org/10.1016/j.cppeds.2014.06.001.Google Scholar
Adams, M., “Autism, mercury, thimerosal and vaccines: Natural News releases large collection of scientific knowledge that’s been suppressed by the FDA, CDC, and pharma-controlled media,” Natural News, March 5, 2017, https://www.naturalnews.com/2017-03-05-autism-mercury-thimerosal-and-vaccines-natural-news-releases-collection-of-scientific-knowledge-thats-been-suppressed.html, accessed June 21, 2019.Google Scholar
Lilley, J., “Vaccines cause autism, says confidential document from corrupt drug company,” Natural News, April 23, 2015, https://www.naturalnews.com/049458_autism_infanrix_vaccine_glaxosmithkline.html, accessed June 21, 2019.Google Scholar
Kahan, D. M., “‘Ordinary science intelligence’: A science-comprehension measure for study of risk and science communication, with notes on evolution and climate change,” Journal of Risk Research, 2017, 20(8): 9951016, https://doi.org/10.1080/13669877.2016.1148067.Google Scholar
Rizopoulos, D., “ltm: A package for latent variable modeling and item response analysis,” Journal of Statistical Software, 2006, 17(5): 125, http://www.jstatsoft.org/v17/i05/.Google Scholar
Lindeman, R. H.Merenda, P. F. and Gold, R. Z., Introduction to Bivariate and Multivariate Analysis (Glenview, IL: Scott, Foresman, 1980).Google Scholar
Grömping, U., “Relative importance for linear regression in R: The package relaimpo,” Journal of Statistical Software, 2006, 17(1): 127, https://www.jstatsoft.org/article/view/v017i01/v17i01.pdf.10.18637/jss.v017.i01Google Scholar
Figure 0

Figure 1. Predicted probability of rejecting climate change based on conspiracy mentality by political party. Post hoc simple effects tests with Bonferroni correction (adjusting the cutoff alpha to .016) suggest that the effect of conspiracy mentality on rejecting climate change is not statistically significant for any of the party affiliations, despite what might appear to be positive relationships depicted in the figure (Democrats: b = 0.51, p = .067; Republicans: b = 0.06, p = .813; independent: b = 0.01, p = .964; unaffiliated/other: b = 0.19, p = .537). Shaded regions represent 95% confidence intervals.

Figure 1

Figure 2. Party affiliation by science literacy interaction effect on the predicted probability of rejecting climate change. Whereas Democrats (b = –0.71, p = .025) and Independents (b = –0.52, p = .026) are less likely to reject climate change with increasing science literacy, Republicans and the unaffiliated/other are more likely to do so. Shaded regions represent 95% confidence intervals.

Figure 2

Table 1. Results of GLM predicting rejection of well-supported scientific theories when controlling for sample.

Figure 3

Figure 3. Interaction effect of party and conspiracy mentality on the rejection of human evolution. Simple effects tests with Bonferroni correction (adjusting the cutoff p value to .013) suggest that the effect of conspiracy mentality on rejecting evolution is marginally significant for Republicans (b = 0.57, p = .018) and for unaffiliated/other (b = 1.66, p < .001), but it is not significant for Democrats (b = 0.41, p = .067) or independents (b = 0.19, p = .350).

Figure 4

Table 2. Results from the regression analyses predicting the four deceptive claims.

Figure 5

Figure 4. Predicting endorsement of the claim that GMOs cause cancer and corporations are covering this up on a scale from definitely false (1) to definitely true (4). There was a significant interaction between conspiracy mentality and science literacy. Among people with lower conspiracy mentality (scores less than –1), higher science literacy predicted evaluating the claim as more likely to be false. Among people with higher conspiracy mentality (scores greater than 1), higher science literacy means evaluating the claim as more likely to be true.

Figure 6

Figure 5. Relative importance of the factors predicting susceptibility to each deceptive claim. Conspiracy mentality and science literacy were the two factors that accounted for the most response variance.

Supplementary material: File

Landrum and Olshansky supplementary material

Landrum and Olshansky supplementary material

Download Landrum and Olshansky supplementary material(File)
File 1.3 MB