Abstract
Conspiracy theories can be encountered repeatedly, which raises the issue of the effect of their repeated exposure on beliefs. Earlier studies found that repetition increases truth judgments of factual statements, whether they are uncertain, highly implausible, or fake news, for instance. Would this "truth effect" be observed with conspiracy statements? If so, is the effect size smaller than the typical truth effect, and is it associated with individual differences such as cognitive style and conspiracy mentality? In the present preregistered study, we addressed these three issues. We asked participants to provide binary truth judgments to conspiracy and factual statements already displayed in an exposure phase (an interest judgment task) or that were new (displayed only in the truth judgment task). We measured participants' cognitive style with the three-item Cognitive Reflection Test (CRT), and conspiracy mentality with the Conspiracy Mentality Questionnaire (CMQ). Importantly, we found that repetition increased truth judgments of conspiracy theories, unmoderated by cognitive style and conspiracy mentality. Additionally, we found that the truth effect was smaller with conspiracy theories than with uncertain factual statements, and suggest explanations for this difference. The results suggest that repetition may be a simple way to increase belief in conspiracy theories. Whether repetition increases conspiracy beliefs in natural settings and how it contributes to conspiracism compared to other factors are important questions for future research.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Repetition typically increases truth judgments of statements regardless of their actual truth (for meta-analysis, see Dechêne et al., 2010; see also Brashier & Marsh, 2020; Pillai & Fazio, 2021; Unkelbach et al., 2019). This "truth effect" is commonly explained by processing fluency (e.g., Reber & Schwarz, 1999; Unkelbach & Greifeneder, 2013) and familiarity (e.g., Begg et al., 1992). Repetition makes statements easier to process and more familiar than new ones, which are used as cues for truth (e.g., Brashier & Marsh, 2020; Ecker et al., 2022; see, e.g., Unkelbach & Rom, 2017, for a referential account).
The bulk of studies on the truth effect used uncertain factual statements (Henderson et al., 2022), often assuming that the truth ambiguity of statements is necessary to observe the truth effect (e.g., Dechêne et al., 2010; Unkelbach & Stahl, 2009). Some recent studies used more diverse statements, some of which challenged this assumption. For instance, the truth effect has been observed with true COVID-19 statements (Unkelbach & Speckmann, 2021), political opinions (Arkes et al., 1989), rumors (DiFonzo et al., 2016), fake news (Pennycook et al., 2018), emotional statements (Moritz et al., 2012), and statements that contradict prior knowledge (Fazio, 2020; Fazio et al., 2015), sometimes blatantly so (Fazio et al., 2019; Lacassagne et al., 2022).
In the present study, we investigated whether repetition increases belief in conspiracy theories (hereafter, conspiracism). For the present purpose, we confine defining conspiracism as "a belief that two or more actors have coordinated in secret to achieve an outcome and that their conspiracy is of public interest but not public knowledge" (Douglas & Sutton, 2023, p. 282; see also, e.g., Douglas et al., 2019; Keeley, 1999; Nera & Schöpfer, 2023). This definition is agnostic to the truth of conspiracy theories (some may be true, and others may be false). However, conspiracy theories are "epistemically risky" (Douglas & Sutton, 2023), meaning that they are typically implausible and prone to falsity – as a result, conspiracy theories are often considered a form of false and misleading information (Pennycook & Rand, 2021).
With the Internet, conspiracy theories can spread broadly, raising questions such as the antecedents and consequences of conspiracism (e.g., van Prooijen & van Vugt, 2018). Conspiracism is assumed to be rooted in individual differences and predispositions. For instance, intuitive (analytic) thinking has been associated with increased (decreased) conspiracism (e.g., Swami et al., 2014; van Prooijen, 2017). Other individual differences such as motivations to believe (Biddlestone et al., 2022; Douglas et al., 2019; Douglas et al., 2017), belief in finalism (Wagner-Egger et al., 2018), paranoia (Brotherton & Eser, 2015), other personality traits (Goreis & Voracek, 2019), and demographic factors (e.g., Freeman & Bentall, 2017) have also been associated with conspiracism (see, e.g., Douglas & Sutton, 2023, for an overview).
Research also investigated the consequences of exposure to conspiracy theories on behavior, behavioral intentions, and prejudice (Jolley & Douglas, 2014a, 2014b; Jolley, Meleady, & Douglas, 2020a; van der Linden, 2015; for a review, see Jolley, Mari, & Douglas, 2020b). The findings are consistent with the possibility of an increase in belief due to participants being merely exposed to a conspiracy theory. Of importance, these studies did not collect measures of belief in the presented conspiracy statements (e.g., adhesion, truth judgments), or these measures were not collected as a function of repeated exposure. In addition, such studies typically displayed only one overarching conspiracy theory – conspiracy statements that are thematically related (e.g., conspiracy theories of Princess Diana's death; Douglas & Sutton, 2008; Jolley & Douglas, 2014a). To test the effect of prior exposure on belief, one needs to measure belief in conspiracy theories when they were presented before and when they are new – in other words when exposure to the conspiracy theories is repeated or not.
To our knowledge, no experimentation investigated the effects of (repeated) exposure to conspiracy theories on their believability. As endorsing conspiracy theories may be key to influencing behavior, it is critical to directly address the causal role of repetition on truth judgments of conspiracy theories. Relatedly, Muirhead and Rosenblum (2019) suggested the concept of "new conspiracism," which refers to the phenomenon that repetition, not evidence, is commonly used to validate conspiracy theories. Such conspiratorial thinking, Muirhead and Rosenblum reasoned, dispenses with the burden of explanation (which is necessary to uncover real conspiracies; e.g., journalistic investigations) and imposes its reality through repetition (exemplified by the catch-phrase "a lot of people are saying."), which is amplified by social media. Although this phenomenon when tackled in the political science domain assigns repetition a major role, this role has yet to be tested.
Here, we ask whether the truth effect extends to conspiracy statements.
In an earlier investigation, Béna et al. (2019) found initial evidence in line with the hypothesis that repetition might increase the perceived truth of conspiracy statements. Béna et al. reanalyzed large-scale surveys that used representative samples of the French population (Institut Français d'Opinion Publique (IFOP), 2017, 2019). In these surveys, respondents indicated whether they had already seen and to what extent they agreed with ten conspiracy statements corresponding to popular conspiracy theories (e.g., NASA faked moon landing). The re-analyses showed that participants agreed more with conspiracy statements they recognized than those that they did not recognize. Although Béna et al. were not in the position to analyze agreement as a function of actual repetition, but only as a function of perceived repetition, their results align with studies finding that recognized statements were more believed than statements deemed to be new, whether the statements were actually old or not (Bacon, 1979).
In the present high-powered preregistered experiment, we manipulated repeated exposure to conspiracy statements and uncertain factual statements (trivia statementsFootnote 1). Based on the range of statements that the truth effect was found with and on the initial results from Béna et al. (2019), we hypothesized that repeated exposure would increase truth judgments of conspiracy statements. We included trivia statements as a reference point,Footnote 2 allowing us to compare the truth effect magnitude with conspiracy statements compared with trivia statements.Footnote 3 Finding the truth effect with conspiracy statements would be informative as we would learn that repeated exposure is a possible antecedent of conspiracism.
By experimentally repeating statements only once, manipulating materials within participants, and administering a true/false truth judgment task, we proceeded to a conservative test of the truth effect with conspiracy statements. For instance, the truth effect was not found with highly implausible statements (e.g., "Elephants run faster than cheetahs") when only one repetition and scales with few response points were used (Pennycook et al., 2018), but occurred when more repetitions and a sensitive scale were involved (Lacassagne et al., 2022).
In addition to assessing the causal effect of repetition on truth judgments of conspiracy and trivia statements, we also probed participants' cognitive style and conspiracy mentality (two widely studied individual differences in the context of conspiracism). As mentioned above, conspiracism is associated with several individual differences, including cognitive style. In contrast, truth effect research found little evidence for correlations between the truth effect and individual differences, including cognitive style (de Keersmaecker et al., 2020; but see Newman et al., 2020, for a correlation with the need for cognition). If we find a truth effect with conspiracy statements, we can ask whether it depends on individual differences, such as cognitive style and conspiracy mentality. On this matter, no straightforward prediction can be derived from the null results involving individual differences in the truth effect literature, nor from results involving individual differences in conspiracism research. As a result, these analyses were exploratory.
In addition, as we included trivia statements, we tested whether the size of the truth effect with trivia statements depends on cognitive style (conceptually replicating previous research, e.g., de Keersmaecker et al., 2020; conspiracy mentality is less relevant on this matter).
Methods
We report how we determined our sample size, criteria for data exclusion, all manipulations, and all measures in the study. The preregistration, experiment program, data, and analyzes are publicly available at https://osf.io/edzac.
Participants and design
We used a 2 (Repetition: repeated vs. new) × 2 (Materials: conspiracy or trivia) design, with the two factors manipulated within participants. Trivia statements were either factually true or false, which is a nested manipulation inside the trivia statements condition.
We collected complete data from a total of 374 participants online. After data exclusion,Footnote 4 there were 299 participants in the final sample (Mage = 28.59 years, SDage = 11.43, 82.6% women, 57.53% students). An a priori power analysis (conducted with G*Power 3.1.9.7; Faul et al., 2007) showed that we needed 282 participants to detect an effect of Repetition on proportions of "true" judgments in the conspiracy statements condition (the critical effect we are interested in) as small as Cohen's d = 0.2 (in a two-tailed paired samples t-test with α = .05/4 = .0125; 1-β = .8).
Materials
Conspiracy statement selection
To operationalize conspiracy theories, we used 20 existing and widespread conspiracy statements (e.g., the NASA faked the moon landing; Lady Diana's accident being a disguised murder). We used 18 conspiracy statements from IFOP (2017, 2019, 2020, see also Wagner-Egger et al., 2018). We further created two conspiracy statements (one on hydroxychloroquine, the other on climate change). The 20 conspiracy statements we used are available in French at https://osf.io/dtn9q.
Trivia statement selection
To use statements with average uncertain truth, we selected 20 factual statements (e.g., "There are no domestic snakes in Scotland and Groenland") about a variety of topics (sciences, arts, history) from a larger pool of statements selected to be uncertain (including French translations of statements from Unkelbach & Rom, 2017, and Silva, 2014). Ten statements were factually true, and ten statements were factually false. The 20 trivia statements we used are available in French at https://osf.io/dtn9q.
Statement presentation
For each participant, 40 statements (20 conspiracy statements; ten true factual uncertain statements; ten false factual uncertain statements) were randomly allocated to either the repeated or new condition. In each Repetition condition, there were 20 statements (half conspiracy statements, half trivia statements).
Cognitive style
We used a French version of the original three-item Cognitive Reflection Test (CRT; Frederick, 2005) to probe participants' cognitive style. The CRT is intended to probe individual differences in the tendency to override intuitive but incorrect responses (e.g., "In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?" [French translation in the current study: “Un lac est recouvert de nénuphars dont l'étendue double chaque jour. Si les nénuphars mettent 48 jours à couvrir toute la surface du lac, en combien de temps en couvriraient-ils la moitié ?”]). We computed the number of problems correctly solved (M = 1.4; SD = 1.15). No or few problems solved are associated with intuitive thinking, while more solved problems are associated with analytic thinking.
Conspiracy mentality
We administered the Conspiracy Mentality Questionnaire (CMQ; Bruder et al., 2013, translated into French by Lantian et al., 2016).Footnote 5 The CMQ consists of five items aimed at probing individuals’ general susceptibility to conspiracy explanations (e.g., “I think that events which superficially seem to lack a connection are often the result of secret activities” [French translation: “Je pense que des événements qui, en apparence, ne semblent pas avoir de lien sont souvent le résultat d’activités secrètes”]). Participants indicated how likely they thought the five statements were on a 5-point Likert scale ("Certainly not, 0%", "25%", "50%", "75%", "Certainly, 100%"). For each participant, we computed the mean response (Cronbach's α = .82; M = 3.16; SD = 0.85), with higher scores indicating a higher conspiracy mentality.
Procedure
After the ethical committee approval, we ran the study online with the Qualtrics online survey tool (Qualtrics, Provo, UT) between 5 October 2020, and 12 January 2021. We created a JavaScript code to randomize statements attribution in each repetition condition and order of presentation for each participant. One of the co-authors distributed the study to various French-speaking Facebook groups related (e.g., undergraduate student groups from several majors) and unrelated (e.g., news groups from several French cities) to our university. As a result, the researcher and his interest in the truth effect were unlikely to be known by the participants overall, and the final sample is unlikely to mainly reflect the researcher own's network. The post indicated that the study was about the evaluation of information without further details. It was strongly recommended to participate on a computer in a quiet room.
The study was conducted online in French. After the display of the consent form and the collection of their agreement, participants gave demographic information (sex, age, professional situation, mother tongue, level on the Common European Framework of Reference scale for French if the mother tongue was other than French).
Instructions then indicated that statements, some true and some false, would be displayed without a time limit with the task to rate their interest (as frequently done in truth effect studies, see, e.g., Henderson et al., 2022) on a 5-point Likert scale (1 – "Not interesting at all"; 5 – "Extremely interesting"). Participants then rated the interest of 20 statements (ten conspiracy statements; five trivia false; five trivia true) displayed in a random order one by one in the center of the screen.
Immediately after this task, participants were introduced to the true/false truth judgment task. In this task, the 20 statements from the interest judgment task were mixed with 20 new ones (ten conspiracy statements; five trivia false; five trivia true) and displayed in a random order one by one in the center of the screen without a time limit. The instructions stressed that it was important to answer even if some statements seemed odd or if the participants were uncertain. Participants were in addition asked not to look for information about the statements during the task.
Once the truth judgment task was completed, we administered the three-item CRT and the CMQ. The CRT and CMQ order was counterbalanced between participants. In the CRT, participants were asked to solve three short problems displayed individually in a random order, without time limit. Participants had to give their response in an open numerical format. In the CMQ, we told participants that we were interested in their personal opinion and that they would indicate the extent to which they thought the five items, displayed on the same screen, were true.
Finally, we asked participants (1) whether they looked for information about the statements or the problems during the study (yes/no answer), (2) whether they happened to answer without reading the displayed statements (yes/no answer), and (3) after reading the study objectives, whether they allow us to use their data in our analyses (yes/no answer). We used responses to these three questions as exclusion criteria (see the Participants and design section above). Participants were then thanked and debriefed in a concluding text.
Results
To conduct the statistical analyses, we used R (R Core Team, 2021) and the packages afex (Singmann et al., 2021, version 1.0-1), emmeans (Lenth, 2020, version 1.5.2-1), and stats (in base R). We calculated Cohen’s d with effsize (Torchiano, 2020, version 0.8.1). We made the raincloud plots (Allen et al., 2021) with scripts from Allen et al. and ggplot2 (Wickham, 2016, version 3.3.5); we made the regression plots with interactions (Long, 2019, version 1.1.0) and ggpubr (Kassambara, 2020, version 0.4.0).
A truth effect with trivia and conspiracy statements
We conducted the preregistered 2 (Repetition: repeated or new) × 2 (Materials: conspiracy or trivia statements) repeated-measures ANOVA on the proportions of "true" responses (see Fig. 1). The main effect of Repetition was statistically significant, F(1, 298) = 119.45, p < .001, η2G = .041. Overall, repeated statements were more often judged as true (M = .51; SD = .13) than new ones (M = .42; SD = .13). The main effect of Materials was also significant, F(1, 298) = 877.19, p < .001, η2G = .599. Trivia statements were more often judged as true (M = .72; SD = .19) than conspiracy statements (M = .21; SD = .19). Critically, these main effects were qualified by a two-way interaction, F(1, 298) = 42.7, p < .001, η2G= .015 (see Fig. 1).
To interpret the two-way interaction between Repetition and Materials, we conducted pairwise comparisons based on the full model in each Materials condition. For trivia statements, "true" responses were more frequent when the statements were repeated (M = .79; SD = .21) than when they were new (M = .65; SD = .22) – the typical truth effect, t(298) = 11.43, p < .0001, Cohen’s d = 0.649, 95%CId = [0.526; 0.772]. This effect of repetition was also significant for conspiracy statements: "true" responses were more frequent for repeated (M = .22; SD = .22) than new statements (M = .19; SD = .19), t(298) = 3.45, p = .0006, d = 0.169, 95%CId = [0.072; 0.266]. The truth effect was significant for both trivia and conspiracy statements, but larger for trivia statements (as indicated by the non-overlapping 95%CI of the Cohen's ds and the significant interaction between Repetition and Materials in the ANOVA).
Truth effect scores are unmoderated by CMQ and CRT scores
We conducted the preregistered multiple regression model with "true" responses proportions as the dependent variable and Repetition, Materials (both dummy-coded), CMQ scores, and the number of correct responses in the CRT (both standardized) as factors (participants as a random variable).
Similar to the ANOVA reported above, we found a main effect of Repetition, F(1, 885) = 66.84, p < .001, a main effect of Materials, F(1, 885) = 2211.9, p < .001, and a significant two-way interaction between Repetition and Materials, F(1, 885) = 22.29, p < .001. No other interactive effect involving Repetition was statistically significant,Footnote 7 indicating that the size of the truth effect was not significantly moderated by CMQ and CRT scores both with trivia statements and conspiracy statements.
We found a main effect of CMQ scores on the proportions of "true" responses, F(1, 295) = 68.91, p < .001: Higher CMQ scores were associated with larger proportions of "true" responses. We found a significant two-way interaction between CMQ scores and Materials, F(1, 885) = 139.32, p < .001. For trivia statements, proportions of "true" responses did not vary as a function of CMQ scores (see Fig. 2a). We tested this effect in a non-preregistered multiple regression similar to the analysis reported above, except we removed the Materials factor and we restricted the analyses to the trivia or conspiracy statements. The effect of CMQ scores was not significant, F(1, 295) = 2.12, p = .146. In contrast, for conspiracy statements, higher CMQ scores were associated with larger proportions of "true" responses, F(1, 295) = 192.98, p < .001. The latter result aligns with the notion that CMQ scores capture a general tendency to believe in various conspiracy theories.
Back to the full model, another statistically significant effect was a two-way interaction effect between Materials and CRT scores, F(1, 885) = 35.43, p < .001 (see Fig. 2b). Similar to the non-preregistered analyses conducted to decompose the interaction involving CMQ scores, we decomposed the interaction between Materials and CRT scores. For trivia statements, higher CRT scores were associated with larger proportions of "true" responses, F(1, 295) = 9.76, p = .002. In contrast, for conspiracy statements, higher CRT scores were associated with smaller proportions of "true" responses, F(1, 295) = 14.82, p < .001.
Discussion
Repetition increases truth judgments of false, implausible, and misleading information. Although conspiracy theories can be seen as such statements, whether repetition increases truth judgments of conspiracy theories had yet to be investigated. It has recently been noted that exposure to conspiracism is rarely experimentally varied (Douglas & Sutton, 2023), despite the relevance of such manipulation for both truth effect and conspiracism research (see below). In the present experiment, we manipulated repeated exposure to conspiracy and trivia statements before asking participants to judge the truth of repeated and new statements. We also assessed participants' conspiracy mentality and cognitive style (intuitive vs. analytic thinking).
We found that repetition increased truth judgments of trivia statements (replicating the truth effect with the typical statements, e.g., Dechêne et al., 2010; Unkelbach et al., 2019) and conspiracy statements (extending the demonstration of the truth effect to another category of statements). This extension dovetails nicely with repetition increasing the perceived truth of statements, even implausible and misleading ones (Fazio et al., 2019; Pillai & Fazio, 2021; see below). While this was not our main goal, the present study addresses one limitation of truth effect research, namely the need for more diverse materials, particularly those related to health and politics (Henderson et al., 2022). Regarding conspiracism, we provide empirical support for a causal effect of repetition on conspiracism while (repeated) exposure is rarely varied in conspiracism research (Douglas & Sutton, 2023), and its effect on conspiracy beliefs is not assessed. Finding the truth effect with conspiracy statements suggests that situational factors, in addition to individual factors (e.g., personality; motivation), are central to explaining conspiracism (Brashier, 2023; Douglas et al., 2017).
Of note, we did not find associations between conspiracy mentality or cognitive style and the size of the truth effect, whether it is with trivia or conspiracy statements. Failing to find a relationship between cognitive style and the truth effect with trivia statements aligns with previous research also failing to do so (de Keersmaecker et al., 2020) and with the general difficulty in finding associations between quantitative individual differences and the truth effect (for an exception, see, e.g., Newman et al., 2020). Turning to conspiracy statements, not finding associations of the truth effect with conspiracy mentality or cognitive style may be surprising if one assumes that beliefs in conspiracy theories are mainly rooted in individual differences such as conspiratorial or intuitive thinking (Bago et al., 2022). Beyond suggesting that situational factors may prove important to understand conspiracism, these null results suggest they are independent of some individual ones.
Consistent with Swami et al. (2014), we found that analytic thinking was negatively associated with conspiracy statements' overall level of truth judgments. We also found results consistent with conspiracy mentality capturing a general propensity towards conspiratorial thinking (e.g., Imhoff & Bruder, 2014): truth judgments of conspiracy (but not trivia) statements were positively associated with conspiracy mentality, regardless of repetition.
Overall, the present study suggests that repeated exposure may be a simple way to increase conspiracism. Although the effect size we found was relatively small (d = 0.169; 95%CId = [0.072; 0.266]) and smaller than the truth effect with trivia statements (d = 0.649; 95%CId = [0.526; 0.772]), the present study led to a rather conservative test: Conspiracy statements were experimentally repeated only once, and we used a binary truth judgment task. Our results suggest that one repetition was enough to make some conspiracy statements believed more to the point of being perceived as true versus false. As more repetitions have been shown to increase the size of the truth effect (e.g., Fazio et al., 2022; Hassan & Barber, 2021), real-word settings – in which repetition of the same information may occur more than once may even lead to larger effects of repetition on conspiracism.
One interesting question is why the truth effect was smaller for conspiracy than trivia statements. Two possible explanations are implausibility and exposure rates. Conspiracy theories are less likely than trivia statements to be perceived as true regardless of repetition (as was found here – because they are epistemically risky; see Introduction, and Douglas & Sutton, 2023). This relative implausibility makes it likely that statements initially perceived to be false remain perceived false even if repetition increases perceived truth. As a result, the truth effect is less likely to be observed for implausible (including conspiracy) statements than relatively plausible statements, even if repetition increases perceived truth regardless of a statement’s plausibility (see Fazio et al., 2019, for a model and empirical support; see Lacassagne et al., 2022, for small increases in truth judgments of implausible statements).
The second explanation we consider is exposure rates. We used widespread conspiracy theories (such as "The Americans have never been to the Moon and NASA faked evidence and images of the Apollo mission's landing on the Moon," which 63% of a representative sample of the French population declared having already heard before participating in a survey; IFOP, 2019). As a result, it is possible that we compared one additional exposure to conspiracy statements and one single exposure to trivia statements. If so, and because there is evidence for a logarithmically shaped effect of repetition on truth judgments (Fazio et al., 2022; Hassan & Barber, 2021; the repetition-induced increase in truth judgment is larger for initial than subsequent repetitions), the repetition-induced increase in truth judgments for already-heard conspiracy theories is likely to be smaller than for unknown trivia statements. Future research may build on the present study design to orthogonally manipulate factors of interest beyond materials, such as statements' plausibility or experimental exposure rates.
Through analyses of two large-scale surveys (IFOP, 2017, 2019), Béna et al. (2019) found that perceived prior exposure could increase conspiracism. However, even if perceived exposure is associated with actual exposure, evidence for a causal effect of repeated exposure on conspiracy beliefs has been lacking. The present experiment provides such support in showing repetition-induced perceived truth of conspiracy statements.
We recommend exercising caution regarding the generalizability of the current findings to richer, real-world contexts. To determine the causal role of repetition on conspiracism, we used a truth effect paradigm, which is particularly suited to study how truth judgments depend on statements' repeated exposure. In the present experiment, statements were displayed without context or source information. In real-world contexts, statements come with various additional information, such as a source that can be more or less credible, be familiar or unknown, belong to one's own social group or to another one, to name a few. On social media, pictures often go together with titles of news articles, and comments and reactions appear next to the statements. Whether valid or not, possible sources of truth cues are various, and repeated exposure is only one of them. Whether repeated exposure increases conspiracism in natural settings is an open empirical question. Of interest, Nadarevic et al. (2020) found that participants rely on multiple cues to judge the truth of statements related to education, health, and politics on simulated social media posts. Testing whether repetition increases conspiracism in such settings would be informative to help identify when repetition delivers cues for truth judgments.
If repetition increases conspiracism beyond the procedure we used, a challenge is to reduce this effect. The truth effect with trivia statements is robust, and reducing it to non-significance is difficult. For instance, asking participants to avoid the truth effect reduced it but not to the point of canceling it (Calio et al., 2020; Nadarevic & Aßfalg, 2016). This result suggests that repetition-induced conspiracism may be difficult to cancel, too, although empirical evidence is still lacking.
Interestingly, research has found that repetition increases "has been used as fake news on social media" judgments – a "fakeness-by-repetition" effect (Corneille et al., 2020; see also Béna et al., 2022). This effect suggests that repetition may sometimes help fight misinformation effects rather than consistently being an issue to overcome. More research on the fakeness-by-repetition effect with consequential statements such as conspiracy theories and other types of misinformation would help identify judgment contexts where repetition can be used to fight belief in misinformation. Other interventions, such as orienting information processing on statements' truth right from the exposure phase, may help reduce the truth effect (e.g., Brashier et al., 2020 ; Nadarevic & Erdfelder, 2014; Smelter & Calvillo, 2020; see the "accuracy focus" to reduce the spread of misinformation, e.g., Pennycook et al., 2020, 2021; Roozenbeek et al., 2021). Whether such manipulations limit the effect of repetition on conspiracism is an important question for future research.
Conclusion
Repetition may be a simple way to increase conspiracism. The present experiment showed that the effect of repetition on truth judgments extends to conspiracy statements, regardless of cognitive style and conspiracy mentality. As we were interested in the causal role of repetition on conspiracism, we relied on a truth effect paradigm with minimal contextual information. Future research may test whether repetition increases conspiracism when other and possibly more diagnostic information is available. If this is the case, identifying ways to reduce repetition-induced conspiracism may contribute to fighting conspiracism as a whole.
Notes
Testing the truth effect with conspiracy statements cannot simply amount to testing it with trivia (whether factually true or false) statements because of four critical differences between conspiracy and trivia statements: First, conspiracy theories are specifically about actors (at least two) that work together in secret; hence, conspiracy theories are socially oriented, while this is not necessarily the case for trivia statements. Second, the actors typically have malevolent intentions (e.g., spreading viruses such as AIDS or coronaviruses; see, e.g., Frenken & Imhoff, 2022); hence, conspiracy theories are emotional (negatively valenced), while trivia statements are typically of neutral valence. Third, conspiracy theories are often seen as false, misleading, and implausible (see above; although their truth status may be difficult to test and definitions need not assume conspiracy theories’ truth status), while trivia statements used in truth effect studies are factually true or false and moderately plausible. Fourth and finally, believing conspiracy theories is consequential (e.g., Sassenberg et al., 2023), while belief in neutral trivia statements is typically harmless.
This is because failing to find a truth effect with conspiracy statements is more informative when also finding versus failing to find the truth effect with trivia statements, as it excludes the possibility of a failed replication.
Although we did compare the size of the truth effect with conspiracy versus trivia statements, the interpretation of such a difference is complicated by the possible covariation of several attributes (e.g., conspiracy theories are less plausible than the trivia statements; conspiracy theories but not the trivia statements are likely to have been heard before the experiment – we discuss these possibilities in the Discussion). We repeat that the present experiment was not primarily aimed at probing why the truth effect might be different for conspiracy and trivia statements; trivia statements were mainly included as a reference point. Although we see reasons why the truth effect might be smaller for conspiracy than trivia statements, this aspect of the experiment was exploratory because the study was not designed to specifically test this difference (nor to test explanations of such an as yet untested difference).
Preregistered data exclusion criteria were as follow (participants can be excluded on more than one ground): Less than 5% of statements judged as true or false (n = 0); if the mother tongue is not French, an insufficient fluency in French (a self-disclosed French level below B2) (n = 1); declaring having searched for information about the statements or the problems while performing the study (n = 38); declaring having responded to items without having read them (n = 34); declaring not wanting the data to be analyzed after having read a debriefing of the study objectives (n = 10).
In the preregistration, we stated that we were interested in two individual differences: cognitive style and skepticism. Please note that referring to “skepticism” in the preregistration is an error, as we were interested in conspiracy mentality, not skepticism.
The non-significant effects involving Repetition were as follows: Repetition × CMQ scores: F(1, 885) = 0.21, p = .649; Repetition × CRT scores: F(1, 885) < 0.001, p = .956; Repetition × CMQ scores × CRT scores: F(1, 885) = 1.28, p = .258; Repetition × Materials × CMQ scores: F(1, 885) = 0.70, p = .403; Repetition × Materials × CRT scores: F(1, 885) = 0.12, p = .727; Repetition × Materials × CMQ scores × CRT scores: F(1, 885) < 0.001, p = .981.
References
Allen, M., Poggiali, D., Whitaker, K., Marshall, T. R., van Langen, J., & Kievit, R. A. (2021). Raincloud plots: a multi-platform tool for robust data visualization. Wellcome Open Research, 4, 63.
Arkes, H. R., Hackett, C., & Boehm, L. (1989). The generality of the relation between familiarity and judged validity. Journal of Behavioral Decision Making, 2(2), 81–94.
Bacon, F. T. (1979). Credibility of repeated statements: Memory for trivia. Journal of Experimental Psychology: Human Learning & Memory, 5(3), 241–252.
Bago, B., Rand, D. G., & Pennycook, G. (2022). Does deliberation decrease belief in conspiracies? Journal of Experimental Social Psychology, 103, 104395.
Begg, I. M., Anas, A., & Farinacci, S. (1992). Dissociation of processes in belief: Source recollection, statement familiarity, and the illusion of truth. Journal of Experimental Psychology: General, 121(4), 446–458.
Béna, J., Carreras, O., & Terrier, P. (2019). On believing conspiracy theories we remember: analyses of two large-scale surveys of conspiracism in the French general public. Preprint. https://doi.org/10.31234/osf.io/tf76n
Béna, J., Corneille, O., Mierop, A., & Unkelbach, C. (2022). Robustness tests provide further support for an ecological account of the truth and fake news by repetition effects. International Review of Social Psychology, 35(1), 19.
Biddlestone, M., Green, R., Cichocka, A., Douglas, K., & Sutton, R. M. (2022). A systematic review and meta-analytic synthesis of the motives associated with conspiracy beliefs. Preprint. https://doi.org/10.31234/osf.io/rxjqc
Brashier, N. M. (2023). Do conspiracy theorists think too much or too little? Current Opinion in Psychology, 49, 101504.
Brashier, N. M., & Marsh, E. J. (2020). Judging truth. Annual Review of Psychology, 71(1), 499–515.
Brashier, N. M., Eliseev, E. D., & Marsh, E. J. (2020). An initial accuracy focus prevents illusory truth. Cognition, 194, 104054. https://doi.org/10.1016/j.cognition.2019.104054
Brotherton, R., & Eser, S. (2015). Bored to fears: Boredom proneness, paranoia, and conspiracy theories. Personality and Individual Differences, 80, 1–5.
Bruder, M., Haffke, P., Neave, N., Nouripanah, N., & Imhoff, R. (2013). Measuring individual differences in generic beliefs in conspiracy theories across cultures: Conspiracy mentality questionnaire. Frontiers in Psychology, 4, 225.
Calio, F., Nadarevic, L., & Musch, J. (2020). How explicit warnings reduce the truth effect: A multinomial modeling approach. Acta Psychologica, 211, 103185.
Corneille, O., Mierop, A., & Unkelbach, C. (2020). Repetition increases both the perceived truth and fakeness of information: An ecological account. Cognition, 205, 104470.
Dechêne, A., Stahl, C., Hansen, J., & Wänke, M. (2010). The truth about the truth: A meta-analytic review of the truth effect. Personality and Social Psychology Review, 14(2), 238–257.
de Keersmaecker, J., Dunning, D., Pennycook, G., Rand, D. G., Sanchez, C., Unkelbach, C., & Roets, A. (2020). Investigating the robustness of the illusory truth effect across individual differences in cognitive ability, need for cognitive closure, and cognitive style. Personality and Social Psychology Bulletin, 46(2), 204–215. https://doi.org/10.1177/0146167219853844
DiFonzo, N., Beckstead, J. W., Stupak, N., & Walders, K. (2016). Validity judgments of rumors heard multiple times: The shape of the truth effect. Social Influence, 11(1), 22–39.
Douglas, K. M., & Sutton, R. M. (2008). The hidden impact of conspiracy theories: Perceived and actual influence of theories surrounding the death of Princess Diana. The Journal of Social Psychology, 148, 210–222.
Douglas, K. M., & Sutton, R. M. (2023). What are conspiracy theories? A definitional approach to their correlates, consequences, and communication. Annual Review of Psychology, 74(1), 271–298.
Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. Current Directions in Psychological Science, 26(6), 538–542.
Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi, F. (2019). Understanding conspiracy theories. Political Psychology, 40(S1), 3–35.
Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K., & Amazeen, M. A. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), 13–29.
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191.
Fazio, L. K. (2020). Repetition increases perceived truth even for known falsehoods. Collabra: Psychology, 6(1), 38.
Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993–1002.
Fazio, L., Rand, D. G., & Pennycook, G. (2019). Repetition increases perceived truth equally for plausible and implausible statements. Psychonomic Bulletin & Review, 26, 1705–1710.
Fazio, L. K., Pillai, R. M., & Patel, D. (2022). The effects of repetition on belief in naturalistic settings. Journal of Experimental Psychology: General, 151(10), 2604–2613.
Frederick, S. (2005). Cognitive reflection and fecision making. Journal of Economic Perspectives, 19(4), 25–42.
Freeman, D., & Bentall, R. P. (2017). The concomitants of conspiracy concerns. Social Psychiatry and Psychiatric Epidemiology, 52(5), 595–604.
Frenken, M., & Imhoff, R. (2022). Malevolent intentions and secret coordination. Dissecting cognitive processes in conspiracy beliefs via diffusion modeling. Journal of Experimental Social Psychology, 103, 104383.
Goreis, A., & Voracek, M. (2019). A systematic review and meta-analysis of psychological research on conspiracy beliefs: Field characteristics, measurement instruments, and associations with personality traits. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.00205
Hassan, A., & Barber, S. J. (2021). The effects of repetition frequency on the illusory truth effect. Cognitive Research: Principles and Applications, 6, 38.
Henderson, E. L., Westwood, S. J., & Simons, D. J. (2022). A reproducible systematic map of research on the illusory truth effect. Psychonomic Bulletin & Review, 29, 1065–1088.
IFOP (2017). Enquête sur le complotisme, Décembre 2017. IFOP pour la Fondation Jean-Jaurès et Conspiracy Watch. Retrieved from https://jeanjaures.org/sites/default/files/redac/commun/productions/2018/0108/115158_-_rapport_02.01.2017.pdf
IFOP (2019). Enquête sur le complotisme, vague 2. IFOP pour la Fondation Jean-Jaurès et Conspiracy Watch. Retrieved from https://www.ifop.com/wp-content/uploads/2019/02/115960-Pr%C3%A9sentation-version-publi%C3%A9e.pdf
IFOP (2020). L’origine perçue du Covid19. IFOP pour la Fondation Jean-Jaurès et Conspiracy Watch. Retrieved from https://www.jean-jaures.org/wp-content/uploads/drupal_fjj/redac/commun/productions/2020/2803/117275_rapport_covid_19.pdf
Imhoff, R., & Bruder, M. (2014). Speaking (un–)truth to power: Conspiracy mentality as a generalised political attitude. European Journal of Personality, 28(1), 25–43.
Jolley, D., & Douglas, K. M. (2014a). The social consequences of conspiracism: Exposure to conspiracy theories decreases the intention to engage in politics and to reduce one's carbon footprint. British Journal of Psychology, 105, 35–56.
Jolley, D., & Douglas, K. M. (2014b). The effects of anti-vaccine conspiracy theories on vaccination intentions. PLOS ONE, 9(2), e89177.
Jolley, D., Meleady, R., & Douglas, K. M. (2020a). Exposure to intergroup conspiracy theories promotes prejudice which spreads across groups. British Journal of Psychology, 111(1), 17–35.
Jolley, D., Mari, S., & Douglas, K. M. (2020b). Consequences of conspiracy theories. In M. Butter & P. Knight (Eds.), Routledge Handbook of Conspiracy Theories (pp. 231–241). Routledge.
Kassambara, A. (2020). ggpubr: 'ggplot2' based publication ready plots. R package version 0.4.0. https://CRAN.R-project.org/package=ggpubr
Keeley, B. L. (1999). Of Conspiracy Theories. The Journal of Philosophy, 96(3), 109–126.
Lacassagne, D., Béna, J., & Corneille, O. (2022). Is Earth a perfect square? Repetition increases the perceived truth of highly implausible statements. Cognition, 223, 105052.
Lantian, A., Muller, D., Nurra, C., & Douglas, K. M. (2016). Measuring belief in conspiracy theories: Validation of a French and English single-item scale. International Review of Social Psychology, 29(1), 1–14.
Lenth, R. (2020). emmeans: Estimated marginal means, aka least-squares means. R package version 1.5.2-1. https://CRAN.R-project.org/package=emmeans
Long, J. A. (2019). interactions: Comprehensive, user-friendly toolkit for probing interactions. R package version 1.1.0, https://cran.r-project.org/package=interactions
Moritz, S., Köther, U., Woodward, T. S., Veckenstedt, R., Dechêne, A., & Stahl, C. (2012). Repetition is good? An Internet trial on the illusory truth effect in schizophrenia and nonclinical participants. Journal of Behavior Therapy and Experimental Psychiatry, 43(4), 1058–1063.
Muirhead, R., & Rosenblum, N. L. (2019). A lot of people are saying: The new conspiracism and the assault on democracy. Princeton University Press.
Nadarevic, L., & Aßfalg, A. (2016). Unveiling the truth: Warnings reduce the repetition-based truth effect. Psychological Research, 81(4), 814–826.
Nadarevic, L., & Erdfelder, E. (2014). Initial judgment task and delay of the final validity-rating task moderate the truth effect. Consciousness and Cognition, 23, 74–84.
Nadarevic, L., Reber, R., Helmecke, A. J., & Köse, D. (2020). Perceived truth of statements and simulated social media postings: an experimental investigation of source credibility, repeated exposure, and presentation format. Cognitive Research: Principles and Implications, 5(1). https://doi.org/10.1186/s41235-020-00251-4
Nera, K., & Schöpfer, C. (2023). What is so special about conspiracy theories? Conceptually distinguishing beliefs in conspiracy theories from conspiracy beliefs in psychological research. Theory & Psychology. https://doi.org/10.1177/09593543231155891
Newman, E. J., Jalbert, M. C., Schwarz, N., & Ly, D. P. (2020). Truthiness, the illusory truth effect, and the role of need for cognition. Consciousness and Cognition, 78, 102866.
Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402.
Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865–1880.
Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological Science, 31(7), 770–780.
Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), 590–595.
Pillai, R. M., & Fazio, L. K. (2021). The effects of repeating false and misleading information on belief. WIREs Cognitive Science, 12(6), e1573.
R Core Team. (2021). R: A language and environment for statistical computing. R Foundation for Statistical Computing https://www.R-project.org/
Reber, R., & Schwarz, N. (1999). Effects of perceptual fluency on judgments of truth. Consciousness and Cognition, 8(3), 338–342.
Roozenbeek, J., Freeman, A. L. J., & van der Linden, S. (2021). How accurate are accuracy-nudge interventions? A preregistered direct replication of Pennycook et al. (2020). Psychological Science, 32(7), 1169–1178.
Sassenberg, K., Bertin, P., Douglas, K. M., & Hornsey, M. J. (2023). Engaging with conspiracy theories: Causes and consequences. Journal of Experimental Social Psychology, 105, 104425.
Silva, R. R. (2014). "The truth is never pure and rarely simple": Understanding the role of repetition and processing fluency on the illusion of truth effect. (Unpublished doctoral dissertation). Instituto Universitário, Lisbon, Portugal. Retrieved from http://repositorio.ispa.pt/handle/10400.12/3187
Singmann, H., Bolker, B., Westfall, J., Aust, F. & Ben-Shachar, M. S. (2021). afex: Analysis of factorial experiments. R package version 1.0-1. https://CRAN.R-project.org/package=afex
Smelter, T. J., & Calvillo, D. P. (2020). Pictures and repeated exposure increase perceived accuracy of news headlines. Applied Cognitive Psychology, 34(5), 1061–1071.
Swami, V., Voracek, M., Stieger, S., Tran, U. S., & Furnham, A. (2014). Analytic thinking reduces belief in conspiracy theories. Cognition, 133(3), 572–585.
Torchiano M (2020). effsize: Efficient Effect Size Computation. R package version 0.8.1, https://CRAN.R-project.org/package=effsize.
Unkelbach, C., & Greifeneder, R. (2013). A general model of fluency effects in judgment and decision making. In C. Unkelbach & R. Greifeneder (Eds.), The experience of thinking: How the fluency of mental processes influences cognition and behaviour (pp. 11–32). Psychology Press.
Unkelbach, C., & Rom, S. C. (2017). A referential theory of the repetition-induced truth effect. Cognition, 160, 110–126.
Unkelbach, C., & Speckmann, F. (2021). Mere repetition increases belief in factually true COVID-19-related information. Journal of Applied Research in Memory and Cognition, 10(2), 241–247.
Unkelbach, C., & Stahl, C. (2009). A multinomial modeling approach to dissociate different components of the truth effect. Consciousness and Cognition, 18(1), 22–38.
Unkelbach, C., Koch, A., Silva, R. R., & Garcia-Marques, T. (2019). Truth by repetition: Explanations and implications. Current Directions in Psychological Science, 28(3), 247–253.
van der Linden, S. (2015). The conspiracy-effect: Exposure to conspiracy theories (about global warming) decrease pro social behavior and science acceptance. Personality and Individual Differences, 87, 171–173.
Van Prooijen, J. W. (2017). Why education predicts decreased belief in conspiracy theories. Applied Cognitive Psychology, 31(1), 50–58. https://doi.org/10.1002/acp.3301
van Prooijen, J. W., & van Vugt, M. (2018). Conspiracy theories: Evolved functions and psychological mechanisms. Perspectives on Psychological Science, 13(6), 770–788.
Wagner-Egger, P., Delouvée, S., Gauvrit, N., & Dieguez, S. (2018). Creationism and conspiracism share a common teleological bias. Current Biology, 28(16), R867–R868.
Wickham, H. (2016). ggplot2: Elegant graphics for data analysis. Springer-Verlag.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no conflict of interest.
Additional information
Open practices statement
The preregistration, experiment program, data, and analysis scripts are available at https://osf.io/edzac.
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Béna, J., Rihet, M., Carreras, O. et al. Repetition could increase the perceived truth of conspiracy theories. Psychon Bull Rev 30, 2397–2406 (2023). https://doi.org/10.3758/s13423-023-02276-4
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.3758/s13423-023-02276-4