Introduction

The focus of this review paper is on how an understanding of cognitive illusions of thinking might contribute to understanding the origin and maintenance of obsessional thoughts. Broadly speaking, the term cognitive illusion originally proposed by Philip Johnson-Laird and Savary (1999), applies to any reasoning where the person arrives at an incorrect or distorted conclusion owing to bias introduced through procedures or psychological factors which trump normative reasoning rules. In the following review we consider cognitive illusions after the classification of Pohl (2004) and comment on their potential relevance to obsessive–compulsive disorder (OCD). Cognitive illusions include: (1) thinking illusions (conjunction fallacy, confirmation bias, illusory correlation, illusions of control, faulty deductive and inductive reasoning, confirmation and threat biases); (2) judgement illusions (availability and representative heuristics, anchoring and validity effects), and (3) memory illusions (associative memory, labelling and misinformation effects).

In the first part we review current studies pointing to cognitive biases and/or deficits in OCD. We then review the nature of cognitive illusions and how they relate to OCD thinking. Finally, we conclude with some clinical implications and suggestions for further research.

Background

A growing body of evidence suggests that OCD could be associated with cognitive failings and deficits (Otto 1992; Tallis 1995, 1997; Cabrera et al. 2001; Deckersbach et al. 2000; Savage et al. 1996; Savage 1998; Savage et al. 1999; Savage et al. 2000). People with OCD show mnestic and executive dysfunctions, when performing neuropsychological tests, such as the Rey-Osterrieth complex figure test, the Wisconsin card-sorting test and others (Cabrera et al. 2001; Deckersbach et al. 2000; Savage 1998; Savage et al. 1996, 1999, 2000; Zielinski et al. 1991; Kim et al. 2002; Park et al. 2003). Neuroimaging data show dysfunctions in fronto-striatal structures in OCD and support the hypothesis that executive dysfunction may be a primary factor in the mnestic deficit (Shin et al. 2004a, b; Kang et al. 2003).

Such cognitive deficits could inform us directly about symptomatic aspects of OCD functioning. Alterations in executive functioning could create problems in the everyday life of people with OCD. If confronted with a problematic situation, people with OCD may tend to focus their attention on irrelevant details, instead of considering the global picture in order to find an effective solution to the problem itself (Savage 1998). Whilst the skill of people with OCD in grouping verbal information and semantic categories seems within normal limits (Park et al. 2003), mnestic alterations in OCD could mediate ineffective organization strategies (Savage 1998; Savage et al. 1999, 2000; Park et al. 2003). For example, if people with OCD have difficulty globally organizing their memories, then this might account for why their mnestic structures are localized and fragmented (Savage 1998). Such organization failure could consequently lead to abnormal levels of doubting and uncertainty in life events (Greisberg and McKay 2003) and account for how after solving a problem, people with OCD seem not to remember if they really have succeeded (Enright 1995; Rubin and Harris 1999). There seems also a lack of cognitive flexibility in problem-solving (Chamberlain et al. 2006, 2007), which could encourage hesitancy. A recent study by Burdick et al. (2008) showed a profile in people with OCD characterized by an overall neurocognitive deficit of .5 standard deviation compared to healthy volunteers in motor and processing speed domains.

The question is how best to relate neurocognitive findings in OCD on the one hand, to observable OCD thinking and behaviour, and on the other hand, to remediation or rehabilitation options. Recent contributions (Park et al. 2006) have underlined the benefits of cognitive rehabilitation aimed at improving, for example, organizing strategies in OCD, so significantly reducing OC symptoms in comparison to an untreated control group.

The above studies seem to link OCD symptomatology to cognitive deficits in classic cognitive functions (attention, memory, visuo-motor performance) information processing, but do such deficits contribute causally to the origin of the disorder or are they an epiphenomenon of the disorder? One important factor which has been overlooked which can give insight into cognitive processes is reasoning theory and in particular cognitive illusions. Cognitive illusions could be an important causal or mediating factor interacting with other relevant cognitive components. In this view, OCD originates from a peculiar identifiable reasoning style, which through the use of cognitive illusions, maintains the disorder and cognitive dysfunctions. Hence, as we will show, ineffective organisation strategies, attention and even memory deficits may result from reasoning strategies rather than from structural cognitive deficits.

Cognitive illusions which may be experienced in the general population seem present to an excessive degree in OCD. But so far there has been little scientific discussion of such illusions. However understanding the presence of such illusions separately and in combination may inform us further on the origin and the maintenance of the disorder.

Thinking Illusions

Thinking illusions usually involve the application of a certain rule (like Bayes’ theorem, hypothesis testing, or syllogistic reasoning), derived from normative models (like probability theory, falsification principle, or logic). There is evidence that people with OCD do show characteristic reasoning styles and that these styles are pertinent to OCD thinking and behaviour.

For example, in decision-making and in Bayesian probabilistic reasoning (Milner et al. 1971; Volans 1976; Fear and Healy 1997), research has demonstrated a “data-gathering excess” in OCD, where much more evidence is required to make a decision than in non-clinical controls. Other authors have highlighted different cognitive distortions in reasoning amongst OCD such as inferential confusion (O’Connor 2002; Aardema et al. 2005), where an imagined possibility is confused with a real probability.

In a similar vein Dèttore (2003a) proposed that at least some OCD could originate and/or be maintained by a disconnection between a syntactic module (aimed at generating new possible imagined developments of a given situation) and a semantic module (with the function of evaluating the reality and the probability of the syntactic module elaborations). Anxiety disorders and in particular OCD would be the consequence of an excessive functioning of the syntactic module (generating too many possible models of a given situation, and, above all, too many negative ones) and/or an inadequate functioning by the semantic one. This hypothesis was partially supported by a preliminary study with clinical clients (some with OCD and some with panic disorder) and controls (Dèttore and Castelli 2010).

Conjunction Fallacy

The conjunction fallacy arises when individuals assign probabilities to conjunctive events that exceed the probabilities assigned to the component events that comprise the conjunction. The literature shows, for example, that in this type of judgement, where the conjunction combines a likely event with an unlikely one, the proportion of individuals committing the fallacy can be very high, often exceeding 90% (Fisk and Pidgeon 1996; Gavanski and Roskos-Ewoldsen 1991; Tversky and Kahnemann 1983; Yates and Carlson 1986).

The presence of such a fallacy in OCD could account for symptoms of bizarre associations which frequently occur as in the following example: if I fear contamination with HIV (Human Immunodeficiency Virus), while walking in the street, and I am worried about trampling on something; I fear that this something could be a syringe and that this syringe will be an HIV-infected one. The event “trampling on a syringe while walking in the street” appears very probable, even if each individual event (stepping on something while walking, and the something being a syringe) is less probable. The conjunction rule can be formally expressed:

$$ P(trampling\,on\,something\,and\,this\,something\,is\,a\,syringe) = P(trampling\,on\,something) \, \times \, P(trampling\,on\,a\,syringe\,given\,that\,I\,trampled\,on\,something). $$

The person with OCD contamination fears may go further and fear that the syringe can also be an infected one and so he/she rates the conjunctive event as highly probable so that:

$$ P \, (\hbox{``}trampling\,on\,an\,infected\,syringe\hbox{''}) = P(trampling\,on\,a\,syringe) \, \times \, P(trampling\,on\,an\,infected\,syringe\,given\,that\,I\,trampled\,on\,a\,syringe). $$

In such cases, the conjunction of two events is less probable than each single event, but it seems nevertheless highly probable.

Confirmation Bias

Information is searched for, interpreted, and remembered in such a way that it systematically impedes the possibility that the hypothesis could be rejected, so fostering the immunity of the hypothesis. The classical experiments conducted by Wason (1960), demonstrated that humans do not try to test their hypothesis critically but rather confirm them. In Wason’s (1960) original test, people tested a rule by turning over cards, which either confirm or disconfirm the rule. The majority of respondents chose confirmation over disconfirmation. Several variants of the Wason test have shown that, even in non-clinical populations, while prompts and task demand can lead a person to ‘disconfirm’, spontaneously choosing disconfirmation over confirmation is unusual. Also in OCD such confirmation bias has been reported (Fear and Healy 1997; Mancini et al. 2007) and may account for the frequent distrust of sense and other information.

Confirmation bias is paired in OCD with another logical error proposed originally by Aristotle: the “fallacy of the consequent” or “affirming the consequent”. In such a form of reasoning error, we infer the existence of a cause from the affirmation of an effect: “if it’s raining then the streets are wet; the streets are wet, therefore it’s raining”. Arntz et al. (1995) identified this fallacy in anxiety disorders, considering it as a peculiar form of “emotional reasoning” and naming it “ex consequentia reasoning”. This error tends to invert the correct direction of the causal reasoning applied to the emotions: I feel disgusted therefore there must be dirt (the reverse is the right causal sequence); I feel worried therefore necessarily there is some danger (also in this case, the reverse is the correct causal chain). O’Connor and Robillard (1995), identified a similar fallacy as “inverse inference”, and highlighted its importance: “…the OCD client, rather than revising the hypothesis in the face of evidence, revises the evidence in the face of the hypothesis” (p. 890). For example, a person with OCD, instead of inferring from the real presence of dirt the hypothesis that someone entered the room (a correct inference since causally necessary), instead considers the fact that someone entered the room as a confirmation of the presence of dirt, even if it isn’t visible (an incorrect inference, since someone could have entered the room, but without bringing in dirt). In OCD the reversed causality is peculiarly linked to the presence of frequent preventive compulsions. In other anxiety disorders this logical error can also depend on whether the person feels anxious or worried (Engelhard, Macklin, McNally, van den Hout, and Arntz 2001).

Biases in Deductive and Causal Reasoning

Research into inductive and deductive reasoning (Pélissier and O’Connor 2002; Simpson et al. 2007; Pélissier et al. 2009) has shown that in OCD people need more information and postpone the final decision making. According to Pélissier et al. (2009), following Johnson-Laird’s mental model theory (1994a, b) “these findings were due to an excessive production of alternative mental models on the people with OCD which may have slowed down the process of generating inferences as well as created excessive doubting by multiplying cognitive loading on the inductive reasoning process” (p. 89). This doubting paradigm involved: (a) presenting participants with a premise and a conclusion; (b) participants rating confidence in the conclusion, and (c) participants are then given alternative possibilities.

Premise:

John is organizing a garden party

Premise:

The forecast is rain

Conclusion:

John cancels the party

Alternatives:

John puts up a tent.

John moves the party inside the house.

Participants then rescore their confidence in the original conclusion in the light of alternatives. Both people with and without OCD doubt their original score and modify their confidence level given alternatives. However, people with OCD doubt more. The implication here is that people with OCD are more vulnerable to consider alternative possibilities and may be less discriminatory in according them importance. Hence therapy might consider using alternative mental models to weaken convictions. These reasoning insights were indeed integrated into an inference-based therapy (O’Connor and Aardema 2012).

Another influence on deductive reasoning research is belief bias (Evans et al. 1983). This is typically viewed as a tendency to endorse arguments, the conclusions of which, are believed by a person regardless of whether they are formally and logically valid or not.

In OCD such a bias can be detected in typical contamination and checking fears. Applying transformations to the classical propositional calculus, it is possible to produce characteristic OCD conclusions, which are able to activate fears. In the following syllogisms, we first give the abstract forms and then substitute the symbols for classic OCD themes to obtain potential OCD thoughts.

Hypothetical Syllogisms

  1. (a)

    Modus ponens:

    • If A ⊃ B and A, then B

    • If dirt implies danger and there is dirt, then there is danger.

    • If an open door implies danger and the door is possibly open, then there is danger.

  2. (b)

    Modus tollens:

    • If X ⊃ Y and −Y, then −X

    • If HIV infected blood requires a syringe and there isn’t a syringe, then there is probably no HIV infected blood.

    • If an excessively full washbasin implies an open tap and there isn’t an open tap, then there isn’t an excessively full washbasin.

Pure Hypothetical Syllogism

  1. (c)

    If A ⊃ B and B ⊃ C, then A ⊃ C

    • If dirt implies danger and danger implies infections, then dirt implies infections.

    • If an open door implies danger and danger implies a thief in the house, then an open door implies a thief in the house.

Here, the syllogisms are formally (syntactically) but not semantically correct (they aren’t necessarily true); if the subject can’t discriminate such a difference, this error could facilitate the onset and the maintenance of obsessional reasoning. The weak link of the chain is the major premise of the first hypothetical syllogism, which isn’t absolute but only true in some circumstances (dirt is not always dangerous and the same can be said perhaps about an open door).

An important aspect of belief bias is that people accept invalid arguments because they are faced with readily believable conclusions. Ways of overcoming belief bias in control populations include augmenting instructions to make people aware of the influence of prior knowledge and the selective search for conclusions. Also the content of the reasoning will influence belief bias and replacing content can lead to more accurate deduction. These mechanisms are reinforced in OCD by the already described thinking error, the ex consequentia reasoning (Arntz et al. 1995). A vicious circle is then produced (Dèttore 2003b): the person with OCD sees him/herself engaged in preventive compulsions and consequently generates possible models of the world, which lead to OCD deductive reasoning. The reasoning is formally correct but not necessarily true.

We can give some examples again using syllogisms.

Hypothetical Syllogisms

  1. (a)

    Modus ponens:

    • If A ⊃ B and A, then B

    • If washing hands implies dirty hands and I am washing my hands, then my hands are really dirty.

    • If checking a door implies that the door is perhaps open and I am checking the door, then the door can really be open.

  2. (b)

    Modus tollens:

    • If X ⊃ Y and −Y, then −X

    • If an object being clean implies that I can touch it, and I don’t touch it, then it is really dirty.

    • If a syringe which is surely not blood infected implies that I don’t need to check it, and I check it, then the syringe is really blood infected.

Pure Hypothetical Syllogism

  1. (c)

    If A ⊃ B and B ⊃ C, then A ⊃ C

    • If washing hands implies dirty hands and dirty hands imply that I touched something dirty, then washing my hands implies that I touched something dirty.

    • If checking a door implies it could be potentially open and a potentially open door implies the risk of a thief in the house, then checking the door implies there is the risk of a thief in house.

As a consequence of such reasoning, overestimation of threat is reinforced, fears acquire an immediate reality, the feared dangers are confirmed and amplified, the emotional responses are activated and the necessity to produce reassuring compulsions is increased. Subsequently, the emission of the ritual or compulsion augments the ex consequentia reasoning, confirming the doubt and the worries, and negatively reinforcing the compulsion itself. So the person ends up in the well-known self-sustaining cycle of OCD.

Of relevance here is a study by Deacon and Maack (2008). In this study, engagement in OCD safety behaviours such as taking precautions and avoidance, moderately increased fear of contamination in a student sample (Deacon and Maack 2008). The authors suggest that the safety behaviours may have maintained the importance of contamination fears and directed increased attention to the objects. The results could also be interpreted as a form of ex-consequentia reasoning. “I’m acting as though this is a risk to my health, therefore it must be a danger.”

Illusory Correlation

As organisms learn to predict and control their environment through serial observations, they assess the correlations that exist between important stimulus events. If a correlation is perceived that isn’t really there, this is an “illusory correlation”. More generally, the term does not only apply to overestimations of zero correlation but to all kinds of systematic deviations or biases in subjective assessment of association.

This effect includes “sample-size effect” (Fiedler 1996) where two or more behaviors seem to occur together in the majority of cases rather than in the minority, simply as a function of different sample size. There is also “positive testing” (Klayman and Ha 1987), according to which in hypothesis testing people usually focus on the occurrence rather than the non-occurrence of the critical event. This illusory correlation can be detected in OCD.

If I am a person with OCD contamination fears, I want to test the hypothesis that washing my hands (after touching a possibly contaminated object) produces the condition of “no disease”. Consequently, according to the positive testing, I’ll focus above all on cases in which I wash my hands, producing the following:

I touch and I wash my hands ⇒ No disease:

1,000 cases

I touch and I don’t wash my hands ⇒ No disease:

50 cases

I touch and I wash my hands ⇒ Disease:

20 cases

I touch and I don’t wash my hands ⇒ Disease:

1 case

Thus, although the proportion of “No disease” is the same across washing conditions, the sample size is higher for the washing hands condition, due to positive testing. As a consequence, even if there is a zero correlation, washing hands seems more strongly associated with “No disease”. This is an effect that often appears in the non-clinical population but obviously it is likely more evident in a clinical population of people that continuously wash their hands.

As far as we know, this effect of illusory correlation in OCD has been studied only by Gloster et al. (2008), who examined accuracy in recall of ‘covariation’ of OCD with associated states. Participants consistently overestimated the correlation of OCD frequency and duration with stress, anxiety and distress following interpersonal conflict, compared to data collected through self-monitoring at the time of OCD occurrence. The results are largely consistent with the research based on illusory correlations (Kahneman, Slovic, and Tversky 1982), producing the following.

Presence of OCD symptoms → I am stressed and/or in anxiety:

100 cases

Presence of OCD symptoms → I am not stressed nor in anxiety:

20 cases

No OCD symptom → I am stressed and/or in anxiety:

10 cases

No OCD symptom → I am not stressed nor in anxiety:

2 cases

Thus, although the proportion of “stress/anxiety” is the same across OCD symptom and non-symptomatic conditions, the sample size is higher for the condition where there is presence of OCD symptoms, due to positive testing. As a consequence, even if there is a no correlation, the presence of OCD symptom seems more strongly associated with “stress/anxiety”.

Illusion of Control

An illusion of control occurs when individuals overestimate their personal influence over an outcome. Starting from the pioneering study by Langer (1975), such an effect has been confirmed by several authors (e.g., Alloy and Abramson 1979; McKenna 1993) in non-clinical populations.

This illusion can be explained by the “control heuristic” (Thompson et al. 1998), a shortcut that people use to evaluate the extent of their personal influence. The control heuristic involves two elements: the intention to achieve the outcome, and the perceived connection between one’s action and the desired outcome. When a person acts with the clear intention of obtaining a particular outcome and there is a relation (temporal, common meaning, or predictive) between his/her action and the outcome, that person reasons there is control over the outcome.

In OCD we find a paradoxical situation: on the one hand, there is the strong illusion about the power of thought control shown in thought-action fusion (TAF, the belief that having a thought about an event increases the probability that this event can really occur and consequently the individual can be held morally responsible of its negative consequences); on the other hand, people with OCD always doubt their control over events, but nonetheless aim often unrealistically to attain an ideal form of control (for example in controlling all thoughts). The illusion of correlation is relevant here and a person with OCD may judge control exclusively by the ‘positive testing’ of a large number of selective confirmatory cases.

Moulding et al. (2008) demonstrated that individuals with OCD show a higher desire for control and a lower sense of control relative to community controls, and a higher desire for control than a clinical group with anxiety disorder. In another study the same researchers (Moulding et al. 2009) showed that higher levels of desire for control and a lower sense of control were associated with more frequent OCD-related beliefs and symptoms. Belayachi and Van der Linden (2010) observed in subjects with OCD (checkers) an undermined sense of self-agency. These data are congruent with the hypothesis by Moulding and Kyrios (2006) according to which extreme discrepancies between desire for control and sense of control could produce elevations in magical thinking (TAF), specific to OCD. The pertinent literature on OCD and illusion of control (with peculiar attention to superstitious obsessions) has been reviewed by Moulding and Kyrios (2006).

The “covariation bias” (de Jong et al. 1995, 1997, 1998; Smeets et al. 2000) could be linked to the illusions of correlation and control. In covariation bias, phobic subjects tend to overestimate the association between fear relevant stimuli and aversive outcomes and such a process would enhance fear. Approaches to control differ according to whether intentions aim to avoid danger or ensure safety. The research of de Jong and collaborators demonstrated that non-clinical participants, when in a mental state of threat, if they have to check hypotheses of danger, tend to confirm them and don’t try to invalidate them, as they do if they need to check hypotheses of safety: Better safe than sorry. “Positive testing” and “control heuristics” could explain such abnormal selective checking.

Judgement Illusions

If people are asked to subjectively rate a specific aspect of a given stimulus (e.g., its pleasantness, frequency, veracity or danger), specific features of the situation may bias someone’s judgement in a certain direction. Such biases thrive in cases of judgements under uncertainty: the person has no knowledge about the correct solution and so relies on subjective impressions. Judgement illusions interfere with tasks such as estimating a probability, elaborating or verifying a logical conclusion, or inferring a rule and are directly related to formal reasoning; in addition, judgement illusions are pertinent to a subjective evaluation that is distorted, for example, “by feelings of familiarity or confidence, the subjective experience of searching one’s memory, or the selective activation of one’s memory contents” (Pohl 2004, p. 4).

Availability and Representativeness

In many everyday situations especially those characterized by anxiety and uncertainty, reasoning may follow ‘cognitive heuristics’ (Tversky and Kahnemann 1974), instead of “algorithms” which are logical procedures in problem-solving that are highly formalized and computational and expressible in strings of symbols and instructions.

Heuristics are rapid judgements made in uncertain conditions or when the information is insufficient; they are shortcuts based upon intuitive and not logical reasoning. They involve a simplified information selection and filtering that can produce dysfunctions in discursive reasoning.

Because the human brain has limited information processing capacity, the use of heuristics is very convenient; they are less complex and require far less data than algorithms. In some cases, above all when the information available is incomplete, heuristics allow the subject to decide and avoid behavioural blocks.

The presence of such heuristics in reasoning is found in both clinical and non-clinical groups; the difference between normal and pathological processing probably lies in the level, the frequency and the type of bias or heuristic preferably employed.

The heuristic of availability depends on the ease with which relevant instances of a class come to mind. The ‘availability heuristic’ suggests that people estimate the frequency or probability of an event by bringing to mind the easiest example of a class of event. Instances of large classes of events are recalled quicker than infrequent ones. In other words, familiarity yields erroneous decisions, according to the availability heuristic, e.g., I read about a garage door springing open, so mine could spring open. In areas of concern (for example AIDS obsessions) the ready availability of scenarios probably contributes to the overestimation of the probability of the feared event and to the start of the OC cycle.

Several authors (Carroll 1978; Gregory et al. 1982; Sherman et al. 1985) have demonstrated in non-clinical settings that simply imagining a future event could increase the individual’s perception of the probability that the imagined event will occur. They interpreted such results in term of the availability heuristic: the easier is to imagine or to mentally explain an event, the more its subjective probability increases. Consequently, the availability heuristic could be linked to OCD, by means of a mechanism similar to the one well described by Muse et al. (2010) for hypochondriasis: “… the recurrent, future-oriented intrusive images along themes of illness and death serve to maintain anxiety about health by increasing participants’ estimation of the likelihood of these events occurring” (p. 796). Keen et al. (2008) also noted that people with OCD were better able to imagine future negative scenarios.

The heuristic of representativeness can be considered as a procedure for estimating probabilities through judgements of similarity or typicality, that may be accurate, but which can still lead to biased estimates. So for example, in the ‘representativeness heuristic’ a person relies on estimates of the degree to which A resembles B, or how much A is a representative of B, e.g., the two people I met going to the party were superficial so all the people invited to the party will be superficial. While this may be true, it is not always the case because base rates are not considered when the representativeness heuristic filters judgment. In OCD it is frequently present: for example, an occasional event (such as forgetting something unimportant) in the mind of the person with OCD makes more possible and probable forgetfulness in the entire class of cases including important material,so inducing and maintaining the obsessional doubt. If an individual frequently imagines this thought, then its subjective probability will increase via the availability heuristic and we obtain another self-sustaining cycle that maintains OCD symptoms.

Anchoring Effect

This robust and ubiquitous effect (Tversky and Kahnemann 1974) is present when a numeric estimate is compared to a previously considered standard (the anchor). This ‘anchoring effect’ phenomenon occurs in two stages (Mussweiler et al. 2004): (1) the selection of a judgemental anchor (the standard); and (2) its subsequent comparison with the target. Tversky and Kahnemann (1974) explain the anchoring effect by a “selective accessibility model” of anchoring: “… judges compare the target with the anchor by testing the possibility that the target’s value is equal to the anchor value… To do so, they selectively retrieve knowledge from memory that is consistent with this assumption” (pp. 191–192), consequently they produce an estimate that is heavily influenced by the anchor-consistent knowledge processed previously.

At least three mechanisms may influence the initial stage of standard selection: (1) A particular value may be selected as an anchor because conversational sources suggest it as relevant. For example, a person with OCD can hear about or read about a probability value relative to a peculiar risky event; then he/she could use such a value as an anchor to estimate the probability of similar events that, in reality, are much less probable than the original event. Consequently the probability is overvalued. So, a person with OCD may read that in Africa there is a very likely probability of becoming seropositive, he/she could accord a high value to his or her estimate of becoming seropositive in his or her country, even if the situation is different and the risk is far less relevant. (2) A value may be selected as an anchor because it is easily accessible and comes to mind during the evaluation of the target. In OCD, there may be an overvaluation of the risk of the feared event (as a consequence of the previously described biases and illusions), so the anchor, against which the target will be compared, is high. Hence the estimated probability of similar events will also be high. (3) An anchor may be self-generated via an insufficient correction process. Even if a person with OCD is provided with an implausible anchor (for example, an excessively high probability value), this value will be used as a starting point to generate a more plausible value, but this last estimate will still be too high, via the anchoring effect.

Although there is no study, as far we know, linking OCD to anchoring effect, Bodenhausen et al. (2000) and Englich and Soder (2009) found that participants in a sad mood showed greater susceptibility to anchoring effects in confronting neutral or happy controls. Since a sad mood induces more thorough information processing (Englich and Soder 2009), Furnham and Boo (2011) affirm that sad people, according to the selective accessibility model, will retrieve more sad anchor consistent information. These studies are relevant to OCD as a depressed mood is frequently associated with this disorder (Dèttore 2003b).

Validity Effect

In this effect (Hasher et al. 1977), if information has been previously heard, people likely ascribe more truth or validity to its repetition, than if they are hearing it for the first time. The effect occurs regardless the type of information and regardless of whether it was originally believed to be true or false.

People with OCD frequently substantiate their excessive fears by citing information found in newspapers, books and other mass media, or heard from another source, above all an external one (as demonstrated in a non-clinical setting by Arkes et al. 1989). In addition, they may relay excessively on their own rules to the exclusion of facts. Pélissier et al. (2009) also reported that people with OCD were more influenced by possible information given by the experimenter.

Illusions of Memory

Illusions of memory refer to fallacies in recall or recognition of earlier encoded material. People can remember material they haven’t seen, remember illusory covariations between material and distrust and distort what they have remembered.

This issue of memory has been highlighted by Sher, Frost and Otto (1983) who were the first authors to explicitly address the link between OCD (above all the checking type) and memory deficits. They hypothesized three categories of dysfunction: (1) people with OCD have poor memory and consequently they need to repetitively check; (2) they are obliged to check since they fail to distinguish between actual memories and only imagined events; and 3) they lack confidence in their memory and so feel the urge to frequently check. We analyse briefly each point.

Many neuropsychological studies have been conducted about the possible deficits in OCD in verbal, non-verbal, visual and autobiographical memory (for reviews see Greisberg and McKay 2003; Cuttler and Graf 2009; Harkin and Kessler 2011). This literature showed several types of mnestic deficits in OCD, but it is difficult to ascribe the failure in recalling to a neurological anomaly since neuropsychological indexes are not correlated with symptoms severity (Cox et al. 1989; Boone et al. 1991) nor to the mediating effects of anxiety or mood, and neuropsychological scores don’t correlate with anxiety or mood measures (Sher et al. 1984; Zielinski et al. 1991). Cuttler and Graf (2009) prudently conclude their review on memory in OCD checkers and OCD non-checkers stating that data “from our review argue against the theoretical claims that memory deficits or meta-memory deficits contribute to the compulsion to check” (p. 404). Recently, the extensive review by Harkin and Kessler (2011) on memory in checking OCD underlined problems in the executive functioning (above all attention) as the primary cause of secondary memory deficit, without however explaining if these problems are structural or the consequence of other processes.

The second and third points raised by Sher et al. (1983) are interrelated. The important study by van den Hout and Kindt (2003), with the simulated gas stove, demonstrated in non-clinical subjects that repeated checking not only reduces confidence in memory (but not accuracy of memory), but also weakens the vividness and detail of memory for the checking behaviour. The authors explained this result as a consequence of familiarity induced by repetition: since the stimulus becomes more familiar, the perceptual processing is reduced to a more automatic, conceptual level, so diminishing memory vividness and detail and consequently damaging the confidence in that memory; subjects check to reassure themselves, their memory becomes less vivid and confidence is reduced, in a vicious circle. The study was replicated with a real stove by Radomsky et al. (2006), always with a non-clinical group and with the same results; Boschen and Vuksanovic (2007) obtained again the same results in a condition of high-perceived responsibility in a group of subjects with OCD; very recently Fowle and Boschen (2011) confirmed partially these data with a non-clinical group for the first time with repeated cleaning task.

These experimental results seem to demonstrate that the reduced confidence in memory could be the consequence of repeated checking/cleaning influencing the executive processes and not the effect of structural mnestic deficit. On the other hand, if the repetition of reassuring behaviours is able to reduce the vividness and detail of memory, it is more difficult for the subject to distinguish between a less vivid and detailed actual memory and a only imagined cognitive content, so creating the obsessive doubt: I really lived that moment or I simply thought I lived it? In other words, illusions of memory, rather than cognitive deficit, may account for apparent representation problems in OCD.

Associative Memory Illusions

People can falsely remember non-presented events that are associated with events that did occur. This effect was demonstrated exhaustively by means of word lists (Roediger and McDermott 1995), but, as underlined by Roediger and Gallo (2004), similar processes occur whenever people try to comprehend the world around them.

In OCD, drawing inferences, making suppositions, and creating possible future scenarios can distort the retrieval of memories. For example, if a person treads upon a little stone walking in a street, the retrieval of this event can be confused with fearful associations. The recall is then primed by associative processes and idiosyncratic sensibilities: the trodden upon stone in memory becomes a syringe and the touched object transforms into something dirty. Aardema and O’Connor (2003) have referred to how people with OCD ruminations can retain representations of what did not occur. For instance, the initial meta-cognitive thought that one might have the impulse to harm someone is experienced as confused with an actual impulse. The result of this confusion could trigger a whole scenario of harm with all of the accompanying emotions and images as if a particular thought or impulse were actually present. Once the distinction between thinking about the thought and actually having the thought has become lost, the ‘lived’ character of the obsession or inference may be further exacerbated by confirmatory strategies.

This distinction between experiencing a thought and remembering one might have experienced the thought applies equally to sacrilegiuos and pornographic ruminations. People experiencing repugnant, ego-alien thoughts may be reacting to the thought about the possibility of experiencing such thoughts, rather than the actual thought itself. Over time the two events may become fused and they erroneously remember thinking the content of the thought (Aardema and O’Connor 2003).

Wilhelm et al. (1996) showed that people with OCD, compared to non-clinical controls, were not able to follow the instruction of forgetting specific items (positive, negative or neutral words) and this inability to forget was only in relation to negative words. Radomsky and Rachman (1999) demonstrated that subjects with contamination OCD had a better memory for contaminated objects than for clean ones; this bias was not present in subjects with anxiety disorder and non-clinical controls; such results were confirmed in a successive study by Radomsky et al. (2001) with participants with checking OCD, but only if a sense of responsibility was induced in them. We could interpret all these effects on memory as the result of validity effects and familiarity (availability heuristic) relative to words/objects that are very important in respect to subjects’ goals.

Effects of Labelling and Misinformation Effect

Labelling and misinformation effects are similar. In the first case, a specific label is affixed to a stimulus and exerts its distorting influence in subsequent judgement or recall (Carmichael et al. 1932); in the second the effect occurs when a person receives post-event information that interferes with the person’s ability to accurately recall the original event (Loftus and Palmer 1974; Loftus 1975, 2005; Loftus et al. 1978).

A person with OCD, for example, may be told (or may read) that one can lose memories following a traumatic event; so, his/her mind will generate the doubt that he/she could not remember some bad action possibly done in the recent past. Consequently, he/she will feel forced to check to verify if he/she really committed some misdeed. After starting to check, the person may label him/herself as oblivious at times and could begin to believe that bad actions are likely to be forgotten. Such a labelling will distort his/her mnestic retrieval, prompting further doubting. In addition, the frequent generation of checking compulsions could be a source of confusion in the retrieval of memories, thus reinforcing the doubts in memory.

A peculiar aspect of misinformation effect is the so-called “imagination inflation”, originally reported by Garry et al. (1996): when adult subjects imagined event that could have occurred in their infancy, they were subsequently more likely to judge that these imagined events rather than the non-imagined events had occurred. Consequently, such an effect was demonstrated for events and/or actions imagined to have occurred not only in infancy but also in a more recent past (e.g., Thomas and Loftus 2002). In other terms, imagined past events can implant false memories regarding what was imagined.

It is easy to understand how imagination inflation can maintain and aggravate obsessive doubts, as Tryon and McKay (2009) well describe: “… rumination over … a partial memory may result in imagination inflation thereby creating a stronger more complete and embellished gestalt over time… foster[ing] additional rumination that presents clinically as an obsession. This sets the occasion for compensatory behavior [the compulsion]” (p. 553). If an individual with OCD contamination fears, passes by a badly dressed person, when out walking, after some time the individual could imagine that the passer-by is a drug addict and a seropositive one, so inflating the memory in a negative direction. Such a narrative becomes subjectively more and more probable with repetition and obliges the individual to perform some purifying and preventive compulsion.

Discussion

As noted earlier, many of these cognitive illusions occur frequently in non-pathological population. But within an obsessional context and when the fallacies occur in combination, they may exert a powerful hold over OCD thinking. The first point about identifying such illusions is exactly their transparency and accessibility. Correcting the fallacy and exposing the reasoning behind it is a logical step once the OCD thinking is exposed. Once the cognitive illusions are clearly detected, changing the logic involves changing the sequence leading to the fallacy. Such change involves education in reasoning process and does not involve cognitive confrontation, or any challenge or threat to the person’s values and schemas.

How these illusions of thinking could contribute to the origin of OCD as well as forming powerful maintaining and potentiating factors is illustrated in the following sequence: (a) A possible or potential risk or danger is highlighted by a casual external event or by a thought association. (b) A doubt about the possibility of such a risk is primed by distorted reasoning, via the above-described cognitive illusions, so increasing its subjectively perceived probability to a real probability (O’Connor et al. 2005, 2009; Grenier et al. 2010). (c) The presence of such a risk is evaluated according to the person’s value system; if this system doesn’t discriminate between acts of omission and commission (lack of omission bias) (Siev et al. 2010), the person’s attention will be focused on the perceived risk with an increased probability of initiating preventive and/or corrective behaviours. (e) These preventive/corrective behaviours are reinforced by the lessening of negative emotions associated with the perception of risk and consequently they become more frequent and probable in similar situations. (f) These behaviours prime reasoning processes that further strengthen and consolidate the subjective risk perception by ex consequentia reasoning, inverse inference and other illusions. At the same time, repeated behavioural or mental checking causes distrust in memory and in perception (van den Hout et al. 2008; Dek et al. 2010; Radomsky and Alcolado 2010). Both factors induce a more powerful motivation to check or to produce preventive/corrective behaviours. (g) Finally, we have a self-sustaining vicious circle that maintains and aggravates OCD.

Cognitive illusions are not necessarily the product of cognitive deficits and as underlined in the above review, such illusions are present in the non-clinical population. But such cognitive illusions in combination with other risk factors could kindle and maintain OCD without the presence of altered cognitive processing, linked to primary structural deficits. In the same vein, cognitive illusions are not symptoms or epiphenomena of OCD, but are part of the nature of thinking in clinical and non-clinical populations alike.

In our model, the disorder is primed and maintained by the interaction between the ubiquitous presence of such cognitive illusions and a peculiar cognitive vulnerability and motivational set linked to idiosyncratic evaluative processes. Such a cognitive vulnerability could derive from person’s self and interpersonal negative schemata which induce the subject to choose strategies aimed at presenting oneself as a correct and honest person, who doesn’t cause harm to oneself and others (Dèttore 2003b, c). In addition, cognitive appraisals aimed at over-control (Dèttore 2003b), over-estimation of threat related and phobic themes (Dèttore 2003c) and leading to inflated responsibilities and guilt fears (Dèttore 2003c; Mancini 2003), push the person with OCD to perform compulsions, that are preventive and reparative, and aimed at avoiding the feared harm. The compulsions are instrumental in preserving such a self image and lead the individual to focus on specific events or possibilities to the exclusion of others, so facilitating cognitive illusions. In other words, reliance on cognitive illusions is a function of motivational set not deficit and serves a purpose for the person.

We can therefore now draw some clinical implications and suggestions. ERP is currently a technique of choice in treating OCD and works by interrupting the self-sustaining cycle, which maintains compulsions by negative reinforcement (reducing anxiety and other negative affects, such as disgust and guilt). But ERP may also block ex consequentia reasoning and inverse inference. ERP may also confront confirmation bias, the illusory correlation and the illusion of control, since ERP constitutes a powerful invalidation process of those illusions.

Cognitive techniques can enhance awareness of illusions of thinking and help invalidate them during treatment. A central goal for the cognitive intervention is the acceptance of risk as an intrinsic part of existence: it is impossible to completely eliminate a feared risk from our life and to be perfectly sure that our actions will not have any negative consequence. Equally relevant is the critical discussion about the so-called “better safe than sorry” strategy (Gilbert 2004), a pessimistic point of view that can sometimes give birth to excessively prudent behaviours. Psychoeducation can stress that such a strategy can be useful and rewarding in a risky and unpredictable environment (Leahy 2002), but becomes counterproductive if it is blindly activated so as to produce frequently more damage than the avoided risks.

It is also necessary to differentiate between deductive and inductive reasoning: we cannot search for the perfect algorithm (always true conclusions), but we can adopt functional heuristics (probably but not always true conclusions).

The person with OCD learns to systematically discriminate obsessive thoughts, and not accept them as a body of fact, but rather subject them as any other thought to critical semantic analysis. All the cognitive illusions can be addressed in therapy by revealing their hidden functioning and illustrating through clear examples, selected from the person’s experience.

The person could then be led to discriminate between “impressions or feelings” and a “body of fact”. For example, “to have the feeling of being dirty” and “to be dirty” are semantically very different; in such a case it is important to arrive at a shared definition of “dirt”, based on objective criteria and not on “feelings”. Here it can be helpful to unmask ex consequentia reasoning and inverse inference and to underline the correct causal sequence for factual inferences.

An analogous distinction can be made regarding the frequent confusion between the “memory of an event” (usually feared) and the “actively imagined representation of an event”, by consequence of which a person, as a product of his/her fears and over-activation of negative models of the world, erroneously thinks that a mental image is the memory of an actually lived event, and not only a product of his imagination. Examples can be presented to the person as a simple way to conceptualise his/her dysfunctional reasoning, following recent proposals for cognitive treatment of OCD (e.g., O’Connor et al. 2005).