Keywords

1 Introduction

Contemporary models of human drug addiction emphasize neuropsychological and neurobiological dysfunction of complex processes within the brain (Everitt and Robbins 2005; Koob 2006; Robinson and Berridge 2008). In these models, cognitive factors, such as a diminished capacity to control one’s own behavior, in conjunction with a strong motivation to consume a drug, is considered critical. Decades of research has demonstrated the powerful reinforcing properties of addictive drugs via their influence on the neurotransmitter dopamine within the mesocorticolimbic system of the brain (Volkow et al. 1999). However, this attribute alone does not explain the maintenance of drug taking behavior, particularly if it is likely to result in serious adverse consequences. Recent work has argued that executive control deficits also play a critical role in the development and maintenance of drug addiction (Jentsch and Taylor 1999; Goldstein and Volkow 2002; Lubman et al. 2004; Garavan and Stout 2005; Yucel et al. 2007a). Current research indicates that executive control processes are fundamental for successfully inhibiting the immediate pursuit of pleasurable stimuli, and for the development of adaptive patterns of behavior – both key factors in drug addiction (Kalivas and Volkow 2005). The aim of this review is to outline the evidence for compromised executive control processes, and the neural mechanisms that underlie them, thereby contributing to prolonged drug consumption. This review will examine evidence of executive control dysfunction in dependent drug users, drug-naïve “at-risk” populations and its predictive value for identifying those individuals who transition from use to dependence. In this review, the term “drug” will be used throughout to encompass all psychoactive substances (including alcohol) that are abused.

2 Executive Control Processes and Their Constituent Neural Network

All goal-directed behavior, however trivial, might be said to require executive involvement: Cowan’s operational definition of executive functions includes all processes that can be influenced by instructions or incentives (Cowan 2001). In general, executive functions serve the same explanatory roles as control processes (Atkinson and Shiffrin 1968; Schneider and Shiffrin 1977; Shiffrin and Schneider 1977); that is, those nonroutinized, attentionally-demanding, consciously-available, volitional processes that initiate a certain action or interrupt and adjust ongoing actions. Measuring individual differences in these processes has typically involved cognitive tasks that increase demands for specific aspects of control, such as inhibition, selective attention, or task switching. Examples of tests used to examine executive processes include dual-task performance, Stroop, Wisconsin Card Sorting, Tower of London, delayed alternation, and assorted working memory (WM) tasks. Such paradigms have proved a reliable method for demonstrating executive control deficits across a range of clinical conditions (Dalrymple-Alford et al. 1994; Diamond 1996; Baddeley et al. 1997; Barkley 1997; Diamond et al. 1997; Konrad et al. 2000; Baddeley et al. 2001; Bennetto et al. 2001; Gilotty et al. 2002; Sharma and Antonova 2003; Simon et al. 2003), with recent work also indicating a strong relationship between executive control deficits on laboratory tasks and real-world behavioral problems (Burgess et al. 1998; Kibby et al. 1998; Moriyama et al. 2002; Kalechstein et al. 2003a; Odhuba et al. 2005; Chaytor et al. 2006).

3 Neuroanatomy of Executive Control Processes

In attempting to identify their anatomical loci, cognitive neuroimaging experiments have operationalized executive functions in various ways, including dual-task coordination (D’Esposito et al. 1995), task switching (Dove et al. 2000; Sohn et al. 2000), memory updating (Salmon et al. 1996), and response sequencing, monitoring and manipulation (Owen et al. 1996). A consensus implicating the dorsolateral prefrontal cortex as critical for executive functioning has emerged, as this region has been observed in a number of studies using a range of different tasks (Owen et al. 1996; Smith and Jonides 1999; Owen et al. 2000; Petrides 2000; Postle et al. 2000; Bor et al. 2001; Szameitat et al. 2002; Sylvester et al. 2003). This consensus is also consistent with the human lesion literature, which implicates the frontal lobes in organizing, regulating, and producing coherent behavior (Luria 1973; Stuss and Benson 1987). Patients with frontal lobe lesions appear to lose important aspects of autonomous executive control, as evidenced by the loss of behavioral control to environmental contingencies. Classic examples of such behavior include capture errors (automatically following cues with prepotent responses) and utilization behaviors (reaching out and using objects in the environment in an automatic manner) (Lhermitte 1986).

However, it is clear that executive functions are not located solely in prefrontal regions (Andres 2003). Those neuroimaging studies that have localized executive functions to the dorsolateral prefrontal cortex have also observed extensive parietal, premotor, cingulate, occipital, and cerebellar activation. Consistent with these findings, functional imaging studies of “classic” executive tasks such as the Tower of London, the Wisconsin Card Sorting Task, and Stroop Test reveal extensive activation in the frontal lobes, as well as in temporal, parietal, occipital, and cerebellar regions (Berman et al. 1995; Prabhakaran et al. 1997; Monchi et al. 2001; Newman et al. 2003). Other investigators have argued for distinct or interacting prefrontal and anterior cingulate contributions to executive processes (Gehring and Knight 2000; MacDonald et al. 2000), or have suggested that regions underlying executive functions may contribute to many other cognitive processes, such that executive functions are accomplished by distributed networks of activated areas (Carpenter et al. 2000; Miller and Cohen 2001). Despite the challenge that this may present for localizing executive functions, one should still be able to identify the underlying neuroanatomical circuitry, although a more sophisticated level of description may be required; the hallmark of executive functions may not be a particular gyrus or gyri, but may be reflected in dynamic patterns of activation within an entire task-related circuit.

4 Executive Control Dysfunction in Addicted Drug Users

Significant impairments on clinical neuropsychological (e.g., Stroop test, WCST) and experimental measures of executive control (e.g., Go/No-go task, Eriksen Flanker task, Simon Task) have been identified in a range of dependent drug-using groups (Hoff et al. 1996; Bolla et al. 1999, 2000; Simon et al. 2000; Rosselli et al. 2001; Fillmore and Rush 2002; Salo et al. 2002; Simon et al. 2002; Solowij et al. 2002; Kalechstein et al. 2003b; Goldstein et al. 2004; Lundqvist 2005; Li et al. 2006; Verdejo-Garcia et al. 2006; McHale and Hunt 2008). Neuroimaging studies have identified an association between these executive control deficits and dysfunction in prefrontal (particularly dorsolateral and inferior frontal), anterior cingulate, and orbitofrontal regions (Bolla et al. 2001, 2003, 2004; Goldstein et al. 2001; Franklin et al. 2002; Paulus et al. 2002; Kaufman et al. 2003; Hester and Garavan 2004; Gruber and Yurgelun-Todd 2005; London et al. 2005; Tapert et al. 2007; Paulus et al. 2008). Individual studies have also identified changes within subcortical (thalamus and basal ganglia), parietal, temporal, and cerebellar regions, although these findings are less consistent. The variability in brain regions implicated across studies is partly related to differences in the task demands of the cognitive paradigms administered. Similarly, the characteristics of the sample (e.g., demography, education, premorbid intelligence, comorbid psychiatric history), the duration and frequency of drug use, neuroimaging technique [e.g., positron emission tomography (PET), electroencephologram (EEG), functional magnetic resonance imaging (fMRI)], and type of drug used all appear to subtly influence the pattern of behavioral and neural executive control deficits observed (see also Yucel et al. 2007a).

It is important to point out that many studies that observe differences in functional brain activity between drug-using groups and matched control participants do not observe a significant difference in executive control performance (Goldstein et al. 2001; Tapert et al. 2007). Indeed, some studies manipulate task difficulty to ensure equivalent performance between drug users and controls so as to ensure that the differences in brain function observed are not overly influenced by performance differences, or by factors that potentially contribute to performance differences (e.g., fatigue, frustration, effort)(Kaufman et al. 2003; Yucel et al. 2007b). The interpretation of findings from such studies has generally focused on identifying brain regions or brain networks that behave differently in drug users. For example, studies have found that equivalent executive control performance in drug users is typically associated with higher levels of activity within brain regions related to the task (e.g., prefrontal regions; Gruber and Yurgelun-Todd 2005; Tapert et al. 2007), or the recruitment of additional analogous brain regions (Desmond et al. 2003; Yucel et al. 2007b) that suggest compensatory patterns of activity.

The identification of executive control deficits in addicted drug users has typically involved a comparison between an actively using addicted group, or those who have recently become abstinent, and a nondrug-using control group. Little research has examined the trajectory of executive control deficits during sustained abstinence, or longitudinally examined the impact of relapse or continued use on performance. Simon et al. (2004) found that methamphetamine users who had relapsed by 3-month follow-up had significantly poorer executive control performance than demographically comparable participants who had remained abstinent. However, the “relapse” group (and for some tests the abstinent group) also had significantly poorer executive control performance than a group of comparable users who had not attempted abstinence and continued to use methamphetamine. The results of this study highlight the critical need for longitudinal research examining how cognitive performance, and in particular executive control, is influenced by continued drug use, abstinence, and relapse.

5 Attentional Bias for Drug-Related Stimuli

One suggested mechanism by which executive control dysfunction influences further drug consumption is via specific attentional biases to drug-related stimuli (e.g., drug paraphernalia). Human drug addiction is a complex multifactorial phenomenon that features, with remarkable consistency, a difficulty in directing attention away from salient drug-related stimuli. Behavioral studies have shown that processing a nonsalient stimulus in the presence of a salient drug-related stimulus presents a significant difficulty for those dependent on cocaine (Copersino et al. 2004; Hester et al. 2006), alcohol (Sharma et al. 2001; Ryan 2002; Cox et al. 2003; Duka and Townshend 2004a, b), cannabis (Field et al. 2004a), nicotine (Wertz and Sayette 2001; Powell et al. 2002; Waters et al. 2003; Bradley et al. 2004; Field et al. 2004b), or heroin (Lubman et al. 2000; Franken et al. 2003). Similarly, electrophysiological studies, which are able to directly quantify the allocation of processing resources to specific stimuli independently of conscious awareness, demonstrate enhanced event-related potential (ERP) responses to drug-related stimuli compared to nonsalient stimuli across a range of addicted populations (Warren and McDonough 1999; Herrmann et al. 2000, 2001; Franken et al. 2003; van de Laar et al. 2004; Lubman et al. 2007b, 2008. Together, these studies provide evidence that drug-related stimuli capture processing resources and influence behavior.

The basis of this attentional bias in addicted users may relate to the reinforcing properties of drugs and their influence on the mesocorticolimbic “reward” network, and consequently, the influence of the limbic system on attention and executive control. The mesocorticolimbic neural circuit, which includes the nucleus accumbens, amygdala and hippocampus, has been associated with the acute reinforcing properties of addictive drugs (Everitt et al. 1999). Repeated administration of a drug alters the responsiveness of these brain regions, insofar as they become sensitized to the association between the drug, its many related stimuli (e.g., context and surroundings in which it is taken), and the euphoria that accompanies intoxication. Indeed, studies of drug craving where drug-related stimuli are presented to either active or abstinent users have demonstrated significant activation in regions such as the amygdala, nucleus accumbens, and hippocampus (Grant et al. 1996; Maas et al. 1998; Childress et al. 1999; Garavan et al. 2000; Ciccocioppo et al. 2001; Kilts et al. 2001; Bonson et al. 2002; Brody et al. 2002; Tapert et al. 2003; Franken et al. 2004a). This type of conditioned associative learning is typically found with other reinforcing stimuli (e.g., food, pain), and items conditioned in this way are reinforced as salient to the individual (Berridge and Robinson 1998).

The salience of a stimulus determines its capacity to hold attention, and to an extent, to direct attention. Learning the salience of stimuli and, in turn, allowing salience to reflexively direct our attention (particularly visual attention) appears to have a logical and evolutionary advantage. Thus, when navigating a complex multistimulus environment, our attention is captured by those items which we find rewarding (e.g., food) or that could harm us (e.g., predators). As salience directs attention relatively automatically (Pessoa and Ungerleider 2004), a greater level of executive control must be imposed to ignore a salient cue in order to focus on a less salient stimulus. Exerting cognitive control is associated with activation in the anterior cingulate cortex (ACC), dorsolateral prefrontal cortex, and inferior parietal regions, during selective attention paradigms such as the Stroop Test (Kerns 2006).

The strong attentional bias that chronic users typically demonstrate for drug-related stimuli highlights its potential role in maintaining addictive behavior. If a user’s attentional system is sensitive to directing attention toward drug-related stimuli in their environment, re-encountering these stimuli will cue attention, and consequently craving. Indeed, several studies have reported a correlation between craving and drug cue-elicited ERP responses (Franken et al. 2003, 2004b; Namkoong et al. 2004; Lubman et al. 2008). While the relationship between craving and relapse during abstinence is complex, users typically report that cravings occur prior to and during the period of highly ritualized and automatic drug-taking behavior that follows an impulsive urge to use (Miller and Gold 1994). Recent studies have demonstrated that the extent of an individual users’ attentional bias for drug-related stimuli can robustly predict the likelihood of successfully ceasing cigarette smoking (Waters et al. 2003), or remaining abstinent during treatment for alcohol (Cox et al. 2002), cocaine (Carpenter et al. 2005), and heroin (Marissen et al. 2006) dependence. The study by Cox et al. (2002) measured attentional bias for alcohol-related stimuli over two time-points and demonstrated that levels of bias increased prior to relapse.

Recently, Lubman et al. (2009) utilized a multimethod approach to examine hedonic responses to natural reinforcers and drug cues among heroin users on opiate substitution treatment. Across a range of response measures (i.e., self report, expressive, reflex modulation, and cortical/attentional), they consistently found altered processing of drug and pleasant pictures in opiate-dependent individuals relative to controls. The opiate-dependent group demonstrated enhanced attentional processing of drug-related stimuli as well as reduced responsiveness to natural reinforcers, and subjective valence ratings of pleasant pictures consistently predicted regular (at least weekly) heroin use at 6-month follow-up, even after controlling for baseline craving scores and heroin use. While few other addiction studies have included a nondrug-related emotionally salient class of stimuli (e.g., sexual imagery, highly aversive images) in their study design, these results support the notion that the hedonic balance between drug cues and natural reinforcers is abnormal in heroin users, with drug-related stimuli capturing relatively more attentional and hedonic resources than natural rewards.

Research with other clinical (e.g., Major Depression, PTSD) populations suggest that the neural mechanisms underlying attentional biases for emotionally-salient information may be related to a reciprocal suppression effect (Bush et al. 2000). In these studies, the processing of nonsalient incongruent Stroop stimuli resulted in a pattern of increased dorsal ACC and dorsolateral prefrontal cortex activity, while the processing of evocative or emotionally salient words activated limbic areas such as the rostral anterior cingulate cortex (rACC), insula, and amygdala (Mayberg et al. 1999). Interestingly, during the latter condition, dorsal ACC and dorsolateral prefrontal cortex regions demonstrated decreased activation (when compared to the incongruent Stroop condition), further supporting the notion of a reciprocal suppression effect, whereby emotional words appear emotionally salient and are associated with decreased activity of executive control regions.

In general, research examining the neural bases of attentional biases in drug users has been limited. Goldstein et al. (2007) administered an emotional stroop task that presented drug-related words to participants and required them to ignore the evocative content of the stimuli while performing a cognitive operation (responding to the word’s ink color). The design of their task prevented the detection of an attentional bias; however, brain activation in response to drug-related words (relative to neutral words) by their sample of dependent cocaine users indicated significant activity in ACC and mesial orbitofrontal regions, with individual differences in performance correlating with activity in these regions.

Recent work has also demonstrated that cognitive measures of executive control dysfunction, in the absence of evocative drug-related stimuli, are also capable of predicting treatment outcomes. Bowden-Jones et al. (2005) demonstrated that decision-making deficits (e.g., the inability to inhibit the selection of immediately rewarding stimuli associated with poorer outcomes over the longer term) predicted those alcohol-dependent patients would relapse, after completing a 21-day inpatient program, in the following 3 months. Passetti et al. (2008) have recently demonstrated a similar relationship using performance on the Cambridge Gamble Task and the Iowa Gambling Task to predict relapse rates in dependent opiate users. This study highlighted that impulsiveness for reward, rather than impulsiveness per se (measured by tasks such as the Go/No-go), predicted relapse rates. Streeter et al. (2008) also found that baseline Stroop task performance predicted those patients who failed to complete an outpatient treatment trial for cocaine dependence. These results, while consistent with previous studies demonstrating a relationship between cognitive task performance and treatment completion (Aharonovich et al. 2003, 2006), utilized measures of treatment compliance rather than relapse to drug taking. However, the relationship between treatment completion and cognitive function may be mediated by the treatment approach. For example, compliance rates for cognitively demanding treatments (e.g., cognitive behavior therapy) are more influenced by individual differences in cognitive ability than nondemanding forms of treatment (e.g., medication trials).

To date, Paulus et al. (2005) have conducted the only study to demonstrate that neural activation patterns (measured by fMRI) during a decision-making task can also be used to predict relapse risk. Treatment-seeking methamphetamine dependent patients were administered a two-choice prediction task 3–4 weeks after starting an abstinence-based treatment program. The patients were followed up at 12 months post-discharge and assessed for drug-taking behavior in the intervening period. After categorizing participants as “relapsers” or “nonrelapsers,”, the fMRI data analysis indicated a network of regions that differentiated the two groups, including significantly lower levels of activity in dorsolateral prefrontal, insula, parietal, and temporal cortex regions. Activity in the right insula, right posterior cingulate, and right middle temporal cortex best differentiated relapsers from nonrelapsers, correctly predicting 17 of 18 relapsers and 19 of 22 nonrelapsers (94% sensitivity, 86% specificity). The data highlight the potential for neuroimaging studies of executive control to play a role in predicting those patients at risk of relapsing during the early stages of treatment.

6 Executive Control Dysfunction in “At-Risk” Individuals

The consistent finding of executive control deficits across cross-sectional studies of drug addicted populations raises questions of whether such deficits directly relate to the addictive process, or to some extent, represent premorbid vulnerabilities. While animal research has robustly demonstrated the neurotoxic effects of chronic drug abuse on cortical regions critical to executive control, few human studies have explored whether such neural and behavioral deficits relate to pre-existing vulnerabilities that may be further exacerbated by chronic drug consumption (Lubman et al. 2007a). Using a longitudinal approach, Tarter and colleagues (Tarter et al. 2003, 2004; Habeych et al. 2005; McNamee et al. 2008) have examined which cognitive factors predict the later development of drug dependence in children (from the age of 10) who have a parent with a diagnosed substance use disorder (SUD). Their data has demonstrated a strong predictive relationship between “neurobehavioral disinhibition,”, a composite index of personality and neuropsychological tests that measure executive control, and the development of SUD by the age of 19. Individual differences in neurobehavioral disinhibition have also been associated with fMRI activity in prefrontal regions of adolescents at risk of SUD (McNamee et al. 2008).

Studies of high-risk populations (e.g., a family history of alcoholism) suggest impairments in frontal functioning are apparent prior to drug use exposure (Monti et al. 2005; Schweinsburg et al. 2005) and can predict later substance use (Deckel and Hesselbrock 1996; see also Ivanov et al. 2008 for a review). Schweinsburg et al. (2005) demonstrated that on a Go/No-go fMRI paradigm, adolescents with a positive family history of alcoholism demonstrated less inhibitory frontal response than those with no family history, despite similar task performance between groups. Deckel and Hesselbrock (1996) examined the ability of neuropsychological and behavioral tests of anterior brain functioning to predict changes in adolescent alcohol-related behaviors 3 years after the initial assessment. Tests of executive functioning, in subjects with a positive family history of alcoholism, were the only measures to predict later alcohol consumption.

Other populations that are at-risk of developing drug dependence are those children with diagnoses of oppositional defiant disorder, conduct disorder, and attention deficit hyperactivity disorder (Myers et al. 1995; Zoccolillo et al. 1997; Riggs 1998; Whitmore et al. 2000; Finn et al. 2005), as well as young people with psychiatric disorders such as schizophrenia and bipolar disorder (Dixon 1999; Batel 2000; Crome 2000; Soyka 2000; Chambers et al. 2001; Altamura 2007; Thoma et al. 2007). These disorders are consistently associated with impairments in executive control as well as disruptions to frontal brain circuitry, highlighting the role that executive deficits play in increasing risk for drug addiction.

Recent advances in developmental neuroscience have highlighted that frontal brain regions do not fully mature until midway through the third decade of life (Paus 2005), and appear to be affected by episodes of developmental trauma as well as exposure to psychoactive drugs (Lubman and Yucel 2008). Indeed, there is growing evidence that psychoactive substances impact differentially on both behavior and brain function during adolescence, with the adolescent brain appearing to be more sensitive to the neurotoxic effects of a broad range of psychoactive substances (Lubman and Yucel 2008). In addition, childhood mistreatment, which is common among drug users, has also been shown to affect brain development (Teicher et al. 2003). Thus, vulnerable adolescents (early trauma and/or family history of addiction) may develop further deficits in frontal functioning following early sustained drug exposure, thereby substantially increasing their risk of transitioning from drug use to drug dependence. However, longitudinal studies that examine the impact of psychoactive drugs on the developing human brain have yet to be conducted.

7 Future Studies

Together, these findings highlight the need for prospective studies across addicted populations that systematically examine structural, functional, and cognitive changes within frontal brain networks, both pre- and post-treatment. Longitudinal research documenting the development of executive and hedonic functioning during adolescence (prior to the onset of drug use) in high-risk populations is also required, so as to determine how early drug use impacts upon developmental trajectories. Such studies would improve our understanding of neurobiological risk for addictive disorders, development of related neuropsychological and neurobiological impairments, as well as potential prognostic markers for treatment and recovery.

To date, cognitive and neurobiological research in the addiction field has tended to focus on identifying factors that increase risk for later drug dependence. More recently, however, there has been growing interest in the role of protective factors (such as executive skills), which improve resilience. Studies that examine both risk and protective factors, as well as environmental variables that promote them, have the potential to foster a greater understanding of brain–behavior relationships as well as pathways into (or away from) addictive disorders.

There is little evidence regarding how, and to what extent, the brain recovers following detoxification and protracted abstinence, or the specific role of treatment in the recovery of affected neurobiological systems. Such data would be particularly salient for rehabilitation settings, and would also provide critical information regarding prognostic outcomes. For instance, the degree to which identified executive control impairments recover with abstinence remains unclear. There is some evidence to suggest that the functional impairments observed in the executive control network are exacerbated during the early stages of withdrawal. Thus, at the time when inhibitory control and decision-making abilities are most needed, it appears that the neural systems underlying them are most impaired (Copersino et al. 2004; Jacobsen et al. 2007). This has clear treatment implications, including the need for interventions that bolster executive control during periods of increased risk, as well as the potential for utilizing neuropsychological paradigms to predict early relapse. Clearly, more research is required in this domain.

Finally, while current diagnostic criteria promote the physiological features of drug dependence (i.e., tolerance and withdrawal), it is arguably the neuropsychological component (i.e., impaired control over one’s behavior) that has the most significant impact on affected individuals and the wider community. Although there is growing recognition that deficits in executive control are a key feature of addictive disorders, we hope that recent advances and future studies in the neuropsychological and neuroimaging fields will further inform diagnostic conceptualizations. This would facilitate recognition of the clinical need to incorporate the management of such deficits within standard care, as well as promote the development of interventions (both pharmacological and psychological) that reduce the impact of attentional biases, enhance executive skills (i.e., improved decision-making and inhibitory control), and improve hedonic responding to prosocial relationships and activities.