Keywords

Introduction to Cognitive Reserve

The idea of reserve against brain damage stems from the repeated observation that there is not a direct relationship between degree of brain pathology or damage and the clinical manifestation of that damage. For example, Katzman and colleagues described ten cases of cognitively normal elderly women who were discovered to have advanced Alzheimer’s disease (AD) pathology in their brains at death [1]. In more recent cohort studies, it has been estimated that approximately 25% of individuals who have postmortem neuropathological evidence of AD are not demented during their lives [2]. This discrepancy raises the question of how brain function and structure become decoupled and whether certain person-specific variables provide reserve against the clinical effects of pathological brain changes. Several theoretical models have been put forth to address this issue.

The cognitive reserve (CR) model suggests that the brain actively attempts to cope with brain damage by using preexisting cognitive processing approaches or by enlisting compensatory approaches [3, 4]. Individuals with high CR would be more successful at coping with the same amount of brain damage than those with low CR. In this scenario, brain function rather than brain size is the relevant variable. This characteristic distinguishes the CR model from the brain reserve model in which reserve derives from brain size or neuronal count [5]. According to the CR model, the same amount of brain damage or pathology will have different effects on different people, even when brain size is held constant.

Epidemiological studies have helped to shape our understanding of the nature of cognitive reserve and the person-specific variables which appear to enhance reserve. Many studies have demonstrated the beneficial effects of education [6], occupation [7], leisure [8, 9], and intellectual ability [10] on dementia incidence. In 1994, Stern and colleagues reported incident dementia data from a follow-up study of 593 community-based, non-demented individuals aged 60 years or older [7]. After 1–4 years of follow-up, 106 became demented with all but 5 meeting research criteria for AD. The risk of dementia was increased in subjects with low education, such that the relative risk (RR) of developing dementia over the follow-up period was 2.2 times higher in individuals with less than 8 years of education as compared to those with more years of education. Similarly, risk of incident dementia was increased in those with low lifetime occupational attainment (RR = 2.25) and greatest for subjects with both low education and low lifetime occupational attainment (RR = 2.87).

To the extent that aspects of educational and occupational attainment reflect lifetime exposures that would increase CR, it would be logical to expect that environmental exposures later in life would also be beneficial. In a subsequent study, the same group assessed participation in a variety of leisure activities characterized as intellectual (e.g., reading, playing games, going to classes) or social (e.g., visiting with friends or relatives) in a population sample of non-demented elderly in New York [9]. During follow-up, subjects who engaged in more of these activities had 38% less risk of developing dementia. Interestingly, specific classifications of leisure activity (such as purely intellectual activities) did not provide better prediction than a simple summation of all the considered activities.

A meta-analysis examining cohort studies of the effects of education, occupation, premorbid IQ, and mental activities on dementia risk over approximately 7 years revealed that 25 of 33 datasets demonstrated a significant protective effect of these variables [11]. The summary overall risk of incident dementia for individuals with high levels of the protective variable as compared to low was 0.54, a decreased risk of 46%. There is also evidence for the role of education in age-related cognitive decline, with many studies of normal aging reporting slower cognitive and functional decline in individuals with higher educational attainment [12,13,14,15,16,17,18,19]. These studies suggest that the same factors that delay the onset of dementia also allow individuals to cope more effectively with brain changes encountered in normal aging. The concept of CR provides a ready explanation for the manner in which intellectual functioning, education, and other life experiences may allow individuals to sustain greater burdens of brain pathology or age-related changes before demonstrating cognitive and functional deficits.

Neuroimaging studies have also provided evidence in support of cognitive reserve and have contributed to our conceptualization of this phenomenon. Our original functional imaging study found that in patients matched for overall severity of dementia (i.e., clinical expression of disease), the parietotemporal cerebral flow deficit was greater in those with more years of education [20]. This observation was confirmed in a later PET study in which higher education correlated negatively with cerebral metabolism in prefrontal, premotor, and left superior parietal association areas after controlling for clinical dementia severity [21]. Similar observations have been made for occupational attainment [22] and leisure activities [23] and across multiple markers of pathology including white matter abnormalities [24] and amyloid deposition [25]. The negative correlations between the exposures of interest and pathology are consistent with the CR hypothesis’ prediction that at any given level of clinical disease severity, those with higher CR should have greater pathology (see Fig. 2.1).

Fig. 2.1
The line graph depicts the relation between the effect of cognitive reserve on dementia condition. The trend for a high-reserve patient is above the low reserve, which signifies the difference in memory test scores.

Effect of cognitive reserve on dementia onset and course. Note: Fig. 2.1 illustrates the way in which cognitive reserve may mediate the relationship between AD pathology and its clinical expression. We assume that AD pathology slowly increases over time, and this is graphed on the x-axis. The y-axis represents cognitive function, in this case memory performance. AD pathology begins to develop many years before the disease is expressed clinically and slowly becomes more severe. At some point, this developing pathology will begin to produce the initial cognitive changes associated with dementia. This is labeled as the point of inflection in the figure. The pathology will subsequently result in symptoms of sufficient severity to allow the clinical diagnosis of AD (indicated by the dotted line labeled Incident Dementia). The cognitive reserve (CR) model predicts that because there are individual differences in reserve capacity, there will be individual differences in the amount of pathology required for the initial expression of clinical symptoms and the subsequent diagnosis of disease. Because people with higher cognitive reserve can tolerate more AD pathology, memory function will begin to be affected later in time, after more pathology has accumulated, pushing back the “point of inflection.” Therefore, all other things being equal, dementia should emerge later in people with higher cognitive reserve. This leads to the prediction that the rate of incident dementia should be lower in individuals with higher cognitive reserve. An assumption of this model is that at some point, AD pathology must become too severe to support the processes that mediate either cognitive reserve or memory function. The timing of this final common endpoint will be the same in all patients, regardless of their level of cognitive reserve. It then follows that the time between the point of inflection and this common endpoint will be shorter in patients with higher cognitive reserve. This leads to the prediction that memory decline after the inflection point must be more rapid in patients with higher cognitive reserve. Although this trajectory might appear counterintuitive at first, its theoretical basis is illustrated in this figure, and it has been supported by multiple epidemiological studies

Results and interpretations of these studies have been further supported by prospective projects with subsequent neuropathological analysis. Specifically, education has been found to modify the association between AD pathology and levels of cognitive function. With brain pathology held constant, higher education was associated with better cognitive function [26] and less likelihood of having received a clinical diagnosis of dementia in life [27]. These studies converge nicely with epidemiological evidence that supports that higher levels of education, occupational attainment, and leisure activity reduce dementia incidence and suggest that these variables influence dementia risk by enhancing cognitive reserve.

Theoretical Issues

Despite the wealth of information that has accumulated in support of the concept of cognitive reserve, there are many aspects of this construct that have yet to be fully elaborated. It is important to highlight these issues prior to discussing the various means of characterizing reserve and considering the clinical implications of cognitive reserve. The intent of the current chapter is not to fully explore these theoretical issues but simply to raise the reader’s awareness of the unanswered questions surrounding the construct of cognitive reserve.

First, the precise manner in which cognitive reserve affords protection from pathology is not understood. As discussed above, we know that across individuals, there is a discrepancy between brain changes or pathology and cognitive change such that in some individuals, cognitive function remains relatively preserved in the face of pathological markers. As such, individuals with high cognitive reserve are not necessarily protected from developing pathology but rather that they are spared the clinical effects of such pathology. Thus, when we refer to the preservation of a cognitive function such as memory in the sections below, we are in fact talking only about memory itself and not the integrity of the brain areas underlying that cognitive function (e.g., hippocampus). Indeed, the concept of cognitive reserve only applies when considering variability in cognitive functioning (i.e., memory) in the face of changes in brain integrity (i.e., hippocampal volume).

This raises one of the puzzling questions surrounding reserve: memory and hippocampal integrity are intimately related, and the mechanisms underlying the decoupling of structure and function are not clear. From a strict point of view, the differences in cognitive processing envisioned by the CR model must also have a physiologic basis, in that the brain must ultimately mediate all cognitive function. The difference is in terms of the level of analysis. Presumably, the physiologic variability subsumed by cognitive reserve is at the level of variability in synaptic organization or in relative utilization of specific brain regions. Thus, cognitive reserve implies anatomic variability at the level of brain networks, while brain reserve implies differences in the quantity of available neural substrate.

Moreover, it has more recently been recognized that life exposures that are associated with reserve also affect brain structure or brain pathology and not simply cognitive properties. This has been referred to as brain maintenance [28]. Recent studies that support this concept include 1 which found reduced rate of hippocampal atrophy over 3 years in individuals with higher levels of complex mental activity across the life span [29] and another which found microstructural differences in the hippocampus as a function of education [30]. Additionally, the child developmental literature suggests that not only do individuals with higher IQ have larger brain volume [31, 32] but that cognitively stimulating aspects of life experience may also be associated with increased brain volume. It is also now clear that stimulating environments and exercise promote neurogenesis in the dentate gyrus [33, 34]. Both exercise and cognitive stimulation regulate factors that increase neuronal plasticity (such as brain-derived neurotrophic factor) and resistance to cell death. Finally, there is some evidence to suggest that environmental enrichment might act directly to prevent or slow the accumulation of AD pathology [35]. All of these considerations lead to the conclusion that brain maintenance acts to help preserve the brain over time. In this regard we can consider brain reserve the current state of the brain as shaped by brain maintenance.

In sum, there appears to be growing evidence that the experiences that provide cognitive reserve may indeed reflect not only a cognitive advantage but a structural advantage as well. Thus, brain reserve and cognitive reserve concepts are not mutually exclusive, and it is likely that both are involved in providing reserve against brain damage. A complete model of cognitive reserve will have to integrate the complex interactions between genetics, the environmental influences on brain reserve and pathology, and the ability to actively compensate for the effects of pathology.

Setting aside the question of brain integrity, and considering cognitive reserve only, we return to the question of why insult to brain structure does not invariably affect cognition. We have observed that individuals with higher cognitive reserve (defined using a literacy measure) have less rapid memory decline over time than those with lower literacy levels [36]. However, the manner in which this memory advantage is conferred is unknown. It may be that preserved memory reflects preservation of the memory networks per se or use of alternative and supportive skills such as enhanced organizational strategies [37]. Stern and colleagues have described these two potential neural implementations of cognitive reserve as neural reserve and neural compensation [4, 38, 39]. The idea behind neural reserve is that there is natural interindividual variability in the brain networks or cognitive processes that underlie the performance of any task. This variability could be in the form of differing efficiency or capacity of these networks or in greater flexibility in the networks that can be invoked to perform a task. While healthy individuals may invoke these networks when coping with increased task demands, the networks could also help an individual cope with brain pathology. An individual whose networks are more efficient, have greater capacity, or are more flexible might be more capable of coping with the challenges imposed by brain pathology. In contrast, neural compensation refers to the process by which individuals suffering from brain pathology use brain structures or networks (and thus cognitive strategies) not normally used by individuals with intact brains in order to compensate for brain damage. The term compensation is reserved for a situation where it can be demonstrated that the more impaired group is using a different network than the unimpaired group.

It is not yet clear whether or when each of these forms of reserve come into play. The answer to this question has several implications, one of which pertains to the applicability of cognitive reserve under various conditions. Specifically, if the benefits of cognitive reserve are attributable to the flexible application of alternative strategies for completing a task (compensation), specific aspects of brain function may receive less assistance from cognitive reserve than others. It may be that a cognitive skill such as verbal recall can be accomplished in a number of ways that differentially employ serial rehearsal, semantic processing, or working memory. In contrast, there may be fewer cognitive routes to reproduce a complex figure or detect a subtle visual detail amid a complex scene. In this scenario, a compensatory reserve mechanism might be less applicable to spatial skills than to verbal memory. However, it is also possible that critical issue is not task specific but, rather, person specific. That is, based on life experience, one person may have multiple ways of approaching a spatial task but less flexibility for a verbal task, whereas the opposite pattern may exist in another individual. If the crux of cognitive reserve is the ability to apply alternative approaches to accomplish tasks, then the benefit of reserve may be linked directly to the flexibility of the task (and corresponding skill) itself or to a person’s premorbid cognitive style.

One final question is whether or not deterioration of specific cognitive functions can directly affect cognitive reserve. For example, if cognitive reserve is closely aligned or even overlaps with executive abilities [40], is it the case that cognitive reserve is less able (or unable) to stave off executive deficits as opposed to declines in other domains such as memory or language? That is, is cognitive reserve itself vulnerable to a particular presentation of disease? Or, is cognitive reserve a construct that is “immune” to the regional distribution of pathology, independent of the cognitive abilities that may be affected, functioning universally under a wide variety of lesions? While the answer to this question is not entirely clear, recent studies examining the effects of reserve on information processing efficiency in individuals with multiple sclerosis may shed light on the issue [41,42,43,44]. For example, Sumowksi and colleagues showed that the negative effect of brain atrophy on rapid information processing was attenuated in individuals with higher levels of reserve [42], suggesting that reserve confers benefits to cognitive functions whose nature is quite similar to some conceptualizations of reserve. That is, the information processing measure was comprised of the Symbol Digit Modalities Test [45] and the Paced Auditory Serial Addition Test [46], tasks which require mental flexibility and fluidity. Similarly, although speculative, one perspective of cognitive reserve is that it represents the mental flexibility to develop alternative strategies in the face of pathology and to fluidly apply such strategies to the task at hand. The reported benefits of reserve on information processing and efficiency in the above studies are interesting and raise many questions for future work. For the time being, such studies may offer preliminary evidence either that (1) reserve is immune to the distribution of pathology or (2) reserve is fundamentally different than the cognitive skills assessed in these studies.

Estimating Cognitive Reserve

A practical question for the clinician is how to account for cognitive reserve in the diagnostic process. In this section, we review the advantages and disadvantages of several approaches including the following: (1) measurement of individual characteristics (demographic and lifestyle), (2) consideration of cumulative life experiences, (3) estimation of intellectual functioning, (4) implementation of statistical approaches (use of latent or residual variables), and (5) derivation of brain network patterns. Prior to discussing these approaches, it is also important to consider that although epidemiological work has led to the conceptualization of reserve as a reflection of important lifetime experiences, the cognitive advantage which manifests as reserve might also have played an important role early in life to afford individuals the desire and ability to pursue certain life experiences such as graduate school, for example. Thus, the effects of lifetime experiences are not necessarily separate from early life factors. Although certain work has suggested that reserve is a cumulative process built on both early life and late life experiences [47], the causal pathway of cognitive reserve has not been fully delineated. As the reader considers the clinical implications of cognitive reserve and the various methods for measuring reserve, it is important to be aware of the larger questions surrounding its origins and characteristics.

Individual Characteristics

One of the most commonly used methods of characterizing reserve involves quantifying individual characteristics that have been associated with reduced risk of dementia including education, occupation, intellectual functioning, leisure activity, and social engagement. The advantage of this approach is that these variables are relatively easy to acquire and quantify and, at face value, are generally plausible proxies for reserve. A disadvantage is that these variables may be singular representations of a multidimensional mechanism such that characterization of education in isolation, for example, might account for a relatively small proportion of the variance in overall cognitive reserve. Moreover, these variables are rather agnostic with regard to the source and nature of cognitive reserve and may confound multiple other factors with “true” reserve (e.g., education may impart greater knowledge and access to health care which in turn may promote health-related behaviors and enhance cognitive functioning). As such, use of variables such as those listed above, although convenient, should not be the sole indicators of CR.

Cumulative Life Experiences

A second approach for characterizing cognitive reserve is one in which multiple or cumulative life experiences are synthesized to develop a more comprehensive estimation of an individual’s reserve. The purported benefit of this approach is that it synthesizes numerous experiences, all of which have been shown through epidemiological work to confer protection against the development of dementia. The consideration of comprehensive life experiences offers the opportunity to capture a wide array of factors that may uniquely contribute to reserve, if indeed reserve is created through a cumulative process. Valenzuela and Sachdev [48] developed the Lifetime of Experiences Questionnaire (LEQ) as a means of capturing and quantifying various social, academic, occupational, and leisure activities spanning young to late adulthood. The questionnaire showed good reliability and validity and was useful in predicting which individuals would demonstrate cognitive decline over an 18-month period.

While this appears to be a powerful method of capturing a myriad of experiences relevant to the construct of cognitive reserve, there are several issues to consider. It is possible that the summation of experiences within this questionnaire may not be more predictive than any individual variable, and compiling these experiences may even obscure the effect of the most relevant variable. For example, Hall and colleagues found that the effect of education on cognitive decline prior to dementia diagnosis was negligible after accounting for cognitively stimulating leisure activities later in life [49], suggesting one of two possible scenarios raised by the authors. First, it could be that the effects of education were mediated by mental activities late in life or second, that education influenced reserve directly with no additional benefit conferred by later life mental stimulation. Researchers must carefully consider these issues; however, a lifetime approach to characterizing reserve for clinical purposes is certainly useful in that it comprehensively quantifies important experiences that may delay cognitive decline in the face of advancing pathology.

Intellectual Function

A third and very different means of characterizing reserve is the assessment of intellectual functioning, typically via a single-word reading test, such as the Wechsler Test of Adult Reading [50] or the North American Adult Reading Test [51], or a subtest of the Wechsler Adult Intelligence Scales such as Vocabulary or Information [52]. Word reading measures evaluate an individual’s ability to pronounce a series of phonologically regular and irregular words ranging in difficulty and are based on the idea that correct pronunciation of the more difficult items requires previous exposure to such words. Like vocabulary and fund of information, this ability is generally spared early in the course of dementia, reflecting its reliance on long-term, crystallized knowledge versus the more fluid abilities affected early in disease [53,54,55,56,57].

The characterization of IQ is believed to offer a thumbnail sketch of an individual’s lifetime intellectual achievement, highly related to, though not necessarily synonymous with, the concept of cognitive reserve. An advantage of using IQ to characterize cognitive reserve is that in contrast to an external exposure variable such as education or occupation, an internal and broadly stable capability such as IQ is presumably more closely associated with the cognitive and neural representation of reserve. Unfortunately, a corresponding disadvantage is that IQ scores do change in the course of disease and therefore can be contaminated by the disease process itself (unlike education or occupation). Moreover, while reading scores are fairly stable in the very early stages of degenerative illnesses, they are certainly not valid estimates of premorbid IQ in a language predominant illness, nor are they valid estimates in nonnative English speakers.

Despite the differences in applying IQ versus an exposure variable such as education, there is statistical evidence that both share common statistical variance that is distinct from cognitive functions more broadly [40]. The presence of both convergent and discriminant validity in this context provides support for both of these variables as independent proxies for reserve, as well as evidence for the construct validity of reserve. This is an important finding because the coherence of cognitive reserve as a construct remains under question, leading several groups to argue that latent variables derived through structural equation modeling may be the most appropriate way to capture the essence of reserve [58, 59]. Although the details of these models are beyond the scope of this chapter, the idea is that through statistical data reduction, we can boil down the overgeneralized concept of reserve into its core elements and identify those variables that are central to its construct versus those that may be extraneous. A necessary drawback, however, is that representation of cognitive reserve through shared variance may not reflect aspects of reserve potentially captured selectively by each unique variable.

Statistical Approaches

A statistical approach to identifying reserve has recently been proposed by Reed and colleagues [60] by decomposing the variance of a specific cognitive skill such as episodic memory. Specifically, the authors partitioned the variance explained by demographic variables (education, sex, and ethnicity), structural brain imaging variables, and a third residual component. By definition, this residual component approximates the concept of cognitive reserve as it represents the unexplained variance in cognitive performance after accounting for brain structure and, in this case, demographics. Interestingly, the authors included education as part of the demographics variable to isolate a component that would be uncontaminated by the indirect effects of education on brain integrity (e.g., access to health care and knowledge of health-promoting behaviors). Results showed that residual scores correlated with another measure of reserve (word reading), modified rates of conversion from mild cognitive impairment to dementia over time, and modified rates of decline in executive function. Finally, baseline brain status had less of an effect on cognitive decline over time in individuals with high residual scores than low residual scores.

In addition to providing an operational measure of reserve that is quantitative, continuous, and specific to the individual, the residual approach to characterizing reserve allows the estimate of cognitive reserve to change over time. This fluid characteristic may or may not be appealing to individual researchers and clinicians, depending on the particular question or task at hand. The authors also note that a potential problem with this approach is that, depending on the specific brain and cognitive variables used to define reserve, different measures of reserve will be applicable to a person at any given time. Practically speaking, a primary drawback to using residual scores is that it is currently not feasible for the clinician to apply such scores on an individual basis. This may change in the future with greater access to imaging technologies and availability of normative or group data with which to derive an individual’s residual score.

Brain Network Patterns

A future goal for representing reserve is through an identifiable brain network or series of networks. Such networks might be derived using functional imaging techniques that capture the neural signature of cognitive reserve. For example, Stern and colleagues examined whether or not a common neural network, whose expression varied as a function of cognitive reserve, could be detected across verbal and spatial delayed match-to-sample tasks [61]. Indeed, in the group of young adults, such a network was identified, and expression of this network was entirely independent of task performance. The invocation of this network on divergent tasks was uniquely related to cognitive reserve, as assessed with a composite of vocabulary and word reading, suggesting that the network may represent a generalized neural instantiation of reserve.

The utility of a brain network for capturing cognitive reserve is multifold. First, to the extent that reserve truly has a neural signature, the identification of a brain network that “behaves” like cognitive reserve (e.g., correlates with traditional reserve variables, persists across divergent task demands, and interacts with task performance in the expected way) would be a more direct way to measure the construct. Second, a brain network would be a nonbiased characterization of reserve that could be used universally in a manner that tests such as vocabulary or single-word reading cannot, due to their influences from culture and language. Third, a brain network is malleable in a way that fixed life experiences are not and thus lends itself to examination in the context of a longitudinal study. For example, interventional studies aimed at increasing reserve could use a brain network to measure reserve both pre- and post-intervention, and unlike cognitive testing, this network would be resistant to practice effects.

Application of Cognitive Reserve in Clinical Practice

While the concept of cognitive reserve is on the one hand intuitive, it is also easily misunderstood and conducive to misapplication in part due to the thorny theoretical and methodological issues discussed above. However, there is nothing magical about the concept of reserve, and most clinicians generally consider the role of reserve in their assessment and case conceptualization (even if not explicitly). In this section, we provide concrete suggestions for the consideration and application of cognitive reserve in clinical practice.

First, when assessing cognition as part of a diagnostic evaluation, it is important to take into account the most appropriate and valid indicator of cognitive reserve for a given patient. In the event that an individual’s level of education is not believed to be a good representation of his or her optimal cognitive functioning, assessment of IQ or consideration of occupation may provide a more accurate estimate. Alternatively, in a nonnative English speaker, education may be a better representation than single-word reading to estimate IQ. Although, it should be noted that the availability of tests in other languages is increasing, such as Spanish [62], French [63], Japanese [64], and Swedish [65]. Application of a non-English assessment tool would be appropriate only in circumstances when the remainder of the neuropsychological battery can also be validly administered in the same language, as direct comparisons of IQ and neuropsychological scores would be otherwise impossible.

Integration of the most appropriate and valid measure of cognitive reserve into the diagnostic formulation is critical. Individuals with high reserve, by definition, will not demonstrate clinical symptoms as early as individuals with low levels of reserve. On the one hand, this issue could partially be a problem with instrumentation, such that (1) more challenging tests with higher ceilings may better detect changes in individuals with very high levels of functioning, (2) tests that are more pathologically specific (e.g., associative learning tasks for the hippocampus) may have greater sensitivity in high reserve individuals, or (3) better normative data may allow for better detection of impairment in individuals with high levels of intellectual functioning. Indeed, quantitative consideration of IQ scores appears to improve the sensitivity of cognitive testing for detecting pathology. Rentz and colleagues [66] found that when memory scores in a group of cognitively “normal” individuals were adjusted based on IQ, the adjusted memory scores correlated with cerebral perfusion in areas vulnerable to the early stages of AD pathology. That is, those with higher IQ (i.e., reserve) had greater pathology despite similar cognitive performance, and these individuals showed greater cognitive decline over the following 3 years than the individuals whose IQ-adjusted memory scores were intact [66].

In theory, there would still be a period of time during which even the most sensitive measures would fail to detect change in those with high reserve given the apparent “lag” between pathological changes and their cognitive sequelae. Therefore, from a clinical standpoint, neuropsychological testing will be less sensitive to the presence of early pathology in those with high reserve even when we consider current test scores in the context of a person’s optimal level of functioning (e.g., IQ, education). As such, the only action to be taken by clinicians is to be aware of this conundrum and to appreciate that intact cognition in individuals with high levels of reserve does not preclude the presence of disease.

The standard and generally useful approach taken by neuropsychologists is to formally adjust cognitive scores for education, a procedure which, in theory, allows for the interpretation of current cognitive performance in the context of an individual’s expected performance. For example, we know that there are baseline differences in cognitive performance such that in the absence of pathology, a 70-year-old with 8 years of education might recall fewer words over the course of a list learning test than a 70-year-old with 19 years of education. The corollary of this phenomenon is that the patient with 19 years of education would have had to sustain a greater degree of neuropathology to reach a certain score than the individual with 6 years of education, all other things being equal. However, this observation does not, in and of itself, reflect cognitive reserve. Rather, reserve accounts for the ability of the individual with 19 years of education to maintain baseline cognitive functioning for a longer period of time than the individual with 6 years of education in the face of advancing pathology.

Information regarding brain integrity should be integrated with cognitive data for diagnostic purposes, whenever possible. Of course, this process is done regularly in most clinical settings and adds important information and greater clarity to the overall clinical picture. In this context, however, the focus is on the relevance of neuroimaging as a means to understand the influence of cognitive reserve on the clinical presentation. Neuroimaging tools have the potential, particularly in individuals with high reserve who maintain cognitive functioning for an extended period of time, to detect pathological changes when impairment on neuropsychological testing is absent or subtle. For example, at a given level of clinical severity, AD patients with higher education have a more severe pattern of AD-related changes on PET scan than those with lower education [67, 68].

More recently, the sensitivity of a variety of imaging tools for detecting pathological changes prior to cognitive change has been demonstrated on structural MRI [69] and functional MRI (fMRI) [70], as well as through examination of activity level in the default network on resting fMRI [71]. Moving forward, in vivo amyloid imaging, although not currently used in clinical practice, will certainly play an important role in identifying neuropathological changes in asymptomatic individuals as the field moves toward earlier identification of disease. While these various technologies enable the consideration of cognitive reserve as a factor influencing the clinical presentation and diagnosis of a patient, a current challenge to integrating imaging information is applying results from group studies to individual patients. Ideally, research studies might generate a cutoff value so that performance scores below this cutoff would raise concern for the presence of pathological changes. Such a value would be selected based on its utility in distinguishing between cognitively normal individuals who go on to develop cognitive impairment and other clinical endpoints versus those who remain cognitively healthy. This type of value has been identified for the purposes of distinguishing healthy elders from those diagnosed with AD [72, 73], and future work will aim to make this distinction at earlier time points.

Another recommendation for applying the concept of cognitive reserve to clinical practice is to consider it as a factor that will influence rate of cognitive decline following diagnosis. Although cognitive reserve delays the manifestation of cognitive deficits, symptoms progress fairly rapidly once evident (see Fig. 2.1). In fact, decline is more rapid in individuals with high reserve than those with low reserve, even when accounting for a multitude of other factors that may contribute to the disease course [74,75,76]. This counterintuitive acceleration in rate of change is believed to reflect the increasingly high pathological burden that the brain can no longer tolerate. Certainly, this has practical implications for the patient, family, and health-care providers. It may also have direct relevance for the effectiveness of treatment.

Cognitive reserve may influence an individual’s response to treatment with currently available medications as well as future drug therapies. The treatment of degenerative diseases such as Alzheimer’s disease is certain to be most effective when done preventatively, when the burden of pathology in the brain is very low or absent altogether. Thus, in order to develop reasonable expectations about a medication’s effectiveness, it will be important to have knowledge of three variables: cognitive performance, cognitive reserve, and pathological burden. As we have reinforced throughout this chapter, it is the combination of these three variables that enables an accurate understanding of disease severity. From a clinical standpoint, treatment in an individual with mildly impaired cognition and high cognitive reserve may be more or less effective depending on the status of the third variable, pathological burden. With little to no evidence of pathology, an individual with these characteristics would be an ideal candidate for therapy. In contrast, in the context of significant pathology, disease-delaying agents may be entirely ineffective, and this possibility should be anticipated by the clinician.

A final insight for clinicians is that while a wide range of evidence exists from epidemiological studies linking certain life experiences and individual characteristics to lower rates of dementia, this evidence is not sufficient to determine definitively whether or not such experiences directly prevent or delay dementia. As mentioned earlier, there may be a separate unidentified variable accounting for the observed relationship between specific experiences (e.g., completing crossword puzzles) and dementia risk. As such, intervention studies are needed to firmly establish causal links between life experiences, individual characteristics, and cognitive reserve, and such studies are underway. Therefore, while recommending that patients engage in certain activities such as mental enrichment and physical fitness is likely not to be harmful and may in fact have numerous positive effects, clinicians should be careful not to present these activities as established treatments or fully proven preventative strategies against dementia.

Clinical Pearls

  • When formulating clinical impressions, apply the most appropriate and valid indicator of cognitive reserve for each individual patient. This may be an individual characteristic such as level of education; a representation of cumulative life experiences spanning social, academic, occupational, and leisure activities; or a measure of intellectual functioning. Moving forward, statistically and neuroanatomically derived measures of cognitive reserve may also become valuable for clinical purposes.

  • Integrate neuroimaging tools to complement cognitive data for diagnostic purposes.

  • Consider cognitive reserve as a factor that may affect rate of decline. The apparent yet counterintuitive acceleration of decline associated cognitive reserve may reflect a state of increasingly high pathological burden that the brain can no longer tolerate.

  • Appreciate that cognitive reserve may be a factor that influences response to treatment.

  • Be aware that epidemiological studies linking life experiences to reduced dementia risk are observational, and intervention studies are needed to determine definitively if specific experiences and activities enhance reserve and lower dementia risk.