Abstract
Memory is comprised of many integrated components and functions. Decades ago, Ray Kesner presented an attribute model of memory that provided a functional architecture for memory organization in the brain. Many of the features of the model have been consistently supported by new data, and new discoveries about memory function have often been readily incorporated into the model. Current challenges, then, are to understand what makes each brain area so unique that they mediate different types of memory, and to determine how the different brain areas that process mnemonic information work together in a continuous and seemingly automatic way. The literature shows that the special mnemonic contributions could not be accounted for by unique types of neural representation in different areas of the brain. In fact most memory-related structures show movement-, reward-, and spatial-related neural discharge, although to varying degrees. Also, emerging evidence suggest that the functional consequence of the intrinsic computations of memory structures may be comparable: each may generate a prediction error signal, albeit for different types of information. Task-dependent co-modulation of population efferent codes of distant brain areas (e.g., striatum and hippocampus), however, may importantly determine strategic, memory-driven control over decisions that impact the future selection of responses. The automatic nature of memory-driven strategy switches may depend on a self-regulatory homeostatic system that allows integrative structures like the prefrontal cortex to continuously monitor and control the excitability state of neurons in different memory prediction areas of brain, and in this way enable appropriate control over future decisions.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
Introduction
Our memories reflect the accumulation of our past experiences. They shape future decisions and determine what and how we learn over time. It should not be surprising, then, that many fundamental elements of memory (e.g., different associative algorithms, motivation, sensory, motor, attention, memory updating, and response selection) must work together to continuously guide experience-dependent and adaptive behaviors regardless of the nature of the type of currently active memory. Study of a variety of amnesic populations has illustrated that not only many regions of the brain play these important roles in memory, but also different brain areas do so for different reasons. Temporal lobe patients (the most famous of which is patient H. M.) show severe but select anterograde episodic memory impairment, while procedural memory remained intact (Bayley et al. 2005; Milner 2005). Patients suffering from basal ganglia dysfunction show selective impairment in habit learning and procedural memory (Knowlton et al. 1996; Yin and Knowlton 2006). Amygdala damage results in poor emotional regulation of memory (Adolphs et al. 2005; Paz and Pare 2013). Frontal patients suffer from inadequate working memory (Baddeley and Della Sala 1996; Goldman-Rakic 1996). These classic distinctions of the mnemonic consequences of damage to different brain areas in humans have been replicated in rodents by many, and the Kesner laboratory has been particularly successful at demonstrating not only double, but also often triple dissociations of functions of structures like the hippocampus, striatum, amygdala, and prefrontal cortex (e.g., Chiba et al. 2002; Gilbert and Kesner 2002; Kametani and Kesner 1989; Kesner et al. 1993; Kesner et al. 1989; Kesner and Williams 1995). Moreover, the often clever behavioral paradigms created in the Kesner laboratory over the years have been inspirational for generating more specific hypotheses about memory function that could be tested in human subjects (e.g., Hopkins et al. 1995, 2004; Kesner and Hopkins 2001, 2006).
Decades ago, Ray Kesner was perhaps the first to develop a broad theoretical model of memory that sought to explain many of its complexities. His attribute model of memory posits the existence of three basic types of memory (event, knowledge, and rule-based memories), each one of which incorporates similar and fundamental memory operations to establish and use their particular type of memory. Kesner has written many elegant reviews of his work (Kesner 1980, 1998, 2009; Kesner and DiMattia 1987; Kesner and Rogers 2004), and readers, if they have not done so already, should seek out those reviews to gain an appreciation for his most impressive programmatic, timely, and innovative research program. Kesner initially proposed his attribute model at a time when most studies on the neurobiological models of memory focused on simpler memory functions of a small number of brain regions. However, more recently, the development of new behavioral and neuroscience technologies has sparked the current, widespread, and strong interest in studying the multiple neural systems of the brain during complex memory function that involve learning and decision-making mechanisms. Thus, it is clear that the attribute model was and continues to be a visionary theoretical framework for studying brain mechanisms of memory, learning, and decision-making. Indeed, it is now generally accepted that as espoused by the attribute model, hierarchical sets of parallel and distributed neural networks mediate the complex and dynamic processes of simple and complex memories in the brain. Current challenges are to figure out how different networks interact, how behaviors come to guide memory operations, and how existing memories guide future learning and decisions.
Neurophysiological investigations of these memory-related brain regions both confirmed and challenged the multiple memory system view. Spatial and conjunctive context-dependent coding were identified in the hippocampus (O’Keefe and Dostrovsky 1971; O’Keefe and Nadel 1978) and this was consistent with the view that hippocampus mediates episodic memory (Tulving 2002). Response-related codes were found in the striatum (Eschenko and Mizumori 2007; Jog et al. 1999; Yeshenko et al. 2004), supporting the hypothesis that striatum mediate habit or response learning (Knowlton et al. 1996). Frontal cortical neurons remain active during delay periods (Goldman-Rakic 1995), a finding that one might expect from a brain region that is importantly involved in working memory (Fuster 2006, 2008, 2009). With time, however, additional studies began to show that these striking neural correlates of behavior were not so unique to the hippocampus, striatum, and frontal cortex. Egocentric movement-related firing by hippocampal interneurons was reported long ago (e.g., Vanderwolf 1969) but was largely unstudied until recently in favor of studying what was at the time the more intriguing place cells. Parietal cortical neurons also showed strong representations of behavioral responses (e.g., Fogassi et al. 2005; McNaughton et al. 1994). Delay cells were found in regions of the cortex other than the prefrontal cortex, for example, in somatosensory cortex (Meftah et al. 2009), parietal cortex (Snyder et al. 1997), frontal eye fields (Curtis et al. 2004), and less so in temporal cortex (Kurkin et al. 2011). The fact that the single unit evidence did not align directly with the lesion literature suggested that many regions of the brain use similar types of information during their mnemonic computations. However, since the single-unit data came from studies of rodents and primates that used different recording methods while subjects performed a diverse set of tasks, it became important to record from multiple memory-related brain structures as an animal performed a single task that required animals to switch between different memories to continue adaptive decision-making.
In the following section, we describe our efforts to address the issue of whether different brain regions mediate different memories because they represent different kinds of information. The last section of this chapter explores the hypothesis that the relative contribution of different brain areas to memory is driven by homeostatic neural mechanisms that insures proper self-regulation of a behavioral adaptation system that depends on the memory functions described in the attribute model.
Memory Specialization and the Brain
Given that different memory capacities exist across different brain structures, a major challenge has been to understand why those different brain areas make such specialized contributions to memory. The following describes investigations that tested the hypothesis that different brain areas represent different types of information and that this is responsible for their different memory capacities. Their specific focus here will be on a comparison of hippocampal and striatal neural representations as rats performed a hippocampal-dependent (spatial) task or a striatal-dependent (response) task. Importantly, these are tasks that show dissociable mnemonic involvement by the hippocampus and striatum in lesion and clinical investigations. Also, it is noted that the principle conclusions from these results should apply more broadly to an understanding of the relationship between other memory-related brain areas such as the amygdala and prefrontal cortex.
Are Memory Specializations Due to the Nature of the Information Represented by Neurons?
Dorsal striatal and dorsal hippocampal single-unit activity were recorded as rats performed a T-Maze task (Yeshenko et al. 2004; Eschenko and Mizumori 2007). One group of rats was trained to solve the first 10 trials of a recording session according to a spatial strategy and then the next 10 trials according to a response strategy. According to a spatial strategy, rats seek a location that had been previously associated with reward. A response strategy, on the other hand, requires rats to use the same egocentric response (i.e., right turn from the start location) to find food. Another group of rats ran 10 response trials followed by 10 spatial trials. Since these 20 trials were performed within one recording session, the same striatal and hippocampal neurons were recorded before, during, and after an experimenter-controlled switch in cognitive strategy (or memory). Also, since striatal neurons were often recorded simultaneously with hippocampal neurons, it was possible to compare directly activity in the two brain structures relative to the currently active memory, the accuracy of the choices made, and relative to the type of cognitive switch. Such switches included, not only changes from spatial to response strategy use (or vice versa) but also changes from one spatial memory (e.g., food is in the north location) to another spatial memory (e.g., food is in the south location), or from one response strategy (always turn right to find food) to another response strategy (always turn left to find food). Also, in some tests, the visual cues were altered to present another type of strategy (or memory) shift. Together, the inclusion of these different types of manipulations made it possible to see if neural responses in hippocampus or striatum were specific to a particular type of activated memory or cognitive change, or just cognitive change in general.
As had been reported in previous studies (e.g., McNaughton et al. 1983; O’Keefe and Dostrovsky 1971; Olton et al. 1978; Muller and Kubie 1987; Ranck 1973; Redish 1999; summarized in Mizumori 2008b; O’Keefe and Nadel 1978), hippocampal pyramidal cell discharge showed strong correlations with the location of an animal on the maze, while hippocampal (presumed) interneurons showed firing that was correlated with an animal’s movement velocity (Eschenko and Mizumori 2007; Yeshenko et al. 2004). As expected from the results of striatal lesion studies, dorsal striatal neurons showed strong correlations with behavioral response parameters such as the rat’s movement velocity and acceleration. Unexpectedly, location-selective neurons were also found in both medial and lateral sectors of dorsal striatum. (This pattern contrasts with an earlier report that a different type of neural representation of space, a rat’s directional heading, is found in only the medial, not lateral, dorsal striatum, Ragozzino et al. 2001.) Most of the details of the properties of both the movement and location correlates of hippocampal and striatal neurons did not differ as a function of whether the rat was solving the task with a (hippocampal-dependent) spatial or (striatal-dependent) response strategy. This result indicates that both structures continuously and actively process similar types of information (although some of the details may vary) regardless of whether hippocampal-dependent or striatal-dependent memories are engaged.
Given that both spatial and response-related information are represented in hippocampus and striatum, it is possible that the distinct mnemonic contributions of these areas derive from differential sensitivities to changes in memory or contextual information. This hypothesis was also tested by Eschenko and Mizumori (2007) and Yeshenko et al. (2004) and the clear result was that both hippocampal and striatal spatial, and movement neural codes were sensitive to changes in the memory demands of the task regardless of how the change in memory was brought about (e.g., by changing choice strategies, the available cues, from spatial to response strategy, or vice versa). For example, velocity correlated firing by either striatal or hippocampal neurons showed significant changes in the magnitude of the velocity correlation after the cognitive demands of a task shifted. This was the case even though the actual behaviors and velocities exhibited by the rats were consistent in all phases of the test session. Interestingly, hippocampal movement codes responded most dramatically when rats shifted from one type of spatial strategy to another, whereas striatal movement codes responded similarly across all types of context shifts. Thus, details of a given context shift may differentially impact particular hippocampal neurons whereas the same could not be said for striatal networks. Perhaps striatum responds more generally than hippocampus to any type of context change, a conclusion that is consistent with the view that striatum is primarily responsive to changes in reinforcement conditions more generally. Striatal and hippocampal velocity codes per se then are not solely determined by the ongoing behavior of the animals, but rather it is determined, or gated, by memory.
Place-specific firing by striatum and hippocampus also showed sensitivity to changes in cognitive strategy (or activated memory) and this was evidenced by significant changes in place field location or in-field firing rates. In the future then it would be of interest to determine if the context-regulation of striatal neural codes derives from hippocampus by disabling hippocampus, while testing the striatal neural responses to a context change. Since the hippocampus is hypothesized to detect changes in context (e.g., Mizumori et al. 1999; Mizumori et al. 2007a; Mizumori and Jo 2013), one prediction is that striatal neurons will not respond to context manipulations without a proper functioning hippocampus. If, however, striatal context-sensitivity is not affected when hippocampal input goes off-line, then the context-dependent striatal codes may emerge from neocortical memory systems.
Do the Memory Specializations Within the Brain Reflect Compensation After Brain Damage?
Another consideration that can be used to explain the different memory consequences of hippocampal and striatal lesions is that their effects reflect the extent to which remaining brain areas can compensate for a particular type of lesion. If the intrinsic processing by the structure of interest is unique and essential for learning to take place, then no behavioral impairment should be observed if other neural circuits are compromised. Indeed, there is abundant evidence that under most conditions, stimulus-response learning is not impaired following hippocampal lesions, presumably since striatal computations are sufficient to support such learning (e.g., Knowlton et al. 1996; McDonald and White 1993; Packard et al. 1989; Packard and McGaugh 1996; Yin and Knowlton 2006). This does not mean that the hippocampus does not normally play a role in stimulus-response performance; if hippocampus defines the context of the learning so that animals can quickly adapt when test conditions change, deficits in stimulus-response performance may be observed after hippocampal damage only if a context change is involved. In contrast, context-dependent learning is by definition complex and dynamic—a situation that the hippocampus seems uniquely qualified to handle. Thus, as shown in the literature, hippocampal, but not striatal, lesions result in selective context memory deficits.
Are Memory Specializations in the Brain Defined by the Coordination of Neural Networks Across Brain Structures?
It is possible that brain structures play different roles in memory because of task-dependent co-modulated neural activity across different brain structures at strategic times during task performance. Although much work remains to thoroughly test this hypothesis, there are initial indications that support this view. Ragozzino et al. (2001) recorded striatal head direction cells simultaneously with hippocampal place cells, and then compared their responses to different types of context shifts. It was found that when familiar cues were shifted, head directional preferences and place field locations shifted in a comparable fashion. In contrast, when rats were placed in new environments, the shift in head direction preferences and place fields appeared random relative to each other. This result suggests that memory (i.e., past experiences) can bias striatal and hippocampal neural responses in a coordinated fashion, and that without such memory guidance, the two structures function more independently.
Numerous laboratories report that specific neural population-based rhythms can be detected both within and between memory structures such as the hippocampus, striatum, or prefrontal cortex (DeCoteau et al. 2007a; Engel et al. 2001; Fell et al. 2001; Siapas et al 2005; Tabuchi et al. 2000; Varela et al. 2001; Womelsdorf et al. 2007). Hippocampal theta activity seems to coordinate the timing of spatial coding by hippocampal neurons (Gengler et al. 2005; O’Keefe and Recce 1993). Striatal theta oscillations have been shown to become entrained to the hippocampal theta rhythm (Berke et al. 2004; DeCoteau et al. 2007a) in a behaviorally-dependent fashion. Also, directly stimulating the striatum can induce hippocampal high frequency theta activity (Sabatino et al. 1985). When neural activity is disrupted in the striatum via D2 receptor antagonism, striatal modulation of high frequency hippocampal theta activity is reduced, motor and spatial/contextual information is not integrated, and task performance is impaired (Gengler et al. 2005). This is consistent with the idea that theta is important for sensory-motor integration (Hallworth and Bland 2004). It appears then that during goal directed navigation, hippocampal and striatal activity becomes increasingly coherent, and this pattern is dopamine dependent. Given its putative role in assessing the value of behavioral outcomes (e.g., Schultz and Dickinson 2000), dopamine may play an important role in biasing the relative strengths of hippocampal and striatal output signals according to the most effective mnemonic strategy (e.g., Mizumori et al. 2004).
Particularly intriguing is a finding common to both the hippocampus and striatum, and that is that synchronous neural activity occurs in specific task-relevant ways (e.g., Hyman et al. 2005; Jones and Wilson 2005), especially when rats engage in decision-making (e.g., Benchenane et al. 2010). For example, striatal theta is modified over the course of learning an egocentric T-maze task, increasing as the rat comes to regularly choose and initiate turn behavior (DeCoteau et al. 2007a, 2007b). Rats that learned the task developed an antiphase relationship between hippocampal and striatal theta oscillations, while rats that did not learn the task did not show this coherent theta relationship. This coherence has also been observed during striatal-dependent classical conditioning (Kropf and Kuschinsky 1993).
The possibility that dopamine contributes to the regulation of memory efficiency and strategies by coordinating ensemble neural activity in distant brain structures is intriguing given its role in decision-making: Coherent theta oscillations across distant brain structures can be enhanced with application of dopamine (Benchenane et al. 2010) and impaired by dopamine antagonism (Gengler et al. 2005). Functionally, this type of control by dopamine suggests that information about the saliency of reward may determine which brain systems become synchronized (and desynchronized), and this in turn informs the decision process about what information is used to update memories and which behaviors are selected.
The functional importance of neural oscillations in the gamma range (30–80 Hz) remains debated. However, since gamma oscillations tend to occur intermittently (i.e., “gamma bursts” of about 150–250 ms are followed by periods of desynchronous activity), information carried by the cells that participate in a gamma burst effectively become a clear and punctuate signal against a background of disorganized neural activity. For this reason, it has been suggested that gamma bursts represent a fundamental mechanism by which information becomes segmented and/or filtered within a structure, as well as a way to coordinate information across structures (Buzsaki 2006). Although theta and gamma frequencies are quite different (perhaps reflecting the type of information that each rhythm coordinates), there are many common physiological and behavioral relationships that suggest they are components of a coordinated and larger scale oscillatory network. For example, similar to theta rhythms, single unit responses that are recorded simultaneously with gamma oscillations have been found to have specific phase relationships to the gamma rhythm (e.g., Berke 2009; Kalenscher et al. 2010; van der Meer and Redish 2009). Also, it is hypothesized that gamma oscillations may effectively select salient information that can come to impact decisions, learning, and behavioral responses (e.g., Kalenscher et al. 2010; van der Meer and Redish 2009), since their appearance is often in relation to task-relevant events. Another similarity with the theta system is that the occurrence gamma oscillations appear to be at least in part regulated by the dopamine system (Berke 2009).
Are Memory Specializations in the Brain Defined by the Functional Significance of the Output Messages of Populations of Cells?
From the above discussion it is clear that there are widespread neural codes for spatial and response aspects of task performance across different brain areas, and that these are context- (or memory-) dependent. It is possible that neuromodulators such as dopamine bias the strengths of the output messages according to recent behavioral success. What then is the meaning of the efferent neural code at the population level? What neural mechanisms control this meaning, and how are the outputs of different memory processing areas of the brain coordinated to result in continuously adaptive behaviors? These are some of the big challenges that remain to be addressed before we can fully understand the neural systems basis of multiple memory function. What follows is a suggested approach to future investigations of these big challenges.
Predictive Memories and Adaptive Decisions
Known patterns of intrinsic neural connectivity indicate that each memory-related brain structure undoubtedly processes information in a somewhat unique way, and yet it is unclear if these differences are sufficiently unique to account for the documented specialized memory capacities of each brain area. There is, however, growing evidence that the output of different brain structures has a common goal for different kinds of information, and that is to relay the extent to which experience-based predictions relevant to optimal task performance are born out. In fact, an emerging view is that the brain evolved in large part to allow organisms to accurately predict the outcomes of events and behaviors (e.g., Buzsaki 2013; Buzsaki and Moser 2013; Llinas and Roy 2009; Mizumori and Jo 2013). In this way, organisms have been able to adapt to environments and societies of increasing complexity—a condition that required sophisticated mechanisms to make decisions and predictions in a dynamic and conditional environment. According to this view, the underlying neural mechanisms of predictions (and the assessment of their accuracy) are likely to be highly conserved across species (Adams et al. 2013; Watson and Platt 2008). This includes the ability to retain information over times of varying scales depending on the desired goal. Indeed, different brain areas are known to generate and retain sequences of information, an ability that can be accounted for by state-dependent changes in network dynamics (Mauk and Buonomano 2004), internally-generated oscillatory activity (Pastalkova et al. 2008), and/or dedicated “time cells” (Kraus et al. 2013).
Hippocampal Evaluation of the Accuracy of Predictions About Contextual Information
A context discrimination hypothesis (CDH) postulates that single hippocampal neurons provide multidimensional (context-defining) data to population-based network computations that ultimately determine whether expected contextual features of a situation have changed (e.g., Mizumori et al. 1999, 2000a, 2007a; Mizumori 2008a, b; Smith and Mizumori 2006a, b). Specifically, these hippocampal representations of spatial context information (O’Keefe and Nadel 138; Nadel and Payne 2002; Nadel and Wilner 1980) may contribute to a match-mismatch type of analysis that evaluates the present context according to how similar it is to the context that an animal expects to encounter based on past experiences (e.g., Anderson and Jeffery 2003; Gray 1982, 2000; Hasselmo 2005b; Hasselmo et al. 2002; Jeffery et al. 2004, Lisman and Otmakhova 2001; Manns et al. 2007a; Mizumori et al. 1999, 2000a; Smith and Mizumori 2006a, b; Nadel 2008; Vinogradova 1995). Detected mismatches can be used to identify novel situations, initiate learning-related neural plasticity mechanisms, and to distinguish different contexts in memory—functions that are necessary to define significant events or episodes. When a match is computed, the effect of hippocampal output could be to strengthen currently active memory networks located elsewhere in the brain (e.g., neocortex). Thus, hippocampus may play different mnemonic roles depending on whether or not contexts change.
In support of the CDH, disconnecting hippocampus by fornix lesions impairs context discrimination (Smith et al. 2004), and hippocampal lesions reduce animals’ ability to respond to changes in a familiar environment (Good and Honey 1991; Save et al. 1992a, 1992b). Spatial novelty detection corresponds to selective elevation of the immediate early gene c-fos in hippocampus, and not in surrounding parahippocampal cortical regions (Jenkins et al. 2004). Also, as described above, hippocampal neurons show significantly altered firing patterns when rats experience spatial or nonspatial changes in a familiar environment (Eschenko and Mizumori 2007; Ferbinteanu and Shapiro 2003; Fyhn et al. 2002; Leutgeb et al. 2005a, 2005b; Moita et al. 2004; Muller and Kubie 1987; O’Keefe 1976; Puryear et al. 2006; Smith and Mizumori 2006b; Wood et al. 1999; Yeshenko et al. 2004). As an example, Smith and Mizumori (2006b) showed that hippocampal neurons develop context-specific responses, but only when rats were required to discriminate contexts. Discriminating neural responses were not observed when rats were allowed to randomly forage for the same amount of time. Further, Manns et al. (2007b) demonstrated that relative to match trials in an odor cue or object recognition task, CA1 neurons were preferentially discharged when animals experienced a nonmatch situation in these same tasks. Also consistent with the CDH, neuroimaging studies of human performance shows that hippocampus becomes differentially active during match and mismatch trials (Chen et al. 2011; Dickerson et al. 2011; Duncan et al. 2012a; Duncan et al. 2012b; Foerde and Shohamy 2011; Kuhl et al. 2010; Kumaran and Maguire 2007).
The detection of changes in context is fundamentally important for the continual selection of appropriate behaviors that optimize performance and learning in a variety of tasks (e.g., navigation-based learning, instrumental conditioning, or classical conditioning). Context discrimination engages and prepares cellular mechanisms for rapid and new learning at potentially important times (Paulsen and Moser 1998), as it is generally known that novelty detection increases attention and exploratory behaviors in a variety of tasks. Interestingly, hippocampal cell firing tends to occur during the “encoding phase” of the ongoing theta rhythm (Hasselmo 2005a), which is increased during exploratory and investigatory behaviors (Vanderwolf 1969). Thus, detection of a nonmatch situation can change the relationship between cell discharge and the local theta rhythm such that encoding functions are enhanced. Detection of matches, on the other hand, does not cause changes in the hippocampal neural activity profile, resulting in efferent messages that continue to retrieve/utilize the currently active memory network that drove the execution of recently successful responses. Context discrimination, then, can be viewed as being critical for the formation of new episodic memories because it leads to the separation in time and space, one meaningful event from the next. Such division of memories could facilitate long-term information storage according to memory schemas (Bethus et al. 2010; Tse et al. 2007).
Context discrimination, or the detection of a mismatch between expected and experienced context-specific information, is considered an example of an error in predicting the contextual details of the current situation, referred to as a context prediction error. Transmission of a context prediction error signal from hippocampus should inform distal brain areas that a change in the context has occurred. In this case, upon receipt of the context prediction error message, efferent midbrain structures may respond with changes in excitation or inhibition that are needed to evaluate the subjective value of the context prediction error signal (e.g., Humphries and Prescott 2010; Lisman and Grace 2005; Mizumori et al. 2004; Penner and Mizumori 2012a). On the other hand, a hippocampal signal indicating that there was no prediction error may enable plasticity mechanisms that facilitate the incorporation of new information into existing memory schemas (e.g., Bethus et al. 2010; Mizumori et al. 2007a, b; Tse et al. 2007). Thus, hippocampal context analysis become critical for the formation of new episodic memories not only because prediction signals provide a mechanism that separates in time and space one meaningful event from the next, but also because the outcome of the prediction error computation engages appropriate neuroplasticity mechanisms in efferent structures that promote subsequent adaptive decisions and memory.
Striatal Evaluation of the Accuracy of Predictions About Response Outcomes
Analogous to hippocampus, the midbrain dopaminergic system also generates prediction error signals, but in this case the focus is on whether the outcome of goal-directed behaviors occur as predicted based on past experience (Bayer and Glimcher 2005; Hollerman et al. 1998; Hollerman and Schultz 1998; Mizumori et al. 2009; Stalnaker et al. 2012). In particular, it is thought that dopamine neurons transmit information about the subjective value of rewards in terms of reward prediction error signals (RPEs). RPEs are thought to initiate three distinct and parallel loops of information processing between striatum and neocortex as new associations become learned sufficiently to habitually drive behaviors (e.g., Alexander et al. 1986; Alexander and Crutcher 1990a, b; Haber 2003). Penner and Mizumori (2012b) recently summarized this vast literature: Information within the limbic loop flows between ventromedial prefrontal cortex with the ventral striatum (Alexander and Crutcher 1990a, b; Graybiel 2008; Graybiel et al. 1994; Pennartz et al. 2009; Voorn et al. 2004; Yin and Knowlton 2006) to mediate learning about the significance of previously neutral stimuli (i.e., as it occurs in Pavlovian learning). The associative loop involves the medial prefrontal cortex and the dorsomedial striatum to support action-outcome learning. The sensorimotor loop involves transmission between somatosensory and motor cortical areas with the dorsolateral striatum. The latter loop is suited for incremental sensory-motor learning that happens when new procedural memories are formed. It is hypothesized that the transformation of newly learned behaviors to habits occurs as a result of multiple iterations of information flow through these three information loops starting with the limbic loop, the associative loop, and then finally the sensorimotor loop. Importantly, information flow through these systems is thought to be continually informed about the expected values of goals via dopamine signaling from the ventral tegmental area (VTA) and/or the substantia nigra (SN; Horvitz 2002; Nicola et al. 2004; Schultz 2010). When performing well-learned habits, the striatum is particularly suitable to rapidly control behavior or to provide feedback about behaviors that led to prediction errors (Stalnaker et al. 2012) because of its rather unique pattern of reciprocal connections with sensory and motor cortical regions (Alexander and Crutcher 1990a; Groenewegen et al. 1999; Haber 2003), and because striatum can receive immediate feedback when goal outcomes are not what was expected. In this way, midbrain signals of errors in predicting rewards may initiate adjustments to future planned behaviors (Penner and Mizumori 2012b).
Sensory and Motor Predictions
In addition to hippocampus and striatum, various sensory-motor cortical and cerebellar areas have been reported to generate prediction errors when expected sensory or motor-related input does not match expectations (e.g., Scheidt et al. 2012; Tanaka et al. 2009). This sort of feedback permits temporally and spatially precise behavior adjustments based on past outcomes. Also, information about expected sensory and motor events can be used to plan future sensory expectations and specific anticipatory movements (e.g., Duhamel et al. 1992). Such prediction error mechanisms are thought to fine tune actions to optimize the chances of securing a desired goal.
Summary: Error Signaling in the Brain
The above description illustrates that the generation of neural responses that signal times when actual events or information do not match those expected based on past experiences (i.e., prediction error signals) is often observed across many brain areas. In fact, it has been suggested that the ability to predict behavioral outcomes has essentially driven the evolution of the entire brain (Llinas and Roy 2009). Such error signals allow organisms to appropriately refine movements and choices relative to their perceived utility or value, and thus ultimately determine future decisions and behavior (e.g., Doll et al. 2012; Schultz and Dickinson 2000; Walsh and Anderson 2012). At a cellular level, prediction error signals may elevate the level of excitability of efferent neurons such that they become more responsive to outcome signals. This greater neural responsiveness may enhance the temporal and spatial resolution of future neural responses, and this in turn may ultimately result in improved accuracy of future predictions. For example, if hippocampus detects a mismatch between expected and actual contextual features, it may generate an error signal that “alerts” striatal efferent structures so that they become more responsive to future rewards (Lisman and Grace 2005; Mizumori et al. 2000a, 2004, 2009; Penner and Mizumori 2012a, b; Schultz and Dickinson 2000). Midbrain-generated reward prediction error signals may destabilize cortical neural (memory) networks so that they become more readily updated with new information (Mizumori 2008a; Penner and Mizumori 2012b). The updated memory information can then be passed on to hippocampus in the form of the most up-to-date context expectations. This view of how error signals can inform future processing in other prediction regions of the brain is consistent with the view that there is a high degree of interdependence across mnemonic structures regardless of the task (Mizumori et al. 2004; Yeshenko et al. 2004).
Homeostatic Regulation of Predictive Memory Systems
An interesting and often described feature of memory functions is the rapid and seemingly automatic nature of its processes, or changes in processing, when a significant feature of a memory task changes. A challenge for future research then is to understand the neural mechanisms of the apparent automaticity and high accuracy of prediction analyses. An intriguing possibility is that the seemingly automatic nature derives from self-regulatory, intrinsic synaptic mechanisms rather than (only) responses to external information. Such mechanisms may align with the principles of homeostasis similar to those described for self-regulation at synaptic and neural circuit levels (e.g., Marder and Goaillard 2006; Marder and Prinz 2003; Mizumori and Jo 2013; Turrigiano 1999, 2008, 2011; Turrigiano and Nelson 2004). That is, homeostatic regulation could drive the automatic and continuous maintenance of the balance between stable and flexible processing that neural networks need to retain learned (stable or expectancy) information that can to be (flexibly and adaptively) updated as needed.
Marder and Goaillard (2006) suggested that homeostatic neural plasticity may be nested: Calcium sensors may monitor neural firing rates, then up or down regulate the availability of glutamate receptors to ramp up or down firing rates toward an optimal firing rate set point. Groups of neurons or neural networks may sense changes in firing collectively to regulate experience-dependent population activity levels and patterns of activation. In this way homeostatic synaptic plasticity enables groups of neural circuits to find a balance between flexible and stable processing as needed to learn from experiences, and to be responsive to future changed inputs. While details of how networks of cells or their connections engage in homeostatic regulation remain to be discovered it is worth noting that homeostatic regulation at the neural systems level indeed occurs, as is evident from studies of brain development, as well as from studies of reactive or compensatory neuroplasticity mechanisms that occur in response to experience (e.g., sensorimotor learning; Froemke et al. 2007) or brain injury (e.g., brain trauma or addiction; Nudo 2011; Robinson and Kolb 2004). Although homeostatic neural plasticity mechanisms have not been used to account for complex learning, current theories of reinforcement- and context-based learning and memory commonly rely on the autoregulation of feedback loops between systems that assess the outcomes of choices and existing (episodic) memory systems.
If homeostatic regulation pertains to neural networks that underlie adaptive memory, it should be possible to identify key components including variables that are being monitored by sensors and then regulated by controllers. For complex memory systems, such a model likely contains multiple hierarchically organized loops of control similar to what was described by Buzsaki (2013). For example, iterative updating via interactions between hippocampus, the dopamine striatal system, and memory networks would allow context prediction errors to guide future adaptive behaviors. This is somewhat similar to the micro- macro-agent distinction recently described by Kurth-Nelson and Redish (2009). This process may be enabled by sensors which monitor cell excitability within each structure. In this case, changes in calcium flux appear to be an important part of the sensing system that determines the current level of firing. A change in firing rates (either higher or lower than the optimal level) is taken as an indicator of a mismatch between optimal and actual rates, and a controller mechanism becomes engaged to bring the firing rates back to the optimal levels. Such a natural restorative mechanism that responds to cellular detection of errors in prediction (as reflected in firing rate deviations) seems essential since it would be unhealthy for neurons to exist in a chronically excited or inhibited state.
It has been suggested that the prefrontal cortex serves in a controller role that maintains the optimal excitatory level in prediction regions of the brain (more details below). In particular prefrontal cortex is strategically situated to enable mechanisms that restore afferent firing rates to a predetermined ‘set point’ via its detailed excitatory and inhibitory extrinsic connections (as reviewed in Arnsten et al. 2012). In this way, prefrontal cortex may orchestrate and coordinate the level of neural excitability in different prediction error brain areas (e.g., hippocampus and striatum) according to homeostatic principles. Thus, prefrontal cortex biases the nature of the outputs of connected brain areas according to experience and recent outcomes of decisions (Fig. 9.1).
It should be noted that although it is reasonable to assume that the prefrontal cortex is a major controller of the impact of prediction error signaling in the brain, other sources of control of cell excitability may arise via direct interconnections between the multiple prediction detection areas of the brain. For example, a prediction error from the hippocampus could be transmitted to midbrain–striatal neurons along pathways that do not necessarily include the prefrontal cortex. Indirect support for this idea comes from observations that conditions that produce error messages in the hippocampus change reward responses of dopamine neurons (Jo et al. 2013; Puryear et al. 2010), phasic theta comodulation is observed between hippocampus and striatum (DeCoteau et al. 2007a) during decision tasks, and comodulation of neural activity has been reported between prefrontal cortex and parietal cortex (Diwadkar et al. 2000). Perhaps neuromodulators such as dopamine strengthen or weaken associations that led to the last correct or incorrect, respectively, decision or behavior.
In sum, homeostatic regulatory processes may contribute to the automatic and continuous self-regulatory nature of prediction error analysis, decision-making, and learning. Such a naturally adaptive mechanism may optimize the relative contribution of different types of prediction error signals to future decisions and actions according to the pattern of recent successes and failures in prediction.
Setting the Baseline from Which Prediction Error Signals Emerge: A Role for Hypothalamus, Amygdala, and the Prefrontal Cortex
Individual neurons face a continual barrage of excitatory inputs across tens of thousands of synaptic connections. Yet, neurons cannot maintain high levels of excitability and remain viable in the long term. Fortunately, individual neurons appear to be able to naturally and automatically engage mechanisms that control their level of excitability. This may occur by sensing and controlling the flow of various ions across cell membranes (e.g., Burrone et al. 2002; Turrigiano 2008; Turrigiano et al. 1998, see more detailed description below). Optimal levels of neuronal activity can be maintained also by achieving a relatively constant balance of excitatory and inhibitory synaptic input (e.g., Burrone et al. 2002). Together these factors define the baseline level of tonic activity from which phasic error signals are generated. Interestingly, the tonic level of cell excitability can be set according to the motivational state of an animal (Cagniard et al. 2006; Pecina et al. 2003; Puryear et al. 2010) suggesting that ones motivational state may play a significant role in determining the threshold for phasic neural and behavioral responding to unexpected events.
Motivational state information (e.g., signals of hunger or thirst) may arrive in prediction error structures (e.g. the hippocampus or striatum) and/or their controller (prefrontal cortex) via hypothalamic afferent systems. For example, lateral hypothalamus signals of hunger that reach brain areas that evaluate predictions may increase subsequent reward-responsiveness of efferent target neurons. Elevated responses to reward may reflect higher subjective values of the reward, an interpretation that is consistent with the biological needs of an animal. The amygdala, on the other hand, is thought to mediate a different motivational variable, and that is the emotional state of animals (Johansen et al. 2011). A message reflecting the current emotional state may emerge from the amygdala’s role in associating cues with their aversive consequences (e.g., Chau and Galvez 2012; Paz and Pare 2013). Amygdala likely alters its neural activity in response to fear (Ciocchi et al. 2010; Haubensak et al. 2010; Li et al. 2013). Since the amygdala has direct excitatory effects on SN or VTA neurons (Lee et al. 2005; Zahm et al. 2011), fear-induced amygdala activation may increase the likelihood that dopamine neurons transition to a more excitable “up-state” (Wilson 1993; Wilson and Kawaguchi 1996) when hippocampal messages arrive. In this way, in urgent situations, animals can more readily assess the value of a changed context since transitioning to an “up-state” could make the dopamine cells respond more quickly to an input. This could be adaptive since responses can be implemented more quickly.
In addition to generally biasing the levels of neural excitability (which may translate to biasing the threshold for prediction error signaling), the amygdala may modulate prediction error-based learning efficiency on a trial by trial basis. For example, it is known that there is increased attention to cues or rewards that are unexpected or surprising based on past experiences (Pearce and Hall 1980; Rescorla and Wagner 1972). The dopamine system clearly plays a role in surprise-induced enhancement of learning (e.g., Schultz et al. 1997; Schultz and Dickinson 2000), and this may relate to transient influences of the amygdala on dopamine neurons since this prediction error-based learning effect is abolished in rats with amygdala disruption (Holland and Gallagher 2006) with no effects on the subsequent expression of surprise-induced enhanced learning (Lee et al. 2008). The amygdala and hypothalamus, then, may orchestrate information processing circuits/systems by ultimately setting the threshold for future error detection via direct connections to prediction error structures such as the hippocampus, striatum, sensory and motor cortex, and the cerebellum.
The prefrontal cortex can also be thought of as regulating the excitability state of neurons in predictive centers of the brain but for different reasons than both amygdala and hypothalamus. The prefrontal cortex is commonly thought to be important for holding information on-line in a working memory buffer (e.g., Arnsten et al. 2012; Fuster 2008). This function is considered essential to be able to select appropriate responses and/or for switching behavioral strategies (Ragozzino et al. 1999a; Ragozzino et al. 1999b; Young and Shapiro 2009), and this interpretation is consistent with findings that transient functional connections exist between the prefrontal cortex and the hippocampus or striatum especially when working memory is helpful for optimal behaviors. For example, hippocampal and prefrontal theta become co-modulated at times when animals make choices (e.g., Hyman et al. 2005; Shirvalkar et al. 2010), but not at other times during task performance. Co-activation of striatal and prefrontal activity has also been observed when working memory is required for accurate response selection (Levy et al. 1997; Scimeca and Badre 2012). Thus, the functional connections between striatum and prefrontal cortex, or between hippocampus and prefrontal cortex, can vary in strength and impact depending on the current task demands. Presumably this variation reflects the phasic task-dependent coordination of patterns of excitation and inhibition between prefrontal cortex and its efferent targets. Since the prefrontal cortex is thought to play a role in prediction analysis (e.g., Holyroyd et al. 2002), we suggest the possibility that its major contribution is to regulate efferent cell excitability according to recent behavioral outcomes. Indeed, Karlsson et al. (2012) recently showed that prefrontal cortical representations switch states of stability when conditions of greater uncertainty arise, that is, when response outcomes do not occur as predicted. Also, Merchant et al. (2011) suggest that prefrontal cortex exerts “top-down” control over parietal cortical responses in a match-to-sample task.
Prefrontal modulation is made possible due to the rather complex pattern of inhibitory and excitatory control over multiple types of efferent neurons (i.e., both interneurons and projection neurons) in efferent prediction brain areas (as reviewed in Arnsten 2011, Arnsten et al. 2012; Khan and Muly 2011), neurons that then could return information back to prefrontal cortex about their current activity state. Neocortex has indeed been shown to regulate the excitability states of subcortical neurons (e.g., Calhoon and O’Donnell 2013; Plenz and Arnsten 1996; Plenz and Kitai 1998). During baseline conditions, prefrontal cortex in particular may continually receive information about the current level of neural activity in target regions, and then use this afferent data to determine the extent and type of excitatory and inhibitory control needed to achieve optimal tonic activity within each of the multiple efferent prediction error systems. If the tonic activity becomes too low, for example, at times when there are no prediction error signals, prefrontal cortex may elevate the state of neural excitability so that the prediction cells are more responsive to future error signals, a feature that should increase the speed and accuracy of the error signaling. If, on the other hand, the baseline activity of a target region is higher than the optimal for the detection of prediction errors, further increasing the excitability of the cells may be detrimental for the cell’s health and ability to produce clear error signals. In this case, it would be most adaptive if the prefrontal cortex lowered the level of excitability of its target cells so that optimal responsivity can be restored in the target region.
Recurrent neurocircuitry within the prefrontal cortex is thought to contribute to its working memory capacity (e.g., Arnsten et al. 2012), and as such this circuit is a clear candidate system to not only integrate error signals arriving from the different prediction error brain regions, but to also bias the thresholds and strengths of subsequent error-related signals from the brain regions that originally produced the error signal. The particular constellation of excitatory and inhibitory biases presumably will result in the most desired behavioral outcome.
In summary, at specific times when working memory is needed, the intrinsic recurrent neural circuits of the prefrontal cortex (Arnsten et al. 2012) may selectively and strategically exploit (differentially or in concert) its rich array of excitatory and inhibitory efferent connections to regulate the probability of neural firing in different prediction areas of the brain such that the relative responsiveness of different prediction brain regions changes in task-dependent ways. When prediction errors are detected and firing rates change, the prefrontal cortex may not only integrate the signal within its recurrent intrinsic circuitry, but it may have a key restorative function in efferent structures such that the firing rates return to a baseline tonic level that optimizes subsequent responsiveness to input. Thus, the prefrontal cortex may bias efferent neurons’ ability to engage in, or efficiently use, prediction error analysis and hence their ability to adaptively guide future behaviors. This process may be a key factor that determines which prediction error-generating (i.e., memory) system ultimately controls the selection of future responses.
Final Comments
Memory research continues to evolve in complexity as new technologies become available. Ray Kesner’s conceptualization of the cognitive and neurobiological underpinnings of memory was visionary in that it laid out a multidimensional neural systems view of memory that has provided guidance for a number of decades. Current challenges are to understand why different brain structures have select roles in memory, the nature of the interaction between brain structures, the control mechanisms for the interactions between brain areas, and the mechanisms by which memory functions, in all of its complexities, appear to be self-regulated and automatic. We offer a new hypothesis that the different memory regions of the brain make special contributions to memory because they assess the validity of different types of predictions that are needed for one to continually make adaptive choices and engage in adaptive and goal-directed behaviors. A homeostatic model of memory regulation is described to at least in part account for the seemingly automatic nature of memory function. Thank you Ray, for guiding the field to this very important and crucial time in memory research, one that promises exponential growth in the near future.
References
Adams, R. A., Shipp, S., & Friston, K. J. (2013). Predictions not commands: Active interference in the motor system. Brain Structure and Function, 218, 611–643.
Adolphs, R., Tranel, D., & Buchanan T. W. (2005). Amygdala damage impairs emotional memory for gist but not details of complex stimuli. Nature Neuroscience, 8, 512–518.
Alexander, G. E., & Crutcher, M. D. (1990a). Functional architecture of basal ganglia circuits: Neural substrates of parallel processing. Trends in Neuroscience, 13, 266–271.
Alexander, G. E., & Crutcher, M. D. (1990b). Preparation for movement: Neural representations of intended direction in three motor areas of the monkey. Journal of Neurophysiology, 64, 133–150.
Alexander, G. E., DeLong, M. R., & Strick, P. L. (1986). Parallel organization of functionally segregated circuits linking basal ganglia and cortex. Annual Reviews in Neuroscience, 9, 357–381.
Anderson, M. I., & Jeffery, K. J. (2003). Heterogeneous modulation of place cell firing by changes in context. Journal of Neuroscience, 23, 8827–8835.
Arnsten, A. F. T. (2011). Prefrontal cortical network connections: Key site of vulnerability in stress and schizophrenia. International Journal of Developmental Neuroscience, 29, 21–223.
Arnsten, A. F. T., Wang, M. J., & Paspalas, C. D. (2012). Neuromodulation of thought: Flexibilities and vulnerabilities in prefrontal cortical network synapses. Neuron, 76, 223–239.
Baddeley, A., & Della Sala, S. (1996). Working memory and executive control. Royal Society of London B Biological Sciences, 351, 1397–1403.
Bayer, H. M., & Glimcher, P. W. (2005). Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129–141.
Bayley, P.J., Frascino, J.C., & Squire, L.R. (2005). Robust habit learning in the absence of awareness and independent of the medial temporal lobe. Nature, 436, 550–553.
Benchenane, K., Peyrache, A., Khamassi, M., Tierney, P. L., Gioanni, Y., Battaglia, F. P., & Wiener, S. I. (2010). Coherent theta oscillations and reorganization of spike timing in the hippocampal- prefrontal network upon learning. Neuron, 66, 921–936.
Berke, J. D. (2009). Fast oscillations in cortical-striatal networks switch frequency following rewarding events and stimulant drugs. European Journal of Neuroscience, 30, 848–859.
Berke, J. D., Okatan, M., Skurski, J., & Eichenbaum, H. B. (2004). Oscillatory entrainment of striatal neurons in freely moving rats. Neuron, 43, 883–896.
Bethus, I., Tse, D., & Morris, R. G. (2010). Dopamine and memory: modulation of the persistence of memory for novel hippocampal NMDA receptor-dependent paired associates. Journal of Neuroscience, 30, 1610–1618.
Burrone, J., O’Bryne, M., & Murthy V. N. 2002. Multiple forms of synaptic plasticity triggered by selective suppression of activity in individual neurons. Nature, 420, 414–419.
Buzsaki, G. (2006). Rhythms of the brain. New York: Oxford Press.
Buzsaki, G. (2013). Time, space and memory. Nature, 497, 568–569.
Buzsaki, G., & Moser, E. I. (2013). Memory, navigation and theta rhythm in the hippocampal-entorhinal system. Nature Neuroscience, 16, 130–138.
Cagniard, B., Beeler, J. A., Britt, J. P., McGhee, D. S., Marinelli, M., & Zhuang, X. (2006). Dopamine scales performance in the absence of new learning. Neuron, 51, 541–547.
Calhoon, G. G., & O’Donnell, P. (2013). Closing the gate in the limbic striatum: Prefrontal suppression of hippocampal and thalamic inputs. Neuron, 78, 181–190.
Chau, L. S., & Galvez, R. (2012). Amygdala’s involvement in facilitating associative learning-induced plasticity: A promiscuous role for hte amygdala in memory acquisition. Frontiers in Integrative Neuroscience, 6, 92.
Chen, J., Olsen, R. K., Preston, A. R., Glover, G. H., & Wagner A. D. (2011). Associative retrieval processes in the human medial temporal lobe: Hippocampal retrieval success and CA1 mismatch detection. Learning and Memory, 18, 523–528.
Chiba, A. A., Kesner, R. P., & Jackson, P. (2002). Two forms of spatial memory: A double dissociation between the parietal cortex and the hippocampus in the rat. Behavioral Neuroscience, 116, 874–883.
Ciocchi, S., Herry, C., Grenier, F., Wolff, S. B., Letzkus, J. J., Vlachos, I., Ehrlich, I., Sprengel, R., Deisseroth, K. I., Stadler, M. B., Muller, C., & Luthi, A. (2010). Encoding of conditioned fear in central amygdala inhibitory circuits. Nature, 468, 277–282.
Curtis, C. E., Rao, V. Y., & E’Esposito, M. (2004). Maintenance of spatial and motor codes during oculomotor delayed response tasks. Journal of Neuroscience, 24, 3944–3952.
DeCoteau, W. E., Thorn, C., Gibson, D. J., Courtemanche, R., Mitra, P., Kubota, Y., & Graybiel, A. M. (2007a). Learning-related coordination of striatal and hippocampal theta rhythms during acquisition of a procedural maze task. Proceedings of the National Academy of Science U S A, 104, 5644–5649.
DeCoteau, W. E., Thorn, C., Gibson, D. J., Courtemanche, R., Mitra, P., Kubota, Y., & Graybiel, A. M. (2007b). Oscillations of local field potentials in the rat dorsal striatum during spontaneous and instructed behaviors. Journal of Neurophysiology, 97, 3800–3805.
Dickerson, K. C., Li, J., & Delgado, M. R. (2011). Parallel contributions of distinct human memory systems during probabilistic learning. Neuroimage, 55, 266–276.
Diwadkar, V. A., Carpenter, P. A., & Just, M. A. (2000). Collaborative activity between parietal and dorso-lateral prefrontal cortex in dynamic spatial working memory revealed by fMRI. Neuroimage, 12, 85–99.
Doll, B. B., Simon, D. A., & Daw, N. D. (2012). The ubiquity of model-based reinforcement learning. Current Opinion in Neurobiology, 22, 1075–1081.
Duhamel, J. R., Colby, C. L., & Goldberg, M. E. (1992). The updating of the representation of visual space in parietal cortex by intended eye movements. Science, 255, 90–92.
Duncan, K., Ketz, N., Inati, S. J., & Davachi, L. (2012a). Evidence for area CA1 as a match/mismatch detector: A high-resolution fMRI study of the human hippocampus. Hippocampus, 22, 389–398.
Duncan, K., Sadanand, A., & Davachi, L. (2012b). Memory’s penumbra: Episodic memory decisions induce lingering mnemonic biases. Science, 337, 485–487.
Engel, A. K., Fries, P., & Singer, W. (2001). Dynamic predictions: Oscillations and synchrony in top-down processing. Nature Reviews in Neuroscience, 2, 704–716.
Eschenko, O., & Mizumori, S. J.Y. (2007). Memory influences on hippocampal and striatal neural codes: Effects of a shift between task rules. Neurobiology of Learning and Memory, 87, 495–509.
Fell, J., Klaver, P., Lehnertz, K., Grunwald, T., Schaller, C., Elger, C. E., & Fernandez, G. (2001). Human memory formation is accompanied by rhinal-hippocampal coupling and decoupling. Nature Neuroscience, 4, 1259–1264.
Ferbinteanu, J., & Shapiro, M. L. (2003). Prospective and retrospective memory coding in the hippocampus. Neuron, 40, 1227–1239.
Foerde, K., & Shohamy, D. (2011). Feedback timing modulates brain systems for learning in humans. Journal of Neuroscience, 31, 13157–13167.
Fogassi, L., Ferrari, P. F., Gesierich, B., Rozzi, S., Chersi, F., & Rizzolatti, G. (2005). Parietal lobe: From action organization to intention understanding. Science, 308, 662–667.
Froemke, R. C., Merzenich, M. M., & Schreiner, C. E. (2007). A synaptic memory trace for cortical receptive field plasticity. Nature, 450, 425–429.
Fuster, J. M. (2006). The cognit: A network model of cortical representation. International Journal of Psychophysiology, 60, 125–132.
Fuster, J. M. (2008). The prefrontal cortex (4th ed.). London: Academic Press.
Fuster, J. M. (2009). Cortex and memory: Emergence of a new paradigm. Journal of Cognitive Neuroscience, 21, 2047–2072.
Fyhn, M., Molden, S., Hollup, S., Moser, M. B., & Moser E. (2002). Hippocampal neurons responding to first-time dislocation of a target object. Neuron, 35, 555–566.
Gengler, S., Mallot, H. A., & Holscher C. (2005). Inactivation of the rat dorsal striatum impairs performance in spatial tasks and alters hippocampal theta in the freely moving rat. Behavioral Brain Research, 164, 73–82.
Gilbert, P. E., & Kesner, R. P. (2002). The amygdala but not the hippocampus is involved in pattern separation based on reward value. Neurobiology of Learning and Memory, 77, 338–353.
Goldman-Rakic, P. S. (1995). Cellular basis of working memory. Neuron, 14, 477–485.
Goldman-Rakic, P. S. (1996). The prefrontal landscape: Implications of functional architecture for understanding human mentation and the central executive. Philosophical Translation of the Royal Society of London B Biological Sciences, 351, 1445–1453.
Good, M., & Honey, R. C. (1991). Conditioning and contextual retrieval in hippocampal rats. Behavioral Neuroscience, 105, 499–509.
Gray, J. A. (1982). The neuropsychology of anxiety: An enquiry into the functions of the septo-hippocampal system. Oxford: Oxford University Press.
Gray, J. A. (2000). The neuropsychology of anxiety: An enquiry into the functions of the septo-hippocampal system (2nd ed.). Oxford: Oxford University Press.
Graybiel, A. M. (2008) Habits, rituals, and the evaluative brain. Annual Review in Neuroscience, 31, 359–387.
Graybiel, A. M., Aosaki, T., Flaherty, A. W., & Kimura, M. (1994). The basal ganglia and adaptive motor control. Science, 265, 1826–1831.
Groenewegen, H. J., Galis-de Graaf, Y., & Smeets, W. J. (1999). Integration and segregation of limbic cortico-striatal loops at the thalamic level: An experimental tracing study in rats. Journal of Chemical Neuroanatomy, 16, 167–185.
Haber, S. N. (2003). The primate basal ganglia: Parallel and integrative networks. Journal of Chemical Neuroanatomy, 26, 317–330.
Hallworth, N. E., & Bland, B. H. (2004). Basal ganglia—hippocampal interactions support the role of the hippocampal formation in sensorimotor integration. Experimental Neurology, 188, 430–443.
Hasselmo, M. E. (2005a). What is the function of hippocampal theta rhythm?–Linking behavioral data to phasic properties of field potential and unit recording data. Hippocampus, 15, 936–949.
Hasselmo, M. E. (2005b). The role of hippocampal regions CA3 and CA1 in matching entorhinal input with retrieval of associations between objects and context: Theoretical comment on Lee et al. (2005). Behavioral Neuroscience, 119, 342–345.
Hasselmo, M. E., Hay, J., Ilyn, M., & Gorchetchnikov, A. (2002). Neuromodulation, theta rhythm and rat spatial navigation. Neural Networks, 15, 689–707.
Haubensak, W., Kunwar, P. S., Cai, H., Ciocchi, S., Wall, N. R., Ponnusamy, R., Biag, J., Dong, H. W., Deisseroth, K., Callaway, E. M., Fanselow, M. S., Lüthi, A., & Anderson, D. J. (2010). Genetic dissection of an amygdala microcircuit that gates conditioned fear. Nature, 468, 270–276
Holland, P. C., & Gallagher, M. (2006). Different roles for amygdala central nucleus and substantia innominate in the surprise-induced enhancement of learning. Journal of Neuroscience, 26, 3791–3797.
Hollerman, J. R., & Schultz, W. (1998). Dopamine neurons report an error in the temporal prediction of reward during learning. Nature Neuroscience, 1, 304–309.
Hollerman, J. R., Tremblay, L., & Schultz, W. (1998) Influence of reward expectation on behavior-related neuronal activity in primate striatum. Journal of Neurophysiology, 80, 947–963.
Holroyd, C. B., Coles, M. G., & Nieuwenhuis, S. (2002). Medial prefrontal cortex and error potentials. Science, 29(6), 1610–1611.
Hopkins, R. O., Kesner, R. P., & Goldstein, M. (1995). Memory for novel and familiar temporal spatial and linguistic temporal distance information in hypoxic subjects. International Journal of Neuropsychology, 1, 454–468.
Hopkins, R. O., Waldram, K., & Kesner, R. P. (2004). Sequences assessed by declarative and procedural tests of memory in amnesic patients with hippocampal damage. Neuropsychologia, 42, 1877–1886.
Horvitz, J. C. (2002). Dopamine gating of glutamatergic sensorimotor and incentive motivational input signals to the striatum. Behavioral Brain Research, 137, 65–74.
Humphries, M. D., & Prescott, T. J. (2010). The ventral basal ganglia, a selection mechanism at the crossroads of space, strategy, and reward. Progress in Neurobiology, 90, 385–417.
Hyman, J. M., Zilli, E. A., Paley, A. M., & Hasselmo, M. E. (2005). Medial prefrontal cortex cells show dynamic modulation with the hippocampal theta rhythm dependent on behavior. Hippocampus, 15, 739–749.
Jeffery, K. J., Anderson, M. I., Hayman, R., & Chakraborty, S. (2004). A proposed architecture for the neural representation of spatial context. Neuroscience and Biobehavioral Review, 28, 201–218.
Jenkins, T. A., Amin, E., Pearce, J. M., Brown, M. W., & Aggleton. J. P. (2004). Novel spatial arrangements of familiar visual stimuli promote activity in the rat hippocampal formation but not the parahippocampal cortices: A c-fos expression study. Neuroscience, 124, 43–52.
Jo, Y. S., Lee, J., & Mizumori, S. J. Y. (2013). Effects of prefrontal cortical inactivation on neural activity in the ventral tegmental area. Journal of Neuroscience, 33, 8159–8171.
Jog, M. S., Kubota, Y., Connolly, C. I., Hillegaart, V., & Graybiel, A. M. (1999). Building neural representations of habits. Science, 286, 1745–1749.
Johansen, J. P., Cain, C. K., Ostroff, L. E., & LeDoux, J. E. (2011). Molecular mechanisms of fear learning and memory. Cell, 147, 509–524.
Jones, M. W., & Wilson, M. A. (2005). Theta rhythms coordinate hippocampal-prefrontal interactions in a spatial memory task. PLoS Biology, 3, e402.
Khan, Z. U., & Muly, E. C. (2011). Molecular mechanisms of working memory. Behavioral and Brain Research, 219, 329–341.
Kalenscher, T., Lansink, C. S., Lankelma, J. V., & Pennartz, C. M. (2010). Reward-associated gamma oscillations in ventral striatum are regionally differentiated and modulate local firing activity. Journal of Neurophysiology, 103, 1658–1672.
Kametani, H., & Kesner, R. P. (1989). Retrospective and prospective coding of information: Dissociation of parietal cortex and hippocampal formation. Behavioral Neuroscience, 103, 84–89.
Karlsson, M. P., Tervo, D. G. R., & Karpova, A. Y. (2012). Network resets in medial prefrontal cortex mark the onset of behavioral uncertainty. Science, 338, 135–139.
Kesner, R. P. (1980). An attribute analysis of memory: The role of the hippocampus. Physiology Psychology, 8, 189–197.
Kesner, R. P. (1989). Retrospective and prospective coding of information: Role of the medial prefrontal cortex. Journal of Experimental Brain Research, 74, 163–167.
Kesner, R. P. (1998). Neurobiological views of memory. In J. L. Martinez & R. P. Kesner (Eds.), The neurobiology of learning and memory (pp. 361–416). San Diego: Academic.
Kesner, R.P. (2009). Tapestry of memory. Behavioral Neuroscience, 123, 1–13.
Kesner, R.P., & DiMattia, B.V. (1987). Neurobiology of an attribute model of memory. In A. R. Morrison & A. N. Epstein (Eds.), Progress in psychobiology and physiological psychology (pp. 207–277). New York: Academic.
Kesner, R. P., & Hopkins, R. O. (2001). Short-term memory for duration and distance in humans: Role of the hippocampus. Neuropsychology, 15, 58–68.
Kesner, R. P., & Hopkins, R. O. (2006). Mnemonic functions of the hippocampus: A comparison between animals and humans. Biological Psychology, 73, 3–18.
Kesner, R.P., & Rogers, L. (2004). An analysis of independence and interactions of brain substrates that subserve multiple attributes, memory systems, and underlying processes. Neurobiology of Learning and Memory, 82, 199–215.
Kesner, R. P., & Williams, J. M. (1995). Memory for magnitude of reinforcement: Dissociation between the amygdala and hippocampus. Neurobiology of Learning and Memory, 64, 237–244
Kesner, R. P., Farnsworth, G., & DiMattia, B. V. (1989). Double dissociation of egocentric and allocentric space following medial prefrontal and parietal cortex lesions in the rat. Behavioral Neuroscience, 103, 956–961.
Kesner, R. P., Bolland, B., & Dakis, M. (1993). Memory for spatial locations, motor responses, and objects: Triple dissociations among the hippocampus, caudate nucleus and extrastriate visual cortex. Experimental Brain Research, 93, 462–470.
Knowlton, B. J., Mangels, J. A., & Squire, L.R. (1996). A neostriatal habit learning system in humans. Science, 273, 1399–1402.
Kraus, B. J., Robinson II, R. J., White, J. A., Eichenbaum, H., & Hasselmo, M. E. (2013). Hippocampal “time cells”: Time versus path integration. Neuron, 78, 1–12.
Kropf, W., & Kuschinsky, K. (1993). Conditioned effects of apomorphine are manifest in regional EEG of rats both in hippocampus and in striatum. Naunyn-Schmiedeberg’s Archives of Pharmacology, 347, 487–493.
Kuhl, B. A., Shah, A. T., DuBrow, S., & Wagner AD. (2010). Resistance to forgetting associated with hippocampus-mediated reactivation during new learning. Nature Neuroscience, 13, 501–506.
Kumaran, D., & Maguire, E. A. (2007). Match-mismatch processes underlie human hippocampal responses to associative novelty. Journal of Neuroscience, 27, 8517–8524.
Kurkin, S., Akao, T., Shichinohe, N., Fukushima, J., & Fukushima, K. (2011). Neuronal activity in medial superior temporal area (MST) during memory-based smooth pursuit eye movements in monkeys. Experimental Brain Research, 214, 293–301.
Kurth-Nelson, Z., & Redish A. D. (2009). Temporal-difference reinforcement learning with distributed representations. PLoS One, 4, 37362.
Lee, H. J., Groshek, F., Petrovich, G. D., Cantalini, J. P., Gallagher, M., & Holland, P. C. (2005). Role of amygdalo-nigral circuitry in conditioning of a visual stimulus paired with food. Journal of Neuroscience, 25, 3881–3888.
Lee, H. J., Youn, J. M., Gallagher, M., & Holland, P. C. (2008). Temporally limited role of substantia nigra-central amygdala connections in surprise-induced enhancement of learning. European Journal of Neuroscience, 27, 3043–3049.
Leutgeb, J. K., Leutgeb, S., Treves, A., Meyer, R., Barnes, C. A., McNaughton, B. L., Moser, M. B., & Moser, E. I. (2005a). Progressive transformation of hippocampal neuronal representations in “morphed” environments. Neuron, 48, 345–358.
Leutgeb, S., Leutgeb, J. K., Barnes, C. A., Moser, E. I., McNaughton, B. L., & Moser, M. B. (2005b). Independent codes for spatial and episodic memory in hippocampal neuronal ensembles. Science, 309, 619–623.
Levy, R., Friedman, H. R., Davachi, L., & Goldman-Rakic, P. S. (1997). Differential activation of the caudate nucleus in primates performing spatial and nonspatial working memory tasks. Journal of Neuroscience, 17, 3870–3882.
Li, H., Penzo, M. A., Taniguchi, H., Kopec, C. D., Huang, Z. J., & Li, B. (2013). Experience-dependent modification of a central amygdala fear circuit. Nature Neuroscience, 16, 332–329.
Lisman, J. E., & Grace, A. A. (2005). The hippocampal-VTA loop: Controlling the entry of information into long-term memory. Neuron, 46, 703–713.
Lisman, J. E., & Otmakhova, N. A. (2001). Storage, recall, and novelty detection of sequences by the hippocampus: Elaborating on the SOCRATIC model to account for normal and aberrant effects of dopamine. Hippocampus, 11, 551–568.
Llinas, R. R., & Roy, S. (2009). The ‘predictive imperative’ as the basis for self-awareness. Philosophical Transactions of the Royal Society B Biological Sciences, 364, 1301–1307.
Manns, J. R., Howard, M. W., & Eichenbaum, H. (2007a). Gradual changes inhippocampal activity support remembering the order of events. Neuron, 56, 530–540.
Manns, J. R., Zilli, E. A., Ong, K. C., Hasselmo, M. E., & Eichenbaum, H. (2007b). Hippocampal CA1 spiking during encoding and retrieval: Relation to theta phase. Neurobiology Learning Memory, 87, 9–20.
Marder, E., & Goaillard, J. M. (2006). Variability, compensation and homeostasis in neuron and network function. Nature Reviews Neuroscience, 7, 563–574.
Marder, E., & Prinz, A. A. (2003). Current compensation in neuronal homeostasis. Neuron, 37, 2–4.
Mauk, M. D., & Buonomano, D. V. (2004). The neural basis of temporal processing. Annual Reviews in Neuroscience, 27, 307–340.
McDonald, R. J., & White, N. M. (1993). A triple dissociation of memory systems: Hippocampus, amygdala and dorsal striatum. Behavioral Neuroscience, 107, 371–388.
McNaughton, B. L., Barnes, C. A., & O’Keefe, J. (1983). The contributions of position, direction, and velocity to single unit activity in the hippocampus of freely-moving rats. Experimental Brain Research, 52, 41–49.
McNaughton, B. L., Mizumori, S. J., Barnes, C. A., Leonard, B. J., Marquis, M., & Green, E. J. (1994). Cortical representation of motion during unrestrained spatial navigation in the rat. Cerebral Cortex, 4, 26–39.
Meftah, E.-M, Bourgeon, S., & Chapman, C. E. (2009). Instructed delay discharge in primary and secondary somatosensory cortex within the context of a selective attention task. Journal of Neurophysiology, 101, 2649–2667.
Merchant H., Crowe, D. A., Robertson, M. S., Fortes, A. F., & Georgopoulos, A. P. (2011). Top-down spatial categorization signal from prefrontal to posterior parietal cortex in the primate. Frontiers in Systems Neuroscience, 5, 1–9.
Milner, B. (2005). The medial temporal-lobe amnesic syndrome. Psychiatry Clinical North America, 28, 599–611.
Mizumori, S. J. Y. (2008a). A context for hippocampal place cells during learning. In S. J. Y. Mizumori (Ed.), Hippocampal place fields: Relevance to learning and memory (pp. 16–43). New York: Oxford University Press.
Mizumori, S. J. Y. (2008b). Hippocampal place fields: Relevance to learning and memory. New York: Oxford University Press.
Mizumori, S. J. Y., & Jo, Y. S. (2013). Homeostatic regulation of memory systems and adaptive decisions. Hippocampus, 23(11), 1103–1124.
Mizumori, S. J. Y., Ragozzino, K. E., Cooper, B. G., & Leutgeb, S. (1999). Hippocampal representational organization and spatial context. Hippocampus, 9, 444–451.
Mizumori, S. J. Y., Cooper, B. G., Leutgeb, S., & Pratt, W. E. (2000a). A neural systems analysis of adaptive navigation. Molecular Neurobiology, 21, 57–82.
Mizumori, S. J. Y., Ragozzino, K. E., & Cooper, B. G. (2000b). Location and head direction representation in the dorsal striatum of rats. Psychobiology, 28, 441–462.
Mizumori, S. J. Y., Yeshenko, O., Gill, K., & Davis, D. (2004). Parallel processing across neural systems: Implications for a multiple memory systems hypothesis. Neurobiology of Learning and Memory, 82, 278–298.
Mizumori, S. J. Y., Smith, D.M., & Puryear, C.B. (2007a). Hippocampal and neocortical interactions during context discrimination: electrophysiological evidence from the rat. Hippocampus, 17, 851–862.
Mizumori, S. J. Y., Smith, D. M., & Puryear, C. B. (2007b). A role for place representation in episodic memory. In J. L. Martinez & R. P. Kesner (Eds.), Neurobiology of learning and memory (pp. 155–189). Second Edition Burlington Massachusetts: A: Academic.
Mizumori S. J. Y., Puryear, C. B., & Martig, A. K. (2009). Basal ganglia contributions to adaptive navigation. Behavioural Brain Research, 199, 32–42.
Moita, M. A., Rosis, S., Zhou, Y., LeDoux, J. E., & Blair, H. T. (2004). Putting fear in its place: Remapping of hippocampal place cells during fear conditioning. Journal of Neuroscience, 24, 7015–7023.
Muller, R. U., & Kubie, J. L. (1987). The effects of changes in the environment on the spatial firing of hippocampal complex-spike cells. Journal of Neuroscience, 7, 1951–1968.
Nadel, L. (2008). The hippocampus and context revisited. In S. J. Y. Mizumori (Ed.), Hippocampal place fields: Relevance to learning and memory (pp. 3–15). New York: Oxford University Press.
Nadel, L., & Payne, J. D. (2002). The hippocampus, wayfinding and episodic memory. In P. E. Sharp (Ed.), The neural basis of navigation: Evidence from single cell recording (pp. 235–248). Boston: Kluwer.
Nadel, L., & Wilner, J. (1980). Context and conditioning: a place for space. Physiological Psychology, 8, 218–228.
Nicola, S. M., Yun, I. A., Wakabayashi, K. T., & Fields, H. L. (2004). Firing of nucleus accumbens neurons during the consummatory phase of a discriminative stimulus task depends on previous reward predictive cues. Journal of Neurophysiology, 91, 1866–1882.
Nudo, R. J. (2011). Neural bases of recovery after brain injury. Journal of Communications Disorder, 44, 515–520.
O’Keefe, J. (1976). Place units in the hippocampus of the freely moving rat. Experimental Neurology, 51, 78–109.
O’Keefe, J., & Dostrovsky, J. (1971). The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Research, 34, 171–175.
O’Keefe, J. & Nadel, L. (1978). The hippocampus as a cognitive map. Oxford University Press: Oxford.
O’Keefe, J., & Recce, M. L. (1993). Phase relationship between hippocampal place units and the EEG theta rhythm. Hippocampus, 3, 317–330.
Olton, D. S., Branch, M., & Best, P. J. (1978). Spatial correlates of hippocampal unit activity. Experimental Neurology, 58, 387–409.
Packard, M. G., & McGaugh, J. L. (1996). Inactivation of hippocampus or caudate nucleus with lidocaine differentially affects expression of place and response learning. Neurobiology of Learning and Memory, 65, 65–72.
Packard, M. G., Hirsh, R., & White, N. M. (1989). Differential effects of fornix and caudate nucleus lesions on two radial maze tasks: Evidence for multiple memory systems. Journal of Neuroscience, 9, 1465–1472.
Pastalkova, E., Itskov, V., Amarasingham, A., & Buzsaki, G. (2008). Internally generated cell assembly sequences in the rat hippocampus. Science, 321, 1322–1327.
Paulsen, O., & Moser, E. I. (1998). A model of hippocampal memory encoding and retrieval: GABAergic control of synaptic plasticity. Trends in Neuroscience, 21, 273–278.
Paz, R., & Pare, D. (2013). Physiological basis for emotional modulation of memory circuits by the amygdala. Current Opinion in Neurobiology, 23, 1–6.
Pearce, J. M., & Hall. G. (1980). A model for Pavlovian learning: Variations in the effectiveness of conditioned but not of unconditioned stimuli. Psychological Review, 87, 532–552.
Pecina, S., Cagniard, B., Berridge, K. C., Aldridge J. W., & Zhuang, X. (2003). Hyperdopaminergic mutant mice have higher ‘wanting’ but not ‘liking’ for sweet rewards. Journal of Neuroscience, 23, 9395–9402.
Pennartz, C. M., Berke, J. D., Graybiel, A. M., Ito, R., Lansink, C. S., van der Meer, M., Redish, A. D., Smith, K. S., & Voorn, P. (2009). Corticostriatal interactions during learning, memory processing, and decision making. Journal of Neuroscience, 29, 12831–12838.
Penner, M. R., & Mizumori, S. J. Y. (2012a). Age-associated changes in the hippocampal-ventral striatum-ventral tegmental loop that impact learning, prediction and context discrimination. Frontiers in Aging Neuroscience, 4, 1–12.
Penner, M. R., & Mizumori, S. J. Y. (2012b). Neural systems analysis of decision making during goal-directed navigation. Progress in Neurobiology, 96, 96–135.
Plenz, D., & Aertsen, A. (1996). Neural dynamics in cortex–striatum cocultures. I. Anatomy and electrophysiology of neuronal cell types. Neuroscience, 70, 861–891.
Plenz, D., & Kitai, S. T. (1998). Up and down states in striatal medium spiny neurons simultaneously recorded with spontaneous activity in fast-spiking interneurons studied in cortex-striatum-substantia nigra organotypic cultures. Journal of Neuroscience, 18, 266–283.
Puryear, C. B., King, M., & Mizumori, S. J. Y. (2006). Specific changes in hippocampal spatial codes predict spatial working memory performance. Behavioral Brain Research, 169, 168–175.
Puryear, C. B., Kim, M. J., & Mizumori, S. J. Y. (2010). Conjunctive encoding of reward and movement by ventral tegmental area neurons: Contextual control during adaptive spatial navigation. Behavioral Neuroscience, 124, 234–247. (Erratum in: Behav Neurosci. 2010 Jun;124(3):336).
Ragozzino, M. E., Wilcox, C., Raso, M., & Kesner, R. P. (1999a). Involvement of rodent prefrontal cortex subregions in strategy switching. Behavioral Neuroscience, 113, 32–41.
Ragozzino, M. E., Detrick, S., & Kesner, R. P. (1999b). Involvement of the prelimbic infralimbic areas of the rodent prefrontal cortex in behavioral flexibility for place and response learning. Journal of Neuroscience, 19, 4585–4594.
Ragozzino, K. E., Leutgeb, S., & Mizumori, S. J. Y. (2001). Conditional coupling of dorsal striatal head direction and hippocampal place representations during spatial navigation. Experimental Brain Research, 139, 372–376.
Ragozzino, M. E., Ragozzino, K. E., Mizumori, S. J. Y., & Kesner, R. P. (2002). The role of the dorsomedial striatum in behavioral flexibility for response and visual cue discrimination learning. Behavioral Neuroscience, 116, 105–115.
Ranck, J. B., Jr. (1973). Studies on single neurons in dorsal hippocampal formation and septum in unrestrained rats. I. Behavioral correlates and firing repertoires. Experimental Neurology, 41, 461–531.
Redish, A. D. (1999). Beyond the cognitive map: From place cells to episodic memory. Cambridge: MIT Press
Rescorla, R. A., & Wagner, A. R. (1972). A theory of Pavlovian conditioning: Variations in the effectiveness of reinforcement and nonreinforcement. In A. H. Black & W. F. Prokasy (Eds). Classical conditioning II: Current research and theory (pp. 64–99). New York: Appleton Century Crofts.
Robinson, T. E., & Kolb, B. (2004). Structural plasticity associated with exposure to drugs of abuse. Neuropharmacology, 47(Suppl 1), 33–46.
Sabatino, M., Ferraro, G., Liberti, G., Vella, N., & La Grutta, V. (1985). Striatal and septal influence on hippocampal theta and spikes in the cat. Neuroscience Letters, 61, 55–59.
Save, E., Buhot, M. C., Foreman, N., & Thinus-Blanc, C. (1992a). Exploratory activity and response to a spatial change in rats with hippocampal or posterior parietal cortical lesions. Behavioral Brain Research, 47, 113–127
Save, E., Poucet, B., Foreman, N., & Buhot, M. C. (1992b). Object exploration and reactions to spatial and nonspatial changes in hooded rats following damage to parietal cortex or hippocampal formation. Behavioral Neuroscience, 106, 447–456.
Scheidt, R. A., Zimbelman, J. L., Salowitz, N. M. G., Suminski, A. J., Mosier, K. M., Houk, J., & Simo, L. (2012). Remembering forward: Neural correlates of memory and prediction in human motor adaptation. Neuroimage, 59, 582–600.
Schultz, W. (2010). Dopamine signals for reward value and risk: basic and recent data. Behavior and Brain Function, 6, 24.
Schultz, W., & Dickinson, A. (2000). Neuronal coding of prediction errors. Annual Reviews in Neuroscience, 23, 473–500.
Schultz, W., Dayan, P., & Montague, P. R. (1997). A neural substrate of prediction and reward. Science, 275, 1593–1599.
Scimeca, J. M., & Badre, D. (2012). Striatal contributions to declarative memory retrieval. Neuron, 75, 380–392.
Shirvalkar, P. R., Rapp, P. R., & Shapiro, M. L. (2010). Bidirectional changes to hippocampal theta-gamma comodulation predict memory for recent spatial episodes. Proceedings of the National Academy of Sciences U S A, 107, 7054–7059.
Siapas, A. G., Lubenov, E. V., & Wilson, M. A. (2005). Prefrontal phase locking to hippocampal theta oscillations. Neuron, 46, 141–151.
Smith, D. M., & Mizumori, S. J. Y. (2006a). Hippocampal place cells, context and episodic memory. Hippocampus, 16, 716–729.
Smith, D. M., & Mizumori, S. J. Y. (2006b). Learning-related development of context-specific neuronal responses to places and events: The hippocampal role in context processing. Journal of Neuroscience, 26, 3154–3163.
Smith, D. M., Wakeman, D., Patel, J., & Gabriel, M. (2004). Fornix lesions impair context-related cingulothalamic neuronal patterns and concurrent discrimination learning. Behavioral Neuroscience, 118, 1225–1239.
Snyder, L. H., Batista, A. P., & Andersen, R. A. (1997). Coding of intention in the posterior parietal cortex. Nature, 386, 167–170.
Stalnaker, T. A., Calhoon, G. G., Ogawa, M., Roesch, M. T., & Schoenbaum, G. (2012). Reward prediction error signaling in posterior dorsomedial striatum is action specific. Journal of Neuroscience, 32, 10296–10305.
Tabuchi, E. T., Mulder, A. B., & Wiener, S. I. (2000). Position and behavioral modulation of synchronization of hippocampal and accumbens neuronal discharges in freely moving rats. Hippocampus, 10, 717–728.
Tanaka, H., Sejnowski, T. J., & Krakauer, J. W. (2009). Adaptation to visuomotor rotation through interaction between posterior parietal and motor cortical areas. Journal of Neurophysiology, 102, 2921–2932.
Tse, D., Langston, R. F., Kakeyama, M., Bethus, I., Spooner, P. A., Wood, E. R., Witter, M. P., & Morris, R. G. (2007). Schemas and memory consolidation. Science, 316, 76–82.
Tulving, E. (2002). Episodic memory: From mind to brain. Annual Reviews in Psychology, 53, 1–25.
Turrigiano, G. G. (1999). Homeostatic plasticity in neuronal networks: The more things change, the more they stay the same. Trends in Neuroscience, 22, 221–227.
Turrigiano, G. G. (2008). The self-tuning neuron: Synaptic scaling of excitatory synapses. Cell, 135, 422–435.
Turrigiano, G. G. (2011). Homeostatic synaptic plasticity: Local and global mechanisms for stabilizing neuronal function. Cold Spring Harbor Perspectives in Biology, 4a, 005736.
Turrigiano, G. G., & Nelson, S. B. (2004). Homeostatic plasticity in the developing nervous system. Nature Reviews in Neuroscience, 5, 97–107.
Turrigiano, G. G., Leslie, K. R., Desai, N. S., Rutherford, L. C., & Nelson, S. B. (1998). Activity-dependent scaling of quantal amplitude in neocortical neurons. Nature, 391, 892–896.
van der Meer, M. A., & Redish, A. D. (2009). Low and high gamma oscillations in rat ventral striatum have distinct relationships to behavior, reward, and spiking activity on a learned spatial decision task. Frontiers in Integrative Neuroscience, 3, 9.
Vanderwolf, C. H. (1969). Hippocampal electrical activity and voluntary movement in the rat. Electroencephalography Clinical Neurophysiology, 26, 407–418.
Varela, F., Lachaux, J. P., Rodriguez, E., & Martinerie, J. (2001). The brainweb: Phase synchronization and large-scale integration. Nature Reviews in Neuroscience, 2, 229–239.
Vinogradova, O. S. (1995). Expression, control, and probably functional significance of the neuronal theta-rhythm. Progress in Neurobiology, 45, 523–583.
Voorn, P., Vanderschuren, L. J., Groenewegen, H. J., Robbins, T. W., & Pennartz, C. M. (2004) Putting a spin on the dorsal-ventral divide of the striatum. Trends in Neuroscience, 27, 468–474.
Walsh, M. M., & Anderson, J. R. (2012). Learning from experience: event-related potential correlates of reward processing, neural adaptation, and behavioral choice. Neuroscience and Biobehaioral Reviews, 36, 1870–1884.
Watson, K. K., & Platt, M. L. (2008). Neuroethology of reward and decision making. Philosophical Transactions of the Royal Society of London B, 363, 3825–3835.
Wilson, C. J. (1993). The generation of natural firing patterns in neostriatal neurons. Progress in Brain Research, 99, 277–298.
Wilson, C. J., & Kawaguchi, Y. (1996). The origins of two-state spontaneous membrane potential fluctuations of neostriatal spiny neurons. Journal of Neuroscience, 16, 2397–2410.
Womelsdorf, T., Schoffelen, J. M., Oostenveld, R., Singer, W., Desimone, R., Engel, A. K., & Fries, P. (2007). Modulation of neuronal interactions through neuronal synchronization. Science, 316, 1609–1612.
Wood, E. R., Dudchenko, P. A., & Eichenbaum H. (1999). The global record of memory in hippocampal neural activity. Nature, 397, 613–616.
Yeshenko, O., Guazzelli, A., & Mizumori, S. J. (2004). Context-dependent reorganization of spatial and movement representations by simultaneously recorded hippocampal and striatal neurons during performance of allocentric and egocentric tasks. Behavioral Neuroscience, 118, 751–769.
Yin, H. H., & Knowlton, B. J. (2006). The role of the basal ganglia in habit formation. Nature Reviews in Neuroscience, 7, 464–476.
Young, J. J., & Shapiro, M. L. (2009). Double dissociation and hierarchical organization of strategy switches and reversals in the rat PFC. Behavioral Neuroscience, 123, 1028–1035.
Zahm, D. S., Cheng, A. Y., Lee, T. J., Ghobadi, C. W., Schwartz, Z. M., Geisler, S., Parsely, K. P., Gruber, C., & Veh, R. W. (2011). Inputs to the midbrain dopaminergic complex in the rat, with emphasis on extended amygdala-recipient sectors. Journal of Comparative Neurology, 19, 3159–3188.
Acknowledgements
This data described in this chapter were generated with grant support over the past 20 years (most recently NIH grant MH58755). The current conceptual frameworked evolved over that time, and they were based on discussions with a number of graduate students and postdoctoral students. Most recently, these included Wambura Fobbs, Katy Gill, Yong Sang Jo, Adria Martig, Sujean Oh, Corey Puryear, David Smith, Valerie Tryon, and Oxana Yeshenko.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Mizumori, S. (2016). Self Regulation of Memory Processing Centers of the Brain. In: Jackson, P., Chiba, A., Berman, R., Ragozzino, M. (eds) The Neurobiological Basis of Memory. Springer, Cham. https://doi.org/10.1007/978-3-319-15759-7_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-15759-7_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-15758-0
Online ISBN: 978-3-319-15759-7
eBook Packages: Behavioral Science and PsychologyBehavioral Science and Psychology (R0)