Keywords

FormalPara Core Message

Despite significant advances in HIV/AIDS research, the disease still impacts millions of people worldwide. The psychosocial environment of the patient plays an important role in the disease progression. Psychological stress, mental health issues and lack of social support contribute to a poor prognosis, particularly in those patients with prior exposure to these risk factors. Early life stress is known to affect mental health and modulate neuroendocrine and immune function long term, influencing individual’s vulnerability to adult stress and compromised health status. This increased susceptibility to the adverse effects of stress may in turn promote the rate of HIV disease progression. Understanding the possible interactions between early life experiences of an infected individual and their ability to cope with the diagnosis and health consequences of HIV infection may shed light on the underlying biological mechanisms contributing to the disease progression and, thus, to improve current therapeutic strategies.

10.1 Introduction

The pathogenesis of human immunodeficiency virus (HIV)-1 is highly variable, and individual differences exist in the progression of the disease. Following a primary infection and spread of HIV-1 among CD4+ cells, replication of the virus is controlled by the cytotoxic T-cell response [1] and HIV-1-specific antibodies [2, 3]. Other immune responses against HIV-1 include activation of natural killer (NK) cells that control viral replication [4] and enhance the cytotoxic T-cell response [5]. T helper (Th) 1 response is also essential in the control of the viral spread via production of cytokines, such as interleukin (IL)-2, IL-12 and interferon (IFN) γ that are essential for the development of cytotoxic T cells and promote cellular immunity [6]. These initial antiviral responses inhibit replication, without the depletion of the infected cells. After the acute antiviral response, which lasts between 2 and 4 weeks, the viral load set point is established. That is, the replication of the virus and its clearance reach an equilibrium [7]. Higher RNA levels of the virus in plasma typically predict more rapid progression to acquired immunodeficiency syndrome AIDS [8, 9]. After the set point is established, a clinical latency of the disease, when an infected individual remains asymptomatic, may last for years, with a median of 10 years, until the onset of clinical conditions defining AIDS begin [10], the most prominent of which is the decline in the absolute number of CD4+ T-cells below ~200/ul of peripheral blood [11]. Rapid decline in CD4+ T-cell counts is often followed by increased plasma viral load and a decline in HIV antibody production [1, 12]. Other immunological parameters affected by the progression of the disease include diminished NK cell levels and cytotoxicity [5], as well as a critical alteration in the cytokine balance and a shift towards Th2 immune responses, inhibiting the production of Th1 cytokines and cellular immunity [6]. Prior to the introduction of antiretroviral therapy, life expectancy after the diagnosis of AIDS has been on average up to 2 years [13]. The development of anti-retroviral therapies has substantially prolonged the life expectancy of the patients and dramatically decreased the incidence of AIDS [14]. However, even the most effective treatment cannot fully eradicate the virus, which will continue to replicate in lymphoid organs and the central nervous system. Over time, the virus undergoes mutation, and any treatment may be eventually rendered ineffective [15].

Although individual differences in the level of the HIV set point play an important role in the progression rate of the disease, its progression rate is not solely dependent on the established set point of the viral load. Other factors may influence the progression rate, such as the psychosocial environment and its impact on the immune status of the patient. HIV/AIDS diagnosis is undoubtedly a traumatic event. Like any other chronic illness, it causes a constant emotional and economic burden on the individual. But in addition to the expected stressful impact, HIV/AIDS diagnosis is often accompanied by a negative social stigma . The infected individual has now to deal not only with the illness but also with a potential judgemental attitude from their friends, family and even strangers. Even with the current increased awareness to the origins and transmission routes of the disease, HIV-infected individuals are still facing a constant threat of social isolation, strongly negative attitudes, and even discrimination of basic civil rights [16,17,18]. With these issues in mind, the psychological quality of life of the patients may become substantially reduced.

There is a large and continually emerging literature demonstrating that the psychosocial environment can play a vital role in the health status of HIV-1 patients. Stressful events , lack of positive social support and the onset of mood disorders such as anxiety and depression may impair the already vulnerable immune status and impact upon HIV-1 disease progression, even in the presence of antiretroviral therapies [19, 20]. Herein, we will review the existing literature on several psychosocial factors that have been associated with the pathogenesis of HIV and discuss their potential biological mediators. We also aim to introduce a new model of HIV/AIDS disease progression, which takes into account preexisting conditions that may affect the progression rate and health consequences of the disease. These are early life factors that alter the internal milieu of an organism and may contribute to the interrelationship that exists among the immune system, mood and behaviour, and ultimately modulate disease progression.

10.2 Psychosocial Determinants of HIV Progression

10.2.1 Stressful Life Events and Coping Strategies

There is growing evidence demonstrating the deleterious impact of stress on health status in a variety of diseases [21, 22]. In the context of HIV, significantly increased depletion of CD4+ T cells and the emergence of other AIDS-defining conditions were found in several longitudinal studies assessing the impact of stressful and traumatic life events among infected individuals, including loss of job, relationship breakup, financial difficulties and more [19, 23]. Other studies have shown that high levels of distress were associated with other immunological parameters in HIV-infected individuals, including decreased numbers of memory T helper cells and B cells [24], and impaired NK cell activity [25], which are indicative of decreased cellular immunity associated with accelerated HIV progression [6].

It appears that rather than exposure to the stressful event itself, it can be the subjective perception and coping strategies utilised which can produce the most profound detrimental impact on physiological responses [26]. A meta analytic review of 36 articles, investigating associations between the impact of the psychosocial environment on HIV/AIDS disease progression revealed that personality types or coping strategies and psychological distress were more strongly associated with accelerated HIV disease than the stressor per se. All of the health-related outcomes, such as CD4+ T-cell levels, the diagnosis of AIDS, the stage of the disease, and HIV/AIDS symptoms, were significantly affected by these adverse psychosocial factors [26]. Negative or pessimistic perceptions of the stressful event were found to be correlated with accelerated decline in CD4+ cell levels [27], increased plasma viral load [28] and decreased NK cell cytotoxicity as well as decreased levels of cytotoxic T cells [29]. In contrary, positive proactive approach has been associated with slower disease progression [30]. Unrealistic optimism, however, reflecting denial and passive coping style, was found to be harmful and has been associated with decreased CD4+ T cell levels [31]. Another study indicated that direct and active coping approach to the HIV/AIDS-related stress is associated with better psychological adjustment [32].

10.2.2 Social Support and Relationships

Social support and meaningful relationships can mitigate the detrimental effects of stress by providing emotional, physical, or financial support [33]. As such, it has been shown to be important in the context of HIV disease, with a lack of positive social network generally being associated with an enhanced rate of HIV progression [34]. Several early studies have reported this phenomenon, demonstrating greater satisfaction with social support to be associated with slower progression to AIDS , as indicated by CD4+ lymphocyte counts [19, 23]. Perception of social support has been reported to be associated with greater adjustment and coping, as well as with greater self-perceived quality of life, among HIV-infected individuals [35], implicating the stress-buffering role of social support in coping with chronic illness.

There is, however, an important distinction between the impact of social support in the context of HIV infection, as opposed to other chronic illnesses (i.e. cancer). Provision of social support requires the infected individual to disclose their HIV status, which may ultimately lead to rejection and a potential loss of support, due to the negative reactions associated with the stigma of the disease [36]. This often results in a lower rate of disclosure of HIV infected over other diseases. Nevertheless, it appears that infected HIV/AIDS individuals who have chosen to disclose their HIV status to family and friends receive greater social support from these meaningful relationships and perceive less emotional distress [18]. The stage of the disease is likely to be a valuable indicator of the perceived benefit of social support , with larger social networks and greater support predicting longevity among individuals at a later stage of HIV disease [37].

Along with the beneficial impact of positive social factors on health status, there are possible detrimental effects of negative social environment. For instance, the nature of social relationships has been found to predict survival when support from some social networks may be associated with encouragement of sexual, drug-related and other HIV risk behaviours and therefore has a negative impact [38]. Other factors, such as sexual orientation, may play an important role in predicting the benefits of social relationships and support. Disclosure of homosexual identity was found to be associated with high CD4+ T lymphocyte counts only when those HIV-seropositive gay and bisexual men received high levels of social support [39]. Concealment of sexual identity, reflective of psychosocial inhibition and utilisation of passive coping strategies, has been reported to have a negative impact on HIV-1 disease progression and increased risk of other infectious diseases [40,41,42]. Stability of social relationships is yet another important factor. Studies of rhesus macaques experimentally infected with the simian immunodeficiency virus (SIV ) indicated a complex impact of social relationships on SIV pathogenesis, with a negative influence of random and unstable social interactions on the disease progression and survival, as opposed to stable social hierarchies [43].

10.2.3 Mental Illness and Cognitive Decline

Unsurprisingly, depression can become a common occurrence among HIV-1-positive individuals. Unfortunately, the development of major depression or depressed mood can impact negatively on the progression of HIV in patients. A number of studies have reported that depressive states in HIV patients correlate with increased risk of rapid disease progression [19, 44, 45]. Chronic depression is typically associated with accelerated HIV disease progression, including increasing morbidity and mortality [20], suggestive of the benefits of early detection of depression in these patients. Even with the use of highly active antiretroviral therapies (HAART) , chronic depression was associated with rapid progression of HIV [46]. Although antidepressants have been shown to affect immune function in patients with depression [47], tricyclic and selective serotonin re-uptake inhibitors (SSRI) antidepressants do not appear to have an impact on CD4+ T-cell counts or plasma viral load [48, 49]. These drugs may however improve adherence to antiretroviral therapy [50].

Preclinical studies have produced recent support for the link between HIV-1 and depression. Increased activation of indoleamine 2,3-dioxygenase (IDO) , the rate-limiting enzyme in tryptophan metabolism, which results in metabolites such as serotonin and the excitotoxic NMDA receptor agonist quinolinic acid among others, has been implicated in inflammation-induced depression [51, 52], and is now considered to be a potential contributing factor to the onset of depression associated with HIV infection [53]. Depressive-like behaviour was reported in HIV-1 Tat protein-treated mice. Increased expression of IDO was found in the hippocampus of these animals, mediated via activation of the p38 mitogen-activated protein kinase (MAPK) inflammatory pathway [54]. Several studies have shown that MAPK signalling pathway can positively regulate replication of HIV-1 [55, 56]. Activation of IDO also inhibits T-cell proliferation via catabolism of tryptophan into its metabolite kynurenine. In the context of HIV, increased IDO-mediated tryptophan catabolism has been reported [57]. In rodents, inhibition of HIV-induced IDO activation has been shown to result in increased cytotoxic T cell response and depletion of infected cells [58]. In vitro studies have indicated that increased expression of IDO mRNA is present in peripheral blood mononuclear cells of HIV-infected patients. Inhibition of IDO in these cells resulted in increased proliferation of CD4+ lymphocytes [59]. Given the robust immunosuppressive activity of IDO pathways, further investigation of therapeutic approaches aimed at controlling this mechanism in the context of HIV infection is required.

Despite the apparent lack of impact of the commonly used antidepressants on the biological indices of HIV disease progression [48, 49], there is evidence that psychological interventions, aimed at reducing psychological distress, may improve depression and anxiety symptoms as well as HIV-related health outcomes [46, 60]. These effects imply that not only serotonergic pathways, most prominently targeted by antidepressants, are affected in HIV-infected and depressed individuals, but other biobehavioural mediators of the psychosocial factors on HIV disease progression might be involved.

In addition to mental illness, individuals living with HIV tend to exhibit cognitive impairments. Before antiretroviral therapies became available, a majority of HIV-infected individuals with advanced infection suffered from HIV-associated dementia (HAD) or also known as Neuro-AIDS. HAD is a severe neurological impairment that includes motor dysfunction, loss of dexterity and coordination [61]. The discovery of HAART dramatically decreased the prevalence of HAD. However, although HAD incidence is now rare, the incidence of HIV-associated mild cognitive impairments remains common, leading to significant neurocognitive decline in otherwise virologically stable HIV-infected patients [62, 63]. With a significantly increased survival rate and extended life span for the HIV-infected individuals in the era of antiretroviral therapies, age is also a significant contributor to HIV-associated cognitive disturbances. The incidence of cognitive decline is three times greater in HIV-infected individuals over the age of 50 [64], further accelerative cognitive aging [65].

Despite significant advances in knowledge, the cellular and molecular mechanisms underlying the HIV-associated neurocognitive decline are not fully elucidated. However, several possible pathways have been proposed. First, it is evident that HIV itself can invade the brain, causing neurodegeneration, synaptic and dendritic damage, as well as pathological activation of astrocytes and microglia [66, 67]. HIV entry into the central nervous system (CNS) is mediated by infected lymphocytes and monocytes or through trafficking of cell-free virus (reviewed in [68]). HIV infection of the CNS leads to the loss of pericytes affecting endothelial integrity and inducing breakdown of the blood–brain barrier (BBB) [69, 70]. These processes further facilitate the ability of HIV cell-free virus to cross the BBB and enter the brain [71]. CNS infiltration initiates neuroinflammatory responses, leading to robust astrocytosis and microgliosis, even in the presence of antiretroviral therapy, primarily in the brain regions particularly associated with cognitive function, such as the hippocampus [72, 73]. In addition, activated astrocytes and microglia release proinflammatory cytokines and chemokines, contributing to neuronal damage and loss of myelin integrity [74]. Although the presence of HIV in the CNS is essential for the development of HIV-associated cognitive declines, it is still unknown whether increasing amounts of HIV in the CNS are positively correlated with the severity of cognitive decline [75,76,77]. Moreover, the effectiveness of CNS penetration by antiretroviral agents does not appear to be associated with improved neurocognitive outcomes in HIV-infected patients [78, 79]. Cognitive impairments, psychological distress and depression in HIV patients further contribute to rapid HIV disease progression [45, 78]. More research is therefore imperative on effective preventative measures to treat CNS HIV infection and the associated neurocognitive decline and mental illness.

10.3 Physiological Systems Underlying Stress and Psychosocial Well-Being

The psychosocial factors discussed above have been demonstrated to impact on disease progression in HIV-1 patients. This is due to the underlying physiological systems involved in regulating, interpreting and processing stressful situations, social interactions and other psychosocial factors. The physiological signalling systems that mediate psychosocial processes have become the focus of several avenues of literature, not least of all because they are known to interact closely with the immune system. In particular, the neural–endocrine–immune relationship has seen great interest due to its capacity to modulate the antiviral response. The two major systems involved in stress responsiveness and regulation are the hypothalamic–pituitary–adrenal (HPA) axis and the autonomic nervous system (ANS). In this section, we will outline how these biological processes can impact HIV-1 status.

10.3.1 The Role of the HPA Axis in the Disease Progression

Exposure to physiological or psychological stress is manifested by activation of the HPA axis , indicated by the release of corticotropin-releasing hormone (CRH) and arginine vasopressin (AVP) from the paraventricular nucleus (PVN) of the hypothalamus, stimulating CRH and AVP receptors in the anterior pituitary and the subsequent release of adrenocorticotropic hormone (ACTH) , which in turn stimulates production and release of glucocorticoids (cortisol in humans and corticosterone in rodents) from the adrenal cortex. Inhibition of the HPA axis responses to stress is induced via the binding of glucocorticoids to glucocorticoid and mineralocorticoid receptors (GRs/MRs) in various brain regions, although primarily in the hippocampus, hypothalamus and pituitary. This results in activation of a negative feedback loop and a return to homeostasis [80]. In this way, the HPA axis provides an effective stress-control mechanism by stimulating a relatively brief release of glucocorticoids. The HPA axis is also activated upon exposure to immunological stress. Immune activation leads to the initiation of the HPA axis response, which is primarily mediated by increased production of proinflammatory cytokines (such as tumour necrosis factor (TNF)-α, IL-1α and β, and IL-6) [81]. The resulting acute increase in glucocorticoid levels possesses adaptive anti-inflammatory properties alleviating the immune activation. However, if the exposure to stress is prolonged, or if the adaptive physiological response fails, prolonged exposure to increased HPA axis activity initiates a complex neuroimmune response resulting in increased circulation of IL-6 and CRP, promoting CRH synthesis and further HPA axis activity. Prolonged escalations in glucocorticoid levels are, therefore, detrimental for normal physiological functioning. Glucocorticoids are known to exert immunoregulatory functions, via induction of a shift in immune responses from a cellular (Th1) towards a humoral (Th2) response [82]. As discussed above, the trend towards Th2 responses is also evident during the course of HIV infection and is associated with decline in antiviral activity and accelerated progression rate of the disease [6]. The HPA axis, therefore, represents one of the possible mechanisms by which changes in the psychosocial environment may intervene with the disease progression. Thus far, however, clinical studies have indicated only a correlative link between altered cortisol levels to the severity of the disease. Increased cortisol levels have been found to correspond with faster progression to AIDS [19]. On the other hand, low waking cortisol in patients at early stages of HIV infection has been shown to be associated with greater T cell immune activation, a known risk for rapid disease progression [83]. Because the progression of HIV disease itself can lead to changes in cortisol production [84], it is difficult to determine a causal link between the altered HPA axis activity and the disease progression. In vitro studies suggest that low glucocorticoid concentrations enhance HIV-1 replication by stimulating HIV-1 long terminal repeat [85], or arresting infected T cells in the G2 phase of cell cycles, when the expression of the viral genome is optimal [86]. However, further empirical evidence is required to establish this plausible hypothesis in vivo.

Interestingly, experimental studies of SIV infection revealed lower cortisol levels at baseline and in response to an acute stressor in animals assigned to unstable social conditions (43). This pattern of cortisol response corresponds with that found in individuals suffering from posttraumatic stress disorder (PTSD) , which often present hypocorticolism in the presence of a stressful situation [87]. The blunted cortisol response in SIV-treated animals assigned to unstable social conditions was nevertheless associated with increased humoral immune responses and increased plasma SIV RNA levels [43]. Despite the inconsistency of these data with the hypothesis that increased release of glucocorticoids leads to a shift in T helper cells function towards Th2 immunity [82], data from PTSD research indicate an inverse relationship between cortisol levels and glucocorticoid receptor numbers on lymphocytes and enhanced negative feedback sensitivity of the HPA axis [88]. Because behavioural differences during the formation of the social groups predicted survival in SIV-infected animals even without alteration of the HPA axis activity, these data suggest that other biological pathways are involved [43]. Although subpopulations of PTSD patients have been demonstrated to exhibit a blunted cortisol profile, they are also known to exhibit prolonged ANS activation (adrenaline release), reflective of the inability to switch from the acute fight/flight mechanism of the ANS to the energy mobilisation mechanism of the HPA axis [89, 90]. The above studies describing blunted HPA function in correlation with increased HIV progression did not assess ANS indicators, which may also be a contributing factor. Therefore, given that increased noradrenergic responsiveness to stress has been previously reported in PTSD patients in the absence of elevated HPA activity [91], another plausible mediator of the viral progression is activation of the ANS.

10.3.2 The Role of the ANS in the Disease Progression

The ANS and the HPA axis work in a coordinated fashion to modulate the stress response. ANS functioning is, however, regulated by different neurobiological pathways (reviewed in [92]). The ANS provides the most immediate response to stress via activation of its sympathetic and parasympathetic compartments, resulting in rapid physiological changes, mediated by innervation of end organs. The sympathetic nervous system (SNS) , for instance, can rapidly induce changes in heart rate and blood pressure as well as the release of catecholamines [93]. Catecholamines are released from the adrenal medulla and sympathetic neurons in lymphoid organs in response to psychological or immune stress. Increased production of catecholamines can modulate immune function, by driving a Th2 shift [94]; inhibition of thymopoiesis [95] and by inhibiting the activity of NK cells, cytotoxic T cells and macrophages [96]. Activation of the parasympathetic division of the ANS via both the afferent and efferent vagal nerves, which is classically associated with the counterbalance of the sympathetic response, has been also noted to supress inflammatory signals [97, 98]. Increased release of acetylcholine, the major parasympathetic neurotransmitter, leads to deactivation of macrophages, inhibiting the production of proinflammatory cytokines, such as TNF-α, IL-1 and IL-18) but not anti-inflammatory cytokines, such as IL-10 [99]. Activation of the ANS has, therefore, an important role in the control of immune function, regulating the immune activity towards Th2 immunity and inhibiting a proinflammatory response [100]. In the context of HIV, there is a more consistent association between increased ANS activity and HIV-1 pathogenesis, than in the case of the HPA axis. Importantly, given that unlike the HPA axis, ANS activity does not change substantially after the onset of AIDS, more causal links can be drawn. Natural history studies have indicated that socially inhibited HIV patients with increased plasma viral load and poorer response to the HAART treatment demonstrated elevated ANS activity [101, 102]. Cognitive–behavioural stress management intervention in HIV-positive patients was found to decrease anxiety and self-perceived stress, along with a decrease in urine catecholamine levels and increase in the number of cytotoxic T cells, as compared with a group of HIV-positive individuals that did not receive a psychological intervention and exhibited high levels of ANS activity [103]. Another study that directly assessed ANS activity has found that HIV-seropositive men with high levels of ANS activity before initiation of active antiretroviral therapy demonstrated poorer adjustment to the therapy, with elevated plasma viral load and decreased CD4+ T cell levels in response to the treatment [101]. Animal studies have also revealed that SIV-treated rhesus macaques that were subjected to social stress had increased density of sympathetic innervation within the lymph node parenchyma. Exposure to stress in these animals also increased viral replication, which was attributed specifically to the elevated density of catecholaminergic varicosities within the parenchymal tissue of the secondary lymphoid organs [104]. In vitro studies have shown that catecholamines can accelerate HIV-1 replication via several molecular pathways [105]. One such pathway is the cAMP/PKA signalling pathway , which mediates catecholamine response via activation of β-adrenergic receptors resulting in suppression of Th1 responses [106]. Activation of cAMP/PKA pathway has been shown to increase HIV-1 plasma viral load, whereas inhibitors of PKA were able to normalise HIV-1 biomarkers [105]. Catecholamines were also found to upregulate expression of viral CXCR4 and CCR5 co-receptors [107] and to enhance transcription of HIV-1 genes, which was abrogated by the blockade of β-adrenergic receptors and by inhibition of PKA activity [101, 105]. To date, however, no studies have assessed the impact of inhibition of ANS activity per se on HIV disease progression.

Despite the complexity of the current research on the biobehavioural determinants of HIV progression, it is clear that individual differences drive susceptibility to these factors and therefore may regulate the health consequences . The question that will be addressed below is whether these individual differences in susceptibility to the influences of psychosocial and physiological environments can be identified, and importantly, whether exposure to environmental changes early in life can predict predisposition to these alterations.

10.4 The Role of Perinatal Programming in Health and Disease

A large body of research has provided evidence for the reciprocal interactions of the neural, endocrine and immune systems. The establishment of these interactions begins during ontogeny, when the physiological systems are extremely sensitive to environmental impacts, a process that has been referred to as developmental plasticity [108]. During this sensitive period of development, the neuroendocrine systems exert not only a regulatory but also morphogenetic role [109,110,111]. The plasticity of physiological systems in perinatal development allows environmental factors to alter the functionality of an organism, providing for foetal adaptation to adverse conditions. However, any adversity experienced during this time may interfere with the development or formation of physiological interactions and negatively influence physiological functioning in later life.

Over the last two decades, a new field of research has emerged to investigate the ability of early life adversity to alter the normal course of development and predispose to pathologies in later life. The process by which the early life environment can have permanent effect on physiological systems has been described as perinatal programming [112]. This concept was originally explored by epidemiologist Professor David Barker. Barker proposed that disease states in adulthood may have their origins in the early developmental period [113]. His studies have demonstrated a link between being born at a low birth weight and greater risk of developing coronary heart disease in adulthood [114]. His later studies have suggested that low birth weight is also associated with an increased risk of hypertension, stroke and type 2 diabetes [115], and led to the establishment of Developmental Origins of Health and Disease (DOHaD) hypothesis, which has associated the perinatal experience with disease susceptibility in later life [116]. Many lines of research have extended the concept of the DOHaD hypothesis, investigating how the different avenues of early growth and development may determine susceptibility to other health conditions such as osteoporosis [117], cancer [118] and even predict life expectancy [119]. Low birth weight has also been associated with psychopathological outcomes, including depression and suicide [120].

The DOHaD hypothesis incorporates the concept of perinatal programming, whereby changes in environment (i.e. nutrition) may alter the functionality of physiological systems and predispose to later health adversity. The process of perinatal programming can also be regarded as phenotypic induction [121], as the process of programming integrates involvement of genetic predisposition and determinacy in foetal development. Phenotypic induction allows for adaptive response to changes in environmental conditions, and health consequences are, therefore, dependent on the ability to develop an adequate response to such changes. While generally beneficial, under certain condition, this response may become maladaptive. In particular, when there is a mismatch between phenotypic adaptation during development and later life actual conditions. An extent to which the mismatch between environmental demands will have an impact on health outcomes is dependent on a more vulnerable or resilient genetic predisposition [122].

Many lines of research have addressed the impact of perinatal programming on a variety of physiological as well as psychological outcomes. Epidemiological and experimental evidence have indicated that discrepancies between the early and later life environments are not limited to nutritional factors but extend to other potential impacts, such as immune and hormonal statuses, as well as mental states. As such, the research has demonstrated how exposure to physiological or psychological stress factors during critical periods of fetal and neonatal development, which involve vigorous activation of the neuroendocrine and immune responses, disturb the internal milieu of the developing organism and are associated with an increased risk of pathologies and psychopathologies. This evidence will be reviewed in the following sections and incorporated in the discussion of plasticity of neural, endocrine and immune systems during early development.

The HPA axis and the ANS are the major stress response systems, controlled by the CNS which is known to exhibit an enormous degree of plasticity during perinatal development and therefore to be particularly sensitive to environmental stimuli. The critical development of most brain regions and systems occurs in-utero or in early neonatal life, so as the development of neuroendocrine control of stress responsivity. The HPA axis and the ANS are also involved in systemic response to immunological inputs, whereas the immune system, in turn, has an active role in the development of the neural and neuroendocrine responses to a variety of stress factors [123]. Even though variability in the developmental timeline exists among different species, the functional activity of the neural, endocrine and immune systems is significantly lower in perinatal life than in adulthood. Therefore, the early life period is crucial for the definitive development of an organism and disturbances in normal ontogeny of any of the physiological systems may lead to long-term alterations in the functioning of other systems. The ability of environmental stimuli to produce robust programming effects is dependent on the developmental stage of an organism. The same environmental stressor may cause no lasting effects in mature organisms but can be detrimental if experienced during critical periods of development. The extent to which the trajectory of development of the HPA axis , the ANS and the immune system can be affected by environmental stimuli and how these alterations can be manifested in physiological and behavioural abnormalities is described below.

10.5 Early Life Plasticity of Neuroendocrine-Immune Interactions

10.5.1 Programming of the HPA Axis

Optimal exposure to glucocorticoids is important for normal brain development; however, increased glucocorticoids exposure may alter the developmental trajectory of brain maturation and function [124]. The HPA axis is known to be particularly sensitive to environmental influences early in life and exposure to stress during the critical period of HPA axis development may alter stress responsiveness long term [97]. Excess perinatal glucocorticoid exposure has been shown to down-regulate the expression of mineralocorticoid (MRs) and glucocorticoid receptors (GRs) [125, 126], resulting in increased release of glucocorticoids in response to later aversive stimuli [127, 128]. However, this profile of glucocorticoid release has been found to be largely dependent on the timing, strength, type and duration of exposure to not only the initial stressor but also a secondary stimulus. For instance, chronic stress has typically been shown to produce blunted glucocorticoid output, which can be maladaptive at times of stress due to the inability of the HPA axis to regulate its activity [129].

In rodents, brief periods of handling typically alter maternal behaviour, so that handled pups receive reduced maternal attention. This reduced maternal care has been shown to lead to decreased corticosterone levels and increased hippocampal GR expression in adulthood in those animals that were handled as neonates [130]. Increased hippocampal GR expression then leads to inhibition of CRH synthesis and reduced levels of ACTH and corticosterone in response to stress when compared with non-handled animals. Other animal studies have shown that low maternal care during the first week of life in rats affects methylation levels of the GR gene, resulting in hypermethylation within the exon 17 GR promoter and increased histone acetylation and transcription factor (NGFI-A) binding to the GR promoter in the hippocampus of adult offspring [126], which is typically associated with reduced DNA binding and thereby reduced transcriptional activity [131]. These effects, however, can be reversed by cross-fostering the pups to dams that provide higher levels of maternal care [126, 132], suggesting a causal link between differences in maternal care and programming of gene expression. Thus, the sensitivity of the HPA axis is directly programmed by the degree of stress exposure during development.

Similarly, in non-human primates and in humans, parental care mediates and programs stress responsivity later in life [133]. Parental abuse and maltreatment tend to produce initial elevations in cortisol levels, followed by lower than normal cortisol release [134] and elevated ACTH response to psychological stressors [135]. In rhesus macaques, the deleterious effects of early life adversity appear to be so robust that even 3 years of normal social life could not reverse the decline in cortisol levels and abnormal behavioural patterns observed following exposure to maternal separation at birth [136]. Although maternal separation is an intense manipulation in infant monkeys, these findings provide convincing evidence for the long-term programming effects of the early adversity on brain development, particularly the HPA axis.

The lower cortisol secretion and suppressed ability to respond to stress associated with PTSD may be transmitted across generations through epigenetic programming, as has been shown in studies in offspring of Holocaust survivors, as well as other traumatic events, including children born to mothers who were pregnant on 9/11, reviewed in Yehuda and Bierer, 2008 [137]. These important observations may have potential implications for the consideration of factors influencing susceptibility and resilience to HIV disease progression, because severe life stress and PTSD have been linked with the severity of HIV disease [138, 139].

The variable and long-term outcomes of perinatal stress on adult stress responsiveness illustrate the complexity of HPA axis programming. The timing, the extent and the origin of stress exposure, all influence the long-term sensitivity and efficacy of the HPA axis in mediating an appropriate stress response. Ongoing research is still endeavouring to elucidate the critical determinants in programming of the HPA axis. In addition, there is an additional focus on the impact of early life events on the ANS and its subsequent role in stress regulation.

10.5.2 Programming of the ANS

While the majority of literature exploring the effects of early life stress focuses on programming of the HPA axis, the ANS is also susceptible to long-term functional alterations by exposure to a variety of stressors in early life. Similar to other physiological systems, although certain environmental exposures may program beneficial adaption to the environmental challenges, under some circumstances, these changes may prove maladaptive in adulthood and as such provide a basis for developmental origins of pathological states [140].

Due to the complexity of ANS structure and function, each compartment is likely to respond to different environmental factors, generating differing programming outcomes. For instance, thermoregulation, which is predominantly controlled by the SNS, is susceptible to environmental modifications in early life. Exposure to a cold environment during early development improves adaptation to subsequent exposure to cold. Conversely, exposure to elevated temperature enhances tolerance to heat later in life, in both animals and humans [140]. Another important developmental factor that has been extensively studied is maternal and neonatal nutrition, which contributes to the development of the ANS structure and function. In rodents, rearing in small litters results in a permanent increase in body weight and fat mass. Assessment of sympathetic function in these animals revealed that although no consistent differences were observed in adrenergic innervation of peripheral organs, sucrose-induced activation of cardiac sympathetic activity was diminished [141]. Neonatal handling in rat pups, which has been shown to induce a long-lasting reduction in the HPA responses to stress [130, 142], induces an increased autonomic response, as demonstrated by an increase in catecholamine concentrations in response to fasting in adulthood [143]. An acute neonatal immune challenge in rat pups has also been implicated in programming an increased autonomic arousal and anxiety-like behaviours long term [144]. Furthermore, exposure to various modalities of prenatal stress in rats has been shown to exaggerate stress responsivity in later life, as demonstrated by enhanced cardiovascular activity in response to restraint stress in adult offspring [145]. Maternal separation stress in rodents exacerbates responses to sympathetic stimulation, by inducing sensitisation of the renal and systemic sympathetic system and thus impairing blood pressure regulation in adulthood [146].

In children, early experience of neglect and the quality of relationship with their current caregivers has been shown to predict ANS reactivity. Children with a background of neglect and disordered attachment exhibited increased sympathetic reactivity , compared with those children with ordered/secure attachment [147], indicative of a potential reversal of the detrimental effects of early life adversity. Another encouraging evidence of plasticity in the HPA axis and ANS development, even after extreme psychosocial deprivation in early life, comes from the Bucharest Early Intervention Project. Children placed in foster care, after being raised in deprived institutional setting in Romanian orphanages, exhibited normalisation of their HPA axis and ANS responsiveness, compared with those who remained in institutional care [148]. Timing of placement in foster care, however, had a significant effect on the degree of reversal of the negative influences of psychosocial deprivation. Positive intervention effects of foster care on the HPA axis and the ANS were evident for children placed in foster care before 18 months and 2 years of age, respectively. These findings highlight a sensitive period during which the stress response systems, such as the HPA axis and the ANS, are most strongly influenced by environmental challenges [148].

Although a short window of developmental plasticity exists, the ANS and HPA axis hyper-reactivity is a typical consequence of childhood maltreatment and abuse, contributing to adult psychopathology [149]. Prolonged activation of the sympathetic compartment of the ANS inhibits the activity of the innate immune system [150] and has, therefore, implications in the progression of immune-related diseases, including HIV [101]. Early life stress can also independently alter the immune system development, leading to lasting immune consequences.

10.5.3 Programming of the Immune System

The development of the immune system is dependent on the immune, autonomic and endocrine signals that it receives early in life [151,152,153]. The neonatal immune system is typically considered to be functionally immature, leading to an increased risk of infections during this period of life [154]. Inadequate exposure to immune stimuli in early life may disrupt the developmental trajectory of immune maturation and has been associated with a number of chronic immune-related diseases, including increased susceptibility to allergic and autoimmune diseases later in life [155].

Although the neonatal peripheral immune system generates a lower response when compared with that of the adult [156], the expression of many cytokines in the developing brain is significantly increased, even in the absence of an immune stimulus, coinciding with the appearance of “active” amoeboid microglial morphology. This increased central inflammatory activity in the developing brain stimulates neurogenesis, neuronal and glial cell migration, proliferation, differentiation, and synaptic maturation and pruning. The heightened central cytokine expression and “active” microglial morphology are likely to be indicative of the increased sensitivity of the developing brain to its environment. When adversity is experienced, this may lead to permanent alterations of major developmental processes and long-term programming of neuroimmune function (reviewed in [157]).

Microglia play important roles not only in the developing brain but also in the adult brain. Microglial activation significantly contributes to pathogenesis in neurodegenerative disorders, such as Alzheimer’s and Parkinson’s disease [158], as well as to HIV-associated neurocognitive impairments [67]. Early life psychological, immune and nutritional stressors have been associated with lasting proinflammatory changes and microglial activation in the adult brain [159,160,161], suggesting that early life stress may contribute to the development of neurodegenerative disorders later in life. These detrimental effects of early life adversity on neuroinflammatory markers have, therefore, potential significant implications for HIV-associated cognitive impairments and Neuro-AIDS.

Reciprocal interactions between the neural, immune and endocrine systems are established during the perinatal period and together mediate organismal development. Therefore, not only may altered maturation of individual systems affect later life functioning, but disrupted interactions between these systems at critical periods of development may also lead to pathology. The effects of early life stress on the innate immunity of the offspring are generally inhibitory. In rodents, exposure to prenatal psychological stress has been shown to decrease cytotoxicity of natural killer (NK) cells and resistance to experimentally induced tumours [162, 163]. Similar findings have been demonstrated in rats exposed to brief periods of maternal separation, following adult restraint stress [164]. These data suggest that perinatal stress may alter the ontogeny of the neonatal immune system leading to increased susceptibility to diseases and altered immune responsiveness to stressful stimuli.

In primates, prenatal stress and exposure to IL-1β in juvenility have led to a blunted inflammatory response with reduced plasma and cerebrospinal fluid levels of IL-6 and reduced febrile response to the IL-1β [165]. Exposure to an acute psychological stressor in pregnancy has also been shown to diminish cellular cytokine response to an in vitro stimulation with LPS [166]. Prenatal stress also impairs T cell-mediated antigen recognition in monkeys whose mothers were exposed to stress during gestation. These effects, however, were dependent on the timing of exposure to prenatal stress [167]. The suppressive effects of prenatal stress on the immune function in non-human primates appear to be similar to that observed in animals subjected to psychosocial stress in adulthood and SIV infection [43, 104], and may thus be potentially implicated in HIV disease progression.

Although a link between perinatal stress and immune function in the offspring has been investigated in animal models, so far limited information on this association exists in humans. Parental stress has been suggested to have implications for the development of allergic and atopic diseases in the children, influencing polarisation of T cell-mediated immunity towards Th2 cell dominance and contributing to the burden of childhood respiratory illness [168, 169]. On the other hand, additional epidemiological evidence has indicated a link between early life stress and an enhanced inflammatory profile. Specifically, patients with childhood abuse-related PTSD has been shown to display increased levels of inflammatory cytokines and decreased sensitivity of monocytes to glucocorticoids, indicative of increased inflammation [170]. Similarly, depressed adults with a history of childhood maltreatment were found to exhibit increased inflammation, as indicted by higher levels of C-reactive protein [171]. Stressful early life events have been also found to be associated with the reactivation of herpes virus, telomere shortening via increased T-cell proliferation and with immune dysregulation of the tumour environment in different types of cancers [151].

These diverse outcomes of early life stress on the immune function highlight the important gaps in the current knowledge regarding the specific influences of the timing and the extent of traumatic events, and how these factors mediate susceptibility to chronic diseases later in life. Therefore, the biological consequences of early life adversity require further investigation, particularly in human populations. The experience of early life trauma and its long-term programming effects on neuroendocrine and neuroimmune responses is inevitably associated with a variety of behavioural consequences and an increased incidence of mental health problems.

10.5.4 Behavioural Programming

The foetal and neonatal brain is characterised by an extensive network of developing neuronal connections and is especially vulnerable to the consequences of stress. Alterations of these developmental processes may affect cognitive function and behaviour, with the manifestation of these programming effects appearing later in life [172, 173]. Although data from human studies are able to provide correlational links, animal models of perinatal stress have been widely used to establish a causal relationship and to provide in-depth investigation of underlying mechanisms of behavioural programming. Multiple studies in rodents have demonstrated the long-term effect of early life psychological, physical and immune stressors on the emergence of anxiety-like and depressive behaviours, with the behavioural phenotype typically triggered in response to a secondary stressor in adulthood [174,175,176]. This unique phenotypic consequence of early life stress has received particular attention in a double-hit hypothesis of schizophrenia. Originally, the double-hit hypothesis of schizophrenia pointed out that a combination of genetic susceptibility together with a distinct developmental insult, as a ‘first hit’, can predispose an individual for a later traumatic event, a ‘second hit’, leading to the onset of psychopathology [177]. This hypothesis has since been extended to include infection and inflammatory insults as a ‘first hit’ that sets up this increased vulnerability [178,179,180] and is implicated in psychopathologies, including anxiety and depression [175, 181,182,183].

Epidemiological evidence has also indicated a link between maternal stress and increased risk for developing affective disorders. Specifically, it has been indicated that low birth weight and preterm birth, as a result of gestational stress, contribute to developmental impairments and motor dysfunction, common risk factors for schizophrenia [184]. Other research has reported maternal stress during pregnancy to influence developmental delays, emotional status and learning skills in childhood [185,186,187]. These findings in human populations are, however, confounded by the continuing influence of maternal anxiety on the quality of maternal care, which can affect development and behaviour [188]. Retrospective studies reported a higher incidence of schizophrenia in adults born to mothers exposed to severe stress during pregnancy, such as the stress of war, famine or a natural disaster [185, 189, 190]. Moreover, postnatal abuse, maltreatment and neglect have been linked to a greater risk of psychiatric disorders in later life, including PTSD [88], anxiety disorders [191], schizophrenia [190] and other mood disorders [192].

Cumulatively, both animal models and human studies of perinatal programming provide evidence that exposure to early life stress of various sources may alter the maturation of the HPA axis, the ANS and the immune system . These changes may in turn increase susceptibility to dysregulated stress responsiveness and pathological outcomes in adulthood, the nature and intensity of which depend on the genetic vulnerability of the individual, the severity of perinatal stress, the timing of exposure and the quality of the postnatal environment.

10.6 Perinatal Programming of Vulnerability or Resilience: Implications for HIV Disease Progression

Despite incredible advancements in HIV/AIDS research , the disease still affects millions of people worldwide. In 2015, there were 36.7 million people infected with HIV worldwide, with the rate of new infections falling only by 6% in the past 6 years. Continuous antiretroviral therapy is, currently, the most effective way to reduce the risk of disease progression, however, only half of those infected worldwide have currently access to these life-saving drugs [193]. Treatment interruptions are also common in the clinical practice, reducing treatment efficacy [194]. Importantly, the progression of HIV and the development of AIDS are not limited to physical and immunological symptoms. The disease is often accompanied by psychopathology and substance abuse, worsening disease outcome, even in the presence of antiretroviral therapy [101, 195].

Psychosocial stress is an inevitable component of every chronic disease, including HIV/AIDS. This relationship is influenced by the nature and the extent of stress, as well as by individual vulnerability to its effects. Here, we propose that this vulnerability may be at least in part affected by early life experiences of an individual that may eventually lead to more rapid disease progression in the context of HIV (Fig. 10.1). It appears that some outcomes of early life adversity are similar to those associated with poor prognosis of HIV infection, such as a blunted cortisol profile , overreactive sympathetic responses and diminished parasympathetic activity, as well as symptoms of PTSD , anxiety and depression [19, 46, 83, 102]. Therefore, these individuals may potentially be at a higher risk of adverse disease outcomes upon HIV infection.

Fig. 10.1
figure 1

Framework proposal: In the context of HIV infection, early life adversity may influence the progression of the disease and ultimately affect survival, by predisposing the individual to increased risk of psychopathology, blunted stress responsiveness and altered immune function. These parameters that are further exacerbated by HIV infection and may in turn accelerate the progression of the disease

HIV/AIDS research suffers from decline in investment in recent years. The newly established National Institute of Health (NIH) HIV/AIDS research priorities exclude behavioural and social science research activities [196]. Behavioural research is essential to optimise medication adherence and to improve the management of psychosocial disorders which can greatly impede HIV health outcomes. In combination with biomedical therapies, behavioural interventions have been shown to optimise the efficacy of treatment [197], to increase treatment adherence [198] and to increase knowledge of HIV and transmission risks among populations at risk [199]. Such activities are vital for the continuous improvement of HIV treatment and prevention strategies. More research is required into the potential for identifying biomarkers of early life adversity and altered stress responsiveness in HIV infected individuals, which may predict accelerated disease progression. Together with behavioural interventions, this approach may assist in early detection of reduced resilience and provide these individuals with timely and appropriate psychosocial support.

It is important to note that in this chapter we did not discuss the important issue of HIV infection in newborns and children. Although the rates of HIV infection among children have declined by 50% since 2010 [193], the challenges in the diagnosis and treatment of the disease at such young ages are still of utmost importance for healthcare providers and researchers. The interactions between the developing immune system and HIV are complex, leading to different HIV pathogenesis in children, characterised by more rapid disease progression , compared with adults (reviewed in [200]). While such differences in the disease progression and outcome are beyond the scope of the current chapter, further research into the particular susceptibility of the foetal to adolescent immunity to infectious diseases like HIV is essential to reduce pediatric HIV infection and improve the quality of care.

Conflict of interest

The authors report no conflicts of interest.