Keywords

1 Introduction

According to the Center for Disease Control Human Immunodeficiency Virus (HIV) disease may be categorized as (1) asymptomatic, i.e., absence of serious illness and T cell counts above 200, (2) symptomatic, i.e., presenting with non-AIDS-defining illnesses and T-cell counts above 200 or (3) acquired immune deficiency syndrome (AIDS), i.e., having an AIDS-defining opportunistic infection and/or a current or prior T-cell count below 200 (Control & Prevention, 1992). Observations made in the neurological sequel of HIV from the beginning of the epidemic to the late 1990s revealed stark decline in motor functioning, executive skills and information processing speed across the disease spectrum (Reger, Welsh, Razani, Martin, & Boone, 2002). Toward the end of the second decade there was a paradigm shift in the course of the epidemic that was the widespread availability of antiretroviral therapy (ART). Historically, the most commonly reported cognitive domains ART shows an effect are attention, psychomotor speed, verbal learning and memory, as well as global cognitive function. However, early evidence for the effectiveness of ART on executive functions (EF) are inconsistent suggesting disease progression may not be the only contributor to dysexecutive syndromes in persons living with HIV (PLWH) (Cohen et al., 2001; Sacktor et al., 2003; Tozzi et al., 1993, 1999). Nonetheless, the emergence of combination ART (cART) with enhanced features such as central nervous system penetrability has once again changed the course of neurocognitive decline in PLWH, albeit not equally amongst the afore-mentioned cognitive domains (Cysique & Brew, 2009; Heaton et al., 2011). Indeed, rates of neurocognitive impairment before and after widespread cART availability reveal some concerning trends. Notably, rates of mild neurocognitive impairment have risen across the entire disease spectrum; meanwhile, rates of dysfunction in the executive and learning/memory domains are being reported at their highest rate (Heaton et al., 2011), particularly in individuals with late-stage HIV disease (Goodkin et al., 2017). Although these trends may be partially attributed to the higher rates of survival into older age (Cohen, Seider, & Navia, 2015; Hardy & Vance, 2009; Valcour, Shikuma, Watters, & Sacktor, 2004; Wendelken & Valcour, 2012), other factors are increasingly being shown to contribute to rates of executive dysfunction in HIV. What follows is a discussion of the various manifestations of executive dysfunction in HIV, their measurement and levels of severity and known associations with central and peripheral biomarkers, psychosocial as well as behavioral risk factors.

2 Assessing Executive Dysfunction in HIV

Early efforts to establish the etiology of neurocognitive impairment in HIV focused on polar ends of the spectrum through the identification of primarily asymptomatic individuals with minor cognitive motor disorder (MCMD) and HIV-associated dementia (HAD). Based upon recommendations from a National Institutes of Health (NIH) working group, the nosology for what is now known as HIV-associated neurocognitive disorder (HAND) was revisited and subsequently amended based on performance in at least five domains of neurocognitive functioning typically found to be impacted along the course of HIV infection, i.e., attention/working memory, EFs, episodic memory, speed of information processing, motor skills, language and sensory perception (Antinori et al., 2007). Dependent upon the level of impairment and disease course, HAND may be further subdivided into symptomatic forms of mild neurocognitive disorder (MNCD) and HAD or the asymptomatic form of asymptomatic neurocognitive impairment (ANI). These subtleties pose a unique challenge as patients with ANI often do not experience or report symptoms, yet are at higher risk of developing symptomatic cognitive impairments (Grant et al., 2014). Although the study of HIV-related executive dysfunction has blossomed in recent years, the literature is convoluted with multiple abstractions of EF being reported as outcome measures across studies. While there are several approaches to characterization of EF into core domains such as updating and working memory, set-shifting, and inhibition (Jurado & Rosselli, 2007; Miyake et al., 2000; Stuss & Alexander, 2000), it is widely accepted that there is no unitary EF but rather evidence of varying degrees of frontal lobe involvement with the striatum and limbic system during cognitive task performance (Bechara, Damasio, Tranel, & Anderson, 1998; Lezak, Howieson, Loring, & Fischer, 2004). Those EF domains receiving considerable attention in most recent years include inhibition, updating/working memory, set-shifting and mental flexibility along with decision making (Walker & Brown, 2018).

Inhibition or the ability to withhold or attenuate an action or thought is one of the core functions contributing to performance on attention and complex EF tasks (Hofmann, Schmeichel, & Baddeley, 2012). Because inhibition is tied to many health-related behaviors it has been studied extensively in PLWH. Perhaps the most widely utilized measure of inhibitory control used within this population has been the Stroop task (Hinkin, Castellon, Hardy, Granholm, & Siegle, 1999; Martin et al., 1992). Stroop task indicators of inhibitory control vary widely across studies from vocal reaction time (RT) to time of completion on interference trials (Chang et al., 2002; Cohen et al., 2011; Maki et al., 2015; Martin et al., 2004a). When verbal report is not permissible, such as during an fMRI experiment, the Stroop Match-to-Sample task involving perceptual cueing and repetition has been incorporated to assess PLWH for performance decrements due to cognitive interference (Schulte, Müller-Oehring, Sullivan, & Pfefferbaum, 2011). Also utilized in HIV research is the stop-signal anticipation task as an index of frontostriatal function during inhibitory control (du Plessis et al., 2015).

Updating and working memory are a composite of EFs frequently assessed in PLWH. The most commonly used task design to demonstrate deficits in updating and working memory within PLWH has been the n-back task wherein the two-back condition is most often reported. The sensory domains to which performance outcomes are typically identified include verbal/auditory and visual stimuli (Chang et al., 2008; Hinkin et al., 2002). However, not all post-cART studies have report HIV-related effects within this domain perhaps due to the presence of medical comorbidities such as Hepatitis C Virus (HCV) (Caldwell et al., 2014; Ernst, Chang, Jovicich, Ames, & Arnold, 2002). It is important to note that while the 2-back task provides strong support for HIV-related deficits in working memory modified digit and letter span tasks provide corresponding support in the visual and auditory domain (Farinpour et al., 2000; Hinkin et al., 2002; Martin et al., 2001).

Set-shifting has been described as an individual’s ability to reorient attention between different elements of the same task and may be tapped—in part or whole—by complex or resource-dependent EF tasks. Among the tasks most commonly used to assess ability to shift attention is Part B of the Trail Making Test. When controlling for speed of visual processing and fine motor function, this test provides a proxy of mental set-shifting that is integral to EF (Reitan, 1958). Most of the evidence supporting HIV-related effects in set-shifting ability have come from older adult cohorts and substance-abusing PLWH (Bousman et al., 2010a; Chang et al., 2011; Fama, Sullivan, Sassoon, Pfefferbaum, & Zahr, 2016; Kesby et al., 2015, Tang et al., 2015). Nevertheless, some negative findings have been reported in studies comparing performance between PLWH and HIV-negative adults (Manly et al., 2011; Rippeth et al., 2004). The Wisconsin Card Sorting Task (WSCT) is another frequently used complex EF measure that has yielded inconsistent results, potentially confounded by age and SES, regarding deficits in set-shifting ability amongst PLWH (Corrêa et al., 2016; Rippeth et al., 2004). Moreover, composite measures of both WCST and TMT-B suggest HIV-related difficulty in set-shifting (Carter et al., 2003; Reger, Welsh, Razani, Martin, & Boone, 2002). Seldom-used measures of set-shifting further contributing to this heterogeneity in findings amongst PLWH include intra/extra-dimensional shifting tasks and other custom-designed task-switching paradigms (Bousman et al., 2010b; Byrd et al., 2013; Fama et al., 2016; Jiang, Barasky, Olsen, Riesenhuber, & Magnus, 2016; Kesby et al., 2015; Spies, Fennema-Notestine, Cherner, & Seedat, 2017; Tang et al., 2015).

Although not considered by some to be a pure measure of EF decision making ability is heavily dependent upon frontal lobe function and an integral component in behavioral health and disease management in PLWH (Doyle et al., 2016). In the cART era the Iowa gambling task (Bechara, 2007) and the Cambridge gambling task (Sahakian & Owen, 1992) have emerged as the more frequently reported assessments for decision making in PLWH (Iudicello et al., 2013). The vast majority of these studies reveal HIV-related deficits (Hardy, Hinkin, Levine, Castellon, & Lam, 2006; Iudicello et al., 2013; Thames et al., 2012), yet some reports remain inconclusive (Gonzalez et al., 2005; Paydary et al., 2016). IGT performance in PLWH is strongly mitigated by levels of depression implicating affect dysregulation in the etiology of decision-making deficits in this population. Despite this strong tie to affect-related processes such as punishment and reward, IGT performance is shown to be strongly predictive of global EF in PLWH (Thames et al., 2012). Decision making under ambiguity, i.e., when risk and reward contingencies are not explicitly known, appears to be particularly impaired amongst PLWH compared to when risks are explicitly known (Martin et al., 2013). Decision making under explicit risk is less frequently assessed in HIV, but evident in allied tasks such as Game of Dice where peripheral biomarkers such as Nadir CD4+ T-cell count, substance abuse behavior, and psychiatric comorbidities are highly predictive of performance (Gomez, Power, Gill, & Fujiwara, 2017). In another study comparing HIV patients with HIV-negative controls impaired performance on the Game of Dice Task was found suggesting explicit risk decision making impairments in HIV+ individuals are characterized by less advantageous choices and more random choice strategies (Fujiwara, Tomlinson, Purdon, Gill, & Power, 2015). HIV-related deficits in other decision-making assessments such as the balloon analogue task (Paydary et al., 2016) are evident; however, findings from the intertemporal choice Task are null (Meade et al., 2016). Decision-making deficits in PLWH may also be attributable to poor working memory hence more of a syndrome of EF-related deficits; however, results from a study examining IGT performance in HIV-positive and HIV-negative controls with past or current history of substance dependence found delayed non-match to sample task did not relate to decision-making ability (Martin et al., 2004b).

3 Peripheral Biomarkers of Executive Dysfunction in HIV

HIV-1 viral RNA and CD4+ counts are among the most commonly used clinical biomarkers of disease progression that are used in conjunction with neurocognitive assessment and diagnosis with HAND. Nonetheless, studies examining the neurocognitive correlates of plasma viral RNA yield inconsistent support for an effect of viral replication on executive dysfunction (Mcguire, Gill, Douglas, & Kolson, 2015; Reger et al., 2005). These and other studies suggest that despite suppression of systemic HIV infection and viral RNA proliferation through cART HIV-related cognitive impairment in the executive domain remains elevated. Despite this, allied measures of viral DNA levels provide some insight into the sequel of executive dysfunction in PLWH in the absence of unhindered viral replication. For example, elevated circulating levels of HIV DNA were linked to poorer performance on a composite measure of EF in older, but not younger, virally suppressed PLWH suggesting an interaction of HIV and age on EF (de Oliveira et al., 2015). Although fraught with inconsistencies, this and other studies inform a paradigm shift in our understanding of peripheral biomarkers predictive of executive dysfunction in PLWH. Of particular interest are peripheral and central biomarkers of inflammation and monocyte activation. One of the main observations made in older adults and individuals with chronic immune suppression is that of inflammation and immune activation, i.e., inflamm-aging (Nasi et al., 2017). In the last several years peripheral monocytes have emerged as a primary reservoir for HIV DNA most closely linked to neurocognitive impairment (Cysique et al., 2015; Kusao et al., 2012; Shiramizu, Williams, Shikuma, & Valcour, 2009; Valcour et al., 2013). Evidence of EF deficits related to peripheral inflammation is common within the literature. For example, along with learning deficits lower performance on a battery of EF tests normed by the HIV Neurobehavioral Research Center (Heaton et al., 2011) yielded an inverse association with higher plasma sCD163 a monocyte/macrophage-specific scavenger receptor shed during the pro-inflammatory response (Burdo et al., 2013).

A host of peripheral biomarkers of inflammation are found to be coincident with inhibitory deficits in PLWH. Reduced performance on the Stroop task has been associated with elevated MIP-1β while evidence of reduced IL-18, MCP-1 and TNF-α concentrations suggesting imbalances in pro- to anti-inflammatory cytokine expression may explain inhibitory deficits in PLWH, particularly those coinfected with other chronic inflammatory immune conditions such as HCV (Cohen et al., 2011). In the Women’s Interagency HIV Study (WIHS) higher soluble CD163 and marker of monocyte activation CD14+  was related to EF deficits indexed by the Stroop and Trails Making B task performance. However, circulating IL-6 and another marker of gut microbial translocation were unrelated suggesting a specificity to the effect of inflammatory immune response from different cellular and hence viral reservoirs on EF in PLWH (Imp et al., 2016).

Monocyte activation is also linked to composite measures of working memory performance consisting of the Letter-Number Sequencing Test and Paced Auditory Serial Addition Test (PASAT-50) (Lyons et al., 2011; Woods et al., 2004). Peripheral inflammation, indexed by elevated levels of pro-inflammatory cytokines and reductions in several anti-inflammatory cytokines, is the primary outcome of activated transcription factors such as nuclear factor kappa B in monocyte/macrophage populations. These factors interact in a way that leads to enhanced viral replication in HIV (Thieblemont et al., 1995). Reduced levels of two particular anti-inflammatory cytokine markers, i.e., IL-10 and TRAIL (thought to be involved in activation of cell death/survival signals), were retained among other immunoassayed cytokines as significant predictors of Letter-Number Sequencing performance in a small cohort of HIV-positive and HIV-negative adults (Cohen et al., 2011a). A recent double-blind, placebo-controlled, crossover study examining the time-dependent effects of a single low-dose administration of oral hydrocortisone in 36 HIV-positive women revealed enhanced working memory performance, indexed by the letter-number sequencing task, 30 min after administration (Rubin, Phan, Keating, & Maki, 2018b). Interestingly, these hydrocortisone-related reductions in working memory were mitigated by plasma reductions in tumor necrosis factor (TNF)-α, soluble receptor for TNF type II (TNFRII), MCP-1, MMP9, and sCD14 (Rubin et al., 2018b). Another trial examining the effect of cenicriviroc, a dual C-C chemokine receptor type 2 (CCR2) and type 5 (CCR5) antagonist showed increased working memory performance indexed by the Digit Span Backward, California Verbal Learning Test B, and LNS associated with decreased monocyte activation indexed by sCD163, sCD14, and neopterin (D’Antoni et al., 2018). Among a cohort of low-SES midlife women followed longitudinally, TNF-α and IL-6 predicted poorer global neurocognitive impairment across HIV-positive and HIV-negative individuals while C-reactive protein (CRP) specifically predicted lower attention/working memory indexed by the letter-number sequencing task (Rubin et al., 2018a). Adjacent to these studies Woods and colleagues explored the relationship of peripheral and central biomarkers of inflammation performance on a procedural memory task. Procedural memory is a form of memory for future intentions that involves executive capacities of set-shifting, inhibition, and updating in conjunction with intent and planning (Kliegel et al., 2002; Schnitzspahn, Stahl, Zeintl, Kaller, & Kliegel, 2013). After controlling for HIV disease and treatment-related factors, higher levels of TNFRII, MCP-1 in plasma, and tau in cerebrospinal fluid were associated with greater decrements in prospective memory performance (Woods et al., 2006).

In contrast to this working memory literature, evidence of peripheral biomarkers linked explicitly to set-shifting or mental flexibility domain are more seldomly found. In a cross-sectional study of HIV+/HCV+ adults and HIV-seronegative controls, reduced performance on TMT-B was associated with elevated IL-6 and reduced IL-10 concentrations (Cohen et al., 2011). Genetic sequencing for biomarkers of neurodegeneration derived from peripheral blood yield interesting associations with markers of set-shifting ability. For example, the APOE ε4 allele has been associated with poorer Trails B and the Stroop interference task performance within HIV-positive but not HIV-negative individuals (Chang et al., 2011, 2014).

To date, very few studies examining decision-making ability in persons living with HIV compare these indices to peripheral biomarkers. Lower current CD4 cell count has been associated with greater activation of prefrontal brain regions during a decision-making task indirectly suggesting immune suppression is predictive of altered allocation of neural resources during this task (Connolly et al., 2014).

4 Central Biomarkers of Executive Dysfunction in HIV

Akin to advancements made in our understanding of the relationship of peripheral immune to executive functioning in HIV, there is now extensive support for the role of immune surveillance in the CNS in the manifestation of neurocognitive impairment among PLWH (Spudich, 2016). CNS metabolites such as glutamate are increasingly linked with HAND and may reflect the indirect damage of neuronal processes via neurotoxic secretions from surveillance cells such as microglia (Dickens et al., 2015; Erdmann et al., 2007; Jiang et al., 2001; Zheng et al., 2001). The damage to neurons in HIV appears to be secondary to the shedding of viral proteins such as TAT and gp120 (Behnisch, Francesconi, & Sanna, 2004; Maragos et al., 2003) or indirectly due to the production of neurotoxic substances from activated astrocytes, monocyte/macrophages, and resident microglia (Benos, Mcpherson, Hahn, Chaikin, & Benveniste, 1994; Brack-Werner, 1999; Chen et al., 2002; Patton, Zhou, Bubien, Benveniste, & Benos, 2000). Indeed, neurons are not directly infected by HIV; rather, greater evidence has been levied in support of the direct infection and activation of these neuroglia and resident immune cells (Epstein and Gendelman, 1993; Trillo‐Pazos et al., 2003). This work has traditionally featured pathology of hippocampal regions in the context of HAD; however, exploration into HIV viral proteins and indirect HIV neurotoxicity of midbrain, striatal and cortical dopaminergic neurons yield insight into frontostriatal executive control systems (Bennett et al., 1995; Everall et al., 1999; Itoh, Mehraein, & Weis, 2000; Nath et al., 2000).

There are several magnetic resonance imaging (MRI) modalities used to investigate HIV-related changes in the brain. Magnetic resonance spectroscopy (MRS), for instance, has been used to measure subclinical biochemical abnormalities in the brain that correspond with changes in the metabolic function of underlying cells (Cysique et al., 2013). MRS studies found a Abnormal brain metabolite levels are commonly found in white matter areas underlying executive control (Chang et al., 2002; Ernst et al., 2003). These findings are also consistent with white matter changes detectible by diffusion tensor imaging (DTI) tractography which is an MRI method that measures white matter abnormalities in vivo. In PLWH, average directional movement through white matter, or global functional anisotropy (FA), was also associated with impaired executive function (Tate et al., 2010). The association between EF and HIV extends to the structure and function of gray matter. Structural MRI, which is used to measure the cortical thickness of the brain, consistently reports HIV-related loss in cortical volume (Chang et al., 2011; Pfefferbaum et al., 2012). Functional magnetic resonance imaging (fMRI) is also emerging as a sensitive tool for detecting cerebral pathology related to HIV (Ernst et al., 2002). Blood oxygenated level dependent (BOLD) fMRI measures the brain’s hemodynamic response, i.e., delivery of oxygenated blood in response to metabolic activity among neural ensembles. BOLD fMRI studies often investigate changes in functional connectivity (FC), the synchronous activation of spatially distributed brain regions (Biswal, Yetkin, Haughton, & Hyde, 1995). Functional neuroimaging studies of the brain repeatedly show that HIV-associated cognitive function is associated with the FC of the frontostriatal circuits that are involved with attention and working memory (Chang et al., 2001; Ernst et al., 2002). A meta-analysis of fMRI studies concluded that the frontostriatal circuit is generally hyperactive in HIV-positive individuals (Plessis et al., 2014, 2015). Dysfunctional FC between the basal ganglia and frontal regions of the brain was also observed (Ipser et al., 2015; Melrose, Tinaz, Castelo, Courtney, & Stern, 2008; Ortega, Brier, & Ances, 2015; Thomas et al., 2013). In sum, there are generalizable disturbances that can be observed in HIV-positive subjects compared to seronegative in areas involved with EF (Hakkers et al., 2017).

MRS markers of glial functioning (i.e., myo-inositol and creatine) within the frontal white matter have also been found to correlate with reduced inhibitory control, suggesting a possible etiological basis for striatal abnormalities (Chang et al., 2002). As previously mentioned, striatal brain regions in conjunction with frontal lobes are important contributors to EF ability in HIV. In a study comparing treatment-naïve PLWH to HIV-negative controls functional hypoactivation of the right and left putamen was observed while performing the Stop Signal Anticipation Task (SSAT) suggesting poorer inhibitory control (du Plessis et al., 2015). In a follow-up study, reduced functional activity of the left putamen during the modified SSAT was also associated with reduced cortical thickness of the right superior frontal gyrus (du Plessis et al., 2016).

Central biomarkers of neuronal metabolic activity, measured by magnetic resonance spectroscopy (MRS) also relate to working memory deficits in PLWH. For example, working memory deficits were linked to abnormal choline and myo-inositol in frontal white matter and the basal ganglia (Ernst et al., 2003). The neural network most closely implicated in working memory processes in the frontal-parietal attention network. This network has at its core the posterior parietal cortex (PPC) and the prefrontal cortex (PFC). These structures overlap, and the resulting frontoparietal network consistently becomes active during working memory processes (Murray, Jaramillo, & Wang, 2017). Functional abnormalities in frontostriatal and frontoparietal circuitry, which are common in HIV, may also conribute to deficits in working memory (Chang et al., 2001, 2002; Tomasi, Chang, De Castro Caparelli, Telang, & Ernst, 2006). For instance, hyperactivation, particularly in the lateral PFC, was found within and outside the frontoparietal network before and after completing four distinct working memory tasks, suggesting perhaps compensatory challenges for PLWH during working memory task performance (Chang et al., 2001; Ernst et al., 2002). HIV-related deficits in working memory are also found to correlate with BOLD activation in the cerebellum and superior PFC (Tomasi et al., 2006). It should be noted that although most HIV studies report significant differences in cortical activation during working memory tasks, group differences in behavioral performance are not consistently noted suggesting these functional biomarkers are sensitive to compensatory differences in working memory amongst PLWH compared to HIV-negative controls (Caldwell et al., 2014).

Neuroimaging studies in PLWH also suggest that poorer performance in set-shifting ability correlates with structural and functional abnormalities in frontostriatal circuit. HIV-associated changes can be observed in the metabolic activity of global white matter regions of the brain as evidenced by the association of glutamate concentrations with performance on TMT-B (Trillo-pazos et al., 2003). In another study, performance on the TMT-B and the verbal interference task was found to correlate with global FA, (Tate et al., 2010). Lower performance on the WCST is also shown to correspond with lower cortical volume of the bilateral caudate, the left accumbens, right putamen, and globus pallidum of HIV-positive individuals (Correa et al., 2016). Support for set-shifting deficits also extends from the structural literature into the functional domain of neuroimaging studies. In an fMRI study using a task-switching paradigm wherein individuals performed a face-gender or word-semantic task, HIV-related failure to adapt to change was associated with a paucity of activation within the anterior cingulate cortex (ACC), compared to HIV-negative controls (Jiang et al., 2016a). Overall, findings across a host of neuroimaging modalities support abberant brain activity for PLWH during set-shifting tasks.

HIV-related performance decrements on gambling and decision-making tasks are associated with abnormal activation in several brain regions known to support decision-making including the medial PFC, dorsal striatum, ventral striatum, and insula (Connolly et al., 2014; Meade et al., 2016). In an fMRI task there were no behavioral differences found in the rate participants made choices between smaller, more immediate rewards and larger delayed rewards, however, there was an HIV-group effect for activation of the left parietal and bilateral prefrontal cortex during easy trials and soley within the prefrontal cortex during more difficult trials suggesting aberrant allocation of cognitive resources during decision making tasks (Connolly et al., 2014). In addition to the absence of an group effect on behavioral performance the magnitude of HIV-related activation of PFC corresponded with lower CD4 cell nadir count and risk-taking propensity. In another decision-making fMRI task HIV-related differences were noted in behavioral performance, however, greater activation of the basal ganglia, ACC, insula and dorsolateral PFC was found while making risky choices. Additionally, functional neuroimaging has provided evidence of a unique interaction for poly-substance use among primary cocaine abusers in HIV-positive individuals that resulted in aberrant activation in the bilateral PFC and cerebellum during an intertemporal decision-making task (Meade et al., 2017). Lack of behavioral differences to scan tasks. This area of research suggests that there is an adaptive functional response of the frontostriatal loop during decision making processes that may be dysfunctional in HIV despite comparable performance on behavioral indices. Moreover, behavioral factors such as substance abuse can further debilitate these networks to compensate for HIV-related deficits in EF.

5 Behavioral Factors Linked to Executive Dysfunction in HIV

Since the beginning of the epidemic, several behavioral factors have aligned with executive difficulties in PLWH, the most intuitive being adherence to ART and the presumed impact of viral suppression on neurocognitive outcomes. Considering the context of this chapter and other text that has covered these relationships in detail (Cysique and Brew, 2009), we turn our focus to other pertinent health behaviors tied to HIV disease management, namely the salient and syndemic effects substance abuse in this population. Despite the widespread availability of cART and lower HIV-related mortality, the number of individuals reporting a history of substance abuse has been on a rise (Control & Prevention, 2005). It is estimated that in the USA over 61.4% of adults under care for HIV report use of substance abuse or mental health services (Burnam et al., 2001). Historically, research in substance-abusing PLWH has focused on intravenous drug users; however, poly-substance-based research on alcohol, psychostimulants, and cannabis use continue to show a rise in usage (Chander, Himelhoch, & Moore, 2006; Hinkin et al., 2004). Recreational and medicinal marijuana use in HIV-positive persons is among the most common forms of substance use (Fogarty et al., 2007). Although still controversial, the largest effects for chronic marijuana use on cognitive functioning in PLWH appears to be within the memory domain (Cristiani, Pukay-Martin, & Bornstein, 2004; Gonzalez, Schuster, Vassileva, & Martin, 2011). Alcohol abuse has substantial impact on most domains of cognitive functioning amongst PLWJ with effects lingering years after sobriety (Gongvatana et al., 2014; Stavro, Pelletier, & Potvin, 2013). While in the presence of alcohol abuse most forms of cognitive functioning show declining trends and are associated with decreased viral resistance, exacerbated metabolic injury, and negative neuropathological, immune, and other treatment outcomes (Azar, Springer, Meyer, & Altice, 2010; Baum et al., 2010; Braithwaite et al., 2007; Conigliaro et al., 2006; Persidsky et al., 2011). Studies examining the effects of stimulant use on neurocognition in HIV have become more expansive with cocaine and methamphetamine use sharing the spotlight in terms of neurobehavioral sequel of psychostimulant abuse among vulnerable populations (Goodkin et al., 1998; Morgan, Iudicello, Weber, & Woods, 2016, Meade, Conn, Skalski, & Safren, 2011; Napier, 2017; Soontornniyomkij et al., 2016).

As our understanding of the neurocognitive sequel of HIV expands, it has also become evident that levels of exercise and physical activity may be important predictors of neurocognitive outcomes in the executive domain (Vancampfort et al., 2018b; Weber, Blackstone, & Woods, 2013). Physical activity and exercise levels in PLWH are among the lowest reported in chronic disease populations due to various factors such as HIV-disease symptomology such as fatigue, medical comorbidities, and socioeconomic barriers for healthy living (Rehm and Konkle-Parker, 2016; Vancampfort et al., 2017, 2018a; Webel et al., 2016). Studies examining the effects of moderate aerobic exercise on health outcomes in PLWH suggest these interventions may be effective in increasing quality of life through enhanced fitness, reduced depression, lower inflammation, and reduced cardiometabolic risk (Chaparro et al., 2018; Cutrono et al., 2016; O’Brien, Tynan, Nixon, & Glazier, 2016; Vancampfort et al., 2016). In healthy populations aerobic, resistance, and multimodal exercise training are all shown to have a positive impact on the preservation of EFs in older age (Barha, Davis, Falck, Nagamatsu, & Liu-Ambrose, 2017). In older adult PLWH higher levels of moderate physical activity were associated with greater EF (Fazeli et al., 2015). In a longitudinal sample of nearly 300 HIV-infected adults, individuals endorsing consistent physical activity, indexed by any activity in which the heart beats rapidly, showed significantly less neurocognitive decline in the broad domain of EF over 30 months than individuals reporting no physical activity (Dufour et al., 2018).

Inhibitory control deficits are found to occur in HIV-positive individuals with co-occurring alcohol (Schulte et al., 2011) or stimulant (Rippeth et al., 2004) use disorders. Notably, Pfefferbaum and collaborators utilized an emotional Stroop task suggesting frontoparietal attention and frontosubcortical emotion systems may be easily taxed in HIV-positive individuals with alcohol abuse (Schulte et al., 2011). It appears that a wide range of drugs have a deleterious effect on cognitive inference in HIV-positive individuals. A recent fMRI study of the Stroop task performance found signal change within a cluster of neurons in the frontoinsular cortex, during cognitive interference, correlated positively with cumulative years of regular marijuana use (Meade et al., 2018). Physical activity proxies are also found to relate to deficits in inhibitory control for PLWH. In a small sample of HIV-positive older adults, higher V02 mL/kg max was associated with greater executive control indexed by the Delis-Kaplan EF System (DKEFS) Color-Word Interference Test (Mapstone et al., 2013). Cross-sectional data from a cohort of over 200 HIV-positive adults from the Multicenter AIDS Cohort Study revealed that compared to individuals reporting low physical activity (Indexed by the International Physical Activity Questionnaire (IPAQ)), those reporting high, but not moderate activity evinced lower likelihood of executive dysfunction indexed by Stroop interference and Trails-B performance (Monroe et al., 2017).

Working memory performance has also been linked with substance abuse and other health behaviors in PLWH. Poly-drug use is found to be associated with poorer delayed non-match-to-sample performance (Martin et al., 2003). Moderate exercise is shown to be among the most highly predictive health behaviors in the modification of neurocognitive function amongst both healthy and chronic disease populations by reducing oxidative stress and other neuro-inflammatory markers while enhancing angiogenesis, neurogenesis, and synaptogenesis (Ahlskog, Geda, Graff-Radford, & Petersen, 2011; Smith et al., 2010). Self-reported exercise was associated with greater working memory (indexed by the PASAT and the WMS-III spatial span test), within a cohort of over 300 community-dwelling HIV-positive adults (Dufour et al., 2013). This coincides with an earlier study showing that in addition to fine-motor control, current exercise was associated with PASAT performance (Honn, Para, Whitacre, & Bornstein, 1999).

Poorer set-shifting ability is also found with co-occur with stimulant use disorders (Bousman et al., 2010b; Rippeth et al., 2004). Meanwhile, activities of daily living, particularly engagement in employment-related activities, predicted poorer Trails B performance. In a study of over 150 adults living with HIV, Trails B performance, along with physical limitations, differentiated employed from unemployed participants (van Gorp, Baerwald, Ferrando, Mcelhiney, & Rabkin, 1999). Although the direction of this effect is not clear greater activities of daily living are supported by preserved EF while job-related activities may support and hence attenuate decline in EF.

Deficits in decision making are present in co-occurring or recent substance dependence including poly-substance dependence in males living with HIV (Martin et al., 2004b). Similarly, history of crack-cocaine and/or heroin use among HIV-infected women is associated with reduced loss aversion (Vassileva et al., 2013). This work begs the question of whether gender differences are present in decision making among persons living with HIV. Indeed, the Game of Dice Task has been used to show decision-making impairments are significantly more impaired amongst women compared to men living with HIV (Martin et al., 2016).

6 Psychosocial Factors Linked to Executive Dysfunction in HIV

HIV is a syndemic disease with those infected being disproportionately exposed to life stressors. With an ever-increasing proportion of the HIV population belonging to female or sexual minority group past history of trauma has become a common feature of the HIV syndemic of substance abuse and violence (Israelski et al., 2007; Machtinger, Wilson, Haberer, & Weiss, 2012; McIntosh and Rosselli, 2012; Meyer, Springer, & Altice, 2011; Sullivan, Messer, & Quinlivan, 2015; Whetten et al., 2006). Chronic exposure to stress is increasingly shown to have a negative impact on cognitive performance across the life span (Juster, Mcewen, & Lupien, 2010; McEwen, 2008). Indeed, much of the pathophysiological processes implicated in the effects of childhood adversity on neurocognition, e.g., HPA-axis dysregulation, neuroinflammation and oxidative stress, have been implicated in the multifaceted psychopathology of HAND (Thames et al., 2018). The cascade of neuroendocrine and neurotransmitter activity that occurs in response to repeated and prolonged stress in otherwise healthy individuals may be accelerated in HIV disease. For instance, with regard to the domain of verbal learning and memory, effects for probable PTSD on cognitive impairment in a large group of HIV-positive woman and controls varied depending on whether the PTSD exposure was linked to sexual abuse and/or violence (Rubin et al., 2016a). Moreover, among women living with HIV, the prefrontal lobe of the brain appears to be particularly vulnerable to stress and as a result may provide a pathway to impaired learning and memory processes in this population (Rubin et al., 2016b). Social adversity is a salient psychosocial risk factor. In a mixed sample of HIV-positive and HIV-negative men and women, social adversity predicted reductions in left hippocampus volume which in turn corresponded with deficits in EF including working memory (Thames et al., 2018). Using a composite scale indicating history of trauma, economic hardship (food insecurity and low socioeconomic status), and perceived psychological stress total adversity was correlated with a composite measure of EF (Watson et al., 2019). Perceived stress was also associated with poorer EF indexed by the color-word (interference) condition of the Stroop, Trails B, which measures mental flexibility, and the working memory condition of LNS. Moreover, in predominantly white HIV-infected men who have sex with men, acute stressful life events are associated with an index of worse global EF (Pukay-Martin, Cristiani, Saveanu, & Bornstein, 2003).

Inhibitory deficits, although prevalent, are not readily found associated with psychosocial function in persons living with HIV. However, apathy and irritability ratings were negatively related to inhibitory performance on the Stroop interference task in addition to a dual task (Castellon, Hinkin, & Myers, 2000). Apathy is a neuropsychiatric syndrome that taps frontostriatal circuitry that underpins executive function and is discussed in detail otherwise (McIntosh, Rosselli, Uddin, & Antoni, 2015).

Among a cohort of HIV-positive individuals a combined index of trauma, economic hardship (food insecurity and low socioeconomic status), and stress was related to lower levels of working memory (Watson et al., 2019). Within a cohort of midlife women with and without early life stress (ELS) and HIV infection, greater self-report levels of ELS related to structural atrophy-related attention/working memory were evident between the left frontal lobe volume and Trails Making Test A (Spies, Ahmed-Leitao, Fennema-Notestine, Cherner, & Seedat, 2016). Poorer digit span performance correlated with greater levels of alexithymia, an index of difficulty identifying and describing feelings closely tied to psychological distress, in a small group of asymptomatic HIV+ individuals (Bogdanova, Díaz-Santos, & Cronin-Golomb, 2010). A dual-task paradigm consisting of a visual tracking task interspersed with a digit span task showed performance in 189 HIV-positive adults to covary with an index of apathy and irritability derived from the Neuropsychiatric Inventory (NPI) modified for use with HIV-positive individuals (Cole et al., 2007). Interestingly, apathy, but not depression was found to correlate with working memory performance on a digit span task in HIV-positive individuals across the disease spectrum (Castellon, Hinkin, Wood, & Yarema, 1998). Meanwhile, in another HIV-positive cohort, family-related stress was negatively correlated with performance on the paced auditory serial addition test after controlling for age, education, anxiety, and depression (Pukay-Martin et al., 2003).

As it pertains to set-shifting, a longitudinal cohort of HIV-positive  women showed that over the course of 1 year there was an interactive effect of HIV and childhood trauma on their Wisconsin Card Sorting Test performance (Spies et al. 2017). Importantly, greater mood disturbance and medical symptoms are associated with cognitive complaints in HIV-positive individuals but are not neurocognitive performance (Carter et al., 2003). Greater levels of alexithymia were also found to be associated with Trail-B performance, controlling for time of completion on Trails A (McIntosh et al., 2014). A similar finding was made in a small cohort of asymptomatic individuals and HIV-negative controls, albeit only a correlation with the externally oriented thoughts subscale was predicted by longer time of completion on the Trails B (Bogdanova et al., 2010). Studies from international cohorts are also contributing to our understanding of the neurocognitive presentation of HIV in EF. For instance, in a cohort of more clinically advanced HIV-positive women and healthy HIV-negative controls from South Africa history of early life stress was associated with greater decline in WCST performance (Spies et al., 2017). However, among a small sample of HIV + and HIV-negative Iranian men, EF indexed by the WCST and the Tower of London (ToL) task did not relate to PTSD symptoms indexed by the Impact of Event Scale (Moradi, Miraghaei, Parhon, Jabbari, & Jobson, 2013).

7 Conclusion

As predictors of HIV-associated executive dysfunction are gradually established in the Western hemisphere it is apparent that efforts must be replicated to elucidate factors contributing to executive dysfunction within the context of the high viral genetic diversity found in other parts of the world. For instance, nearly all of the nine major subtypes of HIV-1 Group M (A–D, F–H, J, and K), in addition to strains of HIV-1 Groups N and O, and HIV-2 are found on the continent of Africa. Roughly 66% of the estimated more than 30 million people living with the virus reside in Africa (Hemelaar, Gouws, Ghys, & Osmanov, 2006). The subtype B HIV-1 strain is most commonly found among US individuals however, those in Cameroon feature groups O and N and unique recombinant forms. A recent study showed executive dysfunction to vary as a function of viral genotype (Kanmogne et al., 2018). Thus, well-validated measures of EF will need to be made available for comparison studies. It is also clear that functional brain abnormalities associated with EF may persist in PLWH despite behavioral differences seldomly being detected. This suggests promise for functional imaging modalities in the detection of changes in neural substrate that predate cognitive impairment in the EF domain. Henceforth, other indices of neurocognitive impairment such as that derived from functional magnetic resonance imaging (fMRI) may emerge as more sensitive screening tools for diagnosis of HAND and its various domains (Hakkers et al., 2017). As fMRI research continues to expand, the field tasked with the challenge of developing sophisticated tools and greater insight to establish the factors related to the neurocognitive sequel of HIV in the executive domain.