Introduction

Traumatic brain injury (TBI) is defined as an alteration in brain function, caused by an external force, being a major public health problem in nowadays [1]. Neuroendocrine dysfunction following TBI may occur with a much higher prevalence than previously suspected [1]. Similarly, subarachnoid hemorrhage (SAH) and ischemic stroke (IS) may lead to pituitary dysfunction although the literature in this field has largely developed lately as survival rates mainly from SAH have improved, enabling pituitary evaluation to be performed on large cohorts of long-term SAH survivors [2, 3].

Pituitary disorders are a frequently overlooked complication of TBI and SAH. Many symptoms of TBI and SAH survivors such as fatigue, concentration difficulties, depressive symptoms are nonspecific and overlap with symptoms of patients with pituitary disorders from other causes with a large amount of hypopituitarism related to brain damage remaining undiagnosed [2, 4]. This can explain why the diagnosis of hypopituitarism is often missed or delayed after these conditions—with potentially serious and sometimes life-threatening consequences for the affected patients. Therefore, these patients should undergo routine screening for hypopituitarism and each pituitary hormone must be evaluated separately, since there is a variable pattern of hormone deficiency among patients with TBI-induced hypopituitarism [1].

Epidemiology and pathophysiology

The drive to diagnose hypopituitarism is the suspect that the secretion of one or more pituitary hormone may be subnormal. This suspicion has an appropriate clinical context in which hypopituitarism can be present, or symptoms known as caused by hypopituitarism [1]. The knowledge that the patient had a TBI or SAH is robust enough to evaluate for hypopituitarism because some patients with hypopituitarism have no symptoms [13]. On the other hand, considering the high number of subjects with TBI, it is crucial to strictly define on a cost/benefit basis which patients should be investigated for hypopituitarism [4]. Evidence shows that pituitary function is impaired at least in 20–30 % of patients following moderate-to-severe TBI; thus, patients with clinical signs or symptoms associated with hypopituitarism should be screened for their pituitary function [5].

There is a wide variation in the frequencies of hormone deficits following TBI and SAH [13]. Reported prevalence of TBI-induced hypopituitarism considerably differs among reported studies ranging from 5.4 to 90 % [6]. Nevertheless, approximately one-third of TBI patients will develop at least one anterior pituitary disorder [7]. In particular, a meta-analysis including patients with TBI reported a prevalence of anterior hypopituitarism from 15 to 68 % and a posterior endocrine dysfunction about 6.9 % [8]. On the other hand, the evidences for pituitary dysfunction following SAH and IS are less respect to TBI [810]. Among long-term SAH survivors, the reported prevalence of anterior and posterior dysfunction was from 37.5 to 55 % and 0 to 2.8 %, respectively [8]. Similarly, hypopituitarism was detected from 23.5 to 37.5 % in stroke patients [3, 10]. These variabilities could be attributed to different timings of evaluation, severity of the trauma or stroke, diagnostic criteria, methods and assays to test endocrine function [7]. Thus, the timing of hormonal assessment of the pituitary disorders at different time frames varied highly across studies [1]. The transient hypopituitarism has been almost exclusively reported in the first 6 months after TBI as hormone alterations mimicking pituitary insufficiency can be present in the acute phase after trauma [6]. An early assessment of pituitary function after the event could lead to overestimate prevalence of pituitary dysfunction in these patients, especially if patients with a more severe trauma are included. Indeed, the chance to develop hypopituitarism is directly related to the severity of trauma. A rational approach to endocrine testing could be with an assessment of HPA axis reserve in the acute phase followed by a more comprehensive assessment of pituitary function during the chronic recovery phase. Moreover, physiologic hormonal changes that can mimic pituitary dysfunction are often observed in the early posttraumatic period. The physiologic response to acute and critical illness comprises hormonal changes similar for GH deficiency, central hypogonadism, and hypothyroidism. Furthermore, the metabolism of the protein-binding hormones can be altered by acute illness or drugs frequently used in severe diseases, resulting in increased circulating levels and, consequently, false deficiencies [9].

The reliability of the methodological tools used to assess pituitary function could be another important factor [1]. Evaluation of the GH and HPA axes requires dynamic stimulation test in order to clearly separate normal from deficient responses and normative cut-off should be defined using healthy volunteers in each local laboratory [1, 11, 12]. Therefore, differences in the reported frequency may be due to more stringent diagnostic criteria applied by some studies but not to others [1, 1316]. Moreover, cut-off levels used for tests and possible different hormone assays might have another important influence on the frequency of hypopituitarism [1, 11, 17, 18]. Therefore, the robustness of the methodologies used to diagnose hypopituitarism varies between studies but there is a broad agreement nevertheless that hypopituitarism is a common complication of TBI [111].

The pathophysiology of hypopituitarism following brain damage is not completely understood and several factors implicated in its development have been suggested. Different types of lesions could cause from damage to the pituitary capsule, to injury to the anterior and posterior lobe and pituitary stalk in the form of hemorrhage, necrosis, and fibrosis [8, 19]. Moreover, vascular damage to the pituitary gland can also be caused by venous infarctions of the hypophyseal portal vessels or by pituitary stalk transaction [1, 8]. Finally, recent research has indicated a possible interaction between autoimmunity and the development of hypopituitarism after TBI. It has been demonstrated that antipituitary and antihypothalamic antibodies (Abs) are present in patients with TBI-induced pituitary dysfunction and persist even 5 years after diagnosis. [20, 21]. It seems that antibodies may form after TBI due to the disruption of the blood–brain barrier, which allows brain proteins to leak into the circulation, evoking an immune response [20]. Moreover, in patients with higher titers of pituitary antibodies, the development of pituitary deficiencies is more frequent whereas the recovery of pituitary function is related to negative antibodies titers [22].

Several studies have highlighted the existing relationship between hypopituitarism and TBI-associated morbidity. [1, 23 ]. Serious and life-threatening adrenal crises secondary to acute ACTH hormone deficiency in patients with TBI have been highlighted in the literature with dramatic improvement following glucocorticoid replacement [24, 25]. Increased neuropsychiatric morbidity in patients with TBI with GH insufficiency also has been reported [26]. It is further expected that hypopituitarism significantly impedes recovery and rehabilitation after TBI [9]. In fact, posttraumatic hypopituitarism was independently associated with poor quality of life, abnormal body composition, and adverse metabolic profile [27]. In patients with SAH, preliminary data indicate that neuroendocrine disturbances contribute to disturbed quality of life, depression, and sleeping disturbances [28].These findings indicate that hypopituitarism after both TBI and SAH is associated with poor recovery and worse outcome (Table 1).

Table 1 Diagnostic processes of pituitary function in adult TBI/SAH and IS patients

Diagnostic process

Hypopituitarism should be diagnosed as a combination of low peripheral and inappropriately normal/low pituitary hormones. However, the measurement of basal hormones alone may be not distinctive due to pulsatile, circadian, or situational secretion of some hormones.

Hypothalamic–pituitary–adrenal (HPA) axis

The basal secretion of ACTH must be sufficient to maintain the serum cortisol concentration within the normal range. During physical stress, ACTH secretion must increase in order to raise serum cortisol concentrations adequately. Thus, normal cortisol levels do not exclude the ACTH inability to adequately respond to stress if impaired. ACTH and cortisol secretion follow a circadian rhythm with highest levels in the early morning hours and lowest levels around midnight. Thus, basal ACTH secretion, and serum cortisol should be measured at 8 to 9 AM.

In the appropriate clinical context, secondary adrenal insufficiency can be excluded at morning cortisol levels greater than 500 nmol/L (>18 mcg/dL) and is indicated below 83–100 nmol/L (<3 mcg/dL). Intermediate levels require a stimulation test [1, 2931].

Hypoglycemia (blood glucose <2.2 mmol/L, 40 mg/dL) induced by the insulin tolerance test (ITT, 0.05, 0.1–0.2 IU insulin per kg body weight given iv as a bolus) is considered as the gold standard for assessment of the HPA axis. A cortisol peak level greater than 500 nmol/L (18 mcg/dL) indicates normal integrity of the axis. This test has some unpleasant side effects such as sweating, trembling, fatigue, and hunger, and is contraindicated in elderly patients and in those with cardiac, cerebrovascular or epileptic seizures. In the under-study population, this test has been used by some authors [11, 13, 32, 33] without adverse effects, but other authors have used alternative tests, suggesting the possible danger of ITT [5, 16]. It should be conducted only under close supervision in experienced centers [1, 5, 17, 34].

Corticotropin releasing hormone (CRH, 100 µg as a bolus), given as a stimulus for the pituitary ACTH reserve, has proven to be not more predictive of adrenal function than morning cortisol levels [1, 35]. Therefore, it is of limited value for the diagnosis of ACTH deficiency.

Overnight metyrapone test (MET) has so far been considered a convenient and sensitive method, with a good correlation with ITT [36]. MET (30 mg/kg administered orally with a snack at midnight) inhibits adrenal 11β-hydroxylase and the conversion of 11-deoxycortisol (11-DOC) to cortisol with a possible risk of precipitating adrenal crisis. Shortcomings of the MET test are the limited availability of reliable 11-DOC assays and the fact that the drug is not widely available owing to local restrictions [36].

ACTH deficiency causes adrenal atrophy and ACTH receptor down-regulation. This is the rationale for the administration of exogenous ACTH [1, 29, 30]. Thus, the 250 µg 1-24 ACTH test could be used to establish secondary adrenal insufficiency if performed at least 4 weeks after onset of ACTH deficiency [1, 29, 31]. It is administered intramuscularly or intravenously measuring serum cortisol 30 and 60 min later. A serum cortisol concentration ≥500 nmol/L (>18 mcg/dL) is considered a normal response. Stimulated cortisol levels ≤500 nmol/L indicate ACTH deficiency [1, 31, 37].

It has been proposed that the low-dose (1 µg) corticotrophin test represents a more physiological stimulus for maximal adrenal stimulation compared to the 250 µg [1, 29, 31]. Some studies have suggested a superior sensitivity for the 1 µg corticotrophin test whereas a meta-analysis reported comparable diagnostic accuracy of both tests [38]. The low-dose test has technical disadvantages (dilution in particular), and the need for repetitive blood sampling. This makes the standard test more practical and easy performed. Moreover, in a retrospective study where 148 patients with a low-normal cortisol response to the 250 µg test were studied over a median time of 4.2 years; only two patients developed a clear-cut adrenal insufficiency, another two presented a persisting diagnostic uncertainty, and seven had adrenal insufficiency after subsequent pituitary surgery or irradiation [39]. Thus, the 250 µg ACTH test seems an appropriate diagnostic tool to exclude clinically significant hypoadrenalism, even though, definite conclusion, including a comparison with the ITT, would be desirable.

In the mean time, glucagon stimulation test (GST, 1 mg glucagon s.c.) has been started to be used more frequently in the evaluation of the HPA axis [40, 41]. Blood samples for measurement of cortisol were obtained 90, 120, 150, 180, 210, and 240 min after glucagon injection. The cut-off cortisol level ≥500 nmol/L accepted as sufficient responses to ITT seems to reveal a low specificity in GST [42]. In different studies, it was demonstrated that the minimum peak cortisol response of an healthy individual could be as low as 200, 250, and 295 nmol to GST [43, 44] although the majority of healthy adults were demonstrated to have peak cortisol responses ≥500 nmol/L to GST [45]. It is likely too high and will likely misclassify many healthy adults as being adrenally insufficient. Thereby, as suggested by Berg et al., a lower cutpoint is likely required in order to avoid many false positives with regard to cortisol status during the glucagon test with the threshold of 277 nmol/L being more specify and sensitive [42].The reproducibility of the GST seems to be about 88 % in the evaluation of the HPA axis and GST may be a good alternative. In a recent study by Simsek et al., comparing ITT, low-dose ACTH test and GST with each other, it has been shown that the peak cortisol responses to all three tests were found to be well correlated with each other The concordance of all 3 tests was found to be only about 44–50 % according to the cut-off level of 500 nmol whereas higher concordance was detected when new cut-off level for peak cortisol response were used. Consequently, any test can be used in the evaluation of the HPA axis, but cut-off levels for the insufficiency of HPA axis should be individualized for each test [40]. Additionally, it has to be borne in mind that no test, including the ITT, classifies all subjects correctly while even healthy individuals might show abnormal values [1, 29, 4649]. Thus, in borderline cases, clinical judgment and careful follow-up is crucial in the evaluation of HPA axis.

TRH-TSH-thyroid axis

In the appropriate clinical context, central hypothyroidism is easily diagnosed when the levels of free T4 are decreased and TSH levels are low or more often normal. Dynamic testing is generally not necessary as it does not add to diagnostic reliability. T3 is still normal in most cases [5053].

GnRH-LH/FSH-gonadal axis

In both sexes. Before the diagnosis of GnRH-LH/FSH-gonadal axis function, prolactin (PRL) excess should be excluded that may be present due to disturbed hypothalamic inhibition of prolactin release [1, 5, 30, 31].

Hypogonadism in pre-peripubertal children does not show specific clinical symptoms until the onset of puberty. Then it usually presents with delayed or missing onset of puberty [1, 53, 54].

In a woman of premenopausal age who has pituitary or hypothalamic disease like TBI/SAH but normal menses, no tests of LH or FSH secretion are needed because a normal menstrual cycle is a more sensitive indicator of intact pituitary–gonadal function than any biochemical test. If a-, oligo-menorrhoea is present with low estradiol levels and inappropriately low/normal LH and FSH levels secondary hypogonadism in premenopausal women is diagnosed. In peri-post menopause, low LH/FSH levels shows central hypogonadism (Fig. 1) [1, 5, 30, 31].

Fig. 1
figure 1

Screening recommendations for pituitary function after traumatic brain injury/stroke

In man, LH deficiency can best be detected by measurement of the serum testosterone concentration. If it is repeatedly low at 8 to 10 AM and the LH concentration is not elevated, the patient has secondary hypogonadism. When the serum testosterone concentration is low, the serum LH concentration is usually within the normal range, but low compared with elevated values in primary hypogonadism. If fertility is an issue, the sperm count should be determined [1, 5, 30, 31].

Dynamic test, GnRH in particular, is generally not necessary in adults because it does not add greater diagnostic accuracy. GnRH test could be useful in adolescents when a differential diagnosis between hypogonadism and pubertal delay is cumbersome [1, 30, 31].

Somatotroph axis

The somatotroph (GH-IGF-I) axis needs to be evaluated by stimulation test, unless all other pituitary axes are deficient and insulin-like growth factor-1 (IGF-1) is low [1, 55, 56]. Some studies show that the evidence of low age-related IGF-I level could be diagnostic “per se” only if all other anterior pituitary hormone are already compromised [57, 58]. Measurement of basal serum growth hormone concentration does not distinguish reliably between normal and subnormal growth hormone secretion in adults [1, 5861].

For the diagnosis of GH deficiency, the ITT and the GHRH + arginine test are now considered as the tests of choice with similar accuracy [58].

For the ITT test a peak GH response levels lower than 3 µg/L indicate severe GH deficiency in adults (if hypoglycemia is adequately reached) [1, 40, 58].

The GH releasing hormone plus arginine test (GHRH + ARG, 1 µg/kg GHRH iv as a bolus plus 30 g arginine as an infusion over 30 min) is easy to perform, well tolerated, and has been shown to reliably detect severe GH deficiency in a lean adult population when a cut-off of 9 µg/L is used [62, 63]. But, the GH response to GHRH + ARG, and to all known GH provocative stimuli, significantly declines with increasing body mass index in adults [64, 65]; thus the use of cut-off non BMI-related in obese subjects causes a high percentage of false positive results [29]. The study from Biller and coworkers, suggesting for the GHRH + ARG test a cut-off level of 4.2 µg/L is indicative for GH deficiency in obese subjects [66]. Similarly our group [67, 68] using the Roc-analysis defined the BMI-dependent cut-off levels both in adults and in transition populations. The cut-off levels suggested for lean (BMI < 25), overweight (BMI ≥ 25–30), and obese (BMI ≥ 30) adults subjects are 11.5; 8.0, and 4.1 µg/L, respectively [68]. A GH peak below this limit is indicative of GH deficiency in the appropriate clinical context. Similar results, considering waist circumference instead of BMI was reached by Colao et al. in a collaborative recent study [69]. Moreover, it is known that the response of pituitary GH release as stimulated by multiple individual secretagogues is decreased in aging and elevated thyroid hormone concentrations. However, somministration of arginine enhances GHRH’s stimulatory effect on GH release in the elderly by fourfold, and evokes normal GH secretion acutely [70, 71].

Further, there is evidence in patients receiving chronic glucocorticoid treatment that arginine is able to normalize the GH response demonstrating that the stimulatory action of arginine and the inhibitory action of glucocorticoids on GH secretion are mediated by opposite effects on hypothalamic somatostatin tone [72]. Another potent and validated provocative test is the GHRH + GH releasing peptide-6 test [41, 63]. It has BMI-dependent cut-off (15 and 5 mcg/L for lean and obese with BMI >35 kg/m2, respectively) and possess a great accuracy in distinguish normal subjects from patients with GH deficiency [73, 74].

Unfortunately GHRH is no longer available in the United States. Other classical stimuli such as arginine alone, clonidine, L-DOPA, and the combination of arginine and L-DOPA are much weaker and therefore more likely to give false positive results and useless in adults patients [58]. The glucagon test (GST) (1 mg glucagon im, GH measurements every 30 min until 240 min after administration) has been proposed as alternative because it is able to separate with a sensitivity and specifity of 100 % between GH deficient and healthy subjects if a peak GH of 3 µg/L is considered [56, 75]. GST is a good alternative when GHRH is unavailable or there is contraindication to ITT However, like the other tests, it is age- and BMI-dependent [70] and more time-consuming than other stimulation tests [52] although revised cut-offs might be taken in consideration in the field of the diagnosis of GHD [76].

In all, for the diagnosis of adult GHD in TBI patients we need to consider that no test for the assessment of GH deficiency is totally reliable. The likelihood of GH deficiency in TBI patients increases with increasing numbers of additional pituitary axes deficiencies. [77].

Posterior pituitary

Diabetes Insipidus (ADH deficiency) causes polyuria and polydipsia. Diabetes mellitus as a common cause of polyuria should be excluded. Diabetes insipidus is likely if polyuria (>40 mL/kg body weight/day, >2500–3000 cc day) in combination with urine osmolality <300 mOsm/kg and hypernatriaemia is present. It can be clearly manifested in the acute phase and requires treatment acutely. In case of normal plasma sodium levels, a water deprivation test will be necessary but it should be effected later (during a clinical stable patient phase and recovered). This should be done in an experienced center and signs of exsiccosis should be monitored closely. Generally, diabetes insipidus can be diagnosed if there is no clear increase in urine osmolality (maximum <700 mOsm/kg) or the ratio of peak urine to plasma osmolality is <2. Glucocorticoids may suppress ADH secretion and, thus, diabetes insipidus may be precipitated by glucocorticoid replacement [1, 29, 7780].

Conclusions

In conclusion, hypopituitarism is a common, potentially serious but treatable complication of TBI, and may also occur for a smaller but significant patient number affected by of SAH and IS. Increased level of awareness among physicians of all disciplines who are involved in the care of patients with TBI, SAH, and SI is required to identify affected cases and provide the appropriate and timely hormone therapy, which has the potential to improve recovery, rehabilitation, and quality of life for those patients.