Introduction

Hepatocellular carcinoma (HCC) is the fifth most frequently diagnosed cancer and the third cause of cancer-related mortality worldwide, and liver transplantation (LT) has been accepted as an effective therapy for HCC [1, 2]. According to the widely used Milan criteria, HCC patients with 3 or less than 3 cm lesions or with solitary lesions not greater than 5 cm in diameter are candidates for LT [3, 4]. However, Milan criteria may be too strict; some patients with HCC who exceed the criteria but who could be successfully treated with LT may be unnecessarily excluded. In recent years, many extensions of Milan criteria have been suggested; these extended criteria have been shown to achieve similar outcomes to Milan criteria [3,4,5,6]. Because the simple use of tumor size and number is insufficient to indicate biological features of HCC and to predict mortality, some biological markers such as α-fetoprotein (AFP), des-gamma-carboxy prothrombin (DCP), and the neutrophil-to-lymphocyte ratio (NLR) have been used to select LT candidates beyond Milan criteria. According to the Kyushu University criteria, for example, HCC patients with tumor size ≤5 cm or serum DCP ≤ 300 mAU/mL are candidates for LT [3, 4].

Sharma et al. [7] recently demonstrated that low bone mineral density (BMD), a surrogate marker for bone loss, was independently associated with early marker of deconditioning that precedes sarcopenia [7]. They analyzed BMD by measuring the average pixel density of trabecular bone in the thoracic vertebrae on preoperative enhanced computed tomography (CT). On the other hand, we previously reported that sarcopenia is predictor for poor prognosis after living donor liver transplantation (LDLT) in patients with end-stage liver disease [8]; however, in patients with HCC, we did not find any correlation between sarcopenia and prognosis after LDLT. Besides, the relationship between sarcopenia and osteopenia, both of which may be the common status of recipients with cirrhosis, has not been clarified yet in the world.

To the best of our knowledge, little evidence about the prognostic value of osteopenia in HCC patients undergoing LDLT has been published, compared with information on previously reported biological markers, such as AFP, DCP, and NLR. Therefore, the aim of the present study was to clarify the prognostic impact of osteopenia as a risk factor for post-LDLT mortality in patients with HCC.

Patients and methods

Patient characteristics

A total of 522 adult patients who underwent LDLT at Kyushu University Hospital (Fukuoka, Japan) from January 1998 and December 2015 were enrolled in the study. Among them, 193 recipients underwent LDLT for end-stage liver disease with HCC from 2001 to 2015. Our selection criteria for LDLT in HCC patients were as follows: (1) no modality except LDLT available to cure the patients; (2) no extrahepatic metastasis; and (3) no major vascular infiltration [4]. There were no restrictions on tumor size, number of tumors, or pretransplant treatment. Since defining the Kyushu University criteria, we have not performed LDLT in HCC patients with tumor size greater than 5 cm and DCP levels greater than 300 mAU/mL [3]. Pretransplant imaging was used to estimate the maximum tumor size, number of tumors, and tumor locations [9]. Not only biological surrogate markers, such as AFP, DCP, and NLR, but also sarcopenia for patient status were measured before the LDLT. The grafts were selected, according to the previous report [10, 11]. Briefly, a right lobe graft was procured if recipient’s standard liver volume (GV/SLV) ratio using the extended left lobe with caudate lobe was ≤35% and preoperatively predicted remnant liver volume of donor was ≥35%.

All LDLTs were performed after obtaining full informed consent from all patients and approval by the Liver Transplantation Committee of Kyushu University. The study protocol conformed to the ethical guidelines of the 1975 Helsinki Declaration and all procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional review board of ethics committee.

Measurement of bone mineral density and definition of osteopenia

BMD was measured in trabecular bone by calculating average pixel density within a circle in midvertebral core at the bottom of 11th thoracic vertebra (Th11) on preoperative enhanced CT, as previously described [7] (Fig. 1a). BMD was significantly correlated with age (r = − 0.435, P < 0.001) (Fig. 1b), and therefore, osteopenia was defined as actual BMD below the calculated standard BMD, which were calculated according to a previous report [12]; 308.82–2.49 × Age in men and 311.84–2.41 × Age in women.

Fig. 1
figure 1

a Measurement of trabecular bone mineral density (BMD) within a circle in midvertebral core at the bottom of 11th thoracic vertebral level (Th11). b Relationship between measured BMD and age in the recipients (n = 193). c The delta BMD values (actual BMD values minus standard BMD values) in each patient, including 151 of survived patients (gray) and 42 of died patients (black) in the follow-up period. d Overall survival of recipients with osteopenia (n = 103) or without osteopenia (n = 90) after LDLT for hepatocellular carcinoma. BMD bone mineral density; HU Hounsfield units

Definition of sarcopenia

Skeletal muscle area was measured as previously described [13, 14]. In brief, cross-sectional areas (cm2) of skeletal muscles in L3 region were measured by manual outlining on CT images. The formulae to calculate the standard skeletal muscle area were 126.9 × body surface area (BSA)–66.2 in men and 125.6 × BSA–81.1 in women [13]. Using these formulae, sarcopenia was defined as actual skeletal muscle area below the calculated standard skeletal muscle area × 0.80.

Postoperative management

The graft harvesting technique, recipient surgery, and perioperative management of recipients, including immunosuppression regimens, have been previously described [15]. Immunosuppression was initiated with a protocol based on tacrolimus (Prograf; Astellas Pharma Inc., Tokyo, Japan) or cyclosporine A (Neoral; Novartis Pharma K.K., Tokyo, Japan), with steroid and/or mycophenolate mofetil (MMF; Chugai Pharmaceutical, Tokyo, Japan) [16]. The target trough for tacrolimus was set at 10 ng/mL for 3 months after LDLT, followed by 5–10 ng/mL thereafter. The target trough level for cyclosporine A was set at 250 ng/mL for 3 months after LDLT, followed by 150–200 ng/mL thereafter. Methylprednisolone was initiated on the day of LDLT, then tapered and converted to prednisolone 7 days after LDLT. Prednisolone treatment was tapered and discontinued 6 months after LDLT. MMF was used in 167 recipients (86.5%) and was started at 1000 mg/day on the day after LDLT, then tapered and discontinued until 6 months after LDLT. All patients had monthly follow-ups. The median follow-up period was 2518 days; 976 days and 3828 days were the 25th and 75th percentiles, respectively.

Post-LDLT mortality and risk factors

The primary endpoint of this study was the death of recipients after LDLT. All patients underwent abdominal CT scan every 3 months and chest CT scan and bone scintigraphy every 6 months for 5 years after LDLT. Overall survival (OS) was defined as the period between LDLT and recipients’ death. Univariate and multivariate analyses were performed to identify the factors associated with OS after LDLT.

Statistical analysis

All statistical analyses were performed with JMP statistical software version 13 (SAS Institute Inc., Cary, NC, USA) and R (version 3.2.1). Continuous variables were expressed as mean ± standard deviation and compared using the nonparametric Wilcoxon test for independent samples. Chi-square test was used to compare categorical values. Cox proportional hazards regression model was applied to the multivariate analyses [4, 16]. Survival was calculated with the Kaplan–Meier product-limited method; differences in survival between the groups were compared with the log-rank test. Nomogram was generated by the coefficients of the multivariable Cox regression model using the RMS package in R [17]. Calibration curves were formulated graphically by plotting the observed rates against the nomogram predicted probabilities, and a concordance index (C-index) was calculated via a bootstrap method with 1000 resamples. P values < 0.05 were considered significant.

Results

Characteristics of recipients, donors, and tumors

We focused on only data available before transplant, owing to the fact that decision making should be absolutely done using such pretransplant data. Characteristics of the recipients and donors in this study are shown in Table 1. The median value of BMD was 163.6 Hounsfield units (HU); osteopenia was diagnosed in 103 recipients (53.4%), according to the calculated standard BMD values. The BMD values in patients with osteopenia were significantly smaller than those without osteopenia (P = 0.0001, 126.3 ± 32.5 vs. 206.4 ± 30.5 HU). Between the two groups, there were no significant differences regarding the data from recipients, donors, and tumor-related factors, except for rate of sarcopenia. The rate of sarcopenia in patients with osteopenia was significantly higher than that without osteopenia (P = 0.031, 34.3% vs. 21.1%).

Table 1 Baseline characteristics of recipients and donors (n = 193)

Of 193 patients, 54 patients (28.0%) had major postoperative complications within 1 month after LDLT, such as sepsis (n = 8), graft dysfunction/small-for-size syndrome (n = 6), postoperative bleedings requiring surgery (n = 6), bile leakage (n = 4), pancreatic fistula due to splenectomy (n = 7), acute rejection (n = 4), portal vein thrombosis (n = 2), and others (n = 17). The rate of postoperative complications in patients with osteopenia was significantly higher than that without osteopenia (P = 0.020, 35.0% vs. 20.0%).

According to ROC analysis for mortality after LDLT, the cutoff values for NLR was determined as 3.01, and recipients with NLR ≥ 3.01 was 66 of 193 (34.2%). The cutoff values for AFP and DCP were set as 500 ng/mL and 200 mAU/mL, respectively, in the same manner [4]. Recipients with AFP > 500 ng/mL were 26 of 193 (13.5%) and recipients with DCP > 200 were 37 (19.3%).

Impact of osteopenia for post-LDLT survival

The 1-, 5- and 10-year OS rates of enrolled recipients with HCC were 91.7%, 81.3%, and 76.8%, respectively. Among the 193 recipients, 42 died after LDLT (Fig. 1c). The OS of recipients with osteopenia was significantly lower than that of recipients without osteopenia (HR 2.297, 95% CI 1.218–4.578, P = 0.010; 5-year survival, 74.1% vs. 88.6%) (Fig. 1d). Collectively, our results indicate that the presence of osteopenia is significantly associated with poor prognosis after LDLT in HCC, along with the presence of sarcopenia.

Independent risk factors for post-LDLT mortality

A univariate Cox proportional hazards regression analysis for postoperative mortality revealed the following risk factors after LDLT; use of cyclosporine A (P = 0.019), osteopenia (P = 0.010), number of nodules ≥5 (P = 0.001), bilobar tumor distribution (P = 0.011), AFP > 500 ng/mL (P = 0.045), DCP > 200 mAU/mL (P = 0.001), and NLR ≥ 3.01 (P = 0.039) (Table 2). Next, multivariate Cox proportional hazards regression analysis using these seven factors revealed that osteopenia [hazard ratio (HR) 2.106, 95% CI 1.102–4.241, P = 0.024] and the other three tumor-related factors, such as number of tumors ≥ 5 (HR 2.521, 95% CI 1.113–5.743, P = 0.027), DCP > 200 mAU/mL (HR 2.678, 95% CI 1.342–5.168, P = 0.006), and NLR ≥ 3.01 (HR 2.068, 95% CI 1.099–3.868, P = 0.025) were independent risk factors for mortality after LDLT (Table 2). These findings highlight that osteopenia is a novel prognostic biomarker for mortality after LDLT in HCC patients.

Table 2 Risk factors for postoperative mortality (n = 193)

Clinical significance of osteopenia for post-LDLT mortality

Using the above four independent risk factors, the cohort was classified into the two groups. The OS of the patients who met two factors and more (n = 75) was significantly lower than the patients who did not meet any factors or met one factor (n = 118) (HR 5.382, CI 2.565–10.489, P < 0.001). In addition, the OS of patients who met three factors and more (n = 17) was significantly lower than the others (n = 176) (HR 5.878, CI 2.994–12.613, P < 0.001) (Fig. 2).

Fig. 2
figure 2

a Comparison of overall survival between the patients who met two factors and more (n = 75) and the patients who did not meet any factors or met one factor (n = 118). b Comparison of overall survival between the patients who met three factors and more (n = 17) and the patients who did not meet any factors or met one or two factors (n = 176). CI confidence interval; HR hazard ratio

For a clinical standpoint, a nomogram to predict OS was generated by the coefficients of the multivariable Cox regression model (Fig. 3a). The predictors included the above factors, such as DCP > 200 mAU/mL, NLR ≥ 3.01, number of tumors ≥5, and osteopenia. The calibration plot for the 5-year OS was predicted very well as C-index of 0.746 (95% CI 0.707–0.785) (Fig. 3b). Collectively, these data demonstrate that osteopenia with other tumor-related variables (high levels of DCP, high levels of NLR, and large number of tumors) can be clinically used to predict mortality after LDLT.

Fig. 3
figure 3

a The nomogram to predict overall survival after LDLT by the multivariable Cox regression model using the four predictors, such as DCP, NLR, number of tumors, and osteopenia. b Calibration curves of the nomogram to predict overall survival after LDLT. The actual overall survival is plotted on the y axis; nomogram predicted probability is plotted on the x axis. DCP des-gamma-carboxy prothrombin; NLR neutrophil-to-lymphocyte ratio; LDLT living donor liver transplantation

Discussion

This is the first report to demonstrate the impact of osteopenia on post-LDLT mortality and to compare this risk factor with other preoperative predictors of prognosis, including AFP, DCP, NLR, and patient status of sarcopenia. According to multivariate analysis using Cox proportional hazards regression model in this large cohort, osteopenia was independently associated with poor prognosis after LDLT for HCC. Moreover, risk score and nomogram with calibration curve were developed to confirm the clinical usefulness of osteopenia for post-LDLT patients.

Osteopenia and osteoporosis are generally defined by dual-energy x-ray absorptiometry (DEXA) of spine or proximal femur, as the standard method; these values are correlated with fracture risk and treatment efficacy [18, 19]. Pickhardt et al. [20] compared CT-derived BMD assessment with DEXA screening and reported that a threshold of 160 HU or less by CT analysis had 90% sensitivity and that threshold of 110 HU had greater than 90% specificity for differentiating osteoporosis from osteopenia and normal BMD. In our cohort of 193 Japanese cirrhotic patients, there were no significant differences in post-LDLT survival by a cutoff value of 110 HU (n = 30, 15.5%) or 160 HU (n = 93, 48.2%) for osteopenia. However, by the age-specific cutoff value [12], the recipients meeting the newly defined criteria for osteopenia (n = 103, 53.4%) had a poorer prognosis after LDLT in the present study. BMD is strongly correlated with age naturally [21]. Other parameters that generally influence osteopenia include sex; medications; comorbidities, including liver disease; heredity; job; and lifestyle, including favorite foods [22]. Therefore, adjustment of the standard BMD according to age is feasible for patients who have other diseases associated with osteopenia, such as HCC recipients with cirrhosis. Considering the correlation between osteopenia and poor prognosis, early interventions such as rehabilitation and medical therapy to improve osteopenia or to prevent further bone loss may improve postoperative survival after LDLT.

Interesting point is that there was no significant difference in the recurrence-free survival between the recipients with or without osteopenia (Supplementary Fig. S1A). Sharma et al. [7] also demonstrated that osteopenia was independently associated with post-LT mortality in HCC patients, but not for the recurrence, which is the same as the present study. And they also pointed out that fewer patients with osteopenia received locoregional therapy among those who died compared to those who survived after LT. Considering the fact that osteopenia is significantly related not to the mortality but to the recurrence after LT, how many effective treatments such as locoregional HCC treatment or hepatectomy could be done for the patients might be related to the osteopenia, which should be checked precisely in the future.

Besides, in the patients of HCV-related cirrhosis (n = 135), the OS of recipients with osteopenia was significantly lower than that of recipients without osteopenia (HR 2.196, 95% CI 1.005–4.800, P = 0.040) (Supplementary Fig. S1B); however, in the recipients who had HBV-related cirrhosis and cryptogenic cirrhosis, there was no significant difference in the OS between the recipients with or without osteopenia. And about the relation between mortality and osteopenia through MELD score, in the cohort of the high MELD patients, there was no significant difference in the OS between the recipients with or without osteopenia (P = N.S.); however, the OS of recipients with osteopenia was significantly lower than that of recipients without osteopenia in the low MELD patients (HR 2.391, 95% CI 1.214–4.712, P = 0.012) (Supplementary Fig. S1C). Taken together, osteopenia is the feasible predictor for the mortality in the low MELD patients after LDLT, and the other prognostic factors than osteopenia had a strong impact on patients with aggravated general condition such as high MELD patients.

It is not clear whether the bone loss, suggesting the osteopenia, clearly promotes cancer development or whether stable bone density prevents cancer invasion; biological mechanisms have not been investigated [23, 24]. Of the patients with osteopenia, the point is to predict precisely before LT which patients with osteopenia had a good prognosis and which patients without osteopenia had a poor prognosis. Therefore, as reported recently [25, 26], it is necessary to investigate whether osteopenia-related biomarker, such as circulating serum microRNAs, is feasible as the prognostic predictor even in LT recipients who had severe liver dysfunction.

Sarcopenia is a common status among cirrhotic recipients and is correlated with poor prognosis after LDLT [8, 27]. In this cohort of 193 recipients with HCC, osteopenia had a significant association with post-LDLT mortality, whereas sarcopenia alone did not. Pereira et al. [28] proposed that the bone loss of osteopenia may become apparent before the loss of muscle mass, suggesting that a decrease in BMD may be an early landmark of the muscle loss of sarcopenia. LDLT candidates with HCC have better liver function and lower Model for End-Stage Liver Disease (MELD) scores than cirrhotic recipients without HCC, suggesting relatively preserved daily activity and muscle strength. In contrast, both osteopenia and sarcopenia are noticeable in patients with decompensated end-stage liver disease. Therefore, although not conclusive, osteopenia may have a closer association with poor prognosis than sarcopenia in HCC recipients, as in the present cohort. The double penia who had both osteopenia and sarcopenia (n = 34) did not have a significant correlation with mortality after LDLT for HCC, and this might be because the low number of candidates. However, the recipients who had either osteopenia and sarcopenia (n = 120) had a significant association with post-LDLT mortality (HR 2.006, 95% CI 1.007–3.998, P = 0.038) (Supplementary Fig. S1D). Taken together, both osteopenia and sarcopenia may be predictors of post-LDLT mortality in recipients with end stage with or without HCC. Hanada et al. [29] demonstrated that the neuromuscular electrical stimulation of the quadriceps muscle for recipients was able to maintain the quadriceps muscle thickness after surgery by phase-II single-blinded randomized controlled trial. And improving sarcopenia should achieve the reversal of osteopenia. Therefore, we are now checking the effect of this electrical stimulation to improve not only sarcopenia but also osteopenia. Of course, larger scale studies are needed in the future to consider the effect of neuromuscular electrical stimulation and how to incorporate this intervention in the sophisticated strategy as the physical therapy program.

In conclusion, in addition to high tumor burden as indicated by high DCP levels, number of tumors, and high NLR levels, osteopenia before LDLT is an important predictor of post-LDLT mortality in recipients of end-stage liver disease with HCC. Improving osteopenia with preoperative rehabilitation or medical therapy may improve post-LDLT survival. This subject warrants further research.