Introduction

Acute lymphoblastic leukemia (ALL) is a hematologic malignancy characterized by the impaired differentiation, proliferation, and accumulation of leukemic cells in bone marrow and/or extramedullary sites [1]. Despite the majority of adult ALL patients (80%) will achieve complete remission (CR) with standard chemotherapy, long-term survival is approximately 40%. Relapse has been related to the residual leukemic cells that remain following morphologic CR and that are not detected using conventional morphologic assessment, termed minimal residual disease (MRD) [2, 3•].

Over the last decades, the concept of high-risk disease has evolved in adults with ALL. Some classical high-risk features, such as age, gender, white blood cell count, and central nervous system disease, are nowadays considered less or non-relevant, while genetic and molecular characteristics and MRD status are considered more powerful predictors of disease-free survival (DFS) and overall survival (OS) [4]. Evaluating MRD at various time points in the course of treatment can act as a prognostic factor in the therapy decision-making process. Patients who achieve CR1 but remain MRD positive are considered to have a high-risk disease and can benefit from intensified therapy [5, 6••, 7]. On the other hand, evaluation of MRD positivity after transplant can determine, with high specificity and sensitivity, a group with the highest incidence of relapse and dismal long-term outcomes [8]. However, there are still many questions and challenges to determine which is the best intervention in these settings, as the therapeutic role of allo-HSCT. The purpose of this review is to discuss the potential role of allo-HSCT for Philadelphia-negative (Ph) adult ALL in CR1 in the era of MRD.

MRD Definition and Techniques

Minimal residual disease is defined as a level of disease that is undetectable by conventional cytomorphologic techniques and is not accompanied by clinical symptoms [9]. Currently, the application of MRD diagnostics in acute leukemia has expanded worldwide, and despite evidence is stronger in children, a majority of adult ALL patients are being monitored with MRD techniques to assess therapy effectiveness and to settle MRD-based risk groups according to their risk of relapse [10]. Assessment of MRD can be performed by three methods: (a) multiparameter flow cytometry (MFC), (b) real-time quantitative polymerase chain reaction (RQ-PCR), and (c) next-generation sequencing (NGS) [7].

In more than 90% of the patients, leukemia-associated immunophenotypes can be detected by MFC, deriving from the presence of aberrant and/or asynchronous markers in malignant cells. The sensibility of this technique is 0.1–0.001% (10−3–10−5), based on the characteristics of the flow cytometer (less sensitivity using four- to six-color approaches) [7, 11, 12]. An advantage of MFC over RQ-PCR is that it is widely available in most laboratories and is more affordable; however, as a limitation, expertise of the operator to provide an accurate result is required [13, 14, 15•].

Detection of MRD by RQ-PCR detects specific fusion genes in a minority of patients (BCR-ABL), but generates sensitive probes in more than 90% of the ALL patients by detecting clonal rearrangements of immunoglobulin and T cell receptor genes with a sensitivity of 0.01–0.001% (10−4–10−5) [13, 14]. The use of this technique is supported by its extraordinary sensitivity, and extensively optimized and standardized methodology, unfortunately, it is a costly and time-consuming technique, as it requires sequencing of diagnostic DNA, identification of suitable rearrangements (often more than one), synthesis of corresponding primers, and development of optimal PCR conditions for each rearrangement [14].

Some groups have recently focused on the development of NGS-based MRD assays, showing that it has a high sensitivity (10−6) for disease detection when an adequate number of cells are analyzed, and offer the advantage of being less laborious and time-consuming than RQ-PCR. However, standardization and validation of NGS-based MRD is still in progress [9].

The current recommendation is to measure MRD by either MFC or RQ-PCR with a sensitivity of at least 0.01% (10−4). Bone marrow is preferred as source sample, due to the concern that in B-cell ALL there is no good correlation between MRD results in bone marrow and peripheral blood [10].

Allo-HSCT in the Era of MRD

Improvement of outcomes regarding long-term survival in adults with ALL in the last years has been achieved by modifying the induction chemotherapy approach to pediatric-like protocols [12, 16,17,18]. However, in a well-selected group of patients, allo-HSCT continues to be the consolidation therapy of choice to achieve better OS by preventing relapse [19,20,21]. Prohibitive transplant-related mortality (TRM), when using allo-HSCT in adults, has currently been overcome by the use of non-myeloablative and reduced intensity conditioning regimens, and by the use of MRD as a factor in decision-making to determine whether therapy intensity should be reduced or escalated, even up to allo-HSCT, in a given patient [3•, 22].

The advantage of allo-HSCT over conventional post-induction chemotherapy resides on its additional immunotherapeutic effect mediated by the allogeneic effectors that can overcome the chemoresistance intrinsic to some leukemia clones. Although, graft versus leukemia (GVL) effect in ALL has always been an area of controversy, over the last 20 years, different studies support it plays a major role in reducing the risk of relapse. An exact estimate of GVL effect is difficult to determine; however, lower relapse rates in patients after undergoing allo-HSCT compared with autologous-HSCT and/or intensive chemotherapy and in patients developing chronic graft versus host disease (cGVHD) suggest that these provide further protection against relapse (Table 1) [23,24,25].

Table 1 Effect of cGVHD on relapse and survival of ALL patients undergoing allo-HSCT

Currently, the measurement of MRD is one of the most important parameters for decision-making in adult ALL patients in CR1 sent for allo-HSCT evaluation. Time points to determine MRD testing are variable, being population and protocol dependent. Most adult protocols have agreed on three determining time points according to their prognostic impact: (a) after induction or early consolidation (12–22 weeks), to select high-risk patients that should undergo allo-HSCT; (b) at time for allo-HSCT, having MRD-positive (MRD+) patients a higher risk of relapse; and (c) after allo-HSCT, defining a population where an immediate intervention to prevent relapse should be incorporated [26].

MRD and Allo-HSCT Choice

It is still not well defined how MRD status may be used to facilitate decisions regarding allo-HSCT for ALL. While MRD-negative (MRD−) status after consolidation chemotherapy alone has shown good long-term results, high-risk ALL patients, defined as having persistent MRD+ (≥ 10−4) at week 16, have a high risk of relapse and limited possibilities of obtaining a molecular CR with conventional treatment, benefiting from more intensive therapies such as transplant. Different treatment protocols, including the GMALL 06/99, GMALL 07/03, GRAALL, PETHEMA ALL-AR 03, and NILG ALL, report data suggesting the potential role of allo-HSCT overriding the MRD status.

The GMALL 06/99 identified distinct risk groups according to the MRD status at different time points (N = 148). Patients categorized as low risk, MRD < 0.01% on days 11 and 24 (10%), had a 3-year OS, and DFS of 100%. In contrast, patients in the high-risk group, MRD ≥ 0.01% persisting through week 16 (23%), had a 3-year DFS of 6% and OS of 45%, respectively [13]. The prospective study GMALL 06/99 and 07/03 (N = 580) evaluated the potential advantage of intensifying treatment regimens including allo-HSCT based on the post-consolidation MRD status. MRD− versus MRD+ status after consolidation was related to higher 5-year DFS 67 versus 25% (p < .0001) and 5-year OS of 80 versus 42% (p < .0001). Outcomes in patients with molecular failure and morphologic CR who underwent allo-HSCT (n = 57) were better regarding DFS at 5 years (63 versus 44%; p < .0001) and showed a trend for higher 5-year OS (54 versus 33%; p = .06). Among patients in the non-allo-HSCT group, the median time from MRD detection to clinical relapse was 8 months [5].

Results from PETHEMA ALL-AR 03 trial (N = 326) evaluated treatment of high-risk Ph ALL in adolescents and adults (15–60 years) according to early cytological response and MRD by MFC. Good early cytological response was defined as < 10% blasts at day 14 of induction, and MRD− was determined by < 5 × 10−4 at the end of early consolidation (week 16 to 18). One hundred seventy-nine patients (76%) achieved CR1, completed early consolidation, and were assigned by intention-to-treat to receive allo-HSCT (n = 71) or to continue on conventional chemotherapy (n = 108). Five-year DFS and OS probabilities were 37 and 35% for the whole group. Five-year DFS and OS for the patients assigned to allo-HSCT versus the group assigned to chemotherapy were 32 versus 55% (p = .002) and 37 versus 59% (p = .002), respectively. However, the multivariable analysis showed that poor MRD clearance (> 1 × 10−3 after induction and > 5 × 10−4 after early consolidation) was an adverse prognostic factor for DFS (HR 4.49, 95% CI 1.67–12.03; p = .003) and OS (HR 4.95, 95% CI 1.82–13.40; p = .002). The results of this study suggest that the prognosis of adolescent and adult patients with Ph ALL with good early cytologic response and low MRD levels after early consolidation is quite favorable being conventional chemotherapy the consolidation therapy of choice [27].

Results of the Italian group NGIL ALL 09/00 trial included a total of 304 patients with a median age of 34 years, from which sensitive molecular MRD probes were available in 200 patients in CR. The primary objective was to determine whether different post-induction MRD levels were predictive of post-transplantation outcome in MRD+ patients (> 10−3). At 6 years, DFS was improved following allo-HSCT in MRD+ patients when compared to auto-HSCT (42 versus 18%; p = .035) [28].

Post hoc analysis of the GRAALL-2003/2005 trials (N = 522) included young adults (15 to 55 years) with at least one conventional high-risk feature, treated with pediatric-inspired intensive chemotherapy and plan to proceed to allo-HSCT in first remission if a donor was available. Post-induction MRD was available in 259 patients, and 282 patients underwent allo-HSCT. In the group of allo-HSCT, MRD− patients had better outcomes regarding DFS (HR 0.40, 95% CI 0.23–0.69; p = .001) and OS (HR 0.41, 95% CI 0.23–0.74; p = .003) compared to MRD+ patients [6••].

MRD at Time of Allo-HSCT

It is widely known that not achieving a morphologic remission, before HSCT, is the most important factor predicting post-transplant relapse, being reinduction recommended to achieve CR before proceeding to HSCT. Recently, in pediatric population, it has been demonstrated that the status of molecular remission measured by MRD at time of allo-HSCT is associated with a higher risk of relapse when MRD persists positive [29].

Detection of ALL MRD in adults before transplant has been less extensively analyzed. The impact of MRD on outcomes varies among the reports of protocols that included young adolescents and adults (Table 2). However, the two most recent studies that included the largest cohorts of patients confirmed that detection of residual leukemic cells by MFC just before allo-HSCT is the most significant adverse factor for OS, DFS, and relapse rate (RR) [33, 34].

Table 2 Prognostic significance of minimal residual disease at time and after allo-HSCT

MRD after Allo-HSCT

Traditionally chimerism has been the method of choice to monitor patients after transplantation. However, recent studies presume that MRD can perform better to detect early relapse, as it detects not only autologous hematopoiesis but also residual and remerging clones, with greater sensitivity and specificity compared to chimerism [37, 38]. In 2014, a comparative study of these two methods reported that MRD positivity after allo-HSCT had the highest incidence of relapse 86% (95% CI, 63–100%), compared to 7% (95% CI, 1–44%) in patients who remained MRD− (p = .0035), and represented an independent prognostic factor for relapse and OS when analyzed in the multivariate analysis (relapse HR 24.64, 95% CI 1.58–384.19; p = .022; OS HR 9.67, 95% CI 1.93–48.50; p = .006). In addition, detection of relapse by this method had a sensitivity of 86% (95% CI 49–97%) and specificity of 95% (95% CI 70–99%), being the median time to morphologic relapse 173 days, longer than the interval reported for chimerism, 25 and 116 days when performed on peripheral blood and bone marrow, respectively [8].

Relapse after allo-HSCT continues to be the leading cause of death in the ALL group. Determining if any transplant procedure can be modified to prevail the GVL effect and/or which intervention can be added to improve outcomes can be a decision guided by the MRD status after allo-HSCT. Different studies support that MRD positivity after allo-HSCT is associated with increased risk of relapse (Table 2), and some have reported that activating immunological surveillance by donor T cells targeting neoplastic cell antigens or minor HLA antigens may be responsible for durable remission after either donor lymphocyte infusion, immunosuppressive therapy modulation, and/or development of GVHD [30, 39, 40].

Conclusions

Until new targeted immune therapies demonstrate efficacy and get approval as first-line treatment for adult patients with ALL, allo-HSCT will continue to be the therapy of choice to achieve lasting anti-ALL immune responses. Pre- and post-transplant measurement of MRD are widely applicable and predictive of outcomes. Minimal residual disease positivity can identify a high-risk group patients, as it is a reflect of chemoresistant clones that confer an increased risk of relapse. Further studies with careful controls are needed to better categorize risk groups and establish pathways to treat ALL more efficiently improving long-term outcomes.