Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Hematopoietic stem cells (HSCs) may be obtained by collection of bone marrow, mobilization and collection of peripheral blood stem cells (PBSCs), or collection of umbilical cord blood (UCB). The choice of HSC product depends upon the disease being treated, the availability of a suitable donor, and, to a certain extent, donor size. Donors of either PBSC or marrow rarely develop serious complications, although there are qualitative differences in the types of adverse events associated with either procedure (Rowley et al. 2001).

2.1 Autologous Hematopoietic Stem Cells

Autologous HSC provides a readily obtainable reservoir of cells for reconstitution of hematopoietic function after high-dose therapy without risk of graft-versus-host disease (GVHD). The indications for autologous hematopoietic cell transplant (HCT) are limited to high-risk lymphoma or certain solid tumors, such as high-risk neuroblastoma. Autologous HSC may also serve as the source of cells for gene transfer for correction of single-gene defects, such as X-linked severe combined immunodeficiency (SCID) or Fanconi anemia (Cavazzana-Calvo et al. 2000; Tolar et al. 2011). Autologous grafts have the potential disadvantage of tumor cell contamination and lack immune-mediated graft-versus-tumor effects, which may contribute to relapse of malignancy after transplant (Rill et al. 1994). In the late 1980s, the procedure of autologous HCT was advanced significantly by use of hematopoietic growth factors to stimulate circulation of large numbers of HSC that could be collected from the peripheral blood. PBSC has largely replaced marrow as the preferred product for reconstitution of autologous hematopoiesis because recovery of peripheral blood counts is more rapid. Consequently, compared to marrow recipients, PBSC recipients have fewer platelet and red cell transfusions, days of antibiotic use, and days in hospital (Theilgaard-Monch et al. 1999; Schmitz et al. 1996). Marrow and PBSC differ quantitatively and qualitatively in the number of CD34+ cells as well as other cell subsets, including a tenfold increase in number of CD3 cells in PBSC. CD34+ and CD3 cells obtained by granulocyte colony-stimulating factor (G-CSF) mobilization may be functionally different, and together with differences in types of accessory cells, the two products may not be equivalent in types of immune cells reinfused or the kinetics of immune reconstitution (Arpinati et al. 2000).

Private banking of autologous UCB has increased over the past 10 years, with approximately 2,000,000 units stored worldwide. To date, few of these units have been used for autologous HCT. One reason is that there are a limited number of indications for autologous HCT, and in these settings the goal is generally to achieve rapid recovery of hematopoiesis after high-dose conditioning. Therefore, the potential benefit of avoiding tumor cell contamination in the cord unit may be counterbalanced by the slow engraftment kinetics of UCB. Secondly, the limited cell dose of most UCB units restricts the option to young children. Nonetheless, there have been several case reports of successful autologous HCT. Therefore, if available, it could be considered, particularly in cases in which a graft-versus-leukemia effect is not required (Hayani et al. 2007; Rosenthal et al. 2011).

2.2 HLA-Identical Related Donors

Allogeneic HCT requires availability of a suitable donor, determined by human leukocyte antigen (HLA) compatibility and physical fitness for the procedure. The inheritance pattern of HLA haplotypes results in the potential for matching HLA antigens at the genetic level among full siblings (HLA genotypic identity). Donor-recipient HLA genotypic identity confers the lowest risk for immunologically mediated complications, graft rejection and GVHD (Beatty et al. 1991; Anasetti et al. 1989). When an HLA genotypically identical donor is available, the choice of marrow or PBSC depends upon the patient’s disorder, as well as suitability of donor for the procedure. Randomized trials have found faster recovery of peripheral blood cells without significant increase in the incidence of acute GVHD among recipients of PBSC compared to marrow (Schmitz et al. 1998; Bensinger et al. 2001). Allogeneic PBSCs are associated with a lower risk for relapse and increase the probability of relapse-free survival, suggesting that the higher dose of T cells may contribute an important graft-versus-leukemia (GVL) effect. The superiority of PBSC in treatment of patients with hematologic malignancies cannot be extrapolated to patients with nonmalignant conditions, as the risk of chronic GVHD is higher with PBSC grafts (Flowers et al. 2002).

2.3 Alternative Donors

Most patients referred for allogeneic HCT lack an HLA-matched sibling; thus, an alternative donor must be identified. Possible sources for an alternative donor include unrelated volunteer donors (URD), unrelated cord blood units, or extended family members. The suitability of each donor source depends upon the disease being treated, the urgency of the transplant procedure, and the available protocols. To date, there have been no randomized studies comparing outcome of the various donor sources that could guide selection of an alternative donor. The best possible alternative donor will be HLA matched with the recipient; however, a less well-matched donor may be appropriate for patients with aggressive malignancies in the interest of shortening the time to HCT.

Large studies comparing outcome of HCT using HLA-matched sibling donor (MSD) grafts compared to other donor sources have necessarily been retrospective analyses. Therefore, caution should be taken in the interpretation of these studies, particularly those with small numbers, as the results may be affected by selection bias or other confounding factors. As discussed in more detail below, high-resolution typing has improved the ability to select donors matched for HLA alleles. Although matching for HLA-A, HLA-B, HLA-C, and DRB1 (8/8) alleles has been shown to improve outcome of unrelated HCT, it is not clear whether this level of matching can be viewed as equivalent to an HLA-MSD (Petersdorf et al. 2004; Lee et al. 2007). A prospective, genetically randomized trial conducted by the French Society of Bone Marrow Graft Transplantation and Cell Therapy (SFGM-TC) found that disease-free survival (DFS) was not statistically different for patients given unrelated donor grafts matched for HLA-A, HLA-B, HLA-C, DRB1, and DQB1 (n = 55) compared to patients given HLA-MSD grafts (n = 181) (Yakoub-Agha et al. 2006; Hansen et al. 1998). However, larger, albeit retrospective, studies have shown that even very well-matched unrelated donor (MUD) grafts are not the equivalent of MSD grafts. In a study of patients with chronic myelogenous leukemia (CML) given myeloablative conditioning, Weisdorf and colleagues from the Center for International Blood and Marrow Transplant Research (CIBMTR) found that DFS was superior for those given MSD grafts (n = 450) compared to 8/8 HLA-MUD grafts (n = 667) (hazard ratio (HR) 1.89, 95 % confidence interval (CI) 1.59–2.25, p < 0.0001) (Weisdorf et al. 2009). Another CIBMTR study of adults with hematologic malignancies found that DFS was lower for 8/8 MUD recipients with acute myeloid leukemia (AML, n = 340) compared to MSD recipients (n = 1,271), although no difference was observed for patients with acute lymphoblastic leukemia (ALL, n = 483 MSD versus 189 MUD) or CML (n = 1,401 MSD versus 412 MUD) (Petersdorf et al. 1998; Ringdén et al. 2009). In this analysis, the significantly greater incidence of acute or chronic GVHD among MUD recipients did not appear to result in a reciprocal significant reduction in relapse.

These studies primarily or exclusively include patients with chronic phase CML, not commonly diagnosed in pediatric patients, and now considered treatable with tyrosine kinase inhibitors, such that HCT is no longer considered frontline therapy. For this reason, a single-center retrospective study was conducted in 1,448 patients with advanced hematologic malignancies to determine whether the outcome of HCT with very well-matched (10/10) MUD grafts could approach that of MSD (Woolfrey et al. 2010). The risk for mortality and relapse was similar between the two groups; however, patients given 10/10 MUD had a significantly higher risk for acute GVHD grades 2–4 (odds ratio (OR) 1.77, 95 % CI 1.33–2.36, p = 0.0001) and for clinical extensive chronic GVHD (adjusted HR 1.34, 95 % CI 1.12–1.60, p = 0.001). Despite a higher incidence of chronic GVHD, there was no significant difference in the performance scores, suggesting that quality of life was not appreciably different. There was, however, an effect of cell source that was apparent among patients with intermediate-risk disease, defined as acute leukemia in remission, CML in accelerated phase, or refractory anemia with excess blasts (RAEB). Specifically, patients given PBSC grafts from 10/10 MUD had significantly higher risk for mortality compared to patients given MSD grafts or MUD marrow grafts (HR 1.62, 95 % CI 1.21–2.17, p = 0.001).

Taken together, these retrospective studies support several concepts. First, despite matching for HLA by high-resolution typing at 8 or 10 alleles, MSD grafts remain the “gold standard” and therefore should be preferred over MUD when available. Second, if a suitable MSD is not available, an 8/8 or 10/10 MUD graft will result in nearly similar outcome. Third, the effect of using an alternative donor is mainly seen in patients with low-risk disease, and there is little difference in outcome for those with more advanced leukemia. Finally, the source of MUD cells (i.e., peripheral blood or marrow) may have an effect on outcome, particularly for patients with less advanced disease.

2.4 Selection of Unrelated Donors

Several factors should be considered in selection of the optimal URD in order to reduce transplant-related mortality (TRM), the most important of which is the degree of HLA match. Within the past decade, high-resolution typing techniques have been developed to allow identification of the polymorphic alleles of class I HLA-A, HLA-B, and HLA-C antigens and class II HLA DRB1 and DQB1 antigens. Retrospective studies have shown that only half of patient-donor pairs otherwise matched for HLA-A and HLA-B by serologic typing, and matched for the DRB1 alleles, will be matched at the allele level for all five loci (HLA-A, HLA-B, HLA-C, DRB1, DQB1), and approximately 25 % will be mismatched for multiple alleles (Petersdorf et al. 1998). The ability to distinguish allele-level mismatches has allowed investigation of the relevancy of patient-donor mismatching. These studies show that the impact of patient-donor mismatching depends on the disease being treated, and within disease risk groups depends upon the degree of HLA mismatch and the locus of HLA mismatch.

Initial studies of HLA matching based on retrospective high-resolution typing suggested that both the number and the location of the allelic mismatch were associated with outcome. The Seattle group found an increased risk for graft failure when donors had multiple mismatches that involved at least one class I allele, but the highest risk for severe acute GVHD was observed with multiple mismatches involving at least one class II allele (Petersdorf et al. 1998; Petersdorf et al. 1997; Sasazuki et al. 1998; Petersdorf et al. 2001). The effect of HLA mismatching appeared to be greater for patients with low-risk diseases, such as CML, compared to those with more aggressive leukemias (Petersdorf et al. 2004). An important limitation of the Seattle studies was that patients were mainly of Caucasian ethnicity, so results may not be transferable to other ethnic populations. For example, studies conducted with the Japan Marrow Donor Program (JMDP) found that mismatching of HLA-A and HLA-B, but not class II HLA, decreased survival (Sasazuki et al. 1998).

Retrospective HLA-allele typing of patient and donor pairs performed by the National Marrow Donor Program (NMDP) has allowed analysis of larger donor-recipient cohorts, which has helped to distinguish the contribution of both number and locus of the HLA mismatch to outcome. Of most use for clinicians is an understanding of the effects of HLA mismatching on overall mortality. The initial study by Flomenberg employed multivariate modeling to determine the independent effects of HLA mismatching detected by high-resolution typing of 1,874 donor-recipient pairs (Flomenberg et al. 2004). Donor-recipient disparity of class I HLA loci HLA-A, HLA-B, and HLA-C was found to be independently associated with an increase in the risk for mortality. In this study, class I HLA mismatches that could be detected only with high-resolution typing (allele-level mismatch) did not appear to increase the risk for poor outcome, nor did mismatches at HLA-DQ. The subsequent 2007 NMDP/CIBMTR study included 3,857 patients and incorporated subset analyses in order to determine whether there were specific HLA locus effects (Lee et al. 2007). In this large cohort of patients with hematologic malignancies given myeloablative conditioning, mortality increased proportionately with the number of mismatches involving HLA-A, HLA-B, HLA-C, or DRB1, but again not HLA-DQ. Furthermore, the effect of an allele-level mismatch appeared to be similar to that of an antigen-level mismatch. The risk of mortality was 1.25-fold higher for patients given a single HLA-mismatched (7/8 match) graft and 1.65-fold higher for those given a double mismatched graft (6/8 match) compared to a fully matched (8/8 match) graft. Mismatches at HLA-A and DRB1 appeared to have a greater negative effect on mortality compared to mismatches at HLA-B or HLA-C. In these studies, the negative effects of HLA mismatching on survival were due to higher incidence of both acute and chronic GVHD; negative effects on relapse and graft rejection were not discerned.

Although marrow is more commonly used as the graft source for pediatric patients, PBSC grafts have become the predominant source for adult patients. Because the previous studies were confined almost entirely to recipients of marrow grafts, the CIBMTR conducted a separate analysis of HLA matching in patients given PBSC grafts for treatment of hematologic malignancies (Woolfrey et al. 2011). Similar to the Lee study, matching for HLA-A, HLA-B, HLA-C, and DRB1 alleles (8/8 match) was associated with better survival at 1 year (56 % versus 47 %) compared with 7/8 HLA-matched pairs. Mismatches involving the HLA-C antigen were associated with increased mortality. In the PBSC dataset, neither allele-level mismatches nor mismatches at antigens other than HLA-C had a significant effect on mortality. The increase in mortality associated with HLA-C mismatching held for recipients given either myeloablative or reduced-intensity conditioning. The analysis also indicated that in the case of an HLA-C antigen mismatch, switching from PBSC to marrow as the source did not mitigate the negative effect.

The studies discussed above were confined to patients treated for hematologic malignancies; therefore, the results may not be valid for patients with nonmalignant disorders. A recent CIBMTR study addressed this question by analysis of a separate cohort of 663 patients with nonmalignant disorders, including aplastic anemia (which comprised around half of the cohort) (Horan et al. 2012). Again, survival was not affected by mismatching at either HLA-DQ or HLA-DP. Higher mortality was associated with mismatch of a single HLA antigen or two mismatches, but not with mismatch at a single HLA allele. In contrast to the findings in patients with hematologic malignancies, in this study HLA mismatches were not associated with acute or chronic GVHD, but were strongly associated with graft failure, with two- to fourfold increased risk of graft failure depending on the number of mismatched loci. Most likely the lack of association with GVHD is because almost all patients in this cohort were given anti-T-cell antibody, such as antithymocyte globulin (ATG), and many were given T-depleted grafts.

Taken together, these studies support donor identification strategies that limit HLA mismatch. Because the numbers of donor-recipient pairs in the PBSC study and the nonmalignant disease study were smaller than in the 2007 NMDP study, it should be assumed that mismatch at the allele level must be identified by high-resolution typing and avoided if possible. These studies together also suggest that, in the instance when a mismatch is unavoidable, a tolerable mismatch will depend upon the ethnicity of the recipient, the type of graft (PBSC or marrow), and the disease (Table 2.1). Among Caucasian recipients, mismatch at HLA-DQB1 appears most tolerated, followed by mismatch at HLA-B. In contrast, in Japanese recipients, HLA-A or HLA-B mismatches fare the worst. These data do not define tolerable mismatches for other ethnic groups, due to insufficient patient numbers and diverse HLA haplotypes.

Table 2.1 General guidelines for selection of an unrelated donor

Consideration of other donor-related factors is justified when more than one donor of equivalent HLA match has been identified. The source of the cell product has not been considered to affect outcome, and as noted above, PBSC has become the predominant source of unrelated hematopoietic stem cells. The recently completed NMDP/CIBMTR randomized study of unrelated marrow versus PBSC, which included pediatric patients, found that indeed there was no significant difference in mortality between recipients of PBSC compared to marrow. However, the incidence of chronic GVHD was approximately 55 % in PBSC compared to 40 % in marrow recipients (p < 0.014) (C. Anasetti, personal communication). Based on these results, marrow should be the preferred source for pediatric patients, unless infectious comorbidities exist that would benefit from the faster neutrophil recovery associated with PBSC grafts.

The contribution of donor age or gender to TRM may be important for certain diseases. The 2001 NMDP analysis, which included 6,978 patients, found that both male gender and younger age were independently associated with lower risk for GVHD and younger age with improved survival (Kollman et al. 2001). Importantly, the study suggested that these factors may be more important in the situation wherein the donor is HLA mismatched. In subsequent retrospective studies focused on the impact of high-resolution mismatches, donor age and gender were not found to be independently associated with survival; however, age and ABO incompatibility have been found to be associated with risk for mortality in a recent analysis of a larger cohort of patients (C. Kollman, personal communication). Earlier studies supported matching patients and donors for cytomegalovirus (CMV) serostatus (Bowden et al. 1993; Ljungman et al. 2003); in the current era, with early polymerase chain reaction (PCR) detection methods and effective drugs for treatment of CMV reactivation, the CMV status of the donor does not affect mortality (M. Boeckh, personal communication) (Lee et al. 2007). Several other variables have been associated with outcome, such as cell dose or time between collection and infusion; however, these factors are typically not under control of the physician caring for the patient (Collins et al. 2010; Lazarus et al. 2009).

Selection of the optimal URD must also consider whether the donor has been sensitized to HLA. Screening recipient serum against a panel of reactive antibodies (PRA) will detect potential anti-HLA antibodies (Anasetti 1991). Crossmatch studies, which detect antibodies in recipient serum directed against proteins expressed by donor cells, are used to determine whether the antibody is specific or nonspecific. The importance of a positive crossmatch was demonstrated in a study of 522 patients in which there was a ninefold greater incidence of graft rejection among crossmatch-positive compared to crossmatch-negative recipients (Anasetti et al. 1989). More recent technology uses solid surfaces or beads coated with purified HLA molecules to detect donor-specific antibodies (DSA), resulting in enhanced sensitivity and specificity compared to cell-based assays. Bead-based technology used in a prospective study of 604 patients given 8/8 or 7/8 matched URD grafts detected DSA in 1.4 % of patients, primarily directed against donor HLA-DPB1 (Ciurea et al. 2011). The presence of DSA was correlated with the risk for graft rejection (p = 0.0014). Recommendations for HLA-antibody studies are shown in Table 2.2. Determination of the HLA specificity of the antibody is important, as avoiding a donor with the sensitizing HLA may require typing of HLA-DPB1.

Table 2.2 Indications for donor-specific antibody assay

Significant advances have been made in understanding the role of natural killer (NK) cell activity after marrow grafting; however, the value of selecting an NK alloreactive donor has not been established. Killer immunoglobulin-like receptors (KIR) present on NK cells interact with specific HLA class I molecules, particularly HLA-C, to regulate NK cell activation. Several studies have evaluated the impact of KIR ligand mismatching, defined as the absence of one donor KIR ligand class I allele in the recipient. A study of 130 patients treated at three European centers and conditioned with myeloablative conditioning and thymoglobulin found a significant decrease in mortality (p = 0.0006) among patients given a KIR ligand mismatched graft. In contrast, retrospective analyses from Japan and Minnesota evaluated 1,449 and 175 URD transplants, respectively, and found no benefit for KIR ligand incompatibility (Davies et al. 2002; Morishima et al. 2003). Although most of these latter patients did not receive T-cell depletion, though necessary for promoting NK alloreactivity, another study of 190 patients, most of whom received ATG, also showed no benefit; in fact, survival was lower and TRM higher among recipients of KIR ligand mismatched grafts (Schaffer et al. 2004). Thus, the value of KIR ligand mismatching in URD selection remains undetermined. More promising results have come from studies of KIR genotyping and categorization of donors into those who possess “favorable” or ≥2 B gene motifs (Cooley et al. 2009). In a retrospective study that included 1,409 patients, those with AML who received their graft from a “favorable” KIR genotype had significantly lower mortality and lower relapse (Cooley et al. 2010). A prospective study is currently under way to test the hypothesis that donor KIR haplotype has an independent effect on mortality.

2.5 Selection of Umbilical Cord Blood Units

Umbilical cord blood (UCB) characteristically differs from marrow in a number of ways. The median doses of total nucleated cells (TNC), CD34+ cells, and CD3+ cells in UCB unit are approximately ten times lower than that of a bone marrow graft (Moscardo et al. 2004; Barker and Wagner 2003). Reduced cell numbers may be offset by a higher capacity for replication, as indicated by higher cell cycle rates and longer telomeres in UCB progenitor cells (Lewis and Verfaillie 2000; Mayani and Lansdorp 1998). Immune mediator cells in UCB have been characterized as relatively immature compared to marrow cells, including less mature T- and B-cell phenotypes, reduced response to alloantigen, and lower capacity to generate inflammatory cytokines (Mayani and Lansdorp 1998; Garderet et al. 1998; Risdon et al. 1994; Risdon et al. 1995; Bradley and Cairo 2005). These biologic differences have significant effect on outcome following UCB transplant and as well influence the selection of a UCB unit.

Diminution of immunologic activity has allowed greater freedom to transplant HLA-mismatched UCB units. Conventionally, the degree of HLA match between recipient and UCB has been determined according to serologic typing at HLA-A and HLA-B along with high-resolution typing to distinguish DRB1 alleles. Hence, a UCB unit matched by serologic typing at HLA-A and HLA-B and matched for DRB1 allele has been considered a full or 6 of 6 locus match. This definition of matching ignores mismatches at HLA-C and DQB1 as well as allele-level mismatches at HLA-A and HLA-B loci. Not surprisingly, around one-third of conventionally typed units will have at least one additional undetected HLA mismatch at HLA-A, HLA-B, HLA-C, DRB1, or DQB1 when retyped by high-resolution methods (Cornetta et al. 2005; Kogler et al. 2005).

Retrospective studies of HLA matching have shown an association with the risks for graft failure, TRM, and GVHD. Initial observations reported for 65 UCB transplants by the Eurocord registry in 1997 and 562 transplants by the New York Blood Center (NYBC) in 1998 found HLA mismatch to be associated with lower probability of neutrophil and platelet recovery and, in the latter study, a higher probability of acute GVHD and lower probability of survival (Gluckman et al. 1997; Rubinstein et al. 1998). The subsequent NYBC analysis of 607 UCB transplants found the degree of HLA mismatch correlated directly with the probability of TRM (Abstracts and summary of the 4th Annual International Umbilical Cord Blood Transplantation Symposium 2006). Particularly among patients who did not develop acute GVHD, HLA mismatch was associated with high risk of death from infection, implying a potential negative effect on immune reconstitution. A Minnesota group study of 152 UCB transplants showed that survival after transplant of a UCB unit matched at 4 of 6 loci was significantly worse compared to those matched at 5 or 6 loci (Wagner et al. 2002). In contrast, the subsequent Eurocord report, which analyzed 550 patients, confirmed the association of HLA mismatch with probability of neutrophil engraftment and acute GVHD grades III–IV, but showed an association with lower probability of relapse and thus no apparent effect on survival (Gluckman et al. 2004). More recently the relevance of matching the UCB unit based on high-resolution HLA typing was analyzed in a retrospective analysis of 803 recipients of UCB transplants registered at Eurocord-European Group for Blood and Marrow Transplantation, Netcord, and the CIBMTR (Eapen et al. 2011). Addition of mismatch at HLA-C to a 6/6 or 5/6 conventionally matched unit was found to significantly increase the risk for TRM (p = 0.018 and 0.029, respectively). Taken together, these studies indicate that HLA matching is an important factor in reducing the risk for TRM and improving outcome after UCB transplants. Although not all UCB units have been typed for HLA-C, if the information is available, it may help in selection of the optimal unit.

The cell dose of a UCB unit, a critical factor in determining success, was observed in the first large studies and correlated with probability of neutrophil engraftment and platelet recovery (Gluckman et al. 1997; Rubinstein et al. 1998). Based on the Eurocord results, the lower limit of an acceptable unit has generally been considered approximately 2 × 107 total nucleated cells (TNC) per kilogram recipient weight, as determined by TNC in the unit before cryopreservation (Migliaccio et al. 2000). In acknowledgement of the importance of cell dose, UCB banks subsequently made efforts to improve UCB volume at collection. However, until recently the problem of cell dose has limited UCB transplants to smaller patients; hence, most of the subsequent analyses have been performed in pediatric patients. These studies confirmed the association of cell dose and engraftment, and in the most recent Eurocord analysis that included 550 UCB transplants, TNC was found to be associated with risk for acute GVHD (Gluckman et al. 2004). Together these studies support a minimum TNC dose of around 2 × 107 per kilogram recipient weight. Furthermore, stepwise increases in TNC dose appear to correlate with reduction in TRM, and there does not appear to be an upper limit over which TNC dose seems detrimental (Michel et al. 2003).

The best method to measure UCB progenitor cell dose for unit selection has not been established. Several studies show superior predictive value using CD34+ cell dose compared to TNC (Wagner et al. 2002; Laughlin et al. 2001). In the Minnesota studies, the number of CD34+ cells per kilogram recipient weight was associated not only with graft recovery but also with TRM and survival. The doses of TNC and CD34+ measured after thawing may also have superior predictive value compared to values obtained before unit cryopreservation, although post-thaw assays have no practical value for unit selection. The lower limit of CD34+ cell dose has not been firmly established; however, a unit with less than 1.7 × 105 CD34+ cells per kilogram recipient weight has generally been considered inadequate (Wagner et al. 2002; Laughlin et al. 2001). UCB graft progenitor cell content, measured by colony-forming cell assay, has also been shown to correlate with engraftment; however, no association with survival has been suggested (Migliaccio et al. 2000). Current data supports the utilization of either TNC or CD34+ cell dose as measure of unit suitability.

The selection of an optimum UCB unit must take into account both cell dose and HLA match, and there continues to be debate about which factor, if either, should be considered more important. To address this question, Barker et al. analyzed 1,061 UCB recipients treated for hematologic malignancies (Barker et al. 2010). Similar to previous studies, lower TNC and greater HLA match were independently associated with mortality. Importantly, the analysis was able to elucidate interactions between cell dose and HLA match. Specifically, when the unit contained a TNC of at least 2.5 × 107/kg recipient weight, HLA matching became the more significant determiner of outcome. Thus, if a 5/6 unit has a TNC above the threshold (e.g., TNC of 2.8 × 107/kg), selection of a 4/6 unit with higher TNC does not appear to improve outcome. In contrast, within a TNC range of 2.5–4.9 × 107/kg, survival appears to be better with a 5/6 compared to 4/6 matched unit. To continue the example, if the same 5/6 matched unit has a TNC below the threshold (e.g., 2.0 × 107/kg), then selection of a 4/6 unit with a TNC above the threshold appears to improve survival.

Consideration of cell dose in addition to HLA match has most profound implications for adult patients. While it is now possible to identify a 4/6 matched UCB unit for most pediatric patients, the number of usable units decreases markedly when cell dose is considered (Stevens et al. 2005). A prospective multicenter study of single unit UCB transplants found <10 % survival in the adult arm of the protocol, in part because the median TNC was 2.3 × 107 per kilogram recipient weight (Cornetta et al. 2005). These difficulties have prompted exploration of methods to enhance cell dose given to adult patients, such as transplant of more than one UCB unit or expansion of UCB progenitor cells. Addition of a second unit is a practical method to increase the overall TNC given to the patient, as demonstrated in a study of 21 adult recipients wherein the median TNC after two units was 4.0 × 106/kg (Ballen et al. 2007; Barker et al. 2005). Single-center studies of double UCB transplants in adults also report comparatively improved outcome. The observation that only one of the two CB units engrafts long term suggests a supportive role for the additional unit. Algorithms to aid in multiple unit selection have been devised, based upon HLA match and cell dose of each unit. One reasonable algorithm for selection of two CB units is shown in Table 2.3. Without large numbers of patients to analyze, ambiguity remains regarding the allowable minimum cell dose, HLA mismatch with recipient, and unit-to-unit HLA matching. Reports of supporting single CB transplant with third-party G-mobilized PBSC suggest that HLA matching between cell products may not be relevant (Magro et al. 2006). The degree of HLA match of the recipient to each UCB unit appears to be much more important, since either may become the engrafting unit (Barker et al. 2005).

Table 2.3 General algorithm for umbilical cord blood selection

Recipient sensitization to alloantigen is more difficult to assess prior to UCB transplant, since donor cells are not available for crossmatch assays. A reasonable approach is to screen UCB recipients for HLA antibodies by PRA assay, discussed in the previous section. Recipients with a positive PRA can be tested for further for HLA-antibody specificity to determine whether DSA is present. A recent single-center analysis of 73 recipients of double UCB transplants found that the presence of DSA was significantly associated with graft rejection, delayed neutrophil engraftment, and mortality. Patients at most risk for graft rejection were those with DSA to both units.

Thus, if DSA is detected in the recipient, it is prudent to avoid units with the identified HLA whenever possible (Cutler et al. 2011).

Aside from HLA matching and cell dose, recent studies have suggested other potential factors to consider in selection of the optimal UCB unit(s). A retrospective study that included 218 recipients of UCB grafts from the Eurocord group suggested that NK alloreactivity may play a role in GVL effects (Willemze et al. 2009). NK cells are essential effector cells of the innate immune system that, without prior activation, recognize and lyse target cells. NK cell cytolytic activity is regulated by the balance of inhibitory and activating signals generated by binding of NK cell surface receptors, or KIRs. Negative regulation occurs when inhibitory KIR bind to specific HLA class I molecules; hence, target cells expressing the appropriate HLA class I molecules are protected from NK cell cytolysis (a mechanism termed “missing self”) (Ljunggren and Kärre 1990; Lanier 1998). In the setting of allogeneic HCT, NK cell alloreactivity can occur when the recipient lacks the inhibitory ligand for donor KIR. Class I HLA epitopes involved in NK cell allorecognition include the Bw4 epitope, present on approximately 40 % of HLA-B alleles, and the allelic HLA-C1 and HLA-C2 epitopes, one of which is present on all HLA haplotypes and which have approximately equal frequencies (Biassoni et al. 1995; Colonna and Samaridis 1995) In the Eurocord study, KIR ligand incompatibility was defined as absent expression in the recipient of a predicted KIR ligand for the donor (i.e., absence of HLA-C group 1, HLA-C group 2, or HLA-B24 allele group), which would correlate with NK alloreactivity. KIR ligand incompatible UCB grafts were found to confer improved leukemia-free survival, in particular for patients with AML.

Investigators have speculated that exposure to noninherited alloantigen during pregnancy might induce a level of tolerance that could be exploited in selection of UCB grafts as well as haploidentical donors. Indeed, long-term presence of very small numbers of fetal cells can be detected in about 80 % of mothers and maternal cells in about 65 % of offspring, consistent with transference of maternofetal tolerance. Several groups have investigated the role of matching for noninherited maternal antigens (NIMA). Theoretically, placental blood cells could develop tolerance to noninherited maternal HLA, which might translate to tolerance of mismatched HLA of the recipient (van Rood et al. 2009). An analysis of 1,121 recipients of single UCB grafts found that mortality was significantly lower among recipients of grafts with a NIMA match. NIMA match also was correlated with improved neutrophil recovery and reduced risk for relapse. Rocha and colleagues have also found NIMA match to be associated with lower mortality and improved leukemia-free survival (V. Rocha, personal communication). Taken together, the studies above suggest that, in the future, the selection of CB units will require consideration of factors other than HLA match or TNC.

2.6 Selection of Haploidentical Donors

A haploidentical donor is defined as sharing one distinct inherited haplotype (genetically identical) with the patient; the unshared haplotype may be HLA matched (phenotypically identical) or mismatched at one or more HLA loci. Most patients have an HLA-haploidentical donor available, typically a parent or sibling. At the current time, guidance on the selection of the optimal HLA-haploidentical donor is not as clear as that for URD or UCB donors. The rapid development of novel regimens, posttransplant immune suppression strategies, and cellular manipulation has outpaced the ability to perform large retrospective analyses to evaluate factors important in HLA-haploidentical donor selection. Furthermore, previously published retrospective studies of HLA-haploidentical graft recipients did not evaluate high-resolution HLA typing; therefore, the limit of tolerable HLA mismatches has not been defined with certainty.

Published studies have examined outcome risks associated with HLA mismatch defined by serologic typing for HLA-A and HLA-B plus identity for DRB1 alleles. The contribution of mismatching at HLA-C or DQB1, or of allele-level mismatching at HLA-A or HLA-B, is thus unknown. Nonetheless, these studies provide some guidance for selection of haploidentical donors. An analysis of 1,199 recipients of marrow grafts found a sixfold increase in graft failure among recipients of grafts from HLA-haploidentical relatives mismatched for 0 to 3 HLA-A, HLA-B, and DRB1 antigens of the nonshared haplotype compared to those who received grafts from an HLA-identical sibling (genetically HLA identical) (Anasetti et al. 1989). The relative disparity between donor and host histocompatibility antigens, determined by the vectors of HLA incompatibility, also affects engraftment (Fig. 2.1). Specifically, recipients who were homozygous at one or more mismatched loci had a threefold increase in the risk for graft failure compared to heterozygous recipients. Thus, when selecting a haploidentical donor, it is desirable to avoid the situation in which the number of HLA mismatches in the direction of host-versus-graft is not counterbalanced by the same number of HLA mismatches in the direction of GVH (Woolfrey and Anasetti 1999). The historic analyses also showed that disparity of multiple HLA loci resulted in a prohibitive incidence of GVHD when the graft was not depleted of T cells (Ash et al. 1991; Tomonari et al. 2002; Speiser et al. 1997). These studies, together with knowledge gained from analyses of URD transplants, support the use of high-resolution typing to discriminate haploidentical donors potentially mismatched for HLA-A, HLA-B, HLA-C, DRB1, or DQB1 locus.

Fig. 2.1
figure 1

Alloreactivity vectors. The vector for graft-versus-host disease (GVHD) or graft rejection depends upon whether the recipient or donor is homozygous at the locus of mismatch. HLA typing is shown for a potential recipient (left side) and for three potential donors (right side). In the top panel, both recipient and donor are heterozygous at the mismatched HLA-DRB1 locus; therefore, the mismatch will generate alloreactivity in both the GVHD and the rejection vectors. In the middle panel, the recipient is homozygous at the mismatched HLA-B locus; therefore, the mismatch will generate an alloreactivity reaction only in the direction of rejection. In the bottom panel, the donor is homozygous at the mismatched HLA-A locus; therefore, the mismatch will generate an alloreactivity reaction only in the direction of GVHD

Our current understanding of the important role of hematopoietic stem cell dose in HLA-haploidentical grafts is based on animal models. Early murine models demonstrated that engraftment of allogeneic marrow required tenfold higher number of cells compared to syngeneic marrow (Gengozian et al. 1969). Similar models showed that host alloreactivity conferred by the presence of residual (or experimentally added-back) host T cells could be overcome by increasing the number of donor cells, presumably by manipulating competition for marrow space in favor of the donor stem cells (Lapidot et al. 1989). In HLA-histoincompatible models, engraftment of T-cell depleted marrow was shown to require greater numbers of donor marrow cells compared to non-T-cell depleted grafts (Lapidot et al. 1990; Lebkowski et al. 1990). Stem cell dose also appeared to be the critical factor in determining engraftment of T-cell depleted HLA-histoincompatible marrow when intensity of immunosuppression was held constant (Bachar-Lustig et al. 1995; Rachamim et al. 1998). The benefit of mobilized PBSC to attain high cell dose was reported by Aversa and colleagues in studies of HLA-haploidentical grafts for treatment of hematologic malignancies (Aversa et al. 1994; Aversa et al. 1998). After 2–4 apheresis procedures followed by CD34+ selection or T-cell depletions, the PBSC graft contained a median CD34+ cell dose of 13.9–16 × 106 per kilogram recipient weight and a median CD3+ cell dose of 0.27–1.43 × 105 per kilogram recipient weight, resulting in 95 % engraftment rate. Correlation of CD34+ cell dose with risk for graft failure has been confirmed by other groups after different conditioning regimens (Peters et al. 1999). The requirement for maximal CD34+ cell dose guides donor selection toward a preference for adult donors able to tolerate multiple apheresis procedures. In contrast, if a marrow graft instead of PBSC is planned, studies suggest that younger donor age may be preferable (Godder et al. 2000).

Studies by the Perugia group suggest that, in the absence of T cells, NK cell alloreactivity may play an important role in outcome of haploidentical grafts (Biassoni et al. 1995). The existence of potential donor alloreactive NK cells can be deduced through comparison of donor and recipient HLA class I type. This “missing self” model of NK alloreactivity presumes that there exist NK clones in the donor capable of activation provided the recipient lacks the inhibitory ligand. The “missing self” model is an oversimplification of NK cell receptor-ligand biology, as some individuals do not have the inhibitory KIR gene anticipated based on the HLA typing. Velardi and colleagues screened the KIR genotype of 162 patients and found that prediction of NK alloreactivity based solely on KIR ligand incompatibility would be invalid for 3 % of donors who do not possess KIR2DL1 receptor for group 2 HLA-C alleles and 6 % of donors who lack the gene for HLA-Bw4 inhibitory receptor KIR3DL1 (Ruggeri et al. 1999; Ruggeri et al. 2002; Ruggeri et al. 2004a; Velardi et al. 2003; Ruggeri et al. 2004b). This group also showed that direct identification of NK alloreactive clones in the donor was useful for optimum donor selection. Potentially relevant to selection of haploidentical donors is the “missing ligand” model of donor NK activity, which takes into consideration that alloreactive donor NK clones may develop after transplant, provided the recipient lacks at least one KIR ligand. In contrast to the “missing self” model, there is no requirement for a mismatch of the class I HLA ligand between donor and recipient. Hence, the “missing ligand” model would potentially encompass two-thirds of recipients, who will lack one or more of the class I HLA ligands for KIR. In this model, KIR genotyping is essential in order to identify a donor with potential to express an alloreactive KIR.

The Perugia group demonstrated the potential role for NK alloreactivity in protecting against relapse after T-cell-depleted HLA-haploidentical grafts for patients with myeloid malignancies (Ruggeri et al. 1999; Ruggeri et al. 2007). In an analysis of 112 patients, with AML, NK alloreactive clones were detected before transplant in all KIR epitope mismatched donors whose recipients did not express HLA-C group ligands and two-thirds whose recipients did not express HLA-Bw4 alleles, whereas none were detected in donors whose recipients expressed the class I HLA groups present in the donor. Multivariate analysis confirmed that donor-versus-host KIR “ligand mismatch” was an independent factor for survival, associated with a twofold reduction in death or relapse among all patients (p < 0.001), including those with relapsed disease at time of HCT (DFS 30 % versus 6 %, p = 0.04). In contrast, when the analysis took into consideration patients who lacked expression of at least one KIR ligand, for whom there was potential for NK alloreactivity according to the “missing ligand” model, no survival advantage was discerned. Although these results have not been confirmed by others (Bishara et al. 2004), they suggest that the search for HLA-haploidentical donors should be extended beyond immediate family members, guided by KIR genotyping (Ruggeri et al. 2005).

2.7 Selection of the Optimal Donor

Optimization of the donor graft, whether URD, UCB, or HLA haploidentical, takes into consideration data derived from multivariate analyses of large numbers of transplants. In contrast, there are no randomized studies that address the question of selection between the various donor sources. Studies that seek to compare outcome between any donor type, whether HLA-matched sibling, HLA-matched URD, UCB, or haploidentical donors, have been retrospective; therefore, consideration of the results must take into account the problem of selection bias, wherein poor-risk patients die before HCT can be performed. In general, time lapse between the decision to undergo HCT and donor identification is greater for recipients of alternative donor grafts, increasing the probability that poor-risk patients will not be included in these groups. In counterbalance, perception of an increased risk associated with alternative donor HCT may drive physicians to withhold referral of patients until the disease has progressed to an advanced stage.

Registry studies have provided comparative information about outcome among different alternative donor groups. The Eurocord group reported outcomes separately for pediatric and adult patients with acute leukemia given unrelated UCB compared to URD marrow transplants. The first large study reported by the Eurocord group analyzed outcome of pediatric patients given UCB or URD grafts between 1994 and 1998 (Rocha et al. 2001). The adjusted analysis showed lower DFS and a twofold increase in TRM among the UCB recipients compared to URD (p < 0.01), with most of the mortality risk within the first 100 days. In contrast, no difference in survival was reported for adult patients in the Eurocord comparison of 98 single UCB unit recipients to 584 recipients of 6 of 6 HLA-matched URD marrow grafts, reported to the registry between 1998 and 2002 (Rocha et al. 2004). An International Bone Marrow Transplant Registry study reported around the same time found that recipients of UCB or mismatched URD marrow had a higher risk for death from any cause (p < 0.001, HR 1.66, 1.53, respectively) compared to recipients of HLA-matched URD marrow (Laughlin et al. 2004). The increasing awareness of cell dose and advent of the double UCB unit transplant procedure have improved outcome in adult patients, and two recent retrospective studies that compared outcome of double UCB grafts and URD grafts in adult patients with hematologic malignancies found no significant difference in outcome. In patients with hematologic malignancy given myeloablative conditioning, the source of the donor graft (i.e., double UCB or HLA-matched URD or HLA-mismatched URD or HLA-identical sibling) was not found to be significantly associated with mortality (Brunstein et al. 2010). Donor source was also not found to be associated with outcome after reduced-intensity conditioning (Brunstein et al. 2012).

Most importantly, all comparative studies of URD, haploidentical donors, and mismatched cord blood grafts have shown that phase of disease at time of transplant is the most significant predictor of survival (Gluckman et al. 2004; Speiser et al. 1997; Sierra et al. 1997; Aversa et al. 2005; Lu et al. 2006). Therefore the most important variable to consider at the start of the donor search is the urgency of the transplant procedure. Thus, optimal donor selection balances the risk of disease progression against the time required to identify the best donor (Woolfrey et al. 2002). Pragmatically, the time to disease progression or relapse (which depends upon the available therapies that presumably improve over time) should be estimated and donor identification should proceed accordingly. Advances in determining the biologic markers for disease progression should improve ability to decide in favor of HCT with an alternative donor (Bruggemann et al. 2006; Zhou et al. 2007). Table 2.4 shows a useful guideline which we use in our center to select the “optimal” donor based on the predicted urgency of the need for transplant. By employing a strategy that identifies an acceptable donor in the shortest period of time, and that subsequently refines the search to optimize donor characteristics, we can meet our goal to move to transplant at the time most appropriate for the patient, knowing that we have identified the best donor within the appropriate time frame.

Table 2.4 General algorithm for donor selection