Introduction

Heart transplantation has become the preferred therapy for eligible patients with end-stage heart failure. The supply of suitable donors has always been insufficient to meet the expanding demand. Heart allografts must function reasonably well in the immediate post-operative period. There are multiple ways to evaluate the donor heart including echocardiography, coronary angiography which are all subject to interpretation and can be subjective, as opposed to kidney transplantation where laboratory values and in some cases biopsy results are available to guide decisions. The net result is that relatively few (about a third) of potential heart donors are actually used for transplantation, while the waiting list grows ever longer. The purpose of the current manuscript is to update the reader concerning recent criteria for donor heart selection. We have divided the paper into distinct sections.

Graft Abnormalities

Left Ventricular Dysfunction

Decreased left ventricular ejection fraction (LVEF) on echocardiographic evaluation of the donor’s heart is one of the main reasons for not using the graft (normal range for LVEF is 55 to 70%) [1, 2]. Acute severe left ventricular dysfunction associated with acute brain injury is reported in more than 40% of patients [3]. The etiology of ventricular dysfunction caused by brain death is not well understood and is likely similar to Takotsubo cardiomyopathy, which is an acute decrease in LVEF in response to extreme emotional stress [4]. It has been proposed and well accepted that this process is due to an abrupt increase in endogenous catecholamines, resulting in microvascular coronary vasoconstriction, myocardial stunning, and a characteristic histopathological lesion of contraction band necrosis. LVEF may improve with the supportive care of the donor [5, 6•]. Recovery may take hours to days with appropriate donor management. Madan et al. compared donors with initially normal ejection fraction (EF) to 427 donors with LVEF < 40% that subsequently improved to > 50%. They noted that there was no difference in mortality, primary graft failure, or coronary arterial vasculopathy (CAV) in the improved hearts compared to those with LVEF > 50% [7]. With marginally reduced LVEF, protocols using stress echocardiography have been reported to predict improvement in donor heart function following transplantation [3, 8, 9]. In potential donors with suspected transient left ventricular dysfunction, institution of donor management protocols, waiting for hours to days, and repeat echocardiographic evaluation may result in better left ventricular (LV) function and use as a cardiac graft.

Donor Characteristics

Age

Age of the donor is a significant factor while assessing the suitability of a heart for transplantation (HT). Data from ISHLT Registry demonstrated worse outcomes at 1, 5, and 10 years after HT with increasing age, likely due to a higher probability of acquiring comorbidities such as diabetes, hypertension, hyperlipidemia, coronary artery disease (CAD), and heart failure [10]. Also, the combination of advanced age and donor heart ischemic time of greater than 4 h was a strong predictor of mortality.

Fear of increasing donor age causing poor outcomes was documented in Europe. A donor score index utilizing the Eurotransplant data showed that compared to a donor less than 45 years, a donor aged 45–54 years led to a twofold risk of donor nonacceptance that increased to the 4.7-fold risk of nonacceptance for a donor between 55 and 59 years [11]. Regarding survival, the RADIAL score data showed donor age more than 30 years predicted primary graft dysfunction (PGD), and a donor age more than 40 years is a risk factor for poor recovery from PGD and higher in-hospital mortality [12]. In summary, increasing age especially in combination with increasing ischemic time predicts higher post-cardiac transplant mortality.

Ischemic Time

Ischemic time has long been recognized as one of the most important determinants of survival post-transplant. Its effect is particularly noted on early survival and also interacts with donor age [13]. Recent changes to the US transplant allocation scheme emphasize broader sharing of organs which has led to longer ischemic times for many patients. Whether this will lead to significant changes in the post-transplant mortality observed in the USA is unclear [14,15,16,17,18,19,20]. An exciting development is the recent FDA approval of the Transmedics Organ Care System to transport donor hearts where the ischemic time is forecast to be prohibitive. The specific wording of the approval is “The OCS Heart System is indicated for the preservation of brain death (DBD) donor hearts deemed unsuitable for procurement and transplantation at initial evaluation due to limitations of prolong ed cold static cardioplegic preservation (e.g., > 4 h of cross-clamp time).” This system adds expense and staffing issues to consider, but does offer the possibility for increased organ utilization in the future [21].

Impact of Donor Size and Gender

Traditionally, size matching has been based on attempts to match weight and height between the donor and the recipient. The ISHLT Guidelines (from 2010) recommend limiting donor/recipient weight differences to < 30%,and to < 20% if there is a female donor for a male recipient [22]. However, recent data suggests that weight is not an accurate marker for cardiac size, especially in obese recipients, and donor-recipient weight differences may not impact outcomes [23].

“Gender mismatch” is thought by many to be a surrogate for size mismatch that in fact may lead to lower survival rates than gender-matched recipients. An analysis of 60,584 patients in the UNOS registry studying the effect of donor-recipient sex mismatch on heart transplant outcomes showed that male recipients of female donor hearts had a 10% higher mortality rate than did male recipients of male hearts [24]. Kaczmarek et al. showed in 67,855 pts; the male recipient/female donor carried a higher risk for early mortality [25]. One-year survival was highest in male recipients of male donor hearts (84%) and lowest in male recipients of female donor hearts (79%). The survival after 1 year was comparable suggesting that gender influenced mainly short-term outcomes.

However, Reed and colleagues used UNOS registry data to examine size matching with predicted heart mass (pHM) ratio derived from equations integrating donor size, age, and gender. They defined predicted heart mass ratio as pHM = (pHMrecipient − pHMdonor/pHMrecipientt) × 100. They reported that a suitably sized heart from a female donor results in similar outcomes as a male donor heart [26]. Bergenfeldt et al. showed no difference in mortality related to appropriate vs. inappropriate weight-matched donor-recipient pairs (≤ 30% vs. > 30% weight difference) [27]. Inappropriate weight matching was a risk factor for short- and long-term mortality in non-obese recipients, but not in obese recipients. Kransdorf and colleagues examined pHM in the UNOS registry in a more contemporary cohort and divided matches by septiles [28••]. They conclusively showed that pHM ratio is a robust predictor of mortality and its use has become widespread.

In summary, gender mismatch is a misnomer. Gender is not the determinant of outcome in gender mismatch heart transplantation. Cardiac mass matching eliminates the concern.

Hypertension and Left Ventricular Hypertrophy

Hypertension is a risk factor for comorbidities such as coronary artery disease, left ventricular hypertrophy (LVH). Donor heart LVH is an increased risk for coronary artery vasculopathy (CAV) and death [29]. Donors with LVH are more often older, male, and have a history of hypertension [29,30,31]. Some studies have noted that at 12 to 24 months, there is no longer LVH after transplantation [30], LVH alone is not a risk factor for mortality, though there is also an increased risk of death with LVH with donors older than 55 years and when the ischemic time exceeds 4 h.

Coronary Artery Disease in the Donor

Earlier studies suggest that the presence of coronary artery disease in the donor increases the risk of progression within the first year of transplant, particularly if the donor is older than 50 years [29]. Donor hearts with mild or moderate CAD with preserved ejection fraction can be transplanted but recipients may require revascularization or percutaneous intervention. A coronary angiogram is the most accurate way to assess donor CAD and is recommended for donors with risk factors for CAD (i.e., hypertension, diabetes, age greater than 45 years, smoking, and cocaine history) [22]. More recently, Lechiancole and colleagues reported a series of 289 heart transplants from a single European center. Fifty-seven of the donors had moderate CAD (less than 50% stenosis in one or more coronary vessels). There was no difference in 10-year survival nor the development of ISHLT CAV grade 2 or higher [32•]. This provocative work suggests that the risk of using donors with established CAD may not be as high as previously considered.

Diabetes

Diabetes mellitus is a well-known risk factor for the development of CAD and therefore is a factor carefully considered when evaluating a donor. Survival is not different with selected diabetic donors, though evaluation including coronary angiography is common [33]. Recent data suggests CAV may be more common with a diabetic donor [15, 34].

Malignancy

Donors with a history of malignancy represent a dilemma for transplant teams with evidence of transmission based on case reports and small series. Ultimately, the balance between risk and benefit is weighed closely as with most transplant-related decisions [35, 36]. A recent report from the UNOS Disease Transmission Advisory Committee listed 70 cases of possible disease transmissions of malignancies over the period 2008–2017 [37].

Toxic Substances and the Heart Donor

A variety of substances may lead to cardiac injury, including tobacco, alcohol, cocaine, amphetamines, and opiates. The United Network for Organ Sharing (UNOS) registry specifically captures the use of cocaine, alcohol, and unspecified illicit drugs for each donor, which has allowed analyses of the survival associated with such donors. Jayarajan and colleagues analyzed the UNOS registry from 2000–2010 and identified 2274 donors with cocaine use from a pool of 19,636 total donors [38]. Heart transplant recipient survival was similar for donors with prior, current, or no use of cocaine.

Vieira et al. reported an analysis of the International Society for Heart and Lung Transplantation (ISHLT) Thoracic Transplant database [39], a worldwide registry. Data for 24,430 adult de novo transplants between 2000 and 2013 was gathered, yielding 3246 transplants (13.3%) where the donor had a history of cocaine use. Of these, 1477 (45.5%) were classified as current users. The authors reported that there was no decrease in survival with these donors. Also, there was no increase in allograft rejection or the occurrence of cardiac allograft vasculopathy.

Amphetamines are well known to lead to cardiotoxic effects, particularly methamphetamine [40,41,42,43]. Since all donors are screened with echocardiography, whether the use of an apparently normal donor with a history of amphetamine use has remained controversial. The only published information was limited to case reports and series published in abstract form.

Baran and colleagues analyzed 23,748 donors from the UNOS registry between 2007 and 2017 and specifically examined the donor toxicology data field, classifying the use of more than 20 drugs [44••]. In addition, they examined the UNOS fields for drug use which have been analyzed in the past by other groups and they analyzed combinations of toxic drugs to assess the effect of multiple drugs on donor outcomes. They concluded that no single drug was associated with worsened survival and even combinations of multiple drugs were associated with similar survival of recipients of drug-free donors over 10 years post-transplant.

Alcohol is commonly found in donors, and acute intoxication is less of an issue than chronic use. The 2010 ISHLT Guidelines for the Care of Heart Transplant Recipients indicate “In light of current information, the use of hearts from donors with a history of ‘alcohol abuse’ remains uncertain, but it should probably be considered unwise” [22]. However, a 2015 report from the UNOS registry examined nearly 15,000 transplants from 2005–2012 (15.2% with heavy alcohol use) and there was no difference in survival between heavy alcohol donors and others [45]. The recent report using donor toxicology data concluded that donor alcohol use was not associated with increased mortality [44••].

Tobacco use is relatively common but there are no guidelines regarding heart transplant donors [22, 46]. Recently, Hussain and colleagues reported an analysis of the ISHLT Thoracic registry of 26,390 heart transplants from 2005–2016, specifically focusing on tobacco use [47]. The authors also incorporated propensity matching to account for other differences between donors with smoking and those without. They found decreased 5-year survival compared to those without a history of smoking (73.7% vs. 78.1%; p < 0.001). Graft failure was more common (27.1% vs. 22.5%; p < 0.001) and the incidence of CAV (35.5%% vs. 28.6%; p < 0.001) and acute rejection (44.9% vs. 41.8%; p = 0.002) was elevated for the donor smoking cohort compared to the non-smoking cohort at 5 years [47]. The propensity-matched cohort consisted of 4572 transplant recipients. In this group, survival and graft failure were worse with donors who smoked but CAV and rejection were not different. The limitations are that smoking is not quantitated in the registry nor did the authors include whether donors had a coronary angiography prior to donation.

Recently Loupy and colleagues reported on a novel analysis using 1301 heart transplant patients across 3 European and one large US center [48]. Using latent class mixed model statistical analysis, the authors were able to derive and validate several trajectory groups for the development of CAV. Importantly, donor tobacco use was a strong correlate of trajectories with the more rapid development of CAV. From these recent reports, a donor history of heavy chronic smoking should prompt very careful examination of such a potential donor.

Expansion of the Donor Pool by Utilizing Hepatitis C-infected Donors

With the advent of direct-acting anti-viral therapies, the use of hepatitis C-infected donors is changing from a rarity to a commonplace event. Protocols vary, but many centers will treat recipients following documented infection and determination of hepatitis C genotype [49,50,51,52,53,54]. Issues with this approach have been the cost of the direct-acting anti-viral therapy and concerns about drug interactions. Over time, it is likely that hepatitis C-infected hearts will be routinely used given positive results though surveillance for long-term issues remains an important aspect to follow [55, 56].

Donation Following Circulatory Determination of Death: Novel Donor Pool

Following the initial report in 2014 of successful heart transplantation from donors after circulatory determination of death (DCD donors) using machine perfusion to both resuscitate and transport the donor heart [57], there has been a rapid growth of this form of heart transplantation. Successful programs have now been established in multiple centers across Europe and North America.

Compared with the brain-dead donor (DBD) donor, the DCD donor presents some unique challenges. The potential DCD donor is still alive at the time of referral. Applying the principle of “primum non nocere,” this places limits on what ante-mortem investigations and interventions are permissible. While simple non-invasive tests such as transthoracic echocardiography are allowed in most jurisdictions, more invasive ante-mortem investigations such as coronary angiography and procedures (e.g., the placement of perfusion cannulas for the institution of normothermic regional perfusion) may not be.

Following the withdrawal of life support, there is uncertainty whether the donor will progress to death within a timeframe that allows recovery of a viable donor heart. In contrast, to preclinical studies of DCD donation, when progression to circulatory arrest after withdrawal of life support (WLS) is typically rapid [58, 59], clinical experience has shown that the progression to circulatory arrest after withdrawal of life support in the human DCD donor is far less predictable [60]. In addition, there is ongoing uncertainty and debate regarding the timing of the onset of myocardial ischemia following the withdrawal of life support. While some donors progress rapidly, others maintain a period of hemodynamic stability or undergo a hypertensive phase (similar to the Cushing’s reflex that is seen during brain death) before progression to circulatory arrest.

Several prediction scores have been developed with the aim of determining in advance whether a DCD donor will progress to circulatory arrest following WLS within a timeframe that permits recovery of viable organs for transplantation [61]. At present, they have limited utility for heart donation as they predict the time from WLS to circulatory arrest rather than the functional warm ischemic time (FWIT) from the critical hemodynamic change that marks the onset of myocardial ischemia to circulatory arrest. Currently, variables impacting on the donor’s heart ischemic time need to be considered on a case-by-case basis before deciding on whether to send a retrieval team to the donor hospital.

Two hemodynamic measures have been utilized by clinical programs to mark the onset of myocardial ischemia—systolic blood pressure and systemic arterial saturation. Based on their preclinical studies, the Sydney program has used a systolic BP of < 90 mmHg to mark the onset of myocardial ischemia after WLS [58]. The Papworth program and most others have used a systolic BP < 50 mmHg to mark the onset of myocardial ischemia which usually occurs within 1–2 min of the systolic BP falling below 90 mmHg [62]. Some US centers have used an arterial saturation of < 70% to mark the onset of ischemia; however, the accuracy of this measurement after WLS has been questioned [60]. The subsequent progression to circulatory arrest, mandated stand-off period, transfer to the operating room, and retrieval determines the FWIT. The FWIT ends either with the flush of the donor’s heart with cold preservation fluid in the case of direct procurement (DP) or with the recommencement of the circulation in the case of normothermic regional perfusion. With both protocols, the DCD heart is usually fully recoverable if the FWIT is 30 min or less.

DCD donor selection criteria vary between institutions. The upper age limit for the Sydney program increased from 40 years of age at the commencement of the program in 2014 to 55 years of age in 2018 with the oldest donor in this program being 54 years of age [63••, 64]. The upper age limit for the Papworth program is currently 50 years of age although the oldest donor transplanted was 57 years of age (Stephen Large MD, personal communication) [62]. Preclinical studies suggest that there is an adverse interaction between increasing age of the DCD donor and functional warm ischemic time with the aged heart being more susceptible to ischemia/reperfusion injury [65]; however, at present, there is an insufficient clinical experience to determine whether the tolerable FWIT decreases with age.

An echocardiogram should be obtained in all prospective DCD donors. Echocardiographic criteria for heart donation are essentially the same as for the DBD donor with regard to left ventricular size, function and hypertrophy, and any structural abnormalities [66]. Any degree of left ventricular systolic dysfunction in the most recent echocardiogram taken prior to WLS should be regarded as a contra-indication to DCD heart donation.

Troponin and other biomarkers are often elevated in potential DCD donors particularly those who have attempted suicide by hanging. At present, there is no evidence that elevated troponins predict early graft dysfunction or failure after transplantation.

Donor comorbidities such as hypertension and diabetes should be approached in the same way as they are for DBD donors.

Conclusions

More than 50 years following the first human heart transplantation, we still have much to learn about the selection of donors. Heart transplantation remains a field where information is limited, topics controversial, and opinions widely held in the absence of definitive evidence. The selection of heart donors encompasses a complex calculus involving many variables. We have not included all of these. Transplant centers use donor data to optimize the selection and thus the chance for survival of their potential recipients. There is no substitute for compulsive scrutiny of all data available on each potential donor keeping in mind that any single risk factor may be misleading and registry as well as single-institution data provide guidelines that must be used with care. One cannot assume that all young donors, or donors taking cocaine, or all donors with LVEF > 50, etc. will provide hearts that function well. As we all have seen one may no longer assume that female donors should not be used in male recipients. Gender is not an important factor, heart size is. The donor must be reviewed and assessed in relation to each recipient and the severity of illness of each recipient must be evaluated each time a donor’s heart is available. The surgeon and/or cardiologist must ultimately balance the risks and benefits in favor of the specific recipient based upon contemporary data.