Keywords

Introduction

Since ancient times blood has been thought to possess mystical and healing properties and the notion of changing personal characteristics, by means of a transfusion of blood from another species or person with desirable attributes, has been attempted perhaps many times in the sixteenth century. In the seventeenth century a better appreciation of the circulation of blood and that blood loss from hemorrhage could be reversed by a transfusion. William Harvey’s revolutionary experiments and the publication of “Exercitatio Anatomica De Motu Cordis” in Frankfurt in 1628 introduced the concept of experimentation and direct observation initiating the scientific approach to medicine. Animal-to-human and human-to-human transfusions followed, the first in 1666 and 1818, respectively. Not all these attempts were successful. A French physician and naturalist was tried for murder after some unsuccessful animal-to-human transfusion attempts. Subsequently, transfusion attempts were prohibited in both England and France [1,2,3]. In much of the nineteenth century blood transfusion was not accepted as a safe medical procedure, except for the work of James Blundell, a prominent London obstetrician, who recognized that certain circumstances necessitated human transfusions. He developed devices for collecting and administering blood to treat obstetrical hemorrhage and established a donor base. More widespread use of transfusions was hindered by a multitude of “technical” barriers, the absence of methods of sterilizing devices, of appropriate anticoagulation and preservative media. Despite the carnage of the American Civil War and European wars in the second half of the nineteenth century, the use transfusions was insignificant. The introduction of saline infusion in 1884 improved the treatment of hemorrhage and dehydration [1].

The use of transfusions in the early decades of the twentieth century were helped by the discovery of the major blood groups. The outbreak of World War I did not see extensive use of transfusions. With the outbreak of the World war II, transfusions of blood and of plasma and albumin became strategic endeavors [4, 5]. Soldiers in the German SS had their blood group tattooed in their armpits but battlefield transfusions were rare.

The approach taken in this chapter is not a conventional chronological narrative of the history. Rather, it will highlight milestones of the surgical and critical care use of erythrocyte transfusions only and will refer to those as “transfusion”. Blood products and components and the technological aspects of blood banking will not be included. The overriding theme in this chapter is dealing with blood as a scarce and expensive resource that is handled with a view to risk management, whereby expected benefits and hazards are balanced. It must be emphasized that compelling evidence by clinical trials of the benefit of transfusion against its known risks was not available.

Milestones in Erythrocyte Transfusion

Karl Landsteiner and Discovery of Major Blood Groups

The “coming of age” of blood transfusion began with the revolutionary contribution of the Austrian-born, American physician and immunologist, Karl Landsteiner.

The Early Years of Transfusion: The First Milestone

Karl Landsteiner

Karl Landsteiner (1868–1943) is celebrated for his landmark discovery of the ABO blood groups in 1901 and, together with Alexander S. Wiener, for the discovery of the Rhesus factor in 1937. Landsteiner received the Nobel Prize in Medicine and Physiology in 1930 [6,7,8] (Fig. 1.1).

Fig. 1.1
figure 1

Portrait of Karl Landsteiner on an Austrian Postal Service commemorative stamp issued on the 100th anniversary of his birth

These discoveries have made it possible to infuse another person’s blood to someone in great need of it. The ABO blood group system was discovered by Landsteiner by testing samples of erythrocytes with the addition of samples of serum from other individuals, using the methods of immunology in which he had been trained. Some serum samples caused the blood cells to clump, or agglutinate, while others did not. By repeated testing, he intuited that there must have been some element, an antibody in some serum samples that reacted with an antigen on the surface of red blood cells, causing agglutination; whereas the serum of others contained a different antibody that did not react with cells from the same person. And that a person’s blood contained the same type of antigen on the red cells as the antibody in their serum. He categorized these blood types as A, B and C. Erythrocytes of type A would be agglutinated when mixed with serum from a person having type B antibodies in their serum, but not when mixed with serum from a type A person. Erythrocytes from a person with Type A antigen, when mixed with serum from another person with type A erythrocytes, did not agglutinate. A third type, that he named type C, erythrocytes of a person of either type A or type B, were not agglutinated when mixed with serum from Type B or type A person, either. This third, type C, had neither type A nor type B antibody in their sera. Such a person could receive a blood transfusion from either a Type A or Type B donor. The type C was later renamed Type O (or zero, for the original German word “Ohne”, without).

Ironically, the revolutionary observation was first reported by Landsteiner in a footnote in a paper (1900) on pathologic anatomy, describing the agglutination occurring when blood of one person is in contact with that of another person [7, 9]. The actual description of the discovery of the ABC blood groups was published a year later, in 1901. Landsteiner, at first, did not appreciate the importance of his discovery, writing that “I hope that this will be of some use to mankind” [7].

In 1922 he accepted the invitation by Simon Flexner to join the staff of the Rockefeller Institute where he continued to make major discoveries [8].

The discovery of the Rh, or rhesus factor came about from the case described by Bodner and McKie [10]. The obstetrical patient’s physician was Dr. Philip Levine who had been an assistant to Landsteiner for several years.

The patient had a first normal pregnancy, but her second pregnancy ended in the loss of her baby and she suffered a massive hemorrhage. Since both her and her husband had Type O blood, Dr. Levine decided to transfuse her from the husband. To his dismay, she had a violent transfusion reaction. Dr. Levine reasoned that there must have been an alternative blood group type antibody involved in the reaction. It turned out that when the patient’s serum was tested against her husband’s erythrocytes, agglutination occurred. Moreover, the loss of the baby was due an antigen antibody reaction. The mother’s antibody had leaked across the placenta and entered the fetal circulation and caused massive lysis of the fetal erythrocytes which were of a different type inherited from the father. This single case was reported by Philip Levine and Rufus Stetson in 1939 in the Journal of the American Medical Association. They noted the similarity of this first detected case with the then few reported cases of iso-immunization after repeated transfusions [11].

Since the mother’s serum caused agglutination of erythrocytes of rhesus monkey, and those of other animal species’ erythrocytes, the antibody became known as the rhesus, or Rh factor, subsequently renamed type D antibody. A D-negative mother having a D-positive fetus in her first pregnancy has not yet developed antibodies to the fetus’ antigens but will do so when the D-positive fetal cells leak across the placental barrier during delivery. Subsequent pregnancies may be complicated by the mother’s anti-D+ antibodies entering the fetal circulation. The consequences of the presence of anti- D+ antibody in the mother and its absence in the fetus, the intrauterine hemolysis, became known as the Hemolytic Disease of the Fetus and Newborn (HDFN) [8].

Landsteiner’s many contributions have involved the detection of similar patterns of reactions with rhesus blood. In 1940 he and Wiener immunized rabbits and guinea pigs with erythrocytes of rhesus monkeys. This anti- rhesus (anti-Rh) reacted with 85% of human erythrocytes, indicating the frequency of Rh+ phenotype. It is now known that the type D appellation involves many other different agglutinin subtypes detected by cross matching and phenotyping. After the original discovery of the major blood groups, Landsteiner and coworkers and many followers discovered at least 36 other systems of minor subgroup types with weaker isoreactions [8].

In addition to these important discoveries, Landsteiner also made many others, including the recognition of the viral origin of poliomyelitis, and the diagnostic test for paroxysmal cold hemoglobinuria [6, 7].

Ottenberg (1882–1959) was the first to perform the earliest form of a pretransfusion cross match in 1907, recognizing the clinical significance of avoidance of hemolytic transfusion reactions. This rigorous typing and cross matching have contributed greatly to the safety of early transfusions, however, transfusions remained cumbersome and little used, because of the lack of adequate anticoagulation and storage methods, so that most transfusions were direct donor-to-recipient.

The history of the development of anticoagulant and storage technologies, as well as of those of blood banks, donor bases, and of the introduction of component separation is beyond the scope of this chapter.

The next, second, milestone in this narrative is the recognition that parallel to the risks of anemia, transfusion’s benefits may have to be balanced by the recognition that risks are also inherent in transfusions.

Balancing the Risks and Benefits of Anemia and Transfusion: The Second Milestone

The immunologic investigation of blood group types and antigens accelerated in the 1940s as testing technologies improved and became more routine in blood banks. As a result, the use of transfusions of whole blood, and then that of red cell units and components, accelerated, both in cases of acute blood loss (surgery and trauma), and in “chronic “cases (postoperative anemia and in “medical anemia”, such as in malignant disease).

With increasing use and availability of blood for transfusion, the prescribing of transfusion became a more common medical treatment where the decision was based on the expectation of a benefit to the patient by increasing oxygen carrying capacity and transport. However, there was little objective evidence supporting the expected benefit, especially in the case of single-unit transfusions, that generally result only in a 10 g/LFootnote 1 increase in Hb concentration.

Since transfusions had long been in use when the use of clinical trials of establishing efficacy and safety was introduced, transfusion of blood was not subjected to rigorous trials evaluating its efficacy. One of the few medical interventions that remains without rigorous safety and efficacy testing by clinical trials. More recently questions have been raised about when a transfusion is appropriate and the notion of balancing risks and benefits of both the transfusion and of anemia has become a dominant consideration, but without evidence-based support. The balance is not simple because the transfusion is expected to provide a medical benefit, BUT there are no benefits of severe anemia. On the other hand, both have risks.

The Risks of Anemia

There are no known benefits of severe anemia; its risks need to be considered first.

In an anemic subject oxygen delivery may be impaired, depending on its severity, to an extent that physiological functions may deteriorate, activity may be limited, and organ dysfunction may supervene. This may be explained by the concept of supply dependence when the supply is so limited that a substantial mass of body cells are hypoxic and oxygen consumption falls [12]. There are occasional instances observed when an individual may survive such low hemoglobin concentrationFootnote 2 as 10-20 g/L. However, retrospective aggregated data from Jehovah’s Witnesses who refuse transfusion on religious grounds, reveal how dangerous severe anemia is. At persistent hemoglobin concentration of 11 g/L in-hospital mortality was 100% at 30 days. For every 10 g/L reduction of hemoglobin concentration from 50 g/L, the probability of adverse outcomes, such as myocardial infarction, respiratory and renal failure, etc., doubled [13,14,15].

Thus, the threat to life represented by severe anemia in compromising oxygen delivery was thought to mandate medical intervention that intended to prevent, if possible, such hypoxia (e.g., a case of continuing blood loss). If prevention is not feasible, amelioration is required as soon as possible. Thus, a transfusion would be prescribed, in the absence other effective interventions. The expectation of benefit would only be tempered by the then recognized danger of transfusion reactions (see below). This desired goal is hampered by the lack of objectively definable, universally applicable thresholds to facilitate a rational clinical decision [12, 16].

The Target Organs of Anemia-Induced Injury

The organs most vulnerable to hypoxia are those of obligate aerobic metabolism, the brain and heart. Healthy volunteers, subjected to isovolumic hemodilution to a hemoglobin concentration of 50 g/L, exhibited reversible cognitive and memory impairment that was improved by oxygen breathing, indicating the mechanism to be hypoxia [17, 18]. Clinical studies have identified cerebral injury in anemic perioperative patients [16, 19]. Jehovah’s Witness patients who refuse transfusion even in the face of severe anemia and/or continuing blood loss (hemoglobin concentration < 80 g/L) have been found in an 11-year review to suffer all-cause mortality rate of 19.8%, and at a hemoglobin concentration ([Hb]) <50 g/L, are very likely to die [15]. These findings strongly suggest that in patients who may have underlying coronary artery disease, severe anemia represents a real threat. In view of the belief that severe anemia is a threat, transfusion had been used in the expectation of benefit and avoidance of harm.

An excellent experimental study on rats has shown that anemia induced tissue hypoxia occurs at different levels of [Hb] in different vital organs [20]. The study subjected rats to isovolemic hemodilution to [Hb] concentrations of 90 g/L, or 70 g/L, or 50 g/L and was compared to the baseline of 130 g/L. Tissue hypoxia was indicated by increases in HIF-1α-luciferaseFootnote 3 activity and NOSFootnote 4 expression. Whole body HIF activity increased progressively as the [Hb] was decreased, indicating the presence of tissue hypoxia somewhere in the body even at [Hb] of 90 g/L. In the kidney HIF activity was like baseline at [HB] both at 90 and 70 g/L but became significantly increased at [Hb] = 50 g/L., suggesting a relative degree of tolerance of modest hypoxia. In contrast, the liver exhibited increased HIF expression at [Hb] = 70 g/L, suggesting a higher threshold of hypoxia.

The next, third, milestone in this narrative is the recognition that parallel to the risks of anemia, transfusion’s benefits may have to be balanced by the recognition that substantial risks also attend transfusions.

Benefits and Risks of Erythrocyte Transfusion

The Benefits of Transfusion

How do transfusions benefit a patient facing the risks of anemia?

Transfusion is intended to prevent or ameliorate the signs and symptoms of anemia of significant severity that interferes with the supply of oxygen sufficient to the physiological demands of effective functioning. The “physiological benefits” of a two-unit transfusion were described in a study on ICU patients undergoing invasive hemodynamic monitoring [21]. The transfusion’s effects included a rise in hematocrit ratio, from 0.22 ± 0.2, to 0.28 ± 0.03, and [Hb], from 76 ± 8 to 94 ± 9 g/L. It is not clear whether the average pre-transfusion [Hb] of 76 g/L would be associated with the need for increased oxygen capacity to ameliorate critical organ hypoxia. There was also significant improvement in hemodynamic variables and oxygen flux and a reduction in the heart rate. However, it is not clear, whether the documented improvements represented a physiologically significant degree of tissue hypoxia or, whether an improvement in blood volume also contributed. The study did not provide definitive evidence of efficacy.

A related aspect of transfusion’s efficacy is the timing of the benefit. Banked erythrocyte units are well documented to have properties different from those of native erythrocyte: the well-known phenomenon of the “storage lesion” [22]. This consist of changed biomechanical properties of the erythrocyte that significantly impair perfusion in the microcirculation [23,24,25]. Animal experiments have shown that the impaired biomechanics of stored erythrocytes’ adherence to capillary walls and rigidity represent impaired flow and clinical risk [23, 24, 26]. Finally, the breakdown of cells and the release of their fragments and hemoglobin interfere with NO-mediated vasodilator regulation [22]. These effects are reversible within about 24 hours and the transfused erythrocytes become functional, but their circulating half-life is shortened.

Effectively demonstrating the benefits of transfusion in individual cases is also subject to uncertainties. Not every patient with a given [Hb] is the same as every other patient with same [Hb]. This is due to the variability of individuals’ physiological adaptation to the anemia that include:

  • Duration of anemia: chronic vs acute. Physiological adaptations developed to anemia.

  • Increase in cardiac output. Potential redistribution of available blood flow.

  • Modification of erythrocytic 2,3 diphospho-glycerate (2,3 DPG) modulating oxygen unloading.

  • Presence of comorbidities that may affect or limit the physiological adaptations.

Searching for objective markers of tissue hypoxia lead Hare and colleagues [27] to the kidney as a vulnerable organ during cardio-pulmonary bypass. Acidosis and increased plasma lactate concentration were indicative of some tissue hypoxia. Actual measurements in animal experiments of renal medullary pO2 by polarographic electrodes, has shown the presence of tissue hypoxia during cardio-pulmonary bypass [28]. Erythropoietin (EPO) is released to the plasma in the presence hypoxic injury to the kidney. A rise of this hormone was correlated with the onset and severity of anemia, suggesting that EPO could be a potential biomarker for the need for transfusion to avoid hypoxic injury to the kidney during cardio-pulmonary bypass [27]. The potential of EPO being a biomarker for the need of transfusion requires further exploration.

Recognizing the need for objective evidence-based markers for the need of transfusion, significant efforts have been directed at developing clinical trial-based guidance on the expected benefit of transfusion. The introduction of physiological, rather than [Hb] – based ones have been used as surrogates (e.g., heart rate ECG changes, mixed venous oxygen saturation, plasma lactate, etc.) [29]. A transfusion-attributed 10 g/L increase in [Hb] resulted in reduction of lactate clearance by >10% and increased central venous oxygen saturation by >5% in a third of the subjects [29]. Thus, there were putative physiologically meaningful benefits in some but not all of the subjects. There may be three conclusions from this study. First, that objective, physiological indicators can be applied to assess transfusion “efficacy” , and that a 10 g/L increment in a subject’s [Hb] may offer a marginal benefit, and, lastly, that it confirms that not all individuals are alike in their responses and hypoxia tolerance.

This desired goal is hampered by the lack of objectively definable, universally applicable indicators to facilitate a rational clinical decision [12, 16].

The expectation of benefit and of the efficacy of the transfusion were important contributors to a degree of chaotic and individualistic approach to the use of transfusions, especially in surgical settings. Transfusion practices were variable, both among specialties and institutions, as well as within institutions. Many transfusions had been prescribed based on practitioners’ personal values and expectations, as the true magnitude of the hazards of transfusion itself were not fully appreciated.

The Risks of Transfusion

Transfusion Reactions

Transfusion reactions as risk factors for adverse outcomes: these are adverse outcomes of a specified nature and had been well recognized.

(Chapter 6 of Part I of this book offers discussion of the nature, frequency, and clinical significance of transfusion reactions directly attributable to an incompatible transfusion.)

Transfusion reactions are identified post facto and their frequency, severity and their putative causes are monitored by national hemovigilance programs in most countries.

Transfusion reactions include [2, 3]:

  • Incompatibility reactions to major or minor antigen mismatch, with or without hemolysis.

  • Anaphylactic or allergic reactions [30].

  • Accidental mismatch or preventable errors: wrong unit given to wrong patient.

  • Transfusion Mediated Immune Modulation (TRIM) [31].

  • Transfusion-Related Acute Lung Injury (TRALI) [32] and Transfusion-Associated Circulatory Overload (TACO) [33].

  • Adverse reactions initiated by inflammatory mediators potentially derived from residual white cells remaining in transfused erythrocyte units.

  • Febrile non-hemolytic transfusion reaction. Delayed serologic reaction.

  • Post-transfusion purpura.

  • Transfusion-Associated Graft vs Host reaction (T-A GVH) most likely affecting immunocompromised patients [34].

Fatal transfusion-related events occurring in the USA and reported to the FDA in the 5 years between 2012 and 2016 totaled sixty-five, of which one-half were hemolytic transfusion reactions. Despite the relatively low incidence of fatal transfusion reactions that should theoretically be preventable, these do happen and are a cause for concern [35]. The prevalence per 100,000 units transfused is reported yearly. This reporting is a great benefit in decisions of the statistical probabilities of assessing risk tolerance by both the prescriber and the patient.

Transfusion Reactions, TRALI, TRIM and T-A GVH) are rare but serious complications of transfusions.

In the surgical setting the immune suppression due to transfusion may be aggravated by immune suppression due tissue injury. In such cases the compelling argument favoring a transfusion are the consequences of the blood loss. Immune modulation is a well-known contributing risk factor for nosocomial infections in postoperative patients. Amelioration of the immune suppression may be a consideration for possible avoidance of transfusion, if feasible. Thus, the balancing of expected benefits and known and anticipatable risks is the sine qua non of a transfusion decision.

Residual leukocytes in erythrocyte units are thought to be a contributing risk factor to the pathogenesis of TRIM. Hence, increasing attention is directed at producing leukoreduced erythrocyte units. Comparison of transfusions of leuko-reduced and non-leukoreduced units has shown true superiority of the former [36,37,38,39]. The ongoing universal implementation of leukoreduction and introduction of other specialized erythrocyte units (e.g., CMV-free units) became available and ameliorate these risk factors.

Transfusion Transmitted Infectious (TTI) Pathogens

In addition to the risk of transfusion reactions, another category of risks is the transmission of infectious pathogens present in donors, since blood cannot be sterilized [40]. The first infectious disease recognized to be transmissible by transfusion was syphilis and the Serologic Test for Syphilis (STS) was introduced in blood testing in 1935. The actual usefulness of this test is questionable, but it has remained in use. As refrigeration kills the T. pallidum, the clinical risk of syphilis has been overtaken by the more prevalent hepatitis viruses, transmissible by both blood and blood products, starting in 1965, which are not affected by refrigeration [1, 41].

The salient events in this regard in the USA were [1]:

  • Hepatitis B surface antigen (HBsAg) discovered in 1965.

  • Testing of blood donors for Hepatitis B surface antigen introduced in 1972.

  • Transfusion-Associated Acquired Immunodeficiency Syndrome recognized in 1982.

  • Donors deemed at high-risk behaviors were excluded in 1983.

  • Human Immunodeficiency Virus (HIV) identified in 1984.

  • HIV antibody testing introduced in 1985.

  • Surrogate testing for hepatitis, (liver enzyme alanine transaminase ALT), hepatitis B antibody testing introduced in 1987.

  • HTLV antibody testing introduced and hepatitis C virus identified in 1989.

  • Hepatitis C testing introduced in 1990.Footnote 5

  • HIV 2 testing introduced in 1992.

  • Nucleic acid testing and increasing numbers of rigorous virus testing introduced in the years following [1].

By the 1970s, the risks to the blood supply of potentially infectious paid donors were recognized, just as the demand for transfusions was increasing with the soaring number of coronary artery bypass graft (CABG) operations, starting in 1960. Many countries have made the shift away from paid to volunteer, unpaid, donors to protect the safety of their blood supply.

The HIV/AIDS Catastrophe

The arrival of the Human Immunodeficiency Virus (HIV) and its presence in the blood supply became the third milestone event.

The safety of the blood supply and of the erythrocyte and blood products became questioned with a panicked response by the population in most countries. People were unwilling to accept transfusions and untrue rumors circulating caused donations to plummet.

The tragic toll of potentially preventable illness and death became the stimulus for many countries to undertake rigorous and wide-ranging examination of the causes and the failures of national policy and response to the tragedy.

The Response to the Aids Crisis in Blood

The tragic toll exacted by the HIV and hepatitis viruses in the blood supply focused a searchlight on Transfusion Transmissible Infections (TTI) [41, 42], as a transfusion risk, distinct from transfusion reactions.

In the USA in the early 1980’s 10,000 hemophiliacs and 12,000 other patients were infected by the HIV virus by blood and blood products and about 300,000 additional persons were infected with the HCV virus. To quote [43]:“The lessons from these tragedies compel greater vigilance and higher regulatory standards to protect the Nation’s blood supply from emerging infectious agents and blood borne pathogens” [43,44,45]. Several policy recommendations were made to establish sub-Cabinet level Committees and Agencies to be responsible for protecting the safety of the blood supply, and these were to be established by statute. The introduction of new safety measures and policies to safeguard the blood supply was soon justified by the challenge of the emergence of a novel infectious agent, the Zika virus [46]. All donors, as of 2018, are screened by a high-performance Nucleic Acid Amplification Test (NAT) test. This newly emerged threat emphasized the importance of vigilance and horizon scanning to prevent the recurrence of a new HIV-like crises [47].

The introduction of new technologies for rigorous screening of blood, e.g., NAT testing, also introduced additional costs to the provision of blood for transfusion, to maximize blood safety.

NAT testing was first introduced as a sensitive and specific identifier of viral RNA or DNA to blood screening of donated units in the “window period” before infection could be detected by serological tests. They can be performed either on a single sample, or as multipacks combining a multiple of samples. NAT testing was first introduced in Germany in 1997, followed by the Netherlands in 2000. Approximately 33 countries use NAT testing on their blood collections.

NAT tests performed on multiple combined samples have the advantage of having the lower cost of fewer tests than tests performed on individual samples from all donations. Their disadvantage is that once a multipack is identified as positive, all units in the multipack sample need to be quarantined until further tests are completed to identify the one positive unit and the rest can be released [48]. The alternative to testing multipacks is testing of single units; this will increase the test’s sensitivity, but also multiplies the total costs, while avoiding delays in releasing non-reactive units.

Cost-effectiveness in pharmacoeconomic analysis uses the metric of incremental cost of Quality Adjusted Life Years (QALY) achieved. Pharmaco-economic analysis was performed in several countries following the implementation of NAT testing. In the United States, [49] the study found that, using minipool samples, NAT testing would avoid an estimated 37, 128, and 8 cases of HBV, HCV, and HIV, respectively, and would add 53 additional years of life, and 102 additional QALY, compared with single samples tested at a net cost of $ 154 million. For relative scale, note that approximately eight million units are transfused annually in the USA. The incremental cost ratio estimated was $1.5 million per QALY gained. The authors concluded that the cost-effectiveness of adding NAT screening in the US blood system would be outside the typical range of most health care interventions, but not for established blood safety measures.

Following the introduction of NAT testing in Germany in 1997, the German Red Cross reviewed its experience with NAT testing the German blood supply [50]. In the eight-year period (1997-2005) 30.5 million donations (representing about 80% of the total blood collected) were tested. A total 27 HCV, seven HIV-1, and 43 HBV positives had been detected that would have been missed by serological methods only. Thus, NAT testing applied in the “window period” found that the residual risk per unit transfused was estimated at 1 in 10.88 million units for HCV, 1 in 4.3 million units for HIV-1, and 1 in 360,000 units for HBV. The authors concluded that the risk avoided by the addition of NAT testing was “very low”, at a substantial cost.

A third study conducted in Zimbabwe shows how extreme inequality between high- and low-income countries affects these policy decisions [51]. The estimated prevention of infections by the addition of NAT testing would be 25, six, and nine HBV, HCV, and HIV infections, respectively. The incremental cost was estimated at US $ 17,774 for each QALY achieved. This is three times the gross per-capita income in Zimbabwe and fails the test of a reasonable cost.

Thus, the mandates to maximize the safety of the blood supply in high-income countries come at high cost that is felt to be within their national priorities in maintaining the safety of blood. It is clearly an impossibility in countries with low incomes and failed economies.

Canada’s blood system was severely impacted by the HIV crisis. And the failure of a timely response to introduce testing blood collected for the hepatitis virus, as a surrogate, before the identification of the HIV virus. The panic had been aggravated and the tragedy amplified. Criminal charges had been filed against several individuals deemed to be responsible for the delays in recognition of the threat and failing to act in a timely manner. At trial, those charged were not convicted. Those responsible, however, were confronted by many of the victims, and the participation of victims in the review of the events provided those affected an opportunity to express their grief.

A wide-ranging and clear-eyed examination of all the factors was undertaken by a Royal Commission under Mr. Justice Horace Krever over 3 years and costing CDN $ 17 million [52]. The three volumes and appendix take an enormously expansive look at all aspects of the provision of all blood products and components and the means available for reducing contamination. The policy recommendations were far reaching. Before the crisis, the Canadian Red Cross managed all aspects of donor recruitment, donations and processing of blood and components, except for apheresis collection and processing blood products.

The Commission recommended a complete reorganization of all aspects of Canada’s blood system and all its recommendations were implemented by statute. The Canadian Red Cross lost all participation in managing the blood system. Blood, blood products and components were to be treated not as commodities but as taxpayer-funded public goods. A completely new organization, Canadian Blood Services (CBS), was set up on a nation-wide scale, to become the overall manager of blood collection from volunteer donors only, all aspects of processing and supply and to include under its aegis organ transplants and stem cells, as well. Cord blood collection remained in private hands. No blood and blood components are imported to Canada, and blood products are imported only after heat treatment. The CBS encompassed all Canadian provinces and territories, except for Quebec where a similar organization, Hema-Quebec, fulfills a similar mandate. CBS’ s global budget is funded from provincial and federal contributions on an annual basis and is overseen by a council of all Health Ministers. Hema-Quebec receives its funding from the Province and the federal government. CBS provides hospital blood banks and other blood users all blood and components free of charge and recipients are not charged for any services. Blood products for hemophiliacs are provided free by the provincial health insurance agencies.

CBS screens all blood collected with NAT testing for HIV-1 and 2, HBV and HCV, as well as for West Nile Virus during the summer season and for Chagas’ disease (T. cruzi) in travelers.

Volume 3 of the Krever Report provides an exhaustive review of international events and national blood systems, including those of the USA, and comparisons made between systems.

Ten years following the Report, an appraisal concluded that the reform of the Canadian blood system was successful. The public has been kept safe from transfusion transmissible infectious threats by rigorous screening and deferral of potential high-risk donors, by an all-volunteer loyal donor base [53].

Two non-fiction books by Canadian journalists tell the story of those affected in Canada [54, 55].

The World Health Organization (WHO) publishes periodic reports on blood safety and availability in most countries [56].

Thus, the milestone event of the HIV crisis focused attention on transfusion transmissible infections (TTI). This also meant that TTI’s came to be recognized as the second category of serious risk of transfusions, in addition to transfusion reactions. In many countries national policies were introduced mandating maximal efforts to safeguard the scarce and precious resource. The public policy to restore the public’s trust in the safety of blood by a costly effort has been successful in many countries.

As if to reinforce that the emergence of novel and rare infectious disease threats requires continued vigilance and rapid response, the Zika virus arose from Micronesia and was brought to Brazil by Olympic athletes from French Polynesia. Sporadically reported from Africa and Asia before, this mosquito-borne virus attacked an immunologically naïve population in Brazil and caused the birth of thousands of microcephalic infants. The arrival of the virus in 2015 caused an international public health emergency [47]. Infected adults have viremia, but 80 percent are asymptomatic, spreading the virus widely [57]. Potential viremic blood donors without symptoms would threaten the blood supply if sensitive testing were not introduced promptly. While a NAT based test became available in Brazil, not all blood centers had been required to introduce it universally. Apparently, a few cases of transfusion transmitted infections have been reported, although the overwhelming majority of infections did not enter the blood supply. According to AABBFootnote 6 criteria, the virus should be classified as a high-risk infectious agent [46]. Whereas most infections cause no symptoms, the virus is also implicated in rare cases of Guillain-Barre syndrome. Hence, recipients of infected units are at risk of serious but rare complications. The Zika virus is another infectious agent that poses threats to the blood supply in endemic areas, posing challenges to blood collection [58].

In the USA FDA issued Guidance in August in 2016 recommending universal NAT testing for Zika in blood donors. By then, more than 4000 travel related Zika infections had been reported to the Centers for Disease Control and Prevention (CDC) [57].

As health care costs escalate in most countries, the distribution of scarce resources, including financial ones, become important considerations. Pharmaco-economic analysis is being applied to aid decision-making about resource allocation. Among these, the mandate to assure the attainment of best available safety of the blood supply is also constrained by the escalating costs associated with the introduction of newer tests mandated and more expensive technologies introduced. Economic considerations have been applied to blood processing and transfusion-associated costs [59]. As the effectiveness of transfusion has been often overestimated, whereas the risks have been underestimated; cost-effectiveness of transfusion as a frequent medical-surgical intervention needs to be examined [59].

Transfusion-Attributable Adverse Outcomes

The fourth Milestone event: It is being recognized that those receiving transfusions are at risk for adverse outcomes that occur more frequently than in those who had not been exposed to a transfusion. These adverse outcomes are recognized, based on presumptive evidence, as the third category of risks affecting transfusion recipients, in addition to transfusion reactions and TTI’s.

Jehovah’s Witness patients undergoing cardiac surgery are an instructive cohort to consider, when compared to patients undergoing similar procedures who also receive transfusion. Cardiac surgery patients are good examples, because they are at high risk of needing a transfusion, due to uncontrolled bleeding, anticoagulant use, and coagulation defects. A statistical tool, “propensity matching”, enables the selection, from a large cohort, patients who are closely comparable to a smaller cohort when the two cohorts differ in a single attribute, namely, whether or not exposed to transfusion. The study from the Cleveland Clinic [14] reviewed retrospectively in a seven-year period 87, 775 consecutive cases undergoing Coronary Artery Bypass Grafting (CABG). Of this population, 56% (48, 986) received transfusion(s). Using propensity matching, the study selected 322 transfused patients who matched 322 Jehovah’s Witness untransfused patients. The matching created two comparable cohorts of equal size, comprising of patients who were like each other with respect to many preoperative and operative characteristics, but differed with respect to transfusion exposure. During the 30-day postoperative period there were 14 deaths in the transfused and 10 deaths in the untransfused Witness patients (14/322; 4.3%, vs 10/322; 3.1%; Not significantly different). However, significantly more adverse events of myocardial infarction, respiratory failure and reoperations occurred in the transfused cohort. Indicative of the severity and frequency of adverse outcomes, longer ICU, and operative hospital lengths of stay (LOS) were also seen in the transfused patients. Long term survival of those followed up also favored the Witness patients.

A study deploying similar methodology also found differences in adverse outcomes experienced between transfused and untransfused patients as those in the Cleveland Clinic study above [60]. This study population comprised two cohorts of 857 matched pairs. More of the transfused patients experienced myocardial infarctions, respiratory and renal failure, reoperations and longer ICU and hospital LOS. These comparative studies, while not definitive, do suggest that the exposure to transfusion may be a contributing risk factor to more frequent adverse outcomes. This introduces the concept that transfusion avoidance may be a desirable clinical goal, avoiding some of the excess risks that transfusion recipients may experience, leading to the fifth milestone.

Transfusion Avoidance and Blood Conservation

From the foregoing narrative it is evident that there are benefits and risks to be considered when a transfusion decision is made. Thus far, both have been considered in the abstract, without regard to the severity of the anemia of the patient, and its risks. From the consideration of risks of anemia above, it is evident that a [Hb] less than 50–60 g/L is a significant threat to survival. Even that threshold may be dependent of the patient’s physiological reserves and resilience.

To summarize the intertwined risks and benefits of anemia and transfusion:

  • SEVERE ANEMIA:

    • If untreated, is a threat to health and survival.

    • Is a risk factor for transfusion,

    • Is potentially improved by transfusion by avoiding anemia threats.

  • TRANSFUSION:

    • Has inherent risks: transfusion reactions, transmitted infections, transfusion-attributable enhanced risk of adverse outcomes.

    • There is benefit in avoiding transfusion: avoid above risks to individual.

    • Benefits to community: Conserve scarce resources: blood, financial.

The predictive importance of preoperative anemia was evaluated in a cohort of 33,411 patients undergoing elective cardiac surgery [61]. Thirty-one percent (n = 10, 357) of these patients received transfusion(s) indicating how frequently transfusions are prescribed in these circumstances. The likelihood of transfusion was correlated with preoperative anemia. The adjusted mortality rate and a greater number of adverse outcomes was correlated with receipt of transfusion. This indicates that transfusion is an independent risk factor for additional adverse outcomes [62, 63].

In each case, an additional consideration of cost differences may also apply [64, 65] (see below).

Following the recognition of transfusion associated risks to the patients, the then (1997) available guidelines for the use of erythrocyte transfusions were reviewed [66]. The review found no expert consensus-based guideline or practice recommendation for objective guidance for erythrocyte transfusions.

In 2011 a paper appeared in the British Journal of Anaesthesia with the provocative title, “What is really dangerous: anaemia or transfusion?” [67] Shander and colleagues reviewed the physiological mechanisms available to protect from hypoxic tissue and organ injury and called for further research to characterize these risks to better enable rational transfusion decisions that minimize risks and maximize benefits.

The Search for an Objective Transfusion “Trigger”

The fifth milestone:

The Transfusion Requirements in Critical Care (TRICC) Study

The first randomized controlled clinical trial intended to find objectively definable transfusion “triggers” in ICU patients, appeared in the New England Journal of Medicine, February 11, 1999 [68]. The study was intended to find non-inferiority between two groups of critical care patients in 22 tertiary care and three community hospitals in Canada and the USA. The two groups were randomized to receive daily transfusions, to maintain their [Hb] either within the range of 70–90 g/L in the so-called restrictive cohort, and within the range of 100–120 g/L in the so-called liberal cohort.

Carefully selected inclusion/exclusion criteria and characterization of each subject’s pre-randomization profile (using such as Multiple Organ Dysfunction Scores (MODS)) [69] were recorded, to allow clinical comparison of the two cohorts, as well as daily measures during the trial to compare outcomes between the two cohorts. The enrolled population was randomized one-to-one into either the restrictive or the liberal transfusion cohort (n = 418 and n = 420), respectively. Primary outcome measures were mortality at various time points.

The subjects were successfully maintained at their assigned [Hb] ranges (85 ± 7 and 107 ± 7 g/L, p < 0.01). Mortality rates in the ICU and at 30 days were lower in the restrictive than in the liberal groups. The mean number of transfusions received was significantly different between the groups; 2.6 ± 4.1, vs 5.6 ± 5.3 units per subject in the restrictive and liberal groups, respectively. The difference between the groups was also evident in the total number of transfusions avoided: 138 of the 418 subjects in the restrictive group (33%) entirely avoided transfusion, whereas all subjects in the liberal group received at least one transfusion. The transfusions avoided by the restrictive group subjects represented a 46% reduction in total number of transfusions. The clinical severity scores at entry to the trial also predicted that those less severely ill subjects, who were <55 years of age, were able to tolerate relative anemia better, and were less likely to experience adverse outcomes.

The demonstration of non-inferiority of the two treatment strategies indicated that an objective transfusion “trigger” can be found for transfusion decisions in critical care patients. Moreover, that patients, with appropriately triggered transfusions in the range of [Hb] of 70–90 g/L, do not experience more severe outcomes than those receiving transfusions triggered in the [Hb] range of 100–120 g/L. In the former group of patients, the actual avoidance of transfusions did not increase their risk of anemia-related adverse outcomes. Thus, using an objective “diagnostic indicator” for the need of transfusion can contribute to blood conservation, without apparently sacrificing patient safety. And, finally, the study showed that it is not necessary to restore patients’ [Hb] to the reference “normal” range of 140–150 g/L, and that the “breakpoint” of ≤50 g/L being predictive of severe risks, noted above, on the one hand, and the tolerable level ≥70 g/L on the other, shows a relatively narrow band for risk tolerance in intensive care.

Of course, the TRICC study may not be fully generalizable to all critical care patients. Those suffering from coronary artery disease may be especially vulnerable to the risks of anemia [15]. In fact, subgroup analysis of the TRICC study found [70] that in those subjects with severe cardiovascular disease, more prudent use of liberal thresholds may be beneficial.

The avoidance of unnecessary transfusions and of transfusion-related adverse outcomes may have salutary financial benefits, as well. A theoretical model published in 2007, estimated that of the then current 3.070 million units transfused annually in critical care patients in the USA, universal adoption of the restrictive transfusion threshold would be reduced to 1.778 million units, a reduction of 42%. The model also estimated an avoidance of 1624 severe transfusion-attributable adverse outcomes, a reduction of 69%. Using an average unit cost of US $634, the estimated annual cost saving would be US$ 821 million, or 42%. if all ICU transfusion decisions were based on the restrictive thresholds [71]. While these numbers are far out of date, their message is significant: substantial savings could be achieved by restricted use of transfusion, and patients could avoid a significant number of transfusion-attributable adverse outcomes, all the while conserving blood and financial resources.

The findings of the TRICC study have been confirmed in similar, large scale randomized trials in Europe. No significant difference in mortality rates were found between restrictive and liberal transfusion strategies, and significant blood sparing was demonstrated [72]. Meta-analyses of published trials comparing low and higher transfusion thresholds have also confirmed the general conclusions [73,74,75,76,77,78,79]. A more recent meta-analysis paying particular attention to patients with cardiovascular disease recommended a more cautious approach to this population [80]. Longer range (6 months) outcomes after discharge from hospital of anemic patients were assessed and found to be not different among recipients of transfusions triggered by the two strategies [81, 82].

A systematic review of available meta-analyses provided an overview of all reviews comparing mortality in restrictive and liberal [Hb]-based thresholds [83]. This review comprised 33 meta-analyses of variable quality. Among good and moderate quality analyses (total 16), found lower mortality among subjects assigned to the restrictive transfusion thresholds. Thus, a large and diverse set of subjects, from a diversity of institutions, who were transfused at restrictive thresholds, did not experience greater mortality than those transfused at liberal thresholds.

Guidelines recommend that transfusions be used sparingly in critical care units and to avoid excessive phlebotomies and to use alternatives to transfusion, such erythropoietin [84, 85]. Separate guidelines have been published on transfusion support for CABG patients, recommending more frequent use of preoperative autologous transfusion, blood salvage and the establishment of multidisciplinary approach to use interventions that avoid allogeneic transfusions [86].

A major review evaluated whether the two transfusion thresholds are associated with different risks of health-care associated infections [87]. The pooled risk of infection acquired in the health care setting was 10.6% in the restrictive and 12.7% in the liberal transfusion threshold cohorts. The relative risk (RR) for all infections was 0.92 ((95% confidence interval, CI, 0.82–1.04) was not significantly different between thresholds applied. The RR for serious infections was 0.84 (95% C.I 0.73–0.96) was significantly higher for the liberal transfusion thresholds applied. No difference was found between leukocyte-reduced and non-leukocyte reduced units transfused.

An attempt was made to review all available studies comparing restrictive and liberal transfusion thresholds for transfusion in surgical and critical care settings [88]. The review comprised 31 trials involving 12,587 participating subjects. The studies used either a [Hb] of 70 g/L for restrictive transfusion triggering, or [Hb] of 80–90 g/L threshold for liberal triggering. The cohorts comprising the two thresholds used were approximately evenly matched. Use of the restrictive threshold reduced the probability of receiving a transfusion by 43%, while it neither increased nor significantly decreased 30-day mortality, or any other adverse outcomes assessed when compared to the liberal transfusion cohort. The authors concluded that transfusions applied at the restrictive threshold reduced the risk of receiving a transfusion without significantly altering the subjects’ other risks.

Restrictive vs Liberal Transfusion in Cardiac Surgery

Coronary Artery Bypass Grafting (CABG) surgery is the numerically largest subset of cardiac surgery and accounts for significant intra- and postoperative transfusions, cumulatively a large proportion of surgical transfusions overall.

A large prospective trial compared outcomes in 4860 subjects undergoing cardiac surgery in Canada, Australia, and New Zealand (TRICC III Study) [76]. The subjects were randomized one-to-one into, either restrictive or liberal transfusion arms. The transfusion thresholds were [Hb] <75 g/L in the former, and [Hb] <95 g/L in the latter. The two groups were comparable with respect to their preoperative demographic and clinical profiles and surgical procedures. Composite outcome measures were death, stroke, myocardial infarction, and new renal failure. All primary outcomes were comparable between the two arms. One thousand, one hundred and fifty-nine subjects (48%) avoided transfusion in the restrictive arm, as opposed to only 663 in the liberal arm. Thus, twice as many liberal-arm subjects than those in the restrictive-arm received intraoperative transfusions. Significantly more liberal-arm subjects were exposed to postoperative transfusions (52% vs 36%). In the total population of 4860 subjects, a total of 8987 erythrocyte units had been consumed, with a markedly uneven distribution: 3486 units in the restrictive arm and 5501 units in the liberal arm; over 2000 units had been saved in the restricted arm subjects, who had not suffered significantly worse outcomes. The only substantive difference observed was a longer aggregate ICU LOS time (9.7%) in the restrictive-arm subjects. The excess ICU costs in this cohort may be offset by the saving of 2000 unused transfusion units. Long range outcomes in anemic patients discharged from hospital were also found be similar [63, 82].

A systematic review and meta-analysis of 13 similar randomized controlled trials, including the TRICC study followed [89]. The review comprised 9092 patients undergoing cardiac surgery. The adjusted risk ratios for mortality, myocardial infarction, stroke, and arrhythmia were all similar between restricted and liberal transfusion treated subjects. Unlike in the TRICC III study above, both aggregated ICU and hospital LOS were similar. While all other observed risk ratios were similar, the risk of receiving an erythrocyte transfusion favored the subjects in the restrictive treated cohort [90].

In summary, numerous randomized controlled trials and meta-analyses support a restrictive approach to transfusions at thresholds of [Hb] 70–75 g/L, that permits either avoidance or minimizing transfusion exposure among cardiac and other surgical patients. Thus, rational, evidence-based avoidance of unnecessary erythrocyte transfusion minimizes transfusion-attributable risks, spares blood resources and does not expose patients to excessive anemia-related risks.

Systematic reviews have been published in cardiac surgery in children with congenital heart disease, [91] and in neurocritical adults [92].

Another systematic review and meta-analysis also found consistent similarities in outcomes between subjects treated with restrictive and liberal transfusion strategies [93]. One exception to this consistent trend of similar outcomes was two small trials (n = 154 subjects) with acute myocardial infarction in which the liberal transfusion strategy appeared more favorable [80]. Similar caution of favoring more liberal transfusion thresholds in patients with cardiovascular disease, undergoing non-cardiac surgery [80] is recommended.

There have been critics of such clinical trials conducted on intensive care subjects, because of their diversity [94,95,96,97,98].

The alternative to transfusions in surgical practice is bloodless surgery. This practice utilizes meticulous hemostasis and attention to coagulation, and offers the advantages of avoiding transfusions, as well as the adverse outcomes associated with it [99].

This provides the transition to the sixth and last milestone:

Patient Blood Management

The transfusion landscape convinced Shander and colleagues to propose an entirely new paradigm for the use of transfusion [100]. It proposed that instead of treating the Hb concentration of an anemic patient, the patient with anemia should be treated. Prudent use of “this lifesaving, costly, limited and dangerous resource”, transfusion, was required [100].

Shander and colleagues posited that “the vast majority of transfusions in surgical patients can be attributed to low preoperative hemoglobin levels, excessive surgical blood loss and /or inappropriate transfusion practices” [99].

The multimodal Patient Blood Management Program (PBM) is conceived as resting on three pillars: [100,101,102].

  • Optimizing hematopoiesis.

  • Minimizing operative and other blood losses.

  • Harnessing and optimizing physiological tolerance of anemia.

The rationale of a transfusion to treat anemia is based on the determinants of oxygen delivery, i.e., cardiac output, and oxygen content of the blood. The latter is determined by the [Hb]. Of these two, only the cardiovascular adjustments can be altered at will, whereas increasing hemoglobin concentration by erythropoiesis is slow. Thus, the first pillar demands preoperative attention to [Hb]. The first and third pillars above can be manipulated by the clinician’s pharmacological armamentarium . The second pillar demands meticulous operative technique of hemostasis and immediate correction of coagulation disorders. When the three pillars do not offer sufficient relief, recourse to transfusion is based on evidence-based assessment of the attendant risks, taking into consideration of anemia tolerance of the patient in question.

The proposal is based on the recognition that hidden anemia is common, especially in the elderly and many disadvantaged groups. When anemic patients require surgical or critical care treatment, they have often required transfusion. The benefits of the new program lie in minimizing risks, conserving scarce resources, as well as controlling some costs in anemic patients who subsequently require surgical or critical care.

The scientific basis for Patient Blood Management (PBM) was laid out in 2015 [103]. The concept is defined as: comprising …“measures to avoid transfusion such as anemia management without transfusion, cell salvage and the use of anti-fibrinolytic drugs, to reduce bleeding as well as restrictive transfusion” only if needed. “It ensures that patients receive the optimal treatment, and that avoidable, inappropriate use of blood and components is reduced.” The concept has become widely implemented in Europe [104].

Thus, the prevention of anemia is now a desirable objective of medical care. It requires a high degree of cooperation by many disciplines to assure that all therapeutic modalities are brought to bear to assure the best outcomes for the patient. To promote the acceptance and implementation of PBM’s principles and practices, a new institute was formed: The Institute for Patient Blood Management and Bloodless Medicine [99, 101].

Several publications have assessed the PBM program, and these have been subject to systematic review [105]. The review comprised a total of 235,779 surgical patients reported in 17 studies published between 2008 and 2017, comparing 100,866 patients before the implementation of PBM (pre-PBM) and 134,893 after PBM was implemented (post-PBM). In the post-PBM population transfusion rates were 39 percent less than in the pre-PBM cohort, a mean 0.43 fewer erythrocyte units per subject were used, and hospital LOS was shortened by 0.45 days per patient. The total number of in-hospital days were reduced by 40% after PBM introduction, as the total number of complications were reduced by 20% and mortality rate decreased by 11%. The participating institutions used their institutional transfusion thresholds, but the “before-and-after” comparisons favors the conclusion that PBM introduced at many institutions resulted in cost and blood savings, improved patient outcomes, without disadvantaging those patients hospitalized post-PBM. It is admittedly possible that other than PBM practices also contributed to the observed differences between pre-PBM and post-PBM patients’ outcomes. For example, hospital practices may have changed during the nine-year interval, favoring earlier discharge to prevent nosocomial infections. Other factors may also have contributed. Nevertheless, in a large and diverse patient population from many institutions at least, PBM practices were an important contributing factor.

If we accept that in the data above, PBM played an important role, then the observed average changes in outcomes can be seen in a different light. The reduction of 0.43 erythrocyte units may seem trivial, but in the aggregated more than 135,000 patients, it represents a total of 58,000 units saved. If this argument were extended to the total 216,657 patients, the savings would have exceeded 93,000 units , somewhat less than 1% of the total annual use of erythrocyte units. Likewise, the reduced mean individual shortened hospital LOS of 0.45 days per patient, applied across the 100,886 pre-PBM patients would aggregate to a total of 45,398 hospital days saved. These findings show clear benefits that can be attributed to the PBM program, without substantial excess in adverse outcomes.

Lastly, the review also identified the surgical specialties in which the greatest benefits could be expected. Orthopedic patients experienced the greatest reductions in transfusion exposure (55%) and mortality (27%). Cardiac surgical patients experienced the greatest reductions in number of units transfused (0.87 units per patient) and in-hospital LOS (1.34 days per patient).

The possible cost containment afforded by appropriate management of surgical transfusions that also avoid risks contributed by inappropriate transfusions, has major implications for the management of scarce health care resources. Pre-empting inappropriate transfusions by the PBM program is clearly indicated [59]. Orthopedic surgery accounts for 45% of surgical patients exposed to transfusion(s), accounting for about 10% of all erythrocyte units transfused. Preoperative transfusions in anemic patients have not been shown to be beneficial, as postoperative complications are not reduced. The implementation of PBM has clearly shown to be efficacious and has been successful in the USA. Western Australia’s government is a leader in promoting efficient blood utilization with good results [106]. PBM is also implemented in most European countries. A review of its status in individual countries is given in [59, 104].

A direct case of cost containment can be made for reducing transfusion exposure and its attendant risks. A retrospective analysis of hospitalized patients in Australia compared costs incurred between transfused and untransfused patients [64]. In a total of 89,996 acute care hospitalized patients’ costs were analyzed and subjected to multiple regression to eliminate confounding variables. Four thousand eight hundred and five patients were transfused (5.3%). This latter cohort incurred a mean 83% greater costs than the mean in untransfused patients. The study’s specific findings may be questioned, as the receipt of transfusion(s) may be a surrogate marker for greater acuity, but the statistical analysis attempted to account for this. The total transfusion-associated excess cost was equivalent to US $ 72 million, or about $ 15,000 per patient, far exceeding the direct cost of the transfusion(s) per se. Direct hospital costs of allogeneic, autologous, and perioperative transfusions were analyzed in Sweden [65]. The average direct cost of a two-unit transfusion was equivalent to approximately US $ 678.

The benefits of PBM have been recognized in numerous publications in the past 5 years. It is a fundamental shift away from a product-centered to a patient-centered approach to transfusion [99, 100, 102, 107,108,109,110,111]. International conferences in 2017 [112] and 2019 [113] have published guidelines for the implementation and practice of PBM.

The PBM program is consistent with the principles of sound risk management [114]. This publication reviewed the direct and indirect hazards associated with transfusions and provided medico-legal considerations in clinical risk management. It posited that PBM should be the state of the current art to avoid not only adverse outcomes for patients, but also to minimize the risk of litigation for practitioners and institutions. The authors noted that:

  • Blood transfusion is now clearly known to have hazards that are avoidable, and PBM is the program currently most capable of minimizing those hazards.

  • It is recognized that PBM is now the state of the art of surgical transfusion practice.

  • Failure to follow current state of the art practice regarding transfusion may have deleterious clinical consequences that may also expose practitioners and institutions to enhanced litigation risk.

It is important to note that transfusion avoidance is not withholding of necessary medical treatment. Transfusion is still necessary in many circumstances, including continuing uncontrolled bleeding and when fluid resuscitation results in critically low [Hb], in cases of chronic severe anemia of bone marrow failure or chemotherapy, in the prevention of strokes in children with sickle cell disease and in hemoglobinopathies. But avoidance is an ethically justifiable medical treatment decision when the risks and consequences of transfusion(s) outweigh a low level of risk of death. The clinical trials showing non-inferiority of mortality outcomes in subjects having low [Hb] indicates that the risk of death due to avoiding transfusion is acceptably low at [Hb] greater than 70–80 g/L, except in those cases of severe coronary artery disease. A study in 2012 found that patients who had been transfused and with discharge [HB] of 100 g/L, or even 90 g/L “had received excessive transfusion” [115, 116]. The review of the risks of anemia and transfusion by Shander and colleagues [67, 78, 100, 117] has led them to the conclusion that in specific circumstances transfusion avoidance is medically and morally justified. The benefits also include blood resource conservation, but that reason alone is not morally justifiable to avoid transfusion. The question of whether a liberal transfusion threshold in elderly, non-cardiac surgical patients is capable of avoiding ischemic events, will be more definitively answered when the LIBERAL TRIAL results become available [118].

Summary and Conclusions

The discovery of major blood group antigens made it possible to choose donor blood for individual recipients that minimized the likelihood of severe transfusion reactions. In the early decades of the twentieth century, blood transfusion, like most other medical interventions, was used in the expectation of benefits to the recipient since anemia and bleeding were clearly viewed as threats to survival. As transfusion reactions became better understood, the consideration of expected benefit became tempered with consideration of the attendant risks.

The six milestones in the discussion above were the signal achievements of making transfusion of erythrocytes safer and more effective. These milestones stand out in the development of increasing safety of transfusions, as the three categories of transfusion risks – reactions, transmitted pathogens, and transfusion attributable adverse outcomes – were identified. Managing these risks became part of the evidence-guided use of erythrocyte transfusions. Maintaining the safety of blood resources after the HIV/AIDS epidemic became national priorities and large resource allocations were justified. Recognition of transfusion associated risks mandated that the prescribing of transfusions be subject to consideration of risks and benefits. Randomized controlled clinical trials defined objective criteria for [Hb] based “quantitative” thresholds and have been evaluated and found to offer relative safety from excessive adverse outcomes at [Hb] less than the “normal” levels. This facilitates safe, complete, or maximal possible, avoidance of transfusion exposure. The most recent development of the new paradigm of Patient Blood Management (PBM) program incentivizes optimal patient outcomes and the management of transfusions in populations that had previously been at high risk of surgical or critical care transfusion because they are chronically anemic. This large vulnerable population can be safely managed using the three pillars of PBM, namely hematopoietic management, minimizing blood loss and optimizing physiological adaptations to the presence of anemia. The immediate past decade has seen increasing acceptance of the practices embodied in PBM, by repeated demonstrations of its effectiveness in minimizing adverse outcomes, reducing transfusion exposures, saving scarce blood resources, and controlling costs. The principles of sound risk management of all medical risks, including legal ones, suggests that these principles be accepted as the state of the current art.

Key Points

  • The transfusion of erythrocytes remains the standard of care for correction of critical severity of anemia for the prevention of hypoxic end-organ injury.

  • Since the introduction of blood transfusion, the procedure has not been subjected to critical evaluation of its efficacy, while its safety has been subject to increasing scrutiny during the past century.

  • Three categories of safety risks have been identified: those of transfusion reaction directly attributable to a transfusion, those attributable to the transmission of infectious pathogens present in the transfusion, and those indirectly attributable and more frequently occurring adverse outcomes following transfusion exposures.

  • The decision of whether a transfusion of banked erythrocytes is clinically advisable has become a matter of risk management whereby the benefit of avoiding predictable risks of the exposure is balanced against the risk of hypoxic end-organ injury.

  • Erythrocyte transfusion remains the key treatment for continuing uncontrolled blood loss, erythropoietic failure of various causes, sickle cell disease and hemoglobinopathies. Surgical and critical care transfusions are guided by the principal tenets of Patient Blood Management which is an ethically justifiable avoidance of exposure to a transfusion with low level risks and consequences, while maintaining physiological means of improving oxygen supply.