Keywords

The Birth of Transfusion

William Harvey first described the circulation of blood in 1628 as “the beat of the heart and arteries, the transfer of blood from veins to arteries, and its distribution to the body.” Through experimentation with animal vein ligation, he noted “two kinds of death, failure from a lack [of blood flow] and suffocation from excess” [1]. This led to a few scattered reports of experiments involving blood transfusion in subsequent years, including those of Richard Lower, who performed experiments on dogs with exsanguinating hemorrhage at Oxford, England, in 1666 [2]. However, it would be another two centuries before the understanding of circulation would result in the first recorded human blood transfusion and with it the potential to save people from exsanguinating hemorrhage.

In 1817, during a time when bloodletting to cure disease was rampant, John Henry Leacock questioned, “what is there repugnant to the idea of trying to cure diseases arising from an opposite cause [as exsanguinating hemorrhage] by an opposite remedy, to wit by transfusion?” [3, 4]. A year later, the hope of survival in severe hemorrhage was introduced by the British obstetrician James Blundell. After experimenting with blood transfusion in dogs, he performed a series of human blood transfusions for women who bled during childbirth, with a 50% success rate [5, 6]. Over the next half century, reports of transfusion are scarce, with only a handful appearing in the literature. These include transfusions by Samuel Choppin at Charity Hospital in 1854, Daniel Brainard in Chicago in 1860, and US Civil War doctors who transfused at least four Union soldiers [3]. Blood transfusion as the treatment for exsanguinating hemorrhage remained rare due to limited donors, no ability to store blood, and a lack of an effective method of transfusion. As a result, saline intravenous fluids became the standard of care in severe hemorrhage [7].

There was a resurgence of interest in blood transfusions in the 1870s and 1880s with the development of multiple transfusion instruments; however, it was not until the discovery of blood groups by Karl Landsteiner and Reuben Ottenberg early in the twentieth century that reduced transfusion reactions and decreased morbidity and mortality became possible [3, 6, 8, 9]. In 1915, Richard Lewinsohn discovered that sodium citrate with dextrose was a safe preservative, thus enabling blood storage [2, 6].

The Era of the World Wars

The outbreak of World War I (WWI) in 1914 led to a large cadre of severely injured combatants, and the shortcomings of crystalloid resuscitation for hemorrhagic shock became evident. It was demonstrated to result in only a transient response and was ultimately found to be detrimental. Instead, blood transfusions were discovered to be the “most effective means of dealing with cases of continued low blood pressure, whether due to hemorrhage or shock” [10]. The British Expeditionary Force, as well as some American Expeditionary Force hospitals, used whole blood transfusions for resuscitation, and it is estimated that several tens of thousands of transfusions were performed in 1918 alone, saving many lives [6, 8]. At the end of WWI, the Royal Army Medical Corps deemed that advances in blood transfusions were the “most important medical advance of the war” [11].

Following the Great War, the first civilian blood donor service was established in London by the British Red Cross in 1921, employing volunteers to promptly donate fresh blood when contacted [2]. Subsequently, Bernard Fantus at Cook County Hospital in Chicago established the first blood bank in 1937, with whole blood collected in bottles and stored in a refrigerator for up to 10 days [12]. The use of plasma began in 1936, when John Elliot devised a method to separate plasma from red blood cells (RBCs), hailing plasma as a blood substitute for hemorrhagic shock [13, 14]. In WWII, whole blood as well as plasma was used as the primary treatment of hemorrhagic shock [2]. In 1945, Henry Beecher published his recommendations for continued treatment of traumatic hemorrhage based on experience in WWII. He declared whole blood as the superior treatment and advocated administering plasma to “temporarily sustain blood pressure at a level compatible with life for a limited time” to enable “more time to get whole blood into the patient.” He also cited a resuscitation goal of warm skin and “good color,” which often occurred at a systolic blood pressure around 85 mmHg, noting increased blood loss from resuscitation to higher blood pressures [15].

Following WWII, the military-built national blood program in the United States collapsed. As the use of crystalloid resuscitation in traumatic shock increased, Beecher published an editorial to reiterate the lessons learned from the World Wars, stating, “for the sake of both civil and military medicine it will be well to review recent practices and their foundations. It will be tragic if the medical historians can look back on the WWII period and write of it as a time when so much was learned and so little remembered” [8, 16].

Component Therapy and an Abundance of Crystalloid

In 1951, Edwin Joseph Cohn invented a “blood cell separator” that employed centrifugal force to separate whole blood into layers of RBCs, plasma, and platelets. Promoting his work and coining the term component therapy, Cohn further popularized the use of blood fractionation and the transfusion of blood product components in lieu of whole blood [2]. Component therapy was hailed as economical, because it allowed maximal use of a limited resource, enabling multiple patients to be treated with each unit of whole blood. In addition, the transmission of hepatitis through plasma was becoming increasingly evident [14, 17]. Component transfusion would soon largely replace whole blood transfusion, in spite of a lack of supporting clinical outcome data to support its use.

In the late 1950s and early 1960s, studies performed in elective surgery patients noted extracellular fluid loss and edema at sites of tissue injury in addition to acute blood loss [18, 19]. To further evaluate these fluid shifts in traumatic hemorrhagic shock, Tom Shires et al. performed a study in dogs, concluding that whole blood should remain the replacement for lost blood in hemorrhagic shock. However, the investigators also stated a “judicious amount” of adjunctive Ringer’s lactate (LR) could be of value to replace additional fluid losses [20]. This spawned a resurgence of crystalloid use in surgery and trauma patients in an attempt to replace these losses.

In response to the increased use of crystalloid in hemorrhagic shock, Francis Moore and Shires wrote an editorial asking for a return to “moderation” in the use of crystalloid, stating “no conceivable interpretation of these data would justify the use of such excessive balanced salt solution for early replacement in hemorrhage. Neither is the use of saline solutions meant to be a substitute for whole blood. Whole blood is still the primary therapy for blood loss shock” [21]. In spite of these preeminent surgeons’ continued emphasis that acute blood loss should be replaced with whole blood, as well as a continued lack of clinical data supporting the use of fractionated blood, in 1970, the American Medical Association published their recommendation that whole blood no longer be routinely used, stating that patients in hemorrhagic shock should be transfused with RBCs supplemented by balanced salt solutions or plasma expanders, and this became the standard of care [22]. Practice thus shifted to primarily RBC transfusions and an increased use of crystalloid for hemorrhagic shock. Large-volume crystalloid resuscitation became increasingly common during the Vietnam War and subsequently spread to civilian trauma centers, despite a paucity of studies validating its safety and efficacy in treating exsanguinating hemorrhage [23, 24].

In 1976, C. James Carrico and Shires outlined a therapeutic plan for traumatic hemorrhage, including an infusion of 1–2 l of LR until whole blood was available for transfusion [25]. Similar to Moore and Shires editorial in 1967, their recommendations were also interpreted as an endorsement for crystalloid resuscitation in acute blood loss and further popularized large-volume crystalloid resuscitation in the treatment of hemorrhagic shock [23]. Defending the resuscitation strategy of RBCs and large volumes of crystalloid, Shackford et al. performed a study comparing RBCs and crystalloid resuscitation to whole blood and crystalloid, finding RBCs and LR could be used (with an average of 6.5 units of RBCs) without producing coagulopathy [26]. However, others remained hesitant to adopt this strategy, noting those with severe trauma were often thrombocytopenic and coagulopathic. They expressed a concern that RBC and crystalloid resuscitation would result in a further dilutional coagulopathy and continued to propose those in hemorrhagic shock should be given whole blood [27].

In the late 1970s and 1980s, studies touting the benefit of “supratherapeutic resuscitation” appeared in the surgical literature. These papers proposed the benefit of maintaining normal values of systolic blood pressure (SBP), urine output (UOP), base deficit, hemoglobin, and cardiac index in an attempt to optimize tissue perfusion [28,29,30,31,32,33]. This exacerbated the trend toward large-volume crystalloid administration and further away from the concepts expressed by Cannon and Beecher based on their experiences during the Great Wars [15, 34]. However, when supratherapeutic resuscitation was studied decades later in a prospective and randomized setting, it was shown to be of no benefit and resulted in increased isotonic fluid infusion, diminished intestinal perfusion, as well as a greater frequency of abdominal compartment syndrome, multiple organ failure, and mortality [35,36,37].

As studies were recognizing the adverse effects of large-volume crystalloid infusion, a resurgence of interest in traumatic coagulopathy occurred. In 1982, the “bloody vicious cycle,” now known as the “lethal triad,” was coined, citing a frequent downward spiral in severe trauma involving coagulopathy, acidosis, and hypothermia (Fig. 27.1) [38]. While a large percentage of patients who arrived in the emergency department after severe trauma were already coagulopathic, portending a poor outcome [39,40,41,42], large-volume crystalloid exacerbated all aspects of this deadly triad [38]. Volumes of crystalloid greater than 1.5 l result in increased mortality [43]. Hypothermia, exacerbated by the administration of unwarmed fluids, also reduces platelet function, coagulation factor activity, and fibrinogen synthesis [44,45,46]. Crystalloids high in chloride compound cause acidosis, and as a result, significantly more blood products are required to achieve hemodynamically normal parameters. Acidosis also affects platelet aggregation and significantly impairs coagulation factor activity [46, 47]. As surgeons recognized the lethality of the bloody vicious cycle, ways to prevent and treat it were investigated and adopted.

Fig. 27.1
figure 1

The bloody vicious cycle. (Adapted from Kashuk et al. [38])

Damage Control Resuscitation

The next watershed event in the treatment of severe trauma was the concept of damage control resuscitation (DCR). Returning to the principles of hypotensive and hemostatic resuscitation previously advocated and performed in the eras of WWI and WWII, the focus changed to rapidly controlling hemorrhage, limiting crystalloid use, and infusing blood products to a safe, but lower than normal, blood pressure until operative control of bleeding is established. The protective effect of permissive hypotension prior to hemorrhage control was subsequently confirmed in a randomized controlled trial, which revealed control of bleeding prior to resuscitation to a normal blood pressure resulted in improved outcomes in penetrating and blunt trauma patients [48, 49]. As institutions adopted these strategies, studies investigating DCR protocols revealed a decrease in crystalloid and RBC use, less incidence of the “lethal triad,” and improved 24-hour and 30-day survival [50,51,52].

The Change in Component Ratios: Reinitiation of Plasma and Platelet Resuscitation

In addition to a return to hypotensive resuscitation and prompt surgical control of bleeding, transfusion protocols shifted to include early administration of platelets and plasma in addition to RBCs. This was sparked by studies performed in computer-based simulations, as well as animal studies, that revealed higher ratios of plasma and platelets to RBCs were necessary to prevent and correct coagulopathy in traumatic hemorrhage [53,54,55,56,57]. Moreover, additional beneficial effects of plasma in hemorrhagic shock were being discovered, such as the promotion of vascular stability and restoration of the endothelial glycocalyx, which decrease extracellular fluid loss in shock [58,59,60].

In 2004, US combat hospitals shifted to an initial massive transfusion ratio of 1:1:1 plasma, platelets, and RBCs. This spawned a retrospective review at a US Army combat hospital in Iraq, which revealed a mortality reduction as the plasma to RBC ratio increased, noting a greater than 50% reduction of mortality in those who received a 1:1.4 ratio of plasma to red blood cells compared to those who received 1:8. This mortality benefit was primarily due to decreasing early death from hemorrhage [61]. Subsequent retrospective and cohort trials were performed in both military and civilian trauma, confirming improved outcomes in those who received an increased ratio of plasma (Table 27.1) [40, 62,63,64,65,66,67,68,69,70] and platelet transfusions (Table 27.2) [23, 40, 62, 64, 71,72,73]. In addition, these studies demonstrated the importance of prompt, effective resuscitation, revealing uncontrolled hemorrhagic deaths to occur within 6 hours of injury [52, 61, 64, 74,75,76].

Table 27.1 Plasma to red blood cell ratios
Table 27.2 Platelet to red blood cell ratios

While it became established that increased plasma and platelet transfusions were associated with improved survival, the exact ratio for this effect on mortality remained unknown. Varying ratios of blood products were used in these studies, and they did not agree on whether increased administration of plasma and platelets resulted in an increased incidence of complications, such as transfusion-related acute lung injury (TRALI), acute respiratory distress syndrome (ARDS), and multiorgan system failure (MODS). Also, a limitation of these retrospective studies was highlighted by Snyder et al., who focused on the time it took to obtain and administer varying blood components. Plasma takes time to be prepared and platelets must be constantly agitated during storage; therefore, they are only available to those who survive long enough to receive them from the blood bank. Noting the potential flaw of survival bias inherent in retrospective analysis, this work indicated the likelihood of receiving a high ratio of plasma to RBCs increased with survival time [68].

As a result, the Prospective, Observational, Multicenter, Major Trauma Transfusion (PROMMTT) study was designed as a prospective, multicenter trial to further evaluate resuscitation ratios being used at high-volume trauma centers and monitor outcomes. As seen in the previous retrospective studies, hemorrhagic deaths occurred quickly, with 60% of hemorrhagic deaths occurring within the first 3 hours of admission. In spite of clinicians attempting to transfuse a constant ratio, neither plasma nor platelet ratios were consistent across the first 24 hours, and timing of transfusions varied. Thirty minutes after admission, 67% had not received plasma and 99% had not received platelets. Notably, 3 hours after admission, past the time most patients would have already died from hemorrhage, 10% had not received plasma and 28% had not received platelets. However, the study did confirm a survival advantage at 24 hours for higher plasma and platelet ratios early in resuscitation [77].

Subsequently, the Pragmatic, Randomized Optimal Platelet and Plasma Ratios (PROPPR) trial was designed to further evaluate the outcomes of frequently used resuscitation ratios. While it did not show an improvement in overall 24-hour survival for a balanced ratio of 1:1:1 (plasma to platelets to packed red blood cells) in comparison with 1:1:2, it revealed a decreased mortality at 3 hours (the timeframe by which most die of hemorrhage) as well as increased hemostasis and decreased death due to hemorrhage at 24 hours (9.2% of death due to hemorrhage when given 1:1:1 versus 14.6% death due to hemorrhage when given 1:1:2). Providing some allay of concern for increased transfusion complications with increased plasma and platelet administration, this study found there is no difference in the development of ARDS, MODS, venous thromboembolism, or sepsis between the ratios [78]. Further reducing apprehension, recent data has indicated that the incidence of TRALI has decreased with current blood banking practices, and while it still occurs, the rate remains low despite an increase in the use of plasma and platelets in traumatic hemorrhagic shock (1 case of TRALI per 20,000 units of plasma) [79]. Providing further evidence of the value of a return to early platelet transfusion in hemorrhage, a subsequent subanalysis of the PROPPR trial revealed those who received platelets were more likely to achieve hemostasis (94.9% vs. 73.4%) and had significantly decreased 24-hour (5.8% vs. 16.9%) and 30-day mortality (9.5% vs. 20.2%) [80]. As a result of these studies, balanced ratios to approximate whole blood were largely adopted as standard of care, and by 2015, more than 80% of American College of Surgeons Trauma Quality Improvement Program (ACS TQIP) trauma centers targeted a balanced ratio in their massive transfusion protocols [81,82,83].

Massive Transfusion Protocol Initiation and Practice

Throughout the years, there have been a variety of predictors proposed to identify those in need of massive transfusion. There are risks associated with blood transfusion, including transfusion reactions [84], pulmonary and renal dysfunction [85], and multiorgan failure [86]. Additionally, studies have revealed transfusion does not benefit those who are hemodynamically stable, even in the critically ill and trauma population [87, 88]. In WWII, Beecher used vital signs and “cool skin” as indicators of a need for transfusion [15], and this method continued as studies repeatedly confirmed physical exam and radial pulse could reliably predict the need for intervention [89]. Since that time, there have been a variety of MTP prediction scores, including the Field Triage Score [90], the Shock Index [91], and the Assessment of Blood Consumption score (ABC score) [92]. While these and other proposed algorithms remain important tools [93,94,95,96], there remains no universally accepted method for the initiation of massive transfusion, and most MTP activations are based on the provider’s experience and an “overall clinical gestalt” to facilitate quick decisions and prompt care [97] (see Chap. 15).

In order to facilitate prompt initiation of blood product resuscitation and decrease preventable death from hemorrhagic shock, massive transfusion protocols and DCR techniques have been adopted to the prehospital arena. When employed, prehospital plasma and RBC transfusion in lieu of early crystalloid resuscitation has shown to prevent coagulopathy and improve early outcomes through earlier initiation of lifesaving transfusion [98,99,100]. Sparking this important change in the care of severely injured trauma patients in the prehospital setting, Sperry and associates performed the Prehospital Air Medical Plasma (PAMPer) trial in 2018. While most major trauma centers had transitioned to DCR techniques and early balanced resuscitation, the prehospital care lagged behind, with continued crystalloid “standard-care” resuscitation (see Chap. 30). This landmark PAMPer trial examined the efficacy and safety of prehospital thawed plasma in those at risk for hemorrhagic shock (defined as at least one episode of SBP <90 mmHg and HR >108 bpm or SBP <70 mmHg), revealing a lower 30-day mortality (23.2% vs. 33%) and increased hemostasis when plasma was given prior to medical center arrival [99]. A secondary analysis of the study was later performed in 2019, revealing those given red blood cells and plasma to have an even larger mortality benefit and setting the stage for a continued move toward prehospital balanced resuscitation and whole blood transfusion [100].

A Return to Whole Blood

As the benefits of transfusing balanced component ratios became evident in recent years, there has been a push to return to whole blood transfusions for severe traumatic hemorrhage. As Beecher reported in 1945, “men who have lost whole blood need whole blood replacement” [15]. Since 2002, the US military has transfused thousands of units of fresh whole blood (FWB), and published data suggests use of FWB is superior to component therapy, even in a balanced ratio [23, 101,102,103]. Fresh whole blood resuscitation has shown improved survival compared to component therapy and is known to be a more concentrated, functional product [104]. When component products are recombined in a ratio of 1:1:1 of plasma, platelets, and packed red blood cells, the resulting fluid has approximately a hematocrit of 29%, viable platelet concentration of 8.8 × 108 per liter (around half the platelets of whole blood), and plasma coagulation activity of 65% (Fig. 27.2) [107]. “Thus, during massive transfusion with only blood components given in the optimal 1:1:1 ratio, the blood concentrations for each of the components lead to an anemic, thrombocytopenic and coagulopathic state near the transfusion triggers for each of the components, and administration of any one component in excess only results in dilution of the other two” [4].

Fig. 27.2
figure 2

Component ratios. (Adapted from D’Angelo and Dutton [105] and Hess and Holcomb [106])

In addition, current techniques for the separation and storage of component blood products have been shown to be detrimental and do not prevent a decrease in functionality over time. Plasma coagulation factor activity is known to significantly decline during storage. Thawed plasma can only be stored in liquid form for a maximum of 5 days, because the activity of “labile factors” V and VII is significantly impaired. Liquid plasma only exhibits approximately 35% of factor V activity and 10% of factor VII activity at a storage time of 26 days [108, 109]. Storage temperatures and included preservatives exacerbate hypothermia and acidosis. The longer the blood is banked (especially after 21 days), the worse the resultant acidosis, with a typical 21-day-old unit of RBCs having a pH of 6.3 [110]. In addition, a recent study has indicated current component separation and blood banking practices result in the release of cellular debris in plasma in significant enough quantity to elicit a consequential pro-inflammatory response [111].

Whole blood resuscitation, as used in the Great World Wars, was historically the primary treatment for hemorrhagic shock. The development of blood separation techniques and component therapy was intended to improve blood availability and patient outcomes. However, these advancements, in conjunction with a misguided theory of a need to replace extracellular fluid losses, resulted in the overuse of crystalloid and RBC resuscitation, to the detriment of many patients for decades. In recent years, there has been a return to proven strategies of the past, employing permissive hypotension, prompt surgical control of bleeding, and blood product resuscitation with component ratios approximating the composition of whole blood. Recent conflicts in Iraq and Afghanistan reignited an interest in whole blood transfusion, and in 2014, the US Tactical Combat Casualty Care Committee recommended whole blood should return as the treatment for hemorrhagic shock [102, 112]. Military medicine is subsequently transitioning away from blood product components with a return to warm fresh whole blood (WFWB) and cold stored low titer group O whole blood (LTOWB) [113,114,115].

In the United States, there remains a fractionated, for-profit blood banking system that largely produces component blood products, and for 40 years whole blood was generally unavailable for the civilian population. Most institutions have a massive transfusion protocol in place that employs a 1:1:1 or 1:1:2 ratio of component transfusion. However, the availability of LTOWB is increasing, and it is being used at more than 20 institutions and prehospital settings across the country [102, 116, 117]. In 2011, the Trauma and Hemostasis Oxygenation Research (THOR) Network was established as an international community of experts in traumatic hemorrhage, promoting continued investigation and improvement in the treatment of hemorrhagic shock. In recent years, there have been multiple advances in the field including improved storage methods such as cold platelets and the reinstitution of liquid and dried plasma, as well as protocol development for the implementation of whole blood transfusion in both prehospital and hospital environments [118]. Does the future lie in the past [119, 120]? Future studies are needed to determine the efficacy and safety of this pendulum swing, but while many questions remain unanswered, continued inquiry, innovation, and a hope for improved survival in traumatic hemorrhage remain.