Keywords

Hospital-Acquired Infections

Hospital-acquired infections (HAIs) have been plaguing us since the inception of the hospital in the twelfth century.

During this medieval time, hospitals were among the most hazardous places, with death rates as high as 70% [1]. When a sick person entered the hospital, his or her property was disposed of, and in some regions a requiem mass was held, as if the person had already died [2]. But one must remember that this was a time when ground rabbit fur and mummy powder—that is, the ground remains of mummies—were among the most popular wound dressings, and attempts at antisepsis were quite crude. Medicines of the time consisted of ingredients such as snake flesh, laurel berries, sheep dung, lye, cow kidney, antimony, alum, and earthworms that were mixed with various herbs and were to be taken orally or by enema [3]. Improvements to hospital infection prevention measures came very slowly over the next several centuries.

By 1800 hospital mortality was still considerable, with rates commonly above 25%. According to a report of an American military hospital at the time, thousands of young men who had been admitted to the hospital with slight injuries or venereal diseases died from serious hospital-acquired infections during their stay. It was said that “a soldier entering a great battle was in less danger than one entering the hospital.” [2, 4] It was not until the Progressive Era of the late 1800s to the early 1900s that huge breakthroughs were made in the understanding of the invisible realm of microbes. During the late 1800s, the new and exciting field of bacteriology was introduced to the world, bolstered by the work of Pasteur, Koch, and Lister, and our success in combating these invisible enemies grew exponentially as a result.

Interestingly, even today, we are still fighting that age-old battle against nosocomial infections. Despite advances in medical knowledge, in pharmaceuticals, and in hospital infection prevention as a whole, we still lose patients every year to infections that are acquired while being treated for unrelated and very curable conditions, despite the fact that these infections are largely preventable. According to the CDC, around 5–10 percent of hospitalized patients in the USA are affected by HAIs. This equates to approximately 1.7 million HAIs in US hospitals every year, resulting in 99,000 deaths and an estimated $20 billion in healthcare costs [5, 6]. Of these infections, 32% are urinary tract infections (resulting from the use of urinary catheters), 22% are surgical site infections, 15% are hospital-acquired pneumonia, and 14% are bloodstream infections, primarily associated with the insertion of a vascular access device [7, 8].

Why is it that we are still losing the battle to microbes consistently? Why are we sending our loved ones to the hospital to be treated for curable diseases, only to lose them to a hospital-acquired pneumonia or bloodstream infection? For almost a century and a half we have been working under the assumption that we know and understand the invisible enemy that we are fighting. But perhaps the error lies in our fundamental understanding of the microbes themselves. Perhaps we need to accept the fact that we need to progress into the next chapter of understanding the microbial world. We need the medical community, the Food and Drug Administration (FDA), and medical device and pharmaceutical companies to get caught up with academia and advance into the paradigm of biofilms.

Biofilms

Biofilms were first explained at length by Costerton et al. in 1978. Since then biofilms have become more and more recognized and studied as a compelling field impacting medical and industrial settings alike. Microbial biofilms exist as microorganisms and produce extracellular polymers, which are used to adhere to a surface. This extracellular polymer, known as extracellular polysaccharide (EPS) , functions as a scaffolding or matrix that provides structure and security within the biofilm.

Unfortunately, in the medical community today, this is a commonly misunderstood and ignored subject. The word “biofilm” exists merely as a buzzword, and though commonly used, the meaning of the word is frequently misunderstood. Perhaps this is a branding problem. Upon hearing the word biofilm, it does conjure an image of a slimy byproduct of microorganisms, and this is precisely the root of the confusion. In an article recently published in the online publication Science Daily, an author attempts to introduce the subject of biofilms by saying:

Have you ever heard of biofilms? They are slimy, glue-like membranes that are produced by microbes, like bacteria and fungi, in order to colonize surfaces. They can grow on animal and plant tissues, and even inside the human body on medical devices such as catheters, heart valves, or artificial hips. Biofilms protect microbes from the body’s immune system and increase their resistance to antibiotics. They represent one of the biggest threats to patients in hospital settings.

This of course is a true statement but gives the reader the impression that the word “biofilm” only refers to the slime produced by the bacteria and not the bacteria themselves. The slime is only part of the story. Microorganisms such as bacteria exist in two main phenotypes, namely planktonic, which are free-floating cells, and biofilms, which are aggregations of cells that have adhered to a surface. Biofilms are not unique to bacteria. Fungal organisms such as yeast (Candida sp.) are well-known for forming biofilms. Biofilms can exist in which one species can dominate the space, known as a monomicrobial biofilm. However, biofilms rarely exist in this manner in nature; rather, they exist as polymicrobial biofilms. This is when more than one species of microorganism is well distributed throughout, at times creating a symbiotic relationship.

Biofilm development is a complex process that can be condensed into five major steps.

  • Stage 1—surface adherence: within minutes microorganisms can begin to colonize a surface.

  • Stage 2—aggregation: microcolonies form and begin to excrete EPS components, i.e., slime.

  • Stage 3—biofilm is formed: the community begins to mature into multilayered clusters.

  • Stage 4—three-dimensional growth: maturation advances to include physical pathways (water channels) that shuttle nutrients and waste products, and the biofilm begins to be protected from host defense mechanisms and antibiotics.

  • Stage 5—critical mass is reached: planktonic cells can escape the community and colonize other surfaces.

An oxygen gradient can also exist. Organisms such as Staphylococcus aureus and Pseudomonas aeruginosa are well-known biofilm-forming organisms and are also known collectively as facultative anaerobes. The ability to shift their physiology from that of an aerobic state where oxygen is available to an anaerobic state where oxygen is less available is advantageous. It is in these regions of limited oxygen that you will find cells that have a much lower metabolic rate and live in a state of dormancy. These cells have been referred to as persisters [9]. This ability is problematic for antibiotics whose mechanism of action is at the ribosome, for example, as the persister cells are not metabolically active; therefore, the drug will have difficulty gaining access into the bacterial cell through its usual metabolic channel and is rendered ineffective. This phenomenon of persister cells has explained the tolerance that biofilms demonstrate to multiple classes of antibiotics that require getting inside the cell to carry out their mechanism of action. Biofilms have been found to be 1000 times more tolerant to antibiotics than their planktonic counterparts, which may be contributed by persisters [10]. Cell-to-cell communication, referred to as quorum sensing, also contributes to the pathogenicity of biofilms. Quorum sensing involves signals known as autoinducers that respond to cell density and other stresses experienced in the environment, which can contribute to expression of virulence factors.

Biofilms and Bloodstream Catheters

Of all the medical procedures in the hospital setting, few are more common and ubiquitous than the insertion of a vascular access device. The establishment of reliable venous access is required for nearly every patient in the hospital regardless of their healthcare needs, and often a device is placed in a patient upon entering the emergency department whether they need it or not. Consequently, the insertion of an intravenous (IV) catheter is often the first procedure performed upon entering the hospital, and removal of an IV catheter is commonly the last.

The peripheral IV is by far the most common vascular access device utilized in hospitals around the world and is considered indispensable in modern-day medical practice [8]. These devices consist of a small flexible tube that is inserted into a peripheral vein for intravenous therapies, such as the administration of medications and fluids, and are also often used for blood draws. Today, up to 90% of patients admitted to the hospital receive a peripheral IV, and over 1 billion peripheral IVs are placed globally each year [11].

For emergent and critically ill patients, obtaining vascular access is a critical and time-sensitive management step [12]. Prompt vascular access for these patients allows for rapid laboratory testing and the administration of life-saving therapies. Often the vascular access needs for these patients exceeds the capability of a small peripheral IV catheter. For this reason, there are a variety of vascular access devices that vary in size, utility, and invasiveness to the patient. All of them, however, pose a risk of infection, due to the fact that they are percutaneous devices. As such, a risk for contamination exists from the time they are inserted to the moment they are removed.

Although catheter-related blood stream infection (CRBSI) is the subject of extensive surveillance and research, most of these efforts have been limited to the study of central venous CRBSI, while CRBSI related to peripheral IVs (PIVs) has received much less focus. Catheter-related infection is a problem that deserves attention no matter the placement or location. Reported rate of infection related to PIVs is lower than that of central venous catheters (CVCs); however, with more than 200 million PIVs being placed in the USA each year, the number of infections related to PIVs is actually greater than that of central lines [8].

CVC insertion has become an indispensable procedure in a variety of situations throughout the hospital and in home health settings. During the past half century, the multiple technical and technological achievements leading to the development of safe, short-term, long-term, or chronic vascular access have had significant effects in saving or prolonging the lives of countless patients. The many applications for CVCs include fluid resuscitation, hemodynamic monitoring, parenteral nutritional support, dialysis, and the administration of chemotherapy or other caustic or harmful medications that can’t be administered peripherally.

Generally, central lines are of two main types. The first is tunneled catheters, which are implanted surgically by creating a subcutaneous track prior to entering a central vein, such as the internal jugular, subclavian, or femoral vein for long-term (weeks to months) access. These types of catheters are designed for chronic use and the indications of use including therapies such as chemotherapy and hemodialysis. The second type of central line is a “nontunneled” or acute central line. These catheters are inserted percutaneously, are the most common type of central line, and account for the majority of central-line-associated bloodstream infections (CLABSIs) that are reported [13].

Acute central line placements have long been regarded as dangerous procedures by practitioners, catheter manufacturers, and the FDA [14]. More than three million CVCs are placed annually. Of those procedures it has been reported that 3–25% experience complications [15, 16]. Common complications include inadvertent arterial injury, air embolism, pneumothorax, and CLABSI.

A CLABSI is defined as a laboratory-confirmed bloodstream infection not related to an infection at another site that develops within 48 hours of central line placement. Of all the healthcare-associated infections, CLABSIs are the most costly, accounting for approximately $46,000 per case [13]. CLABSIs lead to prolonged hospital stays and increased mortality rates. Nosocomial bloodstream infections are reported to be the eighth-leading cause of death in the USA [17]. It is estimated that more than 250,000 cases occur annually in the USA alone, with a fatality rate of approximately 23.8% [18, 19]. These incidents are costly, deadly, and largely preventable. The US Department of Health and Human Services’ Action Plan to Prevent Healthcare-Associated Infections is focusing attention on the need to dramatically reduce these infections [20, 21].

Starting in 2008, CLABSIs were classified as a “never event,” forcing hospitals to track and document all incidents. This increased awareness of the issue and made hospital infection rates public knowledge, increasing their incentive to address this issue and eliminate CLABSI. This has recently been reinforced by the introduction of the Affordable Care Act, which has brought about even greater awareness and monitoring.

The prevention of CLABSI is an extremely challenging and complicated issue. In 2011 the CDC took this challenge head-on when they published the “CDC Guidelines for the Prevention of Intravascular Catheter-Related Infections.” This document is comprised of 83 pages of evidence-based guidelines and instructions for the proper care and maintenance of these devices and serves as standard for healthcare personnel who insert intravascular catheters, those who are responsible for using and maintaining them, and those who are liable for the surveillance and control of infections in the hospital (infection preventionists) [22].

Although it is extremely difficult to track and confirm the source of a central line infection, there are some general perceptions about the most common causes. The incidence of catheter-related infection is directly influenced by duration of catheter dwell time in the patient. Longer dwell times result in an increased number of manipulations at the catheter hub which, in turn, can lead to increased risk of intraluminal contamination. As previously mentioned, if an infection develops within 48 hours of catheter placement, it is commonly perceived that this infection was the result of contamination during the insertion procedure. Central line insertions are sterile procedures. Much like a surgical procedure, during a central line insertion patients are draped from head to foot in a sterile barrier. The insertion site is prepared with a surgical antiseptic, typically chlorhexidine gluconate (CHG). The most common insertion sites include the internal jugular vein, the subclavian vein, a deep vessel in the upper arm, or the femoral vein. The clinician dons a sterile gown, mask, cap, and sterile gloves. Similar to other sterile procedures, the opportunity to introduce contamination is only as good as the sterile technique of the clinician performing the procedure.

It is commonly understood that within 7–10 days of CVC placement, bacteria on the surface of the skin can migrate along the surface of the catheter from the catheter insertion site towards the intravascular space. For nontunneled devices, the absence of a tunnel places these catheters at higher risk for CLABSIs. Research shows that CLABSIs that occur beyond 10 days are typically the result of contamination of the intraluminal portion of the catheter hub, and this is commonly caused by a healthcare provider’s contaminated hands, often due to a breach of standard aseptic procedure while accessing the catheter. Less common mechanisms of contamination include hematogenous seeding of bacteria from another source or from a contaminated infusate [23, 24].Host factors that increase the risk of CLABSI include chronic illnesses (hemodialysis, malignancy, gastrointestinal tract disorders, pulmonary hypertension), immune-compromised states (bone marrow transplant, end-stage renal disease, diabetes mellitus), malnutrition, total parenteral nutrition (TPN), extremes of age, loss of skin integrity (burns), prolonged hospitalization before line insertion, catheter type, catheter location (femoral line has the highest, followed by internal jugular, then subclavian), conditions of insertion (emergent versus elective, use of maximal barrier precautions versus limited), catheter site care, and skill of the catheter inserter. Pseudomonas is commonly seen in association with neutropenia, severe illness, or known prior colonization. Candida is associated with other risk factors, namely femoral catheterization, TPN, prolonged administration of broad-spectrum antibiotics, hematologic malignancy, or solid organ or hematopoietic stem cell transplantation. Certain bacteria such as staphylococci, Pseudomonas, and Candida produce biofilms, which favor increased virulence, adherence to catheter surfaces, and diffidence to antimicrobial therapy [23].

Antimicrobial Strategies and Catheters

Antimicrobial-coated or impregnated central catheters were first introduced to clinical practice circa 1990 and quickly grew in popularity and clinical use in the acute setting. The two most common catheter coatings are comprised of either chlorhexidine and silver sulfadiazine, or minocycline and rifampin. For approximately six decades, chlorhexidine has been used in clinical practice as a skin antiseptic and disinfectant for a number of sterile procedures [25]. These technologies have remained unchanged in almost 30 years. Imagine how much technology has changed since that time. Antimicrobial catheters were introduced 17 years before the first iPhone. Currently, more than 75% of the acute central lines placed in the United States utilize these same antiquated coatings. It is challenging to assess the efficacy of these technologies. Since their introduction, several studies have been published evaluating their ability to reduce the incidence of CLABSI, many touting extremely positive results. In an effort to answer this question, McConnell and colleagues published a paper in 2003 critically analyzing 11 of these studies. They assessed study methodology, patient characteristics, and the presence of flaws in the studies and found that many of the studies contained inconsistent definitions of CRBI, failed to account for confounding variables, contained suboptimal statistical analyses, and lacked clinically relevant endpoints [26]. In the end, the authors concluded that although the use of impregnated catheters may decrease catheter colonization, they recommended that more reliable studies should be conducted in order to definitively conclude whether these technologies have the ability to decrease the incidence of catheter-related infection [26]. But whether they have the ability to decrease the incidence of infection or not, the problem of bloodstream infections is apparent. Antimicrobial catheter coatings have been in use for almost 30 years—why are we not doing a better job of preventing this avoidable issue?

But perhaps the question is not whether the idea of coating a catheter with antimicrobials is a valid one, but what assumptions were made about microbes in the creation and optimization of the coatings themselves. It has long been assumed by biofilm academics and enthusiasts that many of the methods established to test antimicrobial efficacy are based on a number of incorrect and outdated assumptions about bacteria themselves. From a medical device development standpoint, how can we do a better job of designing antimicrobial technologies that address the actual clinical scenario with a more complete understanding of how microbes function? As mentioned previously in this chapter, perhaps the fault lies with those that first coined the term biofilm. It leads the reader to believe that the term refers to something that microbial life creates. In reality, knowledge of biofilm is true knowledge of how microbes actually behave and what microbes truly are.

Knowledge is of no value unless we use it for change. Many of us who work in the medical device field do so because we believe we can make a difference in the lives of patients by elevating the technologies used to treat those that need it most. Throughout history, advancements in knowledge have led to advancements in technology and practice, which in turn have led to vast improvements in clinical care. Is it possible that a simple conceptual hang-up is preventing us from entering a new era of medical advancement? Is it possible that by simply viewing the microbial world through the biofilm lens, we might finally overcome the hurdles that are holding us back? It is my hope that a more complete understanding of microbial biofilms will allow us to overcome these hurdles, inspire the creation of new and exciting medical technologies, and finally guide us into a world where hospital-acquired infections are a thing of the past.