Introduction

Clostridioides (formerly Clostridium) difficile [1] is a Gram-positive, spore-forming anaerobe that is involved in the development of pseudomembranous colitis, generally called C. difficile infection (CDI) [2]. It is the most common pathogen to cause hospital-acquired infections (HAI) in the USA [3, 4]. The increase in the number and severity of CDI has been attributed to multiple factors, including the emergence of the BI/NAP027 hypervirulent strain, which has in increased production of toxins A and B, presence of a binary toxin, and resistance to antibiotics including fluoroquinolones [5,6,7]. BI/NAP027 is the predominant strain involved in the rising number of health care-related cases in the USA [8, 9]. The emergence of this strain has been associated to increased mortality, hospital length of stay, and healthcare costs [4, 10,11,12,13], but the relationship of this particular strain and the severity of CDI are still under debate [14, 15]. Its main clinical presentation is the development of diarrhea, which could be severe enough to cause toxic megacolon, fulminant colitis, and death [16]. Common risk factors associated to this infection include prior antibiotic use [17], proton-pump inhibitors (PPIs) [18,19,20], and certain types of chemotherapy [21]. Its mode of transmission is by the oral–fecal route, with an estimated level of colonization in healthy individuals of up to 3%, and higher rates in older individuals and patients admitted to healthcare facilities [22]. Even though CDI is a major problem in the hospital setting, cases in individuals without evident contact with the healthcare system (community-acquired CDI, or CA-CDI) have been on the rise [23].

Guidelines for the treatment and prevention of CDI have been published [24••, 25••, 26, 27]. Multiple interventions recommended in these guidelines are mainly focused in environmental control and antibiotic stewardship, with simultaneous interventions (“bundle”) being advocated to control nosocomial outbreaks, especially involving hypervirulent strains [28], but it is challenging to determine which interventions have the highest impact on controlling CDI [29, 30]. Even though these interventions are usually perceived as the cornerstones of the CDI prevention bundle, we think that we need to have a better understanding of the interactions between C. difficile and the gut microbiota [31, 32], along a better grasp on the impact of C. difficile colonized patients on both horizontal transmission of C. difficile and the incidence of CDI [33, 34]. These latter concepts will have an increasingly important role in the development of novel infection prevention strategies. The impact of CDI in infection prevention efforts is such that three out of five recommendations given by the Society of Healthcare Epidemiology of America for the Choosing Wisely campaign sponsored by the American Board of Internal Medicine foundation are directly related to CDI, including avoidance of prolonged antibiotic use without evidence of infection, avoidance of C. difficile testing without signs or symptoms suggestive for CDI, and to avoid antibiotic therapy in patients with recent CDI without convincing evidence of infection [35].

Interventions

Antimicrobial stewardship

Antimicrobial stewardship is a set of coordinated strategies to improve the use of antibiotics, with the goals to improve patient’s outcomes, reduce development of adverse effects related to antimicrobials including development of antimicrobial resistance, and decrease unnecessary costs [36]. It is highly recommended that all healthcare facilities develop a core antimicrobial stewardship program. In regard to CDI, antimicrobial stewardship has become a critical aspect of CDI prevention given the evidence that a significant proportion of hospital-acquired CDI (HA-CDI) cases are related to the transition from asymptomatic carrier state to active CDI [33, 34]. In theory, antimicrobial stewardship could decrease the transition from C. difficile colonization to CDI by avoiding the inappropriate exposure to antibiotics and other medications [37]. In addition to targeted antibiotic restriction (usually involving fluoroquinolones, clindamycin, and cephalosporins), evaluation of the appropriateness of PPIs is becoming a key intervention. PPIs have been implicated as a risk factor for CDI, including increase in the degree of severity [38,39,40], especially when used for 48 h or more [41]. With a deeper understanding of the gut microbiota, PPIs have been shown to directly affect the gut microbiome, as noted in a mouse model that showed increased local inflammatory reactions [42] and with decreased microbiome diversity that can predispose to CDI [43]. A recent review describing the evidence supporting the positive relationship between HA-CDI and PPIs has been published [44].

Testing stewardship and clinical decision support

Even though early detection of CDI is important in starting adequate therapy and implementing infection prevention measures to help decrease transmission in the hospital setting, inappropriate testing is associated with unnecessary exposure to antibiotics, with an increased risk of side effects and antibiotic resistance, and the implementation of contact precautions which have deleterious effects in patient care (sentiments of isolation, deterioration in quality of care). Testing stewardship, in manner similar to antibiotic stewardship, could be defined as a set of strategies to improve the use of diagnostic tests, with the expectation to improve detection of true-positive CDI cases, decrease the rate of false-positive cases (distinguish between colonization and true infection), allow prompt discontinuation of contact isolation measures, and decrease antibiotic exposure. These strategies have to be balanced against the advantages of performing an early diagnosis, especially in facilities with diagnostic algorithms that can be initiated without the order of a physician (nurse-driven protocols). Over-testing is associated with higher HA-CDI LabID events, as defined by NHSN, which can trigger unnecessary interventions (exposure to oral vancomycin, implementation of contact isolation precautions). We recommend establishing an algorithm for C. difficile testing (an example is given in Fig. 1), which has shown to decrease LabID events by a combination of standardization of laboratory processes, electronic health record clinical decision support, and real-time monitoring [45]. Every facility has to establish rules to evaluate the pre-test probability of CDI prior to ordering stool testing, based on clinical presentation, laboratory data, medical interventions, and recent C. difficile testing. In general, the likelihood of CDI is low when patients have not been exposed to antibiotics, have a normal white blood cell count, do not have fever or abdominal pain, have a recent negative C. difficile stool testing (within the last seven days), and are taking medications or undergoing other interventions that might explain the presence of diarrhea. Evaluation for exposure to laxatives or other interventions that might impact stool consistency before testing is highly recommended [46, 47].

Fig. 1
figure 1

An example of a local algorithm to improve early diagnosis of Clostridioides difficile infection (CDI) and implementation of isolation while decreasing the diagnosis of false-positive tests and late detection, which is classified as hospital-acquired CDI (HA-CDI) if diagnosed after 96 h of admission. WBC white blood cells, NAAT nucleic acid amplification test.

The laboratory method for the diagnosis of CDI has important implications in the diagnosis of patients and the reporting of CDI rates. Nucleic acid amplification testing (NAAT) for diagnosis is commonly used in the USA, but toxin production might be a better predictor of true infection and severity [48]. NAAT-only testing has been associated to substantial increase in CDI rates [49], and the switch from a NAAT-only testing strategy to a two-step testing (NAAT, followed by toxin assay if NAAT testing is positive) has immediate effects in CDI rates [50]. If a two-step diagnostic strategy is established, interventions regarding the handling of patients considered to be asymptomatic carriers (NAAT-positive, EIA-negative) have to be studied before implementation, which could include the need for contact isolation of asymptomatic carriers and tailored antimicrobial stewardship interventions.

Clinical decision support and prediction scoring systems have been studied to predict the development of the first episode of HA-CDI [51,52,53], recurrent CDI within certain limitations [54,55,56], and the development of a comorbid index after controlling for antibiotic use, age, PPIs use, and histamine blocker use [57]. Targeted testing based on clinical criteria (admission to a medical institution in the preceding 90 days, administration of antibiotics in the preceding 90 days, or a history of CDI) has been shown to improve timing and appropriateness of testing [58], while decreasing unnecessary inpatient testing [59], improving timely discontinuation of laxatives with a non-significant increase in the proportion of patients with C. difficile-related complications [60], and reducing time to implementation of contact isolation measures [61]. The perception of healthcare workers when confronted with these electronic tools showed acceptance due to standardization and error reduction, although they referred perceived loss of autonomy and clinical judgment [62].

Infection control

Environmental contamination with pathogens is an important source of health care-associated infections [63,64,65], and CDI is not the exception. C. difficile forms spores, which are resistant to the bactericidal effects of alcohol and other commonly used hospital disinfectants. General recommendations to help decrease CDI in the hospital setting include use of private rooms or cohorting; the rapid institution of contact precautions (especially when results of stool testing are not readily available), and to continue contact precautions for at least 48 h after diarrhea has resolved, or until discharge in outbreak or hyperendemic situations; and use of disposable equipment when feasible or thorough cleaning and disinfection with a sporicidal agent [24••, 25••].

Hand hygiene has been advocated as an important method to control CDI [66] even in settings with an already high rate of hand hygiene compliance [67]. Handwashing with soap and water has greater impact on C. difficile colony-forming units (CFUs) when compared to alcohol-based rub [68, 69], and it is recommended during CDI outbreaks, other hyperendemic settings, and visible fecal contamination. The impact of a structured versus non-structured handwashing technique has been evaluated by Deschenes et al. [70], showing decline in non-toxigenic C. difficile CFU with structured handwashing methods, but extrapolation of these findings has to be taken carefully. A study by Edmonds et al. showed that surrogate organisms were not predictive of C. difficile spore removal [71]. A study in a teaching hospital in Italy showed an inverse correlation between CDI incidence and hand hygiene compliance [72], but it is unclear what the real impact of hand hygiene is when compared to other “bundled” interventions [29]. An economic evaluation based in an agent-based model showed that hand hygiene compliance, environmental decontamination, and empiric isolation and treatment were the interventions with the greatest impact [73]. Patient hand hygiene could be another option to control CDI incidence [66].

Terminal room cleaning with a sporicidal agent should be considered in conjunction with other measures to prevent CDI during endemic high rates or outbreaks or if there is evidence of repeated cases of CDI in the same room. Cleaning effectiveness needs to be measured to ensure its quality. The use of hydrogen peroxide for environmental disinfection for infection control has been proven to be effective in eradicating C. difficile spores from the hospital environment [74,75,76] and decreasing CDI rates when used as a terminal disinfection method [77]. In regard to the use of ultraviolet light (UV-C), disinfection of C. difficile and other hospital acquired infection pathogens with mobile automated devices has been advocated to decrease surface contamination [78,79,80,81]. Time of exposure and organic load are the main factors that influence its effectiveness [79]. Time of emission of UV-C has been decreased by the use of reflective wall coating [82]. UV light has been compared against hydrogen peroxide vapor (HPV), with evidence showing increased decontamination with the latter method [83]. Xenon UV-light devices have been associated with substantial decline in HA-CDI rates, along with decrease in the number of deaths and colectomies attributed to severe CDI [84]. Pulse xenon-UV effectiveness seems to be similar to UV-C [85] and at least not inferior to sodium hypochlorite [86]. This effect was not reproduced in a recent study in a burn intensive care unit [87]. The use of these methods should be used in conjunction with other methods to control CDI [88], but the impact of using no-touch technology to terminal cleaning with bleach is still under debate [89], and it is generally recommended to perform terminal cleaning with bleach before the use of UV-C decontamination [90]. Targeted decontamination of rooms from which a patient infected or colonized with C. difficile, methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), or multidrug-resistant Acinetobacter spp. had a hospital-wide decline in CDI incidence [90]. UV-C terminal cleaning did affect not only CDI rates but also other hospital-acquired infection [91]. A recent systematic review and meta-analysis suggest the impact of no-touch technologies is more robust against CDI and VRE [92]. However, routine use of automated methods for terminal disinfection is not recommended [25••]. Copper alloys (65–100% copper content) have been associated to sporicidal activity against C. difficile [93, 94], but the impact on HAI rates and its cost effectiveness has not been extensively studied [95]. The use of copper-impregnated surfaces and linens and its impact in CDI and hospital-acquired infection rates have revealed conflicting results [96, 97].

Colonization and microbiome manipulation

The identification and isolation of patients colonized by C. difficile is still controversial [98, 99]. A study by Longtin et al. [100] showed that the implementation of universal screening for C. difficile upon hospital admission and the subsequent placement of these patients on contact precautions successfully decreased CDI rates when compared to other hospitals in the same region of Canada. However, this study was quasi-experimental in its design and the CDI rates had already started to drop by the time the interventions were put in place.

Given the biological progression of C. difficile entails the transition from a non-colonized to colonized to infection state, future interventions should aim at directly manipulating the microbiome to prevent this biological progression [31, 32]. Prebiotics (e.g., non-absorbable oligosaccharides), probiotics (e.g., communities of beneficial enteric organisms), or symbiotics (prebiotics combined with probiotics) will become more relevant in future infection control interventions [32]. In the next decade, sequencing machines for 16S rRNA will probably be readily available in clinical laboratories. This will open the opportunities for prevention to a more individualized approach tailored to each patient’s needs.

Conclusions

Prevention of CDI should include interventions aimed at decreasing exposure to non-colonized patients (i.e., handwashing, environmental disinfection), decreasing microbiome disruption (i.e., antimicrobial stewardship interventions), decreasing misdiagnosis (i.e., laboratory stewardship), and improving diagnostic testing specificity (e.g., NAAT followed by toxin EIA). In the future, additional infection control interventions aimed at direct manipulation of the intestinal microbiome will be individualized to our patients needs.