Introduction

Antibiotic stewardship programs (ASPs) aim to optimize antibiotic prescription that results in the best clinical outcomes for the treatment or prevention of infections, with minimal toxicity to the patients and minimal impact on subsequent resistance [1, 2]. ASPs have been increasingly implemented in various healthcare and community settings and demonstrated to be effective in reducing antibiotic consumption, the length of hospital stay (LOS), and Clostridioides difficile infections (CDIs) [3,4,5]. An important assumption supporting a successful ASP lies in the concept of a “multidisciplinary approach.” The Infectious Diseases Society of America (IDSA) and Society of Healthcare Epidemiology of America (SHEA) guidelines state that the core members of an ASP team consist of an infectious disease (ID) physician and a clinical pharmacist with ID training; the presence of clinical microbiologists, infection control professionals, and hospital epidemiologists is deemed as optimal [6]. The Centers for Disease Control and Prevention guidelines identify seven core elements for a successful ASP implementation and have pointed out that ASP should actively collaborate with clinical microbiology [7]. At the patient level, the main task of the microbiology laboratory is to support therapeutic clinical decisions by providing detection and identification of microorganisms and of antibiotic susceptibility tests (AST) [8••].

In the last two decades, the explosion in the development of rapid diagnostic techniques (RDTs) has placed the microbiology laboratory in a crucial position along the diagnostic workflow. RDTs can relevantly shorten the time to identification of microorganisms compared with the conventional phenotypic methods and can provide more rapidly susceptibility tests by detecting resistance markers or genes [9]. These advantages lead to rapid initiation of the most effective targeted antibiotic treatment, reducing or even avoiding the exposure to broad-spectrum antibiotics. The chance of speeding up the diagnosis process is tempting for clinicians; however, having access to such a wide range of diagnostic tools might increase the risk of their inappropriate use. It has been estimated, in fact, that roughly one-fifth of currently available tests are overused, with even more being underused [10]. The urgent need for regulatory policies governing microbiologic diagnostics has spurred the antibiotic stewardship experts to introduce the new concept of “diagnostic stewardship” in clinical practice to promote evidence-based utilization of diagnostic tests, with the primary goals of improving value and care quality and safely reducing cost [11].

The review will introduce the specific role of diagnostic and antibiotic stewardship in the implementation of RDTs and will describe the most relevant related clinical applications in CDIs, bloodstream infections (BSIs), and respiratory infections.

Implementation of RDTs: role of diagnostic and antibiotic stewardship

The field of RDTs is constantly expanding, and the advancement of RDTs is one of the key tasks stated by the National Action Plan for Combating Antibiotic Resistance Bacteria [12]. RDTs employ a variety of different technologies, able to identify a pathogen, or a group of pathogens, and/or specific resistance markers. However, RDTs cannot replace the conventional diagnostic methods, but rather they function as tools to return valuable and reliable information with a relevantly shorter turnaround time (TAT) [13]. As a salient example, the information on antibiotic susceptibility based on the detection of resistance mechanism is only qualitative, since no minimum inhibitory concentration thresholds are provided, and does not necessarily mirror the actual phenotypic pattern, since the presence of the resistance determinant does not imply the expression of resistance phenotype [14]. Hence, the implementation of RDTs along the diagnostic workflow requires extensive knowledge by clinicians on the interpretation of results and a full understanding of the strengths and weaknesses of the various RDTs. In this context, the concept of “diagnostic stewardship” has emerged to support the rational use of the microbiologic diagnostics addressing these three essential questions: “which RDTs should be used? How and when should they be used? Are they worth paying for?” Implementation strategies of diagnostic stewardship occur at different stages along the diagnostic pathway: the pre-analytic stage, including test-related decision-making and sample collection; the analytic stage, relating to the laboratory proceedings; and the post-analytic stage, referring to reporting and interpretation of test results [11, 15]. Diagnostic stewardship approaches come in many forms, encompassing, as examples, prior authorization laboratory policies, which include refusing to test samples wrongly collected or managed, targeted provider educational interventions surrounding appropriate test ordering and interpretation [16], and development of diagnostic algorithms or clinical decision supports guiding clinicians towards the most appropriate test for the specific clinical scenario [17].

A fruitful implementation of each diagnostic stewardship strategy requires a functioning collaborative partnership between laboratory and clinical sectors. This is even more true in the case of RDTs, whose relevance from a clinical perspective is strictly related to TAT. Even assuming that each stage of the diagnostic pathway has run appropriately, the RDT results cannot affect the therapeutic decision-making if they are not timely communicated, correctly interpreted, and applied to clinical practice [8••]. The positivity of RDTs on a certain specimen does not necessarily imply the equation “detection means infection,” and for clinicians usually not dealing with infectious diseases, a correct interpretation of RDT results may be challenging. Therefore, the antibiotic stewardship team can facilitate the interpretation of results and promptly provide clinically useful information directly to prescribers. Buehler et al.’s 2016 pooled data revealed that for patient with BSIs, only RDTs coupled with direct communication led to significant differences in the time of appropriate antibiotic therapy [18]. Selective reporting [19], templated comments [20], and especially ASP interventions through real-time audit and feedback approach have been shown to be valuable tools for a more rapid treatment optimization [21] and might even improve the clinical outcomes [22,23,24]. Key steps of diagnostic and antibiotic stewardship are displayed in Fig. 1.

Fig. 1
figure 1

Key steps of diagnostic and antibiotic stewardship.

Embracing RDTs and stewardship in clinical practice: implementation successes and challenges

Clostridioides difficile infections

C. difficile is the most important infective cause of healthcare-associated diarrhea and one of the most important healthcare-associated pathogens in both Europe and the USA, leading to significant morbidity, mortality, and cost [25, 26]. Even in the RDTs era, the diagnosis of CDI remains a relevant clinical challenge. The diagnosis of CDI occurs very commonly through molecular RDTs, which unfortunately do not reliably discern colonization and infection [27]. They have the potential to misdiagnose patients with colonization as having CDI, particularly when used in patients with a low likelihood of CDI, leading to unnecessary antibiotic treatment and additional costs. Several studies have been conducted evaluating the best strategy to guide clinicians on appropriate CDI testing, and some of them showed a significant reduction of testing and of CDI events [28••]. Diagnostic options include tests for C. difficile organism (e.g., glutamate dehydrogenase enzyme immunoassay, GDH EIA), toxin antigen or gene (e.g., toxin A-B EIA and nucleic acid amplification test, NAAT), or algorithmized combinations of these tests. So far, the optimal diagnostic procedure is still a matter of debate. In an attempt to improve uniformity in CDI diagnosis, the European Society of Clinical Microbiology and Infectious Diseases updated the guidelines by systematically adding the new evidence from literature on CDI diagnosis with the newer RDTs. The new diagnostic guidance document, published in 2016, underlined the need of combining tests to achieve optimal accuracy and proposed two distinct multi-step diagnostic algorithms based on the combination of RDTs to be interchangeably applied in clinical practice [29]. Of note, although the authors pointed out that therapeutic decisions rely mostly on patients’ clinical features, no specific clinical parameters were included in the algorithms. The 2018 IDSA/SHEA guidelines tried to address this issue by recommending two distinct approaches to C. difficile testing (single or multi-stage diagnostic algorithm) on the basis of the presence of “pre-agreed institutional criteria for patient stool submission” [30]. The pre-agreed criteria state to target the stool samples only in patients with unexplained and new-onset unformed stools in the previous 24 h. If pre-agreed institutional criteria are fulfilled, NAAT alone might be used; on the contrary, a multi-step approach incorporating the use of two tests coupled with strategies to reduce unnecessary stool testing (staff education, restrictive criteria for test ordering) should be followed. The use of this diagnostic algorithm has been shown to maximize test accuracy and avoid unnecessary test use; however, the clinical implications on patient safety remain largely unclear. Even in the presence of these limitations, this is the first guideline that addresses the concept of diagnostic stewardship and provides two different recommendations depending on whether diagnostic stewardship practices are present as part of the intervention at the institutional level [28••].

Bloodstream infections

BSIs are one of the leading causes of mortality due to infection; estimates reveal that nearly two million episodes and one-quarter of a million deaths from BSIs occur every year in North America and Europe [31]. The ability to promptly optimize antibiotic treatment through the early detection of organism and/or resistance markers via RDTs becomes of utmost importance in BSIs, since a delay in administering appropriate therapy may affect mortality [32]. There is ample evidence demonstrating that the benefits of RDTs are maximized when paired with real-time ASPs. In light of this evidence, the 2016 IDSA guidelines in ASP implementation recommended the use of RDTs for the diagnosis of BSIs and respiratory infections, in addition to the conventional microbiological methods [2]. In the randomized controlled trial (RCT) of Banjeree et al., 617 patients with BSIs were randomized in three arms: standard blood culture processing with matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) and rapid penicillin-binding protein 2a test; rapid multiple polymerase chain reaction (PCR) with template comments or rapid multiple PCR with standard comments; and daytime prospective audit and feedback. RDTs were included in all the randomized arms; however, an improved time to antibiotic de-escalation (34 h vs 38 h, p < 0.001) was observed exclusively in the real-time ASP arm [20]. In addition, data from observational studies showed that the reduction of time to appropriate antibiotic treatment translated into improved patient outcomes. In a quasi-experimental study, Huang et. al evaluated the MALDI-TOF implementation with a real-time ASP in adult patients with BSIs admitted in a large academic center. The five-member ASP team provided daily evidence-based antibiotic recommendations after having received a real-time notification following organism identification and antibiotic susceptibilities. Compared with the pre-intervention, the ASP implementation resulted to be significantly efficacious in optimizing time to effective antibiotic therapy (20.4 vs 30.1 h, p 0.021) and reducing all-cause mortality (20.3% vs 12.7%, p 0.021), LOS (8.3 vs 14.9 days, p 0.014), and 30-day BSI relapse (2% vs 5.9%, p 0.038) [22]. A similar approach was used by Wenzler et al. for evaluating clinical outcomes specifically in Acinetobacter baumannii BSIs or pneumonia. The ASP design was based on providing recommendations after a retrospective drug review by the ASP team. The intervention-related shortening of time in appropriate treatment led to a significant increase of clinical cure at 7 days (34% vs 14%, p 0.16) and a decrease of infection-attributable LOS by 48 h (p 0.021) [24]. Of note, the rate of multi-drug resistant (MDR) A. baumannii in that institution was more than 60%. In the study of Perez et al., the standard care was compared with a multi-faceted ASP approach including early MALDI-TOF species identification, AST for multi-drug resistant (MDR) Gram-negative bacteria, and real-time communication of results and intervention by pharmacists. This approach determined a significant decrease of LOS (15.3 days vs 23.3 days, p 0.0001), 30-day mortality (8.9% vs 21%, p 0.01), and a reduction of per-patient costs (from $70.991 to £52.693, p 0.002) [23].

In BSIs due to Gram-positive bacteria, the rapid detection of methicillin or vancomycin resistance marker can affect clinicians’ therapeutic decisions. Bauer et al. evaluated the clinical and economic outcomes of rapid PCR with an infectious diseases pharmacist’s intervention in patients with Staphylococcus aureus BSIs. In patients with BSIs due to methicillin-susceptible S. aureus (MSSA), a significant shortening of time to switch from empiric vancomycin to targeted nafcillin or cefazolin (1.7 days, p 0.002) and a downward trend of LOS and costs were observed [33]. The Verigene platform that enables the simultaneous identification of species and resistance markers (VanA, VanB, and mecA) was successfully implemented with ASP in patients with Gram-positive BSIs. Sango et al. tested Verigene blood culture Gram positive in addition to real-time ASP in patients with enterococcal BSIs. Besides the shorter time to appropriate antibiotic therapy in the patients with vancomycin-resistant Enterococci BSIs (21.6 h shorter, p < 0.0001), a significant reduction of LOS (21.7 days shorter, p 0.048) and mean hospital costs ($60,729 lower, p 0.02) were observed in the patients in the post-intervention group compared with those in the pre-intervention group [34].

The enhanced communication between laboratory and clinicians likely facilitated the reduction in unnecessary antibiotic use for false-positive blood cultures. The RCT of Ly et al. assessed the implementation of peptide-nucleic acid fluorescent in situ hybridization (PNA-FISH) for S. aureus and coagulase-negative Staphylococci (CoNS) coupled with a real-time notification program to ASP team. Compared with the patients receiving the PNA-FISH alone, those randomized to PNA-FISH coupled with ASP had a significant shortened median antibiotic duration for CoNS (median − 2.5 days, p 0.01) and a decrease of mortality (8% vs 18%, p 0.05) [35]. A pre-post intervention study demonstrated that the use of PNA-FISH for rapid CoNS identification within an established ASP benefitted cancer patients. The intervention led to shorter median antibiotic duration and to a higher proportion of patients with vancomycin avoidance (50% vs 31%, p 0.002) and less monitoring of plasma vancomycin levels (31% vs 52%, p 0.009) [36].

The implementation of RDTs towards MDR Gram-negative bacteria (GNB) is more challenging, because of the complexity of Gram-negative resistance, which involves several resistance mechanisms. Even considering that the RDTs were implemented in the right clinical context and applied to the right patient, interpretation issues remain, especially in the case of a negative test. For example, if a hospital has a high rate of extended spectrum beta-lactamase (ESBL)–producing GNB and the RDT result shows a Klebsiella pneumoniae isolate that is CTX-M negative, the stewardship team may be reluctant to de-escalate empiric therapy to a third-generation cephalosporin (or to continue ceftriaxone and not escalate to a carbapenem) since the resistance could still exist, due to other mechanisms of resistance not detectable by RDT. This often leads to antibiotic de-escalation or modification only in light of AST data, exposing the patient to unnecessary broad-spectrum antibiotic therapy and consequently limiting the clinical usefulness of RDT [37]. Pogue et al. addressed this issue by assessing the performance of Verigene blood culture Gram negative (capable of detecting nine species and six markers of resistance), in predicting susceptibility in two geographically distinct high-resistance scenarios. Among 1046 GNB isolates from BSIs, the absence of resistance determinants reported by RDT largely predicted susceptibility to the targeted antibiotics with a negative predictive value (NPV) more than 90% for resistance to third-generation cephalosporins in Escherichia coli and K. pneumoniae, but lower for Pseudomonas aeruginosa, likely given to the more complex nature of resistance [37].

These studies show that the ASPs, by enhancing communication and interpretation of RDT results, have a significant benefit in terms of de-escalation from broad-spectrum agents, more rapid administration of appropriate therapy, and discontinuation of unnecessary treatment. That said, no “one-size-fits-all” approach exists; diagnostic stewardship should consider that a proper integration of RDTs into diagnostic workflow requires knowledge of the local resistance epidemiology profile, of the type and distribution of the resistance mechanisms, and of the usual empiric treatment coverages.

The cost-effectiveness factor is an important aspect that should be considered when conducting stewardship for RDTs. While the conventional microbiologic diagnostics are quite inexpensive, some new technologies can be very costly, reinforcing the need to address value [38]. Several single-center studies have proven that the use of RDTs can be a cost-effective strategy, especially when coupled with ASPs. According to some studies, the cost saving was achieved through lower antibiotic use secondary to rapid streamlining [39], while other studies concluded that the main driver of cost saving was likely the shortening of LOS [23, 33, 40••].

A comprehensive cost-effectiveness analysis published in 2018 aimed to economically assess seven molecular RDTs alone or in conjunction with ASP support. The authors stated that RDTs were in general cost effective, and the greatest healthcare cost saving occurred in the presence of ASP [41].

Applying a decision model, Brown et al. evaluated the impact of the identification of MRSA in the blood through rapid PCR on clinical and economical outcomes. The findings showed that rapid PCR testing had the potential to decrease mortality while being less costly than empiric therapy across a wide range of MRSA prevalence rates and PCR test costs [42]. A global financial assessment, considering both direct and indirect costs, revealed that despite the additional costs of implementing MALDI-TOF and of dedicating pharmacy stewardship personnel time to interventions, the total hospital costs decreased by $2439 per BSI [39]. Even the more expensive platforms were cost saving. The cost-effective analysis of Pardo et al. found that the implementation of the FilmArray Blood Culture Identification Panel coupled with ASP resulted in a significantly shorter post-culture length of stay and saved approximately $30,000 per 100 patients tested [43].

According to this evidence, integrating RDTs and ASPs seems to be a cost-effective strategy to improve patient care. Hence, an ideal application of RDTs should include both clinical effectiveness and economic efficiency.

Respiratory infections

Acute respiratory infections are common among patients admitted to hospital and are one of the leading causes of morbidity and mortality worldwide [44]. A diagnosis based on clinical and radiologic findings is often not sufficient to discern with certainty viral infections, for which antibiotic treatment should be avoided, between bacterial infections, for which antibiotics are required. Therefore, the RDTs may play a crucial role in increasing diagnostic certainty and decreasing the over prescription of antibiotics [45]. While the collaborative relationship between active ASP and RDTs in optimizing clinical outcomes has been well described for BSIs, evidence data on respiratory infections remains scarce. The 2016 IDSA guidelines on ASP implementation recommend the use of RDTs to reduce the inappropriate use of antibiotics. However, these recommendations are weak and based on poor-quality evidence [2], largely from pediatric studies. Similarly, the 2019 American Thoracic Society and IDSA Community-Acquired Pneumonia Guidelines recognize the potential role of these RDTs to improve antibiotic stewardship but do not specifically support their use beyond testing for influenza during the influenza season [46, 47••].

A good example of how a wise RDT implementation might have stewardship implications concerns the methicillin-resistant S. aureus (MRSA) nasal PCR test. Pooled data from a well-conducted diagnostic meta-analysis revealed that PCR nares screening for MRSA has excellent specificity and NPV for ruling out MRSA pneumonia and therefore could be a valuable tool to streamline the empiric antibiotic therapy [48••]. Dunaway et al. assessed a pharmacy-driven protocol encompassing the use of nasal swab MRSA PCR test on the duration of antibiotic therapy and on clinical outcomes in patients with suspected pneumonia. The median duration of vancomycin therapy was significantly shortened by approximately 31 h per patient; however, no significant benefits on clinical outcomes were observed [49•].

The multiplex respiratory RDT panels are also potentially good tools for optimizing the stewardship-related outcomes. So far, only few stewardship implementation efforts around these RDTs were carried out and provide conflicting results [50, 51, 52•]. A recent meta-analysis analyzing 56 diagnostic test accuracy studies and 15 clinical studies concluded that, despite the optimal pooled diagnostic accuracy (sensitivity and specificity > 90%), the real clinical implications remain largely unclear due to the very high inter-study heterogeneity [53]. Based on the so-called “syndromic approach,” these RDT panels are designed to detect an extremely wide range of pathogens potentially causing a wide range of “respiratory infectious syndrome.” It becomes clear, therefore, that using these tests by solely relying on their good performance does not represent good clinical practice. Algorithmic strategies integrating clinical assessment, epidemiological data, patient risk factors, and therapeutic options would allow rational use of these RDTs [54]. Therefore, active collaborative diagnostic and antibiotic stewardship strategies are needed to identify evidence-based approaches for maximizing the clinical utility of respiratory RDTs panels [46].

Conclusions

With the development and spread of RDTs, the key concept of a “multidisciplinary approach” advocated by the antibiotic stewardship guidelines is of utmost importance. The RDTs, by shortening the time to appropriate therapy, seem to affect the clinical decision-making and the patient clinical outcomes. A functional partnership between the microbiology laboratory and the ASP team is essential to maximize the performance of RDTs by ensuring that the most appropriate RDTs are selected and implemented and that results are interpreted and communicated correctly and efficiently. In this sense, the goals of diagnostic stewardship and antibiotic stewardship in the implementation of RDTs are intertwined, in so far as the laboratory results guide the antibiotic decision-making. While the ASPs act primarily on properly interpreting test results and applying them to treatment decisions, thus ensuring proper use of antibiotics, the diagnostic stewardship should promote the appropriate use of RDTs by rationally positioning them along the diagnostic workflow, by developing dedicated diagnostic algorithms and diagnostic guidance. As for antibiotic stewardship, the concept of diagnostic stewardship is to be fully recognized and embedded within regular clinical practice.