Introduction

Over the past decade, direct oral anticoagulants (DOAC) have been introduced, and are now increasingly used for the treatment/prophylaxis of venous thromboembolism, or for the prevention of stroke and systemic embolism in patients with non-valvular atrial fibrillation [1]. DOAC have several advantages over vitamin K antagonists (VKA). They possess high bioavailability, and their anticoagulant effect is more predictable than that of VKA, hence they do not require routine dose adjustment based on laboratory testing. In addition, the anticoagulant effect of DOAC is not affected by diet and relatively little by other drugs. Furthermore, compared with VKA, DOAC are fast acting and have a relatively short half-life: they reach peak plasma levels approximately 2 h after ingestion, and trough plasma concentrations at 24 or 12 h, depending on whether they are administered once- or twice-daily, respectively. However, at variance with VKA, DOAC are eliminated (though not exclusively) from the circulation through the kidney. This is considered as a potential drawback as many patients on oral anticoagulants (about 50% of the whole anticoagulated population) are currently on one of the anticoagulants for the prevention of stroke and systemic embolism because of non-valvular atrial fibrillation. These patients are prevalently aged, and are, therefore, at risk of renal failure. According to current practice [2,3,4,5], guidelines [6,7,8,9] and drug technical annexes [10,11,12,13], patients with severe chronic kidney disease (CKD) [creatinine clearance (CrCl) less than 30 mL/min] should be denied DOAC, or they should be used with caution. Patients with CrCl from 30 to 50 mL/min can be given DOAC, but their renal function should be tightly controlled. Although DOAC do not need routine dose adjustment based on laboratory testing, the measurement of their anticoagulant effect may be useful in special situations. These have been discussed elsewhere [14,15,16,17] and include the following.

  1. i.

    Before initiation of anticoagulation [complete blood cells counting including platelets and the basic tests of coagulation, prothrombin and activated partial thromboplastin time (PT, APTT) should be carried out].

  2. ii.

    At the time of adverse events (hemorrhage or thrombosis).

  3. iii.

    Before antidotes administration.

  4. iv.

    To make decision on thrombolytic therapy in ischemic stroke patients.

  5. v.

    Whenever drug-to-drug interaction is suspected.

  6. vi.

    In patients with extreme body weight.

One of the situations that is still debated concerns whether testing is required for patients on DOAC when temporary discontinuation of the treatment is needed for surgery or invasive procedures. This review article discusses the pros and cons of measuring (or not measuring) DOAC levels before surgery/invasive procedures by a multidisciplinary team of experts with different background, including the thrombosis laboratory, clinical thrombosis, internal medicine, cardiology and nephrology.

The problem

Patients on oral anticoagulants (whichever the drug used) are likely to undergo surgical or invasive procedures at some point in their lives. It has been estimated that about 1/10 of anticoagulated patients require temporary interruption of their treatment because of surgery or invasive procedures [18] and, therefore, their management is rather challenging. Discontinuing anticoagulant treatment before procedures is an important medical decision as it may potentially affect patient health. Most of the experience so far accumulated comes from patients on VKA [19] for whom temporary discontinuation of anticoagulation is considered mandatory only for major surgery, or as well for some invasive procedure for which the hemorrhagic risk is deemed relatively high (see Table 1 for more details). The protocol adopted for patients on VKA when they need to undergo surgery or invasive procedures requires discontinuation of anticoagulation for a few days before and measurement of the international normalized ratio (INR) immediately before the procedure to make sure the VKA effect is not still present [19].

Table 1 Bleeding risk assessment in patients on DOAC who are awaiting surgery or invasive procedures

Information on DOAC is scanty, and comes mostly from post hoc analysis of data from clinical registration trials, few observational studies and one prospective study of patients on dabigatran [20], for whom, however, testing for DOAC was not included in the protocol. Many scientific societies issued their own management guidelines that are based on expert opinion or application of standardized interruption/resumption protocols, based on the evaluation of CrCl, patients characteristics and risk of bleeding. Studies randomizing patients to undergo procedures after standardized interruption DOAC protocols and no testing vs interruption protocols and testing are not available. Furthermore, no studies on DOAC measurement have been undertaken to show whether the application of the standardized interruption protocol results in residual DOAC concentrations potentially harmful for patients. Recently, Douketis et al. [21] tried to cover this gap and carried out an observational study designed to assess whether the management of the perioperative period would benefit from testing of dabigatran. Patients, who were to undergo surgical or invasive procedures, received a standardized interruption protocol and blood for the measurement of dabigatran was taken preoperatively. Nevertheless, results of testing were not used to make decision on whether procedures could or could not be safely carried out. Post-hoc dabigatran concentration analyses show that the standardized interruption protocol resulted in about 80% “normal” test values. These results should be interpreted with caution. On the one hand, the standardized interruption protocol devised by Douketis et al. [21] could be considered safe as the majority of patients had post hoc relatively low residual circulating dabigatran levels. On the other hand, the fact that a non-negligible proportion of patients (20%) had relatively high dabigatran levels should be considered as a cause of concern. To obtain data on the safety of an interruption, Perioperative Anticoagulant Use for Surgery Evaluation (PAUSE) study has been organized and is being carried out on a multicentric level [22]. The study is undertaking the validation of a perioperative management based on a standardized interruption protocol, which includes local DOAC testing. The primary aim of the study is to demonstrate the safety of the PAUSE interruption protocol, and the secondary aim is to determine the effect of the pre-procedure interruption on the levels of residual circulating DOAC [22]. Assuming that the PAUSE study will eventually achieve its goal of showing that the interruption protocol is safe and no pre-procedural testing is needed, this strategy will (hopefully) be valid for the majority of patients, but it is reasonable to believe that there will be an (unpredictable) proportion of patients for whom it will not be safe. The crucial question is whether we should be concerned with the majority, and forget the minority of patients. Testing preoperatively would increase patient safety without excessively increasing the cost.

Why testing

It is well known that DOAC possess favorable pharmacokinetic properties that make them reach peak plasma levels within 2 h from administration, and trough levels 12 or 24 h later, according to whether they are administered twice- or once-daily, respectively. These favorable pharmacokinetics cast doubts on the need to measure the levels of DOAC before surgery or invasive procedure to rule in, or out the presence of clinically relevant residual DOAC concentrations that would increase the risk of peri- or post-procedural bleeding. This issue has been the focus of guidelines and debates in scientific journals [23,24,25], but clinicians are still doubtful on what to do in their practice.

The authors, who are against testing before surgery or invasive procedures, argue that in normal conditions, stopping anticoagulation 2 or 3 days before procedures would be relatively safe, as this time period is sufficient to get DOAC cleared from the circulation, provided that renal function is normal [24].

Glomerular filtration rate (GFR) is the fundamental parameter of renal function, and is often estimated by various equations, all including serum creatinine concentration as the key variable [26,27,28]. The use of these equations is growing in clinical practice because of the increasing incidence of kidney dysfunction [29]. The most frequently used is the Cockcroft–Gault equation [26] that estimates CrCl, while the others estimate GFR. It should be realized that the estimation of renal function is dependent upon the equation used, and that there may be over-estimation in some cases. Although international guidelines [29] indicate that in adults, estimates of renal function should be done using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation [28], for DOAC it has been pragmatically decided that the method of Cockroft–Gault is sufficient to get meaningful information. Hence, the relevant equation has been used in clinical trials and in clinical practice, even though it is known that renal function assessed by this equation is over-estimated when compared with GFR [30, 31].

As mentioned, most guidelines and DOAC technical annexes state that patients with CrCl lower than 30 mL/min should not be treated (or should be treated with caution), with DOAC. DOAC are in fact eliminated (though not exclusively) by the kidney and, therefore, low CrCl would carry the risk of considerable drug accumulation within the circulation, thus increasing the risk of bleeding during and after procedures. Clinical guidelines and DOAC technical annexes, therefore, warn on the need to assess CrCl before DOAC prescription, and periodically thereafter, to assess for any incidental variation that can occur over time. Accordingly, it would be justified not to perform DOAC measurement soon before surgery or invasive procedure, if renal function is within normal limits. However, it is well known that CrCl may vary over time, and sometime abruptly, especially in the elderly [29, 31, 32]. In addition, it is known that renal function decreases progressively with age [33]. Therefore, the assessment of CrCl, even at 6-month intervals, as recommended by many clinical guidelines, cannot guarantee that at the time of surgery or invasive procedure renal function is still adequate to ensure elimination of residual drug from the circulation. Hence, to be on the safe side, it would be required that CrCl be checked soon before surgery or invasive procedure. This would, however, be a surrogate solution, and one may wonder whether the measurement of DOAC concentration is not the most obvious and direct solution for patient safety. On the other hand, there are other reasons for testing DOAC before surgery or invasive procedures.

  1. i.

    DOAC are cleared from circulation by the kidney, but also by the liver. Therefore, measuring only CrCl is not the solution, and check of liver function should also be performed.

  2. ii.

    DOAC plasma concentrations do correlate poorly with CrCl [34]. Therefore, using CrCl as a standalone parameter to make decision on drug renal clearance would not be justified.

  3. iii.

    Relatively high inter-individual variability of DOAC plasma concentrations has been observed in subjects taking the same dose [34, 35].

  4. iv.

    DOAC elimination does also depend upon various genetic variants of the relevant enzymes that are involved in their catabolism [36]. Therefore, knowledge of whether or not patients are carriers of any of the involved genetic variants would be needed, but this is not commonly determined.

  5. v.

    Last but not least, drug clearance from circulation is dependent on the time elapsing from the last administration to the procedure. It is often assumed that patients have understood and adopted correctly the schedule of interruption, but this cannot be taken for granted. The abovementioned arguments make measurement of the drug level a more direct and safer procedure than the above assumptions.

Why not testing

Arguments against testing are listed and discussed in the following paragraph.

  1. i.

    Test availability. The argument that tests for DOAC are not available in clinical laboratories does not necessarily hold true, as dedicated tests for DOAC are available from and are marketed by many manufacturers involved in hemostasis testing. The fact that they are not used is largely dependent on a vicious circle, whereby laboratories do not set up methods because clinicians do not prescribe testing, and clinicians do not prescribe testing because laboratories do not set up the relative methods. Furthermore, testing for DOAC is perceived as useless because DOAC were developed to overcome the main disadvantage of VKA (i.e., frequent testing for dose adjustment). Actually, there is still confusion about two distinct concepts: “monitoring”, that means dose adjustment based on laboratory testing (that applies to VKA and heparins) as opposed to “measuring”, that means laboratory testing to evaluate the levels of anticoagulation achieved by DOAC in special situations [37].

  2. ii.

    Prompt availability of results. Results are not promptly available in an emergency. This does not hold true as the total turnaround time for getting results is around 30 min; therefore, they can be promptly available even in an emergency. Laboratory operators should achieve this by appropriate organization of their work.

  3. iii.

    Difficult tests that need expertise. Dedicated tests for DOAC are difficult to set up and run in a general clinical laboratory [38, 39]. Methods for DOAC (in spite of their difficult names) are relatively simple to set up and run in regular coagulometers that are available in any average clinical laboratory. The level of expertise is the one needed for running general coagulation tests such as the PT, APTT or the measurement of antithrombin by chromogenic substrate technology. Very few physicians realize that the dilute thrombin time used for dabigatran is as simple as running the PT, and that the anti-factor Xa assay used for rivaroxaban, apixaban or edoxaban is essentially the same as the one used for antithrombin. Both PT and antithrombin measurements are widely used, and no one casts doubts about the reliability of their results.

  4. iv.

    Inter-laboratory variability of results. It is (erroneously) believed that tests for DOAC are prone to a relatively large inter-laboratory variability (i.e., there is no agreement between results produced in different laboratories when using different methods/reagents). Recent nationwide proficiency surveys have involved nearly 100 Italian laboratories with average expertise; these laboratories were provided with a common set of freeze-dried plasmas, and were asked to test for DOAC. The surveys show that the inter-laboratory variability of the measurement for dabigatran, rivaroxaban and apixaban is relatively small, and compares favorably with that observed for the measurement of INR used to monitor patients on VKA (i.e., coefficient of variation of 9 vs 11%) [40].

  5. v.

    Lack of cut off-values. There are no cut-off values for DOAC concentration, beyond which one should be worried. The least DOAC concentration that can be considered safe before surgery or invasive procedure is not yet well known as clinical experience with these drugs is still lacking. However, it can be assumed that relatively small concentrations such as those less than 30 ng/mL [7], or values that are below the quantitation limits of the local method, can be considered as the safe threshold to make a decision, at least until more reliable cut off values stemming from ad hoc clinical studies will be available. It may also be argued that a decision based on dichotomization (i.e., results of testing below or above cut off), is inadequate, and that the patient’s individual risk should be evaluated [6]. In this respect, it would be relevant to discuss who should be responsible for such evaluation. Perhaps, patients on DOAC who are awaiting surgical procedures and need temporary discontinuation of their treatment should be referred to the expert professionals operating in anticoagulation clinics [41, 42]. Failure to do so and no testing, will presumably increase the rate of misclassification and the inherent risk of bleeding on the occasion of surgical procedures.

  6. vi.

    Poor definition of laboratory tests. There is general perception that laboratory tests to be used for DOAC measurement are not well defined. This does not hold true, as there is wide consensus among experts (referenced in [43], see also Table 2 and Figs. 1, 2) that the tests of choice for dabigatran are the dilute thrombin time or the ecarin clotting (or chromogenic) assays, and the test of choice for the anti-factor Xa drugs (i.e., rivaroxaban, apixaban or edoxaban) is the anti-factor Xa activity assay. Results obtained with these tests compare favorably with those obtained with high-pressure liquid chromatography or mass spectrometry [44] that are considered as gold standards, but are not available in most laboratories. Each of the abovementioned tests are clot-based (dilute thrombin time and ecarin assays, Fig. 1) or chromogenic (anti-factor Xa or ecarin assays, Fig. 2), and are calibrated against the specific standards, which are commercially available, certified for the drug concentration (either dabigatran, rivaroxaban, apixaban or edoxaban) and results are expressed as ng/mL. It should be realized that no “universal standard” (i.e., valid for all DOAC) is available, hence clinicians must inform the laboratory as to which drug is taken by the patient under investigation. Therapeutic intervals for DOAC concentrations are not yet available. Laboratories should report patient results, and clinicians should refer to the expected values for each DOAC that are reported in the DOAC technical annexes issued by the European Medical Agency (for more details, see reference no. [43]).

    Table 2 Tests of choice to measure plasma concentrations of direct oral anticoagulants (DOAC)
    Fig. 1
    figure 1

    Schematic representation of the dilute thrombin time. Patient plasma diluted into pooled normal plasma (or standard) is mixed with purified thrombin. Dabigatran inhibits thrombin activity and residual thrombin catalyzes the fibrinogen-to-fibrin conversion; the higher the dabigatran concentration the smaller the residual thrombin and the longer the clotting time (a). A series of standards at certified dabigatran concentrations are tested along with patient plasma to construct a calibration curve. Interpolation of the patient clotting time from the calibration curve yields dabigatran concentrations (b). In the relative test, ecarin (a snake venom extract) catalyzes the prothrombin-to-(meizo)thrombin conversion; (meizo)thrombin (a thrombin variant) is inhibited by dabigatran and the residual amount is in turn determined by clotting or chromogenic assays; the higher the residual (meizo)thrombin, the smaller dabigatran concentration

    Fig. 2
    figure 2

    Schematic representation of the chromogenic anti-factor Xa (FXa) assay. Diluted patient plasma (or standard) is mixed with an excess of purified FXa. Anti-FXa drugs (either rivaroxaban, apixaban or edoxaban) will inhibit FXa activity and the residual FXa catalyzes the conversion of the chromogenic substrate. The amount of converted chromogenic substrate can be determined photometrically as optical density (OD); the smaller the OD, the lower the residual FXa and consequently the greater the anti-FXa drug concentration (a). A series of standards at certified anti-FXa drug (either rivaroxaban, apixaban or edoxaban) concentrations are tested along with patient plasma to construct a calibration curve. Interpolation of the patient OD from the calibration curve yields the drug concentration (b)

  7. vii.

    Conventional tests of coagulation can be used instead of dedicated tests. Most clinicians are more familiar with the conventional tests of coagulation, such as PT, APTT or thrombin time (TT), and would prefer using them instead of the dedicated tests to assess the levels of anticoagulation achieved by DOAC. This does not hold true for the following reasons. Conventional tests of coagulation (PT, APTT, TT) are affected (i.e., their clotting time is prolonged) to some extent by DOAC. However, the degree of prolongation does not entirely reflect the DOAC plasma concentration [45]. The sensitivity of PT, APTT to DOAC is in fact largely dependent on the composition of the reagent used for testing. For instance, there are commercial thromboplastins used for the PT test that are relatively sensitive to DOAC, but others that are not [45,46,47,48]. Therefore, concluding that there are no clinically relevant circulating DOAC levels, based solely on the normality of PT or APTT could be misleading and dangerous for patient safety, if the reagent used for testing is insensitive to DOAC [45,46,47,48]. Furthermore, it should be realized that the PT and APTT are global tests that may be prolonged beyond the upper limit of the normal range for reasons other than DOAC, e.g. small but sizeable deviations from normality due to liver disease or other comorbidities [48]. Because of the above considerations, basic tests of coagulation should not be used to draw conclusions on DOAC levels with the possible exception of TT. Owing to the very high sensitivity of this test to dabigatran, TT results within normal limits in patients taking this drug can rule out clinically relevant plasma concentrations.

  8. viii.

    Postponement of procedures. It might be argued that some procedures could occasionally be postponed based because of borderline DOAC results or results that are difficult to interpret. Although these situations are likely to occur, they do not justify decision of “no testing”. Patient safety should be the first concern whenever there are dubious circulating DOAC levels that may increase the risk of peri- or post-procedural bleeding.

Concluding remarks

In conclusion, measuring DOAC with dedicated tests before surgical or invasive procedures is of paramount importance for patient safety. It provides the best and most direct evidence to rule in or out, clinically relevant concentrations of residual drugs. Regulatory agencies should urgently approve their use in clinical practice. Hospital administrators should make them available, and clinical laboratories should set up the relative methods and make them available to clinicians [49].