The purported benefits of the newer oral anticoagulants over warfarin include predictable pharmacokinetics, fewer interactions with foods and other drugs, a lack of a need for routine laboratory coagulation monitoring, and quicker onset and ‘offset’ of action.
The claim that NOACs have predictable pharmacokinetics is misleading. For example, for a given dosage of dabigatran, the 10th to 90th centiles of observed steady-state concentrations encompassed a five-fold range of values.18 This degree of variability is typical for most drugs.19 Hence, it is remarkable that clinical outcomes from fixed-dose NOACs have been found to be non-inferior to INR-targeted warfarin. These non-inferiority trial findings are supported by observational studies of real-world use, especially for dabigatran,20,21 albeit not entirely.22
The lack of an established need for routine coagulation monitoring with NOACs may be convenient for patients who do not have ready access to INR testing for warfarin therapy. However, it makes monitoring adherence and managing thrombotic events more difficult.13 Also, although NOACs have fewer food and drug interactions than warfarin,2 the relative lack of familiarity with interactions and routine monitoring means that prescribers may miss important interactions.
The quicker onset and ‘offset’ of action with NOACs is both a positive and a negative. On the one hand, the need for bridging with parenteral anticoagulants may be obviated with NOACs. Conversely, missing even a single dose could result in a period of minimal anticoagulation12 (see the Table for half-lives).
A limitation of both the interventional and observational data so far is the relative lack of longitudinal information. The best available evidence is for dabigatran in atrial fibrillation, where the rates of major thrombotic and bleeding events were comparable to warfarin over five years.23 Similar data for the other oral anticoagulants, and with ‘indefinite’ use for venous thromboembolism, are expected.