Phlebologie 2014; 43(05): 232-237
DOI: 10.12687/phleb2227-5-2014
Review Article
Schattauer GmbH

Direct oral anticoagulants – laboratory monitoring

Article in several languages: English | deutsch
M. Spannagl
1   Hämostaseologie, Klinikum der Universität München, Germany
› Author Affiliations
Further Information

Publication History

Received: 31 July 2014

Accepted: 08 August 2014

Publication Date:
04 January 2018 (online)

Zoom Image

Summary

The pharmacokinetics of direct oral anticoagulants (DOACs), which are similar to those of low-molecular-weight heparin (LMWH), allow these substances to be substituted for each other in routine clinical practice while maintaining the same frequency of administration. Parenteral LMWH administration is mainly performed if oral DOAC administration is either unsafe or impossible. The required waiting period prior to interventions or surgery can be precisely determined for both applications. Where necessary, both substance classes are monitored using similar laboratory tests.

General concomitant coagulation monitoring, as has been practised for many decades using vitamin K antagonists, is not necessary during treatment with the new Xa and thrombin inhibitors. If specific clinical situations (e.g. emergency surgery or interventions, acute haemorrhage, acute organ failure) arise in patients undergoing treatment with a DOAC, information about the patient‘s plasma drug level can aid the physician in assessing the haemorrhage risk. As the DOACs act at a central position of the coagulation system, they interfere with global coagulation tests, such as prothrombin time (PT; Quick’s time / INR), activated partial thromboplastin time (aPTT) and thrombin time (TT) (only thrombin inhibitors!) and specific physiologically based coagulation tests. The changes in coagulation diagnostics depend on both the DOAC’s mode of action and the corresponding half-life, as well as the time of tablet intake, the dosage and the test system/ reagent used in the coagulation laboratory. In order to interpret the results of haemostaseo-logical measurement, both the clinician and the laboratory physician need to know which DOAC was taken and at what time.

In addition, the sensitivity (dose-response curve) of each test system used must be considered. Under DOAC treatment, global coagulation assays can only provide estimated (semi-quantitative) statements: If the prothrombin time (Quick’s time / INR), is in the range of normal during rivaroxaban in-take (provided that a rivaroxaban-sensitive reagent, e.g. Neoplastin Plus, was used in the laboratory), it can be assumed that any clinically relevant residual effect of rivaroxaban in the patient’s plasma is unlikely. During dabigatran intake, a trough aPTT >80 sec indicates an increased risk of haemorrhage, while a TT in the range of normal indicates the absence of dabigatran in the plasma.

For quantification of the plasma concentration of anti-Xa inhibitors, chromogenic anti-Xa tests are available that are specifically calibrated to the substance being tested. For quantification of the plasma dabigatran concentration, the calibrated diluted thrombin time (Hemoclot® test) can be used. To date, however, measurement of drug concentrations has not formed part of routine clinical practice, as the plasma concentrations measured cannot currently be clinically interpreted for the individual patient. Overall, use of haemostaseological laboratory methods for detecting DOACS must be planned according to the local situation and repeatedly agreed (e.g. if the reagent is changed) between laboratory and clinic.