Download Transcript (pdf)
Hello, my name is Giuseppe Lippi. I am Director of the Laboratory of Clinical Chemistry and Hematology at the Academic Hospital of Parma, Italy, and Associate Professor of Clinical Biochemistry. Welcome to this Pearl of Laboratory Medicine on “Pre-analytical Issues Specific to Coagulation Testing.&rdquo
The errors in laboratory diagnostics typically develop throughout the so-called “brain to brain turnaround time loop,” which starts from the physician’s brain with test order and ends at the same point – in the physician’s brain - with interpretation of test results. Throughout this cycle and like many other human activities, there is a latent chance of errors. Although it is frequently perceived that most laboratory errors emerge from the analytical phase, several lines of evidence attest that this does not hold true and – conversely – the leading source of diagnostic errors - up to two-thirds - is the pre- analytical phase, where most manually intensive activities are clustered.
Coagulation testing is no exception to this rule. In a survey that we have completed at my academic hospital, the frequency of pre-analytical errors has been estimated to be as high as 5% of all blood collections for coagulation testing, with notable variations across different wards, wherein the highest rate could be observed in pediatric departments. The type of pre-analytical problems also appeared to be substantially different across different healthcare settings, with spurious hemolysis prevailing in the emergency department and clotted plasma samples prevailing in pediatric wards.
The pre-analytical variability can hence be considered as a sort of “Troy horse” in coagulation testing, from where a large number of latent errors may arise. The most frequent sources of pre-analytical errors are summarized in this slide. Beside general causes such as identification errors and unsuitable transportation, which do not noticeably differ from other areas of diagnostic testing, in red and bold font are highlighted those that are more critical for, or even specific to, coagulation testing. These include the blood drawing technique, venous stasis, the blood collection system, the use or not of a discard tube, the effect of the anticoagulant, sample mixing, sample preparation, and overall sample quality.
As for any other area of diagnostic testing, an appropriate venipuncture is indeed the foremost issue for obtaining a quality specimen for coagulation testing. Although each phlebotomist typically uses a technique that he or she is comfortable with and that has been developed after a large number of blood draws, there are some basic criteria that should be fulfilled. These characteristically entail an appropriate education and training of phlebotomists, the preferential collection of blood from the larger and most easily accessible median cubital and cephalic veins, the accurate detersion of the venipuncture site with 70% isopropyl alcohol, followed by accurate wipe off of detergent. It is also noteworthy that the procedure should be immediately stopped when the attempt is unsuccessful in order to prevent harm to the patient and collection of poor quality specimens due to repeated injury to corpuscular elements of blood and endothelial cells.
There is a very good reason for following the previous suggestion of collecting blood from the veins of the forearm. As clearly understandable from this figure, the chance to obtain hemolyzed specimens, where hemolysis is mirrored by the concentration of cell-free hemoglobin in plasma or serum, is remarkably higher when blood is drawn from veins below the forearm and – especially – from those of the metacarpal plexus. In this instance, the chance of collecting mildly or largely hemolyzed samples is enormously magnified, and up to 70% of samples collected from this site may be ultimately unsuitable for being tested with most coagulation assays.
Venous stasis is another important but frequently overlooked issue. The tourniquet is usually placed to help identify a suitable vein to be punctured and, according to most guidelines, it should be immediately released after this has been located. All of us would agree, however, that this rarely happens in the daily practice and the tourniquet is instead left in site for much longer than necessary. There is now good evidence that the venous stasis caused by prolonged tourniquet placing causes a large number of problems that are prevalently attributable to hemoconcentration and spurious activation of both coagulation and fibrinolytic pathways. According to data that we have obtained in a specific study aimed to assess the influence of tourniquet placing on coagulation testing, most tests were analytically and clinically biased after 3 min of placing. Therefore, venous stasis during blood drawing should never be prolonged over 2 minutes. When it may be necessary to maintain the tourniquet for preventing collapse of the vein, it should be however released after 2 min and eventually retightened only when strictly necessary.
The size of the needle is typically expressed in Gauge and, characteristically, the bigger the Gauge number, the smaller the caliber of the needle. In a previous study that we have performed to assess which is the lowest suitable needle size for drawing blood for coagulation testing, we have found that needles bigger than 23 gauge may always fit for this purpose. We have also observed, however, that the use of very small needles, such as those equal or smaller than 25 gauge, may introduce a significant bias, especially in the values of D-dimer and in the platelet count, two separate events that probably recognize an identical trigger, that is the activation of hemostasis when blood is forced through very fine needles.
At variance with the studies on needle bore size, we have found that collection of blood with butterfly devices, when appropriately used, is not a significant source of bias in coagulation testing. These devices are frequently used when drawing blood from children, patients with not easily accessible veins, and sometimes even in the presence of anxious subjects. In all these circumstances, according to our data, the butterfly device may be suitably used also for collecting blood for coagulation testing, provided that the air in the tubing system is discarded and not introduced in the primary blood tube, thus causing underfilling.
The use of catheters, however, is a more debated issue. These devices, which have been developed for infusing liquids and not for drawing blood, have two leading aspects that can cause problems during sample collection. The former issue is the presence of specific valves inside the catheter, which ultimately obstruct the flow of blood in the opposite direction than that they have originally been designed. The latter aspect is related to the development of a remarkable delta of pressure between the evacuated blood tube and the vein. This inevitably causes a perturbation of blood flow, with potential damage and/or activation of erythrocytes, leukocytes, and platelets. The combination of these two important elements explains the much higher degree of spurious hemolysis that can be observed when blood is directly drawn from catheters into evacuated blood tubes. A potential solution for this problem is the use of primary blood tubes that can be used as a syringe; this would eventually reduce the chance of obtaining unsuitable coagulation samples and contextually maintain a high degree of operator safety as for all “closed” collection systems.
In daily practice, especially in short stay units and in some specific internal wards such as the oncology department, blood samples may be collected from intravenous lines, that basically entail central venous or peripheral catheters. Although this commonplace practice is somehow acceptable to avoid performing a second venipuncture for drawing blood, it carries some latent risks of producing unsuitable specimens. The foremost problem has been earlier discussed, and refers to the risk of causing injury to corpuscular elements of blood during blood flow through the catheter. Nevertheless, another important risk is sample contamination with infusion liquids and, specific for coagulation testing, with heparin.
Contamination of blood with saline or glucose solutions leads to a spurious dilution of coagulation factors, which is roughly linearly dependent upon the volume of dilution liquid. Contamination of blood with heparin, however, causes a more pronounced bias in coagulation testing, since this compound is the physiologic co-factor of antithrombin, that is the major inhibitor of blood coagulation. Although most thromboplastins contain a heparin neutralizer, contamination of blood samples with heparin causes a spurious prolongation of prothrombin time and, especially, of activated partial thromboplastin time, with partial or no correction after mixing with normal plasma, a dramatic prolongation - up to no clotting - of thrombin time and Lupus anticoagulant testing, along with activities of clotting factors VIII, IX, XI, and XII that can be spuriously lower than 1%, depending on the degree of heparin contamination. This effect may be diluted out, however.
The sample matrix is almost critical for coagulation testing. It is typically represented by plasma collected in blood tubes containing 3.2 percent citrate buffer. This anticoagulant is a reversible calcium chelator and, at variance with EDTA or heparin, allows recalcification and recoagulation of blood during in vitro testing once a fixed concentration of this ion is reintroduced in the test sample. To ensure an appropriate blood to anticoagulant ratio, the final blood sample should hence contain 9 parts of blood and 1 part of anticoagulant. It is thereby rather understandable that the use of serum or wrong additives or, even more improperly, the transfer of blood from a different collection tube into those for coagulation testing, would introduce a dramatic bias and finally make test results unreliable.
This slide synthetically summarizes the major influence of a wrong sample matrix on routine and specialized coagulation testing. In sample collected in EDTA, clotting tests are dramatically prolonged and the activity of most coagulation factors and inhibitors is substantially reduced, up to 10-times lower than the original value in samples appropriately collected. The influence of heparin, which has already been discussed, is even worse, with clotting times dramatically prolonged up to no clotting, whereas clotting factors and inhibitors activities are considerably depressed. Coagulation testing in serum is also typically biased, with failure to clot of most tests, and clotting factors activity substantially depressed.
The opportunity or not to introduce a so-called “discard tube” before collecting a coagulation sample has been a matter of long debate over the past decades. Now, several lines of evidence, which are also acknowledged by the Clinical and Laboratory Standards Institute in its practical recommendations, attest that the use of a discard tube is meaningless and anti-economic during straight needle venipunctures, since results of coagulation testing are reliable even when blood is directly drawn into a first citrated tube. The most notable exception to this rule is when collecting blood from intravenous lines or butterfly devices. In these circumstances, the use of a discard tube may be advisable for preventing sample contamination with intravenous fluids and heparin, or for preventing under-filling of blood tubes due to incorporation of the amount of air contained in the tubing system into the primary blood tube.
The collection of blood tubes for platelet function testing is another important exception, which may hence require the use of a discard tube.
As already mentioned, blood tubes for either specialized or routine coagulation testing should be filled up to their nominal volume, in order to fulfill the strict requirement of 9:1 blood to anticoagulant ratio. When this ratio is impaired, more often when a lower volume of blood is collected into the tube, the relative excess of anticoagulant consistently biases test results. Some examples are summarized in this slide, where it can be clearly acknowledged that blood tubes should be filled not less than 90% of their nominal volume to prevent spurious prolongation of clotting tests, with APTT being the most sensitive to underfilling.
Blood tubes must be necessarily subjected to centrifugation for separating blood cells (and platelets) from plasma. In the case of coagulation testing, there are strict requirements that should be fulfilled to obtain a sample matrix that is suitable for testing. Basically, the Clinical and Laboratory Standards Institute (CLSI) currently recommends that coagulation tubes be centrifuged at 1500g for 15 minutes at room temperature for obtaining a platelet count lower than 10 thousands platelets for microliter of plasma.
Although this recommendation should be strictly followed when performing specialized coagulation testing or when the specimens are to be frozen for delayed testing, we have provided good evidence that routine coagulation testing may still be suitable in samples centrifuged for less than 15 min, typically between 5 to 10 min. This is an important element for reducing the turnaround time of urgent coagulation testing, which may be critical in patients with stroke, in whom the opportunity for effective thrombolysis is within the first few hours from onset of symptoms.
The use of quality specimens is indeed foremost in any area of in vitro diagnostics. Coagulation testing makes no exception to this rule. The three major interfering substances that may bias the results of testing include cell-free hemoglobin (as in hemolyzed specimens), turbidity (mostly attributable to lipids), and bilirubin (as in icterus). While turbidity and icterus principally cause different degrees of optical interference that can be overcome using different wavelengths for testing than those where the maximum degree of interference is observed, the problems caused by spurious hemolysis of blood specimens are expectedly worse. In fact, not only they entail a spectrophotometric interference due to the peak of absorbance of cell-free hemoglobin, but also involve biological interference due to release of phospholipids and thromboplastin-like substances, that would finally trigger a spurious activation of both primary and secondary hemostasis.
The effect of spurious blood cell injury during venipuncture is shown in this slide. A breakdown of as many as 0.5-0.6 percent of erythrocytes, which roughly corresponds to the presence of 0.7-0.8 g/L of cell-free hemoglobin in plasma, causes in fact a spurious alteration of all routine clotting tests and D– dimer.
The most logical implication of this evidence is that systematic assessment of plasma indices should be strongly encouraged in all clinical laboratories, to overcome the inherent limitations of visual inspection and safeguard the quality of testing. We have recently evaluated the analytical performance along with the impact on our organization of systematic and automatic assessment of plasma indices in all samples using one widespread coagulation analyzer, and we have concluded that this approach is indeed technically suitable and does not significantly impact on the turnaround time.
Slide 21: References
Slide 22: References
Slide 23: Disclosures
Thank you for joining me on this Pearl of Laboratory Medicine on “Pre-analytical issues specific to coagulation testing.” I am Giuseppe Lippi.