November 25, 2013 – Combining genetic data with clinical information to determine the initial dosage of the blood thinner warfarin, used to prevent blood clots in the circulatory system, was no more effective in achieving stable anticoagulation than using only clinical information, according to a National Institutes of Health-funded clinical trial. In addition, the study found that in African-Americans, anticoagulation control was lower in the genetics-based approach compared to the clinically-based method.
The results of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial, supported by the NIH’s National Heart, Lung, and Blood Institute, were presented today at the American Heart Association (AHA) Scientific Sessions in Dallas. The study was published simultaneously in the New England Journal of Medicine.
“The use of genetic data holds great promise for predicting disease risk or determining optimal therapies, but it must be put to the test through clinical trials like this one to determine how to best use that information,” said Gary H. Gibbons, M.D., director of the NHLBI. “This is especially true for complex drugs like warfarin whose action in our bodies is influenced by a variety of genetic, clinical and environmental factors.”
Warfarin is the most commonly prescribed drug to prevent blood clots in conditions such as atrial fibrillation, deep vein thrombosis, or pulmonary embolism. Though warfarin is an effective therapy for many people with cardiovascular problems, the drug poses risks if improperly dosed. If dosed too high, warfarin can increase the risk of bleeding; if dosed too low, it can increase the risk of blood clots. Proper dosing of warfarin is complicated because the drug interacts with many other common medications as well as some foods. When determining an initial dose, doctors often start with a standard dose and can take certain clinical indicators into account to alter that dose. These clinical measures include age, body size, smoking status, and use of certain medications. During the initial weeks of therapy, the warfarin activity is monitored closely through blood tests, and adjustments are made as needed.
Recent research has suggested that variants of two genes, CYP2C9 and VKORC1, may be important in selecting the dose of warfarin needed for individual patients. Based on these studies, dosing formulas have been developed that incorporate a person’s genetic profile along with the patient’s clinical characteristics to try and better predict the proper dose of warfarin — an approach known as pharmacogenetics.
However, the evidence supporting pharmacogenetics for warfarin has not been definitive; small clinical studies and some observational data have produced conflicting results. In addition, there have been differences noted in how accurate these dosing formulas are in different groups of patients. In particular, the formulas tend to be somewhat less accurate in African-Americans.
“Given the lack of definitive information on whether or not pharmacogenetics can improve the care of patients and the need to study a broad range of patients being treated with warfarin, we needed a large clinical trial like COAG to help resolve this important question,” said Stephen Kimmel, M.D., of the Perelman School of Medicine at the University of Pennsylvania and principal investigator of the COAG trial.
COAG enrolled 1,015 patients beginning warfarin therapy and randomly assigned them to one of two dosing strategies. During the first five days of therapy, the participants would have their dosages determined and adjusted by a clinical formula or a pharmacogenetic formula. The participants were monitored for 23 additional days, and dosage changes were made using a standard approach. Participants and the treating physician were blinded to the strategy and the dose of warfarin.
Study investigators compared how much time the patients spent in their ideal therapeutic dosage range during the 28-day monitoring phase. Among all patients, the clinical and pharmacogenetic groups were virtually identical at 45.4 percent and 45.2 percent time in therapeutic range, respectively.
Among the 255 African-American participants, the pharmacogenetic formula provided only 35.2 percent time in therapeutic range compared to 43.5 percent for the clinical formula. African-Americans in the pharmacogenetic group generally took longer to reach an ideal dose compared to the clinical group (70 percent in the pharmacogenetic group reached their ideal range by day 14 compared to 87 percent in the clinical group). The African-Americans in the pharmacogenetic group did not experience any increased health issues like bleeding or clotting, however. There were also no differences in adverse events between the two dosing groups as a whole, and the total number of adverse events was low.
“These findings highlight the importance of developing and evaluating pharmacogenetic testing in patients from diverse racial and ethnic backgrounds,” Gibbons said. “We are optimistic about the prospects of personalized, precision medicine, but we must make sure that we put these approaches through the same type of rigorous testing as any other prognostic test or clinical treatment strategy.”
The COAG study was supported by NHLBI contract HHSN268200800003C and carried out at 18 hospitals and medical centers across the country. Yves Rosenberg, M.D., M.P.H., was the NHLBI project officer for COAG, and also served on the executive and steering committee.
For more information: www.nhlbi.nih.gov