Assessing Drug Safety Post Approval: Lessons from Vioxx, Avandia, and Meridia – Part 1

A panel discussion on May 14, 2011, at the American Heart Association Quality of Care and Outcomes Research in Cardiovascular Disease and Stroke conference addressed lessons from three drugs — rofecoxib (Vioxx), rosiglitzone (Avandia) and sibutramine (Meridia) — that were found post approval to increase cardiovascular risk and subsequently either withdrawn or severely restricted. The panelists were Steven Nissen of the Cleveland Clinic, Milton Packer of University of Texas Southwestern Medical Center, Dean Follmann of National Institute of Allergy and Infectious Diseases, NIH, and Ellis Unger of the Center for Drug Evaluation and Research, FDA and the moderator was Sanjay Kaul of Cedars-Sinai Medical Center.
 
In his introductory remarks, Kaul emphasized the asymmetry in the evidence base for assessing efficacy and safety in drug approval. Efficacy is assessed pre-approval by randomized controlled trials (RCTs) with prespecified, adjudicated endpoints. If a safety signal emerges in these efficacy trials, the adverse events are not prespecified and the studies are often not adequately powered to reliably determine risk. Few RCTs are conducted to assess safety pre-approval. Thus, safety is assessed post-approval with meta-analyses, observational databases, the FDA’s Adverse Event Reporting System (AERS), or RCTs that involve limited exposure and/or narrow populations. Thus, the evidence for determining efficacy is often much superior to the evidence for determining safety.
 
Steven Nissen outlined “critical lessons learned” from rofecoxib and rosiglitazone experiences. First, post-approval studies and spontaneous AE reporting are ineffective at detecting increased risk of common sources of morbidity and mortality, such as cardiovascular disease. Second, for the cardiovascular hazards of rofecoxib and rosiglitazone, strong signals suggesting harm appeared early, but were missed or actively concealed. Third, dedicated post-approval safety studies take many years, and are vulnerable to manipulation, mischief, and flaws in study design or conduct. Thus, the last line of defense against unsafe drugs is often the drug approval process, because “once the genie gets out the bottle, it is very hard to put it back.”
 
In the case of rofecoxib, which was approved in 1999, Nissen said the safety signal emerged the following year when the VIGOR trial was published. Buried in the “General Safety” section was a sentence stating that “Myocardial infarctions were less common in the naproxen group than in the rofecoxib group (0.1 percent vs. 0.4 percent; 95% confidence interval for the difference, 0.1 to 0.6 percent; relative risk 0.2; 95% confidence interval 0.1 to 0.7)” (emphasis added). Nissen called this phrasing “diabolical” and noted that “no one saw this.” Moreover, the table and Kaplan-Meier curves for thrombotic events were omitted from the manuscript. After the table showing the number of events and the Kaplan-Meier curves were made available in connection with an FDA advisory committee meeting, Nissen and colleagues published the data in JAMA, “creating a furor and lots of slings and arrows, but it didn’t do a thing,” according to Nissen. Vioxx sales continued to grow and ultimately a total of 105 million prescriptions were written, exposing 20 million Americans to the drug. In 2004, the APPROVe study was stopped by the Data Safety Monitoring Board when an excess of thrombotic events became evident, leading to the drug’s withdrawal. Nissen described how the APPROVe study too was published in a misleading way, making it appear that there was an 18-month delay before the excess risk became evident. Documents that were disclosed in litigation revealed a previously undisclosed intention-to-treat analysis. The ITT analysis showed an early hazard with no 18 month delay. In Nissen’s view, these misleading trial publications demonstrate that the medical community just can’t trust that industry-sponsored clinical trial data will be published in a way that is not misleading.
 
Nissen conducted a similar analysis of the history of rosiglitazone, arguing that a safety signal was evident in the pre-approval trials, in which there was an excess of ischemic myocardial events for rosiglitazone, as well as a worrisome 18.6% increase in LDL. Nissen also described corporate misconduct, including the intimidation of a leading diabetes researcher, buried data and GSK meta-analyses that were conducted in 2005 and 2006, before the 2007 Nissen/Wolski meta-analysis. The GSK meta-analyses showed an increased risk of ischemic myocardial events and were shared with the FDA but not with physicians or patients. (For a detailed rosiglitazone chronology, see Nissen’s 2010 editorial, “The rise and fall of rosiglitazone,” as well as Nissen’s slides from the July 2010 FDA advisory committee meeting on rosiglitazone). Of note, the Nissen/Wolski meta-analysis was only made possible because a settlement with the New York attorney general’s office had required GSK to make all its clinical trial data available. Nissen/Wolski found the data on a GSK website and published their meta-analysis. Without the disclosure required by the settlement with New York state, the meta-analysis would not have been possible, as 35 out of 42 clinical trials were unpublished. As for the RECORD trial, Nissen described it as a texbook example of “how not to perform a safety study.” The trial was completely unblinded to patients and physicians and there was unrestricted availability of treatment codes to the contract research organization and GlaxoSmithKline (GSK). In addition, the study leadership removed silent heart attacks (10 to 5, rosiglitazone vs. control) from the database after analyzing the data. Nonetheless, Nissen believes, based on the reanalysis of the RECORD data by the FDA’s Thomas Marciniak (see Marciniak’s slides) that RECORD didn’t show, as argued by GSK, that rosiglitazone was safe; “it demonstrated that the drug was unsafe.” The lesson Nissen believes we should learn from Vioxx, Avandia and Meridia is that “you’ve got to stop these things at the approval process, and when early safety signals are seen, it requires aggressive regulatory action, at the very least demanding that well-conducted safety trials be done. In these three cases, that didn’t happen. Drugs stayed on the market too long and too many people were harmed.”
 
Milton Packer gave a presentation that was a greatly condensed version of one he gave in February 2005 at the FDA advisory committee meeting on Cox-2s (slides here; transcript here). He emphasized the difficulty of interpreting observed differences in the frequency of events when the number of events is small.  The difficulty is that in an efficacy trial, the trial is sized for efficacy, not safety. Where the number of events is small, the point estimates will be extremely imprecise and the confidence intervals will be wide. Even if the result is statistically significant and the effect is biologically plausible, it is often not possible to be certain the effect is real. Packer gave the example of a Vioxx meta-analysis conducted by Peter Juni and colleagues and published in The Lancet in December 2004, after the withdrawal of Vioxx. Based on a cumulative meta-analysis, the authors concluded that by the end of 2000 the relative risk was 2.30 (95% CI 1.22-4.33, p=0.01) and that Vioxx should have been withdrawn at that time. Packer points out that this analysis was based on only 52 events. He believes that was not enough events to draw reliable conclusions. As evidence, Packer gave examples of small pilot trials that gave results that were not confirmed when larger definitive trials were done. Packer doesn’t disagree with the actions that were ultimately taken with respect to Vioxx, Avandia and Meridia, he just questions that it was possible to know what the risks were early on.
 
In Part 2 of this post, I will discuss the presentations by Dean Follman and Ellis Unger.
 

Posted on June 12, 2011, in cardiology, drug safety and tagged , , , , , , , , , , , . Bookmark the permalink. 3 Comments.

Leave a comment