Ben Goldacre has rather famously described the clinical trial reporting requirements in the Food and Drug Administration Amendments Act of 2007 as a “fake fix” that was being thoroughly “ignored” by the pharmaceutical industry.
Pharma: breaking the law in broad daylight? |
Despite the fact that the FDA, which has access to more data, says that only a tiny fraction of studies are potentially noncompliant, Goldacre's frequently repeated claims that the law is being ignored seems to have caught on in the general run of journalistic and academic discussions about FDAAA.
And now there appears to be additional support for the idea that a large percentage of studies are noncompliant with FDAAA results reporting requirements, in the form of a new study in the Journal of Clinical Oncology: "Public Availability of Results of Trials Assessing Cancer Drugs in the United States" by Thi-Anh-Hoa Nguyen, et al.. In it, the authors report even lower levels of FDAAA compliance – a mere 20% of randomized clinical trials met requirements of posting results on clinicaltrials.gov within one year.
Unsurprisingly, the JCO results were immediately picked up and circulated uncritically by the usual suspects.
I have to admit not knowing much about pure academic and cooperative group trial operations, but I do know a lot about industry-run trials – simply put, I find the data as presented in the JCO study impossible to believe. Everyone I work with in pharma trials is painfully aware of the regulatory environment they work in. FDAAA compliance is a given, a no-brainer: large internal legal and compliance teams are everywhere, ensuring that the letter of the law is followed in clinical trial conduct. If anything, pharma sponsors are twitchily over-compliant with these kinds of regulations (for example, most still adhere to 100% verification of source documentation – sending monitors to physically examine every single record of every single enrolled patient - even after the FDA explicitly told them they didn't have to).
I realize that’s anecdotal evidence, but when such behavior is so pervasive, it’s difficult to buy into data that says it’s not happening at all. The idea that all pharmaceutical companies are ignoring a highly visible law that’s been on the books for 6 years is extraordinary. Are they really so brazenly breaking the rules? And is FDA abetting them by disseminating incorrect information?
Those are extraordinary claims, and would seem to require extraordinary evidence. The BMJ study had clear limitations that make its implications entirely unclear. Is the JCO article any better?
Some Issues
In fact, there appear to be at least two major issues that may have seriously compromised the JCO findings:
1. Studies that were certified as being eligible for delayed reporting requirements, but do not have their certification date listed.
The study authors make what I believe to be a completely unwarranted assumption:
In trials for approval of new drugs or approval for a new indication, a certification [permitting delayed results reporting] should be posted within 1 year and should be publicly available.
It’s unclear to me why the authors think the certifications “should be” publicly available. In re-reading FDAAA section 801, I don’t see any reference to that being a requirement. I suppose I could have missed it, but the authors provide a citation to a page that clearly does not list any such requirement.
But their methodology assumes that all trials that have a certification will have it posted:
If no results were posted at ClinicalTrials.gov, we determined whether the responsible party submitted a certification. In this case, we recorded the date of submission of the certification to ClinicalTrials.gov.
If a sponsor gets approval from FDA to delay reporting (as is routine for all drugs that are either not approved for any indication, or being studied for a new indication – i.e., the overwhelming majority of pharma drug trials), but doesn't post that approval on the registry, the JCO authors deem that trial “noncompliant”. This is not warranted: the company may have simply chosen not to post the certification despite being entirely FDAAA compliant.
2. Studies that were previously certified for delayed reporting and subsequently reported results
It is hard to tell how the authors treated this rather-substantial category of trials. If a trial was certified for delayed results reporting, but then subsequently published results, the certification date becomes difficult to find. Indeed, it appears in the case where there were results, the authors simply looked at the time from study completion to results posting. In effect, this would re-classify almost every single one of these trials from compliant to non-compliant. Consider this example trial:
- Phase 3 trial completes January 2010
- Certification of delayed results obtained December 2010 (compliant)
- FDA approval June 2013
- Results posted July 2013 (compliant)
In looking at the JCO paper's methods section, it really appears that this trial would be classified as reporting results 3.5 years after completion, and therefore be considered noncompliant with FDAAA. In fact, this trial is entirely kosher, and would be extremely typical for many phase 2 and 3 trials in industry.
Time for Some Data Transparency
The above two concerns may, in fact, be non-issues. They certainly appear to be implied in the JCO paper, but the wording isn't terribly detailed and could easily be giving me the wrong impression.
However, if either or both of these issues are real, they may affect the vast majority of "noncompliant" trials in this study. Given the fact that most clinical trials are either looking at new drugs, or looking at new indications for new drugs, these two issues may entirely explain the gap between the JCO study and the unequivocal FDA statements that contradict it.
I hope that, given the importance of transparency in research, the authors will be willing to post their data set publicly so that others can review their assumptions and independently verify their conclusions. It would be more than a bit ironic otherwise.
[Image credit: Shamless lawlessness via Flikr user willytronics.]
8 comments:
There is a clear need for routine and fully open public audit on the compliance with all transparency legislation. One way to achieve this would be for the FDA to post all correspondence relating to applications for delay in results reporting onto ClinicalTrials.gov, immediately after it is received. It would also be appropriate for ClinicalTrials.gov to have a data field denoting whether a trial is required to be compliant with FDA AA 2007, since this is clearly a matter that is contested by individual trialists. I understand that industry has lobbied heavily against this measure. Meanwhile, we do at least have access to these audits published in the academic literature. They are the best currently available evidence.
I am surprised that you make no comment on the key issue: doctors and patients need all the results of all the trials that have been conducted on a treatment, to make informed decisions about which is best. That's not an unreasonable request, and it is not currently met. This is a serious failing that undermines our best efforts to practice evidence based medicine, and it has been well-documented - with solutions proposed but never adequately implemented - for at least three decades.
I hope you and your readers will join the many other ethical professionals in industry and sign up to the AllTrials campaign, calling for all trials to be registered and all results reported, at www.alltrials.net
Thank you for your comments.
For the record, I signed the AllTrials petition back in early February - just before GSK announced their signing on. You could look it up: my name is not terribly common, and the list is not terribly long.
Even so, I'm not sure why you're "surprised" that I "make no comment" on my support for general principles of transparency in this post. I was reviewing a much-publicized study that I felt contained, potentially, severe methodological flaws that might entirely compromise its findings. I did not pause to re-pledge my support for AllTrials, just as I would not expect my physician to preface a review of my medical tests with a solemn recitation of the Hippocratic Oath.
It's patently false that these 2 academic studies of FDAAA compliance represent "the best currently available evidence". We have the announced results of the FDA's internal review. While those results were not published in the BMJ, they do have the benefit of being the only evidence based upon direct inspection of the actual data. The BMJ and JCO studies used unreliable proxy measures, producing (predictably enough) unreliable results. I would encourage you not to cite either of those papers with the same unequivocal certitude as you have recently.
If you will indulge me, I have a question for you:
At the end of my post, I call on the JCO authors to publicly post their data. It would seem to me that if their methods are sound, then their data would allow us all to clearly see which companies are out of FDAAA compliance. I couldn't help but notice that you ignored that part. Will you join me in urging the authors to make their entire dataset public?
Paul you make a good case that the pharma companies are not "brazenly" ignoring the FDAAA requirements. But that is kind of arguing the fine print. The fact remains that overwhelmingly, clinical trials listed on ClinicalTrials.gov do NOT contain any results. Sure some trials are published in the literature eventually, but the average Joe or jane patient definitely does not feel like they have access to much of the clinical data that has been collected.
Andrew,
Thanks for your comments. I think you probably have a good point that more results can and should be posted. In fairness, I think this would require an expansion of FDAAA requirements: that's an issue worth more substantive discussion, and I will try to cover it in a post very soon.
However, I do not think that this is "arguing the fine print". This is the 2nd study that is essentially claiming that companies are routinely breaking the law, and this claim is being picked up as a talking point for those who make a living attacking the research world (see multiple links, above). That's why I chose to review this study's methods, as we've had prior experience of an even weaker study being stripped of all qualifications and put forward as clear proof of pharma wrongdoing.
Hi Paul,
In the interest of following up, I contacted the NIH for clarification on the posting requirements cited in the study. The NIH reply appears in the form of my own comment here...
http://www.pharmalive.com/fighting-cancer-may-be-harder-as-many-drug-trials-are-undisclosedhed#comment-110884
Hope this helps,
Ed Silverman
your usual suspect at Pharmalot
Hi Ed,
Thanks for the update. I'm going to try one more time with the authors, and if that fails will produce and post an extract from ClinicalTrials.gov in order to (try to) replicate their methods.
The conversations I've had in the past weeks with pharma clients have further solidified my opinion that there is a high rate of compliance with FDAAA reporting requirements. There seems to be no way to reconcile the issues apart from public review of the actual data.
Paul
Like different bonuses, your welcome bonus must be wagered a certain number of times before you'll be able to|you presumably can} withdraw it. Not all games contribute equally to the rollover necessities. For instance, slots usually depend one hundred pc path of|in path of} assembly your wagering necessities, whereas table games like blackjack and roulette only contribute 10-20%. Wagering necessities are the conditions that you should meet so as to be able to|to have the ability to} withdraw your winnings from a casino bonus. These necessities usually contain betting {a certain 우리카지노 amount|a particular amount|a certain quantity} of funds or enjoying in} through your bonus a certain number of times. No-deposit bonuses are much less widespread, however they're nice if you want to|if you want to} check out a casino without risking any of your own cash.
Great bloog you have here
Post a Comment