Select Page

Data Dump: U.S. Trials Often Fail to Report Results

‘Poor performance and noticeable variation’ in dissemination of clinical trial findings

by Sarah Wickline Wallan | Staff Writer, MedPage Today | See Original Here

Most clinical trials conducted by researchers at U.S. academic medical centers, finishing from 2007 to 2010, did not report or publish results within 2 years of completion, researchers reported.

Out of 4,347 interventional trials with base operations across 51 U.S. trial-experienced academic medical centers, only 36% reported or published results within 24 months of study completion (range 16% to 55% for individual centers), and 34% did not report or publish results at all in the following years, according to Harlan M. Krumholz, MD, of Yale School of Medicine, and colleagues, in The BMJ.

“We were aware that this was a big problem; we’ve published studies that researchers are slow to publish research on human experiments,” Krumholz said in a phone interview with MedPage Today. “We were dismayed that the response to this has been slow in the academic community, and we thought this ‘report card’ would be useful.”

Krumholz said his team thought they would find the institutions that were doing well and then disseminate identifiable best practices.

“But we were surprised to find that no one is doing well,” he added. “The fact that it’s so pervasive suggests it’s not about bad individuals, it’s about a culture that allows for reporting to be discretionary rather than mandatory.”

“This is a human subjects’ violation,” Krumholz explained. “People have agreed to be part of our studies and we routinely don’t report [results]. All studies should be completed and reported, but these in particular, are human studies. These aren’t studies that have fallen off the tracks. These are studies that were successfully completed. This should alarm everyone.”

Krumholz’s group looked at trials run by a lead investigator who was affiliated with an academic medical center that had 40 or more completed interventional trials registered through

A total of 4,347 interventional clinical trials originating from 51 academic medical centers that were set to complete from 2007 to 2010 were identified. Exactly half were phase II though IV, 23% had more than 100 patients enrolled, and 28% were double-blinded.

Trial results were ever published or reported for only 66% of the trials, and only 35.9% of the trials reported results within 24 months of the trial’s completion.

Publishing results of the trials in a peer-reviewed journal within 24 months of completion ranged from 11% to 40% in individual centers — 29% overall. And reporting results on within 24 months of trial end ranged from 2% to 41%, and 13% overall.

Rates of publishing or reporting results at all ranged from 46% to 77% across institutions. But no academic center published more than 40% of completed trial results, or reported more than 41% to

The endpoint classifications for 76% of the trials were safety and/or efficacy, and 34% of the trials involved cancer and other neoplasms.

Study limitations included not having studies that published or reported results prior to the primary completion date in the time analysis in case of data entry error. However, these trials were included in the overall calculation.

According to Krumholz, the impact of this lack of data reporting means only a biased slice of research information has been influencing medicine and future research.

“We talked to a lot of people about it and we failed to find any single problem,” Krumholz said.”Some people [weren’t] excited about the results, others got busy or distracted, many were small and maybe they use the [data] to inform their next research project.”

Krumholz told MedPage Today that some researchers stated “Maybe the study was so bad we shouldn’t report the results,” implying that they avoided sharing or publishing results because of potential embarrassment.

But he stressed that if participants consented, and experiments were conducted, then the data “need to see the light of day… if it’s good enough to be consented, it’s good enough to share the results.”

Krumholz disclosed relevant relationships with Janssen, Medtronic, the FDA, and UnitedHealth. Some co-authors disclosed relevant relationships with Janssen, Medtronic, and the FDA.