Clinical-trial reporting is on the rise
When Malcolm MacLeod made it his personal mission to update the results of every clinical trial ever registered at the University of Edinburgh, UK, he didn’t realize that the effort would take years. Eager to improve on what he refers to as a “shockingly poor” reporting rate — only 54% of clinical trials run at the institution had ever reported results — he started chasing principal investigators, master’s students and trial collaborators.
“I’ve stalked people on Twitter, I’ve had them dig up old logins,” says MacLeod, a neurologist. “But often we weren’t able to fix it, and the records are stuck in aspic.”
Why put scientists through all this work to correct omissions in the first place? The reasons are manifold and depend on who you talk to. In the European Union, around 15% of clinical-trial results never get published or recorded. This can not only lead to research duplication, but also create a false picture of a drug’s potency and effectiveness, especially if negative results do not make it into the public sphere.
Nature Spotlight: Bench to bedside
This has a knock-on effect for patients. In Berlin, where the Federal Joint Committee works to ensure the best possible public health care, getting clinical-trial results published is a growing priority for the group of agencies.
“Drugs can have bad side effects,” explains Susanne Teupen, team leader of the committee’s patient participation unit. “If trial results are not published, we will never know what happened. Was there a problem? Was a trial cancelled? Patients have a right to know, especially participants in trials.”
The World Health Organization defines the timely reporting of trial results as publication in an open-access repository within 24 months of trial completion. The European Union is stricter, introducing guidance in 2012 that every trial registered on the EU Clinical Trials Register (EUCTR) should publish results within 12 months of the study’s completion. The United States followed suit in 2016 with a ‘final rule’ — an addition to the 2007 Food and Drug Administration Amendments Act — that imposes a 12-month publication deadline for all trials, including international ones, registered on its tracking portal clinicaltrials.gov.
After nearly a decade of effort, the latest raw data suggest that reporting rates are improving. The website EU Trials Tracker, which records the reporting of trials, shows that 83.4% of all trials that were due to report at the time Nature went to press had published their results. The United States, which started to monitor and enforce clinical-trial-results reporting at around the same time as the EU, is catching up, too. According to the latest US Food and Drugs Administration (FDA) data, 77.4% of trials registered on clinicaltrials.gov that are due to report have published their results.
Results on the rise
In 2016, when the EU clinical-trials regulation came into force, many European countries had abysmal reporting records — with most not keeping track at all. In Germany, only 39% of clinical trials run between 2009 and 2013 had published results in any kind of database1. Europe-wide data were not gathered until 2018, when a team of concerned academics led by data researcher Nick DeVito and Ben Goldacre, who studies evidence-based medicine, both at the University of Oxford, UK, began tracking clinical-trial results using data from the EUCTR. The numbers were concerning — only around half of all trials registered on the database had reported results in the time required, and just 11% of publicly funded and academic trials had met the one-year deadline2.
Since then, however, the picture has improved, says DeVito. His team now maintains the EU Trials Tracker and its US-based equivalent, trialstracker.net. A paper3 published by DeVito and his colleagues in January found that 53% of the trials due to have reported results by December 2020 had updated records on the EUCTR database. The results had been updated faster than they had in other databases — in an average of 1,142 days of registration, compared with 3,321 days for the US database clinicaltrials.gov. However, around one-quarter of the total number of trials in the study had published results elsewhere and failed to update their entry in the database.
“In Europe, things have certainly improved quite substantially from 2018 until now,” DeVito says. ”This could be due to a couple of reasons, mainly the UK getting its act together and advocacy work on the continent, especially in Germany. It comes down to institutions educating their principal investigators.”
The public has also been alerted to the problem, putting pressure on politicians and trial sponsors to ensure that results get out in the open. Teupen and Germany’s Federal Joint Committee advocate making even more information mandatory, with the hope of improving transparency on issues such as funding, failed drugs and trials involving non-pharmaceutical interventions, including cosmetic implants or talk therapy. Teupen says that the full publication of all trial information, regardless of outcome, is an ethical duty for scientists and funders alike. ”Patients are getting wiser, and many will only participate in trials that will publicize all results,” she says. “There is a moral reason that should motivate all scientists: not knowing is always bad.”
Although the reporting data seem promising on first look, they contain one large fallacy. Compliance rates vary wildly between countries, especially when it comes to publicly funded trials. A 2022 report by nine transparency and patient-lobby groups, including Transparency International in Berlin, Cochrane Austria and UK-based TranspariMED, found that Italy, the Netherlands, Spain and France were responsible for two-thirds of the then nearly 5,500 clinical trials without results (see go.nature.com/4ex27p1). As of August, the number of EUCTR trials missing results is closer to 3,500.
Data gaps
Of the missing trials in the 2022 report, 1,299 were in Italy, comprising nearly one-quarter. This made the country the worst-performing in absolute numbers when it came to reporting the results of trials — 76% of its completed trials on EUCTR were awaiting results publication in 2022.
This is in stark contrast to Germany, which has seen marked improvement in its results reporting rates. In 2020, around 44% of trials with results had reported them; this increased to 66% as of July 2022, according to the 2022 report. Till Bruckner, a political scientist based in Sweden, and founder of TranspariMED, told Nature that highlighting poor reporting rates created media interest in Germany, which put enough pressure on national regulators to start e-mailing reminders to principal investigator and institutions, asking them to update their records.
“That made a huge difference,” he says. “The big German funders also strengthened their rules on reporting and are now starting to monitor whether people stick to the rules.” There was also pressure from the inside, Bruckner adds. “Medical researchers were saying we should act ethically; we owe it to patients to have all our results made public.”
“In Italy, by contrast, impact was limited,” he says. “The major clinical trials funder, the Italian ministry of health, has no policies whatsoever on registering clinical trials, let alone on reporting results.”
Getting any results into the public domain is still a distant goal for many countries, including Canada. Although trial registration with the Canadian Institutes of Health Research has grown significantly, from 35% in 2009 to 73% in 2019, the publication rate of results has not changed. Only around one-third of all trial results are put in the agency’s clinical trials database, according to one study4.
Kelly Cobey, a public-health scientist at the University of Ottawa in Canada and an author of the study, says that Canada’s efforts to monitor compliance are stymied by a lack of researcher education, an over-reliance on time-intensive manual input and tracking, and a lack of resources to tackle a backlog of historical trials that have not reported results.
“In my view, this approach to monitoring is not effective,” she says, adding that the improvements in registration shown in the study are probably down to journals increasingly requiring a trial registration number to publish results.
Institutional action
UK universities lead the public institutions that register trials in the EUCTR database, with many universities reporting 100% of results, including the University of Oxford, University of Birmingham, University of Dundee and Imperial College London. The University of Edinburgh (together with local health-care provider NHS Lothian) had 66 trials registered on the EUCTR, with a reporting rate of 97.7%.
MacLeod has been at the heart of the institution’s efforts to reach a 100% reporting rate. ”When I started, our compliance was at 54%, really shockingly poor,” he says, describing how his team spent months tracking down every trial on record that had not reported results. “Some of these trials had been so long ago that the principal investigator had left the institution. Sometimes, our research professionals had to go into the system on their behalf with their materials, and update the record.”
Another step the university took was to train principal investigators on the importance of trial registration and results publication, and ensure that any research group it sponsored had enough resources to register and update its clinical trial records. But most importantly, MacLeod says, the university tries to instil in its researchers the benefits of results publication.
“We need to use registries for the purpose they should serve: a record of the trial having been conducted,” he says. ”There’s a whole load of good reasons, not just the record being tidy, but the missed opportunities, if it is not, for people with the condition in question to receive treatment and cure.”
Fines are another way of securing compliance. The FDA has the right to impose penalties of up to US$11,569 a day for the late publication of results. The EU has allowed member states to decide whether and how much to charge institutions — and seven states have threatened to impose up to €250,000 ($274,000) a day on latecomers. The FDA has so far issued notices of non-compliance to five companies that failed to submit trial results on time. However, Nature has been unable to obtain any records to show whether the EU or FDA have imposed a fine.
Bruckner thinks that fines will take time to implement, because the laws that empower agencies to collect them can be vague or unclear about who should enforce the rules. “A regulator should be able to just go out and issue fines like a parking ticket, and some regulators would love to do that,” he says. “But by law they are required to go through a lengthy process of warning letters, ultimatums and legal steps. It’s easier for funding agencies, who can simply withhold grants.”
“Fines are a can of worms no agency is interested in opening,” said DeVito. “But it would work. In the US context at least, if they fined even one person, you would see a lot of people shape up.”
However, DeVito adds, improving reporting rates is only the first step in a lengthy journey to better health outcomes.
”We are looking very much at a binary, whether trials report or not. But the second element is, are we getting all the information we need, how well has the study been conducted, has it been reported correctly?” he says. “While we are getting results out there, how useful these results are is another matter.”