Data-sharing and re-analysis for main studies assessed by the EMA: our first registered report!
Transparency and reproducibility are expected to be normative practices in clinical trials used for decision-making on marketing authorisations for new medicines.
In October 2014, the European Medicine agency released its policy 0070 on “publication of clinical data for medicinal products for human use”. The agency describes a two-step approach. The second step includes the publication of IPD. A date for the implementation of this step still needs to be fixed. And, as a result of Brexit and the relocation of the EMA to the Netherlands, further developments and renovation have been stopped for the moment. Efforts are therefore still needed to reach full transparency in the EMA.
Umbrella groups of biopharmaceutical companies (e.g. the European Federation of Pharmaceutical Industries and Associations) endorsed a commitment “to [...] responsible sharing of clinical trial data”. Despite this commitment from 2013, an audit found that data availability was reached for only 9/61 clinical trials on medicines sponsored by the pharmaceutical industry and first published between July and December 2015 in the top 10 medical journals. If such low rates of data-sharing were also to be observed for main trials (i.e. pivotal studies), it would invalidate any efforts towards reproducibility for these very important studies. Therefore, our registered report published at BMC Medicine introduces a cross-sectional study aiming to assess inferential reproducibility for main trials assessed by the EMA.
Two heroic researchers (Max and Jeanne) identified all studies on new medicines, biosimilars and orphan medicines given approval by the European Commission between January 2017 and December 2019, categorised as ‘main studies’ in the European Public Assessment Reports (EPARs). Sixty-two of these studies were randomly sampled. One researcher retrieved the individual patient data (IPD) for these studies and prepared a dossier for each study, containing the IPD, the protocol and information on the conduct of the study.
A second researcher who had no access to study reports used the dossier to run an independent re-analysis of each trial. All results of these re-analyses were reported in terms of each study’s conclusions, p-values, effect sizes and changes from the initial protocol. A team of two researchers not involved in the re-analysis compared results of the re-analyses with published results of the trial.
Two hundred ninety-two main studies in 173 EPARs were identified. Among the 62 studies randomly sampled, we received IPD for 10 trials. For the remaining 52 studies, reasons for unavailability were heterogeneous. The most common reason was restriction due to the study status, i.e. extension studies were ongoing (13/52; 25%). Other reasons included confidentiality (9/52;17.3%) or "lack of scientific merit” [... importantly considering scientific merit, our study was a registered report and it was independently reviewed for scientific merit before conducting the research...]. Of the 52 studies where IPD was not shared, 40 (77%) belonged to companies that had a data-sharing policy. The median number of days between data request and data receipt was 253 [interquartile range 182–469].
For the ten trials with available data, we identified 23 distinct primary outcomes for which the conclusions were reproduced in all re-analyses. We found no selective reporting of the studies’ primary outcomes and no change from the original study protocol for the primary outcome in any of these ten studies. Therefore, 10/62 trials (16% [95% confidence interval 8% to 28%]) were reproduced, as the 52 studies without available data were considered non-reproducible. There was no change from the original study protocol regarding the primary outcome in any of these ten studies. Spin was observed in the report of one study. It was a trial about esketamine. By the way (coincidently, we have another ongoing registered report at BMC Medicine about this drug... and... we have now received the data a few days ago form YODA). We will keep in touch about this project.
Despite their results supporting decisions that affect millions of people’s health across the European Union, most main studies used in EPARs lack transparency and their results are not reproducible for external researchers (because data are not available). Re-analyses of the few trials with available data showed very good inferential reproducibility.