Compliance with International Committee of Medical Journal Editors policy on individual participant data sharing in clinical trial registries: An audit.
Compliance with International Committee of Medical Journal Editors policy on individual participant data sharing in clinical trial registries: An audit.
Project description:ImportanceThe benefits of responsible sharing of individual-participant data (IPD) from clinical studies are well recognized, but stakeholders often disagree on how to align those benefits with privacy risks, costs, and incentives for clinical trialists and sponsors. The International Committee of Medical Journal Editors (ICMJE) required a data sharing statement (DSS) from submissions reporting clinical trials effective July 1, 2018. The required DSSs provide a window into current data sharing rates, practices, and norms among trialists and sponsors.ObjectiveTo evaluate the implementation of the ICMJE DSS requirement in 3 leading medical journals: JAMA, Lancet, and New England Journal of Medicine (NEJM).Design, setting, and participantsThis is a cross-sectional study of clinical trial reports published as articles in JAMA, Lancet, and NEJM between July 1, 2018, and April 4, 2020. Articles not eligible for DSS, including observational studies and letters or correspondence, were excluded. A MEDLINE/PubMed search identified 487 eligible clinical trials in JAMA (112 trials), Lancet (147 trials), and NEJM (228 trials). Two reviewers evaluated each of the 487 articles independently.ExposurePublication of clinical trial reports in an ICMJE medical journal requiring a DSS.Main outcomes and measuresThe primary outcomes of the study were declared data availability and actual data availability in repositories. Other captured outcomes were data type, access, and conditions and reasons for data availability or unavailability. Associations with funding sources were examined.ResultsA total of 334 of 487 articles (68.6%; 95% CI, 64%-73%) declared data sharing, with nonindustry NIH-funded trials exhibiting the highest rates of declared data sharing (89%; 95% CI, 80%-98%) and industry-funded trials the lowest (61%; 95% CI, 54%-68%). However, only 2 IPD sets (0.6%; 95% CI, 0.0%-1.5%) were actually deidentified and publicly available as of April 10, 2020. The remaining were supposedly accessible via request to authors (143 of 334 articles [42.8%]), repository (89 of 334 articles [26.6%]), and company (78 of 334 articles [23.4%]). Among the 89 articles declaring that IPD would be stored in repositories, only 17 (19.1%) deposited data, mostly because of embargo and regulatory approval. Embargo was set in 47.3% of data-sharing articles (158 of 334), and in half of them the period exceeded 1 year or was unspecified.Conclusions and relevanceMost trials published in JAMA, Lancet, and NEJM after the implementation of the ICMJE policy declared their intent to make clinical data available. However, a wide gap between declared and actual data sharing exists. To improve transparency and data reuse, journals should promote the use of unique pointers to data set location and standardized choices for embargo periods and access requirements.
Project description:ObjectiveTo determine whether two specific criteria in Uniform Requirements for Manuscripts (URM) created by the International Committee of Medical Journal Editors (ICMJE)--namely, including the trial ID registration within manuscripts and timely registration of trials, are being followed.Materials and methodsObservational study using computerized analysis of publicly available Medline article data and clinical trial registry data. We analyzed a purposive set of five ICMJE founding journals looking at all trial articles published in those journals during 2010-2011, and data from the ClinicalTrials.gov (CTG) trial registry. We measured adherence to trial ID inclusion policy as the percentage of trial journal articles that contained a valid trial ID within the article (journal-based sample). Adherence to timely registration was measured as the percentage of trials that registered the trial before enrolling the first participant within a 60-day grace period. We also examined timely registration rates by year of all phase II and higher interventional trials in CTG (registry-based sample).ResultsTo determine trial ID inclusion, we analyzed 698 clinical trial articles in five journals. A total of 95.8% (661/690) of trial journal articles included the trial ID. In 88.3% the trial-article link is stored within a structured Medline field. To evaluate timely registration, we analyzed trials referenced by 451 articles from the selected five journals. A total of 60% (272/451) of articles were registered in a timely manner with an improving trend for trials initiated in later years (eg, 89% of trials that began in 2008 were registered in a timely manner). In the registry-based sample, the timely registration rates ranged from 56% for trials registered in 2006 to 72% for trials registered in 2011.DiscussionAdherence to URM requirements for registration and trial ID inclusion increases the utility of PubMed and links it in an important way to clinical trial repositories. This new integrated knowledge source can facilitate research prioritization, clinical guidelines creation, and precision medicine.ConclusionsThe five selected journals adhere well to the policy of mandatory trial registration and also outperform the registry in adherence to timely registration. ICMJE's URM policy represents a unique international mandate that may be providing a powerful incentive for sponsors and investigators to document clinical trials and trial result publications and thus fulfill important obligations to trial participants and society.
Project description:Authorship represents a critical element of scientific research. This study evaluated the perceptions, attitudes, and practices of Jordanian researchers toward the International Committee of Medical Journal Editors (ICMJE) authorship criteria. An anonymous questionnaire was distributed to health sciences faculty ( n = 986), with 272 participants completing the questionnaire. Only 27.2% reported awareness of ICMJE guidelines, yet, 76.8% agreed that all ICMJE criteria must be met for authorship, and 55.9% believed that it is easy to apply the guidelines. Unethical authorship practices were reported by 16.5% to 31.3% of participants. A majority (73%) agreed that violation of authorship criteria is scientific misconduct. Well-defined criteria for authorship need to be disseminated and emphasized in less developed countries through training to avoid authorship disputes and unethical conduct.
Project description:Registration of clinical trials is critical for promoting transparency and integrity in medical research; however, trials must be registered in a prospective fashion to deter unaccounted protocol modifications or selection of alternate outcomes that may enhance favorability of reported findings. We assessed adherence to the International Committee of Medical Journal Editors' (ICMJE) prospective registration policy and identified the frequency of registrations occurring after potential observation of primary outcome data among trials published in the highest-impact journals associated with US professional medical societies. Additionally, we examined whether trials that are unregistered or registered after potential observation of primary outcome data were more likely to report favorable findings.We conducted a retrospective, cross-sectional analysis of the 50 most recently published clinical trials that reported primary results in each of the ten highest-impact US medical specialty society journals between 1 January 2010 and 31 December 2015. We used descriptive statistics to characterize the proportions of trials that were: registered; registered retrospectively; registered retrospectively potentially after initial ascertainment of primary outcomes; and reporting favorable findings, overall and stratified by journal and trial characteristics. Chi-squared analyses were performed to assess differences in registration by journal and trial characteristics.We reviewed 6869 original research reports published between 1 January 2010 and 31 December 2015 to identify a total of 486 trials across 472 publications. Of these 486 trials, 47 (10%) were unregistered. Among 439 registered trials, 340 (77%) were registered prospectively and 99 (23%) retrospectively. Sixty-seven (68%) of these 99 retrospectively registered trials, or 15% of all 439 registered trials, were registered after potential observation of primary outcome data ascertained among participants enrolled at inception. Industry-funded trials, those with enrollment sites in the US, as well as those assessing FDA-regulated interventions each had lower rates of retrospective registration. Unregistered trials were more likely to report favorable findings than were registered trials (89% vs. 64%; relative risk (RR)?=?1.38, 95% confidence interval (CI)?=?1.20-1.58; p?=?0.004), irrespective of registration timing.Adherence to the ICMJE's prospective registration policy remains sub-standard, even in the highest-impact journals associated with US professional medical societies. These journals frequently published unregistered trials and trials registered after potential observation of primary outcome data.
Project description:Although selective reporting of clinical trial results introduces bias into evidence based clinical decision making, publication bias in pediatric epilepsy is unknown today. Since there is a considerable ambiguity in the treatment of an important and common clinical problem, pediatric seizures, we assessed the public availability of results of phase 3 clinical trials that evaluated treatments of seizures in children and adolescents as a surrogate for the extent of publication bias in pediatric epilepsy.We determined the proportion of published and unpublished study results of phase 3 clinical trials that were registered as completed on ClinicalTrials.gov. We searched ClinicalTrials.gov, PubMed, and Google Scholar for publications and contacted principal investigators or sponsors. The analysis was performed according to STROBE criteria.Considering studies that were completed before 2014 (N = 99), 75 (76%) pediatric phase 3 clinical trials were published but 24 (24%) remained unpublished. The unpublished studies concealed evidence from 4,437 patients. Mean time-to-publication was 25 SD ± 15.6 months, more than twice as long as mandated.Ten years after the ICMJE's clinical trials registration initiative there is still a considerable amount of selective reporting and delay of publication that potentially distorts the body of evidence in the treatment of pediatric seizures.
Project description:BackgroundThe aim of the study was to develop quality indicators that can be used for quality assessment of registries of occupational diseases in relation to preventive policy on a national level. The research questions were: 1. Which indicators determine the quality of national registries of occupational diseases with respect to their ability to provide appropriate information for preventive policy? 2. What are the criteria that can distinguish low quality from high quality?MethodsFirst, we performed a literature search to assess which output of registries can be considered appropriate for preventive policy and to develop a set of preliminary indicators and criteria. Second, final indicators and criteria were assessed and their content validity was tested in a Delphi study, for which experts from the 25 EU Member States were invited.ResultsThe literature search revealed two different types of information output to be appropriate for preventive policy: monitor and alert information. For the evaluation of the quality of the monitor and alert function we developed ten indicators and criteria. Sixteen of the twenty-five experts responded in the first round of the Delphi study, and eleven in the second round. Based on their comments, we assessed the final nine indicators: the completeness of the notification form, coverage of registration, guidelines or criteria for notification, education and training of reporting physicians, completeness of registration, statistical methods used, investigation of special cases, presentation of monitor information, and presentation of alert information. Except for the indicator "coverage of registration" for the alert function, all the indicators met the preset requirements of content validity.ConclusionWe have developed quality indicators and criteria to evaluate registries for occupational diseases on the ability to provide appropriate information for preventive policy on a national level. Together, these indicators form a tool which can be used for quality improvement of registries of occupational diseases.
Project description:Background: The work of journal editors is essential to producing high-quality literature, and editing can be a very rewarding career; however, the profession may not be immune to gender pay gaps found in many professions and industries, including academia and clinical medicine. Our study aimed to quantify remuneration for journal editors from core clinical journals, determine if a gender pay gap exists, and assess if there are remuneration differences across publishing models and journal characteristics. Methods: We completed an online survey of journal editors with substantial editing roles including section editors and editors-in-chief, identified from the Abridged Index Medicus "Core Clinical" journals in MEDLINE. We analyzed information on demographics, editing income, and journal characteristics using a multivariable partial proportional odds model for ordinal logistic regression. Results: There were 166 survey respondents (response rate of 9%), which represented editors from 69 of 111 journals (62%). A total of 140 fully completed surveys were analyzed (95 males and 45 females); 50 (36%) editors did not receive remuneration for editorial work. No gender pay gap and no difference in remuneration between editors who worked in subscription-based publishing vs. open access journals were detected. Editors who were not primarily health care providers were more likely to have higher editing incomes (adjusted odds ratio [OR] 2.96, 95% confidence interval [CI] 1.18-7.46). Editors who worked more than 10 hours per week editing earned more than those who worked 10 hours or less per week (adjusted OR 16.7, 95%CI 7.02-39.76). Conclusions: We were unable to detect a gender pay gap and a difference in remuneration between editors who worked in subscription-based publishing and those in open access journals. More than one third of editors surveyed from core clinical journals did not get remunerated for their editing work.