Vazquez Enrique, Ross Joseph S, Gross Cary P, Childers Karla, Bamford Stephen, Ritchie Jessica D, Waldstreicher Joanne, Krumholz Harlan M, Wallach Joshua D
Yale University, New Haven, CT, USA.
Section of General Internal Medicine, Yale School of Medicine, New Haven, CT, USA.
Clin Trials. 2025 Jun;22(3):279-288. doi: 10.1177/17407745241304355. Epub 2024 Dec 29.
Background/AimsThe reuse of clinical trial data available through data-sharing platforms has grown over the past decade. Several prominent clinical data-sharing platforms require researchers to submit formal research proposals before granting data access, providing an opportunity to evaluate how published analyses compare with initially proposed aims. We evaluated the concordance between the included trials, study objectives, endpoints, and statistical methods specified in researchers' clinical trial data use request proposals to four clinical data-sharing platforms and their corresponding publications.MethodsWe identified all unique data request proposals with at least one corresponding peer-reviewed publication as of 31 March 2023 on four prominent clinical trial data sharing request platforms (Vivli, ClinicalStudyDataRequest.com, the Yale Open Data Access Project, and Supporting Open Access to Researchers-Bristol Myers Squibb). When data requests had multiple publications, we treated each publication-request pair as a unit. For each pair, the trials requested and analyzed were classified as fully concordant, discordant, or unclear, whereas the study objectives, primary and secondary endpoints, and statistical methods were classified as fully concordant, partially concordant, discordant, or unclear. For Vivli, ClinicalStudyDataRequest.com, and Supporting Open Access to Researchers-Bristol Myers Squibb, endpoints of publication-request pairs were not compared because the data request proposals on these platforms do not consistently report this information.ResultsOf 117 Vivli publication-request pairs, 76 (65.0%) were fully concordant for the trials requested and analyzed, 61 (52.1%) for study objectives, and 57 (48.7%) for statistical methods; 35 (29.9%) pairs were fully concordant across the 3 characteristics reported by all platforms. Of 106 ClinicalStudyDataRequest.com publication-request pairs, 66 (62.3%) were fully concordant for the trials requested and analyzed, 41 (38.7%) for study objectives, and 35 (33.0%) for statistical methods; 20 (18.9%) pairs were fully concordant across the 3 characteristics. Of 65 Yale Open Data Access Project publication-request pairs, 35 (53.8%) were fully concordant for the trials requested and analyzed, 44 (67.7%) for primary study objectives, and 25 (38.5%) for statistical methods; 15 (23.1%) pairs were fully concordant across the 3 characteristics. In addition, 26 (40.0%) and 2 (3.1%) Yale Open Data Access Project publication-request pairs were concordant for primary and secondary endpoints, respectively, such that only one (1.5%) Yale Open Data Access Project publication-request pair was fully concordant across all five characteristics reported. Of three Supporting Open Access to Researchers-Bristol Myers Squibb publication-request pairs, one (33.3%) was fully concordant for the trials requested and analyzed, two (66.6%) for primary study objectives, and two (66.6%) for statistical methods; one (33.3%) pair was fully concordant across all three characteristics reported by all platforms.ConclusionAcross four clinical data sharing platforms, data request proposals were often discordant with their corresponding publications, with only 25% concordant across all three key proposal characteristics reported by each platform. Opportunities exist for investigators to describe any data-sharing request proposal deviations in their publications and for platforms to enhance the reporting of key study characteristic specifications.
背景/目的
在过去十年中,通过数据共享平台对临床试验数据的再利用有所增加。几个著名的临床数据共享平台要求研究人员在获得数据访问权限之前提交正式的研究提案,这为评估已发表的分析与最初提出的目标之间的差异提供了机会。我们评估了研究人员向四个临床数据共享平台提交的临床试验数据使用申请提案中所包含的试验、研究目标、终点和统计方法与其相应出版物之间的一致性。
方法
我们在四个著名的临床试验数据共享请求平台(Vivli、ClinicalStudyDataRequest.com、耶鲁开放数据访问项目和支持研究人员开放获取——百时美施贵宝)上,识别出截至2023年3月31日所有具有至少一篇相应同行评审出版物的独特数据请求提案。当数据请求有多个出版物时,我们将每个出版物 - 请求对视为一个单元。对于每一对,所请求和分析的试验被分类为完全一致、不一致或不明确,而研究目标、主要和次要终点以及统计方法被分类为完全一致、部分一致、不一致或不明确。对于Vivli、ClinicalStudyDataRequest.com和支持研究人员开放获取——百时美施贵宝,未比较出版物 - 请求对的终点,因为这些平台上的数据请求提案并未一致报告此信息。
结果
在117对Vivli出版物 - 请求对中,76对(65.0%)所请求和分析的试验完全一致,61对(52.1%)研究目标完全一致,57对(48.7%)统计方法完全一致;35对(29.9%)在所有平台报告的三个特征上完全一致。在106对ClinicalStudyDataRequest.com出版物 - 请求对中,66对(62.3%)所请求和分析的试验完全一致,41对(38.7%)研究目标完全一致,35对(33.0%)统计方法完全一致;20对(18.9%)在三个特征上完全一致。在65对耶鲁开放数据访问项目出版物 - 请求对中,35对(53.8%)所请求和分析的试验完全一致,44对(67.7%)主要研究目标完全一致,25对(38.5%)统计方法完全一致;15对(23.1%)在三个特征上完全一致。此外,26对(40.0%)和2对(3.1%)耶鲁开放数据访问项目出版物 - 请求对分别在主要和次要终点上一致,因此只有1对(1.5%)耶鲁开放数据访问项目出版物 - 请求对在所有报告的五个特征上完全一致。在三对支持研究人员开放获取——百时美施贵宝出版物 - 请求对中,1对(33.3%)所请求和分析的试验完全一致,2对(66.6%)主要研究目标完全一致,2对(66.6%)统计方法完全一致;1对(33.3%)在所有平台报告的三个特征上完全一致。
结论
在四个临床数据共享平台中,数据请求提案与其相应出版物之间常常不一致,每个平台报告的所有三个关键提案特征中只有25%一致。研究人员有机会在其出版物中描述任何数据共享请求提案的偏差,而平台也有机会加强关键研究特征规范的报告。