School of Continuous Professional Development, Mayo Clinic College of Medicine and Science, Rochester, MN, USA.
Division of General Internal Medicine, Mayo Clinic, Rochester, MN, USA.
Perspect Med Educ. 2022 Jun;11(3):156-164. doi: 10.1007/s40037-022-00705-z. Epub 2022 Mar 31.
We sought to evaluate the reporting and methodological quality of cost evaluations of physician continuing professional development (CPD).
We conducted a systematic review, searching MEDLINE, Embase, PsycInfo, and the Cochrane Database for studies comparing the cost of physician CPD (last update 23 April 2020). Two reviewers, working independently, screened all articles for inclusion. Two reviewers extracted information on reporting quality using the Consolidated Health Economic Evaluation Reporting Standards (CHEERS), and on methodological quality using the Medical Education Research Study Quality Instrument (MERSQI) and a published reference case.
Of 3338 potentially eligible studies, 62 were included. Operational definitions of methodological and reporting quality elements were iteratively revised. Articles reported mean (SD) 43% (20%) of CHEERS elements for the Title/Abstract, 56% (34%) for Introduction, 66% (19%) for Methods, 61% (17%) for Results, and 66% (30%) for Discussion, with overall reporting index 292 (83) (maximum 500). Valuation methods were reported infrequently (resource selection 10 of 62 [16%], resource quantitation 10 [16%], pricing 26 [42%]), as were descriptions/discussion of the physicians trained (42 [68%]), training setting (42 [68%]), training intervention (40 [65%]), sensitivity analyses of uncertainty (9 [15%]), and generalizability (30 [48%]). MERSQI scores ranged from 6.0 to 16.0 (mean 11.2 [2.4]). Changes over time in reporting index (initial 241 [105], final 321 [52]) and MERSQI scores (initial 9.8 [2.7], final 11.9 [1.9]) were not statistically significant (p ≥ 0.08).
Methods and reporting of HPE cost evaluations fall short of current standards. Gaps exist in the valuation, analysis, and contextualization of cost outcomes.
我们旨在评估医师持续专业发展(CPD)成本评估的报告和方法学质量。
我们进行了系统评价,检索了 MEDLINE、Embase、PsycInfo 和 Cochrane 数据库中比较医师 CPD 成本的研究(最后更新日期为 2020 年 4 月 23 日)。两名独立审查员筛选所有纳入的文章。两名审查员使用综合健康经济评估报告标准(CHEERS)评估报告质量,使用医学教育研究质量工具(MERSQI)和已发表的参考案例评估方法学质量。
在 3338 篇潜在合格的研究中,有 62 篇被纳入。方法学和报告质量元素的操作定义经过反复修订。文章报告了标题/摘要的 CHEERS 元素的平均值(标准差)为 43%(20%),介绍的 56%(34%),方法的 66%(19%),结果的 61%(17%),讨论的 66%(30%),整体报告指数为 292(83)(最高 500)。评估方法的报道频率较低(资源选择 10 项/62 项 [16%],资源量化 10 项/62 项 [16%],定价 26 项/62 项 [42%]),以及受训医生的描述/讨论(42 项/62 项 [68%])、培训环境(42 项/62 项 [68%])、培训干预(40 项/62 项 [65%])、不确定性的敏感性分析(9 项/62 项 [15%])和可推广性(30 项/62 项 [48%])。MERSQI 评分范围为 6.0 至 16.0(平均 11.2[2.4])。报告指数(初始 241[105],最终 321[52])和 MERSQI 评分(初始 9.8[2.7],最终 11.9[1.9])的时间变化没有统计学意义(p≥0.08)。
HPE 成本评估的方法和报告均未达到当前标准。在成本结果的评估、分析和情境化方面存在差距。