Department of Orthopaedic Surgery and Rehabilitation, University of Mississippi Medical Center, Jackson, MS.
J Orthop Trauma. 2018 Apr;32(4):e139-e144. doi: 10.1097/BOT.0000000000001106.
The mission of any academic orthopaedic training program can be divided into 3 general areas of focus: clinical care, academic performance, and research. Clinical care is evaluated on clinical volume, patient outcomes, patient satisfaction, and becoming increasingly focused on data-driven quality metrics. Academic performance of a department can be used to motivate individual surgeons, but objective measures are used to define a residency program. Annual in-service examinations serve as a marker of resident knowledge base, and board pass rates are clearly scrutinized. Research productivity, however, has proven harder to objectively quantify. In an effort to improve transparency and better account for conflicts of interest, bias, and self-citation, multiple bibliometric measures have been developed. Rather than using individuals' research productivity as a surrogate for departmental research, we sought to establish an objective methodology to better assess a residency program's ability to conduct meaningful research. In this study, we describe a process to assess the number and quality of publications produced by an orthopaedic residency department. This would allow chairmen and program directors to benchmark their current production and make measurable goals for future research investment. The main goal of the benchmarking system is to create an "h-index" for residency programs. To do this, we needed to create a list of relevant articles in the orthopaedic literature. We used the Journal Citation Reports. This publication lists all orthopaedic journals that are given an impact factor rating every year. When we accessed the Journal Citation Reports database, there were 72 journals included in the orthopaedic literature section. To ensure only relevant, impactful journals were included, we selected journals with an impact factor greater than 0.95 and an Eigenfactor Score greater than 0.00095. After excluding journals not meeting these criteria, we were left with 45 journals. We performed a Scopus search over a 10-year period of these journals and created a database of articles and their affiliated institutions. We performed several iterations of this to maximize the capture of articles attributed to institutions with multiple names. Based off of this extensive database, we were able to analyze all allopathic US residency programs based on their quality research productivity. We believe this as a novel methodology to create a system by which residency program chairmen and directors can assess progress over time and accurate comparison with other programs.
任何学术骨科培训计划的使命都可以分为 3 个关注重点领域:临床护理、学术表现和研究。临床护理通过临床量、患者结局、患者满意度来评估,并越来越关注数据驱动的质量指标。部门的学术表现可用于激励个别外科医生,但客观衡量标准用于定义住院医师培训计划。年度在职考试是衡量住院医师知识库的标志,而委员会通过率则受到严格审查。然而,研究生产力更难客观量化。为了提高透明度,并更好地考虑利益冲突、偏见和自引,已经开发了多种文献计量学衡量标准。我们没有将个人的研究生产力作为部门研究的替代指标,而是试图建立一种客观的方法来更好地评估住院医师培训计划进行有意义研究的能力。在这项研究中,我们描述了一种评估骨科住院医师培训部门发表的文章数量和质量的方法。这将允许主席和项目主任对他们目前的产出进行基准测试,并为未来的研究投资制定可衡量的目标。基准系统的主要目标是为住院医师培训计划创建一个“h 指数”。要做到这一点,我们需要创建一个骨科文献中相关文章的列表。我们使用期刊引文报告。该出版物列出了每年都有影响因子评级的所有骨科期刊。当我们访问期刊引文报告数据库时,有 72 种期刊被列入骨科文献部分。为了确保只包含相关的、有影响力的期刊,我们选择了影响因子大于 0.95 和 Eigenfactor 得分大于 0.00095 的期刊。排除不符合这些标准的期刊后,我们只剩下 45 种期刊。我们对这些期刊进行了 10 年的 Scopus 搜索,并创建了一个包含文章及其附属机构的数据库。我们对其进行了几次迭代,以最大限度地捕获归因于具有多个名称的机构的文章。根据这个广泛的数据库,我们能够根据他们的高质量研究生产力分析所有美国的骨科住院医师培训计划。我们相信,这是一种创建一种系统的新方法,通过这种方法,住院医师培训计划主席和主任可以随着时间的推移评估进展情况,并与其他计划进行准确比较。