Butler Kathryn L, Hirsh David A, Petrusa Emil R, Yeh D Dante, Stearns Dana, Sloane David E, Linder Jeffrey A, Basu Gaurab, Thompson Lisa A, de Moya Marc A
Department of Surgery, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts.
Cambridge Health Alliance, Cambridge, Massachusetts; Department of Medicine, Harvard Medical School, Boston, Massachusetts.
J Surg Educ. 2017 Mar-Apr;74(2):286-294. doi: 10.1016/j.jsurg.2016.08.018. Epub 2016 Sep 28.
Optimal methods for medical student assessment in surgery remain elusive. Faculty- and housestaff-written evaluations constitute the chief means of student assessment in medical education. However, numerous studies show that this approach has poor specificity and a high degree of subjectivity. We hypothesized that an objective structured clinical examination (OSCE) in the surgery clerkship would provide additional data on student performance that would confirm or augment other measures of assessment.
We retrospectively reviewed data from OSCEs, National Board of Medical Examiners shelf examinations, oral presentations, and written evaluations for 51 third-year Harvard Medical School students rotating in surgery at Massachusetts General Hospital from 2014 to 2015. We expressed correlations between numeric variables in Pearson coefficients, stratified differences between rater groups by one-way analysis of variance, and compared percentages with 2-sample t-tests. We examined commentary from both OSCE and clinical written evaluations through textual analysis and summarized these results in percentages.
OSCE scores and clinical evaluation scores correlated poorly with each other, as well as with shelf examination scores and oral presentation grades. Textual analysis of clinical evaluation comments revealed a heavy emphasis on motivational factors and praise, whereas OSCE written comments focused on cognitive processes, patient management, and methods to improve performance.
In this single-center study, an OSCE provided clinical skills data that were not captured elsewhere in the surgery clerkship. Textual analysis of faculty evaluations reflected an emphasis on interpersonal skills, rather than appraisal of clinical acumen. These findings suggest complementary roles of faculty evaluations and OSCEs in medical student assessment.
外科医学学生评估的最佳方法仍不明确。在医学教育中,由教员和住院医师撰写的评估是学生评估的主要方式。然而,大量研究表明,这种方法特异性差且主观性强。我们推测,外科实习中的客观结构化临床考试(OSCE)将提供有关学生表现的额外数据,这些数据将证实或补充其他评估措施。
我们回顾性分析了2014年至2015年在马萨诸塞州综合医院轮转外科的51名哈佛医学院三年级学生的OSCE、美国国家医学考试委员会结业考试、口头报告和书面评估的数据。我们用Pearson系数表示数值变量之间的相关性,通过单因素方差分析对评分者组之间的分层差异进行分析,并用双样本t检验比较百分比。我们通过文本分析检查了OSCE和临床书面评估的评论,并将这些结果总结为百分比。
OSCE分数与临床评估分数之间、与结业考试分数和口头报告成绩之间的相关性都很差。对临床评估评论的文本分析显示,评论非常强调动机因素和赞扬,而OSCE书面评论则侧重于认知过程、患者管理和提高表现的方法。
在这项单中心研究中,OSCE提供了外科实习中其他地方未获取的临床技能数据。教员评估的文本分析反映出对人际技能的强调,而非对临床敏锐度的评估。这些发现表明教员评估和OSCE在医学生评估中具有互补作用。