Lu Yang, Miranda Ruth, Quach Chi, Girgis Mark, Lewis Catherine E, Tillou Areti, Chen Formosa
Department of Surgery, Ronald Reagan UCLA Medical Center, Los Angeles, California.
David Geffen UCLA School of Medicine, Los Angeles, California.
J Surg Educ. 2020 Nov-Dec;77(6):1568-1576. doi: 10.1016/j.jsurg.2020.05.015. Epub 2020 Jun 4.
Mock oral examinations (MOE) are used to prepare residents and assess their readiness for the American Board of Surgery Certifying Exam (ABSCE). Delivery of MOEs varies by institution and previous studies have demonstrated significant implementation barriers such as availability of faculty examiners and exam scenarios.
To assess the value and participant satisfaction of a standardized multi-institutional MOE for general surgery residents.
Thirty-three general surgery residents and 37 faculty members from 3 institutions participated in a regional MOE. Residents were examined in three 20-minute sessions. Faculty examiners were given a wide selection of prescripted exam scenarios and instructed to use standardized grading rubrics during a brief orientation on the day of the exam. All participants were surveyed on their overall experience.
Of 33 participating residents, 26 (79%) passed the MOE (92% of R5, 91% R4, and 50% of R3). Response rates were 91% for residents, and 57% for faculty members respectively. Most respondents were satisfied with the overall exam experience (88%), standardized question quality (86%) and question variety (82%). A total of 92% of respondents agreed that the time, effort, and cost of the MOE was justified by its educational value to residents. Higher medical knowledge ratings assigned by faculty examiners correlated with stronger trainee performance (β = 0.48; 95% confidence interval [CI] 0.29-0.66), while patient care and interpersonal communication skill ratings were not associated with trainee performance. The standardized grading rubric achieved moderate inter-rater reliability among examiner pairs with 70.6% agreement (Kappa 0.47).
General Surgery residents and faculty perceived the standardized multi-institutional MOE to be a highly satisfactory educational experience and valuable assessment tool. Developing a repertoire of scripted exam scenarios made it feasible to recruit sufficient faculty participants, and standardizing grading rubrics allowed for a consistent exam experience with moderate inter-rater reliability.
模拟口试(MOE)用于培训住院医师,并评估他们是否准备好参加美国外科委员会认证考试(ABSCE)。各机构提供的模拟口试各不相同,先前的研究表明存在重大的实施障碍,如考官的可用性和考试场景。
评估标准化多机构普通外科住院医师模拟口试的价值和参与者满意度。
来自3个机构的33名普通外科住院医师和37名教员参加了区域模拟口试。住院医师分三个20分钟时段接受考核。在考试当天的简短培训中,为考官提供了多种预先设定的考试场景,并指导他们使用标准化评分标准。对所有参与者的总体体验进行了调查。
33名参与的住院医师中,26名(79%)通过了模拟口试(R5的92%,R4的91%,R3的50%)。住院医师的回复率为91%,教员的回复率为57%。大多数受访者对总体考试体验(88%)、标准化问题质量(86%)和问题多样性(82%)感到满意。共有92%的受访者认为,模拟口试的时间、精力和成本因其对住院医师的教育价值而合理。考官给出的更高医学知识评分与更强的学员表现相关(β = 0.48;95%置信区间[CI] 0.29 - 0.66),而患者护理和人际沟通技能评分与学员表现无关。标准化评分标准在考官对之间达到了中等的评分者间信度,一致性为70.6%(Kappa 0.47)。
普通外科住院医师和教员认为标准化多机构模拟口试是一种非常令人满意的教育体验和有价值的评估工具。开发一系列预先设定的考试场景使得招募足够的教员参与者变得可行,标准化评分标准允许有一致的考试体验,且评分者间信度中等。