Taylor David R, Park Yoon Soo, Egan Rylan, Chan Ming-Ka, Karpinski Jolanta, Touchie Claire, Snell Linda S, Tekian Ara
D.R. Taylor is associate professor, Department of Medicine, Queen's University School of Medicine, Kingston, Ontario, Canada. Y.S. Park is assistant professor, Department of Medical Education, University of Illinois at Chicago, Chicago, Illinois. R. Egan is assistant professor and director, Office of Health Science Education, Queen's University School of Medicine, Kingston, Ontario, Canada. M.-K. Chan is associate professor, Department of Pediatrics and Child Health, University of Manitoba, Winnipeg, Manitoba, Canada, and clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada. J. Karpinski is associate professor, Department of Medicine, University of Ottawa, Ottawa, Ontario, Canada, and associate director, Specialties Unit, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada. C. Touchie is associate professor, Department of Medicine, University of Ottawa, Ottawa, Ontario, Canada, and chief medical education advisor, Medical Council of Canada, Ottawa, Ontario, Canada. L.S. Snell is professor, Department of Medicine, McGill University, Montreal, Quebec, Canada, and senior clinician educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada. A. Tekian is professor, Department of Medical Education, University of Illinois at Chicago, Chicago, Illinois.
Acad Med. 2017 Nov;92(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 56th Annual Research in Medical Education Sessions):S110-S117. doi: 10.1097/ACM.0000000000001908.
Entrustable professional activities (EPAs) have become a cornerstone of assessment in competency-based medical education (CBME). Increasingly, EPAs are being adopted that do not conform to EPA standards. This study aimed to develop and validate a scoring rubric to evaluate EPAs for alignment with their purpose, and to identify substandard EPAs.
The EQual rubric was developed and revised by a team of education scholars with expertise in EPAs. It was then applied by four residency program directors/CBME leads (PDs) and four nonclinician support staff to 31 stage-specific EPAs developed for internal medicine in the Royal College of Physicians and Surgeons of Canada's Competency by Design framework. Results were analyzed using a generalizability study to evaluate overall reliability, with the EPAs as the object of measurement. Item-level analysis was performed to determine reliability and discrimination value for each item. Scores from the PDs were also compared with decisions about revisions made independently by the education scholars group.
The EQual rubric demonstrated high reliability in the G-study with a phi-coefficient of 0.84 when applied by the PDs, and moderate reliability when applied by the support staff at 0.67. Item-level analysis identified three items that performed poorly with low item discrimination and low interrater reliability indices. Scores from support staff only moderately correlated with PDs. Using the preestablished cut score, PDs identified 9 of 10 EPAs deemed to require major revision.
EQual rubric scores reliably measured alignment of EPAs with literature-described standards. Further, its application accurately identified EPAs requiring major revisions.
可托付专业活动(EPA)已成为基于胜任力的医学教育(CBME)评估的基石。越来越多采用的EPA不符合EPA标准。本研究旨在开发并验证一个评分量表,以评估EPA与其目的的一致性,并识别不合格的EPA。
由一组在EPA方面具有专业知识的教育学者开发并修订了EQual量表。然后,四位住院医师培训项目主任/CBME负责人(PD)和四位非临床支持人员将其应用于为加拿大皇家内科医师与外科医师学院“以设计促胜任力”框架下的内科制定的31个特定阶段的EPA。使用概化研究分析结果以评估总体可靠性,将EPA作为测量对象。进行项目层面的分析以确定每个项目的可靠性和区分度值。还比较了PD给出的分数与教育学者小组独立做出的修订决定。
当由PD应用时,EQual量表在概化研究中显示出高可靠性,phi系数为0.84;当由支持人员应用时,可靠性中等,为0.67。项目层面的分析确定了三个表现不佳的项目,其项目区分度低且评分者间可靠性指数低。支持人员的分数与PD的分数仅呈中等程度相关。使用预先设定的cut分数,PD识别出10个EPA中有9个被认为需要重大修订。
EQual量表分数可靠地衡量了EPA与文献描述标准的一致性。此外,其应用准确地识别出需要重大修订的EPA。