Roark Rick M, Schaefer Steven D, Yu Guo-Pei, Branovan Daniel I, Peterson Stephen J, Lee Wei-Nchih
Department of Otolaryngology, The New York Eye & Ear Infirmary, Manhattan, New York 10003, USA.
Laryngoscope. 2006 May;116(5):682-95. doi: 10.1097/01.mlg.0000205148.14269.09.
The objectives of this study were to: 1) implement web-based instruments for assessing and documenting the general competencies of otolaryngology resident education, as outlined by the Accreditation Council of Graduate Medical Education (ACGME); and 2) examine the benefit and validity of this online system for measuring educational outcomes and for identifying insufficiencies in the training program as they occur.
We developed an online assessment system for a surgical postgraduate education program and examined its feasibility, usability, and validity. Evaluations of behaviors, skills, and attitudes of 26 residents were completed online by faculty, peers, and nonphysician professionals during a 3-year period. Analyses included calculation and evaluation of total average performance scores of each resident by different evaluators. Evaluations were also compared with American Board of Otolaryngology-administered in-service examination (ISE) scores for each resident. Convergent validity was examined statistically by comparing ratings among the different evaluator types.
Questionnaires and software were found to be simple to use and efficient in collecting essential information. From July 2002 to June 2005, 1,336 evaluation forms were available for analysis. The average score assigned by faculty was 4.31, significantly lower than that by nonphysician professionals (4.66) and residents evaluating peers (4.63) (P < .001), whereas scores were similar between nonphysician professionals and resident peers. Average scores between faculty and nonphysician groups showed correlation in constructs of communication and relationship with patients, but not in those of professionalism and documentation. Correlation was observed in respect for patients but not in medical knowledge between faculty and resident peer groups. Resident ISE scores improved in the third year of the study and demonstrated high correlation with faculty perceptions of medical knowledge (r = 0.65, P = .007).
Compliance for completion of forms was 97%. The system facilitated the educational management of our training program along multiple dimensions. The small perceptual differences among a highly selected group of residents have made the unambiguous validation of the system challenging. The instruments and approach warrant further study. Improvements are likely best achieved in broad consultation among other otolaryngology programs.
本研究的目的是:1)按照毕业后医学教育认证委员会(ACGME)概述的要求,采用基于网络的工具来评估和记录耳鼻咽喉科住院医师教育的总体能力;2)检验这个在线系统在衡量教育成果以及发现培训项目中出现的不足之处方面的益处和有效性。
我们为一个外科研究生教育项目开发了一个在线评估系统,并检验了其可行性、易用性和有效性。在3年时间里,教员、同行和非医师专业人员通过网络完成了对26名住院医师行为、技能和态度的评估。分析内容包括计算和评估不同评估者对每位住院医师的总体平均表现得分。评估结果还与美国耳鼻咽喉科委员会组织的在职考试(ISE)中每位住院医师的成绩进行了比较。通过比较不同评估者类型之间的评分,对收敛效度进行了统计学检验。
问卷调查和软件被发现易于使用且能有效地收集重要信息。从2002年7月到2005年6月,有1336份评估表可供分析。教员给出的平均分数为4.31,显著低于非医师专业人员(4.66)和评估同行的住院医师(4.63)(P <.001),而非医师专业人员和住院医师同行给出的分数相似。教员组和非医师组的平均分数在与患者沟通和关系的结构方面显示出相关性,但在职业素养和文档记录方面没有相关性。教员组和住院医师同行组在尊重患者方面存在相关性,但在医学知识方面没有相关性。住院医师的ISE成绩在研究的第三年有所提高,并且与教员对医学知识的看法显示出高度相关性(r = 0.65,P =.007)。
表格填写的完成率为97%。该系统在多个维度上促进了我们培训项目的教育管理。在经过高度筛选的住院医师群体中存在的微小认知差异使得对该系统进行明确验证具有挑战性。这些工具和方法值得进一步研究。改进可能最好通过与其他耳鼻咽喉科项目进行广泛协商来实现。