Department of Medical Education, St. Mary Medical Center and David Geffen School of Medicine at UCLA, Long Beach, California 90813, USA.
Clin Cardiol. 2010 Dec;33(12):738-45. doi: 10.1002/clc.20851.
Many reported studies of medical trainees and physicians have demonstrated major deficiencies in correctly identifying heart sounds and murmurs, but cardiologists had not been tested. We previously confirmed these deficiencies using a 50-question multimedia cardiac examination (CE) test featuring video vignettes of patients with auscultatory and visible manifestations of cardiovascular pathology (virtual cardiac patients). Previous testing of 62 internal medical faculty yielded scores no better than those of medical students and residents.
In this study, we tested whether cardiologists outperformed other physicians in cardiac examination skills, and whether years in practice correlated with test performance.
To obviate cardiologists' reluctance to be tested, the CE test was installed at 19 US teaching centers for confidential testing. Test scores and demographic data (training level, subspecialty, and years in practice) were uploaded to a secure database.
The 520 tests revealed mean scores (out of 100 ± 95% confidence interval) in descending order: 10 cardiology volunteer faculty (86.3 ± 8.0), 57 full-time cardiologists (82.0 ± 3.3), 4 private-practice cardiologists (77.0 ± 6.8), and 19 noncardiology faculty (67.3 ± 8.8). Trainees' scores in descending order: 150 cardiology fellows (77.3 ± 2.1), 78 medical students (63.7 ± 3.5), 95 internal medicine residents (62.7 ± 3.2), and 107 family medicine residents (59.2 ± 3.2). Faculty scores were higher in those trained earlier with longer practice experience.
Academic and volunteer cardiologists outperformed other medical faculty, as did cardiology fellows. Lower scores were observed in more recently trained faculty. Remote testing yielded scores similar to proctored tests in comparable groups previously studied. No significant improvement was seen after medical school with residency training.
许多针对医学受训者和医生的研究报告表明,他们在正确识别心音和杂音方面存在严重缺陷,但尚未对心脏病专家进行过测试。我们之前使用了一项包含 50 个问题的多媒体心脏检查(CE)测试,该测试采用了听诊和心血管病理学可视表现的患者视频片段(虚拟心脏患者),证实了这些缺陷。之前对 62 名内科教员进行的测试,其得分并不优于医学生和住院医师。
在这项研究中,我们测试了心脏病专家是否在心脏检查技能方面优于其他医生,以及实践年限是否与测试表现相关。
为避免心脏病专家不愿接受测试,CE 测试被安装在 19 个美国教学中心,以便进行机密测试。测试分数和人口统计学数据(培训水平、专业和从业年限)被上传到一个安全的数据库。
520 次测试的平均分数(满分 100 分,置信区间为 95%)按降序排列如下:10 名心脏病学志愿教员(86.3 ± 8.0)、57 名全职心脏病专家(82.0 ± 3.3)、4 名私人执业心脏病专家(77.0 ± 6.8)和 19 名非心脏病学教员(67.3 ± 8.8)。按降序排列,受训者的分数如下:150 名心脏病学研究员(77.3 ± 2.1)、78 名医学生(63.7 ± 3.5)、95 名内科住院医师(62.7 ± 3.2)和 107 名家庭医学住院医师(59.2 ± 3.2)。培训时间较早、从业经验较长的教员分数较高。
学术和志愿心脏病专家的表现优于其他医学教员,心脏病学研究员也是如此。在最近接受培训的教员中,分数较低。远程测试在之前研究的可比群体中获得了与现场测试相似的分数。在医学院毕业后,随着住院医师培训的进行,并没有明显的提高。