Collins Jannette, Herring William, Kwakwa Francis, Tarver Robert D, Blinder Russell A, Gray-Leithe Linda, Wood Beverly
Department of Radiology, University of Wisconsin Medical School, E3/311 Clinical Science Center, 600 Highland Avenue, Madison WI 53792-3252, USA.
Acad Radiol. 2004 Jul;11(7):787-94. doi: 10.1016/j.acra.2004.04.005.
We surveyed program directors to determine current radiology program practices in evaluating their residents, faculty, and program.
In January 2003, a 52-item Web-based survey was made available to program directors of accredited core radiology programs. Responses to the items were tabulated to determine relative frequency distribution. Two-tailed Pearson chi-square tests were used to compare proportions and assess the association between variables.
A total of 99 (52%) of 192 program directors responded. Programs were largely in compliance with Accreditation Council for Graduate Medical Education (ACGME) requirements. Noncompliance was related to the requirements to evaluate residents at least four times per year in at least 22 (22.2%) of 99 programs and annually evaluate the program in 20 (20.2%) of 99 programs. New program directors (<1-year tenure) were less likely than those with >/=1-year tenure to be using the Association of Program Directors in Radiology Education Committee global rating form (41.2% versus 68.8%, P =.03). Programs that used this form, compared with those that didn't, were more likely to evaluate resident competence in systems-based practice (88.5% versus 44.0%, P =.001). Being a program director for 1 or more years versus less than 1 year was associated with using a computerized evaluation system (35.8% versus 11.8%, P =.05).
In general, there is a high degree of compliance among radiology programs in meeting ACGME evaluation requirements. However, some programs do not comply with requirements for frequency of resident evaluation or annual program evaluation. The percentage of new program directors is high and related to not using or knowing about useful evaluation resources. Use of computerized evaluation systems, which have the potential to decrease the work associated with evaluations and provide more dependable and reliable data, is minimal.
我们对项目主任进行了调查,以确定当前放射科项目在评估住院医师、教员和项目方面的做法。
2003年1月,向经认可的核心放射科项目的项目主任提供了一份包含52个条目的基于网络的调查问卷。对这些条目的回答进行列表统计,以确定相对频率分布。使用双尾Pearson卡方检验来比较比例并评估变量之间的关联。
192名项目主任中共有99名(52%)做出了回应。各项目在很大程度上符合毕业后医学教育认证委员会(ACGME)的要求。不符合要求的情况与以下方面有关:在99个项目中的至少22个(22.2%)中,每年对住院医师进行至少4次评估;在99个项目中的20个(20.2%)中,每年对项目进行评估。新的项目主任(任期<1年)比任期≥1年的项目主任使用放射科教育项目主任协会委员会整体评分表的可能性更小(41.2%对68.8%,P = 0.03)。与未使用该表格的项目相比,使用该表格的项目更有可能评估住院医师在基于系统的实践中的能力(88.5%对44.0%,P = 0.001)。担任项目主任1年或更长时间与少于1年相比,与使用计算机化评估系统有关(35.8%对11.8%,P = 0.05)。
总体而言,放射科项目在满足ACGME评估要求方面有很高的合规度。然而,一些项目不符合对住院医师评估频率或年度项目评估的要求。新的项目主任比例较高,且与不使用或不了解有用的评估资源有关。计算机化评估系统有可能减少与评估相关的工作并提供更可靠和可信的数据,但使用极少。