Department of Diagnostic Radiology and Nuclear Medicine, Division of Vascular and Interventional Radiology, University of Maryland School of Medicine, 22 S. Greene St. (N2W74), Baltimore, MD 21201.
J Vasc Interv Radiol. 2013 Oct;24(10):1481-6.e1. doi: 10.1016/j.jvir.2013.07.001.
Existing diagnostic radiology peer-review systems do not address the specificities of interventional radiology (IR) practice. The purpose of this study was to assess the feasibility of a specifically developed interventional peer review method, IR Peer.
Retrospective review of a prospectively encoded pilot database aimed at demonstrating the feasibility of IR Peer in a multiphysician practice was performed. This scoring system used morning peer review of selected IR cases from the previous day in the form of a five-item questionnaire and an ordinal answer scale that grades reviewers' agreement with imaging findings, procedural/technical management, early outcomes, and follow-up plan. Patient lists from IR Peer and morbidity and mortality (M&M) conferences were compared to evaluate the amount of overlap and capability of IR Peer to help detect adverse events (AEs).
A total of 417 consecutive reviews of IR attending physician cases by peers were performed in 163 consecutive patients over 18 months, and 94% of cases were reviewed by two or three IR attending physicians. Each question was answered 99%-100% of the time. Answers showed disagreement in 10% of cases (2% by a single reviewer, 8% by several), most related to procedural technique. Overall AE incidence was 1.8%. IR Peer contributed 10.7% of cases to the M&M list.
IR Peer is feasible, relevant, and easy to implement in a multiphysician IR practice. When used along with other quality-assurance processes, it might help in the detection of AEs for M&M; the latter will require further confirmatory research.
现有的诊断放射学同行评审系统并未解决介入放射学(IR)实践的特殊性。本研究旨在评估专门开发的介入同行评审方法 IR Peer 的可行性。
对前瞻性编码的试点数据库进行回顾性分析,旨在证明多医师实践中 IR Peer 的可行性。该评分系统采用五分量表问卷和等级答案量表,对前一天的选定 IR 病例进行晨间同行评审,对成像结果、程序/技术管理、早期结果和随访计划进行评分,以评估放射科医师的意见一致性。将 IR Peer 和医疗不良事件(M&M)会议的患者清单进行比较,以评估重叠程度和 IR Peer 发现不良事件的能力。
在 18 个月的时间里,对 163 例连续患者的 417 例 IR 主治医生病例进行了同行评审,其中 94%的病例由 2 或 3 名 IR 主治医生进行了评审。每个问题的回答都是 99%-100%。有 10%的病例存在意见分歧(单个评审员的比例为 2%,多个评审员的比例为 8%),主要与程序技术有关。总体不良事件发生率为 1.8%。IR Peer 为 M&M 列表贡献了 10.7%的病例。
IR Peer 在多医师 IR 实践中是可行的、相关的和易于实施的。与其他质量保证流程一起使用时,它可能有助于检测 M&M 中的不良事件;后者将需要进一步的确认性研究。