Department of Child Health, School of Medicine, Cardiff University, UK.
Clin Radiol. 2012 Jul;67(7):664-8. doi: 10.1016/j.crad.2011.12.003. Epub 2012 Feb 15.
To compare levels of agreement amongst paediatric clinicians with those amongst consultant paediatric radiologists when interpreting chest radiographs (CXRs).
Four paediatric radiologists used picture archiving and communication system (PACS) workstations to evaluate the presence of five radiological features of infection, independently in each of 30 CXRs. The radiographs were obtained over 1 year (2008) from children with fever and signs of respiratory distress, aged 6 months to <16 years. The same CXRs were interpreted a second time by the paediatric radiologists and by 21 clinicians with varying experience levels, using the Web 1000 viewing system and a projector. Intra- and interobserver agreement within groups, split by grade and specialty, were analysed using free-marginal multi-rater kappa.
Normal CXRs were identified consistently amongst all 25 participants. The four paediatric radiologists showed high levels of intraobserver agreement between methods (kappa scores between 0.53 and 1.00) and interobserver agreement for each method (kappa scores between 0.67 and 0.96 for PACS assessment). The 21 clinicians showed varying levels of agreement from 0.21 to 0.89.
Paediatric radiologists showed high levels of agreement for all features. In general, the clinicians had lower levels of agreement than the radiologists. This study highlights the need for improved training in interpreting CXRs for clinicians and the timely reporting of CXRs by radiologists to allow appropriate patient management.
比较儿科临床医生与顾问儿科放射科医生在解读胸部 X 光片(CXR)时的一致性水平。
4 名儿科放射科医生使用图像存档和通信系统(PACS)工作站,独立评估 30 张 CXR 中感染的 5 种放射学特征,每张 CXR 评估 1 次。这些 X 光片是在 2008 年的 1 年中从发热且有呼吸窘迫迹象的儿童中获得的,年龄在 6 个月至<16 岁之间。同样的 CXR 由儿科放射科医生和 21 名经验水平不同的临床医生使用 Web 1000 查看系统和投影仪进行第二次解读。通过自由边际多评估者 kappa 分析,按等级和专业对组内和组间的观察者内和观察者间一致性进行分析。
所有 25 名参与者均一致识别出正常的 CXR。4 名儿科放射科医生在两种方法之间显示出高度的观察者内一致性(kappa 评分在 0.53 至 1.00 之间),并且在每种方法之间显示出高度的观察者间一致性(kappa 评分在 PACS 评估时为 0.67 至 0.96)。21 名临床医生的一致性从 0.21 到 0.89 不等。
儿科放射科医生对所有特征都表现出高度的一致性。一般来说,临床医生的一致性水平低于放射科医生。这项研究强调了需要为临床医生提供更好的解读 CXR 的培训,以及放射科医生及时报告 CXR,以实现适当的患者管理。