Suppr超能文献

放射学申请单和报告质量审核中的观察者间一致性

Inter-observer agreement in audit of quality of radiology requests and reports.

作者信息

Stavem K, Foss T, Botnmark O, Andersen O K, Erikssen J

机构信息

Department of Radiology, Akershus University Hospital, Nordbyhagen, Norway.

出版信息

Clin Radiol. 2004 Nov;59(11):1018-24. doi: 10.1016/j.crad.2004.04.002.

Abstract

AIMS

To assess the quality of the imaging procedure requests and radiologists' reports using an auditing tool, and to assess the agreement between different observers of the quality parameters.

MATERIALS AND METHODS

In an audit using a standardized scoring system, three observers reviewed request forms for 296 consecutive radiological examinations, and two observers reviewed a random sample of 150 of the corresponding radiologists' reports. We present descriptive statistics from the audit and pairwise inter-observer agreement, using the proportion agreement and kappa statistics.

RESULTS

The proportion of acceptable item scores (0 or +1) was above 70% for all items except the requesting physician's bleep or extension number, legibility of the physician's name, or details about previous investigations. For pairs of observers, the inter-observer agreement was generally high, however, the corresponding kappa values were consistently low with only 14 of 90 ratings >0.60 and 6 >0.80 on the requests/reports. For the quality of the clinical information, the appropriateness of the request, and the requested priority/timing of the investigation items, the mean percentage agreement ranged 67-76, and the corresponding kappa values ranged 0.08-0.24.

CONCLUSION

The inter-observer reliability of scores on the different items showed a high degree of agreement, although the kappa values were low, which is a well-known paradox. Current routines for requesting radiology examinations appeared satisfactory, although several problem areas were identified.

摘要

目的

使用审核工具评估影像检查申请单及放射科医生报告的质量,并评估不同观察者对质量参数的一致性。

材料与方法

在一项采用标准化评分系统的审核中,三名观察者对连续296例放射学检查的申请单进行了审查,两名观察者对相应放射科医生报告中的150份随机样本进行了审查。我们使用一致性比例和kappa统计量展示审核的描述性统计数据及观察者间的两两一致性。

结果

除申请医生的传呼机号码或分机号码、医生姓名的清晰度或既往检查的详细信息外,所有项目可接受项目评分(0或+1)的比例均高于70%。对于成对的观察者,观察者间的一致性总体较高,然而,相应的kappa值始终较低,申请单/报告的90项评分中只有14项>0.60,6项>0.80。对于临床信息质量、申请的适当性以及检查项目的申请优先级/时间安排,平均一致性百分比范围为67 - 76,相应的kappa值范围为0.08 - 0.24。

结论

不同项目评分的观察者间可靠性显示出高度一致性,尽管kappa值较低,这是一个众所周知的矛盾现象。尽管发现了几个问题领域,但目前申请放射学检查的常规流程似乎令人满意。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验