Suppr超能文献

观察者间一致性的加权特定类别kappa测量

Weighted specific-category kappa measure of interobserver agreement.

作者信息

Kvålseth Tarald O

机构信息

Department of Mechanical Engineering, University of Minnesota, Minneapolis 55455, USA.

出版信息

Psychol Rep. 2003 Dec;93(3 Pt 2):1283-90. doi: 10.2466/pr0.2003.93.3f.1283.

Abstract

When two observers classify a sample of items using the same categorical scale, and when different disagreements are differentially weighted, the weighted Kappa (Kw) by Cohen may serve as a measure of interobserver agreement. We propose a Kappa-based weighted measure (K(ws)) of agreement on some specific category s, with Kw being a weighted average of all K(ws)s. Therefore, while Cohen's Kw is a summary measure of the overall agreement, the proposed K(ws) provides a measure of the extent to which the observers agree on the specific categories, with both measures being suitable for ordinal categories because of the weights being used. Statistical inferences for K(ws) and its unweighted counterpart are also discussed. A numerical example is provided.

摘要

当两位观察者使用相同的分类量表对一组项目样本进行分类,且不同的分歧被赋予不同权重时,科恩提出的加权卡帕系数(Kw)可作为观察者间一致性的一种度量。我们提出了一种基于卡帕系数的关于某个特定类别s的加权一致性度量(K(ws)),其中Kw是所有K(ws)的加权平均值。因此,虽然科恩的Kw是总体一致性的一个汇总度量,但所提出的K(ws)提供了一种度量,用以表明观察者在特定类别上的一致程度,由于使用了权重,这两种度量都适用于有序类别。还讨论了K(ws)及其未加权对应项的统计推断。并给出了一个数值示例。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验