Department of Hand and Wrist Surgery, Maasstad Ziekenhuis, Rotterdam, the Netherlands.
Department of Surgery, Academic Medical Center Amsterdam, Amsterdam, the Netherlands.
J Hand Surg Am. 2024 May;49(5):482-485. doi: 10.1016/j.jhsa.2024.01.006. Epub 2024 Feb 17.
Observer reliability studies for fracture classification systems evaluate agreement using Cohen's κ and absolute agreement as outcome measures. Cohen's κ is a chance-corrected measure of agreement and can range between 0 (no agreement) and 1 (perfect agreement). Absolute agreement is the percentage of times observers agree on the matter they have to rate. Some studies report a high-absolute agreement but a relatively low κ value, which is counterintuitive. This phenomenon is referred to as the Kappa Paradox. The objective of this article was to explain the statistical phenomenon of the Kappa Paradox and to help readers and researchers to recognize and prevent this phenomenon.
观察者可靠性研究用于骨折分类系统,使用 Cohen's κ 和绝对一致性作为结果测量来评估一致性。Cohen's κ 是一种校正机会的一致性度量,范围从 0(无一致性)到 1(完全一致)。绝对一致性是观察者在需要评分的问题上达成一致的次数的百分比。一些研究报告显示,高绝对一致性但 κ 值相对较低,这是违反直觉的。这种现象被称为 Kappa 悖论。本文的目的是解释 Kappa 悖论的统计现象,并帮助读者和研究人员认识和预防这种现象。