Department of Methodology and Statistics, University of Maastricht, The Netherlands.
Stat Med. 2012 Dec 10;31(28):3667-80. doi: 10.1002/sim.5424. Epub 2012 Jun 26.
Kappa-like agreement indexes are often used to assess the agreement among examiners on a categorical scale. They have the particularity of correcting the level of agreement for the effect of chance. In the present paper, we first define two agreement indexes belonging to this family in a hierarchical context. In particular, we consider the cases of a random and fixed set of examiners. Then, we develop a method to evaluate the influence of factors on these indexes. Agreement indexes are directly related to a set of covariates through a hierarchical model. We obtain the posterior distribution of the model parameters in a Bayesian framework. We apply the proposed approach on dental data and compare it with the generalized estimating equations approach.
Kappa 相似性一致性指数常用于评估分类尺度上的评估者之间的一致性。它们具有校正一致性水平以消除偶然效应的特点。在本文中,我们首先在层次结构上下文中定义了属于这个家族的两个一致性指数。特别地,我们考虑了随机和固定的评估者集的情况。然后,我们开发了一种方法来评估这些指数的因素影响。一致性指数通过层次模型与一组协变量直接相关。我们在贝叶斯框架下获得模型参数的后验分布。我们将提出的方法应用于牙科数据,并将其与广义估计方程方法进行比较。