Department of Educational Psychology, The University of Georgia, 323 Aderhold Hall, Athens, GA , 30602, USA.
Psychometrika. 2018 Sep;83(3):696-732. doi: 10.1007/s11336-018-9626-9. Epub 2018 Jun 15.
This paper proposes a model-based family of detection and quantification statistics to evaluate response bias in item bundles of any size. Compensatory (CDRF) and non-compensatory (NCDRF) response bias measures are proposed, along with their sample realizations and large-sample variability when models are fitted using multiple-group estimation. Based on the underlying connection to item response theory estimation methodology, it is argued that these new statistics provide a powerful and flexible approach to studying response bias for categorical response data over and above methods that have previously appeared in the literature. To evaluate their practical utility, CDRF and NCDRF are compared to the closely related SIBTEST family of statistics and likelihood-based detection methods through a series of Monte Carlo simulations. Results indicate that the new statistics are more optimal effect size estimates of marginal response bias than the SIBTEST family, are competitive with a selection of likelihood-based methods when studying item-level bias, and are the most optimal when studying differential bundle and test bias.
本文提出了一种基于模型的检测和量化统计家族,用于评估任何大小的项目束中的响应偏差。同时提出了补偿性(CDRF)和非补偿性(NCDRF)响应偏差度量,以及在使用多组估计拟合模型时它们的样本实现和大样本变异性。基于与项目反应理论估计方法的内在联系,有人认为这些新的统计数据为研究分类响应数据的响应偏差提供了一种强大而灵活的方法,超过了文献中以前出现的方法。为了评估它们的实际效用,通过一系列蒙特卡罗模拟,将 CDRF 和 NCDRF 与密切相关的 SIBTEST 统计家族和基于似然的检测方法进行了比较。结果表明,新的统计数据是边际响应偏差的更优效应量估计,比 SIBTEST 统计家族更优,在研究项目级偏差时与一些基于似然的方法具有竞争力,而在研究差异束和测试偏差时则是最优的。