Hong Maxwell, Steedle Jeffrey T, Cheng Ying
University of Notre Dame, Notre Dame, IN, USA.
ACT, Iowa City, IA, USA.
Educ Psychol Meas. 2020 Apr;80(2):312-345. doi: 10.1177/0013164419865316. Epub 2019 Jul 31.
Insufficient effort responding (IER) affects many forms of assessment in both educational and psychological contexts. Much research has examined different types of IER, IER's impact on the psychometric properties of test scores, and preprocessing procedures used to detect IER. However, there is a gap in the literature in terms of practical advice for applied researchers and psychometricians when evaluating multiple sources of IER evidence, including the best strategy or combination of strategies when preprocessing data. In this study, we demonstrate how the use of different IER detection methods may affect psychometric properties such as predictive validity and reliability. Moreover, we evaluate how different data cleansing procedures can detect different types of IER. We provide evidence via simulation studies and applied analysis using the ACT's Engage assessment as a motivating example. Based on the findings of the study, we provide recommendations and future research directions for those who suspect their data may contain responses reflecting careless, random, or biased responding.
努力回应不足(IER)在教育和心理环境中会影响多种形式的评估。许多研究已经考察了不同类型的IER、IER对测验分数心理测量属性的影响,以及用于检测IER的预处理程序。然而,在文献中,当评估IER证据的多个来源时,对于应用研究人员和心理测量学家来说,在实际建议方面存在差距,包括预处理数据时的最佳策略或策略组合。在本研究中,我们展示了使用不同的IER检测方法如何影响心理测量属性,如预测效度和信度。此外,我们评估了不同的数据清理程序如何检测不同类型的IER。我们通过模拟研究和以美国大学入学考试(ACT)的参与度评估为例进行的应用分析来提供证据。基于该研究的结果,我们为那些怀疑其数据可能包含反映粗心、随机或有偏差回应的回答的人提供建议和未来的研究方向。