Legacy Emanuel Medical Center, 2801 N. Gantenbein Ave. MOB No. 130, Portland, OR 97227, USA.
Am J Surg. 2013 May;205(5):552-6; discussion 556. doi: 10.1016/j.amjsurg.2013.01.021.
Little is known about the reliability of data collected by abstractors without professional medical training. This investigation sought to determine the level of agreement among untrained volunteer abstractors as part of a study to evaluate the risk assessment of venous thromboembolism in patients who have undergone trauma.
Forty-nine paper charts were chosen randomly from a volunteer-reviewed cohort of 2,339 and were compared with those of a single experienced abstractor. Inter-rater agreement was assessed using percent agreement, Cohen's kappa, and prevalence-adjusted bias-adjusted kappa (PABAK).
Of the 71 data points, 28 had perfect agreement. The average agreement across all charts was 97%. Data with imperfect agreement had kappa values between .27 and .96 (mean, .75), with one additional value at zero even though it was associated with an agreement of 94%. PABAK values ranged from .67 to .98 (mean, .91), an average increase of .17 compared with kappa values.
The performance of volunteers showed outstanding inter-rater reliability; however, limitations of interpretation can influence reliability.
对于没有专业医学培训的编录员所收集数据的可靠性知之甚少。本研究旨在确定未经训练的志愿者编录员在评估创伤患者静脉血栓栓塞风险的研究中的一致性程度。
从志愿者审核的 2339 例队列中随机选择 49 份纸质图表,并与一位经验丰富的编录员的图表进行比较。使用百分比一致性、Cohen's kappa 和调整后偏倚一致性(PABAK)评估评分者间一致性。
在 71 个数据点中,有 28 个具有完全一致性。所有图表的平均一致性为 97%。存在不一致数据的kappa 值在 0.27 到 0.96 之间(平均为 0.75),尽管kappa 值为零,但存在一个一致性为 94%的额外值。PABAK 值范围为 0.67 到 0.98(平均为 0.91),与 kappa 值相比平均增加了 0.17。
志愿者的表现显示出出色的评分者间可靠性;但是,解释的局限性会影响可靠性。