Suppr超能文献

临床工作观察性研究的观察者间一致性和可靠性评估。

Inter-observer agreement and reliability assessment for observational studies of clinical work.

机构信息

Centre for Health Systems and Safety Research, Australian Institute of Health Innovation, Faculty of Medicine and Health Sciences, Macquarie University, 75 Talavera Road, North Ryde, NSW 2113, Australia.

Department of Statistics, School of Mathematics and Statistics, University of New South Wales, The Red Centre, UNSW Australia, Sydney, NSW 2052, Australia.

出版信息

J Biomed Inform. 2019 Dec;100:103317. doi: 10.1016/j.jbi.2019.103317. Epub 2019 Oct 22.

Abstract

Inter-observer agreement (IOA) is a key aspect of data quality in time-and-motion studies of clinical work. To date, such studies have used simple and ad hoc approaches for IOA assessment, often with minimal reporting of methodological details. The main methodological issues are how to align time-stamped task intervals that rarely have agreeing start and end times, and how to assess IOA for multiple nominal variables. We present a combination of methods that simultaneously addresses both these issues and provides a more appropriate measure by which to assess IOA for time-and-motion studies. The issue of alignment is addressed by converting task-level data into small time windows then aligning data from different observers by time. A method applicable to multivariate nominal data, the iota score, is then applied to the time-aligned data. We illustrate our approach by comparing iota scores to the mean of univariate Cohen's kappa scores through application of these measures to existing data from an observational study of emergency department physicians. While the two scores generated very similar results under certain conditions, iota was more resilient to sparse data issues. Our results suggest that iota applied to time windows considerably improves on previous methods used for IOA assessment in time-and-motion studies, and that Cohen's kappa and other univariate measures should not be considered the gold standard. Rather, there is an urgent need for ongoing explicit discussion of methodological issues and solutions to improve the ways in which data quality is assessed in time-and-motion studies in order to ensure the conclusions drawn from such studies are robust.

摘要

观察者间一致性 (IOA) 是临床工作时间和运动研究中数据质量的关键方面。迄今为止,此类研究使用简单和临时的方法来评估 IOA,通常很少报告方法细节。主要的方法学问题是如何对齐很少有一致开始和结束时间的带时间戳的任务间隔,以及如何评估多个名义变量的 IOA。我们提出了一种组合方法,同时解决了这两个问题,并提供了更合适的衡量标准,以评估时间和运动研究中的 IOA。通过将任务级数据转换为小时间窗口,然后通过时间对齐来自不同观察者的数据,解决了对齐问题。然后,将适用于多变量名义数据的 iota 分数应用于时间对齐的数据。我们通过将 iota 分数与单变量 Cohen's kappa 分数的平均值进行比较,通过将这些度量应用于急诊科医生观察研究的现有数据来说明我们的方法。虽然在某些条件下,这两个分数产生了非常相似的结果,但 iota 对稀疏数据问题更具弹性。我们的结果表明,应用于时间窗口的 iota 大大改进了时间和运动研究中用于 IOA 评估的先前方法,并且 Cohen's kappa 和其他单变量度量不应被视为金标准。相反,迫切需要持续讨论方法学问题和解决方案,以改进时间和运动研究中数据质量评估的方式,以确保从此类研究中得出的结论是稳健的。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验