Research and Clinical Education, Canadian Memorial Chiropractic College, 6100 Leslie Street, Toronto, Ontario, M2H 3J1, Canada.
Graduate Education and Research Programs, Canadian Memorial Chiropractic College, Toronto, Ontario, Canada.
BMC Health Serv Res. 2021 Jul 28;21(1):750. doi: 10.1186/s12913-021-06745-1.
There is a dearth of information about health education clinical file audits in the context of completeness of records and demonstrating program-wide competency achievement. We report on the reliability of an audit instrument used for electronic health record (EHR) audits in the clinics of a chiropractic college in Canada.
The instrument is a checklist built within an electronic software application designed to pull data automatically from the EHR. It consists of a combination of 61 objective (n = 20) and subjective (n = 41) elements, representing domains of standards of practice, accreditation and in-house educational standards. Trained auditors provide responses to the elements and the software yields scores indicating the quality of clinical record per file. A convenience sample of 24 files, drawn randomly from the roster of 22 clinicians, were divided into three groups of eight to be completed by one of three auditors in the span of 1 week, at the end of which they were transferred to another auditor. There were four audit cycles; audits from cycles 1 and 4 were used to assess intra-rater (test-retest) reliability and audits from cycles 1, 2 and 3 were used to assess inter-rater reliability. Percent agreement (PA) and Kappa statistics (K) were used as outcomes. Scatter plots and intraclass correlation (ICC) coefficients were used to assess standards of practice, accreditation, and overall audit scores.
Across all 3 auditors test-retest reliability for objective items was PA 89% and K 0.75, and for subjective items PA 82% and K 0.63. In contrast, inter-rater reliability was moderate at PA 82% and K 0.59, and PA 70% and K 0.44 for objective and subjective items, respectively. Element analysis indicated a wide range of PA and K values inter-rater reliability of many elements being rated as poor. ICC coefficient calculations indicated moderate reliability for the domains of standards of practice, accreditation, and overall file scores.
The file audit process has substantial test-retest reliability and moderate inter-rater reliability. Recommendations are made to improve reliability outcomes. These include modifying the audit checklist with a view of improving clarity of elements, and enhancing uniformity of auditor responses by increased training aided by preparation of an audit guidebook.
在记录完整性和展示全范围能力方面,关于健康教育临床档案审核的信息十分匮乏。我们报告了在加拿大一所脊椎按摩学院的诊所中,用于电子健康记录(EHR)审核的审核工具的可靠性。
该工具是一个内置在电子软件应用程序中的检查表,旨在自动从 EHR 中提取数据。它由 61 个客观(n=20)和 41 个主观(n=41)元素组成,代表实践标准、认证和内部教育标准的领域。经过培训的审核员对元素做出回应,软件会根据每个文件的临床记录质量给出分数。从 22 名临床医生的名单中随机抽取 24 个文件作为方便样本,将其分为三组,由三名审核员中的一名在一周内完成,最后将其转移给另一名审核员。共有四个审核周期;第 1 轮和第 4 轮审核用于评估内部审核员(测试-再测试)可靠性,第 1 轮、第 2 轮和第 3 轮审核用于评估外部审核员可靠性。一致性百分比(PA)和 Kappa 统计量(K)用于评估结果。散点图和组内相关系数(ICC)系数用于评估实践标准、认证和整体审核分数。
在所有 3 名审核员中,客观项目的测试-再测试可靠性的 PA 为 89%,K 为 0.75,主观项目的 PA 为 82%,K 为 0.63。相比之下,外部审核员的可靠性中等,PA 为 82%,K 为 0.59,客观项目的 PA 为 70%,K 为 0.44,主观项目的 PA 为 70%,K 为 0.44。元素分析表明,许多元素的内部审核员可靠性的 PA 和 K 值范围很广,被评为较差。ICC 系数计算表明,实践标准、认证和整体文件评分的领域具有中等可靠性。
文件审核过程具有很强的测试-再测试可靠性和中等的外部审核员可靠性。建议提高可靠性的结果。这包括通过改进元素的清晰度、增加培训并借助审核指南的编制来增强审核员的一致性,从而修改审核检查表。