Guise Jeanne-Marie, Deering Shad H, Kanki Barbara G, Osterweil Patricia, Li Hong, Mori Motomi, Lowe Nancy K
Medical Informatics and Clinical Epidemiology, and Public Health & Preventive Medicine, Oregon Health & Science University, Portland, OR 97239-3098, USA.
Simul Healthc. 2008 Winter;3(4):217-23. doi: 10.1097/SIH.0b013e31816fdd0a.
Human factors and teamwork are major contributors to sentinel events. A major limitation to improving human factors and teamwork is the paucity of objective validated measurement tools. Our goal was to develop a brief tool that could be used to objectively evaluate teamwork in the field during short clinical team simulations and in everyday clinical care.
A pilot validation study. Standardized videos were created demonstrating poor, average, and excellent teamwork among an obstetric team in a common clinical scenario (shoulder dystocia). Three evaluators all trained in Crew Resource Management, and unaware of assigned teamwork level, independently reviewed videos and evaluated teamwork using the Clinical Teamwork Scale (CTS). Statistical analysis included calculation of the Kappa statistic and Kendall coefficient to evaluate agreement and score concordance among raters, and Interclass Correlation Coefficient (ICC) to evaluate interrater reliability. The reliability of the tool was further evaluated by estimating the variance of each component of the tool based on generalizability theory.
There was substantial agreement (Kappa 0.78) and score concordance (Kendall coefficient 0.95) among raters, and excellent interrater reliability (interclass correlation coefficient 0.98). The highest percentage of variance in scores among raters was because of rater/item interaction.
The CTS was developed to efficiently measure key clinical teamwork skills during simulation exercises and in everyday clinical care. It contains 15 questions in 5 clinical teamwork domains (communication, situational awareness, decision-making, role responsibility, and patient friendliness). It is easy to use and has construct validity with median ratings consistently corresponding with the intended teamwork level. The CTS is a brief, straightforward, valid, reliable, and easy-to-use tool to measure key factors in teamwork in simulated and clinical settings.
人为因素和团队协作是导致警讯事件的主要因素。改善人为因素和团队协作的一个主要限制是缺乏经过客观验证的测量工具。我们的目标是开发一种简短的工具,可用于在短期临床团队模拟和日常临床护理期间客观评估现场的团队协作情况。
一项试点验证研究。创建了标准化视频,展示了产科团队在常见临床场景(肩难产)中团队协作不佳、一般和出色的情况。三名均接受过机组资源管理培训且不知道所分配团队协作水平的评估人员独立审查视频,并使用临床团队协作量表(CTS)评估团队协作。统计分析包括计算卡帕统计量和肯德尔系数以评估评分者之间的一致性和分数一致性,以及计算组内相关系数(ICC)以评估评分者间的可靠性。通过基于概化理论估计工具各组成部分的方差,进一步评估了该工具的可靠性。
评分者之间存在高度一致性(卡帕值为0.78)和分数一致性(肯德尔系数为0.95),且评分者间可靠性极佳(组内相关系数为0.98)。评分者之间分数差异的最大百分比是由于评分者/项目交互作用。
CTS旨在有效测量模拟练习和日常临床护理期间的关键临床团队协作技能。它在5个临床团队协作领域(沟通、态势感知、决策、角色责任和患者友好度)包含15个问题。它易于使用,具有结构效度,中位数评分始终与预期的团队协作水平相对应。CTS是一种简短、直接、有效、可靠且易于使用的工具,可用于测量模拟和临床环境中团队协作的关键因素。