Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts; MGH Learning Laboratory, Massachusetts General Hospital, Boston, Massachusetts.
MGH Learning Laboratory, Massachusetts General Hospital, Boston, Massachusetts; Department of Anesthesia, Critical Care and Pain Medicine, Massachusetts General Hospital, Boston, Massachusetts.
J Surg Res. 2014 Jul;190(1):22-8. doi: 10.1016/j.jss.2014.04.024. Epub 2014 Apr 18.
High-quality teamwork among operating room (OR) professionals is a key to efficient and safe practice. Quantification of teamwork facilitates feedback, assessment, and improvement. Several valid and reliable instruments are available for assessing separate OR disciplines and teams. We sought to determine the most feasible approach for routine documentation of teamwork in in-situ OR simulations. We compared rater agreement, hypothetical training costs, and feasibility ratings from five clinicians and two nonclinicians with instruments for assessment of separate OR groups and teams.
Five teams of anesthesia or surgery residents and OR nurses (RN) or surgical technicians were videotaped in simulations of an epigastric hernia repair where the patient develops malignant hyperthermia. Two anesthesiologists, one OR clinical RN specialist, one educational psychologist, one simulation specialist, and one general surgeon discussed and then independently completed Anesthesiologists' Non-Technical Skills, Non-Technical Skills for Surgeons, Scrub Practitioners' List of Intraoperative Non-Technical Skills, and Observational Teamwork Assessment for Surgery forms to rate nontechnical performance of anesthesiologists, surgeons, nurses, technicians, and the whole team.
Intraclass correlations of agreement ranged from 0.17-0.85. Clinicians' agreements were not different from nonclinicians'. Published rater training was 4 h for Anesthesiologists' Non-Technical Skills and Scrub Practitioners' List of Intraoperative Non-Technical Skills, 2.5 h for Non-Technical Skills for Surgeons, and 15.5 h for Observational Teamwork Assessment for Surgery. Estimated costs to train one rater to use all instruments ranged from $442 for a simulation specialist to $6006 for a general surgeon.
Additional training is needed to achieve higher levels of agreement; however, costs may be prohibitive. The most cost-effective model for real-time OR teamwork assessment may be to use a simulation technician combined with one clinical rater to allow complete documentation of all participants.
手术室(OR)专业人员之间的高质量团队合作是高效和安全实践的关键。团队合作的量化有助于反馈、评估和改进。有几种有效的、可靠的工具可用于评估单独的 OR 学科和团队。我们旨在确定在原位 OR 模拟中对团队合作进行常规记录的最可行方法。我们比较了五位临床医生和两位非临床医生使用评估单独 OR 组和团队的工具的评分者一致性、假设培训成本和可行性评分。
五组麻醉或外科住院医师和手术室护士(RN)或外科技术员在模拟上腹部疝修补术的视频中接受了培训,其中患者出现恶性高热。两名麻醉师、一名手术室临床 RN 专家、一名教育心理学家、一名模拟专家和一名普通外科医生进行了讨论,然后分别独立完成了麻醉师非技术技能、外科医生非技术技能、手术室护士术中非技术技能清单和手术观察团队评估表,以评估麻醉师、外科医生、护士、技术员和整个团队的非技术表现。
一致性的组内相关系数范围为 0.17-0.85。临床医生的意见与非临床医生的意见没有差异。发表的评分者培训时间为 4 小时用于麻醉师非技术技能和手术室护士术中非技术技能清单,2.5 小时用于外科医生非技术技能,15.5 小时用于手术观察团队评估表。培训一名评分者使用所有工具的估计成本从模拟专家的 442 美元到普通外科医生的 6006 美元不等。
需要额外的培训才能达到更高的一致性水平;然而,成本可能是一个障碍。用于实时 OR 团队合作评估的最具成本效益的模型可能是使用模拟技术员与一名临床评分者相结合,以允许对所有参与者进行完整的记录。