Rittenberger Jon C, Martin Jacqueline R, Kelly Lori J, Roth Ronald N, Hostler David, Callaway Clifton W
Department of Emergency Medicine, University of Pittsburgh, 230 McKee Place, Suite 400, Pittsburgh, PA 15213, USA.
Resuscitation. 2006 Sep;70(3):410-5. doi: 10.1016/j.resuscitation.2005.12.015. Epub 2006 Jun 27.
Witnessed collapse and bystander CPR are the variables most frequently associated with good outcome from out-of-hospital cardiac arrest (OOHCA). The reliability of abstracting witnessed collapse and bystander CPR from prehospital Emergency Medical Services (EMS) patient care records (PCRs) is not known. We sought to determine the inter-rater reliability for different methods of ascertaining and defining witnessed collapse and performance of bystander CPR.
A sample of 100 PCRs for patients with OOHCA was selected at random from a pool of 325 PCRs between May 2003 and January 2005. Paramedics used a drop down menu to indicate witnessed collapse and bystander CPR, and completed a narrative description of the event. An on-scene EMS physician also completed a data sheet. The PCR was examined by two separate evaluators to determine the presence of witnessed collapse and bystander CPR. A consensus was reached by three other reviewers using all available data sources. Inter-rater agreement was quantified using the unweighted kappa statistic.
For witnessed collapse, there is substantial agreement between the following: individual evaluators (kappa=0.76, S.D.=0.07), individual evaluators and consensus group (kappa=0.61, S.D.=0.07 and 0.66, S.D.=0.07), and physician and consensus group (kappa=0.68, S.D.=0.08). Agreement between individual evaluators and the physician was fair to moderate (kappa=0.38, S.D.=0.07 and 0.44, S.D.=0.07). Agreement between individual evaluators, physician, consensus group and the PCR drop down menu was fair to moderate (kappa range 0.33, S.D.=0.09 to 0.54, S.D.=0.09). For bystander CPR, there is substantial agreement between the individual evaluators and the consensus group (kappa=0.64, S.D.=0.07 and 0.63, S.D.=0.06) and between the physician and the consensus group (kappa=0.61, S.D.=0.08). Agreement between the two individual evaluators is moderate (kappa=0.59, S.D.=0.07). Agreement between the physician and individual evaluators is fair (kappa=0.36, S.D.=0.07 and 0.38, S.D.=0.07). The PCR drop down menu had moderate to substantial agreement with the individual evaluators, physician, and consensus group (kappa range 0.50, S.D.=0.09 to 0.75, S.D.=0.09).
Determination of witnessed collapse and bystander CPR during OOHCA may be less reliable than previously thought, and differences between methods of rating could influence study results.
目睹心脏骤停发作及旁观者进行心肺复苏(CPR)是与院外心脏骤停(OOHCA)良好预后最常相关的变量。从院前急救医疗服务(EMS)患者护理记录(PCRs)中提取目睹心脏骤停发作及旁观者进行心肺复苏情况的可靠性尚不清楚。我们试图确定不同方法确定和定义目睹心脏骤停发作及旁观者进行心肺复苏情况的评分者间信度。
从2003年5月至2005年1月的325份PCRs中随机抽取100份OOHCA患者的PCRs样本。护理人员使用下拉菜单表明目睹心脏骤停发作及旁观者进行心肺复苏情况,并完成事件的叙述性描述。一名现场EMS医生也填写了一份数据表。由两名独立的评估者检查PCR,以确定是否存在目睹心脏骤停发作及旁观者进行心肺复苏情况。另外三名评审员利用所有可用数据源达成共识。使用未加权kappa统计量对评分者间一致性进行量化。
对于目睹心脏骤停发作情况,以下各方之间存在高度一致性:个体评估者之间(kappa=0.76,标准差=0.07)、个体评估者与共识小组之间(kappa=0.61,标准差=0.07和0.66,标准差=0.07)以及医生与共识小组之间(kappa=0.68,标准差=0.08)。个体评估者与医生之间的一致性为一般到中等(kappa=0.38,标准差=0.07和0.44,标准差=0.07)。个体评估者、医生、共识小组与PCR下拉菜单之间的一致性为一般到中等(kappa范围为0.33,标准差=0.09至0.54,标准差=0.09)。对于旁观者进行心肺复苏情况,个体评估者与共识小组之间(kappa=0.64,标准差=0.07和0.63,标准差=0.06)以及医生与共识小组之间(kappa=0.61,标准差=0.08)存在高度一致性。两名个体评估者之间的一致性为中等(kappa=0.59,标准差=0.07)。医生与个体评估者之间的一致性为一般(kappa=0.36,标准差=0.07和0.38,标准差=0.07)。PCR下拉菜单与个体评估者、医生及共识小组之间存在中等至高度一致性(kappa范围为0.50,标准差=0.09至0.75,标准差=0.09)。
在院外心脏骤停期间确定目睹心脏骤停发作及旁观者进行心肺复苏情况可能比之前认为的可靠性更低,且评分方法之间的差异可能会影响研究结果。