Gardner Andrew J, Levi Christopher R, Iverson Grant L
Centre for Stroke and Brain Injury, School of Medicine and Public Health, University of Newcastle, Callaghan, New South Wales, Australia.
Hunter New England Local Health District Sports Concussion Program, John Hunter Hospital, Newcastle, New South Wales, Australia.
Sports Med Open. 2017 Dec;3(1):26. doi: 10.1186/s40798-017-0093-0. Epub 2017 Jul 14.
Several professional contact and collision sports have recently introduced the use of sideline video review for club medical staff to help identify and manage concussions. As such, reviewing video footage on the sideline has become increasingly relied upon to assist with improving the identification of possible injury. However, as yet, a standardized method for reviewing such video footage in rugby league has not been published. The aim of this study is to evaluate whether independent raters reliably agreed on the injury characterization when using a standardized observational instrument to record video footage of National Rugby League (NRL) concussions.
Video footage of 25 concussions were randomly selected from a pool of 80 medically diagnosed concussions from the 2013-2014 NRL seasons. Four raters (two naïve and two expert) independently viewed video footage of 25 NRL concussions and completed the Observational Review and Analysis of Concussion form for the purpose of this inter-rater reliability study. The inter-rater reliability was calculated using Cohen's kappa (κ) and intra-class correlation (ICC) statistics. The two naïve raters and the two expert raters were compared with one another separately.
A considerable number of components for the naïve and expert raters had almost perfect agreement (κ or ICC value ≥ 0.9), 9 of 22 (41%) components for naïve raters and 21 of 22 (95%) components for expert raters. For the concussion signs, however, the majority of the rating agreement was moderate (κ value 0.6-0.79); both the naïve and expert raters had 4 of 6 (67%) concussion signs with moderate agreement. The most difficult concussion sign to achieve agreement on was blank or vacant stare, which had weak (κ value 0.4-0.59) agreement for both naïve and expert raters.
There appears to be value in expert raters, but less value for naive raters, in using the new Observational Review and Analysis of Concussion (ORAC) Form. The ORAC Form has high inter-rater agreement for most data elements, and it can be used by expert raters evaluating video footage of possible concussion in the NRL.
最近,一些职业接触性和碰撞性运动引入了边线视频回放功能,供俱乐部医务人员使用,以帮助识别和处理脑震荡。因此,依靠在边线上查看视频片段来协助改进对可能受伤情况的识别变得越来越普遍。然而,迄今为止,尚未公布在橄榄球联盟中查看此类视频片段的标准化方法。本研究的目的是评估当使用标准化观察工具记录国家橄榄球联盟(NRL)脑震荡的视频片段时,独立评分者在损伤特征描述上是否能可靠地达成一致。
从2013 - 2014年NRL赛季的80例经医学诊断的脑震荡病例库中随机选取25例脑震荡的视频片段。四位评分者(两位新手和两位专家)独立观看25例NRL脑震荡的视频片段,并完成用于本评分者间信度研究的脑震荡观察性回顾与分析表格。使用科恩kappa系数(κ)和组内相关系数(ICC)统计量计算评分者间信度。分别对两位新手评分者和两位专家评分者进行相互比较。
新手评分者和专家评分者的相当多项目具有几乎完美的一致性(κ或ICC值≥0.9),新手评分者的22个项目中有9个(41%),专家评分者的22个项目中有21个(95%)。然而,对于脑震荡体征,大多数评分一致性为中等(κ值0.6 - 0.79);新手和专家评分者的6个脑震荡体征中均有4个(67%)具有中等一致性。最难达成一致的脑震荡体征是眼神空洞或茫然,新手和专家评分者在这一体征上的一致性都较弱(κ值0.4 - 0.59)。
在使用新的脑震荡观察性回顾与分析(ORAC)表格时,专家评分者似乎有价值,而新手评分者价值较小。ORAC表格对大多数数据元素具有较高的评分者间一致性,并且可被专家评分者用于评估NRL中可能的脑震荡视频片段。