Kriek Jacobus J, Govender Shunmugam
Orthopaedics, University KwaZulu Natal, , KwaZulu Natal, Durban, South Africa.
Eur Spine J. 2006 Aug;15(8):1239-46. doi: 10.1007/s00586-005-0002-y. Epub 2005 Dec 21.
This study was designed to assess the inter-observer reliability and intra-observer reproducibility of standard radiographic evaluation of 150 thoraco-lumbar fractures using the AO-classification. The influence of clinical information on agreement levels was also evaluated. Six observers (two junior and four senior residents) evaluated the radiographic images. The injuries were classified by each observer as either type A, B or C according to the AO-classification system and the levels of agreement were documented. After 3 months the injuries were again classified with the addition of the clinical findings of each patient and the level of agreement evaluated. The level of agreement was measured using Cohen's kappa-test. The overall inter-observer agreement was rated as fair (0.291) in the first session and moderate (0.403) in the second. Intra-observer values ranged from slight (0.181) to moderate (0.488). The increased level of agreement in the second session was attributed to the value of additional clinical information, the learning curve of the junior residents and the simplicity of the classification.
本研究旨在评估使用AO分类法对150例胸腰椎骨折进行标准影像学评估时观察者间的可靠性和观察者内的可重复性。还评估了临床信息对一致性水平的影响。六位观察者(两名初级住院医师和四名高级住院医师)对影像学图像进行评估。每位观察者根据AO分类系统将损伤分为A、B或C型,并记录一致性水平。3个月后,在加入每位患者的临床发现后再次对损伤进行分类,并评估一致性水平。使用Cohen卡方检验测量一致性水平。第一次评估时观察者间的总体一致性评分为一般(0.291),第二次为中等(0.403)。观察者内的值范围从轻微(0.181)到中等(中等0.488)。第二次评估中一致性水平的提高归因于额外临床信息的价值、初级住院医师的学习曲线以及分类的简易性。