Schock Justus, Truhn Daniel, Abrar Daniel B, Merhof Dorit, Conrad Stefan, Post Manuel, Mittelstrass Felix, Kuhl Christiane, Nebelung Sven
Department of Diagnostic and Interventional Radiology, University Hospital Düsseldorf, Düsseldorf, Germany (J.S., D.B.A., S.N.); Institute of Computer Vision and Imaging, RWTH University Aachen, Pauwelsstrasse 30, 52072 Aachen, Germany (J.S., D.M.); Department of Diagnostic and Interventional Radiology, University Hospital Aachen, Aachen, Germany (D.T., M.P., F.M., C.K., S.N.); and Faculty of Mathematics and Natural Sciences, Institute of Informatics, Heinrich Heine University Düsseldorf, Düsseldorf, Germany (S.C.).
Radiol Artif Intell. 2020 Dec 23;3(2):e200198. doi: 10.1148/ryai.2020200198. eCollection 2021 Mar.
To develop and validate a deep learning-based method for automatic quantitative analysis of lower-extremity alignment.
In this retrospective study, bilateral long-leg radiographs (LLRs) from 255 patients that were obtained between January and September of 2018 were included. For training data ( = 109), a U-Net convolutional neural network was trained to segment the femur and tibia versus manual segmentation. For validation data ( = 40), model parameters were optimized. Following identification of anatomic landmarks, anatomic and mechanical axes were identified and used to quantify alignment through the hip-knee-ankle angle (HKAA) and femoral anatomic-mechanical angle (AMA). For testing data ( = 106), algorithm-based angle measurements were compared with reference measurements by two radiologists. Angles and time for 30 random radiographs were compared by using repeated-measures analysis of variance and one-way analysis of variance, whereas correlations were quantified by using Pearson and intraclass correlation coefficients.
Bilateral LLRs of 255 patients (mean age, 26 years ± 23 [standard deviation]; range, 0-88 years; 157 male patients) were included. Mean Sørensen-Dice coefficients for segmentation were 0.97 ± 0.09 for the femur and 0.96 ± 0.11 for the tibia. Mean HKAAs and AMAs as measured by the readers and the algorithm ranged from 0.05° to 0.11° ( = .5) and from 4.82° to 5.43° ( < .001). Interreader correlation coefficients ranged from 0.918 to 0.995 ( range, < .001), and agreement was almost perfect (intraclass correlation coefficient range, 0.87-0.99). Automatic analysis was faster than the two radiologists' manual measurements (3 vs 36 vs 35 seconds, < .001).
Fully automated analysis of LLRs yielded accurate results across a wide range of clinical and pathologic indications and is fast enough to enhance and accelerate clinical workflows.© RSNA, 2020See also commentary by Andreisek in this issue.
开发并验证一种基于深度学习的下肢对线自动定量分析方法。
在这项回顾性研究中,纳入了2018年1月至9月期间获取的255例患者的双侧长腿X线片(LLR)。对于训练数据(n = 109),训练一个U-Net卷积神经网络来分割股骨和胫骨,并与手动分割进行对比。对于验证数据(n = 40),对模型参数进行优化。在识别解剖标志点后,确定解剖轴和机械轴,并通过髋-膝-踝角(HKAA)和股骨解剖-机械角(AMA)来量化对线情况。对于测试数据(n = 106),将基于算法的角度测量结果与两名放射科医生的参考测量结果进行比较。通过重复测量方差分析和单因素方差分析比较30张随机X线片的角度和时间,同时使用Pearson相关系数和组内相关系数来量化相关性。
纳入了255例患者的双侧LLR(平均年龄26岁±23[标准差];范围0 - 88岁;157例男性患者)。股骨分割的平均索伦森-迪赛系数为0.97±0.09,胫骨为0.96±0.11。读者和算法测量的平均HKAA和AMA范围分别为0.05°至0.11°(P = 0.5)和4.82°至5.43°(P < 0.001)。读者间相关系数范围为0.918至0.995(P范围,P < 0.001),一致性几乎完美(组内相关系数范围,0.87 - 0.99)。自动分析比两名放射科医生的手动测量更快(3秒对36秒对35秒,P < 0.001)。
LLR的全自动分析在广泛的临床和病理指征中都能产生准确的结果,并且速度足够快,能够增强和加速临床工作流程。©RSNA,2020另见本期Andreisek的评论。