Suppr超能文献

用于腕部X光片骨折自动检测与定位的卷积神经网络

Convolutional Neural Networks for Automated Fracture Detection and Localization on Wrist Radiographs.

作者信息

Thian Yee Liang, Li Yiting, Jagmohan Pooja, Sia David, Chan Vincent Ern Yao, Tan Robby T

机构信息

Department of Diagnostic Imaging (Y.L.T., P.J., D.S., V.E.Y.C.) and Department of Electrical and Computer Engineering (Y.L., R.T.T.), National University of Singapore, 5 Lower Kent Ridge Rd, Singapore 119074; and Science Division, Yale-NUS College, Singapore (R.T.T.).

出版信息

Radiol Artif Intell. 2019 Jan 30;1(1):e180001. doi: 10.1148/ryai.2019180001. eCollection 2019 Jan.

Abstract

PURPOSE

To demonstrate the feasibility and performance of an object detection convolutional neural network (CNN) for fracture detection and localization on wrist radiographs.

MATERIALS AND METHODS

Institutional review board approval was obtained with waiver of consent for this retrospective study. A total of 7356 wrist radiographic studies were extracted from a hospital picture archiving and communication system. Radiologists annotated all radius and ulna fractures with bounding boxes. The dataset was split into training (90%) and validation (10%) sets and used to train fracture localization models for frontal and lateral images. Inception-ResNet Faster R-CNN architecture was implemented as a deep learning model. The models were tested on an unseen test set of 524 consecutive emergency department wrist radiographic studies with two radiologists in consensus as the reference standard. Per-fracture, per-image (ie, per-view), and per-study sensitivity and specificity were determined. Area under the receiver operating characteristic curve (AUC) analysis was performed.

RESULTS

The model detected and correctly localized 310 (91.2%) of 340 and 236 (96.3%) of 245 of all radius and ulna fractures on the frontal and lateral views, respectively. The per-image sensitivity, specificity, and AUC were 95.7% (95% confidence interval [CI]: 92.4%, 97.8%), 82.5% (95% CI: 77.4%, 86.8%), and 0.918 (95% CI: 0.894, 0.941), respectively, for the frontal view and 96.7% (95% CI: 93.6%, 98.6%), 86.4% (95% CI: 81.9%, 90.2%), and 0.933 (95% CI: 0.912, 0.954), respectively, for the lateral view. The per-study sensitivity, specificity, and AUC were 98.1% (95% CI: 95.6%, 99.4%), 72.9% (95% CI: 67.1%, 78.2%), and 0.895 (95% CI: 0.870, 0.920), respectively.

CONCLUSION

The ability of an object detection CNN to detect and localize radius and ulna fractures on wrist radiographs with high sensitivity and specificity was demonstrated.© RSNA, 2019.

摘要

目的

验证用于腕部X光片骨折检测与定位的目标检测卷积神经网络(CNN)的可行性和性能。

材料与方法

本回顾性研究获得机构审查委员会批准,无需患者同意。从医院图像存档与通信系统中提取了7356份腕部X光检查报告。放射科医生用边界框标注了所有桡骨和尺骨骨折。数据集被分为训练集(90%)和验证集(10%),用于训练正位和侧位图像的骨折定位模型。采用Inception-ResNet Faster R-CNN架构作为深度学习模型。该模型在524份连续的急诊科腕部X光检查报告的未知测试集上进行测试,以两位达成共识的放射科医生的诊断作为参考标准。确定了每处骨折、每张图像(即每个视角)和每项研究的敏感度和特异度。进行了受试者操作特征曲线(AUC)下面积分析。

结果

该模型在正位视图上检测并正确定位了340处桡骨和尺骨骨折中的310处(91.2%),在侧位视图上检测并正确定位了245处中的236处(96.3%)。正位视图的每张图像敏感度、特异度和AUC分别为95.7%(95%置信区间[CI]:92.4%,97.8%)、82.5%(95%CI:77.4%,86.8%)和0.918(95%CI:0.894,0.941),侧位视图分别为96.7%(95%CI:93.6%,98.6%)、86.4%(95%CI:81.9%,90.2%)和0.933(95%CI:0.912,0.954)。每项研究的敏感度、特异度和AUC分别为98.1%(95%CI:95.6%,99.4%)、72.9%(95%CI:67.1%,78.2%)和0.895(95%CI:0.870,0.920)。

结论

证明了目标检测CNN能够以高敏感度和特异度检测和定位腕部X光片上的桡骨和尺骨骨折。©RSNA,2019年。

相似文献

引用本文的文献

本文引用的文献

5
Artificial intelligence for analyzing orthopedic trauma radiographs.用于分析骨科创伤X光片的人工智能
Acta Orthop. 2017 Dec;88(6):581-586. doi: 10.1080/17453674.2017.1344459. Epub 2017 Jul 6.
8
Implementing Machine Learning in Radiology Practice and Research.在放射学实践与研究中实施机器学习
AJR Am J Roentgenol. 2017 Apr;208(4):754-760. doi: 10.2214/AJR.16.17224. Epub 2017 Jan 26.
10
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.更快的 R-CNN:基于区域建议网络的实时目标检测。
IEEE Trans Pattern Anal Mach Intell. 2017 Jun;39(6):1137-1149. doi: 10.1109/TPAMI.2016.2577031. Epub 2016 Jun 6.

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验