Suppr超能文献

深度学习在降低基于超声 BI-RADS 4A 乳腺病变恶性率中的应用。

Application of Deep Learning to Reduce the Rate of Malignancy Among BI-RADS 4A Breast Lesions Based on Ultrasonography.

机构信息

Department of Medical Ultrasound, Fudan University Shanghai Cancer Center, Shanghai, China; Department of Oncology, Shanghai Medical College, Fudan University, Shanghai, China.

Department of Applied Mathematics, School of Science, Xi'an Jiaotong-Liverpool University, Suzhou, China.

出版信息

Ultrasound Med Biol. 2022 Nov;48(11):2267-2275. doi: 10.1016/j.ultrasmedbio.2022.06.019. Epub 2022 Aug 30.

Abstract

The aim of the work described here was to develop an ultrasound (US) image-based deep learning model to reduce the rate of malignancy among breast lesions diagnosed as category 4A of the Breast Imaging-Reporting and Data System (BI-RADS) during the pre-operative US examination. A total of 479 breast lesions diagnosed as BI-RADS 4A in pre-operative US examination were enrolled. There were 362 benign lesions and 117 malignant lesions confirmed by postoperative pathology with a malignancy rate of 24.4%. US images were collected from the database server. They were then randomly divided into training and testing cohorts at a ratio of 4:1. To correctly classify malignant and benign tumors diagnosed as BI-RADS 4A in US, four deep learning models, including MobileNet, DenseNet121, Xception and Inception V3, were developed. The performance of deep learning models was compared using the area under the receiver operating characteristic curve (AUROC), accuracy, sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). Meanwhile, the robustness of the models was evaluated by five-fold cross-validation. Among the four models, the MobileNet model turned to be the optimal model with the best performance in classifying benign and malignant lesions among BI-RADS 4A breast lesions. The AUROC, accuracy, sensitivity, specificity, PPV and NPV of the optimal model in the testing cohort were 0.897, 0.913, 0.926, 0.899, 0.958 and 0.784, respectively. About 14.4% of patients were expected to be upgraded to BI-RADS 4B in US with the assistance of the MobileNet model. The deep learning model MobileNet can help to reduce the rate of malignancy among BI-RADS 4A breast lesions in pre-operative US examinations, which is valuable to clinicians in tailoring treatment for suspicious breast lesions identified on US.

摘要

本研究旨在开发一种基于超声(US)图像的深度学习模型,以降低术前 US 检查中 BI-RADS 4A 类乳腺病变的恶性率。共纳入 479 例术前 US 诊断为 BI-RADS 4A 的乳腺病变。术后病理证实良性病变 362 例,恶性病变 117 例,恶性率为 24.4%。US 图像从数据库服务器采集。然后将其随机分为训练集和测试集,比例为 4:1。为了正确分类术前 US 诊断为 BI-RADS 4A 的良恶性肿瘤,我们开发了四种深度学习模型,包括 MobileNet、DenseNet121、Xception 和 Inception V3。使用受试者工作特征曲线下面积(AUROC)、准确率、敏感度、特异度、阳性预测值(PPV)和阴性预测值(NPV)比较深度学习模型的性能。同时,通过五折交叉验证评估模型的稳健性。在这四种模型中,MobileNet 模型在分类 BI-RADS 4A 乳腺病变中的良恶性方面表现最佳,成为最优模型。最优模型在测试集的 AUROC、准确率、敏感度、特异度、PPV 和 NPV 分别为 0.897、0.913、0.926、0.899、0.958 和 0.784。约 14.4%的患者预计在 MobileNet 模型的辅助下 US 升级为 BI-RADS 4B。深度学习模型 MobileNet 有助于降低术前 US 检查中 BI-RADS 4A 类乳腺病变的恶性率,这对临床医生针对 US 发现的可疑乳腺病变制定治疗方案具有重要价值。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验