Department of Biomedical Engineering, School of Precision Instrument and Opto-electronics Engineering, Tianjin University, Tianjin, 300072, China.
Department of Radiotherapy, Yantai Yuhuangding Hospital, No. 20 Yuhuangding East Road, Yantai, 264000, Shandong, China.
Eur Radiol. 2024 Feb;34(2):917-927. doi: 10.1007/s00330-023-10170-9. Epub 2023 Aug 23.
To develop an end-to-end deep neural network for the classification of contrast-enhanced mammography (CEM) images to facilitate breast cancer diagnosis in the clinic.
In this retrospective mono-centric study, patients who underwent CEM examinations from January 2019 to August 2021 were enrolled. A multi-feature fusion network combining low-energy (LE) and dual-energy subtracted (DES) images and dual view, as well as bilateral information, was trained and tested using a large CEM dataset with a diversity of breast tumors for breast lesion classification. Its generalization performance was further evaluated on two external datasets. Results were reported using AUC, accuracy, sensitivity, and specificity.
A total of 2496 patients (mean age, 53 years ± 12 (standard deviation)) were included and divided into a training set (1718), a validation set (255), and a testing set (523). The proposed CEM-based multi-feature fusion network achieved the best diagnosis performance with an AUC of 0.96 (95% confidence interval (CI): 0.95, 0.97), compared with the no-fusion model, the left-right fusion model, and the multi-feature fusion network with only LE image inputs. Our models reached an AUC of 0.90 (95% CI: 0.85, 0.94) on a full-field digital mammograph (FFDM) external dataset (86 patients), and an AUC of 0.92 (95% CI: 0.89, 0.95) on a CEM external dataset (193 patients).
The developed multi-feature fusion neural network achieved high performance in CEM image classification and was able to facilitate CEM-based breast cancer diagnosis.
Compared with low-energy images, CEM images have greater sensitivity and similar specificity in malignant breast lesion detection. The multi-feature fusion neural network is a promising computer-aided diagnostic tool for the clinical diagnosis of breast cancer.
• Deep convolutional neural networks have the potential to facilitate contrast-enhanced mammography-based breast cancer diagnosis. • The multi-feature fusion neural network reaches high accuracies in the classification of contrast-enhanced mammography images. • The developed model is a promising diagnostic tool to facilitate clinical breast cancer diagnosis.
开发一种用于对比增强乳腺摄影(CEM)图像分类的端到端深度学习网络,以促进临床乳腺癌诊断。
在这项回顾性单中心研究中,纳入了 2019 年 1 月至 2021 年 8 月期间接受 CEM 检查的患者。使用具有多种乳腺肿瘤的大型 CEM 数据集,结合低能(LE)和双能减影(DES)图像以及双视图、双侧信息的多特征融合网络,对该网络进行训练和测试,以进行乳腺病变分类。并在两个外部数据集上进一步评估其泛化性能。使用 AUC、准确率、敏感度和特异度报告结果。
共纳入 2496 例患者(平均年龄 53 岁±12 岁[标准差]),分为训练集(1718 例)、验证集(255 例)和测试集(523 例)。与非融合模型、左右融合模型以及仅输入 LE 图像的多特征融合网络相比,基于 CEM 的多特征融合网络在诊断性能方面表现最佳,AUC 为 0.96(95%置信区间(CI):0.95,0.97)。我们的模型在全视野数字乳腺摄影(FFDM)外部数据集(86 例患者)中获得了 0.90 的 AUC(95%CI:0.85,0.94),在 CEM 外部数据集(193 例患者)中获得了 0.92 的 AUC(95%CI:0.89,0.95)。
所开发的多特征融合神经网络在 CEM 图像分类中具有出色的性能,并且能够促进基于 CEM 的乳腺癌诊断。
与低能图像相比,CEM 图像在恶性乳腺病变检测中具有更高的灵敏度和相似的特异性。多特征融合神经网络是一种有前途的计算机辅助诊断工具,可用于临床诊断乳腺癌。
• 深度卷积神经网络有可能促进基于对比增强乳腺摄影的乳腺癌诊断。• 多特征融合神经网络在 CEM 图像分类中达到了很高的准确率。• 所开发的模型是一种有前途的诊断工具,有助于临床乳腺癌诊断。