Liu Baoqin, Liu Shouyao, Cao Zijian, Zhang Junning, Pu Xiaoqi, Yu Junjie
Department of TCM gynecology, China-Japan Friendship Hospital, Beijing, China.
Department of TCM surgery, China-Japan Friendship Hospital, Beijing, China.
Front Bioeng Biotechnol. 2025 Jun 25;13:1526260. doi: 10.3389/fbioe.2025.1526260. eCollection 2025.
Breast cancer is the most common malignant tumor in women worldwide, and early detection is crucial to improving patient prognosis. However, traditional ultrasound examinations rely heavily on physician judgment, and diagnostic results are easily influenced by individual experience, leading to frequent misdiagnosis or missed diagnosis. Therefore, there is a pressing need for an automated, highly accurate diagnostic method to support the detection and classification of breast cancer. This study aims to build a reliable breast ultrasound image benign and malignant classification model through deep learning technology to improve the accuracy and consistency of diagnosis.
This study proposed an innovative deep learning model RcdNet. RcdNet combines deep separable convolution and Convolutional Block Attention Module (CBAM) attention modules to enhance the ability to identify key lesion areas in ultrasound images. The model was internally validated and externally independently tested, and compared with commonly used models such as ResNet, MobileNet, RegNet, ViT and ResNeXt to verify its performance advantage in benign and malignant classification tasks. In addition, the model's attention area was analyzed by heat map visualization to evaluate its clinical interpretability.
The experimental results show that RcdNet outperforms other mainstream deep learning models, including ResNet, MobileNet, and ResNeXt, across all key evaluation metrics. On the external test set, RcdNet achieved an accuracy of 0.9351, a precision of 0.9168, a recall of 0.9495, and an F1-score of 0.9290, demonstrating superior classification performance and strong generalization ability. Furthermore, heat map visualizations confirm that RcdNet accurately attends to clinically relevant features such as tumor edges and irregular structures, aligning well with radiologists' diagnostic focus and enhancing the interpretability and credibility of the model in clinical applications.
The RcdNet model proposed in this study performs well in the classification of benign and malignant breast ultrasound images, with high classification accuracy, strong generalization ability and good interpretability. RcdNet can be used as an auxiliary diagnostic tool to help physicians quickly and accurately screen breast cancer, improve the consistency and reliability of diagnosis, and provide strong support for early detection and precise diagnosis and treatment of breast cancer. Future work will focus on integrating RcdNet into real-time ultrasound diagnostic systems and exploring its potential in multi-modal imaging workflows.
乳腺癌是全球女性中最常见的恶性肿瘤,早期检测对于改善患者预后至关重要。然而,传统超声检查严重依赖医生的判断,诊断结果容易受到个人经验的影响,导致误诊或漏诊频繁发生。因此,迫切需要一种自动化、高精度的诊断方法来辅助乳腺癌的检测和分类。本研究旨在通过深度学习技术构建一个可靠的乳腺超声图像良恶性分类模型,以提高诊断的准确性和一致性。
本研究提出了一种创新的深度学习模型RcdNet。RcdNet结合了深度可分离卷积和卷积块注意力模块(CBAM)注意力模块,以增强识别超声图像中关键病变区域的能力。该模型进行了内部验证和外部独立测试,并与ResNet、MobileNet、RegNet、ViT和ResNeXt等常用模型进行比较,以验证其在良恶性分类任务中的性能优势。此外,通过热图可视化分析模型的注意力区域,以评估其临床可解释性。
实验结果表明,RcdNet在所有关键评估指标上均优于其他主流深度学习模型,包括ResNet、MobileNet和ResNeXt。在外部测试集上,RcdNet的准确率达到0.9351,精确率为0.9168,召回率为0.9495,F1分数为0.9290,显示出卓越的分类性能和强大的泛化能力。此外,热图可视化证实RcdNet能够准确关注肿瘤边缘和不规则结构等临床相关特征,与放射科医生的诊断重点高度契合,增强了模型在临床应用中的可解释性和可信度。
本研究提出的RcdNet模型在乳腺超声图像良恶性分类中表现出色,具有高分类准确率、强大的泛化能力和良好的可解释性。RcdNet可作为辅助诊断工具,帮助医生快速准确地筛查乳腺癌,提高诊断的一致性和可靠性,为乳腺癌的早期检测和精准诊断治疗提供有力支持。未来的工作将集中于将RcdNet集成到实时超声诊断系统中,并探索其在多模态成像工作流程中的潜力。