• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于BCDNet的有效乳腺癌分类模型,采用结合基于VGG16的最优特征提取的混合深度学习方法。

Effective BCDNet-based breast cancer classification model using hybrid deep learning with VGG16-based optimal feature extraction.

作者信息

P Meenakshi Devi, A Muna, Ali Yasser, V Sumanth

机构信息

Department of Information Technology, K.S.R. College of Engineering, Tiruchengode, Tamilnadu, 637215, India.

Centre for Research Impact & Outcome, Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura, Punjab, 140401, India.

出版信息

BMC Med Imaging. 2025 Jan 8;25(1):12. doi: 10.1186/s12880-024-01538-4.

DOI:10.1186/s12880-024-01538-4
PMID:39780045
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11707918/
Abstract

PROBLEM

Breast cancer is a leading cause of death among women, and early detection is crucial for improving survival rates. The manual breast cancer diagnosis utilizes more time and is subjective. Also, the previous CAD models mostly depend on manmade visual details that are complex to generalize across ultrasound images utilizing distinct techniques. Distinct imaging tools have been utilized in previous works such as mammography and MRI. However, these imaging tools are costly and less portable than ultrasound imaging. Also, ultrasound imaging is a non-invasive method commonly used for breast cancer screening. Hence, the paper presents a novel deep learning model, BCDNet, for classifying breast tumors as benign or malignant using ultrasound images.

AIM

The primary aim of the study is to design an effective breast cancer diagnosis model that can accurately classify tumors in their early stages, thus reducing mortality rates. The model aims to optimize the weight and parameters using the RPAOSM-ESO algorithm to enhance accuracy and minimize false negative rates.

METHODS

The BCDNet model utilizes transfer learning from a pre-trained VGG16 network for feature extraction and employs an AHDNAM classification approach, which includes ASPP, DTCN, 1DCNN, and an attention mechanism. The RPAOSM-ESO algorithm is used to fine-tune the weights and parameters.

RESULTS

The RPAOSM-ESO-BCDNet-based breast cancer diagnosis model provided 94.5 accuracy rates. This value is relatively higher than the previous models such as DTCN (88.2), 1DCNN (89.6), MobileNet (91.3), and ASPP-DTC-1DCNN-AM (93.8). Hence, it is guaranteed that the designed RPAOSM-ESO-BCDNet produces relatively accurate solutions for the classification than the previous models.

CONCLUSION

The BCDNet model, with its sophisticated feature extraction and classification techniques optimized by the RPAOSM-ESO algorithm, shows promise in accurately classifying breast tumors using ultrasound images. The study suggests that the model could be a valuable tool in the early detection of breast cancer, potentially saving lives and reducing the burden on healthcare systems.

摘要

问题

乳腺癌是女性死亡的主要原因之一,早期检测对于提高生存率至关重要。人工乳腺癌诊断耗时且主观。此外,以前的计算机辅助诊断(CAD)模型大多依赖于人为的视觉细节,这些细节难以通过不同技术在超声图像中进行通用化。以前的研究中使用了不同的成像工具,如乳腺X线摄影和磁共振成像(MRI)。然而,这些成像工具成本高昂且不如超声成像便携。此外,超声成像是一种常用于乳腺癌筛查的非侵入性方法。因此,本文提出了一种新颖的深度学习模型BCDNet,用于使用超声图像将乳腺肿瘤分类为良性或恶性。

目的

该研究的主要目的是设计一种有效的乳腺癌诊断模型,能够在早期准确地对肿瘤进行分类,从而降低死亡率。该模型旨在使用RPAOSM-ESO算法优化权重和参数,以提高准确性并最小化假阴性率。

方法

BCDNet模型利用预训练的VGG16网络进行迁移学习以提取特征,并采用AHDNAM分类方法,其中包括空洞空间金字塔池化(ASPP)、深度可分离卷积网络(DTCN)、一维卷积神经网络(1DCNN)和注意力机制。RPAOSM-ESO算法用于微调权重和参数。

结果

基于RPAOSM-ESO-BCDNet的乳腺癌诊断模型准确率达到94.5%。该值相对高于以前的模型,如DTCN(88.2%)、1DCNN(89.6%)、MobileNet(91.3%)和ASPP-DTC-1DCNN-AM(93.8%)。因此,可以保证所设计的RPAOSM-ESO-BCDNet在分类方面比以前的模型产生更准确的结果。

结论

BCDNet模型凭借其由RPAOSM-ESO算法优化的复杂特征提取和分类技术,在使用超声图像准确分类乳腺肿瘤方面显示出前景。该研究表明,该模型可能是早期检测乳腺癌的有价值工具,有可能挽救生命并减轻医疗系统的负担。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ea949771936a/12880_2024_1538_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/b4012e537649/12880_2024_1538_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/b7a95f91aee3/12880_2024_1538_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ebca62bc4f63/12880_2024_1538_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/453d6bdacad9/12880_2024_1538_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/764ce1a1a0f6/12880_2024_1538_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/0bda5f6ccfb2/12880_2024_1538_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ac1666370e34/12880_2024_1538_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/0aa8740d37a0/12880_2024_1538_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/3ac8d9876e41/12880_2024_1538_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/2b7c3c25d211/12880_2024_1538_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/888be94c5e05/12880_2024_1538_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ea949771936a/12880_2024_1538_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/b4012e537649/12880_2024_1538_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/b7a95f91aee3/12880_2024_1538_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ebca62bc4f63/12880_2024_1538_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/453d6bdacad9/12880_2024_1538_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/764ce1a1a0f6/12880_2024_1538_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/0bda5f6ccfb2/12880_2024_1538_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ac1666370e34/12880_2024_1538_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/0aa8740d37a0/12880_2024_1538_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/3ac8d9876e41/12880_2024_1538_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/2b7c3c25d211/12880_2024_1538_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/888be94c5e05/12880_2024_1538_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b7/11707918/ea949771936a/12880_2024_1538_Fig12_HTML.jpg

相似文献

1
Effective BCDNet-based breast cancer classification model using hybrid deep learning with VGG16-based optimal feature extraction.基于BCDNet的有效乳腺癌分类模型,采用结合基于VGG16的最优特征提取的混合深度学习方法。
BMC Med Imaging. 2025 Jan 8;25(1):12. doi: 10.1186/s12880-024-01538-4.
2
Brain tumor segmentation and detection in MRI using convolutional neural networks and VGG16.使用卷积神经网络和VGG16在磁共振成像(MRI)中进行脑肿瘤分割与检测
Cancer Biomark. 2025 Mar;42(3):18758592241311184. doi: 10.1177/18758592241311184. Epub 2025 Apr 4.
3
BCDnet: Parallel heterogeneous eight-class classification model of breast pathology.BCDnet:一种并行异构的乳腺病理八分类分类模型。
PLoS One. 2021 Jul 12;16(7):e0253764. doi: 10.1371/journal.pone.0253764. eCollection 2021.
4
ViT-MAENB7: An innovative breast cancer diagnosis model from 3D mammograms using advanced segmentation and classification process.基于先进分割和分类流程的 3D 乳腺 X 线摄影的乳腺癌诊断新模型:ViT-MAENB7。
Comput Methods Programs Biomed. 2024 Dec;257:108373. doi: 10.1016/j.cmpb.2024.108373. Epub 2024 Aug 23.
5
Advancing breast ultrasound diagnostics through hybrid deep learning models.通过混合深度学习模型推进乳腺超声诊断。
Comput Biol Med. 2024 Sep;180:108962. doi: 10.1016/j.compbiomed.2024.108962. Epub 2024 Aug 13.
6
Role of inter- and extra-lesion tissue, transfer learning, and fine-tuning in the robust classification of breast lesions.病灶内和病灶外组织、迁移学习和微调在乳腺病变稳健分类中的作用。
Sci Rep. 2024 Oct 1;14(1):22754. doi: 10.1038/s41598-024-74316-5.
7
Efficient hybrid heuristic adopted deep learning framework for diagnosing breast cancer using thermography images.高效混合启发式算法采用深度学习框架,利用热成像图像诊断乳腺癌。
Sci Rep. 2025 Apr 19;15(1):13605. doi: 10.1038/s41598-025-96827-5.
8
Variational mode directed deep learning framework for breast lesion classification using ultrasound imaging.基于超声成像的变分模态导向深度学习框架用于乳腺病变分类
Sci Rep. 2025 Apr 24;15(1):14300. doi: 10.1038/s41598-025-99009-5.
9
Breast ultrasound tumor image classification using image decomposition and fusion based on adaptive multi-model spatial feature fusion.基于自适应多模型空间特征融合的图像分解与融合在乳腺超声肿瘤图像分类中的应用。
Comput Methods Programs Biomed. 2021 Sep;208:106221. doi: 10.1016/j.cmpb.2021.106221. Epub 2021 Jun 3.
10
Advanced feature learning and classification of microscopic breast abnormalities using a robust deep transfer learning technique.使用稳健的深度迁移学习技术对微观乳腺异常进行高级特征学习和分类。
Microsc Res Tech. 2024 Aug;87(8):1862-1888. doi: 10.1002/jemt.24557. Epub 2024 Mar 30.

本文引用的文献

1
Incorporating a Novel Dual Transfer Learning Approach for Medical Images.将一种新的双重迁移学习方法应用于医学图像。
Sensors (Basel). 2023 Jan 4;23(2):570. doi: 10.3390/s23020570.
2
Egret Swarm Optimization Algorithm: An Evolutionary Computation Approach for Model Free Optimization.白鹭群优化算法:一种用于无模型优化的进化计算方法。
Biomimetics (Basel). 2022 Sep 27;7(4):144. doi: 10.3390/biomimetics7040144.
3
DGANet: A Dual Global Attention Neural Network for Breast Lesion Detection in Ultrasound Images.DGANet:一种用于超声图像中乳腺病变检测的双全局注意力神经网络。
Ultrasound Med Biol. 2023 Jan;49(1):31-44. doi: 10.1016/j.ultrasmedbio.2022.07.006. Epub 2022 Oct 4.
4
Deep Interpretable Classification and Weakly-Supervised Segmentation of Histology Images via Max-Min Uncertainty.基于极大极小不确定性的组织学图像深度可解释分类与弱监督分割。
IEEE Trans Med Imaging. 2022 Mar;41(3):702-714. doi: 10.1109/TMI.2021.3123461. Epub 2022 Mar 2.
5
Dual Convolutional Neural Networks for Breast Mass Segmentation and Diagnosis in Mammography.双卷积神经网络在乳腺钼靶图像肿块分割及诊断中的应用
IEEE Trans Med Imaging. 2022 Jan;41(1):3-13. doi: 10.1109/TMI.2021.3102622. Epub 2021 Dec 30.
6
Domain Knowledge Powered Deep Learning for Breast Cancer Diagnosis Based on Contrast-Enhanced Ultrasound Videos.基于对比增强超声视频的乳腺癌诊断的领域知识增强深度学习
IEEE Trans Med Imaging. 2021 Sep;40(9):2439-2451. doi: 10.1109/TMI.2021.3078370. Epub 2021 Aug 31.
7
Attention-Enriched Deep Learning Model for Breast Tumor Segmentation in Ultrasound Images.基于注意力机制的深度学习模型在超声图像中的乳腺肿瘤分割
Ultrasound Med Biol. 2020 Oct;46(10):2819-2833. doi: 10.1016/j.ultrasmedbio.2020.06.015. Epub 2020 Jul 21.
8
Channel Attention Module With Multiscale Grid Average Pooling for Breast Cancer Segmentation in an Ultrasound Image.基于多尺度网格平均池化的通道注意力模块用于超声图像中的乳腺癌分割
IEEE Trans Ultrason Ferroelectr Freq Control. 2020 Jul;67(7):1344-1353. doi: 10.1109/TUFFC.2020.2972573. Epub 2020 Feb 10.
9
Multi-View Mammographic Density Classification by Dilated and Attention-Guided Residual Learning.基于扩张和注意力引导残差学习的多视图乳腺密度分类。
IEEE/ACM Trans Comput Biol Bioinform. 2021 May-Jun;18(3):1003-1013. doi: 10.1109/TCBB.2020.2970713. Epub 2021 Jun 3.
10
Deep Neural Networks With Region-Based Pooling Structures for Mammographic Image Classification.基于区域池化结构的深度神经网络在乳腺图像分类中的应用。
IEEE Trans Med Imaging. 2020 Jun;39(6):2246-2255. doi: 10.1109/TMI.2020.2968397. Epub 2020 Jan 21.