• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于多视图卷积神经网络和迁移学习的自动乳腺超声乳腺癌分类。

Breast Cancer Classification in Automated Breast Ultrasound Using Multiview Convolutional Neural Network with Transfer Learning.

机构信息

Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, Canada.

Department of Radiology, Research Institute of Clinical Medicine of Jeonbuk National University-Biomedical Research Institute of Jeonbuk National University Hospital, Jeonbuk National University Medical School, Jeonju City, Jeollabuk-Do, South Korea.

出版信息

Ultrasound Med Biol. 2020 May;46(5):1119-1132. doi: 10.1016/j.ultrasmedbio.2020.01.001. Epub 2020 Feb 12.

DOI:10.1016/j.ultrasmedbio.2020.01.001
PMID:32059918
Abstract

To assist radiologists in breast cancer classification in automated breast ultrasound (ABUS) imaging, we propose a computer-aided diagnosis based on a convolutional neural network (CNN) that classifies breast lesions as benign and malignant. The proposed CNN adopts a modified Inception-v3 architecture to provide efficient feature extraction in ABUS imaging. Because the ABUS images can be visualized in transverse and coronal views, the proposed CNN provides an efficient way to extract multiview features from both views. The proposed CNN was trained and evaluated on 316 breast lesions (135 malignant and 181 benign). An observer performance test was conducted to compare five human reviewers' diagnostic performance before and after referring to the predicting outcomes of the proposed CNN. Our method achieved an area under the curve (AUC) value of 0.9468 with five-folder cross-validation, for which the sensitivity and specificity were 0.886 and 0.876, respectively. Compared with conventional machine learning-based feature extraction schemes, particularly principal component analysis (PCA) and histogram of oriented gradients (HOG), our method achieved a significant improvement in classification performance. The proposed CNN achieved a >10% increased AUC value compared with PCA and HOG. During the observer performance test, the diagnostic results of all human reviewers had increased AUC values and sensitivities after referring to the classification results of the proposed CNN, and four of the five human reviewers' AUCs were significantly improved. The proposed CNN employing a multiview strategy showed promise for the diagnosis of breast cancer, and could be used as a second reviewer for increasing diagnostic reliability.

摘要

为了协助放射科医生在自动乳腺超声(ABUS)成像中对乳腺癌进行分类,我们提出了一种基于卷积神经网络(CNN)的计算机辅助诊断方法,用于对乳腺病变进行良性和恶性分类。所提出的 CNN 采用了改进的 Inception-v3 架构,以在 ABUS 成像中提供高效的特征提取。由于 ABUS 图像可以在横切面和冠状面视图中可视化,因此所提出的 CNN 提供了一种从两个视图中提取多视图特征的有效方法。该 CNN 在 316 个乳腺病变(135 个恶性和 181 个良性)上进行了训练和评估。进行了观察者性能测试,以比较五位观察者在参考所提出的 CNN 的预测结果前后的诊断性能。我们的方法在五重交叉验证下获得了 0.9468 的曲线下面积(AUC)值,其中敏感性和特异性分别为 0.886 和 0.876。与基于传统机器学习的特征提取方案(特别是主成分分析(PCA)和方向梯度直方图(HOG))相比,我们的方法在分类性能方面取得了显著提高。与 PCA 和 HOG 相比,所提出的 CNN 实现了> 10%的 AUC 值增加。在观察者性能测试中,所有观察者的诊断结果在参考所提出的 CNN 的分类结果后,AUC 值和敏感性均有所提高,其中五位观察者中的四位 AUC 得到了显著提高。采用多视图策略的所提出的 CNN 显示出对乳腺癌诊断的潜力,并且可以用作增加诊断可靠性的第二审阅者。

相似文献

1
Breast Cancer Classification in Automated Breast Ultrasound Using Multiview Convolutional Neural Network with Transfer Learning.基于多视图卷积神经网络和迁移学习的自动乳腺超声乳腺癌分类。
Ultrasound Med Biol. 2020 May;46(5):1119-1132. doi: 10.1016/j.ultrasmedbio.2020.01.001. Epub 2020 Feb 12.
2
Distinction between benign and malignant breast masses at breast ultrasound using deep learning method with convolutional neural network.利用卷积神经网络的深度学习方法在乳腺超声中区分良恶性乳腺肿块。
Jpn J Radiol. 2019 Jun;37(6):466-472. doi: 10.1007/s11604-019-00831-5. Epub 2019 Mar 19.
3
Computer-aided diagnosis system for breast ultrasound images using deep learning.基于深度学习的乳腺超声图像计算机辅助诊断系统。
Phys Med Biol. 2019 Dec 5;64(23):235013. doi: 10.1088/1361-6560/ab5093.
4
Classification of Breast Masses on Ultrasound Shear Wave Elastography using Convolutional Neural Networks.基于卷积神经网络的超声剪切波弹性成像在乳腺肿块分类中的应用。
Ultrason Imaging. 2020 Jul-Sep;42(4-5):213-220. doi: 10.1177/0161734620932609. Epub 2020 Jun 5.
5
Computer-aided diagnosis of breast ultrasound images using ensemble learning from convolutional neural networks.基于卷积神经网络集成学习的乳腺超声图像计算机辅助诊断。
Comput Methods Programs Biomed. 2020 Jul;190:105361. doi: 10.1016/j.cmpb.2020.105361. Epub 2020 Jan 25.
6
Role of inter- and extra-lesion tissue, transfer learning, and fine-tuning in the robust classification of breast lesions.病灶内和病灶外组织、迁移学习和微调在乳腺病变稳健分类中的作用。
Sci Rep. 2024 Oct 1;14(1):22754. doi: 10.1038/s41598-024-74316-5.
7
Classification of Mammogram Images Using Multiscale all Convolutional Neural Network (MA-CNN).使用多尺度全卷积神经网络(MA-CNN)对乳腺 X 光图像进行分类。
J Med Syst. 2019 Dec 14;44(1):30. doi: 10.1007/s10916-019-1494-z.
8
Lymph Node Metastasis Prediction from Primary Breast Cancer US Images Using Deep Learning.基于深度学习的原发性乳腺癌 US 图像淋巴结转移预测。
Radiology. 2020 Jan;294(1):19-28. doi: 10.1148/radiol.2019190372. Epub 2019 Nov 19.
9
Fus2Net: a novel Convolutional Neural Network for classification of benign and malignant breast tumor in ultrasound images.Fus2Net:一种用于超声图像中良性和恶性乳腺肿瘤分类的新型卷积神经网络。
Biomed Eng Online. 2021 Nov 18;20(1):112. doi: 10.1186/s12938-021-00950-z.
10
Detection and classification the breast tumors using mask R-CNN on sonograms.使用掩码区域卷积神经网络(Mask R-CNN)在超声图像上检测和分类乳腺肿瘤。
Medicine (Baltimore). 2019 May;98(19):e15200. doi: 10.1097/MD.0000000000015200.

引用本文的文献

1
Unlocking the power of L1 regularization: A novel approach to taming overfitting in CNN for image classification.解锁L1正则化的力量:一种驯服卷积神经网络(CNN)图像分类中过拟合问题的新方法。
PLoS One. 2025 Sep 5;20(9):e0327985. doi: 10.1371/journal.pone.0327985. eCollection 2025.
2
Artificial intelligence-based automated breast ultrasound radiomics for breast tumor diagnosis and treatment: a narrative review.基于人工智能的乳腺肿瘤诊断与治疗自动乳腺超声影像组学:一项叙述性综述
Front Oncol. 2025 May 8;15:1578991. doi: 10.3389/fonc.2025.1578991. eCollection 2025.
3
Diagnosing Ankylosing Spondylitis via Architecture-Modified ResNet and Combined Conventional Magnetic Resonance Imagery.
通过架构改进的残差神经网络和联合传统磁共振成像诊断强直性脊柱炎
J Imaging Inform Med. 2025 Mar 3. doi: 10.1007/s10278-025-01427-4.
4
Refining breast cancer classification: Customized attention integration approaches with dense and residual networks for enhanced detection.优化乳腺癌分类:结合密集网络和残差网络的定制注意力集成方法以增强检测效果
Digit Health. 2025 Jan 6;11:20552076241309947. doi: 10.1177/20552076241309947. eCollection 2025 Jan-Dec.
5
Breast Cancer Detection on Dual-View Sonography via Data-Centric Deep Learning.基于以数据为中心的深度学习的双视图超声乳腺癌检测
IEEE Open J Eng Med Biol. 2024 Sep 5;6:100-106. doi: 10.1109/OJEMB.2024.3454958. eCollection 2025.
6
Using the GoogLeNet deep-learning model to distinguish between benign and malignant breast masses based on conventional ultrasound: a systematic review and meta-analysis.基于传统超声,使用谷歌网络深度学习模型区分乳腺良恶性肿块:一项系统综述和荟萃分析。
Quant Imaging Med Surg. 2024 Oct 1;14(10):7111-7127. doi: 10.21037/qims-24-679. Epub 2024 Sep 26.
7
Multiview deep learning networks based on automated breast volume scanner images for identifying breast cancer in BI-RADS 4.基于自动乳腺容积扫描仪图像的多视图深度学习网络用于识别BI-RADS 4级乳腺癌。
Front Oncol. 2024 Sep 6;14:1399296. doi: 10.3389/fonc.2024.1399296. eCollection 2024.
8
Neural Network Pattern Recognition of Ultrasound Image Gray Scale Intensity Histograms of Breast Lesions to Differentiate Between Benign and Malignant Lesions: Analytical Study.基于神经网络模式识别的乳腺病变超声图像灰度强度直方图鉴别良恶性病变的分析研究
JMIR Biomed Eng. 2021 Jun 2;6(2):e23808. doi: 10.2196/23808.
9
Explainable DCNN Decision Framework for Breast Lesion Classification from Ultrasound Images Based on Cancer Characteristics.基于癌症特征的超声图像乳腺病变分类可解释深度卷积神经网络决策框架
Bioengineering (Basel). 2024 May 2;11(5):453. doi: 10.3390/bioengineering11050453.
10
Machine learning and new insights for breast cancer diagnosis.用于乳腺癌诊断的机器学习与新见解
J Int Med Res. 2024 Apr;52(4):3000605241237867. doi: 10.1177/03000605241237867.