• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于弹性成像的协作学习实现 B 模式超声成像中乳腺癌的联合定位与分类。

Joint Localization and Classification of Breast Cancer in B-Mode Ultrasound Imaging via Collaborative Learning With Elastography.

出版信息

IEEE J Biomed Health Inform. 2022 Sep;26(9):4474-4485. doi: 10.1109/JBHI.2022.3186933. Epub 2022 Sep 9.

DOI:10.1109/JBHI.2022.3186933
PMID:35763467
Abstract

Convolutional neural networks (CNNs) have been successfully applied in the computer-aided ultrasound diagnosis for breast cancer. Up to now, several CNN-based methods have been proposed. However, most of them consider tumor localization and classification as two separate steps, rather than performing them simultaneously. Besides, they suffer from the limited diagnosis information in the B-mode ultrasound (BUS) images. In this study, we develop a novel network ResNet-GAP that incorporates both localization and classification into a unified procedure. To enhance the performance of ResNet-GAP, we leverage stiffness information in the elastography ultrasound (EUS) modality by collaborative learning in the training stage. Specifically, a dual-channel ResNet-GAP network is developed, one channel for BUS and the other for EUS. In each channel, multiple class activity maps (CAMs) are generated using a series of convolutional kernels of different sizes. The multi-scale consistency of the CAMs in both channels are further considered in network optimization. Experiments on 264 patients in this study show that the newly developed ResNet-GAP achieves an accuracy of 88.6%, a sensitivity of 95.3%, a specificity of 84.6%, and an AUC of 93.6% on the classification task, and a 1.0NLF of 87.9% on the localization task, which is better than some state-of-the-art approaches.

摘要

卷积神经网络(CNNs)已成功应用于计算机辅助乳腺癌超声诊断。迄今为止,已经提出了几种基于 CNN 的方法。然而,它们中的大多数将肿瘤定位和分类视为两个独立的步骤,而不是同时进行。此外,它们受到 B 型超声(BUS)图像中有限的诊断信息的限制。在本研究中,我们开发了一种新的网络 ResNet-GAP,将定位和分类纳入一个统一的过程。为了提高 ResNet-GAP 的性能,我们利用弹性超声(EUS)模态中的刚度信息,通过在训练阶段进行协作学习。具体来说,开发了一个双通道 ResNet-GAP 网络,一个通道用于 BUS,另一个通道用于 EUS。在每个通道中,使用一系列不同大小的卷积核生成多个类激活图(CAM)。在网络优化中进一步考虑了两个通道中 CAM 的多尺度一致性。本研究对 264 名患者的实验表明,新开发的 ResNet-GAP 在分类任务上的准确率为 88.6%,灵敏度为 95.3%,特异性为 84.6%,AUC 为 93.6%,定位任务上的 1.0NLF 为 87.9%,优于一些最先进的方法。

相似文献

1
Joint Localization and Classification of Breast Cancer in B-Mode Ultrasound Imaging via Collaborative Learning With Elastography.基于弹性成像的协作学习实现 B 模式超声成像中乳腺癌的联合定位与分类。
IEEE J Biomed Health Inform. 2022 Sep;26(9):4474-4485. doi: 10.1109/JBHI.2022.3186933. Epub 2022 Sep 9.
2
CAM-QUS guided self-tuning modular CNNs with multi-loss functions for fully automated breast lesion classification in ultrasound images.CAM-QUS 引导的自调谐模块化卷积神经网络,具有多损失函数,用于超声图像中全自动的乳腺病变分类。
Phys Med Biol. 2023 Dec 26;69(1). doi: 10.1088/1361-6560/ad1319.
3
Gray-to-color image conversion in the classification of breast lesions on ultrasound using pre-trained deep neural networks.基于预训练深度神经网络的超声乳腺病变灰阶到彩色图像转换分类。
Med Biol Eng Comput. 2023 Dec;61(12):3193-3207. doi: 10.1007/s11517-023-02928-6. Epub 2023 Sep 15.
4
Classification of Breast Masses on Ultrasound Shear Wave Elastography using Convolutional Neural Networks.基于卷积神经网络的超声剪切波弹性成像在乳腺肿块分类中的应用。
Ultrason Imaging. 2020 Jul-Sep;42(4-5):213-220. doi: 10.1177/0161734620932609. Epub 2020 Jun 5.
5
Deep Learning Networks for Breast Lesion Classification in Ultrasound Images: A Comparative Study.用于超声图像中乳腺病变分类的深度学习网络:一项比较研究。
Annu Int Conf IEEE Eng Med Biol Soc. 2023 Jul;2023:1-4. doi: 10.1109/EMBC40787.2023.10340293.
6
Using BI-RADS Stratifications as Auxiliary Information for Breast Masses Classification in Ultrasound Images.利用 BI-RADS 分层作为超声图像中乳腺肿块分类的辅助信息。
IEEE J Biomed Health Inform. 2021 Jun;25(6):2058-2070. doi: 10.1109/JBHI.2020.3034804. Epub 2021 Jun 3.
7
Breast Cancer Classification in Automated Breast Ultrasound Using Multiview Convolutional Neural Network with Transfer Learning.基于多视图卷积神经网络和迁移学习的自动乳腺超声乳腺癌分类。
Ultrasound Med Biol. 2020 May;46(5):1119-1132. doi: 10.1016/j.ultrasmedbio.2020.01.001. Epub 2020 Feb 12.
8
An Efficient Multi-Scale Convolutional Neural Network Based Multi-Class Brain MRI Classification for SaMD.基于高效多尺度卷积神经网络的 SaMD 多类脑 MRI 分类
Tomography. 2022 Jul 26;8(4):1905-1927. doi: 10.3390/tomography8040161.
9
Bi-Modal Transfer Learning for Classifying Breast Cancers via Combined B-Mode and Ultrasound Strain Imaging.基于 B 模式和超声应变成像的双模态迁移学习在乳腺癌分类中的应用。
IEEE Trans Ultrason Ferroelectr Freq Control. 2022 Jan;69(1):222-232. doi: 10.1109/TUFFC.2021.3119251. Epub 2021 Dec 31.
10
A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound.用于超声乳腺肿瘤语义分割的预训练卷积神经网络的比较研究
Comput Biol Med. 2020 Nov;126:104036. doi: 10.1016/j.compbiomed.2020.104036. Epub 2020 Oct 8.

引用本文的文献

1
U-Net and Its Variants Based Automatic Tracking of Radial Artery in Ultrasonic Short-Axis Views: A Pilot Study.基于U-Net及其变体的超声短轴视图下桡动脉自动追踪:一项初步研究。
Diagnostics (Basel). 2024 Oct 23;14(21):2358. doi: 10.3390/diagnostics14212358.
2
Joint localization and classification of breast masses on ultrasound images using an auxiliary attention-based framework.基于辅助注意力框架的超声图像中乳腺肿块的联合定位与分类。
Med Image Anal. 2023 Dec;90:102960. doi: 10.1016/j.media.2023.102960. Epub 2023 Sep 14.