• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

FMRNet:一种融合多肿瘤区域的网络,用于基于超声图像的乳腺肿瘤分类。

FMRNet: A fused network of multiple tumoral regions for breast tumor classification with ultrasound images.

机构信息

Institute of Biomedical Engineering, School of Communication and Information Engineering, Shanghai University, Shanghai, China.

Medical Imaging Department, Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China.

出版信息

Med Phys. 2022 Jan;49(1):144-157. doi: 10.1002/mp.15341. Epub 2021 Nov 29.

DOI:10.1002/mp.15341
PMID:34766623
Abstract

PURPOSE

Recent studies have illustrated that the peritumoral regions of medical images have value for clinical diagnosis. However, the existing approaches using peritumoral regions mainly focus on the diagnostic capability of the single region and ignore the advantages of effectively fusing the intratumoral and peritumoral regions. In addition, these methods need accurate segmentation masks in the testing stage, which are tedious and inconvenient in clinical applications. To address these issues, we construct a deep convolutional neural network that can adaptively fuse the information of multiple tumoral-regions (FMRNet) for breast tumor classification using ultrasound (US) images without segmentation masks in the testing stage.

METHODS

To sufficiently excavate the potential relationship, we design a fused network and two independent modules to extract and fuse features of multiple regions simultaneously. First, we introduce two enhanced combined-tumoral (EC) region modules, aiming to enhance the combined-tumoral features gradually. Then, we further design a three-branch module for extracting and fusing the features of intratumoral, peritumoral, and combined-tumoral regions, denoted as the intratumoral, peritumoral, and combined-tumoral module. Especially, we design a novel fusion module by introducing a channel attention module to adaptively fuse the features of three regions. The model is evaluated on two public datasets including UDIAT and BUSI with breast tumor ultrasound images. Two independent groups of experiments are performed on two respective datasets using the fivefold stratified cross-validation strategy. Finally, we conduct ablation experiments on two datasets, in which BUSI is used as the training set and UDIAT is used as the testing set.

RESULTS

We conduct detailed ablation experiments about the proposed two modules and comparative experiments with other existing representative methods. The experimental results show that the proposed method yields state-of-the-art performance on both two datasets. Especially, in the UDIAT dataset, the proposed FMRNet achieves a high accuracy of 0.945 and a specificity of 0.945, respectively. Moreover, the precision (PRE = 0.909) even dramatically improves by 21.6% on the BUSI dataset compared with the existing method of the best result.

CONCLUSION

The proposed FMRNet shows good performance in breast tumor classification with US images, and proves its capability of exploiting and fusing the information of multiple tumoral-regions. Furthermore, the FMRNet has potential value in classifying other types of cancers using multiple tumoral-regions of other kinds of medical images.

摘要

目的

最近的研究表明,医学图像的瘤周区域对临床诊断具有价值。然而,现有的利用瘤周区域的方法主要侧重于单个区域的诊断能力,而忽略了有效融合肿瘤内和瘤周区域的优势。此外,这些方法在测试阶段需要准确的分割掩模,这在临床应用中既繁琐又不方便。为了解决这些问题,我们构建了一个深度卷积神经网络,可以在没有分割掩模的情况下,自适应地融合多个肿瘤区域的信息(FMRNet),用于使用超声(US)图像进行乳腺癌分类。

方法

为了充分挖掘潜在的关系,我们设计了一个融合网络和两个独立的模块,以同时提取和融合多个区域的特征。首先,我们引入了两个增强型联合肿瘤(EC)区域模块,旨在逐步增强联合肿瘤的特征。然后,我们进一步设计了一个三分支模块,用于提取和融合肿瘤内、瘤周和联合肿瘤区域的特征,分别表示为肿瘤内、瘤周和联合肿瘤模块。特别是,我们通过引入通道注意力模块设计了一个新颖的融合模块,以自适应地融合三个区域的特征。该模型在包含乳腺癌超声图像的两个公共数据集 UDIAT 和 BUSI 上进行评估。使用五重分层交叉验证策略在两个各自的数据集上进行了两组独立的实验。最后,我们在两个数据集上进行了消融实验,其中 BUSI 用作训练集,UDIAT 用作测试集。

结果

我们对所提出的两个模块进行了详细的消融实验,并与其他现有代表性方法进行了对比实验。实验结果表明,该方法在两个数据集上均取得了最先进的性能。特别是在 UDIAT 数据集上,所提出的 FMRNet 分别达到了 0.945 的高准确率和 0.945 的高特异性。此外,与现有最佳结果方法相比,BUSI 数据集上的精度(PRE=0.909)甚至大幅提高了 21.6%。

结论

所提出的 FMRNet 在使用 US 图像进行乳腺癌分类方面表现出良好的性能,证明了其利用和融合多个肿瘤区域信息的能力。此外,FMRNet 在使用其他类型医学图像的多个肿瘤区域对其他类型癌症进行分类方面具有潜在价值。

相似文献

1
FMRNet: A fused network of multiple tumoral regions for breast tumor classification with ultrasound images.FMRNet:一种融合多肿瘤区域的网络,用于基于超声图像的乳腺肿瘤分类。
Med Phys. 2022 Jan;49(1):144-157. doi: 10.1002/mp.15341. Epub 2021 Nov 29.
2
A deep learning-based method for the detection and segmentation of breast masses in ultrasound images.基于深度学习的超声图像中乳腺肿块检测与分割方法
Phys Med Biol. 2024 Jul 26;69(15). doi: 10.1088/1361-6560/ad61b6.
3
BIRADS features-oriented semi-supervised deep learning for breast ultrasound computer-aided diagnosis.基于 BI-RADS 特征的半监督深度学习在乳腺超声计算机辅助诊断中的应用。
Phys Med Biol. 2020 Jun 12;65(12):125005. doi: 10.1088/1361-6560/ab7e7d.
4
DAU-Net: Dual attention-aided U-Net for segmenting tumor in breast ultrasound images.DAU-Net:用于乳腺超声图像中肿瘤分割的双注意力辅助 U-Net。
PLoS One. 2024 May 31;19(5):e0303670. doi: 10.1371/journal.pone.0303670. eCollection 2024.
5
Improved breast ultrasound tumor classification using dual-input CNN with GAP-guided attention loss.利用具有 GAP 引导注意力损失的双输入 CNN 提高乳腺超声肿瘤分类
Math Biosci Eng. 2023 Jul 20;20(8):15244-15264. doi: 10.3934/mbe.2023682.
6
Using BI-RADS Stratifications as Auxiliary Information for Breast Masses Classification in Ultrasound Images.利用 BI-RADS 分层作为超声图像中乳腺肿块分类的辅助信息。
IEEE J Biomed Health Inform. 2021 Jun;25(6):2058-2070. doi: 10.1109/JBHI.2020.3034804. Epub 2021 Jun 3.
7
CTG-Net: Cross-task guided network for breast ultrasound diagnosis.CTG-Net:用于乳腺超声诊断的跨任务引导网络。
PLoS One. 2022 Aug 11;17(8):e0271106. doi: 10.1371/journal.pone.0271106. eCollection 2022.
8
A hybrid attentional guidance network for tumors segmentation of breast ultrasound images.一种用于乳腺超声图像肿瘤分割的混合注意力引导网络。
Int J Comput Assist Radiol Surg. 2023 Aug;18(8):1489-1500. doi: 10.1007/s11548-023-02849-7. Epub 2023 Feb 28.
9
CAM-QUS guided self-tuning modular CNNs with multi-loss functions for fully automated breast lesion classification in ultrasound images.CAM-QUS 引导的自调谐模块化卷积神经网络,具有多损失函数,用于超声图像中全自动的乳腺病变分类。
Phys Med Biol. 2023 Dec 26;69(1). doi: 10.1088/1361-6560/ad1319.
10
Automatic tumor segmentation in breast ultrasound images using a dilated fully convolutional network combined with an active contour model.使用带孔全卷积网络结合主动轮廓模型进行乳腺超声图像的自动肿瘤分割。
Med Phys. 2019 Jan;46(1):215-228. doi: 10.1002/mp.13268. Epub 2018 Nov 28.

引用本文的文献

1
BUSClean: Open-source software for breast ultrasound image pre-processing and knowledge extraction for medical AI.BUSClean:用于乳腺超声图像预处理和医学人工智能知识提取的开源软件。
PLoS One. 2024 Dec 11;19(12):e0315434. doi: 10.1371/journal.pone.0315434. eCollection 2024.
2
An updated overview of radiomics-based artificial intelligence (AI) methods in breast cancer screening and diagnosis.基于放射组学的人工智能(AI)方法在乳腺癌筛查和诊断中的最新综述。
Radiol Phys Technol. 2024 Dec;17(4):795-818. doi: 10.1007/s12194-024-00842-6. Epub 2024 Sep 16.
3
HBMD-Net: Feature Fusion Based Breast Cancer Classification with Class Imbalance Resolution.
HBMD-Net:基于特征融合的乳腺癌分类与类别不平衡问题解决
J Imaging Inform Med. 2024 Aug;37(4):1440-1457. doi: 10.1007/s10278-024-01046-5. Epub 2024 Feb 26.
4
Combining radiomics and deep learning features of intra-tumoral and peri-tumoral regions for the classification of breast cancer lung metastasis and primary lung cancer with low-dose CT.结合肿瘤内和肿瘤周围区域的放射组学和深度学习特征,利用低剂量CT对乳腺癌肺转移和原发性肺癌进行分类。
J Cancer Res Clin Oncol. 2023 Nov;149(17):15469-15478. doi: 10.1007/s00432-023-05329-2. Epub 2023 Aug 29.