• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于脑肿瘤分割的细粒度分层渐进模态感知网络

Fine-Grained Hierarchical Progressive Modal-Aware Network for Brain Tumor Segmentation.

作者信息

Lu Chenggang, Zhang Jianwei, Zhang Dan, Mou Lei, Yuan Jinli, Xia Kewen, Guo Zhitao, Zhang Jiong

出版信息

IEEE J Biomed Health Inform. 2025 May 22;PP. doi: 10.1109/JBHI.2025.3572301.

DOI:10.1109/JBHI.2025.3572301
PMID:40402698
Abstract

Brain tumors are highly lethal and debilitating pathological changes that require timely diagnosis and treatment. Magnetic resonance imaging (MRI), a non-invasive diagnostic tool, provides complementary multi-modal information crucial for accurate tumor detection and delineation. However, existing methods struggle to effectively fuse multi-modal information from MRI sequences and often fail to perform modality-specific feature extraction, which hinders accurate tumor segmentation. Furthermore, the inherent challenges posed by the blurred boundaries and complex morphological characteristics of tumor structures present additional substantial obstacles to achieving precise segmentation. To address these issues, we propose FiHam, a fine-grained hierarchical progressive modal-aware network that introduces a novel multi-modal fusion strategy and an advanced feature extraction mechanism. Specifically, FiHam employs a progressive fusion strategy that extracts modality-specific features at lower levels and integrates multi-modal features at higher levels to effectively leverage complementary information from tumor images. Additionally, we design a gated cross-attention modal-fusion module that adaptively selects and integrates dual-modal features using cross-attention mechanisms to enhance modality fusion. To further refine segmentation accuracy, we incorporate a tiny U-Net into the encoder to capture boundary features and complex tumor morphology. Extensive experiments on three large-scale, multi-modal brain tumor datasets demonstrate that FiHam achieves state-of-the-art performance, delivering significant improvements in segmentation accuracy and generalizability across diverse MRI modalities.

摘要

脑肿瘤是极具致死性和致残性的病理变化,需要及时诊断和治疗。磁共振成像(MRI)作为一种非侵入性诊断工具,提供了对准确肿瘤检测和勾勒至关重要的互补多模态信息。然而,现有方法难以有效地融合来自MRI序列的多模态信息,并且常常无法进行特定模态的特征提取,这阻碍了准确的肿瘤分割。此外,肿瘤结构边界模糊和形态特征复杂所带来的固有挑战,给实现精确分割带来了额外的重大障碍。为了解决这些问题,我们提出了FiHam,这是一种细粒度分层渐进模态感知网络,它引入了一种新颖的多模态融合策略和先进的特征提取机制。具体而言,FiHam采用渐进融合策略,在较低层次提取特定模态特征,在较高层次整合多模态特征,以有效利用肿瘤图像中的互补信息。此外,我们设计了一个门控交叉注意力模态融合模块,利用交叉注意力机制自适应地选择和整合双模态特征,以增强模态融合。为了进一步提高分割精度,我们在编码器中并入一个微小U-Net,以捕捉边界特征和复杂的肿瘤形态。在三个大规模多模态脑肿瘤数据集上进行的大量实验表明,FiHam实现了最优性能,在分割精度和跨不同MRI模态的通用性方面有显著提升。

相似文献

1
Fine-Grained Hierarchical Progressive Modal-Aware Network for Brain Tumor Segmentation.用于脑肿瘤分割的细粒度分层渐进模态感知网络
IEEE J Biomed Health Inform. 2025 May 22;PP. doi: 10.1109/JBHI.2025.3572301.
2
A 3D hierarchical cross-modality interaction network using transformers and convolutions for brain glioma segmentation in MR images.一种使用变换和卷积的 3D 层次跨模态交互网络,用于磁共振图像中的脑胶质瘤分割。
Med Phys. 2024 Nov;51(11):8371-8389. doi: 10.1002/mp.17354. Epub 2024 Aug 13.
3
CMAF-Net: a cross-modal attention fusion-based deep neural network for incomplete multi-modal brain tumor segmentation.CMAF-Net:一种基于跨模态注意力融合的深度神经网络,用于不完全多模态脑肿瘤分割。
Quant Imaging Med Surg. 2024 Jul 1;14(7):4579-4604. doi: 10.21037/qims-24-9. Epub 2024 Jun 27.
4
A modality-collaborative convolution and transformer hybrid network for unpaired multi-modal medical image segmentation with limited annotations.一种用于具有有限标注的未配对多模态医学图像分割的模态协作卷积与Transformer混合网络。
Med Phys. 2023 Sep;50(9):5460-5478. doi: 10.1002/mp.16338. Epub 2023 Mar 15.
5
Flexible Fusion Network for Multi-Modal Brain Tumor Segmentation.灵活融合网络的多模态脑肿瘤分割。
IEEE J Biomed Health Inform. 2023 Jul;27(7):3349-3359. doi: 10.1109/JBHI.2023.3271808. Epub 2023 Jun 30.
6
Multi-modality self-attention aware deep network for 3D biomedical segmentation.多模态自注意力感知深度网络用于 3D 生物医学分割。
BMC Med Inform Decis Mak. 2020 Jul 9;20(Suppl 3):119. doi: 10.1186/s12911-020-1109-0.
7
IMIIN: An inter-modality information interaction network for 3D multi-modal breast tumor segmentation.
Comput Med Imaging Graph. 2022 Jan;95:102021. doi: 10.1016/j.compmedimag.2021.102021. Epub 2021 Nov 29.
8
Self-Supervised Multi-Modal Hybrid Fusion Network for Brain Tumor Segmentation.基于自监督多模态混合融合网络的脑肿瘤分割。
IEEE J Biomed Health Inform. 2022 Nov;26(11):5310-5320. doi: 10.1109/JBHI.2021.3109301. Epub 2022 Nov 10.
9
Attention-guided multi-scale context aggregation network for multi-modal brain glioma segmentation.基于注意引导的多尺度上下文聚合网络的多模态脑胶质瘤分割。
Med Phys. 2023 Dec;50(12):7629-7640. doi: 10.1002/mp.16452. Epub 2023 May 7.
10
A Neighbor-Sensitive Multi-Modal Flexible Learning Framework for Improved Prostate Tumor Segmentation in Anisotropic MR Images.一种用于改进各向异性磁共振图像中前列腺肿瘤分割的邻域敏感多模态灵活学习框架。
IEEE Trans Biomed Eng. 2025 Apr 21;PP. doi: 10.1109/TBME.2025.3562766.