• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

DMAeEDNet:用于基于超声的乳腺病变自动分割的密集乘法注意力增强编码器-解码器网络

DMAeEDNet: Dense Multiplicative Attention Enhanced Encoder Decoder Network for Ultrasound-Based Automated Breast Lesion Segmentation.

作者信息

Saini Manali, Afrin Humayra, Sotoudehnia Setayesh, Fatemi Mostafa, Alizad Azra

机构信息

Department of Radiology, Mayo Clinic College of Medicine and Science, Rochester, MN 55905, USA.

Department of Physiology and Biomedical Engineering, Mayo Clinic College of Medicine and Science, Rochester, MN 55905, USA.

出版信息

IEEE Access. 2024;12:60541-60555. doi: 10.1109/access.2024.3394808. Epub 2024 Apr 29.

DOI:10.1109/access.2024.3394808
PMID:39553390
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11566434/
Abstract

Automated and precise segmentation of breast lesions can facilitate early diagnosis of breast cancer. Recent research studies employ deep learning for automatic segmentation of breast lesions using ultrasound imaging. Numerous studies introduce somewhat complex modifications to the well adapted segmentation network, U-Net for improved segmentation, however, at the expense of increased computational time. Towards this aspect, this study presents a low complex deep learning network, i.e., dense multiplicative attention enhanced encoder decoder network, for effective breast lesion segmentation in the ultrasound images. For the first time in this context, two dense multiplicative attention components are utilized in the encoding layer and the output layer of an encoder-decoder network with depthwise separable convolutions, to selectively enhance the relevant features. A rigorous performance evaluation using two public datasets demonstrates that the proposed network achieves dice coefficients of 0.83 and 0.86 respectively with an average segmentation latency of 19. Further, a noise robustness study using an in-clinic recorded dataset without pre-processing indicates that the proposed network achieves dice coefficient of 0.72. Exhaustive comparison with some commonly used networks indicate its adeptness with low time and computational complexity demonstrating feasibility in real time.

摘要

乳腺病变的自动精确分割有助于乳腺癌的早期诊断。最近的研究利用深度学习对超声成像的乳腺病变进行自动分割。许多研究对适应性良好的分割网络U-Net进行了一些复杂的修改以改进分割效果,然而,这是以增加计算时间为代价的。针对这一方面,本研究提出了一种低复杂度的深度学习网络,即密集乘法注意力增强编码器-解码器网络,用于在超声图像中有效地分割乳腺病变。在此背景下,首次在具有深度可分离卷积的编码器-解码器网络的编码层和输出层中使用了两个密集乘法注意力组件,以选择性地增强相关特征。使用两个公共数据集进行的严格性能评估表明,所提出的网络分别实现了0.83和0.86的骰子系数,平均分割延迟为19。此外,使用未经预处理的临床记录数据集进行的噪声鲁棒性研究表明,所提出的网络实现了0.72的骰子系数。与一些常用网络的详尽比较表明,它在低时间和计算复杂度方面表现出色,证明了其在实时应用中的可行性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/708bbe380f2a/nihms-1991124-f0019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/9d14f59cafb5/nihms-1991124-f0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/1168c83e3818/nihms-1991124-f0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/baf4cc0c2aa2/nihms-1991124-f0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/b46f7b603f07/nihms-1991124-f0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/51bba27314a8/nihms-1991124-f0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/2b0d475b3f1c/nihms-1991124-f0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/6a69e79913b5/nihms-1991124-f0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/01a1ecfa1326/nihms-1991124-f0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/1bd87856e2fd/nihms-1991124-f0015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/cf087a4b361d/nihms-1991124-f0016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/55eed1a32c09/nihms-1991124-f0017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/f78814e88d75/nihms-1991124-f0018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/708bbe380f2a/nihms-1991124-f0019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/9d14f59cafb5/nihms-1991124-f0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/1168c83e3818/nihms-1991124-f0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/baf4cc0c2aa2/nihms-1991124-f0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/b46f7b603f07/nihms-1991124-f0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/51bba27314a8/nihms-1991124-f0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/2b0d475b3f1c/nihms-1991124-f0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/6a69e79913b5/nihms-1991124-f0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/01a1ecfa1326/nihms-1991124-f0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/1bd87856e2fd/nihms-1991124-f0015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/cf087a4b361d/nihms-1991124-f0016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/55eed1a32c09/nihms-1991124-f0017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/f78814e88d75/nihms-1991124-f0018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/db28/11566434/708bbe380f2a/nihms-1991124-f0019.jpg

相似文献

1
DMAeEDNet: Dense Multiplicative Attention Enhanced Encoder Decoder Network for Ultrasound-Based Automated Breast Lesion Segmentation.DMAeEDNet:用于基于超声的乳腺病变自动分割的密集乘法注意力增强编码器-解码器网络
IEEE Access. 2024;12:60541-60555. doi: 10.1109/access.2024.3394808. Epub 2024 Apr 29.
2
A multiple-channel and atrous convolution network for ultrasound image segmentation.一种用于超声图像分割的多通道多孔卷积网络。
Med Phys. 2020 Dec;47(12):6270-6285. doi: 10.1002/mp.14512. Epub 2020 Oct 18.
3
Feature-guided attention network for medical image segmentation.基于特征引导的注意力网络的医学图像分割。
Med Phys. 2023 Aug;50(8):4871-4886. doi: 10.1002/mp.16253. Epub 2023 Feb 16.
4
Multi-scale input layers and dense decoder aggregation network for COVID-19 lesion segmentation from CT scans.多尺度输入层和密集解码器聚合网络用于从 CT 扫描中 COVID-19 病变分割。
Sci Rep. 2024 Oct 10;14(1):23729. doi: 10.1038/s41598-024-74701-0.
5
ADU-Net: An Attention Dense U-Net based deep supervised DNN for automated lesion segmentation of COVID-19 from chest CT images.ADU-Net:一种基于注意力密集U-Net的深度监督深度神经网络,用于从胸部CT图像中自动分割新冠病毒肺炎病变。
Biomed Signal Process Control. 2023 Aug;85:104974. doi: 10.1016/j.bspc.2023.104974. Epub 2023 Apr 21.
6
Breast ultrasound image segmentation: A coarse-to-fine fusion convolutional neural network.乳腺超声图像分割:一种粗到细融合的卷积神经网络。
Med Phys. 2021 Aug;48(8):4262-4278. doi: 10.1002/mp.15006. Epub 2021 Jul 29.
7
A dense multi-path decoder for tissue segmentation in histopathology images.一种用于组织病理学图像中组织分割的密集多路径解码器。
Comput Methods Programs Biomed. 2019 May;173:119-129. doi: 10.1016/j.cmpb.2019.03.007. Epub 2019 Mar 14.
8
PolypSegNet: A modified encoder-decoder architecture for automated polyp segmentation from colonoscopy images.息肉分割网络(PolypSegNet):一种用于从结肠镜检查图像中自动分割息肉的改进型编码器-解码器架构。
Comput Biol Med. 2021 Jan;128:104119. doi: 10.1016/j.compbiomed.2020.104119. Epub 2020 Nov 13.
9
HMA-Net: A deep U-shaped network combined with HarDNet and multi-attention mechanism for medical image segmentation.HMA-Net:一种结合 HarDNet 和多注意力机制的深度 U 形网络,用于医学图像分割。
Med Phys. 2023 Mar;50(3):1635-1646. doi: 10.1002/mp.16065. Epub 2022 Nov 3.
10
MADR-Net: multi-level attention dilated residual neural network for segmentation of medical images.MADR-Net:用于医学图像分割的多层次注意扩张残差神经网络。
Sci Rep. 2024 Jun 3;14(1):12699. doi: 10.1038/s41598-024-63538-2.

引用本文的文献

1
Variational mode directed deep learning framework for breast lesion classification using ultrasound imaging.基于超声成像的变分模态导向深度学习框架用于乳腺病变分类
Sci Rep. 2025 Apr 24;15(1):14300. doi: 10.1038/s41598-025-99009-5.

本文引用的文献

1
AIPs-SnTCN: Predicting Anti-Inflammatory Peptides Using fastText and Transformer Encoder-Based Hybrid Word Embedding with Self-Normalized Temporal Convolutional Networks.AIPs-SnTCN:使用基于fastText和基于Transformer编码器的混合词嵌入与自归一化时间卷积网络预测抗炎肽
J Chem Inf Model. 2023 Nov 13;63(21):6537-6554. doi: 10.1021/acs.jcim.3c01563. Epub 2023 Oct 31.
2
Boundary-Guided and Region-Aware Network With Global Scale-Adaptive for Accurate Segmentation of Breast Tumors in Ultrasound Images.边界引导和区域感知的全局尺度自适应网络,用于准确分割超声图像中的乳腺肿瘤。
IEEE J Biomed Health Inform. 2023 Sep;27(9):4421-4432. doi: 10.1109/JBHI.2023.3285789. Epub 2023 Sep 6.
3
Attention guided neural ODE network for breast tumor segmentation in medical images.
基于注意力引导的神经 ODE 网络的医学图像中乳腺肿瘤分割。
Comput Biol Med. 2023 Jun;159:106884. doi: 10.1016/j.compbiomed.2023.106884. Epub 2023 Apr 3.
4
DeepMiCa: Automatic segmentation and classification of breast MIcroCAlcifications from mammograms.DeepMiCa:从乳房 X 光片中自动分割和分类乳腺微钙化
Comput Methods Programs Biomed. 2023 Jun;235:107483. doi: 10.1016/j.cmpb.2023.107483. Epub 2023 Mar 31.
5
MsGoF: Breast lesion classification on ultrasound images by multi-scale gradational-order fusion framework.MsGoF:基于多尺度渐变顺序融合框架的超声图像乳腺病变分类
Comput Methods Programs Biomed. 2023 Mar;230:107346. doi: 10.1016/j.cmpb.2023.107346. Epub 2023 Jan 19.
6
ATFE-Net: Axial Transformer and Feature Enhancement-based CNN for ultrasound breast mass segmentation.ATFE-Net:用于超声乳腺肿块分割的基于轴向Transformer和特征增强的卷积神经网络
Comput Biol Med. 2023 Feb;153:106533. doi: 10.1016/j.compbiomed.2022.106533. Epub 2023 Jan 3.
7
Breast tumor localization and segmentation using machine learning techniques: Overview of datasets, findings, and methods.使用机器学习技术进行乳腺肿瘤定位与分割:数据集、研究结果及方法综述
Comput Biol Med. 2023 Jan;152:106443. doi: 10.1016/j.compbiomed.2022.106443. Epub 2022 Dec 19.
8
AAU-Net: An Adaptive Attention U-Net for Breast Lesions Segmentation in Ultrasound Images.AAU-Net:一种用于超声图像中乳腺病变分割的自适应注意 U-Net。
IEEE Trans Med Imaging. 2023 May;42(5):1289-1300. doi: 10.1109/TMI.2022.3226268. Epub 2023 May 2.
9
ESTAN: Enhanced Small Tumor-Aware Network for Breast Ultrasound Image Segmentation.ESTAN:用于乳腺超声图像分割的增强型小肿瘤感知网络。
Healthcare (Basel). 2022 Nov 11;10(11):2262. doi: 10.3390/healthcare10112262.
10
MLNet: Metaheuristics-Based Lightweight Deep Learning Network for Cervical Cancer Diagnosis.MLNet:基于元启发式算法的轻量级深度学习网络在宫颈癌诊断中的应用。
IEEE J Biomed Health Inform. 2023 Oct;27(10):5004-5014. doi: 10.1109/JBHI.2022.3223127. Epub 2023 Oct 5.