• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

卷积模糊注意网络用于细胞核分割。

Convolutional Blur Attention Network for Cell Nuclei Segmentation.

机构信息

Department of Biomedical Sciences and Engineering, National Central University, Taoyuan 320317, Taiwan.

Faculty of Digital Technology, University of Technology and Education-The University of Danang, Danang 550000, Vietnam.

出版信息

Sensors (Basel). 2022 Feb 18;22(4):1586. doi: 10.3390/s22041586.

DOI:10.3390/s22041586
PMID:35214488
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8878074/
Abstract

Accurately segmented nuclei are important, not only for cancer classification, but also for predicting treatment effectiveness and other biomedical applications. However, the diversity of cell types, various external factors, and illumination conditions make nucleus segmentation a challenging task. In this work, we present a new deep learning-based method for cell nucleus segmentation. The proposed convolutional blur attention (CBA) network consists of downsampling and upsampling procedures. A blur attention module and a blur pooling operation are used to retain the feature salience and avoid noise generation in the downsampling procedure. A pyramid blur pooling (PBP) module is proposed to capture the multi-scale information in the upsampling procedure. The superiority of the proposed method has been compared with a few prior segmentation models, namely U-Net, ENet, SegNet, LinkNet, and Mask RCNN on the 2018 Data Science Bowl (DSB) challenge dataset and the multi-organ nucleus segmentation (MoNuSeg) at MICCAI 2018. The Dice similarity coefficient and some evaluation matrices, such as F1 score, recall, precision, and average Jaccard index () were used to evaluate the segmentation efficiency of these models. Overall, the proposal method in this paper has the best performance, the indicator on the DSB dataset and MoNuSeg is 0.8429, 0.7985, respectively.

摘要

准确分割的细胞核不仅对于癌症分类很重要,而且对于预测治疗效果和其他生物医学应用也很重要。然而,细胞类型的多样性、各种外部因素和光照条件使得细胞核分割成为一项具有挑战性的任务。在这项工作中,我们提出了一种新的基于深度学习的细胞细胞核分割方法。所提出的卷积模糊注意 (CBA) 网络由下采样和上采样过程组成。模糊注意模块和模糊池化操作用于在下采样过程中保留特征显著性并避免噪声产生。提出了一种金字塔模糊池化 (PBP) 模块,用于在上采样过程中捕获多尺度信息。在 2018 年数据科学碗 (DSB) 挑战赛数据集和 2018 年 MICCAI 的多器官细胞核分割 (MoNuSeg) 上,将所提出的方法与 U-Net、ENet、SegNet、LinkNet 和 Mask RCNN 等几种分割模型进行了比较。使用 Dice 相似系数和一些评估矩阵,如 F1 分数、召回率、精度和平均 Jaccard 指数 ( ) 来评估这些模型的分割效率。总体而言,本文提出的方法性能最佳,在 DSB 数据集和 MoNuSeg 上的指标分别为 0.8429 和 0.7985。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/915c93eadfeb/sensors-22-01586-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/cea1cd2f93f9/sensors-22-01586-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/b089508cb901/sensors-22-01586-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/bcc45cb05e0d/sensors-22-01586-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/0f07149baf34/sensors-22-01586-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/eb4ecfaf32f9/sensors-22-01586-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/90671c49ff72/sensors-22-01586-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/3bf8ec459ade/sensors-22-01586-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/dd42d974479f/sensors-22-01586-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/4e9aef970cbd/sensors-22-01586-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/f7f545a84778/sensors-22-01586-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/be1bc7448b54/sensors-22-01586-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/b5bfdebf4825/sensors-22-01586-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/c1046f3d1f2b/sensors-22-01586-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/915c93eadfeb/sensors-22-01586-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/cea1cd2f93f9/sensors-22-01586-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/b089508cb901/sensors-22-01586-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/bcc45cb05e0d/sensors-22-01586-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/0f07149baf34/sensors-22-01586-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/eb4ecfaf32f9/sensors-22-01586-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/90671c49ff72/sensors-22-01586-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/3bf8ec459ade/sensors-22-01586-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/dd42d974479f/sensors-22-01586-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/4e9aef970cbd/sensors-22-01586-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/f7f545a84778/sensors-22-01586-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/be1bc7448b54/sensors-22-01586-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/b5bfdebf4825/sensors-22-01586-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/c1046f3d1f2b/sensors-22-01586-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8512/8878074/915c93eadfeb/sensors-22-01586-g013.jpg

相似文献

1
Convolutional Blur Attention Network for Cell Nuclei Segmentation.卷积模糊注意网络用于细胞核分割。
Sensors (Basel). 2022 Feb 18;22(4):1586. doi: 10.3390/s22041586.
2
High-resolution deep transferred ASPPU-Net for nuclei segmentation of histopathology images.高分辨率深度转移 ASPPU-Net 用于组织病理学图像的细胞核分割。
Int J Comput Assist Radiol Surg. 2021 Dec;16(12):2159-2175. doi: 10.1007/s11548-021-02497-9. Epub 2021 Oct 7.
3
A multiple-channel and atrous convolution network for ultrasound image segmentation.一种用于超声图像分割的多通道多孔卷积网络。
Med Phys. 2020 Dec;47(12):6270-6285. doi: 10.1002/mp.14512. Epub 2020 Oct 18.
4
A novel convolutional neural network for kidney ultrasound images segmentation.一种用于肾脏超声图像分割的新型卷积神经网络。
Comput Methods Programs Biomed. 2022 May;218:106712. doi: 10.1016/j.cmpb.2022.106712. Epub 2022 Feb 26.
5
A Multi-Organ Nucleus Segmentation Challenge.多器官细胞核分割挑战赛
IEEE Trans Med Imaging. 2020 May;39(5):1380-1391. doi: 10.1109/TMI.2019.2947628. Epub 2019 Oct 23.
6
GC-Net: Global context network for medical image segmentation.GC-Net:用于医学图像分割的全局上下文网络。
Comput Methods Programs Biomed. 2020 Jul;190:105121. doi: 10.1016/j.cmpb.2019.105121. Epub 2019 Oct 4.
7
An iterative multi-path fully convolutional neural network for automatic cardiac segmentation in cine MR images.基于迭代多路径全卷积神经网络的心脏电影磁共振图像自动分割方法。
Med Phys. 2019 Dec;46(12):5652-5665. doi: 10.1002/mp.13859. Epub 2019 Nov 1.
8
CLCU-Net: Cross-level connected U-shaped network with selective feature aggregation attention module for brain tumor segmentation.CLCU-Net:用于脑肿瘤分割的具有选择性特征聚合注意力模块的跨层连接U型网络。
Comput Methods Programs Biomed. 2021 Aug;207:106154. doi: 10.1016/j.cmpb.2021.106154. Epub 2021 May 13.
9
AL-Net: Attention Learning Network Based on Multi-Task Learning for Cervical Nucleus Segmentation.AL-Net:基于多任务学习的注意力学习网络用于颈椎核分割。
IEEE J Biomed Health Inform. 2022 Jun;26(6):2693-2702. doi: 10.1109/JBHI.2021.3136568. Epub 2022 Jun 3.
10
HWA-SegNet: Multi-channel skin lesion image segmentation network with hierarchical analysis and weight adjustment.HWA-SegNet:具有层次分析和权重调整的多通道皮肤病变图像分割网络。
Comput Biol Med. 2023 Jan;152:106343. doi: 10.1016/j.compbiomed.2022.106343. Epub 2022 Nov 28.

引用本文的文献

1
An Improved Nested U-Net Network for Fluorescence In Situ Hybridization Cell Image Segmentation.用于荧光原位杂交细胞图像分割的改进型嵌套 U-Net 网络。
Sensors (Basel). 2024 Jan 31;24(3):928. doi: 10.3390/s24030928.
2
Complex-Phase Steel Microstructure Segmentation Using UNet: Analysis across Different Magnifications and Steel Types.基于UNet的复相钢微观组织分割:不同放大倍数和钢种的分析
Materials (Basel). 2023 Nov 21;16(23):7254. doi: 10.3390/ma16237254.
3
Pixel-level multimodal fusion deep networks for predicting subcellular organelle localization from label-free live-cell imaging.

本文引用的文献

1
nucleAIzer: A Parameter-free Deep Learning Framework for Nucleus Segmentation Using Image Style Transfer.nucleAIzer:一种基于图像风格转换的无参深度学习核分割框架。
Cell Syst. 2020 May 20;10(5):453-458.e6. doi: 10.1016/j.cels.2020.04.003. Epub 2020 May 7.
2
An automatic nuclei segmentation method based on deep convolutional neural networks for histopathology images.一种基于深度卷积神经网络的组织病理学图像细胞核自动分割方法。
BMC Biomed Eng. 2019 Oct 17;1:24. doi: 10.1186/s42490-019-0026-8. eCollection 2019.
3
Microscopy cell nuclei segmentation with enhanced U-Net.
用于从无标记活细胞成像预测亚细胞细胞器定位的像素级多模态融合深度网络。
Front Genet. 2022 Oct 26;13:1002327. doi: 10.3389/fgene.2022.1002327. eCollection 2022.
基于增强型 U-Net 的显微镜细胞核分割
BMC Bioinformatics. 2020 Jan 8;21(1):8. doi: 10.1186/s12859-019-3332-1.
4
A Multi-Organ Nucleus Segmentation Challenge.多器官细胞核分割挑战赛
IEEE Trans Med Imaging. 2020 May;39(5):1380-1391. doi: 10.1109/TMI.2019.2947628. Epub 2019 Oct 23.
5
Nucleus segmentation across imaging experiments: the 2018 Data Science Bowl.跨影像实验的核分割:2018 年数据科学竞赛
Nat Methods. 2019 Dec;16(12):1247-1253. doi: 10.1038/s41592-019-0612-7. Epub 2019 Oct 21.
6
Towards pixel-to-pixel deep nucleus detection in microscopy images.面向显微镜图像的像素级细胞核检测。
BMC Bioinformatics. 2019 Sep 14;20(1):472. doi: 10.1186/s12859-019-3037-5.
7
Supervised classification enables rapid annotation of cell atlases.监督分类可实现细胞图谱的快速标注。
Nat Methods. 2019 Oct;16(10):983-986. doi: 10.1038/s41592-019-0535-3. Epub 2019 Sep 9.
8
Cell Nuclei Segmentation in Cytological Images Using Convolutional Neural Network and Seeded Watershed Algorithm.基于卷积神经网络和种子分水岭算法的细胞学图像细胞核分割。
J Digit Imaging. 2020 Feb;33(1):231-242. doi: 10.1007/s10278-019-00200-8.
9
Convolutional neural network for cell classification using microscope images of intracellular actin networks.基于细胞内肌动蛋白网络的显微镜图像的卷积神经网络细胞分类。
PLoS One. 2019 Mar 13;14(3):e0213626. doi: 10.1371/journal.pone.0213626. eCollection 2019.
10
Comparative Study on Automated Cell Nuclei Segmentation Methods for Cytology Pleural Effusion Images.细胞学胸腔积液图像中细胞自动核分割方法的比较研究。
J Healthc Eng. 2018 Sep 12;2018:9240389. doi: 10.1155/2018/9240389. eCollection 2018.