• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种高效的开源交互式3D医学图像分割解决方案。

An effective and open source interactive 3D medical image segmentation solution.

作者信息

Gao Yi, Chen Xiaohui, Yang Qinzhu, Lasso Andras, Kolesov Ivan, Pieper Steve, Kikinis Ron, Tannenbaum Allen, Zhu Liangjia

机构信息

School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, 518060, China.

Shenzhen Key Laboratory of Precision Medicine for Hematological Malignancies, Shenzhen, 518060, China.

出版信息

Sci Rep. 2024 Dec 2;14(1):29878. doi: 10.1038/s41598-024-80206-7.

DOI:10.1038/s41598-024-80206-7
PMID:39622975
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11612195/
Abstract

3D medical image segmentation is a key step in numerous clinical applications. Even though many automatic segmentation solutions have been proposed, it is arguably that medical image segmentation is more of a preference than a reference as inter- and intra-variability are widely observed in final segmentation output. Therefore, designing a user oriented and open-source solution for interactive annotation is of great value for the community. In this paper, we present an effective interactive segmentation method that employs an adaptive dynamic programming approach to incorporates users' interactions efficiently. The method first initializes an segmentation through a feature-based geodesic computation. Then, the segmentation is further refined by using an efficient updating scheme requiring only local computations when new user inputs are available, making it applicable to high resolution images and very complex structures. The proposed method is implemented as a user-oriented software module in 3D Slicer. Our approach demonstrates several strengths and contributions. First, we proposed an efficient and effective 3D interactive algorithm with the adaptive dynamic programming method. Second, this is not just a presented algorithm, but also a software with well-designed GUI for users. Third, its open-source nature allows users to make customized modifications according to their specific requirements.

摘要

三维医学图像分割是众多临床应用中的关键步骤。尽管已经提出了许多自动分割解决方案,但可以说医学图像分割更多的是一种偏好而非参考标准,因为在最终分割输出中广泛存在着个体间和个体内的变异性。因此,为交互式标注设计一个面向用户的开源解决方案对该领域具有重要价值。在本文中,我们提出了一种有效的交互式分割方法,该方法采用自适应动态规划方法来有效地整合用户交互。该方法首先通过基于特征的测地线计算初始化一个分割。然后,当有新的用户输入时,使用一种仅需局部计算的高效更新方案进一步细化分割,使其适用于高分辨率图像和非常复杂的结构。所提出的方法在3D Slicer中实现为一个面向用户的软件模块。我们的方法展示了几个优点和贡献。首先,我们用自适应动态规划方法提出了一种高效且有效的三维交互式算法。其次,这不仅是一个提出的算法,而且是一个为用户设计了良好图形用户界面的软件。第三,其开源性质允许用户根据他们的特定要求进行定制修改。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/a204924d25bc/41598_2024_80206_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/d012c5fef8b3/41598_2024_80206_Figa_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/f3f6bf4b067a/41598_2024_80206_Figb_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/c4fea6c38bc2/41598_2024_80206_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/cd3c769ce356/41598_2024_80206_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/d519d658f50d/41598_2024_80206_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/60f4585755b9/41598_2024_80206_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/e4e440bd9889/41598_2024_80206_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/4ade7dbbeac3/41598_2024_80206_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/62383bb2dd0f/41598_2024_80206_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/464172617dcd/41598_2024_80206_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/d7d85510dcb4/41598_2024_80206_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/9595d431e697/41598_2024_80206_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/dd68bfd3bbb5/41598_2024_80206_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/a062b9cc3d35/41598_2024_80206_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/20b9ffea237f/41598_2024_80206_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/a204924d25bc/41598_2024_80206_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/d012c5fef8b3/41598_2024_80206_Figa_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/f3f6bf4b067a/41598_2024_80206_Figb_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/c4fea6c38bc2/41598_2024_80206_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/cd3c769ce356/41598_2024_80206_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/d519d658f50d/41598_2024_80206_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/60f4585755b9/41598_2024_80206_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/e4e440bd9889/41598_2024_80206_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/4ade7dbbeac3/41598_2024_80206_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/62383bb2dd0f/41598_2024_80206_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/464172617dcd/41598_2024_80206_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/d7d85510dcb4/41598_2024_80206_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/9595d431e697/41598_2024_80206_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/dd68bfd3bbb5/41598_2024_80206_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/a062b9cc3d35/41598_2024_80206_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/20b9ffea237f/41598_2024_80206_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/91af/11612195/a204924d25bc/41598_2024_80206_Fig14_HTML.jpg

相似文献

1
An effective and open source interactive 3D medical image segmentation solution.一种高效的开源交互式3D医学图像分割解决方案。
Sci Rep. 2024 Dec 2;14(1):29878. doi: 10.1038/s41598-024-80206-7.
2
Web-based interactive 2D/3D medical image processing and visualization software.基于网络的交互式 2D/3D 医学图像处理和可视化软件。
Comput Methods Programs Biomed. 2010 May;98(2):172-82. doi: 10.1016/j.cmpb.2009.11.012.
3
A 3D interactive multi-object segmentation tool using local robust statistics driven active contours.使用局部鲁棒统计驱动主动轮廓的三维交互式多目标分割工具。
Med Image Anal. 2012 Aug;16(6):1216-27. doi: 10.1016/j.media.2012.06.002. Epub 2012 Jul 6.
4
Polymorph segmentation representation for medical image computing.多态分割表示在医学图像计算中的应用。
Comput Methods Programs Biomed. 2019 Apr;171:19-26. doi: 10.1016/j.cmpb.2019.02.011. Epub 2019 Feb 21.
5
3D Slicer as a tool for interactive brain tumor segmentation.3D Slicer作为交互式脑肿瘤分割的工具。
Annu Int Conf IEEE Eng Med Biol Soc. 2011;2011:6982-4. doi: 10.1109/IEMBS.2011.6091765.
6
Real-time 3D interactive segmentation of echocardiographic data through user-based deformation of B-spline explicit active surfaces.基于用户的 B 样条显式活动曲面变形实现超声心动图数据的实时 3D 交互式分割。
Comput Med Imaging Graph. 2014 Jan;38(1):57-67. doi: 10.1016/j.compmedimag.2013.10.002. Epub 2013 Oct 22.
7
Semi-automatic stitching of filamentous structures in image stacks from serial-section electron tomography.序列切片电子断层扫描图像堆栈中丝状结构的半自动拼接。
J Microsc. 2021 Oct;284(1):25-44. doi: 10.1111/jmi.13039. Epub 2021 Jul 9.
8
Minimally interactive segmentation of 4D dynamic upper airway MR images via fuzzy connectedness.通过模糊连接性对4D动态上呼吸道磁共振图像进行最小交互分割。
Med Phys. 2016 May;43(5):2323. doi: 10.1118/1.4945698.
9
A review on multiplatform evaluations of semi-automatic open-source based image segmentation for cranio-maxillofacial surgery.关于颅颌面外科中基于半自动化开源的图像分割的多平台评估的综述。
Comput Methods Programs Biomed. 2019 Dec;182:105102. doi: 10.1016/j.cmpb.2019.105102. Epub 2019 Sep 30.
10
MIDeepSeg: Minimally interactive segmentation of unseen objects from medical images using deep learning.MIDeepSeg:使用深度学习对医学图像中看不见的物体进行最少的交互分割。
Med Image Anal. 2021 Aug;72:102102. doi: 10.1016/j.media.2021.102102. Epub 2021 May 18.

本文引用的文献

1
MONAI Label: A framework for AI-assisted interactive labeling of 3D medical images.MONAI Label:一个用于3D医学图像人工智能辅助交互式标注的框架。
Med Image Anal. 2024 Jul;95:103207. doi: 10.1016/j.media.2024.103207. Epub 2024 May 15.
2
Segment anything in medical images.在医学图像中分割任何内容。
Nat Commun. 2024 Jan 22;15(1):654. doi: 10.1038/s41467-024-44824-z.
3
Segment anything model for medical images?用于医学图像的图像分割模型?
Med Image Anal. 2024 Feb;92:103061. doi: 10.1016/j.media.2023.103061. Epub 2023 Dec 7.
4
Segment anything model for medical image analysis: An experimental study.用于医学图像分析的分割模型:一项实验研究。
Med Image Anal. 2023 Oct;89:102918. doi: 10.1016/j.media.2023.102918. Epub 2023 Aug 2.
5
MISSFormer: An Effective Transformer for 2D Medical Image Segmentation.MISSFormer:用于二维医学图像分割的有效 Transformer。
IEEE Trans Med Imaging. 2023 May;42(5):1484-1494. doi: 10.1109/TMI.2022.3230943. Epub 2023 May 2.
6
An efficient interactive multi-label segmentation tool for 2D and 3D medical images using fully connected conditional random field.一种基于全连接条件随机场的用于 2D 和 3D 医学图像的高效交互式多标签分割工具。
Comput Methods Programs Biomed. 2022 Jan;213:106534. doi: 10.1016/j.cmpb.2021.106534. Epub 2021 Nov 14.
7
CHAOS Challenge - combined (CT-MR) healthy abdominal organ segmentation.CHAOS 挑战赛——联合(CT-MR)健康腹部器官分割。
Med Image Anal. 2021 Apr;69:101950. doi: 10.1016/j.media.2020.101950. Epub 2020 Dec 25.
8
Polymorph segmentation representation for medical image computing.多态分割表示在医学图像计算中的应用。
Comput Methods Programs Biomed. 2019 Apr;171:19-26. doi: 10.1016/j.cmpb.2019.02.011. Epub 2019 Feb 21.
9
CE-Net: Context Encoder Network for 2D Medical Image Segmentation.CE-Net:用于二维医学图像分割的上下文编码器网络。
IEEE Trans Med Imaging. 2019 Oct;38(10):2281-2292. doi: 10.1109/TMI.2019.2903562. Epub 2019 Mar 7.
10
Interactive Medical Image Segmentation Using Deep Learning With Image-Specific Fine Tuning.基于图像特定精细调整的深度学习的交互式医学图像分割。
IEEE Trans Med Imaging. 2018 Jul;37(7):1562-1573. doi: 10.1109/TMI.2018.2791721.