• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

H-ProSeg:基于可解释性引导的数学模型的混合超声前列腺分割。

H-ProSeg: Hybrid ultrasound prostate segmentation based on explainability-guided mathematical model.

机构信息

Department of Health Technology and Informatics, The Hong Kong Polytechnic University, Hong Kong, China.

Department of Medical Technology, Jiangsu Province Hospital, Nanjing, Jiangsu, China.

出版信息

Comput Methods Programs Biomed. 2022 Jun;219:106752. doi: 10.1016/j.cmpb.2022.106752. Epub 2022 Mar 17.

DOI:10.1016/j.cmpb.2022.106752
PMID:35338887
Abstract

BACKGROUND AND OBJECTIVE

Accurate and robust prostate segmentation in transrectal ultrasound (TRUS) images is of great interest for image-guided prostate interventions and prostate cancer diagnosis. However, it remains a challenging task for various reasons, including a missing or ambiguous boundary between the prostate and surrounding tissues, the presence of shadow artifacts, intra-prostate intensity heterogeneity, and anatomical variations.

METHODS

Here, we present a hybrid method for prostate segmentation (H-ProSeg) in TRUS images, using a small number of radiologist-defined seed points as the prior points. This method consists of three subnetworks. The first subnetwork uses an improved principal curve-based model to obtain data sequences consisting of seed points and their corresponding projection index. The second subnetwork uses an improved differential evolution-based artificial neural network for training to decrease the model error. The third subnetwork uses the parameters of the artificial neural network to explain the smooth mathematical description of the prostate contour. The performance of the H-ProSeg method was assessed in 55 brachytherapy patients using Dice similarity coefficient (DSC), Jaccard similarity coefficient (Ω), and accuracy (ACC) values.

RESULTS

The H-ProSeg method achieved excellent segmentation accuracy, with DSC, Ω, and ACC values of 95.8%, 94.3%, and 95.4%, respectively. Meanwhile, the DSC, Ω, and ACC values of the proposed method were as high as 93.3%, 91.9%, and 93%, respectively, due to the influence of Gaussian noise (standard deviation of Gaussian function, σ = 50). Although the σ increased from 10 to 50, the DSC, Ω, and ACC values fluctuated by a maximum of approximately 2.5%, demonstrating the excellent robustness of our method.

CONCLUSIONS

Here, we present a hybrid method for accurate and robust prostate ultrasound image segmentation. The H-ProSeg method achieved superior performance compared with current state-of-the-art techniques. The knowledge of precise boundaries of the prostate is crucial for the conservation of risk structures. The proposed models have the potential to improve prostate cancer diagnosis and therapeutic outcomes.

摘要

背景与目的

在经直肠超声(TRUS)图像中准确、稳健地分割前列腺对于图像引导的前列腺介入和前列腺癌诊断具有重要意义。然而,由于前列腺与周围组织之间的边界缺失或不明确、存在阴影伪影、前列腺内强度异质性和解剖变异等各种原因,这仍然是一项具有挑战性的任务。

方法

本研究提出了一种基于少量放射科医生定义的种子点作为先验点的 TRUS 图像前列腺分割混合方法(H-ProSeg)。该方法由三个子网组成。第一个子网使用改进的基于主曲线的模型获得由种子点及其对应的投影指数组成的数据序列。第二个子网使用改进的基于差分进化的人工神经网络进行训练,以减少模型误差。第三个子网使用人工神经网络的参数来解释前列腺轮廓的平滑数学描述。使用 Dice 相似系数(DSC)、Jaccard 相似系数(Ω)和准确性(ACC)值评估了 H-ProSeg 方法在 55 名近距离放射治疗患者中的性能。

结果

H-ProSeg 方法实现了出色的分割准确性,DSC、Ω 和 ACC 值分别为 95.8%、94.3%和 95.4%。同时,由于高斯噪声的影响(高斯函数的标准差,σ=50),该方法的 DSC、Ω 和 ACC 值高达 93.3%、91.9%和 93%。尽管σ从 10 增加到 50,但 DSC、Ω 和 ACC 值的波动最大约为 2.5%,表明我们的方法具有出色的稳健性。

结论

本研究提出了一种用于准确、稳健的前列腺超声图像分割的混合方法。与当前最先进的技术相比,H-ProSeg 方法具有更好的性能。对前列腺精确边界的了解对于风险结构的保护至关重要。所提出的模型有可能改善前列腺癌的诊断和治疗效果。

相似文献

1
H-ProSeg: Hybrid ultrasound prostate segmentation based on explainability-guided mathematical model.H-ProSeg:基于可解释性引导的数学模型的混合超声前列腺分割。
Comput Methods Programs Biomed. 2022 Jun;219:106752. doi: 10.1016/j.cmpb.2022.106752. Epub 2022 Mar 17.
2
Boundary delineation in transrectal ultrasound images for region of interest of prostate.经直肠超声图像中前列腺感兴趣区的边界描绘。
Phys Med Biol. 2023 Sep 20;68(19). doi: 10.1088/1361-6560/acf5c5.
3
Semi-Automatic Prostate Segmentation From Ultrasound Images Using Machine Learning and Principal Curve Based on Interpretable Mathematical Model Expression.基于可解释数学模型表达式,利用机器学习和主曲线从超声图像中进行半自动前列腺分割
Front Oncol. 2022 Jun 7;12:878104. doi: 10.3389/fonc.2022.878104. eCollection 2022.
4
Automatic prostate segmentation using deep learning on clinically diverse 3D transrectal ultrasound images.基于临床多样的三维经直肠超声图像,利用深度学习进行前列腺自动分割。
Med Phys. 2020 Jun;47(6):2413-2426. doi: 10.1002/mp.14134. Epub 2020 Apr 8.
5
Deep learning-based ultrasound auto-segmentation of the prostate with brachytherapy implanted needles.基于深度学习的放射性治疗植入针前列腺超声自动分割。
Med Phys. 2024 Apr;51(4):2665-2677. doi: 10.1002/mp.16811. Epub 2023 Oct 27.
6
Ultrasound Prostate Segmentation Using Adaptive Selection Principal Curve and Smooth Mathematical Model.基于自适应选择主曲线和光滑数学模型的前列腺超声图像分割。
J Digit Imaging. 2023 Jun;36(3):947-963. doi: 10.1007/s10278-023-00783-3. Epub 2023 Feb 2.
7
A deep learning method for real-time intraoperative US image segmentation in prostate brachytherapy.一种用于前列腺近距离放射治疗术中实时超声图像分割的深度学习方法。
Int J Comput Assist Radiol Surg. 2020 Sep;15(9):1467-1476. doi: 10.1007/s11548-020-02231-x. Epub 2020 Jul 20.
8
Label-driven magnetic resonance imaging (MRI)-transrectal ultrasound (TRUS) registration using weakly supervised learning for MRI-guided prostate radiotherapy.基于弱监督学习的标签驱动 MRI-经直肠超声(TRUS)配准在 MRI 引导前列腺放疗中的应用。
Phys Med Biol. 2020 Jun 26;65(13):135002. doi: 10.1088/1361-6560/ab8cd6.
9
Ultrasound prostate segmentation based on multidirectional deeply supervised V-Net.基于多方向深度监督 V-Net 的前列腺超声图像分割。
Med Phys. 2019 Jul;46(7):3194-3206. doi: 10.1002/mp.13577. Epub 2019 May 29.
10
Intelligent contour extraction approach for accurate segmentation of medical ultrasound images.用于医学超声图像精确分割的智能轮廓提取方法
Front Physiol. 2023 Aug 22;14:1177351. doi: 10.3389/fphys.2023.1177351. eCollection 2023.

引用本文的文献

1
Knowledge-Informed Machine Learning for Cancer Diagnosis and Prognosis: A Review.用于癌症诊断和预后的知识驱动型机器学习综述
IEEE Trans Autom Sci Eng. 2025;22:10008-10028. doi: 10.1109/tase.2024.3515839. Epub 2024 Dec 18.
2
Mathematical modeling in radiotherapy for cancer: a comprehensive narrative review.癌症放射治疗中的数学建模:一项全面的叙述性综述。
Radiat Oncol. 2025 Apr 4;20(1):49. doi: 10.1186/s13014-025-02626-7.
3
A literature review of artificial intelligence (AI) for medical image segmentation: from AI and explainable AI to trustworthy AI.
医学图像分割的人工智能文献综述:从人工智能、可解释人工智能到可信人工智能
Quant Imaging Med Surg. 2024 Dec 5;14(12):9620-9652. doi: 10.21037/qims-24-723. Epub 2024 Nov 29.
4
Artificial intelligence in interventional radiotherapy (brachytherapy): Enhancing patient-centered care and addressing patients' needs.介入放射治疗(近距离放射治疗)中的人工智能:加强以患者为中心的护理并满足患者需求。
Clin Transl Radiat Oncol. 2024 Sep 22;49:100865. doi: 10.1016/j.ctro.2024.100865. eCollection 2024 Nov.
5
A bi-directional segmentation method for prostate ultrasound images under semantic constraints.一种语义约束下的前列腺超声图像双向分割方法。
Sci Rep. 2024 May 22;14(1):11701. doi: 10.1038/s41598-024-61238-5.
6
SAA-SDM: Neural Networks Faster Learned to Segment Organ Images.SAA-SDM:神经网络更快地学会分割器官图像。
J Imaging Inform Med. 2024 Apr;37(2):547-562. doi: 10.1007/s10278-023-00947-1. Epub 2024 Jan 10.
7
Novel Solution for Using Neural Networks for Kidney Boundary Extraction in 2D Ultrasound Data.利用神经网络从二维超声数据中提取肾边界的新方法。
Biomolecules. 2023 Oct 19;13(10):1548. doi: 10.3390/biom13101548.
8
The use of deep learning in interventional radiotherapy (brachytherapy): A review with a focus on open source and open data.深度学习在介入放射治疗(近距离放射治疗)中的应用:重点关注开源和开放数据的综述。
Z Med Phys. 2024 May;34(2):180-196. doi: 10.1016/j.zemedi.2022.10.005. Epub 2022 Nov 12.
9
Semi-Automatic Prostate Segmentation From Ultrasound Images Using Machine Learning and Principal Curve Based on Interpretable Mathematical Model Expression.基于可解释数学模型表达式,利用机器学习和主曲线从超声图像中进行半自动前列腺分割
Front Oncol. 2022 Jun 7;12:878104. doi: 10.3389/fonc.2022.878104. eCollection 2022.