Suppr超能文献

使用U-Net模型对前列腺及其区域、前部纤维肌基质和尿道进行MRI分割以及多模态图像融合。

Segmentation of the prostate, its zones, anterior fibromuscular stroma, and urethra on the MRIs and multimodality image fusion using U-Net model.

作者信息

Rezaeijo Seyed Masoud, Jafarpoor Nesheli Shabnam, Fatan Serj Mehdi, Tahmasebi Birgani Mohammad Javad

机构信息

Department of Medical Physics, Faculty of Medicine, Ahvaz Jundishapur University of Medical Sciences, Ahvaz, Iran.

Faculty of Engineering, University of Science and Culture, Tehran, Iran.

出版信息

Quant Imaging Med Surg. 2022 Oct;12(10):4786-4804. doi: 10.21037/qims-22-115.

Abstract

BACKGROUND

Due to the large variability in the prostate gland of different patient groups, manual segmentation is time-consuming and subject to inter-and intra-reader variations. Hence, we propose a U-Net model to automatically segment the prostate and its zones, including the peripheral zone (PZ), transitional zone (TZ), anterior fibromuscular stroma (AFMS), and urethra on the MRI [T2-weighted (T2W), diffusion-weighted imaging (DWI), and apparent diffusion coefficient (ADC)], and multimodality image fusion.

METHODS

A total of 91 eligible patients were retrospectively identified; 50 patients were considered for training process in a 10-fold cross-validation fashion and 41 ones for external test. Firstly, images were registered, and cropping was performed through a bounding box. In addition to T2W, DWI, and ADC separately, fused images were used. We considered three combinations, including T2W + DWI, T2W + ADC, and DWI + ADC, using wavelet transform. U-Net was applied to segment the prostate and its zones, AFMS, and urethra in a 10-fold cross-validation fashion. Eventually, dice score (DSC), intersection over union (IoU), precision, recall, and Hausdorff distance (HD) were used to evaluate the proposed model.

RESULTS

Using T2W images alone on the external test images, higher DSC, IoU, precision, and recall was achieved than the individual DWI and ADC images. DSC of 95%, 94%,98%, 94%, and 88%, IoU of 88%, 88.5%, 96%, 90%, and 79%, precision of 95.9%, 93.9%, 97.6%, 93.83%, and 87.82%, and recall of 94.2%, 94.2%, 98.3%, 94%, 87.93% was achieved for the whole prostate, PZ, TZ, urethra, and AFMS, respectively. The results clearly show that the best segmentation was obtained when the model is trained using T2W + DWI images. DSC of 99.06%, 99,05%, 99.04%, 99.09%, and 98.08%, IoU of 97.09%, 97.02%, 98.12%, 98.13%, and 96%, precision of 99.24%, 98.22%, 98.91%, 99.23%, and 98.9%, and recall of 98.3%, 99.8%, 99.02%, 98.93%, and 97.51% was achieved for the whole prostate, PZ, TZ, urethra, and AFMS, respectively. The min of the HD in the testing set for three combinations was 0.29 for the T2W + ADC procedure in the whole prostate class.

CONCLUSIONS

Better performance was achieved using T2W + DWI images than T2W, DWI, and ADC separately or T2W + ADC and DWI + ADC in combination.

摘要

背景

由于不同患者群体前列腺的差异很大,手动分割耗时且存在阅片者间和阅片者内的差异。因此,我们提出一种U-Net模型,用于在MRI [T2加权(T2W)、扩散加权成像(DWI)和表观扩散系数(ADC)] 上自动分割前列腺及其区域,包括外周带(PZ)、移行带(TZ)、前部纤维肌基质(AFMS)和尿道,并进行多模态图像融合。

方法

回顾性纳入91例符合条件的患者;50例患者以10折交叉验证的方式用于训练过程,41例用于外部测试。首先,对图像进行配准,并通过边界框进行裁剪。除了分别使用T2W、DWI和ADC外,还使用了融合图像。我们考虑了三种组合,包括T2W + DWI、T2W + ADC和DWI + ADC,采用小波变换。U-Net以10折交叉验证的方式用于分割前列腺及其区域、AFMS和尿道。最终,使用骰子系数(DSC)、交并比(IoU)、精度、召回率和豪斯多夫距离(HD)来评估所提出的模型。

结果

在外部测试图像上单独使用T2W图像时,获得的DSC、IoU、精度和召回率高于单独的DWI和ADC图像。整个前列腺、PZ、TZ、尿道和AFMS的DSC分别为95%、94%、98%、94%和88%,IoU分别为88%、88.5%、96%、90%和79%,精度分别为95.9%、93.9%、97.6%、93.83%和87.82%,召回率分别为94.2%、94.2%、98.3%、94%、87.93%。结果清楚地表明,当使用T2W + DWI图像训练模型时,分割效果最佳。整个前列腺、PZ、TZ、尿道和AFMS的DSC分别为99.06%、99.05%、99.04%、99.09%和98.08%,IoU分别为97.09%、97.02%、98.12%、98.13%和96%,精度分别为99.24%、98.22%、98.91%、99.23%和98.9%,召回率分别为98.3%、99.8%、99.02%、98.93%和97.51%。在整个前列腺类别中,T2W + ADC程序在测试集中的HD最小值为0.29。

结论

使用T2W + DWI图像的性能优于单独使用T2W、DWI和ADC,或T2W + ADC和DWI + ADC的组合。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/22e0/9511435/02ee904c6d0e/qims-12-10-4786-f1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验