Laboratory of Biomedical Diagnostics, Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands.
Laboratory of Biomedical Diagnostics, Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands.
Eur Urol Focus. 2021 Jan;7(1):78-85. doi: 10.1016/j.euf.2019.04.009. Epub 2019 Apr 23.
Although recent advances in multiparametric magnetic resonance imaging (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation.
To exploit deep learning to perform automatic, real-time prostate (zone) segmentation on TRUS images from different scanners.
DESIGN, SETTING, AND PARTICIPANTS: Three datasets with TRUS images were collected at different institutions, using an iU22 (Philips Healthcare, Bothell, WA, USA), a Pro Focus 2202a (BK Medical), and an Aixplorer (SuperSonic Imagine, Aix-en-Provence, France) ultrasound scanner. The datasets contained 436 images from 181 men.
Manual delineations from an expert panel were used as ground truth. The (zonal) segmentation performance was evaluated in terms of the pixel-wise accuracy, Jaccard index, and Hausdorff distance.
The developed deep-learning approach was demonstrated to significantly improve prostate segmentation compared with a conventional automated technique, reaching median accuracy of 98% (95% confidence interval 95-99%), a Jaccard index of 0.93 (0.80-0.96), and a Hausdorff distance of 3.0 (1.3-8.7) mm. Zonal segmentation yielded pixel-wise accuracy of 97% (95-99%) and 98% (96-99%) for the peripheral and transition zones, respectively. Supervised domain adaptation resulted in retainment of high performance when applied to images from different ultrasound scanners (p > 0.05). Moreover, the algorithm's assessment of its own segmentation performance showed a strong correlation with the actual segmentation performance (Pearson's correlation 0.72, p < 0.001), indicating that possible incorrect segmentations can be identified swiftly.
Fusion-guided prostate biopsies, targeting suspicious lesions on MRI using TRUS are increasingly performed. The requirement for (semi)manual prostate delineation places a substantial burden on clinicians. Deep learning provides a means for fast and accurate (zonal) prostate segmentation of TRUS images that translates to different scanners.
Artificial intelligence for automatic delineation of the prostate on ultrasound was shown to be reliable and applicable to different scanners. This method can, for example, be applied to speed up, and possibly improve, guided prostate biopsies using magnetic resonance imaging-transrectal ultrasound fusion.
尽管多参数磁共振成像(MRI)的最新进展使得 MRI-经直肠超声(TRUS)融合前列腺活检有所增加,但这些方法既耗时又费力且成本高昂。引入深度学习方法将改善前列腺分割。
利用深度学习在来自不同扫描仪的 TRUS 图像上自动实时进行前列腺(区域)分割。
设计、地点和参与者:在不同机构中使用 iU22(飞利浦医疗,华盛顿州 Bothell)、Pro Focus 2202a(BK Medical)和 Aixplorer(SuperSonic Imagine,法国 Aix-en-Provence)超声扫描仪收集了三个包含 TRUS 图像的数据集,每个数据集均包含来自 181 名男性的 436 张图像。
使用专家小组的手动勾画作为金标准。根据像素级准确性、Jaccard 指数和 Hausdorff 距离评估(区域)分割性能。
与传统的自动技术相比,所开发的深度学习方法显著提高了前列腺分割的性能,达到 98%(95%置信区间 95-99%)的中位数准确率、0.93(0.80-0.96)的 Jaccard 指数和 3.0(1.3-8.7)mm 的 Hausdorff 距离。区域分割分别为外周区和移行区提供了 97%(95-99%)和 98%(96-99%)的像素级准确性。当应用于来自不同超声扫描仪的图像时,监督域自适应保留了高性能(p>0.05)。此外,算法对自身分割性能的评估与实际分割性能具有很强的相关性(Pearson 相关性 0.72,p<0.001),这表明可能的错误分割可以迅速识别。
越来越多地使用 MRI 引导的 TRUS 靶向可疑病变进行融合引导的前列腺活检。对前列腺进行(半)手动勾画给临床医生带来了巨大的负担。深度学习为快速准确地进行 TRUS 图像的(区域)前列腺分割提供了一种方法,并且可以应用于不同的扫描仪。
人工智能用于自动勾画超声前列腺的方法被证明是可靠的,并且适用于不同的扫描仪。例如,该方法可用于加快并可能改进使用 MRI-TRUS 融合的引导性前列腺活检。