School of Physics, Mathematics and Computing, University of Western Australia, Perth, WA, Australia.
Medical School, University of Western Australia, Crawley, WA, Australia.
Eur J Nucl Med Mol Imaging. 2022 Dec;50(1):67-79. doi: 10.1007/s00259-022-05927-1. Epub 2022 Aug 17.
This study aimed to develop and assess an automated segmentation framework based on deep learning for metastatic prostate cancer (mPCa) lesions in whole-body [Ga]Ga-PSMA-11 PET/CT images for the purpose of extracting patient-level prognostic biomarkers.
Three hundred thirty-seven [Ga]Ga-PSMA-11 PET/CT images were retrieved from a cohort of biochemically recurrent PCa patients. A fully 3D convolutional neural network (CNN) is proposed which is based on the self-configuring nnU-Net framework, and was trained on a subset of these scans, with an independent test set reserved for model evaluation. Voxel-level segmentation results were assessed using the dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity. Sensitivity and PPV were calculated to assess lesion level detection; patient-level classification results were assessed by the accuracy, PPV, and sensitivity. Whole-body biomarkers total lesional volume (TLV) and total lesional uptake (TLU) were calculated from the automated segmentations, and Kaplan-Meier analysis was used to assess biomarker relationship with patient overall survival.
At the patient level, the accuracy, sensitivity, and PPV were all > 90%, with the best metric being the PPV (97.2%). PPV and sensitivity at the lesion level were 88.2% and 73.0%, respectively. DSC and PPV measured at the voxel level performed within measured inter-observer variability (DSC, median = 50.7% vs. second observer = 32%, p = 0.012; PPV, median = 64.9% vs. second observer = 25.7%, p < 0.005). Kaplan-Meier analysis of TLV and TLU showed they were significantly associated with patient overall survival (both p < 0.005).
The fully automated assessment of whole-body [Ga]Ga-PSMA-11 PET/CT images using deep learning shows significant promise, yielding accurate scan classification, voxel-level segmentations within inter-observer variability, and potentially clinically useful prognostic biomarkers associated with patient overall survival.
This study was registered with the Australian New Zealand Clinical Trials Registry (ACTRN12615000608561) on 11 June 2015.
本研究旨在开发和评估一种基于深度学习的全身[Ga]Ga-PSMA-11 PET/CT 图像中转移性前列腺癌(mPCa)病变的自动分割框架,以便提取患者水平的预后生物标志物。
从一组生化复发的前列腺癌患者中检索了 337 个[Ga]Ga-PSMA-11 PET/CT 图像。提出了一种完全基于自配置 nnU-Net 框架的三维卷积神经网络(CNN),并在这些扫描的子集中进行了训练,保留了一个独立的测试集用于模型评估。使用骰子相似系数(DSC)、阳性预测值(PPV)和敏感性评估体素级分割结果。计算敏感性和 PPV 以评估病变级别的检测;通过准确性、PPV 和敏感性评估患者级别的分类结果。从自动分割中计算出全身生物标志物总病变体积(TLV)和总病变摄取量(TLU),并使用 Kaplan-Meier 分析评估生物标志物与患者总生存时间的关系。
在患者水平上,准确性、敏感性和 PPV 均>90%,最佳指标是 PPV(97.2%)。病变级别的 PPV 和敏感性分别为 88.2%和 73.0%。体素水平的 DSC 和 PPV 测量值处于测量的观察者间变异性范围内(DSC,中位数=50.7%与第二观察者=32%,p=0.012;PPV,中位数=64.9%与第二观察者=25.7%,p<0.005)。TLV 和 TLU 的 Kaplan-Meier 分析表明,它们与患者总生存时间显著相关(均p<0.005)。
使用深度学习对全身[Ga]Ga-PSMA-11 PET/CT 图像进行全自动评估显示出了巨大的潜力,可实现准确的扫描分类、在观察者间变异性内进行体素级分割,并提供与患者总生存时间相关的潜在临床有用的预后生物标志物。
本研究于 2015 年 6 月 11 日在澳大利亚和新西兰临床试验注册中心(ACTRN12615000608561)注册。