Liang Xiaokun, Bibault Jean-Emmanuel, Leroy Thomas, Escande Alexandre, Zhao Wei, Chen Yizheng, Buyyounouski Mark K, Hancock Steven L, Bagshaw Hilary, Xing Lei
Department of Radiation Oncology, Stanford University, Stanford, CA, 94305, USA.
Department of Radiation Oncology, Clinique des Dentellières, Valenciennes, France.
Med Phys. 2021 Apr;48(4):1764-1770. doi: 10.1002/mp.14755. Epub 2021 Mar 1.
To develop and evaluate a deep unsupervised learning (DUL) framework based on a regional deformable model for automated prostate contour propagation from planning computed tomography (pCT) to cone-beam CT (CBCT).
We introduce a DUL model to map the prostate contour from pCT to on-treatment CBCT. The DUL framework used a regional deformable model via narrow-band mapping to augment the conventional strategy. Two hundred and fifty-one anonymized CBCT images from prostate cancer patients were retrospectively selected and divided into three sets: 180 were used for training, 12 for validation, and 59 for testing. The testing dataset was divided into two groups. Group 1 contained 50 CBCT volumes, with one physician-generated prostate contour on CBCT image. Group 2 contained nine CBCT images, each including prostate contours delineated by four independent physicians and a consensus contour generated using the STAPLE method. Results were compared between the proposed DUL and physician-generated contours through the Dice similarity coefficients (DSCs), the Hausdorff distances, and the distances of the center-of-mass.
The average DSCs between DUL-based prostate contours and reference contours for test data in group 1 and group 2 consensus were 0.83 ± 0.04, and 0.85 ± 0.04, respectively. Correspondingly, the mean center-of-mass distances were 3.52 mm ± 1.15 mm, and 2.98 mm ± 1.42 mm, respectively.
This novel DUL technique can automatically propagate the contour of the prostate from pCT to CBCT. The proposed method shows that highly accurate contour propagation for CBCT-guided adaptive radiotherapy is achievable via the deep learning technique.
开发并评估一种基于区域可变形模型的深度无监督学习(DUL)框架,用于将前列腺轮廓从计划计算机断层扫描(pCT)自动传播到锥形束CT(CBCT)。
我们引入了一种DUL模型,将前列腺轮廓从pCT映射到治疗时的CBCT。DUL框架通过窄带映射使用区域可变形模型来增强传统策略。回顾性选择了251例前列腺癌患者的匿名CBCT图像,并将其分为三组:180例用于训练,12例用于验证,59例用于测试。测试数据集分为两组。第1组包含50个CBCT容积,在CBCT图像上有一名医生生成的前列腺轮廓。第2组包含9幅CBCT图像,每幅图像包括由四名独立医生勾勒的前列腺轮廓以及使用STAPLE方法生成的一致轮廓。通过Dice相似系数(DSC)、豪斯多夫距离和质心距离,比较了所提出的DUL与医生生成的轮廓之间的结果。
第1组测试数据和第2组一致数据中,基于DUL的前列腺轮廓与参考轮廓之间的平均DSC分别为0.83±0.04和0.85±0.04。相应地,平均质心距离分别为3.52 mm±1.15 mm和2.98 mm±1.42 mm。
这种新颖的DUL技术可以将前列腺轮廓从pCT自动传播到CBCT。所提出的方法表明,通过深度学习技术可实现CBCT引导下自适应放疗的高精度轮廓传播。