Department of Radiation Oncology, Medical Artificial Intelligence and Automation Laboratory, University of Texas Southwestern, Dallas, TX, United States of America. Co-first authors.
Phys Med Biol. 2018 Dec 14;63(24):245015. doi: 10.1088/1361-6560/aaf11c.
Accurate segmentation of prostate and surrounding organs at risk is important for prostate cancer radiotherapy treatment planning. We present a fully automated workflow for male pelvic CT image segmentation using deep learning. The architecture consists of a 2D organ volume localization network followed by a 3D segmentation network for volumetric segmentation of prostate, bladder, rectum, and femoral heads. We used a multi-channel 2D U-Net followed by a 3D U-Net with encoding arm modified with aggregated residual networks, known as ResNeXt. The models were trained and tested on a pelvic CT image dataset comprising 136 patients. Test results show that 3D U-Net based segmentation achieves mean (±SD) Dice coefficient values of 90 (±2.0)%, 96 (±3.0)%, 95 (±1.3)%, 95 (±1.5)%, and 84 (±3.7)% for prostate, left femoral head, right femoral head, bladder, and rectum, respectively, using the proposed fully automated segmentation method.
准确分割前列腺和周围危险器官对于前列腺癌放射治疗计划非常重要。我们提出了一种使用深度学习的全自动男性骨盆 CT 图像分割工作流程。该架构由二维器官体积定位网络和三维分割网络组成,用于前列腺、膀胱、直肠和股骨头的体积分割。我们使用了多通道二维 U-Net,然后是三维 U-Net,其编码臂用聚合残差网络(称为 ResNeXt)进行了修改。模型在包含 136 名患者的骨盆 CT 图像数据集上进行了训练和测试。测试结果表明,使用提出的全自动分割方法,基于 3D U-Net 的分割分别实现了前列腺、左侧股骨头、右侧股骨头、膀胱和直肠的平均(±SD)Dice 系数值为 90(±2.0)%、96(±3.0)%、95(±1.3)%、95(±1.5)%和 84(±3.7)%。