Kan Chi Nok Enoch, Gilat-Schmidt Taly, Ye Dong Hye
Department of Electrical and Computer Engineering, Marquette University, Milwaukee, USA.
Proc SPIE Int Soc Opt Eng. 2021 Feb;11596. doi: 10.1117/12.2582127. Epub 2021 Feb 15.
Accurately segmenting organs in abdominal computed tomography (CT) scans is crucial for clinical applications such as pre-operative planning and dose estimation. With the recent advent of deep learning algorithms, many robust frameworks have been proposed for organ segmentation in abdominal CT images. However, many of these frameworks require large amounts of training data in order to achieve high segmentation accuracy. Pediatric abdominal CT images containing reproductive organs are particularly hard to obtain since these organs are extremely sensitive to ionizing radiation. Hence, it is extremely challenging to train automatic segmentation algorithms on organs such as the uterus and the prostate. To address these issues, we propose a novel segmentation network with a built-in auxiliary classifier generative adversarial network (ACGAN) that conditionally generates additional features during training. The proposed CFG-SegNet (conditional feature generation segmentation network) is trained on a single loss function which combines adversarial loss, reconstruction loss, auxiliary classifier loss and segmentation loss. 2.5D segmentation experiments are performed on a custom data set containing 24 female CT volumes containing the uterus and 40 male CT volumes containing the prostate. CFG-SegNet achieves an average segmentation accuracy of 0.929 DSC (Dice Similarity Coefficient) on the prostate and 0.724 DSC on the uterus with 4-fold cross validation. The results show that our network is high-performing and has the potential to precisely segment difficult organs with few available training images.
在腹部计算机断层扫描(CT)图像中准确分割器官对于诸如术前规划和剂量估计等临床应用至关重要。随着深度学习算法的近期出现,已经提出了许多用于腹部CT图像中器官分割的强大框架。然而,这些框架中的许多都需要大量的训练数据才能实现高分割精度。包含生殖器官的儿科腹部CT图像特别难以获得,因为这些器官对电离辐射极其敏感。因此,在诸如子宫和前列腺等器官上训练自动分割算法极具挑战性。为了解决这些问题,我们提出了一种新颖的分割网络,其内置辅助分类器生成对抗网络(ACGAN),该网络在训练期间有条件地生成额外特征。所提出的CFG-SegNet(条件特征生成分割网络)在单个损失函数上进行训练,该损失函数结合了对抗损失、重建损失、辅助分类器损失和分割损失。在一个包含24个含子宫的女性CT容积和40个含前列腺的男性CT容积的自定义数据集上进行了2.5D分割实验。通过4折交叉验证,CFG-SegNet在前列腺上实现了平均分割精度为0.929 DSC(骰子相似系数),在子宫上为0.724 DSC。结果表明,我们的网络性能很高,并且有潜力在可用训练图像较少的情况下精确分割困难器官。