An Yichen, Wang Zhimin, Ma Eric, Jiang Hao, Lu Weiguo
NeuralRad LLC, Madison, WI, USA.
Department of Radiation Oncology, UT Southwestern Medical Center, Dallas, TX, USA.
Head Neck Tumor Segm MR Guid Appl (2024). 2025;15273:222-229. doi: 10.1007/978-3-031-83274-1_17. Epub 2025 Mar 3.
Auto-segmentation of gross tumor volumes (GTVs) in head and neck cancer (HNC) using MRI-guided radiotherapy (RT) images presents a significant challenge that can greatly enhance clinical workflows in radiation oncology. In this study, we developed a novel deep learning model based on the nnUNetv2 framework, augmented with an autoencoder architecture. Our model introduces the original training images as an additional input channel and incorporates an MSE loss function to improve segmentation accuracy. The model was trained on a dataset of 150 HNC patients, with a private evaluation of 50 test patients as part of the HNTS-MRG 2024 challenge. The aggregated Dice similarity coefficient (DSCagg) for metastatic lymph nodes (GTVn) reached 0.8516, while the primary tumor (GTVp) scored 0.7318, with an average DSCagg of 0.7917 across both structures. By introducing an autoencoder output channel and combining dice loss with mean squared error (MSE) loss, the enhanced nnUNet architecture effectively learned additional image features to enhance segmentation accuracy. These findings suggest that deep learning models like our modified nnUNetv2 framework can significantly improve auto-segmentation accuracy in MRI-guided RT for HNC, contributing to more precise and efficient clinical workflows.
利用磁共振成像引导放疗(RT)图像对头颈部癌(HNC)的大体肿瘤体积(GTV)进行自动分割是一项重大挑战,而这一挑战若能攻克,可极大地优化放射肿瘤学的临床工作流程。在本研究中,我们基于nnUNetv2框架开发了一种新型深度学习模型,并辅以自动编码器架构。我们的模型引入原始训练图像作为额外输入通道,并纳入均方误差(MSE)损失函数以提高分割精度。该模型在150例HNC患者的数据集上进行训练,并作为HNTS-MRG 2024挑战赛的一部分,对50例测试患者进行了内部评估。转移性淋巴结(GTVn)的聚合骰子相似系数(DSCagg)达到0.8516,而原发肿瘤(GTVp)的评分为0.7318,两个结构的平均DSCagg为0.7917。通过引入自动编码器输出通道,并将骰子损失与均方误差(MSE)损失相结合,增强后的nnUNet架构有效地学习了额外的图像特征,从而提高了分割精度。这些发现表明,像我们改进后的nnUNetv2框架这样的深度学习模型可以显著提高HNC的MRI引导放疗中的自动分割精度,有助于实现更精确、高效的临床工作流程。