Suppr超能文献

通过联合动态重建与运动估计(DREME)框架,利用单个任意角度的X射线投影进行实时CBCT成像和运动跟踪。

Real-time CBCT imaging and motion tracking via a single arbitrarily-angled x-ray projection by a joint dynamic reconstruction and motion estimation (DREME) framework.

作者信息

Shao Hua-Chieh, Mengke Tielige, Pan Tinsu, Zhang You

机构信息

The Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, TX 75390, United States of America.

Department of Imaging Physics, University of Texas MD Anderson Cancer Center, Houston, TX 77030, United States of America.

出版信息

Phys Med Biol. 2025 Jan 21;70(2):025026. doi: 10.1088/1361-6560/ada519.

Abstract

Real-time cone-beam computed tomography (CBCT) provides instantaneous visualization of patient anatomy for image guidance, motion tracking, and online treatment adaptation in radiotherapy. While many real-time imaging and motion tracking methods leveraged patient-specific prior information to alleviate under-sampling challenges and meet the temporal constraint (<500 ms), the prior information can be outdated and introduce biases, thus compromising the imaging and motion tracking accuracy. To address this challenge, we developed a frameworkynamicconstruction andotionstimation (DREME) for real-time CBCT imaging and motion estimation, without relying on patient-specific prior knowledge.DREME incorporates a deep learning-based real-time CBCT imaging and motion estimation method into a dynamic CBCT reconstruction framework. The reconstruction framework reconstructs a dynamic sequence of CBCTs in a data-driven manner from a standard pre-treatment scan, without requiring patient-specific prior knowledge. Meanwhile, a convolutional neural network-based motion encoder is jointly trained during the reconstruction to learn motion-related features relevant for real-time motion estimation, based on a single arbitrarily-angled x-ray projection. DREME was tested on digital phantom simulations and real patient studies.DREME accurately solved 3D respiration-induced anatomical motion in real time (∼1.5 ms inference time for each x-ray projection). For the digital phantom studies, it achieved an average lung tumor center-of-mass localization error of 1.2 ± 0.9 mm (Mean ± SD). For the patient studies, it achieved a real-time tumor localization accuracy of 1.6 ± 1.6 mm in the projection domain.DREME achieves CBCT and volumetric motion estimation in real time from a single x-ray projection at arbitrary angles, paving the way for future clinical applications in intra-fractional motion management. In addition, it can be used for dose tracking and treatment assessment, when combined with real-time dose calculation.

摘要

实时锥形束计算机断层扫描(CBCT)可为放射治疗中的图像引导、运动跟踪和在线治疗调整提供患者解剖结构的即时可视化。虽然许多实时成像和运动跟踪方法利用患者特定的先验信息来缓解欠采样挑战并满足时间限制(<500毫秒),但先验信息可能过时并引入偏差,从而影响成像和运动跟踪的准确性。为应对这一挑战,我们开发了一种用于实时CBCT成像和运动估计的动态重建与运动估计(DREME)框架,该框架不依赖于患者特定的先验知识。DREME将基于深度学习的实时CBCT成像和运动估计方法纳入动态CBCT重建框架。该重建框架以数据驱动的方式从标准的治疗前扫描中重建CBCT的动态序列,无需患者特定的先验知识。同时,在重建过程中联合训练基于卷积神经网络的运动编码器,以基于单个任意角度的x射线投影学习与实时运动估计相关的运动特征。DREME在数字体模模拟和真实患者研究中进行了测试。DREME实时准确地解决了三维呼吸引起的解剖运动(每次x射线投影的推理时间约为1.5毫秒)。对于数字体模研究,其平均肺肿瘤质心定位误差为1.2±0.9毫米(平均值±标准差)。对于患者研究,其在投影域中的实时肿瘤定位精度为1.6±1.6毫米。DREME可从任意角度的单个x射线投影实时实现CBCT和体积运动估计,为未来在分次内运动管理中的临床应用铺平了道路。此外,当与实时剂量计算相结合时,它可用于剂量跟踪和治疗评估。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ed09/11747166/9d0bc945aff7/pmbada519f1_hr.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验