Fu Yabo, Lei Yang, Wang Tonghe, Patel Pretesh, Jani Ashesh B, Mao Hui, Curran Walter J, Liu Tian, Yang Xiaofeng
Department of Radiation Oncology, Emory University, 1365 Clifton Road NE, Atlanta, GA 30322, United States.
Department of Radiation Oncology, Emory University, 1365 Clifton Road NE, Atlanta, GA 30322, United States; Winship Cancer Institute, Emory University, Atlanta, GA 30322, United States.
Med Image Anal. 2021 Jan;67:101845. doi: 10.1016/j.media.2020.101845. Epub 2020 Oct 7.
A non-rigid MR-TRUS image registration framework is proposed for prostate interventions. The registration framework consists of a convolutional neural networks (CNN) for MR prostate segmentation, a CNN for TRUS prostate segmentation and a point-cloud based network for rapid 3D point cloud matching. Volumetric prostate point clouds were generated from the segmented prostate masks using tetrahedron meshing. The point cloud matching network was trained using deformation field that was generated by finite element analysis. Therefore, the network implicitly models the underlying biomechanical constraint when performing point cloud matching. A total of 50 patients' datasets were used for the network training and testing. Alignment of prostate shapes after registration was evaluated using three metrics including Dice similarity coefficient (DSC), mean surface distance (MSD) and Hausdorff distance (HD). Internal point-to-point registration accuracy was assessed using target registration error (TRE). Jacobian determinant and strain tensors of the predicted deformation field were calculated to analyze the physical fidelity of the deformation field. On average, the mean and standard deviation were 0.94±0.02, 0.90±0.23 mm, 2.96±1.00 mm and 1.57±0.77 mm for DSC, MSD, HD and TRE, respectively. Robustness of our method to point cloud noise was evaluated by adding different levels of noise to the query point clouds. Our results demonstrated that the proposed method could rapidly perform MR-TRUS image registration with good registration accuracy and robustness.
提出了一种用于前列腺介入的非刚性磁共振-经直肠超声(MR-TRUS)图像配准框架。该配准框架由用于磁共振前列腺分割的卷积神经网络(CNN)、用于经直肠超声前列腺分割的CNN以及用于快速三维点云匹配的基于点云的网络组成。使用四面体网格划分从分割后的前列腺掩码生成前列腺体积点云。点云匹配网络使用有限元分析生成的变形场进行训练。因此,该网络在执行点云匹配时隐式地对潜在的生物力学约束进行建模。总共50例患者的数据集用于网络训练和测试。使用包括骰子相似系数(DSC)、平均表面距离(MSD)和豪斯多夫距离(HD)在内的三个指标评估配准后前列腺形状的对齐情况。使用目标配准误差(TRE)评估内部点对点配准精度。计算预测变形场的雅可比行列式和应变张量,以分析变形场的物理保真度。平均而言,DSC、MSD、HD和TRE的平均值和标准差分别为0.94±0.02、0.90±0.23毫米、2.96±1.00毫米和1.57±0.77毫米。通过向查询点云添加不同水平的噪声来评估我们方法对点云噪声的鲁棒性。我们的结果表明,所提出的方法能够快速执行MR-TRUS图像配准,具有良好的配准精度和鲁棒性。