IEEE Trans Pattern Anal Mach Intell. 2023 Jul;45(7):9206-9224. doi: 10.1109/TPAMI.2022.3225418. Epub 2023 Jun 5.
This article presents a generic probabilistic framework for estimating the statistical dependency and finding the anatomical correspondences among an arbitrary number of medical images. The method builds on a novel formulation of the N-dimensional joint intensity distribution by representing the common anatomy as latent variables and estimating the appearance model with nonparametric estimators. Through connection to maximum likelihood and the expectation-maximization algorithm, an information-theoretic metric called X-metric and a co-registration algorithm named X-CoReg are induced, allowing groupwise registration of the N observed images with computational complexity of O(N). Moreover, the method naturally extends for a weakly-supervised scenario where anatomical labels of certain images are provided. This leads to a combined-computing framework implemented with deep learning, which performs registration and segmentation simultaneously and collaboratively in an end-to-end fashion. Extensive experiments were conducted to demonstrate the versatility and applicability of our model, including multimodal groupwise registration, motion correction for dynamic contrast enhanced magnetic resonance images, and deep combined computing for multimodal medical images. Results show the superiority of our method in various applications in terms of both accuracy and efficiency, highlighting the advantage of the proposed representation of the imaging process. Code is available from https://zmiclab.github.io/projects.html.
本文提出了一种通用的概率框架,用于估计任意数量医学图像之间的统计相关性和寻找解剖对应关系。该方法基于一种新的 N 维联合强度分布公式,通过将共同的解剖结构表示为潜在变量,并使用非参数估计器来估计外观模型。通过与最大似然和期望最大化算法的联系,诱导出一种称为 X 度量的信息论度量和一种名为 X-CoReg 的配准算法,允许对 N 个观察图像进行分组配准,计算复杂度为 O(N)。此外,该方法自然扩展到具有弱监督场景的情况,即提供某些图像的解剖标签。这导致了一个基于深度学习的联合计算框架,它以端到端的方式同时和协作地执行配准和分割。进行了广泛的实验以证明我们的模型的多功能性和适用性,包括多模态分组配准、动态对比增强磁共振图像的运动校正以及多模态医学图像的深度联合计算。结果表明,我们的方法在各种应用中的准确性和效率方面都具有优越性,突出了所提出的成像过程表示的优势。代码可从 https://zmiclab.github.io/projects.html 获取。