Suppr超能文献

一种用于图像配准的随机梯度下降优化的高效预处理子。

An Efficient Preconditioner for Stochastic Gradient Descent Optimization of Image Registration.

出版信息

IEEE Trans Med Imaging. 2019 Oct;38(10):2314-2325. doi: 10.1109/TMI.2019.2897943. Epub 2019 Feb 11.

Abstract

Stochastic gradient descent (SGD) is commonly used to solve (parametric) image registration problems. In the case of badly scaled problems, SGD, however, only exhibits sublinear convergence properties. In this paper, we propose an efficient preconditioner estimation method to improve the convergence rate of SGD. Based on the observed distribution of voxel displacements in the registration, we estimate the diagonal entries of a preconditioning matrix, thus rescaling the optimization cost function. The preconditioner is efficient to compute and employ and can be used for mono-modal as well as multi-modal cost functions, in combination with different transformation models, such as the rigid, the affine, and the B-spline model. Experiments on different clinical datasets show that the proposed method, indeed, improves the convergence rate compared with SGD with speedups around 2~5 in all tested settings while retaining the same level of registration accuracy.

摘要

随机梯度下降(SGD)常用于解决(参数化)图像配准问题。然而,在严重缩放的问题中,SGD 仅表现出次线性收敛特性。在本文中,我们提出了一种有效的预处理估计方法来提高 SGD 的收敛速度。基于注册中体素位移的观察分布,我们估计了预处理矩阵的对角元素,从而对优化代价函数进行了重新缩放。该预处理是有效计算和使用的,可以与不同的变换模型(如刚性、仿射和 B 样条模型)结合使用,用于单模态和多模态代价函数。在不同的临床数据集上的实验表明,与 SGD 相比,所提出的方法确实提高了收敛速度,在所有测试设置中加速比约为 2~5,同时保持了相同的配准精度水平。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验