Wang Dilin, Tang Ziyang, Bajaj Chandrajit, Liu Qiang
Department of Computer Science, UT Austin.
Adv Neural Inf Process Syst. 2019 Dec;32:7834-7844.
Stein variational gradient descent (SVGD) is a particle-based inference algorithm that leverages gradient information for efficient approximate inference. In this work, we enhance SVGD by leveraging preconditioning matrices, such as the Hessian and Fisher information matrix, to incorporate geometric information into SVGD updates. We achieve this by presenting a generalization of SVGD that replaces the kernels in vanilla SVGD with more general kernels. This yields a significant extension of SVGD, and more importantly, allows us to flexibly incorporate various preconditioning matrices to accelerate the exploration in the probability landscape. Empirical results show that our method outperforms vanilla SVGD and a variety of baseline approaches over a range of real-world Bayesian inference tasks.
斯坦变分梯度下降(SVGD)是一种基于粒子的推理算法,它利用梯度信息进行高效的近似推理。在这项工作中,我们通过利用诸如海森矩阵和费舍尔信息矩阵等预处理矩阵来增强SVGD,以便将几何信息纳入SVGD更新中。我们通过提出SVGD的一种推广来实现这一点,该推广用更通用的核替换了原始SVGD中的核。这产生了SVGD的显著扩展,更重要的是,使我们能够灵活地纳入各种预处理矩阵以加速在概率空间中的探索。实证结果表明,在一系列实际的贝叶斯推理任务中,我们的方法优于原始SVGD和各种基线方法。