Nielsen Frank
Sony Computer Science Laboratories, Tokyo 141-0022, Japan.
Entropy (Basel). 2021 Oct 28;23(11):1417. doi: 10.3390/e23111417.
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback-Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have been proposed in the literature to either estimate, approximate, or lower and upper bound this divergence. In this paper, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two univariate Gaussian mixtures with arbitrary number of components. Our heuristic relies on converting the mixtures into pairs of dually parameterized probability densities belonging to an exponential-polynomial family. To measure with a closed-form formula the goodness of fit between a Gaussian mixture and an exponential-polynomial density approximating it, we generalize the Hyvärinen divergence to α-Hyvärinen divergences. In particular, the 2-Hyvärinen divergence allows us to perform model selection by choosing the order of the exponential-polynomial densities used to approximate the mixtures. We experimentally demonstrate that our heuristic to approximate the Jeffreys divergence between mixtures improves over the computational time of stochastic Monte Carlo estimations by several orders of magnitude while approximating the Jeffreys divergence reasonably well, especially when the mixtures have a very small number of modes.
杰弗里斯散度是信息科学中广泛使用的有向库尔贝克 - 莱布勒散度的一种著名算术对称化形式。由于高斯混合模型之间的杰弗里斯散度没有闭式解,文献中提出了各种优缺点各异的技术来估计、近似或界定这种散度的上下界。在本文中,我们提出了一种简单而快速的启发式方法,用于近似两个具有任意数量分量的单变量高斯混合模型之间的杰弗里斯散度。我们的启发式方法依赖于将混合模型转换为属于指数多项式族的对偶参数化概率密度对。为了用闭式公式衡量高斯混合模型与近似它的指数多项式密度之间的拟合优度,我们将海瓦林散度推广到α - 海瓦林散度。特别地,2 - 海瓦林散度使我们能够通过选择用于近似混合模型的指数多项式密度的阶数来进行模型选择。我们通过实验证明,我们用于近似混合模型之间杰弗里斯散度的启发式方法在计算时间上比随机蒙特卡罗估计提高了几个数量级,同时能较好地近似杰弗里斯散度,尤其是当混合模型的模态数量非常少时。