Nielsen Frank
Sony Computer Science Laboratories, Takanawa Muse Bldg., 3-14-13, Higashigotanda, Shinagawa-ku, Tokyo 141-0022, Japan.
Entropy (Basel). 2019 May 11;21(5):485. doi: 10.3390/e21050485.
The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.
詹森 - 香农散度是无界库尔贝克 - 莱布勒散度的一种著名的有界对称化形式,它衡量了到平均混合分布的总库尔贝克 - 莱布勒散度。然而,高斯分布之间的詹森 - 香农散度没有封闭形式的表达式。为了绕过这个问题,我们使用抽象均值提出了詹森 - 香农(JS)散度的一种推广形式,当根据分布的参数族选择均值时,该推广形式会产生封闭形式的表达式。更一般地,我们使用从抽象均值导出的参数混合来定义任何距离的JS对称化。特别地,我们首先表明几何均值非常适合指数族,并报告了两个封闭形式的公式:(i)同一指数族概率密度之间的几何詹森 - 香农散度;(ii)同一指数族概率密度之间反向库尔贝克 - 莱布勒散度的几何JS对称化。作为第二个示例,我们表明调和均值非常适合尺度柯西分布,并报告了尺度柯西分布之间调和詹森 - 香农散度的封闭形式公式。文中还涉及了这些新颖的詹森 - 香农散度在聚类中的应用。