Nielsen Frank
Sony Computer Science Laboratories, Tokyo 141-0022, Japan.
Entropy (Basel). 2021 Apr 14;23(4):464. doi: 10.3390/e23040464.
We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.
我们通过考虑关于一般均值的变分定义来推广詹森 - 香农散度和詹森 - 香农多样性指数,从而扩展了西伯森信息半径的概念。该变分定义适用于任何任意距离,并给出了一种定义距离的詹森 - 香农对称化的新方法。当变分优化进一步被约束为属于规定的概率测度族时,我们得到了相对詹森 - 香农散度及其等效的距离詹森 - 香农对称化,它们推广了信息投影的概念。最后,我们探讨了这些变分詹森 - 香农散度和多样性指数在概率测度的聚类和量化任务(包括统计混合)中的应用。