Chodrow Philip S
Department of Computer Science, Middlebury College, Middlebury, VT 05753, USA.
Entropy (Basel). 2025 Jul 19;27(7):766. doi: 10.3390/e27070766.
Bregman divergences form a class of distance-like comparison functions which plays fundamental roles in optimization, statistics, and information theory. One important property of Bregman divergences is that they generate agreement between two useful formulations of information content (in the sense of variability or non-uniformity) in weighted collections of vectors. The first of these is the Jensen gap information, which measures the difference between the mean value of a strictly convex function evaluated on a weighted set of vectors and the value of that function evaluated at the centroid of that collection. The second of these is the divergence information, which measures the mean divergence of the vectors in the collection from their centroid. In this brief note, we prove that the agreement between Jensen gap and divergence informations in fact characterizes the class of Bregman divergences; they are the only divergences that generate this agreement for arbitrary weighted sets of data vectors.
布雷格曼散度构成了一类类似距离的比较函数,它在优化、统计学和信息论中起着基础性作用。布雷格曼散度的一个重要性质是,它们在向量加权集合中两种有用的信息内容表述(从变异性或非均匀性意义上讲)之间产生一致性。其中第一种是詹森差距信息,它衡量严格凸函数在加权向量集上的均值与该函数在该集合质心处的值之间的差异。第二种是散度信息,它衡量集合中向量与其质心的平均散度。在本简短笔记中,我们证明詹森差距和散度信息之间的一致性实际上刻画了布雷格曼散度类;它们是唯一能对任意加权数据向量集产生这种一致性的散度。