Thierrin Ferenc Cole, Alajaji Fady, Linder Tamás
Department of Mathematics and Statistics, Queen's University, Kingston, ON K7L 3N6, Canada.
Entropy (Basel). 2022 Oct 4;24(10):1417. doi: 10.3390/e24101417.
Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources.
香农交叉熵的两种雷尼(Rényi)型推广,即雷尼交叉熵和自然雷尼交叉熵,最近被用作损失函数,以改进深度学习生成对抗网络的设计。在这项工作中,我们以封闭形式推导了属于指数族的一大类常见连续分布的雷尼和自然雷尼微分交叉熵测度,并将结果制成表格以便于参考。我们还总结了平稳高斯过程之间以及有限字母时不变马尔可夫源之间的雷尼型交叉熵率。