Nishiyama Tomohiro, Sason Igal
Independent Researcher, Tokyo 206-0003, Japan.
Faculty of Electrical Engineering, Technion-Israel Institute of Technology, Technion City, Haifa 3200003, Israel.
Entropy (Basel). 2020 May 18;22(5):563. doi: 10.3390/e22050563.
The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of -divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.
相对熵和卡方散度是信息论与统计学中的基本散度度量。本文着重研究这两种散度之间的积分关系、这些关系的含义、它们在信息论中的应用,以及与丰富的(\alpha -)散度类相关的一些推广。本文所研究的应用涉及无损压缩、类型方法与大偏差、强数据处理不等式、收缩系数和最大相关性的界,以及一类离散时间马尔可夫链的平稳收敛速率。