Suppr超能文献

离散分布之间KL散度的极小极大速率最优估计。

Minimax Rate-optimal Estimation of KL Divergence between Discrete Distributions.

作者信息

Han Yanjun, Jiao Jiantao, Weissman Tsachy

机构信息

Stanford University.

出版信息

Int Symp Inf Theory Appl. 2016;2016:256-260.

Abstract

We refine the general methodology in [1] for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional parameters, and elaborate on the case of discrete distributions with support size comparable with the number of observations . Specifically, we determine the "smooth" and "non-smooth" regimes based on the confidence set and the smoothness of the functional. In the "non-smooth" regime, we apply an unbiased estimator for a "suitable" polynomial approximation of the functional. In the "smooth" regime, we construct a bias corrected version of the Maximum Likelihood Estimator (MLE) based on Taylor expansion. We apply the general methodology to the problem of estimating the KL divergence between two discrete distributions from empirical data. We construct a minimax rate-optimal estimator which is adaptive in the sense that it does not require the knowledge of the support size nor the upper bound on the likelihood ratio. Moreover, the performance of the optimal estimator with samples is essentially that of the MLE with ln samples, i.e., the phenomenon holds.

摘要

我们改进了[1]中用于构建和分析有限维参数的广泛函数类的本质上极小极大估计量的一般方法,并详细阐述了支持集大小与观测数量可比的离散分布情况。具体而言,我们基于置信集和函数的光滑性确定了“光滑”和“非光滑” regimes。在“非光滑” regime中,我们对函数的“合适”多项式逼近应用无偏估计量。在“光滑” regime中,我们基于泰勒展开构造最大似然估计量(MLE)的偏差校正版本。我们将一般方法应用于从经验数据估计两个离散分布之间的KL散度问题。我们构造了一个极小极大速率最优估计量,它具有适应性,即不需要知道支持集大小或似然比的上界。此外,具有n个样本的最优估计量的性能本质上与具有n ln n个样本的MLE的性能相同,即n现象成立。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f205/5812299/cb99704021bf/nihms910323f1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验