• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于Kullback-Leibler散度最近邻估计的偏差减少与度量学习

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence.

作者信息

Noh Yung-Kyun, Sugiyama Masashi, Liu Song, Plessis Marthinus C du, Park Frank Chongwoo, Lee Daniel D

机构信息

Seoul National University, Seoul 08826, Korea

RIKEN, Tokyo 103-0027, Japan, and University of Tokyo, Chiba 277-8561, Japan

出版信息

Neural Comput. 2018 Jul;30(7):1930-1960. doi: 10.1162/neco_a_01092. Epub 2018 Jun 14.

DOI:10.1162/neco_a_01092
PMID:29902113
Abstract

Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.

摘要

最近有人提出了用于估计Kullback-Leiber(KL)散度的渐近无偏最近邻估计器,并在许多应用中得到了验证。然而,对于少量样本,由于从最近邻统计中得出的信息具有非局部性,非参数方法通常会遭受较大的估计偏差。在本信函中,我们表明可以通过修改度量函数来减轻这种估计偏差,并且我们提出了一种从基础密度分布的参数生成模型中学习局部最优马氏距离函数的新方法。通过在各种数据集上进行模拟和实验,我们证明了近似生成模型与非参数技术之间的这种相互作用可以显著提高基于最近邻的KL散度估计的准确性。

相似文献

1
Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence.用于Kullback-Leibler散度最近邻估计的偏差减少与度量学习
Neural Comput. 2018 Jul;30(7):1930-1960. doi: 10.1162/neco_a_01092. Epub 2018 Jun 14.
2
Nonparametric estimation of Küllback-Leibler divergence.库尔贝克-莱布勒散度的非参数估计
Neural Comput. 2014 Nov;26(11):2570-93. doi: 10.1162/NECO_a_00646. Epub 2014 Jul 24.
3
Generative Local Metric Learning for Nearest Neighbor Classification.生成式局部度量学习在最近邻分类中的应用。
IEEE Trans Pattern Anal Mach Intell. 2018 Jan;40(1):106-118. doi: 10.1109/TPAMI.2017.2666151. Epub 2017 Feb 8.
4
The AIC criterion and symmetrizing the Kullback-Leibler divergence.赤池信息准则与对称化库尔贝克-莱布勒散度
IEEE Trans Neural Netw. 2007 Jan;18(1):97-106. doi: 10.1109/TNN.2006.882813.
5
Learning Generative Models Using Denoising Density Estimators.使用去噪密度估计器学习生成模型。
IEEE Trans Neural Netw Learn Syst. 2024 Dec;35(12):17730-17741. doi: 10.1109/TNNLS.2023.3308191. Epub 2024 Dec 2.
6
Direct Density Derivative Estimation.直接密度导数估计
Neural Comput. 2016 Jun;28(6):1101-40. doi: 10.1162/NECO_a_00835. Epub 2016 May 3.
7
Distance metric learning based on the class center and nearest neighbor relationship.基于类中心和最近邻关系的距离度量学习。
Neural Netw. 2023 Jul;164:631-644. doi: 10.1016/j.neunet.2023.05.004. Epub 2023 May 10.
8
Optimistic reinforcement learning by forward Kullback-Leibler divergence optimization.基于前向 Kullback-Leibler 散度优化的乐观强化学习。
Neural Netw. 2022 Aug;152:169-180. doi: 10.1016/j.neunet.2022.04.021. Epub 2022 Apr 21.
9
Bootstrap-adjusted quasi-likelihood information criteria for mixed model selection.用于混合模型选择的自助法调整拟似然信息准则
J Appl Stat. 2022 Nov 7;51(4):621-645. doi: 10.1080/02664763.2022.2143484. eCollection 2024.
10
Nearest-Neighbor Estimation for ROC Analysis under Verification Bias.验证性偏倚下ROC分析的最近邻估计
Int J Biostat. 2015 May;11(1):109-24. doi: 10.1515/ijb-2014-0014.

引用本文的文献

1
Linear Wavelet-Based Estimators of Partial Derivatives of Multivariate Density Function for Stationary and Ergodic Continuous Time Processes.基于线性小波的平稳遍历连续时间过程多元密度函数偏导数估计器
Entropy (Basel). 2025 Apr 6;27(4):389. doi: 10.3390/e27040389.
2
DEM Study of the Motion Characteristics of Rice Particles in the Indented Cylinder Separator.凹型筒式分选机中稻粒运动特性的 DEM 研究。
Sensors (Basel). 2022 Dec 27;23(1):285. doi: 10.3390/s23010285.
3
Transfer Extreme Learning Machine with Output Weight Alignment.
迁移极端学习机与输出权值对齐。
Comput Intell Neurosci. 2021 Feb 11;2021:6627765. doi: 10.1155/2021/6627765. eCollection 2021.