Suppr超能文献

最小误差熵估计稳健性的研究进展

Insights Into the Robustness of Minimum Error Entropy Estimation.

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 Mar;29(3):731-737. doi: 10.1109/TNNLS.2016.2636160. Epub 2016 Dec 22.

Abstract

The minimum error entropy (MEE) is an important and highly effective optimization criterion in information theoretic learning (ITL). For regression problems, MEE aims at minimizing the entropy of the prediction error such that the estimated model preserves the information of the data generating system as much as possible. In many real world applications, the MEE estimator can outperform significantly the well-known minimum mean square error (MMSE) estimator and show strong robustness to noises especially when data are contaminated by non-Gaussian (multimodal, heavy tailed, discrete valued, and so on) noises. In this brief, we present some theoretical results on the robustness of MEE. For a one-parameter linear errors-in-variables (EIV) model and under some conditions, we derive a region that contains the MEE solution, which suggests that the MEE estimate can be very close to the true value of the unknown parameter even in presence of arbitrarily large outliers in both input and output variables. Theoretical prediction is verified by an illustrative example.

摘要

最小误差摘 (MEE) 是信息论学习 (ITL) 中的一个重要且高效的优化标准。对于回归问题,MEE 的目标是最小化预测误差的熵,以使估计模型尽可能多地保留数据生成系统的信息。在许多实际应用中,MEE 估计器可以显著优于著名的最小均方误差 (MMSE) 估计器,并在数据受到非高斯(多峰、重尾、离散值等)噪声污染时表现出很强的稳健性。在本简讯中,我们提出了一些关于 MEE 稳健性的理论结果。对于一个单参数线性变量误差 (EIV) 模型,并在某些条件下,我们推导出一个包含 MEE 解的区域,这表明即使在输入和输出变量中存在任意大的异常值,MEE 估计也可以非常接近未知参数的真实值。理论预测通过一个说明性的例子得到验证。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验