Suppr超能文献

通过添加朴素错误率来增强AdaBoost算法。

Robustifying AdaBoost by adding the naive error rate.

作者信息

Takenouchi Takashi, Eguchi Shinto

机构信息

Department of Statistical Science, Graduate University of Advanced Studies, Tokyo, Japan.

出版信息

Neural Comput. 2004 Apr;16(4):767-87. doi: 10.1162/089976604322860695.

Abstract

AdaBoost can be derived by sequential minimization of the exponential loss function. It implements the learning process by exponentially reweighting examples according to classification results. However, weights are often too sharply tuned, so that AdaBoost suffers from the nonrobustness and overlearning. Wepropose a new boosting method that is a slight modification of AdaBoost. The loss function is defined by a mixture of the exponential loss and naive error loss functions. As a result, the proposed method incorporates the effect of forgetfulness into AdaBoost. The statistical significance of our method is discussed, and simulations are presented for confirmation.

摘要

AdaBoost可以通过对指数损失函数进行顺序最小化来推导。它根据分类结果对示例进行指数加权来实现学习过程。然而,权重调整往往过于剧烈,导致AdaBoost存在非鲁棒性和过度学习的问题。我们提出了一种对AdaBoost进行轻微修改的新的提升方法。损失函数由指数损失函数和朴素误差损失函数的混合定义。结果,所提出的方法将遗忘效应纳入了AdaBoost。讨论了我们方法的统计显著性,并给出了模拟结果以作验证。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验