Takenouchi Takashi, Eguchi Shinto
Department of Statistical Science, Graduate University of Advanced Studies, Tokyo, Japan.
Neural Comput. 2004 Apr;16(4):767-87. doi: 10.1162/089976604322860695.
AdaBoost can be derived by sequential minimization of the exponential loss function. It implements the learning process by exponentially reweighting examples according to classification results. However, weights are often too sharply tuned, so that AdaBoost suffers from the nonrobustness and overlearning. Wepropose a new boosting method that is a slight modification of AdaBoost. The loss function is defined by a mixture of the exponential loss and naive error loss functions. As a result, the proposed method incorporates the effect of forgetfulness into AdaBoost. The statistical significance of our method is discussed, and simulations are presented for confirmation.
AdaBoost可以通过对指数损失函数进行顺序最小化来推导。它根据分类结果对示例进行指数加权来实现学习过程。然而,权重调整往往过于剧烈,导致AdaBoost存在非鲁棒性和过度学习的问题。我们提出了一种对AdaBoost进行轻微修改的新的提升方法。损失函数由指数损失函数和朴素误差损失函数的混合定义。结果,所提出的方法将遗忘效应纳入了AdaBoost。讨论了我们方法的统计显著性,并给出了模拟结果以作验证。