Shrestha D L, Solomatine D P
UNESCO-IHE Institute for Water Education, Westvest 7 Delft, The Netherlands.
Neural Comput. 2006 Jul;18(7):1678-710. doi: 10.1162/neco.2006.18.7.1678.
The application of boosting technique to regression problems has received relatively little attention in contrast to research aimed at classification problems. This letter describes a new boosting algorithm, AdaBoost.RT, for regression problems. Its idea is in filtering out the examples with the relative estimation error that is higher than the preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal value of the error threshold to demarcate examples as poorly or well predicted. Some experimental results using the M5 model tree as a weak learning machine for several benchmark data sets are reported. The results are compared to other boosting methods, bagging, artificial neural networks, and a single M5 model tree. The preliminary empirical comparisons show higher performance of AdaBoost.RT for most of the considered data sets.
与针对分类问题的研究相比,将提升技术应用于回归问题受到的关注相对较少。这封信描述了一种用于回归问题的新提升算法AdaBoost.RT。其思路是滤除相对估计误差高于预设阈值的样本,然后遵循AdaBoost过程。因此,需要选择误差阈值的次优值来划分预测不佳或良好的样本。报告了使用M5模型树作为弱学习机对几个基准数据集的一些实验结果。将结果与其他提升方法、装袋法、人工神经网络以及单个M5模型树进行了比较。初步的实证比较表明,对于大多数考虑的数据集,AdaBoost.RT具有更高的性能。