School of Computer Science, The University of Adelaide, Adelaide, SA 5005, Australia.
Neural Netw. 2013 Dec;48:44-58. doi: 10.1016/j.neunet.2013.07.006. Epub 2013 Jul 16.
We propose a general framework for analyzing and developing fully corrective boosting-based classifiers. The framework accepts any convex objective function, and allows any convex (for example, ℓp-norm, p ≥ 1) regularization term. By placing the wide variety of existing fully corrective boosting-based classifiers on a common footing, and considering the primal and dual problems together, the framework allows a direct comparison between apparently disparate methods. By solving the primal rather than the dual the framework is capable of generating efficient fully-corrective boosting algorithms without recourse to sophisticated convex optimization processes. We show that a range of additional boosting-based algorithms can be incorporated into the framework despite not being fully corrective. Finally, we provide an empirical analysis of the performance of a variety of the most significant boosting-based classifiers on a few machine learning benchmark datasets.
我们提出了一个通用框架,用于分析和开发完全校正的基于提升的分类器。该框架接受任何凸目标函数,并允许任何凸(例如,ℓp 范数,p ≥ 1)正则化项。通过将各种现有的基于完全校正的提升分类器置于同一基础上,并同时考虑原始和对偶问题,该框架允许在明显不同的方法之间进行直接比较。通过求解原问题而不是对偶问题,该框架能够生成高效的完全校正提升算法,而无需诉诸复杂的凸优化过程。我们表明,尽管不是完全校正的,但可以将一系列其他基于提升的算法纳入该框架。最后,我们对几种基于提升的最重要分类器在一些机器学习基准数据集上的性能进行了实证分析。