Islam Md Monirul, Yao Xin, Shahriar Nirjon S M Shahriar, Islam Muhammad Asiful, Murase Kazuyuki
Bangladesh University of Engineering and Technology (BUET), Dhaka 1000, Bangladesh.
IEEE Trans Syst Man Cybern B Cybern. 2008 Jun;38(3):771-84. doi: 10.1109/TSMCB.2008.922055.
In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incrementally train different individual NNs in an ensemble using the negative correlation learning algorithm. Bagging and boosting algorithms are used in NegBagg and NegBoost, respectively, to create different training sets for different NNs in the ensemble. The idea behind using negative correlation learning in conjunction with the bagging/boosting algorithm is to facilitate interaction and cooperation among NNs during their training. Both NegBagg and NegBoost use a constructive approach to automatically determine the number of hidden neurons for NNs. NegBoost also uses the constructive approach to automatically determine the number of NNs for the ensemble. The two algorithms have been tested on a number of benchmark problems in machine learning and NNs, including Australian credit card assessment, breast cancer, diabetes, glass, heart disease, letter recognition, satellite, soybean, and waveform problems. The experimental results show that NegBagg and NegBoost require a small number of training epochs to produce compact NN ensembles with good generalization.
在本文中,我们提出了两种协同集成学习算法,即NegBagg和NegBoost,用于设计神经网络(NN)集成。所提出的算法使用负相关学习算法逐步训练集成中的不同个体神经网络。NegBagg和NegBoost分别使用Bagging和Boosting算法为集成中的不同神经网络创建不同的训练集。将负相关学习与Bagging/Boosting算法结合使用的背后思想是在神经网络训练期间促进它们之间的交互与合作。NegBagg和NegBoost都使用一种构造性方法来自动确定神经网络的隐藏神经元数量。NegBoost还使用该构造性方法来自动确定集成中神经网络的数量。这两种算法已经在机器学习和神经网络的多个基准问题上进行了测试,包括澳大利亚信用卡评估、乳腺癌、糖尿病、玻璃、心脏病、字母识别、卫星、大豆和波形问题。实验结果表明,NegBagg和NegBoost需要较少的训练轮次就能产生具有良好泛化能力的紧凑神经网络集成。