Zhang Chenguang, Liu Tian, Du Xuejiao
School of Mathematics and Statistics, Hainan University, Haikou 570100, China.
School of Information and Communication Engineering, Hainan University, Haikou 570100, China.
Entropy (Basel). 2023 Dec 20;26(1):0. doi: 10.3390/e26010007.
In response to the challenge of overfitting, which may lead to a decline in network generalization performance, this paper proposes a new regularization technique, called the class-based decorrelation method (CDM). Specifically, this method views the neurons in a specific hidden layer as base learners, and aims to boost network generalization as well as model accuracy by minimizing the correlation among individual base learners while simultaneously maximizing their class-conditional correlation. Intuitively, CDM not only promotes diversity among the hidden neurons, but also enhances their cohesiveness among them when processing samples from the same class. Comparative experiments conducted on various datasets using deep models demonstrate that CDM effectively reduces overfitting and improves classification performance.
针对可能导致网络泛化性能下降的过拟合挑战,本文提出了一种新的正则化技术,称为基于类的去相关方法(CDM)。具体而言,该方法将特定隐藏层中的神经元视为基学习器,旨在通过最小化各个基学习器之间的相关性,同时最大化它们的类条件相关性,来提高网络泛化能力以及模型准确性。直观地说,CDM不仅促进了隐藏神经元之间的多样性,还增强了它们在处理来自同一类别的样本时的凝聚力。使用深度模型在各种数据集上进行的对比实验表明,CDM有效地减少了过拟合并提高了分类性能。