Department of Information and Mathematics Sciences, China Jiliang University, Hangzhou 310018, Zhejiang Province, PR China.
Department of Information and Mathematics Sciences, China Jiliang University, Hangzhou 310018, Zhejiang Province, PR China.
Neural Netw. 2015 Mar;63:10-7. doi: 10.1016/j.neunet.2014.10.008. Epub 2014 Nov 10.
When the neural element number n of neural networks is larger than the sample size m, the overfitting problem arises since there are more parameters than actual data (more variable than constraints). In order to overcome the overfitting problem, we propose to reduce the number of neural elements by using compressed projection A which does not need to satisfy the condition of Restricted Isometric Property (RIP). By applying probability inequalities and approximation properties of the feedforward neural networks (FNNs), we prove that solving the FNNs regression learning algorithm in the compressed domain instead of the original domain reduces the sample error at the price of an increased (but controlled) approximation error, where the covering number theory is used to estimate the excess error, and an upper bound of the excess error is given.
当神经网络的神经元素数量 n 大于样本大小 m 时,由于参数比实际数据多(变量比约束多),就会出现过拟合问题。为了解决过拟合问题,我们提出使用不需要满足限制等距特性(RIP)条件的压缩投影 A 来减少神经元素的数量。通过应用概率不等式和前馈神经网络(FNN)的逼近特性,我们证明在压缩域而不是原始域中求解 FNN 回归学习算法可以降低样本误差,代价是增加(但可控制)的逼近误差,其中覆盖数理论用于估计过差,并且给出了过差的上界。