uc3m-Santander Big Data Institute, Universidad Carlos III de Madrid. Getafe (Madrid), Spain.
uc3m-Santander Big Data Institute, Universidad Carlos III de Madrid. Getafe (Madrid), Spain.
Neural Netw. 2021 Oct;142:57-72. doi: 10.1016/j.neunet.2021.04.036. Epub 2021 Apr 30.
Even when neural networks are widely used in a large number of applications, they are still considered as black boxes and present some difficulties for dimensioning or evaluating their prediction error. This has led to an increasing interest in the overlapping area between neural networks and more traditional statistical methods, which can help overcome those problems. In this article, a mathematical framework relating neural networks and polynomial regression is explored by building an explicit expression for the coefficients of a polynomial regression from the weights of a given neural network, using a Taylor expansion approach. This is achieved for single hidden layer neural networks in regression problems. The validity of the proposed method depends on different factors like the distribution of the synaptic potentials or the chosen activation function. The performance of this method is empirically tested via simulation of synthetic data generated from polynomials to train neural networks with different structures and hyperparameters, showing that almost identical predictions can be obtained when certain conditions are met. Lastly, when learning from polynomial generated data, the proposed method produces polynomials that approximate correctly the data locally.
即使神经网络在大量应用中得到广泛应用,它们仍被视为黑箱,在对其预测误差进行尺寸确定或评估时仍存在一些困难。这导致人们对神经网络和更传统的统计方法之间的重叠领域越来越感兴趣,这可以帮助克服这些问题。在本文中,通过使用泰勒展开方法,从给定神经网络的权重中构建多项式回归的系数的显式表达式,探索了神经网络和多项式回归之间的数学框架。这是在回归问题中单隐层神经网络中实现的。所提出的方法的有效性取决于不同的因素,例如突触电位的分布或所选的激活函数。通过从多项式生成的合成数据的模拟来经验性地测试该方法的性能,该方法用于训练具有不同结构和超参数的神经网络,结果表明,在满足某些条件时,可以获得几乎相同的预测。最后,当从多项式生成的数据中学习时,所提出的方法产生的多项式可以正确地局部逼近数据。