Department of Electrical and Electronics Engineering, Batman University, 72060 Batman, Turkey.
Neural Netw. 2018 Mar;99:148-157. doi: 10.1016/j.neunet.2018.01.007. Epub 2018 Jan 31.
Determining optimal activation function in artificial neural networks is an important issue because it is directly linked with obtained success rates. But, unfortunately, there is not any way to determine them analytically, optimal activation function is generally determined by trials or tuning. This paper addresses, a simpler and a more effective approach to determine optimal activation function. In this approach, which can be called as trained activation function, an activation function was trained for each particular neuron by linear regression. This training process was done based on the training dataset, which consists the sums of inputs of each neuron in the hidden layer and desired outputs. By this way, a different activation function was generated for each neuron in the hidden layer. This approach was employed in random weight artificial neural network (RWN) and validated by 50 benchmark datasets. Achieved success rates by RWN that used trained activation functions were higher than obtained success rates by RWN that used traditional activation functions. Obtained results showed that proposed approach is a successful, simple and an effective way to determine optimal activation function instead of trials or tuning in both randomized single and multilayer ANNs.
确定人工神经网络中的最佳激活函数是一个重要的问题,因为它直接关系到所获得的成功率。但是,不幸的是,目前还没有任何方法可以对其进行分析,最佳激活函数通常是通过试验或调整来确定的。本文提出了一种更简单、更有效的确定最佳激活函数的方法。在这种方法中,我们可以称之为训练激活函数,通过线性回归为每个特定的神经元训练一个激活函数。这个训练过程是基于训练数据集进行的,该数据集包含隐藏层中每个神经元的输入之和和期望的输出。通过这种方式,为隐藏层中的每个神经元生成了不同的激活函数。本文将该方法应用于随机权重人工神经网络(RWN)中,并通过 50 个基准数据集进行了验证。使用训练后的激活函数的 RWN 获得的成功率高于使用传统激活函数的 RWN 获得的成功率。实验结果表明,该方法是一种成功、简单、有效的方法,可以确定随机单多层神经网络的最佳激活函数,而不是通过试验或调整。