Department of Mathematics, Kyungpook National University, Daegu 41566, Republic of Korea.
Department of Mathematical Sciences, Ulsan National Institute of Science and Technology, Ulsan 44919, Republic of Korea.
Neural Netw. 2024 Aug;176:106362. doi: 10.1016/j.neunet.2024.106362. Epub 2024 May 3.
Appropriate weight initialization settings, along with the ReLU activation function, have become cornerstones of modern deep learning, enabling the training and deployment of highly effective and efficient neural network models across diverse areas of artificial intelligence. The problem of "dying ReLU," where ReLU neurons become inactive and yield zero output, presents a significant challenge in the training of deep neural networks with ReLU activation function. Theoretical research and various methods have been introduced to address the problem. However, even with these methods and research, training remains challenging for extremely deep and narrow feedforward networks with ReLU activation function. In this paper, we propose a novel weight initialization method to address this issue. We establish several properties of our initial weight matrix and demonstrate how these properties enable the effective propagation of signal vectors. Through a series of experiments and comparisons with existing methods, we demonstrate the effectiveness of the novel initialization method.
适当的权重初始化设置,以及 ReLU 激活函数,已经成为现代深度学习的基石,使得能够在人工智能的各个领域训练和部署高效和有效的神经网络模型。ReLU 神经元变得不活跃并产生零输出的“ReLU 死亡”问题,在使用 ReLU 激活函数训练深度神经网络时提出了重大挑战。已经引入了理论研究和各种方法来解决这个问题。然而,即使有了这些方法和研究,对于具有 ReLU 激活函数的极深和狭窄的前馈网络的训练仍然具有挑战性。在本文中,我们提出了一种新的权重初始化方法来解决这个问题。我们建立了我们初始权重矩阵的几个性质,并展示了这些性质如何使信号向量的有效传播成为可能。通过一系列实验和与现有方法的比较,我们证明了新的初始化方法的有效性。