IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3094-3108. doi: 10.1109/TNNLS.2021.3050422. Epub 2022 Jul 6.
Nature has always inspired the human spirit and scientists frequently developed new methods based on observations from nature. Recent advances in imaging and sensing technology allow fascinating insights into biological neural processes. With the objective of finding new strategies to enhance the learning capabilities of neural networks, we focus on a phenomenon that is closely related to learning tasks and neural stability in biological neural networks, called homeostatic plasticity. Among the theories that have been developed to describe homeostatic plasticity, synaptic scaling has been found to be the most mature and applicable. We systematically discuss previous studies on the synaptic scaling theory and how they could be applied to artificial neural networks. Therefore, we utilize information theory to analytically evaluate how mutual information is affected by synaptic scaling. Based on these analytic findings, we propose two flavors in which synaptic scaling can be applied in the training process of simple and complex, feedforward, and recurrent neural networks. We compare our approach with state-of-the-art regularization techniques on standard benchmarks. We found that the proposed method yields the lowest error in both regression and classification tasks compared to previous regularization approaches in our experiments across a wide range of network feedforward and recurrent topologies and data sets.
大自然一直激发着人类的精神,科学家们经常基于对自然的观察开发新的方法。成像和传感技术的最新进展使人们能够深入了解生物神经过程。为了寻找提高神经网络学习能力的新策略,我们关注一种与生物神经网络中的学习任务和神经稳定性密切相关的现象,称为同型性可塑性。在已经提出的描述同型性可塑性的理论中,突触缩放被发现是最成熟和适用的。我们系统地讨论了以前关于突触缩放理论的研究,以及如何将其应用于人工神经网络。因此,我们利用信息论来分析评估互信息如何受到突触缩放的影响。基于这些分析结果,我们提出了两种在简单和复杂、前馈和递归神经网络的训练过程中应用突触缩放的方法。我们在标准基准上比较了我们的方法与最先进的正则化技术。我们发现,与我们实验中的先前正则化方法相比,所提出的方法在广泛的网络前馈和递归拓扑和数据集的回归和分类任务中产生的误差最小。