Schmidgall Samuel, Hays Joe
U.S. Naval Research Laboratory, Spacecraft Engineering Department, Washington, DC, United States.
Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD, United States.
Front Neurosci. 2023 May 12;17:1183321. doi: 10.3389/fnins.2023.1183321. eCollection 2023.
We propose that in order to harness our understanding of neuroscience toward machine learning, we must first have powerful tools for training brain-like models of learning. Although substantial progress has been made toward understanding the dynamics of learning in the brain, models of learning have yet to demonstrate the same performance capabilities as methods in deep learning such as gradient descent. Inspired by the successes of machine learning using gradient descent, we introduce a bi-level optimization framework that seeks to both solve online learning tasks and improve the ability to learn online using models of plasticity from neuroscience. We demonstrate that models of three-factor learning with synaptic plasticity taken from the neuroscience literature can be trained in Spiking Neural Networks (SNNs) with gradient descent via a framework of learning-to-learn to address challenging learning problems. This framework opens a new path toward developing neuroscience inspired online learning algorithms.
我们认为,为了将我们对神经科学的理解应用于机器学习,我们首先必须拥有强大的工具来训练类似大脑的学习模型。尽管在理解大脑学习动态方面已经取得了重大进展,但学习模型尚未展现出与深度学习方法(如梯度下降)相同的性能。受使用梯度下降的机器学习成功案例启发,我们引入了一个双层优化框架,该框架旨在既解决在线学习任务,又利用神经科学中的可塑性模型提高在线学习能力。我们证明,通过一个学习如何学习的框架,利用梯度下降可以在脉冲神经网络(SNN)中训练从神经科学文献中获取的具有突触可塑性的三因素学习模型,以解决具有挑战性的学习问题。这个框架为开发受神经科学启发的在线学习算法开辟了一条新路径。