Suppr超能文献

脉冲神经网络中的平滑精确梯度下降学习

Smooth Exact Gradient Descent Learning in Spiking Neural Networks.

作者信息

Klos Christian, Memmesheimer Raoul-Martin

机构信息

University of Bonn, Neural Network Dynamics and Computation, Institute of Genetics, 53115 Bonn, Germany.

出版信息

Phys Rev Lett. 2025 Jan 17;134(2):027301. doi: 10.1103/PhysRevLett.134.027301.

Abstract

Gradient descent prevails in artificial neural network training, but seems inept for spiking neural networks as small parameter changes can cause sudden, disruptive appearances and disappearances of spikes. Here, we demonstrate exact gradient descent based on continuously changing spiking dynamics. These are generated by neuron models whose spikes vanish and appear at the end of a trial, where it cannot influence subsequent dynamics. This also enables gradient-based spike addition and removal. We illustrate our scheme with various tasks and setups, including recurrent and deep, initially silent networks.

摘要

梯度下降在人工神经网络训练中占主导地位,但对于脉冲神经网络似乎并不适用,因为微小的参数变化可能会导致脉冲突然出现和消失,从而产生干扰。在这里,我们展示了基于不断变化的脉冲动力学的精确梯度下降。这些动力学是由神经元模型生成的,其脉冲在试验结束时消失和出现,在此处它不会影响后续动力学。这也使得基于梯度的脉冲添加和移除成为可能。我们用各种任务和设置说明了我们的方案,包括循环和深度的、最初沉默的网络。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验