Suppr超能文献

用于深度学习的神经元信号衰减激活机制。

Neuron signal attenuation activation mechanism for deep learning.

作者信息

Jiang Wentao, Yuan Heng, Liu Wanjun

机构信息

Department of Artificial Intelligence, Liaoning Technical University, Huludao 125105, China.

出版信息

Patterns (N Y). 2024 Dec 16;6(1):101117. doi: 10.1016/j.patter.2024.101117. eCollection 2025 Jan 10.

Abstract

Neuron signal activation is at the core of deep learning and broadly impacts science and engineering. Despite growing interest in neuron cell stimulation via amplitude current, the activation mechanism of biological neurons has limited application in deep learning due to the lack of a universal mathematical principle suitable for artificial neural networks. Here, we show how deep learning can go beyond the current learning effects through a newly proposed neuron signal activation mechanism. To achieve this, we report a new cross-disciplinary method for neuron signal attenuation, using the inference of differential equations within generalized linear systems to enhance the efficiency of deep learning. We formulate the mathematical model of the efficient activation function, which we refer to as Attenuation (Ant). Ant can represent higher-order derivatives and stabilize data distributions in deep-learning tasks. We demonstrate the effectiveness, stability, and generalization of Ant on many challenging tasks across various neural network architectures.

摘要

神经元信号激活是深度学习的核心,广泛影响科学和工程领域。尽管通过幅度电流对神经元细胞进行刺激的兴趣日益浓厚,但由于缺乏适用于人工神经网络的通用数学原理,生物神经元的激活机制在深度学习中的应用有限。在此,我们展示了深度学习如何通过新提出的神经元信号激活机制超越当前的学习效果。为此,我们报告了一种用于神经元信号衰减的新跨学科方法,利用广义线性系统中的微分方程推理来提高深度学习的效率。我们制定了高效激活函数的数学模型,我们将其称为衰减(Ant)。Ant可以表示高阶导数并稳定深度学习任务中的数据分布。我们在各种神经网络架构的许多具有挑战性的任务上证明了Ant的有效性、稳定性和通用性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8ee9/11783890/35d7db30259b/fx1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验