• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

树状神经网络存储容量的激活函数依赖性

Activation function dependence of the storage capacity of treelike neural networks.

作者信息

Zavatone-Veth Jacob A, Pehlevan Cengiz

机构信息

Department of Physics, Harvard University, Cambridge, Massachusetts 02138, USA.

John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, Massachusetts 02138, USA.

出版信息

Phys Rev E. 2021 Feb;103(2):L020301. doi: 10.1103/PhysRevE.103.L020301.

DOI:10.1103/PhysRevE.103.L020301
PMID:33736039
Abstract

The expressive power of artificial neural networks crucially depends on the nonlinearity of their activation functions. Though a wide variety of nonlinear activation functions have been proposed for use in artificial neural networks, a detailed understanding of their role in determining the expressive power of a network has not emerged. Here, we study how activation functions affect the storage capacity of treelike two-layer networks. We relate the boundedness or divergence of the capacity in the infinite-width limit to the smoothness of the activation function, elucidating the relationship between previously studied special cases. Our results show that nonlinearity can both increase capacity and decrease the robustness of classification, and provide simple estimates for the capacity of networks with several commonly used activation functions. Furthermore, they generate a hypothesis for the functional benefit of dendritic spikes in branched neurons.

摘要

人工神经网络的表达能力关键取决于其激活函数的非线性。尽管已经提出了各种各样的非线性激活函数用于人工神经网络,但对于它们在确定网络表达能力中所起作用的详细理解尚未形成。在这里,我们研究激活函数如何影响树状两层网络的存储容量。我们将无限宽度极限下容量的有界性或发散性与激活函数的平滑性联系起来,阐明了先前研究的特殊情况之间的关系。我们的结果表明,非线性既可以增加容量,也可以降低分类的稳健性,并为具有几种常用激活函数的网络容量提供了简单估计。此外,它们还为分支神经元中树突棘的功能益处提出了一个假设。

相似文献

1
Activation function dependence of the storage capacity of treelike neural networks.树状神经网络存储容量的激活函数依赖性
Phys Rev E. 2021 Feb;103(2):L020301. doi: 10.1103/PhysRevE.103.L020301.
2
On Neural Network Kernels and the Storage Capacity Problem.关于神经网络内核与存储容量问题
Neural Comput. 2022 Apr 15;34(5):1136-1142. doi: 10.1162/neco_a_01494.
3
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
4
Morphological associative memories.形态联想记忆
IEEE Trans Neural Netw. 1998;9(2):281-93. doi: 10.1109/72.661123.
5
A novel type of activation function in artificial neural networks: Trained activation function.人工神经网络中的一种新型激活函数:训练激活函数。
Neural Netw. 2018 Mar;99:148-157. doi: 10.1016/j.neunet.2018.01.007. Epub 2018 Jan 31.
6
Dynamics of periodic delayed neural networks.周期延迟神经网络的动力学
Neural Netw. 2004 Jan;17(1):87-101. doi: 10.1016/S0893-6080(03)00208-9.
7
Implementing Signature Neural Networks with Spiking Neurons.使用脉冲神经元实现签名神经网络。
Front Comput Neurosci. 2016 Dec 20;10:132. doi: 10.3389/fncom.2016.00132. eCollection 2016.
8
Large memory capacity in chaotic artificial neural networks: a view of the anti-integrable limit.混沌人工神经网络中的大存储容量:反可积极限视角
IEEE Trans Neural Netw. 2009 Aug;20(8):1340-51. doi: 10.1109/TNN.2009.2024148. Epub 2009 Jul 17.
9
On decision regions of narrow deep neural networks.窄深神经网络的决策区域。
Neural Netw. 2021 Aug;140:121-129. doi: 10.1016/j.neunet.2021.02.024. Epub 2021 Mar 10.
10
Storage capacity and retrieval time of small-world neural networks.小世界神经网络的存储容量和检索时间。
Phys Rev E Stat Nonlin Soft Matter Phys. 2007 Sep;76(3 Pt 2):036114. doi: 10.1103/PhysRevE.76.036114. Epub 2007 Sep 26.

引用本文的文献

1
Parallel synapses with transmission nonlinearities enhance neuronal classification capacity.具有传输非线性的并行突触增强了神经元的分类能力。
PLoS Comput Biol. 2025 May 9;21(5):e1012285. doi: 10.1371/journal.pcbi.1012285. eCollection 2025 May.
2
Nonlinear classification of neural manifolds with contextual information.具有上下文信息的神经流形的非线性分类。
Phys Rev E. 2025 Mar;111(3-2):035302. doi: 10.1103/PhysRevE.111.035302.
3
Nonideality-Aware Training for Accurate and Robust Low-Power Memristive Neural Networks.非理想感知训练:实现准确稳健的低功耗忆阻神经网络
Adv Sci (Weinh). 2022 Jun;9(17):e2105784. doi: 10.1002/advs.202105784. Epub 2022 May 4.