• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

广义神经元:前馈和循环架构。

Generalized neuron: feedforward and recurrent architectures.

作者信息

Kulkarni Raghavendra V, Venayagamoorthy Ganesh K

机构信息

Real-Time Power and Intelligent Systems Laboratory, Department of Electrical and Computer Engineering, Missouri University of Science and Technology, Rolla, MO, USA.

出版信息

Neural Netw. 2009 Sep;22(7):1011-7. doi: 10.1016/j.neunet.2009.07.027. Epub 2009 Jul 25.

DOI:10.1016/j.neunet.2009.07.027
PMID:19660907
Abstract

Feedforward neural networks such as multilayer perceptrons (MLP) and recurrent neural networks are widely used for pattern classification, nonlinear function approximation, density estimation and time series prediction. A large number of neurons are usually required to perform these tasks accurately, which makes the MLPs less attractive for computational implementations on resource constrained hardware platforms. This paper highlights the benefits of feedforward and recurrent forms of a compact neural architecture called generalized neuron (GN). This paper demonstrates that GN and recurrent GN (RGN) can perform good classification, nonlinear function approximation, density estimation and chaotic time series prediction. Due to two aggregation functions and two activation functions, GN exhibits resilience to the nonlinearities of complex problems. Particle swarm optimization (PSO) is proposed as the training algorithm for GN and RGN. Due to a small number of trainable parameters, GN and RGN require less memory and computational resources. Thus, these structures are attractive choices for fast implementations on resource constrained hardware platforms.

摘要

前馈神经网络,如多层感知器(MLP)和递归神经网络,被广泛用于模式分类、非线性函数逼近、密度估计和时间序列预测。通常需要大量神经元才能准确执行这些任务,这使得多层感知器在资源受限的硬件平台上进行计算实现时吸引力降低。本文强调了一种称为广义神经元(GN)的紧凑型神经架构的前馈和递归形式的优点。本文表明,广义神经元和递归广义神经元(RGN)可以进行良好的分类、非线性函数逼近、密度估计和混沌时间序列预测。由于具有两个聚合函数和两个激活函数,广义神经元对复杂问题的非线性具有弹性。粒子群优化(PSO)被提议作为广义神经元和递归广义神经元的训练算法。由于可训练参数数量少,广义神经元和递归广义神经元需要更少的内存和计算资源。因此,这些结构是在资源受限的硬件平台上进行快速实现的有吸引力的选择。

相似文献

1
Generalized neuron: feedforward and recurrent architectures.广义神经元:前馈和循环架构。
Neural Netw. 2009 Sep;22(7):1011-7. doi: 10.1016/j.neunet.2009.07.027. Epub 2009 Jul 25.
2
Optimal exponential synchronization of general chaotic delayed neural networks: an LMI approach.一般混沌时滞神经网络的最优指数同步:一种线性矩阵不等式方法。
Neural Netw. 2009 Sep;22(7):949-57. doi: 10.1016/j.neunet.2009.04.002. Epub 2009 Apr 22.
3
A complex-valued RTRL algorithm for recurrent neural networks.一种用于递归神经网络的复值实时循环学习算法。
Neural Comput. 2004 Dec;16(12):2699-713. doi: 10.1162/0899766042321779.
4
Real-time computation at the edge of chaos in recurrent neural networks.递归神经网络中混沌边缘的实时计算。
Neural Comput. 2004 Jul;16(7):1413-36. doi: 10.1162/089976604323057443.
5
On Clifford neurons and Clifford multi-layer perceptrons.关于克利福德神经元和克利福德多层感知器
Neural Netw. 2008 Sep;21(7):925-35. doi: 10.1016/j.neunet.2008.03.004. Epub 2008 Jun 2.
6
A generalized feedforward neural network architecture for classification and regression.一种用于分类和回归的广义前馈神经网络架构。
Neural Netw. 2003 Jun-Jul;16(5-6):561-8. doi: 10.1016/S0893-6080(03)00116-3.
7
Coding of temporally varying signals in networks of spiking neurons with global delayed feedback.具有全局延迟反馈的脉冲神经元网络中时变信号的编码
Neural Comput. 2005 Oct;17(10):2139-75. doi: 10.1162/0899766054615680.
8
Spike-timing error backpropagation in theta neuron networks.θ神经元网络中的尖峰时间误差反向传播
Neural Comput. 2009 Jan;21(1):9-45. doi: 10.1162/neco.2008.09-07-610.
9
Approximation of state-space trajectories by locally recurrent globally feed-forward neural networks.通过局部递归全局前馈神经网络逼近状态空间轨迹。
Neural Netw. 2008 Jan;21(1):59-64. doi: 10.1016/j.neunet.2007.10.004. Epub 2007 Nov 22.
10
On the global output convergence of a class of recurrent neural networks with time-varying inputs.一类具有时变输入的递归神经网络的全局输出收敛性
Neural Netw. 2005 Mar;18(2):171-8. doi: 10.1016/j.neunet.2004.10.005. Epub 2005 Jan 19.

引用本文的文献

1
Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond.信号感知器:关于布尔函数空间及其他的可识别性
Front Artif Intell. 2022 Jun 2;5:770254. doi: 10.3389/frai.2022.770254. eCollection 2022.