• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于线性化的学习算法。

Learning algorithms based on linearization.

作者信息

Hahnloser R

机构信息

Institute for Theoretical Physics, ETHZ, Zürich, Switzerland.

出版信息

Network. 1998 Aug;9(3):363-80.

PMID:9861996
Abstract

The aim of this article is to investigate a mechanical description of learning. A framework for local and simple learning algorithms based on interpreting a neural network as a set of configuration constraints is proposed. For any architectural design and learning task, unsupervised and supervised algorithms can be derived, optionally using unconstrained and hidden neurons. Unlike algorithms based on the gradient in weight space, the proposed tangential correlation (TC) algorithms move along the gradient in state space. This results in optimal scaling properties and simple expressions for the weight updates. The number of synapses is much larger than the number of neurons. A constraint for neural states does not impose a unique constraint for synaptic weights. Which weights to assign credit to can be selected from a parametrization of all weight changes equivalently satisfying the state constraints. At the heart of the parametrization are minimal weight changes. Two supervised algorithms (differing by their parametrizations) operating on a three-layer perceptron are compared with standard backpropagation. The successful training of fixed points of recurrent networks is demonstrated. The unsupervised learning of oscillations with variable frequencies is performed on standard and more sophisticated recurrent networks. The results presented here can be useful both for the analysis and for the synthesis of learning algorithms.

摘要

本文旨在研究学习的一种力学描述。提出了一种基于将神经网络解释为一组配置约束的局部和简单学习算法框架。对于任何架构设计和学习任务,都可以推导出无监督和有监督算法,可选择使用无约束和隐藏神经元。与基于权重空间梯度的算法不同,所提出的切向相关(TC)算法沿着状态空间中的梯度移动。这导致了最优的缩放特性和权重更新的简单表达式。突触的数量远多于神经元的数量。对神经状态的约束并不会对突触权重施加唯一的约束。可以从等效满足状态约束的所有权重变化的参数化中选择赋予信用的权重。参数化的核心是最小权重变化。将在三层感知器上运行的两种有监督算法(因其参数化不同)与标准反向传播进行比较。展示了循环网络固定点的成功训练。在标准和更复杂的循环网络上进行了可变频率振荡的无监督学习。这里给出的结果对于学习算法的分析和合成都可能有用。

相似文献

1
Learning algorithms based on linearization.基于线性化的学习算法。
Network. 1998 Aug;9(3):363-80.
2
Equivalence of backpropagation and contrastive Hebbian learning in a layered network.分层网络中反向传播与对比赫布学习的等效性。
Neural Comput. 2003 Feb;15(2):441-54. doi: 10.1162/089976603762552988.
3
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.
4
A new backpropagation learning algorithm for layered neural networks with nondifferentiable units.一种用于具有不可微单元的分层神经网络的新反向传播学习算法。
Neural Comput. 2007 May;19(5):1422-35. doi: 10.1162/neco.2007.19.5.1422.
5
A new constructive algorithm for architectural and functional adaptation of artificial neural networks.一种用于人工神经网络架构和功能自适应的新型构造算法。
IEEE Trans Syst Man Cybern B Cybern. 2009 Dec;39(6):1590-605. doi: 10.1109/TSMCB.2009.2021849. Epub 2009 Jun 5.
6
Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures.在递归语言结构上进行训练前后,简单循环网络状态空间的组织情况。
Neural Netw. 2007 Mar;20(2):236-44. doi: 10.1016/j.neunet.2006.01.020. Epub 2006 May 9.
7
Optimizing one-shot learning with binary synapses.利用二元突触优化一次性学习。
Neural Comput. 2008 Aug;20(8):1928-50. doi: 10.1162/neco.2008.10-07-618.
8
Echo state networks with filter neurons and a delay&sum readout.带滤波神经元和延迟求和读出的回声状态网络。
Neural Netw. 2010 Mar;23(2):244-56. doi: 10.1016/j.neunet.2009.07.004. Epub 2009 Jul 16.
9
Integrating regression formulas and kernel functions into locally adaptive knowledge-based neural networks: a case study on renal function evaluation.将回归公式和核函数集成到基于局部自适应知识的神经网络中:以肾功能评估为例
Artif Intell Med. 2006 Mar;36(3):235-44. doi: 10.1016/j.artmed.2005.07.007. Epub 2005 Oct 6.
10
A spiking neural network model of an actor-critic learning agent.一种基于演员-评论家学习智能体的脉冲神经网络模型。
Neural Comput. 2009 Feb;21(2):301-39. doi: 10.1162/neco.2008.08-07-593.