• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用洛特卡-沃尔泰拉递归神经网络实现竞争层模型的基础。

Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks.

作者信息

Yi Zhang

机构信息

Machine Intelligence Laboratory, College of Computer Science, Sichuan University, Chengdu 610065, China.

出版信息

IEEE Trans Neural Netw. 2010 Mar;21(3):494-507. doi: 10.1109/TNN.2009.2039758. Epub 2010 Feb 5.

DOI:10.1109/TNN.2009.2039758
PMID:20142165
Abstract

The competitive layer model (CLM) can be described by an optimization problem. The problem can be further formulated by an energy function, called the CLM energy function, in the subspace of nonnegative orthant. The set of minimum points of the CLM energy function forms the set of solutions of the CLM problem. Solving the CLM problem means to find out such solutions. Recurrent neural networks (RNNs) can be used to implement the CLM to solve the CLM problem. The key point is to make the set of minimum points of the CLM energy function just correspond to the set of stable attractors of the recurrent neural networks. This paper proposes to use Lotka-Volterra RNNs (LV RNNs) to implement the CLM. The contribution of this paper is to establish foundations of implementing the CLM by LV RNNs. The contribution mainly contains three parts. The first part is on the CLM energy function. Necessary and sufficient conditions for minimum points of the CLM energy function are established by detailed study. The second part is on the convergence of the proposed model of the LV RNNs. It is proven that interesting trajectories are convergent. The third part is the most important. It proves that the set of stable attractors of the proposed LV RNN just equals the set of minimum points of the CLM energy function in the nonnegative orthant. Thus, the LV RNNs can be used to solve the problem of the CLM. It is believed that by establishing such basic rigorous theories, more and interesting applications of the CLM can be found.

摘要

竞争层模型(CLM)可以用一个优化问题来描述。该问题可以在非负象限子空间中通过一个称为CLM能量函数的能量函数进一步公式化。CLM能量函数的最小点集构成了CLM问题的解集。求解CLM问题意味着找出这样的解。递归神经网络(RNN)可用于实现CLM以解决CLM问题。关键在于使CLM能量函数的最小点集恰好对应于递归神经网络的稳定吸引子集。本文提出使用Lotka-Volterra递归神经网络(LV RNN)来实现CLM。本文的贡献在于建立了用LV RNN实现CLM的基础。该贡献主要包含三个部分。第一部分是关于CLM能量函数。通过详细研究建立了CLM能量函数最小点的充要条件。第二部分是关于所提出的LV RNN模型的收敛性。证明了有趣的轨迹是收敛的。第三部分也是最重要的。证明了所提出的LV RNN的稳定吸引子集恰好等于非负象限中CLM能量函数的最小点集。因此,LV RNN可用于解决CLM问题。相信通过建立这样严谨的基础理论,可以找到更多有趣的CLM应用。

相似文献

1
Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks.用洛特卡-沃尔泰拉递归神经网络实现竞争层模型的基础。
IEEE Trans Neural Netw. 2010 Mar;21(3):494-507. doi: 10.1109/TNN.2009.2039758. Epub 2010 Feb 5.
2
Continuous attractors of Lotka-Volterra recurrent neural networks with infinite neurons.具有无限神经元的Lotka-Volterra递归神经网络的连续吸引子
IEEE Trans Neural Netw. 2010 Oct;21(10):1690-5. doi: 10.1109/TNN.2010.2067224. Epub 2010 Sep 2.
3
Selectable and unselectable sets of neurons in recurrent neural networks with saturated piecewise linear transfer function.具有饱和分段线性传递函数的递归神经网络中可选择和不可选择的神经元集。
IEEE Trans Neural Netw. 2011 Jul;22(7):1021-31. doi: 10.1109/TNN.2011.2132762. Epub 2011 May 23.
4
Representations of continuous attractors of recurrent neural networks.循环神经网络连续吸引子的表示。
IEEE Trans Neural Netw. 2009 Feb;20(2):368-72. doi: 10.1109/TNN.2008.2010771. Epub 2009 Jan 13.
5
Competitive layer model of discrete-time recurrent neural networks with LT neurons.带 LT 神经元的离散时间递归神经网络的竞争层模型。
Neural Comput. 2010 Aug;22(8):2137-60. doi: 10.1162/NECO_a_00004-Zhou.
6
A competitive-layer model for feature binding and sensory segmentation.用于特征绑定和感觉分割的竞争层模型。
Neural Comput. 2001 Feb;13(2):357-87. doi: 10.1162/089976601300014574.
7
A competitive layer model for cellular neural networks.细胞神经网络的竞争层模型。
Neural Netw. 2012 Sep;33:216-27. doi: 10.1016/j.neunet.2012.05.005. Epub 2012 Jun 1.
8
A new approach to knowledge-based design of recurrent neural networks.一种基于知识的递归神经网络设计新方法。
IEEE Trans Neural Netw. 2008 Aug;19(8):1389-401. doi: 10.1109/TNN.2008.2000393.
9
Markovian architectural bias of recurrent neural networks.循环神经网络的马尔可夫架构偏差
IEEE Trans Neural Netw. 2004 Jan;15(1):6-15. doi: 10.1109/TNN.2003.820839.
10
Global exponential stability of generalized recurrent neural networks with discrete and distributed delays.具有离散和分布时滞的广义递归神经网络的全局指数稳定性
Neural Netw. 2006 Jun;19(5):667-75. doi: 10.1016/j.neunet.2005.03.015. Epub 2005 Jul 20.

引用本文的文献

1
Deep learning-based EEG emotion recognition: Current trends and future perspectives.基于深度学习的脑电图情感识别:当前趋势与未来展望。
Front Psychol. 2023 Feb 27;14:1126994. doi: 10.3389/fpsyg.2023.1126994. eCollection 2023.
2
A dual deep neural network for auto-delineation in cervical cancer radiotherapy with clinical validation.一种具有临床验证的宫颈癌放疗自动勾画的双重深度神经网络。
Radiat Oncol. 2022 Nov 15;17(1):182. doi: 10.1186/s13014-022-02157-5.
3
Informational Structures and Informational Fields as a Prototype for the Description of Postulates of the Integrated Information Theory.
作为整合信息理论假设描述原型的信息结构与信息场
Entropy (Basel). 2019 May 14;21(5):493. doi: 10.3390/e21050493.