• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

形态联想记忆

Morphological associative memories.

作者信息

Ritter G X, Sussner P, Diza-de-Leon J L

机构信息

University of Florida, Center for Computer Vision and Visualization, Gainesville, FL 32611, USA.

出版信息

IEEE Trans Neural Netw. 1998;9(2):281-93. doi: 10.1109/72.661123.

DOI:10.1109/72.661123
PMID:18252452
Abstract

The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. A nonlinear activation function usually follows the linear operation in order to provide for nonlinearity of the network and set the next state of the neuron. In this paper we introduce a novel class of artificial neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before possible application of a nonlinear activation function. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. The main emphasis of the research presented here is on morphological associative memories. We examine the computing and storage capabilities of morphological associative memories and discuss differences between morphological models and traditional semilinear models such as the Hopfield net.

摘要

人工神经网络理论已成功应用于各种各样的模式识别问题。在该理论中,计算神经元的下一个状态或执行下一层神经网络计算的第一步涉及将神经值与其突触强度相乘并将结果相加的线性运算。通常在该线性运算之后会有一个非线性激活函数,以便为网络提供非线性并设置神经元的下一个状态。在本文中,我们引入了一类新型的人工神经网络,称为形态神经网络,其中乘法和加法运算分别被加法和最大值(或最小值)运算所取代。通过取和的最大值(或最小值)而非乘积的和,形态网络计算在可能应用非线性激活函数之前就是非线性的。因此,形态神经网络的特性与传统神经网络模型的特性截然不同。这里所呈现研究的主要重点是形态联想记忆。我们研究形态联想记忆的计算和存储能力,并讨论形态模型与传统半线性模型(如霍普菲尔德网络)之间的差异。

相似文献

1
Morphological associative memories.形态联想记忆
IEEE Trans Neural Netw. 1998;9(2):281-93. doi: 10.1109/72.661123.
2
Morphological bidirectional associative memories.形态学双向联想记忆
Neural Netw. 1999 Jul;12(6):851-867. doi: 10.1016/s0893-6080(99)00033-7.
3
Associative morphological memories based on variations of the kernel and dual kernel methods.基于核方法和对偶核方法变体的联想形态记忆。
Neural Netw. 2003 Jun-Jul;16(5-6):625-32. doi: 10.1016/S0893-6080(03)00113-8.
4
Learning and Forgetting in Generalized Brain-state-in-a-box (BSB) Neural Associative Memories.广义盒中脑状态(BSB)神经联想记忆中的学习与遗忘
Neural Netw. 1996 Jul;9(5):845-854. doi: 10.1016/0893-6080(95)00101-8.
5
Gray-scale morphological associative memories.灰度形态联想记忆
IEEE Trans Neural Netw. 2006 May;17(3):559-70. doi: 10.1109/TNN.2006.873280.
6
Design and analysis of maximum Hopfield networks.最大霍普菲尔德网络的设计与分析。
IEEE Trans Neural Netw. 2001;12(2):329-39. doi: 10.1109/72.914527.
7
Extreme learning machine for a new hybrid morphological/linear perceptron.极限学习机用于新型混合形态学/线性感知器。
Neural Netw. 2020 Mar;123:288-298. doi: 10.1016/j.neunet.2019.12.003. Epub 2019 Dec 19.
8
A broad class of discrete-time hypercomplex-valued Hopfield neural networks.一类广义的离散时间超复数值 Hopfield 神经网络。
Neural Netw. 2020 Feb;122:54-67. doi: 10.1016/j.neunet.2019.09.040. Epub 2019 Oct 18.
9
On the complexity of computing and learning with multiplicative neural networks.关于乘法神经网络的计算与学习复杂性
Neural Comput. 2002 Feb;14(2):241-301. doi: 10.1162/08997660252741121.
10
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks.超越霍普菲尔德递归神经网络中的最大存储容量限制。
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.

引用本文的文献

1
Imagery in the entropic associative memory.熵关联记忆中的意象。
Sci Rep. 2023 Jun 12;13(1):9553. doi: 10.1038/s41598-023-36761-6.
2
Weighted entropic associative memory and phonetic learning.加权熵关联记忆与语音学习。
Sci Rep. 2022 Oct 6;12(1):16703. doi: 10.1038/s41598-022-20798-0.
3
Entropic associative memory for manuscript symbols.手稿符号的熵联想记忆。
PLoS One. 2022 Aug 4;17(8):e0272386. doi: 10.1371/journal.pone.0272386. eCollection 2022.
4
Facial Expression Recognition from Multi-Perspective Visual Inputs and Soft Voting.多视角视觉输入和软投票的面部表情识别。
Sensors (Basel). 2022 May 31;22(11):4206. doi: 10.3390/s22114206.
5
An entropic associative memory.一种基于熵的联想记忆。
Sci Rep. 2021 Mar 25;11(1):6948. doi: 10.1038/s41598-021-86270-7.
6
Discrimination of schizophrenia auditory hallucinators by machine learning of resting-state functional MRI.通过静息态功能磁共振成像的机器学习对精神分裂症幻听者进行鉴别
Int J Neural Syst. 2015 May;25(3):1550007. doi: 10.1142/S0129065715500070. Epub 2015 Jan 19.
7
One-hot vector hybrid associative classifier for medical data classification.用于医学数据分类的独热向量混合关联分类器。
PLoS One. 2014 Apr 21;9(4):e95715. doi: 10.1371/journal.pone.0095715. eCollection 2014.