• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

大型神经网络认知活动建模中的乘法处理

Multiplicative processing in the modeling of cognitive activities in large neural networks.

作者信息

Valle-Lisboa Juan C, Pomi Andrés, Mizraji Eduardo

机构信息

Group of Cognitive Systems Modeling, Biophysics and Systems Biology Section, Facultad de Ciencias, Universidad de la República, Iguá 4225, 11400 Montevideo, Uruguay.

Centro Interdisciplinario en Cognición para la Enseñanza y el Aprendizaje (CICEA), Universidad de la República, Espacio Interdisciplinario, 11200 Montevideo, Uruguay.

出版信息

Biophys Rev. 2023 Jun 22;15(4):767-785. doi: 10.1007/s12551-023-01074-5. eCollection 2023 Aug.

DOI:10.1007/s12551-023-01074-5
PMID:37681105
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10480136/
Abstract

Explaining the foundation of cognitive abilities in the processing of information by neural systems has been in the beginnings of biophysics since McCulloch and Pitts pioneered work within the biophysics school of Chicago in the 1940s and the interdisciplinary cybernetists meetings in the 1950s, inseparable from the birth of computing and artificial intelligence. Since then, neural network models have traveled a long path, both in the biophysical and the computational disciplines. The biological, neurocomputational aspect reached its representational maturity with the Distributed Associative Memory models developed in the early 70 s. In this framework, the inclusion of signal-signal multiplication within neural network models was presented as a necessity to provide matrix associative memories with adaptive, context-sensitive associations, while greatly enhancing their computational capabilities. In this review, we show that several of the most successful neural network models use a form of multiplication of signals. We present several classical models that included such kind of multiplication and the computational reasons for the inclusion. We then turn to the different proposals about the possible biophysical implementation that underlies these computational capacities. We pinpoint the important ideas put forth by different theoretical models using a tensor product representation and show that these models endow memories with the context-dependent adaptive capabilities necessary to allow for evolutionary adaptation to changing and unpredictable environments. Finally, we show how the powerful abilities of contemporary computationally deep-learning models, inspired in neural networks, also depend on multiplications, and discuss some perspectives in view of the wide panorama unfolded. The computational relevance of multiplications calls for the development of new avenues of research that uncover the mechanisms our nervous system uses to achieve multiplication.

摘要

自20世纪40年代麦卡洛克和皮茨在芝加哥生物物理学派开创先河,以及20世纪50年代跨学科控制论者会议以来,解释神经系统在信息处理中认知能力的基础就一直是生物物理学的开端,这与计算和人工智能的诞生密不可分。从那时起,神经网络模型在生物物理和计算学科领域都走过了漫长的道路。生物神经计算方面在70年代初开发的分布式联想记忆模型中达到了其代表性的成熟阶段。在这个框架中,神经网络模型中纳入信号-信号乘法被认为是为矩阵联想记忆提供自适应、上下文敏感关联的必要条件,同时极大地增强了它们的计算能力。在这篇综述中,我们表明几个最成功的神经网络模型使用了一种信号乘法形式。我们介绍了几个包含这种乘法的经典模型以及纳入这种乘法的计算原因。然后我们转向关于这些计算能力背后可能的生物物理实现的不同提议。我们指出了使用张量积表示的不同理论模型提出的重要观点,并表明这些模型赋予记忆上下文相关的自适应能力,这对于进化适应不断变化和不可预测的环境是必要的。最后,我们展示了受神经网络启发的当代计算深度学习模型的强大能力如何也依赖于乘法,并鉴于所展现的广阔全景讨论了一些观点。乘法的计算相关性要求开发新的研究途径,以揭示我们的神经系统用于实现乘法的机制。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/d70b26ff77e3/12551_2023_1074_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/f110f96d9f98/12551_2023_1074_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/830979601c67/12551_2023_1074_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/875ca5fcaad2/12551_2023_1074_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/d70b26ff77e3/12551_2023_1074_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/f110f96d9f98/12551_2023_1074_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/830979601c67/12551_2023_1074_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/875ca5fcaad2/12551_2023_1074_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6739/10480136/d70b26ff77e3/12551_2023_1074_Fig4_HTML.jpg

相似文献

1
Multiplicative processing in the modeling of cognitive activities in large neural networks.大型神经网络认知活动建模中的乘法处理
Biophys Rev. 2023 Jun 22;15(4):767-785. doi: 10.1007/s12551-023-01074-5. eCollection 2023 Aug.
2
Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012).大分子拥挤现象:化学与物理邂逅生物学(瑞士阿斯科纳,2012年6月10日至14日)
Phys Biol. 2013 Aug;10(4):040301. doi: 10.1088/1478-3975/10/4/040301. Epub 2013 Aug 2.
3
On the complexity of computing and learning with multiplicative neural networks.关于乘法神经网络的计算与学习复杂性
Neural Comput. 2002 Feb;14(2):241-301. doi: 10.1162/08997660252741121.
4
Morphological associative memories.形态联想记忆
IEEE Trans Neural Netw. 1998;9(2):281-93. doi: 10.1109/72.661123.
5
Performance of a Computational Model of the Mammalian Olfactory System哺乳动物嗅觉系统计算模型的性能
6
A neurocomputational model for the processing of conflicting information in context-dependent decision tasks.用于处理上下文相关决策任务中冲突信息的神经计算模型。
J Biol Phys. 2022 Jun;48(2):195-213. doi: 10.1007/s10867-021-09601-9. Epub 2022 Mar 8.
7
Associative Memories via Predictive Coding.通过预测编码实现的联想记忆。
Adv Neural Inf Process Syst. 2021 Dec 1;34:3874-3886.
8
Learning and evolution in bacterial taxis: an operational amplifier circuit modeling the computational dynamics of the prokaryotic 'two component system' protein network.细菌趋化性中的学习与进化:一种模拟原核生物“双组分系统”蛋白质网络计算动力学的运算放大器电路
Biosystems. 2004 Apr-Jun;74(1-3):29-49. doi: 10.1016/j.biosystems.2004.01.003.
9
NeuroLISP: High-level symbolic programming with attractor neural networks.NeuroLISP:基于吸引子神经网络的高级符号编程。
Neural Netw. 2022 Feb;146:200-219. doi: 10.1016/j.neunet.2021.11.009. Epub 2021 Nov 18.
10
Context-sensitive autoassociative memories as expert systems in medical diagnosis.作为医学诊断专家系统的上下文敏感自联想记忆
BMC Med Inform Decis Mak. 2006 Nov 22;6:39. doi: 10.1186/1472-6947-6-39.

引用本文的文献

1
Memory Gate Controlled by Contexts: Potential Key Structure That Could Link Small Associative Failures With Severe Cognitive Disorders.由情境控制的记忆之门:可能将小的联想失败与严重认知障碍联系起来的潜在关键结构。
Bioessays. 2025 Aug;47(8):e70032. doi: 10.1002/bies.70032. Epub 2025 Jun 16.
2
Biophysical Reviews (ISSUE 4 2023): LAFeBS-highlighting biophysics in Latin America.《生物物理评论》(2023年第4期):拉丁美洲生物物理学亮点(LAFeBS)
Biophys Rev. 2023 Aug 29;15(4):419-423. doi: 10.1007/s12551-023-01117-x. eCollection 2023 Aug.

本文引用的文献

1
A biophysical account of multiplication by a single neuron.单个神经元倍增的生物物理描述。
Nature. 2022 Mar;603(7899):119-123. doi: 10.1038/s41586-022-04428-3. Epub 2022 Feb 23.
2
Brains and algorithms partially converge in natural language processing.大脑和算法在自然语言处理中部分融合。
Commun Biol. 2022 Feb 16;5(1):134. doi: 10.1038/s42003-022-03036-1.
3
One model for the learning of language.一种语言学习模型。
Proc Natl Acad Sci U S A. 2022 Feb 1;119(5). doi: 10.1073/pnas.2021865119.
4
The neural architecture of language: Integrative modeling converges on predictive processing.语言的神经结构:综合建模趋向于预测处理。
Proc Natl Acad Sci U S A. 2021 Nov 9;118(45). doi: 10.1073/pnas.2105646118.
5
Backpropagation and the brain.反向传播与大脑。
Nat Rev Neurosci. 2020 Jun;21(6):335-346. doi: 10.1038/s41583-020-0277-3. Epub 2020 Apr 17.
6
Tensor Representation of Topographically Organized Semantic Spaces.拓扑组织的语义空间的张量表示。
Neural Comput. 2018 Dec;30(12):3259-3280. doi: 10.1162/neco_a_01132. Epub 2018 Sep 14.
7
Exploring the sources and mechanisms of cognitive errors in medical diagnosis with associative memory models.运用联想记忆模型探索医学诊断中认知错误的来源与机制。
Diagnosis (Berl). 2017 Nov 27;4(4):251-259. doi: 10.1515/dx-2017-0024.
8
Using goal-driven deep learning models to understand sensory cortex.利用目标驱动的深度学习模型理解感觉皮层。
Nat Neurosci. 2016 Mar;19(3):356-65. doi: 10.1038/nn.4244.
9
Modeling spatial-temporal operations with context-dependent associative memories.使用上下文相关联的联想记忆对时空操作进行建模。
Cogn Neurodyn. 2015 Oct;9(5):523-34. doi: 10.1007/s11571-015-9343-3. Epub 2015 May 17.
10
Parallel Distributed Processing at 25: further explorations in the microstructure of cognition.《并行分布式处理》25周年:认知微观结构的进一步探索
Cogn Sci. 2014 Aug;38(6):1024-77. doi: 10.1111/cogs.12148. Epub 2014 Aug 4.