• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

概率模型与生成神经网络:迈向一个用于对正常和受损神经认知功能进行建模的统一框架。

Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

作者信息

Testolin Alberto, Zorzi Marco

机构信息

Department of General Psychology and Center for Cognitive Neuroscience, University of Padova Padua, Italy.

Department of General Psychology and Center for Cognitive Neuroscience, University of PadovaPadua, Italy; IRCCS San Camillo Neurorehabilitation HospitalVenice-Lido, Italy.

出版信息

Front Comput Neurosci. 2016 Jul 13;10:73. doi: 10.3389/fncom.2016.00073. eCollection 2016.

DOI:10.3389/fncom.2016.00073
PMID:27468262
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC4943066/
Abstract

Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

摘要

联结主义模型可以在概率图模型这个更通用的框架内进行描述,概率图模型能够有效地描述涉及大量相互作用变量的复杂统计分布。这种整合使得能够构建更现实的认知功能计算模型,这些模型能更忠实地反映潜在的神经机制,同时为基于贝叶斯计算的高级描述提供有用的桥梁。在此,我们讨论一类强大的图模型,这类模型可以实现为随机生成神经网络。这些模型克服了许多与经典联结主义模型相关的局限性,例如通过在分层架构(深度网络)中利用无监督学习,以及考虑由反馈回路支持的自上而下的预测处理。我们回顾了一些基于生成网络的近期认知模型,并指出了在这种方法中研究神经心理障碍的有前景的研究方向。尽管为了弥合结构化贝叶斯模型与更现实的神经元动力学生物物理模型之间的差距还需要进一步努力,但我们认为生成神经网络有潜力弥合这些分析层次,从而增进我们对认知的神经基础以及脑损伤所致病理的理解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a5d/4943066/adc6f98e2f2d/fncom-10-00073-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a5d/4943066/f82f16c10e8f/fncom-10-00073-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a5d/4943066/adc6f98e2f2d/fncom-10-00073-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a5d/4943066/f82f16c10e8f/fncom-10-00073-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a5d/4943066/adc6f98e2f2d/fncom-10-00073-g0002.jpg

相似文献

1
Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.概率模型与生成神经网络:迈向一个用于对正常和受损神经认知功能进行建模的统一框架。
Front Comput Neurosci. 2016 Jul 13;10:73. doi: 10.3389/fncom.2016.00073. eCollection 2016.
2
Modeling language and cognition with deep unsupervised learning: a tutorial overview.基于深度无监督学习的语言和认知建模:教程概述。
Front Psychol. 2013 Aug 20;4:515. doi: 10.3389/fpsyg.2013.00515. eCollection 2013.
3
Learning Orthographic Structure With Sequential Generative Neural Networks.使用序列生成神经网络学习正字法结构。
Cogn Sci. 2016 Apr;40(3):579-606. doi: 10.1111/cogs.12258. Epub 2015 Jun 14.
4
Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.整合感知概率模型和交互式神经网络:历史与教程综述。
Front Psychol. 2013 Aug 20;4:503. doi: 10.3389/fpsyg.2013.00503. eCollection 2013.
5
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.深度无监督学习在台式 PC 上的应用:认知科学家入门指南。
Front Psychol. 2013 May 6;4:251. doi: 10.3389/fpsyg.2013.00251. eCollection 2013.
6
The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding.架构和学习约束在神经网络模型中的作用:视觉空间编码的案例研究
Front Comput Neurosci. 2017 Mar 21;11:13. doi: 10.3389/fncom.2017.00013. eCollection 2017.
7
Probabilistic Models with Deep Neural Networks.具有深度神经网络的概率模型。
Entropy (Basel). 2021 Jan 18;23(1):117. doi: 10.3390/e23010117.
8
[Connectionist modeling of higher-level cognitive processes].[高级认知过程的联结主义建模]
Shinrigaku Kenkyu. 2002 Feb;72(6):541-55. doi: 10.4992/jjpsy.72.541.
9
Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012).大分子拥挤现象:化学与物理邂逅生物学(瑞士阿斯科纳,2012年6月10日至14日)
Phys Biol. 2013 Aug;10(4):040301. doi: 10.1088/1478-3975/10/4/040301. Epub 2013 Aug 2.
10
A taxonomy for spatiotemporal connectionist networks revisited: the unsupervised case.
Neural Comput. 2003 Jun;15(6):1255-320. doi: 10.1162/089976603321780281.

引用本文的文献

1
Human shape perception spontaneously discovers the biological origin of novel, but natural, stimuli.人类形状感知能自发地发现新颖但自然的刺激的生物学起源。
J R Soc Interface. 2025 May;22(226):20240931. doi: 10.1098/rsif.2024.0931. Epub 2025 May 21.
2
Investigating the intrinsic top-down dynamics of deep generative models.研究深度生成模型的内在自上而下动力学。
Sci Rep. 2025 Jan 22;15(1):2875. doi: 10.1038/s41598-024-85055-y.
3
Learning Numerosity Representations with Transformers: Number Generation Tasks and Out-of-Distribution Generalization.

本文引用的文献

1
Letter perception emerges from unsupervised deep learning and recycling of natural image features.字母感知源于无监督深度学习和自然图像特征的循环利用。
Nat Hum Behav. 2017 Sep;1(9):657-664. doi: 10.1038/s41562-017-0186-2. Epub 2017 Aug 21.
2
Universal resilience patterns in complex networks.复杂网络中的普遍恢复模式。
Nature. 2016 Feb 18;530(7590):307-12. doi: 10.1038/nature16948.
3
Mastering the game of Go with deep neural networks and tree search.用深度神经网络和树搜索掌握围棋游戏。
使用Transformer学习数量表示:数字生成任务与分布外泛化
Entropy (Basel). 2021 Jul 3;23(7):857. doi: 10.3390/e23070857.
4
Disease gene prediction with privileged information and heteroscedastic dropout.利用特权信息和异方差失活进行疾病基因预测。
Bioinformatics. 2021 Jul 12;37(Suppl_1):i410-i417. doi: 10.1093/bioinformatics/btab310.
5
The secret life of predictive brains: what's spontaneous activity for?预测大脑的秘密生活:自发活动是为了什么?
Trends Cogn Sci. 2021 Sep;25(9):730-743. doi: 10.1016/j.tics.2021.05.007. Epub 2021 Jun 16.
6
Unsupervised learning predicts human perception and misperception of gloss.无监督学习预测人类对光泽的感知和错觉。
Nat Hum Behav. 2021 Oct;5(10):1402-1417. doi: 10.1038/s41562-021-01097-6. Epub 2021 May 6.
7
The phase space of meaning model of psychopathology: A computer simulation modelling study.精神病理学的意义相空间模型:计算机模拟建模研究。
PLoS One. 2021 Apr 26;16(4):e0249320. doi: 10.1371/journal.pone.0249320. eCollection 2021.
8
Paradigm Shift Toward Digital Neuropsychology and High-Dimensional Neuropsychological Assessments: Review.范式转变:迈向数字化神经心理学和高维神经心理评估:综述。
J Med Internet Res. 2020 Dec 16;22(12):e23777. doi: 10.2196/23777.
9
Emergence of Network Motifs in Deep Neural Networks.深度神经网络中网络基序的出现。
Entropy (Basel). 2020 Feb 11;22(2):204. doi: 10.3390/e22020204.
10
Degeneracy and Redundancy in Active Inference.主动推断中的简并和冗余。
Cereb Cortex. 2020 Oct 1;30(11):5750-5766. doi: 10.1093/cercor/bhaa148.
Nature. 2016 Jan 28;529(7587):484-9. doi: 10.1038/nature16961.
4
Human-level concept learning through probabilistic program induction.通过概率编程归纳实现人类水平的概念学习。
Science. 2015 Dec 11;350(6266):1332-8. doi: 10.1126/science.aab3050.
5
The Neural Representation of Sequences: From Transition Probabilities to Algebraic Patterns and Linguistic Trees.序列的神经表示:从转移概率到代数模式和语言树。
Neuron. 2015 Oct 7;88(1):2-19. doi: 10.1016/j.neuron.2015.09.019.
6
Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream.深度神经网络揭示了腹侧流中神经表征复杂性的梯度变化。
J Neurosci. 2015 Jul 8;35(27):10005-14. doi: 10.1523/JNEUROSCI.5023-14.2015.
7
Learning Orthographic Structure With Sequential Generative Neural Networks.使用序列生成神经网络学习正字法结构。
Cogn Sci. 2016 Apr;40(3):579-606. doi: 10.1111/cogs.12258. Epub 2015 Jun 14.
8
Probabilistic machine learning and artificial intelligence.概率机器学习和人工智能。
Nature. 2015 May 28;521(7553):452-9. doi: 10.1038/nature14541.
9
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
10
The restless brain: how intrinsic activity organizes brain function.躁动的大脑:内在活动如何构建大脑功能。
Philos Trans R Soc Lond B Biol Sci. 2015 May 19;370(1668). doi: 10.1098/rstb.2014.0172.