• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

兜圈子是前进的道路:在视觉推理中再现的作用。

Going in circles is the way forward: the role of recurrence in visual inference.

机构信息

Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, United States.

Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, United States; Department of Psychology, Columbia University, New York, NY, United States; Department of Neuroscience, Columbia University, New York, NY, United States; Affiliated member, Electrical Engineering, Columbia University, New York, NY, United States.

出版信息

Curr Opin Neurobiol. 2020 Dec;65:176-193. doi: 10.1016/j.conb.2020.11.009. Epub 2020 Dec 3.

DOI:10.1016/j.conb.2020.11.009
PMID:33279795
Abstract

Biological visual systems exhibit abundant recurrent connectivity. State-of-the-art neural network models for visual recognition, by contrast, rely heavily or exclusively on feedforward computation. Any finite-time recurrent neural network (RNN) can be unrolled along time to yield an equivalent feedforward neural network (FNN). This important insight suggests that computational neuroscientists may not need to engage recurrent computation, and that computer-vision engineers may be limiting themselves to a special case of FNN if they build recurrent models. Here we argue, to the contrary, that FNNs are a special case of RNNs and that computational neuroscientists and engineers should engage recurrence to understand how brains and machines can (1) achieve greater and more flexible computational depth (2) compress complex computations into limited hardware (3) integrate priors and priorities into visual inference through expectation and attention (4) exploit sequential dependencies in their data for better inference and prediction and (5) leverage the power of iterative computation.

摘要

生物视觉系统表现出丰富的递归连接。相比之下,用于视觉识别的最先进的神经网络模型严重依赖或完全依赖前馈计算。任何有限时间的递归神经网络 (RNN) 都可以沿着时间展开,得到一个等价的前馈神经网络 (FNN)。这一重要见解表明,计算神经科学家可能不需要进行递归计算,如果计算机视觉工程师构建递归模型,他们可能会将自己限制在 FNN 的一个特例中。在这里,我们认为相反,FNN 是 RNN 的一个特例,计算神经科学家和工程师应该采用递归方法来理解大脑和机器如何(1)实现更大和更灵活的计算深度(2)将复杂计算压缩到有限的硬件中(3)通过期望和注意将先验和优先级整合到视觉推理中(4)利用数据中的序列依赖性进行更好的推理和预测(5)利用迭代计算的力量。

相似文献

1
Going in circles is the way forward: the role of recurrence in visual inference.兜圈子是前进的道路:在视觉推理中再现的作用。
Curr Opin Neurobiol. 2020 Dec;65:176-193. doi: 10.1016/j.conb.2020.11.009. Epub 2020 Dec 3.
2
Recurrent neural networks can explain flexible trading of speed and accuracy in biological vision.递归神经网络可以解释生物视觉中速度和精度的灵活交易。
PLoS Comput Biol. 2020 Oct 2;16(10):e1008215. doi: 10.1371/journal.pcbi.1008215. eCollection 2020 Oct.
3
Considerations in using recurrent neural networks to probe neural dynamics.使用循环神经网络探究神经动力学的注意事项。
J Neurophysiol. 2019 Dec 1;122(6):2504-2521. doi: 10.1152/jn.00467.2018. Epub 2019 Oct 16.
4
Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.广义循环神经网络适应功能磁共振成像分析的动态因果建模。
Neuroimage. 2018 Sep;178:385-402. doi: 10.1016/j.neuroimage.2018.05.042. Epub 2018 May 18.
5
Recurrent neural networks with explicit representation of dynamic latent variables can mimic behavioral patterns in a physical inference task.具有显式动态潜在变量表示的递归神经网络可以模拟物理推理任务中的行为模式。
Nat Commun. 2022 Oct 4;13(1):5865. doi: 10.1038/s41467-022-33581-6.
6
Canonical circuit computations for computer vision.计算机视觉的规范电路计算。
Biol Cybern. 2023 Oct;117(4-5):299-329. doi: 10.1007/s00422-023-00966-9. Epub 2023 Jun 12.
7
Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks.认识递归神经网络(rRNN):递归神经网络的贝叶斯推理。
Biol Cybern. 2012 Jul;106(4-5):201-17. doi: 10.1007/s00422-012-0490-x. Epub 2012 May 12.
8
The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction.离散时间计算的动力学,及其在递归神经网络和有限状态机提取中的应用。
Neural Comput. 1996 Aug 15;8(6):1135-78. doi: 10.1162/neco.1996.8.6.1135.
9
Capsule networks as recurrent models of grouping and segmentation.胶囊网络作为分组和分割的递归模型。
PLoS Comput Biol. 2020 Jul 21;16(7):e1008017. doi: 10.1371/journal.pcbi.1008017. eCollection 2020 Jul.
10
Recurrent neural network from adder's perspective: Carry-lookahead RNN.从加法器角度看递归神经网络:先行进位递归神经网络。
Neural Netw. 2021 Dec;144:297-306. doi: 10.1016/j.neunet.2021.08.032. Epub 2021 Sep 6.

引用本文的文献

1
Recurrence affects the geometry of visual representations across the ventral visual stream in the human brain.复发会影响人类大脑腹侧视觉通路中视觉表征的几何结构。
PLoS Biol. 2025 Aug 25;23(8):e3003354. doi: 10.1371/journal.pbio.3003354. eCollection 2025 Aug.
2
Multiarea processing in body patches of the primate inferotemporal cortex implements inverse graphics.灵长类动物颞下皮质身体区域的多区域处理实现了逆图形。
Proc Natl Acad Sci U S A. 2025 Jul 15;122(28):e2420287122. doi: 10.1073/pnas.2420287122. Epub 2025 Jul 8.
3
Energy optimization induces predictive-coding properties in a multi-compartment spiking neural network model.
能量优化在多房室脉冲神经网络模型中诱导预测编码特性。
PLoS Comput Biol. 2025 Jun 10;21(6):e1013112. doi: 10.1371/journal.pcbi.1013112. eCollection 2025 Jun.
4
Brain-like variational inference.类脑变分推理
ArXiv. 2025 May 16:arXiv:2410.19315v2.
5
A deep learning model of dorsal and ventral visual streams for DVSD.用于 DVSD 的背侧和腹侧视觉流的深度学习模型。
Sci Rep. 2024 Nov 10;14(1):27464. doi: 10.1038/s41598-024-78304-7.
6
Spiking representation learning for associative memories.用于关联记忆的脉冲表示学习。
Front Neurosci. 2024 Sep 19;18:1439414. doi: 10.3389/fnins.2024.1439414. eCollection 2024.
7
Maintenance and transformation of representational formats during working memory prioritization.工作记忆优先化过程中表象格式的维持和转换。
Nat Commun. 2024 Sep 19;15(1):8234. doi: 10.1038/s41467-024-52541-w.
8
The neural network RTNet exhibits the signatures of human perceptual decision-making.神经网络 RTNet 表现出人类感知决策的特征。
Nat Hum Behav. 2024 Sep;8(9):1752-1770. doi: 10.1038/s41562-024-01914-8. Epub 2024 Jul 12.
9
Memorability shapes perceived time (and vice versa).易记性塑造感知时间(反之亦然)。
Nat Hum Behav. 2024 Jul;8(7):1296-1308. doi: 10.1038/s41562-024-01863-2. Epub 2024 Apr 22.
10
Perceptual reorganization from prior knowledge emerges late in childhood.基于先前知识的知觉重组在童年后期出现。
iScience. 2024 Jan 4;27(2):108787. doi: 10.1016/j.isci.2024.108787. eCollection 2024 Feb 16.