• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

未经训练的神经网络中的视觉量比较。

Comparison of visual quantities in untrained neural networks.

机构信息

Department of Bio and Brain Engineering, Korea Advanced Institute of Science and Technology, Daejeon 34141, Republic of Korea.

Department of Brain and Cognitive Sciences, Korea Advanced Institute of Science and Technology, Daejeon 34141, Republic of Korea.

出版信息

Cell Rep. 2023 Aug 29;42(8):112900. doi: 10.1016/j.celrep.2023.112900. Epub 2023 Jul 29.

DOI:10.1016/j.celrep.2023.112900
PMID:37516959
Abstract

The ability to compare quantities of visual objects with two distinct measures, proportion and difference, is observed even in newborn animals. However, how this function originates in the brain, even before visual experience, remains unknown. Here, we propose a model in which neuronal tuning for quantity comparisons can arise spontaneously in completely untrained neural circuits. Using a biologically inspired model neural network, we find that single units selective to proportions and differences between visual quantities emerge in randomly initialized feedforward wirings and that they enable the network to perform quantity comparison tasks. Notably, we find that two distinct tunings to proportion and difference originate from a random summation of monotonic, nonlinear neural activities and that a slight difference in the nonlinear response function determines the type of measure. Our results suggest that visual quantity comparisons are primitive types of functions that can emerge spontaneously before learning in young brains.

摘要

即使在新生动物中,也能观察到用两种不同的度量标准(比例和差异)来比较视觉对象数量的能力。然而,在没有视觉经验的情况下,这种功能是如何在大脑中产生的,仍然未知。在这里,我们提出了一个模型,在这个模型中,即使在未经训练的神经网络中,神经元对数量比较的调谐也可以自发产生。使用受生物启发的模型神经网络,我们发现,对视觉数量之间的比例和差异有选择性的单个单元会出现在随机初始化的前馈布线中,并且它们使网络能够执行数量比较任务。值得注意的是,我们发现,对比例和差异的两种不同调谐源自单调、非线性神经活动的随机求和,而非线性响应函数的微小差异决定了度量的类型。我们的结果表明,视觉数量比较是原始类型的功能,可以在年轻大脑的学习之前自发产生。

相似文献

1
Comparison of visual quantities in untrained neural networks.未经训练的神经网络中的视觉量比较。
Cell Rep. 2023 Aug 29;42(8):112900. doi: 10.1016/j.celrep.2023.112900. Epub 2023 Jul 29.
2
Invariance of object detection in untrained deep neural networks.未训练的深度神经网络中目标检测的不变性
Front Comput Neurosci. 2022 Nov 3;16:1030707. doi: 10.3389/fncom.2022.1030707. eCollection 2022.
3
Face detection in untrained deep neural networks.未训练的深度神经网络中的人脸检测。
Nat Commun. 2021 Dec 16;12(1):7328. doi: 10.1038/s41467-021-27606-9.
4
Visual number sense in untrained deep neural networks.未训练的深度神经网络中的视觉数字感知。
Sci Adv. 2021 Jan 1;7(1). doi: 10.1126/sciadv.abd6127. Print 2021 Jan.
5
Learning probabilistic neural representations with randomly connected circuits.用随机连接的电路学习概率神经网络表示。
Proc Natl Acad Sci U S A. 2020 Oct 6;117(40):25066-25073. doi: 10.1073/pnas.1912804117. Epub 2020 Sep 18.
6
A brain-inspired network architecture for cost-efficient object recognition in shallow hierarchical neural networks.一种用于浅层层次神经网络中经济高效的目标识别的脑启发式网络架构。
Neural Netw. 2021 Feb;134:76-85. doi: 10.1016/j.neunet.2020.11.013. Epub 2020 Nov 28.
7
Abstract representations emerge naturally in neural networks trained to perform multiple tasks.抽象表示自然出现在经过多任务训练的神经网络中。
Nat Commun. 2023 Feb 23;14(1):1040. doi: 10.1038/s41467-023-36583-0.
8
Number detectors spontaneously emerge in a deep neural network designed for visual object recognition.数字探测器自发地出现在一个专为视觉对象识别而设计的深度神经网络中。
Sci Adv. 2019 May 8;5(5):eaav7903. doi: 10.1126/sciadv.aav7903. eCollection 2019 May.
9
Biologically plausible single-layer networks for nonnegative independent component analysis.用于非负独立成分分析的生物学上合理的单层网络。
Biol Cybern. 2022 Dec;116(5-6):557-568. doi: 10.1007/s00422-022-00943-8. Epub 2022 Sep 7.
10
Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation.尖峰神经网络中的时间编码与阿尔法突触功能:基于反向传播的学习。
IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5939-5952. doi: 10.1109/TNNLS.2021.3071976. Epub 2022 Oct 5.

引用本文的文献

1
Emergence of number sense through the integration of multimodal information: developmental learning insights from neural network models.通过多模态信息整合实现数感的出现:来自神经网络模型的发展性学习见解
Front Neurosci. 2024 Jan 17;18:1330512. doi: 10.3389/fnins.2024.1330512. eCollection 2024.