Suppr超能文献

深度神经网络在视觉感知学习建模中的应用。

Deep Neural Networks for Modeling Visual Perceptual Learning.

机构信息

Gatsby Computational Neuroscience Unit, University College London, London W1T 4JG, United Kingdom and

Department of Psychology, University of California-Riverside, Riverside, California 92521.

出版信息

J Neurosci. 2018 Jul 4;38(27):6028-6044. doi: 10.1523/JNEUROSCI.1620-17.2018. Epub 2018 May 23.

Abstract

Understanding visual perceptual learning (VPL) has become increasingly more challenging as new phenomena are discovered with novel stimuli and training paradigms. Although existing models aid our knowledge of critical aspects of VPL, the connections shown by these models between behavioral learning and plasticity across different brain areas are typically superficial. Most models explain VPL as readout from simple perceptual representations to decision areas and are not easily adaptable to explain new findings. Here, we show that a well -known instance of deep neural network (DNN), whereas not designed specifically for VPL, provides a computational model of VPL with enough complexity to be studied at many levels of analyses. After learning a Gabor orientation discrimination task, the DNN model reproduced key behavioral results, including increasing specificity with higher task precision, and also suggested that learning precise discriminations could transfer asymmetrically to coarse discriminations when the stimulus conditions varied. Consistent with the behavioral findings, the distribution of plasticity moved toward lower layers when task precision increased and this distribution was also modulated by tasks with different stimulus types. Furthermore, learning in the network units demonstrated close resemblance to extant electrophysiological recordings in monkey visual areas. Altogether, the DNN fulfilled predictions of existing theories regarding specificity and plasticity and reproduced findings of tuning changes in neurons of the primate visual areas. Although the comparisons were mostly qualitative, the DNN provides a new method of studying VPL, can serve as a test bed for theories, and assists in generating predictions for physiological investigations. Visual perceptual learning (VPL) has been found to cause changes at multiple stages of the visual hierarchy. We found that training a deep neural network (DNN) on an orientation discrimination task produced behavioral and physiological patterns similar to those found in human and monkey experiments. Unlike existing VPL models, the DNN was pre-trained on natural images to reach high performance in object recognition, but was not designed specifically for VPL; however, it fulfilled predictions of existing theories regarding specificity and plasticity and reproduced findings of tuning changes in neurons of the primate visual areas. When used with care, this unbiased and deep-hierarchical model can provide new ways of studying VPL from behavior to physiology.

摘要

理解视觉感知学习(VPL)变得越来越具有挑战性,因为随着新的刺激和训练范式的出现,新的现象被发现。虽然现有的模型有助于我们了解 VPL 的关键方面,但这些模型显示的行为学习与不同大脑区域之间的可塑性之间的联系通常是肤浅的。大多数模型将 VPL 解释为从简单的感知表示到决策区域的输出,并且不容易适应来解释新的发现。在这里,我们展示了一个众所周知的深度神经网络(DNN)实例,尽管它不是专门为 VPL 设计的,但它提供了一个具有足够复杂性的 VPL 计算模型,可以在多个分析层次上进行研究。在学习了一个 Gabor 方向辨别任务后,DNN 模型再现了关键的行为结果,包括随着任务精度的提高特异性增加,并且还表明,当刺激条件变化时,学习精确的辨别可以不对称地转移到粗略的辨别。与行为发现一致的是,当任务精度增加时,可塑性的分布向较低的层移动,并且这种分布也受到不同刺激类型任务的调制。此外,网络单元中的学习与灵长类动物视觉区域的现有电生理记录非常相似。总的来说,DNN 满足了关于特异性和可塑性的现有理论的预测,并再现了灵长类动物视觉区域神经元调谐变化的发现。尽管这些比较主要是定性的,但 DNN 提供了一种研究 VPL 的新方法,可以作为理论的测试平台,并有助于为生理研究生成预测。已经发现视觉感知学习(VPL)会导致视觉层次结构的多个阶段发生变化。我们发现,在方向辨别任务上训练深度神经网络(DNN)会产生类似于人类和猴子实验中发现的行为和生理模式。与现有的 VPL 模型不同,DNN 是在自然图像上进行预训练的,以达到物体识别的高性能,而不是专门为 VPL 设计的;然而,它满足了关于特异性和可塑性的现有理论的预测,并再现了灵长类动物视觉区域神经元调谐变化的发现。谨慎使用时,这种无偏的深度层次模型可以为从行为到生理学的 VPL 研究提供新方法。

相似文献

1
Deep Neural Networks for Modeling Visual Perceptual Learning.
J Neurosci. 2018 Jul 4;38(27):6028-6044. doi: 10.1523/JNEUROSCI.1620-17.2018. Epub 2018 May 23.
2
Feature reliability determines specificity and transfer of perceptual learning in orientation search.
PLoS Comput Biol. 2017 Dec 14;13(12):e1005882. doi: 10.1371/journal.pcbi.1005882. eCollection 2017 Dec.
3
Real-Time Strategy Video Game Experience and Visual Perceptual Learning.
J Neurosci. 2015 Jul 22;35(29):10485-92. doi: 10.1523/JNEUROSCI.3340-14.2015.
4
Neuroimaging Evidence for 2 Types of Plasticity in Association with Visual Perceptual Learning.
Cereb Cortex. 2016 Sep;26(9):3681-9. doi: 10.1093/cercor/bhw176. Epub 2016 Jun 13.
5
Perceptual learning: toward a comprehensive theory.
Annu Rev Psychol. 2015 Jan 3;66:197-221. doi: 10.1146/annurev-psych-010814-015214. Epub 2014 Sep 10.
6
Perceptual Learning at a Conceptual Level.
J Neurosci. 2016 Feb 17;36(7):2238-46. doi: 10.1523/JNEUROSCI.2732-15.2016.
8
Decoding reveals plasticity in V3A as a result of motion perceptual learning.
PLoS One. 2012;7(8):e44003. doi: 10.1371/journal.pone.0044003. Epub 2012 Aug 28.
10
Two-stage model in perceptual learning: toward a unified theory.
Ann N Y Acad Sci. 2014 May;1316:18-28. doi: 10.1111/nyas.12419. Epub 2014 Apr 23.

引用本文的文献

1
Perceptual learning improves discrimination but does not reduce distortions in appearance.
PLoS Comput Biol. 2025 Apr 15;21(4):e1012980. doi: 10.1371/journal.pcbi.1012980. eCollection 2025 Apr.
2
A neural geometry approach comprehensively explains apparently conflicting models of visual perceptual learning.
Nat Hum Behav. 2025 May;9(5):1023-1040. doi: 10.1038/s41562-025-02149-x. Epub 2025 Mar 31.
3
Dissociable components of visual perceptual learning characterized by non-invasive brain stimulation: Stage 1 Registered Report.
Brain Commun. 2025 Jan 2;7(1):fcae468. doi: 10.1093/braincomms/fcae468. eCollection 2025.
4
A computational deep learning investigation of animacy perception in the human brain.
Commun Biol. 2024 Dec 31;7(1):1718. doi: 10.1038/s42003-024-07415-8.
5
A hierarchical reinforcement learning model explains individual differences in attentional set shifting.
Cogn Affect Behav Neurosci. 2024 Dec;24(6):1008-1022. doi: 10.3758/s13415-024-01223-7. Epub 2024 Sep 23.
6
Hearing temperatures: employing machine learning for elucidating the cross-modal perception of thermal properties through audition.
Front Psychol. 2024 Aug 2;15:1353490. doi: 10.3389/fpsyg.2024.1353490. eCollection 2024.
8
Profiles of visual perceptual learning in feature space.
iScience. 2024 Feb 6;27(3):109128. doi: 10.1016/j.isci.2024.109128. eCollection 2024 Mar 15.
9
Asymmetric stimulus representations bias visual perceptual learning.
J Vis. 2024 Jan 2;24(1):10. doi: 10.1167/jov.24.1.10.
10
Subtle adversarial image manipulations influence both human and machine perception.
Nat Commun. 2023 Aug 15;14(1):4933. doi: 10.1038/s41467-023-40499-0.

本文引用的文献

1
Towards a whole brain model of Perceptual Learning.
Curr Opin Behav Sci. 2018 Apr;20:47-55. doi: 10.1016/j.cobeha.2017.10.004. Epub 2017 Dec 13.
2
Visual Perceptual Learning and Models.
Annu Rev Vis Sci. 2017 Sep 15;3:343-363. doi: 10.1146/annurev-vision-102016-061249. Epub 2017 Jul 19.
3
Deep Neural Networks: A New Framework for Modeling Biological Vision and Brain Information Processing.
Annu Rev Vis Sci. 2015 Nov 24;1:417-446. doi: 10.1146/annurev-vision-082114-035447.
4
Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation.
Front Comput Neurosci. 2017 May 4;11:24. doi: 10.3389/fncom.2017.00024. eCollection 2017.
5
Perceptual Learning of Contrast Detection in the Human Lateral Geniculate Nucleus.
Curr Biol. 2016 Dec 5;26(23):3176-3182. doi: 10.1016/j.cub.2016.09.034. Epub 2016 Nov 10.
6
Random synaptic feedback weights support error backpropagation for deep learning.
Nat Commun. 2016 Nov 8;7:13276. doi: 10.1038/ncomms13276.
7
Seeing it all: Convolutional network layers map the function of the human visual system.
Neuroimage. 2017 May 15;152:184-194. doi: 10.1016/j.neuroimage.2016.10.001. Epub 2016 Oct 21.
10
Using goal-driven deep learning models to understand sensory cortex.
Nat Neurosci. 2016 Mar;19(3):356-65. doi: 10.1038/nn.4244.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验