• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

跨模态刺激预测的期望学习改善单感觉分类。

Expectation Learning for Stimulus Prediction Across Modalities Improves Unisensory Classification.

作者信息

Barros Pablo, Eppe Manfred, Parisi German I, Liu Xun, Wermter Stefan

机构信息

Knowledge Technology, Department of Informatics, University of Hamburg, Hamburg, Germany.

Department of Psychology, University of CAS, Beijing, China.

出版信息

Front Robot AI. 2019 Dec 11;6:137. doi: 10.3389/frobt.2019.00137. eCollection 2019.

DOI:10.3389/frobt.2019.00137
PMID:33501152
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7806099/
Abstract

Expectation learning is a unsupervised learning process which uses multisensory bindings to enhance unisensory perception. For instance, as humans, we learn to associate a barking sound with the visual appearance of a dog, and we continuously fine-tune this association over time, as we learn, e.g., to associate high-pitched barking with small dogs. In this work, we address the problem of developing a computational model that addresses important properties of expectation learning, in particular focusing on the lack of explicit external supervision other than temporal co-occurrence. To this end, we present a novel hybrid neural model based on audio-visual autoencoders and a recurrent self-organizing network for multisensory bindings that facilitate stimulus reconstructions across different sensory modalities. We refer to this mechanism as stimulus prediction across modalities and demonstrate that the proposed model is capable of learning concept bindings by evaluating it on unisensory classification tasks for audio-visual stimuli using the 43,500 Youtube videos from the animal subset of the AudioSet corpus.

摘要

期望学习是一种无监督学习过程,它利用多感官绑定来增强单感官感知。例如,作为人类,我们学会将犬吠声与狗的视觉外观联系起来,并且随着时间的推移,我们不断微调这种关联,比如我们学会将高音调的吠声与小狗联系起来。在这项工作中,我们解决了开发一种计算模型的问题,该模型解决期望学习的重要特性,特别关注除了时间共现之外缺乏明确的外部监督。为此,我们提出了一种基于视听自动编码器和用于多感官绑定的循环自组织网络的新型混合神经模型,该模型有助于跨不同感官模态进行刺激重建。我们将这种机制称为跨模态刺激预测,并通过使用来自AudioSet语料库动物子集的43500个YouTube视频对视听刺激的单感官分类任务进行评估,证明所提出的模型能够学习概念绑定。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/56f416f18fd6/frobt-06-00137-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/ea3220696376/frobt-06-00137-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/00da4d41a07a/frobt-06-00137-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/56f416f18fd6/frobt-06-00137-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/ea3220696376/frobt-06-00137-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/00da4d41a07a/frobt-06-00137-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7a30/7806099/56f416f18fd6/frobt-06-00137-g0003.jpg

相似文献

1
Expectation Learning for Stimulus Prediction Across Modalities Improves Unisensory Classification.跨模态刺激预测的期望学习改善单感觉分类。
Front Robot AI. 2019 Dec 11;6:137. doi: 10.3389/frobt.2019.00137. eCollection 2019.
2
The role of multisensory interplay in enabling temporal expectations.多感官相互作用在产生时间预期中的作用。
Cognition. 2018 Jan;170:130-146. doi: 10.1016/j.cognition.2017.09.015. Epub 2017 Oct 9.
3
A rational analysis of the acquisition of multisensory representations.对多感觉表象获取的理性分析。
Cogn Sci. 2012 Mar;36(2):305-32. doi: 10.1111/j.1551-6709.2011.01216.x. Epub 2011 Dec 5.
4
Simultaneous and independent acquisition of multisensory and unisensory associations.多感官和单感官关联的同时且独立获取。
Perception. 2007;36(10):1445-53. doi: 10.1068/p5843.
5
Multisensory Integration Uses a Real-Time Unisensory-Multisensory Transform.多感官整合使用实时单感官-多感官转换。
J Neurosci. 2017 May 17;37(20):5183-5194. doi: 10.1523/JNEUROSCI.2767-16.2017. Epub 2017 Apr 27.
6
Benefits of multisensory learning.多感官学习的益处。
Trends Cogn Sci. 2008 Nov;12(11):411-7. doi: 10.1016/j.tics.2008.07.006.
7
From Near-Optimal Bayesian Integration to Neuromorphic Hardware: A Neural Network Model of Multisensory Integration.从近似最优贝叶斯整合到神经形态硬件:一种多感官整合的神经网络模型
Front Neurorobot. 2020 May 15;14:29. doi: 10.3389/fnbot.2020.00029. eCollection 2020.
8
Can multisensory training aid visual learning? A computational investigation.多感官训练能否辅助视觉学习?一项计算研究。
J Vis. 2019 Sep 3;19(11):1. doi: 10.1167/19.11.1.
9
Multisensory perceptual learning is dependent upon task difficulty.多感官知觉学习取决于任务难度。
Exp Brain Res. 2016 Nov;234(11):3269-3277. doi: 10.1007/s00221-016-4724-3. Epub 2016 Jul 11.
10
Enhanced multisensory integration and motor reactivation after active motor learning of audiovisual associations.主动学习视听关联后增强多感觉整合和运动再激活。
J Cogn Neurosci. 2011 Nov;23(11):3515-28. doi: 10.1162/jocn_a_00015. Epub 2011 Mar 31.

引用本文的文献

1
Olfactory-colour crossmodal correspondences in art, science, and design.艺术、科学和设计中的嗅觉-颜色跨模态对应关系。
Cogn Res Princ Implic. 2020 Oct 28;5(1):52. doi: 10.1186/s41235-020-00246-1.

本文引用的文献

1
Continual lifelong learning with neural networks: A review.神经网络的持续终身学习:综述。
Neural Netw. 2019 May;113:54-71. doi: 10.1016/j.neunet.2019.01.012. Epub 2019 Feb 6.
2
Born to learn: The inspiration, progress, and future of evolved plastic artificial neural networks.天生学习:进化后的塑料人工神经网络的灵感、进展和未来。
Neural Netw. 2018 Dec;108:48-67. doi: 10.1016/j.neunet.2018.07.013. Epub 2018 Aug 7.
3
The role of feedback contingency in perceptual category learning.反馈偶然性在知觉类别学习中的作用。
J Exp Psychol Learn Mem Cogn. 2016 Nov;42(11):1731-1746. doi: 10.1037/xlm0000277. Epub 2016 May 5.
4
The Neurobiology Shaping Affective Touch: Expectation, Motivation, and Meaning in the Multisensory Context.塑造情感触觉的神经生物学:多感官背景下的期望、动机和意义
Front Psychol. 2016 Jan 6;6:1986. doi: 10.3389/fpsyg.2015.01986. eCollection 2015.
5
Multisensory causal inference in the brain.大脑中的多感官因果推理
PLoS Biol. 2015 Feb 24;13(2):e1002075. doi: 10.1371/journal.pbio.1002075. eCollection 2015 Feb.
6
The co-occurrence of multisensory facilitation and cross-modal conflict in the human brain.人类大脑中多感觉促进和跨模态冲突的共同发生。
J Neurophysiol. 2011 Dec;106(6):2896-909. doi: 10.1152/jn.00303.2011. Epub 2011 Aug 31.
7
The multifaceted interplay between attention and multisensory integration.注意与多感觉整合的多方面相互作用。
Trends Cogn Sci. 2010 Sep;14(9):400-10. doi: 10.1016/j.tics.2010.06.008. Epub 2010 Aug 2.
8
A model of the neural mechanisms underlying multisensory integration in the superior colliculus.上丘多感觉整合背后神经机制的模型
Perception. 2007;36(10):1431-43. doi: 10.1068/p5842.
9
Conscious access to the unisensory components of a cross-modal illusion.对跨通道错觉的单通道成分的有意识通达。
Neuroreport. 2007 Mar 5;18(4):347-50. doi: 10.1097/WNR.0b013e32801776f9.
10
Early cross-modal interactions in auditory and visual cortex underlie a sound-induced visual illusion.听觉和视觉皮层早期的跨模态相互作用是一种声音诱发视觉错觉的基础。
J Neurosci. 2007 Apr 11;27(15):4120-31. doi: 10.1523/JNEUROSCI.4912-06.2007.