• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

主动自我运动过程中的视听运动整合模型。

A model of audio-visual motion integration during active self-movement.

作者信息

Gallagher Maria, Haynes Joshua D, Culling John F, Freeman Tom C A

机构信息

School of Psychology, University of Kent, Canterbury, UK.

School of Psychology, Cardiff University, Cardiff, UK.

出版信息

J Vis. 2025 Feb 3;25(2):8. doi: 10.1167/jov.25.2.8.

DOI:10.1167/jov.25.2.8
PMID:39969485
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11841688/
Abstract

Despite good evidence for optimal audio-visual integration in stationary observers, few studies have considered the impact of self-movement on this process. When the head and/or eyes move, the integration of vision and hearing is complicated, as the sensory measurements begin in different coordinate frames. To successfully integrate these signals, they must first be transformed into the same coordinate frame. We propose that audio and visual motion cues are separately transformed using self-movement signals, before being integrated as body-centered cues to audio-visual motion. We tested this hypothesis using a psychophysical audio-visual integration task in which participants made left/right judgments of audio, visual, or audio-visual targets during self-generated yaw head rotations. Estimates of precision and bias from the audio and visual conditions were used to predict performance in the audio-visual conditions. We found that audio-visual performance was predicted well by models that suggested the transformation of cues into common coordinates but could not be explained by a model that did not rely on coordinate transformation before integration. We also found that precision specifically was better predicted by a model that accounted for shared noise arising from signals encoding head movement. Taken together, our findings suggest that motion perception in active observers is based on the integration of partially correlated body-centered signals.

摘要

尽管有充分证据表明静止观察者存在最佳视听整合,但很少有研究考虑自我运动对这一过程的影响。当头部和/或眼睛移动时,视觉和听觉的整合会变得复杂,因为感官测量始于不同的坐标系。为了成功整合这些信号,它们必须首先被转换到同一坐标系中。我们提出,视听运动线索在作为以身体为中心的视听运动线索进行整合之前,先利用自我运动信号分别进行转换。我们使用一项心理物理学视听整合任务对这一假设进行了测试,在该任务中,参与者在自主产生的偏航头部旋转过程中对听觉、视觉或视听目标进行左右判断。来自听觉和视觉条件的精度和偏差估计用于预测视听条件下的表现。我们发现,那些表明线索转换到共同坐标的模型能够很好地预测视听表现,但一个在整合前不依赖坐标转换的模型无法解释这种表现。我们还发现,一个考虑了编码头部运动信号产生的共享噪声的模型能更好地预测精度。综上所述,我们的研究结果表明,主动观察者的运动感知是基于部分相关的以身体为中心的信号的整合。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/94ff53baffdd/jovi-25-2-8-f011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/9fc66134fb20/jovi-25-2-8-f001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/302afbba97ff/jovi-25-2-8-f002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/aae0453969ca/jovi-25-2-8-f003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/5d6b01243eff/jovi-25-2-8-f004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/2f67e5988e0c/jovi-25-2-8-f005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/461a68b8760b/jovi-25-2-8-f006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/033c868e0ad2/jovi-25-2-8-f007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/46c287eaaacf/jovi-25-2-8-f008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/583d55178512/jovi-25-2-8-f009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/01a38839458d/jovi-25-2-8-f010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/94ff53baffdd/jovi-25-2-8-f011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/9fc66134fb20/jovi-25-2-8-f001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/302afbba97ff/jovi-25-2-8-f002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/aae0453969ca/jovi-25-2-8-f003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/5d6b01243eff/jovi-25-2-8-f004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/2f67e5988e0c/jovi-25-2-8-f005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/461a68b8760b/jovi-25-2-8-f006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/033c868e0ad2/jovi-25-2-8-f007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/46c287eaaacf/jovi-25-2-8-f008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/583d55178512/jovi-25-2-8-f009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/01a38839458d/jovi-25-2-8-f010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/685c/11841688/94ff53baffdd/jovi-25-2-8-f011.jpg

相似文献

1
A model of audio-visual motion integration during active self-movement.主动自我运动过程中的视听运动整合模型。
J Vis. 2025 Feb 3;25(2):8. doi: 10.1167/jov.25.2.8.
2
The precision of signals encoding active self-movement.主动运动信号的编码精度。
J Neurophysiol. 2024 Aug 1;132(2):389-402. doi: 10.1152/jn.00370.2023. Epub 2024 Jun 12.
3
The effects of stereo disparity on the behavioural and electrophysiological correlates of perception of audio-visual motion in depth.立体视差对深度上视听运动感知的行为和电生理相关性的影响。
Neuropsychologia. 2015 Nov;78:51-62. doi: 10.1016/j.neuropsychologia.2015.09.023. Epub 2015 Sep 18.
4
Auditory motion affects visual biological motion processing.听觉运动影响视觉生物运动处理。
Neuropsychologia. 2007 Feb 1;45(3):523-30. doi: 10.1016/j.neuropsychologia.2005.12.012. Epub 2006 Feb 28.
5
Cross-modal integration of auditory and visual motion signals.听觉与视觉运动信号的跨模态整合
Neuroreport. 2001 Aug 8;12(11):2557-60. doi: 10.1097/00001756-200108080-00053.
6
Change of temporal-order judgment of sounds during long-lasting exposure to large-field visual motion.在长时间暴露于大视野视觉运动过程中声音时间顺序判断的变化
Perception. 2008;37(11):1649-66. doi: 10.1068/p5692.
7
Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex.灵长类大脑皮层中的听觉和视觉运动处理与整合。
Front Neural Circuits. 2018 Oct 26;12:93. doi: 10.3389/fncir.2018.00093. eCollection 2018.
8
Selective integration of auditory-visual looming cues by humans.人类对视听逼近线索的选择性整合。
Neuropsychologia. 2009 Mar;47(4):1045-52. doi: 10.1016/j.neuropsychologia.2008.11.003. Epub 2008 Nov 12.
9
Heading Tuning in Macaque Area V6.猕猴V6区的朝向调谐
J Neurosci. 2015 Dec 16;35(50):16303-14. doi: 10.1523/JNEUROSCI.2903-15.2015.
10
The influence of visual cues on temporal anticipation and movement synchronization with musical sequences.视觉线索对与音乐序列的时间预期和动作同步的影响。
Acta Psychol (Amst). 2018 Nov;191:190-200. doi: 10.1016/j.actpsy.2018.09.014. Epub 2018 Oct 24.

本文引用的文献

1
The precision of signals encoding active self-movement.主动运动信号的编码精度。
J Neurophysiol. 2024 Aug 1;132(2):389-402. doi: 10.1152/jn.00370.2023. Epub 2024 Jun 12.
2
Motor Signals Mediate Stationarity Perception.运动信号介导稳态感知。
Multisens Res. 2023 Oct 13;36(7):703-724. doi: 10.1163/22134808-bja10111.
3
Vestibular processing during natural self-motion: implications for perception and action.自然运动过程中的前庭处理:对感知和运动的影响。
Nat Rev Neurosci. 2019 Jun;20(6):346-363. doi: 10.1038/s41583-019-0153-1.
4
Applying the Model-Comparison Approach to Test Specific Research Hypotheses in Psychophysical Research Using the Palamedes Toolbox.运用模型比较方法,借助Palamedes工具箱在心理物理学研究中检验特定研究假设。
Front Psychol. 2018 Jul 23;9:1250. doi: 10.3389/fpsyg.2018.01250. eCollection 2018.
5
A preference for visual speed during smooth pursuit eye movement.在平稳跟踪眼球运动过程中对视觉速度的偏好。
J Exp Psychol Hum Percept Perform. 2018 Oct;44(10):1629-1636. doi: 10.1037/xhp0000551. Epub 2018 Jul 5.
6
Statistically Optimal Multisensory Cue Integration: A Practical Tutorial.统计最优多感官线索整合:实用教程
Multisens Res. 2016;29(4-5):279-317. doi: 10.1163/22134808-00002510.
7
Auditory compensation for head rotation is incomplete.头部旋转的听觉补偿并不完全。
J Exp Psychol Hum Percept Perform. 2017 Feb;43(2):371-380. doi: 10.1037/xhp0000321. Epub 2016 Nov 14.
8
Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals.听觉空间更新对前庭、本体感觉和传出副本信号的依赖性。
J Neurophysiol. 2016 Aug 1;116(2):765-75. doi: 10.1152/jn.00052.2016. Epub 2016 May 11.
9
Distortion of auditory space during visually induced self-motion in depth.深度视诱发自身运动过程中的听觉空间扭曲。
Front Psychol. 2014 Aug 5;5:848. doi: 10.3389/fpsyg.2014.00848. eCollection 2014.
10
Discrimination contours for moving sounds reveal duration and distance cues dominate auditory speed perception.移动声音的辨别轮廓表明,时长和距离线索主导听觉速度感知。
PLoS One. 2014 Jul 30;9(7):e102864. doi: 10.1371/journal.pone.0102864. eCollection 2014.