Suppr超能文献

在听觉引导的视觉搜索任务中,头部旋转遵循截断的菲克万向节的旋转方式。

Head rotations follow those of a truncated Fick gimbal during an auditory-guided visual search task.

作者信息

McLachlan Glen, Lladó Pedro, Peremans Herbert

机构信息

Active Perception Lab, Department of Engineering Management, University of Antwerp, Belgium.

Acoustics Lab, Department of Information and Communication Engineering, Aalto University, Espoo, Finland.

出版信息

J Neurophysiol. 2024 Dec 1;132(6):1857-1866. doi: 10.1152/jn.00298.2024. Epub 2024 Oct 30.

Abstract

Recent interest in dynamic sound localization models has created a need to better understand the head movements made by humans. Previous studies have shown that static head positions and small oscillations of the head obey Donders' law: for each facing direction there is one unique three-dimensional orientation. It is unclear whether this same constraint applies to audiovisual localization, where head movement is unrestricted and subjects may rotate their heads depending on the available auditory information. In an auditory-guided visual search task, human subjects were instructed to localize an audiovisual target within a field of visual distractors in the frontal hemisphere. During this task, head and torso movements were monitored with a motion capture system. Head rotations were found to follow Donders' law during search tasks. Individual differences were present in the amount of roll that subjects deployed, though there was no statistically significant improvement in model performance when including these individual differences in a gimbal model. The roll component of head rotation could therefore be predicted with a truncated Fick gimbal, which consists of a pitch axis nested within a yaw axis. This led to a reduction from three to two degrees of freedom when modeling head movement during localization tasks. Understanding how humans utilize head movements during sound localization is crucial for the advancement of auditory perception models and improvement of practical applications like hearing aids and virtual reality systems. By analyzing head motion data from an auditory-guided visual search task, we concluded that findings from earlier studies on head movement can be generalized to audiovisual localization and, from this, proposed a simple model for head rotation that reduced the number of degrees of freedom.

摘要

最近对动态声音定位模型的关注使得有必要更好地了解人类的头部运动。先前的研究表明,头部的静态位置和小幅度摆动遵循东德斯定律:对于每个朝向方向,都有一个独特的三维方向。目前尚不清楚同样的约束是否适用于视听定位,在视听定位中头部运动不受限制,受试者可能会根据可用的听觉信息转动头部。在一项听觉引导的视觉搜索任务中,人类受试者被要求在额叶半球的视觉干扰物场中定位一个视听目标。在这个任务过程中,使用运动捕捉系统监测头部和躯干的运动。发现在搜索任务期间头部旋转遵循东德斯定律。受试者所采用的横滚量存在个体差异,不过在万向节模型中纳入这些个体差异时,模型性能并没有统计学上的显著改善。因此,头部旋转的横滚分量可以用一个截断的菲克万向节来预测,它由嵌套在偏航轴内的俯仰轴组成。这导致在对定位任务期间的头部运动进行建模时,自由度从三个减少到两个。了解人类在声音定位过程中如何利用头部运动对于推进听觉感知模型以及改善助听器和虚拟现实系统等实际应用至关重要。通过分析来自听觉引导视觉搜索任务的头部运动数据,我们得出结论,早期关于头部运动的研究结果可以推广到视听定位,据此提出了一个简化的头部旋转模型,减少了自由度的数量。

相似文献

1
Head rotations follow those of a truncated Fick gimbal during an auditory-guided visual search task.
J Neurophysiol. 2024 Dec 1;132(6):1857-1866. doi: 10.1152/jn.00298.2024. Epub 2024 Oct 30.
2
Rotation axes of the head during positioning, head shaking, and locomotion.
J Neurophysiol. 2007 Nov;98(5):3095-108. doi: 10.1152/jn.00764.2007. Epub 2007 Sep 26.
3
4
Task-dependent constraints in motor control: pinhole goggles make the head move like an eye.
J Neurosci. 2000 Apr 1;20(7):2719-30. doi: 10.1523/JNEUROSCI.20-07-02719.2000.
5
Three-dimensional head and upper arm orientations during kinematically redundant movements and at rest.
Exp Brain Res. 2002 Jan;142(2):181-92. doi: 10.1007/s00221-001-0897-4. Epub 2001 Nov 30.
6
The precision of signals encoding active self-movement.
J Neurophysiol. 2024 Aug 1;132(2):389-402. doi: 10.1152/jn.00370.2023. Epub 2024 Jun 12.
7
Violations of Listing's law after large eye and head gaze shifts.
J Neurophysiol. 1992 Jul;68(1):309-18. doi: 10.1152/jn.1992.68.1.309.
9
Kinematic strategies for upper arm-forearm coordination in three dimensions.
J Neurophysiol. 2000 Nov;84(5):2302-16. doi: 10.1152/jn.2000.84.5.2302.
10
Modeling the Impact of Head-Body Rotations on Audio-Visual Spatial Perception for Virtual Reality Applications.
IEEE Trans Vis Comput Graph. 2024 May;30(5):2624-2632. doi: 10.1109/TVCG.2024.3372112. Epub 2024 Apr 23.

本文引用的文献

1
The impact of head-worn devices in an auditory-aided visual search task.
J Acoust Soc Am. 2024 Apr 1;155(4):2460-2469. doi: 10.1121/10.0025542.
2
Dynamic spectral cues do not affect human sound localization during small head movements.
Front Neurosci. 2023 Feb 3;17:1027827. doi: 10.3389/fnins.2023.1027827. eCollection 2023.
3
Self-motion with Hearing Impairment and (Directional) Hearing Aids.
Trends Hear. 2022 Jan-Dec;26:23312165221078707. doi: 10.1177/23312165221078707.
4
The Perception of Auditory Motion.
Trends Hear. 2016 Apr 19;20:2331216516644254. doi: 10.1177/2331216516644254.
5
Sound localization with head movement: implications for 3-d audio displays.
Front Neurosci. 2014 Aug 12;8:210. doi: 10.3389/fnins.2014.00210. eCollection 2014.
6
Auditory and visual orienting responses in listeners with and without hearing-impairment.
J Acoust Soc Am. 2010 Jun;127(6):3678-88. doi: 10.1121/1.3409488.
7
Rotation axes of the head during positioning, head shaking, and locomotion.
J Neurophysiol. 2007 Nov;98(5):3095-108. doi: 10.1152/jn.00764.2007. Epub 2007 Sep 26.
8
The impact of hearing protection on sound localization and orienting behavior.
Hum Factors. 2005 Spring;47(1):188-98. doi: 10.1518/0018720053653866.
9
The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations.
Exp Brain Res. 2004 Nov;159(2):151-60. doi: 10.1007/s00221-004-1951-9. Epub 2004 Jun 25.
10
Neural control of three-dimensional eye and head movements.
Curr Opin Neurobiol. 2003 Dec;13(6):655-62. doi: 10.1016/j.conb.2003.10.009.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验