Kaya Emine Merve, Huang Nicolas, Elhilali Mounya
Laboratory for Computational Audio Perception, Department of Electrical and Computer Engineering Johns Hopkins University, Baltimore, MD, USA.
Laboratory for Computational Audio Perception, Department of Electrical and Computer Engineering Johns Hopkins University, Baltimore, MD, USA.
Neuroscience. 2020 Aug 1;440:1-14. doi: 10.1016/j.neuroscience.2020.05.018. Epub 2020 May 21.
As we listen to everyday sounds, auditory perception is heavily shaped by interactions between acoustic attributes such as pitch, timbre and intensity; though it is not clear how such interactions affect judgments of acoustic salience in dynamic soundscapes. Salience perception is believed to rely on an internal brain model that tracks the evolution of acoustic characteristics of a scene and flags events that do not fit this model as salient. The current study explores how the interdependency between attributes of dynamic scenes affects the neural representation of this internal model and shapes encoding of salient events. Specifically, the study examines how deviations along combinations of acoustic attributes interact to modulate brain responses, and subsequently guide perception of certain sound events as salient given their context. Human volunteers have their attention focused on a visual task and ignore acoustic melodies playing in the background while their brain activity using electroencephalography is recorded. Ambient sounds consist of musical melodies with probabilistically-varying acoustic attributes. Salient notes embedded in these scenes deviate from the melody's statistical distribution along pitch, timbre and/or intensity. Recordings of brain responses to salient notes reveal that neural power in response to the melodic rhythm as well as cross-trial phase alignment in the theta band are modulated by degree of salience of the notes, estimated across all acoustic attributes given their probabilistic context. These neural nonlinear effects across attributes strongly parallel behavioral nonlinear interactions observed in perceptual judgments of auditory salience using similar dynamic melodies; suggesting a neural underpinning of nonlinear interactions that underlie salience perception.
当我们聆听日常声音时,听觉感知很大程度上受到音高、音色和强度等声学属性之间相互作用的影响;不过尚不清楚这种相互作用如何影响动态声景中声学显著性的判断。显著性感知被认为依赖于一种内部脑模型,该模型追踪场景声学特征的演变,并将不符合该模型的事件标记为显著事件。当前的研究探讨了动态场景属性之间的相互依存关系如何影响这种内部模型的神经表征,并塑造显著事件的编码。具体而言,该研究考察了沿着声学属性组合的偏差如何相互作用以调节大脑反应,并随后在特定情境下引导对某些声音事件显著性的感知。人类志愿者将注意力集中在一项视觉任务上,同时忽略背景中播放的声学旋律,在此期间记录他们使用脑电图的大脑活动。环境声音由具有概率变化声学属性的音乐旋律组成。嵌入这些场景中的显著音符在音高、音色和/或强度方面偏离了旋律的统计分布。对显著音符的大脑反应记录显示,对旋律节奏的神经功率以及θ波段的跨试验相位对齐会受到音符显著性程度的调制,这种显著性程度是根据所有声学属性在其概率背景下进行估计的。这些跨属性的神经非线性效应与在使用类似动态旋律进行听觉显著性感知判断时观察到的行为非线性相互作用非常相似;这表明了非线性相互作用的神经基础,而这种非线性相互作用是显著性感知的基础。