Verosky Niels J, Morgan Emily
Department of Psychology, Yale University, 100 College St., New Haven, CT 06510, United States.
Department of Linguistics, University of California, Davis, United States.
Cognition. 2025 Oct;263:106179. doi: 10.1016/j.cognition.2025.106179. Epub 2025 May 24.
Musical stimuli present listeners with complex temporal information and rich periodic structure. Periodic patterns in music typically involve multiple hierarchical levels: a basic-level repeating pulse known as the "beat," and a higher-order grouping of beats into the "meter." Previous work has found that a musical stimulus's meter is predicted by recurring temporal patterns of note event onsets, measured by profiles of autocorrelation over time lags. Traditionally, that work has emphasized periodic structure in the timing of event onsets (i.e., repeating rhythms). Here, we suggest that musical meter is in fact a more general perceptual phenomenon, instantiating complex profiles of temporal dependencies across both event onsets and multiple feature dimensions in the actual content of events. We use classification techniques to test whether profiles of temporal dependencies in event onsets and in multiple types of event content predict musical meter. Applying random forest models to three musical corpora, we reproduce findings that profiles of temporal dependencies in note event onsets contain information about meter, but we find that profiles of temporal dependencies in pitch height, interval size, and tonal expectancy also contain such information, with high redundancy among temporal dependencies in event onsets and event content as predictors of meter. Moreover, information about meter is distributed across temporal dependencies at multiple time lags, as indicated by the baseline performance of an unsupervised classifier that selects the single time lag with maximum autocorrelation. Redundant profiles of temporal dependencies across multiple stimulus features may provide strong constraints on musical structure that inform listeners' predictive processes.
音乐刺激为听众呈现出复杂的时间信息和丰富的周期性结构。音乐中的周期性模式通常涉及多个层次:一个被称为“节拍”的基本层次重复脉冲,以及将节拍更高层次地分组为“节拍结构”。先前的研究发现,音乐刺激的节拍结构可通过音符事件起始的反复出现的时间模式来预测,该模式通过随时间滞后的自相关曲线来测量。传统上,这项工作强调事件起始时间中的周期性结构(即重复节奏)。在此,我们认为音乐节拍结构实际上是一种更普遍的感知现象,它体现了事件实际内容中事件起始和多个特征维度之间复杂的时间依赖关系曲线。我们使用分类技术来测试事件起始和多种类型事件内容中的时间依赖关系曲线是否能预测音乐节拍结构。将随机森林模型应用于三个音乐语料库,我们重现了音符事件起始中的时间依赖关系曲线包含有关节拍结构的信息这一发现,但我们发现音高、音程大小和音调期望中的时间依赖关系曲线也包含此类信息,事件起始和事件内容中的时间依赖关系作为节拍结构的预测指标具有高度冗余性。此外,有关节拍结构的信息分布在多个时间滞后的时间依赖关系中,这由选择具有最大自相关的单个时间滞后的无监督分类器的基线性能表明。跨多个刺激特征的时间依赖关系的冗余曲线可能为音乐结构提供强大的约束,从而为听众的预测过程提供信息。