Mueller Calla, Durston Amie J, Itier Roxane J
University of Waterloo, Department of Psychology, 200 University Ave West, Waterloo, Ontario N2L 3G1, Canada.
University of Waterloo, Department of Psychology, 200 University Ave West, Waterloo, Ontario N2L 3G1, Canada.
Brain Res. 2025 Mar 15;1851:149481. doi: 10.1016/j.brainres.2025.149481. Epub 2025 Jan 29.
Neural decoding of others' facial expressions is critical in social interactions and has been investigated using scalp event related potentials (ERPs). However, the impact of task and emotional context congruency on this neural decoding is unclear. Previous ERP studies employed classic statistical analyses that only focused on specific electrodes and time points, which inflates type I and type II errors. The present study re-analyzed the study by Aguado et al. (2019) using robust data-driven Mass Univariate Statistics across every time point and electrode and rejected trials with early reaction times to rule out motor-related activity on neural recordings. Participants viewed neutral faces paired with negative or positive situational sentences (e.g. "She catches her partner cheating on her with her best friend"), followed by the same individuals' faces expressing happiness or anger, such that the facial expressions were congruent or incongruent with the situation. Participants engaged in two tasks: an emotion discrimination task, and a situation-expression congruency discrimination task. We found significant effects of expression largest during the N170-P2 interval, and effects of congruency and task around an LPP-like component. However, the effect of congruency was significant only in the congruency task, suggesting a limited and task-dependant influence of semantic context. Importantly, emotion did not interact with any factor neurally, suggesting facial expressions were decoded automatically during the first 400 ms of vision, regardless of context congruency or task demands. The results and their discrepancies with the original findings are discussed in the context of ERP statistics and the replication crisis.
对他人面部表情进行神经解码在社交互动中至关重要,并且已经使用头皮事件相关电位(ERP)进行了研究。然而,任务和情绪背景一致性对这种神经解码的影响尚不清楚。先前的ERP研究采用经典统计分析,仅关注特定电极和时间点,这会增加I型和II型错误。本研究重新分析了阿瓜多等人(2019年)的研究,使用了强大的数据驱动的多变量统计方法,对每个时间点和电极进行分析,并排除了反应时间较早的试验,以排除神经记录中与运动相关的活动。参与者观看与消极或积极情境句子配对的中性面孔(例如,“她发现她的伴侣和她最好的朋友出轨”),然后是同一个人表达快乐或愤怒的面孔,使得面部表情与情境一致或不一致。参与者参与两项任务:情绪辨别任务和情境-表情一致性辨别任务。我们发现在N170-P2时间间隔内表情的影响最大,在类似LPP的成分周围一致性和任务的影响显著。然而,一致性的影响仅在一致性任务中显著,这表明语义背景的影响有限且依赖于任务。重要的是,情绪在神经层面上没有与任何因素相互作用,这表明面部表情在视觉的前400毫秒内被自动解码,无论情境一致性或任务要求如何。在ERP统计和重复危机的背景下讨论了结果及其与原始发现之间的差异。