School of Psychology, University of East Anglia, Norwich, UK.
School of Psychological Sciences, Birkbeck College, University of London, London, UK.
Neuroimage. 2019 Jul 15;195:261-271. doi: 10.1016/j.neuroimage.2019.03.065. Epub 2019 Mar 30.
Faces transmit a wealth of important social signals. While previous studies have elucidated the network of cortical regions important for perception of facial expression, and the associated temporal components such as the P100, N170 and EPN, it is still unclear how task constraints may shape the representation of facial expression (or other face categories) in these networks. In the present experiment, we used Multivariate Pattern Analysis (MVPA) with EEG to investigate the neural information available across time about two important face categories (expression and identity) when those categories are either perceived under explicit (e.g. decoding facial expression category from the EEG when task is on expression) or incidental task contexts (e.g. decoding facial expression category from the EEG when task is on identity). Decoding of both face categories, across both task contexts, peaked in time-windows spanning 91-170 ms (across posterior electrodes). Peak decoding of expression, however, was not affected by task context whereas peak decoding of identity was significantly reduced under incidental processing conditions. In addition, errors in EEG decoding correlated with errors in behavioral categorization under explicit processing for both expression and identity, however under incidental conditions only errors in EEG decoding of expression correlated with behavior. Furthermore, decoding time-courses and the spatial pattern of informative electrodes showed consistently better decoding of identity under explicit conditions at later-time periods, with weak evidence for similar effects for decoding of expression at isolated time-windows. Taken together, these results reveal differences and commonalities in the processing of face categories under explicit Vs incidental task contexts and suggest that facial expressions are processed to a richer degree under incidental processing conditions, consistent with prior work indicating the relative automaticity by which emotion is processed. Our work further demonstrates the utility in applying multivariate decoding analyses to EEG for revealing the dynamics of face perception.
面部传递着丰富的重要社交信号。虽然之前的研究已经阐明了皮质区域网络对面部表情感知的重要性,以及与 P100、N170 和 EPN 相关的时间成分,但仍不清楚任务约束如何在这些网络中塑造面部表情(或其他面部类别)的表示。在本实验中,我们使用 EEG 的多变量模式分析(MVPA)来研究当这些类别在明确的任务背景(例如,当任务是表情时从 EEG 解码面部表情类别)或偶然的任务背景下(例如,当任务是身份时从 EEG 解码面部表情类别)时,随时间推移,两个重要的面部类别(表情和身份)可用的神经信息。在两种任务背景下,对这两个面部类别的解码在时间窗口 91-170 ms 内达到峰值(跨越后电极)。然而,表情的峰值解码不受任务背景的影响,而身份的峰值解码在偶然处理条件下显著降低。此外,在明确处理条件下,EEG 解码错误与行为分类错误相关,而在偶然条件下,仅表情的 EEG 解码错误与行为相关。此外,解码时间过程和信息电极的空间模式显示,在稍后的时间窗口中,明确条件下身份的解码一致性更好,而在孤立的时间窗口中,对表情解码的类似影响的证据较弱。总的来说,这些结果揭示了在明确的任务背景和偶然的任务背景下,对面部类别的处理存在差异和共性,并表明在偶然处理条件下,面部表情的处理程度更丰富,这与先前表明情绪处理的相对自动化的工作一致。我们的工作进一步证明了将多元解码分析应用于 EEG 以揭示面部感知动态的实用性。