School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia.
Hunter Medical Research Institute, Newcastle, Australia.
Elife. 2022 Aug 31;11:e79581. doi: 10.7554/eLife.79581.
Facial affect is expressed dynamically - a giggle, grimace, or an agitated frown. However, the characterisation of human affect has relied almost exclusively on static images. This approach cannot capture the nuances of human communication or support the naturalistic assessment of affective disorders. Using the latest in machine vision and systems modelling, we studied dynamic facial expressions of people viewing emotionally salient film clips. We found that the apparent complexity of dynamic facial expressions can be captured by a small number of simple spatiotemporal states - composites of distinct facial actions, each expressed with a unique spectral fingerprint. Sequential expression of these states is common across individuals viewing the same film stimuli but varies in those with the melancholic subtype of major depressive disorder. This approach provides a platform for translational research, capturing dynamic facial expressions under naturalistic conditions and enabling new quantitative tools for the study of affective disorders and related mental illnesses.
面部表情是动态表达的——欢笑、鬼脸或烦躁的皱眉。然而,人类情感的特征几乎完全依赖于静态图像。这种方法无法捕捉人类交流的细微差别,也无法支持情感障碍的自然评估。我们使用最新的机器视觉和系统建模技术,研究了人们观看情感强烈的电影片段时的动态面部表情。我们发现,动态面部表情的明显复杂性可以通过少数简单的时空状态来捕捉——这些状态是不同面部动作的组合,每个动作都具有独特的光谱特征。观看相同电影刺激的个体之间,这些状态的连续表达是常见的,但在具有忧郁型重度抑郁症的个体中则有所不同。这种方法为转化研究提供了一个平台,可在自然条件下捕捉动态面部表情,并为情感障碍和相关精神疾病的研究提供新的定量工具。