Oliveira Guilherme Camargo, Ngo Quoc Cuong, Passos Leandro Aparecido, Oliveira Leonardo Silva, Stylianou Stella, Papa João Paulo, Kumar Dinesh
School of Science, São Paulo State University, São Paulo, Brazil.
School of Engineering, Royal Melbourne Institute of Technology, Melbourne, VIC, Australia.
Digit Biomark. 2024 Aug 29;8(1):171-180. doi: 10.1159/000540547. eCollection 2024 Jan-Dec.
Weakened facial movements are early-stage symptoms of amyotrophic lateral sclerosis (ALS). ALS is generally detected based on changes in facial expressions, but large differences between individuals can lead to subjectivity in the diagnosis. We have proposed a computerized analysis of facial expression videos to detect ALS.
This study investigated the action units obtained from facial expression videos to differentiate between ALS patients and healthy individuals, identifying the specific action units and facial expressions that give the best results. We utilized the Toronto NeuroFace Dataset, which includes nine facial expression tasks for healthy individuals and ALS patients.
The best classification accuracy was 0.91 obtained for the pretending to smile with tight lips expression.
This pilot study shows the potential of using computerized facial expression analysis based on action units to identify facial weakness symptoms in ALS.
面部运动减弱是肌萎缩侧索硬化症(ALS)的早期症状。ALS通常根据面部表情的变化来检测,但个体之间的巨大差异会导致诊断的主观性。我们提出了一种对面部表情视频进行计算机化分析以检测ALS的方法。
本研究调查了从面部表情视频中获得的动作单元,以区分ALS患者和健康个体,确定能产生最佳结果的特定动作单元和面部表情。我们使用了多伦多神经面部数据集,其中包括针对健康个体和ALS患者的九项面部表情任务。
对于紧闭嘴唇假装微笑的表情,获得了最佳分类准确率0.91。
这项初步研究表明,基于动作单元的计算机化面部表情分析在识别ALS面部无力症状方面具有潜力。