Medical Genomics Unit, Medical Genetics Branch, National Human Genome Research Institute, Bethesda, MA 20892, United States.
Institute of Computer Science, Augsburg University, Augsburg, Bavaria 86159, Germany.
Bioinformatics. 2024 Jun 28;40(Suppl 1):i110-i118. doi: 10.1093/bioinformatics/btae239.
Artificial intelligence (AI) is increasingly used in genomics research and practice, and generative AI has garnered significant recent attention. In clinical applications of generative AI, aspects of the underlying datasets can impact results, and confounders should be studied and mitigated. One example involves the facial expressions of people with genetic conditions. Stereotypically, Williams (WS) and Angelman (AS) syndromes are associated with a "happy" demeanor, including a smiling expression. Clinical geneticists may be more likely to identify these conditions in images of smiling individuals. To study the impact of facial expression, we analyzed publicly available facial images of approximately 3500 individuals with genetic conditions. Using a deep learning (DL) image classifier, we found that WS and AS images with non-smiling expressions had significantly lower prediction probabilities for the correct syndrome labels than those with smiling expressions. This was not seen for 22q11.2 deletion and Noonan syndromes, which are not associated with a smiling expression. To further explore the effect of facial expressions, we computationally altered the facial expressions for these images. We trained HyperStyle, a GAN-inversion technique compatible with StyleGAN2, to determine the vector representations of our images. Then, following the concept of InterfaceGAN, we edited these vectors to recreate the original images in a phenotypically accurate way but with a different facial expression. Through online surveys and an eye-tracking experiment, we examined how altered facial expressions affect the performance of human experts. We overall found that facial expression is associated with diagnostic accuracy variably in different genetic conditions.
人工智能(AI)在基因组学研究和实践中越来越多地被使用,生成式 AI 最近受到了广泛关注。在生成式 AI 的临床应用中,基础数据集的某些方面会影响结果,应该研究和减轻混杂因素。一个例子涉及到具有遗传条件的人的面部表情。刻板地,威廉姆斯(WS)和安格曼(AS)综合征与“快乐”的外表有关,包括微笑的表情。临床遗传学家可能更有可能在微笑的个体的图像中识别出这些病症。为了研究面部表情的影响,我们分析了大约 3500 名患有遗传疾病的个体的公开面部图像。使用深度学习(DL)图像分类器,我们发现,WS 和 AS 图像的非微笑表情比微笑表情的预测正确综合征标签的概率显著降低。这在 22q11.2 缺失和努南综合征中没有看到,这两种病症与微笑表情无关。为了进一步探索面部表情的影响,我们对这些图像进行了计算性的面部表情改变。我们训练了 HyperStyle,一种与 StyleGAN2 兼容的 GAN 反转技术,以确定我们图像的向量表示。然后,根据 InterfaceGAN 的概念,我们编辑这些向量,以一种表型准确但面部表情不同的方式重新创建原始图像。通过在线调查和眼动实验,我们研究了改变面部表情如何影响人类专家的表现。我们总体发现,面部表情在不同的遗传病症中与诊断准确性有不同程度的关联。