Suppr超能文献

MAGiC:用于分析二元交流中注视的多模态框架。

MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication.

作者信息

Arslan Aydın Ülkü, Kalkan Sinan, Acartürk Cengiz

机构信息

Cognitive Science Program Middle East Technical University Ankara, Turkey.

Computer Science Department Middle East Technical University Ankara, Turkey.

出版信息

J Eye Mov Res. 2018 Nov 12;11(6). doi: 10.16910/jemr.11.6.2.

Abstract

The analysis of dynamic scenes has been a challenging domain in eye tracking research. This study presents a framework, named MAGiC, for analyzing gaze contact and gaze aversion in face-to-face communication. MAGiC provides an environment that is able to detect and track the conversation partner's face automatically, overlay gaze data on top of the face video, and incorporate speech by means of speech-act annotation. Specifically, MAGiC integrates eye tracking data for gaze, audio data for speech segmentation, and video data for face tracking. MAGiC is an open source framework and its usage is demonstrated via publicly available video content and wiki pages. We explored the capabilities of MAGiC through a pilot study and showed that it facilitates the analysis of dynamic gaze data by reducing the annotation effort and the time spent for manual analysis of video data.

摘要

动态场景分析一直是眼动追踪研究中一个具有挑战性的领域。本研究提出了一个名为MAGiC的框架,用于分析面对面交流中的目光接触和目光回避。MAGiC提供了一个能够自动检测和跟踪对话伙伴面部的环境,将注视数据叠加在面部视频之上,并通过言语行为注释整合语音。具体而言,MAGiC集成了用于注视的眼动追踪数据、用于语音分割的音频数据以及用于面部跟踪的视频数据。MAGiC是一个开源框架,其使用方法通过公开可用的视频内容和维基页面进行展示。我们通过一项初步研究探索了MAGiC的功能,结果表明它通过减少注释工作量和手动分析视频数据所花费的时间,促进了对动态注视数据的分析。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/073f/7906569/a0b6d93ad54e/jemr-11-06-b-figure-01.jpg

相似文献

1
MAGiC: A Multimodal Framework for Analysing Gaze in Dyadic Communication.
J Eye Mov Res. 2018 Nov 12;11(6). doi: 10.16910/jemr.11.6.2.
2
Gaze aversion to stuttered speech: a pilot study investigating differential visual attention to stuttered and fluent speech.
Int J Lang Commun Disord. 2010 Mar-Apr;45(2):133-44. doi: 10.3109/13682820902763951.
3
Speech Driven Gaze in a Face-to-Face Interaction.
Front Neurorobot. 2021 Mar 4;15:598895. doi: 10.3389/fnbot.2021.598895. eCollection 2021.
5
Selective Medial Prefrontal Cortex Responses During Live Mutual Gaze Interactions in Human Infants: An fNIRS Study.
Brain Topogr. 2015 Sep;28(5):691-701. doi: 10.1007/s10548-014-0414-2. Epub 2014 Nov 4.
6
Objective eye-gaze behaviour during face-to-face communication with proficient alaryngeal speakers: a preliminary study.
Int J Lang Commun Disord. 2011 Sep-Oct;46(5):535-49. doi: 10.1111/j.1460-6984.2011.00005.x. Epub 2011 Mar 7.
7
Timing of gazes in child dialogues: a time-course analysis of requests and back channelling in referential communication.
Int J Lang Commun Disord. 2012 Jul-Aug;47(4):373-83. doi: 10.1111/j.1460-6984.2012.00151.x. Epub 2012 Mar 5.
8
See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction.
ACM Trans Interact Intell Syst. 2016 May;6(1). doi: 10.1145/2882970.
9
Using dual eye tracking to uncover personal gaze patterns during social interaction.
Sci Rep. 2018 Mar 9;8(1):4271. doi: 10.1038/s41598-018-22726-7.
10
Gaze and visual search strategies of children with Asperger syndrome/high functioning autism viewing a magic trick.
Dev Neurorehabil. 2016;19(2):95-102. doi: 10.3109/17518423.2014.913081. Epub 2014 May 27.

引用本文的文献

1
The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.
Behav Res Methods. 2025 Jan 6;57(1):46. doi: 10.3758/s13428-024-02529-7.
2
Speech Driven Gaze in a Face-to-Face Interaction.
Front Neurorobot. 2021 Mar 4;15:598895. doi: 10.3389/fnbot.2021.598895. eCollection 2021.

本文引用的文献

1
Do you see what I see? Mobile eye-tracker contextual analysis and inter-rater reliability.
Med Biol Eng Comput. 2018 Feb;56(2):289-296. doi: 10.1007/s11517-017-1669-z. Epub 2017 Jul 15.
2
Quantifying saccades while walking: validity of a novel velocity-based algorithm for mobile eye tracking.
Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:5739-42. doi: 10.1109/EMBC.2014.6944931.
3
A non-verbal Turing test: differentiating mind from machine in gaze-based social interaction.
PLoS One. 2011;6(11):e27591. doi: 10.1371/journal.pone.0027591. Epub 2011 Nov 9.
4
Seeing direct and averted gaze activates the approach-avoidance motivational brain systems.
Neuropsychologia. 2008;46(9):2423-30. doi: 10.1016/j.neuropsychologia.2008.02.029. Epub 2008 Mar 8.
5
Prosodic Planning: Effects of Phrasal Length and Complexity on Pause Duration.
J Phon. 2007 Apr;35(2):162-179. doi: 10.1016/j.wocn.2006.04.001.
6
The look of love: gaze shifts and person perception.
Psychol Sci. 2005 Mar;16(3):236-9. doi: 10.1111/j.0956-7976.2005.00809.x.
7
Some functions of gaze-direction in social interaction.
Acta Psychol (Amst). 1967;26(1):22-63. doi: 10.1016/0001-6918(67)90005-4.
8
Decoding of inconsistent communications.
J Pers Soc Psychol. 1967 May;6(1):109-14. doi: 10.1037/h0024532.
9
Gaze and eye contact: a research review.
Psychol Bull. 1986 Jul;100(1):78-100.
10
How the listener integrates the components of speaking rate.
J Exp Psychol Hum Percept Perform. 1976 Nov;2(4):538-43. doi: 10.1037//0096-1523.2.4.538.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验