• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

GestureLens:演讲视频中手势的可视化分析。

GestureLens: Visual Analysis of Gestures in Presentation Videos.

出版信息

IEEE Trans Vis Comput Graph. 2023 Aug;29(8):3685-3697. doi: 10.1109/TVCG.2022.3169175. Epub 2023 Jun 29.

DOI:10.1109/TVCG.2022.3169175
PMID:35446768
Abstract

Appropriate gestures can enhance message delivery and audience engagement in both daily communication and public presentations. In this article, we contribute a visual analytic approach that assists professional public speaking coaches in improving their practice of gesture training through analyzing presentation videos. Manually checking and exploring gesture usage in the presentation videos is often tedious and time-consuming. There lacks an efficient method to help users conduct gesture exploration, which is challenging due to the intrinsically temporal evolution of gestures and their complex correlation to speech content. In this article, we propose GestureLens, a visual analytics system to facilitate gesture-based and content-based exploration of gesture usage in presentation videos. Specifically, the exploration view enables users to obtain a quick overview of the spatial and temporal distributions of gestures. The dynamic hand movements are firstly aggregated through a heatmap in the gesture space for uncovering spatial patterns, and then decomposed into two mutually perpendicular timelines for revealing temporal patterns. The relation view allows users to explicitly explore the correlation between speech content and gestures by enabling linked analysis and intuitive glyph designs. The video view and dynamic view show the context and overall dynamic movement of the selected gestures, respectively. Two usage scenarios and expert interviews with professional presentation coaches demonstrate the effectiveness and usefulness of GestureLens in facilitating gesture exploration and analysis of presentation videos.

摘要

适当的手势可以增强日常沟通和公开演讲中的信息传递和观众参与度。在本文中,我们贡献了一种视觉分析方法,通过分析演示视频,帮助专业的演讲教练提高他们的手势训练实践。手动检查和探索演示视频中的手势使用通常是繁琐和耗时的。缺乏一种有效的方法来帮助用户进行手势探索,这是具有挑战性的,因为手势的内在时间演变及其与演讲内容的复杂相关性。在本文中,我们提出了 GestureLens,这是一种视觉分析系统,用于促进演示视频中基于手势和基于内容的手势使用探索。具体来说,探索视图使用户能够快速概览手势的空间和时间分布。首先,通过手势空间中的热图汇总动态手部运动,以揭示空间模式,然后将其分解为两个相互垂直的时间线,以揭示时间模式。关系视图允许用户通过启用链接分析和直观的符号设计来明确探索演讲内容和手势之间的相关性。视频视图和动态视图分别显示选定手势的上下文和整体动态运动。两个使用场景和对专业演示教练的专家访谈证明了 GestureLens 在促进演示视频中的手势探索和分析方面的有效性和实用性。

相似文献

1
GestureLens: Visual Analysis of Gestures in Presentation Videos.GestureLens:演讲视频中手势的可视化分析。
IEEE Trans Vis Comput Graph. 2023 Aug;29(8):3685-3697. doi: 10.1109/TVCG.2022.3169175. Epub 2023 Jun 29.
2
EmoCo: Visual Analysis of Emotion Coherence in Presentation Videos.EmoCo:演示视频中情感连贯性的视觉分析
IEEE Trans Vis Comput Graph. 2020 Jan;26(1):927-937. doi: 10.1109/TVCG.2019.2934656. Epub 2019 Aug 20.
3
Neural correlates of the processing of co-speech gestures.伴随言语手势加工的神经关联
Neuroimage. 2008 Feb 15;39(4):2010-24. doi: 10.1016/j.neuroimage.2007.10.055. Epub 2007 Nov 13.
4
The connectivity signature of co-speech gesture integration: The superior temporal sulcus modulates connectivity between areas related to visual gesture and auditory speech processing.协同言语手势整合的连通性特征:上颞回调节与视觉手势和听觉言语处理相关的区域之间的连通性。
Neuroimage. 2018 Nov 1;181:539-549. doi: 10.1016/j.neuroimage.2018.07.037. Epub 2018 Jul 17.
5
Memory effects of speech and gesture binding: cortical and hippocampal activation in relation to subsequent memory performance.言语与手势绑定的记忆效应:与后续记忆表现相关的皮质和海马体激活
J Cogn Neurosci. 2009 Apr;21(4):821-36. doi: 10.1162/jocn.2009.21053.
6
The role of iconic gestures in speech disambiguation: ERP evidence.标志性手势在言语歧义消除中的作用:事件相关电位证据。
J Cogn Neurosci. 2007 Jul;19(7):1175-92. doi: 10.1162/jocn.2007.19.7.1175.
7
Production of co-speech gestures in the right hemisphere: Evidence from individuals with complete or anterior callosotomy.右半球协同言语手势的产生:来自完全或前部胼胝体切开术个体的证据。
Neuropsychologia. 2023 Feb 10;180:108484. doi: 10.1016/j.neuropsychologia.2023.108484. Epub 2023 Jan 10.
8
Perception of co-speech gestures in aphasic patients: a visual exploration study during the observation of dyadic conversations.失语症患者对伴随言语手势的感知:二元对话观察期间的视觉探索性研究
Cortex. 2015 Mar;64:157-68. doi: 10.1016/j.cortex.2014.10.013. Epub 2014 Nov 4.
9
Exploring Mid-Air Hand Interaction in Data Visualization.探索数据可视化中的空中手部交互。
IEEE Trans Vis Comput Graph. 2024 Sep;30(9):6347-6364. doi: 10.1109/TVCG.2023.3332647. Epub 2024 Jul 31.
10
Multimodal Development in Children's Narrative Speech: Evidence for Tight Gesture-Speech Temporal Alignment Patterns as Early as 5 Years Old.儿童叙事言语中的多模态发展:早在 5 岁时就有证据表明手势-言语的时间对准模式非常紧密。
J Speech Lang Hear Res. 2023 Mar 7;66(3):888-900. doi: 10.1044/2022_JSLHR-22-00451. Epub 2023 Feb 21.