Suppr超能文献

用于量化人际触摸中身体接触相互作用的3D视觉跟踪

3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch.

作者信息

Xu Shan, Xu Chang, McIntyre Sarah, Olausson Håkan, Gerling Gregory J

机构信息

School of Engineering and Applied Science, University of Virginia, Charlottesville, VA, United States.

Center for Social and Affective Neuroscience (CSAN), Linköping University, Linköping, Sweden.

出版信息

Front Physiol. 2022 Jun 9;13:841938. doi: 10.3389/fphys.2022.841938. eCollection 2022.

Abstract

Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one's attention or caressing to soothe one's anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher's hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver's forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system's capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system's spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy.

摘要

在众多社交场合中,我们会以自然而直观的方式触摸他人,以分享想法和情感,比如轻拍以引起他人注意,或者抚摸以缓解他人的焦虑。要更深入地理解这些人际互动,部分需要精确测量皮肤与皮肤之间的身体接触。在以往的努力中,每种测量方法都有一定的局限性,例如,运动追踪器无法捕捉皮肤表面的精确形状,而压力传感器会妨碍皮肤与皮肤的接触。相比之下,这项工作开发了一种无干扰的3D视觉跟踪系统,使用深度相机来测量触摸者的裸手与接受者的前臂之间的接触属性。通过将手部模型拟合到检测到的3D手部关节,将触摸者的手作为一个姿态和位置的网格进行跟踪,而接受者的前臂则作为一个在反复皮肤接触时更新的3D表面被提取出来。基于一个涉及点云的接触模型,手与前臂接触的时空变化被分解为六个高分辨率的时间序列接触属性,即接触面积、压痕深度、绝对速度以及三个正交速度分量,还有接触持续时间。为了检验该系统的能力和局限性,进行了两种类型的实验。首先,为了评估其辨别人类触摸的能力,一个人用他们喜欢的手势向另一个人传递有提示的社交信息,比如快乐、愤怒、同情。结果表明,从接触属性中很容易辨别出信息、手势以及触摸者的身份。其次,该系统的时空准确性与来自独立设备的测量结果进行了验证,这些独立设备包括电磁运动追踪器、传感压力垫和激光位移传感器。虽然这里是在社交交流的背景下进行验证的,但该系统可扩展到人类触摸互动,如婴儿的母婴护理和按摩治疗。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dccf/9219726/905996cc8285/fphys-13-841938-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验