Suppr超能文献

与有纹理表面进行裸手指交互时的触觉、音频和视觉数据集。

Tactile, Audio, and Visual Dataset During Bare Finger Interaction with Textured Surfaces.

作者信息

Devillard Alexis W M, Ramasamy Aruna, Cheng Xiaoxiao, Faux Damien, Burdet Etienne

机构信息

Imperial College of Science, Technology and Medicine, London, W12 0BZ, UK.

École Normale Supérieure, CNRS, Laboratoire des Systèmes Perceptifs, Paris, 75005, France.

出版信息

Sci Data. 2025 Mar 23;12(1):484. doi: 10.1038/s41597-025-04670-0.

Abstract

This paper presents a comprehensive multi-modal dataset capturing concurrent haptic, audio, and visual signals recorded from ten participants as they interacted with ten different textured surfaces using their bare fingers. The dataset includes stereoscopic images of the textures, and fingertip position, speed, applied load, emitted sound, and friction-induced vibrations, providing an unprecedented insight into the complex dynamics underlying human tactile perception. Our approach utilizes a human finger (while most previous studies relied on rigid sensorized probes), enabling the naturalistic acquisition of haptic data and addressing a significant gap in resources for studies of human tactile exploration, perceptual mechanisms, and artificial tactile perception. Additionally, fifteen participants completed a questionnaire to evaluate their subjective perception of the surfaces. Through carefully designed data collection protocols, encompassing both controlled and free exploration scenarios, this dataset offers a rich resource for studying human multi-sensory integration and supports the development of algorithms for texture recognition based on multi-modal inputs. A preliminary analysis demonstrates the dataset's potential, as classifiers trained on different combinations of data modalities show promising accuracy in surface identification, highlighting its value for advancing research in multi-sensory perception and the development of human-machine interfaces.

摘要

本文展示了一个全面的多模态数据集,该数据集记录了十名参与者在使用裸手指与十种不同纹理表面交互时同时产生的触觉、音频和视觉信号。该数据集包括纹理的立体图像,以及指尖位置、速度、施加的负载、发出的声音和摩擦引起的振动,为深入了解人类触觉感知背后的复杂动态提供了前所未有的视角。我们的方法使用人类手指(而大多数先前的研究依赖于刚性的传感探头),能够自然地获取触觉数据,并填补了人类触觉探索、感知机制和人工触觉感知研究资源方面的重大空白。此外,十五名参与者完成了一份问卷,以评估他们对这些表面的主观感受。通过精心设计的数据收集协议,涵盖受控和自由探索场景,该数据集为研究人类多感官整合提供了丰富的资源,并支持基于多模态输入的纹理识别算法的开发。初步分析证明了该数据集的潜力,因为在不同数据模态组合上训练的分类器在表面识别方面显示出有前景的准确率,突出了其在推进多感官感知研究和人机界面开发方面的价值。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8ae0/11930942/d04d55322ccd/41597_2025_4670_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验