Suppr超能文献

通过声学传感和机器学习检测和分类社交机器人中的人类触摸。

Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning.

机构信息

Robotics Laboratory, Universidad Carlos III de Madrid, Av. de la Universidad 30, Leganés, 28911 Madrid, Spain.

出版信息

Sensors (Basel). 2017 May 16;17(5):1138. doi: 10.3390/s17051138.

Abstract

An important aspect in Human-Robot Interaction is responding to different kinds of touch stimuli. To date, several technologies have been explored to determine how a touch is perceived by a social robot, usually placing a large number of sensors throughout the robot's shell. In this work, we introduce a novel approach, where the audio acquired from contact microphones located in the robot's shell is processed using machine learning techniques to distinguish between different types of touches. The system is able to determine when the robot is touched (touch detection), and to ascertain the kind of touch performed among a set of possibilities: , , , and (touch classification). This proposal is cost-effective since just a few microphones are able to cover the whole robot's shell since a single microphone is enough to cover each solid part of the robot. Besides, it is easy to install and configure as it just requires a contact surface to attach the microphone to the robot's shell and plug it into the robot's computer. Results show the high accuracy scores in touch gesture recognition. The testing phase revealed that Logistic Model Trees achieved the best performance, with an -score of 0.81. The dataset was built with information from 25 participants performing a total of 1981 touch gestures.

摘要

在人机交互中,对不同种类的触摸刺激做出反应是一个重要方面。迄今为止,已经探索了几种技术来确定社交机器人如何感知触摸,通常是在机器人外壳上放置大量传感器。在这项工作中,我们引入了一种新方法,该方法使用机器学习技术处理来自位于机器人外壳中的接触麦克风采集到的音频,以区分不同类型的触摸。该系统能够确定机器人何时被触摸(触摸检测),并确定在一组可能性中执行的触摸类型:轻拍、拍打、推和拉(触摸分类)。该方案具有成本效益,因为只需几个麦克风就能够覆盖整个机器人外壳,因为单个麦克风就足以覆盖机器人的每个实体部分。此外,它易于安装和配置,因为只需要一个接触面将麦克风连接到机器人的外壳上,并将其插入机器人的计算机中。结果表明,在触摸手势识别方面具有很高的准确性得分。测试阶段表明,Logistic Model Trees 实现了最佳性能,其 -score 为 0.81。该数据集是由 25 名参与者在总共执行 1981 次触摸手势时的信息构建的。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验