Nazarbayev University, School of Science and Technology, Astana Z05H0P9, Kazakhstan.
Sci Data. 2018 May 29;5:180101. doi: 10.1038/sdata.2018.101.
This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation.
本文介绍了一个从多个人类受试者在非结构化环境中进行日常活动中采集的抓取数据库。该数据库的主要优势在于使用了三种不同的传感模式:头戴式动作摄像机的彩色图像、主导手臂上深度传感器的距离数据以及惯性运动捕捉服获取的上半身运动学数据。在 9 小时的实验中,共识别了 3826 次抓取。根据层次分类法,这些抓取被分为 35 种不同的抓取类型。该数据库包含与每个抓取相关的信息以及从三种传感器模式获取的相关传感器数据。我们还提供了用 Matlab 编写的开源数据注释软件。数据库的大小为 172GB。我们相信,该数据库可以作为开发大数据和机器学习技术的基础,用于抓取和操作,在康复机器人和智能自动化方面具有潜在的应用。