• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

HANDS:用于人机交互的静态手势RGB-D数据集。

HANDS: an RGB-D dataset of static hand-gestures for human-robot interaction.

作者信息

Nuzzi Cristina, Pasinetti Simone, Pagani Roberto, Coffetti Gabriele, Sansoni Giovanna

机构信息

Department of Mechanical and Industrial Engineering (DIMI), University of Brescia, Brescia, Italy.

出版信息

Data Brief. 2021 Jan 30;35:106791. doi: 10.1016/j.dib.2021.106791. eCollection 2021 Apr.

DOI:10.1016/j.dib.2021.106791
PMID:33604423
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7873347/
Abstract

The HANDS dataset has been created for human-robot interaction research, and it is composed of spatially and temporally aligned RGB and Depth frames. It contains 12 static single-hand gestures performed with both the right-hand and the left-hand, and 3 static two-hands gestures for a total of 29 unique classes. Five actors (two females and three males) have been acquired performing the gestures, each of them adopting a different background and light conditions. For each actor, 150 RGB frames and their corresponding 150 Depth frames per gesture have been collected, for a total of 2400 RGB frames and 2400 Depth frames per actor. Data has been collected using a Kinect v2 camera intrinsically calibrated to spatially align RGB data to Depth data. The temporal alignment has been performed offline using MATLAB, aligning frames with a maximum temporal distance of 66  ms. This dataset has been used in [1] and it is freely available at http://dx.doi.org/10.17632/ndrczc35bt.1.

摘要

HANDS数据集是为机器人与人类交互研究而创建的,它由空间和时间对齐的RGB帧和深度帧组成。它包含用右手和左手执行的12种静态单手手势,以及3种静态双手手势,总共29个独特类别。已采集了五名演员(两名女性和三名男性)执行这些手势的情况,他们每个人都采用了不同的背景和光照条件。对于每个演员,每个手势收集了150个RGB帧及其相应的150个深度帧,每个演员总共收集了2400个RGB帧和2400个深度帧。数据是使用经过内部校准的Kinect v2相机收集的,以便将RGB数据与深度数据进行空间对齐。时间对齐是使用MATLAB离线执行的,将帧的最大时间距离对齐为66毫秒。该数据集已在[1]中使用,可在http://dx.doi.org/10.17632/ndrczc35bt.1上免费获取。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3626/7873347/0761a95fdd6d/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3626/7873347/0894716db2a2/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3626/7873347/0761a95fdd6d/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3626/7873347/0894716db2a2/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3626/7873347/0761a95fdd6d/gr2.jpg

相似文献

1
HANDS: an RGB-D dataset of static hand-gestures for human-robot interaction.HANDS:用于人机交互的静态手势RGB-D数据集。
Data Brief. 2021 Jan 30;35:106791. doi: 10.1016/j.dib.2021.106791. eCollection 2021 Apr.
2
Depth camera based dataset of hand gestures.基于深度相机的手势数据集。
Data Brief. 2022 Oct 10;45:108659. doi: 10.1016/j.dib.2022.108659. eCollection 2022 Dec.
3
CNN Deep Learning with Wavelet Image Fusion of CCD RGB-IR and Depth-Grayscale Sensor Data for Hand Gesture Intention Recognition.CNN 基于 CCD RGB-IR 与深度灰度传感器数据的子波图像融合的深度学习在手势意图识别中的应用。
Sensors (Basel). 2022 Jan 21;22(3):803. doi: 10.3390/s22030803.
4
putEMG-A Surface Electromyography Hand Gesture Recognition Dataset.putEMG-A 表面肌电手势识别数据集。
Sensors (Basel). 2019 Aug 14;19(16):3548. doi: 10.3390/s19163548.
5
A Deep Learning Framework for Recognizing Both Static and Dynamic Gestures.用于识别静态和动态手势的深度学习框架。
Sensors (Basel). 2021 Mar 23;21(6):2227. doi: 10.3390/s21062227.
6
Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study.协作应用中机器人路径定义的手势界面:实现与比较研究。
Sensors (Basel). 2023 Apr 23;23(9):4219. doi: 10.3390/s23094219.
7
Development of Real-Time Hand Gesture Recognition for Tabletop Holographic Display Interaction Using Azure Kinect.使用Azure Kinect开发用于桌面全息显示交互的实时手势识别技术。
Sensors (Basel). 2020 Aug 14;20(16):4566. doi: 10.3390/s20164566.
8
Improving Real-Time Hand Gesture Recognition with Semantic Segmentation.基于语义分割的实时手部 gesture 识别改进。
Sensors (Basel). 2021 Jan 7;21(2):356. doi: 10.3390/s21020356.
9
Dynamic Hand Gesture Recognition Using 3DCNN and LSTM with FSM Context-Aware Model.基于 3DCNN 和 LSTM 的 FSM 上下文感知模型的动态手势识别。
Sensors (Basel). 2019 Dec 9;19(24):5429. doi: 10.3390/s19245429.
10
Attentive 3D-Ghost Module for Dynamic Hand Gesture Recognition with Positive Knowledge Transfer.带正迁移知识的动态手势识别注意力 3D-Ghost 模块。
Comput Intell Neurosci. 2021 Nov 18;2021:5044916. doi: 10.1155/2021/5044916. eCollection 2021.

引用本文的文献

1
KuSL2023: A standard for Kurdish sign language detection and classification using hand tracking and machine learning.KuSL2023:一种使用手部跟踪和机器学习进行库尔德手语检测与分类的标准。
MethodsX. 2025 May 16;14:103374. doi: 10.1016/j.mex.2025.103374. eCollection 2025 Jun.
2
NSL23 dataset for alphabets of Nepali sign language.用于尼泊尔手语字母的NSL23数据集。
Data Brief. 2024 Jan 23;53:110080. doi: 10.1016/j.dib.2024.110080. eCollection 2024 Apr.
3
Dataset for multi-channel surface electromyography (sEMG) signals of hand gestures.

本文引用的文献

1
HGM-4: A new multi-cameras dataset for hand gesture recognition.HGM-4:一个用于手势识别的新型多摄像头数据集。
Data Brief. 2020 May 8;30:105676. doi: 10.1016/j.dib.2020.105676. eCollection 2020 Jun.
用于手势的多通道表面肌电(sEMG)信号的数据集。
Data Brief. 2022 Feb 4;41:107921. doi: 10.1016/j.dib.2022.107921. eCollection 2022 Apr.