• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于手指特征的分层决策和分类的静态手势识别。

Static hand gesture recognition based on hierarchical decision and classification of finger features.

机构信息

School of Mechatronics Engineering, 74623Henan University of Science and Technology, China.

Collaborative Innovation Center of Machinery Equipment Advanced Manufacturing of Henan Province, China.

出版信息

Sci Prog. 2022 Jan-Mar;105(1):368504221086362. doi: 10.1177/00368504221086362.

DOI:10.1177/00368504221086362
PMID:35296188
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10358564/
Abstract

Considering that the distinctions among static hand gestures are the difference between fingers sticking out, a method of grouping and classifying hand gestures step by step by using the information of the quantity, direction, position and shape of the outstretched fingers was proposed this paper. Firstly, the gesture region was segmented by using the skin color information of the hand, and the gesture direction was normalized by using the direction information of the gesture contour lines. Secondly, the finger was segmented one by one by using convex decomposition in the hand gesture image based on the convex characteristic of the gesture shape. Thirdly, the features of quantity, direction, position and shape of the segmented fingers were extracted. Lastly, a hierarchical decision classifier embedded with deep sparse autoencoders was constructed. The quantity of fingers was used to divide the gesture images into groups first, then the direction, position and shape features of the fingers were used to subdivide and recognize gestures within each group. The experimental results show that the proposed method is robust as lighting, direction and scale changes and significantly superior to the traditional method both in the recognition rate and the recognition stability.

摘要

考虑到手的静态手势之间的区别在于伸出的手指,本文提出了一种使用伸出手指的数量、方向、位置和形状信息对手势进行分组和分类的方法。首先,通过使用手部的肤色信息对手势区域进行分割,并通过手势轮廓线的方向信息对手势方向进行归一化。其次,基于手势形状的凸性特征,在基于手势图像中使用凸分解对手指进行逐个分割。第三,提取分割手指的数量、方向、位置和形状特征。最后,构建了一个嵌入深度稀疏自动编码器的分层决策分类器。首先使用手指的数量将手势图像分为组,然后使用手指的方向、位置和形状特征对手势进行细分和识别。实验结果表明,该方法对光照、方向和尺度变化具有鲁棒性,在识别率和识别稳定性方面明显优于传统方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/e8530d4cb48a/10.1177_00368504221086362-fig18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/99b4b31c98df/10.1177_00368504221086362-fig1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/9a60dc4f1517/10.1177_00368504221086362-fig2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/633a2b70de73/10.1177_00368504221086362-fig3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/f7cf516c1210/10.1177_00368504221086362-fig4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/470d94fe09de/10.1177_00368504221086362-fig5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/f5130135e08e/10.1177_00368504221086362-fig6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/9599b0675a17/10.1177_00368504221086362-fig7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/de161d67dc6f/10.1177_00368504221086362-fig8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/726a832a9609/10.1177_00368504221086362-fig9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/cb3b90f8cc19/10.1177_00368504221086362-fig10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/fd055318cdd7/10.1177_00368504221086362-fig11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/b298012c78eb/10.1177_00368504221086362-fig12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/9d23be8df6c0/10.1177_00368504221086362-fig13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/df2efd4438f3/10.1177_00368504221086362-fig14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/a3aee3f96dcf/10.1177_00368504221086362-fig15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/1aec29ab3a6e/10.1177_00368504221086362-fig16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/600eaea73be3/10.1177_00368504221086362-fig17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/e8530d4cb48a/10.1177_00368504221086362-fig18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/99b4b31c98df/10.1177_00368504221086362-fig1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/9a60dc4f1517/10.1177_00368504221086362-fig2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/633a2b70de73/10.1177_00368504221086362-fig3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/f7cf516c1210/10.1177_00368504221086362-fig4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/470d94fe09de/10.1177_00368504221086362-fig5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/f5130135e08e/10.1177_00368504221086362-fig6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/9599b0675a17/10.1177_00368504221086362-fig7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/de161d67dc6f/10.1177_00368504221086362-fig8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/726a832a9609/10.1177_00368504221086362-fig9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/cb3b90f8cc19/10.1177_00368504221086362-fig10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/fd055318cdd7/10.1177_00368504221086362-fig11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/b298012c78eb/10.1177_00368504221086362-fig12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/9d23be8df6c0/10.1177_00368504221086362-fig13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/df2efd4438f3/10.1177_00368504221086362-fig14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/a3aee3f96dcf/10.1177_00368504221086362-fig15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/1aec29ab3a6e/10.1177_00368504221086362-fig16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/600eaea73be3/10.1177_00368504221086362-fig17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c8e4/10358564/e8530d4cb48a/10.1177_00368504221086362-fig18.jpg

相似文献

1
Static hand gesture recognition based on hierarchical decision and classification of finger features.基于手指特征的分层决策和分类的静态手势识别。
Sci Prog. 2022 Jan-Mar;105(1):368504221086362. doi: 10.1177/00368504221086362.
2
FGFF Descriptor and Modified Hu Moment-Based Hand Gesture Recognition.基于 FGFF 描述符和修正 Hu 矩的手势识别
Sensors (Basel). 2021 Sep 29;21(19):6525. doi: 10.3390/s21196525.
3
Real-time hand gesture recognition using finger segmentation.基于手指分割的实时手势识别。
ScientificWorldJournal. 2014;2014:267872. doi: 10.1155/2014/267872. Epub 2014 Jun 25.
4
Method for user interface of large displays using arm pointing and finger counting gesture recognition.使用手臂指向和手指计数手势识别的大型显示器用户界面方法。
ScientificWorldJournal. 2014;2014:683045. doi: 10.1155/2014/683045. Epub 2014 Sep 1.
5
Static Hand Gesture Recognition Using Capacitive Sensing and Machine Learning.基于电容式感应和机器学习的静态手势识别。
Sensors (Basel). 2023 Mar 24;23(7):3419. doi: 10.3390/s23073419.
6
From Signal to Image: Enabling Fine-Grained Gesture Recognition with Commercial Wi-Fi Devices.从信号到图像:利用商业 Wi-Fi 设备实现精细手势识别。
Sensors (Basel). 2018 Sep 18;18(9):3142. doi: 10.3390/s18093142.
7
Exploiting domain transformation and deep learning for hand gesture recognition using a low-cost dataglove.利用领域变换和深度学习,使用低成本数据手套进行手势识别。
Sci Rep. 2022 Dec 12;12(1):21446. doi: 10.1038/s41598-022-25108-2.
8
HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps.HAGR-D:一种利用深度图进行手势识别的新方法。
Sensors (Basel). 2015 Nov 12;15(11):28646-64. doi: 10.3390/s151128646.
9
Finger Gesture Spotting from Long Sequences Based on Multi-Stream Recurrent Neural Networks.基于多流循环神经网络的长序列手指手势识别。
Sensors (Basel). 2020 Jan 18;20(2):528. doi: 10.3390/s20020528.
10
Continuous Finger Gesture Recognition Based on Flex Sensors.基于柔性传感器的连续手指手势识别。
Sensors (Basel). 2019 Sep 15;19(18):3986. doi: 10.3390/s19183986.

本文引用的文献

1
Hand Pose Recognition Using Parallel Multi Stream CNN.基于并行多流卷积神经网络的手势识别
Sensors (Basel). 2021 Dec 18;21(24):8469. doi: 10.3390/s21248469.
2
Gesture-Based Human Machine Interaction Using RCNNs in Limited Computation Power Devices.基于手势的人机交互使用 RCNN 在有限计算能力设备中。
Sensors (Basel). 2021 Dec 8;21(24):8202. doi: 10.3390/s21248202.
3
FGFF Descriptor and Modified Hu Moment-Based Hand Gesture Recognition.基于 FGFF 描述符和修正 Hu 矩的手势识别
Sensors (Basel). 2021 Sep 29;21(19):6525. doi: 10.3390/s21196525.
4
A comparison of Arabic sign language dynamic gesture recognition models.阿拉伯手语动态手势识别模型的比较
Heliyon. 2020 Mar 14;6(3):e03554. doi: 10.1016/j.heliyon.2020.e03554. eCollection 2020 Mar.
5
Enhancement of surgical hand gesture recognition using a capsule network for a contactless interface in the operating room.使用胶囊网络增强手术室无接触界面的手术手势识别。
Comput Methods Programs Biomed. 2020 Jul;190:105385. doi: 10.1016/j.cmpb.2020.105385. Epub 2020 Feb 6.
6
A Gesture Recognition Algorithm for Hand-Assisted Laparoscopic Surgery.一种用于手助腹腔镜手术的手势识别算法。
Sensors (Basel). 2019 Nov 26;19(23):5182. doi: 10.3390/s19235182.
7
Deep learning in neural networks: an overview.神经网络中的深度学习:综述。
Neural Netw. 2015 Jan;61:85-117. doi: 10.1016/j.neunet.2014.09.003. Epub 2014 Oct 13.