• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于姿态识别的动作行为辅助评估。

Aided Evaluation of Motion Action Based on Attitude Recognition.

机构信息

Institute of Mechanical and Dynamic Engineering, East China University of Science and Technology, Shanghai 200237, China.

School of Design and Art, Shanghai Dianji University, Shanghai 200240, China.

出版信息

J Healthc Eng. 2022 Mar 9;2022:8388325. doi: 10.1155/2022/8388325. eCollection 2022.

DOI:10.1155/2022/8388325
PMID:35310175
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8926528/
Abstract

For athletes who are eager for success, it is difficult to obtain their own movement data due to field equipment, artificial errors, and other factors, which means that they cannot get professional movement guidance and posture correction from sports coaches, which is a disastrous problem. To solve this big problem, combined with the latest research results of deep learning in the field of computer technology, based on the related technology of human posture recognition, this paper uses convolution neural network and video processing technology to create an auxiliary evaluation system of sports movements, which can obtain accurate data and help people interact with each other, so as to help athletes better understand their body posture and movement data. The research results show that: (1) using OpenPose open-source library for pose recognition, joint angle data can be obtained through joint coordinates, and the key points of video human posture can be identified and calculated for easy analysis. (2) The movements of the human body in the video are evaluated. In this way, it is judged whether the action amplitude of the detected target conforms to the standard action data. (3) According to the standard motion database created in this paper, a formal motion auxiliary evaluation system is established; compared with the standard action, the smaller the Euclidean distance is, the more standard it is. The action with an Euclidean distance of 4.79583 is the best action of the tested person. (4) The efficiency of traditional methods is very low, and the correct recognition rate of the method based on BP neural network can be as high as 96.4%; the correct recognition rate of the attitude recognition method based on this paper can be as high as 98.7%, which is 2.3% higher than the previous method. Therefore, the method in this paper has great advantages. The research results of the sports action assistant evaluation system in this paper are good, which effectively solves the difficult problems that plague athletes and can be considered to have achieved certain success; the follow-up system test and operation work need further optimization and research by researchers.

摘要

对于渴望成功的运动员来说,由于场地设备、人为误差等因素,很难获得自己的运动数据,这意味着他们无法从体育教练那里获得专业的运动指导和姿势矫正,这是一个灾难性的问题。为了解决这个大问题,结合计算机技术领域深度学习的最新研究成果,基于人体姿势识别的相关技术,本文利用卷积神经网络和视频处理技术,创建了一种运动辅助评估系统,可以获得准确的数据,帮助人们进行互动,从而帮助运动员更好地了解自己的身体姿势和运动数据。研究结果表明:(1)利用 OpenPose 开源库进行姿势识别,通过关节坐标获取关节角度数据,识别和计算视频人体姿势关键点,便于分析。(2)评估视频中人体的动作。这样,就可以判断检测到的目标的动作幅度是否符合标准动作数据。(3)根据本文创建的标准运动数据库,建立正式的运动辅助评估系统;与标准动作相比,检测到的目标的欧式距离越小,越标准。距离为 4.79583 的动作是被测人员的最佳动作。(4)传统方法的效率非常低,基于 BP 神经网络的方法的正确识别率可以高达 96.4%;基于本文的姿态识别方法的正确识别率可以高达 98.7%,比之前的方法高 2.3%。因此,本文提出的方法具有很大的优势。本文提出的运动辅助评估系统的研究结果良好,有效解决了困扰运动员的难题,可以认为已经取得了一定的成功;后续的系统测试和运行工作需要研究人员进一步优化和研究。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/195a4076967f/JHE2022-8388325.018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/512b6ea9d294/JHE2022-8388325.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/4aa2eb70e917/JHE2022-8388325.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/e97265ef5457/JHE2022-8388325.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/5c471aaa15c7/JHE2022-8388325.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/c0419c918bd9/JHE2022-8388325.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/5b83dd5a9217/JHE2022-8388325.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/2b6b8cb4cc39/JHE2022-8388325.007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/3290c7361767/JHE2022-8388325.008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/74d93b58bce9/JHE2022-8388325.009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/8bfc5c0ef121/JHE2022-8388325.010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/f76cd53d74e4/JHE2022-8388325.011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/ce4c3769f57c/JHE2022-8388325.012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/d789e8c375f1/JHE2022-8388325.013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/7ce146a3e002/JHE2022-8388325.014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/d1461f86a37c/JHE2022-8388325.015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/b5ce31860a7b/JHE2022-8388325.016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/18dec401c4ae/JHE2022-8388325.017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/195a4076967f/JHE2022-8388325.018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/512b6ea9d294/JHE2022-8388325.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/4aa2eb70e917/JHE2022-8388325.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/e97265ef5457/JHE2022-8388325.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/5c471aaa15c7/JHE2022-8388325.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/c0419c918bd9/JHE2022-8388325.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/5b83dd5a9217/JHE2022-8388325.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/2b6b8cb4cc39/JHE2022-8388325.007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/3290c7361767/JHE2022-8388325.008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/74d93b58bce9/JHE2022-8388325.009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/8bfc5c0ef121/JHE2022-8388325.010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/f76cd53d74e4/JHE2022-8388325.011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/ce4c3769f57c/JHE2022-8388325.012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/d789e8c375f1/JHE2022-8388325.013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/7ce146a3e002/JHE2022-8388325.014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/d1461f86a37c/JHE2022-8388325.015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/b5ce31860a7b/JHE2022-8388325.016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/18dec401c4ae/JHE2022-8388325.017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f9a/8926528/195a4076967f/JHE2022-8388325.018.jpg

相似文献

1
Aided Evaluation of Motion Action Based on Attitude Recognition.基于姿态识别的动作行为辅助评估。
J Healthc Eng. 2022 Mar 9;2022:8388325. doi: 10.1155/2022/8388325. eCollection 2022.
2
Application of video image processing in sports action recognition based on particle swarm optimization algorithm.基于粒子群算法的视频图像处理在体育动作识别中的应用。
Prev Med. 2023 Aug;173:107592. doi: 10.1016/j.ypmed.2023.107592. Epub 2023 Jun 26.
3
A Deep Learning and Clustering Extraction Mechanism for Recognizing the Actions of Athletes in Sports.运动员运动动作识别的深度学习与聚类提取机制
Comput Intell Neurosci. 2022 Mar 24;2022:2663834. doi: 10.1155/2022/2663834. eCollection 2022.
4
Motion Recognition Based on Deep Learning and Human Joint Points.基于深度学习和人体关节点的运动识别。
Comput Intell Neurosci. 2022 May 10;2022:1826951. doi: 10.1155/2022/1826951. eCollection 2022.
5
Yoga Posture Recognition and Quantitative Evaluation with Wearable Sensors Based on Two-Stage Classifier and Prior Bayesian Network.基于两级分类器和先验贝叶斯网络的可穿戴传感器的瑜伽姿势识别与定量评估。
Sensors (Basel). 2019 Nov 23;19(23):5129. doi: 10.3390/s19235129.
6
Application of Human Posture Recognition Based on the Convolutional Neural Network in Physical Training Guidance.基于卷积神经网络的人体姿态识别在体能训练指导中的应用。
Comput Intell Neurosci. 2022 Jun 28;2022:5277157. doi: 10.1155/2022/5277157. eCollection 2022.
7
Research on Athlete Behavior Recognition Technology in Sports Teaching Video Based on Deep Neural Network.基于深度神经网络的体育教学视频中运动员行为识别技术研究。
Comput Intell Neurosci. 2022 Jan 5;2022:7260894. doi: 10.1155/2022/7260894. eCollection 2022.
8
Research on Multiplayer Posture Estimation Technology of Sports Competition Video Based on Graph Neural Network Algorithm.基于图神经网络算法的体育竞赛视频多人位姿估计技术研究。
Comput Intell Neurosci. 2022 Apr 1;2022:4727375. doi: 10.1155/2022/4727375. eCollection 2022.
9
Sports Action Recognition Based on Deep Learning and Clustering Extraction Algorithm.基于深度学习和聚类提取算法的运动动作识别。
Comput Intell Neurosci. 2022 Mar 19;2022:4887470. doi: 10.1155/2022/4887470. eCollection 2022.
10
A Deep Learning Method for Intelligent Analysis of Sports Training Postures.深度学习方法在体育训练姿势智能分析中的应用。
Comput Intell Neurosci. 2022 Jul 31;2022:2442606. doi: 10.1155/2022/2442606. eCollection 2022.

引用本文的文献

1
Retracted: Aided Evaluation of Motion Action Based on Attitude Recognition.撤回:基于姿态识别的运动动作辅助评估。
J Healthc Eng. 2023 Nov 29;2023:9801367. doi: 10.1155/2023/9801367. eCollection 2023.

本文引用的文献

1
DeepID-Net: Deformable Deep Convolutional Neural Networks for Object Detection.深度身份网络:用于目标检测的可变形深度卷积神经网络
IEEE Trans Pattern Anal Mach Intell. 2017 Jul;39(7):1320-1334. doi: 10.1109/TPAMI.2016.2587642. Epub 2016 Jul 7.