• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于标记的生物力学运动捕捉数据的自动间隙填补。

Automated gap-filling for marker-based biomechanical motion capture data.

机构信息

George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA, USA.

Department of Computing and Mathematical Sciences, California Institute of Technology, Pasadena, CA, USA.

出版信息

Comput Methods Biomech Biomed Engin. 2020 Nov;23(15):1180-1189. doi: 10.1080/10255842.2020.1789971. Epub 2020 Jul 11.

DOI:10.1080/10255842.2020.1789971
PMID:32654510
Abstract

Marker-based motion capture presents the problem of gaps, which are traditionally processed using motion capture software, requiring intensive manual input. We propose and study an automated method of gap-filling that uses inverse kinematics (IK) to close the loop of an iterative process to minimize error, while nearly eliminating user input. Comparing our method to manual gap-filling, we observe a 21% reduction in the worst-case gap-filling error ( < 0.05), and an 80% reduction in completion time ( < 0.01). Our contribution encompasses the release of an open-source repository of the method and interaction with OpenSim.

摘要

基于标记的运动捕捉存在间隙问题,传统上使用运动捕捉软件处理,需要大量的手动输入。我们提出并研究了一种自动的间隙填充方法,该方法使用反向运动学(IK)来封闭迭代过程的循环,以最小化误差,同时几乎消除用户输入。将我们的方法与手动间隙填充进行比较,我们观察到最坏情况下的间隙填充误差( < 0.05)降低了 21%,完成时间( < 0.01)降低了 80%。我们的贡献包括发布该方法的开源存储库以及与 OpenSim 的交互。

相似文献

1
Automated gap-filling for marker-based biomechanical motion capture data.基于标记的生物力学运动捕捉数据的自动间隙填补。
Comput Methods Biomech Biomed Engin. 2020 Nov;23(15):1180-1189. doi: 10.1080/10255842.2020.1789971. Epub 2020 Jul 11.
2
Open-source software library for real-time inertial measurement unit data-based inverse kinematics using OpenSim.基于 OpenSim 使用实时惯性测量单元数据的反向运动学的开源软件库。
PeerJ. 2023 Apr 5;11:e15097. doi: 10.7717/peerj.15097. eCollection 2023.
3
A model-based motion capture marker location refinement approach using inverse kinematics from dynamic trials.基于模型的运动捕捉标记位置精修方法,使用动态试验的逆运动学。
Int J Numer Method Biomed Eng. 2020 Jan;36(1):e3283. doi: 10.1002/cnm.3283. Epub 2019 Nov 19.
4
The development and evaluation of a fully automated markerless motion capture workflow.全自动无标记运动捕捉工作流程的开发与评估。
J Biomech. 2022 Nov;144:111338. doi: 10.1016/j.jbiomech.2022.111338. Epub 2022 Oct 2.
5
AddBiomechanics: Automating model scaling, inverse kinematics, and inverse dynamics from human motion data through sequential optimization.AddBiomechanics:通过顺序优化,从人体运动数据中自动进行模型缩放、运动学逆解和动力学逆解。
PLoS One. 2023 Nov 30;18(11):e0295152. doi: 10.1371/journal.pone.0295152. eCollection 2023.
6
Assessment of a markerless motion analysis system for manual wheelchair application.用于手动轮椅应用的无标记运动分析系统评估。
J Neuroeng Rehabil. 2018 Nov 6;15(1):96. doi: 10.1186/s12984-018-0444-1.
7
Method for Using IMU-Based Experimental Motion Data in BVH Format for Musculoskeletal Simulations via OpenSim.使用基于惯性测量单元的实验运动数据在 OpenSim 中以 BVH 格式用于肌肉骨骼模拟的方法。
Sensors (Basel). 2023 Jun 8;23(12):5423. doi: 10.3390/s23125423.
8
Gap Reconstruction in Optical Motion Capture Sequences Using Neural Networks.基于神经网络的光动作捕捉序列中的间隙重建。
Sensors (Basel). 2021 Sep 12;21(18):6115. doi: 10.3390/s21186115.
9
Bayesian inverse kinematics vs. least-squares inverse kinematics in estimates of planar postures and rotations in the absence of soft tissue artifact.在无软组织伪影情况下,贝叶斯逆运动学与最小二乘逆运动学在平面姿势和旋转估计中的比较
J Biomech. 2019 Jan 3;82:324-329. doi: 10.1016/j.jbiomech.2018.11.007. Epub 2018 Nov 15.
10
Optimizing Trajectories and Inverse Kinematics for Biomechanical Analysis of Markerless Motion Capture Data.优化轨迹和反向运动学,用于无标记运动捕捉数据的生物力学分析。
IEEE Int Conf Rehabil Robot. 2023 Sep;2023:1-6. doi: 10.1109/ICORR58425.2023.10304683.

引用本文的文献

1
Sensor Fusion for Enhancing Motion Capture: Integrating Optical and Inertial Motion Capture Systems.用于增强动作捕捉的传感器融合:整合光学和惯性动作捕捉系统。
Sensors (Basel). 2025 Jul 29;25(15):4680. doi: 10.3390/s25154680.
2
SmartDetector: Automatic and vision-based approach to point-light display generation for human action perception.SmartDetector:一种基于自动视觉的点光显示生成方法,用于人类动作感知。
Behav Res Methods. 2024 Dec;56(8):8349-8361. doi: 10.3758/s13428-024-02478-1. Epub 2024 Aug 13.
3
Deep-Learning-Based Recovery of Missing Optical Marker Trajectories in 3D Motion Capture Systems.
基于深度学习的三维运动捕捉系统中缺失光学标记轨迹恢复
Bioengineering (Basel). 2024 Jun 1;11(6):560. doi: 10.3390/bioengineering11060560.
4
A human lower-limb biomechanics and wearable sensors dataset during cyclic and non-cyclic activities.一个关于人体下肢在周期性和非周期性活动期间的生物力学及可穿戴传感器的数据集。
Sci Data. 2023 Dec 21;10(1):924. doi: 10.1038/s41597-023-02840-6.
5
Linking whole-body angular momentum and step placement during perturbed human walking.在人体受到干扰的行走过程中,连接全身角动量和脚步放置。
J Exp Biol. 2023 Mar 15;226(6). doi: 10.1242/jeb.244760. Epub 2023 Mar 29.
6
Impaired foot placement strategy during walking in people with incomplete spinal cord injury.不完全性脊髓损伤患者行走时足部放置策略受损。
J Neuroeng Rehabil. 2022 Dec 5;19(1):134. doi: 10.1186/s12984-022-01117-0.
7
Evaluating the integration of eye-tracking and motion capture technologies: Quantifying the accuracy and precision of gaze measures.评估眼动追踪与动作捕捉技术的整合:量化注视测量的准确性和精确性。
Iperception. 2022 Sep 26;13(5):20416695221116652. doi: 10.1177/20416695221116652. eCollection 2022 Sep-Oct.
8
Detection and Classification of Artifact Distortions in Optical Motion Capture Sequences.光学运动捕捉序列中伪影失真的检测与分类。
Sensors (Basel). 2022 May 27;22(11):4076. doi: 10.3390/s22114076.
9
A Guide to Inverse Kinematic Marker-Guided Rotoscoping Using IK Solvers.使用IK求解器的反向运动学标记引导逐帧动画制作指南。
Integr Org Biol. 2022 Jan 27;4(1):obac002. doi: 10.1093/iob/obac002. eCollection 2022.
10
A Deep Learning Approach for Foot Trajectory Estimation in Gait Analysis Using Inertial Sensors.基于惯性传感器的步态分析中足部轨迹估计的深度学习方法。
Sensors (Basel). 2021 Nov 12;21(22):7517. doi: 10.3390/s21227517.