• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

抗漂移姿态跟踪器(ADPT),一种基于Transformer的用于跨物种进行稳健动物姿态估计的网络。

Anti-drift pose tracker (ADPT), a transformer-based network for robust animal pose estimation cross-species.

作者信息

Tang Guoling, Han Yaning, Sun Xing, Zhang Ruonan, Han Ming-Hu, Liu Quanying, Wei Pengfei

机构信息

University of Chinese Academy of Sciences, Shenzhen, China.

University of Chinese Academy of Sciences, Beijing, China.

出版信息

Elife. 2025 May 6;13:RP95709. doi: 10.7554/eLife.95709.

DOI:10.7554/eLife.95709
PMID:40326557
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12055000/
Abstract

Deep learning-based methods have advanced animal pose estimation, enhancing accuracy, and efficiency in quantifying animal behavior. However, these methods frequently experience tracking drift, where noise-induced jumps in body point estimates compromise reliability. Here, we present the anti-drift pose tracker (ADPT), a transformer-based tool that mitigates tracking drift in behavioral analysis. Extensive experiments across cross-species datasets-including proprietary mouse and monkey recordings and public and macaque datasets-demonstrate that ADPT significantly reduces drift and surpasses existing models like DeepLabCut and SLEAP in accuracy. Moreover, ADPT achieved 93.16% identification accuracy for 10 unmarked mice and 90.36% accuracy for freely interacting unmarked mice, which can be further refined to 99.72%, enhancing both anti-drift performance and pose estimation accuracy in social interactions. With its end-to-end design, ADPT is computationally efficient and suitable for real-time analysis, offering a robust solution for reproducible animal behavior studies. The ADPT code is available at https://github.com/tangguoling/ADPT.

摘要

基于深度学习的方法推动了动物姿态估计的发展,提高了在量化动物行为方面的准确性和效率。然而,这些方法经常出现跟踪漂移问题,即噪声引起的身体关键点估计跳跃会损害可靠性。在此,我们提出了抗漂移姿态跟踪器(ADPT),这是一种基于Transformer的工具,可减轻行为分析中的跟踪漂移。在跨物种数据集上进行的大量实验——包括专有的小鼠和猴子记录以及公开的猕猴数据集——表明,ADPT显著减少了漂移,并且在准确性方面超过了DeepLabCut和SLEAP等现有模型。此外,ADPT对10只未标记小鼠的识别准确率达到93.16%,对自由互动的未标记小鼠的准确率达到90.36%,这一准确率可进一步提高到99.72%,在社交互动中提高了抗漂移性能和姿态估计准确性。凭借其端到端设计,ADPT计算效率高,适用于实时分析,为可重复的动物行为研究提供了一个强大的解决方案。ADPT代码可在https://github.com/tangguoling/ADPT获取。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/2424992140c5/elife-95709-fig8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/e578352a6159/elife-95709-fig1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/0b6cd5a35165/elife-95709-fig2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/23218ab9a453/elife-95709-fig3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/8c02b06f37cc/elife-95709-fig4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/61ea0e750b23/elife-95709-fig5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/6324c08440a5/elife-95709-fig5-figsupp1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/1878674a0af0/elife-95709-fig6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/50435e07849e/elife-95709-fig7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/2424992140c5/elife-95709-fig8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/e578352a6159/elife-95709-fig1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/0b6cd5a35165/elife-95709-fig2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/23218ab9a453/elife-95709-fig3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/8c02b06f37cc/elife-95709-fig4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/61ea0e750b23/elife-95709-fig5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/6324c08440a5/elife-95709-fig5-figsupp1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/1878674a0af0/elife-95709-fig6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/50435e07849e/elife-95709-fig7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f7f/12055000/2424992140c5/elife-95709-fig8.jpg

相似文献

1
Anti-drift pose tracker (ADPT), a transformer-based network for robust animal pose estimation cross-species.抗漂移姿态跟踪器(ADPT),一种基于Transformer的用于跨物种进行稳健动物姿态估计的网络。
Elife. 2025 May 6;13:RP95709. doi: 10.7554/eLife.95709.
2
SLEAP: A deep learning system for multi-animal pose tracking.SLEAP:一个用于多动物姿态跟踪的深度学习系统。
Nat Methods. 2022 Apr;19(4):486-495. doi: 10.1038/s41592-022-01426-1. Epub 2022 Apr 4.
3
DeepLabCut: markerless pose estimation of user-defined body parts with deep learning.DeepLabCut:基于深度学习的用户自定义身体部位无标记姿态估计。
Nat Neurosci. 2018 Sep;21(9):1281-1289. doi: 10.1038/s41593-018-0209-y. Epub 2018 Aug 20.
4
DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning.DeepPoseKit,一个使用深度学习进行快速、鲁棒的动物姿态估计的软件工具包。
Elife. 2019 Oct 1;8:e47994. doi: 10.7554/eLife.47994.
5
PMotion: an advanced markerless pose estimation approach based on novel deep learning framework used to reveal neurobehavior.PMotion:一种基于新颖深度学习框架的先进无标记姿势估计方法,用于揭示神经行为。
J Neural Eng. 2023 Jul 6;20(4). doi: 10.1088/1741-2552/acd603.
6
STPoseNet: A real-time spatiotemporal network model for robust mouse pose estimation.STPoseNet:一种用于稳健小鼠姿态估计的实时时空网络模型。
iScience. 2024 Apr 18;27(5):109772. doi: 10.1016/j.isci.2024.109772. eCollection 2024 May 17.
7
Anipose: A toolkit for robust markerless 3D pose estimation.Anipose:一个用于鲁棒无标记 3D 姿态估计的工具包。
Cell Rep. 2021 Sep 28;36(13):109730. doi: 10.1016/j.celrep.2021.109730.
8
Fast and Efficient Root Phenotyping via Pose Estimation.通过姿态估计实现快速高效的根系表型分析。
Plant Phenomics. 2024 Apr 12;6:0175. doi: 10.34133/plantphenomics.0175. eCollection 2024.
9
Fast animal pose estimation using deep neural networks.基于深度神经网络的快速动物姿势估计。
Nat Methods. 2019 Jan;16(1):117-125. doi: 10.1038/s41592-018-0234-5. Epub 2018 Dec 20.
10
Fast and efficient root phenotyping via pose estimation.通过姿态估计实现快速高效的根系表型分析。
bioRxiv. 2023 Nov 21:2023.11.20.567949. doi: 10.1101/2023.11.20.567949.

引用本文的文献

1
A Deep Learning Framework for Multi-Object Tracking in Space Animal Behavior Studies.用于空间动物行为研究中多目标跟踪的深度学习框架。
Animals (Basel). 2025 Aug 20;15(16):2448. doi: 10.3390/ani15162448.
2
A Study of Spontaneous Self-Injurious Behavior and Neuroimaging in Rhesus Macaques.恒河猴自发自伤行为与神经影像学研究
Research (Wash D C). 2025 Jul 31;8:0782. doi: 10.34133/research.0782. eCollection 2025.

本文引用的文献

1
Keypoint-MoSeq: parsing behavior by linking point tracking to pose dynamics.Keypoint-MoSeq:通过将点跟踪与姿势动态联系起来来解析行为。
Nat Methods. 2024 Jul;21(7):1329-1339. doi: 10.1038/s41592-024-02318-2. Epub 2024 Jul 12.
2
Dissecting neural computations in the human auditory pathway using deep neural networks for speech.利用用于语音的深度神经网络解析人类听觉通路中的神经计算。
Nat Neurosci. 2023 Dec;26(12):2213-2225. doi: 10.1038/s41593-023-01468-4. Epub 2023 Oct 30.
3
A high-performance neuroprosthesis for speech decoding and avatar control.
一种用于语音解码和化身控制的高性能神经假体。
Nature. 2023 Aug;620(7976):1037-1046. doi: 10.1038/s41586-023-06443-4. Epub 2023 Aug 23.
4
Mapping the neuroethological signatures of pain, analgesia, and recovery in mice.绘制小鼠疼痛、镇痛和恢复的神经行为特征图谱。
Neuron. 2023 Sep 20;111(18):2811-2830.e8. doi: 10.1016/j.neuron.2023.06.008. Epub 2023 Jul 12.
5
Non-human primate models and systems for gait and neurophysiological analysis.用于步态和神经生理学分析的非人灵长类动物模型和系统。
Front Neurosci. 2023 Apr 28;17:1141567. doi: 10.3389/fnins.2023.1141567. eCollection 2023.
6
Learnable latent embeddings for joint behavioural and neural analysis.可学习的潜在嵌入物,用于联合行为和神经分析。
Nature. 2023 May;617(7960):360-368. doi: 10.1038/s41586-023-06031-6. Epub 2023 May 3.
7
Hidden behavioral fingerprints in epilepsy.癫痫的隐匿行为特征。
Neuron. 2023 May 3;111(9):1440-1452.e5. doi: 10.1016/j.neuron.2023.02.003. Epub 2023 Feb 24.
8
Identifying behavioral structure from deep variational embeddings of animal motion.从动物运动的深度变分嵌入中识别行为结构。
Commun Biol. 2022 Nov 18;5(1):1267. doi: 10.1038/s42003-022-04080-7.
9
Automatic extraction of upper-limb kinematic activity using deep learning-based markerless tracking during deep brain stimulation implantation for Parkinson's disease: A proof of concept study.基于深度学习的无标记跟踪技术在帕金森病脑深部刺激植入术中上肢运动学活动的自动提取:概念验证研究。
PLoS One. 2022 Oct 20;17(10):e0275490. doi: 10.1371/journal.pone.0275490. eCollection 2022.
10
Estimation of skeletal kinematics in freely moving rodents.自由活动啮齿动物骨骼运动学的估计。
Nat Methods. 2022 Nov;19(11):1500-1509. doi: 10.1038/s41592-022-01634-9. Epub 2022 Oct 17.