• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用被动标记的广角、单目头部跟踪。

Wide-angle, monocular head tracking using passive markers.

机构信息

Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, USA.

Mind/Brain Institute, Johns Hopkins University, Baltimore, USA; Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, USA; Mechanical Engineering Department, Johns Hopkins University, Baltimore, MD, USA.

出版信息

J Neurosci Methods. 2022 Feb 15;368:109453. doi: 10.1016/j.jneumeth.2021.109453. Epub 2021 Dec 27.

DOI:10.1016/j.jneumeth.2021.109453
PMID:34968626
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8857048/
Abstract

BACKGROUND

Camera images can encode large amounts of visual information of an animal and its environment, enabling high fidelity 3D reconstruction of the animal and its environment using computer vision methods. Most systems, both markerless (e.g. deep learning based) and marker-based, require multiple cameras to track features across multiple points of view to enable such 3D reconstruction. However, such systems can be expensive and are challenging to set up in small animal research apparatuses.

NEW METHODS

We present an open-source, marker-based system for tracking the head of a rodent for behavioral research that requires only a single camera with a potentially wide field of view. The system features a lightweight visual target and computer vision algorithms that together enable high-accuracy tracking of the six-degree-of-freedom position and orientation of the animal's head. The system, which only requires a single camera positioned above the behavioral arena, robustly reconstructs the pose over a wide range of head angles (360° in yaw, and approximately ± 120° in roll and pitch).

RESULTS

Experiments with live animals demonstrate that the system can reliably identify rat head position and orientation. Evaluations using a commercial optical tracker device show that the system achieves accuracy that rivals commercial multi-camera systems.

COMPARISON WITH EXISTING METHODS

Our solution significantly improves upon existing monocular marker-based tracking methods, both in accuracy and in allowable range of motion.

CONCLUSIONS

The proposed system enables the study of complex behaviors by providing robust, fine-scale measurements of rodent head motions in a wide range of orientations.

摘要

背景

相机图像可以编码动物及其环境的大量视觉信息,使用计算机视觉方法可以实现对动物及其环境的高保真 3D 重建。大多数系统,无论是无标记(例如基于深度学习)还是标记系统,都需要多个相机来跟踪多个视角的特征,以实现这种 3D 重建。然而,这些系统可能很昂贵,并且在小型动物研究设备中设置起来具有挑战性。

新方法

我们提出了一种基于标记的开源系统,用于跟踪啮齿动物的头部进行行为研究,该系统仅需要一个具有潜在宽视场的相机。该系统具有轻量级的视觉目标和计算机视觉算法,它们共同实现了对动物头部六自由度位置和方向的高精度跟踪。该系统仅需要一个位于行为场上方的相机,能够在广泛的头部角度范围内(偏航 360°,以及滚动和俯仰方向约 ± 120°)稳健地重建姿态。

结果

活体动物实验表明,该系统可以可靠地识别大鼠头部的位置和方向。使用商业光学跟踪器设备进行的评估表明,该系统的准确性可与商业多相机系统相媲美。

与现有方法的比较

我们的解决方案在准确性和允许的运动范围方面都显著优于现有的单目标记跟踪方法。

结论

该系统通过提供对啮齿动物头部运动在广泛方向上的稳健、精细的测量,实现了对复杂行为的研究。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/18f88947ed32/nihms-1768584-f0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/d9873d074e11/nihms-1768584-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/9878b68532cf/nihms-1768584-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/bca9449f3da4/nihms-1768584-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/2e8e06f41c89/nihms-1768584-f0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/b090324ccdde/nihms-1768584-f0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/f78ca6276f38/nihms-1768584-f0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/0d3cd740a9ec/nihms-1768584-f0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/79901efe2f8e/nihms-1768584-f0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/c7e9ffb3690d/nihms-1768584-f0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/13549764462e/nihms-1768584-f0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/be6ad46746b6/nihms-1768584-f0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/18f88947ed32/nihms-1768584-f0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/d9873d074e11/nihms-1768584-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/9878b68532cf/nihms-1768584-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/bca9449f3da4/nihms-1768584-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/2e8e06f41c89/nihms-1768584-f0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/b090324ccdde/nihms-1768584-f0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/f78ca6276f38/nihms-1768584-f0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/0d3cd740a9ec/nihms-1768584-f0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/79901efe2f8e/nihms-1768584-f0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/c7e9ffb3690d/nihms-1768584-f0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/13549764462e/nihms-1768584-f0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/be6ad46746b6/nihms-1768584-f0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b224/8857048/18f88947ed32/nihms-1768584-f0012.jpg

相似文献

1
Wide-angle, monocular head tracking using passive markers.使用被动标记的广角、单目头部跟踪。
J Neurosci Methods. 2022 Feb 15;368:109453. doi: 10.1016/j.jneumeth.2021.109453. Epub 2021 Dec 27.
2
Spatial and rotational quality assurance of 6DOF patient tracking systems.六自由度患者跟踪系统的空间和旋转质量保证
Med Phys. 2016 Jun;43(6):2785-2793. doi: 10.1118/1.4948506.
3
Fast, robust, and accurate monocular peer-to-peer tracking for surgical navigation.快速、鲁棒、准确的单目点对点手术导航跟踪。
Int J Comput Assist Radiol Surg. 2020 Mar;15(3):479-489. doi: 10.1007/s11548-019-02111-z. Epub 2020 Jan 16.
4
Markerless motion tracking of awake animals in positron emission tomography.正电子发射断层成像术中清醒动物的无标记运动跟踪。
IEEE Trans Med Imaging. 2014 Nov;33(11):2180-90. doi: 10.1109/TMI.2014.2332821. Epub 2014 Jun 26.
5
SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.基于 SLAM 的单目微创手术中密集表面重建及其在增强现实中的应用。
Comput Methods Programs Biomed. 2018 May;158:135-146. doi: 10.1016/j.cmpb.2018.02.006. Epub 2018 Feb 8.
6
Unsupervised markerless 3-DOF motion tracking in real time using a single low-budget camera.使用单个低成本相机实时进行无监督、无标记的 3 自由度运动跟踪。
Int J Neural Syst. 2012 Oct;22(5):1250019. doi: 10.1142/S0129065712500190. Epub 2012 Aug 23.
7
A passive, camera-based head-tracking system for real-time, three-dimensional estimation of head position and orientation in rodents.一种基于摄像头的被动式头部跟踪系统,用于实时三维估计啮齿动物的头部位置和方向。
J Neurophysiol. 2019 Dec 1;122(6):2220-2242. doi: 10.1152/jn.00301.2019. Epub 2019 Sep 25.
8
Social Grouping for Multi-Target Tracking and Head Pose Estimation in Video.视频中用于多目标跟踪和头部姿势估计的社会群体分组。
IEEE Trans Pattern Anal Mach Intell. 2016 Oct;38(10):2082-95. doi: 10.1109/TPAMI.2015.2505292. Epub 2015 Dec 3.
9
Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System.啮齿动物竞技场追踪器(RAT):一种机器视觉啮齿动物跟踪摄像头和闭环控制系统。
eNeuro. 2020 May 12;7(3). doi: 10.1523/ENEURO.0485-19.2020. Print 2020 May/Jun.
10
Visually Impaired Users can Locate and Grasp Objects Under the Guidance of Computer Vision and Non-Visual Feedback.视障用户可以在计算机视觉和非视觉反馈的引导下定位并抓取物体。
Annu Int Conf IEEE Eng Med Biol Soc. 2018 Jul;2018:1-4. doi: 10.1109/EMBC.2018.8512918.

引用本文的文献

1
A Real-Time Approach for Assessing Rodent Engagement in a Nose-Poking Go/No-Go Behavioral Task Using ArUco Markers.一种使用ArUco标记评估啮齿动物在戳鼻“是/否”行为任务中参与度的实时方法。
Bio Protoc. 2024 Nov 5;14(21):e5098. doi: 10.21769/BioProtoc.5098.
2
Control and recalibration of path integration in place cells using optic flow.利用光流控制和重新校准位置细胞中的路径整合。
Nat Neurosci. 2024 Aug;27(8):1599-1608. doi: 10.1038/s41593-024-01681-9. Epub 2024 Jun 27.
3
Real-Time Assessment of Rodent Engagement Using ArUco Markers: A Scalable and Accessible Approach for Scoring Behavior in a Nose-Poking Go/No-Go Task.

本文引用的文献

1
The Dome: A virtual reality apparatus for freely locomoting rodents.穹顶:用于自由移动啮齿动物的虚拟现实设备。
J Neurosci Methods. 2022 Feb 15;368:109336. doi: 10.1016/j.jneumeth.2021.109336. Epub 2021 Aug 26.
2
Real-Time Closed-Loop Feedback in Behavioral Time Scales Using DeepLabCut.使用DeepLabCut在行为时间尺度上进行实时闭环反馈。
eNeuro. 2021 Apr 16;8(2). doi: 10.1523/ENEURO.0415-20.2021. Print 2021 Mar-Apr.
3
A Manufacturing-Oriented Intelligent Vision System Based on Deep Neural Network for Object Recognition and 6D Pose Estimation.
使用 ArUco 标记实时评估啮齿动物的参与度:一种用于在鼻触式 Go/No-Go 任务中评分行为的可扩展且易于使用的方法。
eNeuro. 2024 Mar 5;11(3). doi: 10.1523/ENEURO.0500-23.2024. Print 2024 Mar.
4
Naturalistic neuroscience and virtual reality.自然主义神经科学与虚拟现实
Front Syst Neurosci. 2022 Nov 17;16:896251. doi: 10.3389/fnsys.2022.896251. eCollection 2022.
5
The Dome: A virtual reality apparatus for freely locomoting rodents.穹顶:用于自由移动啮齿动物的虚拟现实设备。
J Neurosci Methods. 2022 Feb 15;368:109336. doi: 10.1016/j.jneumeth.2021.109336. Epub 2021 Aug 26.
一种基于深度神经网络的面向制造的智能视觉系统,用于目标识别和6D位姿估计。
Front Neurorobot. 2021 Jan 7;14:616775. doi: 10.3389/fnbot.2020.616775. eCollection 2020.
4
Real-time, low-latency closed-loop feedback using markerless posture tracking.使用无标记姿势跟踪的实时、低延迟闭环反馈。
Elife. 2020 Dec 8;9:e61909. doi: 10.7554/eLife.61909.
5
Real-Time Selective Markerless Tracking of Forepaws of Head Fixed Mice Using Deep Neural Networks.使用深度神经网络对头固定小鼠前爪进行实时无标记选择性跟踪。
eNeuro. 2020 Jun 15;7(3). doi: 10.1523/ENEURO.0096-20.2020. Print 2020 May/Jun.
6
Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System.啮齿动物竞技场追踪器(RAT):一种机器视觉啮齿动物跟踪摄像头和闭环控制系统。
eNeuro. 2020 May 12;7(3). doi: 10.1523/ENEURO.0485-19.2020. Print 2020 May/Jun.
7
Tracking activity patterns of a multispecies community of gymnotiform weakly electric fish in their neotropical habitat without tagging.在新热带栖息地追踪裸背电鳗目弱电鱼多物种群落的活动模式而不进行标记。
J Exp Biol. 2020 Feb 10;223(Pt 3):jeb206342. doi: 10.1242/jeb.206342.
8
DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult .基于深度学习的 3D 肢体和附属物追踪方法,用于束缚的成年。
Elife. 2019 Oct 4;8:e48571. doi: 10.7554/eLife.48571.
9
A passive, camera-based head-tracking system for real-time, three-dimensional estimation of head position and orientation in rodents.一种基于摄像头的被动式头部跟踪系统,用于实时三维估计啮齿动物的头部位置和方向。
J Neurophysiol. 2019 Dec 1;122(6):2220-2242. doi: 10.1152/jn.00301.2019. Epub 2019 Sep 25.
10
Using DeepLabCut for 3D markerless pose estimation across species and behaviors.使用 DeepLabCut 进行跨物种和行为的无标记 3D 姿态估计。
Nat Protoc. 2019 Jul;14(7):2152-2176. doi: 10.1038/s41596-019-0176-0. Epub 2019 Jun 21.