Neuroscience Graduate Program, University of Washington, Seattle, WA, USA.
Department of Physiology and Biophysics, University of Washington, Seattle, WA, USA.
Cell Rep. 2021 Sep 28;36(13):109730. doi: 10.1016/j.celrep.2021.109730.
Quantifying movement is critical for understanding animal behavior. Advances in computer vision now enable markerless tracking from 2D video, but most animals move in 3D. Here, we introduce Anipose, an open-source toolkit for robust markerless 3D pose estimation. Anipose is built on the 2D tracking method DeepLabCut, so users can expand their existing experimental setups to obtain accurate 3D tracking. It consists of four components: (1) a 3D calibration module, (2) filters to resolve 2D tracking errors, (3) a triangulation module that integrates temporal and spatial regularization, and (4) a pipeline to structure processing of large numbers of videos. We evaluate Anipose on a calibration board as well as mice, flies, and humans. By analyzing 3D leg kinematics tracked with Anipose, we identify a key role for joint rotation in motor control of fly walking. To help users get started with 3D tracking, we provide tutorials and documentation at http://anipose.org/.
量化动物的运动对于理解动物行为至关重要。计算机视觉的进步现在可以从 2D 视频中进行无标记追踪,但大多数动物是在 3D 空间中移动的。在这里,我们介绍一个开源的无标记 3D 姿态估计工具包 Anipose。Anipose 建立在 2D 追踪方法 DeepLabCut 的基础上,因此用户可以扩展现有的实验设置,以获得准确的 3D 追踪。它由四个部分组成:(1)3D 校准模块,(2)用于解决 2D 追踪误差的滤波器,(3)一个整合时空正则化的三角测量模块,以及(4)一个用于处理大量视频的处理流水线。我们在校准板以及小鼠、苍蝇和人类身上评估了 Anipose。通过分析用 Anipose 追踪到的 3D 腿部运动学,我们确定了关节旋转在苍蝇行走运动控制中的关键作用。为了帮助用户开始 3D 追踪,我们在 http://anipose.org/ 提供了教程和文档。