Computer Vision Laboratory, School of Computer and Communication Sciences, EPFL, Lausanne, Switzerland.
Neuroengineering Laboratory, Brain Mind Institute & Interfaculty Institute of Bioengineering, School of Life Sciences, EPFL, Lausanne, Switzerland.
Elife. 2019 Oct 4;8:e48571. doi: 10.7554/eLife.48571.
Studying how neural circuits orchestrate limbed behaviors requires the precise measurement of the positions of each appendage in three-dimensional (3D) space. Deep neural networks can estimate two-dimensional (2D) pose in freely behaving and tethered animals. However, the unique challenges associated with transforming these 2D measurements into reliable and precise 3D poses have not been addressed for small animals including the fly, . Here, we present DeepFly3D, a software that infers the 3D pose of tethered, adult using multiple camera images. DeepFly3D does not require manual calibration, uses pictorial structures to automatically detect and correct pose estimation errors, and uses active learning to iteratively improve performance. We demonstrate more accurate unsupervised behavioral embedding using 3D joint angles rather than commonly used 2D pose data. Thus, DeepFly3D enables the automated acquisition of behavioral measurements at an unprecedented level of detail for a variety of biological applications.
研究神经回路如何协调肢体行为需要精确测量每个附肢在三维(3D)空间中的位置。深度神经网络可以估计自由行为和系绳动物的二维(2D)姿势。然而,将这些 2D 测量值转换为可靠和精确的 3D 姿势的独特挑战尚未针对包括苍蝇在内的小型动物得到解决。在这里,我们介绍了 DeepFly3D,这是一款软件,可以通过多个摄像头图像推断系绳成年 的 3D 姿势。DeepFly3D 不需要手动校准,使用图像结构自动检测和纠正姿势估计错误,并使用主动学习来迭代提高性能。我们展示了使用 3D 关节角度进行更准确的无监督行为嵌入,而不是常用的 2D 姿势数据。因此,DeepFly3D 能够以前所未有的细节水平自动获取各种生物学应用的行为测量。