School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China.
Tsinghua-Peking Center for Life Sciences, Beijing, China.
Elife. 2022 Jun 16;11:e76218. doi: 10.7554/eLife.76218.
Fast and accurately characterizing animal behaviors is crucial for neuroscience research. Deep learning models are efficiently used in laboratories for behavior analysis. However, it has not been achieved to use an end-to-end unsupervised neural network to extract comprehensive and discriminative features directly from social behavior video frames for annotation and analysis purposes. Here, we report a self-supervised feature extraction (Selfee) convolutional neural network with multiple downstream applications to process video frames of animal behavior in an end-to-end way. Visualization and classification of the extracted features (Meta-representations) validate that Selfee processes animal behaviors in a way similar to human perception. We demonstrate that Meta-representations can be efficiently used to detect anomalous behaviors that are indiscernible to human observation and hint in-depth analysis. Furthermore, time-series analyses of Meta-representations reveal the temporal dynamics of animal behaviors. In conclusion, we present a self-supervised learning approach to extract comprehensive and discriminative features directly from raw video recordings of animal behaviors and demonstrate its potential usage for various downstream applications.
快速准确地描述动物行为对于神经科学研究至关重要。深度学习模型在实验室中被有效地用于行为分析。然而,尚未实现使用端到端的无监督神经网络直接从社交行为视频帧中提取全面且有区别的特征,用于注释和分析目的。在这里,我们报告了一种自监督特征提取(Selfee)卷积神经网络,具有多种下游应用,可用于以端到端的方式处理动物行为的视频帧。提取特征(元表示)的可视化和分类验证了 Selfee 以类似于人类感知的方式处理动物行为。我们证明,元表示可以有效地用于检测人类观察无法察觉的异常行为,并提示进行深入分析。此外,元表示的时间序列分析揭示了动物行为的时间动态。总之,我们提出了一种自监督学习方法,可直接从动物行为的原始视频记录中提取全面且有区别的特征,并展示了其在各种下游应用中的潜在用途。