Li Shengyang, Liu Kang, Wang Han, Yang Rong, Li Xuzhi, Sun Yeqing, Zhong Runtao, Wang Wei, Li Yan, Sun Yuanjie, Wang Gaohong
Technology and Engineering Center for Space Utilization, Chinese Academy of Sciences, Beijing, 100094, China.
Key Laboratory of Space Utilization, Chinese Academy of Sciences, Beijing, 100094, China.
Sci Data. 2025 May 10;12(1):766. doi: 10.1038/s41597-025-05111-8.
Non-contact behavioral study through intelligent image analysis is becoming increasingly vital in animal neuroscience and ethology. The shift from traditional "black box" methods to more open and intelligent approaches is driven by advances in deep learning-based pose estimation and tracking. These technologies enable the extraction of key points and their temporal relationships from sequence images. Such approach is particularly crucial for investigating animal behaviors in outer space, with microgravity, high radiation, and hypomagnetic field. However, the limited image data of space animal and the lack of publicly accessible datasets with ground truth annotations have hindered the development of effective evaluation tools and methods. To address this challenge, we present the SpaceAnimal Dataset-the first multi-task, expert-validated dataset for multi-animal behavior analysis in complex scenarios, including model organisms such as Caenorhabditis elegans, Drosophila, and zebrafish. Additionally, this paper provides evaluation code for deep learning models, establishing benchmarks to guide future research. This dataset will advance AI technology innovation in this field, contributing to the discovery of new behavior patterns in space animals.
通过智能图像分析进行的非接触式行为研究在动物神经科学和行为学中变得越来越重要。从传统的“黑箱”方法向更开放、智能的方法转变,是由基于深度学习的姿态估计和跟踪技术的进步推动的。这些技术能够从序列图像中提取关键点及其时间关系。这种方法对于研究外层空间中的动物行为尤为关键,因为外层空间存在微重力、高辐射和弱磁场。然而,空间动物的图像数据有限,且缺乏带有真实标注的公开可用数据集,这阻碍了有效评估工具和方法的发展。为应对这一挑战,我们推出了SpaceAnimal数据集——首个用于复杂场景下多动物行为分析的多任务、经专家验证的数据集,其中包括秀丽隐杆线虫、果蝇和斑马鱼等模式生物。此外,本文还提供了深度学习模型的评估代码,建立了基准以指导未来的研究。该数据集将推动该领域的人工智能技术创新,有助于发现空间动物的新行为模式。