Suppr超能文献

DeepWild:姿势估计工具 DeepLabCut 在野生黑猩猩和倭黑猩猩行为跟踪中的应用。

DeepWild: Application of the pose estimation tool DeepLabCut for behaviour tracking in wild chimpanzees and bonobos.

机构信息

Wild Minds Lab, School of Psychology and Neuroscience, University of St Andrews, St Andrews, UK.

Department of Pedagogy, Chubu Gakuin University, Gifu, Japan.

出版信息

J Anim Ecol. 2023 Aug;92(8):1560-1574. doi: 10.1111/1365-2656.13932. Epub 2023 May 10.

Abstract

Studying animal behaviour allows us to understand how different species and individuals navigate their physical and social worlds. Video coding of behaviour is considered a gold standard: allowing researchers to extract rich nuanced behavioural datasets, validate their reliability, and for research to be replicated. However, in practice, videos are only useful if data can be efficiently extracted. Manually locating relevant footage in 10,000 s of hours is extremely time-consuming, as is the manual coding of animal behaviour, which requires extensive training to achieve reliability. Machine learning approaches are used to automate the recognition of patterns within data, considerably reducing the time taken to extract data and improving reliability. However, tracking visual information to recognise nuanced behaviour is a challenging problem and, to date, the tracking and pose-estimation tools used to detect behaviour are typically applied where the visual environment is highly controlled. Animal behaviour researchers are interested in applying these tools to the study of wild animals, but it is not clear to what extent doing so is currently possible, or which tools are most suited to particular problems. To address this gap in knowledge, we describe the new tools available in this rapidly evolving landscape, suggest guidance for tool selection, provide a worked demonstration of the use of machine learning to track movement in video data of wild apes, and make our base models available for use. We use a pose-estimation tool, DeepLabCut, to demonstrate successful training of two pilot models of an extremely challenging pose estimate and tracking problem: multi-animal wild forest-living chimpanzees and bonobos across behavioural contexts from hand-held video footage. With DeepWild we show that, without requiring specific expertise in machine learning, pose estimation and movement tracking of free-living wild primates in visually complex environments is an attainable goal for behavioural researchers.

摘要

研究动物行为可以帮助我们了解不同物种和个体如何在物理和社会世界中导航。行为的视频编码被认为是一种黄金标准:它允许研究人员提取丰富的细微行为数据集,验证其可靠性,并进行可重复的研究。然而,实际上,如果不能有效地提取数据,视频就没有用处。在 1 万小时的视频中手动定位相关镜头非常耗时,动物行为的手动编码也是如此,它需要广泛的培训才能达到可靠性。机器学习方法用于自动识别数据中的模式,大大减少了提取数据所需的时间,并提高了可靠性。然而,跟踪视觉信息以识别细微的行为是一个具有挑战性的问题,迄今为止,用于检测行为的跟踪和姿势估计工具通常应用于视觉环境高度受控的情况。动物行为研究人员有兴趣将这些工具应用于野生动物的研究,但目前尚不清楚在多大程度上可以做到这一点,或者哪些工具最适合特定的问题。为了弥补这一知识空白,我们描述了这个快速发展领域中可用的新工具,为工具选择提供了指导建议,提供了一个使用机器学习跟踪野生猿猴视频数据中运动的实例演示,并提供了我们的基础模型供使用。我们使用姿势估计工具 DeepLabCut 来演示对两个非常具有挑战性的姿势估计和跟踪问题的试点模型的成功训练:多只野生森林生活的黑猩猩和倭黑猩猩,跨越从手持视频中获取的行为背景。通过 DeepWild,我们表明,无需专门的机器学习、姿势估计和运动跟踪知识,在视觉复杂环境中对自由生活的野生灵长类动物进行跟踪是行为研究人员可以实现的目标。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验