School of Computing, Mathematics and Digital Technology, Manchester Metropolitan University, Manchester, UK; School of Biology & Conservation Ecology, Manchester Metropolitan University, Manchester, UK.
School of Computing, Mathematics and Digital Technology, Manchester Metropolitan University, Manchester, UK.
J Neurosci Methods. 2018 Apr 15;300:147-156. doi: 10.1016/j.jneumeth.2017.04.006. Epub 2017 Apr 13.
Generating quantitative metrics of rodent locomotion and general behaviours from video footage is important in behavioural neuroscience studies. However, there is not yet a free software system that can process large amounts of video data with minimal user interventions.
Here we propose a new, automated rodent tracker (ART) that uses a simple rule-based system to quickly and robustly track rodent nose and body points, with minimal user input. Tracked points can then be used to identify behaviours, approximate body size and provide locomotion metrics, such as speed and distance.
ART was demonstrated here on video recordings of a SOD1 mouse model, of amyotrophic lateral sclerosis, aged 30, 60, 90 and 120days. Results showed a robust decline in locomotion speeds, as well as a reduction in object exploration and forward movement, with an increase in the time spent still. Body size approximations (centroid width), showed a significant decrease from P30.
COMPARISON WITH EXISTING METHOD(S): ART performed to a very similar accuracy as manual tracking and Ethovision (a commercially available alternative), with average differences in coordinate points of 0.6 and 0.8mm, respectively. However, it required much less user intervention than Ethovision (6 as opposed to 30 mouse clicks) and worked robustly over more videos.
ART provides an open-source option for behavioural analysis of rodents, performing to the same standards as commercially available software. It can be considered a validated, and accessible, alternative for researchers for whom non-invasive quantification of natural rodent behaviour is desirable.
从视频片段中生成啮齿动物运动和一般行为的定量指标在行为神经科学研究中非常重要。然而,目前还没有一个免费的软件系统可以在最小的用户干预下处理大量的视频数据。
在这里,我们提出了一种新的自动化啮齿动物跟踪器(ART),它使用简单的基于规则的系统来快速、稳健地跟踪啮齿动物的鼻子和身体点,用户干预最少。然后可以使用跟踪点来识别行为、近似身体大小并提供运动学指标,例如速度和距离。
ART 在此处演示了肌萎缩侧索硬化症 SOD1 小鼠模型的视频记录,年龄分别为 30、60、90 和 120 天。结果显示,运动速度明显下降,物体探索和向前运动减少,而静止时间增加。身体大小近似值(质心宽度)从 P30 开始显著下降。
ART 的性能与手动跟踪和 Ethovision(一种商业上可用的替代方案)非常相似,坐标点的平均差异分别为 0.6 和 0.8mm。然而,它所需的用户干预比 Ethovision 少得多(分别为 6 次和 30 次鼠标点击),并且在更多的视频中运行稳健。
ART 为啮齿动物的行为分析提供了一种开源选择,其性能与商业上可用的软件相同。对于希望对自然啮齿动物行为进行非侵入性量化的研究人员来说,它可以被认为是一种经过验证的、可访问的替代方案。