Stanley O R, Swaminathan A, Wojahn E, Ahmed Z M, Cullen K E
Dept. Biomedical Engineering; Johns Hopkins University.
Depts. Otorhinolaryngology-Head & Neck Surgery, Biochemistry & Molecular Biology, Ophthalmology; University of Maryland School of Medicine.
bioRxiv. 2023 May 30:2023.05.30.540066. doi: 10.1101/2023.05.30.540066.
Quantifying behavior and relating it to underlying biological states is of paramount importance in many life science fields. Although barriers to recording postural data have been reduced by progress in deep-learning-based computer vision tools for keypoint tracking, extracting specific behaviors from this data remains challenging. Manual behavior coding, the present gold standard, is labor-intensive and subject to intra- and inter-observer variability. Automatic methods are stymied by the difficulty of explicitly defining complex behaviors, even ones which appear obvious to the human eye. Here, we demonstrate an effective technique for detecting one such behavior, a form of locomotion characterized by stereotyped spinning, termed 'circling'. Though circling has an extensive history as a behavioral marker, at present there exists no standard automated detection method. Accordingly, we developed a technique to identify instances of the behavior by applying simple postprocessing to markerless keypoint data from videos of freely-exploring () mutant mice, a strain we previously found to exhibit circling. Our technique agrees with human consensus at the same level as do individual observers, and it achieves >90% accuracy in discriminating videos of wild type mice from videos of mutants. As using this technique requires no experience writing or modifying code, it also provides a convenient, noninvasive, quantitative tool for analyzing circling mouse models. Additionally, as our approach was agnostic to the underlying behavior, these results support the feasibility of algorithmically detecting specific, research-relevant behaviors using readily-interpretable parameters tuned on the basis of human consensus.
在许多生命科学领域,量化行为并将其与潜在的生物学状态联系起来至关重要。尽管基于深度学习的计算机视觉工具在关键点跟踪方面取得的进展降低了记录姿势数据的障碍,但从这些数据中提取特定行为仍然具有挑战性。手动行为编码作为目前的金标准,劳动强度大且受观察者内部和观察者之间差异的影响。自动方法因难以明确界定复杂行为而受阻,即使是那些在人眼看来很明显的行为。在这里,我们展示了一种检测一种此类行为的有效技术,这种行为是一种以刻板旋转为特征的运动形式,称为“转圈”。尽管转圈作为一种行为标记有着悠久的历史,但目前尚无标准的自动检测方法。因此,我们开发了一种技术,通过对自由探索的()突变小鼠视频中的无标记关键点数据进行简单的后处理来识别该行为的实例,我们之前发现该品系会表现出转圈行为。我们的技术与人类共识的一致性与个体观察者相同,并且在区分野生型小鼠视频和突变体视频方面达到了>90%的准确率。由于使用该技术不需要编写或修改代码的经验,它还为分析转圈小鼠模型提供了一种方便、无创的定量工具。此外,由于我们的方法与潜在行为无关,这些结果支持了使用基于人类共识调整的易于解释的参数通过算法检测特定的、与研究相关的行为的可行性。