Department of Molecular, Cellular, and Developmental Biology, UC Santa Barbara, Santa Barbara, CA, USA.
Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA.
J Neurosci Methods. 2019 Oct 1;326:108352. doi: 10.1016/j.jneumeth.2019.108352. Epub 2019 Aug 12.
Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal's position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.
动物可以通过灵活地执行简单的动作来执行复杂而有目的的行为。当行为序列高度可变时,分析行为序列特别具有挑战性,就像语言产生、某些类型的鸟鸣以及我们实验中的苍蝇梳理行为一样。高度可变的序列需要严格量化大量数据,以识别此类行为的组织原则和时间结构。为了处理大量数据,并最大程度地减少人工努力和主观偏见,研究人员通常使用自动行为识别软件。我们的标准梳理测定法涉及给苍蝇涂上灰尘,然后用视频记录它们梳理以去除灰尘的过程。苍蝇自由移动,因此以各种方向执行相同的动作。随着灰尘的清除,它们的外观会发生变化。这些情况使得难以依赖精确的身体对齐和解剖学标记(例如眼睛或腿),因此对现有的行为分类软件提出了挑战。人类观察者将运动的速度、位置和形状用作特定梳理动作的诊断特征。我们将这种直觉应用于设计一种新的自动行为识别系统 (ABRS),该系统基于视频数据中的时空特征,重点关注时间动态,并且与动物在场景中的位置和方向无关。我们在两步监督分类中使用这些时空特征,这两步反映了行为结构化的两个时间尺度。作为原理验证,我们展示了从大量刺激诱导的苍蝇梳理行为数据中进行定量和分析的结果,如果在较小的人类注释行为图谱数据集上,这些结果将难以评估。虽然我们开发并验证了这种方法来分析苍蝇梳理行为,但我们提出了一种策略,即结合不变的对齐特征和多时间尺度分析,可能对基于视频数据的运动行为分类普遍有用。