Suppr超能文献

DeepLabStream 使用基于深度学习的无标记、实时姿势检测,实现了闭环行为实验。

DeepLabStream enables closed-loop behavioral experiments using deep learning-based markerless, real-time posture detection.

机构信息

Functional Neuroconnectomics Group, Institute of Experimental Epileptology and Cognition Research, Medical Faculty, University of Bonn, Bonn, Germany.

Institute of Experimental Epileptology and Cognition Research, Medical Faculty, University of Bonn, Bonn, Germany.

出版信息

Commun Biol. 2021 Jan 29;4(1):130. doi: 10.1038/s42003-021-01654-9.

Abstract

In general, animal behavior can be described as the neuronal-driven sequence of reoccurring postures through time. Most of the available current technologies focus on offline pose estimation with high spatiotemporal resolution. However, to correlate behavior with neuronal activity it is often necessary to detect and react online to behavioral expressions. Here we present DeepLabStream, a versatile closed-loop tool providing real-time pose estimation to deliver posture dependent stimulations. DeepLabStream has a temporal resolution in the millisecond range, can utilize different input, as well as output devices and can be tailored to multiple experimental designs. We employ DeepLabStream to semi-autonomously run a second-order olfactory conditioning task with freely moving mice and optogenetically label neuronal ensembles active during specific head directions.

摘要

一般来说,动物行为可以被描述为神经元驱动的、随时间重复出现的姿势序列。目前大多数可用的技术都侧重于具有高时空分辨率的离线姿势估计。然而,要将行为与神经元活动相关联,通常需要在线检测和响应行为表达。在这里,我们提出了 DeepLabStream,这是一种通用的闭环工具,提供实时姿势估计,以提供与姿势相关的刺激。DeepLabStream 的时间分辨率在毫秒范围内,可以利用不同的输入和输出设备,并可以根据多个实验设计进行定制。我们使用 DeepLabStream 半自动地运行一个带有自由移动小鼠的二阶嗅觉条件反射任务,并通过光遗传学标记在特定头部方向活跃的神经元集合。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bc09/7846585/eccbf30b22ba/42003_2021_1654_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验