• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于深度学习的多突触类型自动分类

Automatic Classification for the Type of Multiple Synapse Based on Deep Learning.

作者信息

Luo Jie, Hong Bei, Jiang Yi, Li Linlin, Xie Qiwei, Han Hua

出版信息

Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:40-43. doi: 10.1109/EMBC.2019.8856509.

DOI:10.1109/EMBC.2019.8856509
PMID:31945840
Abstract

Recent studies have shown that the synaptic plasticity induced by development and learning can promote the formation of multiple synapse. With the rapid development of electron microscopy (EM) technology, we can closely observe the multiple synapse structure with high resolution. Although the multiple synapse has been widely researched by recent researchers, the classification accuracy for the type of multiple synapse has not been documented. In this paper, we propose an effective automatic classification method for the type of multiple synapse. The main steps are summarized as three parts: synaptic cleft segmentation, vesicle band segmentation, multiple synapse classification. The experiments on four datasets demonstrate that the proposed method can reach an average accuracy about 97%.

摘要

最近的研究表明,由发育和学习诱导的突触可塑性可以促进多个突触的形成。随着电子显微镜(EM)技术的迅速发展,我们能够以高分辨率密切观察多个突触的结构。尽管最近的研究人员对多个突触进行了广泛研究,但关于多个突触类型的分类准确率尚未见报道。在本文中,我们提出了一种有效的多个突触类型自动分类方法。主要步骤总结为三个部分:突触间隙分割、囊泡带分割、多个突触分类。在四个数据集上的实验表明,所提出的方法平均准确率可达约97%。

相似文献

1
Automatic Classification for the Type of Multiple Synapse Based on Deep Learning.基于深度学习的多突触类型自动分类
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:40-43. doi: 10.1109/EMBC.2019.8856509.
2
Effective automated pipeline for 3D reconstruction of synapses based on deep learning.基于深度学习的有效自动化突触 3D 重建流水线。
BMC Bioinformatics. 2018 Jul 13;19(1):263. doi: 10.1186/s12859-018-2232-0.
3
Deep learning-based synapse counting and synaptic ultrastructure analysis of electron microscopy images.基于深度学习的电子显微镜图像突触计数和突触超微结构分析。
J Neurosci Methods. 2023 Jan 15;384:109750. doi: 10.1016/j.jneumeth.2022.109750. Epub 2022 Nov 19.
4
Fear memory-associated synaptic and mitochondrial changes revealed by deep learning-based processing of electron microscopy data.基于深度学习处理电子显微镜数据揭示的与恐惧记忆相关的突触和线粒体变化。
Cell Rep. 2022 Aug 2;40(5):111151. doi: 10.1016/j.celrep.2022.111151.
5
Sleep Deprivation by Exposure to Novel Objects Increases Synapse Density and Axon-Spine Interface in the Hippocampal CA1 Region of Adolescent Mice.暴露于新异物体致睡眠剥夺增加青春期小鼠海马 CA1 区突触密度和轴突-棘突界面。
J Neurosci. 2019 Aug 21;39(34):6613-6625. doi: 10.1523/JNEUROSCI.0380-19.2019. Epub 2019 Jul 1.
6
CleftNet: Augmented Deep Learning for Synaptic Cleft Detection From Brain Electron Microscopy.CleftNet:基于增强深度学习的脑电镜突触裂检测
IEEE Trans Med Imaging. 2021 Dec;40(12):3507-3518. doi: 10.1109/TMI.2021.3089547. Epub 2021 Nov 30.
7
Specific plasticity of parallel fiber/Purkinje cell spine synapses by motor skill learning.运动技能学习引起的平行纤维/浦肯野细胞棘突触的特异性可塑性。
Neuroreport. 2002 Sep 16;13(13):1607-10. doi: 10.1097/00001756-200209160-00007.
8
Fully-Automatic Synapse Prediction and Validation on a Large Data Set.全自动突触预测及其在大数据集上的验证。
Front Neural Circuits. 2018 Oct 29;12:87. doi: 10.3389/fncir.2018.00087. eCollection 2018.
9
Automatic classification and neurotransmitter prediction of synapses in electron microscopy.电子显微镜下突触的自动分类与神经递质预测
Biol Imaging. 2022 Jul 29;2:e6. doi: 10.1017/S2633903X2200006X. eCollection 2022.
10
Learning context cues for synapse segmentation.学习突触分割的上下文线索。
IEEE Trans Med Imaging. 2013 Oct;32(10):1864-77. doi: 10.1109/TMI.2013.2267747. Epub 2013 Jun 11.

引用本文的文献

1
SynapseCLR: Uncovering features of synapses in primary visual cortex through contrastive representation learning.突触CLR:通过对比表征学习揭示初级视觉皮层中突触的特征。
Patterns (N Y). 2023 Mar 7;4(4):100693. doi: 10.1016/j.patter.2023.100693. eCollection 2023 Apr 14.