• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

多视角师生网络。

Multi-view Teacher-Student Network.

机构信息

School of Economics and Management, University of Chinese Academy of Sciences, Beijing 100190, China; Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing 100190, China; Key Laboratory of Big Data Mining and Knowledge Management, Chinese Academy of Sciences, Beijing 100190, China.

School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 100049, China.

出版信息

Neural Netw. 2022 Feb;146:69-84. doi: 10.1016/j.neunet.2021.11.002. Epub 2021 Nov 15.

DOI:10.1016/j.neunet.2021.11.002
PMID:34839092
Abstract

Multi-view learning aims to fully exploit the view-consistency and view-discrepancy for performance improvement. Knowledge Distillation (KD), characterized by the so-called "Teacher-Student" (T-S) learning framework, can transfer information learned from one model to another. Inspired by knowledge distillation, we propose a Multi-view Teacher-Student Network (MTS-Net), which combines knowledge distillation and multi-view learning into a unified framework. We first redefine the teacher and student for the multi-view case. Then the MTS-Net is built by optimizing both the view classification loss and the knowledge distillation loss in an end-to-end training manner. We further extend MTS-Net to image recognition tasks and present a multi-view Teacher-Student framework with convolutional neural networks called MTSCNN. To the best of our knowledge, MTS-Net and MTSCNN bring a new insight to extend the Teacher-Student framework to tackle the multi-view learning problem. We theoretically verify the mechanism of MTS-Net and MTSCNN and comprehensive experiments demonstrate the effectiveness of the proposed methods.

摘要

多视图学习旨在充分利用视图一致性和视图差异来提高性能。知识蒸馏(Knowledge Distillation,KD)以所谓的“教师-学生”(Teacher-Student,T-S)学习框架为特征,可以将从一个模型中学到的信息转移到另一个模型中。受知识蒸馏的启发,我们提出了一种多视图教师-学生网络(Multi-view Teacher-Student Network,MTS-Net),它将知识蒸馏和多视图学习结合到一个统一的框架中。我们首先重新定义了多视图情况下的教师和学生。然后,通过端到端的训练方式,同时优化视图分类损失和知识蒸馏损失来构建 MTS-Net。我们进一步将 MTS-Net 扩展到图像识别任务,并提出了一种基于卷积神经网络的多视图教师-学生框架,称为 MTSCNN。据我们所知,MTS-Net 和 MTSCNN 为将教师-学生框架扩展到解决多视图学习问题提供了新的思路。我们从理论上验证了 MTS-Net 和 MTSCNN 的机制,并通过全面的实验证明了所提出方法的有效性。

相似文献

1
Multi-view Teacher-Student Network.多视角师生网络。
Neural Netw. 2022 Feb;146:69-84. doi: 10.1016/j.neunet.2021.11.002. Epub 2021 Nov 15.
2
Leveraging different learning styles for improved knowledge distillation in biomedical imaging.利用不同的学习方式提高生物医学成像中的知识蒸馏效果。
Comput Biol Med. 2024 Jan;168:107764. doi: 10.1016/j.compbiomed.2023.107764. Epub 2023 Nov 30.
3
Learning Student Networks via Feature Embedding.通过特征嵌入学习学生网络。
IEEE Trans Neural Netw Learn Syst. 2021 Jan;32(1):25-35. doi: 10.1109/TNNLS.2020.2970494. Epub 2021 Jan 4.
4
Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.基于特权多模态知识的单模态分割学习。
IEEE Trans Med Imaging. 2022 Mar;41(3):621-632. doi: 10.1109/TMI.2021.3119385. Epub 2022 Mar 2.
5
MSKD: Structured knowledge distillation for efficient medical image segmentation.MSKD:用于高效医学图像分割的结构化知识蒸馏。
Comput Biol Med. 2023 Sep;164:107284. doi: 10.1016/j.compbiomed.2023.107284. Epub 2023 Aug 2.
6
A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.一种使用膝关节磁共振成像(MRI)和关节镜检查数据进行半月板撕裂检测的深度学习知识蒸馏框架。
Front Bioeng Biotechnol. 2024 Jan 15;11:1326706. doi: 10.3389/fbioe.2023.1326706. eCollection 2023.
7
Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks.用于视觉智能的知识蒸馏与师生学习:综述与新展望
IEEE Trans Pattern Anal Mach Intell. 2022 Jun;44(6):3048-3068. doi: 10.1109/TPAMI.2021.3055564. Epub 2022 May 5.
8
Restructuring the Teacher and Student in Self-Distillation.在自蒸馏中重构教师与学生
IEEE Trans Image Process. 2024;33:5551-5563. doi: 10.1109/TIP.2024.3463421. Epub 2024 Oct 4.
9
Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.从人类教育智慧中学习:一种以学生为中心的知识蒸馏方法。
IEEE Trans Pattern Anal Mach Intell. 2024 Jun;46(6):4188-4205. doi: 10.1109/TPAMI.2024.3354928. Epub 2024 May 7.
10
Self-knowledge distillation for surgical phase recognition.手术阶段识别的自我知识蒸馏。
Int J Comput Assist Radiol Surg. 2024 Jan;19(1):61-68. doi: 10.1007/s11548-023-02970-7. Epub 2023 Jun 20.

引用本文的文献

1
SMILE: Semi-supervised multi-view classification based on dynamical fusion.SMILE:基于动态融合的半监督多视图分类
PLoS One. 2025 May 20;20(5):e0320831. doi: 10.1371/journal.pone.0320831. eCollection 2025.