Suppr超能文献

具有局部相似性保持知识蒸馏的轻量级深度补全网络

Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation.

作者信息

Jeong Yongseop, Park Jinsun, Cho Donghyeon, Hwang Yoonjin, Choi Seibum B, Kweon In So

机构信息

The Robotics Program, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Korea.

School of Computer Science and Engineering, Pusan National University, 2 Busandaehak-ro 63beon-gil, Geumjeong-gu, Busan 46241, Korea.

出版信息

Sensors (Basel). 2022 Sep 28;22(19):7388. doi: 10.3390/s22197388.

Abstract

Depth perception capability is one of the essential requirements for various autonomous driving platforms. However, accurate depth estimation in a real-world setting is still a challenging problem due to high computational costs. In this paper, we propose a lightweight depth completion network for depth perception in real-world environments. To effectively transfer a teacher's knowledge, useful for the depth completion, we introduce local similarity-preserving knowledge distillation (LSPKD), which allows similarities between local neighbors to be transferred during the distillation. With our LSPKD, a lightweight student network is precisely guided by a heavy teacher network, regardless of the density of the ground-truth data. Experimental results demonstrate that our method is effective to reduce computational costs during both training and inference stages while achieving superior performance over other lightweight networks.

摘要

深度感知能力是各种自动驾驶平台的基本要求之一。然而,由于计算成本高,在现实世界场景中进行精确的深度估计仍然是一个具有挑战性的问题。在本文中,我们提出了一种用于现实世界环境中深度感知的轻量级深度完成网络。为了有效地传递对深度完成有用的教师知识,我们引入了局部相似性保持知识蒸馏(LSPKD),它允许在蒸馏过程中传递局部邻居之间的相似性。通过我们的LSPKD,一个轻量级的学生网络可以被一个重型教师网络精确引导,而不管地面真值数据的密度如何。实验结果表明,我们的方法在训练和推理阶段都能有效降低计算成本,同时比其他轻量级网络具有更优的性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/29f0/9573132/e2d9dbcf43a9/sensors-22-07388-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验