Suppr超能文献

用于持续语义分割的基于知识解缠的尺度混合组蒸馏

Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation.

作者信息

Song Zichen, Zhang Xiaoliang, Shi Zhaofeng

机构信息

School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China.

出版信息

Sensors (Basel). 2023 Sep 12;23(18):7820. doi: 10.3390/s23187820.

Abstract

Continual semantic segmentation (CSS) aims to learn new tasks sequentially and extract object(s) and stuff represented by pixel-level maps of new categories while preserving the original segmentation capabilities even when the old class data is absent. Current CSS methods typically preserve the capacities of segmenting old classes via knowledge distillation, which encounters the limitations of insufficient utilization of the semantic knowledge, i.e., only distilling the last layer of the feature encoder, and the semantic shift of background caused by directly distilling the entire feature map of the decoder. In this paper, we propose a novel CCS method based on scale-hybrid distillation and knowledge disentangling to address these limitations. Firstly, we propose a scale-hybrid group semantic distillation (SGD) method for encoding, which transfers the multi-scale knowledge from the old model's feature encoder with group pooling refinement to improve the stability of new models. Then, the knowledge disentangling distillation (KDD) method for decoding is proposed to distillate feature maps with the guidance of the old class regions and reduce incorrect guides from old models towards better plasticity. Extensive experiments are conducted on the Pascal VOC and ADE20K datasets. Competitive performance compared with other state-of-the-art methods demonstrates the effectiveness of our proposed method.

摘要

持续语义分割(CSS)旨在顺序学习新任务,并在即使没有旧类数据的情况下仍保留原始分割能力的同时,提取由新类别的像素级映射表示的对象和内容。当前的CSS方法通常通过知识蒸馏来保留分割旧类别的能力,这遇到了语义知识利用不足的局限性,即仅蒸馏特征编码器的最后一层,以及直接蒸馏解码器的整个特征图导致的背景语义偏移。在本文中,我们提出了一种基于尺度混合蒸馏和知识解缠的新颖CSS方法来解决这些局限性。首先,我们提出了一种用于编码的尺度混合组语义蒸馏(SGD)方法,该方法通过组池细化从旧模型的特征编码器转移多尺度知识,以提高新模型的稳定性。然后,提出了解码的知识解缠蒸馏(KDD)方法,以在旧类区域的指导下蒸馏特征图,并减少旧模型的错误指导以实现更好的可塑性。在Pascal VOC和ADE20K数据集上进行了广泛的实验。与其他最新方法相比的竞争性能证明了我们提出的方法的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0c47/10537153/aedf5bd37fee/sensors-23-07820-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验