Suppr超能文献

StaRS: Learning a Stable Representation Space for Continual Relation Classification.

作者信息

Pang Ning, Zhao Xiang, Zeng Weixin, Tan Zhen, Xiao Weidong

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9670-9683. doi: 10.1109/TNNLS.2024.3442236. Epub 2025 May 2.

Abstract

Relation classification (RC) aims to detect the semantic relation between two annotated entities in a piece of sentence, serving as an essential task in automatic knowledge graph construction. Due to the emergence of new relations, there is a recent trend to train RC models in continual settings. To overcome the catastrophic forgetting problem in continual learning, existing research is devoted in a two-stage training paradigm, fast adaptation to novel relations, and memory replay for all historical relations. These memory-replay-based methods explore different techniques to mitigate the forgetting problem of continual RC (CRC) models during the memory replay stage. However, we find that the representation space undergoes distortion due to the incoming of fresh relations in the fast adaptation phase. To address this issue, we propose using a knowledge distillation strategy and designing a margin loss, aiming to maintain the stability of the RC model during adaptation to new relations. In addition, in the second stage, with a limited number of typical memory instances available, we introduce a self-contrastive learning objective to facilitate learning a balanced decision boundary for RC. Through training in two stages, our objective is to acquire a stable representation space to encode instances for CRC. We experimentally demonstrate the superiority of our model over competing methods in various settings, and the results suggest that our tailored designs can achieve better performance in CRC.

摘要

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验