Pang Ning, Zhao Xiang, Zeng Weixin, Tan Zhen, Xiao Weidong
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9670-9683. doi: 10.1109/TNNLS.2024.3442236. Epub 2025 May 2.
Relation classification (RC) aims to detect the semantic relation between two annotated entities in a piece of sentence, serving as an essential task in automatic knowledge graph construction. Due to the emergence of new relations, there is a recent trend to train RC models in continual settings. To overcome the catastrophic forgetting problem in continual learning, existing research is devoted in a two-stage training paradigm, fast adaptation to novel relations, and memory replay for all historical relations. These memory-replay-based methods explore different techniques to mitigate the forgetting problem of continual RC (CRC) models during the memory replay stage. However, we find that the representation space undergoes distortion due to the incoming of fresh relations in the fast adaptation phase. To address this issue, we propose using a knowledge distillation strategy and designing a margin loss, aiming to maintain the stability of the RC model during adaptation to new relations. In addition, in the second stage, with a limited number of typical memory instances available, we introduce a self-contrastive learning objective to facilitate learning a balanced decision boundary for RC. Through training in two stages, our objective is to acquire a stable representation space to encode instances for CRC. We experimentally demonstrate the superiority of our model over competing methods in various settings, and the results suggest that our tailored designs can achieve better performance in CRC.