Suppr超能文献

Personalized multi-head self-attention network for news recommendation.

作者信息

Zheng Cong, Song Yixuan

机构信息

Global Energy Interconnection Group Co., Ltd, NO.8 Xuanwumennei Street, Xicheng District, Beijing, PR China.

Baidu Online Network Technology (Beijing) Co., Ltd, Baidu Campus, No. 10 Shangdi 10th Street, Haidian District, Beijing, PR China.

出版信息

Neural Netw. 2025 Jan;181:106824. doi: 10.1016/j.neunet.2024.106824. Epub 2024 Oct 22.

Abstract

With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a Personalized Multi-Head Self-Attention Network (PMSN) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.

摘要

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验