Suppr超能文献

基于自注意力机制的 CTR 预测会话兴趣模型。

Session interest model for CTR prediction based on self-attention mechanism.

机构信息

Shandong Women's University, Jinan, China.

Shandong Provincial Key Laboratory of Network Based Intelligent Computing, Jinan, China.

出版信息

Sci Rep. 2022 Jan 7;12(1):252. doi: 10.1038/s41598-021-03871-y.

Abstract

Click-through rate prediction, which aims to predict the probability of the user clicking on an item, is critical to online advertising. How to capture the user evolving interests from the user behavior sequence is an important issue in CTR prediction. However, most existing models ignore the factor that the sequence is composed of sessions, and user behavior can be divided into different sessions according to the occurring time. The user behaviors are highly correlated in each session and are not relevant across sessions. We propose an effective model for CTR prediction, named Session Interest Model via Self-Attention (SISA). First, we divide the user sequential behavior into session layer. A self-attention mechanism with bias coding is used to model each session. Since different session interest may be related to each other or follow a sequential pattern, next, we utilize gated recurrent unit (GRU) to capture the interaction and evolution of user different historical session interests in session interest extractor module. Then, we use the local activation and GRU to aggregate their target ad to form the final representation of the behavior sequence in session interest interacting module. Experimental results show that the SISA model performs better than other models.

摘要

点击率预测旨在预测用户点击某个项目的概率,这对在线广告至关重要。如何从用户行为序列中捕获用户不断变化的兴趣是 CTR 预测中的一个重要问题。然而,大多数现有模型忽略了序列是由会话组成的因素,并且可以根据发生时间将用户行为分为不同的会话。每个会话中的用户行为高度相关,而跨会话则不相关。我们提出了一种用于 CTR 预测的有效模型,名为通过自注意力的会话兴趣模型(Session Interest Model via Self-Attention,SISA)。首先,我们将用户的顺序行为划分为会话层。使用带偏差编码的自注意力机制来对每个会话进行建模。由于不同的会话兴趣可能相互关联或遵循顺序模式,因此接下来,我们在会话兴趣提取器模块中使用门控循环单元(GRU)来捕获用户不同历史会话兴趣之间的交互和演化。然后,我们使用局部激活和 GRU 来聚合他们的目标广告,以在会话兴趣交互模块中形成行为序列的最终表示。实验结果表明,SISA 模型的性能优于其他模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9730/8741903/586eda572668/41598_2021_3871_Fig1_HTML.jpg

相似文献

2
A CTR prediction model based on session interest.基于会话兴趣的 CTR 预测模型。
PLoS One. 2022 Aug 17;17(8):e0273048. doi: 10.1371/journal.pone.0273048. eCollection 2022.
4
Retrieval-Based Factorization Machines for Human Click Behavior Prediction.基于检索的分解机在人类点击行为预测中的应用。
Comput Intell Neurosci. 2022 Nov 18;2022:1105048. doi: 10.1155/2022/1105048. eCollection 2022.
7
TMH: Two-Tower Multi-Head Attention neural network for CTR prediction.TMH:用于 CTR 预测的双塔多头注意力神经网络。
PLoS One. 2024 Mar 15;19(3):e0295440. doi: 10.1371/journal.pone.0295440. eCollection 2024.

本文引用的文献

2
Deep learning-enabled medical computer vision.基于深度学习的医学计算机视觉。
NPJ Digit Med. 2021 Jan 8;4(1):5. doi: 10.1038/s41746-020-00376-2.
4
Attention in Natural Language Processing.自然语言处理中的注意力机制。
IEEE Trans Neural Netw Learn Syst. 2021 Oct;32(10):4291-4308. doi: 10.1109/TNNLS.2020.3019893. Epub 2021 Oct 5.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验