• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于会话兴趣的 CTR 预测模型。

A CTR prediction model based on session interest.

机构信息

Shandong Women's University, Jinan, China.

Shandong Normal University, Jinan, China.

出版信息

PLoS One. 2022 Aug 17;17(8):e0273048. doi: 10.1371/journal.pone.0273048. eCollection 2022.

DOI:10.1371/journal.pone.0273048
PMID:35976962
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9385038/
Abstract

Click-through rate prediction has become a hot research direction in the field of advertising. It is important to build an effective CTR prediction model. However, most existing models ignore the factor that the sequence is composed of sessions, and the user behaviors are highly correlated in each session and are not relevant across sessions. In this paper, we focus on user multiple session interest and propose a hierarchical model based on session interest (SIHM) for CTR prediction. First, we divide the user sequential behavior into session layer. Then, we employ a self-attention network obtain an accurate expression of interest for each session. Since different session interest may be related to each other or follow a sequential pattern, next, we utilize bidirectional long short-term memory network (BLSTM) to capture the interaction of different session interests. Finally, the attention mechanism based LSTM (A-LSTM) is used to aggregate their target ad to find the influences of different session interests. Experimental results show that the model performs better than other models.

摘要

点击率预测已成为广告领域的热门研究方向。构建有效的 CTR 预测模型非常重要。然而,大多数现有模型忽略了序列由会话组成的因素,并且每个会话中的用户行为高度相关,而不在会话之间相关。在本文中,我们专注于用户的多会话兴趣,并提出了一种基于会话兴趣的分层模型(SIHM)用于 CTR 预测。首先,我们将用户的顺序行为划分为会话层。然后,我们使用自注意力网络为每个会话获得准确的兴趣表达。由于不同的会话兴趣可能相互关联或遵循顺序模式,接下来,我们利用双向长短期记忆网络(BLSTM)来捕获不同会话兴趣之间的交互。最后,使用基于注意力机制的 LSTM(A-LSTM)来聚合它们的目标广告,以发现不同会话兴趣的影响。实验结果表明,该模型的性能优于其他模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/625329557a82/pone.0273048.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/34cb32d27dca/pone.0273048.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/39e9ffe9ee1b/pone.0273048.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/a06714877826/pone.0273048.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/0ceddaa21ee3/pone.0273048.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/9e0843ffce02/pone.0273048.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/625329557a82/pone.0273048.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/34cb32d27dca/pone.0273048.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/39e9ffe9ee1b/pone.0273048.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/a06714877826/pone.0273048.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/0ceddaa21ee3/pone.0273048.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/9e0843ffce02/pone.0273048.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/99e5/9385038/625329557a82/pone.0273048.g006.jpg

相似文献

1
A CTR prediction model based on session interest.基于会话兴趣的 CTR 预测模型。
PLoS One. 2022 Aug 17;17(8):e0273048. doi: 10.1371/journal.pone.0273048. eCollection 2022.
2
Session interest model for CTR prediction based on self-attention mechanism.基于自注意力机制的 CTR 预测会话兴趣模型。
Sci Rep. 2022 Jan 7;12(1):252. doi: 10.1038/s41598-021-03871-y.
3
LSTM-TCN: dissolved oxygen prediction in aquaculture, based on combined model of long short-term memory network and temporal convolutional network.LSTM-TCN:基于长短期记忆网络和时间卷积网络组合模型的水产养殖溶解氧预测。
Environ Sci Pollut Res Int. 2022 Jun;29(26):39545-39556. doi: 10.1007/s11356-022-18914-8. Epub 2022 Feb 1.
4
TMH: Two-Tower Multi-Head Attention neural network for CTR prediction.TMH:用于 CTR 预测的双塔多头注意力神经网络。
PLoS One. 2024 Mar 15;19(3):e0295440. doi: 10.1371/journal.pone.0295440. eCollection 2024.
5
An improved advertising CTR prediction approach based on the fuzzy deep neural network.基于模糊深度神经网络的改进广告点击率预测方法。
PLoS One. 2018 May 4;13(5):e0190831. doi: 10.1371/journal.pone.0190831. eCollection 2018.
6
Disentangled self-attention neural network based on information sharing for click-through rate prediction.基于信息共享的解缠自注意力神经网络用于点击率预测
PeerJ Comput Sci. 2024 Jan 2;10:e1764. doi: 10.7717/peerj-cs.1764. eCollection 2024.
7
Session Recommendation Model Based on Context-Aware and Gated Graph Neural Networks.基于上下文感知和门控图神经网络的会话推荐模型。
Comput Intell Neurosci. 2021 Oct 13;2021:7266960. doi: 10.1155/2021/7266960. eCollection 2021.
8
FEBDNN: fusion embedding-based deep neural network for user retweeting behavior prediction on social networks.FEBDNN:基于融合嵌入的深度神经网络,用于社交网络中用户转发行为预测
Neural Comput Appl. 2022;34(16):13219-13235. doi: 10.1007/s00521-022-07174-9. Epub 2022 Apr 6.
9
Personal Interest Attention Graph Neural Networks for Session-Based Recommendation.用于基于会话的推荐的个人兴趣注意力图神经网络
Entropy (Basel). 2021 Nov 12;23(11):1500. doi: 10.3390/e23111500.
10
Prediction model of sparse autoencoder-based bidirectional LSTM for wastewater flow rate.基于稀疏自编码器的双向长短期记忆网络的污水流量预测模型
J Supercomput. 2023;79(4):4412-4435. doi: 10.1007/s11227-022-04827-3. Epub 2022 Sep 26.

引用本文的文献

1
Advertisement design in dynamic interactive scenarios using DeepFM and long short-term memory (LSTM).使用深度因子分解机(DeepFM)和长短期记忆网络(LSTM)进行动态交互场景中的广告设计。
PeerJ Comput Sci. 2024 Mar 27;10:e1937. doi: 10.7717/peerj-cs.1937. eCollection 2024.

本文引用的文献

1
Personal Interest Attention Graph Neural Networks for Session-Based Recommendation.用于基于会话的推荐的个人兴趣注意力图神经网络
Entropy (Basel). 2021 Nov 12;23(11):1500. doi: 10.3390/e23111500.
2
Meta-Wrapper: Differentiable Wrapping Operator for User Interest Selection in CTR Prediction.元包裹器:用于 CTR 预测中用户兴趣选择的可微分包裹算子。
IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):8449-8464. doi: 10.1109/TPAMI.2021.3103741. Epub 2022 Oct 4.
3
Attention in Natural Language Processing.自然语言处理中的注意力机制。
IEEE Trans Neural Netw Learn Syst. 2021 Oct;32(10):4291-4308. doi: 10.1109/TNNLS.2020.3019893. Epub 2021 Oct 5.
4
A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.递归神经网络综述:长短期记忆细胞和网络架构。
Neural Comput. 2019 Jul;31(7):1235-1270. doi: 10.1162/neco_a_01199. Epub 2019 May 21.
5
A New Approach for Advertising CTR Prediction Based on Deep Neural Network via Attention Mechanism.一种基于注意力机制的深度神经网络广告点击率预测新方法。
Comput Math Methods Med. 2018 Sep 13;2018:8056541. doi: 10.1155/2018/8056541. eCollection 2018.
6
Describing Video With Attention-Based Bidirectional LSTM.基于注意力的双向长短期记忆网络描述视频
IEEE Trans Cybern. 2019 Jul;49(7):2631-2641. doi: 10.1109/TCYB.2018.2831447. Epub 2018 May 25.
7
Deep Learning for Computer Vision: A Brief Review.深度学习在计算机视觉中的应用综述
Comput Intell Neurosci. 2018 Feb 1;2018:7068349. doi: 10.1155/2018/7068349. eCollection 2018.
8
A New Approach for Mobile Advertising Click-Through Rate Estimation Based on Deep Belief Nets.基于深度置信网络的移动广告点击率新估计方法。
Comput Intell Neurosci. 2017;2017:7259762. doi: 10.1155/2017/7259762. Epub 2017 Oct 25.
9
Long short-term memory.长短期记忆
Neural Comput. 1997 Nov 15;9(8):1735-80. doi: 10.1162/neco.1997.9.8.1735.