• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

对比卷积动态多关注推荐模型。

Comparative Convolutional Dynamic Multi-Attention Recommendation Model.

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3510-3521. doi: 10.1109/TNNLS.2021.3053245. Epub 2022 Aug 3.

DOI:10.1109/TNNLS.2021.3053245
PMID:33556019
Abstract

Recently, an attention mechanism has been used to help recommender systems grasp user interests more accurately. It focuses on their pivotal interests from a psychology perspective. However, most current studies based on it only focus on part of user interests; they have not mined user preferences thoroughly. To address the above problem, we propose a novel recommendation model: comparative convolutional dynamic multi-attention (CCDMA). This model provides a more accurate approach to represent user and item features and uses multi-attention-based convolutional neural networks to extract user and item latent feature vectors dynamically. The multi-attention mechanism considers both self-attention and cross-attention. Self-attention refers to the internal attention within users and items; cross-attention is the mutual attention between users and items. Moreover, we propose an optimized comparative learning framework that can mine the ternary relationships between one user and a pair of items, focusing on their relative relationship and the internal link between a pair of items. Extensive experiments on several real-world data sets show that the CCDMA model significantly outperforms state-of-the-art baselines in terms of different evaluation metrics.

摘要

最近,注意力机制被用于帮助推荐系统更准确地掌握用户兴趣。它从心理学的角度关注用户的关键兴趣。然而,目前大多数基于注意力机制的研究只关注用户兴趣的一部分,没有彻底挖掘用户偏好。针对上述问题,我们提出了一种新颖的推荐模型:对比卷积动态多注意力(CCDMA)。该模型提供了一种更准确的方法来表示用户和项目的特征,并使用基于多注意力的卷积神经网络动态提取用户和项目潜在特征向量。多注意力机制同时考虑了自注意力和交叉注意力。自注意力是指用户和项目内部的内部注意力;交叉注意力是用户和项目之间的相互注意力。此外,我们提出了一种优化的对比学习框架,可以挖掘一对用户和一对项目之间的三元关系,重点关注它们的相对关系和一对项目之间的内部联系。在几个真实数据集上的广泛实验表明,与最先进的基线相比,CCDMA 模型在不同的评估指标上都有显著的提升。

相似文献

1
Comparative Convolutional Dynamic Multi-Attention Recommendation Model.对比卷积动态多关注推荐模型。
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3510-3521. doi: 10.1109/TNNLS.2021.3053245. Epub 2022 Aug 3.
2
FIRE: knowledge-enhanced recommendation with feature interaction and intent-aware attention networks.FIRE:基于特征交互和意图感知注意力网络的知识增强推荐
Appl Intell (Dordr). 2022 Dec 7:1-21. doi: 10.1007/s10489-022-04300-x.
3
Neural Time-Aware Sequential Recommendation by Jointly Modeling Preference Dynamics and Explicit Feature Couplings.通过联合建模偏好动态和显式特征耦合进行神经时序感知序列推荐。
IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5125-5137. doi: 10.1109/TNNLS.2021.3069058. Epub 2022 Oct 5.
4
An Autoencoder Framework With Attention Mechanism for Cross-Domain Recommendation.基于注意力机制的跨域推荐自编码器框架
IEEE Trans Cybern. 2022 Jun;52(6):5229-5241. doi: 10.1109/TCYB.2020.3029002. Epub 2022 Jun 16.
5
Multi-Aspect enhanced Graph Neural Networks for recommendation.用于推荐的多方面增强图神经网络
Neural Netw. 2023 Jan;157:90-102. doi: 10.1016/j.neunet.2022.10.001. Epub 2022 Oct 14.
6
A dynamic graph Hawkes process based on linear complexity self-attention for dynamic recommender systems.一种基于线性复杂度自注意力机制的动态图霍克斯过程,用于动态推荐系统。
PeerJ Comput Sci. 2023 May 9;9:e1368. doi: 10.7717/peerj-cs.1368. eCollection 2023.
7
Dynamic and Static Features-Aware Recommendation with Graph Neural Networks.基于图神经网络的动态和静态特征感知推荐
Comput Intell Neurosci. 2022 Apr 21;2022:5484119. doi: 10.1155/2022/5484119. eCollection 2022.
8
Adaptive Deep Modeling of Users and Items Using Side Information for Recommendation.利用侧信息进行推荐的用户和项目的自适应深度建模。
IEEE Trans Neural Netw Learn Syst. 2020 Mar;31(3):737-748. doi: 10.1109/TNNLS.2019.2909432. Epub 2019 Jun 12.
9
TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism.TAFM:一种基于文本注意力分解机制的推荐算法。
Comput Intell Neurosci. 2022 Aug 29;2022:1775496. doi: 10.1155/2022/1775496. eCollection 2022.
10
NeuO: Exploiting the sentimental bias between ratings and reviews with neural networks.NeuO:利用神经网络在评分和评论之间的情感偏见。
Neural Netw. 2019 Mar;111:77-88. doi: 10.1016/j.neunet.2018.12.011. Epub 2019 Jan 8.