Suppr超能文献

GRASS:从链状级联数据中学习时空属性以进行微观扩散预测

GRASS: Learning Spatial-Temporal Properties From Chainlike Cascade Data for Microscopic Diffusion Prediction.

作者信息

Li Huacheng, Xia Chunhe, Wang Tianbo, Wang Zhao, Cui Peng, Li Xiaojian

出版信息

IEEE Trans Neural Netw Learn Syst. 2023 Jul 19;PP. doi: 10.1109/TNNLS.2023.3293689.

Abstract

Information diffusion prediction captures diffusion dynamics of online messages in social networks. Thus, it is the basis of many essential tasks such as popularity prediction and viral marketing. However, there are two thorny problems caused by the loss of spatial-temporal properties of cascade data: "position-hopping" and "branch-independency." The former means no exact propagation relationship between any two consecutive infected users. The latter indicates that not all previously infected users contribute to the prediction of the next infected user. This article proposes the GRU-like Attention Unit and Structural Spreading (GRASS) model for microscopic cascade prediction to overcome the above two problems. First, we introduce the attention mechanism into the gated recurrent unit (GRU) component to expand the restricted receptive field of the recurrent neural network (RNN)-type module, thus addressing the "position-hopping" problem. Second, the structural spreading (SS) mechanism leverages structural features to filter out related users and controls the generation of cascade hidden states, thereby solving the "branch-independency" problem. Experiments on multiple real-world datasets show that our model significantly outperforms state-of-the-art baseline models on both hits@κ and map@κ metrics. Furthermore, the visualization of latent representations by t-distributed stochastic neighbor embedding (t-SNE) indicates that our model makes different cascades more discriminative during the encoding process.

摘要

信息扩散预测捕捉社交网络中在线消息的扩散动态。因此,它是许多重要任务(如流行度预测和病毒式营销)的基础。然而,级联数据时空特性的丧失引发了两个棘手的问题:“位置跳跃”和“分支独立性”。前者意味着任意两个连续受感染用户之间不存在确切的传播关系。后者表明并非所有先前受感染的用户都会对下一个受感染用户的预测产生影响。本文提出了类门控循环单元注意力单元与结构传播(GRASS)模型用于微观级联预测,以克服上述两个问题。首先,我们将注意力机制引入门控循环单元(GRU)组件,以扩展循环神经网络(RNN)类型模块受限的感受野,从而解决“位置跳跃”问题。其次,结构传播(SS)机制利用结构特征过滤出相关用户并控制级联隐藏状态生成,从而解决“分支独立性”问题。在多个真实世界数据集上的实验表明,我们的模型在命中@κ和平均精度均值@κ指标上显著优于现有最先进的基线模型。此外,通过t分布随机邻域嵌入(t-SNE)对潜在表示进行可视化表明,我们的模型在编码过程中使不同级联更具区分性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验