Suppr超能文献

DC-CNN:用于虚假新闻检测的带注意力池化的双通道卷积神经网络。

DC-CNN: Dual-channel Convolutional Neural Networks with attention-pooling for fake news detection.

作者信息

Ma Kun, Tang Changhao, Zhang Weijuan, Cui Benkuan, Ji Ke, Chen Zhenxiang, Abraham Ajith

机构信息

Shandong Provincial Key Laboratory of Network Based Intelligent Computing, University of Jinan, Jinan, 250022 China.

Department of Computer and Software Engineering, Shandong College of Electronic Technology, Jinan, 250200 China.

出版信息

Appl Intell (Dordr). 2023;53(7):8354-8369. doi: 10.1007/s10489-022-03910-9. Epub 2022 Aug 1.

Abstract

Fake news detection mainly relies on the extraction of article content features with neural networks. However, it has brought some challenges to reduce the noisy data and redundant features, and learn the long-distance dependencies. To solve the above problems, Dual-channel Convolutional Neural Networks with Attention-pooling for Fake News Detection (abbreviated as DC-CNN) is proposed. This model benefits from Skip-Gram and Fasttext. It can effectively reduce noisy data and improve the learning ability of the model for non-derived words. A parallel dual-channel pooling layer was proposed to replace the traditional CNN pooling layer in DC-CNN. The Max-pooling layer, as one of the channels, maintains the advantages in learning local information between adjacent words. The Attention-pooling layer with multi-head attention mechanism serves as another pooling channel to enhance the learning of context semantics and global dependencies. This model benefits from the learning advantages of the two channels and solves the problem that pooling layer is easy to lose local-global feature correlation. This model is tested on two different COVID-19 fake news datasets, and the experimental results show that our model has the optimal performance in dealing with noisy data and balancing the correlation between local features and global features.

摘要

假新闻检测主要依靠神经网络提取文章内容特征。然而,在减少噪声数据和冗余特征以及学习长距离依赖方面带来了一些挑战。为了解决上述问题,提出了用于假新闻检测的带注意力池化的双通道卷积神经网络(简称为DC-CNN)。该模型受益于Skip-Gram和Fasttext。它可以有效减少噪声数据,提高模型对非衍生词的学习能力。在DC-CNN中提出了一个并行双通道池化层来取代传统的CNN池化层。最大池化层作为其中一个通道,在学习相邻词之间的局部信息方面保持优势。具有多头注意力机制的注意力池化层作为另一个池化通道,以增强上下文语义和全局依赖的学习。该模型受益于两个通道的学习优势,解决了池化层容易丢失局部-全局特征相关性的问题。该模型在两个不同的COVID-19假新闻数据集上进行了测试,实验结果表明,我们的模型在处理噪声数据以及平衡局部特征和全局特征之间的相关性方面具有最优性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a11b/9340725/f77da59f1eef/10489_2022_3910_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验