• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Pretrained Quantum-Inspired Deep Neural Network for Natural Language Processing.

作者信息

Shi Jinjing, Chen Tian, Lai Wei, Zhang Shichao, Li Xuelong

出版信息

IEEE Trans Cybern. 2024 Oct;54(10):5973-5985. doi: 10.1109/TCYB.2024.3398692. Epub 2024 Oct 9.

DOI:10.1109/TCYB.2024.3398692
PMID:38809747
Abstract

Natural language processing (NLP) may face the inexplicable "black-box" problem of parameters and unreasonable modeling for lack of embedding of some characteristics of natural language, while the quantum-inspired models based on quantum theory may provide a potential solution. However, the essential prior knowledge and pretrained text features are often ignored at the early stage of the development of quantum-inspired models. To attacking the above challenges, a pretrained quantum-inspired deep neural network is proposed in this work, which is constructed based on quantum theory for carrying out strong performance and great interpretability in related NLP fields. Concretely, a quantum-inspired pretrained feature embedding (QPFE) method is first developed to model superposition states for words to embed more textual features. Then, a QPFE-ERNIE model is designed by merging the semantic features learned from the prevalent pretrained model ERNIE, which is verified with two NLP downstream tasks: 1) sentiment classification and 2) word sense disambiguation (WSD). In addition, schematic quantum circuit diagrams are provided, which has potential impetus for the future realization of quantum NLP with quantum device. Finally, the experiment results demonstrate QPFE-ERNIE is significantly better for sentiment classification than gated recurrent unit (GRU), BiLSTM, and TextCNN on five datasets in all metrics and achieves better results than ERNIE in accuracy, F1-score, and precision on two datasets (CR and SST), and it also has advantage for WSD over the classical models, including BERT (improves F1-score by 5.2 on average) and ERNIE (improves F1-score by 4.2 on average) and improves the F1-score by 8.7 on average compared with a previous quantum-inspired model QWSD. QPFE-ERNIE provides a novel pretrained quantum-inspired model for solving NLP problems, and it lays a foundation for exploring more quantum-inspired models in the future.

摘要

相似文献

1
Pretrained Quantum-Inspired Deep Neural Network for Natural Language Processing.
IEEE Trans Cybern. 2024 Oct;54(10):5973-5985. doi: 10.1109/TCYB.2024.3398692. Epub 2024 Oct 9.
2
Few-Shot Learning for Clinical Natural Language Processing Using Siamese Neural Networks: Algorithm Development and Validation Study.使用暹罗神经网络的临床自然语言处理少样本学习:算法开发与验证研究
JMIR AI. 2023 May 4;2:e44293. doi: 10.2196/44293.
3
Investigating the Efficient Use of Word Embedding with Neural-Topic Models for Interpretable Topics from Short Texts.研究基于神经主题模型的词向量有效利用,以实现短文本的可解释主题。
Sensors (Basel). 2022 Jan 23;22(3):852. doi: 10.3390/s22030852.
4
When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification.当 BERT 遇见比尔博:预训练语言模型在疾病分类上的学习曲线分析。
BMC Med Inform Decis Mak. 2022 Apr 5;21(Suppl 9):377. doi: 10.1186/s12911-022-01829-2.
5
Recognition of Unknown Entities in Specific Financial Field Based on ERNIE-Doc-BiLSTM-CRF.基于 ERNIE-Doc-BiLSTM-CRF 的特定金融领域未知实体识别。
Comput Intell Neurosci. 2022 May 14;2022:3139898. doi: 10.1155/2022/3139898. eCollection 2022.
6
Research on sentiment classification for netizens based on the BERT-BiLSTM-TextCNN model.基于BERT-BiLSTM-TextCNN模型的网民情感分类研究
PeerJ Comput Sci. 2022 Jun 8;8:e1005. doi: 10.7717/peerj-cs.1005. eCollection 2022.
7
Character gated recurrent neural networks for Arabic sentiment analysis.基于字符门控循环神经网络的阿拉伯语情感分析。
Sci Rep. 2022 Jun 13;12(1):9779. doi: 10.1038/s41598-022-13153-w.
8
Sentiment Classification of Chinese Tourism Reviews Based on ERNIE-Gram+GCN.基于 ERNIE-Gram+GCN 的中文旅游评论情感分类。
Int J Environ Res Public Health. 2022 Oct 19;19(20):13520. doi: 10.3390/ijerph192013520.
9
Multi-Label Classification in Patient-Doctor Dialogues With the RoBERTa-WWM-ext + CNN (Robustly Optimized Bidirectional Encoder Representations From Transformers Pretraining Approach With Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study.基于RoBERTa-WWM-ext + CNN(带有全词掩码扩展的基于变换器预训练方法的稳健优化双向编码器表示与卷积神经网络相结合)模型的医患对话多标签分类:命名实体研究
JMIR Med Inform. 2022 Apr 21;10(4):e35606. doi: 10.2196/35606.
10
BMT-Net: Broad Multitask Transformer Network for Sentiment Analysis.BMT-Net:用于情感分析的广义多任务变压器网络。
IEEE Trans Cybern. 2022 Jul;52(7):6232-6243. doi: 10.1109/TCYB.2021.3050508. Epub 2022 Jul 4.

引用本文的文献

1
A repetitive amplitude encoding method for enhancing the mapping ability of quantum neural networks.一种用于增强量子神经网络映射能力的重复幅度编码方法。
Sci Rep. 2025 Sep 1;15(1):32111. doi: 10.1038/s41598-025-17651-5.