• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Enhancing Chinese Character Representation With Lattice-Aligned Attention.

作者信息

Zhao Shan, Hu Minghao, Cai Zhiping, Zhang Zhanjun, Zhou Tongqing, Liu Fang

出版信息

IEEE Trans Neural Netw Learn Syst. 2023 Jul;34(7):3727-3736. doi: 10.1109/TNNLS.2021.3114378. Epub 2023 Jul 6.

DOI:10.1109/TNNLS.2021.3114378
PMID:34609945
Abstract

Word-character lattice models have been proved to be effective for some Chinese natural language processing (NLP) tasks, in which word boundary information is fused into character sequences. However, due to the inherently unidirectional sequential nature, prior approaches have only learned sequential interactions of character-word instances but fail to capture fine-grained correlations in word-character spaces. In this article, we propose a lattice-aligned attention network (LAN) that aims to model dense interactions over word-character lattice structure for enhancing character representations. By carefully combining cross-lattice module, gated word-character semantic fusion unit, and self-lattice attention module, the network can explicitly capture fine-grained correlations across different spaces (e.g., word-to-character and character-to-character), thus significantly improving model performance. Experimental results on three Chinese NLP benchmark tasks demonstrate that LAN obtains state-of-the-art results compared to several competitive approaches.

摘要

相似文献

1
Enhancing Chinese Character Representation With Lattice-Aligned Attention.
IEEE Trans Neural Netw Learn Syst. 2023 Jul;34(7):3727-3736. doi: 10.1109/TNNLS.2021.3114378. Epub 2023 Jul 6.
2
Character-Level Neural Language Modelling in the Clinical Domain.临床领域中的字符级神经语言建模
Stud Health Technol Inform. 2020 Jun 16;270:83-87. doi: 10.3233/SHTI200127.
3
Intelligent diagnosis with Chinese electronic medical records based on convolutional neural networks.基于卷积神经网络的中文电子病历智能诊断。
BMC Bioinformatics. 2019 Feb 1;20(1):62. doi: 10.1186/s12859-019-2617-8.
4
Character gated recurrent neural networks for Arabic sentiment analysis.基于字符门控循环神经网络的阿拉伯语情感分析。
Sci Rep. 2022 Jun 13;12(1):9779. doi: 10.1038/s41598-022-13153-w.
5
Bio-SimVerb and Bio-SimLex: wide-coverage evaluation sets of word similarity in biomedicine.生物模拟动词和生物模拟词汇:生物医学中词汇相似度的广泛覆盖评估集。
BMC Bioinformatics. 2018 Feb 5;19(1):33. doi: 10.1186/s12859-018-2039-z.
6
Confusion2Vec 2.0: Enriching ambiguous spoken language representations with subwords.Confusion2Vec 2.0:利用子词丰富歧义口语表示。
PLoS One. 2022 Mar 4;17(3):e0264488. doi: 10.1371/journal.pone.0264488. eCollection 2022.
7
Chinese clinical named entity recognition via multi-head self-attention based BiLSTM-CRF.基于多头自注意力机制的 BiLSTM-CRF 的中文临床命名实体识别。
Artif Intell Med. 2022 May;127:102282. doi: 10.1016/j.artmed.2022.102282. Epub 2022 Mar 18.
8
Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations.基于上下文词表示的无监督跨语言命名实体识别模型迁移。
PLoS One. 2021 Sep 21;16(9):e0257230. doi: 10.1371/journal.pone.0257230. eCollection 2021.
9
Multi-level semantic fusion network for Chinese medical named entity recognition.用于中文医学命名实体识别的多层次语义融合网络
J Biomed Inform. 2022 Sep;133:104144. doi: 10.1016/j.jbi.2022.104144. Epub 2022 Jul 22.
10
Hybrid Attention Network for Language-Based Person Search.基于语言的人物搜索的混合注意力网络。
Sensors (Basel). 2020 Sep 15;20(18):5279. doi: 10.3390/s20185279.

引用本文的文献

1
Collaborative Filtering Algorithm-Based Destination Recommendation and Marketing Model for Tourism Scenic Spots.基于协同过滤算法的旅游景点目的地推荐和营销模型。
Comput Intell Neurosci. 2022 Apr 28;2022:7115627. doi: 10.1155/2022/7115627. eCollection 2022.