文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

用于医学文本分类的改进双通道长短时记忆模型。

An Improved Double Channel Long Short-Term Memory Model for Medical Text Classification.

机构信息

School of Software, Henan University, Kaifeng, China.

Institute of Data Science, City University of Macau, Taipa, Macau, China.

出版信息

J Healthc Eng. 2021 Feb 23;2021:6664893. doi: 10.1155/2021/6664893. eCollection 2021.


DOI:10.1155/2021/6664893
PMID:33688423
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7925031/
Abstract

There are a large number of symptom consultation texts in medical and healthcare Internet communities, and Chinese health segmentation is more complex, which leads to the low accuracy of the existing algorithms for medical text classification. The deep learning model has advantages in extracting abstract features of text effectively. However, for a large number of samples of complex text data, especially for words with ambiguous meanings in the field of Chinese medical diagnosis, the word-level neural network model is insufficient. Therefore, in order to solve the triage and precise treatment of patients, we present an improved Double Channel (DC) mechanism as a significant enhancement to Long Short-Term Memory (LSTM). In this DC mechanism, two channels are used to receive word-level and char-level embedding, respectively, at the same time. Hybrid attention is proposed to combine the current time output with the current time unit state and then using attention to calculate the weight. By calculating the probability distribution of each timestep input data weight, the weight score is obtained, and then weighted summation is performed. At last, the data input by each timestep is subjected to trade-off learning to improve the generalization ability of the model learning. Moreover, we conduct an extensive performance evaluation on two different datasets: cMedQA and Sentiment140. The experimental results show that the DC-LSTM model proposed in this paper has significantly superior accuracy and ROC compared with the basic CNN-LSTM model.

摘要

医疗和健康互联网社区中存在大量症状咨询文本,且中文分词更为复杂,这导致现有医学文本分类算法的准确率较低。深度学习模型在有效提取文本的抽象特征方面具有优势。然而,对于大量复杂文本数据的样本,尤其是对于中文医疗诊断领域中含义模糊的单词,词级神经网络模型就显得力不从心。因此,为了解决患者分诊和精准治疗的问题,我们提出了一种改进的双通道(DC)机制,作为对长短期记忆(LSTM)的重大改进。在这个 DC 机制中,两个通道同时接收词级和字符级的嵌入。混合注意力机制用于结合当前时间的输出和当前时间单元的状态,然后使用注意力计算权重。通过计算每个时间步输入数据权重的概率分布,得到权重得分,然后进行加权求和。最后,对每个时间步输入的数据进行权衡学习,以提高模型学习的泛化能力。此外,我们在两个不同的数据集 cMedQA 和 Sentiment140 上进行了广泛的性能评估。实验结果表明,与基本的 CNN-LSTM 模型相比,本文提出的 DC-LSTM 模型在准确率和 ROC 方面具有显著优势。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/8c4019b0fa99/JHE2021-6664893.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/04ef1897f8a6/JHE2021-6664893.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/3b144513f08b/JHE2021-6664893.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/ddd81782d19c/JHE2021-6664893.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/08644c086a4a/JHE2021-6664893.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/2f1fad12a0bd/JHE2021-6664893.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/8c4019b0fa99/JHE2021-6664893.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/04ef1897f8a6/JHE2021-6664893.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/3b144513f08b/JHE2021-6664893.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/ddd81782d19c/JHE2021-6664893.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/08644c086a4a/JHE2021-6664893.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/2f1fad12a0bd/JHE2021-6664893.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1558/7925031/8c4019b0fa99/JHE2021-6664893.006.jpg

相似文献

[1]
An Improved Double Channel Long Short-Term Memory Model for Medical Text Classification.

J Healthc Eng. 2021

[2]
A comparative study on deep learning models for text classification of unstructured medical notes with various levels of class imbalance.

BMC Med Res Methodol. 2022-7-2

[3]
Entity recognition in Chinese clinical text using attention-based CNN-LSTM-CRF.

BMC Med Inform Decis Mak. 2019-4-4

[4]
DCCL: Dual-channel hybrid neural network combined with self-attention for text classification.

Math Biosci Eng. 2023-1

[5]
Application of Dual-Channel Convolutional Neural Network Algorithm in Semantic Feature Analysis of English Text Big Data.

Comput Intell Neurosci. 2021

[6]
Attention-Based DSC-ConvLSTM for Multiclass Motor Imagery Classification.

Comput Intell Neurosci. 2022

[7]
Breast cancer classification based on hybrid CNN with LSTM model.

Sci Rep. 2025-2-5

[8]
Emotion Analysis Model of Microblog Comment Text Based on CNN-BiLSTM.

Comput Intell Neurosci. 2022

[9]
Attention-Emotion-Enhanced Convolutional LSTM for Sentiment Analysis.

IEEE Trans Neural Netw Learn Syst. 2022-9

[10]
Language Processing Model Construction and Simulation Based on Hybrid CNN and LSTM.

Comput Intell Neurosci. 2021

引用本文的文献

[1]
A model of integrating convolution and BiGRU dual-channel mechanism for Chinese medical text classifications.

PLoS One. 2023

本文引用的文献

[1]
Fusing Visual Attention CNN and Bag of Visual Words for Cross-Corpus Speech Emotion Recognition.

Sensors (Basel). 2020-9-28

[2]
Rationale-Augmented Convolutional Neural Networks for Text Classification.

Proc Conf Empir Methods Nat Lang Process. 2016-11

[3]
Framewise phoneme classification with bidirectional LSTM and other neural network architectures.

Neural Netw. 2005

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索