• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于分段卷积和焦点损失的 BERT 关系分类。

Relation classification via BERT with piecewise convolution and focal loss.

机构信息

School of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing, China.

School of Electrical Engineering, Tsinghua University, Beijing, China.

出版信息

PLoS One. 2021 Sep 10;16(9):e0257092. doi: 10.1371/journal.pone.0257092. eCollection 2021.

DOI:10.1371/journal.pone.0257092
PMID:34506554
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8432804/
Abstract

Recent relation extraction models' architecture are evolved from the shallow neural networks to natural language model, such as convolutional neural networks or recurrent neural networks to Bert. However, these methods did not consider the semantic information in the sequence or the distance dependence problem, the internal semantic information may contain the useful knowledge which can help relation classification. Focus on these problems, this paper proposed a BERT-based relation classification method. Compare with the existing Bert-based architecture, the proposed model can obtain the internal semantic information between entity pair and solve the distance semantic dependence better. The pre-trained BERT model after fine tuning is used in this paper to abstract the semantic representation of sequence, then adopt the piecewise convolution to obtain semantic information which influence the extraction results. Compare with the existing methods, the proposed method can achieve a better accuracy on relational extraction task because of the internal semantic information extracted in the sequence. While, the generalization ability is still a problem that cannot be ignored, and the numbers of the relationships are difference between different categories. In this paper, the focal loss function is adopted to solve this problem by assigning a heavy weight to less number or hard classify categories. Finally, comparing with the existing methods, the F1 metric of the proposed method can reach a superior result 89.95% on the SemEval-2010 Task 8 dataset.

摘要

最近的关系抽取模型的架构是从浅层神经网络发展到自然语言模型,如卷积神经网络或循环神经网络到 Bert。然而,这些方法没有考虑序列中的语义信息或距离依赖问题,内部语义信息可能包含有助于关系分类的有用知识。针对这些问题,本文提出了一种基于 BERT 的关系分类方法。与现有的基于 Bert 的架构相比,所提出的模型可以获得实体对之间的内部语义信息,并更好地解决距离语义依赖问题。本文使用经过微调的预训练 BERT 模型来抽象序列的语义表示,然后采用分段卷积来获取影响提取结果的语义信息。与现有的方法相比,由于在序列中提取了内部语义信息,所提出的方法可以在关系抽取任务中实现更好的准确性。然而,泛化能力仍然是一个不容忽视的问题,不同类别之间的关系数量存在差异。在本文中,采用焦点损失函数通过为较少数量或难以分类的类别分配较大权重来解决这个问题。最后,与现有的方法相比,所提出的方法在 SemEval-2010 Task 8 数据集上的 F1 指标可以达到 89.95%的优异结果。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/21957995c65e/pone.0257092.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/4e70948e7c57/pone.0257092.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/a79b9bd7cba3/pone.0257092.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/2c12f04351aa/pone.0257092.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/a07c72139846/pone.0257092.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/21957995c65e/pone.0257092.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/4e70948e7c57/pone.0257092.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/a79b9bd7cba3/pone.0257092.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/2c12f04351aa/pone.0257092.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/a07c72139846/pone.0257092.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0cd/8432804/21957995c65e/pone.0257092.g007.jpg

相似文献

1
Relation classification via BERT with piecewise convolution and focal loss.基于分段卷积和焦点损失的 BERT 关系分类。
PLoS One. 2021 Sep 10;16(9):e0257092. doi: 10.1371/journal.pone.0257092. eCollection 2021.
2
Extracting comprehensive clinical information for breast cancer using deep learning methods.利用深度学习方法提取乳腺癌全面临床信息。
Int J Med Inform. 2019 Dec;132:103985. doi: 10.1016/j.ijmedinf.2019.103985. Epub 2019 Oct 2.
3
A Fine-Tuned Bidirectional Encoder Representations From Transformers Model for Food Named-Entity Recognition: Algorithm Development and Validation.基于 Transformer 的双向编码器表示模型的精细调整在食品命名实体识别中的应用:算法开发与验证。
J Med Internet Res. 2021 Aug 9;23(8):e28229. doi: 10.2196/28229.
4
BertSRC: transformer-based semantic relation classification.BertSRC:基于转换器的语义关系分类。
BMC Med Inform Decis Mak. 2022 Sep 6;22(1):234. doi: 10.1186/s12911-022-01977-5.
5
Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks.基于语言特征和稀疏自学习神经网络的中文语义映射构建与研究。
Comput Intell Neurosci. 2022 Jun 20;2022:2315802. doi: 10.1155/2022/2315802. eCollection 2022.
6
Fine-tuning BERT for automatic ADME semantic labeling in FDA drug labeling to enhance product-specific guidance assessment.在FDA药品标签中微调BERT以进行自动ADME语义标注,以加强特定产品的指导评估。
J Biomed Inform. 2023 Feb;138:104285. doi: 10.1016/j.jbi.2023.104285. Epub 2023 Jan 9.
7
DocR-BERT: Document-Level R-BERT for Chemical-Induced Disease Relation Extraction via Gaussian Probability Distribution.基于高斯概率分布的文档级 R-BERT 在化学诱导疾病关系抽取中的应用
IEEE J Biomed Health Inform. 2022 Mar;26(3):1341-1352. doi: 10.1109/JBHI.2021.3116769. Epub 2022 Mar 7.
8
A transformer architecture based on BERT and 2D convolutional neural network to identify DNA enhancers from sequence information.基于 BERT 和二维卷积神经网络的变压器架构,用于从序列信息中识别 DNA 增强子。
Brief Bioinform. 2021 Sep 2;22(5). doi: 10.1093/bib/bbab005.
9
An efficient method for disaster tweets classification using gradient-based optimized convolutional neural networks with BERT embeddings.一种使用基于梯度优化的卷积神经网络与BERT嵌入的高效灾难推文分类方法。
MethodsX. 2024 Jul 3;13:102843. doi: 10.1016/j.mex.2024.102843. eCollection 2024 Dec.
10
Bridging auditory perception and natural language processing with semantically informed deep neural networks.用语义信息丰富的深度神经网络连接听觉感知和自然语言处理。
Sci Rep. 2024 Sep 9;14(1):20994. doi: 10.1038/s41598-024-71693-9.

引用本文的文献

1
Mining drug-target interactions from biomedical literature using chemical and gene descriptions-based ensemble transformer model.使用基于化学和基因描述的集成变压器模型从生物医学文献中挖掘药物-靶点相互作用。
Bioinform Adv. 2024 Jul 22;4(1):vbae106. doi: 10.1093/bioadv/vbae106. eCollection 2024.
2
Knowledge-enhanced prototypical network with class cluster loss for few-shot relation classification.基于类簇损失的知识增强原型网络的小样本关系分类。
PLoS One. 2023 Jun 8;18(6):e0286915. doi: 10.1371/journal.pone.0286915. eCollection 2023.
3
3D deep learning versus the current methods for predicting tumor invasiveness of lung adenocarcinoma based on high-resolution computed tomography images.

本文引用的文献

1
Focal Loss for Dense Object Detection.用于密集目标检测的焦散损失
IEEE Trans Pattern Anal Mach Intell. 2020 Feb;42(2):318-327. doi: 10.1109/TPAMI.2018.2858826. Epub 2018 Jul 23.
基于高分辨率计算机断层扫描图像,3D深度学习与当前预测肺腺癌肿瘤侵袭性的方法对比
Front Oncol. 2022 Oct 21;12:995870. doi: 10.3389/fonc.2022.995870. eCollection 2022.
4
BertSRC: transformer-based semantic relation classification.BertSRC:基于转换器的语义关系分类。
BMC Med Inform Decis Mak. 2022 Sep 6;22(1):234. doi: 10.1186/s12911-022-01977-5.