首页 | 本学科首页   官方微博 | 高级检索  
     

基于BERT的因果关系抽取
引用本文:姜博,左万利,王英. 基于BERT的因果关系抽取[J]. 吉林大学学报(理学版), 2021, 59(6): 1439-1444. DOI: 10.13413/j.cnki.jdxblxb.2020352
作者姓名:姜博  左万利  王英
作者单位:1. 吉林大学 计算机科学与技术学院, 长春 130012; 2. 吉林大学 符号计算与知识工程教育部重点实验室, 长春 130012
摘    要:针对传统关系抽取模型依赖特征工程等机器学习方法, 存在准确率较低且规则较繁琐等问题, 提出一种BERT+BiLSTM+CRF方法. 首先使用BERT(bidirectional encoder representations from transformers)对语料进行预训练; 然后利用BERT根据上下文特征动态生成词向量的特点, 将生成的词向量通过双向长短期记忆网络(BiLSTM)编码; 最后输入到条件随机场(CRF)层完成对因果关系的抽取. 实验结果表明, 该模型在SemEval-CE数据集上准确率比BiLSTM+CRF+self-ATT模型提高了0.054 1, 从而提高了深度学习方法在因果关系抽取任务中的性能.

关 键 词:因果关系抽取   序列标注   双向长短期记忆网络(BiLSTM)   BERT模型  
收稿时间:2020-11-16

Causality Extraction Based on BERT
JIANG Bo,ZUO Wanli,WANG Ying. Causality Extraction Based on BERT[J]. Journal of Jilin University: Sci Ed, 2021, 59(6): 1439-1444. DOI: 10.13413/j.cnki.jdxblxb.2020352
Authors:JIANG Bo  ZUO Wanli  WANG Ying
Affiliation:1. College of Computer Science and Technology, Jilin University, Changchun 130012, China;
2. Key Laboratory of Symbol Computation and Knowledge Engineering Ministry of Education, Jilin University, Changchun 130012, China
Abstract:Aiming at the problem that the traditional relation extraction model relied on machine learning methods such as feature engineering, which had low accuracy and complicated rules, we proposed a BERT+BiLSTM+CRF method. Firstly,  bidirectional encoder representations from transformers (BERT) was used to pre-train the corpus. Secondly, by using BERT to dynamically generate word vectors according to context features, the generated word vectors were encoded through bidirectional long and short-term memory network (BiLSTM). Finally, we input it into the conditional random field (CRF) layer to complete the extraction of causality. The experimental result shows that the accuracy of the model on the SemEval-CE dataset is 0.054 1 higher than that of the BiLSTM+CRF+self-ATT model, which improves the performance of the deep learning method in the task of causality extraction.
Keywords:causality extraction   sequence labeling   bidirectional long short-term memory (BiLSTM)   bidirectional encoder representations
 
  from transformers (BERT) model
  
本文献已被 万方数据 等数据库收录!
点击此处可从《吉林大学学报(理学版)》浏览原始摘要信息
点击此处可从《吉林大学学报(理学版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号