首页 | 本学科首页   官方微博 | 高级检索  
     

融合上下文的残差门卷积实体抽取
引用本文:苏丰龙,孙承哲,景宁. 融合上下文的残差门卷积实体抽取[J]. 北京大学学报(自然科学版), 2022, 58(1): 69-76. DOI: 10.13209/j.0479-8023.2021.102
作者姓名:苏丰龙  孙承哲  景宁
作者单位:国防科技大学电子科学学院, 长沙 410073
摘    要:基于传统卷积框架的实体抽取方法,由于受到卷积感受野大小的控制,当前词与上下文的关联程度有限,对实体词在整个句子中的语义欠考虑,识别效果不佳.针对这一问题,提出一种基于残差门卷积的实体识别方法,利用膨胀卷积和带残差的门控线性单元,从多个时序维度同步考虑词间的语义关联,借助门控单元调整流向下一层神经元的信息量,缓解跨层传播...

关 键 词:实体抽取  残差门卷积  梯度消失  注意力机制
收稿时间:2021-06-12

A Context-Fusion Method for Entity Extraction Based onResidual Gated Convolution Neural Network
SU Fenglong,SUN Chengzhe,JING Ning. A Context-Fusion Method for Entity Extraction Based onResidual Gated Convolution Neural Network[J]. Acta Scientiarum Naturalium Universitatis Pekinensis, 2022, 58(1): 69-76. DOI: 10.13209/j.0479-8023.2021.102
Authors:SU Fenglong  SUN Chengzhe  JING Ning
Affiliation:School of Electronic Science, National University of Defense Technology, Changsha 410073
Abstract:Due to the convolutional receptive field size, the current word has a limited relevance to the context. It brings about a problem, that is, the semantics of the entity words in the whole sentence is under-considered. The Residual Gated Convolution Neural Network (RGCNN) uses dilated convolution and residual gated linear units to simultaneously consider the associations between words from different dimensions, which adjusts the amount of information flowing to the next layer of neurons. And then by this way the vanishing gradient can be alleviated in cross-layer propagation. At the same time, RGCNN combines the attention mechanism to calculate the semantics between words in the last layer. The results on datasets show that RGCNN has a competitive advantage in speed and accuracy, which reflects the superiority and robustness of the algorithm.
Keywords:entity extraction  residual gated convolution  vanishing gradient  attention mechanism  
点击此处可从《北京大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《北京大学学报(自然科学版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号