首页 | 本学科首页   官方微博 | 高级检索  
     检索      

递归神经网络研究综述
引用本文:王雨嫣,廖柏林,彭晨,李军,印煜民.递归神经网络研究综述[J].吉首大学学报(自然科学版),2021,42(1):41-48.
作者姓名:王雨嫣  廖柏林  彭晨  李军  印煜民
作者单位:(1.吉首大学数学与统计学院,湖南 吉首 416000;2. 吉首大学信息科学与工程学院,湖南 吉首 416000)
基金项目:国家自然科学基金资助项目(62066015,62006095);湖南省自然科学基金资助项目(2020JJ4511);吉首大学校级科研项目(JDY20063);吉首大学优秀青年项目(20B470)
摘    要:递归神经网络(RNN)因具存储特性,可以处理前后输入有关系的序列数据,故广泛应用于文本音频、视频等领域.当输入间隙较大时,RNN存在短期记忆问题,无法处理很长的输入序列,而长短期记忆(LSTM)能很好地处理长期依赖性问题.自LSTM提出以来,几乎所有基于RNNs的令人兴奋的结果都是由LSTM实现的,因此LSTM成为深度...

关 键 词:递归神经网络  长短期记忆  序列数据  自然语言处理

Research Review of Recurrent Neural Networks
WANG Yuyan,LIAO Bolin,PENG Chen,LI Jun,YIN Yumin.Research Review of Recurrent Neural Networks[J].Journal of Jishou University(Natural Science Edition),2021,42(1):41-48.
Authors:WANG Yuyan  LIAO Bolin  PENG Chen  LI Jun  YIN Yumin
Institution:(1. College of Mathematics and Statistics, Jishou University, Jishou 416000, Hunan China; 2. College of Information Science and Engineering, Jishou University, Jishou 416000, Hunan China)
Abstract:Recurrent neural network (RNN) is a kind of neural network with feedback connection in each layer. Because of its storage characteristics, it can process the sequence data which is related before and after input, and can be widely used in the field of text audio, video and so on. But when the input gap is large, RNN has a short-term memory problem, which can not process long input sequences, while long short-term memory (LSTM) can deal with the long-term dependence problem well. Almost all the exciting results based on RNNs have been realized by LSTM since LSTM was proposed, so LSTM has become the focus of deep learning. This review firstly introduces the basic working principle and characteristics of RNN, and then it introduces the principle and characteristics of LSTM and its variants, as well as the application of RNN and LSTM in various fields. Finally, the future research direction of RNN is proposed.
Keywords:recurrent neural network                                                                                                                          long short-term memory                                                                                                                          sequential data                                                                                                                          natural language processing
本文献已被 CNKI 万方数据 等数据库收录!
点击此处可从《吉首大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《吉首大学学报(自然科学版)》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号