首页 | 本学科首页   官方微博 | 高级检索  
     检索      

融合LSTM和MoE的倒闸操作识别
引用本文:张晓青,肖万芳,郭英杰,刘博文,韩学森,马经纬,高高,黄赫,夏时洪.融合LSTM和MoE的倒闸操作识别[J].系统仿真学报,2022,34(8):1899-1907.
作者姓名:张晓青  肖万芳  郭英杰  刘博文  韩学森  马经纬  高高  黄赫  夏时洪
作者单位:1.中国科学院 计算技术研究所,北京 1001902.北京邮电大学 计算机学院,北京 1008763.国网北京城区供电公司,北京 110102
基金项目:国家重点研发计划(2020YFF0304701);北京市电力公司科技项目(202021900T7)
摘    要:为解决不同人员相同操作的个体差异以及同一人员不同时间相同操作差异的问题,提出一种基于混合专家系统(mixture of experts,MoE)和长短期记忆神经网络(long short-term memory,LSTM)的倒闸操作识别方法MoE-LSTM。基于MoE对LSTM进行集成,学习不同来源数据的特征分布。采集加速度动作数据构建倒闸操作数据集,基于滑动窗口对动作序列进行切分;将动作序列输入到MoE-LSTM中,由不同LSTM独立学习不同动作的时序依赖;通过门控网络选择对当前输入分类较好的LSTM的输出作为动作识别结果。仿真结果表明:不同LSTM对来自不同时空的动作数据都有擅长分类的特征空间。

关 键 词:倒闸操作  长短期记忆神经网络  混合专家系统  神经网络  
收稿时间:2021-04-03

Identification of Switching Operation Based on LSTM and MoE
Xiaoqing Zhang,Wanfang Xiao,Yingjie Guo,Bowen Liu,Xuesen Han,Jingwei Ma,Gao Gao,He Huang,Shihong Xia.Identification of Switching Operation Based on LSTM and MoE[J].Journal of System Simulation,2022,34(8):1899-1907.
Authors:Xiaoqing Zhang  Wanfang Xiao  Yingjie Guo  Bowen Liu  Xuesen Han  Jingwei Ma  Gao Gao  He Huang  Shihong Xia
Institution:1.Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, China2.College of Computer, Beijing University of Posts and Telecommunications, Beijing 100876, China3.State Grid Beijing urban power supply company, Beijing 110102, China
Abstract:Aiming at the individual differences of different personnel in the same operation and differences of the same person in the same operation at different times, a switching operation recognition model(MoE-LSTM) based on Mixture of experts model (MOE) and long short-term memory network(LSTM) is proposed. Based on MoE, LSTM is integrated to learn the feature distribution of different sources data. The acceleration data is collected to build the switching operation dataset and the action sequence is segmented and aligned based on sliding window. The action sequence is input to MoE-LSTM, and the temporal dependencies of different actions are independently learned by different LSTMs. The gating network selects the output of LSTM that classifies the current input better as the action recognition result. The result of model learning is that for action data from different time and space, different LSTMs perform better in a certain feature area than other LSTMs. The experiments on the switching operation dataset demonstrate superior performance of the proposed method compared to other existing action recognition algorithms.
Keywords:switching operation  long short-term memory network(LSTM)  mixture of experts model (MOE)  neural network  
点击此处可从《系统仿真学报》浏览原始摘要信息
点击此处可从《系统仿真学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号