首页 | 本学科首页   官方微博 | 高级检索  
     

多层前向神经网络带正则化因子的算法
引用本文:沈艳军,汪秉文,胡晓娅. 多层前向神经网络带正则化因子的算法[J]. 系统工程与电子技术, 2004, 26(9): 1312-1314
作者姓名:沈艳军  汪秉文  胡晓娅
作者单位:华中科技大学控制科学与工程系,湖北,武汉,430074
摘    要:针对权衰减递推最小二乘算法(trueweightdecayRLS,TWDRLS)每迭代一步计算复杂度和存储要求很大,基于局部线性最小二乘算法(locallinearizedleastsquaresalgorithm,LLLS)与正则化因子,给出了多层前向神经网络带正则化因子的LLLS算法,大大减小了TWDRLS算法每迭代一步计算的复杂度和存储量。实验表明,改进的算法提高了原LLLS算法的鲁棒性和泛化能力,其性能接近TWDRLS算法。

关 键 词:正则化  递推最小二乘算法  泛化能力  局部线性最小二乘算法
文章编号:1001-506X(2004)09-1312-03
修稿时间:2003-06-10

Regularizer for LLLS algorithm in feedforward multilayered neural networks
SHEN Yan-jun,WANG Bing-wen,HU Xiao-ya. Regularizer for LLLS algorithm in feedforward multilayered neural networks[J]. System Engineering and Electronics, 2004, 26(9): 1312-1314
Authors:SHEN Yan-jun  WANG Bing-wen  HU Xiao-ya
Abstract:The true weight decay RLS(TWDRLS) algorithm achieves a good performance at the expense of much greater computational complexity and storage requirements. A local linearized least squares algorithm(LLLS) together with regularizer is used for training multilayer feedforward neural networks. It can greatly decrease computational complexity and storage requirements. By simulation, it is proved that the modified algorithm can improve the robustness and generalization ability of LLLS. Its performance is approximate to that of the TWDRLS.
Keywords:regularization  recursive least squares algorithm  generalization ability  local linearized least squares algorithm
本文献已被 CNKI 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号