首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于最小二乘法的BP算法
引用本文:王赟松,刘钦龙,高卫中.基于最小二乘法的BP算法[J].山东理工大学学报,2004,18(6):24-30.
作者姓名:王赟松  刘钦龙  高卫中
作者单位:山东理工大学交通与车辆工程学院,山东淄博255049
基金项目:ShandongNaturalScienceFoundation(GrantNo.Y2002F17)
摘    要:标准BP神经网络算法收敛速度慢是限制其广泛应用的主要原因.为此,以标准BP算法为基础,应用最小二乘法理论,提出了一种收敛速度快的BP算法——NLMsBP算法.仿真结果表明,和标准BP算法及其它改进形式比较,NLMSBP算法收敛速度大大提高,稳定性并未降低,这为BP神经网络应用于实时性要求高的场合提供了算法基础.该算法缺点是计算量大,所需计算机内存大,不适于大型网络的计算.

关 键 词:BP算法  大型网络  内存  BP神经网络算法  仿真结果  实时性  计算机  收敛速度  最小二乘法  计算量

A BP algorithm for training neural networks based on solutions for a nonlinear least mean square problem
WANG Yun-song,LIU Qin-long,GAO Wei-zhong.A BP algorithm for training neural networks based on solutions for a nonlinear least mean square problem[J].Journal of Shandong University of Technology:Science and Technology,2004,18(6):24-30.
Authors:WANG Yun-song  LIU Qin-long  GAO Wei-zhong
Abstract:That standard backpropagation(BP) algorithm for training neural networks converges slowly is the main reason why it cannot be used widely in practical applications. Therefore, a new kind of BP algorithm, called the NLMSBP algorithm for short, is put forward in this paper by using solutions for a nonlinear least mean square problem. The experimental results have proved that the algorithm converges very fast and has good stability compared with the standard BP algorithm and the other modifications. It is suitable for training the network with a few thousands of weights and offsets and high training precision demand. If the computer memory is enough, the superiority of the algorithm over the others is very notable. Indeed, it is worth popularizing.
Keywords:neural network  backpropagation algorithm  nonlinear least mean square problem
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号