首页 | 本学科首页   官方微博 | 高级检索  
     

反馈神经网络的一种反向传播算法
引用本文:吴晓红. 反馈神经网络的一种反向传播算法[J]. 系统工程与电子技术, 1999, 21(9): I1
作者姓名:吴晓红
作者单位:广东中山学院电子系,中山,528403
摘    要:曾有人提出利用线性最小二乘法解决神经网络求解优化问题,如用于 B P、 Hopfield 网络求解。但在许多实际应用中,典型的反向传播法速度较慢。顾提到的反馈神经网络新型算法基于线性代数方法,使用线性最小二乘技巧获得每个神经元的激活函数的权重和,并根据理想权重和与实际权重和的“子误差”来逐层调整权重。在描述时间序列等应用方面,其速度比经典的反向传播要快几个数量级。

关 键 词:神经  网络  最小二乘方法  算法  激活函数  神经元
修稿时间:1998-05-10

An Algorithm of Dynamic Linear Least Squares Backpropagation for Feedback Neural Networks
Wu Xiaohong. An Algorithm of Dynamic Linear Least Squares Backpropagation for Feedback Neural Networks[J]. System Engineering and Electronics, 1999, 21(9): I1
Authors:Wu Xiaohong
Abstract:Make use of linear least squares to solve the optimal problems of neural network has been put forward. For example, it is used to ask for the answer of BP network and Hopfield network. But in many applications, the speed of typical backpropagation training method is slower. Advanced new training method of recurrent neural network is based on linear algebyaic in this paper. Using linear least squares techniques to get wighted sum of activation function of each neuron, and according to the sub|error between the desire sum and actual sum to adjust the weights one by one. As respect of describing the time order, the speed is faster than that of the essence backpropagation.
Keywords:Feedback neural networks Activation function Artificial neural network
本文献已被 CNKI 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号