首页 | 本学科首页   官方微博 | 高级检索  
     检索      

自底向上加快神经网络学习的算法
引用本文:杨钟瑾.自底向上加快神经网络学习的算法[J].湖南师范大学自然科学学报,2006,29(3):39-44.
作者姓名:杨钟瑾
作者单位:广东商学院信息学院,广州,510320
摘    要:介绍了一种加快神经网络学习的改进算法.这种改进算法结合采用快速自底向上构造神经网络算法和动态优化学习参数算法.首先,快速自底向上构造神经网络算法自动地构建神经网络的优化结构;随后,动态优化学习参数算法动态地调整和选取优化的学习参数.实验结果显示,这种改进算法能自动有效地构造网络的优化结构,与其它算法相比,具有更好的分类性能、优化的网络结构和更快的学习速度.

关 键 词:神经网络  瀑流关联  自底向上  学习参数优化  分类  反向传播算法
文章编号:1000-2537(2006)03-0039-05
收稿时间:2005

An Accelerated Learning Algorithm for Neural Network with the Cascade-Correlation
YANG Zhong-Jin.An Accelerated Learning Algorithm for Neural Network with the Cascade-Correlation[J].Journal of Natural Science of Hunan Normal University,2006,29(3):39-44.
Authors:YANG Zhong-Jin
Institution:School of Information Science and Technology, Guangdong University of Business Studies, Guangzhou 510320, China
Abstract:An improved algorithm for speeding up learning of neural networks is proposed.The improved algorithm combines fast cascade-correlation and dynamic optimization of learning parameters.The fast cascade-correlation constructs optimal architecture of neural networks automatically.The dynamic optimization of learning parameters can adjust learning parameters dynamically and select optimal learning parameters.Simulation results show the improved algorithm is able to automatically design optimal neural network with good classification and simple network architecture and fast learning in comparison with other algorithms.
Keywords:neural network  cascade-correlation  bottom-up  learning parameters optimization  classification  backpropagation
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号