首页 | 本学科首页   官方微博 | 高级检索  
     

AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS
引用本文:Caro Lucas. AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS[J]. 系统科学与复杂性, 1993, 0(4)
作者姓名:Caro Lucas
作者单位:Electrical Engineering
基金项目:This work was supported in part by Tehran University grant number 708.
摘    要:The purpose of this paper is to present a unified theory of several differentneural networks that have been proposed for solving various computation, pattern recog-nition, imaging, optimization, and other problems. The functioning of these networks ischaracterized by Lyapunov energy functions. The relationship between the deterministicand stochastic neural networks is examined. The simulated annealing methods for findingthe global optimum of an objective function as well as their generalization by injectingnoise into deterministic neural networks are discussed. A statistical interpretation of thedynamic evolution of the different neural networks is presented. The problem of trainingdifferent neural networks is investigated in this general framework. It is shown how thisapproach can be used not only for analyzing various neural networks, but also for the choiceof the proper neural network for solving any given problem and the design of a trainingalgorithm for the particular neural network.


AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS
Caro Lucas Electrical Engineering Department,Tehran University,P.O. Box /,Tehran IRAN. AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS[J]. Journal of Systems Science and Complexity, 1993, 0(4)
Authors:Caro Lucas Electrical Engineering Department  Tehran University  P.O. Box /  Tehran IRAN
Affiliation:Caro Lucas Electrical Engineering Department,Tehran University,P.O. Box 14155/6181,Tehran IRAN
Abstract:The purpose of this paper is to present a unified theory of several differentneural networks that have been proposed for solving various computation, pattern recog-nition, imaging, optimization, and other problems. The functioning of these networks ischaracterized by Lyapunov energy functions. The relationship between the deterministicand stochastic neural networks is examined. The simulated annealing methods for findingthe global optimum of an objective function as well as their generalization by injectingnoise into deterministic neural networks are discussed. A statistical interpretation of thedynamic evolution of the different neural networks is presented. The problem of trainingdifferent neural networks is investigated in this general framework. It is shown how thisapproach can be used not only for analyzing various neural networks, but also for the choiceof the proper neural network for solving any given problem and the design of a trainingalgorithm for the particular neural network.
Keywords:Neural networks  stochastic systems  information  energy entropy
本文献已被 CNKI 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号