首页 | 本学科首页   官方微博 | 高级检索  
     检索      

具有充分下降性的修正PRP算法及其收敛性
引用本文:喻高航,关履泰.具有充分下降性的修正PRP算法及其收敛性[J].中山大学学报(自然科学版),2006,45(4):11-14,18.
作者姓名:喻高航  关履泰
作者单位:中山大学科学计算与计算机应用系,广东,广州,510275
基金项目:国家自然科学基金;香港中山大学校科研和教改项目
摘    要:共轭梯度法因其算法简单、存储需求小,非常适合于求解大规模优化问题。在所有的共轭梯度法中,PRP方法被认为是数值表现最好的方法之一。然而,对一般非凸函数,PRP方法即使采用精确线搜索也不能保证全局收敛。本文基于一个修正的PRP公式,提出了一类无需线搜索而具有充分下降性的共轭梯度算法。在一定条件下,建立了该算法的全局收敛性结果。数值试验表明这种改进是有效的。

关 键 词:无约束优化  大规模优化  共轭梯度方法  全局收敛性
文章编号:0529-6579(2006)04-0011-05
收稿时间:2005-07-04
修稿时间:2005-07-04

Modified PRP Methods with Sufficient Descent Property and Their Convergence Properties
YU Gao-hang,GUAN Lü-tai.Modified PRP Methods with Sufficient Descent Property and Their Convergence Properties[J].Acta Scientiarum Naturalium Universitatis Sunyatseni,2006,45(4):11-14,18.
Authors:YU Gao-hang  GUAN Lü-tai
Institution:Department of Scientific Computation and Computer Applications, Sun Yat-sen University, Guangzhou 510275, China
Abstract:The conjugate gradient(CG) method is very suitable for solving large-scale nonlinear optimization due to the simplicity of their iteration and their very low memory requirements.Generally,the PRP method is regarded as one of the most effective CG methods.However,the PRP method even with the exact line search is not globally convergent on nonconvex functions.Based on a modified PRP formula,a class of conjugate gradient methods were proposed for unconstrained optimization.No line search is required,these methods possess the sufficient descent property.Under suitable conditions,globally convergent results were given in this paper.Preliminary numerical results show that these methods are effective.
Keywords:unconstrained optimization  large-scale optimization  conjugate gradient method  global convergence  
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号