首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A new descent memory gradient method and its global convergence
Authors:Min Sun  Qingguo Bai
Institution:1. Department of Mathematics and Information Science, Zaozhuang University, Zaozhuang, 277160, China
2. School of Management, Qufu Normal University, Rizhao, 276826, China
Abstract:In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号