Abstract
In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Y. X. Yuan and W. Y. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, 2002.
Z. J. Shi, A new memory gradient method under exact line search, Asia-Pacific J. Oper. Res., 2003, 20: 275–284.
Z. J. Shi, A new super-memory gradient method for unconstrained optimization, Advance in Mathematics, 2006, 35: 265–274.
L. Zhang, W. J. Zhou, and D. H. Li, A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 2006, 26: 629–640.
L. Zhang, W. J. Zhou, and D. H. Li, Global convergence of a modified Fletcher-Reeves conjugate method with Armijo-type line search, Numerische Mathematik, 2006, 104: 561–572.
Z. J. Shi and J. Shen, Convergence of Liu-Storey conjugate gradient method, European Journal of Operational Research, 2007, 182: 552–560.
W. Y. Cheng, A two-term PRP-based descent method, Numerical Functional Analysis and Optimization, 2007, 28: 1217–1230.
Z. J. Shi and J. Shen, Convergence of the Polak-Ribière-Polyak conjugate method, Nonlinear Analysis, 2007, 66: 1428–1441.
M. R. Hestenes and E. Stiefel, Methods of conjugate gradient for solving linear systems, Journal of Research of the National Bureau of Standards, 1952, 49: 409–436.
R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Computer Journal, 1964, 7: 149–154.
E. Polyak and G. Ribière, Note Sur la convergence de méthods de directions conjugées, Revue Francaise d’Informatique et de Recherche Opérationnelle, 1969, 16: 35–43.
B. T. Polyak, The conjugate gradient method in extreme problems, USSR Computational Mathematics and Mathematical Physics, 1969, 9: 94–112.
Y. H. Dai and Y. X. Yuan, A nonlinear conjugate gradient with a strong global convergence properties, SIAM Journal on Optimization, 2000, 10: 177–182.
Z. X. Wei, S. W. Yao, and L. Y. Liu, The convergence properties of some new conjugate gradient methods, Applied Mathematics and Computation, 2006, 183: 1341–1350.
G. H. Yu, L. T. Guan, and Z. X. Wei, A globally convergent Polak-Ribière-Polyak conjugate method with Armijo-type line search, Numerical Mathematics, 2006, 15: 357–366.
Q. Liu, C. Y. Wang, and X. M. Yang, On the convergence of a new hybrid projection algorithm, Journal of Systems Science & Complexity, 2006, 19(13): 423–430.
S. J. Lian, C. Y. Wang, and L. X. Cao, Convergence properties of the dependent PRP conjugate gradient methods, Journal of Systems Science & Complexity, 2006, 19(2): 288–296.
N. Andrei, A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization, Applied Mathematics Letters, 2008, 21: 165–171.
Author information
Authors and Affiliations
Corresponding author
Additional information
This research is supported by the National Science Foundation of China under Grant No. 70971076 and the Foundation of Shandong Provincial Education Department under Grant No. J10LA59.
This paper was recommended for publication by Editor Shouyang WANG.
Rights and permissions
About this article
Cite this article
Sun, M., Bai, Q. A new descent memory gradient method and its global convergence. J Syst Sci Complex 24, 784–794 (2011). https://doi.org/10.1007/s11424-011-8150-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11424-011-8150-0