Abstract
We study the use of the BFGS and DFP algorithms with step-lengths of one for minimizing quadratic functions of only two variables. The updating formulae in this case imply nonlinear three term recurrence relations between the eigenvalues of consecutive second derivative approximations, which are analysed in order to explain some gross inefficiencies that can occur. Specifically, the BFGS algorithm may require more than 10 iterations to achieve the first decimal place of accuracy, while the performance of the DFP method is far worse. The results help to explain why the DFP method is often less suitable than the BFGS algorithm for general unconstrained optimization calculations, and they show that quadratic functions provide much information about efficiency when the current vector of variables is too far from the solution for an asymptotic convergence analysis.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
R. Fletcher, “A new approach to variable metric algorithms“,The Computer Journal 13 (1970) 317–322.
R. Fletcher,Practical methods of optimization, Vol. 1: Unconstrained optimization (John Wiley & Sons, Chichester, 1980).
M.J.D. Powell, “On the rate of convergence of variable metric algorithms for unconstrained optimization”, Report DAMTP 1983/NA7 (Cambridge, 1983), to appear in theProceedings of the International Congress of Mathematicians, Warsaw 1983 (Polish Scientific Publishers, Warsaw, 1984) pp. 1525–1539.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Powell, M.J.D. How bad are the BFGS and DFP methods when the objective function is quadratic?. Mathematical Programming 34, 34–47 (1986). https://doi.org/10.1007/BF01582161
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01582161