Abstract
In this paper, we study large scale nonlinear systems of equations and nonlinear least square problems. We present subspace methods for solving these two special optimization problems. The subspace methods have the characteristic to force the next iteration in a low dimensional subspace. The main technique is to construct subproblems in low dimensions so that the computation cost in each iteration can be reduced comparing to standard approaches. The subspace approach offers a possible way to handle large scale optimization problems which are now attracting more and more attention. Actually, quite a few known techniques can be viewed as subspace methods, such as conjugate gradient method, limited memory quasi-Newton method, projected gradient method, and null space method.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Bouaricha A, Moré JJ (1997) Impact of partial separability on large-scale optimization. Comput Optim Appl 7(1):27–40
Conn AR, Gould NJM, Toint PhL (2000) Trust-region methods. MPS-SIAM series on optimization. SIAM, Philadelphia
Dennis JE, Schnable RB (1993) Numerical methods for unconstrained optimization and nonlinear equations. SIAM, Philadelphia
Fletcher R (1987) Practical methods of optimization, 2nd edn. Wiley, New York
Gill PE, Leonard MW (2001) Reduced-Hessian quasi-Newton methods for unconstrained optimization. SIAM J Optim 12:209–237
Golub GH, Van Loan ChF (1996) Matrix computations, 3rd edn. Johns Hopkins University Press, Baltimore
Gould NIM, Toint PhL (2007) FILTRANE, a Fortran 95 filter-trust-region package for solving nonlinear least-squares and nonlinear feasibility problems. ACM Trans Math Softw 33(1):3–25
Gould NIM, Orban D, Toint PhL (2005) Numerical methods for large-scale nonlinear optimization. Acta Numer 299–361
Kelly CT (1995) Iterative methods for linear and nonlinear equations. SIAM, Philadelphia
Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program 45:503–528
Mizutani E, Demmel JW (2003) On structure-exploiting trust-region regularized nonlinear least squares algorithms for neural-network learning. Neural Netw 16(5–6):745–753
Nocedal J, Wright SJ (1999) Numerical optimization. Springer, New York
Ortega JM, Rheinboldt WC (1970) Iterative solution of nonlinear equations in several variables. Academic Press, New York
Powell MJD (1970) A new algorithm for unconstrained optimization. In: Rosen JB, Mangasarian OL, Ritter K (eds) Nonlinear programming. Academic Press, New York, pp 31–66
Saad Y (2003) Iterative methods for sparse linear systems 2nd edn. Springer, New York
Steihaug T (1983) The conjugate gradient method and trust regions in large scale optimization. SIAM J Numer Anal 20:626–637
Stoer J, Yuan Y (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75:69–77
Sun WY, Yuan Y (2006) Optimization theory and methods: nonlinear programming. Springer optimization and its application, vol. 1. Springer, Berlin
Toint PhL (1981) Towards an efficient sparsity exploiting Newton method for minimization. In: Duff I (ed) Sparse matrices and their uses. Academic Press, New York, pp 57–88
Toint PhL (1987) On large scale nonlinear least squares calculations. SIAM J Sci Stat Comput 8(3):416–435
Vlček J, Lukšan L (2002) New variable metric methods for unconstrained minimization covering the large-scale case, Technical report No. V876, Institute of Computer Science, Academy of Sciences of the Czech Republic, October 2002
Wang ZH, Yuan YX (2006) A subspace implementation of quasi-Newton trust region methods for unconstrained optimization. Numer Math 104:241–269
Wang ZH, Wen ZW, Yuan YX (2004) A subspace trust region method for large scale unconstrained optimization. In: Yuan YX (ed) Numerical linear algebra and optimization. Science Press, Marrickvill, pp 264–274
Yuan YX (1998) Trust region algorithms for nonlinear equations. Information 1:7–20
Yuan YX (2000a) A review of trust region algorithms for optimization. In: Ball JM, Hunt JCR (eds) ICM99: proceedings of the fourth international congress on industrial and applied mathematics. Oxford University Press, Oxford, pp 271–282
Yuan YX (2000b) On the truncated conjugate gradient method. Math Program 86:561–571
Yuan YX (2007) Subspace techniques for nonlinear optimization. In: Jeltsch R, Li DQ, Sloan IH (eds) Some topics in industrial and applied mathematics. Series in contemporary applied mathematics, vol 8. Higher Education Press, Beijing, pp 206–218
Author information
Authors and Affiliations
Corresponding author
Additional information
This work is partially supported by Chinese NSF grants 10231060, 10831006 and by CAS grant kjcx-yw-s7.
Rights and permissions
About this article
Cite this article
Yuan, YX. Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim Eng 10, 207–218 (2009). https://doi.org/10.1007/s11081-008-9064-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11081-008-9064-0