Abstract
This paper describes two optimal subgradient algorithms for solving structured large-scale convex constrained optimization. More specifically, the first algorithm is optimal for smooth problems with Lipschitz continuous gradients and for Lipschitz continuous nonsmooth problems, and the second algorithm is optimal for Lipschitz continuous nonsmooth problems. In addition, we consider two classes of problems: (i) a convex objective with a simple closed convex domain, where the orthogonal projection onto this feasible domain is efficiently available; and (ii) a convex objective with a simple convex functional constraint. If we equip our algorithms with an appropriate prox-function, then the associated subproblem can be solved either in a closed form or by a simple iterative scheme, which is especially important for large-scale problems. We report numerical results for some applications to show the efficiency of the proposed schemes.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Ahookhosh, M.: Optimal subgradient algorithms with application to large-scale linear inverse problems, Submitted (2015), arXiv:1402.7291
Ahookhosh, M.: High-dimensional nonsmooth convex optimization via optimal subgradient methods. PhD Thesis, University of Vienna, pp. 1–206 (2015)
Ahookhosh, M., Ghaderi, S.: On efficiency of nonmonotone Armijo-type line searches. Appl. Math. Model. 43, 170–190 (2017)
Ahookhosh, M., Neumaier, A.: High-dimensional convex optimization via optimal affine subgradient algorithms. In: ROKS workshop, pp. 83–84 (2013)
Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithm with subspace search for costly convex optimization problems, submitted, http://www.optimization-online.org/DB_FILE/2015/04/4852.pdf (2016)
Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithm for large-scale bound-constrained convex optimization, submitted, arXiv:1501.01497(2015)
Ahookhosh, M., Neumaier, A.: Solving nonsmooth convex optimization with complexity o(ε −1/2), submitted, http://www.optimization-online.org/DB_FILE/2015/05/4900.pdf(2016)
Amini, K., Ahookhosh, M., Nosratipour, H.: An inexact line search approach using modified nonmonotone strategy for unconstrained optimization. Numerical Algorithms 66, 49–78 (2014)
Auslender, A., Teboulle, M.: Interior gradient and proximal methods for convex and conic optimization. SIAM J. Optim. 16, 697–725 (2006)
Bagirov, A., Karmitsa, N., Mäkelä, M.M.: Introduction to Nonsmooth Optimization: Theory, Practice and Software, Springer International Publishing (2014)
Bardsley, J., Vogel, C.R.: A nonnegatively constrained convex programming method for image reconstruction. SIAM J. Sci. Comput. 25, 1326–1343 (2003)
Barzilai, J., Borwein, J.M.: Two point step size gradient method. IMA J. Numer. Anal. 8, 141–148 (1988)
Beck, A., Teboulle, M.: Smoothing and first order methods: a unified framework. SIAM J. Optim. 22, 557–580 (2012)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)
Becker, S.R., Candès, E.J., Grant, M.C.: Templates for convex cone problems with applications to sparse signal recovery. Math. Program. Comput. 3, 165–218 (2011)
Bertsekas, D.P.: Nonlinear programming, 2nd ed. Athena Scientific, Belmont (1999)
Bertsekas, D.P., Tsitsiklis, J.N.: Gradient convergence in gradient methods with errors. SIAM J. Optim. 10, 627–642 (2000)
Birgin, E.G., Martinez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10, 1196–1211 (2000)
Boţ, R.I., Hendrich, C.: A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems. Comput. Optim. Appl. 54(2), 239–262 (2013)
Boţ, R.I., Hendrich, C.: A Douglas-Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators. SIAM J. Optim. 23(4), 2541–2565 (2013)
Boţ, R.I., Csetnek, E. R., Hendrich, C.: A primal-dual splitting algorithm for finding zeros of sums of maximally monotone operators. SIAM J. Optim. 23, 2011–2036 (2013)
Boyd, S., Xiao, L., Mutapcic, A: Subgradient methods, notes for EE392o, stanford university, http://www.stanford.edu/class/ee392o/subgrad_method.pdf (2003)
Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)
Chambolle, A., Caselles, V., Cremers, D., Novaga, M., Pock, M.: An introduction to total variation for image analysis. In: Theoretical foundations and numerical methods for sparse recovery, vol. 9, pp. 263–340. , Radon Series Comp. Appl. Math., De Gruyter (2010)
Chan, R.H., Tao, M., Yuan, X.: Constrained total variation deblurring models and fast algorithms based on alternating direction method of multipliers. SIAM J. Imaging Sci. 6(1), 680–697 (2013)
Combettes, P., Pesquet, J.-C.: Proximal splitting methods in signal processing (2011)
Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math. Program. 146, 37–75 (2013)
Devolder, O., Glineur, F., Nesterov, Y.: Double smoothing technique for large-scale linearly constrained convex optimization. SIAM J. Optim. 22(2), 702–727 (2012)
Gonzaga, C.C., Karas, E.W.: Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming. Math. Program. 138, 141–166 (2013)
Gonzaga, C.C., Karas, E.W., Rossetto, D.R.: An optimal algorithm for constrained differentiable convex optimization. SIAM J. Optim. 23(4), 1939–1955 (2013)
Kaufman, L., Neumaier, A.: PET regularization by envelope guided conjugate gradients. IEEE Trans. Med. Imaging 15, 385–389 (1996)
Kaufman, L., Neumaier, A.: Regularization of ill-posed problems by envelope guided conjugate gradients. J. Comput. Graph. Stat. 6(4), 451–463 (1997)
Kim, Y., Kim, J., Kim, Y.: Blockwise sparse regression. Stat. Sin. 16(2), 375–390 (2006)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hansen, P.: Regularization tools version 4.0 for matlab 7.3. Numerical Algorithms 46, 189–194 (2007)
Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12, 55–67 (1970)
Lemarchal, C., Nemirovskii, A., Nesterov, Y.: New variants of bundle methods. Math. Program. 69(1-3), 111–147 (1995)
Lan, G.: Bundle-level type methods uniformly optimal for smooth and non-smooth convex optimization. Math. Program. 149(1), 1–45 (2015)
Lan, G., Lu, Z., Monteiro, R.D.C.: Primal-dual first-order methods with O(1/ε) iteration-complexity for cone programming. Math. Program. 126, 1–29 (2011)
Nedić, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)
Nemirovsky, A.S., Yudin, D.B.: Problem complexity and method efficiency in optimization. Wiley, New York (1983)
Nesterov, Y.: Introductory lectures on convex optimization: a basic course. Kluwer, Dordrecht (2004)
Nesterov, Y.: A method of solving a convex programming problem with convergence rate o(1/k 2). Doklady AN SSSR (In Russian) 269, 543–547 (1983). English translation: Soviet Math. Dokl., 27 (1983), 372–376
Nesterov, Y.: Universal gradient methods for convex optimization problems. Math. Program. 152(1), 381–404 (2015)
Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005)
Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J. Optim. 16, 235–249 (2005)
Nesterov, Y.: Gradient methods for minimizing composite objective function. Math. Program. 140, 125–161 (2013)
Neumaier, A.: OSGA: a fast subgradient algorithm with optimal complexity. Math. Program. 158(1), 1–21 (2016)
Neumaier, A.: Solving ill-conditioned and singular linear systems: a tutorial on regularization. SIAM Rev. 40(3), 636–666 (1998)
Neumaier, A.: Introduction to numerical analysis. Cambridge University Press, Cambridge (2001)
Parikh, N., Boyd, S.: Proximal algorithms. Foundations and Trends in Optimization 1(3), 123–231 (2013)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. 58, 267–288 (1996)
Tseng, P: On accelerated proximal gradient methods for convex-concave optimization, technical report, mathematics department, university of washington, http://pages.cs.wisc.edu/brecht/cs726docs/tseng.APG.pdf (2008)
Yuan, M., Lin, Y.: Model selection and estimation in regress ion with grouped variables. J. R. Stat. Soc. 68, 49–67 (2006)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Ahookhosh, M., Neumaier, A. Optimal subgradient algorithms for large-scale convex optimization in simple domains. Numer Algor 76, 1071–1097 (2017). https://doi.org/10.1007/s11075-017-0297-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-017-0297-x
Keywords
- Structured convex optimization
- Nonsmooth optimization
- Optimal complexity
- First-order black-box information
- Subgradient method
- High-dimensional data