Abstract.
We present a unified convergence rate analysis of iterative methods for solving the variational inequality problem. Our results are based on certain error bounds; they subsume and extend the linear and sublinear rates of convergence established in several previous studies. We also derive a new error bound for $\gamma$-strictly monotone variational inequalities. The class of algorithms covered by our analysis in fairly broad. It includes some classical methods for variational inequalities, e.g., the extragradient, matrix splitting, and proximal point methods. For these methods, our analysis gives estimates not only for linear convergence (which had been studied extensively), but also sublinear, depending on the properties of the solution. In addition, our framework includes a number of algorithms to which previous studies are not applicable, such as the infeasible projection methods, a separation-projection method, (inexact) hybrid proximal point methods, and some splitting techniques. Finally, our analysis covers certain feasible descent methods of optimization, for which similar convergence rate estimates have been recently obtained by Luo [14].
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Author information
Authors and Affiliations
Additional information
Received: April 17, 2001 / Accepted: December 10, 2002 Published online: April 10, 2003
RID="⋆"
ID="⋆" Research of the author is partially supported by CNPq Grant 200734/95–6, by PRONEX-Optimization, and by FAPERJ.
Key Words. Variational inequality – error bound – rate of convergence
Mathematics Subject Classification (2000): 90C30, 90C33, 65K05
Rights and permissions
About this article
Cite this article
Solodov, M. Convergence rate analysis of iteractive algorithms for solving variational inequality problems. Math. Program., Ser. A 96, 513–528 (2003). https://doi.org/10.1007/s10107-002-0369-z
Issue Date:
DOI: https://doi.org/10.1007/s10107-002-0369-z