Abstract
The main results for the weak convergence of stochastic approximation algorithms are given in this chapter, using the ideas of Sections 7.3 and 7.4. The relative simplicity of the proofs and weakness of the required conditions in comparison with the probability one method will be seen. Section 1 lists the main conditions used for both the martingale difference noise case and the correlated “exogenous” noise case, when the step size does not change with time. For a single process with fixed step size ε n = ε, there can only be convergence in distribution as n → ∞. With an arbitrarily high probability, for small enough e the limit process is concentrated in an arbitrarily small neighborhood of some limit set of the limit mean ODE. This is a special case of the limit results in the theorem.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer Science+Business Media New York
About this chapter
Cite this chapter
Kushner, H.J., Yin, G.G. (1997). Weak Convergence Methods for General Algorithms. In: Stochastic Approximation Algorithms and Applications. Applications of Mathematics, vol 35. Springer, New York, NY. https://doi.org/10.1007/978-1-4899-2696-8_8
Download citation
DOI: https://doi.org/10.1007/978-1-4899-2696-8_8
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4899-2698-2
Online ISBN: 978-1-4899-2696-8
eBook Packages: Springer Book Archive