Abstract
Much of the classical work in stochastic approximation dealt with the situation where the “noise” in each observation Y n is a martingale difference, that is, where there is a function g n (·) of θ such that E [Y n |Yi, i < n, θ0] = g n (θ n ) [17, 40, 45, 47, 56, 79, 86, 132, 154, 159, 169, 181]. Then we can write Y n = g n (θ n ) + δM n where δM n is a martingale difference. This “martingale difference noise” model is still of considerable importance. It arises, for example, where Y n has the form Y n = F n (θ n , ψ n ) where ψ n are mutually independent. The convergence theory is relatively easy in this case, because the noise terms can be dealt with by well-known and relatively simple probability inequalities for martingale sequences. This chapter is devoted to this martingale difference noise case. Nevertheless, the ODE, compactness, and stability techniques to be introduced are of basic importance for stochastic approximation, and will be used in subsequent chapters.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer Science+Business Media New York
About this chapter
Cite this chapter
Kushner, H.J., Yin, G.G. (1997). Convergence with Probability One: Martingale Difference Noise. In: Stochastic Approximation Algorithms and Applications. Applications of Mathematics, vol 35. Springer, New York, NY. https://doi.org/10.1007/978-1-4899-2696-8_5
Download citation
DOI: https://doi.org/10.1007/978-1-4899-2696-8_5
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4899-2698-2
Online ISBN: 978-1-4899-2696-8
eBook Packages: Springer Book Archive