Summary
In this chapter, we establish a framework for formal comparisons of several leading optimization algorithms, providing guidance to practitioners for when to use or not use a particular method. The focus in this chapter is five general algorithm forms: random search, simultaneous perturbation stochastic approximation, simulated annealing, evolution strategies, and genetic algorithms. We summarize the available theoretical results on rates of convergence for the five algorithm forms and then use the theoretical results to draw some preliminary conclusions on the relative efficiency. Our aim is to sort out some of the competing claims of efficiency and to suggest a structure for comparison that is more general and transferable than the usual problem-specific numerical studies.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
- Genetic Algorithm
- Loss Function
- Random Search
- Evolution Strategy
- Simultaneous Perturbation Stochastic Approximation
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag London Limited
About this chapter
Cite this chapter
Spall, J.C., Hill, S.D., Stark, D.R. (2006). Theoretical Framework for Comparing Several Stochastic Optimization Approaches. In: Calafiore, G., Dabbene, F. (eds) Probabilistic and Randomized Methods for Design under Uncertainty. Springer, London. https://doi.org/10.1007/1-84628-095-8_3
Download citation
DOI: https://doi.org/10.1007/1-84628-095-8_3
Publisher Name: Springer, London
Print ISBN: 978-1-84628-094-8
Online ISBN: 978-1-84628-095-5
eBook Packages: EngineeringEngineering (R0)