Abstract
Many well-structured problems in many areas can be defined as the optimization of a multivariate energy function E(x 1,x 2,…,x n). In most cases, a problem of that type is computationally hard. The only way we know before to find the global optimum is by making an algorithm that runs in exponential time which is too expensive in practice. The other traditional methods, like Local Search, Simulated Annealing, Tabu Search, and the evolutionary algorithms, have no general conditions to identify the global optimum and stop searching. A new cooperative optimization algorithm is presented in this paper that is capable of finding the global optimum within a polynomial time, though not guaranteed. It has both sufficient conditions and necessary conditions to identify the global optimums and to trim the search space. It is guaranteed to converge linearly to a solution, which must be the global one if it is a consensus solution. The convergence is also insensitive to the disturbances to its initial or intermediate solutions. A lower bound on the energy function is also provided by the algorithm which is guaranteed to be improved after each iteration. Its power is demonstrated in solving nonlinear optimization problems raised from an early vision problem, shape from shading of a polyhedron. Most importantly in this paper, to make the algorithm complete, i.e. be always capable of finding the global optimum, we propose a general framework for constructing the cooperative global optimization algorithms more powerful than the primary one based on the lattice concept from Abstract Algebra.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
K.E. Atkinson, An Introduction to Numerical Analysis, John Wiley & Sons, 1989.
G. Bilbro et al, “Optimization by mean field annealing,”, Advances in Neural Information Processing Systems,San Mateo, CA: Morgan-Kauffman, 1989.
M.R. Garey and D.S. Johnson, Computers and Intractability, San Francisco:Freeman, 1979.
S. Geman and D. Geman. Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-6(6):721–741, Nov. 1984.
G.E. Hinton, T. Sejnowski, and D. Ackley, “Boltzman machines: Constraint satisfaction machines that learn”, Cognitive Science, 9:147–169, 1985.
J.H. Holland, “Genetic Algorithms”, Scientific American, July:66–72, 1992.
J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proceedings of the National Academy of Sciences, 79:2554–2558, 1982.
J.J. Hopfield, “Neurons with graded response have collective computational properties like those two-state neurons”, Proceedings of the National Academy of Sciences, 81:3088–3092, 1984.
Horn, B.: Understanding image intensity. Artificial Intelligence 8 (1977) 301–231
X. Huang, “A Cooperative Search Approach to Global Optimization,” Proceeding of the First International Conference on Optimiztion Methods and Software, Dec.:140, Hangzhou, China, 2002.
X. Huang. On theorectical foundations of a new algorithm for combinatorial optimization. Tsinghua Science and Technology, P.R. China, To appear.
Kirkpatrick, C.D. Gelatt, and M.P. Vecchi, “Optimization by simulated annealing,” Science,220:671–680, 1983.
P. Larranaga and J. A. Lozano, “Estimation of Distribution Algorithms,” A New Tool for Evolutionary Computation, Kluwer Academic Publishers, 2001.
Z. Michalewicz Genetic Algorithm + Data Structures = Evolution Programs. 3rd, Extended Edition, Springer-Verlag, 1995.
R.S. Varga, Matrix iterative Analysis, Englewood Cliffs, N.J.:Prentice-Hall, 1962.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Kluwer Academic Publishers
About this paper
Cite this paper
Huang, X. (2004). A General Framework for Constructing Cooperative Global Optimization Algorithms. In: Floudas, C.A., Pardalos, P. (eds) Frontiers in Global Optimization. Nonconvex Optimization and Its Applications, vol 74. Springer, Boston, MA. https://doi.org/10.1007/978-1-4613-0251-3_11
Download citation
DOI: https://doi.org/10.1007/978-1-4613-0251-3_11
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-7961-4
Online ISBN: 978-1-4613-0251-3
eBook Packages: Springer Book Archive