Artificial neural networks (ANNs) and evolutionary algorithms (EAs) are both abstractions of natural processes. In the mid 1990s, they were combined into a computational model in order to utilize the learning power of ANNs and adaptive capabilities of EAs. Evolutionary ANNs (EANNs) is the outcome of such a model. They refer to a special class of ANNs in which evolution is another fundamental form of adaptation in addition to learning [52–57]. The essence of EANNs is their adaptability to a dynamic environment. The two forms of adaptation in EANNs – namely evolution and learning – make their adaptation to a dynamic environment much more effective and efficient. In a broader sense, EANNs can be regarded as a general framework for adaptive systems – in other words, systems that can change their architectures and learning rules appropriately without human intervention.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Abbass HA, Sarker R, Newton C 2001 PDE: A Pareto-frontier differential evolution approach for multi-objective optimization problems. In: Kim J-H (ed.) Proc. IEEE Conf. Evolutionary Computation (CEC2001), 27-30 May, Seoul, South Korea. IEEE Press, Piscataway, NJ: 971-978.
Abbass HA 2002 The self-adaptive Pareto differential evolution algorithm. In: Fogel DB, El-Sharkawi MA, Yao X, Greenwood G, Iba H, Marrow P, Shackleton M (eds.) Proc. IEEE Conf. Evolutionary Computation (CEC2002), 12-17 May, Honolulu, HI. IEEE Press, Piscataway, NJ: 831-836.
Abbass HA 2003 Speeding up backpropagation using multiobjective evolu-tionary algorithms. Neural Computation, 15(11): 2705-2726.
Abbass HA 2003 Pareto neuro-evolution: constructing ensemble of neural net-works using multi-objective optimization. In: Sarker R, Reynolds R, Abbass H, Tan KC, McKay B, Essam D, Gedeon T (eds.) Proc. IEEE Conf. Evolutionary Computation (CEC2003), 8-12 December, Canberra, Australia. IEEE Press, Pisctaway, NJ: 2074-2080.
Abbass HA 2003 Pareto neuro-ensemble. In: Gedeon TD, Chun L, Fung C (eds.) Proc. 16th Australian Joint Conf. Artificial Intelligence, 3-5 December, Perth, Australia. Springer-Verlag, Berlin: 554-566.
Angeline PJ, Sauders GM, Pollack JB 1994 An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. Neural Networks, 5(1): 54-65.
Baldi PF, Hornik K 1995 Learning in linear neural networks: a survey. IEEE Trans. Neural Networks, 6(4): 837-858.
Blake C, Merz C UCI repository of machine learning databases. (available online at http://www.ics.uci.edu/ m learn/MLRepository.html - last accessed September 2007).
Belew RK, McInerney J, Schraudolph NN 1991 Evolving networks: using genetic algorithm with connectionist learning. Technical Report CS90-174 (revised), Computer Science and Engineering Department (C-014), University of California, San Diego, February.
Bollé D, Dominguez DRC, Amari S 2000 Mutual information of sparsely coded associative memory with self-control and tenary neurons. Neural Networks, 1: 452-462.
Brown G, Wyatt JL 2003 Negative correlation learning and the ambiguity family of ensemble methods. In: Windeatt T, Roli F (eds.) Proc. Intl. Workshop Multiple Classifier Systems, 11-13 June, Guildford, UK. Springer-Verlag, Berlin: 266-275.
Brown G, Wyatt JL, Harris R, Yao X 2005 Diversity creation methods: a survey and categorisation. J. Information Fusion, 6: 5-20.
Chandra A, Yao X 2006 Ensemble learning using multi-objective evolutionary algorithms. J. Mathematical Modeling and Algorithms, 5(4): 417-445.
Darwen PJ, Yao X 1996 Every niching method has its niche: fitness sharing and implicit sharing compared. In: Ebeling W, Rechenberg I, Schwefel H-P, Voight H-M (eds.) Parallel Problem Solving from Nature (PPSN) IV, 22-26 September, Berlin, Germany. Lecture Notes in Computer Science 1141. Springer-Verlag, Berlin: 398-407.
Darwen PJ, Yao X (1997) Speciation as automatic categorical modularization. IEEE Trans. Evolutionary Computation, 1: 101-108.
Dietterich TG 1998 Machine-learning research: four current directions. AI Magazine, 18(4): 97-136.
Finnoff W, Hergent F, Zimmermann HG 1993 Improving model selection by nonconvergent methods. Neural Networks, 6: 771-783.
Fogel LJ, Owens AJ, Walsh MJ 1966 Artificial Intelligence Through Simulated Evolution. Wiley, New York, NY.
Fogel GB, Fogel DB 1995 Continuous evolutionary programming: analysis and experiments. Cybernetic Systems, 26: 79-90.
Fogel DB 1995 Evolutionary Computation: Toward a New Philosophy of Ma-chine Intelligence. IEEE Press, Piscataway, NJ.
Goldberg DE 1989 Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA.
Hancock PJB 1992 Genetic algorithms and permutation problems: a com-parison of recombination operators for neural net structure specification. In: Whitley D, and Schaffer JD (eds.) in Proc. Intl. Workshop Combinations Genetic Algorithms Neural Networks (COGANN-92), 6 June, Maryland, IEEE Computer Society Press, Los Alamitos, CA: 108-122.
Hansen LK, Salamon P 1990 Neural network ensembles. IEEE Trans. Pattern Analysis and Machine Intelligence, 12(10): 993-1001.
Hashem S 1993 Optimal linear combinations of neural networks. PhD disser-tation. School of Industrial Engineering, Purdue University, West Lafayette, IN, December.
Deb K 2001 Multi-Objective Optimization Using Evolutionary Algorithms. Wiley, Chichester, UK.
Khare V, Yao X, and B. Sendhoff B 2006 Multi-network evolutionary systems and automatic problem decomposition. Intl. J. General Systems, 35(3): 259-274.
Krogh A, Vedelsby J 1995 Neural network ensembles, cross validation, and active learning. Neural Information Processing Systems, 7: 231-238.
Krogh A, Sollich P 1997 Statistical mechanics of ensemble learning. Physics Reviews E, 55: 811-825.
Kwok TY, Yeung DY 1997 Constructive algorithms for structure learning in feedforward neural networks for regression problems. IEEE Trans. Neural Networks, 8: 630-645.
Kwok TY, Yeung DY 1997 Objective functions for training new hidden units in constructive neural networks. IEEE Trans. Neural Networks, 8: 1131-1148.
Lehtokangas M 1999 Modeling with constructive backpropagation,” Neural Networks, 12: 707-716.
Lee CY, Yao X 2004 Evolutionary programming using the mutations based on the Lévy probability distribution. IEEE Trans. Evolutionary Computation, 8(1): 1-13.
Liu Y, Yao X 1999 Ensemble learning via negative correlation,” Neural Networks, 12: 1399-1404.
Liu Y, Yao X, Higuchi T 2000 Evolutionary ensembles with negative corre-lation learning. IEEE Trans. Evolutionary Computation, 4(4): 380-387.
MacQueen J (1967) Some methods for classification and analysis of multi-variate observation. In: Proc. 5th Berkely Symp. Mathematical Statistics and Probability, Berkely, CA, University of California Press, 1: 281-297.
Mahfoud SW 1995 Niching methods for genetic algorithms. PhD Thesis, Dep-tartment of General Engineering, University of Illinois, Urbana-Champaign, IL.
Monirul Islam M, Yao X, Murase K 2003 A constructive algorithm for training cooperative neural network ensembles. IEEE Trans. Neural Networks, 14: 820-834.
Mulgrew B, Cowan CFN 1988 Adaptive Filters and Equalizers. Kluwer, Boston, MA.
Odri SV, Petrovacki DP, Krstonosic GA 1993 Evolutional development of a multilevel neural network. Neural Networks, 6(4): 583-595.
Opitz DW, Shavlik JW 1996 Generating accurate and diverse members of a neural-network ensemble. Neural Information Processing Systems, 8: 535-541.
Opitz D, Maclin R 1999 Popular ensemble methods: an empirical study. J. Artificial Intelligence Research, 11: 169-198.
Perrone MP 1993 Improving regression estimation: averaging methods for vari-ance reduction with extensions to general convex measure optimization. PhD Dissertation, Department of Physics, Brown University, Providence, RI, May.
Prechelt L 1994 Proben1-A set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Fakultät für Informatik, University of Karlsruhe, Germany, September.
Prechelt L 1995 Some notes on neural learning algorithm benchmarking. Neurocomputing, 9(3): 343-347
Rissanen J 1978 Modeling by shortest data description. Automatica, 14: 465-471.
Rumelhart DE, Hinton GE, Williams RJ 1986 Learning internal represen-tations by error propagation. In: Rumelhart DE, McClelland JL (eds.) Parallel Distributed Processing: Explorations in the Microstructures of Cognition, I. MIT Press, Cambridge, MA: 318-362.
Sharkey AJC 1996 On combining artificial neural nets. Connection Science, 8(3/4): 299-313.
Schaffer JD, Whitley D, Eshelman LJ 1992 Combinations of genetic algorithms and neural networks: a survey of the state of the art. In: Whitley D, Schaffer JD (eds.) Proc. Intl. Workshop Combinations Genetic Algorithms Neural Net-works (COGANN-92), 6 June, Maryland. IEEE Computer Society Press, Los Alamitos, CA: 1-37.
Srinivas N, Deb K 1994 Multi-objective function optimization using non-dominated sorting genetic algorithms. Evolutionary Computation, 2(3): 221-248.
Storn R, Price K 1996 Minimizing the real functions of the ICEC’96 contest by differential evolution. In: Fukuda T, Furuhashi T, Back T, Kitano H, Michalewicz (eds.) Proc. IEEE Intl. Conf. Evolutionary Computation, 20-22 May, Nagoya, Japan. IEEE Computer Society Press, Los Alamitos, CA: 842-844.
Syswerda G 1991 A study of reproduction in generational and steady state genetic algorithms. In: Rawlins GJE (ed.) Foundations of Genetic Algorithms. Morgan Kaufmann, San Mateo, CA: 94-101.
Yao X (1991) Evolution of connectionist networks. In: Proc. Intl. Symp. AI, Reasoning & Creativity, Griffith University, Queensland, Australia, 49-52.
Yao X 1993 An empirical study of genetic operators in genetic algorithms. Microprocessors and Microprogramming, 38: 707-714.
Yao X 1993 A review of evolutionary artificial neural networks. Int. J. Intelligent Systems, 8(4): 539-567.
Yao X 1993 Evolutionary artificial neural networks. Int. J. Neural Systems, 4(3): 203-222.
Yao X 1994 The evolution of connectionist networks. In: Dartnall T. (ed.) Artificial Intelligence and Creativity. Kluwer, Dordrecht, The Netherlands: 233-243.
Yao X 1995 Evolutionary artificial neural networks. In: Kent A, Williams JG (eds.) Encyclopedia of Computer Science and Technology 33, Marcel Dekker, New York, NY: 137-170.
Yao X, Shi Y 1995 A preliminary study on designing artificial neural net-works using co-evolution. In: Toumodge S, Lee TH, Sundarajan N (eds.) Proc. IEEE Intl. Conf. Intelligent Control Instrumentation, 2-8 July, Singapore. IEEE Computer Society Press, Los Alamitos, CA: 149-154.
Yao X 1999 Evolving artificial neural networks. Proc. IEEE, 87: 1423-1447.
Yao X, Liu Y 1996 Ensemble structure of evolutionary artificial neural net-works. In: Fukuda T, Furuhashi T, Back T, Kitano H, Michalewicz (eds.) Proc. 1996 IEEE Intl. Conf. Evolutionary Computation (ICEC96), 20-22 May, Nagoya, Japan. IEEE Computer Society Press, Los Alamitos, CA: 659-664.
Yao X, Liu Y 1997 A new evolutionary system for evolving artificial neural networks. IEEE Trans. Neural Networks, 8(3): 694-713.
Yao X, Liu Y 1998 Making use of population information in evolutionary arti-ficial neural networks. IEEE Trans. Systems, Man, and Cybernetics B, 28(3): 417-425.
Yao X, Liu Y, Darwen P 1996 ‘How to make best use of evolutionary learning’. In: Stocker R, Jelinek H, Durnota B (eds.) Complex Systems: From Local Interactions to Global Phenomena. IOS Press, Amsterdam, The Netherlands: 229-242.
Yao X, Liu Y, Lin G 1999 Evolutionary Programming Made Faster. IEEE Trans. Evolutionary Computation, 3(2): 82-102.
Yao X, Islam MM (2008) Evolving artificial neural network ensembles. IEEE Computational Intelligence Magazine, 3(1) (in press).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Islam, M.M., Yao, X. (2008). Evolving Artificial Neural Network Ensembles. In: Fulcher, J., Jain, L.C. (eds) Computational Intelligence: A Compendium. Studies in Computational Intelligence, vol 115. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-78293-3_20
Download citation
DOI: https://doi.org/10.1007/978-3-540-78293-3_20
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-78292-6
Online ISBN: 978-3-540-78293-3
eBook Packages: EngineeringEngineering (R0)