Abstract
As shown in the bibliography, training an ensemble of networks is an interesting way to improve the performance with respect to a single network. However there are several methods to construct the ensemble and there are no complete results showing which one could be the most appropriate. In this paper we present a comparison of eleven different methods. We have trained ensembles of 3, 9, 20 and 40 networks to show results in a wide spectrum of values. The results show that the improvement in performance above 9 networks in the ensemble depends on the method but it is usually marginal. Also, the best method is called “Decorrelated” and uses a penalty term in the usual Backpropagation function to decorrelate the network outputs in the ensemble.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Tumer, K., Ghosh, J.: Error correlation and error reduction in ensemble classifiers. Connection Science 8(3&4), 385–404 (1996)
Raviv, Y., Intrator, N.: Bootstrapping with Noise: An Effective Regularization Technique. Connection Science 8(3&4), 355–372 (1996)
Drucker, H., Cortes, C., Jackel, D., et al.: Boosting and Other Ensemble Methods. Neural Computation 6, 1289–1301 (1994)
Verikas, A., Lipnickas, A., et al.: Soft combination of neural classifiers: A comparative study. Pattern Recognition Letters 20, 429–444 (1999)
Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)
Freund, Y., Schapire, R.: Experiments with a New Boosting Algorithm. In: Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156 (1996)
Rosen, B.: Ensemble Learning Using Decorrelated Neural Networks. Connection Science 8(3&4), 373–383 (1996)
Auda, G., Kamel, M.: EVOL: Ensembles Voting On-Line. In: Proc. of the World Congress on Computational Intelligence, pp. 1356–1360 (1998)
Liu, Y., Yao, X.: A Cooperative Ensemble Learning System. In: Proc. of the World Congress on Computational Intelligence, pp. 2202–2207 (1998)
Jang, M., Cho, S.: Ensemble Learning Using Observational Learning Theory. In: Proceedings of the International Joint Conference on Neural Networks, vol. 2, pp. 1281–1286 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Fernández-Redondo, M., Hernández-Espinosa, C., Torres-Sospedra, J. (2004). Multilayer Feedforward Ensembles for Classification Problems. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds) Neural Information Processing. ICONIP 2004. Lecture Notes in Computer Science, vol 3316. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30499-9_114
Download citation
DOI: https://doi.org/10.1007/978-3-540-30499-9_114
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23931-4
Online ISBN: 978-3-540-30499-9
eBook Packages: Springer Book Archive