Abstract
One of crucial issue in the design of combinational classifier systems is to keep diversity in the results of classifiers to reach the appropriate final result. It’s obvious that the more diverse the results of the classifiers, the more suitable final result. In this paper a new approach for generating diversity during creation of an ensemble together with a new combining classifier system is proposed. The main idea in this novel system is heuristic retraining of some base classifiers. At first, a basic classifier is run, after that, regards to the drawbacks of this classifier, other base classifiers are retrained heuristically. Each of these classifiers looks at the data with its own attitude. The main attempts in the retrained classifiers are to leverage the error-prone data. The retrained classifiers usually have different votes about the sample points which are close to boundaries and may be likely erroneous. Like all ensemble learning approaches, our ensemble meta-learner approach can be developed based on any base classifiers. The main contributions are to keep some advantages of these classifiers and resolve some of their drawbacks, and consequently to enhance the performance of classification. This study investigates how by focusing on some crucial data points the performance of any base classifier can be reinforced. The paper also proves that adding the number of all "difficult" data points just as boosting method does, does not always make a better training set. Experiments show significant improvements in terms of accuracies of consensus classification. The performance of the proposed algorithm outperforms some of the best methods in the literature. Finally, the authors according to experimental results claim that forcing crucial data points to the training set as well as eliminating them from the training set can lead to the more accurate results, conditionally.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
References
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Parvin, H., Minaei-Bidgoli, B., Ghatei, S., Alinejad-Rokny, H.: An Innovative Combination of Particle Swarm Optimization, Learning Automaton and Great Deluge Algorithms for Dynamic Environments. International Journal of the Physical Sciences 6(22), 5121–5127 (2011)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, NY (2001)
Parvin, H., Minaei-Bidgoli, B., Alizadeh, H.: A New Clustering Algorithm with the Convergence Proof. In: König, A., Dengel, A., Hinkelmann, K., Kise, K., Howlett, R.J., Jain, L.C. (eds.) KES 2011, Part I. LNCS, vol. 6881, pp. 21–31. Springer, Heidelberg (2011)
Fu, Q., Hu, S.X., Zhao, S.Y.: A PSO-based approach for neural network ensemble. Journal of Zhejiang University (Engineering Science) 38(12), 1596–1600 (2004) (in Chinese)
Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Transaction on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)
Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Krogh, A., Vedelsdy, J.: Neural Network Ensembles Cross Validation, and Active Learning. Advances in Neural Information Processing Systems 7, 231–238 (1995)
Kuncheva, L.I.: Combining Pattern Classifiers, Methods and Algorithms. Wiley, New York (2005)
Lam, L.: Classifier Combinations: Implementations and Theoretical Issues. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 77–86. Springer, Heidelberg (2000)
Lazarevic, A., Obradovic, Z.: Effective pruning of neural network classifier ensembles. In: Proc. International Joint Conference on Neural Networks, vol. 2, pp. 796–801 (2001)
Liu, Y., Yao, X.: Evolutionary ensembles with negative correlation learning. IEEE Trans. Evolutionary Computation 4(4), 380–387 (2000)
Melville, P., Mooney, R.: Constructing Diverse Classifier Ensembles Using Artificial Training Examples. In: Proc. of the IJCAI 2003, pp. 505–510 (2003)
Minaei-Bidgoli, B., Parvin, H., Alinejad-Rokny, H., Alizadeh, H., Punch, W.F.: Effects of resampling method and adaptation on clustering ensemble efficacy (2011)
Navone, H.D., Verdes, P.F., Granitto, P.M., Ceccatto, H.A.: Selecting Diverse Members of Neural Network Ensembles. In: Proc. 16th Brazilian Symposium on Neural Networks, pp. 255–260 (2000)
Opitz, D., Shavlik, J.: Actively searching for an effective neural network ensemble. Connection Science 8(3-4), 337–353 (1996)
Parvin, H., Helmi, H., Minaei-Bidgoli, B., Alinejad-Rokny, H., Shirgahi, H.: Linkage Learning Based on Differences in Local Optimums of Building Blocks with One Optima. International Journal of the Physical Sciences 6(14), 3419–3425 (2011)
Qiang, F., Shang-Xu, H., Sheng-Ying, Z.: Clustering-based selective neural network ensemble. Journal of Zhejiang University Science 6A(5), 387–392 (2005)
Roli, F., Kittler, J. (eds.): MCS 2002. LNCS, vol. 2364. Springer, Heidelberg (2002)
Rosen, B.E.: Ensemble learning using decorrelated neural network. Connection Science 8(3-4), 373–384 (1996)
Schapire, R.E.: The strength of weak learn ability. Machine Learning 5(2), 197–227 (1990)
Zhou, Z.H., Wu, J.X., Jiang, Y., Chen, S.F.: Genetic algorithm based selective neural network ensemble. In: Proc. 17th International Joint Conference on Artificial Intelligence, vol. 2, pp. 797–802 (2001)
Melville, P., Mooney, R.J.: Constructing Diverse Classifier Ensembles Using Artificial Training Examples. In: Conference on Artificial Intelligence, pp. 505–510 (2003)
Melville, P., Mooney, R.J.: Creating Diversity in Ensembles Using Artificial Data. Information Fusion: Special Issue on Diversity in Multiclassifier Systems 6(1), 99–111 (2004)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. In: European Conference on Computational Learning Theory, pp. 23–37 (1995)
Breiman, L.: Arcing classifiers. The Annals of Statistics 26(3), 801–823 (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Parvin, H., Ansari, S., Parvin, S. (2013). A Diversity Production Approach in Ensemble of Base Classifiers. In: Batyrshin, I., González Mendoza, M. (eds) Advances in Artificial Intelligence. MICAI 2012. Lecture Notes in Computer Science(), vol 7629. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37807-2_5
Download citation
DOI: https://doi.org/10.1007/978-3-642-37807-2_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-37806-5
Online ISBN: 978-3-642-37807-2
eBook Packages: Computer ScienceComputer Science (R0)