Abstract
This paper presents a new hybrid genetic algorithm (HGA) for feature selection (FS) called as HGAFS. HGAFS incorporates a new local search operation that is devised and embedded in HGA to fine-tune the search in FS. The proposed local search operation works on basis of the distinct and informative nature of input features that is computed by their correlation information. The aim of using correlation information is to encourage the local search strategy for selecting less correlated (distinct) features. Such an encouragement reduces the redundancy of information in the generated subset of salient features. We have tested our methods on several real-world datasets and have compared the performances with the results of other existing algorithms. It is found that HGAFS produces consistently better performances.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
References
Hsu, C., Huang, H., Schuschel, D.: The ANNIGMA-wrapper approach to fast feature selection for neural nets. IEEE Trans. on Syst., Man, and Cybern.-Part B: Cybern. 32(2), 207–212 (2002)
Hall, M.A.: Correlation-based feature selection for discrete and numeric class machine learning. In: 17th International Conference on Machine Learning (2000)
Guan, S., Liu, J., Qi, Y.: An incremental approach to contribution-based feature selection. Journal of Intelligence Systems 13(1) (2004)
Abe, S.: Modified backward feature selection by cross validation. In: Proceedings of the European Symposium on Artificial Neural Networks, pp. 163–168 (2005)
Aghdam, M.H., Aghaee, N.G., Basiri, M.E.: Text feature selection using ant colony optimization. Expert systems with applicantions 36, 6843–6853 (2009)
Ke, L., Feng, Z., Ren, Z.: An efficient ant colony optimization approach to attribute reduction in rough set theory. Pattern Recognition Letters 29, 1351–1357 (2008)
Holland, J.: Adaptation in Nature and Artificial Systems. MIT Press, Cambridge (1992)
Oh, I.-S., Lee, J.S., Moon, B.: Hybrid Genetic Algorithms for Feature Selection. IEEE Trans. on Pattern Analysis and Machine Intelligence 26(11), 1424–1437 (2004)
Huang, J., Cai, Y., Xu, X.: A hybrid genetic algorithm for feature selection wrapper based on mutual information. Pattern Recognition Letters 28, 1825–1844 (2007)
Kwok, T.Y., Yeung, D.Y.: Objective functions for training new hidden units in constructive neural networks. IEEE Trans. Neural Network 8(5), 1131–1148 (1997)
Goldberg, D.E.: Genetic Algorithms in search, optimization and machine learning (2004)
Rumelhart, D.E., McClelland, J.: Parallel distributed processing. MIT Press, Cambridge (1986)
Muni, D.P., Pal, N.R., Das, J.: Genetic Programming for Simultaneous Feature Selection and Classifier Design. IEEE Trans. on Systems, Man, and Cybern.-Part B: Cybern. 36(1) (2006)
Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Sciences, University of California, Irvine (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kabir, M.M., Shahjahan, M., Murase, K. (2009). Involving New Local Search in Hybrid Genetic Algorithm for Feature Selection. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5864. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10684-2_17
Download citation
DOI: https://doi.org/10.1007/978-3-642-10684-2_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10682-8
Online ISBN: 978-3-642-10684-2
eBook Packages: Computer ScienceComputer Science (R0)