Abstract
In the paper, an evolutionary algorithm for global induction of decision trees is presented. In contrast to greedy, top-down approaches it searches for the whole tree at the moment. Specialised genetic operators are proposed which allow modifying both tests used in the non-terminal nodes and structure of the tree. The proposed approach was validated on both artificial and real-life datasets. Experimental results show that the proposed algorithm is able to find competitive classifiers in terms of accuracy and especially complexity.
This work was supported by the grant W/WI/1/02 from Białystok Technical University
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
5 References
Bennett K., “Global tree optimization: A non-greedy decision tree algorithm”, Computing Science and Statistics 26, pp. 156–160, 1994.
Blake C, Merz C, “UCI Repository of machine learning databases” [http://www.ics.uci.edu/~mlearn/MLRepository.html] Irvine, CA: University of California, 1998.
Bobrowski L. “Piecewise-linear classifiers, formal neurons and separability of the learning sets”, Proc. of ICPR'96, IEEE CS Press, pp. 224–228, 1996.
Bot M., Langdon W., “Application of genetic programming to induction of linear classification trees”, Proc. of EuroGP, LNCS 1802, pp.247–258, 2000.
Breiman L., Friedman J., Olshen R., Stone C, “Classification and Regression Trees”, Wadsworth International Group, 1984.
Cantu-Paz E., Kamath C, “Inducing oblique decision trees with evolutionary algorithms”, IEEE Trans. on Evol. Computation 7(1), pp. 54–68, 2003.
Chai B., Huang T., Zhuang X., Zhao Y., Sklansky J., “Piecewise-linear classifiers using binary tree structure and genetic algorithm”, Pattern Recognition 29(11), pp. 1905–1917, 1996.
Fayyad U., Irani K., “Multi-interval discretization of continuous-valued attributes for classification learning”, Proc. of IJCAI'93, Morgan Kaufmann, pp. 1022–1027, 1993.
Fayyad U., Piatetsky-Shapiro G., Smyth P., Uthurusamy R., (eds.) Advances in Knowledge Discovery and Data Mining, AAAI Press, 1996.
Hayfil L., Rivest R., “Constructing optimal binary decision trees is NP.-complete”, Information Processing Letters 5(1), pp. 15–17, 1976.
Koza J., “Concept formation and decision tree induction usisng genetic programming paradigm”, Proc. of PPSN 1, LNCS 496, pp. 124–128, 1991.
Krętowski M., “An evolutionary algorithm for oblique decision tree induction”, Proc. of ICAISC'04, Springer, LNCS 3070, pp.432–437, 2004.
Michalewicz Z., “Genetic Algorithms + Data Structures = Evolution Programs”, Springer, 1996.
Murthy S., Salzberg S., “Decision tree induction: How effective is the greedy heuritics?”, Proc. of KDD-95, 1995.
Murthy S., “Automatic construction of decision trees from data: A multidisciplinary survey”, Data Mining and Know. Disc. 2, pp. 345–389, 1998.
Quinlan J., “C4.5: Programs for Machine Learning”, Morgan Kauf., 1993.
Shah S., Sastry P., “New algorithm for learning and pruning oblique decision trees”, IEEE Trans. on SMC — Part C 29(4), pp. 494–505, 1999.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer Science+Business Media, Inc.
About this paper
Cite this paper
Krętowski, M., Grześ, M. (2005). Global learning of decision trees by an evolutionary algorithm. In: Saeed, K., Pejaś, J. (eds) Information Processing and Security Systems. Springer, Boston, MA. https://doi.org/10.1007/0-387-26325-X_36
Download citation
DOI: https://doi.org/10.1007/0-387-26325-X_36
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-25091-5
Online ISBN: 978-0-387-26325-0
eBook Packages: Computer ScienceComputer Science (R0)