Abstract
C4.5 is the most well-known inductive learning algorithm. It can be used to build decision trees as well as production rules. Production rules are a very common formalism for representing and using knowledge in many real-world domains. C4.5 generates production rules from raw trees. It has been shown that the set of production rules is usually both simpler and more accurate than the decision tree from which the ruleset was formed. This research shows that generating production rules from pruned trees usually results in significantly simpler rulesets than generating rules from raw trees. This reduction in complexity is achieved without reducing prediction accuracies. Furthermore, the new approach uses significantly less induction time than the latter. This paper uses experiments in a wide variety of natural domains to illustrate these points. It also shows that the new method scales up better than the old one in terms of ruleset size, the number of rules, and learning time when the training set size increases. This is an important characteristic for learning algorithms used for data mining.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Breiman, L., Friedman, J.H., Olshen, R.A., and Stone, C.J.: Classification And Regression Trees. Belmont, CA: Wadsworth (1984).
Kohavi, R.: A study of cross-validation and bootstrap for accuracy estimation and model selection. Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence. San Mateo, CA: Morgan Kaufmann (1995) 1137–1143.
Merz, C.J. and Murphy, P.M.: UCI Repository of Machine Learning Databases [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Irvine, CA: University of California, Department of Information and Computer Science (1997).
Quinlan, JR.: Generating production rules from decision trees. Proceedings of the Tenth International Joint Conference on Artificial Intelligence. San Mateo, CA: Morgan Kaufmann (1987a) 304–307.
Quinlan, JR.: Simplifying decision trees. International Journal of Man-Machine Studies 27 (1987b) 221–234.
Quinlan, JR.: C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann (1993).
Quinlan, JR.: MDL and categorical theories (continued). Proceedings of the Twelfth International Conference on Machine Learning. San Francisco, CA: Morgan Kaufmann (1995) 464–470.
Quinlan, JR.: Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 4 (1996) 77–90.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zheng, Z. (1998). Scaling up the rule generation of C4.5. In: Wu, X., Kotagiri, R., Korb, K.B. (eds) Research and Development in Knowledge Discovery and Data Mining. PAKDD 1998. Lecture Notes in Computer Science, vol 1394. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-64383-4_29
Download citation
DOI: https://doi.org/10.1007/3-540-64383-4_29
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64383-8
Online ISBN: 978-3-540-69768-8
eBook Packages: Springer Book Archive