Abstract
The alternating decision tree (ADTree) is a successful classification technique that combines decision trees with the predictive accuracy of boosting into a set of interpretable classification rules. The original formulation of the tree induction algorithm restricted attention to binary classification problems. This paper empirically evaluates several wrapper methods for extending the algorithm to the multiclass case by splitting the problem into several two-class problems. Seeking a more natural solution we then adapt the multiclass LogitBoost and AdaBoost.MH procedures to induce alternating decision trees directly. Experimental results confirm that these procedures are comparable with wrapper methods that are based on the original ADTree formulation in accuracy, while inducing much smaller trees.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Erin Allwein, Robert Schapire, and Yoram Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of Machine Learning Research, 1:113–141, 2000.
C. Blake, E. Keogh, and C. J. Merz. UCI repository of machine learning databases. Technical report, University of California, Department of Information and Computer Science, Irvine, CA, 1998. [http://www.ics.uci.edu/~mlearn/MLRepository.html].
Leo Breiman. Arcing classifiers. The Annals of Statistics, 26(3):801–849, 1998.
Wray Buntine. Learning classification trees. Statistics and Computing, 2:63–73, 1992.
Thomas G. Dietterich. Approximate statistical test for comparing supervised classification learning algorithms. Neural Computation, 10(7):1895–1923, 1998.
Thomas G. Dietterich and Ghulum Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, 1995.
Yoav Freund and Llew Mason. The alternating decision tree learning algorithm. In Proc. 16th Int. Conf. on Machine Learning, pages 124–133. Morgan Kaufmann, 1999.
Yoav Freund and Robert E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th Int. Conf. on Machine Learning, pages 148–156. Morgan Kaufmann, 1996.
Jerome Friedman. Another approach to polychotomous classification. Technical report, Stanford University, Department of Statistics, 1996.
Jerome Friedman, Trevor Hastie, and Robert Tibshirani. Additive logistic regression: a statistical view of boosting. The Annals of Statistic, 28(2):337–374, 2000.
Johannes Fürnkranz. Round robin classification. Journal of Machine Learning Research, 2:721–747, 2002.
Ron Kohavi and Clayton Kunz. Option decision trees with majority votes. In Proc. 14th Int. Conf. on Machine Learning, pages 161–169. Morgan Kaufmann, 1997.
Robert E. Schapire. Using output codes to boost multiclass learning problems. In Proc. 14th Int. Conf. on Machine Learning, pages 313–321. Morgan Kaufmann, 1997.
Robert E. Schapire and Yoram Singer. Improved boosting algorithms using confidence-rated predictions. In Proc. 11th Conf. on Computational Learing Theory, pages 80–91. ACM Press, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Holmes, G., Pfahringer, B., Kirkby, R., Frank, E., Hall, M. (2002). Multiclass Alternating Decision Trees. In: Elomaa, T., Mannila, H., Toivonen, H. (eds) Machine Learning: ECML 2002. ECML 2002. Lecture Notes in Computer Science(), vol 2430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36755-1_14
Download citation
DOI: https://doi.org/10.1007/3-540-36755-1_14
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44036-9
Online ISBN: 978-3-540-36755-0
eBook Packages: Springer Book Archive