Abstract
Regression trees are tree-based models used to solve those prediction problems in which the response variable is numeric. They differ from the better-known classification or decision trees only in that they have a numeric value rather than a class label associated with the leaves. Model trees are an extension of regression trees in the sense that they associate leaves with multivariate linear models. In this paper a method for the data-driven construction of model trees is presented, namely the Stepwise Model Tree Induction (SMOTI) method. Its main characteristic is the induction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and splitting nodes, which partition the sample space. In this way, the multivariate linear model associated to each leaf is efficiently built stepwise. SMOTI has been evaluated in an empirical study and compared to other model tree induction systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Breiman L., Friedman J., Olshen R., & Stone J.: Classification and regression tree, Wadsworth & Brooks, 1984.
Ciampi A.: Generalized regression trees, Computational Statistics and Data Analysis, 12, pp. 57–78, 1991.
Draper N.R., & Smith H.: Applied regression analysis, John Wiley & Sons, 1982.
Karalic A.: Linear regression in regression tree leaves, in Proceedings of ISSEK.92 (International School for Synthesis of Expert Knowledge), Bled, Slovenia, 1992.
Lanubile A., & Malerba D.: Induction of regression trees with RegTree, in Book of Short Papers on Classification and Data Analysis, Pescara, Italy, pp. 253–260, 1997.
Morgan J.N., & Sonquist J.A.: Problems in the analysis of survey data, and a proposal, in American Statistical Association Journal, pp. 415–434, 1963.
Orkin, M., Drogin, R.: Vital Statistics, McGraw Hill, New York (1990).
Quinlan J. R.: Learning with continuous classes, in Proceedings AI’92, Adams & Sterling (Eds.), World Scientific, pp. 343–348, 1992.
Siciliano R., & Mola F.: Modelling for recursive partitioning in variable selection, in COMPSTAT’94, Dutter R., & Grossman W. (Eds.), Physica-Verlag, pp. 172–177, 1994.
Torgo L.: Kernel Regression Trees, in Poster Papers of the 9th European Conference on Machine Learning (ECML 97), M. van Someren, & G. Widmer (Eds.), Prague, Czech Republic, pp. 118–127, 1997.
Torgo L.: Functional Models for Regression Tree Leaves, in Proceedings of the Fourteenth International Conference (ICML’97), D. Fisher (Ed.), Nashville, Tennessee, pp. 385–393, 1997.
Wang Y., & Witten I.H.: Inducing Model Trees for Continuous Classes, in Poster Papers of the 9th European Conference on Machine Learning (ECML 97), M. van Someren, & G. Widmer (Eds.), Prague, Czech Republic, pp. 128–137, 1997.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Malerba, D., Appice, A., Bellino, A., Ceci, M., Pallotta, D. (2001). Stepwise Induction of Model Trees. In: Esposito, F. (eds) AI*IA 2001: Advances in Artificial Intelligence. AI*IA 2001. Lecture Notes in Computer Science(), vol 2175. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45411-X_3
Download citation
DOI: https://doi.org/10.1007/3-540-45411-X_3
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42601-1
Online ISBN: 978-3-540-45411-3
eBook Packages: Springer Book Archive