Abstract
Model trees are an extension of regression trees that associate leaves with multiple regression models. In this paper a method for the top-down induction of model trees is presented, namely the Stepwise Model Tree Induction (SMOTI) method. Its main characteristic is the induction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and split nodes, which partition the sample space. The multiple linear model associated to each leaf is then obtained by combining straight-line regressions reported along the path from the root to the leaf. In this way, internal regression nodes contribute to the definition of multiple models and have a “global” effect, while straight-line regressions at leaves have only “local” effects. This peculiarity of SMOTI has been evaluated in an empirical study involving both real and artificial data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Breiman L., Friedman J., Olshen R., & Stone J.: Classification and regression tree, Wadsworth & Brooks, 1984.
Ciampi A.: Generalized regression trees, Computational Statistics and Data Analysis, 12, pp. 57–78, 1991.
Draper N.R., & Smith H.: Applied regression analysis, John Wiley & Sons, 1982.
Karalic A.: Linear regression in regression tree leaves, in Proceedings of ISSEK’ 92 (International School for Synthesis of Expert Knowledge), Bled, Slovenia, 1992.
Lubinsky D.: Tree Structured Interpretable Regression, in Learning from Data, Fisher D. & Lenz H.J. (Eds.), Lecture Notes in Statistics, 112, Springer, pp. 387–398, 1994.
Malerba D., Appice A., Bellino A., Ceci M., & Pallotta D.: Stepwise Induction of Model Trees. In F. Esposito (Ed.), AI*IA 2001: Advances in Artificial Intelligence, Lecture Notes in Artificial Intelligence, 2175, Springer, Berlin, Germany, pp. 20–32, 2001.
Morgan J.N., & Sonquist J.A.: Problems in the analysis of survey data, and a proposal, in American Statistical Association Journal, pp. 415–434, 1963.
Orkin, M., Drogin, R.: Vital Statistics, McGraw Hill, New York (1990).
Quinlan J. R.: Learning with continuous classes, in Proceedings AI’92, Adams & Sterling (Eds.), World Scientific, pp. 343–348, 1992.
Siciliano R., & Mola F.: Modelling for recursive partitioning in variable selection, in COMPSTAT’ 94, Dutter R., & Grossman W. (Eds.), Physica-Verlag, pp. 172–177, 1994.
Torgo L.: Kernel Regression Trees, in Poster Papers of the 9th European Conference on Machine Learning (ECML 97), M. van Someren, & G. Widmer (Eds.), Prague, Czech Republic, pp. 118–127, 1997.
Torgo L.: Functional Models for Regression Tree Leaves, in Proceedings of the Fourteenth International Conference (ICML’ 97), D. Fisher (Ed.), Nashville, Tennessee, pp. 385–393, 1997.
Wang Y., & Witten I.H.: Inducing Model Trees for Continuous Classes, in Poster Papers of the 9th European Conference on Machine Learning (ECML 97), M. van Someren, & G. Widmer (Eds.), Prague, Czech Republic, pp. 128–137, 1997.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Malerba, D., Appice, A., Ceci, M., Monopoli, M. (2002). Trading-Off Local versus Global Effects of Regression Nodes in Model Trees. In: Hacid, MS., Raś, Z.W., Zighed, D.A., Kodratoff, Y. (eds) Foundations of Intelligent Systems. ISMIS 2002. Lecture Notes in Computer Science(), vol 2366. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48050-1_43
Download citation
DOI: https://doi.org/10.1007/3-540-48050-1_43
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43785-7
Online ISBN: 978-3-540-48050-1
eBook Packages: Springer Book Archive