Abstract
In the framework of supervised classification and prediction modeling, this paper introduces a methodology based on a general formulation of combined model integration in order to improve the fit to the data. Despite of Generalized Additive Models (GAM) our approach combines not only and not necessarily estimations derived from smoothing functions, but also those provided by either parametric or nonparametric models. Because of the multiple classifier combination we have named this general class of models as Generalized Additive MultiModels (GAM-M). The estimation procedure iterates the inner algorithm which is a variant of the backfitting algorithm - and the outer algorithm which is a standard local scoring algorithm - until convergence. The performances of GAM-M approach with respect to alternative approaches are shown in some applications using real data sets. The stability of the model estimates is evaluated by means of bootstrap and cross-validation. As a result, our methodology improves the goodness-of-fit of the model to the data providing also stable estimates.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Antoch, J., Mola, F.: Parsimonious regressograms for generalized additive models. In N.C. Lauro (Ed.): Sviluppi Metodologici Applicativi dell’Inferenza Computazionale nell’Analisi Multidimensionale dei Dati, (1996), Rocco Curto, Napoli, 46–59.
Breiman, L.: Bagging Predictors. Machine Learning, 26, (1996), 46–59.
Breiman, L., Friedman, J.H., Olshen, R.A. and Stone, C.J.: Classification and Regression Trees, Belmont C.A. Wadsworth, (1984).
Cleveland, W.S.: Robust locally weighted regression and smoothing scatterplots, Journal of the American Statistical Association, 74, (1979), 829–836.
Conversano, C.: A Regression Tree Procedure for Smoothing in Generalized Additive Models. In M. Huskova et al. (eds.): Prague Stochastics’98 Abstracts, 13–14, Union of Czech Mathematicians and Physicists, (1998).
Conversano, C.: Semiparametric Models for Supervised Classification and Prediction. Some Proposals for Model Integration and Estimation Procedures (in Italian), Ph.D Thesis in Computational Statistics and Data Analysis, Universitá di Napoli Federico II, (2000).
Conversano, C., Siciliano, R.: Modelling MIB30 Index Volatility: Regression Tree Criteria for Smoothing and Variable Selection in Generalized Additive Models, submitted.
Friedman, J.H.: Multivariate Adaptive Regression Splines. The Annals of Statistics, 23, (1991), 1–149.
Hastie, T.J., and Tibshirani, R.J.: Generalized Additive Models. Chapman and Hall, London, (1990).
Mardia, K.V., Kent, J.T., Bibby, J.M.: Multivariate Analysis. Academic Press, London, (1995).
Mola, F.: Selection of Cut Points in Generalized Additive Models. In: Classification and Data Analysis: Theory and Application, Springer Verlag, Berlin, (1998), 121–128.
Mola, F., Siciliano, R.: A Fast Splitting Procedure for Classification Trees, Statistics and Computing, 7, (1997), 208–216.
Siciliano, R.: Exploratory versus Decision Trees, in R. Payne, P. Green (Eds.): Proceedings in Computational Statistics: COMPSTAT’ 98, Physica-Verlag, Heidelberg (D), (1998), 113–124.
Siciliano, R., Mola, F: Multivariate Data Analysis through Classification and Regression Trees, Computational Statistics and Data Analysis, 32, Elsevier Science, (2000), 285–301.
Silvermann B.W.: Some aspects of the spline smoothing approach to non-parametric regression curve fitting, Journal of the Royal Statistical Society, B 47, (1985) 1–52.
Yee, T.W., Wild, C.J.: Vector generalized additive models, Journal of Royal Statistical Society, B 58, (1996), 481–493.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Conversano, C., Siciliano, R., Mola, F. (2000). Supervised Classifier Combination through Generalized Additive Multi-model. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_16
Download citation
DOI: https://doi.org/10.1007/3-540-45014-9_16
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67704-8
Online ISBN: 978-3-540-45014-6
eBook Packages: Springer Book Archive