Abstract
Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Copas, J.: Binary regression models for contaminated data. J. Royal Statist. Soc. B 50, 225–265 (1988)
Domingo, C., Watanabe, O.: MadaBoost: A modification of AdaBoost. In: Proc. of the 13th Conference on Computational Learning Theory, COLT (2000)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55, 119–139 (1997)
Friedman, J.H., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. Annals of Statistics 28, 337–407 (2000)
Hampel, F.R., Rousseeuw, P.J., Ronchetti, E.M., Stahel, W.A.: Robust statistics. the approach based on influence functions. John Wiley and Sons, Inc., Chichester (1986)
Lebanon, G., Lafferty, J.: Boosting and maximum likelihood for exponential models. Advances in Neural Information Processing Systems (2002)
Mason, L., Baxter, J., Bartlett, P.L., Frean, M.: Boosting algorithms as gradient descent. Advances in Neural Information Processing Systems (1999)
Murata, N., Takenouchi, T., Kanamori, T., Eguchi, S.: Information geometry of u-boost and bregman divergence. Neural Computation (2004) (To appear)
Rátsch, G., Onoda, T., Müller, K.-R.: Soft margins for adaboost. Machine Learning 42, 287–320 (2001)
Takenouchi, T., Eguchi, S.: Robustifying adaboost by adding the naïve error rate. Neural Computation 16(4), 767–787 (2004)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, NY (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kanamori, T., Takenouchi, T., Eguchi, S., Murata, N. (2004). The Most Robust Loss Function for Boosting. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds) Neural Information Processing. ICONIP 2004. Lecture Notes in Computer Science, vol 3316. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30499-9_76
Download citation
DOI: https://doi.org/10.1007/978-3-540-30499-9_76
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23931-4
Online ISBN: 978-3-540-30499-9
eBook Packages: Springer Book Archive