Skip to main content

The Most Robust Loss Function for Boosting

  • Conference paper
Neural Information Processing (ICONIP 2004)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3316))

Included in the following conference series:

Abstract

Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Copas, J.: Binary regression models for contaminated data. J. Royal Statist. Soc. B 50, 225–265 (1988)

    MathSciNet  Google Scholar 

  2. Domingo, C., Watanabe, O.: MadaBoost: A modification of AdaBoost. In: Proc. of the 13th Conference on Computational Learning Theory, COLT (2000)

    Google Scholar 

  3. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55, 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  4. Friedman, J.H., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. Annals of Statistics 28, 337–407 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  5. Hampel, F.R., Rousseeuw, P.J., Ronchetti, E.M., Stahel, W.A.: Robust statistics. the approach based on influence functions. John Wiley and Sons, Inc., Chichester (1986)

    Google Scholar 

  6. Lebanon, G., Lafferty, J.: Boosting and maximum likelihood for exponential models. Advances in Neural Information Processing Systems (2002)

    Google Scholar 

  7. Mason, L., Baxter, J., Bartlett, P.L., Frean, M.: Boosting algorithms as gradient descent. Advances in Neural Information Processing Systems (1999)

    Google Scholar 

  8. Murata, N., Takenouchi, T., Kanamori, T., Eguchi, S.: Information geometry of u-boost and bregman divergence. Neural Computation (2004) (To appear)

    Google Scholar 

  9. Rátsch, G., Onoda, T., Müller, K.-R.: Soft margins for adaboost. Machine Learning 42, 287–320 (2001)

    Article  Google Scholar 

  10. Takenouchi, T., Eguchi, S.: Robustifying adaboost by adding the naïve error rate. Neural Computation 16(4), 767–787 (2004)

    Article  MATH  Google Scholar 

  11. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, NY (1995)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kanamori, T., Takenouchi, T., Eguchi, S., Murata, N. (2004). The Most Robust Loss Function for Boosting. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds) Neural Information Processing. ICONIP 2004. Lecture Notes in Computer Science, vol 3316. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30499-9_76

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-30499-9_76

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-23931-4

  • Online ISBN: 978-3-540-30499-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics