Abstract
Building a single decision tree provides a simple model of the world, but it is often too simple or too specific. Over many years of experience in data mining, it has become clear that many models working together are better than one model doing it all. We have now become familiar with the idea of combining multiple models (like decision trees) into a single ensemble of models (to build a forest of trees).
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Williams, G. (2011). Random Forests. In: Data Mining with Rattle and R. Use R. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9890-3_12
Download citation
DOI: https://doi.org/10.1007/978-1-4419-9890-3_12
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-9889-7
Online ISBN: 978-1-4419-9890-3
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)