Abstract
The performance of a single weak classifier can be improved by using combining techniques such as bagging, boosting and the random subspace method. When applying them to linear discriminant analysis, it appears that they are useful in different situations. Their performance is strongly affected by the choice of the base classifier and the training sample size. As well, their usefulness depends on the data distribution. In this paper, on the example of the pseudo Fisher linear classifier, we study the effect of the redundancy in the data feature set on the performance of the random subspace method and bagging.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Jain, A.K., Chandrasekaran, B.: Dimensionality and Sample Size Considerations in Pattern Recognition Practice. In: Krishnaiah, P.R., Kanal, L.N. (eds.): Handbook of Statistics, Vol. 2. North-Holland, Amsterdam (1987) 835–855
Friedman, J.H.: Regularized Discriminant Analysis. JASA 84 (1989) 165–175
An, G.: The Effects of Adding Noise During Backpropagation Training on a Generalization Performance. Neural Computation 8 (1996) 643–674
Breiman, L.: Bagging predictors. Machine Learning Journal 24(2) (1996) 123–140
Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Machine Learning: Proceedings of the Thirteenth International Conference (1996) 148–156
Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8) (1998) 832–844
Skurichina, M., Duin, R.P.W.: Bagging, Boosting and the Random Subspace Method for Linear Classifiers. Submitted to FMC special issue of Pattern Analysis and Applications (2001)
Ho, T.K.: Complexity of Classification Problems and Comparative Advantages of Combined Classifiers. In: Kittler, J., Roli, F. (eds.): Multiple Classifier Systems (Proceedings of the First International Workshop MCS 2000, Cagliari, Italy, June 2000). Lecture Notes in Computer Science, Vol. 1857, Springer-Verlag, Berlin (2000) 97–106
Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press (1990) 400–407
Skurichina, M., Duin, R.P.W.: The Role of Combining Rules in Bagging and Boosting. In Ferri, F.J., Inesta, J.M., Amin, A., Pudil, P. (eds): Advances in pattern recognition (Proceedings of the Joint Int. Workshops SSPR’2000 and SPR’2000, Alicante, Spain, Aug.–Sept. 2000). Lecture Notes in Computer Science, Vol. 1876, Springer-Verlag, Berlin (2000) 631–640
Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman and Hall, New York (1993)
Skurichina, M., Duin, R.P.W.: Bagging for Linear Classifiers. Pattern Recognition 31(7) (1998) 909–930
Skurichina, M., Duin, R.P.W.: Boosting in Linear Discriminant Analysis. In: Kittler, J., Roli, F. (eds.): Multiple Classifier Systems (Proceedings of the First International Workshop MCS 2000, Cagliari, Italy, June 2000). Lecture Notes in Computer Science, Vol. 1857, Springer-Verlag, Berlin (2000) 190–199
Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Irvine, CA: University of California, Department of Information and Computer Science URL (1998)
Golomb, S.W., Baumert, L.D.: The Search for Hadamard Matrices, American Mathematics Monthly 70 (1963) 12–17
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Skurichina, M., Duin, R.P.W. (2001). Bagging and the Random Subspace Method for Redundant Feature Spaces. In: Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2001. Lecture Notes in Computer Science, vol 2096. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48219-9_1
Download citation
DOI: https://doi.org/10.1007/3-540-48219-9_1
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42284-6
Online ISBN: 978-3-540-48219-2
eBook Packages: Springer Book Archive