Abstract
Simple classifiers, including LDC, have often been praised for their robustness and accuracy. Here we consider an online version of LDC applied to streaming data with concept drift. The classifier is trained on a moving window containing the latest N observations. Current approaches to determining the window size are mostly heuristical. The talk presents a framework within which theoretical relationship can be sought between the window size and the classification error. The framework is based upon two ideas. First, past literature offers formal relationships between the size of the training set and the asymptotic accuracy for several classifier models. Such a relationship can be used as a guide to determining the moving window size. Second, after a sudden change, the “old” classifier may be better than an undertrained “new” classifier that uses only the data coming after the change. We state the optimal window size for the case of LDC applied to two Gaussian classes and a sudden change in the form of rotation and translation of the feature space. Simulation results are included in order to investigate the sensitivity of the theoretical results to violation of the underlying assumptions.
Chapter PDF
Similar content being viewed by others
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kuncheva, L.I., Zliobaite, I. (2008). Linear Discriminant Classifier (LDC) for Streaming Data with Concept Drift. In: da Vitoria Lobo, N., et al. Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2008. Lecture Notes in Computer Science, vol 5342. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-89689-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-540-89689-0_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-89688-3
Online ISBN: 978-3-540-89689-0
eBook Packages: Computer ScienceComputer Science (R0)