Abstract
The problems of learning in non-stationary situations has rarely been a subject of studies even in a parametric case. Historically the first papers on learning in non-stationary environments where occasionally published in the sixties and seventies. The proper tool for solving such a type of problems seemed to be the dynamic stochastic approximation technique [1, 2] as an extension of the Robbins-Monro [3] procedure for the non-stationary case. The traditional procedure of stochastic approximation was also used [4, 5] with a good effect for tracking the changing regression function root.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Dupač, V.: A dynamic stochastic approximation methods. Ann. Math. Stat. 36, 1695–1702 (1965)
Dupač, V.: Stochastic approximations in the presence of trend. Neural Netw. 5, 283–288 (1966)
Robbins, H., Monro, S.: A stochastic approximation method. Ann. Math. Stat. 22(1) (1951)
Watanabe, M.: On robbins-monro stochastic approximation method with time varying observations. Bull. Math. Statist. 16, 73–91 (1974)
Young, T., Westerberg, R.: Stochastic approximation with a non-stationary regression function. IEEE Trans. Inform. Theory 18, 518–519 (1972)
Fu, K.: Sequential Methods in Pattern Recognition and Machine Learning. Academic, New York (1968)
Tzypkin, J.: Learning algorithms of pattern recognition in non-stationary condition. In: Watanabe, S. (ed.) Frontiers of Pattern Recognitions, pp. 527–542. Academic Press, New York (1972)
Nishida, K., Yamauchi, K.: Learning, detecting, understanding, and predicting concept changes. In: International Joint Conference on Neural Networks. IJCNN 2009, pp. 2280–2287. IEEE (2009)
Minku, L.L., Yao, X.: DDD: a new ensemble approach for dealing with concept drift. IEEE Trans. Knowl. Data Eng. 24(4), 619–633 (2012)
Mahdi, O.A., Pardede, E., Cao, J.: Combination of information entropy and ensemble classification for detecting concept drift in data stream. In: Proceedings of the Australasian Computer Science Week Multiconference, p. 13. ACM (2018)
Liu, A., Zhang, G., Lu, J.: Fuzzy time windowing for gradual concept drift adaptation. In: IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pp. 1–6. IEEE (2017)
Li, P., Wu, X., Hu, X., Wang, H.: Learning concept-drifting data streams with random ensemble decision trees. Neurocomputing 166, 68–83 (2015)
Elwell, R., Polikar, R.: Incremental learning of concept drift in nonstationary environments. IEEE Trans. Neural Netw. 22(10), 1517–1531 (2011)
Alippi, C., Boracchi, G., Roveri, M.: Just-in-time classifiers for recurrent concepts. IEEE Trans. Neural Netw. Learn. Syst. 24(4), 620–634 (2013)
Zliobaite, I., Bifet, A., Pfahringer, B., Holmes, G.: Active learning with drifting streaming data. IEEE Trans. Neural Netw. Learn. Syst. 25(1), 27–39 (2014)
Zhang, T., Zhang, Q., Wang, Q.: Model detection for functional polynomial regression. Comput. Stat. Data Anal. 70, 183–197 (2014)
Yun, U., Lee, G.: Sliding window based weighted erasable stream pattern mining for stream data applications. Futur. Gener. Comput. Syst. 59, 1–20 (2016)
Yin, X., Huang, K., Hao, H.: De2: dynamic ensemble of ensembles for learning nonstationary data. Neurocomputing 165, 14–22 (2015)
Ye, Y., Squartini, S., Piazza, F.: Online sequential extreme learning machine in nonstationary environments. Neurocomputing 116, 94–101 (2013)
Souto Maior Barros, R., Carvalho Santos, S.G.T.: A large-scale comparison of concept drift detectors. Inf. Sci. 451–452, 348–370 (2018)
Escovedo, T., Koshiyama, A., Abs da Cruz, A., Vellasco, M.: Detecta: abrupt concept drift detection in non-stationary environments. Appl. Soft Comput. 62, 119–133 (2018)
Webb, G.I., Kuan Lee, L., Petitjean, F., Goethals, B.: Understanding concept drift. CoRR (2017). arXiv:1704.00362
Zambon, D., Alippi, C., Livi, L.: Concept drift and anomaly detection in graph streams. In: IEEE Transactions on Neural Networks and Learning Systems, pp. 1–14 (2018)
Gama, J., Žliobaitė, I., Bifet, A., Pechenizkiy, M., Bouchachia, A.: A survey on concept drift adaptation. ACM Comput. Surv. (CSUR) 46(4), 44:1–44:37 (2014)
Ditzler, G., Roveri, M., Alippi, C., Polikar, R.: Learning in nonstationary environments: a survey. IEEE Comput. Intell. Mag. 10(4), 12–25 (2015)
Aizerman, M., Braverman, E., Rozonoer, L.: Theoretical foundations of the potential function method in pattern recognition learning. Autom. Remote Control 25, 821–837 (1964)
Révész, P.: How to apply the method of stochastic approximation in the nonparametric estimation of a regression function. Mathematische Operationsforschung und Statistik Series Statistics 8, 119–126 (1977)
Braverman, E., Rozonoer, L.: Convergence of random processes in machine learning theory. Autom. Remote Control 30, 44–64 (1969)
Sorour, E.: On the convergence of the dynamic stochastic approximation method for stochastic non-linear multidimensional dynamic systems. Cybernetica 14, 28–37 (1978)
Uosaki, K.: Application of stochastic approximation to the tracking of a stochastic non-linear dynamic systems. Int. J. Control 18, 1233–1247 (1973)
Uosaki, K.: Some generalizations of dynamic stochastic approximation process. Ann. Stat. 2, 1042–1048 (1974)
Chung, K.: On a stochastic approximation methods. Ann. Math. Stat. 25, 463–483 (1954)
Watanabe, M.: On convergence of asymptotically optimal discriminant functions for pattern classification problem. Bull. Math. Statist. 16, 23–34 (1974)
Tucker, H.: A Graduate Course in Probability. Academic, New York (1967)
Efromovich, S.: Nonparametric Curve Estimation. Methods, Theory and Applications. Springer, New York (1999)
Duda, P., Jaworski, M., Rutkowski, L.: On ensemble components selection in data streams scenario with reoccurring concept-drift. In: 2017 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–7 (2017)
Duda, P., Jaworski, M., Rutkowski, L.: Convergent time-varying regression models for data streams: tracking concept drift by the recursive Parzen-based generalized regression neural networks. Int. J. Neural Syst. 28(02), 1750048 (2018)
Rutkowski, L.: Adaptive probabilistic neural-networks for pattern classification in time-varying environment. IEEE Trans. Neural Netw. 15, 811–827 (2004)
Rutkowski, L.: Generalized regression neural networks in time-varying environment. IEEE Trans. Neural Netw. 15 (2004)
Jaworski, M.: Regression function and noise variance tracking methods for data streams with concept drift. Int. J. Appl. Math. Comput. Sci. 28(3), 559–567 (2018)
Jaworski, M., Duda, P., Rutkowski, L., Najgebauer, P., Pawlak, M.: Heuristic regression function estimation methods for data streams with concept drift. Lecture Notes in Computer Science 10246, 726–737 (2017)
Pietruczuk, L., Rutkowski, L., Maciej, J., Duda, P.: The Parzen kernel approach to learning in non-stationary environment. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 3319–3323 (2014)
Duda, P., Pietruczuk, L., Jaworski, M., Krzyzak, A.: On the Cesaro-means-based orthogonal series approach to learning time-varying regression functions. In: Lecture Notes in Artificial Intelligence, pp. 37–48. Springer, Berlin (2016)
Duda, P., Jaworski, M., Rutkowski, L.: Knowledge discovery in data streams with the orthogonal series-based generalized regression neural networks. Inf. Sci. 460–461, 497–518 (2018)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Rutkowski, L., Jaworski, M., Duda, P. (2020). General Non-parametric Learning Procedure for Tracking Concept Drift. In: Stream Data Mining: Algorithms and Their Probabilistic Properties. Studies in Big Data, vol 56. Springer, Cham. https://doi.org/10.1007/978-3-030-13962-9_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-13962-9_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-13961-2
Online ISBN: 978-3-030-13962-9
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)