Abstract
According to the classic Karush-Kuhn-Tucker (KKT) theorem, at every step of incremental support vector machine (SVM) learning, the newly adding sample which violates the KKT conditions will be a new support vector (SV) and migrate the old samples between SV set and non-support vector (NSV) set, and at the same time the learning model should be updated based on the SVs. However, it is not exactly clear at this moment that which of the old samples would change between SVs and NSVs. Additionally, the learning model will be unnecessarily updated, which will not greatly increase its accuracy but decrease the training speed. Therefore, how to choose the new SVs from old sets during the incremental stages and when to process incremental steps will greatly influence the accuracy and efficiency of incremental SVM learning. In this work, a new algorithm is proposed to select candidate SVs and use the wrongly predicted sample to trigger the incremental processing simultaneously. Experimental results show that the proposed algorithm can achieve good performance with high efficiency, high speed and good accuracy.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Ade R and Deshmukh P R, International Journal of Computer Applications 2, 1039 (2014).
Yuan Y, Fang J and Wang Q, IEEE Transactions on Cybernetics 45, 548 (2015).
Omar Abdel Wahab, Azzam Mourad, Hadi Otrok and Jamal Bentahar, Expert Systems with Applications 50, 40 (2016).
Maoxiang Chu, Jie Zhao, Xiaoping Liu and Rongfen Gong, Chemometrics and Intelligent Laboratory Systems 168, 15 (2017).
Nilashi M., Ibrahim O. B., Mardani A., Ahani A. and Jusoh A., A Soft Computing Approach for Diabetes Disease Classification, Health Informatics Journal, 2016.
Ivana Marković, Miloš Stojanović, Miloš Božić and Jelena Stanković, ICT Innovations, 105 (2014).
Masayuki Karasuyama and Ichiro Takeuchi, Advances in Neural Information Processing Systems, 907 (2009).
Gert Cauwenberghs and Tomaso Poggio, Incremental and Decremental Support Vector Machine Learning, International Conference on Neural Information Processing Systems, MIT Press, 388 (2000).
Antoine Bordes, Seyda Ertekin, Jason Weston and Léon Bottou, Journal of Machine Learning Research 6, 1579 (2005).
Seyda Ertekin, Leon Bottou and C Lee Giles, IEEE Transactions on Pattern Analysis and Machine Intelligence 33, 368 (2011).
Cunhe Li, Kangwei Liu and Hongxia Wang, Applied Intelligence 34, 19 (2011).
Bin Gu, Victor S. Sheng, Zhijie Wang, Derek Ho, Said Osman and Shuo Li, Neural Networks 67, 140 (2015).
Yang Yi, Jiansheng Wu and Wei Xu, Expert Systems with Applications 38, 7698 (2011).
Roshan Chitrakar and Chuanhe Huang, Computers & Security 45, 231 (2014).
Fei Gao, Jingyuan Mei, Jinping Sun, Jun Wang, Erfu Yang and Amir Hussain, Plos One 10, e0135709 (2015).
Wei-Yuan Cheng and Chia-Feng Juang, Fuzzy Sets and Systems 163, 24 (2011).
YouLong Yang, JinXing Che, YanYing Li, YanJun Zhao and SuLing Zhu, Energy 113, 796 (2016).
Chih-Chung Chang and Chih-Jen Lin, ACM Transactions on Intelligent Systems and Technology 2, 1 (2001).
Shaoning Pang, Lei Zhu, Gang Chen, Abdolhossein Sarrafzadeh, Tao Ban and Daisuke Inoue, Neural Networks 44, 87 (2013).
Bache K and Lichman M, UCI Machine Learning Repository, CA: University of California, School of Information and Computer Science, 2013.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work has been supported by the National Natural Science Foundation of China (Nos.U1509207 and 61325019).
Rights and permissions
About this article
Cite this article
Tang, Tl., Guan, Q. & Wu, Yr. Support vector machine incremental learning triggered by wrongly predicted samples. Optoelectron. Lett. 14, 232–235 (2018). https://doi.org/10.1007/s11801-018-7254-3
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11801-018-7254-3