Abstract
Nowadays, big data is one of the most technical challenges confront researchers and companies. The main challenge lies in the fact that big data sources usually formed in a continuous data stream. Thus, many previous researches present incremental data mining approaches to deal with the challenges of the data streams by adapting traditional machine learning algorithms. Artificial Neural Network (ANN) is a common technique in this field. The main challenge is how to optimize the neural network parameters to deal with a huge data arrive over time. These parameters, which are vital for the performance of a neural network, are called hyperparameters. Most earlier optimization approaches have dealt with big data containers instead of big data streams or handled big data streams with time consumed. This paper proposes an incremental learning process for ANN hyperparameters optimization over data stream by utilizing Grasshopper Algorithm (GOA) as a swarm intelligence technique. GOA is utilized to make a balance between exploration and exploitation to find the best set of ANN hyperparameters suitable for data stream. The experimental results illustrate that the proposed optimization model yields better accuracy results with appropriate CPU time.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Emani, C.K., Cullot, N., Nicolle, C.: Understandable big data: a survey. Comput. Sci. Rev. 17, 70–81 (2015)
Bayera, H., Aksogana, M., Celikb, E., Kondilogluc, A.: Big data mining and business intelligence trends. J. Asian Bus. Strategy 7(1), 23–33 (2017)
Ruzgas, T., Jakubėlienė, K.: Big data mining and knowledge discovery. J. Commun. Technol. Electron. Comput. Sci. 9(1), 5–9 (2016)
Sivarajah, U., Kamal, M.M., Irani, Z., Weerakkody, V.: Critical analysis of big data challenges and analytical methods. J. Bus. Res. 70, 263–286 (2017)
Mnich, M.: Big data algorithms beyond machine learning. KI-Künstliche Intell. 32(1), 9–17 (2018)
Losing, V., Hammer, B., Wersing, H.: Incremental on-line learning: a review and comparison of state of the art algorithms. Neurocomputing 275, 1261–1274 (2018)
Gepperth, A., Hammer, B.: Incremental learning algorithms and applications. In: Proceedings on European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Belgium, pp. 357–368 (2017)
Bose, S., Huber, M.: Incremental learning of neural network classifiers using reinforcement learning. In: Proceedings on IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, pp. 97–103 (2016)
Fong, S., Fang, C., Tian, N., Wong, R., Yap, B.W: Self-adaptive parameters optimization for incremental classification in big data using neural network. In: Big Data Applications and Use Cases, pp. 175–196. Springer (2016)
Cheng, S., Shi, Y., Qin, Q., Bai, R.: Swarm intelligence in big data analytics. In: Intelligent Data Engineering and Automated Learning, pp. 417–426. Springer (2013)
Fong, S., Yang, X., Deb, S.: Swarm search for feature selection in classification. In: Proceedings of IEEE International Conference on Computational Science and Engineering, Sydney, pp. 902–909 (2013)
Fong, S., Deb, S., Yang, X., Li, J.: Feature selection in life science classification: metaheuristic swarm search. IT Prof. 16(4), 24–29 (2014)
Heidari, A.A., Faris, H., Aljarah, I., Mirjalili, S.: An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 1–18 (2018)
Schilling, N., Wistuba, M., Drumond, L., Schmidt-Thieme, L.: Hyperparameter optimization with factorized multilayer perceptrons. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 87–103. Springer, Cham (2015)
Bochinski, E., Senst, T., Sikora, T.: Hyper-parameter optimization for convolutional neural network committees based on evolutionary algorithms. In: Proceedings of IEEE International Conference on Image Processing (ICIP), Beijing, pp. 3924–3928 (2017)
Tsirikoglou, P., Abraham, S., Contino, F., Lacor, C., Ghorbaniasl, G.: A hyperparameters selection technique for support vector regression models. Appl. Soft Comput. 61, 139–148 (2017)
Bilal, M., Canini, M.: Towards automatic parameter tuning of stream processing systems. In: Proceedings of the 2017 Symposium on Cloud Computing, pp. 189–200. ACM (2017)
Qolomany, B., Maabreh, M., Al-Fuqaha, A., Gupta, A., Benhaddou, D.: Parameters optimization of deep learning models using Particle swarm optimization. In: Proceedings of 13th International Wireless Communications and Mobile Computing Conference (IWCMC), pp. 1285–1290. IEEE (2017)
Saremi, S., Mirjalili, S., Lewis, A.: Grasshopper optimisation algorithm: theory and application. Adv. Eng. Softw. 105, 30–47 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Darwish, S.M., Saber, A.I. (2020). Self-adaptive Parameters Optimization for Incremental Classification in Big Data Using Swarm Intelligence. In: Hassanien, AE., Azar, A., Gaber, T., Oliva, D., Tolba, F. (eds) Proceedings of the International Conference on Artificial Intelligence and Computer Vision (AICV2020). AICV 2020. Advances in Intelligent Systems and Computing, vol 1153. Springer, Cham. https://doi.org/10.1007/978-3-030-44289-7_20
Download citation
DOI: https://doi.org/10.1007/978-3-030-44289-7_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-44288-0
Online ISBN: 978-3-030-44289-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)