Skip to main content

Self-adaptive Parameters Optimization for Incremental Classification in Big Data Using Swarm Intelligence

  • Conference paper
  • First Online:
Proceedings of the International Conference on Artificial Intelligence and Computer Vision (AICV2020) (AICV 2020)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1153))

Abstract

Nowadays, big data is one of the most technical challenges confront researchers and companies. The main challenge lies in the fact that big data sources usually formed in a continuous data stream. Thus, many previous researches present incremental data mining approaches to deal with the challenges of the data streams by adapting traditional machine learning algorithms. Artificial Neural Network (ANN) is a common technique in this field. The main challenge is how to optimize the neural network parameters to deal with a huge data arrive over time. These parameters, which are vital for the performance of a neural network, are called hyperparameters. Most earlier optimization approaches have dealt with big data containers instead of big data streams or handled big data streams with time consumed. This paper proposes an incremental learning process for ANN hyperparameters optimization over data stream by utilizing Grasshopper Algorithm (GOA) as a swarm intelligence technique. GOA is utilized to make a balance between exploration and exploitation to find the best set of ANN hyperparameters suitable for data stream. The experimental results illustrate that the proposed optimization model yields better accuracy results with appropriate CPU time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://sourceforge.net/projects/moaatastream/files/Datasets/Classification/elecNormNew.arff.zip/download.

References

  1. Emani, C.K., Cullot, N., Nicolle, C.: Understandable big data: a survey. Comput. Sci. Rev. 17, 70–81 (2015)

    Article  MathSciNet  Google Scholar 

  2. Bayera, H., Aksogana, M., Celikb, E., Kondilogluc, A.: Big data mining and business intelligence trends. J. Asian Bus. Strategy 7(1), 23–33 (2017)

    Article  Google Scholar 

  3. Ruzgas, T., Jakubėlienė, K.: Big data mining and knowledge discovery. J. Commun. Technol. Electron. Comput. Sci. 9(1), 5–9 (2016)

    Article  Google Scholar 

  4. Sivarajah, U., Kamal, M.M., Irani, Z., Weerakkody, V.: Critical analysis of big data challenges and analytical methods. J. Bus. Res. 70, 263–286 (2017)

    Article  Google Scholar 

  5. Mnich, M.: Big data algorithms beyond machine learning. KI-Künstliche Intell. 32(1), 9–17 (2018)

    Article  MathSciNet  Google Scholar 

  6. Losing, V., Hammer, B., Wersing, H.: Incremental on-line learning: a review and comparison of state of the art algorithms. Neurocomputing 275, 1261–1274 (2018)

    Article  Google Scholar 

  7. Gepperth, A., Hammer, B.: Incremental learning algorithms and applications. In: Proceedings on European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Belgium, pp. 357–368 (2017)

    Google Scholar 

  8. Bose, S., Huber, M.: Incremental learning of neural network classifiers using reinforcement learning. In: Proceedings on IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, pp. 97–103 (2016)

    Google Scholar 

  9. Fong, S., Fang, C., Tian, N., Wong, R., Yap, B.W: Self-adaptive parameters optimization for incremental classification in big data using neural network. In: Big Data Applications and Use Cases, pp. 175–196. Springer (2016)

    Google Scholar 

  10. Cheng, S., Shi, Y., Qin, Q., Bai, R.: Swarm intelligence in big data analytics. In: Intelligent Data Engineering and Automated Learning, pp. 417–426. Springer (2013)

    Google Scholar 

  11. Fong, S., Yang, X., Deb, S.: Swarm search for feature selection in classification. In: Proceedings of IEEE International Conference on Computational Science and Engineering, Sydney, pp. 902–909 (2013)

    Google Scholar 

  12. Fong, S., Deb, S., Yang, X., Li, J.: Feature selection in life science classification: metaheuristic swarm search. IT Prof. 16(4), 24–29 (2014)

    Article  Google Scholar 

  13. Heidari, A.A., Faris, H., Aljarah, I., Mirjalili, S.: An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 1–18 (2018)

    Google Scholar 

  14. Schilling, N., Wistuba, M., Drumond, L., Schmidt-Thieme, L.: Hyperparameter optimization with factorized multilayer perceptrons. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 87–103. Springer, Cham (2015)

    Google Scholar 

  15. Bochinski, E., Senst, T., Sikora, T.: Hyper-parameter optimization for convolutional neural network committees based on evolutionary algorithms. In: Proceedings of IEEE International Conference on Image Processing (ICIP), Beijing, pp. 3924–3928 (2017)

    Google Scholar 

  16. Tsirikoglou, P., Abraham, S., Contino, F., Lacor, C., Ghorbaniasl, G.: A hyperparameters selection technique for support vector regression models. Appl. Soft Comput. 61, 139–148 (2017)

    Article  Google Scholar 

  17. Bilal, M., Canini, M.: Towards automatic parameter tuning of stream processing systems. In: Proceedings of the 2017 Symposium on Cloud Computing, pp. 189–200. ACM (2017)

    Google Scholar 

  18. Qolomany, B., Maabreh, M., Al-Fuqaha, A., Gupta, A., Benhaddou, D.: Parameters optimization of deep learning models using Particle swarm optimization. In: Proceedings of 13th International Wireless Communications and Mobile Computing Conference (IWCMC), pp. 1285–1290. IEEE (2017)

    Google Scholar 

  19. Saremi, S., Mirjalili, S., Lewis, A.: Grasshopper optimisation algorithm: theory and application. Adv. Eng. Softw. 105, 30–47 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Akmal I. Saber .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Darwish, S.M., Saber, A.I. (2020). Self-adaptive Parameters Optimization for Incremental Classification in Big Data Using Swarm Intelligence. In: Hassanien, AE., Azar, A., Gaber, T., Oliva, D., Tolba, F. (eds) Proceedings of the International Conference on Artificial Intelligence and Computer Vision (AICV2020). AICV 2020. Advances in Intelligent Systems and Computing, vol 1153. Springer, Cham. https://doi.org/10.1007/978-3-030-44289-7_20

Download citation

Publish with us

Policies and ethics