Abstract
Nowadays, optimization problems have been one of the most challenges for solving real-work problems. Because of the importance of practical problems in other fields, more and more intelligent search algorithms have been proposed. Each algorithm has both advantages and disadvantages. It is difficult to find a suitable algorithm to solve real problems. In this article, the comparison of four robust algorithms including two classical algorithms such as particle swarm optimization (PSO) and artificial bee colony (ABC), and two algorithms which are presented recently, i.e., cuckoo search (CS) and gray wolf optimizer (GWO). To make a diversity in comparison, the first ten benchmark test functions were chosen to compare items such as convergence rate and accuracy of the algorithms. Moreover, the high dimensions (nā=ā30) in the search space are also examined to evaluate the effectiveness of the algorithms in the variable dimension problems. Through the results achieved, an algorithm which is achieved a balance between the convergence rate, and the accuracy level is submitted.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
In the two last decades, the optimization algorithms have become more and more attractive. New algorithms have been presented such as particle swarm optimization (PSO) [1], artificial bee colony (ABC) [2], firefly algorithm (FA) [3], genetic algorithm (GA) [4], differential evolution (DE) [5], evolution strategy (ES) [6], ant colony optimization (ANT) [7], cuckoo search (CS) [8], and gray wolf optimizer (GWO) [9]. The common features of these algorithms are all inspired by the animalsā search for food in nature. This process attempted to simulate mathematical formulas. All algorithms have outstanding features such as simple programming, fast convergence speed, and accuracy with acceptable errors. These algorithms appeared in almost fields including the economic system, engineering problem. It leads to finding that a remarkable algorithm that can solve the problem with suitable and high reliability is necessary. For that, in this paper, comparing the reliability of four algorithms PSO, ABC, CS, and GWO, in which, PSO and ABC algorithms are known as classical algorithms, and their significant features are widely acknowledged. CS and GWO algorithms are recently proposed as the new algorithms, and the main characteristic of these algorithms is improved accuracy. In order to comprehensively evaluate the advantages and disadvantages of the above algorithms, the paper investigated the search space having 30 dimensions to compare the performance of all four algorithms in two terms; the convergence rate and accuracy level.
2 Swarm Intelligence Algorithms
Particle Swarm Optimization (PSO)
Particle swarm optimization (PSO) is proposed by Eberhart and Kennedy [1]. The feature of this algorithm is based on the balance of global best (Gbest) and local best (Lbest,i) during the velocity update. Thus, each candidate solution \(X_{i} (i = 1,2, \ldots ,n)\) at the step of movement (tĀ +Ā 1)th registers the new position which is closer to the local best or global best position. During their movement, solution candidates also expand the new search spaces through the two factors defined as learning factors c1, c2. The process of updating the new velocity and position of the PSO algorithm is described in Eqs.Ā 1 and 2.
where r1, r2 register the value in the range [0,1], w is denoted an inertia weight to control the velocity.
Artificial Bee Colony (ABC)
The artificial bee colony (ABC) [2] algorithm is a swarm-based meta-heuristic algorithm that was introduced by Karaboga in 2005. The search process of ABC has three major steps:
-
Send the employed bees to a food source and estimate their nectar quality following Eq.Ā 3;
$$v_{i,j} = x_{i,j} + \phi_{i,j} \left( {x_{i,j} - x_{k,j} } \right)$$(3)where \(k \in \left\{ {1,2, \ldots ,{\text{SN}}} \right\}\) is randomly chosen index; k is determined randomly and should differ from i. \(\phi_{i,j}\) register the value in the range [0,1].
-
Onlooker bees select the food sources based on information collected from employed bees and estimate their nectar quality expressed in Eq.Ā 4;
$$p_{i} = \frac{{{\text{fit}}_{i} }}{{\sum\nolimits_{i = 1}^{\text{SN}} {{\text{fit}}_{i} } }}$$(4)where \({\text{fit}}_{i}\) is the fitness value of the solution ith.
-
Determine the scout bees and employ them on possible food sources for exploitation.
The general structure of the algorithm is introduced as follows.
The food source of which the nectar is abandoned by the scout bees is replaced with a new food source by the scouts by Eq.Ā (7) in case the position cannot be improved further. The parameter limit is the control parameter to determine the abandonment of the food sources within the predetermined number of cycles.
Cuckoo search Algorithm (CS)
Cuckoo search algorithm (CS) is also a nature-inspired algorithm, lied on the development of the population of cuckoo bird in nature. This algorithm was introduced by Yang and Deb [8]. This algorithm lied on LĆ©vy flights having the step length s to orient the new direction. An outstanding advantage of this algorithm is that it can produce a suitable distribution in which its values can be registered positive or negative. Thus, at the step of movement, (tĀ +Ā 1)th follows Eq.Ā 5
where
-
\(\alpha\): Scaling factor;
-
\(X_{i}^{t+1}\) and \(X_{i}^{t}\) are new position and current position of cuckoo bird.
-
\(L\left( {s,\lambda } \right)\): is LĆ©vy distribution, used to define the step size of random walk.
Gray Wolf Optimizer (GWO)
Gray wolf optimizer (GWO) is proposed by Mirjalili et al. [9]. Based on the hunting characteristics of wolves, with a division of tasks for each of the wolves, the leader in the swarm is called alpha (\(\alpha\)). The alpha considers the best solution, the second and third best solution namely beta (\(\beta\)) and delta (\(\delta\)). The process of hunting and attract toward the prey can simulate in mathematical form as:
where
In Eq.Ā 9, the vector \(\overset{\lower0.5em\hbox{$\smash{\scriptscriptstyle\rightharpoonup}$}}{a}\) is a reduction function from 2 to 0. \(\overset{\lower0.5em\hbox{$\smash{\scriptscriptstyle\rightharpoonup}$}}{{r_{1} }}\) and \(\overset{\lower0.5em\hbox{$\smash{\scriptscriptstyle\rightharpoonup}$}}{{r_{2} }}\) are random scalar having a value in range [0,1].
3 Numerical Examples
To compare the efficiency of algorithms PSO, ABC, CS, and WGO, the first ten benchmark test functions were selected to examine and are given in TableĀ 1. All functions have the same search space with nĀ =Ā 30 dimension, the results are presented in two terms; comparing the convergence rate and accuracy level for each function.
For fair in comparison, all algorithms have the same initial solution candidates NĀ =Ā 30, and a total of iterations are 1000. The convergence trend of all functions is shown in Fig.Ā 1. The comparison results obtained from 30 runs randomly in both algorithms are given in TableĀ 2 in the case of 30 dimensions. Terms of best fitness, mean, and standard deviation are investigated after obtaining the results from 30 runs randomly.
Based on the convergence trend shown in Fig.Ā 1, it can be realized that two algorithms, CS and GWO, illustrate the better performance of convergence rate at functions F1āF5, F9āF10. Especially, at the F9 function, the CS and GWO algorithms achieved the best value which are registered around 200 and 400 of iterations, respectively. Meanwhile, the ABC algorithm shows a good convergence rate at F8 functions. And PSO records the worst convergence rate among the algorithms.
Based on the results statistics given in TableĀ 2, it can be seen that the CS and GWO algorithms are given the best level of accuracy response in almost functions investigated. Especially at the functions F1āF5, the CS algorithm has shown superiority when the best fitness value is far ahead of the other algorithms. Meanwhile, the accuracy of PSO and ABC still recorded worse performance in comparison with the CS and GWO algorithms. However, at function F6, the ABC algorithm achieved the most accurate level.
4 Conclusion
In this paper, a comparison between the algorithms presented such as PSO, ABC, CS, and GWO to analyze optimization problems. The first ten benchmark functions are used as numerical examples to investigate the convergence rate and accuracy response of the algorithms. Through the achieved results, the CS algorithm is considered to be a stable and highly reliable algorithm in solving the real works, while the PSO and ABC algorithms recognize an instability in some cases particular problems.
References
Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNNā95 International Conference on Neural Networks. IEEE (1995)
Karaboga, D., Basturk, B.: On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. 8(1), 687ā697 (2008)
Yang, X.-S.: Firefly algorithm, stochastic test functions and design optimisation. Int. J. Bio-inspired Comput. 2(2), 78ā84 (2010)
MĆ¼hlenbein, H.: Genetic algorithms (1997)
Storn, R., Price, K.: Differential evolutionāa simple and efficient adaptive scheme for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341359 (1997)
Rechenberg, I.: Evolution strategy: natureās way of optimization. In: Optimization: Methods and Applications, Possibilities and Limitations, pp. 106ā126. Springer (1989)
Dorigo, M., Gambardella, L.M.: Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans. Evol. Comput. 1(1), 53ā66 (1997)
Yang, X.-S., Deb, S.: Cuckoo search via LĆ©vy flights. In: 2009 World Congress on Nature and Biologically Inspired Computing (NaBIC). IEEE (2009)
Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46ā61 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Minh, HL., Luong, V.H., Thanh, CL. (2021). Comparison of Swarm Intelligence Algorithms for Optimization Problems. In: Bui, T.Q., Cuong, L.T., Khatir, S. (eds) Structural Health Monitoring and Engineering Structures. Lecture Notes in Civil Engineering, vol 148. Springer, Singapore. https://doi.org/10.1007/978-981-16-0945-9_11
Download citation
DOI: https://doi.org/10.1007/978-981-16-0945-9_11
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-0944-2
Online ISBN: 978-981-16-0945-9
eBook Packages: EngineeringEngineering (R0)