Abstract
In this paper, a hybrid bio-inspired metaheuristic optimization approach namely emperor penguin and salp swarm algorithm (ESA) is proposed. This algorithm imitates the huddling and swarm behaviors of emperor penguin optimizer and salp swarm algorithm, respectively. The efficiency of the proposed ESA is evaluated using scalability analysis, convergence analysis, sensitivity analysis, and ANOVA test analysis on 53 benchmark test functions including classical and IEEE CEC-2017. The effectiveness of ESA is compared with well-known metaheuristics in terms of the optimal solution. The proposed ESA is also applied on six constrained and one unconstrained engineering problems to evaluate its robustness. The results reveal that ESA offers optimal solutions as compared to the other competitor algorithms.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
During the last few decades, various algorithms have been proposed to solve a variety of engineering optimization problems [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21]. These optimization problems are very complex in nature because they have more than one local optimum solution. These problems are categorized into various categories whether they are constrained or unconstrained, discrete or continuous, static or dynamic, single or multi-objective.
In order to increase the efficiency and accuracy of these problems [22,23,24,25,26], researchers have encouraged to rely on metaheuristic algorithms [27,28,29]. Metaheuristics become more popular in various field because they do not require gradient information and bypass the local optima problem.
Metaheuristics are classified into two main categories: single-solution and multiple-solution. In single-solution-based algorithms, the searching process starts with one candidate solution, whereas in multiple-solution-based algorithm, the optimization performs using a set of solutions (i.e., population). Multiple-solution or population-based metaheuristics have advantages over single-solution-based metaheuristics. These are as follows:
-
The searching process starts with random generated population, i.e, a set of multiple solutions.
-
The multiple solutions can share the information between each other around the search space and avoid local optimal solutions.
-
The exploration capability of multiple-solution or population-based metaheuristics is better than the single-solution-based metaheuristics.
The key phases of metaheuristic algorithms are exploration and exploitation. The exploration phase ensures that algorithm investigates the different promising regions in a given search space, whereas exploitation ensures the searching of optimal solutions around the promising regions. However, it is difficult to balance between these phases due to its stochastic nature. Therefore, the fine-tuning of these two phases is required to achieve the near-optimal solutions.
In recent years, a large number of metaheuristic algorithms have been developed. However, there is no single algorithm present which can solve all types of optimization problems. Some algorithms provide better optimal results as compared to the others. Therefore, developing a new metaheuristic algorithm is an open problem. This is the one fact which can motivates us to develop a novel metaheuristic algorithm for solving optimization problems.
This paper presents a hybrid bio-inspired metaheuristic algorithm named as emperor penguin and salp swarm algorithm (ESA). It is inspired by the huddling and swarm behavior of emperor penguin optimizer (EPO) [30] and salp swarm algorithm (SSA) [31], respectively. The main contributions of this work are as follows:
-
A hybrid bio-inspired swarm algorithm (ESA) is proposed.
-
The proposed ESA is implemented and tested on 53 benchmark test functions (i.e., classical and CEC-2017).
-
The performance of ESA is compared with well-known metaheuristics using sensitivity analysis, convergence analysis, and ANOVA test analysis.
-
The robustness of proposed ESA and other metaheuristics are examined for solving engineering problems.
The rest of this paper is structured as follows: Sect. 2 presents the background and related works of optimization problems. The proposed ESA algorithm is discussed in Sect. 3. The experimental results and discussion is presented in Sect. 4. Section 5 focuses on the applications of ESA in engineering problems. Finally, the conclusion and some future research directions are given in Sect. 6.
2 Background and related works
This section firstly describes the recently developed EPO and SSA algorithms followed by related works in the field of optimization.
2.1 Emperor penguin optimizer (EPO)
Emperor penguins are social animals that perform various activities for living like hunting, foraging in groups. Emperor penguins perform huddling during extreme winters in the Antarctic to survive. Each penguin contributes equally while huddling depicting the sense of collectiveness and unity in their social behavior [32]. The huddling behavior can be summarized as below [30]:
-
Create and discover huddling boundary.
-
Compute the temperature around the huddle.
-
Calculate the distance between each penguin.
-
Effective mover is relocated.
2.1.1 Mathematical modeling
The main objective of modeling is to identify effective mover. L-shape polygon plane is considered as the shape of the huddle. After the effective mover is identified, the boundary of the huddle is again computed.
2.1.1.1 Generate and determine the huddle boundary
To map the huddling behavior of emperor penguins, the first thing we need to consider is their polygon-shaped grid boundary. Every penguin is surrounded by at least two penguins while huddling. The huddling boundary is decided by the direction and speed of wind flow. Wind flow is generally faster as compared to penguins movement. Mathematically huddling boundary can be formulated as: let \(\eta\) represents the velocity of wind and \(\chi\) represents the gradient of \(\eta\):
Vector \(\alpha\) is integrated with \(\eta\) to obtain complex potential:
where i represents the imaginary constant and G defines the polygon plane function.
2.1.1.2 Temperature profile around the huddle
Emperor penguins perform huddling to conserve their energy and maximize huddle temperature \(T=0\) if \({X}> 0.5\) and \(T=1\) if \({X}< 0.5\), where X is the polygon radius. This temperature measure helps to perform exploration and exploitation task among emperor penguins. The temperature is computed as:
where y represents the current iteration, defines the current iteration, \(\text {Max}_{\text {itr}}\) represents the maximum count of iterations, X is the radius, and T is the time require to identify best optimal solution.
2.1.1.3 Distance between emperor penguins
After the huddling boundary is computed, distance between the emperor penguin is calculated. The current optimal solution is the solution with higher fitness value than previous optimum solution. The search agents update their positions corresponding to current optimal solution. The position updation can be mathematically represented as:
where \(\vec {M_{\text {ep}}}\) denotes the distance between the emperor penguin and best fittest search agent (i.e., with less fitness value), x represents the ongoing iteration. \(\vec {X}\) and \(\vec {A}\) help to avoid collision among penguin. \(\vec {Q}\) represents the best optimal solution (i.e., fittest emperor penguin), \(\vec {Q_{\text {ep}}}\) represents the position vector of emperor penguin. N() denotes the social forces that helps to identify best optimal solution. The vectors \(\vec {X}\) and \(\vec {A}\) are calculated as follows:
where M is the movement parameter that maintains a gap between search agents for collision avoidance. The value of parameter M is set to 2. \(T'\) is the temperature profile around the huddle, \(P_{\text {grid}}(\text {Accuracy})\) defines the polygon grid accuracy by comparing the difference between emperor penguins, and Rand() is a random function lies in the range of [0, 1].
The function S() is calculated as follows:
where e defines the expression function. f and l are control parameters for better exploration and exploitation. The values of f and l lie in the range of [2, 3] and [1.5, 2], respectively. Note that it has been observed that EPO algorithm provides better results between these ranges.
2.1.1.4 Relocate the mover
The best obtained optimal solution (mover) is used to update the position of emperor penguins. The selected moves lead to the movement of other search agents in a search space. To find next position of a emperor penguin, following equations are used:
where \(\vec {Q_{\text {ep}}}(x+1)\) denotes the updated position of emperor penguin.
2.2 Salp swarm algorithm (SSA)
Salp swarm algorithm is a metaheuristic bio-inspired optimization algorithm developed by Mirjalili et al. [31]. This algorithm is based on the swarming behavior of salps when navigating and foraging in the deep sea. This swarming behavior is mathematically modeled named as salp chain. This chain is divided into two groups: leader and followers. The leader leads the whole chain from the front while the followers follow each other. The updated position of the leader in a n-dimensional search environment is described as follows:
where \(x_i^1\) represents the first position of salp, i.e., leader in the \(i{\text {th}}\) dimension, \(F_i\) is the position of food source, \(ub_i\) and \(lb_i\) are the lower bound and upper bound of \(i{\text {th}}\) dimension, respectively. However, \(c_1, c_2,\) and \(c_3\) are random numbers.
The coefficient \(c_1\) is responsible for better exploration and exploitation which is defined as follows:
where l represents the current iteration and L is the maximum number of iterations; whereas, the parameters \(c_2\) and \(c_3\) are random numbers in range [0, 1].
To update the position of followers, the following equations are defined as follows:
where \(x_i^j\) shows the position of follower, T represents the time and \(V_0\) represents the initial speed. The parameter A is calculated as follows:
Considering \(V_0=0\), the following equation can be expressed as:
The SSA algorithm is able to solve high-dimensional problems using low computational efforts.
2.3 Related works
Multiple-solution-based metaheuristic algorithms are further classified into three categories such as evolutionary-based, physics-based, and swarm-based algorithms (see Fig. 1). The former one is generic population-based metaheuristic which is inspired from biological evolution, i.e., mutation, recombination, and selection. These do not make any assumptions about fitness landscape. The most popular evolutionary algorithm is genetic algorithm (GA) [33]. The evolution starts with randomly generated individuals from the given population. The fitness of each individual is computed in each generation. The crossover and mutation operators are applied on individual to create a new population. The best individuals can generate a new population during the course of iterations. However, compared to other stochastic methods, genetic algorithm has advantage that it can be parallelized with little effort and not necessarily remain trapped in a sub-optimal local maximum or minimum of the target function. GA may provides local minima of a function that can steer the search in the wrong direction for some of the optimization problems. differential evolution (DE) [34] is another evolutionary-based metaheuristic algorithm that optimizes a problem by maintaining a candidate solutions and creates new candidate solutions by combining the existing ones. It can keep the candidate solution which has best fitness value for optimization problem. It has an ability to handle non-differentiable and non-linear cost functions. There are only few parameters to steer the minimization problem. The parameter tuning is a main challenge in DE because same parameters may not guarantee the global optimum solution. Apart from these, some of the other popular evolutionary-based algorithms are genetic programming (GP) [35], evolution strategy (ES) [36], and biogeography-based optimizer (BBO) [37].
The second category is physics-based algorithms in which each search agent can move throughout the search space according to physics rules such as gravitational force, electromagnetic force, inertia force, and many more. The well-known physics-based metaheuristic algorithms are simulated annealing (SA) [38] and gravitational search algorithm (GSA) [39]. Simulated annealing is inspired from annealing in metallurgy that involves heating and controlled cooling attributes of a material. These attributes depend on its thermodynamic free energy. SA is advantageous in terms to deal with non-linear models and noisy data. The main advantage of SA over other search methods is its ability to search the global optimal solution. However, it suffers from high computational time especially if the fitness function is very complex and non-linear in nature. Gravitational search algorithm is based on the law of gravity and mass interactions. The population solutions are interact with each other through the gravity force and their performance is measured by its mass. GSA requires only two input parameters to adjust, i.e., mass and velocity. It is easy to implement. The ability to find near the global optimum solution makes GSA differ from the other optimization algorithms. However, it suffers from computational time and convergence problem if the initial population is not generated well. Some of the other popular algorithms are: big-bang big-crunch (BBBC) [40], charged system search (CSS) [41], black hole (BH) [42] algorithm, central force optimization (CFO) [43], small-world optimization algorithm (SWOA) [44], artificial chemical reaction optimization algorithm (ACROA) [45], ray optimization (RO) algorithm [46], galaxy-based search algorithm (GbSA) [47], and curved space optimization (CSO) [48].
The third category is swarm-based algorithms which are inspired by the collective behavior of social creatures. This collective intelligence is based on the interaction of swarm with each other. These are easier to implement than the evolutionary-based algorithms due to number of operators (i.e., selection, crossover, and mutation).
The most popular algorithm is particle swarm optimization (PSO) which was proposed by Kennedy and Eberhart [49]. In PSO, particles move around the search space using the combination of best solutions [50]. The whole process is repeated until the termination criterion is satisfied. The main advantage of PSO is that it has no overlapping and mutation computation. During simulation, the most optimist particle can transmit information among the other particles. However, it suffers from the stagnation problem.
Ant colony optimization (ACO) is another popular swarm intelligence algorithm which was proposed by Dorigo [51]. The main inspiration behind this algorithm is the social behavior of ants in ant colony. The social intelligence of ants is to find the shortest path between the source food and nest. ACO is able to solve the travelling salesman and similar problems in an efficient way that can be advantageous of ACO over the other approaches. The theoretical analysis of a problem is very difficult using ACO because the computational cost is high during convergence.
Bat-inspired algorithm (BA) [52] is inspired by the echolocation behavior of bats. Another well-known swarm-based metaheuristic is artificial bee colony (ABC) algorithm [53] which is inspired by the collective behavior of bees to find the food sources. Spotted hyena optimizer (SHO) [16] is a bio-inspired metaheuristic algorithm that mimics the searching, hunting, and attacking behaviors of spotted hyenas in nature. The main concept behind this technique is the social relationship and collective behavior of spotted hyenas for hunting strategy. Cuckoo search (CS) [54] is inspired by the obligate brood parasitism of cuckoo species. These species lay their eggs in the nest of other species. Each egg and a cuckoo egg represent a solution and a new solution, respectively.
Emperor penguin optimizer (EPO) [30] is a recently developed bio-inspired metaheuristic algorithm that mimics the huddling behaviors of emperor penguins. The main steps of EPO are to generate huddle boundary, compute temperature around the huddle, calculate the distance, and find the effective mover.
Grey wolf optimizer (GWO) [55] is a very popular bio-inspired based algorithm for solving real-life constrained problems. Grey wolf optimizer (GWO) is inspired by the behaviors of grey wolves. It mimics the leadership, hierarchy, and hunting mechanisms of grey wolves. GWO employs four types of grey wolves namely, alpha, beta, delta, and omega for optimization problems. The hunting, searching, encircling, and attacking mechanisms are also implemented. Further, to investigate the performance of GWO algorithm, it was tested on well-known test functions and classical engineering design problems.
Multi-verse optimizer (MVO) is a promising optimization algorithm proposed by Mirjalili et al. [56]. It is inspired by the theory of multi-verse in physics which consists of three main concepts, i.e., white hole, black hole, and worm hole. The concepts of white hole and black hole are appropriate for exploration and worm hole helps in the exploitation of the given search spaces.
Sine cosine algorithm (SCA) is proposed by Mirjalili [57] for solving numerical optimization problems. SCA generates multiple random solutions and fluctuate them towards the best optimal solution using mathematical models such as sine and cosine functions. The convergence speed of SCA is very high which is helpful for local optima avoidance.
The other well-known metaheuristic algorithms are fireworks algorithm (FWA) [58,59,60,61], monkey search [62], Bacterial foraging optimization algorithm [63], firefly algorithm (FA) [64], fruit fly optimization algorithm (FOA) [65], golden section line search algorithm [66], Fibonacci search method [67], bird mating optimizer (BMO) [68], Krill Herd (KH) [69], artificial fish-swarm algorithm (AFSA) [70], Dolphin partner optimization (DPO) [71], bee collecting pollen algorithm (BCPA) [72], and hunting search (HS) [73].
3 Proposed algorithm
In this section, the motivation and brief justification of the proposed algorithm are described in detail.
3.1 Motivation
Nature has inspired many researchers in many ways and thus is a rich source of inspiration. Nowadays, most new algorithms are nature-inspired because they have been developed by drawing inspiration from nature. The main source of inspiration for developing new algorithms is nature. Almost all new algorithms can be referred as nature-inspired algorithms. The majority of nature-inspired algorithms are based on some characteristics of biological system. Therefore, the largest fraction of nature-inspired algorithms are biology or bio-inspired. Among bio-inspired algorithms, a special class of algorithms have been developed by drawing inspiration from swarm intelligence. It has been observed from the literature that multiple-solution or population-based swarm intelligence algorithms are able to solve real-life optimization problems. They are able to explore throughout the search space, and exploit the global optimum. However, population-based techniques are more reliable than single-solution-based techniques because of more function evaluations.
According to no free lunch theorem [74], there is no optimization algorithm which is able to solve all optimization problems. This fact will attract the researchers of different fields to propose a new optimization algorithm. These motivate us to propose a new population-based metaheuristic algorithm.
The researchers have pointed out convergence and diversity difficulties for real-life problems. Hence, there is a need to develop an algorithm that maintains the convergence and diversity. In this paper, the navigation and foraging behaviors of SSA algorithm is used to maintain the diversity. The reasons to select these behaviors over others are:
-
1.
SSA algorithm eliminates the problem of missing selection individuals.
-
2.
The values of these behaviors are directly optimized, without any need for niching, that helps to maintain the diversity.
-
3.
SSA ensures that any approximation set that has high-quality value for a particular problem contains all optimal solutions.
However, the calculation of SSA parameters requires high computational effort. To resolve this problem, EPO algorithm is employed. SSA suffers from overhead of maintaining the necessary information. For this, huddling behavior of EPO algorithm is used for maintaining the information. Therefore, a novel hybrid algorithm is proposed that utilizes the features of both EPO and SSA.
3.2 Hybrid emperor penguin and salp swarm algorithm (ESA)
The first step is to initialize the population and initial parameters of ESA algorithm as explained in Table 1. After the initialization, objective value of each search agent is calculated using FITNESS function as defined in line 4 of Algorithm 1. The best search agent is explored from the given search space. Further, the huddling behavior is defined using Eq. (9) until the suitable result is found for each search agent. In line 6 of Algorithm 1, position of each search agent is updated. Now, the leader and follower selection approaches are applied to update the positions of search agents using Eq. (14).
Again, the objective value of each search agent is calculated to find the optimal solutions. The condition is checked whether any search agent goes beyond the boundary in a given search space and if it happens then adjust it. Calculate the updated search agent objective value and update the parameters if there is a better solution from the previous one. The algorithm will be stopped when the stopping criterion is satisfied. This criterion is defined by user for how long the algorithm will be run, i.e., maximum number of iterations. Finally, the optimal solution is returned, after the stopping criterion is satisfied (see Fig. 2).
The pseudo-code of ESA algorithm is shown in Algorithm 1. There are some interesting points about the proposed ESA algorithm which are given below:
-
N(), A, and V assist the candidate solutions to behave more randomly in a search space and are responsible in avoiding conflicts between search agents.
-
The convergence behaviors of common optimization algorithms suggest that the exploitation tends to increase the speed of convergence, while exploration tends to decrease the convergence rate of the algorithm. Therefore, the possibility of better exploration and exploitation is done by the adjusted values of N(), A, and V.
-
The huddling and swarm behaviors of ESA in a search region defines the effectively collective behavior.
3.3 Computational complexity
In this subsection, the computational complexity of proposed ESA algorithm is discussed. Both the time and space complexities of the proposed algorithm are given below.
3.3.1 Time complexity
-
1.
Population initialization process requires \(\mathcal {O}(n \times d)\) time, where n indicates the population size and d indicates the dimension of a given problem.
-
2.
The fitness of each agent requires \(\mathcal {O}(\text {Max}_{\text {itr}} \times n \times d)\) time, where \(\text {Max}_{\text {itr}}\) is the maximum number of iterations to simulate the proposed algorithm.
-
3.
It requires \(\mathcal {O}(N)\) time, where N defines the huddling and swarm behaviors of EPO and SSA for better exploration and exploitation.
Hence, the total time complexity of ESA algorithm is \(\mathcal {O}(\text {Max}_{\text {itr}}\times n \times d \times N)\).
3.3.2 Space complexity
The space complexity of ESA algorithm is the maximum amount of space used at any one time which is considered during its initialization process. Thus, the total space complexity of ESA algorithm is \(\mathcal {O}(n \times d)\).
4 Experimental results and discussion
This section describes the experimentation on 53 standard benchmark test functions to evaluate the performance of proposed algorithm. The detailed description of these benchmarks are presented below. Further, the results are compared with well-known metaheuristic algorithms.
4.1 Benchmark test functions
The 53 benchmark test functions are applied on the proposed algorithm to demonstrate its applicability and efficiency. These functions are divided into six main categories: unimodal [75], multimodal [64], fixed-dimension multimodal [64, 75], and IEEE CEC-2017 [76] test functions. The descriptions of these test functions are given in “Appendix”. In “Appendix”, Dim and Range indicate the dimension of the function and boundary of the search space, respectively. \(f_{\min }\) denotes the minimization function.
“Appendix” shows the characteristics of unimodal, multimodal, fixed-dimension multimodal, and CEC-2017 benchmark test functions. The seven test functions (\(F_1\)–\(F_7\)) are included in the first category of unimodal test functions. These functions have only one global optimum. The second category consists of six test functions (\(F_8\)–\(F_{13}\)) and third category includes ten test functions (\(F_{14}\)–\(F_{23}\)). There are multiple local solutions in these categories which are useful for examining the local optima problem. The fourth category consists of 30 CEC-2017 benchmark test functions (C1–C30).
4.2 Experimental setup
The proposed ESA is compared with well-known algorithms namely spotted hyena optimizer (SHO) [16], grey wolf optimizer (GWO) [55], particle swarm optimization (PSO) [49], multi-verse optimizer (MVO) [56], sine cosine algorithm (SCA) [57], gravitational search algorithm (GSA) [39], salp swarm algorithm (SSA) [31], emperor penguin optimizer (EPO) [30], and jSO [77]. The parameter values of these algorithms are set as they are recommended in their original papers. Table 1 shows the parameter settings of competitor algorithms. The experimentation has been done on Matlab R2014a (8.3.0.532) version using 64-bit Core i7 processor with 3.20 GHz and 8 GB main memory (Tables 2, 3).
4.3 Performance comparison
In order to demonstrate the effectiveness of the proposed algorithm, it is compared with well-known optimization algorithms on unimodal, multimodal, fixed-dimension multimodal, and CEC-2017 benchmark test functions. The average and standard deviation of the best optimal solution are mentioned in tables. For each benchmark test function, ESA algorithm utilizes 30 independent runs in which each run employs 1000 iterations.
4.3.1 Evaluation of test functions \(F_1\)–\(F_7\)
The unimodal test functions (\(F_1\)–\(F_7\)) are used to assess the exploitation capability of metaheuristic algorithm. Table 4 shows the mean and standard deviation of best optimal solution obtained from the above-mentioned algorithms on unimodal test functions. For \(F_1, F_2,\) and \(F_3\) test functions, SHO is the best optimizer whereas ESA is the second best optimizer in terms of mean and standard deviation. ESA provides better results for \(F_4, F_5, F_6,\) and \(F_7\) benchmark test functions. It is observed from results that ESA is very competitive as compared with other competitor algorithms and has better exploitation capability to find the best optimal solution very efficiently.
4.3.2 Evaluation of test functions \(F_8\)–\(F_{23}\)
Multimodal test functions have an ability to evaluate the exploration of an optimization algorithm. Tables 5 and 6 depict the performance of above-mentioned algorithms on multimodal test functions (\(F_8\)–\(F_{13}\)) and fixed-dimension multimodal test functions (\(F_{14}\)–\(F_{23}\)), respectively. From these tables, it can be seen that ESA is able to find optimal solution for nine test problems (i.e., \(F_8, F_{10}, F_{13}, F_{14}, F_{15}, F_{17}, F_{18}, F_{19},\) and \(F_{22}\)). For \(F_9, F_{11},\) and \(F_{16}\) test functions, SHO provides better optimal results than ESA. ESA is the second best algorithm on these test functions. PSO attains best optimal solution for \(F_{12}\) and \(F_{20}\) test functions. For \(F_{21}\) test function, GWO and ESA are the first and second best optimization algorithms, respectively. The results reveal that ESA obtains competitive results in majority of the test problems and has better exploration capability.
4.3.3 Evaluation of IEEE CEC-2017 test functions (\(C_1\)–\(C_30\))
This special session is devoted to the algorithms and techniques for solving real parameter single objective optimization without making use of the exact equations of the test functions. Table 7 shows the performance of proposed ESA algorithm and other competitive approaches on IEEE CEC-2017 test functions. The results reveal that ESA achieves best optimal solution for most of the CEC-2017 benchmark test functions (i.e., C-4, C-5, C-8, C-10, C-11, C-12, C-13, C-20, C-22, C-25, C-26, C-29, C-30).
4.4 Convergence analysis
Figure 3 shows the convergence curves of proposed ESA and other competitor algorithm. It is shown that ESA is very competitive over benchmark test functions. ESA has three different convergence behaviors. In the initial stage of iterations, ESA converges more quickly in the search space due to its adaptive mechanism. In the second step, ESA converges towards the optimum when final iteration reaches. The last step shows the express convergence from the initial step of iterations. The results reveal that ESA algorithm maintains a proper balance between exploration and exploitation to find the global optimum.
4.5 Sensitivity analysis
The proposed ESA algorithm uses four parameters namely, maximum number of iterations and number of search agents. The sensitivity investigation of these parameters has been discussed by varying their values and keeping other parameters fixed.
-
1.
Maximum number of iterations: ESA algorithm was run for different number of iterations. The values of \(\text {Max}_{\text {iteration}}\) used in experimentation are 100, 500, 800, and 1000. Table 2 shows the effect of iterations over benchmark test functions. The results reveal that ESA converges towards the optimum when the number of iterations is increased.
-
2.
Number of search agents: ESA algorithm was run for different values of search agent (i.e., 30, 50, 80, 100). Table 3 shows the effect of number of search agents on benchmark test functions. It is observed from table that the value of objective function decreases with the increase in number of search agents.
4.6 Scalability study
This subsection describes the effect of scalability on various test functions by using proposed ESA and other competitive algorithms. The dimensionality of the test functions is made to vary between the different ranges. Figure 4 shows the performance of ESA and other algorithms on scalable benchmark test functions. It is observed that the performance of ESA is not too much degraded when the dimensionality of search space is increased. The results reveal that the performance of ESA is least affected with the increase in dimensionality of search space. This is due to better capability of the proposed ESA for balancing between exploration and exploitation.
4.7 Statistical testing
Apart from standard statistical analysis such as mean and standard deviation, ANOVA test has been conducted. ANOVA test is used to determine whether the results obtained from proposed algorithm are different from other competitor algorithms in a statistically significant way. The sample size for ANOVA test is 30 with 95% confidence of interval. A p value determines whether the given algorithm is statistically significant or not. If the p value of the given algorithm is less than 0.05, then the corresponding algorithm is statistically significant. Table 8 shows the analysis of ANOVA test on the benchmark test functions. It is observed from Table 8 that the p value obtained from ESA is much smaller than 0.05 for all the benchmark test functions. Therefore, the proposed ESA is statistically different from the other competitor algorithms.
5 ESA for engineering problems
The proposed ESA algorithm has been tested and validated on six constrained and one unconstrained engineering problems. These are pressure vessel, speed reducer, welded beam, tension/compression spring, 25-bar truss, rolling element bearing, and displacement of loaded structure design problems [78, 79]. These optimization design problems have different constraints with different nature. Different types of penalty functions are used to handle these problems such as static penalty, dynamic penalty, annealing penalty, adaptive penalty, co-evolutionary penalty, and death penalty [80].
However, death penalty function handles the solution which can violate the constraints. This function assigns the fitness value as zero to discard the infeasible solutions during optimization, i.e., it does not employ any information about infeasible solutions. Due to its low computational complexity and simplicity, ESA algorithm is equipped with death penalty function to handle both constrained and unconstrained engineering design problems.
5.1 Constrained engineering problems
This subsection describes six constrained engineering problems and compared it with other competitor approaches. The statistical analysis of these problems is also done to validate the efficiency and effectiveness of proposed algorithm.
5.1.1 Pressure vessel design problem
This problem was first proposed by Kannan and Kramer [81] to minimize the total cost consisting of material, forming, and welding of a cylindrical vessel. The schematic view of pressure vessel problem is shown in Fig. 5 which is capped at both ends by hemispherical heads. There are four design variables of this problem:
-
\(T_s\) (\(z_1\), thickness of the shell).
-
\(T_h\) (\(z_2\), thickness of the head).
-
R (\(z_3\), inner radius).
-
L (\(z_4\), length of the cylindrical section without considering the head).
Among these four design variables, R and L are continuous variables. \(T_s\) and \(T_h\) are integer values which are multiples of 0.0625 in. The mathematical formulation of this problem is given below:
Table 9 reveals the obtained best comparison between ESA and other competitor algorithms such as EPO, SHO, GWO, PSO, MVO, SCA, GSA, and SSA. The proposed ESA provides optimal solution at \(z_{1-4} =(0.778092, 0.383236, 40.315052, 200.00000)\) with corresponding fitness value as \(f(z_{1-4}) = 5879.9558\). From this table, it can be seen that, ESA algorithm is able to find best optimal design with minimum cost.
The statistical results of pressure vessel design problem are tabulated in Table 10. It can be seen from Table 10 that ESA surpassed other algorithms for providing the best solution in terms of best, mean, and median. The convergence behavior obtained by proposed ESA for best optimal design is shown in Fig. 6.
5.1.2 Speed reducer design problem
The speed reducer design problem is a challenging benchmark problem due to its seven design variables [82] as shown in Fig. 7. The objective of this problem is to minimize the weight of speed reducer subject to constraints [83]:
-
Bending stress of the gear teeth.
-
Surface stress.
-
Transverse deflections of the shafts.
-
Stresses in the shafts.
There are seven design variables (\(z_1\)–\(z_7\)) such as face width (b), module of teeth (m), number of teeth in the pinion (p), length of the first shaft between bearings (\(l_1\)), length of the second shaft between bearings (\(l_2\)), diameter of first (\(d_1\)) shafts, and diameter of second shafts (\(d_2\)). The mathematical formulation of this problem is formulated as follows:
Table 11 shows the comparison of the best obtained optimal solution with various optimization algorithms. The proposed ESA algorithm provides optimal solution at \(z_{1-7} =(3.50120, 0.7, 17, 7.3, 7.8, 3.33415, 5.26531)\) with corresponding fitness value as \(f(z_{1-7}) = 2993.9584\). The statistical results of ESA and competitor optimization algorithms are given in Table 12.
The results show that ESA outperforms than other metaheuristic optimization algorithms. Figure 8 shows the convergence behavior of ESA on speed reducer design problem.
5.1.3 Welded beam design problem
The main objective of this design problem is to minimize the fabrication cost of welded beam as shown in Fig. 9. The optimization constraints of welded beam are shear stress (\(\tau\)), bending stress (\(\theta\)) in the beam, buckling load (\(P_c\)) on the bar, and end deflection (\(\delta\)) of the beam. There are four design variables (\(z_1\)–\(z_4\)) of this problem.
-
h (\(z_1\), thickness of weld)
-
l (\(z_2\), length of the clamped bar)
-
t (\(z_3\), height of the bar)
-
b (\(z_4\), thickness of the bar)
The mathematical formulation is described as follows:
The obtained best comparison between proposed ESA and other metaheuristics is presented in Table 13. Among other algorithms, the proposed ESA provides optimal solution at \(z_{1-4} =(0.203296, 3.471148, 9.035107, 0.201150)\) with corresponding fitness value equal to \(f(z_{1-4})=1.721026\). Table 14 shows the statistical comparison of the proposed algorithm and other competitor algorithms. ESA shows superiority to other algorithms in terms of best, mean, and median.
Figure 10 shows the convergence analysis of best optimal solution obtained from ESA for welded beam design problem.
5.1.4 Tension/compression spring design problem
The objective of this design problem is to minimize the tension/ compression spring weight (see Fig. 11). The optimization constraints of this problem are described as follows:
-
Shear stress.
-
Surge frequency.
-
Minimum deflection.
There are three design variables such as wire diameter (d), mean coil diameter (D), and the number of active coils (P). The mathematical formulation of this problem is given below:
Table 15 shows the comparison for the best solution obtained from the proposed ESA and other competitor algorithms in terms of design variables and objective values. ESA obtained best solution at design variables \(z_{1-3}=(0.051080, 0.342895, 12.0895)\) with an objective function value of \(f(z_{1-3})=0.012655526\). The results reveal that ESA performs better than the other competitor algorithms. The statistical results of tension/compression spring design problem for the reported algorithms are compared and tabulated in Table 16. It can be seen from Table 16 that ESA provides better statistical results than the other optimization algorithms in terms of best, mean, and median.
Figure 12 shows the convergence behavior of best optimal solution obtained from proposed ESA.
5.1.5 25-bar truss design problem
The truss design problem is a popular optimization problem [84, 85] (see Fig. 14). There are 10 nodes and 25 bars cross-sectional members. These are grouped into eight categories.
-
Group 1: \(A_1\)
-
Group 2: \(A_2, A_3, A_4, A_5\)
-
Group 3: \(A_6, A_7, A_8, A_9\)
-
Group 4: \(A_{10}, A_{11}\)
-
Group 5: \(A_{12}, A_{13}\)
-
Group 6: \(A_{14}, A_{15}, A_{17}\)
-
Group 7: \(A_{18}, A_{19}, A_{20}, A_{21}\)
-
Group 8: \(A_{22}, A_{23}, A_{24}, A_{25}\)
The other variables which affects on this problem are as follows:
-
p = 0.0272 N/cm\(^3\) (0.1 lb/in.\(^3\))
-
E = 68947 MPa (10,000 Ksi)
-
Displacement limitation = 0.35 in.
-
Maximum displacement = 0.3504 in.
-
Design variable set = \(\{0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2.0, 2.1, 2.2, 2.3, 2.4, 2.6, 2.8, 3.0, 3.2, 3.4 \}\)
Table 17 shows the member stress limitations for this problem. The loading conditions for 25-bar truss are presented in Table 18. The comparison of best obtained solutions among several algorithms is tabulated in Table 19. It can be seen that the proposed ESA is better than other algorithms in terms of best, average, and standard deviation. ESA converges very efficiently towards optimal solution as shown in Fig. 13.
5.1.6 Rolling element bearing design problem
The main objective of this problem is to maximize the dynamic load carrying capacity of a rolling element bearing as depicted in Fig. 15. There are ten decision variables such as pitch diameter (\(D_m\)), ball diameter (\(D_b\)), number of balls (Z), inner (\(f_i\)) and outer (\(f_o\)) raceway curvature coefficients, \(K_{\text {Dmin}}\), \(K_{\text {Dmax}}\), \(\varepsilon\), e, and \(\zeta\) (see Fig. 15). The mathematical representation of this problem is given below:
Table 20 shows the performance comparison of best obtained optimal solution. The proposed ESA provides optimal solution at \(z_{1-10} =(125, 21.41750, 10.94109, 0.510, 0.515, 0.4, 0.7, 0.3, 0.02, 0.6)\) with corresponding fitness value equal to \(f(z_{1-10}) = 85070.085\). The statistical results obtained for rolling element bearing design problem are compared and tabulated in Table 21. The results reveal that the proposed ESA gives the best solution with considerable improvement.
Figure 16 shows the convergence analysis of ESA algorithm and reveals that ESA is able to achieve best optimal solution.
5.2 Unconstrained engineering problem
This subsection describes the displacement of loaded structure design problem to minimize the potential energy.
5.2.1 Displacement of loaded structure design problem
A displacement is a vector which defines the shortest distance between initial and final position of a given point.
The objective of this problem is to minimize the potential energy for reducing the excess load of structure. The loaded structure that should have minimum potential energy (\(f(\vec {z})\)) is shown in Fig. 17. The problem can be stated as follows:
Table 22 reveals the comparison of best optimal solution obtained from ESA and other metaheuristics including EPO, SHO, GWO, PSO, MVO, SCA, GSA, and SSA. The proposed ESA generates best optimum cost at \(\pi =167.2635\). It can be seen that ESA is able to minimize the potential energy for loaded structure problem.
The statistical results for the reported algorithms are tabulated in Table 23. From Table 23, it is noticed that the results obtained from ESA are far better than the other competitor algorithms in terms of best, mean, and median. Figure 18 shows the convergence analysis of best solution obtained from proposed ESA algorithm (Tables 24, 25, 26, 27).
In summary, ESA is an effective optimizer for solving both constrained and unconstrained engineering design problems with low computational cost and fast convergence speed.
6 Conclusion and future works
This paper presents a hybrid swarm-based bio-inspired metaheuristic algorithm called emperor penguin and salp swarm algorithm (ESA). The fundamental concepts behind this algorithm are the huddling and swarm behaviors of EPO and SSA algorithms, respectively. The proposed ESA algorithm has been tested on fifty-three benchmark test functions. It is observed from statistical analysis that ESA attains global optimal solution with better convergence as compared to other competitive algorithms.
For CEC-2017 benchmark test functions, the performance of ESA is found accurate and consistent. The effect of scalability has also been investigated on the performance of ESA. The results reveal that the performance of ESA is less susceptible to scalability as compared to other algorithms. The sensitivity analysis has also been investigated on ESA.
Moreover, ESA is applied on six constrained and one unconstrained engineering design problems to show its effectiveness and efficacy. On the basis of results, it can be concluded that the proposed ESA is applicable to engineering design problems. In future, ESA may be extended for solving multi-objective optimization problems. The binary and many objective versions of ESA can be valuable contributions. ESA may also be extended for solving online large scale optimization and engineering applications.
References
Kaveh A, Shahrouzi M (2007) A hybrid ant strategy and genetic algorithm to tune the population size for efficient structural optimization. Eng Comput 24(3):237–254
Kaveh A, Shahrouzi M (2008) Dynamic selective pressure using hybrid evolutionary and ant system strategies for structural optimization. Int J Numer Methods Eng 73(4):544–563
Singh P, Rabadiya K, Dhiman G (2018) A four-way decision-making system for the indian summer monsoon rainfall. Mod Phys Lett B 32(25):1850304
Singh P, Dhiman G (2018) A hybrid fuzzy time series forecasting model based on granular computing and bio-inspired optimization approaches. J Comput Sci 27:370–385 [Online]. http://www.sciencedirect.com/science/article/pii/S1877750317300923
Singh P, Dhiman G, Kaur A (2018) A quantum approach for time series data based on graph and Schrödinger equations methods. Mod Phys Lett A 33(35):1850208
Kaur A, Kaur S, Dhiman G (2018) A quantum method for dynamic nonlinear programming technique using Schrödinger equation and Monte Carlo approach. Mod Phys Lett B 1850374
Dhiman G, Kaur A (2019) A hybrid algorithm based on particle swarm and spotted hyena optimizer for global optimization. Soft computing for problem solving. Springer, Berlin, pp 599–615
Dhiman G, Kumar V (2019) Spotted hyena optimizer for solving complex and non-linear constrained engineering problems. Harmony search and nature inspired optimization algorithms. Springer, Berlin, pp 857–867
Kaur A, Dhiman G (2019) A review on search-based tools and techniques to identify bad code smells in object-oriented systems. Harmony search and nature inspired optimization algorithms. Springer, Berlin, pp 909–921
Dhiman G, Kaur A (2017) Spotted hyena optimizer for solving engineering design problems. In: Machine learning and data science (MLDS), 2017 international conference on IEEE, pp 114–119
Dhiman G, Kumar V (2018) Multi-objective spotted hyena optimizer: a multi-objective optimization algorithm for engineering problems. Knowl Based Syst 150:175–197 [Online]. http://www.sciencedirect.com/science/article/pii/S0950705118301357
Dhiman G, Kaur A (2018) Optimizing the design of airfoil and optical buffer problems using spotted hyena optimizer. Designs 2(3):28
Dhiman G, Kumar V (2018) Knrvea: a hybrid evolutionary algorithm based on knee points and reference vector adaptation strategies for many-objective optimization. Appl Intell 1–27
Dhiman G, Guo S, Kaur S (2018) Ed-sho: a framework for solving nonlinear economic load power dispatch problem using spotted hyena optimizer. Mod Phys Lett A 33(40):1850239
Dhiman G, Kumar V (2018) Emperor penguin optimizer: a bio-inspired algorithm for engineering problems. Knowl Based Syst 159:20–50
Dhiman G, Kumar V (2017) Spotted hyena optimizer: a novel bio-inspired based metaheuristic technique for engineering applications. Adv Eng Softw 114:48–70
Verma S, Kaur S, Dhiman G, Kaur A (2019) Design of a novel energy efficient routing framework for wireless nanosensor networks. In: 2018 first international conference on secure cyber computing and communication (ICSCCC). IEEE, pp 532–536
Dhiman G, Singh P, Kaur H, Maini R (2019) DHIMAN: a novel algorithm for economic dispatch problem based on optimization method using Monte Carlo simulation and a strophysics concepts. Mod Phys Lett A 34(04):1950032
Dhiman G, Kaur A (2019) STOA: a bio-inspired based optimization algorithm for industrial engineering problems. Eng Appl Artif Intell 82:148–174
Singh P, Dhiman G, Guo S, Maini R, Kaur H, Kaur A, Kaur H, Singh J, Singh N (2019) A hybrid fuzzy quantum time series and linear programming model: special application on Taiex index dataset. Mode Phys Lett A 1950201
Dhiman G (2019) MOSHEPO: a hybrid multi-objective approach to solve economic load dispatch and micro grid problems. Appl Intell
Chandrawat RK, Kumar R, Garg B, Dhiman G, Kumar S (2017) An analysis of modeling and optimization production cost through fuzzy linear programming problem with symmetric and right angle triangular fuzzy number. In: Proceedings of sixth international conference on soft computing for problem solving. Springer, pp 197–211
Singh P, Dhiman G (2017) A fuzzy-LP approach in time series forecasting. In: International conference on pattern recognition and machine intelligence, Springer, pp 243–253
Dhiman G, Kumar V (2018) Astrophysics inspired multi-objective approach for automatic clustering and feature selection in real-life environment. Mod Phys Lett B 32(31):1850385
Dhiman G, Kumar V (2019) Seagull optimization algorithm: theory and its applications for large-scale industrial engineering problems. Knowl Based Syst 165:169–196
Singh P, Dhiman G (2018) Uncertainty representation using fuzzy-entropy approach: special application in remotely sensed high-resolution satellite images (RSHRSIs). Appl Soft Comput 72:121–139 [Online]. http://www.sciencedirect.com/science/article/pii/S1568494618304265
Kaveh A, Rad SM (2010) Hybrid genetic algorithm and particle swarm optimization for the force method-based simultaneous analysis and design. Iran J Sci Technol 34(B1):15
Kaveh A, Zolghadr A (2012) Truss optimization with natural frequency constraints using a hybridized CSS-BBBC algorithm with trap recognition capability. Comput Struct 102:14–27
Kaveh A, Javadi SM (2014) An efficient hybrid particle swarm strategy, ray optimizer, and harmony search algorithm for optimal design of truss structures. Period Polytech Civ Eng 58(2):155–171
Dhiman G, Kumar V (2018) Emperor penguin optimizer: a bio-inspired algorithm for engineering problems. Knowl Based Syst 159:20–50 [Online]. http://www.sciencedirect.com/science/article/pii/S095070511830296X
Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv Eng Softw 114:163–191
Waters A, Blanchette F, Kim AD (2012) Modeling huddling penguins. PLoS One 7(11):e50277
Holland JH (1992) Genetic algorithms. Sci Am 267(1):66–72
Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359. https://doi.org/10.1023/A:1008202821328 [Online]
Koza JR (1992) Genetic programming: on the programming of computers by means of natural selection. MIT Press, New York
Beyer H-G, Schwefel H-P (2002) Evolution strategies—a comprehensive introduction. Nat Comput 1(1):3–52. https://doi.org/10.1023/A:1015059928466 [Online]
Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248. http://www.sciencedirect.com/science/article/pii/S0020025509001200 [Online]
Erol OK, Eksin I (2006) A new optimization method: Big bang-big crunch. Adv Eng Softw 37(2):106–111. http://www.sciencedirect.com/science/article/pii/S0965997805000827 [Online]
Kaveh A, Talatahari S (2010) A novel heuristic optimization method: charged system search. Acta Mech 213(3–4):267–289
Hatamlou A (2013) Black hole: a new heuristic optimization approach for data clustering. Inf Sci 222:175–184. http://www.sciencedirect.com/science/article/pii/S0020025512005762 [Online]
Formato RA (2009) Central force optimization: a new deterministic gradient-like optimization metaheuristic. Opsearch 46(1):25–51. https://doi.org/10.1007/s12597-009-0003-4 [Online]
Du H, Wu X, Zhuang J (2006) Small-world optimization algorithm for function optimization. Springer, Berlin, pp 264–273
Alatas B (2011) ACROA: artificial chemical reaction optimization algorithm for global optimization. Expert Syst Appl 38(10):13 170–13 180. http://www.sciencedirect.com/science/article/pii/S0957417411006531 [Online]
Kaveh A, Khayatazad M (2012) A new meta-heuristic method: ray optimization. Comput Struct 112:283–294
Shah Hosseini H (2011) Principal components analysis by the galaxy-based search algorithm: a novel metaheuristic for continuous optimisation. Int J Comput Sci Eng 6:132–140
Moghaddam FF, Moghaddam RF, Cheriet M (2012) Curved space optimization: a random search based on general relativity theory. Neural Evol Comput
Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, pp 1942–1948
Slowik A, Kwasnicka H (2017) Nature inspired methods and their industry applications–swarm intelligence algorithms. IEEE Trans Ind Inf 99:1–1
Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization—artificial ants as a computational intelligence technique. IEEE Comput Intell Mag 1:28–39
Yang X-S (2010) A new metaheuristic bat-inspired algorithm. Springer, Berlin, pp 65–74
Karaboga D, Basturk B (2007) Artificial Bee Colony (ABC) optimization algorithm for solving constrained optimization problems. Springer, Berlin, pp 789–798
Yang XS, Deb S (2009) Cuckoo search via levy flights. In: World congress on nature biologically inspired computing, pp 210–214
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. http://www.sciencedirect.com/science/article/pii/S0965997813001853 [Online]
Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513. https://doi.org/10.1007/s00521-015-1870-7 [Online]
Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based Syst 96:120–133. http://www.sciencedirect.com/science/article/pii/S0950705115005043 [Online]
Tan Y, Zhu Y (2010) Fireworks algorithm for optimization. In: International conference in swarm intelligence. Springer, pp 355–364
Zheng S, Janecek A, Tan Y (2013) Enhanced fireworks algorithm. In: Evolutionary computation (CEC), 2013 IEEE congress on IEEE, pp 2069–2077
Ding K, Zheng S, Tan Y (2013) A GPU-based parallel fireworks algorithm for optimization. In: Proceedings of the 15th annual conference on genetic and evolutionary computation. ACM, pp 9–16
Zheng S, Janecek A, Li J, Tan Y (2014) Dynamic search in fireworks algorithm. In: Evolutionary computation (CEC), 2014 IEEE congress on IEEE, pp 3222–3229
Mucherino A, Seref O (2007) Monkey search: a novel metaheuristic search for global optimization. AIP conference proceedings 953(1)
Das S, Biswas A, Dasgupta S, Abraham A (2009) Bacterial foraging optimization algorithm: theoretical foundations, analysis, and applications. Springer, Berlin, pp 23–55
Yang X-S (2010) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio-Inspired Comput 2(2):78–84. https://doi.org/10.1504/IJBIC.2010.032124
Pan W-T (2012) A new fruit fly optimization algorithm: taking the financial distress model as an example. Knowl Based Syst 26:69–74
Wang Y, Wu S, Li D, Mehrabi S, Liu H (2016) A part-of-speech term weighting scheme for biomedical information retrieval. J Biomed Inf 63:379–389. http://www.sciencedirect.com/science/article/pii/S1532046416301125 [Online]
Orozco-Henao C, Bretas A, Chouhy-Leborgne R, Herrera-Orozco A, Marin-Quintero J (2017) Active distribution network fault location methodology: a minimum fault reactance and fibonacci search approach. Int J Electr Power Energy Syst 84:232–241. http://www.sciencedirect.com/science/article/pii/S0142061516302307 [Online]
Askarzadeh A (2014) Bird mating optimizer: an optimization algorithm inspired by bird mating strategies. Commun Nonlinear Sci Numer Simul 19(4):1213–1228
Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun Nonlinear Sci Numer Simul 17(12):4831–4845
Neshat M, Sepidnam G, Sargolzaei M, Toosi AN (2014) Artificial fish swarm algorithm: a survey of the state-of-the-art, hybridization, combinatorial and indicative applications. Artif Intell Rev 42(4):965–997
Shiqin Y, Jianjun J, Guangxing Y (2009) A dolphin partner optimization. In: Proceedings of the WRI global congress on intelligent systems, pp 124–128
Lu X, Zhou Y (2008) A novel global convergence algorithm: bee collecting pollen algorithm. In: 4th international conference on intelligent computing, Springer, pp 518–525
Oftadeh R, Mahjoob M, Shariatpanahi M (2010) A novel meta-heuristic optimization algorithm inspired by group hunting of animals: hunting search. Comput Math Appl 60(7):2087–2098. http://www.sciencedirect.com/science/article/pii/S0898122110005419 [Online]
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
Digalakis J, Margaritis K (2001) On benchmarking functions for genetic algorithms. Int J Comput Math 77(4):481–506
Awad N, Ali M, Liang J, Qu B, Suganthan P (2016) Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective bound constrained real-parameter numerical optimization. In: Technical report, Nanyang Technological University Singapore
Brest J, Maučec MS, Bošković B (2017) Single objective real-parameter optimization: algorithm JSO. In: Evolutionary computation (CEC), 2017 IEEE congress on IEEE, pp 1311–1318
Kaveh A (2014) Advances in metaheuristic algorithms for optimal design of structures. Springer, Berlin
Kaveh A, Ghazaan MI (2018) Meta-heuristic algorithms for optimal design of real-size structures. Springer, Berlin
Coello CAC (2002) Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Comput Methods Appl Mech Eng 191(11–12):1245–1287. http://www.sciencedirect.com/science/article/pii/S0045782501003231 [Online]
Kannan B, Kramer SN (1994) An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J Mech Des 116(2):405–411
Gandomi AH, Yang X-S (2011) Benchmark problems in structural optimization. Springer, Berlin, pp 259–281
Mezura-Montes E, Coello CAC (2005) Useful infeasible solutions in engineering optimization with evolutionary algorithms. Springer, Berlin, pp 652–662
Kaveh A, Talatahari S (2009) A particle swarm ant colony optimization for truss structures with discrete variables. J Construct Steel Res 65(8–9):1558–1568
Kaveh A, Talatahari S (2009) Particle swarm optimizer, ant colony strategy and harmony search scheme hybridized for optimization of truss structures. Comput Struct 87(5–6):267–283
Bichon CVCBJ (2004) Design of space trusses using ant colony optimization. J Struct Eng 130(5):741–751
Schutte J, Groenwold A (2003) Sizing design of truss structures using particle swarms. Struct Multidiscip Optim 25(4):261–269. https://doi.org/10.1007/s00158-003-0316-5 [Online]
Kaveh A, Talatahari S (2010) Optimal design of skeletal structures via the charged system search algorithm. Struct Multidiscip Optim 41(6):893–911
Kaveh A, Talatahari S (2009) Size optimization of space trusses using big bang-big crunch algorithm. Comput Struct 87(17–18):1129–1140
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author declares that he has no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix: Unimodal, multimodal, and fixed-dimension multimodal benchmark test functions
Appendix: Unimodal, multimodal, and fixed-dimension multimodal benchmark test functions
1.1 Unimodal benchmark test functions
1.1.1 Sphere model
1.1.2 Schwefel’s problem 2.22
1.1.3 Schwefel’s problem 1.2
1.1.4 Schwefel’s problem 2.21
1.1.5 Generalized Rosenbrock’s function
1.1.6 Step function
1.1.7 Quartic function
1.2 Multimodal benchmark test functions
1.2.1 Generalized Schwefel’s problem 2.26
1.2.2 Generalized Rastrigin’s function
1.2.3 Ackley’s function
1.2.4 Generalized Griewank function
1.2.5 Generalized penalized functions
-
$$\begin{aligned}&F_{12}(z)= \dfrac{\pi }{30}\{10\text {sin}(\pi x_1) + \sum _{i=1}^{29}(x_i - 1)^2\\&\quad \times \, [1 + 10\text {sin}^2(\pi x_{i+1})] + (x_n - 1)^2 \} \\&\quad + \sum _{i=1}^{30}u(z_i, 10, 100, 4) \\ \\&\quad -50 \le z_i \le 50 , \quad f_{\min } = 0 , \quad \text {Dim} = 30 \\ \end{aligned}$$
-
$$\begin{aligned}&F_{13}(z)= 0.1\{\text {sin}^2(3\pi z_1) + \sum _{i=1}^{29}(z_i - 1)^2\\&\quad \times \,[1 + sin^2(3\pi z_i + 1)] + (z_n - 1)^2[1 + \text {sin}^2(2\pi z_{30})]\} \\&\quad + \sum _{i=1}^Nu(z_i, 5, 100, 4) \\&\quad -50 \le z_i \le 50 , \quad f_{\min } = 0 , \quad \text {Dim} = 30, \\ \end{aligned}$$
where \(x_i = 1 + \dfrac{z_i+1}{4}\)
$$\begin{aligned} u(z_i, a, k, m) = {\left\{ \begin{array}{ll} k(z_i - a)^m \quad \quad \quad z_i > a\\ 0 \quad \quad \quad \quad \quad \quad \quad -a<z_i<a\\ k(-z_i - a)^m \quad \quad z_i<-a \end{array}\right. } \\ \end{aligned}$$
1.3 Fixed-dimension multimodal benchmark test functions
1.3.1 Shekel’s Foxholes function
1.3.2 Kowalik’s function
1.3.3 Six-hump camel-back function
1.3.4 Branin function
1.3.5 Goldstein–Price function
1.3.6 Hartman’s family
-
$$\begin{aligned}&F_{19}(z)= -\sum _{i=1}^4c_i \text {exp}\left( -\sum _{j=1}^3 a_{ij}(z_j - p_{ij})^2\right) \\&\quad 0 \le z_j \le 1 , \quad f_{\min } = -3.86 , \quad \text {Dim} = 3 \\ \end{aligned}$$
-
$$\begin{aligned}&F_{20}(z)= -\sum _{i=1}^4c_i \text {exp}\left( -\sum _{j=1}^6 a_{ij}(z_j - p_{ij})^2\right) \\ \\&\quad 0 \le z_j \le 1 , \quad f_{\min } = -3.32 , \quad \text {Dim} = 6 \\ \end{aligned}$$
1.3.7 Shekel’s Foxholes function
-
$$\begin{aligned}&F_{21}(z)= -\sum _{i=1}^5[(X - a_i)(X - a_i)^T + c_i]^{-1}\\&\quad 0 \le z_i \le 10 , \quad f_{\min } = -10.1532 , \quad \text {Dim} = 4 \\ \end{aligned}$$
-
$$\begin{aligned}&F_{22}(z)= -\sum _{i=1}^7[(X - a_i)(X - a_i)^T + c_i]^{-1} \\&\quad 0 \le z_i \le 10 , \quad f_{\min } = -10.4028 , \quad \text {Dim} = 4 \\ \end{aligned}$$
-
$$\begin{aligned}&F_{23}(z)= -\sum _{i=1}^{10}[(X - a_i)(X - a_i)^T + c_i]^{-1} \\&\quad 0 \le z_i \le 10 , \quad f_{min} = -10.536 , \quad Dim = 4 \\ \end{aligned}$$
1.4 CEC-2017 benchmark test functions
The detailed descriptions of 15 well-known CEC-2017 benchmark test functions (C1–C30) are mentioned in Table 28.
Rights and permissions
About this article
Cite this article
Dhiman, G. ESA: a hybrid bio-inspired metaheuristic optimization approach for engineering problems. Engineering with Computers 37, 323–353 (2021). https://doi.org/10.1007/s00366-019-00826-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00366-019-00826-w