Abstract
Whale optimization algorithm (WOA) is a very popular meta-heuristic algorithm. When optimizing complex multi-dimensional problems, the WOA has problems such as poor convergence behavior and low exploration efficiency. To improve the convergence behavior of the WOA and strengthen its global exploration efficiency, we propose a novel enhanced global exploration whale optimization algorithm (EGE-WOA). First, Lévy flights have the ability to strengthen global space search. For unconstrained optimization problems and constrained optimization problems, the EGE-WOA introduces Lévy flights to enhance its global exploration efficiency. Then, the EGE-WOA improves its convergence behavior by introducing new convergent dual adaptive weights. Finally, according to the characteristics of sperm whales hunting by emitting high-frequency ultrasound, the EGE-WOA introduces a new mechanism for judging the predation status of whales. The judgment mechanism is to judge the three predation states of whales by judging the fitness value between the optimal whale individual and any whale individual. The proposed new judgment mechanism can indeed effectively improve the global exploration efficiency of the WOA. For the exploration efficiency of the unconstrained optimization problems and constrained optimization problems, the EGE-WOA combines the Lévy flights and judgment mechanism in different ways to achieve efficient exploration efficiency and better convergence behavior. The experimental results show that in the optimization process of 33 unconstrained benchmark functions and 6 constrained real cases, the mean and standard deviation of the EGE-WOA are better than other algorithms.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
The optimization problem is common in all aspects of society and life. It is a research hotspot in many fields such as image compression [1], path planning aspect [2], structural optimization [3, 4], parameter estimation [5], skeletal structure size optimization [6], distribution system optimization aspect [7], and resource scheduling aspect [8]. In many practical applications, the optimization problem often exhibits dynamic, nonlinear, uncertain and high-dimensional characteristics [9]. The complexity of the engineering optimisation problem leads to an increase in the complexity of the algorithm. The increased complexity of the algorithm tends to make the algorithm less stable. The complexity of the algorithm leads to a significant increase in calculation costs.
To obtain more effective optimisation algorithms, many authors have proposed good optimisation algorithms. These optimization algorithms are divided into two categories: sequential algorithm and random algorithm. The sequential algorithm mainly consists of hill climbing method [10], Newton iteration method [11], and simplex method [12], least squares method [13, 14], etc. These algorithms have good convergence behavior. But for complex optimisation problems, they are easy to fall into local optimality. The random algorithm contains a random term. The random item has the characteristics of exploration and search. Under the same initial conditions, the random algorithms always inevitably produce different solutions, so the repeatability of the algorithms is very poor [15].
Most stochastic algorithms can be thought of as meta-heuristics. The meta-heuristics are inspired by the phenomenon of animal activity in nature. For example, the bat algorithm (BA) [16, 17], the firefly algorithm (FA) [4, 18], the gray wolf algorithm (GWO) [19,20,21], the moth-flame optimization (MFO) [22], the grasshopper optimisation algorithm (GOA) [23], the bacterial foraging optimization(BFO) [24], the ant lion optimization(ALO) [25], the harris hawks optimization (HHO) [26], and the barnacles Mating Optimizer (BMO) [27]. These meta-heuristic algorithms are emerging optimization algorithms [28,29,30]. Their advantages are that the algorithms can suppress local optima to a certain extent when solving complex nonlinear problems. Therefore, they are usually applied to solve practical engineering problems [31].
In 2016, Seyedali Mirjalili proposed a whale optimization algorithm (WOA) [32]. After the whale has found its prey, the humpback whale first dives to the bottom of the prey and then forms a distinctive bubble along a circular path. The WOA works in three parts: spiral hunting, envelope hunting and searching for prey. Khaled et al. [33] utilized the WOA to optimize the scheduling problem of the power system and realized the optimal scheduling of reactive power. Yu et al. [34] applied the WOA in the parameter optimization of the controller, and the optimized parameters can make the control system more robust.
However, the WOA still has risks such as low exploration efficiency, poor convergence behavior, and the possibility of falling into a local optimum. So as to strengthen the optimization performance of the WOA, many variants of the WOA have been proposed. When dealing with complex optimization scenarios, Huiling Chen et al. [35] believed that the traditional WOA has the problem of easily falling into local optimality. To solve the problem, they proposed a balanced variant algorithm called BWOA. Compared with the WOA, the BWOA is more suitable for optimizing complex scenes. Mohammad Tubishat et al. [36] encountered the problem of the WOA easily falling into local optima when optimizing a large number of data sets. To solve it, they proposed an improved algorithm (IWOA). Compared with other meta-heuristic algorithms, the IWOA is the best in the accuracy of sentiment analysis classification.
In 2017, Ying Ling et al. proposed a new whale optimization algorithm based on the Lévy flights (LWOA) [37]. In 2018, Yongquan Zhou used the LWOA to solve engineering optimization problems [38]. Huiling Chen et al. [39] believed that WOA has the disadvantages of poor convergence behavior, low exploration efficiency and easy falling into local optimal when optimizing complex unconstrained continuous problems. To overcome these shortcomings of the WOA, they proposed an enhanced variant called RDWOA. Experimental data prove that the RDWOA is a promising variant of WOA, which has better exploration efficiency than other state-of-the-art algorithms.
Although the WOA, IWOA, BWOA and RDWOA have unique search mechanisms for global optimization problems, they are not completely suitable for solving complex multi-dimensional problems. To strengthen the global exploration efficiency of the WOA and improve its convergence behavior, we conduct research on it. We believe that the switching mechanism of the WOA’s location update formula and the two strategies of the RDWOA have some shortcomings.
-
1.
The switching mechanism of WOA’s three position update formulas relies on uniformly random distribution parameters, which are random, uncertain and blind.
-
2.
The RDWOA’s random spare strategy makes each individual approach the optimal individual with a certain probability. This can indeed achieve better convergence behavior. It can cause premature RDWOA. The premature phenomenon will reduce the overall exploration efficiency of the RDWOA.
-
3.
The double adaptive weight strategy proposed by RDWOA does improve the development accuracy of the algorithm and the global exploration capability. However, it can be seen from the iterative curve characteristics of these two weight parameters. In the iterative process of the RDWOA, the adaptive weights gradually tend to diverge rather than converge. This not only makes the RDWOA deviate from the local optimum, but also affects its convergence behavior.
To optimize the above three shortcomings, while strengthening the WOA’s global exploration capabilities and improving its convergence behavior, the paper proposes the EGE-WOA.
The novelties of the paper are as follows.
-
1.
The judging mechanism for whale location update: replace the original random value based on the difference between the fitness of any individual and the best individual.
-
2.
For constrained optimization and unconstrained optimization problems, the EGE-WOA introduces Lévy flights in different ways, respectively.
-
3.
For constrained optimization and unconstrained optimization problems, the EGE-WOA introduces different convergent adaptive weights, respectively.
We arrange our articles in the following order: first, related research on the WOA, WOA and RDWOA are reviewed in Sect. 2. Section 3 presents the proposed EGE-WOA. Performance comparison of algorithms, comparison data and low-dimensional space and high-dimensional space simulation data are recorded in Sect. 4. Section 5 suggests the efficiency of our EGE-WOA through four real-world application case. Finally, the conclusions are given in Sect. 6.
2 Related research
The whale is the largest animal in the world. The adult whales can grow to 33 m long and weigh 181 tons [40]. Whales have infrasound/ultrasonic hearing and rely on unique echolocation to search for prey around or transmit information to each other. In the whale family, humpback whales are huge baleen whales. Due to the lack of chewing teeth, they can only prefer to prey on groups of small fish and shrimp, so a special foraging behavior has evolved, called bubble foraging. According to the hunting characteristics of humpback whales, their hunting process includes three stages: surrounding prey, spiral search, and random search.
The sperm whale is a large whale. The sound of the sperm whale is extraordinary. Its maximum sound reaches 234 decibels [41]. If humans stay by its side, this kind of sound may deaf the human ears. It is the biggest sound in the animal kingdom. The rumbling sound waves are like a bright light in the dark deep sea, which enables the sperm whale to detect the king squid within 500 m, as shown in the sub-figure (a, b) in Fig. 1. The squid’s hearing system cannot detect the high-frequency ultrasonic waves emitted by the sperm whale. Therefore, the squid cannot perceive the danger approaching quickly, as shown in the sub-figure (b), (c) and (d) in Fig. 1. It makes sperm whales to quickly capture their prey, as shown in the sub-figure (e).
2.1 A brief of WOA
2.1.1 Surround prey
Whales can identify the location of their prey through echolocation and surround the prey. In the process of surrounding the prey, the current local optimal solution is the whale closest to the food. At this time, other whales gradually approach the whale that represents the optimal solution. In this way, the encirclement of the prey by the whale is completed.
The location update formula of each individual is as follows:
where \(X_{{{\text{local}}}} (t)\) represents the spatial position coordinates of the whale in the tth iteration, and \(X_{{{\text{best}}}} (t)\) represents the spatial position coordinates of the optimal whale in the tth iteration. A and B are the coefficient matrices:
where rand obeys [0 1] uniform random distribution. tmax is the maximum number of iterations. h represents the iteration variable, which decreases linearly from 2 to 0.
2.1.2 Spiral search prey
Gradually narrow the encircling circle in a spiral upward way to obtain the food. This algorithm design includes two mechanisms, shrinking envelopment and spiral update space location, and assumes that the selection of shrinking envelopment mechanism and spiral update location probability are both 0.5:
where b stands for constant, and l obeys [− 1 1] uniform random distribution.
2.1.3 Random search prey
When the coefficient vector |A|≥ 1, it means that the whale is swimming outside the shrinking enclosure. At this time, individual whales conduct a random search based on each other’s position, and the mathematical model is as follows:
where \(X_{{{\text{rand}}}} (t)\) is the location of a random whale.
2.2 Brief introduction of RDWOA
2.2.1 Random spare method
The method is to replace the current individual’s of the tth dimensional vector with the best individual’s corresponding vector values based on certain conditions. The condition is shown in the following equation:
where rand obeys [0 1] uniform random distribution. iter represents the current iteration value. Max_iter represents the maximum iteration value.
When the condition satisfies inequality (6), the random backup mechanism is started. Although the method improves its convergence behavior and exploration ability, it may also lead to premature RDWOA, and reduce its exploration efficiency and convergence behavior.
2.2.2 Dual adaptive weight
When the individual’s position is not updating, s will automatically increase by 1.
w1 and w2 have the same curve characteristics. We take w1 as the research object.
In the whole iterative process of the algorithm, when s takes different values, the curve characteristics change greatly.
Figure 2 shows the curve characteristic of w1 when s is a different constant. From the analysis of the local static graph, when the constant s takes different values, the curve characteristic of w1 changes greatly. When s = 1, the curve of w1 linearly converges to 0. When s = 200, the curve characteristic of w1 converges to 0 non-linearly. When s = 500, the curve characteristic of w1 is emitted and does not converge.
The first half of RDWOA is as follows, when FE/MaxFEs < = 0.5:
The second half of RDWOA is as follows, when FE/MaxFEs > 0.5:
For unconstrained optimization problems, it is demonstrated that compared with the FA, BA and IWOA, the RDWOA has better global exploration performance.
3 Proposed EGE-WOA
The optimization problems can be summarized into constrained optimization and unconstrained optimization problems [42]. The constrained optimization problems refer to a nonlinear programming problem with constraints. For example, the 5 real cases in Sect. 5.
The unconstrained optimization problems refer to the selection of the optimal solution according to a certain index from all possible alternatives of a problem. For example, the 36 benchmark functions in Table 1.
3.1 The Lévy flights’ method
Lévy flights have been applied to optimization and optimal search, and the results show that it has strong global search capabilities [43]. In view of the Lévy flights’ method can improve the algorithm’s ability to explore the global space, the EGE-WOA introduces it. As shown in the following formula:
where v and µ obey the standard normal distribution.\(\varphi\) is shown in the following formula:
where τ is the standard Gamma function.
The Lévy flights’ method will only be used when the spatial positions of all individuals are not changing. Therefore, the paper uses the method when iter/Max_iter < 0.5:
In the formula, the spatial dimension of each individual X(t) is D; XD(t) is the D-dimensional location space. Lévy(D) is the Lévy distribution with a number of D.
3.2 The new random spare method
The method is to replace the current individual’s of the tth dimensional vector with the best individual’s corresponding vector values based on certain conditions. The condition is as shown in the following equation:
3.3 The convergent adaptive weight
It can be seen from Figure 2 that the weight curve characteristics of the RDWOA are divergent and unstable, which weakens the global exploration efficiency.
The paper proposes a new nonlinear, convergent adaptive weight:
Figure 3 shows the curve characteristic of the new adaptive weight. Compared with Fig. 2, the curve characteristic in Fig. 3 is convergent and nonlinear.
3.4 The judgment mechanism of the whale’s position update formula
Based on this habit of whales, we propose a new algorithm that uses whale ultrasound to determine the distance between individuals, as a switching mechanism for the whale position update formula, instead of WOA’s original whale switching mechanism. We design different judgment mechanisms according to the types of continuous optimization problems (unconstrained optimization and constrained optimization problems).
According to the difference between the fitness value of the current best individual and any individual, a new judgment value of the switching position update formula is proposed:
where f(x*(t)) stands for the adaptive fitness value of the best individual. f(xrand(t)) is the adaptive fitness value of any individual. d is the difference between the current optimal individual and the fitness value of any individual.
3.4.1 For continuous unconstrained optimization problems
The EGE-WOA proposed in the paper consists of two parts.
The first half of the EGE-WOA is as follows, when FEs/MaxFEs < = 0.2:
The second half of the EGE-WOA is as follows, when FE/MaxFEs > 0.2:
3.4.2 For continuous constrained optimization problem
The EGE-WOA proposed in the paper consists of two parts:
The first half of the EGE-WOA is as follows, when FEs/MaxFEs < = 0.2:
The second half of the EGE-WOA is as follows, when FE/MaxFEs > 0.2:
3.5 Opposition-based learning method (OBL)
OBL was first proposed by Tizhoosh [44] in 2005. The detailed description of OBL is as follows:
-
1.
Suppose x is any real number in the interval [lb, ub]. The relative number xop of x is as follows:
$$x_{op} = lb + ub - x,$$(34)where lb ≤ ub, lb and ub are any real number. Similarly, we apply it to multi-dimensional situations.
-
2.
Suppose IP(x1, x2,…, xn) is a point in the n-dimensional system. Each xi is in the interval of [1b(i), ub(i)].
The opposite number OP of IP is defined as
where xiop is the coordinate of OP.
3.6 Random distribution method
After each iteration, the position of each individual whale will be randomly redistributed in the area of radius k. The purpose of the method is to enhance the global search capability of the WOA:
where \(k = |ub - lb|/2*s\), and s is the population size.
4 Numerical simulations
4.1 Benchmark function
We choose 33 benchmark functions [45, 46] in Table 1. The 33 benchmark functions belong to unconstrained optimization problems (Fig. 4; Table 2).
4.2 Comparison with other variant algorithms
The population search space D is 30. The algorithm runs independently 20 times, recording its mean and variance.
As can be seen in Table 3, the mean and Std of the EGE-WOA and BMO are the smallest. It reflects that in the process of optimizing the benchmark function, the EGE-WOA and BMO have the highest exploration efficiency. For F5, F16 and F27, the mean and variance of the EGE-WOA are the smallest.
Compared with the GWO, FA, BA, MFO, BFO, FPA, GOA, ALO and HHO, the EGE-WOA has very obvious advantages. Its mean and variance are the smallest. Therefore, the EGE-WOA has a strong global exploration efficiency.
In Fig. 5, compared with the GWO, FA, BA, MFO, BFO, FPA, GOA, ALO and HHO, the convergence behaviors of the EGE-WOA and BMO are the best. Compared with the BMO, the convergence behavior of the EGE-WOA is significantly better than that of the BMO. For F4, F6 and F7, compared with the GWO, FA, BA, MFO, BFO, FPA, GOA and ALO, the HHO has better convergence behavior. The convergence behavior of the BMO is significantly better than that of the HHO. The convergence behavior of the EGE-WOA is the best. Therefore, in general, the global exploration efficiency and convergence behavior of the EGE-WOA is the best.
4.3 Compared with other variant algorithm
To objectively verify the global optimization performance of the EGE-WOA on 33 benchmark functions of unconstrained optimization, compare it with five representative whale algorithms: WOA, LWOA, BWOA, IWOA, and RDWOA.
In the experiment, F1–F11, the number of the search agents was set to 15. Each algorithm calculates 50 times independently, and the maximum number of iterations of all algorithms is 1000. The experimental result data are recorded in Table 4.
The data in Table 4 show that in the process of optimizing unconstrained benchmark functions, the mean and variance of the EGE-WOA are the smallest, such as F2, F3, F4, F5, F6, F12, F14, F15, F16, F17, F24, F25, F26, F27 and F28. It shows that the EGE-WOA has the best global exploration efficiency.
The BWOA, RDWOA and EGE-WOA have the same mean and variance such as F1, F7, F8, F9, F10, F12, F18, F19, F20, F23, F29, F30 and F31. At this time, they successfully avoided falling into the local optimum and obtained the global optimum. It shows that the EGE-WOA, BWOA and RDWOA have the best global exploration efficiency.
The EGE-WOA and BWOA have the smallest mean and variance, such as F11, F22, F32, and F33.
To verify the global exploration efficiency of the EGE-WOA, the paper selects F1, F2, F3, F4, F5, F11, F12, F14, F15, F16, F22, F25, F26 and F27 to display the convergence curves of five algorithms in Fig. 6.
From the convergence curves of the six algorithms in Fig. 6, it can be seen that the convergence behavior of the EGE-WOA is the best. The EGE-WOA can effectively enhance the global optimization efficiency of the WOA and improve its convergence behavior. It can effectively avoid the risk of falling into a local optimum.
For example, F2, F3, and F4, from the convergence curve of the WOA, IWOA, BWOA and RDWOA, it can be seen that because they are trapped in the local optimal, they only obtain different local optimal solutions, but cannot obtain the global optimal solution.
For example, for F11, F12, F14 and F15, the EGE-WOA has the best convergence behavior. It has strong global exploration capabilities and can effectively avoid falling into local optimum. The convergence behaviors of the BWOA and RDWOA are worse than that of the EGE-WOA. The convergence behaviors of the WOA, LWOA and IWOA are the worst. They are unable to obtain the global optimal solution because they fall into the trap of local optimality.
Although LWOA introduced Lévy flights, it did not use Lévy flights correctly. It can be seen from the simulation data that for the unconstrained optimization problem of multi-dimensional space, LWOA cannot prevent WOA from falling into the local optimum.
In summary, the EGE-WOA has the best global exploration capability and convergence behavior. In contrast, the optimization efficiency of the RDWOA and BWOA for unconstrained optimization problems is significantly better than that of the IWOA and WOA (Table 5).
4.4 The execution time of different algorithms
The execution time of different algorithms is tested on the same computer in the same environment. The experimental results are recorded in Table 6. Each algorithm runs independently 20 times.
Although Lévy flights can enhance the exploration efficiency of EGE-WOA in the search space, it is well known that it will increase the execution time of the algorithm.
It can be seen from Table 7 that for F1–F11, the WOA has the shortest execution time. The execution time of the EGE-WOA is the longest. For the four real cases, the WOA and RDWOA have the shortest execution time. Since the EGE-WOA introduces Lévy flights, its execution time is the longest.
5 Case studies of real-world applications
In the section, the purpose is to verify the optimization performance of the six whale algorithms for constrained real engineering cases. The WOA, LWOA, IWOA, BWOA, RDWOA and EGE-WOA are evaluated in five engineering real applications: Cantilever beam [47], pressure vessel design [47], speed reducer design [47], a three-bar tress design [47], and Welded beam design [42] (Fig. 7).
5.1 Cantilever beam
where \(0.01 \le x_{1} ,x_{2} ,x_{3} ,x_{4} ,x_{5} \le 100.\)
The abscissa of Fig. 8 is the number of iterations of each algorithm. Its ordinate is the average value of the value obtained in each iteration of each algorithm. It can be seen from Fig. 8 that when the whale optimization algorithms optimize for the constrained realistic engineering case, their convergence curves are obviously different from those of when they optimize for the unconstrained optimization problems.
The f(x) in Table 7 records the optimal mean values of the six curves in Fig. 8. In Fig. 8 and Table 7, the optimal mean (f(x)) of the LWOA is the largest. The optimal means of the WOA and RDWOA are the smaller than that of the LWOA. The optimal mean of the BWOA is smaller than those of the WOA and RDWOA. The optimal mean value of the IWOA is smaller than that of the BWOA. The optimal mean of the EGE-WOA is the smallest. The convergence behavior of the EGE-WOA in Fig. 8 is the best (Fig. 9).
5.2 Pressure vessel design (PVD)
where \(1.5 \times 0.0625 \le T_{s} ,T_{h} \le 99 \times 0.0625,{\text{and}}10 \le R,L \le 200\).
For the constrained practical engineering problem, comparing the mean iteration curves of six whale optimization algorithms in Fig. 10, it can be seen that LWOA has the worst convergence behavior. The convergence behavior of the WOA is better than that of the RDWOA. The convergence behavior of the BWOA is slightly better than that of the WOA. The convergence behavior of the EGE-WOA is the best.
From the f(x) of the five algorithms in Table 8, it can be seen that the f(x) of the LWOA is 15,331.3268, which is the largest. The f(x) of the EGE-WOA is 5653.7587, which is the smallest. When optimizing the constrained real case, the optimization efficiency of the EGE-WOA is the best.
5.3 Speed reducer design (SRD)
The purpose of structural optimization is to minimize the total weight of the reducer (Fig. 11). The mathematical formula for this case is as follows:
where \(2.6 \le b \le 3.6,\quad 0.7 \le m \le 0.8,\quad 17 \le z \le 28,\quad 7.3 \le l_{1} ,l_{2} \le 8.3,\quad 2.9 \le d_{1} \le 3.9,\quad 5.0 \le d_{2} \le 5.5.\)
In Fig. 12, the convergence behaviors of the WOA and LWOA are the worst. The convergence curve of the BWOA is significantly better than that of the RDWOA. The convergence behavior of the EGE-WOA is the best. From the f(x) in Table 9, it can be seen that the f(x) of the EGE-WOA is 2616.6264, which is the smallest. The f(x) of WOA and LWOA is 2695.7386 and 2695.7386, which are the largest. Through comparison, it can be seen that when optimizing the speed reducer design case, the exploration efficiency of the EGE-WOA is the best (Fig. 13).
5.4 A three-bar truss design
with \(l = 100{\text{cm}},P = 2KN/CM^{2} ,{\text{and }}\sigma = 2KN/CM^{2} (0 \le A_{1} ,A_{2} \le 1).\)
The optimization of a three-bar truss design belongs to the constrained problem optimization. It can be seen from Fig. 14 that, except for the LWOA and RDWOA, the convergence behaviors of the other four algorithms are not much different. The convergence behavior of the EGE-WOA is slightly better than that of the WOA, IWOA and BWOA. It can be seen from Table 10 that the f(x) the EGE-WOA is 2.8284, which is slightly smaller than that of the WOA, LWOA, IWOA, BWOA and RDWOA. The optimal average value of the LWOA is 2.8693, which is the largest (Fig. 15).
5.5 Welded beam design
Consider: \(x = [h,l,t,b] = [x_{1} ,x_{2} ,x_{3} ,x_{4} ],\)
subject to
It can be seen from Fig. 16 that the convergence curve of the LWOA is the worst. The convergence curve of the EGE-WOA is slightly better than that of the WOA, IWOA and BWOA. It can be seen from Table 11 that the f(x) of the EGE-WOA is 1.8433, which is the smallest.
It can be seen from the five cases that LWOA has poor convergence behavior and exploration efficiency. It shows that the Lévy flights introduced by the LWOA do not play a positive role. It cannot solve the problem that the algorithm is easy to fall into local optimization.
5.6 The simulation optimal design of section parameters of hydraulic support top beam
The structural parts of the hydraulic support mainly include the top beam, the cover beam, the front and rear connecting rods, the base and so on. They are all box-shaped multi-cavity structures welded by steel plates. Their weight accounts for more than 70% of the total weight of the bracket. The following will carry out the simulation optimization design of structural parameters of the MTZ7200-20/32 hydraulic support top beam. The paper chooses to optimize the design of the most dangerous section of the roof beam under the action of the concentrated load at the middle end, as shown in section D–D in Fig. 17. The simulation optimization is aimed at the lightest weight.
The simulation optimization problem of the top beam section parameters is a constrained minimization problem. The general form of its mathematical model is as follows:
The objective function: \(\min f(x).\)
The nonlinear constraints: \({\text{g}}_{i} ({\text{x}}) \le 0,i = 1,2,3, \cdots\).
Linear constraints: \(A(x) = B.\)
A represents the coefficient matrix of linear constraints. x is the design variable. B is a column vector.
5.6.1 The objective function and design variables
The external dimensions of the top beam of the hydraulic support are generally determined in the overall design. Therefore, taking the lightest weight is the ultimate goal of the simulation optimization design. That is to take the minimum actual material area of the top beam section as the goal of the simulation optimization design. According to Fig. 18, its objective function is as follows:
In the roof beam structure, the thickness of the upper and lower cover plates and the layout and size of the ribs should meet the requirements of the strength and stiffness of the roof beam. The reasonable selection of these section parameters directly determines the weight, reliability and structural stress distribution of the top beam. Therefore, it is necessary to select the section parameters of the top beam as the design variables of its structural simulation and optimization design.
Since the width of roof beam has been standardized, t1, t2, t3 and t4 are taken as design variables:
5.6.2 The constraint condition
The constraint conditions of hydraulic support vary with the frame type, external load condition and basic shape of section. In addition to the strength conditions, geometric constraints should also be met.
5.6.2.1 The strength condition
-
1.
The bending strength condition
The maximum bending stress is used for checking at the section, and the bending strength condition is as follows:
$$\frac{{\sigma_{{\text{s}}} }}{\sigma } \ge n_{{\text{s}}} ,$$$$\sigma = \frac{{3M(2t_{1} + t_{4} )}}{{t_{4}^{3} (t_{2} + t_{3} ) + Ct_{1}^{3} + 3Ct_{1} (t_{1} + t_{4} )^{2} }},$$where \(\sigma_{s}\) is the yield limit of the material, MPa. \(n_{s}\) is the allowable safety factor. \(\sigma_{s}\) is the maximum bending stress of the calculated section, MPa. \(M\) is the maximum bending moment of the calculated section, N mm.
The \(g_{1} (x) = n_{{\text{s}}} - \frac{{\sigma_{{\text{s}}} }}{\sigma (x)} \le 0.\)
-
2.
The shear strength condition
When the shear stress of this section is the maximum, the shear strength needs to be checked:
$$\frac{[\tau ]}{\tau } \ge n_{\tau } ,$$$$\tau = \frac{{Q[t_{1} C\frac{{t_{1} + t_{4} }}{2} + \frac{{t_{4}^{2} }}{2}(t_{2} + t_{3} )]}}{{\left[ {\frac{{t_{4}^{3} (t_{2} + t_{3} )}}{3} + \frac{{Ct_{1}^{3} }}{3} + (t_{1} + t_{4} )^{2} t_{1} C} \right](t_{2} + t_{3} )}},$$where \(n_{\tau }\) is the allowable safety factor. \([\tau ]\) is the allowable safety stress, MPa. \(\tau\) is the maximum shear stress of the calculated section, MPa. \(Q\) is the maximum shear force, N.
Then, \(g_{2} (x) = n_{\tau } - \frac{[\tau ]}{{\tau (x)}} \le 0.\)
5.6.2.2 The geometric constraints
-
1.
The limit of top beam thickness.
The thicker the top beam is, the smaller the stress. However, considering the ventilation section of the support, gas emission, pedestrian passing and other factors, a limit thickness Tmax is usually given in the design:
$$2t_{1} + t_{4} \le T_{\max } ,$$where Tmax is the ultimate thickness of top beam, mm.
-
2.
The limitation of total web thickness.
From the point of view of meeting the conditions of bending and shear strength, the thinner the web, the less material is used. However, considering that the top beam of the support should have a certain stiffness, a minimum thickness Cmin should be limited in the design:
$$- 2(t_{2} + t_{3} ) \le - C_{\min } ,$$where Cmin is the lower bound of the total thickness of the web, mm.
-
3.
The boundary conditions.
The design variables are not only limited by the specifications of each plate, but also limited by the global or local stiffness and deformation, so their values cannot be too small:
$$- t_{1} \le - 10, - t_{2} \le - 10, - t_{3} \le - 10, - t_{4} \le - 50.$$
5.6.2.3 The mathematical model
By substituting the known parameters M = 3041 × 106 N mm, C = 1430 mm, σs = nτ = 1.38, σs = 330 MPa, [τ] = 165 MPa, and Q = 3.566 MN into the objective function and constraints, the mathematical model is as follows:
The coefficient matrix A and column vector B of linear constraint are
According to the different values of Tmax and Cmin, the minimization problem of the mathematical model is solved.
Considering the effect of different Tmax and Cmin values on the cross-sectional area, we choose eight different Tmax and Cmin for simulation optimization (Tables 12, 13).
The first case: when Tmax = 440 mm and Cmin = 80 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 19).
The second case: when Tmax = 440 mm and Cmin = 120 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 20).
The third case: when Tmax = 440 mm and Cmin = 160 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 21).
Case | Optimizer | Optimal design variables (x) | f(x) | |||
---|---|---|---|---|---|---|
Hydraulic support top beam | t 1 | t 2 | t 3 | t 4 | ||
WOA | 24.40032 | 44.70096 | 44.90829 | 300.0924 | 104,895.2481 | |
LWOA | 46.67345 | 47.00782 | 82.80762 | 185.2508 | 147,086.8826 | |
IWOA | 20.15372 | 48.28946 | 32.12598 | 343.5569 | 97,294.6936 | |
BWOA | 22.49702 | 66.63666 | 14.20119 | 326.811 | 99,690.9320 | |
RDWOA | 29.26249 | 26.44685 | 73.99081 | 261.4931 | 114,215.2917 | |
EGE-WOA | 17.61107 | 27.77492 | 13.308 | 323.8953 | 97,100.2436 |
The fourth case: when Tmax = 440 mm and Cmin = 240 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 22).
Case | Optimizer | Optimal design variables (x) | f(x) | |||
---|---|---|---|---|---|---|
Hydraulic support top beam | t 1 | t 2 | t 3 | t 4 | ||
WOA | 30.88702 | 90.23891 | 44.67687 | 241.2454 | 128,540.2534 | |
LWOA | 44.85422 | 63.15818 | 129.6596 | 173.4766 | 163,692.6104 | |
IWOA | 20.35711 | 73.59813 | 47.5478 | 317.4275 | 119,354.0369 | |
BWOA | 20.27847 | 50.32282 | 70.06599 | 318.5292 | 119,014.8264 | |
RDWOA | 30.91130 | 27.1493 | 110.7774 | 255.1978 | 132,821.1814 | |
EGE-WOA | 17.59403 | 10.08118 | 10.4889 | 266.9102 | 118,791.0842 |
The fifth case: when Tmax = 400 mm and Cmin = 160 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 23).
Case | Optimizer | Optimal design variables (x) | f(x) | |||
---|---|---|---|---|---|---|
Hydraulic support top beam | t 1 | t 2 | t 3 | t 4 | ||
WOA | 30.26314 | 39.09114 | 49.06834 | 271.2596 | 111,059.3476 | |
LWOA | 36.90048 | 79.22142 | 49.25809 | 216.5744 | 134,363.2622 | |
IWOA | 22.72781 | 51.88702 | 28.51705 | 321.9593 | 99,184.5120 | |
BWOA | 23.78628 | 32.7138 | 48.68922 | 307.8309 | 99,727.9174 | |
RDWOA | 26.5803 | 40.33903 | 64.66502 | 280.8413 | 114,842.3755 | |
EGE-WOA | 18.97744 | 25.59272 | 15.29235 | 317.9446 | 97,152.6332 |
The sixth case: when Tmax = 360 mm and Cmin = 160 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 24).
Case | Optimizer | Optimal design variables (x) | f(x) | |||
---|---|---|---|---|---|---|
Hydraulic support top beam | t 1 | t 2 | t 3 | t 4 | ||
WOA | 37.65714 | 44.42424 | 48.54884 | 214.1215 | 117,723.1160 | |
LWOA | 38.50779 | 161.3716 | 103.8331 | 195.9281 | 184,941.4289 | |
IWOA | 24.86709 | 33.70215 | 47.81462 | 299.0856 | 100,649.1497 | |
BWOA | 28.68981 | 47.134 | 34.06963 | 268.7986 | 103,480.5820 | |
RDWOA | 31.32396 | 35.98665 | 55.68149 | 251.4981 | 110,704.1973 | |
EGE-WOA | 23.37292 | 18.00547 | 12.62175 | 297.5980 | 99,315.4843 |
The seventh case: when Tmax = 320 mm and Cmin = 160 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 25).
Case | Optimizer | Optimal design variables (x) | f(x) | |||
---|---|---|---|---|---|---|
Hydraulic support top beam | t 1 | t 2 | t 3 | t 4 | ||
WOA | 33.83567 | 73.77938 | 33.84293 | 228.454 | 119,554.4854 | |
LWOA | 53.5757 | 120.2834 | 170.555 | 142.1108 | 191,718.6022 | |
IWOA | 32.56013 | 31.65445 | 49.404 | 242.3646 | 107,234.3334 | |
BWOA | 32.93235 | 43.19313 | 38.09523 | 239.6881 | 107,679.7659 | |
RDWOA | 51.21735 | 104.5579 | 51.75729 | 168.4327 | 153,268.0284 | |
EGE-WOA | 29.75091 | 37.49873 | 28.57456 | 255.7566 | 104,284.7457 |
The eighth case: when Tmax = 300 mm and Cmin = 160 mm. The optimization curves and data of the six whale optimization algorithms for the f(x) are as follows (Fig. 26).
Case | Optimizer | Optimal design variables (x) | f(x) | |||
---|---|---|---|---|---|---|
Hydraulic support top beam | t 1 | t 2 | t 3 | t 4 | ||
WOA | 33.59803 | 33.59092 | 120.5783 | 219.466 | 143,773.5264 | |
LWOA | 47.45665 | 171.0472 | 116.2638 | 156.2057 | 202,026.0851 | |
IWOA | 37.12945 | 47.78879 | 38.6945 | 215.4439 | 114,859.8591 | |
BWOA | 39.69094 | 56.74338 | 24.04799 | 204.0625 | 115,766.3774 | |
RDWOA | 60.89707 | 101.6072 | 69.36508 | 128.3987 | 166,081.0218 | |
EGE-WOA | 34.44039 | 10.00754 | 10.40055 | 227.2008 | 109,277.5338 |
From the convergence curves and optimization results of the above eight cases, it can be seen that the convergence behavior of EGE-WOA is better. Its optimization efficiency is the highest.
6 Conclusions
In the process of global continuity optimization, the WOA has the problems of poor exploration efficiency and weak convergence behavior. To improve the global exploration efficiency of the WOA, IWOA and BWOA were proposed. Although these two algorithms have improved the global exploration efficiency and convergence behavior of the WOA to a certain extent, they have not effectively avoided the risk of falling into the local optimum. The exploration efficiency of the RDWOA for unconstrained continuous optimization problems is significantly higher than that of the IWOA, BWOA and WOA. From the experimental data in Sect. 5, it can be seen that compared with the WOA, IWOA and BWOA, the RDWOA has very poor exploration efficiency for constrained continuous optimization problems. Its convergence behavior is also significantly worse than other variant algorithms.
To enhance the exploration efficiency and convergence behavior of the WOA in unconstrained and constrained continuous optimization problems, we propose a novel whale optimization algorithm (EGE-WOA).
For the unconstrained global continuous optimization problem, it can be seen from the experimental results of 33 benchmark functions that compared with the WOA, IWOA, BWOA and RDWOA, the global exploration efficiency and convergence behavior of the EGE-WOA have been significantly improved. The EGE-WOA can effectively avoid the risk of the algorithm falling into a local optimum.
For the constrained global continuous optimization problem, it can be seen from the experimental results of five real engineering application cases that compared with the WOA, IWOA, BWOA and RDWOA, the EGE-WOA still has a strong global exploration efficiency and a better convergence curve.
In summary, the EGE-WOA can strengthen the global exploration efficiency and convergence behavior of the WOA.
In the future, we will research and apply whale optimization algorithm in many aspects, for example, in multi-objective optimization. In practical applications, we use WOA to provide the best parameters for image segmentation and machine-learning models.
References
Chavan PP, Rani BS, Murugan M, Chavan P (2020) A novel image compression model by adaptive vector quantization: modified rider optimization algorithm. Sadhana Acad Proc Eng Sci 45(1):1–15
Montiel O, Sepúlveda R, Orozco-Rosas U (2015) Optimal path planning generation for mobile robots using parallel evolutionary artificial potential field. J Intell Rob Syst 79(2):237–257
Mortazavi A (2021) Solving structural optimization problems with discrete variables using interactive fuzzy search algorithm. Struct Eng Mech 79(2):247–265
Yang X-S (2010) Firefly algorithm, stochastic test functions and design optimization. Int J Bioinspir Comput 2(2):78–84
Oliva D, Aziz MAE, Hassanien AE (2017) Parameter estimation of photovoltaic cells using an improved chaotic whale optimization algorithm. Appl Energy 200:141–154
Kaveh A, Ghazzan MI (2017) Enhanced Whale optimization algorithm for sizing optimization of skeletal structures. Mech Based Des Struct Mach 45(3):345–362
Prakash DB, Lakshminarayana C (2016) Optimal siting of capacitors in radial distribution network using whale optimization algorithm. Alex Eng J 56(4):499–509
Reddy PDP, Reddy VCV, Manohar TG (2017) Whale optimization algorithm for optimal sizing of renewable resources for loss reduction in distribution systems. Renew Wind Water Solar 4(1):3
Maeda K, Fukano Y, Yamamichi S, Nitta D, Kurata H (2011) An integrative and practical evolutionary optimization for a complex, dynamic model of biological networks. Bioprocess Biosyst Eng 34(4):433–446
Goldfeld SM, Quandt RE, Trotter HF (1996) Maximization by quadratic hill-climbing. Econ J Econ Soc 541–551
Abbasbandy S (2003) Improving Newton–Raphson method for nonlinear equations by modified adomian decomposition method. Appl Math Comput 145(2-3):887–893
Nelder JA, Mead R (1965) A simplex-method for function minimization. Comput J 7(4):308–313
Birdi J, Muraleedharan A, D’hooge J, Bertrand A (2021) Fast linear least-squares method for ultrasound attenuation and backscatter estimation. Ultrasonics 116:106503
Liu J, Wang F, Zhao H, Han G (2017) Filtering algorithm and application of fuze echo signal based on LMS principle. J Proj Rock Missiles Guid 37(06):45–47
Yang X-S (2009) Firefly algorithms for multimodal optimization. In: International symposium on Stochastic algorithm. pp169–178
Yang X-S (2010) A new metaheuristic bat-inspired algorithm: nature inspired cooperative strategies for optimization. Springer, Berlin, pp 65–74
Yu H, Zhao N, Wang P, Chen H, Li C (2019) Chaos-enhanced synchronized bat optimizer. Appl Math Model. https://doi.org/10.1016/j.apm.2019.09.029
Jianxun L, Jinfei S, Fei H, Min D, Xiaoya Z (2021) A novel enhanced exploration firefly algorithm for global continuous optimization problems. Eng Comput. https://doi.org/10.1007/s00366-021-01477-6
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69(7):46–61
Heidari AA, Pahlavani P (2017) An efficient modified grey wolf optimizer with Lévy flight for optimization tasks. Appl Soft Comput J 60:115–134
Cai Z, Gu J, Zhang Q, Chen H, Pan Z, Li Y, Li C (2019) Evolving an optimal kernel extreme learning machine by using an enhanced grey wolf optimization strategy. Expert Syst Appl. https://doi.org/10.1016/j.eswa.2019.07.031
Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27:495–513
Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47
Passino KM (2002) Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst Mag 22:52–67
Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83(Sup 1):80–98
Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H (2019) Harris hawks optimization: algorithm and applications. Future Gener Comput Syst 97:849–872
Sulaiman MH et al (2020) Barnacles mating optimizer: a new bio-inspired algorithm for solving engineering optimization problems. Eng Appl Artif Intell 87:103330. https://doi.org/10.1016/j.engappai.2019.103330
Chetty S, Adewumi AO (2014) Comparison study of swarm intelligence techniques for the annual crop planning problem. IEEE Trans Evol Comput 18(2):258–268
Fister I, Yang XS, Brest J et al (2015) Analysis of randomisation methods in swarm intelligence. Int J Bioinspir Comput 7(1):36–49
Lalwani S, Kumar R, Deep K (2017) Multi-objective two level swarm intelligence approach for multiple RNA sequence structure alignment. Swarm Evol Comput 34:130–144
Gandomi AH, Alavi AH (2011) Multi-stage genetic programming: a new strategy to nonlinear system modeling. Inf Sci 181(23):5227–5239
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95(5):51–67
Khaled M, Samir S, Abdelghani B (2018) Whale optimization algorithm based optimal reactive power dispatch: a case study of the Algerian power system. Electr Power Syst Res 163(10):696–750
Yu Y, Wang H, Li N et al (2017) Automatic carrier landing system based on active disturbance rejection control with a novel parameters optimizer. Aerosp Sci Technol 69(10):149–160
Huiling C, Yueting X, Mingjing W, Xuehua Z (2019) A balanced whale optimization algorithm for constrained engineering design problems. Appl Math Model 71:45–59
Mohammad T, Mohammad AM, Abushariah NI, Ibrahim A (2019) Improved whale optimization algorithm for feature selection in Arabic sentiment analysis. Appl Intell 49:1688–1707
Ying LING, Yongquan ZHOU, Qifang LUO (2017) Lévy flight trajectory-based whale optimization algorithm for global optimization. IEEE Access 5:6168–6186
Zhou Y, Ling Y, Luo Q (2018) Lévy flight trajectory-based whale optimization algorithm for engineering optimization. Eng Comput. https://doi.org/10.1108/EC-07-2017-0264
Huiling C, Chenjun Y, Ali AH, XueHua Z (2019) An efficient double adaptive random spare reinforced whale optimization algorithm. Expert Syst Appl. https://doi.org/10.1016/j.eswa.2019.113018
The largest whale in the world is 33 meters long and weighs 181 tons. http://www.qnong.com.cn/news/tupian/6174.html. Accessed 25 Sept 2021
Man and nature:how a whale uses ultrasound to become a super hunter. https://haokan.baidu.com/v?pd=wisenatural&vid=12508710492429248227. Accessed 25 Sept 2021
Kang SL, Zong WG (2005) A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice. Comput Methods Appl Mech Eng 194:3902–3933
Reynolds AM, Frye MA (2007) Free-flight odor tracking in Drosophila is consistent with an optimal intermittent scale-free search. PLoS One 2:e354
Tizhoosh HR (2005) Opposition-based learning: a new scheme for machine intelligence. International conference on computational intelligence for modelling, Vienna, Austria, pp 695–701.
Digalakis JG, Margaritis KG (2001) On benchmarking functions for genetic algorithms. Int J Comput Math 77:481–506
Yelghi A, Köse C (2018) A modified firefly algorithm for global minimum optimization. Appl Soft Comput 62:29–44
Gandomi AH, Yang X-S, Alavi AH (2013) Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Eng Comput 29(1):17–35
Acknowledgements
Thank you for the academic resources provided by the Southeast University Library. The platform for calculating data is supported by the laboratory of Nanjing Institute of Technology. This work was supported by National Natural Science Foundation of China (Grant no. 51705238).
Author information
Authors and Affiliations
Contributions
Mr JL: conceptualization, investigation, methodology, validation, software, writing, revision, review and editing, visualization, and formal analysis; Prof JS: supervision, project administration, and funding acquisition; Prof FH: methodology, formal analysis, writing, and revision; Prof MD: software, revision, writing, review, and formal analysis.
Corresponding author
Ethics declarations
Conflict of interest
All the authors declare no conflict interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Liu, J., Shi, J., Hao, F. et al. A novel enhanced global exploration whale optimization algorithm based on Lévy flights and judgment mechanism for global continuous optimization problems. Engineering with Computers 39, 2433–2461 (2023). https://doi.org/10.1007/s00366-022-01638-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00366-022-01638-1