Abstract
Moth-Flame Optimization (MFO) algorithm is a new population-based meta-heuristic algorithm for solving global optimization problems. Flames generation and spiral search are two key components that affect the performance of MFO. To improve the diversity of flames and the searching ability of moths, an improved Moth-Flame Optimization (IMFO) algorithm is proposed. The main features of the IMFO are: the flames are generated by orthogonal opposition-based learning (OOBL); the modified position updating mechanism of moths with linear search and mutation operator. To evaluate the performance of IMFO, the IMFO algorithm is compared with other 20 algorithms on 23 benchmark functions and IEEE (Institute of Electrical and Electronics Engineers) CEC (Congress on Evolutionary Computation) 2014 benchmark test set. The comparative results show that the IMFO is effective and has good performance in terms of jumping out of local optimum, balancing exploitation ability and exploration ability. Moreover, the IMFO is also used to solve three engineering optimization problems, and it is compared with other well-known algorithms. The comparison results show that the IMFO algorithm can improve the global search ability of MFO and effectively solve the practical engineering optimization problems.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Global optimization is important in various scientific problems and engineering applications. In some literatures, traditional mathematical methods are used to solve global optimization problems in various fields [1,2,3]. But they meet great challenges in solving complex optimization problems, such as multimodal, discontinuous and non-convex problems. Therefore, the meta-heuristic algorithms are proposed by imitating biological evolution and insect/bird behaviors, and applied to solve complex optimization problems.
In recent decades, the population-based meta-heuristic algorithms have been widely concerned by scholars, which are simple and easy to apply in various fields. Some of the population-based algorithms are proposed, for example, Particle Swarm Optimization (PSO) [4], Bees Algorithm (BA) [5], Artificial Bee Colony (ABC) [6] algorithm, Differential Evolution (DE) [7], Krill Herd (KH) [8], Coyote Optimization Algorithm (COA) [9], Bat Algorithm (BA) [10],Bacterial Foraging Optimization (BFO) [11] algorithm, Fruit Fly Optimization Algorithm (FOA) [12], Sine Cosine Algorithm (SCA) [13], Grey Wolf Optimization (GWO) [14], Flower Pollination Algorithm (FPA) [15], Spotted Hyena Optimizer (SHO) [16], Barnacles Mating Optimizer (BMO) [17], Poor and rich optimization algorithm(PRO) [18], Multi-Verse Optimizer (MVO) [19], Whale Optimization Algorithm (WOA) [20] and Moth-Flame Optimization (MFO) [21] algorithm.
Among the above algorithms, Moth-Flame Optimization algorithm has received special attention because MFO algorithm has good search ability, which is proposed by Seyedali Mirjalili based on flight behavior of moths in nature [21]. The MFO algorithm is implemented by updating the position of moths and generating flames. MFO has been widely used because of its simple and easy to use software. For example, in [22], MFO was used to optimize the parameters of least squares support vector machine (LSSVM), and the MFO-LSSVM forecasting model was established. In [23], MFO was utilized to determine the control parameters of blade pitch controllers. In [24], MFO was used to solve the vehicular ad hoc networks clustering problem. In [25], MFO was used to design the antenna array. In [26], MFO was utilized to realize the optimal link cost in wireless router network. In order to improve the control ability of multiarea hybrid interconnected power system, MFO was used to optimize its controller [27]. Puja Singh and Shashi Prakash have used MFO for multiple optical network units placement [28]. In [29], MFO was utilized to address the optimal reactive power dispatch (ORPD) problem.
To solve specific problems, researchers have developed some improved MFO algorithms. For example, in [30], to improve the global searching ability and convergence speed of MFO, the differential evolution was used to generate flames and the mechanism of flames guided moths was modified. To increase the diversity of the population, the levy flight strategy was integrated into MFO [31]. In [32], an Enhanced Moth-Flame Optimization was proposed to improve the performance of MFO in balancing exploitation and exploration. In [33], Opposition-based Moth Flame Optimization (OMFO) was proposed to solve the global optimization problem. In [34], MFO was hybridized with mutation strategy to solve the complex optimization tasks. In [35], a chaotic local search and Gaussian mutation have been introduced into MFO to optimize KELM. In [36], an improved Moth Flame Optimization (IMFO) algorithm was proposed to solve real-world optimization problems. In [37], two spiral search mechanisms were proposed to improve the search ability of MFO.
Although improved MFO algorithms proposed in the above literature enhance the global convergence of MFO algorithm to a certain extent, these enhanced MFO algorithms still face the problem of falling into local optimum in their results analysis. To further alleviate the above problems and to improve MFO’s performance, in this paper, an improved MFO algorithm (IMFO) is proposed. The main contributions of this paper are summarized as follows:
-
(1)
The OOBL strategy is used to update the best and worst flames in the iterative process, which can generate effective flames to guide moths, so as to enhance exploration ability of the MFO.
-
(2)
A modified position updating mechanism of moths is proposed by introducing linear search and mutation operator. Linear search and spiral search are integrated to improve convergence efficiency. Meanwhile, Euclidean distance is used to select the search strategy and mutation operators in the iterative process. This mechanism can guarantee the diversity of population under the condition of accelerating convergence speed.
-
(3)
The performance of the IMFO algorithm is evaluated on 23 benchmark functions and IEEE CEC 2014 benchmark test set, which is compared with MFO, LMFO, MFO3, CMFO, IMFO2020, COA, CMAES, SSA, SCA, GSA, ABC, PSO, GWO, WOA, PSOGSA, HCLPSO, MVO, HSCA, EWOA, IDA, SHO. Meanwhile, the IMFO is also used to solve three engineering optimization problems and the results are compared with other well-known algorithms(such as SCA, MMA and GCA).
The rest of the paper is organized as follows: Section 2, introduces Moth-Flame Optimization algorithm, Opposition-Based Learning (OBL) and Orthogonal Experiment Design (OED). In Section 3, the improved Moth-Flame Optimization algorithm is discussed in detail. Section 4 uses the classical benchmark function and IEEE CEC 2014 to estimate the performance of IMFO, and compares with other algorithms. In Section 5, IMFO is used to solve three engineering optimization problems. Section 6 presents the conclusions of this study.
2 Related works
2.1 Moth-flame optimization algorithm
MFO is a population-based optimization algorithm, which is proposed by Seyedali Mirjalili based on flight behavior of moths in nature [21]. In the search mechanism of MFO, flames and moths can exchange information quickly, and balance exploitation and exploration performance. The following is a brief review of MFO.
The moths population is expressed as:
where the number of moths/flames is n, the variables(dimension) number of per moth individual is d.
At the same time, the fitness values of moths is expressed as:
where OMi means the fitness function value of the corresponding moth \(M_{i}=\left [m_{i1}, m_{i2}, \cdots , m_{id}\right ], i = 1,2, \cdots , n\), the fitness function is determined according to the actual situation. MFO initializes the population as follows:
where j = 1,2,⋯ ,d; uj is a random constant between 0 and 1; ubi and lbi indicate the upper and lower bounds of the i-th variables as follows:
where the values of ub and lb are determined by the actual situation.
Correspondingly, the flames and their fitness values are expressed as
The logarithmic spiral is the main form of moths position update in MFO, which is defined as follows:
where l represents the current number of iterations; Mi(l + 1) means the i th moth in l + 1 iteration; b is a constant, and the value here is 1; the value range of t is set to [r, 1], in which r = − 1 + l ∗ (− 1/L), the moth can converge to any point around the flame by changing t; L is the maximum number of iterations; Di represents the distance between the i-th moth and the flame corresponding to the moth. Di can be calculated as follows:
where fno denotes the number of adaptive flames and is calculated:
where round denotes rounding in the nearest direction. This mechanism of adaptive flame number, which can reduce the number of flames adaptively during iteration, can balance the exploration ability and development ability of the MFO. The pseudo code of the MFO are given in Algorithm 1.
2.2 Opposition-based learning
OBL is to evaluate the current point and its reverse point at the same time, and then select the best to use. See Definition 1 for the basic definition of OBL.
Definition 1 (Opposite number)
[38] Let x ∈ [a, b] be a real number. Its opposite, \( \breve {x}\), is defined as follow:
Extend Definition 1 to high dimensional space.
Definition 2 (Opposite point in the d space)
[38] Let \(x=\left (x_{1}, \dots , x_{d}\right )\) be a point in d-dimensional space and \(x_{i} \in \left [a_{i}, b_{i}\right ], i=1,2, \dots , d\). The opposite of x is defined by \(\breve {x}=\left (\breve {x}_{1}, \dots , \breve {x}_{d}\right )\) as follow:
2.3 Orthogonal experiment design
OED is to select some representative points from the comprehensive test for the test. It is widely used in optimization algorithms to improve their performance, such as GA [39], because of its characteristics of uniform dispersion and strong comparability.
An orthogonal table LM(QK) for K factors at Q levels, the number of combinations is M [39]. Table 1 is an example of L8(27).
3 Improved moth-flame optimization algorithm
In this section, the improved Moth-Flame Optimization (IMFO) algorithm is described. The detailed introduction and discussion of the IMFO are as follows:
3.1 Motivation of improving MFO algorithm
Although, the MFO is effective in solving the problem with unknown constrained search space, it sometimes faces the lack of diversity. For MFO, the flames of the next iteration are generated by selecting the best individuals from the current iterative moths and flames. This method can realize the rapid exchange of information among the flames and the moths, but it may also lose the diversity of flames. It is difficult for the moths to jump out of the local optimal state if the flames fall into the local optimal state and are far from the global optimal state. At the same time, the flames adaptive guiding moths position updating mechanism can adaptively reduce the numbers of flames, and improve local search ability to a certain extent. Furthermore, the exploration ability and exploitation ability cannot be well balanced in the early and late stages of the iteration process.
Therefore, in order to improve the above problems in MFO algorithm, IMFO algorithm is proposed. The strategies in the IMFO are indicated in the following two sections.
3.2 Flames generation by OOBL strategy
In order to overcome the dimension degradation of the reverse solution, the OOBL is proposed, which is made up of OBL and OED. Assuming that Table 1 is used for orthogonal design, the d-dimensional space defines two levels for flame individual \(F_{i}=\left [f_{i1}, f_{i2}, \cdots , f_{id}\right ]\) and its reverse individual \(\breve F_{i}=\left [\breve f_{i1}, \breve f_{i2}, \cdots , \breve f_{id}\right ]\). Owing to the fact that individual dimension d is generally larger than factor K, orthogonal tables cannot be used directly. According to literature [39], the individual dimension is divided into K subvectors.
where 1 ≤ r1 < r2 < ⋯ < rK − 1 ≤ d.
To improve the diversity of the population and enhance the exploring ability of the MFO algorithm, the optimal flame and the worst flame are selected to generate the OOBL flames. Among them, the best flame is selected to improve the local search ability, and the worst flame is selected to improve the ability to jump out of the local optimum. The OOBL flames FM is as follows:
where Fbest and Fworst represent the optimal flame and the worst flame; Fbestnew and Fworstnew represent OOBL solutions of the optimal flame and the worst flame; ones(1,d) represents a vector of 1 × d and its elements are all 1; rand is a random number in [0,1], which is chosen to increase randomness and the possibility of jumping out of local optimum; ubbest, lbbest, ubworst and lbworst represent the upper and lower bounds of the optimal flame and the worst flame, respectively.
3.3 Modified position updating mechanism of moths
To further improve the convergence speed and the global search ability of the MFO algorithm, modified position updating mechanism of moths based on hybrid search strategy and mutation operator is proposed. The modified position updating mechanism of moths, which is introduced into IMFO, is divided into two parts through Euclidean distance. When the moth is near the best flame, the spiral and linear search mechanism is used to enhance the local search ability; when the moth is far away from the best flame, the mutation operator is used to enhance the global search ability. The maximum Euclidean distance in search space is as follows:
The Euclidean distance between i-th moth and optimal flame in current iteration is as follows:
The modified position updating mechanism of moths is elaborated as follows:
-
(1)
In the iterative search process, when di ≤ w ⋅ DE, linear search mechanism is introduced. The position of moths is updated as follows:
$$ \begin{array}{@{}rcl@{}} \ M_{i}(l+1)=\left\{\begin{array}{ll}{F_{i}-A \cdot D_{i}^{\prime},} & {i \leq f_{no}} \\ {D_{i} \cdot e^{b t} \cdot \cos (2 \pi t)+F_{f_{no}}(l),} & {i>f_{no}} \end{array}\right. \end{array} $$(20)$$ \begin{array}{@{}rcl@{}} \begin{array}{c}{D_{i}^{\prime}=\left|C \cdot F_{i}-M_{i}\right|} \\ {D_{i}=\left|F_{f_{no}}-M_{i}\right|} \end{array} \end{array} $$(21)where w is the weight coefficient and the value is selected as 0.1 in this paper; A = 2a ⋅ R − a; C = 2 ⋅ R; a = − 1 + l ∗ (− 1/L); R is a random constant in [0,1]. Different places around the flame can be reached with respect to the current position by adjusting the value of A and C.
-
(2)
In the iterative search process, when di > w ⋅ DE, mutation operator is used to update moth position to improve the diversity of the population. The mutation operator is defined as [40]
$$ \begin{array}{@{}rcl@{}} \ M_{i}(l+1)=M_{g_{1}}(l)+R_{m} \otimes\left( M_{g_{2}(l)}-M_{g_{3}(l)}\right) \end{array} $$(22)where \(M_{g_{1}}\), \(M_{g_{2}}\) and \(M_{g_{3}}\) are randomly selected from M, g1, g2 and g3 are random value that is not equal to i in [1,n]. \(R_{m}=\left [q_{1}, q_{2}, \cdots , q_{d}\right ]\) is a randomly position vector, qj(j = 1,2,⋯ ,d) is a uniformly distributed random number in [0,1]; ⊗ represents hadamard product.
Based on the above explanation, the pseudo code of IMFO algorithm is shown in Algorithm 2 and the steps of IMFO are illustrated as follows:
-
Step 1 Parameter initialization
Initialize the number of moths/flames n, the maximum number of iterations L; the variables (dimension) number of per moth d and parameters a, t, A, C, w.
-
Step 2 Initialize the positions of moths using (3).
-
Step 3 Calculate the number of flames using (10) and the fitness function values of moths.
-
Step 4 Update flames
The moths of the current iteration, the flames of the previous iteration, and the flames generated by orthogonal opposition-based learning are sorted, and the best n individuals are selected to form the flame population of the current iteration.
-
Step 5 Update moths
Update the positions of moths using (20)–(22).
-
Step 6 Orthogonal opposition-based learning
In the search process, the best flame and the worst flame of each iteration are chosen as the base flame, and the orthogonal reverse flames are formed using (15)–(17).
-
Step 7 Stop or continue the iterative process.
Repeat step 3 to step 6 until the number of iterations l has reached the L or the desired fitness function value is reached.
3.4 Computational complexity of IMFO
The computational complexity of the algorithm represents the resources consumed by the algorithm. The computational complexity of the MFO algorithm depends on population initialization, position updating, population individual position evaluation and sorting mechanism. The computational complexity of initial population is O(n × D); the computational complexity of population position updating is O(L × n × D); the complexity of position evaluation is O(L × n × D); and the sorting mechanism adopts the fast sorting method, considering the worst case in the sorting process, the computational complexity is O(L × n2). The computational complexity of MFO is O(n × D) + O(L × (n2 + 2n × D)). The orthogonal opposition-based learning strategy is used to update the optimal and worst flames, so the complexity of IMFO position updating is O(L × (n + 14) × D), and the complexity of positions evaluation is O(L × (n + 14) × D). Therefore, the final computational complexity of IMFO is O(n × D) + O(L × (n2 + 2(n + 14) × D)). The computational complexity of IMFO is the same as that of MFO and all of them are O(n2).
4 Simulation study and discussion
Compared with MFO algorithm, the IMFO algorithm proposed in this paper is more effective in exploring global optimal solutions and exploiting the promising regions in search space. In this section, to verify the performance and advantages of IMFO algorithm, the 23 benchmark functions and the CEC 2014 benchmark set have been taken. All simulation experiments are conducted on a Windows10 Professional system with Intel(R) Core(TM) i3-6100 CPU(3.70GHz), 4.0 GB of RAM and language is MATLAB R2014a.
4.1 Experimental comparisons on classical benchmark set
The 23 classical benchmark problems, which are used to test the performance of the IMFO, are divided to three parts: unimodal problems(F1-F7), multi-modal problems(F8-F13), fixed dimension multi-modal problems(F14-F23). The details described of the classical benchmark set is given in Table 2. In addition, the performance of IMFO algorithm is compared with other algorithms: MFO [21], LMFO [31], MFO3 [37], CMFO [41], IMFO2020 [42], COA [9], CMAES [43], SSA [40], SCA [13], GSA [44], ABC [6], PSO [4], GWO [14], PSOGSA [45], HCLPSO [46], WOA [20], HSCA [47], EWOA [48], IDA [49], SHO [16]. The parameter settings of the 20 optimization algorithms are given in Table 3 and each algorithm runs 30 times independently on each test problem. The results of the 23 benchmark problems are given in Tables 4, 5, 6, 7, 8, 9, 10 and 11.
4.1.1 The influence of improving mechanism on MFO
In this section, the different mechanisms of the proposed method are simulated and analyzed. Among them, MLMFO means to improve MFO by mutation operator and linear update mechanism; OOBLMFO means to improve MFO by orthogonal opposition-based learning; LMFO means to improve MFO by linear update mechanism; MMFO means to improve MFO by mutation operator; OOBLLMFO means to improve MFO by orthogonal opposition-based learning and linear update mechanism; OOBLMMFO represents the improvement of MFO with orthogonal opposition-based learning and mutation operator. In the simulation of this section, the number of population, dimension and number of iterations are 30, 30, 500 respectively, and each algorithm runs 30 times independently on each test function. The simulation results are given in Table 4.
As can be seen from Table 4, MFO variants are superior to MFO algorithm on most test functions in Mean and Std indexes. In order to intuitively analyze the convergence performance of the algorithm, Fig. 1 shows the convergence graphs of eight algorithms on six test functions. It can be seen from the figure that the IMFO algorithm has advantages in convergence accuracy and convergence speed. In conclusion, the IMFO selected in this paper is the best of the above variant algorithms.
In order to further analyze the performance of IMFO, Wilson sign rank test [50] is used to detect IMFO and other variants. The test results are given in Table 5 and are indicated by ’1/0 /-1’. It can be seen from Table 5 that, in 23 test functions, IMFO is superior to MFO in 19 functions and has the same performance with MFO in 4 functions. Accordingly, IMFO has good results compared with other variants. At the same time, Friedman test [34] is used to sort the above eight algorithms. It can be seen in Table 5 that the average ranking (Ave) of IMFO is the best.
To sum up, IMFO shows the best performance among all variants.
4.1.2 Analysis on classical benchmark set
To verify the performance of IMFO, the IMFO algorithm and MFO algorithm are simulated and analyzed on 23 classical test problems in this subsection. The simulation results show that in Tables 6–9, the Best, Worst, Mean and Stds of each algorithm running independently for 30 times are given. On F1-F13, this section selects three dimensions (10, 30, and 50) to evaluate the performance of IMFO and MFO. Other parameters in the simulation are selected as shown in Table 3.
In the unimodal problems (F1-F7), there is only one global optimal solution and no local optimal solution. Consequently, F1-F7 are used to estimate the convergence rate and exploitation ability of the IMFO. The simulation results of IMFO algorithm and MFO algorithm are given in Tables 6–8. In these tables, it can be observed that IMFO algorithm is better than MFO algorithm in the Best, Worst, Mean and Std. Therefore, the test results from F1 to F7 show that the development ability and convergence rate of IMFO in search space are effective.
There are many local optimum solutions for multi-modal problems (F8-F13), which are used to estimate the exploration ability of the IMFO. The comparison results for F8 to F13 are given in Tables 6–8. For F9 and F11, IMFO can search for the best solution with good stability. And for F8, F10, F12 and F13, the comparison results of IMFO is better than MFO in Mean and Std indexes. Therefore, the results show that the exploration ability of IMFO is better than that of MFO.
Fixed dimension multi-modal problems(F14-F23) have fewer dimensions and fewer local optima. They are used to detect the balance between the exploration ability and exploitation ability of the IMFO. In Table 9, for F14, F15, F21,F22, F23 IMFO is superior to MFO. For F20, the mean value of IMFO is not as good as MFO, but the Std of IMFO is small. For F16, F17, F18, F19, IMFO and MFO have similar accuracy.
Therefore, by analyzing the comparison results in Tables 6–9, it can be concluded that IMFO algorithm is effective in searching global optimal solution.
To better illustrate the convergence performance of IMFO algorithm, the convergence curve in dimension 30 is drawn as shown in Fig. 2. The Fig. 2 are convergence curves of unimodal problems, multi-modal problems, fixed dimension multi-modal problems, respectively. Among them, the ordinates in the graph represent the mean of objective function value of the algorithm running independently for 30 times, and the abscissa represents the iteration times. From Fig. 2, it can be seen that the convergence speed and accuracy of IMFO are better than those of MFO in most test functions.
4.1.3 Comparison with other algorithms
To further evaluate the performance of the IMFO, 20 optimization algorithms have been selected and compared with IMFO. The parameter settings of these algorithms used in this paper are shown in Table 3. Table 10 gives the comparative results between IMFO algorithm and MFO, LMFO, MFO3, CMFO, IMFO2020, COA, CMAES, SSA, SCA, GSA, ABC, PSO, GWO, PSOGSA, HCLPSO, WOA, HSCA, EWOA, IDA algorithms. In the table, each algorithm still runs 30 times independently in each classical test problem.
For F1, F2, F4, F9, F11, the IMFO can obtain the best solution. For F3, the IMFO can get better results than other algorithms. Among other test problems, IMFO has better simulation results in most problems. At the same time, Fig. 3 shows the convergence curve of 20 algorithm in 6 test problems. It can be seen in this figure that IMFO has better convergence accuracy and speed on these six test problems compared with other algorithms.
Meanwhile, Wilcoxon signed rank test and Friedman test are introduced to analyze the proposed algorithm. In Table 11, the results of Wilcoxon signed rank test and Friedman test are given. In Table 11, the results of IMFO are better than that of other 19 algorithms in most test problems. It can be seen in Table 11 that the average ranking (Ave) of IMFO is the best.
Therefore, compared with other algorithms, IMFO algorithm has good competitiveness and better performance in convergence speed, search accuracy, robustness and jumping out of local optimum.
4.2 Experimental comparisons on CEC 2014
To further test the performance of IMFO algorithm, the CEC 2014 test set [51] is the second set of test problems to estimate the effectiveness of the IMFO. The CEC 2014 test problems are more difficult than the classical test problems. It consists of four groups: (1) Unimodal (f1-f3); (2) Multimodal (f4-f16); (3) hybrid (f17-f22); (4) composite (f23-30) problems. In this experiment, the individual dimension is 30, the population size is 30 and the maximum iteration times is 10000. The results of these test questions have been given in Tables 12, 13 and 14.
4.2.1 Analysis on CEC 2014
f1-f3 are the unimodal problems. Table 12 shows that IMFO can effectively exploiting search space. In Table 12, the performance of IMFO is better than that of MFO in four indicators. Therefore, the improved position updating strategy and orthogonal reverse flame generation of moths have been proved to be effective in the search space of development problems.
f4 to f16 are multimodal problems. As can be seen in Table 12, the other indicators except Std of f16 are that IMFO strategy is superior to MFO. For f16, although the Std of IMFO is larger than that of MFO, the mean, best and worst indices of IMFO are better than that of MFO. This shows that IMFO has better exploration ability than MFO. The results show that the mutation operator and the Euclidean distance are used to modify the moth position update mechanism, which is helpful to improve the exploration ability of the MFO. Therefore, IMFO is more effective than MFO in exploration capability.
f17-f22 and f23-30 are hybrid and composite test problems in CEC 2014 test set, respectively. As can be seen from Table 12 that IMFO performs better than MFO on test problems f17 to f30, and has better global search ability.
4.2.2 Comparison with other algorithms
In this subsection, IMFO algorithm has been compared with MFO, LMFO, MFO3, CMFO, IMFO2020, SSA, SCA, PSO, WOA, HSCA, EWOA, IDA, MVO, SHO algorithms on the CEC2014 test problems. Table 13 shows the results of 16 algorithms on the test functions, with 30 dimensions and 10000 iterations. For Table 13, it can be observed that the IMFO has a strong competitiveness compared with other algorithms in Mean and Std indexes. The results show that the IMFO can get competitive solution by introducing OOBL and modified moth position update strategy.
To ensure the effectiveness of IMFO improvement, Wilcoxon signed rank test and Friedman test are also introduced, in which the parameter setting is consistent with the classical benchmark problem set. The statistical conclusions are shown in Table 14 and the symbols are the same as those described in Section 4.1.1. In Table 14, the results of IMFO are better than that of other algorithms in most test problems. It can be seen in Table 14 that the average ranking (Ave) of IMFO is the best.
5 Engineering optimization problems
In the above part, the performance of IMFO is simulated on classical test set and standard CEC2014 test set. In this section, to verify the ability of the IMFO to solve practical problems, the IMFO is used to optimize three practical engineering optimization problems.
5.1 Pressure vessel design problem
This problem is given in Fig. 4 [21]. Its purpose is to obtain the lowest cost cylindrical pressure vessel as much as possible. For this problem, there are four parameters to be optimized in Fig. 4: thickness of head Th, thickness of shell Ts, length of cylindrical shell L and inner radius R. The mathematical expressions of this problem are as follows [21, 52, 53]:
Table 15 shows the comparison results of the IMFO algorithm and other heuristic optimization algorithms for this problem, which includes the values of each variable and its corresponding results. From Table 15, the results show that IMFO can get the minimum cost than other optimization algorithms.
5.2 Cantilever beam design problem
This problem is shown in Fig. 5 [21], and its purpose is to get a set of values to minimize the weight of the cantilever. There are five parameters to be optimized: the side length xi (\(i=1, 2, \dots , 5\)) of cross section (square) of the five different beams. The mathematical expressions of this problem are as follows [21, 47, 52]:
Table 16 shows the comparison results of the IMFO algorithm and other heuristic optimization algorithms for this problem, which includes the values of each variable and its corresponding results. From Table 16, the results show that IMFO can get the minimum weight design than other optimization algorithms.
5.3 Tension/compression spring design problem
This optimization problem is given in Fig. 6 [14]. Its purpose is to achieve lower manufacturing cost as much as possible. For this problem, there are three parameters to be optimized in Fig. 6: d indicates wire diameter; D indicates mean coil diameter; N number of active coils. In addition, the mathematical expressions of this problem are as follows [21, 55, 59, 60]:
where \(0.05 \leqslant x_{1} \leqslant 2.00\), \(0.25 \leqslant x_{2} \leqslant 1.30\), \(2.00 \leqslant x_{3} \leqslant 15.0\).
Table 17 shows the comparison results of the IMFO algorithm and other heuristic optimization algorithms for this problem, which includes the values of each variable and its corresponding results. From Table 18, the results show that IMFO can get the minimum weight design than other optimization algorithms.
6 Conclusions
In this paper, an improved Moth-Flame optimization algorithm is proposed to solve global optimization problems. The IMFO is mainly realized by flames generation strategy and modified position update mechanism of moths. The flames generation strategy is used to generate effective flames to guide moths by OOBL, which improves the performance of the MFO algorithm to jump out of local optimum and enhance exploration ability. Modified position updating mechanism of moths is proposed based on spiral search and mutation operator, which helps to increase convergence speed of the IMFO. To verify the effectiveness of IMFO, it has been compared with MFO, LMFO, MFO3, CMFO, IMFO2020, SSA, SCA, GSA, ABC, PSO, GWO, WOA, PSOGSA, HCLPSO, MVO, HSCA, EWOA, IDA, SHO in 23 benchmark functions and the CEC 2014 benchmark test functions. The comparative results show that the IMFO algorithm is effective, accurate and stable. In addition, the IMFO is also compared with other well-known algorithms (such as SCA, MMA and GCA) on three practical engineering optimization problems. The comparative results show that the IMFO algorithm can obtain competitive solutions in solving practical engineering optimization problems. Therefore, the comparative analysis results of this paper show that the IMFO algorithm overcomes some shortcomings of MFO algorithm, and it can be used to deal with the classical test problems and the practical engineering optimization.
References
Boubezoul A, Paris S (2012) Application of global optimization methods to model and feature selection. Pattern Recogn 45(10):3676–3686
Sebastian N, Suvrit S, Wright SJ (2011) Optimization for machine learning. The MIT Press, Cambridge
Pasandideh SHR, Niaki STA, Gharaei A (2015) . Optimization of a multiproduct economic production quantity problem with stochastic constraints using sequential quadratic programming 84:98–107
Eberhart RC, Shi Y (2002) Particle swarm optimization: developments, applications and resources. In: Congress on evolutionary computation, vol 1, pp 81–86
Yang XS (2010) A new metaheuristic bat-inspired algorithm. In: Proceedings of the IEEE international conference on neural networks, vol 284, pp 65–74
Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Glob Optim 39(3):459–471
Das S, Suganthan P (2011) Differential evolution: a survey of the state-of-the-art. IEEE Trans Evol Comput 15(1):4–31
Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun Nonlinear Sci Numer Simul 17(12):4831–4845
Pierezan J, Coelho LDS (2018) Coyote optimization algorithm: a new metaheuristic for global optimization problems. In: 2018 IEEE Congress on evolutionary computation (CEC), pp 1–8
Yang X, Gandomi AH (2012) Bat algorithm: a novel approach for global engineering optimization. Eng Comput 29(5):464–483
Sharma V, Pattnaik SS, Garg T (2012) A review of bacterial foraging optimization and its applications. Procedia - Social and Behavioral Sciences 48(1):1294–1303
Pan WT (2012) A new fruit fly optimization algorithm: taking the financial distress model as an example. Knowl-Based Syst 26(2):69–74
Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69(3):46–61
Yang XS (2012) Flower pollination algorithm for global optimization. In: Unconventional computation and natural computation. Springer, pp 240–249
Dhiman G, Kumar V (2017) Spotted hyena optimizer: a novel bio-inspired based metaheuristic technique for engineering applications. Adv Eng Softw 114:48–70
Sulaiman MH, Mustaffa Z, Saari MM, Daniyal H (2020) Barnacles mating optimizer: a new bio-inspired algorithm for solving engineering optimization problems. Eng Appl Artif Intel 87:103330
Moosavi SHS, Bardsiri VK (2019) Poor and rich optimization algorithm: a new human-based and multi populations algorithm. Eng Appl Artif Intel 86:165–181
Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput and Applic 27(2):495–513
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67
Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl-Based Syst 89:228–249
Li C, Li S, Liu Y (2016) A least squares support vector machine model optimized by moth-flame optimization algorithm for annual power load forecasting. Appl Intell 45(4):1166–1178
Ebrahim MA, Becherif M, Abdelaziz AY (2018) Dynamic performance enhancement for wind energy conversion system using moth-flame optimization based blade pitch controller. Sustain Energy Technol Assess 27:206–212
Ishtiaq A, Ahmed S, Khan MF, Aadil F, Khan S (2019) Intelligent clustering using moth flame optimizer for vehicular ad hoc networks. Int J Distrib Sensor Netw 15(1):1–13
Das A, Mandal D, Ghoshal SP, Kar R (2018) Moth flame optimization based design of linear and circular antenna array for side lobe reduction. Int J Numer Modell: Electron Netw Devs Fields 32(7):1–15
Mittal N (2018) Moth flame optimization based energy efficient stable clustered routing approach for wireless sensor networks. Wireless Person Commun 1:1–18
Mohanty B, Acharyulu BVS, Hota PK (2018) Moth-flame optimization algorithm optimized dual-mode controller for multiarea hybrid sources agc system. Optim Control Applic Methods 39(4):720–734
Singh P, Prakash S (2017) Optical network unit placement in Fiber-Wireless (FiWi) access network by Moth-Flame optimization algorithm. Opt Fiber Technol 36:403–411
Mei RNS, Sulaiman MH, Mustaffa Z, Daniyal H (2017) Optimal reactive power dispatch solution by loss minimization using moth-flame optimization technique. Appl Soft Comput 59:210–222
Li C, Niu Z, Song Z, Li B, Fan J, Liu PX (2018) A double evolutionary learning moth-flame optimization for real-parameter global optimization problems. IEEE Access 6:76700–76727
Li Z, Zhou Y, Zhang S, Song J (2016) Lvy-flight moth-flame algorithm for function optimization and engineering design problems. Math Probl Eng 2016:1–22
Xu L, Li Y, Li K, Beng GH, Jiang Z, Wang C, Liu N (2018) Enhanced moth-flame optimization based on cultural learning and gaussian mutation. J Bionic Eng 15(4):751–763
Sapre S, Mini S (2019) Opposition-based moth flame optimization with Cauchy mutation and evolutionary boundary constraint handling for global optimization. Soft Comput 23(15): 6023–6041
Xu Y, Chen H, Luo J, Zhang Q, Jiao S, Zhang X (2019) Enhanced moth-flameoptimizer with mutation strategy for global optimization. Inform Sci 492:181–203
Xu Y, Chen H, Heidari AA, Luo J, Zhang Q, Zhao X, Li C (2019) An efficient chaotic mutative moth-flame-inspired optimizer for global optimization tasks. Expert Syst Applic 129:135–155
Taher MA, Kamel S, Jurado F, Ebeed M (2019) An improved moth-flame optimization algorithm for solving optimal power flow problem. Int Trans Electric Energy Syst 29(3):1–28
Tolan G, MH Khorshid M, Abou-El-Enien T (2016) Modified moth-flame optimization algorithms for terrorism prediction. Inte J Applic Innov Eng Manag (IJAIEM) 5:47–58
Mahdavi S, Rahnamayan S, Deb K (2018) Opposition based learning: a literature review. Swarm Evol Comput 39:1–23
Zhang Q, LEUNG YW (1999) An orthogonal genetic algorithm for multimedia multicast routing. IEEE Trans Evol Comput 3(1):53–62
Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Mirjalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv Eng Softw, 1–29
Wang M, Chen H, Yang B, Zhao X, Hu L, Cai Z, Huang H, Tong C (2017) Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing 267(6):69–84
Lin GQ, Li LL, Tseng ML, Liu HM, Tan RR (2020) An improved moth-flame optimization algorithm for support vector machine prediction of photovoltaic power generation. J Clean Prod 253:119966
Hansen N, Ostermeier A (2001) Completely derandomized self-adaptation in evolution strategies. Evol Comput 9(2):159–195
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inform Sci 179:2232–2248
Mirjalili S, Hashim SZM (2010) A new hybrid PSOGSA algorithm for function optimization. In: 2010 International conference on computer and information application (ICCIA), pp 374–377
Lynn N, Suganthan PN (2015) Heterogeneous comprehensive learning particle swarm optimization with enhanced exploration and exploitation. Swarm Evol Comput 24:11–24
Gupta S, Deep K (2019) A novel hybrid sine cosine algorithm for global optimization and its application to train multilayer perceptrons. Appl Intell, 1–34
Qais MH, Hasanien HM, Alghuwainem S (2020) Enhanced whale optimization algorithm for maximum power point tracking of variable-speed wind generators. Appl Soft Comput J 86:1–14
Li LL, Zhao X, Tseng ML, Tan RR (2020) Short-term wind power forecasting based on support vector machine with improved dragonfly algorithm. J Clean Prod 242:1–12
Derrac J, Garcia S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18
Liang J, Qu B, Suganthan P (2013) Problem Definitions and Evaluation Criteria for the CEC 2014. Special Session and Competition on Single Objective Real-Parameter Numerical Optimization. Computational Intelligence Laboratory, Zhengzhou University Zhengzhou China and Technical Report. Nanyang Technological University, Singapore
Gupta S, Deep K (2018) Improved sine cosine algorithm with crossover scheme for global optimization. Knowl-Based Syst 165:374–406
Kaveh A, Talatahari S (2010) An improved ant colony optimization for constrained engineering design problems. Eng Comput 27(1):155–182
Mezura-Montes E, Coello CAC (2008) An empirical study about the usefulness of evolution strategies to solve constrained optimization problems. Int J Gen Syst 37(4):443–473
He Q, Ling W (2007) An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng Appl Artif Intel 20(1):89–99
Sandgren E (1990) Nonlinear integer and discrete programming in mechanical design. J Mech Des 112(2):223–229
Gandomi HA, Yang X-S, Alavi HA (2013) Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Eng Comput 29(2):245–245
Chickermane H, Gea H (1996) Structural optimization using a new local approximation method. Int J Numer Methods Eng 39:829–846
Coello CAC (2000) . Use of a self-adaptive penalty approach for engineering optimization problems 41:113–127
Li LJ, Huang ZB, Liu F, Wu QH (2007) A heuristic particle swarm optimizer for optimization of pin connected structures. Comput Struct 85(7):340–349
Acknowledgments
Project supported by the National Natural Foundation of China (Grant Numbers 61873226 and 61803327), Natural Science Foundation of Hebei Province (Grant Numbers F2017203304, F2019203090 and F2020203018).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhao, X., Fang, Y., Liu, L. et al. An improved moth-flame optimization algorithm with orthogonal opposition-based learning and modified position updating mechanism of moths for global optimization problems. Appl Intell 50, 4434–4458 (2020). https://doi.org/10.1007/s10489-020-01793-2
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-020-01793-2