Abstract
Aiming at the problem that the differential evolution algorithm easily falls into a local optimum and results in premature convergence, a new differential evolution algorithm with an adaptive population size reduction strategy (APRDE) is proposed. Firstly, in the mutation and crossover operation, to balance the local exploitation and global exploration capabilities of the algorithm, a parameter adaptive tunning scheme based on the hyperbolic tangent function and Cauchy distribution is proposed to adaptively adjust the parameter factors. Secondly, an ordered mutation strategy is adopted to guide the direction of mutating and enrich the diversity of the population. Lastly, after each evolution iteration, adaptively reducing the population size according to the error between the fitness values of individuals and the current optimal. The proposed algorithm is compared with 5 other optimization algorithms on 8 typical benchmark functions. The results show that the algorithm has a great improvement in solution accuracy, stability and convergence speed.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
- Adaptive differential evolution algorithm
- Parameters tunning scheme
- Ordered mutation strategy
- Population size reduction strategy
1 Introduction
Coal has always been the main energy source in China, and accounts for more than 60% of consumption, it will remain be the main energy source in China until 2050. Coal intelligent mining is a new stage in the development of coal comprehensive mining technology, which is also an inevitable requirement for the technological revolution and upgrading development of coal industry [1]. Three-dimensional modeling of coal seams at the fully mechanized mining face is an important foundation for coal enterprises to realize “intelligent management and transparent mining”.
Researchers often use kriging interpolation to interpolate unknown regions in space to build 3D models. For the problem that kriging interpolation is prone to overfitting or underfitting in the fitting process of variogram, this paper proposes a Differential Evolution algorithm with Adaptive Population size Reduction (APRDE) to optimize the kriging interpolation algorithm. After experimental verification, the adaptive differential evolution algorithm proposed in this paper has higher solution accuracy, faster convergence speed and better stability.
2 The APRDE Algorithm
The Differential Evolution (DE) algorithm [2] is an effective heuristic search algorithm that can be used to solve parameter optimization problems. This paper optimizes the DE algorithm by modifying the variation strategy, parameter adaptive adjustment mechanism, and population reduction strategy.
2.1 Variation Strategy
The mutation process and the crossover process are the core parts of the differential evolution algorithm, and the mutation formula is:
where \(v_{i}\) is the individual after mutation, \(x_{i} ,x_{j} ,x_{r}\) are the random individuals in the current population and \(i \ne j \ne r\), In the process of mutation, in order to enrich the diversity of the population and improve the local exploitation ability of the algorithm, this paper adopts an ordered mutation strategy [3], and the mutation equation is:
Three randomly selected individuals from the current population are sorted according to the fitness value to obtain \(x_{best} ,x_{middle} ,x_{worst}\). With the current vector as the base vector, avoiding the algorithm from falling into a local optimal solution or stagnation. Combining the base vector and two ordered difference vectors, enriching the diversity of the population, and making the direction of variation gradually approach the optimal solution.
2.2 Parameter Adaptive Adjustment
In the variation process, the variation factor F controls the magnitude of the base vector change. To balance the global exploration and local exploitation abilities of the algorithm, the hyperbolic tangent curve between [−4,4] [4] is used in this paper to control the variation of the mutation factor with the following variation equation.
where \(F_{\min }\) is the minimum value and \(F_{\max }\) is the maximum value of the variation factor. \(G_{max}\) is the maximum evolutionary generation and G is the current evolutionary generation. The hyperbolic tangent curve changes very little at the beginning and the end. The variation factor varies approximately linearly between the maximum and minimum values, striking a balance between global exploration and local exploitation ability. Besides, a variation factor based on normal distribution is used in this paper to enhance the diversity of the variation vector and jump out of the local optimal solution. The variation process of the variation factor is as follows:
The crossover process of the differential evolution algorithm is as follows:
The crossover process is to operate on each variable in an individual, and D denotes the number of variables. The variables in the mutated individual are crossed with those in the initial individual by setting the conditions to obtain the crossover individual. To accommodate the crossover process, this paper changes the crossover factor in a linearly reduced manner with the evolutionary process, and the change equation is:
where the change range of the crossover factor is \([CR_{min} ,\,CR_{max} ]\). The adaptive differential evolution algorithm proposed in this paper adaptively changes the variance and crossover factors in evolutionary process, and balances the global exploration and local exploitation ability to some extent in the algorithm search process.
2.3 Population Reduction Strategy
Reduction of populations during evolution process of differential evolution algorithm can effectively capture useful individual information, reduce unnecessary computational resources, and improve convergence speed. This paper proposes a nonlinear population reduction strategy to control the reduction of population size according to the hyperbolic tangent function curve between [−2.5,4]. Through a predetermined maximum evolutionary generation \(G_{\max }\), the reduction function changes with the current evolutionary generation as the independent variable. The reduction equation is as follows:
In the early stage of evolution, the size of population is large, and to ensure the diversity of population and to improve the global search ability of the algorithm. As the evolutionary process proceeds, the solved optimal individuals are closer and closer to the optimal solution. To save computational resources and improve the convergence speed, some individuals far from the optimal solution are removed. In the later stage of the evolutionary process, the population iterates around the optimal solution. To improve the local search ability, the population size is maintained at the small value and local search is performed carefully to ensure that the optimal solution in that range is found.
3 Numerical Experiment and Analysis
The experiments in this paper use a 64-bit Windows 10 operating system. The processor is an Intel(R) Core (TM) i5-5200U CPU @ 2.20 GHz with an Intel(R) HD Graphics 5500 GPU. Python 3.5.2 is selected as the experimental code language, and the experiment is run in PyCharm software to complete the experimental process.
3.1 Experiments Setup
In this paper, we choose five comparison algorithms, namely DE [2], LSHADE (Linear Success-History based Adaptive DE) [5], AGDE (Adaptive Guided Differential Evolution) [6], AMODE (a DE algorithm based on Adaptive Mutation Operator) [7] and ASVDE (modified DE algorithm based Adaptive Secondary Variation) [8]. The comparative analysis of the algorithms is performed on eight typical benchmark functions in CEC2014 [9], which are unimodal f1, f2, simple multimodal f6, f12, hybrid f17, f22, and composite functions f24, f27, as shown in Table 1. D is the dimensionality of the problem. The search space of these benchmark functions is [-100,100], with more local optima and function values greater than 0. Therefore, the fitness function is defined as f = f(x)−f(x*). f(x) is the function value calculated by the algorithm, f(x*) is the known optimal value of the function. The closer the f is to zero, the closer the function value calculated by the algorithm is to the global optimum. The parameter variables set for the comparison experiments are shown in Table 2.
3.2 Comparison of Solution Accuracy and Stability
With the variable settings and experiments conducted in Table 2, the six comparison algorithms were run 21 times on the benchmark functions in the dimensions of D = 30., the average (avg) and standard deviation (std) of fitness function values were calculated and recorded in Tables 3, with the optimal values bolded in the table.
As shown in Table 3, the mean values solved by the APRDE algorithm on eight functions are 5.51E+02, 0.00E-00, 2.65E-01, 6.00E-01, 2.24E+02, 2.84E+01, 2.22E+02 and 3.20E+02, which are closer to the optimal solution than the mean values solved by other algorithms. The results obtained by the APRDE algorithm are closer to the optimal solutions compared with other algorithms, and also have better performance for solving high-dimensional optimization problems. The APRDE algorithm solves to the closest global optimal solution on 88% of the functions. The standard deviation range obtained by APRDE is 0.00E+00 to 3.85E+05, which has the smallest value compared with other algorithms, so the algorithm has the best stability.
3.3 Comparison of Convergence Speed
When D = 10, the convergence curves of the six compared algorithms on each function are shown in Fig. 1. From Figs. 1-a and 1-b, all algorithms show a single decreasing trend in the process of solving for unimodal functions, and the solution results are near the optimal solution. Compared with other algorithms, the APRDE algorithm converges the fastest, finds the global optimal solution first, and has the highest solution accuracy. In the simple multimodal functions f3 and f4, the fitness function values solved by APRDE algorithm keep decreasing in the iterative process, the algorithm converges faster and first finds the global optimal solution first in the f3 function, and in the f4 function, the solved optimal is closer to the global optimal solution. Among the above four functions, compared with the ASVDE, LSHADE and AGDE algorithms, the APRDE algorithm not only finds the global optimal solution, but also converges faster and more efficiently. In the hybrid and composite functions f5 ~ f8, the functions have multiple local optimal solutions and larger local optimal values. The comparison results shows that the convergence curve of the APRDE algorithm is decreasing and the solved results are smaller and closer to the global optimal solution.
4 Conclusion
In the process of solving optimization problems, the differential evolution algorithm tends to converge prematurely and falls into a local optimal solution or stagnation state. In this paper, we propose a Differential Evolution Algorithm with Adaptive Population size Reduction (APRDE) strategy to remedy the shortcomings of traditional differential evolution algorithm. The APRDE algorithm proposes a parameter adaptive adjustment mechanism to balance the global exploration and local exploitation ability in the search process and adopts an ordered variation strategy to enrich the diversity of the population and improve the convergence speed and solution accuracy of the algorithm. In the evolutionary process, a nonlinear population reduction strategy is proposed to save computational cost and improve the quality of computational results at the same time. Compared with the other five optimization algorithms, the APRDE algorithm proposed in this paper has a fast convergence speed, the optimal solution is obtained in 88% of the tested functions, and the stability of the algorithm is relatively high.
References
Wang, G.F., Zhang, D.S.: Innovation practice and development prospect of intelligent fully mechanized technology for coal mining. J. China Univ. Min. Technol. 47(3), 459–467 (2018)
Storn, R., Price, K.: Differential evolution—a simple and efficient adaptive scheme for global optimization over continuous spaces J. Global Optim. 11(4), 341–359 (1997)
Mohamed, A.W., Hadi, A.A., Jambi, K.M.: Novel mutation strategy for enhancing SHADE and LSHADE algorithms for global numerical optimization. Swarm Evol. Comput. 50 100455 (2019)
Yan, Q.M., Ma, R.Q., Ma, Y.X.: Adaptive simulated annealing particle swarm optimization algorithm Journal of Xidian University 48(04), 120–127 (2021)
Mohamed, A.W., Mohamed, A.K.: Adaptive guided differential evolution algorithm with novel mutation for numerical optimization. Int. J. Mach. Learn. Cybern. 10(2), 253–277 (2017)
Tanabe, R., Fukunaga, A.S.: Improving the search performance of SHADE using linear population size reduction. In: 2014 IEEE congress on evolutionary computation (CEC), pp. 1658–1665. IEEE (2014)
Liao, X., Li, J., Luo, Y.: Differential evolution algorithm based on adaptive mutation operator Comput. Eng. Appl. 54(6), 128–134 (2018)
Hu, F., Dong, Q., Lv, L.: Modified differential evolution algorithm based on adaptive secondary variation and its application Comput. Eng. Appl. 38(7), 271–280 (2021)
Liang, J.J., Qu, B.Y., Suganthan, P.N.: Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization. Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report, Nanyang Technological University, Singapore, 635: 490 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, X., Duan, Z., Liu, Q. (2022). A Differential Evolution Algorithm with Adaptive Population Size Reduction Strategy. In: Yang, S., Lu, H. (eds) Artificial Intelligence and Robotics. ISAIR 2022. Communications in Computer and Information Science, vol 1701. Springer, Singapore. https://doi.org/10.1007/978-981-19-7943-9_15
Download citation
DOI: https://doi.org/10.1007/978-981-19-7943-9_15
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-7942-2
Online ISBN: 978-981-19-7943-9
eBook Packages: Computer ScienceComputer Science (R0)