Abstract
We introduce two strategies into the guided fireworks algorithm (GFWA) to further improve its performance by generating one or more weight-based guiding spark individual(s) for each firework individual. The first strategy assigns different weights to spark individuals under each firework individual according to their fitness and then calculates one or more guiding vector(s) to guide the firework individual to evolve into potential directions. The second strategy decides the number of weight-based guiding spark individuals dynamically based on the evolution of a firework individual, i.e. if a firework individual does not evolve and survive in the next generation, then the second strategy reduces the number of spark individuals generated around the firework individual and generates the same reduced number of weight-based guiding spark individuals additionally. We design a controlled experiment to evaluate the performance of our proposal using CEC 2013 benchmark functions with five different dimensions. The experiment results confirm that the proposed strategies can provide effective guidance information to improve the GFWA performance significantly, and its acceleration effect for higher dimensional tasks is more obvious.
This work was supported in part by Grant-in-Aid for Scientific Research (17H06197, 18K11470, 19J11792).
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The fireworks algorithm (FWA) [1] is a new family member of evolutionary computation community and simulates explosion process of real fireworks repeatedly to find the global optimum. Many powerful variants of FWA have been sprung up like mushrooms by incorporating various effective search mechanisms, such as enhanced FWA (EFWA) [2], dynamic FWA (dynFWA) [3], adaptive FWA (AFWA) [4], guided FWA (GFWA) [5] and others [6,7,8,9]. They have also solved many complex real-world applications successfully, including multilevel image thresholding [10], RFID network planning [11] and privacy preserving [12], etc., thanks to their excellent characteristics. Although they have achieved gratifying results, there is still plenty of room to further improve FWA performance.
The primary objective of this paper is to propose a new type of weight-based guiding spark individuals to accelerate the convergence of FWA. The first strategy gives different weights to spark individuals to generate proposed weight-based guiding spark individuals, and the second strategy focuses on deciding the number of the guiding spark individuals dynamically, while GFWA always uses only one. The secondary objective is to analyze the effect of our proposal as well as their applicability and point out some open topics for discussion.
Following this introductory section, we roughly summarize optimization principles of FWA and a short introduction of GFWA in the Sect. 2. The proposed two strategies are comprehensively described in the Sect. 3. We evaluate the performance of our proposal using 28 benchmark functions of 5 different dimensions in the Sect. 4. Finally, we analyze some topics coming from the evaluation results in the Sect. 5 and conclude our works in the Sect. 6.
2 Optimization Mechanisms of Fireworks Algorithm
There are many generated sparks around a real firework launched into the sky, which can be considered as a local search pattern around a specific point. Inspired by this explosion process, FWA assigns different explosion amplitude and number of generated spark individuals to each firework individual to balance exploitation and exploration. These explosion processes are repeated until a termination condition is satisfied. The Fig. 1 demonstrates the general framework of the FWA consisting of three major operations: explosion, mutation and selection.
GFWA is one of the most powerful variants of FWA, and its core idea is to divide spark individuals into two groups according to their fitness, determine a guiding vector from the centroid of the poor group to that of the better group, and evolve a firework individual to the guiding direction (Fig. 2(a)). However, an incorrect guiding vector may hinder the convergence of a firework individual as shown in Fig. 2(b). Since we do not focus on GFWA itself, the detailed implementations can refer to the [5]. We propose two strategies to avoid poor guidance by generating multiple high precision guiding vectors.
3 Two Proposed Strategies for GFWA
We propose two strategies to further improve the GFWA performance by introducing the concept of weights and generating multiple potential guiding vectors. The first strategy, weight-based guiding strategy, assigns different weights to generated spark individuals according to their fitness, which is expected to find a more effective guiding direction. The second strategy, quantitative increase strategy, may increase the number of weight-based guiding spark individuals to avoid falling into a local area based on previous searches.
3.1 Weight-Based Guiding Strategy
Top \(\sigma \) spark individuals among those generated by a firework individual \(\varvec{x}^i\) based on fitness rank are copied into a pool, and \(\hat{N}\) spark individuals are randomly selected from the pool to calculate a guiding vector, i.e. we can obtain \(\hat{N}\) vectors from the firework individual to these selected spark individuals.
The next problem is how to handle weights. There are many methods to assign weights to these vectors. In this paper, we simply use the fitness difference between a firework individual and a selected spark individual to determine weights, which means the more potential directions are, the more weight they are given. Thus, the i-th guiding spark individual \(\varvec{g}^i\) is calculated by weighting these vectors using Eq. (1). The i-th guiding vector is defined as a vector from the i-th firework individual \(\varvec{x}^i\) to the guiding spark individual \(\varvec{g}^i\).
where \(\varvec{s}^{i}_{j}\) is the j-th spark individual generated by the i-th firework individual \(\varvec{x}^i\) (\(1 \le j \le \hat{N}\)), and f() is a fitness function.
Note that
-
if the i-th firework individual does not survive in the next generation, the pool is cleared. Otherwise, the pool is kept and generated better spark individuals are recorded into the pool until the upper limit is met. Once the pool becomes full, newcomers update poorer ones in the pool in turn.
-
\(\hat{N}\) is less than the pool size.
-
if the j-th spark individual \(\varvec{s}^i_j\) is worse than the i-th firework individual \(\varvec{x}^i\), the weight of the vector from the firework individual to the spark individual, \(\varvec{s}^i_j - \varvec{x}^i\), is set to 0.
3.2 Quantitative Increase Strategy
The second strategy is used only when a firework individual has not evolved and survived to the next generation. Multiple guiding sparks then are generated by using the first strategy to help the firework individual to evolve. Suppose the total number of spark individuals generated by the firework individual is M in the coming explosion operation. When the case mentioned in the above happens, we reduce the number of spark individuals generated by the explosion operation to \(\alpha \times M\) and pack the number by generating \((1 - \alpha ) \times M\) guiding spark individuals. We set \(\alpha \) as 0.9 in our experimental evaluations.
The next key problem is how to generate multiple guiding vectors. Since a pool can provide a variety of spark individuals, we randomly select half of spark individuals from the pool to calculate a guiding vector and repeat this operation \((1-\alpha ) \times M\) times to provide multiple different guiding sparks.
Algorithm 1 outlines the flow of FWA combined with our proposed strategies.
4 Experimental Evaluations
To evaluate the performance of our proposed strategies, we combine the original guiding strategy in the [5] and our proposal with three different FEW variants, EFWA [2], dynFWA [3] and AFWA [4], respectively. Each benchmark function from the CEC2013 benchmark test suit [13] is run 51 times independently in 5 dimension settings of D = 10, 30, 50, 70 and 100.
These functions are designed for real parameter single-objective optimization, and their landscape characteristics include shifted, rotated, global on bounds, unimodal and multi-modal. The parameter settings used in our experimental evaluations showed as following; the number of firework individuals is set to 1, and the total number of spark individuals is set to 200. \(\sigma \) used for selecting top spark individuals is set to 0.2. The explosion amplitude used in EFWA is set to 80. All other parameter settings of EFWA, dynFWA and AFWA are exactly the same with original references [2, 3] and [4], respectively. The dimension selection mechanism is not used in these evaluations to increase population diversity.
We use the number of fitness calls rather than generations to evaluate convergence fairly. The maximal number of evaluations, i.e. termination condition, of each run is 10,000\(\times D\). We apply the Wilcoxon signed-rank test and the Holm’s multiple comparison test on the fitness values at the termination condition to check significant difference between the original guiding Strategy in the [5] and our proposed strategies. Tables 1, 2 and 3 show results of statistical tests.
5 Discussions
5.1 Discussion on the Proposed Strategies
We begin our discussion from an explanation of the superiority of our proposal. The first strategy, weight-based guiding strategy, uses only spark individuals which fitness are better than that of a firework individual to construct multiple vectors from the firework individual to selected spark individuals. Different weights based on their fitness differences are given to these potential vectors to calculate a guiding vector. The possibility of getting a better guiding spark individual by using the guiding vector from the firework individual becomes high.
A guiding spark individual has an anti-noise property to avoid over-preference for a certain direction because of aggregating multiple potential directions. Although the first strategy increases computing costs, i.e. weight processing operation, it is acceptable to add only one additional fitness operation. We can say that it is a low cost, high return strategy from the cost-performance view.
The second strategy, quantitative increase strategy, is to reduce the number of spark individuals generated by an explosion operation and generate the same number of guiding spark individuals to speed up unevolved firework individuals. Since the guiding operation is more likely to favor potential directions rather than a random search, multiple guiding vectors may be beneficial for a firework individual to jump out of the current local area.
To solve the key problem of how to generate diversified guidance vectors, a spark pool is adopted to efficiently use information by storing many excellent spark individuals generated in the past. This strategy does not need additional fitness calculations, but it simply redistributes the proportion of two different types of spark individuals. We can say that it is a low risk, easy-to-use strategy.
5.2 Discussion on Experimental Result
The next discussion is on the effectiveness and applicability of our proposal. To evaluate its performance, we compare it with the original guiding strategy in the [5], and apply them to three different baseline algorithms, EFWA, AFWA and dynFWA, respectively. We apply the Wilcoxon signed-rank test and Holm’s multiple comparison test to the average fitness of 51 trial runs at the termination condition and check significant differences between two guiding methods. From the results of these statistical tests, we found that our proposed strategies had better performance in both unimodal and multimodal tasks on all 5 different dimensions. It may be because our proposal can provide more precise multiple guiding directions to accelerate convergence of FWA. The results show that our proposal can be applied to various variants of FWA successfully and implies that they have a wide range of applicability.
Finally, we discuss several potential approaches to further improve the performance of our proposed strategies. As the next improvement, we may use fitness gradient information instead of fitness difference to handle weights. How to further improve the accuracy of guiding spark individuals and how to use them to accelerate FWA are also our future works.
6 Conclusion
We proposed two effective strategies to improve a guiding information of the original GFWA and further increase its optimization ability. The first strategy uses existing and historical information to construct guiding vectors more reasonably, and the second strategy increases the number of guiding spark individuals to provide multiple potential guiding spark individuals. The experiments confirmed that our proposal can improve the performance of the GFWA significantly.
In our future work, we will continue to explore and exploit the hidden information to accelerate convergence and propose new methods to handle the weights reasonably. Besides, we will use them to solve practical problems.
References
Tan, Y., Zhu, Y.: Fireworks algorithm for optimization. In: Tan, Y., Shi, Y., Tan, K.C. (eds.) ICSI 2010. LNCS, vol. 6145, pp. 355–364. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-13495-1_44
Zheng, S., Janecek, A., Tan, Y.: Enhanced fireworks algorithm. In: 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, pp. 2069–2077 (2013)
Zheng, S., Janecek, A., Li, J., Tan, Y.: Dynamic search in fireworks algorithm. In: 2014 IEEE Congress on Evolutionary Computation, Beijing, China, pp. 3222–3229 (2014)
Li, J., Zheng, S., Tan, Y.: Adaptive fireworks algorithm. In: 2014 IEEE Congress on Evolutionary Computation, Beijing, China, pp. 3214–3221 (2014)
Li, J., Zheng, S., Tan, Y.: The effect of information utilization: introducing a novel guiding spark in the fireworks algorithm. IEEE Trans. Evol. Comput. 21(1), 153–166 (2017)
Yu, J., Takagi, H.: Acceleration for fireworks algorithm based on amplitude reduction strategy and local optima-based selection strategy. In: Tan, Y., Takagi, H., Shi, Y. (eds.) ICSI 2017. LNCS, vol. 10385, pp. 477–484. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61824-1_52
Yu, J., Takagi, H., Tan, Y.: Accelerating the fireworks algorithm with an estimated convergence point. In: Tan, Y., Shi, Y., Tang, Q. (eds.) ICSI 2018. LNCS, vol. 10941, pp. 263–272. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93815-8_26
Yu, J., Tan, Y., Takagi, H.: Scouting strategy for biasing fireworks algorithm search to promising directions. In: The Genetic and Evolutionary Computation Conference Companion, Kyoto, Japan, pp. 99–100 (2018)
Yu, J., Takagi, H., Tan, Y.: Multi-layer explosion-based fireworks algorithm. Int. J. Swarm Intell. Evol. Comput. 7(3) (2018). https://doi.org/10.4172/2090-4908.1000173
Tuba, M., Bacanin, N., Alihodzic, A.: Multilevel image thresholding by fireworks algorithm. In: The 25th International Conference Radioelektronika, Pardubice, Czech Republic, pp. 326–330 (2015)
Tuba, M., Bacanin, N., Beko, M.: Fireworks algorithm for RFID network planning problem. In: The 25th International Conference Radioelektronika, Pardubice, Czech Republic, pp. 440–444 (2015)
Rahmani, A., Amine, A., Hamou, R.M., Rahmani, M.E., Bouarara, H.A.: Privacy preserving through fireworks algorithm based model for image perturbation in big data. Int. J. Swarm Intell. Res. 6(3), 41–58 (2016)
Liang, J., Qu, B., Suganthan, P.N., Alfredo, G.H.: Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization (2013). http://al-roomi.org/multimedia/CEC_Database/CEC2013/RealParameterOptimization/CEC2013_RealParameterOptimization_TechnicalReport.pdf
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, Y., Yu, J., Takagi, H., Tan, Y. (2019). Accelerating Fireworks Algorithm with Weight-Based Guiding Sparks. In: Tan, Y., Shi, Y., Niu, B. (eds) Advances in Swarm Intelligence. ICSI 2019. Lecture Notes in Computer Science(), vol 11655. Springer, Cham. https://doi.org/10.1007/978-3-030-26369-0_24
Download citation
DOI: https://doi.org/10.1007/978-3-030-26369-0_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26368-3
Online ISBN: 978-3-030-26369-0
eBook Packages: Computer ScienceComputer Science (R0)