Keywords

1 Introduction

Numerous real life problems can be represented as optimization problems where global optimum (minimum or maximum) of the objective function needs to be found. In most cases these optimization problems are hard optimization problems and often highly nonlinear. Standard deterministic algorithms are incapable to find the solution for such problems due to computational complexity and numerous local optima, hence different approaches are needed. In the past decades various stochastic optimization algorithms were proposed. One group of such algorithms are nature inspired algorithms where the idea is to mimic some processes from the nature.

Nature inspired algorithms can be divided into two main categories: evolutionary and swarm intelligence algorithms. Evolutionary algorithms are inspired by the biological evolution processes such as reproduction, recombination, mutation and selection. One of the oldest and the most famous member of this group is genetic algorithm. Swarm intelligence algorithms are inspired by collective behavior of different agents in the nature. Each individual follows simple rules and interacts with other members of the swarm. These simple agents collectively exhibit remarkable intelligence that is used for solving optimization problems.

Swarm intelligence algorithms are inspired by processes from nature such as ant colonies, animal herding, food harvesting, nesting and others. By mimicking these processes two main parts of the algorithms have to be implemented: exploration and exploitation. Exploitation represents a local search around promising solutions that were found while exploration is global search where better solutions are looked for in different areas of the search space in order to prevent the algorithm to get stacked in some local optimum.

Among the oldest swarm intelligence algorithms are particle swarm optimization (PSO) and ant colony optimization. Consequently numerous different swarm intelligent algorithms have been proposed such as firefly algorithm [10, 18, 20], fireworks algorithm [13], krill herd algorithm [6, 11], and others. Swarm intelligence algorithms have been applied for solving different problems such as traveling salesman problem [19], multilevel thresholding [14], support vector machine optimization [15, 16], image registration [17], etc.

Optimization algorithms are constantly being improved by different modifications and hybridization. In recent years advances in theory and applications of chaos have been widely used in numerous fields and one of them is optimization algorithms. Chaotic maps such as circle, Gauss/mouse, logistic, piecewise, sine, sinusoidal and others were introduced in swarm intelligence algorithms instead of some random parameters [4]. For example, parameters of the bat algorithm were replaced by different chaotic maps and the results were compared to the original bat algorithm [5]. Results have shown that some chaotic bat algorithms can outperform the version with random numbers. Chebyshev map was introduced into fruit fly optimization algorithm and modified chaotic version of the algorithm has shown superior and more reliable behavior compared to the original one in [8]. In [7] ten different one-dimensional chaotic maps were used for improving fireworks algorithm. The best performance was when circle maps were used.

In this paper we introduce chaos into recent brain storm optimization algorithm [9] and propose chaos based BSO (CBSO). Since different chaotic maps can lead to different behavior of the optimization algorithm, we proposed two chaotic maps. The proposed algorithm was tested on standard benchmark functions proposed in CEC 2013 and it was favorably compared to the original BSO and the standard PSO.

The rest of the paper is organized as follows. Our proposed brain storm optimization algorithm with chaotic maps is presented in Sect. 2. Simulation results along with comparison with other algorithms are given in Sect. 3. At the end the conclusion and proposition for further work is presented in Sect. 4.

2 Chaos Based Brain Storm Optimization Algorithm

Brain storm optimization algorithm (BSO) was proposed by Yuhui Shi in 2011 [9]. This algorithm has been applied to numerous problems [2, 12]. During the last few years different improved and modified BSO versions were proposed [1, 3]. Inspiration for the algorithm was human idea generation process or brainstorming process. Brainstorming was summarized in several steps which were transformed into the brain storm optimization algorithm.

Brainstorming process contains step of generating initial ideas followed by choosing more promising ideas and generating new idea based on better solutions as well as new ones regardless of the previous ideas. It is expected that after a several iteration some good solution will be obtained. Brain storm optimization algorithm is presented in Algorithm 1.

In the Algorithm 1 several parameters of the BSO are mentioned. The first one is n, the number of solutions or individuals in each generation of the ideas and the second one is parameter m which represents the number of clusters. Beside these two parameters, four different parameters need to be set, parameters \(p_{5a},~ p_{6b},~ p_{6bi}\) and \(p_{6c}\) that determine how a new solution will be created according to the Algorithm 1.

figure a

New solutions are generated by the following equation:

$$\begin{aligned} x_{new}=x_{selected}+\zeta *n(\mu ,\sigma ) \end{aligned}$$
(1)

where \(x_{new}\) is a new solution in the d-dimensional space, \(x_{selected}\) represents solution selected to be potentially changed, \(n(\mu ,\sigma )\) is a random number generated from Gaussian distribution with mean \(\mu \) and variance \(\sigma \), while \(\zeta \) is the coefficient that controls the influence of the Gaussian random value. Parameter \(\zeta \) is calculated in each generation by the following expression:

$$\begin{aligned} \zeta =\text {logsig}((0.5*maxIteration-currentIteration)/k)*rand() \end{aligned}$$
(2)

where maxIteration and currentIteration represent maximal number of iterations and the number of the current iteration, respectively. Parameter k changes the logsig() function’s slope where logsig is a logarithmic sigmoid transfer function. Finally, rand() represents random value from uniform distribution within [0,1]. In this paper we propose using chaotic maps instead of random values.

2.1 Chaotic Maps

Chaotic optimization algorithms are optimization algorithms that use chaotic variables rather then random values. The characteristics of chaotic maps such as non-repetition and ergodicity may improve search in the optimization algorithms [21]. In this paper two different one-dimensional maps were considered: circle map and sinusoidal map. Circle map is defined by the following equation:

$$\begin{aligned} x_{k+1}=\Big [x_k+b-\frac{a}{2\pi }\sin (2\pi x_k)\Big ] \mod 1 \end{aligned}$$
(3)

where for \(a=0.5\) and \(b=0.2\) the generated chaotic sequence is within (0, 1).

Sinusoidal map is defined as:

$$\begin{aligned} x_{k+1}=ax_k^2\sin (\pi x_k) \end{aligned}$$
(4)

where for \(a=2.3\) and \(x_0=0.7\) the following simplified form can be used:

$$\begin{aligned} x_{k+1}=\sin (\pi x_k) \end{aligned}$$
(5)

The proposed chaotic maps were used to generate chaos sequence of numbers that were used in Eq. 2.

3 Simulation Results

Our proposed method was implemented in Matlab R2016a and simulations were performed on the platform with Intel ® Core™ i7-3770K CPU at 4 GHz, 8 GB RAM, Windows 10 Professional OS.

The proposed algorithms with chaotic maps were tested on 15 well-known benchmark functions proposed in CEC 2013. We tested on 5 unimodal and 10 multimodal 10-dimensional functions and the details about these functions are presented in Table 1. Parameters for brain storm algorithm were set empirically. Number of individuals was set to 100 and number of the clusters was 5. Probabilities \(p_{5a},~ p_{6b},~ p_{6bi}\) and \(p_{6c}\) were set to 0.2, 0.8, 0.4 and 0.5, respectively. Maximal number of iterations was 5000. All tests were run 30 times.

Table 1. Benchmark function details
Table 2. Comparison of PSO, BSO, CBSO-C and CBSO-S

We compared the original brain storm optimization algorithm with two versions when chaotic maps were introduced. In the first version we used circle map and we named this chaotic brain storm optimization algorithm CBSO-C. The second version had implemented sinusoidal map and it was named CBSO-S. We also compared the results with results of standard PSO algorithm [22]. In [22] maximal number of objective function evaluation was set to 100,000 while in our proposed algorithm it was 50,000 since larger number of evaluations did not improve results. For each function algorithms were executed 30 times and median, standard deviation, the best and the worst solutions were calculated as in [22]. The obtained results are presented in Table 2.

For benchmark function \(f_{1}\), the sphere, all algorithms successfully found global minimum in all cases since standard deviation was 0. Similarly, all algorithms found the global optimum for function \(f_5\) in almost all cases. The smallest standard deviation was achieved by sinusoidal chaotic BSO. From the results presented in Table 2 it can be seen that PSO achieved the same best solutions in the case of benchmark functions \(f_3\), \(f_{11}\) and \(f_{12}\). In all other cases standard PSO was outperformed by BSO or our proposed chaos based BSO.

For function \(f_2\) original BSO achieved the best results, however with a large error. It can be said that all algorithms failed to find global optimum. Similarly to \(f_2\), for \(f_3\) again not nearly good solutions were found by any algorithm but BSO managed to find the closest solutions. For functions \(f_4\) and \(f_7\) the best results were achieved when BSO with sinusoidal map was used. For \(f_4\) where optimal solution is −1100, the best found one was −995.3 which is significantly different, but the improvement over the original BSO is noticeable. Original BSO had −744.3 as median in 30 runs which is worse then the results obtained by both chaotic based BSO. Chaotic based BSO improved performance of the original BSO for function \(f_7\), from −734.9 to 764.0 by CBSO-C and −788.5 by CBSO-S.

For functions from \(f_8\) to \(f_{15}\) the best solutions were obtained by CBSO-C, chaotic BSO with circle map. Improvements are in some cases smaller, such as for functions \(f_8\) and \(f_{10}\). In the case of \(f_8\) median with the original BSO was −679.7 while this solution is improved by CBSO-S to −679.8 and finally, CBSO-C obtained median −680.0. Function \(f_{10}\) was successfully solved by all three BSO versions, but the most robust one was CBCO-C since it had smallest standard deviation. In all other cases chaos based brain storm optimization with circle map improved results of original BSO significantly and also outperformed PSO. For functions \(f_{9}\)\(f_{15}\), except for the function \(f_{10}\), BSO with circle map obtained results better than the original BSO and also better than BSO with sinusoidal map. This shows that different maps are more suitable for some functions.

4 Conclusion

In this paper chaos based brain storm optimization algorithm was proposed. Two different one-dimensional chaotic maps were implemented into the original BSO: circle and sinusoidal maps. Proposed algorithms were tested on 15 benchmark functions from CEC 2013 and the results were compared to the original BSO and standard PSO. The best results in the most cases were obtained by BSO algorithm with circle map. In other cases the best performance was achieved by the original BSO or BSO with sinusoidal map. BSO as well as two modifications outperformed standard PSO in all cases. In further work different chaotic maps can be used and compared to other chaotic optimization algorithms.