Keywords

1 Introduction

The nature-inspired optimization algorithms derived from the simulation of biological group behaviors in natural world. With a simple and parallel implement, strong robustness, good optimization results and so on, the nature-inspired optimization algorithms have become the focus of study. In recent years, some novel swarm intelligence algorithms have been proposed, such as Artificial Bee Colony Optimization (ABC) [1, 2], Shuffled Frog Leaping Algorithm (SFLA) [3], Artificial Fish Swarm Algorithm (AFSA) [4], Cuckoo Search (CS) [5], Monkey Algorithm (MA) [6], Firefly Algorithm (FA) [7], Glowworm Swarm Optimization algorithm (GSO) [8], Flower Pollination Algorithm (FPA) [9], Wind Driven Optimization (WDO) [10], Charged System Search (CSS) [11] and so on.

First proposed by Yang [12] in 2010, Bat Algorithm (BA) was originated from the simulation of echolocation behavior in bats. Bats use a type of sonar called echolocation to detect prey, and avoid obstacles in the dark. When searching their prey, the bats emit ultrasonic pulses. During flight to the prey, loudness will decrease while the pulse emission will gradually increase, which can make the bat locate the prey more accurately. Applications of BA algorithm span the areas of constrained optimization tasks [13], global engineering optimization [14], multi-objective optimization [15], structural optimization [16], and discrete size optimization of steel frames [17].

In this paper, we propose a novel median-oriented bat algorithm (MBA) for the function optimization problem. The proposed algorithm adopts the median and worst bat individuals [18] to avoid premature convergence. As a result, the global search ability of MBA is improved and the proposed algorithm can avoid trapping in the local optimum. Simulation results demonstrate the effectiveness and robustness of the proposed algorithm. MBA can get a more accurate solution for the optimization problems. In MBA, the mutation operation in DE is added to the bat algorithm to accelerate the global convergence speed.

The remainder of this paper is organized as follows. Section 2 introduces the basic bat algorithm. In Sect. 3, median-oriented bat algorithm is introduced. The experimental results and comparison results are given in Sect. 4. Finally, some relevant conclusions are presented in Sect. 5.

2 The Basic Bat Algorithm

2.1 The Update of Velocity and Position

Initialize the bat population randomly. Supposed the dimension of search space is \( n \), the position of the bat \( i \) at time \( t \) is \( x_{i}^{t} \) and the velocity is \( v_{i}^{t} \). Therefore, the position \( x_{i}^{t + 1} \) and velocity \( v_{i}^{t + 1} \) at time \( t + 1 \) are updated by the following formula:

$$ f_{i}^{t} = f_{\hbox{min} } + (f_{\hbox{max} } - f_{\hbox{min} } )\beta $$
(1)
$$ v_{i}^{t + 1} = v_{i}^{t} + (x_{i}^{t} - Gbest)f_{i}^{t} $$
(2)
$$ x_{i}^{t + 1} { = }x_{i}^{t} { + }v_{i}^{t + 1} $$
(3)

Where, \( f_{i} \) represents the pulse frequency emitted by bat \( i \) at the current moment. \( f_{\hbox{max} } \) and \( f_{\hbox{min} } \) represent the maximum and minimum values of pulse frequency respectively. \( \beta \) is a random number in \( [0,1] \) and \( Gbest \) represents the current global optimal solution.

Select a bat from the bat population randomly, and update the corresponding position of the bat according to Eq. (1). This random walk can be understood as a process of local search, which produces a new solution by the chosen solution.

$$ x_{new} (i) = x_{old} + \varepsilon A^{t} $$
(4)

Where, \( x_{old} \) represents a random solution selected from the current optimal solutions, \( A^{t} \) is the loudness, and \( \varepsilon \) is a random vector, and its arrays are random values in \( [ - 1,\;1] \).

2.2 Loudness and Pulse Emission

Usually, at the beginning of the search, loudness is strong and pulse emission is small. When a bat has found its prey, the loudness decreases while pulse emission gradually increases. Loudness \( A(i) \) and pulse emission \( r(i) \) are updated according to Eq. (2) and Eq. (3):

$$ r^{t + 1} (i) = r^{0} (i) \times [1 - \exp ( - \gamma t)] $$
(5)
$$ A^{t + 1} (i) = \alpha A^{t} (i) $$
(6)

Where, both \( 0 < \alpha < 1 \) and \( \gamma > 0 \) are constants. \( A(i){ = }0 \) means that a bat has just found its prey and temporarily stop emitting any sound. It is not hard to find that when \( t \to \infty \), we can get \( A^{t} (i) \to 0 \) and \( r^{t} (i) = r^{0} (i) \).

2.3 The Implementation Steps of Bat Algorithm

Step 1: Initialize the basic parameters: attenuation coefficient of loudness \( \alpha \), increasing coefficient of pulse emission \( \gamma \), the maximum loudness \( A^{0} \) and maximum pulse emission \( r^{0} \) and the maximum number of iterations \( Maxgen \);

Step 2: Define pulse frequency \( f_{i} \in [f_{\hbox{min} } ,\;f_{\hbox{max} } ] \);

Step 3: Initialize the bat population \( x \) and \( v \);

Step 4: Enter the main loop. If \( rand < r_{i} \), update the velocity and current position of the bat according to Eqs. (2) and (3). Otherwise, make a random disturbance for position of the bat, and go to Step 5;

Step 5: If \( rand < A_{i} \) and \( f(x_{i} ) < f(x^{*} ) \), accept the new solutions, and fly to the new position;

Step 6: If \( f(x_{i} ) < f_{\hbox{min} } \), replace the best bat, and adjust the loudness and pulse emission according to Eqs. (5) and (6);

Step 7: Evaluate the bat population, and find out the best bat and its position;

Step 8: If termination condition is met (i.e., reach maximum number of iterations or satisfy the search accuracy), go to step 9; Otherwise, go to step 4, and execute the next search.

Step 9: Output the best fitness values and global optimal solution.

Where, \( rand \) is a uniform distribution in \( [0,1] \).

3 Median-Oriented Bat Algorithm

In this section, a novel median-oriented bat algorithm (MBA) is presented to enhance the performance of the basic bat algorithm [1922]. In BA, each bat moves toward good solutions based on the best solution. MBA is a global search algorithm.

$$ stepnow = (iterMax - iter)^{3} *(step_{ini} - step_{final} )/(iterMax)^{3} + step_{final} $$
(7)
$$ v_{i}^{t + 1} = v_{i}^{t} \, + \, f_{i}^{t} *(x_{i}^{t} - Gbest) \, + \, stepnow*rand*(x_{i}^{t} - Gmedian - Gworst) $$
(8)
$$ x_{i}^{t + 1} { = }x_{i}^{t} { + }v_{i}^{t + 1} $$
(9)

Due to the proposed algorithm considering the best bat individual, the median bat individual and the worst bat individual, this is equivalent to adopt a compromise solution. The coordination of the bat population of individuals is conducive to cover a wider range of bat population and increase the diversity of bat population.

4 Simulation Experiments and Discussion

4.1 Simulation Platform

All the algorithms are implemented in Matlab R2012 (a). The test environment is set up on a computer with AMD Athlon (tm) II X4 640 Processor, 3.00 GHz, 4 GB RAM, running on Windows 7.

4.2 Benchmark Functions

In order to verify the effectiveness of the proposed algorithm, we select 10 standard benchmark functions [23] in Table 1 to detect the searching capability of the proposed algorithm. The proposed algorithm in this paper (i.e., MBA) is compared with PSO, CLSPSO and BA.

Table 1. Benchmark functions

4.3 Parameter Setting

In PSO, we use linear decreasing inertia weight is \( \omega_{\hbox{max} } = 0.9 \), \( \omega_{\hbox{min} } = 0.4 \), and learning factor is \( C_{1} = C_{2} = 1.4962 \). In CLSPSO, inertia weight and learning factor are the same as in PSO. The times of chaotic search is \( MaxC = 10 \). In BA, the parameters are generally set as follows: pulse frequency range is \( f_{i} \in [ - 1,\;1] \), the maximum loudness is \( A^{0} = 0.9 \), minimum pulse emission is \( r^{0} = 0.5 \), attenuation coefficient of loudness is \( \alpha = 0.95 \), increasing coefficient of pulse emission is \( \gamma = 0.05 \). In MBA, the basic parameters are the same as in BA, and \( stepLen\_ini = 5 \) \( stepLen\_final = 0 \).

4.4 Experimental Results

In order to evaluate the performance of MBA, sixteen benchmark functions are adopted in this paper. In this section, the population size \( popsize = 50 \) and maximum number of iterations \( iterMax = 2000 \), and MBA is compared to DE and BA. The test results are get from 50 independent run times.

4.4.1 Experimental Results of Low-Dimension Case

The comparison results of all the algorithms on all the functions are recorded in Table 2. The best, mean, worst and std represent the optimal value, mean value, worst value and standard deviation, respectively. We can see that the performance of MBA exhibits significantly better than that of other algorithms. For the functions F 1 , F 2 , F 4 , F 7 , F 8 , F 9 , F 10 , the MBA algorithm can obtain the theoretical optimal values in all runs. For the function F 3 , compared with the mean value, the quality of MBA algorithm is far better than PSO, CLSPSO, BA with at least higher 5, 5, and 3 orders of magnitude, respectively. For function F 5 , the best value and the mean value of MBA are both better than those of other algorithms. For function F 6 , compared with the mean value, the quality of MBA algorithm is far better than PSO, CLSPSO, BA with at least higher 16, 15, and 17 orders of magnitude, respectively. By Comparison with standard deviation, MBA is also better than PSO, CLSPSO, BA, we can see that the MBA algorithm has strong robustness. And for functions F 1 , F 2 , F 4 , F 6 , F 7 , F 8 , F 9 , F 10 , the standard deviations are 0. That is, the MBA algorithm obtains the same global optimal value in all runs. For the basic BA algorithm, the best values are inferior to those of MBA, even to the worse values of MBA on all functions. The results of CLSPSO are a little better than those of PSO, but obvious worse than those of MBA. Figure 1 shows the mean fitness of four algorithms on the function F 1 to F 10 , when the mean value is not 0, MBA has a longer and downward column. When the mean value is 0, we cannot find the bar for MBA algorithm.

Table 2. Experimental results for function from \( F_{1} \) to \( F_{10} \) (D = 30)
Fig. 1.
figure 1

Mean fitness of four algorithms for \( F_{1} \) to \( F_{10} (D = 30) \). (Color figure online)

4.4.2 Experimental Results of High-Dimension Case

In order validate the performance of the proposed algorithm further, we implement the experiment on 100 dimensions for the all algorithms and keep the parameters unchanged.

The comparison results for high-dimension case are shown in Table 3. Seen from the results, the optimization performance of MBA is the best. For the functions F 1 , F 2 , F 4 , F 7 , F 8 , F 9 , F 10 , the MBA algorithm still obtains the theoretical optimal values in all runs without a doubt. Only the precision of the best value descends for solving the F5 function. The standard deviations of MBA are better than those of other algorithms on all functions. It demonstrates that MBA has strong robustness and good global search ability, and MBA does not reduce the accuracy of the solutions as the dimension increases. However, the precision of solutions of other three algorithms, including the best value, the mean value, the worse value, the median value and the standard deviation, will decrease with the increase of the dimension. Figures 2 and 3 show the results of the ANOVA tests for all algorithms on F 1 and F 2 . PSO and BA have long tail on the functions F 1 , F 2 , and more singular points. That is, the methods are not robust and are not acceptable for the function optimization. The precision of BA is inferior to those of PSO and CLSPSO, but better stability and robustness.

Table 3. Experimental results for function from \( F_{1} \) to \( F_{10} \) (D = 100)
Fig. 2.
figure 2

ANOVA tests for \( F_{1} (D = 100) \). (Color figure online)

Fig. 3.
figure 3

ANOVA tests for \( F_{2} (D = 100) \). (Color figure online)

5 Conclusions

This paper presented a novel Median-oriented bat algorithm (MBA) for function optimization problem. The proposed algorithm adopts the median and worst bat individuals to avoid premature convergence. MBA has an excellent ability of global search owing to its diversity caused by the probabilistic representation. The simulation experiments show that the proposed algorithm is a feasible and effective way for function optimization. The optimization ability of MBA does not show a significant decline with increasing the dimension. The proposed algorithm is suitable for low -dimensional and high-dimensional case.