Keywords

1 Introduction

Swarm intelligence algorithms have some advantages in handling complex and difficult optimization problems [1,2,3]. So, it has received more and more attention in optimization community. At present, the main swarm intelligence algorithms include ACO, PSO and ABC. Compared with PSO and ACO, ABC has fewer parameters. Some other studies proved that ABC has certain advantages [3,4,5,6,7,8].

However, ABC also has certain disadvantages. Some studies showed that the randomness of the neighborhood selection method is too strong [4,5,6,7,8]. So, ABC exhibits good exploration and poor exploitation. To conquer this shortcoming, there were some improved methods for search strategy design [9,10,11,12,13,14,15,16,17]. For example, an enhanced ABC based on global best solution was proposed in [9]. In [10], Wang et al. utilized an external archive to guide ABC. In [11], different guidance information was used to improve the solution. Banharnsakun et al. [12] used a best-so-far ABC to effectively solve the JSSP problem. In [13], ABC was parallelized [13]. Tuba et al. [14] introduced adaptive steering adjustments in ABC. In [15], inertia weights and acceleration factors were used to strengthen the search equation. Bi and Wang [16] designed a fast mutation-based ABC. In [17], DE was utilized to strengthen the search equation. It can be seen from the above literature that most of improvements on ABC aim to improve their search equations.

To further strengthen the performance, an improved ABC based on external archive is presented. The new method is motivated by GABC [9] and JADE [18]. In NSSABC, we use an external archive [10] to preserve the global best solution in the search equation of GABC. Those best solutions contained in the external archive are updated as the number of iterations increases. Simulation test experiments show NSSABC algorithm is superior to the standard ABC, GABC and IABC on most test functions.

The remainder of this paper is organized as follows. The descriptions of ABC are briefly given in Sect. 2. The proposed NSSABC is described in Sect. 3. Simulation experiments are conducted in Sect. 4. Finally, this work is concluded in Sect. 5.

2 Artificial Bee Colony

In ABC, there are four main processes: initialization process, employ bee process, onlooker bee process, and scout bee process. At the initialization stage, ABC randomly generates SN solutions (food sources) by the following formula.

$$ x_{i,j} = x_{j,\hbox{min} } + rand(0,1) \cdot (x_{j,\hbox{max} } - x_{j,\hbox{min} } ) $$
(1)

where \( i\, = \,1,2 \ldots ,SN,\; \, j\, = \,1,2 \ldots ,D,SN \) is the population size, D is the dimension size, rand(0, 1) randomly generated in [0, 1] and [xj,min, xj,max] is the constraint of search range.

In the process of employing bee, they have their own food sources. Each employ bee finds other food sources around its own food source by the following formula.

$$ v_{i,j} = x_{i,j} + \phi_{i,j} (x_{i,j} - x_{k,j} ) $$
(2)

where i = 1, 2…, SN, vi is a new solution, xk is a randomly chosen solution (k ≠ i), and ϕi,j ∈ [−1, 1] is a random weight.

In the process of onlooker bee, they select the food source according to the selection formula as follows.

$$ p_{i} = \frac{{f_{i} }}{{\sum\nolimits_{i = 1}^{SN} {f_{i} } }} $$
(3)

where pi is the selection probability, and fi is the fitness value. It is apparently that the probability and the fitness value are positively correlated. It means that a better solution has a larger selection probability.

In the process of scout bee, if the food source could not be further updated within some iterations, it is supposed to be discarded. Assuming that the discarded solution is xi, the scout bee randomly produces a new one by formula (1) instead of xi.

3 ABC Based on New Search Strategy

Global exploration capabilities and local exploitation capabilities are two very important aspects for assessing the optimization ability of a swarm intelligence algorithm. So, balancing the two capabilities has influences on the performance of ABC.

For example, in GABC, the global best solution is used as a directive solution. Compared with the standard ABC, GABC further enhances the exploitation capabilities. However, the gbest may cause the entire swarm to a wrong search direction. It is pointed out in [7] that the search formula in GABC may be in the opposite direction, which easily leads to the oscillation during the search.

In [10], an external archive in IABC was used to lead the search, in which a concept of external archive makes full use of the global optimal solution. However, IABC causes all solutions converging toward the global best solution at different stages. It may result in the loss of swarm diversity.

To solve the above problems, a new ABC based on external archive is deigned, which combines IABC and GABC. In NSSABC, we add the external archive of IABC to the GABC search strategy to replace the guidance of the gbest in GABC. The external archive focuses on preserving different global best solutions. With the growth of iterations, the external archive is updated. Compared with GABC, NSSABC uses some good solutions (not the best one) to lead the search. It may correct the wrong search direction and avoid local minima. Unlike IABC, NSSABC does not use the current gbest to lead the search. It reduces the degree of convergence of all solutions to the gbest, and the swarm diversity increases. In summary, NSSABC can maintain strong exploration and exploitation capabilities.

$$ v_{i,j} = x_{i,j} + \phi_{i,j} (x_{i,j} - x_{k,j} ) + \psi_{i,j} \left( {x_{Arc,j} - x_{i,j} } \right) $$
(4)

where xArc is randomly chosen from the external archive, and xk is randomly selected from the swarm (k ≠ i). ψi,j ∈ [0, 1.5] and ϕi,j ∈ [−1, 1] are two random numbers.

Initially, an external archive A is created, and m is used as the threshold size of the archive A. After each generation, the archive A is updated. If the gbest is improved, then gbest will be preserved in A and the size increases 1 [10]. If the current archive is greater than m, we randomly delete a solution in the archive A.

Compared with the standard ABC, NSSABC only modifies the search strategy. Therefore, they have the same time complexity.

4 Experimental Study

To assess the optimization ability of NSSABC, some famous problems are applied in the simulation tests. Table 1 presents the brief descriptions of these benchmark functions.

Table 1. Benchmark functions

In the experiment, the SN and the control parameter limit are set to 100. The threshold m is set to 5. For D = 30 and D = 100, the MAX FEs is equal to 1.5E+05 and 5.0E+05, respectively.

Table 2 gives the comparison results of four ABC algorithms when D = 30. From the results, NSSABC surpasses other three ABCs on all test cases except for f3. NSSABC finds the global optimal solution on the three test functions f6, f9, and f11. For f1 and f2, NSSABC greatly improves the quality of solutions comparing with IABC and GABC. However, GABC outperforms NSSABC on f3. NSSABC, GABC and IABC achieve the same result on both f6 and f9. Overall, NSSABC provides better solutions than GABC, ABC and IABC.

Table 2. Results achieved by ABC, IABC, GABC and NSSABC when D = 30

Table 3 displays the results of four ABCs when D = 100. Like D = 30, these four algorithms do not achieve good results on f3, f4, and f5. When D = 100, ABC falls into local minima, but GABC, IABC and NSSABC can still find the global optimal solution. In general, for D = 100, the test results of these four ABC algorithms are worse than those of D = 30. For f9 and f11, NSSABC can still find the global optimal solution, but IABC and GABC fail to do.

Table 3. Results achieved by ABC, IABC, GABC and NSSABC when D = 100

Figure 1 shows a comparison of the convergence performance of NSSABC and other ABCs. It is obvious that NSSABC is much faster than the other three algorithms except for f5, f6, and f9. The main reason is that the new search strategy based on GABC and IABC helps speed up the convergence on most of the test functions.

Fig. 1.
figure 1

The convergence characteristics of ABC, GABC, IABC and NSSABC on some functions for D = 30.

5 Conclusions

In this paper, we propose an improved ABC algorithm called NSSABC, which uses the information of the global best solution to guide the search. An external archive aims to preserve the updated global best solutions. It avoids the oscillating search to some extent. NSSABC can maintain good exploration and exploitation capacities. Experimental results show NSSABC algorithm is superior to the ABC, IABC, and GABC algorithms.