1 Introduction

We consider the unconstrained optimization problem

$$\min f(x),\quad x \in R^n$$
(1)

where \(f:R^n \rightarrow R\) is an objective function and x represents a decision variable vector. Also, \(x \in S \subset R^n\) and S denotes the search space, which is n dimensional and bounded by parametric constraints as follows:

$$l_i\le x_i\le u_i\quad i=1,2, \ldots , n$$
(2)

where \(l_i\) and \(u_i\) are the lower and upper bounds of the decision variables \(x_i\), respectively.

Many researchers have solved the problem in (1) via deterministic algorithms such as steepest descent, Newton, and conjugate gradient methods, see, e.g., [37]. These methods require the differentiability of the objective function. However, computing the Jacobian (derivative of the objective function) is a difficult and expensive operation. Also, the objective function might be nonsmooth. This is a motivation for many researchers to develop stochastic global optimization algorithms such as swarm intelligence (SI) algorithms. SI take inspiration from the behavior of a group of social organisms. These algorithms are applied to solve global optimization problems and their applications such as Ant Colony Optimization (ACO) [8], Artificial Bee Colony [12], Particle Swarm Optimization (PSO) [13], Bacterial foraging [18], Bat algorithm [35], Bee Colony Optimization (BCO) [28], Wolf search [25], Cat swarm [6], Cuckoo search [34], Firefly algorithm [33], Fish swarm/school [15], etc. Some old metaheuristic algorithms such as genetic algorithms (GA) have some drawbacks when they are dealing with multimodal optimization problems.

Recently, Yang [35] proposed a novel metaheuristic search algorithm, called bat algorithm (BA). Preliminary studies show that it is very promising and could outperform existing algorithms. Bats are captivating animals, are the only mammals with wings and have advanced capability of echolocation. Micro bats use a type of sonar, called, echolocation, to detect prey, avoid obstacles, and locate their roosting crevices in the dark.

BA has a good capability to balance the global exploration and the local exploitation during the search process. Since BA has a powerful performance, is easy to implement, and has fast convergence, many researchers have attracted and applied BA on their works to solve various applications, for example, Yang [36] applied BA to solve multi-objective optimization and benchmark engineering problems. Zhang and Wang [38] improved the diversity of solutions by using the mutation with bat algorithm for image matching. Komarasamy and Wahi [14] gave a combination of K-means and bat algorithm (KMBA) for efficient clustering. Lin et al. [16] carried out parameter estimation in dynamic biological systems using a chaotic bat algorithm by combining Levy flights and chaotic maps. Nakamura et al. [17] developed a discrete version of bat algorithm to solve classifications and feature selection problems. Wang and Guo [31] combined bat algorithm with harmony search and have produced a hybrid bat algorithm for numerical optimization of function benchmarks. In addition, Xie et al. [32] presented a variant of bat algorithm combining differential operator and Levy flights to solve function optimization problems.

The main objective of this paper is to produce a new hybrid bat algorithm with the multi-directional search algorithm in order to solve unconstrained global optimization problems. The proposed algorithm is called multi-directional bat algorithm (MDBAT). In order to overcome the slow convergence of the bat algorithm, we invoke the multi-directional search algorithm. The multi-directional search can accelerate the search instead of letting the algorithm running for more iterations without any improvement. The bat algorithm running for number of iterations then the multi-directional search starts the search from the best obtained solution from the bat algorithm.

The rest of this paper is organized as follows. In Sect. 2, we overview the multi-directional search algorithm. In Sect. 3, we present the standard bat algorithm and describe he main concepts of the proposed MSBAT algorithm. In Sect. 4, we present our numerical experimental results. Finally, in Sect. 5 we end up with our conclusion and future work.

2 Multi-directional search algorithm

The multi-directional search algorithm was proposed by Dennis and Torczon [30] as a step towards a general-purpose optimization algorithm with promising properties for parallel computation. The multi-directional search algorithm is a direct search algorithm.

According to Dennis and Torczon [7], the Nelder-Mead Simplex algorithm can converge to non-minimizers when the dimension of the problem becomes large enough.

One of the advantages of the multi-directional search is that unlike the Nelder-Mead simplex algorithm, it is backed by convergence theorems that numerical testing also indicate are borne out in practice.

Since the multi-directional search is backed by convergence theorems, it has a promising behavior when applied with the high dimensional problems.

In Algorithm 1 and Fig. 1, we present the main steps of the multi-directional search algorithm.

figure a
Fig. 1
figure 1

Multi-directional search

We can summarize the main steps of the multi-directional search as follows.

  • Step 1 The algorithm starts by setting the initial values of the expansion factor \(\mu\), contraction factor \(\theta\) and the maximum number of iterations parameter \(Max _{itr1}\). (Line 1)

  • Step 2 The multi-directional search algorithm begins with a simplex S with verities \(x_i^0\), where \(i=0,1, \ldots , n\). (Line 2)

  • Step 3 The vertices are sorted in ascending order where \(f(x_0^0)\le f(x_1^0)\le \cdots \le f(x_n^0)\). (Line 3)

  • Step 4 The iteration counter is initialized and the main algorithm loop is initialized. (Lines 4–5)

  • Step 5 The reflected step is started by reflecting the vertices \(x_1, x_2, \ldots , x_n\) through the best vertex \(x_0\) and the new reflected vertices are evaluated. (Lines 6–9)

  • Step 6 If a reflected vertex succeeds and its value is better than the current best vertex, then the algorithm starts the expansion step. (Line 10)

  • Step 7 The expansion process starts to expand each reflected edge by using the expansion factor \(\mu\), where \(\mu =2\) to create new expanded vertices. The new expanded vertices are evaluated in order to check the success of the expansion step. (Lines 10–14)

  • Step 8 If the expanded vertex is better than the all reflected vertices, the new simplex will be the expanded simplex. (Lines 15–16)

  • Step 9 If the expansion and reflection steps fail, then the contracted simplex starts by changing the size of the step via using contraction factor \(\theta\), which reduces the reflected simplex to the half in the next iteration. (Lines 18–21)

  • Step 10 The new vertices are evaluated and the vertices are sorted according to their evaluation function value and the new simplex is constructed. (Lines 24–27)

  • Step 11 The iteration counter is increased and the overall process are repeated till the termination criterion is satisfied which is by default the maximum number of iterations \(Max _{itr1}\). Finally, the best solution is produced. (Lines 28–30)

3 The basic bat algorithm

In this section, we overview of the main concepts and structure of the bat algorithm as follows.

3.1 Main concepts

Bat algorithm (BA) is a novel metaheuristic optimization algorithm proposed by Yang [35]. Because the bat algorithm is simple to understand, its adjustment parameters are few, it is easy to implement and its convergence speed is fast at a very initial stage by switching from exploration to exploitation, however, switching from exploration to exploitation quickly may lead to stagnation after some initial stage.

Bats use sonar called as echolocation to detect prey and to avoid obstacles. Micro-bats are able to recognize positions of the objects by spreading short and high audio signals and by reflection and collision of these spread signals.

Xin-She Yang idealized the following rules to model Bat algorithm:

  • All bats use echolocation to sense distance, and they also know the difference between food/prey and background barriers in some magical way.

  • Each bat randomly moves with velocity \(v_i\) at a position \(x_i\) with a frequency \(f_{min}\) varying loudens \(A_0\) and pulse emission rate r.

  • Although the loudness can vary in many ways, Xin-She Yang assumes that the loudness varies from a large value \(A_0\) to a minimum value \(A_{min}\).

The bat algorithm is a population based method, where the population size consist of bats (solutions). Each bat (solution) in the population is randomly moving with velocity \(v_i\) and a location \(x_i\). Also each bat is randomly assigned a frequency drawn uniformly from \([f_{min},f_{max}]\). The position of each bat in the population is updated as shown in the following equations.

$$f_i= f_{min}+(f_{max}-f_{min})\beta$$
(3)
$$v^t_i= v^{t-1}_i+\left( x^{t-1}_i - x^*\right) f_i,$$
(4)
$$x^t_i= x^{t-1}_i+v^t_i,$$
(5)

where \(\beta \in [0,1]\) is a random vector drawn from a uniform distribution.

The loudness \(A_i\) and the pulse rate emission \(r_i\) are very important to let the algorithm switch between exploration and exploitation process. When the bat has found its pray, the loudness decreases and the rate of pulse emission increases. The bat algorithm starts with an initial value of the loudness \(A_0\) and the rate of pulse emission \(r_0\), then these values will be updated as shown in the following equations.

$$A_i^{(t+1)}= \alpha A_i^{(t)}$$
(6)
$$r_i^{(t)}= r_i^{(0)} [1-exp(-\gamma t)]$$
(7)

where \(\alpha\) and \(\gamma\) are constants, the \(\alpha\) parameter plays a similar role as the cooling factor in the simulated annealing algorithm, where \(\alpha \in [0,1]\) and \(\gamma > 0\).

For the local search part, a solution is selected among the current best solutions, for each bat, a new solution is generated locally using a local random walk as in (8)

$$x_{new}=x_{old}+ \lambda A^t$$
(8)

where \(\lambda\) is a random number, \(\lambda \in [-1,1]\) and \(A^t\) is the average loudness of all the bats at this iterations.

3.2 Bat algorithm

In Algorithm 2, we show the main steps of the standard bat algorithm and can summarize these steps as follows.

figure b
  • Step 1 The algorithm starts by setting the initial values of its parameters and the main iteration counter is set to zero. (Lines 1–2)

  • Step 2 The initial population is randomly generated by generating the initial position \(x^0\) and the initial velocity \(v^0\) for each bat (solution) in the population, the initial frequency \(f_i\) is assigned to each solution in the population, where f is randomly generated from \([f_{min}, f_{max}]\). The initial population is evaluated by calculating the objective function for each solution in the initial population \(f(x^0_i)\). The values of pulse rate \(r_i\) and loudness \(A_i\) are initialized, where \(r \in [0,1]\) and \(A_i\) varies from a large \(A_0\) to \(A_{min}\). (Lines 3–9)

  • Step 3 The new population is generated by adjusting the position \(x_i\) and the velocity \(v_i\) for each solution in the population is given in (3), (4), and (5). (Lines 12–13)

  • Step 4 The new population is evaluated by calculating the objective function for each solution and the best solution \(x^*\) is selected from the population. (Lines 14-15)

  • Step 5 The local search method is applied in order to refine the best found solution at each iteration. (Lines 16–19)

  • Step 6 The new solution is accepted with some proximity depending on parameter \(A_i\), the rate of pulse emission increases and the loudness decreases. The values of \(A_i\) and \(r_i\) are updated as shown in (6) and (7).

  • Step 7 The new population is evaluated and the best solution is selected from the population. The operations are repeated until termination criteria are satisfied and the overall best solution is produced. (Lines 25–28)

3.3 The proposed MDBAT algorithm

In Algorithm 3, we present the algorithm of the proposed MDBAT algorithm. The proposed algorithm applies the same steps of the standard bat algorithm as shown in Algorithm 2, then the best found bat solution is refined by applying Algorithm 1 for a number of iterations \(Max _{itr2}\) till termination criteria are satisfied.

figure c

4 Numerical experiments

In this section, we evaluate the efficiency of the MDBAT algorithm by presenting the general performance of it with various benchmark functions and comparing the results of the proposed algorithm against various algorithms. In the following subsections, we present the parameter setting of the proposed algorithm and the properties of the applied test functions. Also, we present the performance analysis of the proposed algorithm with the comparative results between it and the other algorithms.

4.1 Parameter setting

We summarize the parameters of the MDBAT algorithm summarized with their assigned values in Table 1. We select these values based on the common setting in the literature or our preliminary numerical experiments.

  • Population size P The experimental tests show that the best population size is P = 20, increasing this number will increase the evaluation function values without any improvement in the obtained results.

  • Frequency parameter f Bat movement is based on the value of the frequency parameter f. In MDBAT algorithm, it turns out that the quality of the solution is related to the value of f parameter. The experimental tests show that the best maximum value of f is \(f_{max}=2\) and the minimum value of f is \(f_{min} =0\).

  • Loudness parameters A and \(\alpha\) Loudness parameter A is one of the most important parameter in the bat algorithm. The acceptance of the new generated solutions is depending on the value of A. The \(\alpha\) parameter plays a similar role as the cooling factor in the simulated annealing algorithm. We set the initial value of A to A = 1 and the value of \(\alpha\) is set to \(\alpha =0.9\).

  • Pulse emission rate r The value of the rate of pulse emission parameter r is very important to apply the local search method in the algorithm. The experimental tests show that, the best value of r is 0.9 and the rate of pulse emission constant is \(\gamma =0.9\).

  • Step size for random walk \(\lambda\) The experimental results show that the best step size \(\lambda\) in the random walk is equal to 1. Figure 2 shows the general performance of the standard bat algorithm with different values of step size at \(\lambda = 0.1, 0.01, 0.001, 1\) for functions f 1, f 2, f 4, f 5, f 6, f 8, f 9, f 16 (randomly picked ). Figure 2 shows that the best value of \(\lambda\) is equal to 1.

  • Multi-directional search parameters There are two main parameters in the multi-directional search algorithm, which are the expansion factor parameter \(\mu\) and the contraction factor parameter \(\theta\). We take the selected values for \(\mu\) and \(\theta\) from lecture, where \(\mu =2\) and \(\theta =0.5\).

  • Stopping condition parameters MDBAT terminates the search when the number of iterations reaches to the desired maximum number of iterations or any other termination depends on the comparison with other algorithms. In our experiment, we set the value of the maximum iteration number for bat algorithm \(Max _{itr1}= 10\) before starting the multi-directional search algorithm and the maximum iteration number for multi-directional search algorithm \(Max _{itr2}= 100\).

  • Final intensification We collect the best obtained solutions from the bat algorithm in a list in order to apply the multi-directional search algorithm on them, the number of the solutions in this list is called \(N_{elit}\). We set \(N_{elit} =1\) in order to avoid increasing the value of the function evaluation value.

Fig. 2
figure 2

The general performance of the standard bat algorithm with different values of step size \(\lambda\)

Table 1 Parameter setting

4.2 Test unconstrained optimization problems

We test MDBAT algorithm on 16 unconstrained optimization functions with various properties (multi-model, uni-model) functions. We list these functions in Table 2 and report their properties in Table 3.

Table 2 Unconstrained benchmark functions
Table 3 Properties of classical functions

4.3 The efficiency of applying the multi-directional search in MDBAT algorithm

In order to investigate the efficiency of invoking the multi-directional search algorithm, we apply the standard bat algorithm without combining it with the multi-directional search algorithm and the proposed algorithm on six functions which are randomly picked. In Table 4, we report the mean and the standard deviation over 50 runs. The run is successful if the algorithm reaches to the global minimum of the solution within an error of \(10^{-4}\) before the 20,000 function evaluation value. If any algorithm fails to obtain the desired function value, then we report the value of best obtained function value in parentheses. The results in Table 4 show that the combination between the standard bat algorithm and the multi-directional search can accelerate the search and obtain the desired function values for all test functions faster than the standard bat algorithm which needs more iterations in order to obtain the desired function values.

Table 4 The efficiency of applying the multi-directional search in MDBAT algorithm

4.4 MDBAT and other algorithms

We give the first test in order to investigate the powerful performance of the proposed MDBAT algorithm by comparing it with four benchmark algorithms (particle swarm optimization with its variants). Before discussing the comparison results of all algorithms, we present a brief description about the comparative four algorithms [21] as follows.

  • RWMPSOg RWMPSOg is Random Walk Memetic Particle Swarm Optimization (with global variant), which combines the particle swarm optimization with random walk as a direction exploitation.

  • RWMPSOl RWMPSOl is Random Walk Memetic Particle Swarm Optimization (with local variant), which combines the particle swarm optimization with random walk as a direction exploitation.

  • PSOg PSOg is standard particle swarm optimization with global variant without local search method.

  • PSOl PSOl is standard particle swarm optimization with local variant without local search method.

4.4.1 Comparison between RWMPSOg, RWMPSOl, PSOg, PSOl and MDBAT

We present the comparison results between our MDBAT algorithm and the other particle swarm optimization algorithms. We test the five comparative algorithms on 9 benchmark functions \(f_1{-}f_9\) and report the results in Table 2. We take the results of the other comparative algorithms from their original papers [21]. In Tables 5, 6 and 7, we report the minimum (min), maximum (max), average (Mean), standard deviation (SD) and Success rate (%Suc) of the evaluation function values over 50 runs with different population size (SS) 15, 30 and 60, respectively. The run is successful if the algorithm reaches to the global minimum of the solution within an error of \(10^{-4}\) before the 20,000 function evaluation value. We report the best results between the comparative algorithms in boldface text. The results in Tables 5, 6 and 7 show that the proposed MDBAT algorithm succeeds and obtains the desired objective value of each function faster than the other algorithms in most cases.

Table 5 Comparison results of MDBAT and other PSO-based algorithms for problems \(f_1{-}f_{9}\) at SS = 15
Table 6 Comparison results of MDBAT and other PSO-based algorithms for problems \(f_1{-}f_{9}\) at SS = 30
Table 7 Comparison results of MDBAT and other PSO-based algorithms for problems \(f_1{-}f_{9}\) at SS = 60

4.4.2 Comparison results between GA-PSO, DE-PSO, AMPSO and MDBAT

We give the second comparison test by comparing the proposed MDBAT with other four various hybrid PSO algorithms. The first algorithm is a hybrid genetic algorithm with particle swarm optimization, which is called GA-PSO. The second algorithm is a hybrid deferential evolution algorithm with PSO algorithm, the hybrid algorithm is called DE-PSO algorithm. The third and fourth algorithms are a modified version of PSO including EP based adaptive mutation operator AMPSO1, AMPSO2 [9]. Before presenting the comparative results between the proposed MDBAT algorithm and the other four hybrid PSO algorithms, we give a brief description of each algorithm as follows.

  • GA-PSO The GA-PSO algorithm was proposed by Kao and Zahara [11]. In GA-PSO, in order to solve a D dimensional problem, the hybrid approach takes 4D individuals which are randomly generated. These individuals may be regarded as chromosomes in the case of GA, or as particles in the case of PSO. These 4D individuals are evaluated and sorted by their fitness, and the top 2D individuals are fed into the real-coded GA to create 2D new individuals by crossover and mutation operations. They proposed a random mutation operator for the real-coded GA to modify an individual with a random number in the problems domain with a 20% probability. The created new 2D individuals from real-coded GA are used to adjust the remaining 2D particles by the PSO method.

  • DE-PSO The DE-PSO algorithm was proposed by Pant et al. [19]. It starts as a usual DE algorithm till the trial vector is generated. If the trial vector is better than the corresponding target vector, then it is included in the population, otherwise the algorithm enters the PSO phase and generates a new candidate solution using PSO velocity and position update equations. The method is iteratively repeated till the optimum value is reached.

  • AMPSO The AMPSO algorithm was proposed by Pant et al. [20]. The algorithm is called adaptive mutation operator particle swarm optimization. Two versions of AMPSO called AMPSO1 and AMPSO2 are proposed. In AMPSO1, the personal best particle is mutated, while in AMPSO2 the global best particle is mutated.

In Table 8, we report the comparative results between the proposed MDBAT algorithm, GA-PSO, DE-PSO, AMPSO1 and AMPSO2. We take the results of algorithm GA-PSO, DE-PSO, AMPSO1 and AMPSO2 from [29]. In order to make a fair comparison, we use the same parameters values as in other four algorithms. We report the average function evaluation results and the rate of success over 100 runs. We report the best values in boldface text.

Table 8 Mean error of the function values of GA-PSO, DE-PSO, AMPSO1, AMPSO2 and MDBAT algorithms

The results in Table 8 show that, the proposed MDBAT are better in 9 test cases out of 11 cases. We can conclude from Fig. 2, that the proposed algorithm is a promising algorithm and faster than the other algorithms.

4.5 Wilcoxon signed-ranks test

Wilcoxons test is a nonparametric procedure employed in a hypothesis testing situation involving a design with two samples [10, 24, 39]. It is a pairwise test that aims to detect significant differences between the behavior of two algorithms. \(\rho\) is the probability of the null hypothesis being true. The result of the test is returned in \(\rho <0.05\) indicates a rejection of the null hypothesis, while \(\rho >0.05\) indicates a failure to reject the null hypothesis. The \(R^+\) is the sum of positive ranks, while \(R^-\) is the sum of negative ranks.

We apply the Wilcoxon signed-ranks test on the results in Tables 5, 6 and 7. In Tables 9, 10 and 11, we report the comparison by the test between the MDBAT algorithm and RWMPSOg, RWMPSOl, PSOg and PSOl. Also, we apply the Wilcoxon signed-ranks test on the results in Table 8 and report the comparison by test between the MDBAT algorithm and GA-PSO, DE-PSO, AMPSO1, AMPSO2 in Table12.

Table 9 Wilcoxon test for comparison results in Table 5
Table 10 Wilcoxon test for comparison results in Table 6
Table 11 Wilcoxon test for comparison results in Table 7
Table 12 Wilcoxon test for comparison results in Table 8

The statistical results in Tables 9, 10 and 11 show that the proposed algorithm is a promising algorithm and outperform the other algorithms.

5 Conclusion and future work

We propose a novel hybrid bat algorithm and multi-directional search algorithm in order to solve unconstrained global optimization problems. We call the proposed algorithm by multi-directional bat algorithm (MDBAT). In MDBAT, the standard bat algorithm starts the search for number of iterations then the multi-directional search starts the search from the best obtained solution from the bat algorithm. The hybridization between the bat algorithm and multi-directional search can accelerate the convergence instead of letting the algorithm runs for more iterations without any improvements. We apply the standard bat algorithm without combining it with the multi-directional search algorithm and the proposed algorithm on six functions which are randomly selected. Also, we report the mean and the standard deviation over 50 runs. We consider the run is successful if the algorithm reaches to the global minimum of the solution within an error of \(10^{-4}\) before the 20,000 function evaluation value. If any algorithm fails to obtain the desired function value, then we report the value of best obtained function value in parentheses. Further, we show that the combination between the standard bat algorithm and the multi-directional search can accelerate the search and obtain the desired function values for all test functions faster than the standard bat algorithm which needs more iterations in order to obtain the desired function values.

We verify the robustness and the effectiveness of the proposed algorithm by applying it on 16 unconstrained global optimization problems and compare it against 8 various particle swarm optimization algorithms, namely, random walk memetic particle swarm optimization (with global and local variants), standard particle swarm optimization with global and variants with local search method, adaptive mutation operator particle swarm optimization, genetic algorithm with PSO, and differential evaluation with PSO.

Finally, the experimental results show that the proposed algorithm is a promising algorithm and has a powerful ability to solve unconstrained optimization problems faster than other algorithms in most cases.

Our future work is concentrated on the following directions:

  • Apply the proposed algorithms on solving constrained optimization and engineering problems [1] such as design of a tension/compression spring [5], design of a welded beam [22], design of a gear train [23], and design of a pressure vessel [23].

  • Modify our proposed algorithm in order to solve other combinatorial problems, large scale integer programming and minimax problems [2, 3, 26].

  • Modify our proposed algorithm to solve large scale unconstrained global optimization problems and molecular potential energy function as done by the authors of this paper in [4, 27].