Keywords

1 Introduction

Nature is an origin of inspiration for solving hard and complex problems. Nature-inspired algorithms (NIAs) are inspired by nature and used to deal with difficult real-world engineering problems [6]. Swarm intelligence algorithms [1, 2] are inspired by any type of collective behaviours of individuals in nature. Gravitational search algorithm (GSA) [6] is a swarm intelligence type algorithm that is inspired by the Newton’s physics concept gravitational force and motion of individuals in nature. Individuals fascinate each others by the gravity force and accelerate according to the force applied on individuals. Individuals with heavier masses have high attraction power compare to lower masses individuals. By this attraction power of individuals higher masses individuals move slowly as compare to lower masses individuals. GSA is an optimization algorithm and provides proper balancing between exploitation and exploration capabilities. So in this algorithm, heavier masses individuals are responsible for exploitation whereas lighter masses individuals are responsible for the exploration of the search area. When searching process start lighter masses (individuals are far from the optimum solutions) individuals move with large step size (exploration) and after this when individuals converge to the optimum solutions i.e. higher masses individuals move with comparative small step size (exploitation). Researchers have been progressively produce new techniques to refine the performance of the algorithm [3, 7, 8, 11].

In this paper, to improve the searching capability and diversification and intensification proficiency of GSA algorithm, a new variant of GSA is designed, namely Exploitative Gravitational Search Algorithm (EGSA). In the proposed EGSA, Kbest (best individuals) and Gravitational constant G are modified such that searching efficiency of GSA is increased and the solution explore the search space in early iteration having large step size while exploit the identified search region in later iterations with small step size. To balance the number of individuals that applied force to other individuals as the iteration is increased, Kbest value is exponentially decreased. Gravitational constant is also decreased through iterations that is responsible for the step size of individuals.

The remaining paper is organised as shown: In Sect. 2, a brief overview of GSA is illustrated. Exploitative GSA algorithm (EGSA) is proposed in Sect. 3. In Sect. 4, performance of EGSA is tested with several numerical benchmark functions. Finally, Sect. 5 gives a summary and conclude the work.

2 Gravitational Search Algorithm

E. Rashedi et al. developed Gravitational Search Algorithm (GSA) in 2009 [6]. GSA is a population-based stochastic search algorithm inspired by the Newton’s gravitational law and movement of individuals in universe by gravitational force. According to Newton’s gravity law “Every individuals in-universe fascinate each other with force, this force is directly proportional to the product of individuals masses and inversely proportional to the square of the distance between individuals masses [6]”. Force applied to the individuals, by this force individuals are accelerated from their position. Performance of individuals is measure by their mass. Agents with higher mass are good as compare to lighter mass agents.

The GSA algorithm is described as follows: Each individual \(X_{i}\) in search space with I number of individuals is represented as:

$$\begin{aligned} X_{i}=(x_{i}^{1},.....,x_{i}^{d},.....,x_{i}^{n}) \; \; \; \; \; \; for \, \, i=1,2,.....,I, \end{aligned}$$
(1)

here \(x_{i}^{d}\) shows the position of \(i^{th}\) individual in d dimensional area.

Mass of individuals is a based on the individuals fitness. The fitness of all individuals are calculated and worst and best fitness are identified for calculating the mass of individuals.

  • For minimization problems best and worst fitness are:

    $$\begin{aligned} best(g) = min fit_{j}(g) \; \; \; \; \; \; j \in {1,\cdot \cdot \cdot ,I} \end{aligned}$$
    (2)
    $$\begin{aligned} worst(g) = max fit_{j}(g) \; \; \; \; \; \; j \in {1,\cdot \cdot \cdot ,I} \end{aligned}$$
    (3)
  • For maximization problems best and worst fitness are:

    $$\begin{aligned} best(g) = max fit_{j}(g) \; \; \; \; \; \; j \in {1,\cdot \cdot \cdot ,I} \end{aligned}$$
    (4)
    $$\begin{aligned} worst(g) = min fit_{j}(g) \; \; \; \; \; \; j \in {1,\cdot \cdot \cdot ,I} \end{aligned}$$
    (5)

    \(max fit_{j}(g)\) and \(min fit_{j}(g)\) show the maximum and minimum fitness value of the \(j^{th}\) individual at iteration g.

In GSA inertia, active and passive gravitational masses are equal. Individual with heavier masses are more efficient. Heavier masses individuals have higher attraction power and move slowly. Masses in GSA depend on the fitness value of individuals and calculated as follows:

$$\begin{aligned} M_{aj} = M_{pi} = M_{ii} = M_{i}, \; \; \; \; \; \; i = l, 2, ....,I. \end{aligned}$$
(6)
$$\begin{aligned} m_{i}(g) = \frac{fit_{i} - worst(g)}{best(g) - worst(g)} \end{aligned}$$
(7)
$$\begin{aligned} M_{i} = \frac{m_{i}(g)}{\sum _{j=1}^{I}m_{j}(g)} \end{aligned}$$
(8)

here \(M_{ii}\) and \(M_{pi}\) are inertia and passive gravitational masses of \(i^{th}\) individual respectively and \(M_{aj}\) is active gravitational mass of \(j^{th}\) individual. \(fit_{i}\) is the fitness value of \(i^{th}\) individual.

G(g) is the gravitational constant computed as Eq. 9.

$$\begin{aligned} G(g) = G_{0} e^{( - \alpha g/MaxIt)} \end{aligned}$$
(9)

Here, \(G_{0}\) and \(\alpha \) are constant and initialized at the starting. The value of G(g) is reduced exponentially during each iteration for controlling search accuracy. MaxIt is total number of iteration. Acceleration of individuals depends upon the ratio of force and mass of the individual [5] and calculated as follows:

$$\begin{aligned} a_{i}^{d}(g) = F_{i}^{d}(g)/M_{ii}(g) \end{aligned}$$
(10)

\(F_{i}^{d}(g)\) is the overall force acting on \(i^{th}\) individual computed as:

$$\begin{aligned} F_{i}^{d}(g) = \sum _{j \in Kbest, j \ne i}rand_{j} F_{ij}^{d}(g) \end{aligned}$$
(11)

Kbest is computed as follows:

$$\begin{aligned} Kbest = finalper + (1 - \frac{g}{MaxIt}) \times (N-finalper) \end{aligned}$$
(12)
$$\begin{aligned} Kbest = round(N \times \frac{Kbest}{N}) \end{aligned}$$
(13)

Here, finalper is the constant and N is the total number of individuals in the search space. Kbest is initial N individuals with the best fitness value and highest mass. Kbest will reduce linearly in each iteration and at the final only one individual applying force to the other individuals.

Force on \(i^{th}\) individual by \(j^{th}\) individuals mass during iteration g is computed using the following Eq. 14:

$$\begin{aligned} F_{ij}^{d}(g) = G(g).(M_{pi}(g) \times M_{aj}(g)/R_{ij}(g) + \varepsilon ).(x_{j}^{d}(g) - x_{i}^{d}(g)) \end{aligned}$$
(14)

Here, \(R_{ij}(g)\) is the Euclidian-distance between two individuals i and j at iteration g. Gravitational constant G(g) is calculated using Eq. 9 while \(\varepsilon \) is a small constant. The velocity update equation for individuals is defined as:

$$\begin{aligned} v_{i}^{d}(g+1) = rand_{i} \times v_{i}^{d}(g) + a_{i}^{d}(g) \end{aligned}$$
(15)

here, rand is random variable in interval [0, 1]. \(v_{i}^{d}(g)\) and \(v_{i}^{d}(g+1)\) are the velocity of \(i^{th}\) individual at the iteration g and \(g+1\) subsequently.

The position update equation for individuals is defined as:

$$\begin{aligned} x_{i}^{d}(g+1) = x_{i}^{d}(g) + v_{i}^{d}(g+1) \end{aligned}$$
(16)

here, \(x_{i}^{d}(g)\) and \(x_{i}^{d}(g+1)\) are the position of \(i^{th}\) individual at the iteration g and \(g+1\) subsequently. Velocity of individuals is updated during each iteration. Due to changes in the velocity every individual update its position.

This procedure is carry on until their termination criteria is met or iteration reach their maximum limit.

3 Exploitative Gravitational Search Algorithm

In the population-based algorithms, behavior of agent is measured by the exploitation and exploration capability in the search space. Exploration is for the enlarging the entire search space and exploitation is the finding optimum solution from the previously visited good solutions. During the early iteration of the algorithm, GSA visit the entire search space to find out the optimal solutions. After the lapse of iteration GSA exploit the search space by visiting the previously visited points. For the better performance of any population-based algorithm it is necessary to maintain a proper balance between the exploitation and exploration. Initially, when the individuals are not converged, exploration is needed to find out the good solutions in the whole search space. For the exploration large step size is necessary. After the lapse of iteration, individuals are converged. Hence for finding the optimal solution of the algorithm, individuals needs to exploit the search space (step size is comparatively less). In GSA gravitational constant G affects the step size of individuals. As mentioned in Eq. 14, force is directly proportional to the gravitational constant G and from the Eq. 10, acceleration is depend on the force of individuals. Acceleration in GSA plays a vital role for the step size of the individuals. Therefore in this paper gravitational constant G is modified as shown here:

$$\begin{aligned} G(g) = G_{0} e^{(- \alpha g/MaxIt)}(1-\frac{g}{MaxIt}) \end{aligned}$$
(17)

From the Eq. 17, it is clear that the value of gravitational constant will be high during the initial iteration and value is reduced iteratively. Therefore, the acceleration and step size of the individuals are decreased as the number of iteration increased.

Kbest in Eq. 13 controls the number of individuals that apply the force to other individuals in search space. A large number of Kbest (individuals) means large number of individuals interact with each other and movement between the individuals is high. As the result convergence speed is lower. From Eq. 13 it is clear that Kbest is linearly decreased therefore change in Kbest is very small as the number of iterations increase. Due to this movement and interaction between individuals is comparatively lower but the effect in convergence speed is not much. Therefore in this paper Kbest is modified as shown here:

$$\begin{aligned} Kbest = round(N \times exp(- \beta g/MaxIt)) \end{aligned}$$
(18)

Here, N is total number of individuals and \(\beta \) is constant. The value of Kbest is reduced exponentially during each iteration. From the Eq. 18 it is clear that Kbest is exponentially decreased. At initial iteration Kbest is large therefore movement and interaction between the individuals is more that show the exploration of the search space. Whereas the number of iteration is increased Kbest is reduced therefore movement and interaction between the individuals is comparative less that shows the exploitation of the individuals.

Fig. 1.
figure 1

Effect of Kbest in number of individuals

Behaviour of Kbest through iterations is shown in Fig. 1. It is clear from this figure that in EGSA, Kbest is decreases exponentially through iterations whereas in GSA Kbest is decreases linearly through iteration. So initially large number of individuals apply force as the number of iteration is increased comparatively less number of individuals apply force. So, EGSA regulates a proper balance between diversification and intensification proficiency and improve the searching ability as the number of iteration increase. The pseudo-code of EGSA is shown in Algorithm 1.

figure a

4 Results and Discussions

To examine the outcome of the EGSA, 12 different benchmark functions (\(f_1\) to \(f_{12}\)) are picked as shown in Table 1.

To certify the prosection of the proposed algorithm EGSA, a comparative analysis is carried out among EGSA, standard GSA, FBGSA [4] and BBO [10]. To authenticate the performance of the considered algorithms over the test problems, the experimental setting is given below:

  • The number of simulations/run = 30,

  • Number of population (N) = 50,

  • \(G_{0} = 100\), \(\alpha = 20\), \(\beta = 5\) and \(finalper = 2\),

  • Experimental settings for the algorithms GSA, FBGSA [4] and BBO [10] are simulated from their primary research papers.

Table 2 display the experimental results of the examine algorithms. A detailed analysis about the standard deviation (SD), mean error (ME), average number of function evaluations (AFE) along with the success rate (SR) are shown in Table 2. Results in Table 2 replicates, many times EGSA exceeds in terms of reliability, efficiency as well as accuracy as compare to the GSA, FBGSA and BBO.

Table 1. Test problems, D: Dimension, AE: Acceptable Error
Table 2. Comparison of the results of test functions, TP: Test Problem

Further, Mann-Whitney U rank sum test [9] is performed at \(5\%\) level of remarkable \((\alpha = 0.05)\) between EGSA - GSA, EGSA - FBGSA and EGSA - BBO. Table 3 display the compared results of mean function evaluation and Mann-Whitney test for 30 simulations. In Mann-Whitney test, we observe the remarkable difference between two data set. If remarkable difference is not seen then = symbol appears and when remarkable difference is observed then comparison is performed in terms of the AFEs. And we use + and - symbol, + represent the EGSA is superior than the examined algorithms and - represent the algorithm is inferior. The last row in Table 3, authorize the excellence of EGSA over GSA, FBGSA and BBO.

Table 3. Comparison based on Mann-Whitney U rank test at significance level \(\alpha =0.05\) and mean function evaluations

Moreover, for comparison of examined algorithms, in form of consolidated achievement boxplots [2] study of AFE is carried out. Boxplot study efficiently describe the empirical circulation of data graphically. The boxplots for EGSA, GSA, FBGSA and BBO are depicted in Fig. 2. The results clearly show that interquartile range and medians of EGSA are relatively low.

Fig. 2.
figure 2

Boxplots graphs (Average number of function evaluation)

To calculate the convergence speed of modified algorithm we use Acceleration Rate (AR) [9] which is represent as shown below:

$$\begin{aligned} AR=\frac{AFE_{compareAlgo}}{AFE_{EGSA}} \end{aligned}$$
(19)

here, \(compareAlgo \in \) (GSA, FBGSA and BBO) and \(AR > 1\) means EGSA is faster than compared algorithms. For investigate the AR of modified algorithm it compared with standard GSA, FBGSA and BBO, results of Table 2 are analyzed and the value of AR is calculated using Eq. 19. It is cleared from the Table 4 that convergence speed of EGSA is faster than other examined algorithms.

Table 4. Test Problems: TP, Acceleration Rate (AR) of EGSA compare to the Standard GSA, FBGSA and BBO

5 Conclusion

This paper presents a variant of GSA algorithm, known as Exploitative Gravitational Search Algorithm (EGSA). In the modified version, two control parameters gravitational constant G and Kbest are modified. Kbest is number of best individuals that exponentially decreases with the number of iterations increases and it increase the searching speed of algorithm. Gravitational constant G is also deceased with number of iterations increased and it enhance the convergence speed. This methodology is reliable and efficient and maintain the proper balance between the exploitation and exploration proficiency of the algorithm. The proposed algorithm is compared with GSA, FBGSA and BBO over different benchmark functions. The obtained results state that EGSA is a competitive variant of GSA and also a good choice for solving the continuous optimization problems. In future, the newly developed algorithm may be used to solve various real world optimization problems of continuous nature.