Keywords

1 Introduction

Travelling salesman problem (TSP) [1] is one of typical NP-hard problems in combinatorial optimization problem, which wishes to visit a certain number of cities, beginning and finishing its course in the same city by visiting each other’s city one and only once. It wishes to select the best tour, which minimizes the traversed total distance.

The resolution of this problem is very important because of its application in various fields such as combinative data analysis, data-processing wiring, machine sequencing, the routing of the vehicles and scheduling, planning and logistics.

In the 19th century, several researchers used heuristic and metaheuristics methods for this problem: Local Search [2], Simulated Annealing [3], Tabu Search [4], Genetic Algorithm [5, 6], Ant Colony System [7], Particle Swarm Optimization [8], Bee Colony Optimization [9], and Penguins Search Optimization Algorithm [10]. Hybrid methods: A novel hybrid penguins search optimization algorithm [11], Hybrid genetic algorithm [12], Particle Swarm Optimization with Simulated Annealing [13] and Simulated Annealing Algorithm with Greedy Search [14].

In this paper, we have added GA operators such as selection, crossover and mutation to the PeSOA algorithm in order to find a new hybrid approach HPeSOA. This approach aims to solve the traveling salesman problem. The computational results show that the proposed HPeSOA algorithm has faster convergence, higher computational precision and is more efficient to solve the TSP.

The organization of this research is as follows: In Sect. 2, presentation of the traveling salesman problem; In Sect. 3, description of metheuristics.; In Sect. 4, the adaptation of Hybrid genetic algorithm and Penguins Search Optimization Algorithm to solve traveling salesman problem; In Sect. 5, the results of tests using TSPLIB instances and finally conclusions are summarized in Sect. 6.

2 The Travelling Salesman Problem

Travelling salesman problem (TSP) is one of the most ancient and widely studied problems in combinatorial optimization. Having N cities to visit wishes to establish a tour which permit it to pass exactly once by each city and to return to its starting point for lower costs, traversing the smallest possible distance.

The distances between the cities are known. It is necessary to find the way, which minimizes the distance from displacement. This problem is modeled in linear programming in integer numbers, associating to a binary variable xij the value 1 if the city i is immediately visited before the city j otherwise, this variable takes the value 0. The cost of this visitor is noted cij. We then obtain the following model:

$$ \begin{array}{*{20}c} {\text{Min}} & {\sum\nolimits_{i,j} {c_{ij} x_{ij} } } \\ \end{array} $$
(1)

The constraints are:

$$ \begin{array}{*{20}c} {\sum\limits_{j} {x_{ij = 1} } } & {\forall i \ne j} \\ \end{array} $$
(2)
$$ \begin{array}{*{20}c} {\sum\limits_{i} {x_{ij = 1} } } & {\forall j \ne i} \\ \end{array} $$
(3)
$$ \begin{array}{*{20}c} {0 \le x_{ij} \le 1} & {\forall i,\,\forall j} \\ \end{array} $$
(4)

The objective function (1) is expressed as a scalar product between a real vector c and a vector of binary variables x. This model is also composed of constraints (2) and (3), allow to visit all cities exactly one and only once. The third constraint (4) prohibits solutions, which will form sub-rounds; it is generally called removal sub-rounds constraint.

3 Description of Metaheuristics

3.1 The Genetic Algorithm

Genetic algorithm [15], introduced by John Holland, is a search algorithm based on the mechanism of natural selection and genetics.

Genetic algorithm first generates an initial population randomly which consist of a set of individual solutions to the problem. The best individuals of the population are those, which have a better chance to reproduce and to transmit part of their genetic heritage to the next generation. A new population of generation is then created by combining the genes of the parents. We expects that certain individuals of the new generation have the best characteristics from both parents, so they will be better and be a better solution to the problem. The new group (the new generation) is then subjected to the same selection criteria, and roughly generates its own kids. This process is repeated several times, until all the individuals have the same genetic heritage. The members of this last generation, who are usually very different from their ancestors, have genetic information g which corresponds to the best solution to the problem. The basic genetic algorithm comprises two simple operations, which are not more complicated than algebraic operations: Crossover–Mutation.

The Pseudo code of this Genetic algorithm is as follows:

figure a

3.2 Penguins Search Optimization Algorithm

Penguins Search Algorithm optimization [16] is a new class of metaheuristics proposed in 2013 by the Gheraibia and Abdelouahab Moussaoui. This algorithm is inspired by the strategy of the hunting of Penguins, which is based on the concept of collaboration between the penguins to profit from their diving. This algorithm is based on the construction of a population, it is divided into groups, each group starts looking for food in a random position, the oxygen reserve exhausted, the groups return to the earth, and choose the best group that consumed the largest amount of fish, the best individuals in the selected group immigrate to other groups to guide them to find rich positions. This technique is repeated several times in order to find one of the best positions. Each position of penguins in the search space is a solution to the problem correlated. Penguins cooperate to determine the best position (solution) in a specific hole and level according (solution space). The movement of penguins between the groups is carried by this equation:

$$ D_{new} = D_{last} + Rand\, \otimes |X_{best} - X_{id} | $$
(5)

Where rand is a random number of the distribution; the current Solution (Dlast), the best local solution (Xbest), the last solution (Xid) and the new solution (Dnew).

The Pseudo code of this Penguins Search Optimization algorithm (PeSOA) is as follows:

figure b

3.3 HPeSOA Algorithm

The HPeSOA hybrid method is a combination of two Metaheuristics PeSOA and GA. The objective of this method is to improve PeSOA research techniques, by integrating techniques of the genetic method to find solutions of good quality.

The genetic algorithm operator’s crossover and mutation implemented to create a population of childhoods solutions from the current solutions and the best solution in the previous population. The PeSOA mechanism, presents to improve this population by local search techniques in order to find the best solution.

The pseudo code for this HPeSOA algorithm is as follows:

figure c

4 Adaptation of HPeSOA to Solve TSP

This section presents the HPeSOA method adaptation to solve a TSP.

The adaptation of HPeSOA consists in redefining algebraic operators of genetic algorithm (crossover - mutation), penguins search optimization algorithm and the HPeSOA algorithm structures and steps.

4.1 The GA Operators (Crossover - Mutation)

Suppose that the two positions randomly selected are 4 and 7, so we get the children q1 and q2, but q1 and q2 are not legitimate. We consider the duplicated cities in q1 as superfluous cities that are 2 and 10 in Fig. 1, and duplicate cities in q2 as being devoid of cities that are 3 and 9. Then exchange for the new law q1’ and q2’.

Fig. 1.
figure 1

An example of the GA operators (crossover – mutation).

The mutation operator creates random changes in the order of cities. By randomly selecting a solution and two positions, and then changing the first position city with the second one. In Fig. 1 an example of a mutation of a solution q1’ with positions 4 and 8 gives the solution q1’’.

4.2 The PeSOA Operators (Substraction –Multiplication-Addition)

The new solution Xnew is calculated by using Eq. (5), which is based on the operator’s substraction, multiplication and addition: The substraction operation (−) is an operation between two solutions Xbest and Xid, the result of this operation is a set of permutations Q, which can pass from the first solution to the second.

The multiplication operation (*) is an operation effected between a real k ϵ [0; 1] and a set of permutations Q, The result Q’ is a part of the set Q according to the value of k.

The addition operation (+) is an operation effect between the solution Xid and the set of permutations Q’, the result is a new solution Xnew. This operation consists of applying permutations of Q’ on Xid to obtain a new solution Xnew (Fig. 2).

Fig. 2.
figure 2

A simple example of the PeSOA operators.

4.3 The Flowchart of the HPeSOA Algorithm

The Flowchart of the HPeSOA algorithm is shown in Fig. 3. The structures of two PeSOA and GA algorithms are present in the HPeSOA method to solve Travelling salesman problem.

Fig. 3.
figure 3

Flowchart of the HPeSOA Algorithm

  • 1 st step: Initialization.

The initialization of the parameters.

Initialize a population of penguins with random positions on the dimensions D, a position represents a solution to our problem.

  • 2 nd step: Fitness.

Calculate the objective function of each solution in the population (the physical condition of each penguin in the population) {f(x1), f(x2),f(x3),…, f(xM)}}

  • 3 rd step: Selection.

Select the best solution (Xbest) and the best group (gbest) of the population.

  • 4 th step: Crossover Operator.

Select solutions with PC probability of the population, and the crossover with the best solution gbest, then the best child of each crossover, are selected as a child solution to the second population.

  • 5 th step: Shift Operator (mutation).

Select solutions with PM probability of the population, and then generate solutions by exchanging two randomly selected positions within the solution.

  • 6 th step: update population/gbest

Update the population by child solutions, which are generated by genetic operators and update the best group gbest of this population.

  • 7 th step: update population using PeSOA

Update of the population, using Eq. (5) of PeSOA algorithm.

  • The last step: stopping critical

Check the stop condition if the case leaves the loop and shows the best solution found, otherwise, return to step 2.

5 Experimental Results

We applied the HPeSOA method to the different instances of TSPLIB. This experiment implemented by language C in a PC with an Intel (R) Core (TM) 2 Duo processor 2.00 GHz M370@2.40GHZ 2.40GHZ and 4.00 GB of RAM. Each instance runs 10 times. The parameters of HPeSOA are Presented in the Table 1.

Table 1. Value of HPeSOA parameters

The numerical results of this adaptation appearing in Table 2 presented as follows: The first column contains the names of the instances (Instance). The second column represents the number of cities (N). The third is the best result (Optimum) shown in the TSPLIB. The fourth and fifth columns represent the best (Best) and worst results (Worst) obtained by the HPeSOA method. The sixth is the percentage error (Err) and the last column contain the average time of execution (Tps).

Table 2. Numerical results obtained by HPeSOA applied to some TSP instances of TSPLIB
$$ Err = \frac{{\frac{Best + Worst}{2} - Optimum}}{Optimum} \times 100\% $$
(6)

Table 3 shows the comparison between the HPeSOA method, Penguins search optimization algorithm (PeSOA) and genetic algorithm (GA). This comparison is based on three criteria: Best solution value (Best), percentage of error (Err) of optimal solution value and average solution value of 10 runs, and the average time of execution (Tps) in second (Fig. 4).

Table 3. Comparing the performance PeSOA, GA, with HPeSOA
Fig. 4.
figure 4

Percentage of PeSOA, GA and HPeSOA methods for different instance.

This figure show that the error percentage of HPeSOA method is negligible with respect to GA and PeSOA (Fig. 5).

Fig. 5.
figure 5

The overall execution time of the different instances except Pr107 by methods PeSOA, GA and HPeSOA

This figure show that the HPeSOA method solve a set of instances at a small average time of execution, compared with PeSOA and GA methods.

6 Conclusion

In this paper, we have developed a hybrid algorithm HPeSOA that combines the advantage of the genetic algorithm and Penguins search optimization algorithm. This approach is an improvement of the two algorithms: the genetic algorithm so as not to be trapped in the local optimal and PeSOA algorithm by minimizing the time of search for the best solutions.

The result of HPeSOA shows its effectiveness to resolve some instance of TSP according PeSOA and GA, In the future; we plan to add more parameters to the proposed algorithm to make it more robust and flexible to solve the most difficult combinatorial optimization problems.